Tagged: Electron Microscopy Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 7:46 am on November 9, 2022 Permalink | Reply
    Tags: "Inspiration at the atomic scale", , , Electron Microscopy, , , James LeBeau, , , , , , , , With new techniques in electron microscopy James LeBeau explores the nanoscale landscape within materials to understand their properties.   

    From The School of Engineering AT The Massachusetts Institute of Technology: “Inspiration at the atomic scale” 

    From The School of Engineering

    At

    The Massachusetts Institute of Technology

    11.9.22
    Zach Winn

    1
    MIT Associate Professor James LeBeau develops new techniques for gathering and analyzing data in electron microscopy to better understand material properties in fields including electronics, photonics, quantum mechanics, and energy storage. “Science is truly a creative outlet,” LeBeau says. Photo: Adam Glanzman.

    With new techniques in electron microscopy James LeBeau explores the nanoscale landscape within materials to understand their properties.

    To explain why he loves electron microscopy, Associate Professor James LeBeau uses an analogy: He likens the technique, which uses beams of electrons to illuminate materials at a scale thousands of times smaller than conventional microscopes, to the inverse of astronomy.

    “It’s discovering things that no human has ever seen before that really captures the imagination,” LeBeau says. “There is a beauty to the way atoms are arranged in materials, particularly at defects, which give rise to all sorts of material behavior.”

    LeBeau has used that passion to develop new techniques for collecting and interpreting data in electron microscopy that can be used to describe materials more comprehensively. He’s applied those techniques to explain materials’ behavior in fields from electronics and optics to energy storage, quantum computing, and more.

    “Beyond explaining material properties, there’s also a significant computational component to electron microscopy as it’s used to analyze data that may have been overlooked previously and to make conclusions about the data in new ways. And, with the creation of the MIT Schwarzman College of Computing, it’s an exciting time to be at MIT,” he says.

    Discovering a passion

    LeBeau became interested in engineering while helping his father build and repair things around the house, and he discovered a love for science at a young age.

    “Science can provide an explanation of the world around us beyond supernatural beliefs,” LeBeau says. “For me, science was about making sense of the world.”

    LeBeau first learned about materials science through the technical high school he attended in Indiana. But it wasn’t until he was an undergraduate at Rensselaer Polytechnic Institute in New York that a few pivotal experiences helped set his course in life.

    During his first year, he participated in a project using data science to predict material properties.

    “After that I was hooked, and at that point I knew I wanted to go the academic route,” he recalls. “Just being able to explore things and have that academic freedom really appealed to me.”

    A few years later, in 2005, LeBeau participated in a summer research program for undergraduates at what is now the Materials Research Laboratory at MIT.

    The experience, in which he integrated biopolymers into a casting process, stoked his interest in using materials science for sustainability. The passion of the researchers around MIT also left a lasting impression on him.

    Finally, as a senior, LeBeau got his first taste of electron microscopy.

    “We’d be in the lab in the middle of the night analyzing these materials, and that excitement caught my attention pretty early on,” LeBeau says. “It didn’t really matter how much I was working — I loved doing it, and that set the stage for the rest of my career.”

    During his PhD at the University of California-Santa Barbara, LeBeau was part of a team that showed that scanning transmission electron microscopy theory and experiment are in very good agreement and, in turn, that attograms (one millionth of a trillionth of a gram) of material could be weighed directly from electron microscopy images without the need for external microscope calibration standards.

    LeBeau also discovered a passion for cycling through the mountains near UC Barbara’s campus, an activity he continues by biking thousands of miles a year, including to MIT nearly every day regardless of the weather.

    After his PhD, LeBeau accepted a faculty position at North Carolina State University, where he worked for eight years before a similar position opened up at MIT in 2019.

    Since his move to MIT, LeBeau has helped the Institute adopt state-of-the-art electron microscopy equipment that researchers from across campus have taken advantage of in MIT.nano and elsewhere.

    “As an electron microscopist, the equipment I use is extremely expensive to maintain and necessitates that it becomes a shared resource. I’m happy that’s the case because ultimately users from across campus benefit from these tools and advance their science through this shared infrastructure,” LeBeau says. “More broadly, the microscope routinely challenges what people thought they knew about the materials they are studying. The results are always exciting.”

    Creativity and quantification

    When it’s his group’s turn on the microscope, LeBeau says they try to go after hard problems that require new ways of collecting and interpreting data.

    “We choose questions that are not easy to answer through other methods and that require new ways to extract information from our datasets to make conclusions,” LeBeau says.

    One type of material LeBeau has studied is relaxor ferroelectrics, which are used for applications including ultrasounds, actuators, and energy storage. The materials have been studied for decades but are extremely heterogeneous at the nanoscale, making it difficult to explain their electromechanical properties. By analyzing the materials’ structure using new electron microscopy techniques, LeBeau’s group was able to explain its properties in a way that could help create more sustainable versions of the material, which currently contain lead.

    “Impact is always at the forefront of everything we do,” LeBeau explains. “When we go after problems, the application space is very important because it tells us if the insights can change the way an entire space operates.”

    One area of LeBeau’s research explores ways to use machine learning to help the microscope collect data more quickly than a human could.

    “Transmission electron microscopy in general is often a very slow technique,” LeBeau explains. “But you can imagine a case where a self-driving microscope is able to align a microscope and sample much faster, and in a much more reproducible way, than a human can. Doing so would enable us to collect a full statistical description of the material. That’s where machine learning can play a role: in pulling more data out of what we’ve already acquired but also in the acquisition itself.”

    Indeed, making electron microscopy more quantitative and reproducible has been a theme of LeBeau’s career. But he doesn’t believe quantifying something comes at the expense of creativity.

    “Science is truly a creative outlet,” LeBeau says. “The creativity comes from not only creating new experiment design or theories, but also from deciding how to present your data in visually appealing and informative ways. There’s a major creative element to what we do.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    The MIT School of Engineering is one of the five schools of the Massachusetts Institute of Technology, located in Cambridge, Massachusetts. The School of Engineering has eight academic departments and two interdisciplinary institutes. The School grants SB, MEng, SM, engineer’s degrees, and PhD or ScD degrees. The school is the largest at MIT as measured by undergraduate and graduate enrollments and faculty members.

    Departments and initiatives:

    Departments:

    Aeronautics and Astronautics (Course 16)
    Biological Engineering (Course 20)
    Chemical Engineering (Course 10)
    Civil and Environmental Engineering (Course 1)
    Electrical Engineering and Computer Science (Course 6, joint department with MIT Schwarzman College of Computing)
    Materials Science and Engineering (Course 3)
    Mechanical Engineering (Course 2)
    Nuclear Science and Engineering (Course 22)

    Institutes:

    Institute for Medical Engineering and Science
    Health Sciences and Technology program (joint MIT-Harvard, “HST” in the course catalog)

    (Departments and degree programs are commonly referred to by course catalog numbers on campus.)

    Laboratories and research centers

    Abdul Latif Jameel Water and Food Systems Lab
    Center for Advanced Nuclear Energy Systems
    Center for Computational Engineering
    Center for Materials Science and Engineering
    Center for Ocean Engineering
    Center for Transportation and Logistics
    Industrial Performance Center
    Institute for Soldier Nanotechnologies
    Koch Institute for Integrative Cancer Research
    Laboratory for Information and Decision Systems
    Laboratory for Manufacturing and Productivity
    Materials Processing Center
    Microsystems Technology Laboratories
    MIT Lincoln Laboratory Beaver Works Center
    Novartis-MIT Center for Continuous Manufacturing
    Ocean Engineering Design Laboratory
    Research Laboratory of Electronics
    SMART Center
    Sociotechnical Systems Research Center
    Tata Center for Technology and Design

    MIT Seal

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

    The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center , and the Haystack Observatory , as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and Whitehead Institute.

    Massachusettes Institute of Technology-Haystack Observatory Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities.

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after The Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    The Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, The Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at The Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    The Massachusetts Institute of Technology‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology ‘s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, The Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected The Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of The Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, The Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and The Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. The Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However, six Massachusetts Institute of Technology students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at The Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, The Massachusetts Institute of Technology’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    The Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    The Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, The Massachusetts Institute of Technology launched “OpenCourseWare” to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, The Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

    The Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, The Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of The Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology , Massachusetts Institute of Technology, and industrial contractors, and funded by the National Science Foundation .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also a Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of The Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 10:48 am on August 24, 2022 Permalink | Reply
    Tags: "New quantum technology combines free electrons and photons", , Electron Microscopy, , , , Whenever an electron interacts with the vacuum evanescent field of the ring resonator a photon can be generated.   

    From The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH): “New quantum technology combines free electrons and photons” 

    From The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH)

    8.24.22
    Andrea Testa
    Guanhao Huang

    1
    Scientists from EPFL, the Max Planck Institute for Multidisciplinary Sciences and the University of Göttingen have successfully created electron-photon pairs for the first time in a controlled way, using integrated photonic circuits on a chip. Using a new technique, they could precisely detect the involved particles. The findings of the study expand the toolbox of quantum technology.

    Faster computers, tap-proof communication, sensors beyond standard quantum limit – quantum technologies have the potential to revolutionize our lives just as once the invention of computers or the internet. Experts worldwide are trying to implement findings from basic research into quantum technologies.

    To this end, they sometimes require individual particles, such as photons – the elementary particles of light – with special properties. However, obtaining individual particles is complicated and requires complex methods. Various applications already use free electrons to generate light, such as the case in X-ray tubes.

    In a new study, recently published in the journal Science [below], scientists from EPFL’s Laboratory of Photonics and Quantum Measurement, Göttingen Max Planck Institute for Multidisciplinary Sciences (MPI-NAT) and the University of Göttingen demonstrate a novel method for generating cavity-photons using free electrons, in a form of pair states. To do so, they used chip-based photonic integrated circuits in an electron microscope.

    2
    An optical chip with ring-shaped light storage, called a microring resonator, and a fiber-optic coupling. The chip is only three millimeters wide, and the ring resonator at its tip has a radius of 0.114 millimeters. © Armin Feist / Max Planck Institute for Multidisciplinary Sciences.

    Fundamental Particle Physics in Electron Microscopes

    In the experiment, the beam of an electron microscope passes on a built-in integrated photonic chip, consisting of a micro-ring resonator and optical fiber output ports. This new approach, using photonic structures fabricated at EPFL for transmission electron microscope (TEM) experiments performed at MPI-NAT, was established in a recent study [Nature (below)].

    Whenever an electron interacts with the vacuum evanescent field of the ring resonator a photon can be generated. In this process, obeying the laws of energy and momentum conservation, the electron loses the energy quantum of a single photon. Through this interaction, the system evolves into a pair state. Thanks to a newly developed measurement method, the scientists could precisely detect simultaneously both electron energy and generated photons, revealing the underlying electron-photon pair states.

    Future quantum technology with free electrons

    Besides observing this process for the first time at the single particle level, these findings implement a novel concept for generating single-photon or electron. Specifically, the measurement of the pair state enables heralded particle sources, where the detection of one particle signals the generation of the other. This is necessary for many applications in quantum technology and adds to its growing toolset.

    “The method opens up fascinating new possibilities in electron microscopy. In the field of quantum optics, entangled photon pairs already improve imaging. With our work, such concepts can now be explored with electrons,” explains Claus Ropers, MPI-NAT Director.

    In the first proof-of-principle experiment, the researchers make use of the generated correlated electron-photon pairs for photonic mode imaging, achieving a three-orders of magnitude contrast enhancement. Dr. Yujia Yang, a postdoc at EPFL and a co-lead author of the study, adds: “We believe our work has a substantial impact on the future development in electron microscopy by harnessing the power of quantum technology.”

    A particular challenge for future quantum technology is how to interface different physical systems. “For the first time, we bring free electrons into the toolbox of quantum information science. More broadly, coupling free electrons and light using integrated photonics could open the way to a new class of hybrid quantum technologies,” says Tobias Kippenberg, professor at EPFL and head of the Laboratory of Photonics and Quantum Measurement.

    The work from the collaboration between the two teams contributes to the currently emerging field of free-electron quantum optics, and demonstrates a powerful experimental platform for event-based and photon-gated electron spectroscopy and imaging. “Our work represents a critical step to utilize quantum optics concepts in electron microscopy. We plan to further explore future directions like electron-heralded exotic photonic states, and noise reduction in electron microscopy,” says Guanhao Huang, PhD student at EPFL and co-lead author of the study.

    Science papers:
    Science
    Nature 2021

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    EPFL bloc

    EPFL campus

    The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH) is a research institute and university in Lausanne, Switzerland, that specializes in natural sciences and engineering. It is one of the two Swiss Federal Institutes of Technology, and it has three main missions: education, research and technology transfer.

    The QS World University Rankings ranks EPFL(CH) 14th in the world across all fields in their 2020/2021 ranking, whereas Times Higher Education World University Rankings ranks EPFL(CH) as the world’s 19th best school for Engineering and Technology in 2020.

    EPFL(CH) is located in the French-speaking part of Switzerland; the sister institution in the German-speaking part of Switzerland is The Swiss Federal Institute of Technology ETH Zürich [Eidgenössische Technische Hochschule Zürich] (CH). Associated with several specialized research institutes, the two universities form The Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles Polytechniques Fédérales] (CH) which is directly dependent on the Federal Department of Economic Affairs, Education and Research. In connection with research and teaching activities, EPFL(CH) operates a nuclear reactor CROCUS; a Tokamak Fusion reactor; a Blue Gene/Q Supercomputer; and P3 bio-hazard facilities.

    ETH Zürich, EPFL (Swiss Federal Institute of Technology in Lausanne) [École Polytechnique Fédérale de Lausanne](CH), and four associated research institutes form The Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles polytechniques fédérales] (CH) with the aim of collaborating on scientific projects.

    The roots of modern-day EPFL(CH) can be traced back to the foundation of a private school under the name École Spéciale de Lausanne in 1853 at the initiative of Lois Rivier, a graduate of the École Centrale Paris (FR) and John Gay the then professor and rector of the Académie de Lausanne. At its inception it had only 11 students and the offices were located at Rue du Valentin in Lausanne. In 1869, it became the technical department of the public Académie de Lausanne. When the Académie was reorganized and acquired the status of a university in 1890, the technical faculty changed its name to École d’Ingénieurs de l’Université de Lausanne. In 1946, it was renamed the École polytechnique de l’Université de Lausanne (EPUL). In 1969, the EPUL was separated from the rest of the University of Lausanne and became a federal institute under its current name. EPFL(CH), like ETH Zürich (CH), is thus directly controlled by the Swiss federal government. In contrast, all other universities in Switzerland are controlled by their respective cantonal governments. Following the nomination of Patrick Aebischer as president in 2000, EPFL(CH) has started to develop into the field of life sciences. It absorbed the Swiss Institute for Experimental Cancer Research (ISREC) in 2008.

    In 1946, there were 360 students. In 1969, EPFL(CH) had 1,400 students and 55 professors. In the past two decades the university has grown rapidly and as of 2012 roughly 14,000 people study or work on campus, about 9,300 of these being Bachelor, Master or PhD students. The environment at modern day EPFL(CH) is highly international with the school attracting students and researchers from all over the world. More than 125 countries are represented on the campus and the university has two official languages, French and English.

    Organization

    EPFL is organized into eight schools, themselves formed of institutes that group research units (laboratories or chairs) around common themes:

    School of Basic Sciences
    Institute of Mathematics
    Institute of Chemical Sciences and Engineering
    Institute of Physics
    European Centre of Atomic and Molecular Computations
    Bernoulli Center
    Biomedical Imaging Research Center
    Interdisciplinary Center for Electron Microscopy
    MPG-EPFL Centre for Molecular Nanosciences and Technology
    Swiss Plasma Center
    Laboratory of Astrophysics

    School of Engineering

    Institute of Electrical Engineering
    Institute of Mechanical Engineering
    Institute of Materials
    Institute of Microengineering
    Institute of Bioengineering

    School of Architecture, Civil and Environmental Engineering

    Institute of Architecture
    Civil Engineering Institute
    Institute of Urban and Regional Sciences
    Environmental Engineering Institute

    School of Computer and Communication Sciences

    Algorithms & Theoretical Computer Science
    Artificial Intelligence & Machine Learning
    Computational Biology
    Computer Architecture & Integrated Systems
    Data Management & Information Retrieval
    Graphics & Vision
    Human-Computer Interaction
    Information & Communication Theory
    Networking
    Programming Languages & Formal Methods
    Security & Cryptography
    Signal & Image Processing
    Systems

    School of Life Sciences

    Bachelor-Master Teaching Section in Life Sciences and Technologies
    Brain Mind Institute
    Institute of Bioengineering
    Swiss Institute for Experimental Cancer Research
    Global Health Institute
    Ten Technology Platforms & Core Facilities (PTECH)
    Center for Phenogenomics
    NCCR Synaptic Bases of Mental Diseases

    College of Management of Technology

    Swiss Finance Institute at EPFL
    Section of Management of Technology and Entrepreneurship
    Institute of Technology and Public Policy
    Institute of Management of Technology and Entrepreneurship
    Section of Financial Engineering

    College of Humanities

    Human and social sciences teaching program

    EPFL Middle East

    Section of Energy Management and Sustainability

    In addition to the eight schools there are seven closely related institutions

    Swiss Cancer Centre
    Center for Biomedical Imaging (CIBM)
    Centre for Advanced Modelling Science (CADMOS)
    École Cantonale d’art de Lausanne (ECAL)
    Campus Biotech
    Wyss Center for Bio- and Neuro-engineering
    Swiss National Supercomputing Centre

     
  • richardmitnick 2:51 pm on January 11, 2022 Permalink | Reply
    Tags: "Catalyst surface analysed at atomic resolution", , Atomic Probe Tomography, , Electron Microscopy, , , ,   

    From The Ruhr-Universität Bochum (DE): “Catalyst surface analysed at atomic resolution” 

    From The Ruhr-Universität Bochum (DE)

    1

    Members of the Bochum-based research team in the lab: Weikai Xiang, Chenglong Luan and Tong Li (from left to right) © Privat.

    Catalyst surfaces have rarely been imaged in such detail before. And yet, every single atom can play a decisive role in catalytic activity.

    A German-Chinese research team has visualised the three-dimensional structure of the surface of catalyst nanoparticles at atomic resolution. This structure plays a decisive role in the activity and stability of the particles. The detailed insights were achieved with a combination of atom probe tomography, spectroscopy and electron microscopy. Nanoparticle catalysts can be used, for example, in the production of hydrogen for the chemical industry. To optimise the performance of future catalysts, it is essential to understand how it is affected by the three-dimensional structure.

    Researchers from the Ruhr-Universität Bochum, The University of Duisburg-Essen [Universität Duisburg-Essen](DE) and The MPG Institute for Chemical Energy Conversion [Max-Planck-Institut für chemische Energieumwandlung](DE) cooperated on the project as part of the Collaborative Research Centre “Heterogeneous oxidation catalysis in the liquid phase”.

    At RUB, a team headed by Weikai Xiang and Professor Tong Li from Atomic-scale Characterisation worked together with the Chair of Electrochemistry and Nanoscale Materials and the Chair of Industrial Chemistry. Institutes in Shanghai, China, and Didcot, UK, were also involved. The team presents their findings in the journal Nature Communications, published online on 10 January 2022.

    Particles observed during the catalysis process

    The researchers studied two different types of nanoparticles made of cobalt iron oxide that were around ten nanometres. They analysed the particles during the catalysis of the so-called oxygen evolution reaction. This is a half reaction that occurs during water splitting for hydrogen production: hydrogen can be obtained by splitting water using electrical energy; hydrogen and oxygen are produced in the process. The bottleneck in the development of more efficient production processes is the partial reaction in which oxygen is formed, i.e. the oxygen evolution reaction. This reaction changes the catalyst surface that becomes inactive over time. The structural and compositional changes on the surface play a decisive role in the activity and stability of the electrocatalysts.

    For small nanoparticles with a size around ten nanometres, achieving detailed information about what happens on the catalyst surface during the reaction remains a challenge. Using atom probe tomography, the group successfully visualised the distribution of the different types of atoms in the cobalt iron oxide catalysts in three dimensions. By combining it with other methods, they showed how the structure and composition of the surface changed during the catalysis process – and how this change affected the catalytic performance.

    “Atom probe tomography has enormous potential to provide atomic insights into the compositional changes on the surface of catalyst nanoparticles during important catalytic reactions such as oxygen evolution reaction for hydrogen production or CO2 reduction,” concludes Tong Li.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Ruhr-Universität Bochum (DE) is a public university located in the southern hills of the central Ruhr area in Bochum. It was founded in 1962 as the first new public university in Germany after World War II. Instruction began in 1965.

    The Ruhr-University Bochum is one of the largest universities in Germany and part of the Deutsche Forschungsgemeinschaft, the most important German research funding organization.

    The RUB was very successful in the Excellence Initiative of the German Federal and State Governments (2007), a competition between Germany’s most prestigious universities. It was one of the few institutions left competing for the title of an “elite university”, but did not succeed in the last round of the competition. There are currently nine universities in Germany that hold this title.

    The University of Bochum was one of the first universities in Germany to introduce international bachelor’s and master’s degrees, which replaced the traditional German Diplom and Magister. Except for a few special cases (for example in Law) these degrees are offered by all faculties of the Ruhr-University. Currently, the university offers a total of 184 different study programs from all academic fields represented at the university.

     
  • richardmitnick 5:43 pm on December 23, 2021 Permalink | Reply
    Tags: , "Researchers use electron microscope to turn nanotube into tiny transistor", Apple says the chip which powers the future iPhones contains 15 billion transistors., , Electron Microscopy, In recent years researchers have made significant steps in developing nanotransistors which are so small that millions of them could fit onto the head of a pin., It remains a great challenge to control the chirality of individual carbon nanotubes., , , Researchers created a transistor that's 25000 times smaller than the width of a human hair., Semiconducting carbon nanotubes are promising for fabricating energy-efficient nanotransistors to build beyond-silicon microprocessors., ShouId I sell my Intel stock?   

    From The Queensland University of Technology (AU) via phys.org : “Researchers use electron microscope to turn nanotube into tiny transistor” 

    From The Queensland University of Technology (AU)

    via

    phys.org

    December 23, 2021

    1
    A designer view of a single-wall carbon nanotube intramolecular junction with metallic portions on left and right ends and a semiconductor ultrashort ~3,0nm channel in between. Credit: The National University of Science and Technology MISiS[Национальный исследовательский технологический университет МИСиС](RU).

    An international team of researchers have used a unique tool inserted into an electron microscope to create a transistor that’s 25,000 times smaller than the width of a human hair.

    The research, published in the journal Science, involves researchers from Japan, China, Russia and Australia who have worked on the project that began five years ago.

    The Queensland University of Technology Center for Materials Science co-director Professor Dmitri Golberg, who led the research project, said the result was a “very interesting fundamental discovery” which could lead a way for the future development of tiny transistors for future generations of advanced computing devices.

    “In this work, we have shown it is possible to control the electronic properties of an individual carbon nanotube,” Professor Golberg said.

    The researchers created the tiny transistor by simultaneously applying a force and low voltage which heated a carbon nanotube made up of few layers until outer tube shells separate, leaving just a single-layer nanotube.

    The heat and strain then changed the “chilarity” of the nanotube, meaning the pattern in which the carbon atoms joined together to form the single-atomic layer of the nanotube wall was rearranged.

    The result of the new structure connecting the carbon atoms was that the nanotube was transformed into a transistor.

    Professor Golberg’s team members from The National University of Science and Technology MISiS[Национальный исследовательский технологический университет МИСиС](RU) created a theory explaining the changes in the atomic structure and properties observed in the transistor.

    Lead author Dr. Dai-Ming Tang, from The International Center for Materials Nanoarchitectonics[材料の国際センター](JP), said the research had demonstrated the ability to manipulate the molecular properties of the nanotube to fabricated nanoscale electrical device.

    Dr. Tang began working on the project five years ago when Professor Golberg headed up the research group at this center.

    “Semiconducting carbon nanotubes are promising for fabricating energy-efficient nanotransistors to build beyond-silicon microprocessors,” Dr. Tang said.

    “However, it remains a great challenge to control the chirality of individual carbon nanotubes, which uniquely determines the atomic geometry and electronic structure.

    “In this work, we designed and fabricated carbon nanotube intramolecular transistors by altering the local chirality of a metallic nanotube segment by heating and mechanical strain.”

    Professor Golberg said the research in demonstrating the fundamental science in creating the tiny transistor was a promising step towards building beyond-silicon microprocessors.

    Transistors, which are used to switch and amplify electronic signals, are often called the “building blocks” of all electronic devices, including computers. For example, Apple says the chip which powers the future iPhones contains 15 billion transistors.

    The computer industry has been focused on developing smaller and smaller transistors for decades, but faces the limitations of silicon.

    In recent years researchers have made significant steps in developing nanotransistors which are so small that millions of them could fit onto the head of a pin [Should I sell my Intel stock? (ed).

    “Miniaturization of transistors down to nanometer scale is a great challenge of the modern semiconducting industry and nanotechnology,” Professor Golberg said.

    “The present discovery, although not practical for a mass-production of tiny transistors, shows a novel fabrication principle and opens up a new horizon of using thermomechanical treatments of nanotubes for obtaining the smallest transistors with desired characteristics.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Queensland University of Technology (QUT) (AU) is a public research university located in the urban coastal city of Brisbane, Queensland, Australia. The Queensland University of Technology is located on two campuses in the Brisbane area viz. Gardens Point and Kelvin Grove. The university in its current form was founded in 1989, when the Queensland Institute of Technology (QIT) was made a university through The Queensland University of Technology Act 1988, with the resulting Queensland University of Technology beginning its operations from January 1989. In 1990, the Brisbane College of Advanced Education merged with The Queensland University of Technology .

    In 2020, The Queensland University of Technology has 52,672 students enrolled (composed of 39,156 undergraduate students, 10,390 postgraduate students, and 661 non-award students), employs 5,049 full-time equivalent (FTE) staff members, a total revenue of $1.054 billion, and a total expenditure of $1.028 billion.

    The Queensland University of Technology was a member of the Australian Technology Network of universities, but withdrew participation on 28 September 2018.

    History

    The Queensland University of Technology (QUT) has a history that dates to 1849 when the Brisbane School of Arts was established. Queensland Institute of Technology (QIT) succeeded the Central Technical College and was formed in 1965. The current Queensland University of Technology was established as a university in 1989 from the merger of several predecessor institutions listed below:

    Brisbane School of Arts (1849)
    Brisbane Technical College (1882)
    Central Technical College (1908)
    Queensland Institute of Technology (1965)

    Brisbane College of Advanced Education was formed in 1982, which itself is a combination of multiple predecessor institutions shown in the list below:

    Brisbane Kindergarten Training College (1911)
    Brisbane Kindergarten Teachers College (1965)
    Queensland Teachers’ Training College (1914)
    Kelvin Grove Teachers College (1961)
    Kelvin Grove College of Advanced Education (1976)
    Kedron Park Teachers College (1961)
    North Brisbane College of Advanced Education (1974)

    In 1988, The Queensland University of Technology Act was passed for the grant of university status to Queensland Institute of Technology (QIT). As a result, QIT was granted university status and was operational as Queensland University of Technology (QUT) beginning in January 1989. The Brisbane College of Advanced Education joined with QUT in 1990.

    The Gardens Point campus was once entirely housed in the 19th-century, former Government House of Queensland. In 1909, during the relocation of the governor’s residence, the Old Government House and the surrounding five hectares were set aside for both a university and a technical college. The first university on the site was the University of Queensland which was moved to St Lucia in 1945, where it remains today.[citation needed]

    Research

    The Queensland University of Technology establishes collaborative research partnerships between academia, industry, government and community actors. The university is a key member of the Brisbane Diamantina Health Partners, Queensland’s first academic health science system. QUT attracts national grants and industry funding and has a number of research centres, including:

    Research institutes

    Research Council Centre of Excellence for the Digital Child
    Centre for Agriculture and the Bioeconomy
    Centre for Biomedical Technologies
    Centre for Data Science
    Centre for Future Enterprise
    Centre for Genomics and Personalised Health
    Centre for Healthcare Transformation
    Centre for Justice
    Centre for Materials Science
    Centre for Robotics
    Digital Media Research Centre
    Australian Centre for Entrepreneurship Research
    Australian Centre for Health Law Research
    Australian Centre for Health Services Innovation
    Australian Centre for Philanthropy and Nonprofit Studies
    Australia-China Centre for Tissue Engineering and Regenerative Medicine
    Cancer and Palliative Care Outcomes Centre
    Centre for a Waste-Free World
    Centre for Accident Research and Road Safety
    Centre for Behavioural Economics, Society and Technology
    Centre for Clean Energy Technologies and Practices
    Centre for Decent Work and Industry

    Indigenous Research Centres

    Curumba Institute
    National Indigenous Research and Knowledges Network

    Research infrastructure

    Biorefining Research Facility
    Central Analytical Research Facility
    Design and Fabrication Facility
    Digital Observatory
    eResearch
    Medical Engineering Research Facility
    Samford Ecological Research Facility
    Research Engineering Facility
    Visualisation and Interactive Solutions for Engagement and Research

    Former research institutes

    Institute of Health and Biomedical Innovation
    Institute for Future Environments

     
  • richardmitnick 11:55 am on December 23, 2021 Permalink | Reply
    Tags: "Integrated photonics meet electron microscopy", , Electron Microscopy, Integrated photonics circuits based on low-loss silicon nitride have made tremendous progress and are intensively driving the progress of many emerging technologies and fundamental science., Interfacing electron microscopy with photonics has the potential to uniquely bridge atomic scale imaging with coherent spectroscopy., , MPG Institute for Biophysical Chemistry [MPG Institut für Biophysikaliche Chemie](DE), , Researchers have successfully demonstrated extremely efficient electron beam modulation using integrated photonic microresonators., Scientists in Switzerland and Germany have achieved efficient electron-beam modulation using integrated photonics – circuits that guide light on a chip., Simplification and efficiency increase in the optical control of electron beams.,   

    From The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH): “Integrated photonics meet electron microscopy” 

    From The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH)

    23.12.21
    Professor Claus Ropers MPG Institute for Biophysical Chemistry [MPG Institut für Biophysikaliche Chemie](DE); Arslan Raja, Nik Papageorgiou EPFL.

    1

    Scientists in Switzerland and Germany have achieved efficient electron-beam modulation using integrated photonics – circuits that guide light on a chip. The experiments could lead to entirely new quantum measurement schemes in electron microscopy.

    The transmission electron microscope (TEM) can image molecular structures at the atomic scale by using electrons instead of light, and has revolutionized materials science and structural biology. The past decade has seen a lot of interest in combining electron microscopy with optical excitations, trying, for example, to control and manipulate the electron beam by light. But a major challenge has been the rather weak interaction of propagating electrons with photons.

    In a new study, researchers have successfully demonstrated extremely efficient electron beam modulation using integrated photonic microresonators. The study was led by Professor Tobias J. Kippenberg at EPFL and by Professor Claus Ropers at the MPG Institute for Biophysical Chemistry [MPG Institut für Biophysikaliche Chemie](DE) and The University of Göttingen [Georg-August-Universität Göttingen](DE), and is published in Nature.

    The two laboratories formed an unconventional collaboration, joining the usually unconnected fields of electron microscopy and integrated photonics. Photonic integrated circuits can guide light on a chip with ultra-low low losses, and enhance optical fields using micro-ring resonators. In the experiments conducted by Ropers’ group, an electron beam was steered through the optical near field of a photonic circuit, to allow the electrons to interact with the enhanced light. The researchers then probed the interaction by measuring the energy of electrons that had absorbed or emitted tens to hundreds of photon energies. The photonic chips were engineered by Kippenberg’s group, built in such a way that the speed of light in the micro-ring resonators exactly matched the speed of the electrons, drastically increasing the electron-photon interaction.

    2
    The experimental setup, showing a transmission electron microscope and silicon nitride microresonator used to demonstrate the electron-photon interaction. Image credit: Murat Sivis.

    The technique enables a strong modulation of the electron beam, with only a few milli-Watts from a continuous wave laser – a power level generated by a common laser pointer. The approach constitutes a dramatic simplification and efficiency increase in the optical control of electron beams, which can be seamlessly implemented in a regular transmission electron microscope, and could make the scheme much more widely applicable.

    “Integrated photonics circuits based on low-loss silicon nitride have made tremendous progress and are intensively driving the progress of many emerging technologies and fundamental science such as LiDAR, telecommunication, and quantum computing, and now prove to be a new ingredient for electron beam manipulation,” says Kippenberg.

    “Interfacing electron microscopy with photonics has the potential to uniquely bridge atomic scale imaging with coherent spectroscopy,” adds Ropers. “For the future, we expect this to yield an unprecedented understanding and control of microscopic optical excitations.”

    The researchers plan to further extend their collaboration in the direction of new forms of quantum optics and attosecond metrology for free electrons.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    EPFL bloc

    EPFL campus

    The Swiss Federal Institute of Technology in Lausanne [EPFL-École polytechnique fédérale de Lausanne] (CH) is a research institute and university in Lausanne, Switzerland, that specializes in natural sciences and engineering. It is one of the two Swiss Federal Institutes of Technology, and it has three main missions: education, research and technology transfer.

    The QS World University Rankings ranks EPFL(CH) 14th in the world across all fields in their 2020/2021 ranking, whereas Times Higher Education World University Rankings ranks EPFL(CH) as the world’s 19th best school for Engineering and Technology in 2020.

    EPFL(CH) is located in the French-speaking part of Switzerland; the sister institution in the German-speaking part of Switzerland is The Swiss Federal Institute of Technology ETH Zürich [Eidgenössische Technische Hochschule Zürich)](CH) . Associated with several specialized research institutes, the two universities form The Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles polytechniques fédérales] (CH) which is directly dependent on the Federal Department of Economic Affairs, Education and Research. In connection with research and teaching activities, EPFL(CH) operates a nuclear reactor CROCUS; a Tokamak Fusion reactor; a Blue Gene/Q Supercomputer; and P3 bio-hazard facilities.

    ETH Zürich, EPFL (Swiss Federal Institute of Technology in Lausanne) [École polytechnique fédérale de Lausanne](CH), and four associated research institutes form The Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles polytechniques fédérales] (CH) with the aim of collaborating on scientific projects.

    The roots of modern-day EPFL(CH) can be traced back to the foundation of a private school under the name École spéciale de Lausanne in 1853 at the initiative of Lois Rivier, a graduate of the École Centrale Paris (FR) and John Gay the then professor and rector of the Académie de Lausanne. At its inception it had only 11 students and the offices was located at Rue du Valentin in Lausanne. In 1869, it became the technical department of the public Académie de Lausanne. When the Académie was reorganised and acquired the status of a university in 1890, the technical faculty changed its name to École d’ingénieurs de l’Université de Lausanne. In 1946, it was renamed the École polytechnique de l’Université de Lausanne (EPUL). In 1969, the EPUL was separated from the rest of the University of Lausanne and became a federal institute under its current name. EPFL(CH), like ETH Zürich(CH), is thus directly controlled by the Swiss federal government. In contrast, all other universities in Switzerland are controlled by their respective cantonal governments. Following the nomination of Patrick Aebischer as president in 2000, EPFL(CH) has started to develop into the field of life sciences. It absorbed the Swiss Institute for Experimental Cancer Research (ISREC) in 2008.

    In 1946, there were 360 students. In 1969, EPFL(CH) had 1,400 students and 55 professors. In the past two decades the university has grown rapidly and as of 2012 roughly 14,000 people study or work on campus, about 9,300 of these being Bachelor, Master or PhD students. The environment at modern day EPFL(CH) is highly international with the school attracting students and researchers from all over the world. More than 125 countries are represented on the campus and the university has two official languages, French and English.

    Organization

    EPFL is organised into eight schools, themselves formed of institutes that group research units (laboratories or chairs) around common themes:

    School of Basic Sciences (SB, Jan S. Hesthaven)

    Institute of Mathematics (MATH, Victor Panaretos)
    Institute of Chemical Sciences and Engineering (ISIC, Emsley Lyndon)
    Institute of Physics (IPHYS, Harald Brune)
    European Centre of Atomic and Molecular Computations (CECAM, Ignacio Pagonabarraga Mora)
    Bernoulli Center (CIB, Nicolas Monod)
    Biomedical Imaging Research Center (CIBM, Rolf Gruetter)
    Interdisciplinary Center for Electron Microscopy (CIME, Cécile Hébert)
    Max Planck-EPFL Centre for Molecular Nanosciences and Technology (CMNT, Thomas Rizzo)
    Swiss Plasma Center (SPC, Ambrogio Fasoli)
    Laboratory of Astrophysics (LASTRO, Jean-Paul Kneib)

    School of Engineering (STI, Ali Sayed)

    Institute of Electrical Engineering (IEL, Giovanni De Micheli)
    Institute of Mechanical Engineering (IGM, Thomas Gmür)
    Institute of Materials (IMX, Michaud Véronique)
    Institute of Microengineering (IMT, Olivier Martin)
    Institute of Bioengineering (IBI, Matthias Lütolf)

    School of Architecture, Civil and Environmental Engineering (ENAC, Claudia R. Binder)

    Institute of Architecture (IA, Luca Ortelli)
    Civil Engineering Institute (IIC, Eugen Brühwiler)
    Institute of Urban and Regional Sciences (INTER, Philippe Thalmann)
    Environmental Engineering Institute (IIE, David Andrew Barry)

    School of Computer and Communication Sciences (IC, James Larus)

    Algorithms & Theoretical Computer Science
    Artificial Intelligence & Machine Learning
    Computational Biology
    Computer Architecture & Integrated Systems
    Data Management & Information Retrieval
    Graphics & Vision
    Human-Computer Interaction
    Information & Communication Theory
    Networking
    Programming Languages & Formal Methods
    Security & Cryptography
    Signal & Image Processing
    Systems

    School of Life Sciences (SV, Gisou van der Goot)

    Bachelor-Master Teaching Section in Life Sciences and Technologies (SSV)
    Brain Mind Institute (BMI, Carmen Sandi)
    Institute of Bioengineering (IBI, Melody Swartz)
    Swiss Institute for Experimental Cancer Research (ISREC, Douglas Hanahan)
    Global Health Institute (GHI, Bruno Lemaitre)
    Ten Technology Platforms & Core Facilities (PTECH)
    Center for Phenogenomics (CPG)
    NCCR Synaptic Bases of Mental Diseases (NCCR-SYNAPSY)

    College of Management of Technology (CDM)

    Swiss Finance Institute at EPFL (CDM-SFI, Damir Filipovic)
    Section of Management of Technology and Entrepreneurship (CDM-PMTE, Daniel Kuhn)
    Institute of Technology and Public Policy (CDM-ITPP, Matthias Finger)
    Institute of Management of Technology and Entrepreneurship (CDM-MTEI, Ralf Seifert)
    Section of Financial Engineering (CDM-IF, Julien Hugonnier)

    College of Humanities (CDH, Thomas David)

    Human and social sciences teaching program (CDH-SHS, Thomas David)

    EPFL Middle East (EME, Dr. Franco Vigliotti)[62]

    Section of Energy Management and Sustainability (MES, Prof. Maher Kayal)

    In addition to the eight schools there are seven closely related institutions

    Swiss Cancer Centre
    Center for Biomedical Imaging (CIBM)
    Centre for Advanced Modelling Science (CADMOS)
    École cantonale d’art de Lausanne (ECAL)
    Campus Biotech
    Wyss Center for Bio- and Neuro-engineering
    Swiss National Supercomputing Centre

     
  • richardmitnick 10:57 am on December 31, 2020 Permalink | Reply
    Tags: "An Existential Crisis in Neuroscience", , , DNNs are mathematical models that string together chains of simple functions that approximate real neurons., , Electron Microscopy, It’s clear now that while science deals with facts a crucial part of this noble endeavor is making sense of the facts., , ,   

    From Nautilus: “An Existential Crisis in Neuroscience” 

    From Nautilus

    December 30, 2020 [Re-issued “Maps” issue January 23, 2020.]
    Grigori Guitchounts

    1
    A rendering of dendrites (red)—a neuron’s branching processes—and protruding spines that receive synaptic information, along with a saturated reconstruction (multicolored cylinder) from a mouse cortex. Credit: Lichtman Lab at Harvard University.

    We’re mapping the brain in amazing detail—but our brain can’t understand the picture.

    On a chilly evening last fall, I stared into nothingness out of the floor-to-ceiling windows in my office on the outskirts of Harvard’s campus. As a purplish-red sun set, I sat brooding over my dataset on rat brains. I thought of the cold windowless rooms in downtown Boston, home to Harvard’s high-performance computing center, where computer servers were holding on to a precious 48 terabytes of my data. I have recorded the 13 trillion numbers in this dataset as part of my Ph.D. experiments, asking how the visual parts of the rat brain respond to movement.

    Printed on paper, the dataset would fill 116 billion pages, double-spaced. When I recently finished writing the story of my data, the magnum opus fit on fewer than two dozen printed pages. Performing the experiments turned out to be the easy part. I had spent the last year agonizing over the data, observing and asking questions. The answers left out large chunks that did not pertain to the questions, like a map leaves out irrelevant details of a territory.

    But, as massive as my dataset sounds, it represents just a tiny chunk of a dataset taken from the whole brain. And the questions it asks—Do neurons in the visual cortex do anything when an animal can’t see? What happens when inputs to the visual cortex from other brain regions are shut off?—are small compared to the ultimate question in neuroscience: How does the brain work?

    2
    LIVING COLOR: This electron microscopy image of a slice of mouse cortex, which shows different neurons labeled by color, is just the beginning. “We’re working on a cortical slab of a human brain, where every synapse and every connection of every nerve cell is identifiable,” says Harvard’s Jeff Lichtman. “It’s amazing.” Credit: Lichtman Lab at Harvard University.

    The nature of the scientific process is such that researchers have to pick small, pointed questions. Scientists are like diners at a restaurant: We’d love to try everything on the menu, but choices have to be made. And so we pick our field, and subfield, read up on the hundreds of previous experiments done on the subject, design and perform our own experiments, and hope the answers advance our understanding. But if we have to ask small questions, then how do we begin to understand the whole?

    Neuroscientists have made considerable progress toward understanding brain architecture and aspects of brain function. We can identify brain regions that respond to the environment, activate our senses, generate movements and emotions. But we don’t know how different parts of the brain interact with and depend on each other. We don’t understand how their interactions contribute to behavior, perception, or memory. Technology has made it easy for us to gather behemoth datasets, but I’m not sure understanding the brain has kept pace with the size of the datasets.

    Some serious efforts, however, are now underway to map brains in full. One approach, called connectomics, strives to chart the entirety of the connections among neurons in a brain. In principle, a complete connectome would contain all the information necessary to provide a solid base on which to build a holistic understanding of the brain. We could see what each brain part is, how it supports the whole, and how it ought to interact with the other parts and the environment. We’d be able to place our brain in any hypothetical situation and have a good sense of how it would react.

    The question of how we might begin to grasp the entirety of the organ that generates our minds has been pressing me for a while. Like most neuroscientists, I’ve had to cultivate two clashing ideas: striving to understand the brain and knowing that’s likely an impossible task. I was curious how others tolerate this doublethink, so I sought out Jeff Lichtman, a leader in the field of connectomics and a professor of molecular and cellular biology at Harvard.

    Lichtman’s lab happens to be down the hall from mine, so on a recent afternoon, I meandered over to his office to ask him about the nascent field of connectomics and whether he thinks we’ll ever have a holistic understanding of the brain. His answer—“No”—was not reassuring, but our conversation was a revelation, and shed light on the questions that had been haunting me. How do I make sense of gargantuan volumes of data? Where does science end and personal interpretation begin? Were humans even capable of weaving today’s reams of information into a holistic picture? I was now on a dark path, questioning the limits of human understanding, unsettled by a future filled with big data and small comprehension.

    Lichtman likes to shoot first, ask questions later. The 68-year-old neuroscientist’s weapon of choice is a 61-beam electron microscope, which Lichtman’s team uses to visualize the tiniest of details in brain tissue. The way neurons are packed in a brain would make canned sardines look like they have a highly evolved sense of personal space. To make any sense of these images, and in turn, what the brain is doing, the parts of neurons have to be annotated in three dimensions, the result of which is a wiring diagram. Done at the scale of an entire brain, the effort constitutes a complete wiring diagram, or the connectome.

    To capture that diagram, Lichtman employs a machine that can only be described as a fancy deli slicer. The machine cuts pieces of brain tissue into 30-nanometer-thick sections, which it then pastes onto a tape conveyor belt. The tape goes on silicon wafers, and into Lichtman’s electron microscope, where billions of electrons blast the brain slices, generating images that reveal nanometer-scale features of neurons, their axons, dendrites, and the synapses through which they exchange information. The Technicolor images are a beautiful sight that evokes a fantastic thought: The mysteries of how brains create memories, thoughts, perceptions, feelings—consciousness itself—must be hidden in this labyrinth of neural connections.

    2
    THE MAPMAKER: Jeff Lichtman, a leader in brain mapping, says the word “understanding” has to undergo a revolution in reference to the human brain. “There’s no point when you can suddenly say, ‘I now understand the brain,’ just as you wouldn’t say, ‘I now get New York City.’”Credit: Lichtman Lab at Harvard University.

    A complete human connectome will be a monumental technical achievement. A complete wiring diagram for a mouse brain alone would take up two exabytes. That’s 2 billion gigabytes; by comparison, estimates of the data footprint of all books ever written come out to less than 100 terabytes, or 0.005 percent of a mouse brain. But Lichtman is not daunted. He is determined to map whole brains, exorbitant exabyte-scale storage be damned.

    Lichtman’s office is a spacious place with floor-to-ceiling windows overlooking a tree-lined walkway and an old circular building that, in the days before neuroscience even existed as a field, used to house a cyclotron. He was wearing a deeply black sweater, which contrasted with his silver hair and olive skin. When I asked if a completed connectome would give us a full understanding of the brain, he didn’t pause in his answer. I got the feeling he had thought a great deal about this question on his own.

    “I think the word ‘understanding’ has to undergo an evolution,” Lichtman said, as we sat around his desk. “Most of us know what we mean when we say ‘I understand something.’ It makes sense to us. We can hold the idea in our heads. We can explain it with language. But if I asked, ‘Do you understand New York City?’ you would probably respond, ‘What do you mean?’ There’s all this complexity. If you can’t understand New York City, it’s not because you can’t get access to the data. It’s just there’s so much going on at the same time. That’s what a human brain is. It’s millions of things happening simultaneously among different types of cells, neuromodulators, genetic components, things from the outside. There’s no point when you can suddenly say, ‘I now understand the brain,’ just as you wouldn’t say, ‘I now get New York City.’ ”

    “But we understand specific aspects of the brain,” I said. “Couldn’t we put those aspects together and get a more holistic understanding?”

    “I guess I would retreat to another beachhead, which is, ‘Can we describe the brain?’ ” Lichtman said. “There are all sorts of fundamental questions about the physical nature of the brain we don’t know. But we can learn to describe them. A lot of people think ‘description’ is a pejorative in science. But that’s what the Hubble telescope does. That’s what genomics does. They describe what’s actually there. Then from that you can generate your hypotheses.”

    “Why is description an unsexy concept for neuroscientists?”

    “Biologists are often seduced by ideas that resonate with them,” Lichtman said. That is, they try to bend the world to their idea rather than the other way around. “It’s much better—easier, actually—to start with what the world is, and then make your idea conform to it,” he said. Instead of a hypothesis-testing approach, we might be better served by following a descriptive, or hypothesis-generating methodology. Otherwise we end up chasing our own tails. “In this age, the wealth of information is an enemy to the simple idea of understanding,” Lichtman said.

    “How so?” I asked.

    “Let me put it this way,” Lichtman said. “Language itself is a fundamentally linear process, where one idea leads to the next. But if the thing you’re trying to describe has a million things happening simultaneously, language is not the right tool. It’s like understanding the stock market. The best way to make money on the stock market is probably not by understanding the fundamental concepts of economy. It’s by understanding how to utilize this data to know what to buy and when to buy it. That may have nothing to do with economics but with data and how data is used.”

    “Maybe human brains aren’t equipped to understand themselves,” I offered.

    “And maybe there’s something fundamental about that idea: that no machine can have an output more sophisticated than itself,” Lichtman said. “What a car does is trivial compared to its engineering. What a human brain does is trivial compared to its engineering. Which is the great irony here. We have this false belief there’s nothing in the universe that humans can’t understand because we have infinite intelligence. But if I asked you if your dog can understand something you’d say, ‘Well, my dog’s brain is small.’ Well, your brain is only a little bigger,” he continued, chuckling. “Why, suddenly, are you able to understand everything?”

    Was Lichtman daunted by what a connectome might achieve? Did he see his efforts as Sisyphean?

    “It’s just the opposite,” he said. “I thought at this point we would be less far along. Right now, we’re working on a cortical slab of a human brain, where every synapse is identified automatically, every connection of every nerve cell is identifiable. It’s amazing. To say I understand it would be ridiculous. But it’s an extraordinary piece of data. And it’s beautiful. From a technical standpoint, you really can see how the cells are connected together. I didn’t think that was possible.”

    Lichtman stressed his work was about more than a comprehensive picture of the brain. “If you want to know the relationship between neurons and behavior, you gotta have the wiring diagram,” he said. “The same is true for pathology. There are many incurable diseases, such as schizophrenia, that don’t have a biomarker related to the brain. They’re probably related to brain wiring but we don’t know what’s wrong. We don’t have a medical model of them. We have no pathology. So in addition to fundamental questions about how the brain works and consciousness, we can answer questions like, Where did mental disorders come from? What’s wrong with these people? Why are their brains working so differently? Those are perhaps the most important questions to human beings.”

    Late one night, after a long day of trying to make sense of my data, I came across a short story by Jorge Louis Borges that seemed to capture the essence of the brain mapping problem. In the story, On Exactitude in Science, a man named Suarez Miranda wrote of an ancient empire that, through the use of science, had perfected the art of map-making. While early maps were nothing but crude caricatures of the territories they aimed to represent, new maps grew larger and larger, filling in ever more details with each edition. Over time, Borges wrote, “the Art of Cartography attained such Perfection that the map of a single Province occupied the entirety of a City, and the map of the Empire, the entirety of a Province.” Still, the people craved more detail. “In time, those Unconscionable Maps no longer satisfied, and the Cartographers Guilds struck a Map of the Empire whose size was that of the Empire, and which coincided point for point with it.”

    The Borges story reminded me of Lichtman’s view that the brain may be too complex to be understood by humans in the colloquial sense, and that describing it may be a better goal. Still, the idea made me uncomfortable. Much like storytelling, or even information processing in the brain, descriptions must leave some details out. For a description to convey relevant information, the describer has to know which details are important and which are not. Knowing which details are irrelevant requires having some understanding about the thing you’re describing. Will my brain, as intricate as it may be, ever be able to make sense of the two exabytes in a mouse brain?

    Humans have a critical weapon in this fight. Machine learning has been a boon to brain mapping, and the self-reinforcing relationship promises to transform the whole endeavor. Deep learning algorithms (also known as deep neural networks, or DNNs) have in the past decade allowed machines to perform cognitive tasks once thought impossible for computers—not only object recognition, but text transcription and translation, or playing games like Go or chess. DNNs are mathematical models that string together chains of simple functions that approximate real neurons. These algorithms were inspired directly by the physiology and anatomy of the mammalian cortex, but are crude approximations of real brains, based on data gathered in the 1960s. Yet they have surpassed expectations of what machines can do.

    The secret to Lichtman’s progress with mapping the human brain is machine intelligence. Lichtman’s team, in collaboration with Google, is using deep networks to annotate the millions of images from brain slices their microscopes collect. Each scan from an electron microscope is just a set of pixels. Human eyes easily recognize the boundaries of each blob in the image (a neuron’s soma, axon, or dendrite, in addition to everything else in the brain), and with some effort can tell where a particular bit from one slice appears on the next slice. This kind of labeling and reconstruction is necessary to make sense of the vast datasets in connectomics, and have traditionally required armies of undergraduate students or citizen scientists to manually annotate all chunks. DNNs trained on image recognition are now doing the heavy lifting automatically, turning a job that took months or years into one that’s complete in a matter of hours or days. Recently, Google identified each neuron, axon, dendrite, and dendritic spike—and every synapse—in slices of the human cerebral cortex. “It’s unbelievable,” Lichtman said.

    Scientists still need to understand the relationship between those minute anatomical features and dynamical activity profiles of neurons—the patterns of electrical activity they generate—something the connectome data lacks. This is a point on which connectomics has received considerable criticism, mainly by way of example from the worm: Neuroscientists have had the complete wiring diagram of the worm C. elegans for a few decades now, but arguably do not understand the 300-neuron creature in its entirety; how its brain connections relate to its behaviors is still an active area of research.

    Still, structure and function go hand-in-hand in biology, so it’s reasonable to expect one day neuroscientists will know how specific neuronal morphologies contribute to activity profiles. It wouldn’t be a stretch to imagine a mapped brain could be kickstarted into action on a massive server somewhere, creating a simulation of something resembling a human mind. The next leap constitutes the dystopias in which we achieve immortality by preserving our minds digitally, or machines use our brain wiring to make super-intelligent machines that wipe humanity out. Lichtman didn’t entertain the far-out ideas in science fiction, but acknowledged that a network that would have the same wiring diagram as a human brain would be scary. “We wouldn’t understand how it was working any more than we understand how deep learning works,” he said. “Now, suddenly, we have machines that don’t need us anymore.”

    Yet a masterly deep neural network still doesn’t grant us a holistic understanding of the human brain. That point was driven home to me last year at a Computational and Systems Neuroscience conference, a meeting of the who’s-who in neuroscience, which took place outside Lisbon, Portugal. In a hotel ballroom, I listened to a talk by Arash Afraz, a 40-something neuroscientist at the National Institute of Mental Health in Bethesda, Maryland. The model neurons in DNNs are to real neurons what stick figures are to people, and the way they’re connected is equally as sketchy, he suggested.

    Afraz is short, with a dark horseshoe mustache and balding dome covered partially by a thin ponytail, reminiscent of Matthew McConaughey in True Detective. As sturdy Atlantic waves crashed into the docks below, Afraz asked the audience if we remembered René Magritte’s Ceci n’est pas une pipe painting, which depicts a pipe with the title written out below it. Afraz pointed out that the model neurons in DNNs are not real neurons, and the connections among them are not real either. He displayed a classic diagram of interconnections among brain areas found through experimental work in monkeys—a jumble of boxes with names like V1, V2, LIP, MT, HC, each a different color, and black lines connecting the boxes seemingly at random and in more combinations than seems possible. In contrast to the dizzying heap of connections in real brains, DNNs typically connect different brain areas in a simple chain, from one “layer” to the next. Try explaining that to a rigorous anatomist, Afraz said, as he flashed a meme of a shocked baby orangutan cum anatomist. “I’ve tried, believe me,” he said.

    I, too, have been curious why DNNs are so simple compared to real brains. Couldn’t we improve their performance simply by making them more faithful to the architecture of a real brain? To get a better sense for this, I called Andrew Saxe, a computational neuroscientist at Oxford University. Saxe agreed that it might be informative to make our models truer to reality. “This is always the challenge in the brain sciences: We just don’t know what the important level of detail is,” he told me over Skype.

    How do we make these decisions? “These judgments are often based on intuition, and our intuitions can vary wildly,” Saxe said. “A strong intuition among many neuroscientists is that individual neurons are exquisitely complicated: They have all of these back-propagating action potentials, they have dendritic compartments that are independent, they have all these different channels there. And so a single neuron might even itself be a network. To caricature that as a rectified linear unit”—the simple mathematical model of a neuron in DNNs—“is clearly missing out on so much.”

    As 2020 has arrived, I have thought a lot about what I have learned from Lichtman, Afraz, and Saxe and the holy grail of neuroscience: understanding the brain. I have found myself revisiting my undergrad days, when I held science up as the only method of knowing that was truly objective (I also used to think scientists would be hyper-rational, fair beings paramountly interested in the truth—so perhaps this just shows how naive I was).

    It’s clear to me now that while science deals with facts, a crucial part of this noble endeavor is making sense of the facts. The truth is screened through an interpretive lens even before experiments start. Humans, with all our quirks and biases, choose what experiment to conduct in the first place, and how to do it. And the interpretation continues after data are collected, when scientists have to figure out what the data mean. So, yes, science gathers facts about the world, but it is humans who describe it and try to understand it. All these processes require filtering the raw data through a personal sieve, sculpted by the language and culture of our times.

    It seems likely that Lichtman’s two exabytes of brain slices, and even my 48 terabytes of rat brain data, will not fit through any individual human mind. Or at least no human mind is going to orchestrate all this data into a panoramic picture of how the human brain works. As I sat at my office desk, watching the setting sun tint the cloudless sky a light crimson, my mind reached a chromatic, if mechanical, future. The machines we have built—the ones architected after cortical anatomy—fall short of capturing the nature of the human brain. But they have no trouble finding patterns in large datasets. Maybe one day, as they grow stronger building on more cortical anatomy, they will be able to explain those patterns back to us, solving the puzzle of the brain’s interconnections, creating a picture we understand. Out my window, the sparrows were chirping excitedly, not ready to call it a day.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
  • richardmitnick 1:05 pm on August 20, 2020 Permalink | Reply
    Tags: "2D Electronics Get an Atomic Tuneup", , Electron Microscopy, , , , , , TUNING THE BAND GAP   

    From Lawrence Berkeley National Lab: “2D Electronics Get an Atomic Tuneup” 


    From Lawrence Berkeley National Lab

    August 20, 2020
    Theresa Duque
    tnduque@lbl.gov
    (510) 495-2418

    Scientists at Berkeley Lab, UC Berkeley demonstrate tunable, atomically thin semiconductors.

    1
    Electron microscopy experiments revealed meandering stripes formed by metal atoms of rhenium and niobium in the lattice structure of a 2D transition metal dichalcogenide alloy. (Image courtesy of Amin Azizi.)

    TO TUNE THE BAND GAP, a key parameter in controlling the electrical conductivity and optical properties of semiconductors, researchers typically engineer alloys, a process in which two or more materials are combined to achieve properties that otherwise could not be achieved by a pristine material.

    But engineering band gaps of conventional semiconductors via alloying has often been a guessing game, because scientists have not had a technique to directly “see” whether the alloy’s atoms are arranged in a specific pattern, or randomly dispersed.

    Now, as reported in Physical Review Letters, a research team led by Alex Zettl and Marvin Cohen – senior faculty scientists in the Materials Sciences Division at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), and professors of physics at UC Berkeley – has demonstrated a new technique that could engineer the band gap needed to improve the performance of semiconductors for next-generation electronics such as optoelectronics, thermoelectrics, and sensors.

    For the current study, the researchers examined monolayer and multilayer samples of a 2D transition metal dichalcogenide (TMD) material made of the alloy rhenium niobium disulfide.

    Electron microscopy experiments revealed meandering stripes formed by metal atoms of rhenium and niobium in the lattice structure of the 2D TMD alloy.

    A statistical analysis confirmed what the research team had suspected – that metal atoms in the 2D TMD alloy prefer to be adjacent to the other metal atoms, “which is in stark contrast to the random structure of other TMD alloys of the same class,” said lead author Amin Azizi, a postdoctoral researcher in the Zettl lab at UC Berkeley.

    Calculations performed at Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC) by Mehmet Dogan, a postdoctoral researcher in the Cohen lab at UC Berkeley, demonstrated that such atomic ordering can modify the material’s band gap.

    NERSC at LBNL

    NERSC Cray Cori II supercomputer, named after Gerty Cori, the first American woman to win a Nobel Prize in science

    NERSC Hopper Cray XE6 supercomputer, named after Grace Hopper, One of the first programmers of the Harvard Mark I computer

    NERSC Cray XC30 Edison supercomputer

    NERSC GPFS for Life Sciences


    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF computer cluster in 2003.

    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    Future:

    Cray Shasta Perlmutter SC18 AMD Epyc Nvidia pre-exascale supeercomputer

    NERSC is a DOE Office of Science User Facility.

    Optical spectroscopy measurements performed at Berkeley Lab’s Advanced Light Source revealed that the band gap of the 2D TMD alloy can be additionally tuned by adjusting the number of layers in the material.

    LBNL ALS

    Also, the band gap of the monolayer alloy is similar to that of silicon – which is “just right” for many electronic and optical applications, Azizi said. And the 2D TMD alloy has the added benefits of being flexible and transparent.

    The researchers next plan to explore the sensing and optoelectronic properties of new devices based on the 2D TMD alloy.

    Co-authors with Azizi, Cohen, and Zettl include Jeffrey D. Cain, Mehmet Dogan, Rahmatollah Eskandari, Emily G. Glazer, and Xuanze Yu.

    The Advanced Light Source and NERSC are DOE Office of Science user facilities co-located at Berkeley Lab.

    This work was supported by the DOE Office of Science. Additional funding was provided by the National Science Foundation.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    LBNL campus

    LBNL Molecular Foundry

    Bringing Science Solutions to the World
    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the National Academy of Sciences (NAS), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California (UC) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the UC Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a UC Berkeley physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    A U.S. Department of Energy National Laboratory Operated by the University of California.

    University of California Seal

     
  • richardmitnick 7:14 am on July 13, 2020 Permalink | Reply
    Tags: (DMSE)-Department of Materials Science and Engineering, , Electron Microscopy, Frances Ross, , , MIT.nano facility,   

    From MIT News: “A wizard of ultrasharp imaging” Frances Ross 

    MIT News

    From MIT News

    July 12, 2020
    David L. Chandler

    To oversee its new cutting-edge electron microscopy systems, MIT sought out Frances Ross’ industry-honed expertise.

    1
    “I’m hoping that MIT becomes a center for electron microscopy,” professor Frances Ross says. “There is nothing that exists with the capabilities that we are aiming for here.” Photo: Jared Charney

    1
    A specially designed transmission electron microscope in MIT Materials Research Laboratory’s newly renovated Electron Microscopy (EM) Shared Facility in Building 13. Photo, Denis Paiste, Materials Research Laboratory.

    Though Frances Ross and her sister Caroline Ross both ended up on the faculty of MIT’s Department of Materials Science and Engineering, they got there by quite different pathways. While Caroline followed a more traditional academic route and has spent most of her career at MIT, Frances Ross spent most of her professional life working in the industrial sector, as a microscopy specialist at IBM.

    3
    IBM Research Ultra High Vacuum-Transmission Electron Microscope Lab In 360.

    It wasn’t until 2018 that she arrived at MIT to oversee the new state-of-the-art electron microscope systems being installed in the new MIT.nano facility.

    Frances, who bears a strong family resemblance to her sister, says “it’s confused a few people, if they don’t know there are two of us.”

    The sisters grew up in London in a strongly science- and materials-oriented family. Her father, who worked first as a scientist and then as a lawyer, is currently working on his third PhD degree, in classics. Her mother, a gemologist, specializes in precisely matching diamonds, and oversees certification testing for the profession.

    After earning her doctorate at Cambridge University in materials science, specializing in electron microscopy, Frances Ross went on to do a postdoc at Bell Labs in New Jersey, and then to the National Center for Electron Microscopy at the University of California at Berkeley. From there she continued her work in electron microscopy at IBM in Yorktown Heights, New York, where she spent 20 years working on development and application of electron microscope technology to studying crystal growth.

    When MIT built its new cutting-edge nanotechnology fabrication and analysis facility, MIT.nano, it was clear that state-of-the-art microscope technology would need to be a key feature of the new center. That’s when Ross was hired as a professor, along with Professor Jim LeBeau and Research Scientist Rami Dana, who had an academic and industrial research background, to oversee the creation, development, and application of those microscopes for the Department of Materials Science and Engineering (DMSE) and the wider MIT community.

    “Currently, our students have to go to other places to do high-performance microscopy, so they might go to Harvard, or one of the national labs,” says Ross, who is the Ellen Swallow Richards Professor in Materials Science and Engineering. “Very many advances in the instrumentation have come together over the last few years, so that if your equipment is a little older, it’s actually a big disadvantage in electron microcopy. This is an area where MIT had not invested for a little while, and therefore, once they made that decision, the jump is going to be very significant. We’re going to have a state-of-the-art imaging capability.”

    There will be two major electron microscope systems for materials science, which are gradually taking shape inside the vibration-isolated basement level of MIT.nano, alongside two others already installed that are specialized for biomedical imaging.

    One of these will be an advanced version of a standard electron microscope, she says, that will have a unique combination of features. “There is nothing that exists with the capabilities that we are aiming for here.”

    The most important of these, she says, is the quality of the vacuum inside the microscope: “In most of our experiments, we want to start with a surface that’s atomically clean.” For example, “we could start with atomically clean silicon, and then add some germanium. How do the germanium atoms add onto the silicon surface? That’s a very important question for microelectronics. But if the sample is in an environment that’s not well-controlled, then the results you get will depend on how dirty the vacuum is. Contamination may affect the process, and you can’t be sure that what you’re seeing is what happens in real life.” Ross is working with the manufacturers to reach exceptional levels of cleanliness in the vacuum of the electron microscope system being developed now.

    But ultra-high-quality vacuum is just one of its attributes. “We combine the good vacuum with capabilities to heat the sample, and flow gases, and record images at high speed,” Ross says. “Perhaps most importantly for a lot of our experiments, we use lower-energy electrons to do the imaging, because for many interesting materials like 2D materials, such as graphene, boron nitride, and related structures, the high-energy electrons that are normally used will damage the sample.”

    Putting that all together, she says, “is a unique instrument that will give us real insights into surface reactions, crystal growth processes, materials transformations, catalysis, all kinds of reactions involving nanostructure formation and chemistry on the surfaces of 2D materials.”

    Other instruments and capabilities are also being added to MIT’s microscopy portfolio. A new scanning transmission electron microscope is already installed in MIT.nano and is providing high-resolution structural and chemical analysis of samples for several projects at MIT. Another new capability is a special sample holder that allows researchers to make movies of unfolding processes in water or other liquids in the microscope. This allows detailed monitoring, at up to 100 frames per second, of a variety of phenomena, such as solution-phase growth, unfolding chemical reactions, or electrochemical processes such as battery charging and discharging. Making movies of processes taking place in water, she says, “is something of a new field for electron microscopy.”

    Ross already has set up an ultra-high vacuum electron microscope in DMSE but without the resolution and low-voltage operation of the new instrument. And finally, an ultra-high vacuum scanning tunneling microscope has just started to produce images and will measure current flow through nanoscale materials.

    In their free time, Ross and her husband Brian enjoy sailing, mostly off the coast of Maine, with their two children, Kathryn and Eric. As a hobby she collects samples of beach sand. “I have a thousand different kinds of sand from various places, and a lot of them from Massachusetts,” she says. “Everywhere I go, that’s my souvenir.”

    But with her intense focus on developing this new world-class microscopy facility, there’s little time for anything else these days. Her aim is to ensure that it’s the best facility possible.

    “I’m hoping that MIT becomes a center for electron microscopy,” she says. “You know, with all the interesting materials science and physics that goes on here, it matches up very well with this unique instrumentation, this high-quality combination of imaging and analysis. These unique characterization capabilities really complement the rest of the science that happens here.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 11:42 am on February 24, 2020 Permalink | Reply
    Tags: "A Simple Retrofit Transforms Ordinary Electron Microscopes Into High-Speed Atom-Scale Cameras", , Electron Microscopy,   

    From NIST: “A Simple Retrofit Transforms Ordinary Electron Microscopes Into High-Speed Atom-Scale Cameras” 


    From NIST

    February 24, 2020

    Ben P. Stein
    benjamin.stein@nist.gov
    (301) 975-2763

    Patented “beam chopper” provides cost-effective way to investigate super-fast processes important for tomorrow’s technology.

    1
    Credit: N. Hanacek/NIST

    Researchers at the National Institute of Standards and Technology (NIST) and their collaborators have developed a way to retrofit the transmission electron microscope — a long-standing scientific workhorse for making crisp microscopic images — so that it can also create high-quality movies of super-fast processes at the atomic and molecular scale. Compatible with electron microscopes old and new, the retrofit promises to enable fresh insights into everything from microscopic machines to next-generation computer chips and biological tissue by making this moviemaking capability more widely available to laboratories everywhere.

    “We want to be able to look at things in materials science that happen really quickly,” said NIST scientist June Lau. She reports the first proof-of-concept operation of this retrofitted design with her colleagues in the journal Review of Scientific Instruments. The team designed the retrofit to be a cost-effective add-on to existing instruments. “It’s expected to be a fraction of the cost of a new electron microscope,” she said.

    A nearly 100-year-old invention, the electron microscope remains an essential tool in many scientific laboratories. A popular version is known as the transmission electron microscope (TEM), which fires electrons through a target sample to produce an image. Modern versions of the microscope can magnify objects by as much as 50 million times. Electron microscopes have helped to determine the structure of viruses, test the operation of computer circuits, and reveal the effectiveness of new drugs.

    “Electron microscopes can look at very tiny things on the atomic scale,” Lau said. “They are great. But historically, they look at things that are fixed in time. They’re not good at viewing moving targets,” she said.

    In the last 15 years, laser-assisted electron microscopes made videos possible, but such systems have been complex and expensive. While these setups can capture events that last from nanoseconds (billionths of a second) to femtoseconds (quadrillionths of a second), a laboratory must often buy a newer microscope to accommodate this capability as well as a specialized laser, with a total investment that can run into the millions of dollars. A lab also needs in-house laser-physics expertise to help set up and operate such a system.

    “Frankly, not everyone has that capacity,” Lau said.

    In contrast, the retrofit enables TEMs of any age to make high-quality movies on the scale of picoseconds (trillionths of a second) by using a relatively simple “beam chopper.” In principle, the beam chopper can be used in any manufacturer’s TEM. To install it, NIST researchers open the microscope column directly under the electron source, insert the beam chopper and close up the microscope again. Lau and her colleagues have successfully retrofitted three TEMs of different capabilities and vintage.

    Like a stroboscope, this beam chopper releases precisely timed pulses of electrons that can capture frames of important repeating or cyclic processes.

    “Imagine a Ferris wheel, which moves in a cyclical and repeatable way,” Lau said. “If we’re recording it with a pinhole camera, it will look blurry. But we want to see individual cars. I can put a shutter in front of the pinhole camera so that the shutter speed matches the movement of the wheel. We can time the shutter to open whenever a designated car goes to the top. In this way I can make a stack of images that shows each car at the top of the Ferris wheel,” she said.

    Like the light shutter, the beam chopper interrupts a continuous electron beam. But unlike the shutter, which has an aperture that opens and closes, this beam aperture stays open all the time, eliminating the need for a complex mechanical part.

    Instead, the beam chopper generates a radio frequency (RF) electromagnetic wave in the direction of the electron beam. The wave causes the traveling electrons to behave “like corks bobbing up and down on the surface of a water wave,” Lau said.

    Riding this wave, the electrons follow an undulating path as they approach the aperture. Most electrons are blocked except for the ones that are perfectly aligned with the aperture. The frequency of the RF wave is tunable, so that electrons hit the sample anywhere from 40 million to 12 billion times per second. As a result, researchers can capture important processes in the sample at time intervals from about a nanosecond to 10 picoseconds.

    In this way, the NIST-retrofitted microscope can capture atom-scale details of the back-and-forth movements in tiny machines such as microelectromechanical systems (MEMS) and nanoelectromechanical systems (NEMS). It can potentially study the regularly repeating signals in antennas used for high-speed communications and probe the movement of electric currents in next-generation computer processors.

    In one demo, the researchers wanted to prove that a retrofitted microscope functioned as it did before the retrofit. They imaged gold nanoparticles in both the traditional “continuous” mode and the pulsed beam mode. The images in the pulsed mode had comparable clarity and resolution to the still images.

    “We designed it so it should be the same,” Lau said.

    2
    A transmission electron microscope (TEM) image of gold (Au) nanoparticles magnified 200,000 times with a continuous electron beam (left) and a pulsed beam (right). The scale is 5 nanometers (nm).

    The beam chopper can also do double duty, pumping RF energy into the material sample and then taking pictures of the results. The researchers demonstrated this ability by injecting microwaves (a form of radio wave) into a metallic, comb-shaped MEMS device. The microwaves create electric fields within the MEMS device and cause the incoming pulses of electrons to deflect. These electron deflections enable researchers to build movies of the microwaves propagating through the MEMS comb.

    Lau and her colleagues hope their invention can soon make new scientific discoveries. For example, it could investigate the behavior of quickly changing magnetic fields in molecular-scale memory devices that promise to store more information than before.

    The researchers spent six years inventing and developing their beam chopper and have received several patents and an R&D 100 Award for their work. Co-authors in the work included Brookhaven National Laboratory in Upton, New York, and Euclid Techlabs in Bolingbrook, Illinois.

    One of the things that makes Lau most proud is that their design can breathe new life into any TEM, including the 25-year-old unit that performed the latest demonstration. The design gives labs everywhere the potential to use their microscopes to capture important fast-moving processes in tomorrow’s materials.

    “Democratizing science was the whole motivation,” Lau said.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD, USA

    NIST Mission, Vision, Core Competencies, and Core Values

    NIST’s mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.
    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.
    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

     
  • richardmitnick 12:37 pm on February 21, 2019 Permalink | Reply
    Tags: "Big Data at the Atomic Scale: New Detector Reaches New Frontier in Speed", A new detector that can capture atomic-scale images in millionths-of-a-second increments., , , Electron Microscopy, known as the “4D Camera” (for Dynamic Diffraction Direct Detector), , , NCEM-National Center for Electron Microscopy, The Molecular Foundry, The new detector, The Transmission Electron Aberration-corrected Microscope (TEAM 0.5) at Berkeley Lab   

    From Lawrence Berkeley National Lab: “Big Data at the Atomic Scale: New Detector Reaches New Frontier in Speed” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    February 21, 2019
    Glenn Roberts Jr.
    geroberts@lbl.gov
    (510) 486-5582

    1
    The Transmission Electron Aberration-corrected Microscope (TEAM 0.5) at Berkeley Lab has been upgraded with a new detector that can capture atomic-scale images in millionths-of-a-second increments. (Credit: Thor Swift/Berkeley Lab)


    This video provides an overview of the R&D effort to upgrade an electron microscope at Berkeley Lab’s Molecular Foundry with a superfast detector, the 4D Camera. The detector, which is linked to a supercomputer at Berkeley Lab via a high-speed data connection, can capture more images at a faster rate, revealing atomic-scale details across much larger areas than was possible before. (Credit: Marilyn Chung/Berkeley Lab)

    Advances in electron microscopy – using electrons as imaging tools to see things well beyond the reach of conventional microscopes that use light – have opened up a new window into the nanoscale world and brought a wide range of samples into focus as never before.

    Electron microscopy experiments can only use a fraction of the possible information generated as the microscope’s electron beam interacts with samples. Now, a team at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) has designed a new kind of electron detector that captures all of the information in these interactions.

    This new tool, a superfast detector installed Feb. 12 at Berkeley Lab’s Molecular Foundry, a nanoscale science user facility, captures more images at a faster rate, revealing atomic-scale details across much larger areas than was possible before. The Molecular Foundry and its world-class electron microscopes in the National Center for Electron Microscopy (NCEM) provide access to researchers from around the world.

    Faster imaging can also reveal important changes that samples are undergoing and provide movies vs. isolated snapshots. It could, for example, help scientists to better explore working battery and microchip components at the atomic scale before the onset of damage.

    The detector, which has a special direct connection to the Cori supercomputer at the Lab’s National Energy Research Scientific Computing Center (NERSC), will enable scientists to record atomic-scale images with timing measured in microseconds, or millionths of a second – 100 times faster than possible with existing detectors.

    NERSC Cray Cori II supercomputer at NERSC at LBNL, named after Gerty Cori, the first American woman to win a Nobel Prize in science

    “It is the fastest electron detector ever made,” said Andrew Minor, NCEM facility director at the Molecular Foundry.

    “It opens up a new time regime to explore with high-resolution microscopy. No one has ever taken continuous movies at this time resolution” using electron imaging, he said. “What happens there? There are all kinds of dynamics that might happen. We just don’t know because we’ve never been able to look at them before.” The new movies could reveal tiny deformations and movements in materials, for example, and show chemistry in action.

    The development of the new detector, known as the “4D Camera” (for Dynamic Diffraction Direct Detector), is the latest in a string of pioneering innovations in electron microscopy, atomic-scale imaging, and high-speed data transfer and computing at Berkeley Lab that span several decades.

    “Our group has been working for some time on making better detectors for microscopy,” said Peter Denes, a Berkeley Lab senior scientist and a longtime pioneer in the development of electron microscopy tools.

    “You get a whole scattering pattern instead of just one point, and you can go back and reanalyze the data to find things that maybe you weren’t focusing on before,” Denes said. This quickly produces a complete image of a sample by scanning across it with an electron beam and capturing information based on the electrons that scatter off the sample.

    Mary Scott, a faculty scientist at the Molecular Foundry, said that the unique geometry of the new detector allows studies of both light and heavyweight elements in materials side by side. “The reason you might want to perform one of these more complicated experiments would be to measure the positions of light elements, particularly in materials that might be really sensitive to the electron beam – like lithium in a battery material – and ideally you would be able to also precisely measure the positions of heavy elements in that same material,” she said.

    The new detector has been installed on the Transmission Electron Aberration-corrected Microscope 0.5 (TEAM 0.5) at the Molecular Foundry, which set high-resolution records when it launched at NCEM a decade ago and allows visiting researchers to access single-atom resolution for some samples. The detector will generate a whopping 4 terabytes of data per minute.

    “The amount of data is equivalent to watching about 60,000 HD movies simultaneously,” said Peter Ercius, a staff scientist at the Molecular Foundry who specializes in 3D atomic-scale imaging.

    Brent Draney, a networking architect at Berkeley Lab’s NERSC, said that Ercius and Denes had approached NERSC to see what it would take to build a system that could handle this huge, 400-gigabit stream of data produced by the 4D Camera.

    His response: “We actually already have a system capable of doing that. What we really needed to do is to build a network between the microscope and the supercomputer.”

    2
    A technician works on the TEAM 0.5 microscope. The microscope has been upgraded with a superfast detector called the 4D Camera that can capture atomic-scale images in millionths-of-a-second increments. (Credit: Thor Swift/Berkeley Lab)

    Camera data is transferred over about 100 fiber-optic connections into a high-speed ethernet connection that is about 1,000 times faster than the average home network, said Ian Johnson, a staff scientist in Berkeley Lab’s Engineering Division. The network connects the Foundry to the Cori supercomputer at NERSC.

    Berkeley Lab’s Energy Sciences Network (ESnet), which connects research centers with high-speed data networks, participated in the effort.

    Ercius said, “The supercomputer will analyze the data in about 20 seconds in order to provide rapid feedback to the scientists at the microscope to tell if the experiment was successful or not.”

    Jim Ciston, another Molecular Foundry staff scientist, said, “We’ll actually capture every electron that comes through the sample as it’s scattered. Through this really large data set we’ll be able to perform ‘virtual’ experiments on the sample – we won’t have to go back and take new data from different imaging conditions.”

    The work on the new detector and its supporting data systems should benefit other facilities that produce high volumes of data, such as the Advanced Light Source and its planned upgrade, and the LCLS-II project at SLAC National Accelerator Laboratory, Ciston noted.

    LBNL Advanced Light Source

    SLAC LCLS-II

    The Advanced Light Source, ESnet, Molecular Foundry, and NERSC are DOE Office of Science User Facilities.

    The development of the 4D Camera was supported by the Accelerator and Detector Research Program of the Department of Energy’s Office of Basic Energy Sciences, and work at the Molecular Foundry was supported by the DOE’s Office of Basic Energy Sciences.

    3
    This computer chip is a component in a superfast detector called the 4D Camera. The detector is an upgrade for a powerful electron microscope at Berkeley Lab’s Molecular Foundry. (Credit: Marilyn Chung/Berkeley Lab)

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Bringing Science Solutions to the World

    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the National Academy of Sciences (NAS), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California (UC) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the UC Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a UC Berkeley physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    A U.S. Department of Energy National Laboratory Operated by the University of California.

    University of California Seal

    DOE Seal

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: