Tagged: Electronics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:36 pm on November 9, 2022 Permalink | Reply
    Tags: , , , , , , Electronics, "New Theory of Electron Spin to Aid Quantum Devices", Mass - charge - spin, Spintronics-development of quantum electronic devices that use spin in memory storage and information processing., Spin is also central to qubits—the basic unit of information used in quantum computing   

    From The California Institute of Technology: “New Theory of Electron Spin to Aid Quantum Devices” 

    Caltech Logo

    From The California Institute of Technology

    11.9.22
    Emily Velasco
    (626) 372‑0067
    evelasco@caltech.edu

    1
    Credit: Caltech.

    Electrons—those little subatomic particles that help make up the atoms in our bodies and the electricity flowing through your phone or computer right now—have some properties like mass and charge that will be familiar to anyone who has taken a high school physics class. But electrons also have a more abstract property known as spin, which describes how they interact with magnetic fields.

    Electron spin is of particular importance to a field of research called spintronics, which aims to develop quantum electronic devices that use spin in memory storage and information processing. Spin is also central to qubits—the basic unit of information used in quantum computing.

    The problem with using spin in these quantum devices is that its quantum states can be easily disrupted. To be used in a device, the electron spins need to preserve their quantum state for as long as possible to avoid loss of information. This is known as spin coherence, and it is so delicate that even the tiny vibrations of the atoms that make up the device can wipe out the spin state irreversibly.

    In a new paper published in the journal Physical Review Letters [below], Marco Bernardi, professor of applied physics, physics and materials science; and Jinsoo Park (MS ’20, PhD ’22), postdoctoral scholar research associate in applied physics and materials science, have developed a new theory and numerical calculations to predict spin decoherence in materials with high accuracy. Bernardi explains:

    “Existing theories of spin relaxation and decoherence focus on simple models and qualitative understanding. After years of systematic efforts, my group has developed computational tools to study quantitatively how electrons interact and move in materials.

    This new paper has taken our work a few steps further: we have adapted a theory of electrical transport to study spin, and discovered that this method can capture two main mechanisms governing spin decoherence in materials—­spin scattering off atomic vibrations, and spin precession modified by atomic vibrations. This unified treatment allows us to study the behavior of the electron spin in a wide range of materials and devices essential for future quantum technologies. It is almost startling that in some cases we can predict spin decoherence times with an accuracy of a few percent of the measured values—down to a billionth of a second—and access microscopic details of spin motion beyond the reach of experiments. Ironically, our research tools—computers and quantum mechanics—can now be used to develop new computers that use quantum mechanics.”

    A companion paper describing the theory in detail is published in Physical Review B [below]

    Science paper:
    Physical Review Letters
    Physical Review B
    Those with institutional credentials can see the full science papers for detailed material with images.

    Funding for the research was provided by the National Science Foundation.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Caltech campus

    The California Institute of Technology is a private research university in Pasadena, California. The university is known for its strength in science and engineering, and is one among a small group of institutes of technology in the United States which is primarily devoted to the instruction of pure and applied sciences.

    The California Institute of Technology was founded as a preparatory and vocational school by Amos G. Throop in 1891 and began attracting influential scientists such as George Ellery Hale, Arthur Amos Noyes, and Robert Andrews Millikan in the early 20th century. The vocational and preparatory schools were disbanded and spun off in 1910 and the college assumed its present name in 1920. In 1934, The California Institute of Technology was elected to the Association of American Universities, and the antecedents of National Aeronautics and Space Administration ‘s Jet Propulsion Laboratory, which The California Institute of Technology continues to manage and operate, were established between 1936 and 1943 under Theodore von Kármán.

    The California Institute of Technology has six academic divisions with strong emphasis on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. First-year students are required to live on campus, and 95% of undergraduates remain in the on-campus House System at The California Institute of Technology. Although The California Institute of Technology has a strong tradition of practical jokes and pranks, student life is governed by an honor code which allows faculty to assign take-home examinations. The The California Institute of Technology Beavers compete in 13 intercollegiate sports in the NCAA Division III’s Southern California Intercollegiate Athletic Conference (SCIAC).

    As of October 2020, there are 76 Nobel laureates who have been affiliated with The California Institute of Technology, including 40 alumni and faculty members (41 prizes, with chemist Linus Pauling being the only individual in history to win two unshared prizes). In addition, 4 Fields Medalists and 6 Turing Award winners have been affiliated with The California Institute of Technology. There are 8 Crafoord Laureates and 56 non-emeritus faculty members (as well as many emeritus faculty members) who have been elected to one of the United States National Academies. Four Chief Scientists of the U.S. Air Force and 71 have won the United States National Medal of Science or Technology. Numerous faculty members are associated with the Howard Hughes Medical Institute as well as National Aeronautics and Space Administration. According to a 2015 Pomona College study, The California Institute of Technology ranked number one in the U.S. for the percentage of its graduates who go on to earn a PhD.

    Research

    The California Institute of Technology is classified among “R1: Doctoral Universities – Very High Research Activity”. Caltech was elected to The Association of American Universities in 1934 and remains a research university with “very high” research activity, primarily in STEM fields. The largest federal agencies contributing to research are National Aeronautics and Space Administration; National Science Foundation; Department of Health and Human Services; Department of Defense, and Department of Energy.

    In 2005, The California Institute of Technology had 739,000 square feet (68,700 m^2) dedicated to research: 330,000 square feet (30,700 m^2) to physical sciences, 163,000 square feet (15,100 m^2) to engineering, and 160,000 square feet (14,900 m^2) to biological sciences.

    In addition to managing NASA-JPL/Caltech , The California Institute of Technology also operates the Caltech Palomar Observatory; the Owens Valley Radio Observatory;the Caltech Submillimeter Observatory; the W. M. Keck Observatory at the Mauna Kea Observatory; the Laser Interferometer Gravitational-Wave Observatory at Livingston, Louisiana and Hanford, Washington; and Kerckhoff Marine Laboratory in Corona del Mar, California. The Institute launched the Kavli Nanoscience Institute at The California Institute of Technology in 2006; the Keck Institute for Space Studies in 2008; and is also the current home for the Einstein Papers Project. The Spitzer Science Center, part of the Infrared Processing and Analysis Center located on The California Institute of Technology campus, is the data analysis and community support center for NASA’s Spitzer Infrared Space Telescope [no longer in service].

    The California Institute of Technology partnered with University of California at Los Angeles to establish a Joint Center for Translational Medicine (UCLA-Caltech JCTM), which conducts experimental research into clinical applications, including the diagnosis and treatment of diseases such as cancer.

    The California Institute of Technology operates several Total Carbon Column Observing Network stations as part of an international collaborative effort of measuring greenhouse gases globally. One station is on campus.

     
  • richardmitnick 7:46 am on November 9, 2022 Permalink | Reply
    Tags: "Inspiration at the atomic scale", , , , Electronics, , James LeBeau, , , , , , , , With new techniques in electron microscopy James LeBeau explores the nanoscale landscape within materials to understand their properties.   

    From The School of Engineering AT The Massachusetts Institute of Technology: “Inspiration at the atomic scale” 

    From The School of Engineering

    At

    The Massachusetts Institute of Technology

    11.9.22
    Zach Winn

    1
    MIT Associate Professor James LeBeau develops new techniques for gathering and analyzing data in electron microscopy to better understand material properties in fields including electronics, photonics, quantum mechanics, and energy storage. “Science is truly a creative outlet,” LeBeau says. Photo: Adam Glanzman.

    With new techniques in electron microscopy James LeBeau explores the nanoscale landscape within materials to understand their properties.

    To explain why he loves electron microscopy, Associate Professor James LeBeau uses an analogy: He likens the technique, which uses beams of electrons to illuminate materials at a scale thousands of times smaller than conventional microscopes, to the inverse of astronomy.

    “It’s discovering things that no human has ever seen before that really captures the imagination,” LeBeau says. “There is a beauty to the way atoms are arranged in materials, particularly at defects, which give rise to all sorts of material behavior.”

    LeBeau has used that passion to develop new techniques for collecting and interpreting data in electron microscopy that can be used to describe materials more comprehensively. He’s applied those techniques to explain materials’ behavior in fields from electronics and optics to energy storage, quantum computing, and more.

    “Beyond explaining material properties, there’s also a significant computational component to electron microscopy as it’s used to analyze data that may have been overlooked previously and to make conclusions about the data in new ways. And, with the creation of the MIT Schwarzman College of Computing, it’s an exciting time to be at MIT,” he says.

    Discovering a passion

    LeBeau became interested in engineering while helping his father build and repair things around the house, and he discovered a love for science at a young age.

    “Science can provide an explanation of the world around us beyond supernatural beliefs,” LeBeau says. “For me, science was about making sense of the world.”

    LeBeau first learned about materials science through the technical high school he attended in Indiana. But it wasn’t until he was an undergraduate at Rensselaer Polytechnic Institute in New York that a few pivotal experiences helped set his course in life.

    During his first year, he participated in a project using data science to predict material properties.

    “After that I was hooked, and at that point I knew I wanted to go the academic route,” he recalls. “Just being able to explore things and have that academic freedom really appealed to me.”

    A few years later, in 2005, LeBeau participated in a summer research program for undergraduates at what is now the Materials Research Laboratory at MIT.

    The experience, in which he integrated biopolymers into a casting process, stoked his interest in using materials science for sustainability. The passion of the researchers around MIT also left a lasting impression on him.

    Finally, as a senior, LeBeau got his first taste of electron microscopy.

    “We’d be in the lab in the middle of the night analyzing these materials, and that excitement caught my attention pretty early on,” LeBeau says. “It didn’t really matter how much I was working — I loved doing it, and that set the stage for the rest of my career.”

    During his PhD at the University of California-Santa Barbara, LeBeau was part of a team that showed that scanning transmission electron microscopy theory and experiment are in very good agreement and, in turn, that attograms (one millionth of a trillionth of a gram) of material could be weighed directly from electron microscopy images without the need for external microscope calibration standards.

    LeBeau also discovered a passion for cycling through the mountains near UC Barbara’s campus, an activity he continues by biking thousands of miles a year, including to MIT nearly every day regardless of the weather.

    After his PhD, LeBeau accepted a faculty position at North Carolina State University, where he worked for eight years before a similar position opened up at MIT in 2019.

    Since his move to MIT, LeBeau has helped the Institute adopt state-of-the-art electron microscopy equipment that researchers from across campus have taken advantage of in MIT.nano and elsewhere.

    “As an electron microscopist, the equipment I use is extremely expensive to maintain and necessitates that it becomes a shared resource. I’m happy that’s the case because ultimately users from across campus benefit from these tools and advance their science through this shared infrastructure,” LeBeau says. “More broadly, the microscope routinely challenges what people thought they knew about the materials they are studying. The results are always exciting.”

    Creativity and quantification

    When it’s his group’s turn on the microscope, LeBeau says they try to go after hard problems that require new ways of collecting and interpreting data.

    “We choose questions that are not easy to answer through other methods and that require new ways to extract information from our datasets to make conclusions,” LeBeau says.

    One type of material LeBeau has studied is relaxor ferroelectrics, which are used for applications including ultrasounds, actuators, and energy storage. The materials have been studied for decades but are extremely heterogeneous at the nanoscale, making it difficult to explain their electromechanical properties. By analyzing the materials’ structure using new electron microscopy techniques, LeBeau’s group was able to explain its properties in a way that could help create more sustainable versions of the material, which currently contain lead.

    “Impact is always at the forefront of everything we do,” LeBeau explains. “When we go after problems, the application space is very important because it tells us if the insights can change the way an entire space operates.”

    One area of LeBeau’s research explores ways to use machine learning to help the microscope collect data more quickly than a human could.

    “Transmission electron microscopy in general is often a very slow technique,” LeBeau explains. “But you can imagine a case where a self-driving microscope is able to align a microscope and sample much faster, and in a much more reproducible way, than a human can. Doing so would enable us to collect a full statistical description of the material. That’s where machine learning can play a role: in pulling more data out of what we’ve already acquired but also in the acquisition itself.”

    Indeed, making electron microscopy more quantitative and reproducible has been a theme of LeBeau’s career. But he doesn’t believe quantifying something comes at the expense of creativity.

    “Science is truly a creative outlet,” LeBeau says. “The creativity comes from not only creating new experiment design or theories, but also from deciding how to present your data in visually appealing and informative ways. There’s a major creative element to what we do.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    The MIT School of Engineering is one of the five schools of the Massachusetts Institute of Technology, located in Cambridge, Massachusetts. The School of Engineering has eight academic departments and two interdisciplinary institutes. The School grants SB, MEng, SM, engineer’s degrees, and PhD or ScD degrees. The school is the largest at MIT as measured by undergraduate and graduate enrollments and faculty members.

    Departments and initiatives:

    Departments:

    Aeronautics and Astronautics (Course 16)
    Biological Engineering (Course 20)
    Chemical Engineering (Course 10)
    Civil and Environmental Engineering (Course 1)
    Electrical Engineering and Computer Science (Course 6, joint department with MIT Schwarzman College of Computing)
    Materials Science and Engineering (Course 3)
    Mechanical Engineering (Course 2)
    Nuclear Science and Engineering (Course 22)

    Institutes:

    Institute for Medical Engineering and Science
    Health Sciences and Technology program (joint MIT-Harvard, “HST” in the course catalog)

    (Departments and degree programs are commonly referred to by course catalog numbers on campus.)

    Laboratories and research centers

    Abdul Latif Jameel Water and Food Systems Lab
    Center for Advanced Nuclear Energy Systems
    Center for Computational Engineering
    Center for Materials Science and Engineering
    Center for Ocean Engineering
    Center for Transportation and Logistics
    Industrial Performance Center
    Institute for Soldier Nanotechnologies
    Koch Institute for Integrative Cancer Research
    Laboratory for Information and Decision Systems
    Laboratory for Manufacturing and Productivity
    Materials Processing Center
    Microsystems Technology Laboratories
    MIT Lincoln Laboratory Beaver Works Center
    Novartis-MIT Center for Continuous Manufacturing
    Ocean Engineering Design Laboratory
    Research Laboratory of Electronics
    SMART Center
    Sociotechnical Systems Research Center
    Tata Center for Technology and Design

    MIT Seal

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

    The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center , and the Haystack Observatory , as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and Whitehead Institute.

    Massachusettes Institute of Technology-Haystack Observatory Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities.

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after The Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    The Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, The Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at The Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    The Massachusetts Institute of Technology‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology ‘s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, The Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected The Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of The Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, The Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and The Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. The Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However, six Massachusetts Institute of Technology students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at The Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, The Massachusetts Institute of Technology’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    The Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    The Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, The Massachusetts Institute of Technology launched “OpenCourseWare” to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, The Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

    The Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, The Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of The Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology , Massachusetts Institute of Technology, and industrial contractors, and funded by the National Science Foundation .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also a Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of The Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 7:59 am on October 29, 2022 Permalink | Reply
    Tags: "University of Chicago researchers take inspiration from soil to create new material with promise for medical and biofuel technology", , , , , Electronics, , Microbes can be put to work producing molecules such as biofuels., Microbes often get a bad rap but there are many times when we actually want microbes to grow., , The lab found that the droplets of liquid metal boosted the growth of bacteria.,   

    From The University of Chicago: “University of Chicago researchers take inspiration from soil to create new material with promise for medical and biofuel technology” 

    U Chicago bloc

    From The University of Chicago

    10.28.22
    Louise Lerner

    1
    A new University of Chicago experiment mimics the structure of soil to create materials that can interact with their environment, with promise for electronics, medicine, and biofuel technology. Above: a 3D X-ray reconstruction of the soil-like material, with red representing liquid metal and white representing the rest of the components. The entire piece is just 13 microns, about the size of a red blood cell.

    A handful of soil is not only a miracle to a farmer, but also an engineer: “It can respond to a range of stimuli,” said chemist Bozhi Tian.

    1
    Bozhi Tian. Credit: The University of Chicago.

    “If you shine light or heat on it, if you step on it, if you add water, if you add chemicals—the soil changes in response and in turn, this affects the microbes or plants living in the soil. There are so many things we can learn from this.”

    Tian and his laboratory at the University of Chicago are taking inspiration from nature to engineer new systems with a range of potential applications. Their latest experiment mimics the structure of soil to create materials that can interact with their environment, with promise for electronics, medicine, and biofuel technology. It has multiple potential applications; preliminary tests have shown the material can boost the growth of microbes and may be able to help treat gut disorders.

    In a study described in Nature Chemistry [below], the team designed a springy substance composed of tiny particles of clay, starch, and droplets of liquid metal. The clay and starch create structure with lots of nooks and crannies, but it’s flexible enough that the material can also adapt and respond to the conditions around it.

    2
    The experiment in schematic. From Nature Chemistry.

    Much like real soil, these nooks and crannies create the perfect spots for microbes to flourish. “We found the porosity is very important; we call it the partitioning effect,” said Tian. “I think of it like a meeting—if you break a large meeting or class into smaller sections there will be more interaction.”

    3
    Yiliang Lin (left) and Xiang Gao (right) in the Tian lab at the University of Chicago. They are co-first authors on the research along with Jiping Yue and Yin Fang. Courtesy Tian lab.

    Microbes often get a bad rap, but there are many times when we actually want microbes to grow. For example, doctors think that digestive diseases like colitis partially stem from a lack of diverse microbes in the gut, so a goal of medicine is to boost them. In fact, preliminary tests showed the new “soil” material reduced symptoms of colitis in mice.

    Microbes can also be put to work producing molecules such as biofuels, which are used as a renewable alternative to fuels like gasoline. Tian’s lab found that their material encouraged the growth of the biofilms used in biofuel production. It may extend to other uses, too; “This is potentially a more environmentally friendly method to make various chemicals used in industrial production,” said Jiping Yue, a scientist in Tian’s lab and a co-first author on the study.

    In the course of their experiments, the lab also found that the droplets of liquid metal boosted the growth of bacteria. “We’re not yet sure about the mechanism, but if you leave out the liquid metal, the biofilms and the gut microbiome diversity both drop,” said Tian. They theorized it could have to do with providing a source of metal ions, which are abundant in the body and used in enzymes.

    Interestingly, the lab also found that they could make rewriteable circuits by burning patterns into the substance with a laser or drawing them with a pen. The heat or pressure causes the droplets of liquid metal in the substance to melt and join together, forming lines of conductivity. This circuit can then be undone chemically. “That means it is a rewriteable memory; you could think of using this approach for constructing a neuromorphic computing chip from soil-like materials,” said Yiliang Lin, the lead author of the study, formerly a postdoctoral scholar at UChicago and now an assistant professor at the National University of Singapore.

    Moreover, Tian is excited about the nature-inspired approach.

    “Soil is just the beginning; if you think about this as a bigger picture, there are many other places to get inspiration,” he said. “Can we use this knowledge to design new material or chemical systems? There are numerous ways we can learn from nature.”

    The research made use of the University of Chicago Materials Research Science and Engineering Center (MRSEC), the Electron Microscopy Service of the University of Illinois-Chicago, BioCryo facility of Northwestern University, and the Advanced Photon Source and Center for Nanoscale Materials at The DOE’s Argonne National Laboratory.

    Science paper:
    Nature Chemistry

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Chicago Campus

    The University of Chicago is an urban research university that has driven new ways of thinking since 1890. Our commitment to free and open inquiry draws inspired scholars to our global campuses, where ideas are born that challenge and change the world.

    We empower individuals to challenge conventional thinking in pursuit of original ideas. Students in the College develop critical, analytic, and writing skills in our rigorous, interdisciplinary core curriculum. Through graduate programs, students test their ideas with University of Chicago scholars, and become the next generation of leaders in academia, industry, nonprofits, and government.

    University of Chicago research has led to such breakthroughs as discovering the link between cancer and genetics, establishing revolutionary theories of economics, and developing tools to produce reliably excellent urban schooling. We generate new insights for the benefit of present and future generations with our national and affiliated laboratories: DOE’s Argonne National Laboratory, DOE’s Fermi National Accelerator Laboratory , and the Marine Biological Laboratory in Woods Hole, Massachusetts.
    The University of Chicago is enriched by the city we call home. In partnership with our neighbors, we invest in Chicago’s mid-South Side across such areas as health, education, economic growth, and the arts. Together with our medical center, we are the largest private employer on the South Side.

    In all we do, we are driven to dig deeper, push further, and ask bigger questions—and to leverage our knowledge to enrich all human life. Our diverse and creative students and alumni drive innovation, lead international conversations, and make masterpieces. Alumni and faculty, lecturers and postdocs go on to become Nobel laureates, CEOs, university presidents, attorneys general, literary giants, and astronauts. The University of Chicago is a private research university in Chicago, Illinois. Founded in 1890, its main campus is located in Chicago’s Hyde Park neighborhood. It enrolled 16,445 students in Fall 2019, including 6,286 undergraduates and 10,159 graduate students. The University of Chicago is ranked among the top universities in the world by major education publications, and it is among the most selective in the United States.

    The university is composed of one undergraduate college and five graduate research divisions, which contain all of the university’s graduate programs and interdisciplinary committees. Chicago has eight professional schools: the Law School, the Booth School of Business, the Pritzker School of Medicine, the School of Social Service Administration, the Harris School of Public Policy, the Divinity School, the Graham School of Continuing Liberal and Professional Studies, and the Pritzker School of Molecular Engineering. The university has additional campuses and centers in London, Paris, Beijing, Delhi, and Hong Kong, as well as in downtown Chicago.

    University of Chicago scholars have played a major role in the development of many academic disciplines, including economics, law, literary criticism, mathematics, religion, sociology, and the behavioralism school of political science, establishing the Chicago schools in various fields. Chicago’s Metallurgical Laboratory produced the world’s first man-made, self-sustaining nuclear reaction in Chicago Pile-1 beneath the viewing stands of the university’s Stagg Field. Advances in chemistry led to the “radiocarbon revolution” in the carbon-14 dating of ancient life and objects. The university research efforts include administration of DOE’s Fermi National Accelerator Laboratory and DOE’s Argonne National Laboratory, as well as the U Chicago Marine Biological Laboratory in Woods Hole, Massachusetts (MBL). The university is also home to the University of Chicago Press, the largest university press in the United States. The Barack Obama Presidential Center is expected to be housed at the university and will include both the Obama presidential library and offices of the Obama Foundation.

    The University of Chicago’s students, faculty, and staff have included 100 Nobel laureates as of 2020, giving it the fourth-most affiliated Nobel laureates of any university in the world. The university’s faculty members and alumni also include 10 Fields Medalists, 4 Turing Award winners, 52 MacArthur Fellows, 26 Marshall Scholars, 27 Pulitzer Prize winners, 20 National Humanities Medalists, 29 living billionaire graduates, and have won eight Olympic medals.

    The University of Chicago is enriched by the city we call home. In partnership with our neighbors, we invest in Chicago’s mid-South Side across such areas as health, education, economic growth, and the arts. Together with our medical center, we are the largest private employer on the South Side.

    Research

    According to the National Science Foundation, University of Chicago spent $423.9 million on research and development in 2018, ranking it 60th in the nation. It is classified among “R1: Doctoral Universities – Very high research activity” and is a founding member of the Association of American Universities and was a member of the Committee on Institutional Cooperation from 1946 through June 29, 2016, when the group’s name was changed to the Big Ten Academic Alliance. The University of Chicago is not a member of the rebranded consortium, but will continue to be a collaborator.

    The university operates more than 140 research centers and institutes on campus. Among these are the Oriental Institute—a museum and research center for Near Eastern studies owned and operated by the university—and a number of National Resource Centers, including the Center for Middle Eastern Studies. Chicago also operates or is affiliated with several research institutions apart from the university proper. The university manages DOE’s Argonne National Laboratory, part of the United States Department of Energy’s national laboratory system, and co-manages DOE’s Fermi National Accelerator Laboratory, a nearby particle physics laboratory, as well as a stake in the Apache Point Observatory in Sunspot, New Mexico.
    _____________________________________________________________________________________

    SDSS Telescope at Apache Point Observatory, near Sunspot NM, USA, Altitude 2,788 meters (9,147 ft).

    Apache Point Observatory, near Sunspot, New Mexico Altitude 2,788 meters (9,147 ft).
    _____________________________________________________________________________________

    Faculty and students at the adjacent Toyota Technological Institute at Chicago collaborate with the university. In 2013, the university formed an affiliation with the formerly independent Marine Biological Laboratory in Woods Hole, Mass. Although formally unrelated, the National Opinion Research Center is located on Chicago’s campus.

     
  • richardmitnick 9:13 pm on October 20, 2022 Permalink | Reply
    Tags: "Deep learning with light", "Silicon photonics", , , , , Electronics, , , , ,   

    From The Massachusetts Institute of Technology: “Deep learning with light” 

    From The Massachusetts Institute of Technology

    10.20.22
    Adam Zewe

    1
    This rendering shows a novel piece of hardware, called a smart transceiver, that uses technology known as “silicon photonics” to dramatically accelerate one of the most memory-intensive steps of running a machine-learning model. This can enable an edge device, like a smart home speaker, to perform computations with more than a hundred-fold improvement in energy efficiency. Image: Alex Sludds. Edited by MIT News.

    Ask a smart home device for the weather forecast, and it takes several seconds for the device to respond. One reason this latency occurs is because connected devices don’t have enough memory or power to store and run the enormous machine-learning models needed for the device to understand what a user is asking of it. The model is stored in a data center that may be hundreds of miles away, where the answer is computed and sent to the device.

    MIT researchers have created a new method for computing directly on these devices, which drastically reduces this latency. Their technique shifts the memory-intensive steps of running a machine-learning model to a central server where components of the model are encoded onto light waves.

    The waves are transmitted to a connected device using fiber optics, which enables tons of data to be sent lightning-fast through a network. The receiver then employs a simple optical device that rapidly performs computations using the parts of a model carried by those light waves.

    This technique leads to more than a hundredfold improvement in energy efficiency when compared to other methods. It could also improve security, since a user’s data do not need to be transferred to a central location for computation.

    This method could enable a self-driving car to make decisions in real-time while using just a tiny percentage of the energy currently required by power-hungry computers. It could also allow a user to have a latency-free conversation with their smart home device, be used for live video processing over cellular networks, or even enable high-speed image classification on a spacecraft millions of miles from Earth.

    “Every time you want to run a neural network, you have to run the program, and how fast you can run the program depends on how fast you can pipe the program in from memory. Our pipe is massive — it corresponds to sending a full feature-length movie over the internet every millisecond or so. That is how fast data comes into our system. And it can compute as fast as that,” says senior author Dirk Englund, an associate professor in the Department of Electrical Engineering and Computer Science (EECS) and member of the MIT Research Laboratory of Electronics.

    Joining Englund on the paper is lead author and EECS grad student Alexander Sludds; EECS grad student Saumil Bandyopadhyay, Research Scientist Ryan Hamerly, as well as others from MIT, the MIT Lincoln Laboratory, and Nokia Corporation. The research is published today in Science [below].

    Lightening the load

    Neural networks are machine-learning models that use layers of connected nodes, or neurons, to recognize patterns in datasets and perform tasks, like classifying images or recognizing speech. But these models can contain billions of weight parameters, which are numeric values that transform input data as they are processed. These weights must be stored in memory. At the same time, the data transformation process involves billions of algebraic computations, which require a great deal of power to perform.

    The process of fetching data (the weights of the neural network, in this case) from memory and moving them to the parts of a computer that do the actual computation is one of the biggest limiting factors to speed and energy efficiency, says Sludds.

    “So our thought was, why don’t we take all that heavy lifting — the process of fetching billions of weights from memory — move it away from the edge device and put it someplace where we have abundant access to power and memory, which gives us the ability to fetch those weights quickly?” he says.

    The neural network architecture they developed, “Netcast”, involves storing weights in a central server that is connected to a novel piece of hardware called a smart transceiver. This smart transceiver, a thumb-sized chip that can receive and transmit data, uses technology known as “silicon photonics” to fetch trillions of weights from memory each second.

    It receives weights as electrical signals and imprints them onto light waves. Since the weight data are encoded as bits (1s and 0s) the transceiver converts them by switching lasers; a laser is turned on for a 1 and off for a 0. It combines these light waves and then periodically transfers them through a fiber optic network so a client device doesn’t need to query the server to receive them.

    “Optics is great because there are many ways to carry data within optics. For instance, you can put data on different colors of light, and that enables a much higher data throughput and greater bandwidth than with electronics,” explains Bandyopadhyay.

    Trillions per second

    Once the light waves arrive at the client device, a simple optical component known as a broadband “Mach-Zehnder” modulator uses them to perform super-fast, analog computation. This involves encoding input data from the device, such as sensor information, onto the weights. Then it sends each individual wavelength to a receiver that detects the light and measures the result of the computation.

    The researchers devised a way to use this modulator to do trillions of multiplications per second, which vastly increases the speed of computation on the device while using only a tiny amount of power.

    “In order to make something faster, you need to make it more energy efficient. But there is a trade-off. We’ve built a system that can operate with about a milliwatt of power but still do trillions of multiplications per second. In terms of both speed and energy efficiency, that is a gain of orders of magnitude,” Sludds says.

    They tested this architecture by sending weights over an 86-kilometer fiber that connects their lab to MIT Lincoln Laboratory. Netcast enabled machine-learning with high accuracy — 98.7 percent for image classification and 98.8 percent for digit recognition — at rapid speeds.

    “We had to do some calibration, but I was surprised by how little work we had to do to achieve such high accuracy out of the box. We were able to get commercially relevant accuracy,” adds Hamerly.

    Moving forward, the researchers want to iterate on the smart transceiver chip to achieve even better performance. They also want to miniaturize the receiver, which is currently the size of a shoe box, down to the size of a single chip so it could fit onto a smart device like a cell phone.

    “Using photonics and light as a platform for computing is a really exciting area of research with potentially huge implications on the speed and efficiency of our information technology landscape,” says Euan Allen, a Royal Academy of Engineering Research Fellow at the University of Bath, who was not involved with this work. “The work of Sludds et al. is an exciting step toward seeing real-world implementations of such devices, introducing a new and practical edge-computing scheme whilst also exploring some of the fundamental limitations of computation at very low (single-photon) light levels.”

    The research is funded, in part, by NTT Research, the National Science Foundation, the Air Force Office of Scientific Research, the Air Force Research Laboratory, and the Army Research Office.

    Science paper:
    Science

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

    The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center , and the Haystack Observatory , as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and Whitehead Institute.

    Massachusettes Institute of Technology-Haystack Observatory Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities.

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after The Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    The Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, The Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at The Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    The Massachusetts Institute of Technology‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology ‘s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, The Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected The Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of The Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, The Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and The Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. The Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However, six Massachusetts Institute of Technology students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at The Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, The Massachusetts Institute of Technology’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    The Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    The Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, The Massachusetts Institute of Technology launched “OpenCourseWare” to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, The Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

    The Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, The Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of The Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology , Massachusetts Institute of Technology, and industrial contractors, and funded by the National Science Foundation .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also a Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of The Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 8:03 am on October 19, 2022 Permalink | Reply
    Tags: "CD-SAXS": critical dimension small angle X-ray scattering, "Pioneering the Future of Semiconductor Measurement Techniques", “X-ray diffraction” provides detailed data about the chemical structure and physical properties of materials., Electronics, ,   

    From The National Institute of Standards and Technology: “Pioneering the Future of Semiconductor Measurement Techniques” 

    From The National Institute of Standards and Technology

    10.18.22

    1

    Today, CD-SAXS (critical dimension small angle X-ray scattering) is an industry-wide measurement technique for next-generation semiconductor fabrication. Developed by NIST researchers Wen-Li Wu and Joe Kline, CD-SAXS matured from a laboratory technique into a practical application with the help of industry partners through multiple cooperative research and development agreements. This collaborative effort was needed to drive this technology to the market in the quickest time possible. About 60 patent applications can be traced to this research. Currently, all major tool vendors and semiconductor fabrication facilities have CD-SAXS development projects underway that can be traced directly to this NIST innovation.

    Most electronic devices that are used every day by millions of people contain semiconductors. These semiconductors efficiently allow electricity to flow through them, playing a pivotal role in powering current and future technologies. The semiconductor manufacturing industry currently generates approximately $300 billion in revenue annually.

    Twenty years ago, a small team of NIST researchers was focused on measuring transistors. Transistors are semiconductor devices that amplify, control, and generate electrical signals. Transistors, in turn, are the active component of integrated circuits, also known as microchips. A single microchip can contain billions of transistors. These microchips have been developed to be as small as a fingernail with transistors each the size of a single strand of DNA. The measurements must be spot on for every component to operate correctly, and this can be a difficult challenge, especially as semiconductors get smaller each year. Traditionally, transistors have been measured with electron microscopes, but as transistors have decreased in size, electron microscopes have been unable to keep up.

    The NIST team, led by Wen-Li Wu and Joe Kline, sought a way to measure the shape and size of the billions of complex nanoscale transistors that power electronics. The industry needed new measurement methods that could rapidly and non-destructively quantify those dimensions. Rapid, accurate measurements were required to ensure that the manufacturing and supply chain would not be disrupted, negatively impacting the industry.

    Around this time, Wu and Kline’s NIST team invented X-ray diffraction, which provides detailed data about the chemical structure and physical properties of materials. NIST researchers used X-ray diffraction to study the properties of atoms, so the team saw an opportunity to use this technique on nanometer-scale transistors similar in size to the structure of DNA and proteins. Wu and Kline began to develop the technique for application to the semiconductor industry. Their proof-of-concept was the measurement of semiconductors’ shapes, angles, and roughness of features. They needed to prove to the industry that they could quantify this to implement this new measurement into common practice.

    This method, known as critical dimension small angle x-ray scattering (CD-SAXS), can measure the shape of three-dimensional nanostructures in semiconductor devices non-destructively. This innovative measurement technique was designed at a synchrotron facility, which provided an immensely powerful source for x-rays.

    The transition from concept to industrial application was a significant hurdle to overcome as CD-SAXS was vastly different from the metrology techniques at the time for semiconductors. It was so different that Wen-Li and Joe had to spearhead the charge to build out the entire infrastructure from scratch, from hardware specifications to software and data analysis tools. Along with this came the need for sustained interaction with the semiconductor industry to create acceptance in the market. They needed to convince the semiconductor manufacturers to adopt this technique, which would disrupt the current market of semiconductor measurement techniques but was entirely needed to pave the way for the future of semiconductors as they continue to be produced at smaller, miniaturized scales.

    The team quickly designed a CD-SAXS prototype in the lab by building out the entire infrastructure, performing feasibility studies, and developing the needed analytical methods and software. During this time, private-public partnerships were initiated between the NIST team and businesses, which was a necessary step in expediting the technology to manufacturing facilities to keep up with semiconductor production. Industry leaders collaborated with the NIST team to use the CD-SAXS prototype for measurements, allowing the NIST team to validate the superiority of CD-SAXS against standard test methods.

    CD-SAXS continues to be widely adopted across the semiconductor manufacturing industry as semiconductors continuously decrease in size. Industry must have a highly accurate method to measure semiconductors as they have become and will continue to be a major component of technologies found and used in everyday life.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD.

    The National Institute of Standards and Technology‘s Mission, Vision, Core Competencies, and Core Values

    Mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.

    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.

    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

    Background

    The Articles of Confederation, ratified by the colonies in 1781, contained the clause, “The United States in Congress assembled shall also have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States”. Article 1, section 8, of the Constitution of the United States (1789), transferred this power to Congress; “The Congress shall have power…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures”.

    In January 1790, President George Washington, in his first annual message to Congress stated that, “Uniformity in the currency, weights, and measures of the United States is an object of great importance, and will, I am persuaded, be duly attended to”, and ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage, Weights, and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, “A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience”, but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared “Weights and measures may be ranked among the necessities of life to every individual of human society”.

    From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, which was part of the U.S. Coast and Geodetic Survey in the Department of the Treasury.

    Bureau of Standards

    In 1901 in response to a bill proposed by Congressman James H. Southard (R- Ohio) the National Bureau of Standards was founded with the mandate to provide standard weights and measures and to serve as the national physical laboratory for the United States. (Southard had previously sponsored a bill for metric conversion of the United States.)

    President Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was $40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, and set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington DC (US) and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures the Bureau developed instruments for electrical units and for measurement of light. In 1905 a meeting was called that would be the first National Conference on Weights and Measures.

    Initially conceived as purely a metrology agency the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products. Some of these standards were for products intended for government use; but product standards also affected private-sector consumption. Quality standards were developed for products including some types of clothing; automobile brake systems and headlamps; antifreeze; and electrical safety. During World War I, the Bureau worked on multiple problems related to war production even operating its own facility to produce optical glass when European supplies were cut off. Between the wars Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II military research and development was carried out including development of radio propagation forecast methods; the proximity fuze and the standardized airframe used originally for Project Pigeon; and shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.

    In 1948, financed by the United States Air Force the Bureau began design and construction of SEAC: the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version- DYSEAC- was built for the Signal Corps in 1954.

    Due to a changing mission, the “National Bureau of Standards” became the “ The National Institute of Standards and Technology” in 1988.

    Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings.

    Organization

    NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, Colorado, which was dedicated by President Eisenhower in 1954. NIST’s activities are organized into laboratory programs and extramural programs. Effective October 1, 2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six. NIST Laboratories include:

    Communications Technology Laboratory (CTL)
    Engineering Laboratory (EL)
    Information Technology Laboratory (ITL)
    Center for Neutron Research (NCNR)
    Material Measurement Laboratory (MML)
    Physical Measurement Laboratory (PML)

    Extramural programs include:

    Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers to assist small and mid-sized manufacturers to create and retain jobs, improve efficiencies, and minimize waste through process improvements and to increase market penetration with innovation and growth strategies;
    Technology Innovation Program (TIP), a grant program where NIST and industry partners cost share the early-stage development of innovative but high-risk technologies;
    Baldrige Performance Excellence Program, which administers the Malcolm Baldrige National Quality Award, the nation’s highest award for performance and business excellence.

    NIST’s Boulder laboratories are best known for NIST‑F1 which houses an atomic clock.

    NIST‑F1 serves as the source of the nation’s official time. From its measurement of the natural resonance frequency of cesium—which defines the second—NIST broadcasts time signals via longwave radio station WWVB near Fort Collins in Colorado, and shortwave radio stations WWV and WWVH, located near Fort Collins and Kekaha in Hawai’i, respectively.

    NIST also operates a neutron science user facility: the NIST Center for Neutron Research (NCNR).

    The NCNR provides scientists access to a variety of neutron scattering instruments which they use in many research fields (materials science; fuel cells; biotechnology etc.).

    The SURF III Synchrotron Ultraviolet Radiation Facility is a source of synchrotron radiation in continuous operation since 1961.

    SURF III now serves as the US national standard for source-based radiometry throughout the generalized optical spectrum. All NASA-borne extreme-ultraviolet observation instruments have been calibrated at SURF since the 1970s, and SURF is used for measurement and characterization of systems for extreme ultraviolet lithography.

    The Center for Nanoscale Science and Technology performs research in nanotechnology, both through internal research efforts and by running a user-accessible cleanroom nanomanufacturing facility.

    This “NanoFab” is equipped with tools for lithographic patterning and imaging (e.g., electron microscopes and atomic force microscopes).
    Committees

    NIST has seven standing committees:

    Technical Guidelines Development Committee (TGDC)
    Advisory Committee on Earthquake Hazards Reduction (ACEHR)
    National Construction Safety Team Advisory Committee (NCST Advisory Committee)
    Information Security and Privacy Advisory Board (ISPAB)
    Visiting Committee on Advanced Technology (VCAT)
    Board of Overseers for the Malcolm Baldrige National Quality Award (MBNQA Board of Overseers)
    Manufacturing Extension Partnership National Advisory Board (MEPNAB)

    Measurements and standards

    As part of its mission, NIST supplies industry, academia, government, and other users with over 1,300 Standard Reference Materials (SRMs). These artifacts are certified as having specific characteristics or component content, used as calibration standards for measuring equipment and procedures, quality control benchmarks for industrial processes, and experimental control samples.

    Handbook 44

    NIST publishes the Handbook 44 each year after the annual meeting of the National Conference on Weights and Measures (NCWM). Each edition is developed through cooperation of the Committee on Specifications and Tolerances of the NCWM and the Weights and Measures Division (WMD) of the NIST. The purpose of the book is a partial fulfillment of the statutory responsibility for “cooperation with the states in securing uniformity of weights and measures laws and methods of inspection”.

    NIST has been publishing various forms of what is now the Handbook 44 since 1918 and began publication under the current name in 1949. The 2010 edition conforms to the concept of the primary use of the SI (metric) measurements recommended by the Omnibus Foreign Trade and Competitiveness Act of 1988.

     
  • richardmitnick 9:13 am on October 7, 2022 Permalink | Reply
    Tags: "Why NIST Is Putting Its CHIPS Into U.S. Manufacturing", A typical integrated circuit today contains billions of tiny on-off switches known as transistors., An area of major excitement at NIST is “advanced packaging.”, Artificial diamonds are currently used as the semiconductors in chips for aerospace applications., “Integrated circuits”, Cell phones send and receive Wi-Fi and cellular signals thanks to semiconductor chips inside them., Chips also abound on the exteriors of homes inside everything from security cameras to solar panels., Chips typically need to go through a dizzying series of steps-and different suppliers-before they become finished products., CPUs and GPUs in computers, Digital cameras contain chips that detect light and turn it into an image., Electronics, Gallium nitride is resistant to damage from cosmic rays and other radiation in space so it’s commonly the material of choice for electronic devices in satellites., Light emitting diodes (LEDs) on chips, Manufacturers typically mass-produce dozens of integrated circuits on a single semiconductor wafer and then dice the wafer to separate the individual pieces., Measurement science plays a key role in up to 50% of semiconductor manufacturing steps., Memory chips store data., , NIST has the measurement science and technical standards expertise that is needed by the U.S. chip industry., President Joe Biden recently signed into law the "CHIPS Act"., Semiconductor chips, Silicon carbide can handle larger amounts of electricity and voltage than other materials so it has been used in chips for electric vehicles., Silicon is a type of material known as a semiconductor., Silicon is the most frequently used raw material for chips., The average car can have upward of 1200 chips in it., , Today’s cars are computers on wheels.   

    From The National Institute of Standards and Technology: “Why NIST Is Putting Its CHIPS Into U.S. Manufacturing” 

    From The National Institute of Standards and Technology

    10.7.22

    Ben P. Stein

    1
    A NIST NanoFab user works with an optical microscope and computer software to inspect samples and take pictures.
    Credit: B. Hayes/NIST.

    Right after the pandemic hit, I bought a new vacuum cleaner. I wanted to step up my housecleaning skills since I knew I’d be home a lot more. I was able to buy mine right away, but friends who wanted new appliances weren’t so lucky. My relatives had to wait months for their new refrigerator to arrive. And it wasn’t just appliances. New cars were absent from dealership lots, while used cars commanded a premium. What do all these things have in common? Semiconductor chips.

    The pandemic disrupted the global supply chain, and semiconductor chips were particularly vulnerable. The chip shortage delivered a wakeup call for our country to make our supply chain more resilient and increase domestic manufacturing of chips, which are omnipresent in modern life.

    “To an astonishing degree, the products and services we encounter every day are powered by semiconductor chips,” says Mike Molnar, director of NIST’s Office of Advanced Manufacturing.

    Think about your kitchen. Dishwashers have chips that sense how dirty your loads are and precisely time their cleaning cycles to reduce your energy and water bills. Some rice cookers use chips with “fuzzy logic” to judge how long to cook rice. Many toasters now have chips that make sure your bread is perfectly browned.

    We commonly think of chips as the “brains” that crunch numbers, and that is certainly true for the CPUs in computers, but chips do all sorts of useful things. Memory chips store data. Digital cameras contain chips that detect light and turn it into an image. Modern TVs produce their colorful displays with arrays of light emitting diodes (LEDs) on chips. Phones send and receive Wi-Fi and cellular signals thanks to semiconductor chips inside them. Chips also abound on the exteriors of homes, inside everything from security cameras to solar panels.

    The average car can have upward of 1,200 chips in it, and you can’t make a new car unless you have all of them. “Today’s cars are computers on wheels,” an auto mechanic said to me a few years ago, and his words were never more on point than during the height of the pandemic. In 2021, the chip shortage was estimated to have caused a loss of $110 billion in new vehicle sales worldwide.

    The chips in today’s cars are a combination of low-tech, mature chips and high-tech, state-of-the-art processors (which you’ll especially find in electric vehicles and those that have autonomous driving capabilities).

    2
    It takes a lot of chemistry to make a computer chip. Here a NanoFab user is working with acids while wearing the proper personal protective equipment (PPE). Credit: B. Hayes/NIST.

    Whether mature or cutting-edge, chips typically need to go through a dizzying series of steps — and different suppliers — before they become finished products. And most of this work is currently done outside this country. The U.S., once a leader in chip manufacturing, currently only has about a 12% share in the market.

    To reestablish our nation’s leadership in chip manufacturing, Congress recently passed, and President Joe Biden recently signed into law, the “CHIPS Act”. The CHIPS Act aims to help U.S. manufacturers grow an ecosystem in which they produce both mature and state-of-the-art chips at all stages of the manufacturing process and supply chain, and NIST is going to play a big role in this effort.

    The Dirt on Semiconductor Chips

    Silicon is the most frequently used raw material for chips, and one of the most abundant atomic elements on Earth. To give you a sense of its abundance, silicon and oxygen are the main ingredients of most beach sand, and a major component of glass, rocks and soil (which means that you can also find it in actual, not just metaphorical, dirt).

    3
    Making a “wafer” of semiconductor material, like the one shown here, is the first step for making a chip.
    Credit: MS Mikel/Shutterstock.

    Silicon is a type of material known as a semiconductor. Electricity flows through semiconductors better than it does through insulators (such as rubber and cotton), but not quite as well as it does through conductors (such as metals and water).

    But that’s a good thing. In semiconductors, you can control electric current precisely — and without any moving parts. By applying a small voltage to them, you can either cause current to flow or to stop — making the semiconductor (or a small region within it) act like a conductor or insulator depending on what you want to do.

    The first step for making a chip is to start with a thin slice of a semiconductor material, known as a “wafer,” often round in shape. On top of the wafer, manufacturers then create complex miniature electric circuits, commonly called “integrated circuits” (ICs) because they are embedded as one piece on the wafer. A typical IC today contains billions of tiny on-off switches known as transistors that enable a chip to perform a wide range of complex tasks from sending signals to processing information. Increasingly, these circuits also have “photonic” components in which light travels alongside electricity.

    Manufacturers typically mass-produce dozens of ICs on a single semiconductor wafer and then dice the wafer to separate the individual pieces. When each of them is packaged as a self-contained device, you have a “chip,” which can then be placed in smartphones, computers and so many other products.

    4
    An array of photonic integrated circuit chips, which use light to process information. These diced photonics chips are ready for assembly and packaging at AIM Photonics, an Albany, New York-based research facility that is part of the national Manufacturing USA network. Credit: AIM Photonics.

    Though silicon is the most commonly used raw material for chips, other semiconductors are used depending on the application. For example, gallium nitride is resistant to damage from cosmic rays and other radiation in space, so it’s commonly the material of choice for electronic devices in satellites. Gallium arsenide is frequently employed to make LEDs, because silicon typically produces heat instead of light if you try to make an LED with it.

    Non-silicon semiconductors are used in the growing field of “power electronics” in vehicles and energy systems such as wind and solar. Silicon carbide can handle larger amounts of electricity and voltage than other materials, so it has been used in chips for electric vehicles to perform functions such as converting DC battery power into the AC power delivered to the motors.

    Diamonds are semiconductors too — and they have the greatest ability to conduct heat of any known material. Artificial diamonds are currently used as the semiconductors in chips for aerospace applications, as they can draw heat away from the power loads generated in those chips.

    So Why NIST?

    Measurement science plays a key role in up to 50% of semiconductor manufacturing steps, according to a recent NIST report. Good measurements enable manufacturers to mass-produce high-quality, high-performance chips.

    NIST has the measurement science and technical standards expertise that is needed by the U.S. chip industry, and our programs to advance manufacturing and support manufacturing networks across the U.S. mean we can partner with industry to find out what they need and deliver on it.

    5
    This is a test chip NIST has developed, as part of a research and development agreement with Google, for measuring the performance of semiconductor devices used in a range of advanced applications such as artificial intelligence. Credit: B. Hoskins/NIST.

    NIST researchers already work on semiconductor materials for many reasons. For example, researchers have developed new ways to measure semiconductor materials in order to detect defects (such as a stray aluminum atom in silicon) that could cause chips to malfunction. As electronic components get smaller, chips need to be increasingly free of such defects.

    “Modern chips may contain over 100 billion complex nanodevices that are less than 50 atoms across — all must work nearly identically for the chip to function,” the NIST report points out.

    Flexible and Printable Chips

    NIST researchers also measure the properties of new materials that could be useful for future inventions. All of the semiconductor materials I mentioned above are brittle and can’t be bent. But devices with chips — from pacemakers to blood pressure monitors to defibrillators — are increasingly being made with flexible materials so they can be “wearable” and you can attach them comfortably to the contours of your body. NIST researchers have been at the forefront of the work to develop these “flexible” chips.

    6
    A circuit made from organic thin-film transistors is fabricated on a flexible plastic substrate. Credit: Patrick Mansell/Penn State.

    Researchers are also studying materials that could serve as “printable” chips that would be cheaper and more environmentally friendly. Instead of going through the complicated multistep process of making chips in a factory, we are developing ways to print circuits directly onto materials such as paper using technology that’s similar to ink-jet printers.

    And while we’ve lost a lot of overall chip manufacturing share, U.S. companies still make many of the machines that carry out the individual steps for fabricating chips, such as those that deposit ultrathin layers of material on top of semiconductors. But what if, instead of these machines being shipped abroad, more domestic manufacturers developed expertise in using them?

    To support this effort, NIST researchers are planning to perform measurements with these very machines in their labs. They will study materials that these machines use and the manufacturing processes associated with them. The information from the NIST work could help more domestic manufacturers develop the know-how for making chips. This work can help create an ecosystem with many domestic chip manufacturers, not just a few, leading to a more resilient supply chain.

    7
    Three researchers at NIST’s NanoFab talk science with a state-of-the-art Atomic Layer Deposition (ALD) system in the background.Credit: B. Hayes/NIST.

    “Reliance on only one supplier is problematic, as we saw with the recent shortage in baby formula,” NIST’s Jyoti Malhotra pointed out to me. Malhotra serves on the senior leadership team of NIST’s Manufacturing Extension Partnership (MEP). MEP has been connecting NIST labs to the U.S. suppliers and manufacturers who produce materials, components, devices and equipment enabling U.S. chip manufacturing.

    Advanced Packaging

    Last but not least, an area of major excitement at NIST is “advanced packaging.” No, we don’t mean the work of those expert gift-wrappers you may find at stores during the holiday season. When we talk about chip packaging, we’re referring to everything that goes around a chip to protect it from damage and connect it to the rest of the device. Advanced packaging takes things to the next level: It uses ingenious techniques during the chipmaking process to connect multiple chips to each other and the rest of the device in as tiny a space as possible.

    But it’s more about just making a smartphone that fits in your pocket. Advanced packaging enables our devices to be faster and more energy-efficient because information can be exchanged between chips over shorter distances and this in turn reduces energy consumption.

    One great byproduct of advanced packaging’s innovations can be found on my wrist — namely, the smartwatch I wear for my long-distance runs. My watch uses GPS to measure how far I ran. It also measures my heart rate, and after my workouts, it uploads my running data wirelessly to my phone. Its battery lasts for days; it had plenty of juice left even after I ran a full marathon last month.

    Twenty years ago, running watches were big and clunky, with much less functionality. My friends and I had a particular model with a huge face and a bulky slab that fit over the insides of our wrists. When a friend and I opened up his watch to replace his battery, we saw that the GPS receiver was on a completely separate circuit board from the rest of the watch electronics.

    9
    A running friend of mine still has his old running watch, and he recently took a picture of it alongside the modern one that he now uses. The GPS chip in the old watch is on its own circuit board underneath the buttons, apart from the rest of the watch electronics. The modern watch has all the electronic components beneath the small watch face. Credit: Ron Weber.

    Under the small and thin face of my current watch you will find all its electronics, including a GPS sensor, battery, heart-rate monitor, wireless communications device and so many other things.

    Further development of advanced packaging could produce even more powerful devices for monitoring a patient’s vitals, measuring pollutants in the environment, and increasing situational awareness for soldiers in the field.

    10
    This illustration shows the staggering number of ultrathin semiconductor layers that are possible thanks to “advanced packaging” techniques. When I saw this, it reminded me of one of those amazing sandwiches that the cartoon character Dagwood would eat, but I think this is even more impressive! Credit: DoE 3DFeM center at Penn State University.

    Advanced packaging is also a potential niche for domestic manufacturers to grow global market share (currently at 3% for this part of the chipmaking process). Chips are becoming so complex that design and manufacturing processes, once separate steps, are now increasingly intertwined — and the U.S. remains a world leader in chip design. NIST’s measurements to support advanced packaging in chips and standards for the packaging process could give domestic manufacturers a decisive edge in this area.

    All the NIST experts I’ve spoken to talk about a future in which chip manufacturers work increasingly closely with their customers, such as automakers. The benefit of closer relationships would mean that customers could collaborate with manufacturers to create more customized chips that bring about completely new products.

    And as we’ve seen, incorporating chips into existing products tends to make them “smart,” whether it’s an appliance figuring out how long to bake the bread, or solar panels that maximize electricity production by coordinating the power output from individual panels. With more domestic manufacturers on the scene, there are more opportunities to incorporate chips into products — that could also be manufactured in the U.S.A.

    I first encountered semiconductor chips in the 1970s, when the U.S. was a dominant force in chip manufacturing. Inside a department store with my mom, I saw pocket calculators on display, and they fascinated me. You could punch their number keys and they would instantly solve any addition or multiplication problem. As a 6-year-old, I thought that they had little brains in them!

    Since then, semiconductor chips have been a big part of my life. And after the pandemic, I realize I can’t take them for granted. I’m glad to be part of an agency that is working to create a more resilient supply chain — and bring back chip manufacturing in this country.
    __________________________________________________

    Semiconductor Chip Glossary

    Semiconductor: Material that can act either as a conductor or an insulator of electricity, depending on small changes in voltage

    Silicon: Semiconductor material that serves as the basis for many circuits in industry

    Transistor: Simple switch, made with a semiconductor material, that turns on or off depending on changes in voltage and can combine with other transistors to create complex devices

    Integrated circuit: Many transistors (anywhere from several to billions) combined to make a small circuit on a chip

    Wafer: Thin piece of semiconductor material (such as silicon) that we use as a base for building multiple integrated circuits

    Lithography: Process of etching into or building onto the surface of a wafer in order to produce patterns of integrated circuits

    Chip: Self-contained piece including the semiconductor surface and integrated circuit, independently packaged for use in electronics such as cellphones or computers

    Fab: Industrial facility where raw silicon wafers become fully functioning electronic chips
    __________________________________________________

    11
    NIST graphic designer Brandon Hayes and me in our bunny suits as we prepared to enter the NIST NanoFab, where Brandon took many amazing pictures, several of which you see in this blog post. Look for more NanoFab photos from Brandon as we continue to cover this topic in the coming months and years!
    Credit: J. Zhang/NIST

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD.

    The National Institute of Standards and Technology‘s Mission, Vision, Core Competencies, and Core Values

    Mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.

    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.

    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

    Background

    The Articles of Confederation, ratified by the colonies in 1781, contained the clause, “The United States in Congress assembled shall also have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States”. Article 1, section 8, of the Constitution of the United States (1789), transferred this power to Congress; “The Congress shall have power…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures”.

    In January 1790, President George Washington, in his first annual message to Congress stated that, “Uniformity in the currency, weights, and measures of the United States is an object of great importance, and will, I am persuaded, be duly attended to”, and ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage, Weights, and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, “A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience”, but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared “Weights and measures may be ranked among the necessities of life to every individual of human society”.

    From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, which was part of the U.S. Coast and Geodetic Survey in the Department of the Treasury.

    Bureau of Standards

    In 1901 in response to a bill proposed by Congressman James H. Southard (R- Ohio) the National Bureau of Standards was founded with the mandate to provide standard weights and measures and to serve as the national physical laboratory for the United States. (Southard had previously sponsored a bill for metric conversion of the United States.)

    President Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was $40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, and set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington DC (US) and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures the Bureau developed instruments for electrical units and for measurement of light. In 1905 a meeting was called that would be the first National Conference on Weights and Measures.

    Initially conceived as purely a metrology agency the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products. Some of these standards were for products intended for government use; but product standards also affected private-sector consumption. Quality standards were developed for products including some types of clothing; automobile brake systems and headlamps; antifreeze; and electrical safety. During World War I, the Bureau worked on multiple problems related to war production even operating its own facility to produce optical glass when European supplies were cut off. Between the wars Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II military research and development was carried out including development of radio propagation forecast methods; the proximity fuze and the standardized airframe used originally for Project Pigeon; and shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.

    In 1948, financed by the United States Air Force the Bureau began design and construction of SEAC: the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version- DYSEAC- was built for the Signal Corps in 1954.

    Due to a changing mission, the “National Bureau of Standards” became the “ The National Institute of Standards and Technology” in 1988.

    Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings.

    Organization

    NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, Colorado, which was dedicated by President Eisenhower in 1954. NIST’s activities are organized into laboratory programs and extramural programs. Effective October 1, 2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six. NIST Laboratories include:

    Communications Technology Laboratory (CTL)
    Engineering Laboratory (EL)
    Information Technology Laboratory (ITL)
    Center for Neutron Research (NCNR)
    Material Measurement Laboratory (MML)
    Physical Measurement Laboratory (PML)

    Extramural programs include:

    Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers to assist small and mid-sized manufacturers to create and retain jobs, improve efficiencies, and minimize waste through process improvements and to increase market penetration with innovation and growth strategies;
    Technology Innovation Program (TIP), a grant program where NIST and industry partners cost share the early-stage development of innovative but high-risk technologies;
    Baldrige Performance Excellence Program, which administers the Malcolm Baldrige National Quality Award, the nation’s highest award for performance and business excellence.

    NIST’s Boulder laboratories are best known for NIST‑F1 which houses an atomic clock.

    NIST‑F1 serves as the source of the nation’s official time. From its measurement of the natural resonance frequency of cesium—which defines the second—NIST broadcasts time signals via longwave radio station WWVB near Fort Collins in Colorado, and shortwave radio stations WWV and WWVH, located near Fort Collins and Kekaha in Hawai’i, respectively.

    NIST also operates a neutron science user facility: the NIST Center for Neutron Research (NCNR).

    The NCNR provides scientists access to a variety of neutron scattering instruments which they use in many research fields (materials science; fuel cells; biotechnology etc.).

    The SURF III Synchrotron Ultraviolet Radiation Facility is a source of synchrotron radiation in continuous operation since 1961.

    SURF III now serves as the US national standard for source-based radiometry throughout the generalized optical spectrum. All NASA-borne extreme-ultraviolet observation instruments have been calibrated at SURF since the 1970s, and SURF is used for measurement and characterization of systems for extreme ultraviolet lithography.

    The Center for Nanoscale Science and Technology performs research in nanotechnology, both through internal research efforts and by running a user-accessible cleanroom nanomanufacturing facility.

    This “NanoFab” is equipped with tools for lithographic patterning and imaging (e.g., electron microscopes and atomic force microscopes).
    Committees

    NIST has seven standing committees:

    Technical Guidelines Development Committee (TGDC)
    Advisory Committee on Earthquake Hazards Reduction (ACEHR)
    National Construction Safety Team Advisory Committee (NCST Advisory Committee)
    Information Security and Privacy Advisory Board (ISPAB)
    Visiting Committee on Advanced Technology (VCAT)
    Board of Overseers for the Malcolm Baldrige National Quality Award (MBNQA Board of Overseers)
    Manufacturing Extension Partnership National Advisory Board (MEPNAB)

    Measurements and standards

    As part of its mission, NIST supplies industry, academia, government, and other users with over 1,300 Standard Reference Materials (SRMs). These artifacts are certified as having specific characteristics or component content, used as calibration standards for measuring equipment and procedures, quality control benchmarks for industrial processes, and experimental control samples.

    Handbook 44

    NIST publishes the Handbook 44 each year after the annual meeting of the National Conference on Weights and Measures (NCWM). Each edition is developed through cooperation of the Committee on Specifications and Tolerances of the NCWM and the Weights and Measures Division (WMD) of the NIST. The purpose of the book is a partial fulfillment of the statutory responsibility for “cooperation with the states in securing uniformity of weights and measures laws and methods of inspection”.

    NIST has been publishing various forms of what is now the Handbook 44 since 1918 and began publication under the current name in 1949. The 2010 edition conforms to the concept of the primary use of the SI (metric) measurements recommended by the Omnibus Foreign Trade and Competitiveness Act of 1988.

     
  • richardmitnick 9:00 pm on August 3, 2022 Permalink | Reply
    Tags: "Engineers develop new integration route for tiny transistors", , Breaking the bottleneck for future electronics, Electronics, filling a gap in semiconductor applications due to silicon’s opaque and rigid nature, , Solving the semiconductor scaling issue, The new miniaturized devices matched the performance of current silicon-semiconductor field-effect transistors., The potential for large-scale production of a 2D field-effect transistor – a device used to control current in electronics., The scientists hope to see whether the material can be used to build all the circuits for an entire computer on one chip., , UNSW Materials and Manufacturing Futures Institute (MMFI)   

    From The University of New South Wales (AU) : “Engineers develop new integration route for tiny transistors” 

    U NSW bloc

    From The University of New South Wales (AU)

    8.3.22

    The transparent and flexible material could pave the way for emerging 2D electronic applications.

    1
    Researchers from the Materials and Manufacturing Futures Institute designed the material. Photo: Robert Largent.

    Researchers from UNSW Sydney have developed a tiny, transparent and flexible material to be used as a novel dielectric (insulator) component in transistors. The new material would enable what conventional silicon semiconductor electronics cannot do – get any smaller without compromising their function.

    The research, recently published in Nature [below], indicates the potential for large-scale production of a 2D field-effect transistor – a device used to control current in electronics. The new material could help overcome the challenges of nanoscale silicon semiconductor production for dependable capacitance (electrical charge stored) and efficient switching behaviour.

    According to the researchers, this is one of the crucial bottlenecks to solve for the development of a new generation of futuristic electronic devices, from augmented reality, flexible displays and new wearables, as well as many yet-discovered applications.

    “Not only does it pave a critical pathway to overcome the fundamental limit of the current silicon semiconductor industry in miniaturization, but it also fills a gap in semiconductor applications due to silicon’s opaque and rigid nature,” says Professor Sean Li, UNSW Materials and Manufacturing Futures Institute (MMFI) Director and principal investigator on the research. “Simultaneously, the elastic and slim nature could enable the accomplishment of flexible and transparent 2D electronics.”

    Solving the semiconductor scaling issue

    A transistor is a small semiconductive device used as a switch for electronic signals, and they are an essential component of integrated circuits. All electronics, from flashlights to hearing aids to laptops, are made possible by various arrangements and interactions of transistors with other components like resistors and capacitors.

    As transistors have become smaller and more powerful over time, so too have electronics. Think your mobile phone – a compact hand-held computer with more processing power than the computers that sent the first astronauts to the moon.

    But there’s a scaling problem. Developing more powerful future electronics will require transistors with sub-nanometre thickness – a size conventional silicon semiconductors can’t reach.

    “As microelectronic miniaturization occurs, the materials currently being used are pushed to their limits because of energy loss and dissipation as signals pass from one transistor to the next,” says Prof. Li.

    Microelectronic devices continue to diminish in size to achieve higher speeds. As this shrinkage occurs, design parameters are impacted in such a way that the materials currently being used are pushed to their limits because of energy loss and dissipation as signals pass from one transistor to the next. The current smallest transistors made of silicon-based semiconductors are 3 nanometres.

    To get an idea of just how small these devices need to be – imagine one centimetre on a ruler and then count the 10 millimetres of that centimetre. Now, in one of those millimetres, count another one million tiny segments – each of those is one nanometre or nm.

    “With such limits, there has been an enormous drive to radically innovate new materials and technologies to meet the insatiable demands of the global microelectronics market,” says Prof. Li.

    Breaking the bottleneck for future electronics

    For the research, MMFI engineers fabricated the transparent field-effect transistors using a freestanding single-crystal strontium titanate (STO) membrane as the gate dielectric. They discovered their new miniaturized devices matched the performance of current silicon-semiconductor field-effect transistors.

    “The key innovation of this work is that we transformed conventional 3D bulk materials into a quasi-2D form without degrading its properties,” says Dr Jing-Kai Huang, the paper’s lead author. “This means it can be freely assembled, like LEGO blocks, with other materials to create high-performance transistors for a variety of emerging and undiscovered applications.”

    The MMFI academics drew on their diverse expertise to complete the work.

    “Fabricating devices involves people from different fields. Through MMFI, we have established connections with academics who are experts in the 2D electric device fields as well as the semiconductor industry,” says Dr Ji Zhang, a co-author of the paper.

    “The first project was to fabricate the freestanding STO and to study its electrical properties. As the project progressed, it evolved into fabricating 2D transistors using freestanding STO. With the help from the platform established by MMFI, we were able to work together to finish the project.”

    The team is now working towards wafer-scale production. In other words, they hope to see whether the material can be used to build all the circuits for an entire computer on one chip.

    “Extensive data sets were collected to support the performance of these 2D electronics, indicating the technology’s promise for large-size wafer production and industrial adoption,” says Dr Junjie Shi, another co-author of the paper.

    “Achieving this will enable us to fabricate more complex circuits with a density closer to commercial products. This is the crucial step to make our technology reach people,” says Dr Huang.

    The researchers also say their development is a promising step toward a new era of electronics and local manufacturing resilience.

    “From shifting geopolitics and the pandemic, we have seen more disruption in the global semiconductor supply chain, and we believe this is also an opportunity for Australia to join and strengthen this supply chain with our unique technology in the near future,” Dr Huang says.

    Currently, the technology is protected by two Australian provisional patent applications, with MMFI and UNSW looking to commercialize the intellectual property and bring it to market.

    “We are currently fabricating logic circuits with the transistors,” says Prof. Li. “At the same time, we are approaching several leading industries in the Asia-Pacific region to attract investment and establish a semiconductor manufacturing capability in NSW via industrialization of this technology.”

    Science paper:
    Nature

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U NSW Campus

    The University of New South Wales is an Australian public university with its largest campus in the Sydney suburb of Kensington.

    Established in 1949, UNSW is a research university, ranked 44th in the world in the 2021 QS World University Rankings and 67th in the world in the 2021 Times Higher Education World University Rankings. UNSW is one of the founding members of the Group of Eight, a coalition of Australian research-intensive universities, and of Universitas 21, a global network of research universities. It has international exchange and research partnerships with over 200 universities around the world.

    According to the 2021 QS World University Rankings by Subject, UNSW is ranked top 20 in the world for Law, Accounting and Finance, and 1st in Australia for Mathematics, Engineering and Technology. UNSW also leads Australia in Medicine, where the median ATAR (Australian university entrance examination results) of its Medical School students is higher than any other Australian medical school. UNSW enrolls the highest number of Australia’s top 500 high school students academically, and produces more millionaire graduates than any other Australian university.

    The university comprises seven faculties, through which it offers bachelor’s, master’s and doctoral degrees. The main campus is in the Sydney suburb of Kensington, 7 kilometres (4.3 mi) from the Sydney CBD. The creative arts faculty, UNSW Art & Design, is located in Paddington, and subcampuses are located in the Sydney CBD as well as several other suburbs, including Randwick and Coogee. Research stations are located throughout the state of New South Wales.

    The university’s second largest campus, known as UNSW Canberra at ADFA (formerly known as UNSW at ADFA), is situated in Canberra, in the Australian Capital Territory (ACT). ADFA is the military academy of the Australian Defense Force, and UNSW Canberra is the only national academic institution with a defense focus.

    Research centres

    The university has a number of purpose-built research facilities, including:

    UNSW Lowy Cancer Research Centre is Australia’s first facility bringing together researchers in childhood and adult cancers, as well as one of the country’s largest cancer-research facilities, housing up to 400 researchers.
    The Mark Wainwright Analytical Centre is a centre for the faculties of science, medicine, and engineering. It is used to study the structure and composition of biological, chemical, and physical materials.
    UNSW Canberra Cyber is a cyber-security research and teaching centre.
    The Sino-Australian Research Centre for Coastal Management (SARCCM) has a multidisciplinary focus, and works collaboratively with the Ocean University of China [中國海洋大學](CN) in coastal management research.

     
  • richardmitnick 7:57 am on June 19, 2021 Permalink | Reply
    Tags: "Lawrence Livermore team designs semiconductor switch for next-generation communications", , , Electronics, , Laser-driven semiconductor switch   

    From DOE’s Lawrence Livermore National Laboratory (US) : “Lawrence Livermore team designs semiconductor switch for next-generation communications” 

    From DOE’s Lawrence Livermore National Laboratory (US)

    6.18.21

    Jeremy Thomas
    thomas244@llnl.gov
    925-422-5539

    1
    Lawrence Livermore National Laboratory engineers have designed a new kind of laser-driven semiconductor switch that can theoretically achieve higher speeds at higher voltages than existing photoconductive devices. If the device could be realized, it could be miniaturized and incorporated into satellites to enable communication systems beyond 5G, potentially transferring more data at a faster rate and over longer distances, according to researchers.

    Lawrence Livermore National Laboratory (LLNL) engineers have designed a new kind of laser-driven semiconductor switch that can theoretically achieve higher speeds at higher voltages than existing photoconductive devices. The development of such a device could enable next-generation satellite communication systems capable of transferring more data at a faster rate, and over longer distances, according to the research team.

    Scientists at LLNL and the University of Illinois Urbana-Champaign (US) (UIUC) reported on the design and simulation of the novel photoconductive device in a paper published in the IEEE Journal of the Electron Devices Society. The device utilizes a high-powered laser to generate an electron charge cloud in the base material gallium nitride while under extreme electric fields.

    Unlike normal semiconductors, in which electrons move faster as the applied electrical field is increased, gallium nitride expresses a phenomenon called negative differential mobility, where the generated electron cloud doesn’t disperse, but actually slows down at the front of the cloud. This allows the device to create extremely fast pulses and high voltage signals at frequencies approaching one terahertz when exposed to electromagnetic radiation, researchers said.

    “The goal of this project is to build a device that is significantly more powerful than existing technology but also can operate at very high frequencies,” said LLNL engineer and project principal investigator Lars Voss. “It works in a unique mode, where the output pulse can actually be shorter in time than the input pulse of the laser — almost like a compression device. You can compress an optical input into an electrical output, so it lets you potentially generate extremely high speed and very high-power radio frequency waveforms.”

    If the photoconductive switch modeled in the paper could be realized, it could be miniaturized and incorporated into satellites to enable communication systems beyond 5G, potentially transferring more data at a faster rate and over longer distances, Voss said.

    High-power and high-frequency technologies are one of the last areas where solid state devices have yet to replace vacuum tubes, Voss added. New compact semiconductor technologies capable of operating at more than 300 gigahertz (GHz) while delivering a watt or more in output power are in high demand for such applications, and while some high electron mobility transistors can reach frequencies higher than 300 GHz, they are generally limited in energy output, researchers reported.

    “Modeling and simulation of this new switch will provide guidance to experiments, reduce costs of test structures, improve the turnaround and success rate of laboratory tests by preventing trial and error and enable correct interpretation of experimental data,” said lead author Shaloo Rakheja, an assistant professor in the Department of Electrical and Computer Engineering and resident faculty at the Holonyak Micro and Nanotechnology Laboratory at UIUC.

    Researchers are building the switches at LLNL and are exploring other materials such as gallium arsenide to optimize performance.

    “Gallium arsenide expresses the negative differential mobility at lower electric fields than gallium nitride, so it’s a great model to understand the tradeoffs of the effect with more accessible testing,” said LLNL postdoctoral researcher and co-author Karen Dowling.

    Funded by the Laboratory Directed Research and Development program, the goal of the project is demonstrating a conduction device that can operate at 100 GHz and at a high power. Future work will examine the impact of heating from the laser on the electron charge cloud, as well as improving understanding of the device’s operation under an electrical-optical simulation framework, the team reported.

    The simulation work was performed by lead author Rakheja and Kexin Li at UIUC. The project’s original principal investigator Adam Conway, formerly of LLNL, also contributed.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Operated by Lawrence Livermore National Security, LLC, for the Department of Energy’s National Nuclear Security Administration

    DOE’s Lawrence Livermore National Laboratory (LLNL) (US) is an American federal research facility in Livermore, California, United States, founded by the University of California-Berkeley (US) in 1952. A Federally Funded Research and Development Center (FFRDC), it is primarily funded by the U.S. Department of Energy (DOE) and managed and operated by Lawrence Livermore National Security, LLC (LLNS), a partnership of the University of California, Bechtel, BWX Technologies, AECOM, and Battelle Memorial Institute in affiliation with the Texas A&M University System (US). In 2012, the laboratory had the synthetic chemical element livermorium named after it.
    LLNL is self-described as “a premier research and development institution for science and technology applied to national security.” Its principal responsibility is ensuring the safety, security and reliability of the nation’s nuclear weapons through the application of advanced science, engineering and technology. The Laboratory also applies its special expertise and multidisciplinary capabilities to preventing the proliferation and use of weapons of mass destruction, bolstering homeland security and solving other nationally important problems, including energy and environmental security, basic science and economic competitiveness.

    The Laboratory is located on a one-square-mile (2.6 km^2) site at the eastern edge of Livermore. It also operates a 7,000 acres (28 km2) remote experimental test site, called Site 300, situated about 15 miles (24 km) southeast of the main lab site. LLNL has an annual budget of about $1.5 billion and a staff of roughly 5,800 employees.

    LLNL was established in 1952 as the University of California Radiation Laboratory at Livermore, an offshoot of the existing UC Radiation Laboratory at Berkeley. It was intended to spur innovation and provide competition to the nuclear weapon design laboratory at Los Alamos in New Mexico, home of the Manhattan Project that developed the first atomic weapons. Edward Teller and Ernest Lawrence, director of the Radiation Laboratory at Berkeley, are regarded as the co-founders of the Livermore facility.

    The new laboratory was sited at a former naval air station of World War II. It was already home to several UC Radiation Laboratory projects that were too large for its location in the Berkeley Hills above the UC campus, including one of the first experiments in the magnetic approach to confined thermonuclear reactions (i.e. fusion). About half an hour southeast of Berkeley, the Livermore site provided much greater security for classified projects than an urban university campus.

    Lawrence tapped 32-year-old Herbert York, a former graduate student of his, to run Livermore. Under York, the Lab had four main programs: Project Sherwood (the magnetic-fusion program), Project Whitney (the weapons-design program), diagnostic weapon experiments (both for the DOE’s Los Alamos National Laboratory(US) and Livermore laboratories), and a basic physics program. York and the new lab embraced the Lawrence “big science” approach, tackling challenging projects with physicists, chemists, engineers, and computational scientists working together in multidisciplinary teams. Lawrence died in August 1958 and shortly after, the university’s board of regents named both laboratories for him, as the Lawrence Radiation Laboratory.

    Historically, the DOE’s Lawrence Berkeley National Laboratory (US) and Livermore laboratories have had very close relationships on research projects, business operations, and staff. The Livermore Lab was established initially as a branch of the Berkeley laboratory. The Livermore lab was not officially severed administratively from the Berkeley lab until 1971. To this day, in official planning documents and records, Lawrence Berkeley National Laboratory is designated as Site 100, Lawrence Livermore National Lab as Site 200, and LLNL’s remote test location as Site 300.

    The laboratory was renamed Lawrence Livermore Laboratory (LLL) in 1971. On October 1, 2007 LLNS assumed management of LLNL from the University of California, which had exclusively managed and operated the Laboratory since its inception 55 years before. The laboratory was honored in 2012 by having the synthetic chemical element livermorium named after it. The LLNS takeover of the laboratory has been controversial. In May 2013, an Alameda County jury awarded over $2.7 million to five former laboratory employees who were among 430 employees LLNS laid off during 2008.The jury found that LLNS breached a contractual obligation to terminate the employees only for “reasonable cause.” The five plaintiffs also have pending age discrimination claims against LLNS, which will be heard by a different jury in a separate trial.[6] There are 125 co-plaintiffs awaiting trial on similar claims against LLNS. The May 2008 layoff was the first layoff at the laboratory in nearly 40 years.

    On March 14, 2011, the City of Livermore officially expanded the city’s boundaries to annex LLNL and move it within the city limits. The unanimous vote by the Livermore city council expanded Livermore’s southeastern boundaries to cover 15 land parcels covering 1,057 acres (4.28 km^2) that comprise the LLNL site. The site was formerly an unincorporated area of Alameda County. The LLNL campus continues to be owned by the federal government.

    LLNL/NIF

    DOE Seal

    NNSA

     
  • richardmitnick 9:59 pm on June 17, 2021 Permalink | Reply
    Tags: "An Atomic Look At Lithium-rich Batteries", , , Electronics, Lithium-rich oxides are promising cathode material classes because they have been shown to have much higher storage capacity., Researchers believes a paradigm shift is necessary to make a significant impact in battery technology for these industries., The anionic reduction-oxidation mechanism in lithium-rich cathodes., The electrification of heavy-duty vehicles and aircraft requires batteries with more energy density., The team set out to provide conclusive evidence for the redox mechanism utilizing Compton scattering.   

    From Carnegie Mellon University (US) : “An Atomic Look At Lithium-rich Batteries” 

    From Carnegie Mellon University (US)

    June 17, 2021
    Lisa Kulick
    lkulick@andrew.cmu.edu

    International team of researchers makes groundbreaking observation.

    1

    Batteries have come a long way since Volta first stacked copper and zinc discs together 200 years ago. While the technology has continued to evolve from lead-acid to lithium-ion, many challenges still exist — like achieving higher density and suppressing dendrite growth. Experts are racing to address the growing, global need for energy-efficient and safe batteries.

    The electrification of heavy-duty vehicles and aircraft requires batteries with more energy density. A team of researchers believes a paradigm shift is necessary to make a significant impact in battery technology for these industries. This shift would take advantage of the anionic reduction-oxidation mechanism in lithium-rich cathodes. Findings published in Nature mark the first time direct observation of this anionic redox reaction has been observed in a lithium-rich battery material.

    Collaborating institutions included Carnegie Mellon University, Northeastern University (US), Lappeenranta-Lahti University of Technology [LUT-yliopisto](FI) in Finland, and institutions in Japan including Gunma University [群馬大学](JP), Japan Synchrotron Radiation Research Institute (JASRI) [高輝度光科学研究センター] (JP), Yokohama National University [横浜国立大学](JP) , KYOTO UNIVERSITY[京都大学](JP) and Ritsumeikan University [立命館大学](JP).

    Lithium-rich oxides are promising cathode material classes because they have been shown to have much higher storage capacity. But, there is an “AND problem” that battery materials must satisfy — the material must be capable of fast charging, be stable to extreme temperatures, and cycle reliably for thousands of cycles. To address this, scientists need a clear understanding of how these oxides work at the atomic level and how their underlying electrochemical mechanisms play a role.

    Normal Li-ion batteries work by cationic redox, when a metal ion changes its oxidation state as lithium is inserted or removed. Within this insertion framework, only one lithium-ion can be stored per metal-ion. Lithium-rich cathodes, however, can store much more. Researchers attribute this to the anionic redox mechanism — in this case, oxygen redox. This is the mechanism credited with the high capacity of the materials, nearly doubling the energy storage compared to conventional cathodes. Although this redox mechanism has emerged as the leading contender among battery technologies, it signifies a pivot in materials chemistry research.

    The team set out to provide conclusive evidence for the redox mechanism utilizing Compton scattering, the phenomenon by which a photon deviates from a straight trajectory after interacting with a particle (usually an electron). The researchers performed sophisticated theoretical and experimental studies at SPring-8, the world’s largest third-generation synchrotron radiation facility, which is operated by JASRI.

    Synchrotron radiation consists of the narrow, powerful beams of electromagnetic radiation that are produced when electron beams are accelerated to (almost) the speed of light and are forced to travel in a curved path by a magnetic field. Compton scattering becomes visible.

    The researchers observed how the electronic orbital that lies at the heart of the reversible and stable anionic redox activity can be imaged and visualized and its character and symmetry determined. This scientific first can be game-changing for future battery technology.

    While previous research has proposed alternative explanations of the anionic redox mechanism, it could not provide a clear image of the quantum mechanical electronic orbitals associated with redox reactions because this cannot be measured by standard experiments.

    The research team had an “A ha!” moment when they first saw the agreement in redox character between theory and experimental results. “We realized that our analysis could image the oxygen states that are responsible for the redox mechanism, which is something fundamentally important for battery research,” explained Hasnain Hafiz, lead author of the study who carried out this work during his time as a postdoctoral research associate at Carnegie Mellon.

    “We have conclusive evidence in support of the anionic redox mechanism in a lithium-rich battery material,” said Venkat Viswanathan, associate professor of mechanical engineering at Carnegie Mellon. “Our study provides a clear picture of the workings of a lithium-rich battery at the atomic scale and suggests pathways for designing next-generation cathodes to enable electric aviation. The design for high-energy density cathodes represents the next frontier for batteries.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Carnegie Mellon University (US) is a global research university with more than 12,000 students, 95,000 alumni, and 5,000 faculty and staff.
    CMU has been a birthplace of innovation since its founding in 1900.
    Today, we are a global leader bringing groundbreaking ideas to market and creating successful startup businesses.
    Our award-winning faculty members are renowned for working closely with students to solve major scientific, technological and societal challenges. We put a strong emphasis on creating things—from art to robots. Our students are recruited by some of the world’s most innovative companies.
    We have campuses in Pittsburgh, Qatar and Silicon Valley, and degree-granting programs around the world, including Africa, Asia, Australia, Europe and Latin America.

    The university was established by Andrew Carnegie as the Carnegie Technical Schools, the university became the Carnegie Institute of Technology in 1912 and began granting four-year degrees. In 1967, the Carnegie Institute of Technology merged with the Mellon Institute of Industrial Research, formerly a part of the University of Pittsburgh. Since then, the university has operated as a single institution.

    The university has seven colleges and independent schools, including the College of Engineering, College of Fine Arts, Dietrich College of Humanities and Social Sciences, Mellon College of Science, Tepper School of Business, Heinz College of Information Systems and Public Policy, and the School of Computer Science. The university has its main campus located 3 miles (5 km) from Downtown Pittsburgh, and the university also has over a dozen degree-granting locations in six continents, including degree-granting campuses in Qatar and Silicon Valley.

    Past and present faculty and alumni include 20 Nobel Prize laureates, 13 Turing Award winners, 23 Members of the American Academy of Arts and Sciences (US), 22 Fellows of the American Association for the Advancement of Science (US), 79 Members of the National Academies, 124 Emmy Award winners, 47 Tony Award laureates, and 10 Academy Award winners. Carnegie Mellon enrolls 14,799 students from 117 countries and employs 1,400 faculty members.
    Research

    Carnegie Mellon University is classified among “R1: Doctoral Universities – Very High Research Activity”. For the 2006 fiscal year, the university spent $315 million on research. The primary recipients of this funding were the School of Computer Science ($100.3 million), the Software Engineering Institute ($71.7 million), the College of Engineering ($48.5 million), and the Mellon College of Science ($47.7 million). The research money comes largely from federal sources, with a federal investment of $277.6 million. The federal agencies that invest the most money are the National Science Foundation (US) and the Department of Defense (US), which contribute 26% and 23.4% of the total university research budget respectively.

    The recognition of Carnegie Mellon as one of the best research facilities in the nation has a long history—as early as the 1987 Federal budget Carnegie Mellon University was ranked as third in the amount of research dollars with $41.5 million, with only Massachusetts Institute of Technology (US) and Johns Hopkins University (US) receiving more research funds from the Department of Defense.

    The Pittsburgh Supercomputing Center (PSC) (US) is a joint effort between Carnegie Mellon, University of Pittsburgh (US), and Westinghouse Electric Company. Pittsburgh Supercomputing Center was founded in 1986 by its two scientific directors, Dr. Ralph Roskies of the University of Pittsburgh and Dr. Michael Levine of Carnegie Mellon. Pittsburgh Supercomputing Center is a leading partner in the TeraGrid, the National Science Foundation’s cyberinfrastructure program.
    Scarab lunar rover is being developed by the RI.

    The Robotics Institute (RI) is a division of the School of Computer Science and considered to be one of the leading centers of robotics research in the world. The Field Robotics Center (FRC) has developed a number of significant robots, including Sandstorm and H1ghlander, which finished second and third in the DARPA Grand Challenge, and Boss, which won the DARPA Urban Challenge. The Robotics Institute has partnered with a spinoff company, Astrobotic Technology Inc., to land a CMU robot on the moon by 2016 in pursuit of the Google Lunar XPrize. The robot, known as Andy, is designed to explore lunar pits, which might include entrances to caves. The RI is primarily sited at Carnegie Mellon’s main campus in Newell-Simon hall.

    The Software Engineering Institute (SEI) is a federally funded research and development center sponsored by the U.S. Department of Defense and operated by Carnegie Mellon, with offices in Pittsburgh, Pennsylvania, USA; Arlington, Virginia, and Frankfurt, Germany. The SEI publishes books on software engineering for industry, government and military applications and practices. The organization is known for its Capability Maturity Model (CMM) and Capability Maturity Model Integration (CMMI), which identify essential elements of effective system and software engineering processes and can be used to rate the level of an organization’s capability for producing quality systems. The SEI is also the home of CERT/CC, the federally funded computer security organization. The CERT Program’s primary goals are to ensure that appropriate technology and systems management practices are used to resist attacks on networked systems and to limit damage and ensure continuity of critical services subsequent to attacks, accidents, or failures.

    The Human–Computer Interaction Institute (HCII) is a division of the School of Computer Science and is considered one of the leading centers of human–computer interaction research, integrating computer science, design, social science, and learning science. Such interdisciplinary collaboration is the hallmark of research done throughout the university.

    The Language Technologies Institute (LTI) is another unit of the School of Computer Science and is famous for being one of the leading research centers in the area of language technologies. The primary research focus of the institute is on machine translation, speech recognition, speech synthesis, information retrieval, parsing and information extraction. Until 1996, the institute existed as the Center for Machine Translation that was established in 1986. From 1996 onwards, it started awarding graduate degrees and the name was changed to Language Technologies Institute.

    Carnegie Mellon is also home to the Carnegie School of management and economics. This intellectual school grew out of the Tepper School of Business in the 1950s and 1960s and focused on the intersection of behavioralism and management. Several management theories, most notably bounded rationality and the behavioral theory of the firm, were established by Carnegie School management scientists and economists.

    Carnegie Mellon also develops cross-disciplinary and university-wide institutes and initiatives to take advantage of strengths in various colleges and departments and develop solutions in critical social and technical problems. To date, these have included the Cylab Security and Privacy Institute, the Wilton E. Scott Institute for Energy Innovation, the Neuroscience Institute (formerly known as BrainHub), the Simon Initiative, and the Disruptive Healthcare Technology Institute.

    Carnegie Mellon has made a concerted effort to attract corporate research labs, offices, and partnerships to the Pittsburgh campus. Apple Inc., Intel, Google, Microsoft, Disney, Facebook, IBM, General Motors, Bombardier Inc., Yahoo!, Uber, Tata Consultancy Services, Ansys, Boeing, Robert Bosch GmbH, and the Rand Corporation have established a presence on or near campus. In collaboration with Intel, Carnegie Mellon has pioneered research into claytronics.

     
  • richardmitnick 10:57 am on June 2, 2021 Permalink | Reply
    Tags: "World’s smallest and best acoustic amplifier emerges from 50-year-old hypothesis", Acousto-electric devices reveal new road to miniaturizing wireless tech., , Electronics   

    From DOE’s Sandia National Laboratories (US) : “World’s smallest and best acoustic amplifier emerges from 50-year-old hypothesis” 

    From DOE’s Sandia National Laboratories (US)

    June 2, 2021

    Troy Rummler
    trummle@sandia.gov
    505-249-3632

    Acousto-electric devices reveal new road to miniaturizing wireless tech.

    1
    Scientists Matt Eichenfield, left, and Lisa Hackett led the team at Sandia National Laboratories that created the world’s smallest and best acoustic amplifier. Photo by Bret Latter.

    Scientists at Sandia National Laboratories have built the world’s smallest and best acoustic amplifier. And they did it using a concept that was all but abandoned for almost 50 years.

    According to a paper published May 13 in Nature Communications, the device is more than 10 times more effective than the earlier versions. The design and future research directions hold promise for smaller wireless technology.

    Modern cell phones are packed with radios to send and receive phone calls, text messages and high-speed data. The more radios in a device, the more it can do. While most radio components, including amplifiers, are electronic, they can potentially be made smaller and better as acoustic devices. This means they would use sound waves instead of electrons to process radio signals.

    “Acoustic wave devices are inherently compact because the wavelengths of sound at these frequencies are so small — smaller than the diameter of human hair,” Sandia scientist Lisa Hackett said. But until now, using sound waves has been impossible for many of these components.

    Sandia’s acoustic, 276-megahertz amplifier, measuring a mere 0.0008 square inch (0.5 square millimeter), demonstrates the vast, largely untapped potential for making radios smaller through acoustics. To amplify 2 gigahertz frequencies, which carry much of modern cell phone traffic, the device would be even smaller, 0.00003 square inch (0.02 square millimeter), a footprint that would comfortably fit inside a grain of table salt and is more than 10 times smaller than current state-of-the-art technologies.

    The team also created the first acoustic circulator, another crucial radio component that separates transmitted and received signals. Together, the petite parts represent an essentially uncharted path toward making all technologies that send and receive information with radio waves smaller and more sophisticated, said Sandia scientist Matt Eichenfield.

    “We are the first to show that it’s practical to make the functions that are normally being done in the electronic domain in the acoustic domain,” Eichenfield said.

    Resurrecting a decades-old design

    Scientists tried making acoustic radio-frequency amplifiers decades ago, but the last major academic papers from these efforts were published in the 1970s.

    Without modern nanofabrication technologies, their devices performed too poorly to be useful. Boosting a signal by a factor of 100 with the old devices required 0.4 inch (1 centimeter) of space and 2,000 volts of electricity. They also generated lots of heat, requiring more than 500 milliwatts of power.

    The new and improved amplifier is more than 10 times as effective as the versions built in the ‘70s in a few ways. It can boost signal strength by a factor of 100 in 0.008 inch (0.2 millimeter) with only 36 volts of electricity and 20 milliwatts of power.

    2
    An acousto-electric chip, top, produced at Sandia National Laboratories includes a radio-frequency amplifier, circulator and filter. An image taken by scanning electron microscopy shows details of the amplifier. Photo by Bret Latter. Microscopy image courtesy of Matt Eichenfield.

    Previous researchers hit a dead end trying to enhance acoustic devices, which are not capable of amplification or circulation on their own, by using layers of semiconductor materials. For their concept to work well, the added material must be very thin and very high quality, but scientists only had techniques to make one or the other.

    Decades later, Sandia developed techniques to do both in order to improve photovoltaic cells by adding a series of thin layers of semiconducting materials. The Sandia scientist leading that effort happened to share an office with Eichenfield.

    “I had some pretty heavy peripheral exposure. I heard about it all the time in my office,” Eichenfield said. “So fast forward probably three years later, I was reading these papers out of curiosity about this acousto-electric amplifier work and reading about what they tried to do, and I realized that this work that Sandia had done to develop these techniques for essentially taking very, very thin semiconductors and transferring them onto other materials was exactly what we would need to make these devices realize all their promise.”

    Sandia made its amplifier with semiconductor materials that are 83 layers of atoms thick — 1,000 times thinner than a human hair.

    Fusing an ultrathin semiconducting layer onto a dissimilar acoustic device took an intricate process of growing crystals on top of other crystals, bonding them to yet other crystals and then chemically removing 99.99% of the materials to produce a perfectly smooth contact surface. Nanofabrication methods like this are collectively called heterogeneous integration and are a research area of growing interest at Sandia’s Microsystems Engineering, Science and Applications complex and throughout the semiconductor industry.

    Amplifiers, circulators and filters are normally produced separately because they are dissimilar technologies, but Sandia produced them all on the same acousto-electric chip. The more technologies that can be made on the same chip, the simpler and more efficient manufacturing becomes. The team’s research shows that the remaining radio signal processing components could conceivably be made as extensions of the devices already demonstrated.

    Work was funded by Sandia’s Laboratory Directed Research and Development program and the Center for Integrated Nanotechnologies, a user facility jointly operated by Sandia and Los Alamos national laboratories.

    So how long until these petite radio parts are inside your phone? Probably not for a while, Eichenfield said. Converting mass-produced, commercial products like cell phones to all acousto-electric technology would require a massive overhaul of the manufacturing infrastructure, he said. But for small productions of specialized devices, the technology holds more immediate promise.

    The Sandia team is now exploring whether they can adapt their technology to improve all-optical signal processing, too. They are also interested in finding out if the technology can help isolate and manipulate single quanta of sound, called phonons, which would potentially make it useful for controlling and making measurements in some quantum computers.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus.

    DOE’s Sandia National Laboratories (US) managed and operated by the National Technology and Engineering Solutions of Sandia (a wholly owned subsidiary of Honeywell International), is one of three National Nuclear Security Administration(US) research and development laboratories in the United States. Their primary mission is to develop, engineer, and test the non-nuclear components of nuclear weapons and high technology. Headquartered in Central New Mexico near the Sandia Mountains, on Kirtland Air Force Base in Albuquerque, Sandia also has a campus in Livermore, California, next to DOE’sLawrence Livermore National Laboratory(US), and a test facility in Waimea, Kauai, Hawaii.

    It is Sandia’s mission to maintain the reliability and surety of nuclear weapon systems, conduct research and development in arms control and nonproliferation technologies, and investigate methods for the disposal of the United States’ nuclear weapons program’s hazardous waste.

    Other missions include research and development in energy and environmental programs, as well as the surety of critical national infrastructures. In addition, Sandia is home to a wide variety of research including computational biology; mathematics (through its Computer Science Research Institute); materials science; alternative energy; psychology; MEMS; and cognitive science initiatives.

    Sandia formerly hosted ASCI Red, one of the world’s fastest supercomputers until its recent decommission, and now hosts ASCI Red Storm supercomputer, originally known as Thor’s Hammer.


    Sandia is also home to the Z Machine.

    The Z Machine is the largest X-ray generator in the world and is designed to test materials in conditions of extreme temperature and pressure. It is operated by Sandia National Laboratories to gather data to aid in computer modeling of nuclear guns. In December 2016, it was announced that National Technology and Engineering Solutions of Sandia, under the direction of Honeywell International, would take over the management of Sandia National Laboratories starting on May 1, 2017.


     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: