Tagged: Laser Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:27 am on January 23, 2022 Permalink | Reply
    Tags: "Harnessing noise in optical computing for AI", , Artificial intelligence and machine learning are currently affecting our lives in a myriad of small but impactful ways., Artificial neural networks are bedrock technology for AI and machine learning., Cloud computing data centers used by AI and machine learning applications worldwide are already devouring more electrical power per year than some small countries., Laser Technology, The computers used for AI and machine learning demand energy and lots of it., The University of Washington Paul G. Allen College of Electrical and Computer of Engineering (US), This level of energy consumption is unsustainable.   

    From The University of Washington Paul G. Allen College of Electrical and Computer of Engineering (US) : “Harnessing noise in optical computing for AI” 

    From The University of Washington Paul G. Allen College of Electrical and Computer of Engineering (US)

    at

    The University of Washington (US)

    January 21, 2022
    Wayne Gillam

    1
    An illustration of the UW ECE-led research team’s integrated optical computing chip and “handwritten” numbers it generated. The chip contains an artificial neural network that can learn how to write like a human in its own, distinct style. This optical computing system uses “noise” (stray photons from lasers and thermal background radiation) to augment its creative capabilities. The system is also approximately 10 times faster than comparable conventional digital computers and more energy efficient, helping to put AI and machine learning on a path toward environmental sustainability. Illustration by Changming Wu.

    Artificial intelligence and machine learning are currently affecting our lives in a myriad of small but impactful ways. For example, AI and machine learning applications help to interpret voice commands given to our phones and electronic devices, such as Alexa, and recommend entertainment we might enjoy through services such as Netflix and Spotify. In the near future, it’s predicted that AI and machine learning will have an even larger impact on society through activities such as driving fully autonomous vehicles, enabling complex scientific research and facilitating medical discoveries.

    But the computers used for AI and machine learning demand energy and lots of it. Currently, the need for computing power related to these technologies is doubling roughly every three to four months. And cloud computing data centers used by AI and machine learning applications worldwide are already devouring more electrical power per year than some small countries. Knowing this, it’s easy to see that this level of energy consumption is unsustainable, and if left unchecked, will come with serious environmental consequences for us all.

    UW ECE Professor Mo Li and graduate student Changming Wu have been working toward addressing this daunting challenge over the last couple of years, developing new optical computing hardware for AI and machine learning that is faster and much more energy efficient than conventional electronics. They have already engineered an optical computing system that uses laser light to transmit information and do computing by using phase-change material similar to what is in a CD or DVD-ROM to record data. Laser light transmits data much faster than electrical signals, and phase-change material can retain data using little to no energy. With these advantages, their optical computing system has proven to be much more energy efficient and over 10 times faster than comparable digital computers.

    Now, Li and Wu are addressing another key challenge, the ‘noise’ inherent to optical computing itself. This noise essentially comes from stray light particles, photons, that interfere with computing precision. These errant photons come from the operation of lasers within the device and background thermal radiation. In a new paper published on Jan. 21 in Science Advances, Li, Wu and their research team demonstrate a first-of-its-kind optical computing system for AI and machine learning that not only mitigates this noise but actually uses some of it as input to help enhance the creative output of the artificial neural network within the system. This work resulted from an interdisciplinary collaboration of Li’s research group at the UW with computer scientists Yiran Chen and Xiaoxuan Yang at Duke University (US) and material scientists Ichiro Takeuchi and Heshan Yu at The University of Maryland (US).

    “We’ve built an optical computer that is faster than a conventional digital computer,” said Wu, who is the paper’s lead author. “And also, this optical computer can create new things based on random inputs generated from the optical noise that most researchers tried to evade.”

    Using noise to enhance AI creativity

    Artificial neural networks are bedrock technology for AI and machine learning. These networks function in many respects like the human brain, taking in and processing information from various inputs and generating useful outputs. In short, they are capable of learning.

    In this research work, the team connected Li and Wu’s optical computing core to a special type of artificial neural network called a Generative Adversarial Network, or GAN, which has the capacity to creatively produce outputs. The team employed several different noise mitigation techniques, which included using some of the noise generated by the optical computing core to serve as random inputs for the GAN. The team found that this technique not only made the system more robust, but it also had the surprising effect of enhancing the network’s creativity, allowing it to generate outputs with more varying styles.

    To experimentally test the image creation abilities of their device, the team assigned the GAN the task of learning how to handwrite the number “7” like a human. The optical computer could not simply print out the number according to a prescribed font. It had to learn the task much like a child would, by looking at visual samples of handwriting and practicing until it could write the number correctly. Of course, the optical computer didn’t have a human hand for writing, so its form of “handwriting” was to generate digital images that had a style similar to the samples it had studied but were not identical to them.

    “Instead of training the network to read handwritten numbers, we trained the network to learn to write numbers, mimicking visual samples of handwriting that it was trained on,” Li said. “We, with the help of our computer science collaborators at Duke University, also showed that the GAN can mitigate the negative impact of the optical computing hardware noises by using a training algorithm that is robust to errors and noises. More than that, the network actually uses the noises as random input that is needed to generate output instances.”

    After learning from handwritten samples of the number seven, which were from a standard AI-training image set, the GAN practiced writing “7” until it could do it successfully. Along the way, it developed its own, distinct writing style. The team was also able to get the device to write numbers from one to 10 in computer simulations.

    As a result of this research, the team was able to show that an optical computing device could power a sophisticated form of artificial intelligence, and that the noise inherent to integrated optoelectronics was not a barrier, but in fact could be used to enhance AI creativity. They also showed that the technology in their device was scalable, and that it would be possible for it to be deployed widely, for instance, in cloud computing data centers worldwide.

    Next steps for the research team will be to build their device at a larger scale using current semiconductor manufacturing technology. So, instead of constructing the next iteration of the device in a lab, the team plans to use an industrial semiconductor foundry to achieve wafer-scale technology. A larger scale device will further improve performance and allow the research team to do more complex tasks beyond handwriting generation such as creating artwork and even videos.

    “This optical system represents a computer hardware architecture that can enhance the creativity of artificial neural networks used in AI and machine learning, but more importantly, it demonstrates the viability for this system at a large scale where noise and errors can be mitigated and even harnessed,” Li said. “AI applications are growing so fast that in the future, their energy consumption will be unsustainable. This technology has the potential to help reduce that energy consumption, making AI and machine learning environmentally sustainable — and very fast, achieving higher performance overall.”

    This research is financially supported by The Office of Naval Research (US) and The National Science Foundation (US). For more information, contact Mo Li.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    About the University of Washington Paul G. Allen College of Electrical and Computer Engineering (US)

    Mission, Facts, and Stats

    Our mission is to develop outstanding engineers and ideas that change the world.

    Faculty:
    275 faculty (25.2% women)
    Achievements:

    128 NSF Young Investigator/Early Career Awards since 1984
    32 Sloan Foundation Research Awards
    2 MacArthur Foundation Fellows (2007 and 2011)

    A national leader in educating engineers, each year the College turns out new discoveries, inventions and top-flight graduates, all contributing to the strength of our economy and the vitality of our community.

    Engineering innovation

    PEOPLE Innovation at UW ECE is exemplified by our outstanding faculty and by the exceptional group of students they advise and mentor. Students receive a robust education through a strong technical foundation, group project work and hands-on research opportunities. Our faculty work in dynamic research areas with diverse opportunities for projects and collaborations. Through their research, they address complex global challenges in health, energy, technology and the environment, and receive significant research and education grants.IMPACT We continue to expand our innovation ecosystem by promoting an entrepreneurial mindset in our teaching and through diverse partnerships. The field of electrical and computer engineering is at the forefront of solving emerging societal challenges, empowered by innovative ideas from our community. As our department evolves, we are dedicated to expanding our faculty and student body to meet the growing demand for engineers. We welcomed six new faculty hires in the 2018-2019 academic year. Our meaningful connections and collaborations place the department as a leader in the field.

    Engineers drive the innovation economy and are vital to solving society’s most challenging problems. The College of Engineering is a key part of a world-class research university in a thriving hub of aerospace, biotechnology, global health and information technology innovation. Over 50% of UW startups in FY18 came from the College of Engineering.

    Commitment to diversity and access

    The College of Engineering is committed to developing and supporting a diverse student body and faculty that reflect and elevate the populations we serve. We are a national leader in women in engineering; 25.5% of our faculty are women compared to 17.4% nationally. We offer a robust set of diversity programs for students and faculty.
    Research and commercialization

    The University of Washington is an engine of economic growth, today ranked third in the nation for the number of startups launched each year, with 65 companies having been started in the last five years alone by UW students and faculty, or with technology developed here. The College of Engineering is a key contributor to these innovations, and engineering faculty, students or technology are behind half of all UW startups. In FY19, UW received $1.58 billion in total research awards from federal and nonfederal sources.

    u-washington-campus

    The University of Washington is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.
    So what defines us —the students, faculty and community members at the University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

     
  • richardmitnick 10:13 pm on January 22, 2022 Permalink | Reply
    Tags: "This New Record in Laser Beam Stability Could Help Answer Physics' Biggest Questions", , , Laser Technology, , , ,   

    From The University of Western Australia (AU) via Science Alert (AU) : “This New Record in Laser Beam Stability Could Help Answer Physics’ Biggest Questions” 

    U Western Australia bloc

    From The University of Western Australia (AU)

    via

    Science Alert (AU)

    1
    The laser setup at the University of Western Australia. Credit: D. Gozzard/UWA.

    22 JANUARY 2022
    DAVID NIELD

    Scientists are on a mission to create a global network of atomic clocks that will enable us to, among other things, better understand the fundamental laws of physics, investigate dark matter, and navigate across Earth and space more precisely.

    However, to be at their most effective, these clocks will need to be reliably and speedily linked together through layers of the atmosphere, which is far from easy. New research outlines a successful experiment with a laser beam that has been kept stable across a distance of 2.4 kilometers (1.5 miles).

    For comparison, the new link is around 100 times more stable than anything that’s been put together before. It also demonstrates stability that’s around 1,000 times better than the atomic clocks these lasers could be used to monitor.

    “The result shows that the phase and amplitude stabilization technologies presented in this paper can provide the basis for ultra-precise timescale comparison of optical atomic clocks through the turbulent atmosphere,” write the researchers in their published paper [Physical Review Letters].

    The system builds on research carried out last year in which scientists developed a laser link capable of holding its own through the atmosphere with unprecedented stability.

    In the new study, researchers shot a laser beam from a fifth-floor window to a reflector 1.2 kilometers (0.74 miles) away. The beam was then bounced back to the source to achieve the total distance for a period of five minutes.

    Using noise reduction techniques, temperature controls, and tiny adjustments to the reflector, the team was able to keep the laser stable through the pockets of fluctuating air. The atmospheric turbulence at ground level here is likely to equate to ground-to-satellite turbulence (the air is calmer and less dense higher in the atmosphere) of several hundred kilometers.

    While laser accuracy has remained fairly constant for a decade or so, we’ve seen some significant improvements recently, including a laser setup operated by the Boulder Atomic Clock Optical Network (BACON) Collaboration and tested last March [Nature].

    That setup involved a pulse laser rather than the continuous wave laser tested in this new study. Both have their advantages in different scenarios, but continuous wave lasers offer better stability and can transfer more data in a set period of time.

    “Both systems beat the current best atomic clock, so we’re splitting hairs here, but our ultimate precision is better,” says astrophysicist David Gozzard from the University of Western Australia.

    Once an atomic clock network is put together, among the tests scientists will be able to perform is Albert Einstein’s Theory of General Relativity, and how its incompatibility with what we know about quantum physics could be resolved.

    By very precisely comparing the time-keeping of two atomic clocks – one on Earth and one in space – scientists are eventually hoping to be able to work out where General Relativity does and doesn’t hold up. If Einstein’s ideas are correct, the clock further away from Earth’s gravity should tick ever-so-slightly faster.

    But its usefulness doesn’t stop there. Lasers like this could eventually be used for managing the launching of objects into orbit, for communications between Earth and space, or for connecting two points in space.

    “Of course, you can’t run fiber optic cable to a satellite,” says Gozzard.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Western Australia is a public research university in the Australian state of Western Australia. The university’s main campus is in Perth, the state capital, with a secondary campus in Albany and various other facilities elsewhere.

    UWA was established in 1911 by an act of the Parliament of Western Australia and began teaching students two years later. It is the sixth-oldest university in Australia and was Western Australia’s only university until the establishment of Murdoch University (AU) in 1973. Because of its age and reputation, UWA is classed one of the “sandstone universities”, an informal designation given to the oldest university in each state. The university also belongs to several more formal groupings, including The Group of Eight (AU) and The Matariki Network of Universities. In recent years, UWA has generally been ranked either in the bottom half or just outside the world’s top 100 universities, depending on the system used.

    Alumni of UWA include one Prime Minister of Australia (Bob Hawke), five Justices of the High Court of Australia (including one Chief Justice, Robert French, now Chancellor), one Governor of the Reserve Bank (H. C. Coombs), various federal cabinet ministers, and seven of Western Australia’s eight most recent premiers. In 2018 alumnus mathematician Akshay Venkatesh was a recipient of the Fields Medal. As at 2021, the university had produced 106 Rhodes Scholars. Two members of the UWA faculty, Barry Marshall and Robin Warren won Nobel Prizes as a result of research at the university.

    History

    The university was established in 1911 following the tabling of proposals by a royal commission in September 1910. The original campus, which received its first students in March 1913, was located on Irwin Street in the centre of Perth, and consisted of several buildings situated between Hay Street and St Georges Terrace. Irwin Street was also known as “Tin Pan Alley” as many buildings featured corrugated iron roofs. These buildings served as the university campus until 1932, when the campus relocated to its present-day site in Crawley.

    The founding chancellor, Sir John Winthrop Hackett, died in 1916, and bequeathed property which, after being carefully managed for ten years, yielded £425,000 to the university, a far larger sum than expected. This allowed the construction of the main buildings. Many buildings and landmarks within the university bear his name, including Winthrop Hall and Hackett Hall. In addition, his bequest funded many scholarships, because he did not wish eager students to be deterred from studying because they could not afford to do so.

    During UWA’s first decade there was controversy about whether the policy of free education was compatible with high expenditure on professorial chairs and faculties. An “old student” publicised his concern in 1921 that there were 13 faculties serving only 280 students.

    A remnant of the original buildings survives to this day in the form of the “Irwin Street Building”, so called after its former location. In the 1930s it was transported to the new campus and served a number of uses till its 1987 restoration, after which it was moved across campus to James Oval. Recently, the building has served as the Senate meeting room and is currently in use as a cricket pavilion and office of the university archives. The building has been heritage-listed by both the National Trust and the Australian Heritage Council.

    The university introduced the Doctorate of Philosophy degree in 1946 and made its first award in October 1950 to Warwick Bottomley for his research of the chemistry of native plants in Western Australia.

     
  • richardmitnick 5:22 pm on January 19, 2022 Permalink | Reply
    Tags: "Crystallography for the Misfit Crystals", , , , , , Francis Crick-who famously co-discovered the shape of DNA- said: “If you want to understand function study structure.” This remains a tenet of biology; chemistry and materials science., Laser Technology, Linac Coherent Light Source [LCLS] at DOE's SLAC National Accelerator Laboratory (US)., , Molecular Foundry at Berkeley Lab, National Energy Research Scientific Computing Center [NERSC] at Berkeley Lab, Serial femtosecond X-ray crystallography process, smSFX uses an X-ray free electron laser (XFEL)., smSFX: small-molecule serial femtosecond X-ray crystallography, SPring-8 Angstrom Compact free electron LAser (SACLA) at Riken [理研](JP)., X-ray crystallography is most straightforward when the material can be grown into a large single crystal., X-ray crystallography: a technique that maps the density of electrons in a molecule based on how beams of X-ray radiation diffract through the spaces between atoms in the sample.   

    From DOE’s Lawrence Berkeley National Laboratory (US): “Crystallography for the Misfit Crystals” 

    From DOE’s Lawrence Berkeley National Laboratory (US)

    January 19, 2022
    Aliyah Kovner
    akovner@lbl.gov

    1.
    An illustration of the serial femtosecond X-ray crystallography process, showing a jet of liquid solvent combined with the sample particles being blasted with the laser beam to capture diffraction data. This action is completed in just a few femtoseconds – that is quadrillionths of a second, or a few millionths of one billionth of a second. Credit: Ella Maru Studio.

    Francis Crick-who famously co-discovered the shape of DNA-once said: “If you want to understand function study structure.” Many decades later, this remains a tenet of biology, chemistry, and materials science.

    A key breakthrough in the quest for DNA’s structure came from X-ray crystallography, a technique that maps the density of electrons in a molecule based on how beams of X-ray radiation diffract through the spaces between atoms in the sample. The diffraction patterns generated by crystallography can then be used to deduce the overall molecular structure. Thanks to a steady stream of advances over the decades, X-ray crystallography is now exponentially more powerful than it was in Crick’s time, and can even reveal the placement of individual atoms.

    Yet the process is not easy. As the name implies, it requires crystals – specifically, purified samples of the molecule of interest, coaxed into a crystal form. And not all molecules form picture-ready crystals.

    “X-ray crystallography is most straightforward when the material can be grown into a large single crystal,” said Nicholas Sauter, a computer senior scientist at Lawrence Berkeley National Laboratory (Berkeley Lab), in the Molecular Biophysics and Integrated Bioimaging (MBIB) division. “However, most substances instead form powders composed of small granules, whose X-ray diffraction patterns are harder to disentangle.”

    Sauter is co-leading a team working to provide a better way for scientists to study the structures of the many materials that don’t form tidy single crystals, such as solar absorbers and metal-organic frameworks: two diverse material groups with huge potential for combating climate change and producing renewable energy.

    Their new technique, called small-molecule serial femtosecond X-ray crystallography, or smSFX, supercharges traditional crystallography with the addition of custom-built image processing algorithms and an X-ray free electron laser (XFEL). The XFEL, built from a fusion of particle accelerator and laser-based physics, can point X-ray beams that are much more powerful, focused, and speedy than other X-ray sources for crystallography. The entire process, from X-ray pulse to diffraction image, is completed in a few quadrillionths of a second.

    “It’s diffraction before destruction,” said Daniel Paley, an MBIB project scientist and author on the team’s new paper, published today in Nature. “The idea is that the crystal is going to explode instantly when it’s hit by this beam of photons, but with a femtosecond pulse, you collect all the diffraction data before the damage occurs. It’s really cool.”

    3
    Part of the XFEL where the sample is injected into the path of the X-ray beam. This XFEL facility, called the SPring-8 Angstrom Compact free electron LAser (SACLA) is in Japan. The team traveled there and performed their experiments in 2019. Credit: Nate Hohman/The University of Connecticut(US))
    http://www.lightsources.org/facility/sacla

    SACLA Free-Electron – Laser Riken [理研](JP) Japan.

    Paley and co-leader Aaron Brewster, a research scientist in MBIB, developed the algorithms needed to convert XFEL data into high-quality diffraction patterns that can be analyzed to reveal the unit cell – the basic unit of a crystal that is repeated over and over in three dimensions – of each tiny crystalline grain within the sample.

    When you have a true powder, Paley explained, it’s like having a million crystals that are all jumbled together, full of imperfections, and scrambled in every possible orientation. Rather than diffracting the whole jumble together and getting a muddied readout of electron densities (which is what happens with existing powder diffraction techniques), smSFX is so precise that it can diffract individual granules, one at a time. “This gives it a special sharpening effect,” he said. “So that is actually the kind of secret sauce of this whole method. Normally you shoot all million at once, but now you shoot 10,000 all in sequence.”

    The cherry on top is that smSFX is performed without freezing the sample or exposing it to a vacuum – another benefit for the delicate materials studied by materials scientists. “No fancy vacuum chamber required,” said Sauter.

    4
    (Left) The team, pictured in 2019, preparing for an XFEL session with their mascot. (Right) An image of the sample injection apparatus, full of a sample of mithrene, a metallic-organic material that glows blue when exposed to UV light. Credit: Nate Hohman/University of Connecticut.

    In the new study, the team demonstrated proof-of-principle for smSFX, then went one step further. They reported the previously unknown structures of two metal-organic materials known as chacogenolates. Nathan Hohman, a chemical physicist at University of Connecticut and the project’s third co-leader, studies chacogenolates for their semiconducting and light-interaction properties, which could make them ideal for next-generation transistors, photovoltaics (solar cells and panels), energy storage devices, and sensors.

    “Every single one of these is a special snowflake – growing them is really difficult,” said Hohman. With smSFX, he and graduate student Elyse Schriber were able to successfully diffract powder chacogenolates and examine the structures to learn why some of the silver-based materials glow bright blue under UV light, a phenomenon that the scientists affectionately compare to Frodo’s sword in The Lord of the Rings.

    “There is a huge array of fascinating physical and even chemical dynamics that occur at ultrafast timescales, and our experiment could help to connect the dots between a material’s structure and its function,” said Schriber, a Berkeley Lab affiliate and researcher in Hohman’s lab. “After further improvements are made to streamline the smSFX process, we can imagine programs to offer this technique to other researchers. These types of programs are integral for increasing access to light source facilities, especially for smaller universities and colleges.”

    5
    An illustrated collage composed of all the diffraction data gathered at the SACLA. Credit: Nate Hohman/University of Connecticut.

    This work involved the use of the SACLA free-electron laser in Japan [above], the Linac Coherent Light Source at DOE’s SLAC National Accelerator Laboratory (US), and the National Energy Research Scientific Computing Center [below] and Molecular Foundry [below], two U.S. Department of Energy Office of Science user facilities located at Berkeley Lab.

    SLAC LCLS

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    LBNL campus

    LBNL Molecular Foundry

    Bringing Science Solutions to the World

    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) (US) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the The National Academy of Sciences (US), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the The National Academy of Engineering (US), and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California (US) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the University of California- Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a University of California-Berkeley (US) physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    History

    1931–1941

    The laboratory was founded on August 26, 1931, by Ernest Lawrence, as the Radiation Laboratory of the University of California, Berkeley, associated with the Physics Department. It centered physics research around his new instrument, the cyclotron, a type of particle accelerator for which he was awarded the Nobel Prize in Physics in 1939.

    LBNL 88 inch cyclotron.

    LBNL 88 inch cyclotron.

    Throughout the 1930s, Lawrence pushed to create larger and larger machines for physics research, courting private philanthropists for funding. He was the first to develop a large team to build big projects to make discoveries in basic research. Eventually these machines grew too large to be held on the university grounds, and in 1940 the lab moved to its current site atop the hill above campus. Part of the team put together during this period includes two other young scientists who went on to establish large laboratories; J. Robert Oppenheimer founded DOE’s Los Alamos Laboratory (US), and Robert Wilson founded Fermi National Accelerator Laboratory(US).

    1942–1950

    Leslie Groves visited Lawrence’s Radiation Laboratory in late 1942 as he was organizing the Manhattan Project, meeting J. Robert Oppenheimer for the first time. Oppenheimer was tasked with organizing the nuclear bomb development effort and founded today’s Los Alamos National Laboratory to help keep the work secret. At the RadLab, Lawrence and his colleagues developed the technique of electromagnetic enrichment of uranium using their experience with cyclotrons. The “calutrons” (named after the University) became the basic unit of the massive Y-12 facility in Oak Ridge, Tennessee. Lawrence’s lab helped contribute to what have been judged to be the three most valuable technology developments of the war (the atomic bomb, proximity fuse, and radar). The cyclotron, whose construction was stalled during the war, was finished in November 1946. The Manhattan Project shut down two months later.

    1951–2018

    After the war, the Radiation Laboratory became one of the first laboratories to be incorporated into the Atomic Energy Commission (AEC) (now Department of Energy (US). The most highly classified work remained at Los Alamos, but the RadLab remained involved. Edward Teller suggested setting up a second lab similar to Los Alamos to compete with their designs. This led to the creation of an offshoot of the RadLab (now the Lawrence Livermore National Laboratory (US)) in 1952. Some of the RadLab’s work was transferred to the new lab, but some classified research continued at Berkeley Lab until the 1970s, when it became a laboratory dedicated only to unclassified scientific research.

    Shortly after the death of Lawrence in August 1958, the UC Radiation Laboratory (both branches) was renamed the Lawrence Radiation Laboratory. The Berkeley location became the Lawrence Berkeley Laboratory in 1971, although many continued to call it the RadLab. Gradually, another shortened form came into common usage, LBNL. Its formal name was amended to Ernest Orlando Lawrence Berkeley National Laboratory in 1995, when “National” was added to the names of all DOE labs. “Ernest Orlando” was later dropped to shorten the name. Today, the lab is commonly referred to as “Berkeley Lab”.

    The Alvarez Physics Memos are a set of informal working papers of the large group of physicists, engineers, computer programmers, and technicians led by Luis W. Alvarez from the early 1950s until his death in 1988. Over 1700 memos are available on-line, hosted by the Laboratory.

    The lab remains owned by the Department of Energy (US), with management from the University of California (US). Companies such as Intel were funding the lab’s research into computing chips.

    Science mission

    From the 1950s through the present, Berkeley Lab has maintained its status as a major international center for physics research, and has also diversified its research program into almost every realm of scientific investigation. Its mission is to solve the most pressing and profound scientific problems facing humanity, conduct basic research for a secure energy future, understand living systems to improve the environment, health, and energy supply, understand matter and energy in the universe, build and safely operate leading scientific facilities for the nation, and train the next generation of scientists and engineers.

    The Laboratory’s 20 scientific divisions are organized within six areas of research: Computing Sciences; Physical Sciences; Earth and Environmental Sciences; Biosciences; Energy Sciences; and Energy Technologies. Berkeley Lab has six main science thrusts: advancing integrated fundamental energy science; integrative biological and environmental system science; advanced computing for science impact; discovering the fundamental properties of matter and energy; accelerators for the future; and developing energy technology innovations for a sustainable future. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab tradition that continues today.

    Berkeley Lab operates five major National User Facilities for the DOE Office of Science (US):

    The Advanced Light Source (ALS) is a synchrotron light source with 41 beam lines providing ultraviolet, soft x-ray, and hard x-ray light to scientific experiments.

    LBNL/ALS

    DOE’s Lawrence Berkeley National Laboratory (US) Advanced Light Source .
    The ALS is one of the world’s brightest sources of soft x-rays, which are used to characterize the electronic structure of matter and to reveal microscopic structures with elemental and chemical specificity. About 2,500 scientist-users carry out research at ALS every year. Berkeley Lab is proposing an upgrade of ALS which would increase the coherent flux of soft x-rays by two-three orders of magnitude.

    The DOE Joint Genome Institute (US) supports genomic research in support of the DOE missions in alternative energy, global carbon cycling, and environmental management. The JGI’s partner laboratories are Berkeley Lab, DOE’s Lawrence Livermore National Laboratory (US), DOE’s Oak Ridge National Laboratory (US)(ORNL), DOE’s Pacific Northwest National Laboratory (US) (PNNL), and the HudsonAlpha Institute for Biotechnology (US). The JGI’s central role is the development of a diversity of large-scale experimental and computational capabilities to link sequence to biological insights relevant to energy and environmental research. Approximately 1,200 scientist-users take advantage of JGI’s capabilities for their research every year.

    The LBNL Molecular Foundry (US) [above] is a multidisciplinary nanoscience research facility. Its seven research facilities focus on Imaging and Manipulation of Nanostructures; Nanofabrication; Theory of Nanostructured Materials; Inorganic Nanostructures; Biological Nanostructures; Organic and Macromolecular Synthesis; and Electron Microscopy. Approximately 700 scientist-users make use of these facilities in their research every year.

    The DOE’s NERSC National Energy Research Scientific Computing Center (US) is the scientific computing facility that provides large-scale computing for the DOE’s unclassified research programs. Its current systems provide over 3 billion computational hours annually. NERSC supports 6,000 scientific users from universities, national laboratories, and industry.

    DOE’s NERSC National Energy Research Scientific Computing Center(US) at Lawrence Berkeley National Laboratory.

    Cray Cori II supercomputer at National Energy Research Scientific Computing Center(US) at DOE’s Lawrence Berkeley National Laboratory(US), named after Gerty Cori, the first American woman to win a Nobel Prize in science.

    NERSC Hopper Cray XE6 supercomputer.

    NERSC Cray XC30 Edison supercomputer.

    NERSC GPFS for Life Sciences.

    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF computer cluster in 2003.

    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    Cray Shasta Perlmutter SC18 AMD Epyc Nvidia pre-exascale supercomputer.

    NERSC is a DOE Office of Science User Facility.

    The DOE’s Energy Science Network (US) is a high-speed network infrastructure optimized for very large scientific data flows. ESNet provides connectivity for all major DOE sites and facilities, and the network transports roughly 35 petabytes of traffic each month.

    Berkeley Lab is the lead partner in the DOE’s Joint Bioenergy Institute (US) (JBEI), located in Emeryville, California. Other partners are the DOE’s Sandia National Laboratory (US), the University of California (UC) campuses of Berkeley and Davis, the Carnegie Institution for Science (US), and DOE’s Lawrence Livermore National Laboratory (US) (LLNL). JBEI’s primary scientific mission is to advance the development of the next generation of biofuels – liquid fuels derived from the solar energy stored in plant biomass. JBEI is one of three new U.S. Department of Energy (DOE) Bioenergy Research Centers (BRCs).

    Berkeley Lab has a major role in two DOE Energy Innovation Hubs. The mission of the Joint Center for Artificial Photosynthesis (JCAP) is to find a cost-effective method to produce fuels using only sunlight, water, and carbon dioxide. The lead institution for JCAP is the California Institute of Technology (US) and Berkeley Lab is the second institutional center. The mission of the Joint Center for Energy Storage Research (JCESR) is to create next-generation battery technologies that will transform transportation and the electricity grid. DOE’s Argonne National Laboratory (US) leads JCESR and Berkeley Lab is a major partner.

    The University of California-Berkeley US) is a public land-grant research university in Berkeley, California. Established in 1868 as the state’s first land-grant university, it was the first campus of the University of California (US) system and a founding member of the Association of American Universities (US). Its 14 colleges and schools offer over 350 degree programs and enroll some 31,000 undergraduate and 12,000 graduate students. Berkeley is ranked among the world’s top universities by major educational publications.

    Berkeley hosts many leading research institutes, including the Mathematical Sciences Research Institute and the Space Sciences Laboratory. It founded and maintains close relationships with three national laboratories at DOE’s Lawrence Berkeley National Laboratory(US), DOE’s Lawrence Livermore National Laboratory(US) and DOE’s Los Alamos National Lab(US), and has played a prominent role in many scientific advances, from the Manhattan Project and the discovery of 16 chemical elements to breakthroughs in computer science and genomics. Berkeley is also known for student activism and the Free Speech Movement of the 1960s.

    Berkeley alumni and faculty count among their ranks 110 Nobel laureates (34 alumni), 25 Turing Award winners (11 alumni), 14 Fields Medalists, 28 Wolf Prize winners, 103 MacArthur “Genius Grant” recipients, 30 Pulitzer Prize winners, and 19 Academy Award winners. The university has produced seven heads of state or government; five chief justices, including Chief Justice of the United States Earl Warren; 21 cabinet-level officials; 11 governors; and 25 living billionaires. It is also a leading producer of Fulbright Scholars, MacArthur Fellows, and Marshall Scholars. Berkeley alumni, widely recognized for their entrepreneurship, have founded many notable companies.

    Berkeley’s athletic teams compete in Division I of the NCAA, primarily in the Pac-12 Conference, and are collectively known as the California Golden Bears. The university’s teams have won 107 national championships, and its students and alumni have won 207 Olympic medals.

    Made possible by President Lincoln’s signing of the Morrill Act in 1862, the University of California was founded in 1868 as the state’s first land-grant university by inheriting certain assets and objectives of the private College of California and the public Agricultural, Mining, and Mechanical Arts College. Although this process is often incorrectly mistaken for a merger, the Organic Act created a “completely new institution” and did not actually merge the two precursor entities into the new university. The Organic Act states that the “University shall have for its design, to provide instruction and thorough and complete education in all departments of science, literature and art, industrial and professional pursuits, and general education, and also special courses of instruction in preparation for the professions”.

    Ten faculty members and 40 students made up the fledgling university when it opened in Oakland in 1869. Frederick H. Billings, a trustee of the College of California, suggested that a new campus site north of Oakland be named in honor of Anglo-Irish philosopher George Berkeley. The university began admitting women the following year. In 1870, Henry Durant, founder of the College of California, became its first president. With the completion of North and South Halls in 1873, the university relocated to its Berkeley location with 167 male and 22 female students.

    Beginning in 1891, Phoebe Apperson Hearst made several large gifts to Berkeley, funding a number of programs and new buildings and sponsoring, in 1898, an international competition in Antwerp, Belgium, where French architect Émile Bénard submitted the winning design for a campus master plan.

    20th century

    In 1905, the University Farm was established near Sacramento, ultimately becoming the University of California-Davis. In 1919, Los Angeles State Normal School became the southern branch of the University, which ultimately became the University of California-Los Angeles. By 1920s, the number of campus buildings had grown substantially and included twenty structures designed by architect John Galen Howard.

    In 1917, one of the nation’s first ROTC programs was established at Berkeley and its School of Military Aeronautics began training pilots, including Gen. Jimmy Doolittle. Berkeley ROTC alumni include former Secretary of Defense Robert McNamara and Army Chief of Staff Frederick C. Weyand as well as 16 other generals. In 1926, future fleet admiral Chester W. Nimitz established the first Naval ROTC unit at Berkeley.

    In the 1930s, Ernest Lawrence helped establish the Radiation Laboratory (now DOE’s Lawrence Berkeley National Laboratory (US)) and invented the cyclotron, which won him the Nobel physics prize in 1939. Using the cyclotron, Berkeley professors and Berkeley Lab researchers went on to discover 16 chemical elements—more than any other university in the world. In particular, during World War II and following Glenn Seaborg’s then-secret discovery of plutonium, Ernest Orlando Lawrence’s Radiation Laboratory began to contract with the U.S. Army to develop the atomic bomb. Physics professor J. Robert Oppenheimer was named scientific head of the Manhattan Project in 1942. Along with the Lawrence Berkeley National Laboratory, Berkeley founded and was then a partner in managing two other labs, Los Alamos National Laboratory (1943) and Lawrence Livermore National Laboratory (1952).

    By 1942, the American Council on Education ranked Berkeley second only to Harvard University (US) in the number of distinguished departments.

    In 1952, the University of California reorganized itself into a system of semi-autonomous campuses, with each campus given its own chancellor, and Clark Kerr became Berkeley’s first Chancellor, while Sproul remained in place as the President of the University of California.

    Berkeley gained a worldwide reputation for political activism in the 1960s. In 1964, the Free Speech Movement organized student resistance to the university’s restrictions on political activities on campus—most conspicuously, student activities related to the Civil Rights Movement. The arrest in Sproul Plaza of Jack Weinberg, a recent Berkeley alumnus and chair of Campus CORE, in October 1964, prompted a series of student-led acts of formal remonstrance and civil disobedience that ultimately gave rise to the Free Speech Movement, which movement would prevail and serve as precedent for student opposition to America’s involvement in the Vietnam War.

    In 1982, the Mathematical Sciences Research Institute (MSRI) was established on campus with support from the National Science Foundation and at the request of three Berkeley mathematicians — Shiing-Shen Chern, Calvin Moore and Isadore M. Singer. The institute is now widely regarded as a leading center for collaborative mathematical research, drawing thousands of visiting researchers from around the world each year.

    21st century

    In the current century, Berkeley has become less politically active and more focused on entrepreneurship and fundraising, especially for STEM disciplines.

    Modern Berkeley students are less politically radical, with a greater percentage of moderates and conservatives than in the 1960s and 70s. Democrats outnumber Republicans on the faculty by a ratio of 9:1. On the whole, Democrats outnumber Republicans on American university campuses by a ratio of 10:1.

    In 2007, the Energy Biosciences Institute was established with funding from BP and Stanley Hall, a research facility and headquarters for the California Institute for Quantitative Biosciences, opened. The next few years saw the dedication of the Center for Biomedical and Health Sciences, funded by a lead gift from billionaire Li Ka-shing; the opening of Sutardja Dai Hall, home of the Center for Information Technology Research in the Interest of Society; and the unveiling of Blum Hall, housing the Blum Center for Developing Economies. Supported by a grant from alumnus James Simons, the Simons Institute for the Theory of Computing was established in 2012. In 2014, Berkeley and its sister campus, Univerity of California-San Fransisco (US), established the Innovative Genomics Institute, and, in 2020, an anonymous donor pledged $252 million to help fund a new center for computing and data science.

    Since 2000, Berkeley alumni and faculty have received 40 Nobel Prizes, behind only Harvard and Massachusetts Institute of Technology (US) among US universities; five Turing Awards, behind only MIT and Stanford; and five Fields Medals, second only to Princeton University (US). According to PitchBook, Berkeley ranks second, just behind Stanford University, in producing VC-backed entrepreneurs.

    UC Berkeley Seal

     
  • richardmitnick 12:10 pm on January 18, 2022 Permalink | Reply
    Tags: "Protein controlled by both light and temperature can inform cell signal pathways", , , , , Compared to previous probes this research was based on a single protein called BcLOV4., Laser Technology, , The field of optogenetics relies on such proteins to better understand and manipulate these processes., , The scientists serendipitously discovered that BcLOV4 could sense not only light but also temperature., This research will open new horizons for both basic science and translational research.   

    From Penn Today and The Penn School of Engineering and Applied Science (US): “Protein controlled by both light and temperature can inform cell signal pathways” 

    From Penn Today

    and

    2
    The Penn School of Engineering and Applied Science (US)

    at

    U Penn bloc

    University of Pennsylvania

    January 14, 2022
    Melissa Pappas

    Most organisms have proteins that react to light. Even creatures that don’t have eyes or other visual organs use these proteins to regulate many cellular processes, such as transcription, translation, cell growth and cell survival.

    1
    The brighter edges of the cells in the middle and upper right panels show the optogenetic proteins collecting at the membrane after light exposure. At higher temperatures, however, the proteins become rapidly inactivated and thus do not stay at the membrane, resulting in the duller edges seen in the bottom right panel. Image: Penn Engineering Today.

    The field of optogenetics relies on such proteins to better understand and manipulate these processes. Using lasers and genetically engineered versions of these naturally occurring proteins, known as probes, researchers can precisely activate and deactivate a variety of cellular pathways, just like flipping a switch.

    Now, Penn Engineering researchers have described a new type of optogenetic protein that can be controlled not only by light, but also by temperature, allowing for a higher degree of control in the manipulation of cellular pathways. The research will open new horizons for both basic science and translational research.

    Lukasz Bugaj, assistant professor in bioengineering, Bomyi Lim, assistant professor in chemical and biomolecular engineering, Brian Chow, associate professor in bioengineering, and graduate students William Benman in Bugaj’s lab, Hao Deng in Lim’s lab, and Erin Berlew and Ivan Kuznetsov in Chow’s lab, published their study in Nature Chemical Biology. Arndt Siekmann, associate professor of cell and developmental biology at the Perelman School of Medicine, and Caitlyn Parker, a research technician in his lab, also contributed to this research.

    “Compared to previous probes ours were based on a single protein called BcLOV4, which was recently described by Brian Chow’s lab,” says Bugaj. “As a single protein, BcLOV4 can stimulate signals in a manner that required multiple proteins in previous approaches, thus making it simpler and easier to use.”

    The authors successfully showed that BcLOV4-based probes could stimulate the Ras and PI3K pathways in mammalian cells, as well as in zebrafish and fruit flies, two common model organisms.

    “However, in the course of our experiments, we serendipitously discovered that BcLOV4 could sense not only light, but also temperature,” says Bugaj. “As far as we know, this type of dual light and temperature sensitivity is a completely new feature for photosensory proteins.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Pennsylvania School of Engineering and Applied Science, is an undergraduate and graduate school of the University of Pennsylvania. The School offers programs that emphasize hands-on study of engineering fundamentals (with an offering of approximately 300 courses) while encouraging students to leverage the educational offerings of the broader University. Engineering students can also take advantage of research opportunities through interactions with Penn’s School of Medicine, School of Arts and Sciences and the Wharton School.

    Penn Engineering offers bachelors, masters and Ph.D. degree programs in contemporary fields of engineering study. The nationally ranked bioengineering department offers the School’s most popular undergraduate degree program. The Jerome Fisher Program in Management and Technology, offered in partnership with the Wharton School, allows students to simultaneously earn a Bachelor of Science degree in Economics as well as a Bachelor of Science degree in Engineering. SEAS also offers several masters programs, which include: Executive Master’s in Technology Management, Master of Biotechnology, Master of Computer and Information Technology, Master of Computer and Information Science and a Master of Science in Engineering in Telecommunications and Networking.

    History

    The study of engineering at the University of Pennsylvania can be traced back to 1850 when the University trustees adopted a resolution providing for a professorship of “Chemistry as Applied to the Arts”. In 1852, the study of engineering was further formalized with the establishment of the School of Mines, Arts and Manufactures. The first Professor of Civil and Mining Engineering was appointed in 1852. The first graduate of the school received his Bachelor of Science degree in 1854. Since that time, the school has grown to six departments. In 1973, the school was renamed as the School of Engineering and Applied Science.

    The early growth of the school benefited from the generosity of two Philadelphians: John Henry Towne and Alfred Fitler Moore. Towne, a mechanical engineer and railroad developer, bequeathed the school a gift of $500,000 upon his death in 1875. The main administration building for the school still bears his name. Moore was a successful entrepreneur who made his fortune manufacturing telegraph cable. A 1923 gift from Moore established the Moore School of Electrical Engineering, which is the birthplace of the first electronic general-purpose Turing-complete digital computer, ENIAC, in 1946.

    During the latter half of the 20th century the school continued to break new ground. In 1958, Barbara G. Mandell became the first woman to enroll as an undergraduate in the School of Engineering. In 1965, the university acquired two sites that were formerly used as U.S. Army Nike Missile Base (PH 82L and PH 82R) and created the Valley Forge Research Center. In 1976, the Management and Technology Program was created. In 1990, a Bachelor of Applied Science in Biomedical Science and Bachelor of Applied Science in Environmental Science were first offered, followed by a master’s degree in Biotechnology in 1997.

    The school continues to expand with the addition of the Melvin and Claire Levine Hall for computer science in 2003, Skirkanich Hall for bioengineering in 2006, and the Krishna P. Singh Center for Nanotechnology in 2013.

    Academics

    Penn’s School of Engineering and Applied Science is organized into six departments:

    Bioengineering
    Chemical and Biomolecular Engineering
    Computer and Information Science
    Electrical and Systems Engineering
    Materials Science and Engineering
    Mechanical Engineering and Applied Mechanics

    The school’s Department of Bioengineering, originally named Biomedical Electronic Engineering, consistently garners a top-ten ranking at both the undergraduate and graduate level from U.S. News & World Report. The department also houses the George H. Stephenson Foundation Educational Laboratory & Bio-MakerSpace (aka Biomakerspace) for training undergraduate through PhD students. It is Philadelphia’s and Penn’s only Bio-MakerSpace and it is open to the Penn community, encouraging a free flow of ideas, creativity, and entrepreneurship between Bioengineering students and students throughout the university.

    Founded in 1893, the Department of Chemical and Biomolecular Engineering is “America’s oldest continuously operating degree-granting program in chemical engineering.”

    The Department of Electrical and Systems Engineering is recognized for its research in electroscience, systems science and network systems and telecommunications.

    Originally established in 1946 as the School of Metallurgical Engineering, the Materials Science and Engineering Department “includes cutting edge programs in nanoscience and nanotechnology, biomaterials, ceramics, polymers, and metals.”

    The Department of Mechanical Engineering and Applied Mechanics draws its roots from the Department of Mechanical and Electrical Engineering, which was established in 1876.

    Each department houses one or more degree programs. The Chemical and Biomolecular Engineering, Materials Science and Engineering, and Mechanical Engineering and Applied Mechanics departments each house a single degree program.

    Bioengineering houses two programs (both a Bachelor of Science in Engineering degree as well as a Bachelor of Applied Science degree). Electrical and Systems Engineering offers four Bachelor of Science in Engineering programs: Electrical Engineering, Systems Engineering, Computer Engineering, and the Networked & Social Systems Engineering, the latter two of which are co-housed with Computer and Information Science (CIS). The CIS department, like Bioengineering, offers Computer and Information Science programs under both bachelor programs. CIS also houses Digital Media Design, a program jointly operated with PennDesign.

    Research

    Penn’s School of Engineering and Applied Science is a research institution. SEAS research strives to advance science and engineering and to achieve a positive impact on society. Faculty at Penn’s School of Engineering and Applied Science have created several centers for advanced study including.

    U Penn campus

    Academic life at University of Pennsylvania (US) is unparalleled, with 100 countries and every U.S. state represented in one of the Ivy League’s most diverse student bodies. Consistently ranked among the top 10 universities in the country, Penn enrolls 10,000 undergraduate students and welcomes an additional 10,000 students to our world-renowned graduate and professional schools.

    Penn’s award-winning educators and scholars encourage students to pursue inquiry and discovery, follow their passions, and address the world’s most challenging problems through an interdisciplinary approach.

    The University of Pennsylvania (US) is a private Ivy League research university in Philadelphia, Pennsylvania. The university claims a founding date of 1740 and is one of the nine colonial colleges chartered prior to the U.S. Declaration of Independence. Benjamin Franklin, Penn’s founder and first president, advocated an educational program that trained leaders in commerce, government, and public service, similar to a modern liberal arts curriculum.

    Penn has four undergraduate schools as well as twelve graduate and professional schools. Schools enrolling undergraduates include the College of Arts and Sciences; the School of Engineering and Applied Science; the Wharton School; and the School of Nursing. Penn’s “One University Policy” allows students to enroll in classes in any of Penn’s twelve schools. Among its highly ranked graduate and professional schools are a law school whose first professor wrote the first draft of the United States Constitution, the first school of medicine in North America (Perelman School of Medicine, 1765), and the first collegiate business school (Wharton School, 1881).

    Penn is also home to the first “student union” building and organization (Houston Hall, 1896), the first Catholic student club in North America (Newman Center, 1893), the first double-decker college football stadium (Franklin Field, 1924 when second deck was constructed), and Morris Arboretum, the official arboretum of the Commonwealth of Pennsylvania. The first general-purpose electronic computer (ENIAC) was developed at Penn and formally dedicated in 1946. In 2019, the university had an endowment of $14.65 billion, the sixth-largest endowment of all universities in the United States, as well as a research budget of $1.02 billion. The university’s athletics program, the Quakers, fields varsity teams in 33 sports as a member of the NCAA Division I Ivy League conference.

    As of 2018, distinguished alumni and/or Trustees include three U.S. Supreme Court justices; 32 U.S. senators; 46 U.S. governors; 163 members of the U.S. House of Representatives; eight signers of the Declaration of Independence and seven signers of the U.S. Constitution (four of whom signed both representing two-thirds of the six people who signed both); 24 members of the Continental Congress; 14 foreign heads of state and two presidents of the United States, including Donald Trump. As of October 2019, 36 Nobel laureates; 80 members of the American Academy of Arts and Sciences(US); 64 billionaires; 29 Rhodes Scholars; 15 Marshall Scholars and 16 Pulitzer Prize winners have been affiliated with the university.

    History

    The University of Pennsylvania considers itself the fourth-oldest institution of higher education in the United States, though this is contested by Princeton University(US) and Columbia(US) Universities. The university also considers itself as the first university in the United States with both undergraduate and graduate studies.

    In 1740, a group of Philadelphians joined together to erect a great preaching hall for the traveling evangelist George Whitefield, who toured the American colonies delivering open-air sermons. The building was designed and built by Edmund Woolley and was the largest building in the city at the time, drawing thousands of people the first time it was preached in. It was initially planned to serve as a charity school as well, but a lack of funds forced plans for the chapel and school to be suspended. According to Franklin’s autobiography, it was in 1743 when he first had the idea to establish an academy, “thinking the Rev. Richard Peters a fit person to superintend such an institution”. However, Peters declined a casual inquiry from Franklin and nothing further was done for another six years. In the fall of 1749, now more eager to create a school to educate future generations, Benjamin Franklin circulated a pamphlet titled Proposals Relating to the Education of Youth in Pensilvania, his vision for what he called a “Public Academy of Philadelphia”. Unlike the other colonial colleges that existed in 1749—Harvard University(US), William & Mary(US), Yale Unversity(US), and The College of New Jersey(US)—Franklin’s new school would not focus merely on education for the clergy. He advocated an innovative concept of higher education, one which would teach both the ornamental knowledge of the arts and the practical skills necessary for making a living and doing public service. The proposed program of study could have become the nation’s first modern liberal arts curriculum, although it was never implemented because Anglican priest William Smith (1727-1803), who became the first provost, and other trustees strongly preferred the traditional curriculum.

    Franklin assembled a board of trustees from among the leading citizens of Philadelphia, the first such non-sectarian board in America. At the first meeting of the 24 members of the board of trustees on November 13, 1749, the issue of where to locate the school was a prime concern. Although a lot across Sixth Street from the old Pennsylvania State House (later renamed and famously known since 1776 as “Independence Hall”), was offered without cost by James Logan, its owner, the trustees realized that the building erected in 1740, which was still vacant, would be an even better site. The original sponsors of the dormant building still owed considerable construction debts and asked Franklin’s group to assume their debts and, accordingly, their inactive trusts. On February 1, 1750, the new board took over the building and trusts of the old board. On August 13, 1751, the “Academy of Philadelphia”, using the great hall at 4th and Arch Streets, took in its first secondary students. A charity school also was chartered on July 13, 1753 by the intentions of the original “New Building” donors, although it lasted only a few years. On June 16, 1755, the “College of Philadelphia” was chartered, paving the way for the addition of undergraduate instruction. All three schools shared the same board of trustees and were considered to be part of the same institution. The first commencement exercises were held on May 17, 1757.

    The institution of higher learning was known as the College of Philadelphia from 1755 to 1779. In 1779, not trusting then-provost the Reverend William Smith’s “Loyalist” tendencies, the revolutionary State Legislature created a University of the State of Pennsylvania. The result was a schism, with Smith continuing to operate an attenuated version of the College of Philadelphia. In 1791, the legislature issued a new charter, merging the two institutions into a new University of Pennsylvania with twelve men from each institution on the new board of trustees.

    Penn has three claims to being the first university in the United States, according to university archives director Mark Frazier Lloyd: the 1765 founding of the first medical school in America made Penn the first institution to offer both “undergraduate” and professional education; the 1779 charter made it the first American institution of higher learning to take the name of “University”; and existing colleges were established as seminaries (although, as detailed earlier, Penn adopted a traditional seminary curriculum as well).

    After being located in downtown Philadelphia for more than a century, the campus was moved across the Schuylkill River to property purchased from the Blockley Almshouse in West Philadelphia in 1872, where it has since remained in an area now known as University City. Although Penn began operating as an academy or secondary school in 1751 and obtained its collegiate charter in 1755, it initially designated 1750 as its founding date; this is the year that appears on the first iteration of the university seal. Sometime later in its early history, Penn began to consider 1749 as its founding date and this year was referenced for over a century, including at the centennial celebration in 1849. In 1899, the board of trustees voted to adjust the founding date earlier again, this time to 1740, the date of “the creation of the earliest of the many educational trusts the University has taken upon itself”. The board of trustees voted in response to a three-year campaign by Penn’s General Alumni Society to retroactively revise the university’s founding date to appear older than Princeton University, which had been chartered in 1746.

    Research, innovations and discoveries

    Penn is classified as an “R1” doctoral university: “Highest research activity.” Its economic impact on the Commonwealth of Pennsylvania for 2015 amounted to $14.3 billion. Penn’s research expenditures in the 2018 fiscal year were $1.442 billion, the fourth largest in the U.S. In fiscal year 2019 Penn received $582.3 million in funding from the National Institutes of Health(US).

    In line with its well-known interdisciplinary tradition, Penn’s research centers often span two or more disciplines. In the 2010–2011 academic year alone, five interdisciplinary research centers were created or substantially expanded; these include the Center for Health-care Financing; the Center for Global Women’s Health at the Nursing School; the $13 million Morris Arboretum’s Horticulture Center; the $15 million Jay H. Baker Retailing Center at Wharton; and the $13 million Translational Research Center at Penn Medicine. With these additions, Penn now counts 165 research centers hosting a research community of over 4,300 faculty and over 1,100 postdoctoral fellows, 5,500 academic support staff and graduate student trainees. To further assist the advancement of interdisciplinary research President Amy Gutmann established the “Penn Integrates Knowledge” title awarded to selected Penn professors “whose research and teaching exemplify the integration of knowledge”. These professors hold endowed professorships and joint appointments between Penn’s schools.

    Penn is also among the most prolific producers of doctoral students. With 487 PhDs awarded in 2009, Penn ranks third in the Ivy League, only behind Columbia University(US) and Cornell University(US) (Harvard University(US) did not report data). It also has one of the highest numbers of post-doctoral appointees (933 in number for 2004–2007), ranking third in the Ivy League (behind Harvard and Yale University(US)) and tenth nationally.

    In most disciplines Penn professors’ productivity is among the highest in the nation and first in the fields of epidemiology, business, communication studies, comparative literature, languages, information science, criminal justice and criminology, social sciences and sociology. According to the National Research Council nearly three-quarters of Penn’s 41 assessed programs were placed in ranges including the top 10 rankings in their fields, with more than half of these in ranges including the top five rankings in these fields.

    Penn’s research tradition has historically been complemented by innovations that shaped higher education. In addition to establishing the first medical school; the first university teaching hospital; the first business school; and the first student union Penn was also the cradle of other significant developments. In 1852, Penn Law was the first law school in the nation to publish a law journal still in existence (then called The American Law Register, now the Penn Law Review, one of the most cited law journals in the world). Under the deanship of William Draper Lewis, the law school was also one of the first schools to emphasize legal teaching by full-time professors instead of practitioners, a system that is still followed today. The Wharton School was home to several pioneering developments in business education. It established the first research center in a business school in 1921 and the first center for entrepreneurship center in 1973 and it regularly introduced novel curricula for which BusinessWeek wrote, “Wharton is on the crest of a wave of reinvention and change in management education”.

    Several major scientific discoveries have also taken place at Penn. The university is probably best known as the place where the first general-purpose electronic computer (ENIAC) was born in 1946 at the Moore School of Electrical Engineering.

    ENIAC UPenn

    It was here also where the world’s first spelling and grammar checkers were created, as well as the popular COBOL programming language. Penn can also boast some of the most important discoveries in the field of medicine. The dialysis machine used as an artificial replacement for lost kidney function was conceived and devised out of a pressure cooker by William Inouye while he was still a student at Penn Med; the Rubella and Hepatitis B vaccines were developed at Penn; the discovery of cancer’s link with genes; cognitive therapy; Retin-A (the cream used to treat acne), Resistin; the Philadelphia gene (linked to chronic myelogenous leukemia) and the technology behind PET Scans were all discovered by Penn Med researchers. More recent gene research has led to the discovery of the genes for fragile X syndrome, the most common form of inherited mental retardation; spinal and bulbar muscular atrophy, a disorder marked by progressive muscle wasting; and Charcot–Marie–Tooth disease, a progressive neurodegenerative disease that affects the hands, feet and limbs.

    Conductive polymer was also developed at Penn by Alan J. Heeger, Alan MacDiarmid and Hideki Shirakawa, an invention that earned them the Nobel Prize in Chemistry. On faculty since 1965, Ralph L. Brinster developed the scientific basis for in vitro fertilization and the transgenic mouse at Penn and was awarded the National Medal of Science in 2010. The theory of superconductivity was also partly developed at Penn, by then-faculty member John Robert Schrieffer (along with John Bardeen and Leon Cooper). The university has also contributed major advancements in the fields of economics and management. Among the many discoveries are conjoint analysis, widely used as a predictive tool especially in market research; Simon Kuznets’s method of measuring Gross National Product; the Penn effect (the observation that consumer price levels in richer countries are systematically higher than in poorer ones) and the “Wharton Model” developed by Nobel-laureate Lawrence Klein to measure and forecast economic activity. The idea behind Health Maintenance Organizations also belonged to Penn professor Robert Eilers, who put it into practice during then-President Nixon’s health reform in the 1970s.

    International partnerships

    Students can study abroad for a semester or a year at partner institutions such as the London School of Economics(UK), University of Barcelona [Universitat de Barcelona](ES), Paris Institute of Political Studies [Institut d’études politiques de Paris](FR), University of Queensland(AU), University College London(UK), King’s College London(UK), Hebrew University of Jerusalem(IL) and University of Warwick(UK).

     
  • richardmitnick 5:27 pm on January 14, 2022 Permalink | Reply
    Tags: "Earth’s interior is cooling faster than expected", , Laser Technology, Measuring the thermal conductivity of bridgmanite in the laboratory under the pressure and temperature conditions that prevail inside the Earth., Measuring the thermal conductivity of bridgmanite in the laboratory., Scientists are studying the thermal conductivity of the boundary between the Earth’s core and mantle. The boundary layer is formed mainly of the mineral bridgmanite.,   

    From The Swiss Federal Institute of Technology in Zürich [ETH Zürich] [Eidgenössische Technische Hochschule Zürich] (CH): “Earth’s interior is cooling faster than expected” 

    From The Swiss Federal Institute of Technology in Zürich [ETH Zürich] [Eidgenössische Technische Hochschule Zürich] (CH)

    1.14.22
    Peter Rüegg

    Researchers at ETH Zürich have demonstrated in the lab how well a mineral common at the boundary between the Earth’s core and mantle conducts heat. This leads them to suspect that the Earth’s heat may dissipate sooner than previously thought.

    1
    The Earth’s core gives off heat to the mantle (orange to dark red), which contributes to the slow cooling of the Earth. Photograph: iStock / Rost-​9D.

    The evolution of our Earth is the story of its cooling: 4.5 billion years ago, extreme temperatures prevailed on the surface of the young Earth, and it was covered by a deep ocean of magma. Over millions of years, the planet’s surface cooled to form a brittle crust. However, the enormous thermal energy emanating from the Earth’s interior set dynamic processes in motion, such as mantle convection, plate tectonics and volcanism.

    The Tectonic Plates of the world were mapped in 1996, Geological Survey (US).

    Still unanswered, though, are the questions of how fast the Earth cooled and how long it might take for this ongoing cooling to bring the aforementioned heat-​driven processes to a halt.

    One possible answer may lie in the thermal conductivity of the minerals that form the boundary between the Earth’s core and mantle.

    This boundary layer is relevant because it is here that the viscous rock of the Earth’s mantle is in direct contact with the hot iron-​nickel melt of the planet’s outer core. The temperature gradient between the two layers is very steep, so there is potentially a lot of heat flowing here. The boundary layer is formed mainly of the mineral bridgmanite. However, researchers have a hard time estimating how much heat this mineral conducts from the Earth’s core to the mantle because experimental verification is very difficult.

    Now, ETH Professor Motohiko Murakami and his colleagues from The Carnegie Institution for Science (US) have developed a sophisticated measuring system that enables them to measure the thermal conductivity of bridgmanite in the laboratory under the pressure and temperature conditions that prevail inside the Earth. For the measurements, they used a recently developed optical absorption measurement system in a diamond unit heated with a pulsed laser.

    2
    Measuring device for determining the thermal conductivity of bridgmanite under high pressure and extreme temperature. Credit Murakami M, et al, 2021.

    “This measurement system let us show that the thermal conductivity of bridgmanite is about 1.5 times higher than assumed,” Murakami says. This suggests that the heat flow from the core into the mantle is also higher than previously thought. Greater heat flow, in turn, increases mantle convection and accelerates the cooling of the Earth. This may cause plate tectonics, which is kept going by the convective motions of the mantle, to decelerate faster than researchers were expecting based on previous heat conduction values.

    Murakami and his colleagues have also shown that rapid cooling of the mantle will change the stable mineral phases at the core-​mantle boundary. When it cools, bridgmanite turns into the mineral post-​perovskite. But as soon as post-​perovskite appears at the core-​mantle boundary and begins to dominate, the cooling of the mantle might indeed accelerate even further, the researchers estimate, since this mineral conducts heat even more efficiently than bridgmanite.

    “Our results could give us a new perspective on the evolution of the Earth’s dynamics. They suggest that Earth, like the other rocky planets Mercury and Mars, is cooling and becoming inactive much faster than expected,” Murakami explains.

    However, he cannot say how long it will take, for example, for convection currents in the mantle to stop. “We still don’t know enough about these kinds of events to pin down their timing.” To do that calls first for a better understanding of how mantle convection works in spatial and temporal terms. Moreover, scientists need to clarify how the decay of radioactive elements in the Earth’s interior – one of the main sources of heat – affects the dynamics of the mantle.

    Science paper:
    Earth and Planetary Science Letters

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ETH Zurich campus

    The Swiss Federal Institute of Technology in Zürich [ETH Zürich] [Eidgenössische Technische Hochschule Zürich] (CH) is a public research university in the city of Zürich, Switzerland. Founded by the Swiss Federal Government in 1854 with the stated mission to educate engineers and scientists, the school focuses exclusively on science, technology, engineering and mathematics. Like its sister institution The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne](CH) , it is part of The Swiss Federal Institutes of Technology Domain (ETH Domain)) , part of the The Swiss Federal Department of Economic Affairs, Education and Research [EAER][Eidgenössisches Departement für Wirtschaft, Bildung und Forschung] [Département fédéral de l’économie, de la formation et de la recherche] (CH).

    The university is an attractive destination for international students thanks to low tuition fees of 809 CHF per semester, PhD and graduate salaries that are amongst the world’s highest, and a world-class reputation in academia and industry. There are currently 22,200 students from over 120 countries, of which 4,180 are pursuing doctoral degrees. In the 2021 edition of the QS World University Rankings ETH Zürich is ranked 6th in the world and 8th by the Times Higher Education World Rankings 2020. In the 2020 QS World University Rankings by subject it is ranked 4th in the world for engineering and technology (2nd in Europe) and 1st for earth & marine science.

    As of November 2019, 21 Nobel laureates, 2 Fields Medalists, 2 Pritzker Prize winners, and 1 Turing Award winner have been affiliated with the Institute, including Albert Einstein. Other notable alumni include John von Neumann and Santiago Calatrava. It is a founding member of the IDEA League and the International Alliance of Research Universities (IARU) and a member of the CESAER network.

    ETH Zürich was founded on 7 February 1854 by the Swiss Confederation and began giving its first lectures on 16 October 1855 as a polytechnic institute (eidgenössische polytechnische Schule) at various sites throughout the city of Zurich. It was initially composed of six faculties: architecture, civil engineering, mechanical engineering, chemistry, forestry, and an integrated department for the fields of mathematics, natural sciences, literature, and social and political sciences.

    It is locally still known as Polytechnikum, or simply as Poly, derived from the original name eidgenössische polytechnische Schule, which translates to “federal polytechnic school”.

    ETH Zürich is a federal institute (i.e., under direct administration by the Swiss government), whereas The University of Zürich [Universität Zürich ] (CH) is a cantonal institution. The decision for a new federal university was heavily disputed at the time; the liberals pressed for a “federal university”, while the conservative forces wanted all universities to remain under cantonal control, worried that the liberals would gain more political power than they already had. In the beginning, both universities were co-located in the buildings of the University of Zürich.

    From 1905 to 1908, under the presidency of Jérôme Franel, the course program of ETH Zürich was restructured to that of a real university and ETH Zürich was granted the right to award doctorates. In 1909 the first doctorates were awarded. In 1911, it was given its current name, Eidgenössische Technische Hochschule. In 1924, another reorganization structured the university in 12 departments. However, it now has 16 departments.

    ETH Zürich, EPFL (Swiss Federal Institute of Technology in Lausanne) [École polytechnique fédérale de Lausanne](CH), and four associated research institutes form The Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles polytechniques fédérales] (CH) with the aim of collaborating on scientific projects.

    Reputation and ranking

    ETH Zürich is ranked among the top universities in the world. Typically, popular rankings place the institution as the best university in continental Europe and ETH Zürich is consistently ranked among the top 1-5 universities in Europe, and among the top 3-10 best universities of the world.

    Historically, ETH Zürich has achieved its reputation particularly in the fields of chemistry, mathematics and physics. There are 32 Nobel laureates who are associated with ETH Zürich, the most recent of whom is Richard F. Heck, awarded the Nobel Prize in chemistry in 2010. Albert Einstein is perhaps its most famous alumnus.

    In 2018, the QS World University Rankings placed ETH Zürich at 7th overall in the world. In 2015, ETH Zürich was ranked 5th in the world in Engineering, Science and Technology, just behind the Massachusetts Institute of Technology(US), Stanford University(US) and University of Cambridge(UK). In 2015, ETH Zürich also ranked 6th in the world in Natural Sciences, and in 2016 ranked 1st in the world for Earth & Marine Sciences for the second consecutive year.

    In 2016, Times Higher Education World University Rankings ranked ETH Zürich 9th overall in the world and 8th in the world in the field of Engineering & Technology, just behind the Massachusetts Institute of Technology(US), Stanford University(US), California Institute of Technology(US), Princeton University(US), University of Cambridge(UK), Imperial College London(UK) and University of Oxford(UK) .

    In a comparison of Swiss universities by swissUP Ranking and in rankings published by CHE comparing the universities of German-speaking countries, ETH Zürich traditionally is ranked first in natural sciences, computer science and engineering sciences.

    In the survey CHE ExcellenceRanking on the quality of Western European graduate school programs in the fields of biology, chemistry, physics and mathematics, ETH Zürich was assessed as one of the three institutions to have excellent programs in all the considered fields, the other two being Imperial College London(UK) and the University of Cambridge(UK), respectively.

     
  • richardmitnick 10:38 am on January 12, 2022 Permalink | Reply
    Tags: , At the dawn of the 20th century a new theory of matter and energy was emerging., , Could a quantum worldview prove useful outside the lab?, Information Theory: a blend of math and computer science, Laser Technology, One of the main questions quantum mechanics addressed was the nature of light-particle or wave, , Peter Shor: a fast-factoring algorithm for a quantum computer-a computer whose bits exist in superposition and can be entangled., Physicists developed a new system of mechanics to describe what seemed to be a quantized and uncertain probabilistic world-Heisenberg's Uncertainty Principle, , , , , , , Shor’s algorithm is of particular interest in encryption because of the difficulty of identifying the prime factors of large numbers., Shor’s algorithm was designed to quickly divide large numbers into their prime factors., The second quantum revolution also relies on and encompasses new ways of using technology to manipulate matter at the quantum level., Today’s quantum computers are not yet advanced enough to implement Shor’s algorithm., , Vacuum tubes, What changed was Shor’s introduction of error-correcting codes.   

    From Symmetry: “The second quantum revolution” 

    Symmetry Mag

    From Symmetry

    01/12/22
    Daniel Garisto

    1
    Illustration by Ana Kova / Sandbox Studio, Chicago.

    Inventions like the transistor and laser changed the world. What changes will the second quantum revolution bring?

    For physicists trying to harness the power of electricity, no tool was more important than the vacuum tube. This lightbulb-like device controlled the flow of electricity and could amplify signals. In the early 20th century, vacuum tubes were used in radios, televisions and long-distance telephone networks.

    But vacuum tubes had significant drawbacks: They generated heat; they were bulky; and they had a propensity to burn out. Physicists at Bell Labs, a spin-off of AT&T, were interested in finding a replacement.

    Applying their knowledge of quantum mechanics—specifically how electrons flowed between materials with electrical conductivity—they found a way to mimic the function of vacuum tubes without those shortcomings.

    They had invented the transistor. At the time, the invention did not grace the front page of any major news publications. Even the scientists themselves couldn’t have appreciated just how important their device would be.

    First came the transistor radio, popularized in large part by the new Japanese company Sony. Spreading portable access to radio broadcasts changed music and connected disparate corners of the world.

    Transistors then paved the way for NASA’s Apollo Project, which first took humans to the moon. And perhaps most importantly, transistors were made smaller and smaller, shrinking room-sized computers and magnifying their power to eventually create laptops and smartphones.

    These quantum-inspired devices are central to every single modern electronic application that uses some computing power, such as cars, cellphones and digital cameras. You would not be reading this sentence without transistors, which are an important part of what is now called the First Quantum Revolution.

    Quantum physicists Jonathan Dowling and Gerard Milburn coined the term “quantum revolution” in a 2002 paper [The Royal Society]. In it, they argue that we have now entered a new era, a Second Quantum Revolution. “It just dawned on me that actually there was a whole new technological frontier opening up,” says Milburn, professor emeritus at The University of Queensland (AU).

    This second quantum revolution is defined by developments in technologies like quantum computing and quantum sensing, brought on by a deeper understanding of the quantum world and precision control down to the level of individual particles.

    A quantum understanding

    At the dawn of the 20th century a new theory of matter and energy was emerging. Unsatisfied with classical explanations about the strange behavior of particles, physicists developed a new system of mechanics to describe what seemed to be a quantized, uncertain, probabilistic world.

    One of the main questions quantum mechanics addressed was the nature of light. Eighteenth-century physicists believed light was a particle. Nineteenth-century physicists proved it had to be a wave. Twentieth-century physicists resolved the problem by redefining particles using the principles of quantum mechanics. They proposed that particles of light, now called photons, had some probability of existing in a given location—a probability that could be represented as a wave and even experience interference like one.

    This newfound picture of the world helped make sense of results such as those of the double-slit experiment, which showed that particles like electrons and photons could behave as if they were waves.

    But could a quantum worldview prove useful outside the lab?

    At first, “quantum was usually seen as just a source of mystery and confusion and all sorts of strange paradoxes,” Milburn says.

    But after World War II, people began figuring out how to use those paradoxes to get things done. Building on new quantum ideas about the behavior of electrons in metals and other materials, Bell Labs researchers William Shockley, John Bardeen and Walter Brattain created the first transistors. They realized that sandwiching semiconductors together could create a device that would allow electrical current to flow in one direction, but not another. Other technologies, such as atomic clocks and the nuclear magnetic resonance used for MRI scans, were also products of the first quantum revolution.

    Another important and, well, visible quantum invention was the laser.

    In the 1950s, optical physicists knew that hitting certain kinds of atoms with a few photons at the right energy could lead them to emit more photons with the same energy and direction as the initial photons. This effect would cause a cascade of photons, creating a stable, straight beam of light unlike anything seen in nature. Today, lasers are ubiquitous, used in applications from laser pointers to barcode scanners to life-saving medical techniques.

    All of these devices were made possible by studies of the quantum world. Both the laser and transistor rely on an understanding of quantized atomic energy levels. Milburn and Dowling suggest that the technologies of the first quantum revolution are unified by “the idea that matter particles sometimes behaved like waves, and that light waves sometimes acted like particles.”

    For the first time, scientists were using their understanding of quantum mechanics to create new tools that could be used in the classical world.

    The second quantum revolution

    Many of these developments were described to the public without resorting to the word “quantum,” as this Bell Labs video about the laser attests.

    One reason for the disconnect was that the first quantum revolution didn’t make full use of quantum mechanics. “The systems were too noisy. In a sense, the full richness of quantum mechanics wasn’t really accessible,” says Ivan Deutsch, a quantum physicist at The University of New Mexico (US). “You can get by with a fairly classical picture.”

    The stage for the second quantum revolution was set in the 1960s, when the North Irish physicist John Stewart Bell [B.Sc.The Queen’s University of Belfast (NIR); Ph.DThe University of Birmingham (UK);The European Organization for Nuclear Research [Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH) [CERN]; Stanford University (US) ] shook the foundations of quantum mechanics. Bell proposed that entangled particles were correlated in strange quantum ways and could not be explained with so-called “hidden variables.” Tests performed in the ’70s and ’80s confirmed that measuring one entangled particle really did seem to determine the state of the other, faster than any signal could travel between the two.

    The other critical ingredient for the second quantum revolution was information theory, a blend of math and computer science developed by pioneers like Claude Shannon and Alan Turing. In 1994, combining new insight into the foundations of quantum mechanics with information theory led the mathematician Peter Shor to introduce a fast-factoring algorithm for a quantum computer, a computer whose bits exist in superposition and can be entangled.

    Shor’s algorithm was designed to quickly divide large numbers into their prime factors. Using the algorithm, a quantum computer could solve the problem much more efficiently than a classical one. It was the clearest early demonstration of the worth of quantum computing.

    “It really made the whole idea of quantum information, a new concept that those of us who had been working in related areas, instantly appreciated,” Deutsch says. “Shor’s algorithm suggested the possibilities new quantum tech could have over existing classical tech, galvanizing research across the board.”

    Shor’s algorithm is of particular interest in encryption because the difficulty of identifying the prime factors of large numbers is precisely what keeps data private online. To unlock encrypted information, a computer must know the prime factors of a large number associated with it. Use a large enough number, and the puzzle of guessing its prime factors can take a classical computer thousands of years. With Shor’s algorithm, the guessing game can take just moments.

    Today’s quantum computers are not yet advanced enough to implement Shor’s algorithm. But as Deutsch points out, skeptics once doubted a quantum computer was even possible.

    “Because there was a kind of trade-off,” he says. “The kind of exponential increase in computational power that might come from quantum superpositions would be counteracted exactly, by exponential sensitivity to noise.”

    While inventions like the transistor required knowledge of quantum mechanics, the device itself wasn’t in a delicate quantum state, so it could be described semi-classically. Quantum computers, on the other hand, require delicate quantum connections.

    What changed was Shor’s introduction of error-correcting codes. By combining concepts from classical information theory with quantum mechanics, Shor showed that, in theory, even the delicate state of a quantum computer could be preserved.

    Beyond quantum computing, the second quantum revolution also relies on and encompasses new ways of using technology to manipulate matter at the quantum level.

    Using lasers, researchers have learned to sap the energy of atoms and cool them. Like a soccer player dribbling a ball up field with a series of taps, lasers can cool atoms to billionths of a degree above absolute zero—far colder than conventional cooling techniques. In 1995, scientists used laser cooling to observe a long-predicted state of matter: the Bose-Einstein condensate.

    Other quantum optical techniques have been developed to make ultra-precise measurements.

    Classical interferometers, like the type used in the famous Michelson-Morley experiment that measured the speed of light in different directions to search for signs of a hypothetical aether, looked at the interference pattern of light. New matter-wave interferometers exploit the principle that everything—not just light—has a wavefunction. Measuring changes in the phase of atoms, which have far shorter wavelengths than light, could give unprecedented control to experiments that attempt to measure the smallest effects, like those of gravity.

    With laboratories and companies around the world focused on advancements in quantum science and applications, the second quantum revolution has only begun. As Bardeen put it in his Nobel lecture, we may be at another “particularly opportune time … to add another small step in the control of nature for the benefit of [hu]mankind.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 2:12 pm on January 11, 2022 Permalink | Reply
    Tags: "Physicists detect a hybrid particle held together by uniquely intense 'glue'", Antiferromagnets, , , Laser Technology, , , , The discovery could offer a route to smaller and faster electronic devices.,   

    From The Massachusetts Institute of Technology (US) : “Physicists detect a hybrid particle held together by uniquely intense ‘glue'” 

    MIT News

    From The Massachusetts Institute of Technology (US)

    January 10, 2022
    Jennifer Chu

    The discovery could offer a route to smaller and faster electronic devices.

    1
    MIT physicists have detected a hybrid particle in an unusual, two-dimensional magnetic material. The hybrid particle is a mashup of an electron and a phonon. Image: Christine Daniloff, MIT.

    In the particle world, sometimes two is better than one. Take, for instance, electron pairs. When two electrons are bound together, they can glide through a material without friction, giving the material special superconducting properties. Such paired electrons, or Cooper pairs, are a kind of hybrid particle — a composite of two particles that behaves as one, with properties that are greater than the sum of its parts.

    Now MIT physicists have detected another kind of hybrid particle in an unusual, two-dimensional magnetic material. They determined that the hybrid particle is a mashup of an electron and a phonon (a quasiparticle that is produced from a material’s vibrating atoms). When they measured the force between the electron and phonon, they found that the glue, or bond, was 10 times stronger than any other electron-phonon hybrid known to date.

    The particle’s exceptional bond suggests that its electron and phonon might be tuned in tandem; for instance, any change to the electron should affect the phonon, and vice versa. In principle, an electronic excitation, such as voltage or light, applied to the hybrid particle could stimulate the electron as it normally would, and also affect the phonon, which influences a material’s structural or magnetic properties. Such dual control could enable scientists to apply voltage or light to a material to tune not just its electrical properties but also its magnetism.

    The results are especially relevant, as the team identified the hybrid particle in nickel phosphorus trisulfide (NiPS3), a two-dimensional material that has attracted recent interest for its magnetic properties. If these properties could be manipulated, for instance through the newly detected hybrid particles, scientists believe the material could one day be useful as a new kind of magnetic semiconductor, which could be made into smaller, faster, and more energy-efficient electronics.

    “Imagine if we could stimulate an electron, and have magnetism respond,” says Nuh Gedik, professor of physics at MIT. “Then you could make devices very different from how they work today.”

    Gedik and his colleagues have published their results today in the journal Nature Communications. His co-authors include Emre Ergeçen, Batyr Ilyas, Dan Mao, Hoi Chun Po, Mehmet Burak Yilmaz, and Senthil Todadri at MIT, along with Junghyun Kim and Je-Geun Park of The Seoul National University [서울대학교](KR).

    Particle sheets

    The field of modern condensed matter physics is focused, in part, on the search for interactions in matter at the nanoscale. Such interactions, between a material’s atoms, electrons, and other subatomic particles, can lead to surprising outcomes, such as superconductivity and other exotic phenomena. Physicists look for these interactions by condensing chemicals onto surfaces to synthesize sheets of two-dimensional materials, which could be made as thin as one atomic layer.

    In 2018, a research group in Korea discovered some unexpected interactions in synthesized sheets of NiPS3, a two-dimensional material that becomes an antiferromagnet at very low temperatures of around 150 kelvins, or -123 degrees Celsius. The microstructure of an antiferromagnet resembles a honeycomb lattice of atoms whose spins are opposite to that of their neighbor. In contrast, a ferromagnetic material is made up of atoms with spins aligned in the same direction.

    In probing NiPS3, that group discovered that an exotic excitation became visible when the material is cooled below its antiferromagnetic transition, though the exact nature of the interactions responsible for this was unclear. Another group found signs of a hybrid particle, but its exact constituents and its relationship with this exotic excitation were also not clear.

    Gedik and his colleagues wondered if they might detect the hybrid particle, and tease out the two particles making up the whole, by catching their signature motions with a super-fast laser.

    Magnetically visible

    Normally, the motion of electrons and other subatomic particles are too fast to image, even with the world’s fastest camera. The challenge, Gedik says, is similar to taking a photo of a person running. The resulting image is blurry because the camera’s shutter, which lets in light to capture the image, is not fast enough, and the person is still running in the frame before the shutter can snap a clear picture.

    To get around this problem, the team used an ultrafast laser that emits light pulses lasting only 25 femtoseconds (one femtosecond is 1 millionth of 1 billionth of a second). They split the laser pulse into two separate pulses and aimed them at a sample of NiPS3. The two pulses were set with a slight delay from each other so that the first stimulated, or “kicked” the sample, while the second captured the sample’s response, with a time resolution of 25 femtoseconds. In this way, they were able to create ultrafast “movies” from which the interactions of different particles within the material could be deduced.

    In particular, they measured the precise amount of light reflected from the sample as a function of time between the two pulses. This reflection should change in a certain way if hybrid particles are present. This turned out to be the case when the sample was cooled below 150 kelvins, when the material becomes antiferromagnetic.

    “We found this hybrid particle was only visible below a certain temperature, when magnetism is turned on,” says Ergeçen.

    To identify the specific constituents of the particle, the team varied the color, or frequency, of the first laser and found that the hybrid particle was visible when the frequency of the reflected light was around a particular type of transition known to happen when an electron moves between two d-orbitals. They also looked at the spacing of the periodic pattern visible within the reflected light spectrum and found it matched the energy of a specific kind of phonon. This clarified that the hybrid particle consists of excitations of d-orbital electrons and this specific phonon.

    They did some further modeling based on their measurements and found the force binding the electron with the phonon is about 10 times stronger than what’s been estimated for other known electron-phonon hybrids.

    “One potential way of harnessing this hybrid particle is, it could allow you to couple to one of the components and indirectly tune the other,” Ilyas says. “That way, you could change the properties of a material, like the magnetic state of the system.”

    This research was supported, in part, by the U.S. Department of Energy and the Gordon and Betty Moore Foundation.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

    The Massachusetts Institute of Technology (US) is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory (US), the MIT Bates Research and Engineering Center (US), and the Haystack Observatory (US), as well as affiliated laboratories such as the Broad Institute of MIT and Harvard(US) and Whitehead Institute (US).

    Massachusettes Institute of Technology-Haystack Observatory(US) Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology (US) adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology (US) . The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology (US) is a member of the Association of American Universities (AAU).

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia (US), wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after Massachusetts Institute of Technology (US) was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst (US)). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    Massachusetts Institute of Technology (US) was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology (US) faculty and alumni rebuffed Harvard University (US) president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, the Massachusetts Institute of Technology (US) administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology (US) catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities (US)in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at Massachusetts Institute of Technology (US) that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    Massachusetts Institute of Technology (US)‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology (US)’s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, Massachusetts Institute of Technology (US) became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected Massachusetts Institute of Technology (US) profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of Massachusetts Institute of Technology (US) between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, Massachusetts Institute of Technology (US) no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and Massachusetts Institute of Technology (US)’s defense research. In this period Massachusetts Institute of Technology (US)’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. Massachusetts Institute of Technology (US) ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT (US) Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However six Massachusetts Institute of Technology (US) students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at Massachusetts Institute of Technology (US) over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, Massachusetts Institute of Technology (US)’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    Massachusetts Institute of Technology (US) has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology (US) classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    Massachusetts Institute of Technology (US) was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, Massachusetts Institute of Technology (US) launched OpenCourseWare to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, Massachusetts Institute of Technology (US) announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology (US) faculty adopted an open-access policy to make its scholarship publicly accessible online.

    Massachusetts Institute of Technology (US) has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology (US) community with thousands of police officers from the New England region and Canada. On November 25, 2013, Massachusetts Institute of Technology (US) announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of the Massachusetts Institute of Technology (US) community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO (US) was designed and constructed by a team of scientists from California Institute of Technology (US), Massachusetts Institute of Technology (US), and industrial contractors, and funded by the National Science Foundation (US) .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology (US) physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also an Massachusetts Institute of Technology (US) graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of Massachusetts Institute of Technology (US) is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 12:07 pm on January 6, 2022 Permalink | Reply
    Tags: "Physicists watch as ultracold atoms form a crystal of quantum tornadoes", A key crossover from classical to quantum behavior, , , In the quantum world a fluid reaches a limit to how thin it can get., In the quantum world the behavior of individual atoms is governed by the eerie principle that a particle’s location is a probability., Laser Technology, MIT physicists have directly observed the interplay of interactions and quantum mechanics in a particular state of matter: a spinning fluid of ultracold atoms., , Quantum Hall fluids: clouds of electrons floating in magnetic fields,   

    From The Massachusetts Institute of Technology (US) : “Physicists watch as ultracold atoms form a crystal of quantum tornadoes” 

    MIT News

    From The Massachusetts Institute of Technology (US)

    January 5, 2022
    Jennifer Chu

    The new observations record a key crossover from classical to quantum behavior.

    1
    Just like the formation of weather patterns on Earth, here a spinning fluid of quantum particles breaks up into a crystal formed from swirling, tornado-like structures. Credit: Courtesy of the researchers.

    The world we experience is governed by classical physics. How we move, where we are, and how fast we’re going are all determined by the classical assumption that we can only exist in one place at any one moment in time.

    But in the quantum world the behavior of individual atoms is governed by the eerie principle that a particle’s location is a probability. An atom, for instance, has a certain chance of being in one location and another chance of being at another location, at the same exact time.

    When particles interact, purely as a consequence of these quantum effects, a host of odd phenomena should ensue. But observing such purely quantum mechanical behavior of interacting particles amid the overwhelming noise of the classical world is a tricky undertaking.

    Now, MIT physicists have directly observed the interplay of interactions and quantum mechanics in a particular state of matter: a spinning fluid of ultracold atoms. Researchers have predicted that, in a rotating fluid, interactions will dominate and drive the particles to exhibit exotic, never-before-seen behaviors.

    In a study published today in Nature, the MIT team has rapidly rotated a quantum fluid of ultracold atoms. They watched as the initially round cloud of atoms first deformed into a thin, needle-like structure. Then, at the point when classical effects should be suppressed, leaving solely interactions and quantum laws to dominate the atoms’ behavior, the needle spontaneously broke into a crystalline pattern, resembling a string of miniature, quantum tornadoes.

    “This crystallization is driven purely by interactions, and tells us we’re going from the classical world to the quantum world,” says Richard Fletcher, assistant professor of physics at MIT.

    The results are the first direct, in-situ documentation of the evolution of a rapidly-rotating quantum gas. Martin Zwierlein, the Thomas A. Frank Professor of Physics at MIT, says the evolution of the spinning atoms is broadly similar to how Earth’s rotation spins up large-scale weather patterns.

    “The Coriolis effect that explains Earth’s rotational effect is similar to the Lorentz force that explains how charged particles behave in a magnetic field,” Zwierlein notes. “Even in classical physics, this gives rise to intriguing pattern formation, like clouds wrapping around the Earth in beautiful spiral motions. And now we can study this in the quantum world.”

    The study’s coauthors include Biswaroop Mukherjee, Airlia Shaffer, Parth B. Patel, Zhenjie Yan, Cedric Wilson, and Valentin Crépel, who are all affiliated with the MIT-Harvard Center for Ultracold Atoms and MIT’s Research Laboratory of Electronics.

    Spinning stand-ins

    In the 1980s, physicists began observing a new family of matter known as quantum Hall fluids, which consists of clouds of electrons floating in magnetic fields. Instead of repelling each other and forming a crystal, as classical physics would predict, the particles adjusted their behavior to what their neighbors were doing, in a correlated, quantum way.

    “People discovered all kinds of amazing properties, and the reason was, in a magnetic field, electrons are (classically) frozen in place — all their kinetic energy is switched off, and what’s left is purely interactions,” Fletcher says. “So, this whole world emerged. But it was extremely hard to observe and understand.”

    In particular, electrons in a magnetic field move in very small motions that are hard to see. Zwierlein and his colleagues reasoned that, as the motion of atoms under rotation occurs at much larger length scales, they might be able to use utracold atoms as stand-ins for electrons, and be able to watch identical physics.

    “We thought, let’s get these cold atoms to behave as if they were electrons in a magnetic field, but that we could control precisely,” Zwierlein says. “Then we can visualize what individual atoms are doing, and see if they obey the same quantum mechanical physics.”

    Weather in a carousel

    In their new study, the physicists used lasers to trap a cloud of about 1 million sodium atoms, and cooled the atoms to temperatures of about 100 nanokelvins. They then used a system of electromagnets to generate a trap to confine the atoms, and collectively spun the atoms around, like marbles in a bowl, at about 100 rotations per second.

    The team imaged the cloud with a camera, capturing a perspective similar to a child’s when facing towards the center on a playground carousel. After about 100 milliseconds, the researchers observed that the atoms spun into a long, needle-like structure, which reached a critical, quantum thinness.

    “In a classical fluid, like cigarette smoke, it would just keep getting thinner,” Zwierlein says. “But in the quantum world a fluid reaches a limit to how thin it can get.”

    “When we saw it had reached this limit, we had good reason to think we were knocking on the door of interesting, quantum physics,” adds Fletcher, who with Zwierlein, published the results up to this point in a previous Science paper. “Then the question was, what would this needle-thin fluid do under the influence of purely rotation and interactions?”

    In their new paper, the team took their experiment a crucial step further, to see how the needle-like fluid would evolve. As the fluid continued to spin, they observed a quantum instability starting to kick in: The needle began to waver, then corkscrew, and finally broke into a string of rotating blobs, or miniature tornadoes — a quantum crystal, arising purely from the interplay of the rotation of the gas, and forces between the atoms.

    “This evolution connects to the idea of how a butterfly in China can create a storm here, due to instabilities that set off turbulence,” Zwierlein explains. “Here, we have quantum weather: The fluid, just from its quantum instabilities, fragments into this crystalline structure of smaller clouds and vortices. And it’s a breakthrough to be able to see these quantum effects directly.”

    This research was supported, in part, by the National Science Foundation, the Air Force Office of Scientific Research, the Office of Naval Research, the Vannevar Bush Faculty Fellowship, and DARPA.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

    The Massachusetts Institute of Technology (US) is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory (US), the MIT Bates Research and Engineering Center (US), and the Haystack Observatory (US), as well as affiliated laboratories such as the Broad Institute of MIT and Harvard(US) and Whitehead Institute (US).

    Massachusettes Institute of Technology-Haystack Observatory(US) Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology (US) adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology (US) . The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology (US) is a member of the Association of American Universities (AAU).

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia (US), wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after Massachusetts Institute of Technology (US) was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst (US)). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    Massachusetts Institute of Technology (US) was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology (US) faculty and alumni rebuffed Harvard University (US) president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, the Massachusetts Institute of Technology (US) administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology (US) catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities (US)in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at Massachusetts Institute of Technology (US) that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    Massachusetts Institute of Technology (US)‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology (US)’s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, Massachusetts Institute of Technology (US) became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected Massachusetts Institute of Technology (US) profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of Massachusetts Institute of Technology (US) between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, Massachusetts Institute of Technology (US) no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and Massachusetts Institute of Technology (US)’s defense research. In this period Massachusetts Institute of Technology (US)’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. Massachusetts Institute of Technology (US) ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT (US) Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However six Massachusetts Institute of Technology (US) students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at Massachusetts Institute of Technology (US) over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, Massachusetts Institute of Technology (US)’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    Massachusetts Institute of Technology (US) has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology (US) classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    Massachusetts Institute of Technology (US) was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, Massachusetts Institute of Technology (US) launched OpenCourseWare to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, Massachusetts Institute of Technology (US) announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology (US) faculty adopted an open-access policy to make its scholarship publicly accessible online.

    Massachusetts Institute of Technology (US) has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology (US) community with thousands of police officers from the New England region and Canada. On November 25, 2013, Massachusetts Institute of Technology (US) announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of the Massachusetts Institute of Technology (US) community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO (US) was designed and constructed by a team of scientists from California Institute of Technology (US), Massachusetts Institute of Technology (US), and industrial contractors, and funded by the National Science Foundation (US) .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology (US) physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also an Massachusetts Institute of Technology (US) graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of Massachusetts Institute of Technology (US) is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 11:12 am on December 26, 2021 Permalink | Reply
    Tags: "In a Smooth Move Ions Ditch Disorder and Keep Their Memories", A new way for atomic ions to host disturbances that do not fade away., , Laser Technology, , , The Joint Quantum Institute (US), Trapped ions   

    From The Joint Quantum Institute (US): “In a Smooth Move Ions Ditch Disorder and Keep Their Memories” 

    JQI bloc

    From The Joint Quantum Institute (US)

    At


    The University of Maryland (US)

    December 20, 2021

    William Morong
    wmorong@umd.edu

    1
    Researchers have demonstrated a new way for atomic ions to host disturbances that do not fade away. Credit: E. Edwards/JQI.

    A Persian adage, notably wielded by Abe Lincoln and the band “OK Go”, expresses the ephemeral nature of the world: “This, too, shall pass.”

    Physicists have their own version of this rule. It says that wiggles and wrinkles—really any small disturbances—tend to get ironed out over time. For instance, a couple drops of blue food coloring mixed into some cake batter will impart a blue tint to the whole batch; fresh water from a river funneled into the salty ocean will spread out and make a slightly less salty ocean; and a gush of cold wind entering your room will mingle with the air inside and reach a single, cooler temperature. The basic idea is that, given enough time, everything will reach equilibrium, regardless of where it started.

    There are a few notable exceptions to this equanimous rule. In the quantum world of atoms and electrons, particles confined in a container made of electric and magnetic fields—akin to a bowl confining cake batter—can get stuck in place if the container isn’t smooth. When this “bowl” is rough, disorderly, and random, the particles can’t make up their minds about which way to go and instead stay put. Oddly, even when a bunch of these localized particles are allowed to influence each other, they can manage to stay localized, not exchanging energy and avoiding equilibrium. This effect, known as many-body localization (MBL), imparts particles with a kind of memory of where they started.

    Now, scientists have found a new way to create disturbances that do not fade away. Instead of relying on disorder to freeze things in place, they tipped the quantum particles’ container to one side—a trick that is easier to conjure in the lab. A collaboration between the experimental group of College Park Professor Christopher Monroe and the theoretical group of JQI Fellow Alexey Gorshkov, who is also a Fellow of the Joint Center for Quantum Information and Computer Science and a physicist at the National Institute of Standards and Technology, has used trapped ions to implement this new technique, confirming that it prevents their quantum particles from reaching equilibrium. The team also measured the slowed spread of information with the new tipping technique for the first time. They published their results recently in the journal Nature.

    “One advantage of this method of many-body localization is that we don’t need that disorder,” says Fangli Liu, former graduate student in physics at the University of Maryland (now a research scientist at QuEra Computing) and lead theorist on the work. “In the original system the disorder is realized in a random form. But with this method, each time you do a measurement you will have exactly the same result. It gives us the possibility to more efficiently use this many-body localization to do something interesting.”

    Instead of color (as in the dough example) or temperature (in the case of air in your room), the disturbance in the JQI experiment was in the ions’ spins—their little internal magnets that can point up or down (or a bit of both at the same time, as in a quantum superposition). These ion spins sit in a container shaped not like a bowl but instead like a single row of an egg carton, with each ion residing in a different dimple of the container. Normally, after some time all spins would point in the same direction uniformly, with no memory of whether each spin pointed up or down to begin with.

    By controlling the ions individually, the scientists can prepare one spin that points up while the rest point down. With an egg carton container that’s flat (like it’s sitting on a table), the single spin disturbance can hop between ions, chatting with neighbors and ultimately causing all the ions to agree on a uniform configuration. In traditional many-body localization, where randomness and disorder rule the day, the egg-carton dimples become offset up or down from each other in a random way, paralyzing each spin in its spot.

    Instead of adding disorder, the team tilted the egg carton, offsetting each dimple a little higher than its neighbor to the left in a smooth, consistent way. This caused the spins to get localized as well, but for a very different reason. Quantum particles have wave-like properties, and once they start rolling down in the direction of a tilt, they can get reflected by the edges of the egg carton dimples. So instead of rolling downhill forever, they roll down and bounce back up over and over again, which confines them to their small region of the container.

    For a single particle, this pinning mechanism has been known since the 1930s. But whether it would persist in the face of interactions between many particles and halt equilibration has only recently been explored. Indeed, the idea that tilting the egg carton would result in a breakdown of equilibration was only proposed in 2019.

    The JQI team confirmed this in their experiment. Using tightly focused lasers, they adjusted each ion individually and prepared them in a highly disturbed state, with spins pointing in alternating directions. At the same time, they had extra lasers shining on all the ions together, allowing them to talk to each other even while far apart. If the tilt was high enough, the team found, the ions’ spins remained in their original configuration for an extended period, refusing to succumb to equilibrium.

    3
    Normally, ion spins that start out pointing in opposite directions will interact and reach an equilibrium, with no trace of where they started. But when the tilt in their container is large enough, they keep pointing in their original direction, creating a many-body localized state that remembers its initial configuration. Credit: Adapted from article by the authors/JQI.

    In addition to a conceptual leap, creating MBL without disorder may come with certain practical advantages. First, it is experimentally easier to implement a smooth tilt (in fact, a small tilt was present in the JQI experiment whether they wanted it or not). Second, it makes measurements much more straightforward. And third, this method is immune to an accidental break down of MBL. In regular disorder-based MBL, the random offsets of the dimples need to be large. If they aren’t, localization can break down in some spots and infect the whole batch. With a smooth tilt, there’s no such risk.

    This opens the possibility of using many-body localization to create a robust memory. MBL might help maintain quantum information in future quantum computers or help preserve curiosities like time crystals or topological phases.

    In the past year, two other experiments realizing this method were reported. The team of H. Wang in Hangzhou, China set it up using superconducting qubits [Physical Review Letters], and Monika Aidelburger’s team in Munich, Germany made it happen [Nature Communications] with ultracold atoms.

    “There’s a lot of shared themes between our three papers,” says William Morong, a postdoctoral researcher at JQI and lead author on the work, “and I would say all of them together give a more complete picture of the phenomenon then each individually.”

    The JQI group was the only one, however, to demonstrate another key property of many-body localization: the slow spread of entanglement between their ions. The team used a technique adapted from nuclear magnetic resonance imaging to measure the crawling pace with which entanglement spread across their atoms, a hallmark of MBL.

    “I think that our work shows the exciting progress that has been made in modern quantum simulation platforms,” Morong says, “We are reaching the point where we have enough control over collections of quantum particles in these platforms that we can read a theoretical paper describing some interesting effect that emerges in a specific system, program in the forces that we need to create this effect for ourselves, and measure subtle signatures in the quantum entanglement between the particles that are only revealed when you can observe each particle individually. ”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    JQI is supported by The Gordon and Betty Moore Foundation (US).

    We are on the verge of a new technological revolution as the strange and unique properties of quantum physics become relevant and exploitable in the context of information science and technology.

    The Joint Quantum Institute (JQI) (US) is pursuing that goal through the work of leading quantum scientists from the Department of Physics of The University of Maryland (UMD (US)), The National Institute of Standards and Technology (US) and the Laboratory for Physical Sciences (LPS). Each institution brings to JQI major experimental and theoretical research programs that are dedicated to the goals of controlling and exploiting quantum systems.

    U Maryland Campus

    Driven by the pursuit of excellence,

     
  • richardmitnick 11:48 am on December 24, 2021 Permalink | Reply
    Tags: "Lasers and Ultracold Atoms for a Changing Earth", , , Applying new technology rooted in quantum mechanics and relativity to terrestrial and space geodesy will sharpen our understanding of how the planet responds to natural and human-induced changes., , Improving technology for laser interferometric ranging between spacecraft to achieve nanometer-scale accuracy, Laser altimetry, Laser Technology, Measuring Earth’s gravity field from space requires precisely monitoring the changing distance between paired orbiting satellites., NASA Grace mission, NASA Grace-FO mission, , The future of high-precision geodesy lies in the development and application of novel technologies based on quantum mechanics and relativity.   

    From Eos: “Lasers and Ultracold Atoms for a Changing Earth” 

    From AGU
    Eos news bloc

    From Eos

    20 December 2021
    Michel Van Camp
    F. Pereira dos Santos
    Michael Murböck
    Gérard Petit and
    Jürgen Müller

    Applying new technology rooted in quantum mechanics and relativity to terrestrial and space geodesy will sharpen our understanding of how the planet responds to natural and human-induced changes.

    1
    Credit: VENTRIS/Science Photo Library via Getty Images.

    Quantum mechanics rules the atomic world, challenging our intuitions based on Newton’s classical mechanics. And yet atoms share at least one commonality with Newton’s apple and with you and me: They experience gravity and fall in the same way.

    Of course, observing free-falling atoms requires extremely sophisticated experimental devices, which became available only in the 1990s with the advent of laser cooling. Heat represents the extent to which atoms move, so cooling atoms eases their manipulation, allowing scientists to measure their free fall and to quantify and study the effects of gravity with extraordinary precision. Creating samples of ultracold atoms involves slowing the atoms using the momentum of photons in specialized laser beams.

    Today novel developments in methods using ultracold atoms and laser technologies open enhanced prospects for applying quantum physics in both satellite and terrestrial geodesy—the science of measuring the shape, rotation, and gravity of Earth—and for improving measurement reference systems. Such methods have great potential for more accurately monitoring how the Earth system is responding to natural and human-induced forcing, from the planet’s solid surface shifting in response to tectonic and magmatic movements to sea level rising in response to melting glaciers.

    Taking Earth’s Measure

    Earth’s shape is always changing, even if the changes are mostly imperceptible to us humans. In the subsurface, large convection currents and plate tectonics influence each other, shifting huge masses of rock around and causing earthquakes and volcanic eruptions. On the surface, the ocean, atmosphere, glaciers, rivers, and aquifers never rest either—nor do we as we excavate rock, extract groundwater and oil, and generally move mass around. All these movements subtly affect not only the planet’s shape but also its rotation and its gravitational field.

    2
    Fig. 1. The colored bubbles indicate the ranges of spatial resolution (in kilometers) and signal amplitude (in equivalent water height, EWH) characteristic of mass change processes related to continental hydrology (yellow), ice sheets and glaciers (pink), ocean processes (blue), and volcanoes and earthquakes (gray). The current measurement limits of laser interferometric ranging methods (e.g., aboard the Gravity Recovery and Climate Experiment (GRACE) and GRACE Follow-On (GRACE-FO) missions; solid black line) and of terrestrial absolute gravimetry (dashed green line) are shown, along with the directions of improvement in these technologies (arrows) needed to cover more of the ranges of the processes. Credit: IfE/LUH.

    NASA Grace

    National Aeronautics Space Agency (US)/GFZ German Research Centre for Geosciences [Deutsches Forschungszentrum für Geowissenschaften](GFZ)(DE) Grace-FO satellites.

    Geodetic methods allow us to measure minute quantities that tell scientists a lot about Earth’s size, shape, and makeup. As such, geodesy is essential to all branches of geophysics: tectonics, seismology, volcanology, oceanography, hydrology, glaciology, geomagnetism, climatology, meteorology, planetology, and even metrology, physics, and astronomy. Measuring these changes sheds light on many important Earth processes, such as mass loss from polar ice sheets, yet making these measurements accurately remains a challenging task (Figure 1).

    Determining the elevation of an ice sheet’s surface, to gauge whether it might have lost or gained mass, is often done using laser altimetry—that is, by observing the travel time of a laser beam emitted from a plane or a satellite and reflected off the ice surface back up to the observer. It’s a powerful technique, but the laser does not necessarily distinguish between light, fresh snow and dense, old ice, introducing uncertainty into the measurement and into our understanding of the ice sheet’s change.

    Beyond this ambiguity, what happens if Earth’s crust beneath the ice cap is deforming and influencing the elevation of the ice surface? Moreover, the altimeter’s observation is relative: The elevation of the ice sheet surface is measured with respect to the position of the observing aircraft or satellite, which itself must be determined in comparison to a reference height datum (typically sea level). This feat requires measuring quantities that are exceedingly small compared with the size of Earth. If you drew a circle representing Earth on a standard piece of printer paper, even the 20-kilometer difference in height between Mount Everest’s peak and the bottom of abyssal oceanic trenches—would be thinner than the thickness of your pencil line!

    Meanwhile, measuring variation in Earth’s rotation means determining its instantaneous orientation relative to fixed stars to within a fraction of a thousandth of an arc second—the amount Earth rotates in a few micro arc seconds. Assessing velocities and deformations of the tectonic plates requires determining positions at the millimeter scale. And detecting groundwater mass changes requires measuring the associated gravitational effect of a 1-centimeter-thick layer of water (i.e., equivalent water height, or EWH) spread over a 160,000-square-kilometer area. In other words, changes in Earth’s rotation, deformations, and gravity must be measured with precisions that are 10 orders of magnitude shorter than the length of the day, smaller than Earth’s diameter, and weaker than the gravity itself, respectively.

    The Challenges of Attraction

    Performing gravity measurements and analyses remains especially demanding. For land-based measurements, gravimeters are generally cumbersome, expensive, tricky to use and, in the case of the most precise superconducting instruments, require a high-wattage (1,500-watt) continuous power supply. In addition, most gravimeters, including superconducting instruments, offer only relative measurements—that is, they inform us about spatial and temporal variations in gravitational attraction, but they drift with time and do not provide the absolute value of gravitational acceleration (about 9.8 meters per second squared). Absolute gravimeters do, but these instruments are rare, expensive (costing roughly $500,000 apiece), and heavy. And as most are mechanical, wear and tear prevents their use for continuous measurements.

    2
    This absolute gravimeter developed by the SYRTE (Time and Space Reference Systems) department at the Paris Observatory uses ultracold-atom technology to make high-precision measurements of gravity. Credit: Sébastien Merlet, LNE­SYRTE.

    Moreover, terrestrial gravimeters are mostly sensitive to the mass distribution nearby, in a radius of a few hundred meters from the instrument. This sensitivity and scale allow observation of rapid and small-scale changes, such as from flash floods, in small watersheds or glaciers, and in volcanic systems, but they complicate data gathering over larger areas.

    On the other hand, space-based gravimetry, realized in the Gravity Recovery and Climate Experiment mission and its follow-on mission, GRACE-FO, is blind to structures smaller than a few hundred kilometers. However, it offers unique information of homogeneous quality about mass anomalies over larger areas within Earth or at its surface. These missions can detect and monitor a mass change equivalent to a 1-centimeter EWH spread over a 400- × 400-kilometer area, with a temporal resolution of 10 days.

    To monitor change from important Earth processes—from flooding and volcanism to glacier melting and groundwater movement—reliably and across scales, we need gravitational data with better spatiotemporal resolution and higher accuracy than are currently available (Figure 1). We also need highly stable and accurate reference systems to provide the fundamental backbone required to monitor sea level changes and tectonic and human-induced deformation. The needed improvements can be achieved only by using innovative quantum technologies.

    The past few years have seen new efforts to develop such technologies for many uses. In 2018, for example, the European Commission began a long-term research and innovation initiative called Quantum Flagship. For geodetic applications, efforts are being coordinated and supported largely through the Novel Sensors and Quantum Technology for Geodesy (QuGe) program, a worldwide initiative organized under the umbrella of the International Association of Geodesy and launched in 2019. QuGe fosters synergies in technology development, space mission requirements, and geodetic and geophysical modeling by organizing workshops and conference sessions and by otherwise providing a platform where experts from different fields can collaborate.

    A Quantum Upgrade for Gravity Sensing

    QuGe emphasizes three pillars of development. The first focuses on investigations of ultracold-atom technologies for gravimetry on the ground and in space. Quantum gravimetry will benefit a comprehensive set of applications, from fast, localized gravity surveys and exploration to observing regional and global Earth system processes with high spatial and temporal resolution.

    On Earth, the ideal instrument is an absolute, rather than relative, gravimeter capable of taking continuous measurements. This is not possible with a classical mechanical absolute gravimeter, in which a test mass is repeatedly dropped and lifted. In atomic instruments, there are no mobile parts or mechanical wear; instead, lasers control falling rubidium atoms. Recent achievements should enable production of such instruments on a larger scale, allowing scientists to establish dense networks of absolute gravimetric instruments to monitor, for example, aquifer and volcanic systems.

    Today achieving dense coverage with gravimetric surveys, with measurements made at perhaps dozens of points, involves huge efforts, and sampling rates—with measurements taken typically once every month, year, or more—are still poor. Moreover, errors related to instrument calibration and drift remain problematic. Alternatively, a fixed instrument provides a measurement every second but at only a single location. The ability to continuously measure gravity at multiple locations, without the difficulties of drifting instruments, will allow much less ambiguous interpretations of gravity changes and related geophysical phenomena.

    Measuring Earth’s gravity field from space requires precisely monitoring the changing distance between paired orbiting satellites—as in the GRACE-FO mission—which accelerate and decelerate slightly as they are tugged more or less by the gravitational pull exerted by different masses on Earth. However, the satellites can also speed up and slow down because of forces other than changes in Earth’s gravity field, including aerodynamic drag in the thin upper atmosphere. Currently, these other forces acting on the satellites are measured using electrostatic, suspended-mass accelerometers, which also tend to exhibit gradual, low-frequency drifts that hamper their accuracy.

    The performance of these traditional accelerometers is thus challenged by quantum sensors, which have already demonstrated improved long-term stability and lower noise levels on the ground. In addition, hybrid systems combining the benefits of quantum accelerometers with electrostatic accelerometers, which still provide higher measurement rates, could cover a wider range of slower and faster accelerations and could greatly support navigation and inertial sensing on the ground and in space. Quantum accelerometers will also serve as a basis for developing the next generation of gravity-monitoring missions, such as the follow-on to the Gravity field and steady-state Ocean Circulation Explorer (GOCE) mission, which will measure gravity differences in 3D and allow higher-resolution mapping of Earth’s static gravity field.

    Wide-Ranging Improvement

    The second pillar of QuGe focuses on improving technology for laser interferometric ranging between spacecraft to achieve nanometer-scale accuracy, which will become the standard for future geodetic gravity-sensing missions. This method involves comparing the difference in phase between two laser beams: a reference beam and a test beam received back from the second satellite. Such optical measurements are much more precise than similar measurements using microwave ranging or mechanical devices, allowing intersatellite distances to be observed with an accuracy of tens of nanometers or better compared with micrometer accuracies achieved with microwaves.

    High-precision laser ranging was successfully tested in 2017 during the Laser Interferometer Space Antenna (LISA) Pathfinder mission, in which the main goal was to hold the spacecraft as motionless as possible to test technology for use in future missions that will seek to detect gravitational waves with a space-based observatory. It has also been applied successfully in the GRACE-FO mission, demonstrating the superior performance for intersatellite tracking of laser interferometry over traditional microwave-based ranging methods used in the original GRACE mission.

    Although extremely useful, recent satellite gravity missions give only rather rough pictures of global mass variations. Enhanced monitoring of intersatellite distances should improve the ability to resolve 1-centimeter EWH to about 200 kilometers or finer, instead of the 400 kilometers presently. This improvement will allow better separation of overlapping effects, such as continental versus oceanic mass contributions along coastlines, changes in neighboring subsurface aquifers, and variations in glaciers and nearby groundwater tables.

    Even more refined concepts, like intersatellite tracking using laser interferometry for multiple satellite pairs or among a swarm of satellites, might be realized as well within the coming years. Using more satellites in next-generation geodetic missions would yield data with higher temporal and spatial resolution and accuracy—and hence with greater ability to distinguish smaller-scale processes—than are available with current two-satellite configurations.

    Measuring Height with Optical Clocks

    QuGe’s third pillar of development focuses on applying general relativity and optical clocks to improve measurement reference systems. Einstein told us that gravity distorts space and time. In particular, a clock closer to a mass—or, say, at a lower elevation on Earth’s surface, closer to the planet’s center of mass—runs slower than one farther away. Hence, comparing the ticking rates of accurate clocks placed at different locations on Earth informs us about height differences, a technique called chronometric leveling. This technique has been achieved by comparing outputs from highly precise optical clocks connected by optical links over distances on the order of 1,000 kilometers.

    Today systems for measuring height are referenced to mean sea level in some way, for example, through tide gauges. However, sea level is not stable enough to be used as a reference.

    4
    The transportable optical clock of the PTB (left) is housed inside a trailer (right). Credit: PTB Braunschweig, CC BY 4.0.

    Optical clocks keep time by measuring the high frequency of a laser light that is kept locked to the transition frequency between two given energy levels of electrons in ultracold (laser-cooled) atoms or ions. These clocks have demonstrated at least a 100-fold improvement in accuracy over the usual atomic clocks, which measure lower-frequency microwave transitions. With a global network of such optical clocks, if we can remotely compare the clocks’ frequencies with the same accuracy, we could realize a global height reference with 1-centimeter consistency. One can even imagine the reference clocks being placed in a high satellite orbit, far from the noisy Earth environment, to serve as a stable reference for terrestrial height systems and improve measurement accuracy.

    In addition to chronometric leveling, such clocks will improve the accuracy of the International Atomic Time standard—the basis for the Coordinated Universal Time used for civil timekeeping—and will have many other impacts on science and technology. For example, global navigation satellite systems could provide better navigation by using more predictable clocks on satellites, which would have the added advantage of requiring less input from the ground operators controlling the satellite orbits. Space navigation could rely on one-way range measurements instead of on more time-consuming two-way ranging if a spacecraft’s clock were highly accurate. And radio astronomers could make use of more stable frequency references for easier processing and better results in very long baseline interferometry experiments. More fundamental applications are also envisioned for optical clocks, such as detecting gravitational waves, testing the constancy of the fundamental constants of physics, and even redefining the second.

    The Best Tools for the Job

    Our knowledge of Earth’s shape and gravity and the subtle shifts they undergo in response to numerous natural and human-induced processes has grown immensely as geodetic methods and tools have matured. But with current technologies, the clarity and confidence with which we can discern these changes remain limited. Such limitations, namely, insufficient accuracy and resolution in time and space, will become increasingly important as we look to better understand and predict the consequences of accelerating—or even perhaps previously unrecognized—changes occurring as the planet responds to warming temperatures and other anthropogenic influences.

    The future of high-precision geodesy lies in the development and application of novel technologies based on quantum mechanics and relativity. QuGe is working to ensure that the Earth and planetary sciences benefit from the vast potential of these technologies. In particular, ultracold-atom accelerometry, high-precision laser ranging between satellites, and relativistic geodesy with optical clocks are very promising approaches that will overcome problems of classical gravimetric Earth observations. With such advances, we will have the best tools available not only to understand vital geophysical processes but also to better navigate on Earth and in space and to discern the fundamental physics that underlie our world.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: