Tagged: Physics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:40 pm on February 27, 2020 Permalink | Reply
    Tags: "‘Flash photography’ at the LHC", , , , , , , Physics,   

    From Symmetry: “‘Flash photography’ at the LHC” 

    Symmetry Mag
    From Symmetry<

    02/27/20
    Sarah Charley

    1
    Photo by Tom Bullock

    An extremely fast new detector inside the CMS detector will allow physicists to get a sharper image of particle collisions.

    Some of the best commercially available high-speed cameras can capture thousands of frames every second. They produce startling videos of water balloons popping and hummingbirds flying in ultra-slow motion.

    But what if you want to capture an image of a process so fast that it looks blurry if the shutter is open for even a billionth of a second? This is the type of challenge scientists on experiments like CMS and ATLAS face as they study particle collisions at CERN’s Large Hadron Collider.

    When the LHC is operating to its full potential, bunches of about 100 billion protons cross each other’s paths every 25 nanoseconds. During each crossing, which lasts about 2 nanoseconds, about 50 protons collide and produce new particles. Figuring out which particle came from which collision can be a daunting task.

    “Usually in ATLAS and CMS, we measure the charge, energy and momentum of a particle, and also try to infer where it was produced,” says Karri DiPetrillo, a postdoctoral fellow working on the CMS experiment at the US Department of Energy’s Fermilab. “We’ve had timing measurements before—on the order of nanoseconds, which is sufficient to assign particles to the correct bunch crossing, but not enough to resolve the individual collisions within the same bunch.”

    Thanks to a new type of detector DiPetrillo and her collaborators are building for the CMS experiment, this is about to change.

    CERN/CMS Detector

    Physicists on the CMS experiment are devising a new detector capable of creating a more accurate timestamp for passing particles. The detector will separate the 2-nanosecond bursts of particles into several consecutive snapshots—a feat a bit like taking 30 billion pictures a second.

    This will help physicists with a mounting challenge at the LHC: collision pileup.

    Picking apart which particle tracks came from which collision is a challenge. A planned upgrade to the intensity of the LHC will increase the number of collisions per bunch crossing by a factor of four—that is from 50 to 200 proton collisions—making that challenge even greater.

    Currently, physicists look at where the collisions occurred along the beamline as a way to identify which particular tracks came from which collision. The new timing detector will add another dimension to that.

    “These time stamps will enable us to determine when in time different collisions occurred, effectively separating individual bunch crossings into multiple ‘frames,’” says DiPetrillo.

    DiPetrillo and fellow US scientists working on the project are supported by DOE’s Office of Science, which is also contributing support for the detector development.

    According to DiPetrillo, being able to separate the collisions based on when they occur will have huge downstream impacts on every aspect of the research. “Disentangling different collisions cleans up our understanding of an event so well that we’ll effectively gain three more years of data at the High-Luminosity LHC. This increase in statistics will give us more precise measurements, and more chances to find new particles we’ve never seen before,” she says.

    The precise time stamps will also help physicists search for heavy, slow moving particles they might have missed in the past.

    “Most particles produced at the LHC travel at close to the speed of light,” DiPetrillo says. “But a very heavy particle would travel slower. If we see a particle arriving much later than expected, our timing detector could flag that for us.”

    The new timing detector inside CMS will consist of a 5-meter-long cylindrical barrel made from 160,000 individual scintillating crystals, each approximately the width and length of a matchstick. This crystal barrel will be capped on its open ends with disks containing delicately layered radiation-hard silicon sensors. The barrel, about 2 meters in diameter, will surround the inner detectors that compose CMS’s tracking system closest to the collision point. DiPetrillo and her colleagues are currently working out how the various sensors and electronics at each end of the barrel will coordinate to give a time stamp within 30 to 50 picoseconds.

    “Normally when a particle passes through a detector, the energy it deposits is converted into an electrical pulse that rises steeply and the falls slowly over the course of a few nanoseconds,” says Joel Butler, the Fermilab scientist coordinating this project. “To register one of these passing particles in under 50 picoseconds, we need a signal that reaches its peaks even faster.”

    Scientists can use the steep rising slopes of these signals to separate the collisions not only in space, but also in time. In the barrel of the detector, a particle passing through the crystals will release a burst of light that will be recorded by specialized electronics. Based on when the intense flash of light arrives at each sensor, physicists will be able to calculate the particle’s exact location and when it passed. Particles will also produce a quick pulse in the endcaps, which are made from a new type of silicon sensor that amplifies the signal. Each silicon sensor is about the size of a domino and can determine the location of a passing particle to within 1.3 millimeters.

    The physicists working on the timing detector plan to have all the components ready and installed inside CMS for the start-up of the High Luminosity LHC in 2027

    “High-precision timing is a new concept in high-energy physics,” says DiPetrillo. “I think it will be the direction we pursue for future detectors and colliders because of its huge physics potential. For me, it’s an incredibly exciting and novel project to be on right now.”

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 9:56 am on February 26, 2020 Permalink | Reply
    Tags: , , Light-emitting defects in materials that may someday enable quantum-based technologies., , Multiscale microscopy, Physics, Scientists have been investigating hexagonal boron nitride which has a significant downside: It emits light in a rainbow of different hues., , The data is very rich and provides a clear classification of quantum defects in this material., The scientists were able to trace the material’s colorful emission to specific atomic defects., We wanted to know the source of the multi-color emission with the ultimate goal of gaining control over emission.   

    From Stanford University: “Stanford researchers shine light on the defects responsible for messy behavior in quantum materials” 

    Stanford University Name
    From Stanford University

    February 24, 2020
    Taylor Kubota

    Researchers are investigating light-emitting defects in materials that may someday enable quantum-based technologies, such as quantum computers, quantum networks or engines that run on light. Once understood, these defects can become controllable features.

    1
    Researchers studied a material capable of emitting bright quantum light. Materials like this could someday enable the creation of quantum computers, which would be much faster and more efficient than current computers. (Image credit: Getty Images)

    In a future built on quantum technologies, planes and spaceships could be fueled by the momentum of light. Quantum computers will crunch through complex problems spanning chemistry to cryptography with greater speed and energy efficiency than existing processors. But before this future can come to pass, we need bright, on-demand, predictable sources of quantum light.

    Toward this end, a team of Stanford University material scientists, physicists and engineers, in collaboration with labs at Harvard University and the University of Technology Sydney, have been investigating hexagonal boron nitride, a material that can emit bright light as a single photon – a quantum unit of light – at a time. And it can do this at room temperature, making it easier to use compared to alternative quantum sources.

    Unfortunately, hexagonal boron nitride has a significant downside: It emits light in a rainbow of different hues. “While this emission is beautiful, the color currently can’t be controlled,” said Fariah Hayee, the lead author and a graduate student in the lab of Jennifer Dionne, associate professor of materials science and engineering at Stanford. “We wanted to know the source of the multi-color emission, with the ultimate goal of gaining control over emission.”

    By employing a combination of microscopic methods, the scientists were able to trace the material’s colorful emission to specific atomic defects. A group led by co-author Prineha Narang, assistant professor of computational materials science at Harvard University, also developed a new theory to predict the color of defects by accounting for how light, electrons and heat interact in the material.

    “We needed to know how these defects couple to the environment and if that could be used as a fingerprint to identify and control them,” said Christopher Ciccarino, a graduate student in the NarangLab at Harvard University and co-author of the paper.

    The researchers describe their technique and different categories of defects in a paper published in the Feb. 24 issue of the journal Nature Materials.

    Multiscale microscopy

    Identifying the defects that give rise to quantum emission is a bit like searching for a friend in a crowded city without a cellphone. You know they are there, but you have the scan the full city to find their precise location.

    By stretching the capabilities of a one-of-a-kind, modified electron microscope developed by the Dionne lab, the scientists were able to match the local, atomic-scale structure of hexagonal boron nitride with its unique color emission. Over the course of hundreds of experiments, they bombarded the material with electrons and visible light and recorded the pattern of light emission. They also studied how the periodic arrangement of atoms in hexagonal boron nitride influenced the emission color.

    “The challenge was to tease out the results from what can seem to be a very messy quantum system. Just one measurement doesn’t tell the whole picture,” said Hayee. “But taken together, and combined with theory, the data is very rich and provides a clear classification of quantum defects in this material.”

    In addition to their specific findings about types of defect emissions in hexagonal boron nitride, the process the team developed to collect and classify these quantum spectra could, on its own, be transformative for a range of quantum materials.

    “Materials can be made with near atomic-scale precision, but we still don’t fully understand how different atomic arrangements influence their opto-electronic properties,” said Dionne, who is also director of the Photonics at Thermodynamic Limits Energy Frontier Research Center (PTL-EFRC). “Our team’s approach reveals light emission at the atomic-scale, en route to a host of exciting quantum optical technologies.”

    A superposition of disciplines

    Although the focus now is on understanding which defects give rise to certain colors of quantum emission, the eventual aim is to control their properties. For example, the team envisions strategic placement of quantum emitters, as well as turning their emission on and off for future quantum computers.

    Research in this field requires a cross-disciplinary approach. This work brought together materials scientists, physicists and electrical engineers, both experimentalists and theorists, including Tony Heinz, professor of applied physics at Stanford’s School of Humanities and Sciences and of photon science at the SLAC National Accelerator Laboratory, and Jelena Vučković, the Jensen Huang Professor in Global Leadership in the School of Engineering.

    “We were able to lay the groundwork for creating quantum sources with controllable properties, such as color, intensity and position,” said Dionne. “Our ability to study this problem from several different angles demonstrates the advantages of an interdisciplinary approach.”

    Additional Stanford co-authors of this paper include Leo Yu, a postdoctoral scholar in the Heinz lab, and Jingyuan Linda Zhang, who was a graduate student in the Ginzton Laboratory during this research. Other co-authors include researchers from the University of Technology Sydney in Australia. Dionne is also a member of Stanford Bio-X, an affiliate of the Precourt Institute for Energy and a member of the Wu Tsai Neurosciences Institute at Stanford. Vučković is also a professor of electrical engineering and a member of Stanford Bio-X and of the Wu Tsai Neurosciences Institute.

    This research was funded by the Department of Energy, Stanford’s Diversifying Academia, Recruiting Excellence Doctoral Fellowship Program, the National Science Foundation and the Betty and Gordon Moore Foundation.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Stanford University campus. No image credit

    Stanford University

    Leland and Jane Stanford founded the University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members

    Stanford University Seal

     
  • richardmitnick 1:29 pm on February 22, 2020 Permalink | Reply
    Tags: "Time-resolved measurement in a memory device", , Data can be stored in magnetic tunnel junctions virtually without any error and in less than a nanosecond., , , , Physics, The researchers replaced the isolated metal dot by a magnetic tunnel junction., Tomorrow’s memory devices   

    From ETH Zürich: “Time-resolved measurement in a memory device” 

    ETH Zurich bloc

    From ETH Zürich

    19.02.2020
    Oliver Morsch

    Researchers at ETH have measured the timing of single writing events in a novel magnetic memory device with a resolution of less than 100 picoseconds. Their results are relevant for the next generation of main memories based on magnetism.

    1
    The chip produced by IMEC for the experiments at ETH. The tunnel junctions used to measure the timing of the magnetisation reversal are located at the centre (Image courtesy of IMEC).

    At the Department for Materials of the ETH in Zürich, Pietro Gambardella and his collaborators investigate tomorrow’s memory devices. They should be fast, retain data reliably for a long time and also be cheap. So-​called magnetic “random access memories” (MRAM) achieve this quadrature of the circle by combining fast switching via electric currents with durable data storage in magnetic materials. A few years ago researchers could already show that a certain physical effect – the spin-​orbit torque – makes particularly fast data storage possible. Now Gambardella’s group, together with the R&D-​centre IMEC in Belgium, managed to temporally resolve the exact dynamics of a single such storage event – and to use a few tricks to make it even faster.

    Magnetising with single spins

    To store data magnetically, one has to invert the direction of magnetisation of a ferromagnetic (that is, permanently magnetic) material in order to represent the information as a logic value, 0 or 1. In older technologies, such as magnetic tapes or hard drives, this is achieved through magnetic fields produced inside current-​carrying coils. Modern MRAM-​memories, by contrast, directly use the spins of electrons, which are magnetic, much like small compass needles, and flow directly through a magnetic layer as an electric current. In Gambardella’s experiments, electrons with opposite spin directions are spatially separated by the spin-​orbit interaction. This, in turn, creates an effective magnetic field, which can be used to invert the direction of magnetisation of a tiny metal dot.

    “We know from earlier experiments, in which we stroboscopically scanned a single magnetic metal dot with X-​rays, that the magnetisation reversal happens very fast, in about a nanosecond”, says Eva Grimaldi, a post-​doc in Gambardella’s group. “However, those were mean values averaged over many reversal events. Now we wanted to know how exactly a single such event takes place and to show that it can work on an industry-​compatible magnetic memory device.”

    Time resolution through a tunnel junction

    2
    Electron microscope image of the magnetic tunnel junction (MTJ, at the centre) and of the electrodes for controlling and measuring the reversal process. (Image: P. Gambardella / ETH Zürich)

    To do so, the researchers replaced the isolated metal dot by a magnetic tunnel junction. Such a tunnel junction contains two magnetic layers separated by an insulation layer that is only one nanometre thick. Depending on the spin direction – along the magnetisation of the magnetic layers, or opposite to it – the electrons can tunnel through that insulating layer more or less easily. This results in an electrical resistance that depends on the alignment of the magnetization in one layer with respect to the other and thus represents “0” and “1”. From the time dependence of that resistance during a reversal event, the researchers could reconstruct the exact dynamics of the process. In particular, they found that the magnetisation reversal happens in two stages: an incubation stage, during which the magnetisation stays constant, and the actual reversal stage, which lasts less than a nanosecond.

    3
    The magnetic tunnel junction (yellow and red disks) in which the magnetisation of the red disk is inverted by electron spins (blue and yellow arrows). The reversal process is measured through the tunnel resistance (vertical blue arrows).

    Small fluctuations

    “For a fast and reliable memory device it is essential that the time fluctuations between the individual reversal events are minimized”, explains Gambardella’s PhD student Viola Krizakova. So, based on their data the scientists developed a strategy to make those fluctuations as small as possible. To that end, they changed the current pulses used to control the magnetisation reversal in such a way as to introduce two additional physical phenomena. The so-​called spin-​transfer torque as well as a short voltage pulse during the reversal stage now resulted in a reduction of the total time for the reversal event to less than 0,3 nanoseconds, with temporal fluctuations of less than 0,2 nanoseconds.

    Application-​ready technology

    “Putting all of this together, we have found a method whereby data can be stored in magnetic tunnel junctions virtually without any error and in less than a nanosecond”, says Gambardella. Moreover, the collaboration with the research centre IMEC made it possible to test the new technology directly on an industry-​compatible wafer. Kevin Garello, a former post-​doc from Gambardella’s lab, produced the chips containing the tunnel contacts for the experiments at ETH and optimized the materials for them. In principle, the technology would, therefore, be immediately ready for use in a new generation of MRAM.

    Gambardella stresses that MRAM memories are particularly interesting because, differently from conventional main memories such as SRAM or DRAM, they don’t lose their information when the computer is switched off, but are still equally fast. He concedes, though, that the market for MRAM memories currently does not demand such high writing speeds since other technical bottlenecks such as power losses caused by large switching currents limit the access times. In the meantime, he and his co-​workers are already planning further improvements: they want to shrink the tunnel junctions and use different materials that use current more efficiently.

    Science paper:
    “Grimaldi E, et al. Single-​shot dynamics of spin–orbit torque and spin transfer torque switching in three-​terminal magnetic tunnel junctions.”
    Nature Nanotechnology

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ETH Zurich campus
    ETH Zürich is one of the leading international universities for technology and the natural sciences. It is well known for its excellent education, ground-breaking fundamental research and for implementing its results directly into practice.

    Founded in 1855, ETH Zürich today has more than 18,500 students from over 110 countries, including 4,000 doctoral students. To researchers, it offers an inspiring working environment, to students, a comprehensive education.

    Twenty-one Nobel Laureates have studied, taught or conducted research at ETH Zürich, underlining the excellent reputation of the university.

     
  • richardmitnick 3:25 pm on February 21, 2020 Permalink | Reply
    Tags: "Scientists predict state of matter that can conduct both electricity and energy perfectly", , , Physics,   

    From University of Chicago via phys.org: “Scientists predict state of matter that can conduct both electricity and energy perfectly” 

    U Chicago bloc

    From University of Chicago

    via


    phys.org

    February 21, 2020

    1
    From left: Shiva Safaei, David Mazziotti, and LeeAnn Sager discuss their finding that a dual state of matter with both fermion and exciton condensates could exist. Credit: University of Chicago

    Three scientists from the University of Chicago have run the numbers, and they believe there may be a way to make a material that could conduct both electricity and energy with 100% efficiency—never losing any to heat or friction.

    The breakthrough, published Feb. 18 in Physical Review B, suggests a framework for an entirely new type of matter, which could have very useful technological applications in the real world. Though the prediction is based on theory, efforts are underway to test it experimentally.

    “We started out trying to answer a really basic question, to see if it was even possible—we thought these two properties might be incompatible in one material,” said co-author and research adviser David Mazziotti, a professor of Chemistry and the James Franck Institute and an expert in molecular electronic structure. “But to our surprise, we found the two states actually become entangled at a quantum level, and so reinforce each other.”

    Since an untold amount of energy is lost off power lines, engines and machinery every year, scientists are eager to find more efficient alternatives. “In many ways, this is the most important question of the 21st century—how to generate and move energy with minimal loss,” Mazziotti said.

    We’ve known about superconductors—a kind of material that can conduct electricity forever with nearly zero loss—for more than a century. But it was only in the last few years that scientists managed to make a similar material in the laboratory which can conduct energy with nearly zero loss, called an exciton condensate.

    But both superconductors and exciton condensates are tricky materials to make and to keep functioning—partly because scientists don’t fully understand how they work and the theory behind them is incomplete. We do know, however, that both involve the action of quantum physics.

    UChicago graduate student LeeAnn Sager began to wonder how the two states could be generated in the same material. Mazziotti’s group specializes in exploring the properties and structures of materials and chemicals using computation, so she began plugging different combinations into a computer model. “We scanned through many possibilities, and then to our surprise, found a region where both states could exist together,” she said.

    It appears that in the right configuration, the two states actually become entangled—a quantum phenomenon in which systems become intangibly linked together. This challenges the conventional notion that the two states are unrelated, and may open a new field of dual exciton and fermion pair condensates.

    Using some advanced mathematics, they showed that thanks to the quantum entanglement, the dual condensates should theoretically exist even at the macroscopic size—that is, visible to the human eye.

    “This implies that such condensates may be realizable in novel materials, such as a double layer of superconductors,” Sager said.

    The scientists are working with experimental groups to see if the prediction can be achieved in real materials.

    “Being able to combine superconductivity and exciton condensates would be amazing for lots of applications—electronics, spintronics, quantum computing,” said Shiva Safaei, a postdoctoral researcher and the third author on the paper. “Though this is a first step, it looks extremely promising.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    About Science X in 100 words

    Science X™ is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004 (Physorg.com), Science X’s readership has grown steadily to include 5 million scientists, researchers, and engineers every month. Science X publishes approximately 200 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Science X community members enjoy access to many personalized features such as social networking, a personal home page set-up, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.
    Mission 12 reasons for reading daily news on Science X Organization Key editors and writersinclude 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

    U Chicago Campus

    An intellectual destination

    One of the world’s premier academic and research institutions, the University of Chicago has driven new ways of thinking since our 1890 founding. Today, UChicago is an intellectual destination that draws inspired scholars to our Hyde Park and international campuses, keeping UChicago at the nexus of ideas that challenge and change the world.

    The University of Chicago is an urban research university that has driven new ways of thinking since 1890. Our commitment to free and open inquiry draws inspired scholars to our global campuses, where ideas are born that challenge and change the world.

    We empower individuals to challenge conventional thinking in pursuit of original ideas. Students in the College develop critical, analytic, and writing skills in our rigorous, interdisciplinary core curriculum. Through graduate programs, students test their ideas with UChicago scholars, and become the next generation of leaders in academia, industry, nonprofits, and government.

    UChicago research has led to such breakthroughs as discovering the link between cancer and genetics, establishing revolutionary theories of economics, and developing tools to produce reliably excellent urban schooling. We generate new insights for the benefit of present and future generations with our national and affiliated laboratories: Argonne National Laboratory, Fermi National Accelerator Laboratory, and the Marine Biological Laboratory in Woods Hole, Massachusetts.

    The University of Chicago is enriched by the city we call home. In partnership with our neighbors, we invest in Chicago’s mid-South Side across such areas as health, education, economic growth, and the arts. Together with our medical center, we are the largest private employer on the South Side.

    In all we do, we are driven to dig deeper, push further, and ask bigger questions—and to leverage our knowledge to enrich all human life. Our diverse and creative students and alumni drive innovation, lead international conversations, and make masterpieces. Alumni and faculty, lecturers and postdocs go on to become Nobel laureates, CEOs, university presidents, attorneys general, literary giants, and astronauts.

     
  • richardmitnick 1:47 pm on February 21, 2020 Permalink | Reply
    Tags: "Otago physicists grab individual atoms in ground-breaking experiment", , , , , Physics, The experiment improves on current knowledge by offering a previously unseen view into the microscopic world surprising researchers with the results., The University of Otago, Trapping and cooling of three atoms to a temperature of about a millionth of a Kelvin using highly focused laser beams in a hyper-evacuated (vacuum) chamber.   

    From The University of Otago, NZ: “Otago physicists grab individual atoms in ground-breaking experiment” 

    1

    From The University of Otago

    20 February 2020

    Associate Professor Mikkel Andersen
    Department of Physics
    University of Otago
    Tel +64 3 479 7805
    Email mikkel.andersen@otago.ac.nz

    Mark Hathaway
    Senior Communications Adviser
    University of Otago
    Mob +64 21 279 5016
    Email mark.hathaway@otago.ac.nz

    1
    LASER-cooled atom cloud viewed through microscope camera.

    In a first for quantum physics, University of Otago researchers have “held” individual atoms in place and observed previously unseen complex atomic interactions.

    A myriad of equipment including lasers, mirrors, a vacuum chamber, and microscopes assembled in Otago’s Department of Physics, plus a lot of time, energy, and expertise, have provided the ingredients to investigate this quantum process, which until now was only understood through statistical averaging from experiments involving large numbers of atoms.

    The experiment improves on current knowledge by offering a previously unseen view into the microscopic world, surprising researchers with the results.

    “Our method involves the individual trapping and cooling of three atoms to a temperature of about a millionth of a Kelvin using highly focused laser beams in a hyper-evacuated (vacuum) chamber, around the size of a toaster. We slowly combine the traps containing the atoms to produce controlled interactions that we measure,” says Associate Professor Mikkel F. Andersen of Otago’s Department of Physics.

    1
    Mikkel Andersen (left) and Marvin Weyland in the physics lab.

    When the three atoms approach each other, two form a molecule, and all receive a kick from the energy released in the process. A microscope camera allows the process to be magnified and viewed.

    “Two atoms alone can’t form a molecule, it takes at least three to do chemistry. Our work is the first time this basic process has been studied in isolation, and it turns out that it gave several surprising results that were not expected from previous measurement in large clouds of atoms,” says Postdoctoral Researcher Marvin Weyland, who spearheaded the experiment.

    For example, the researchers were able to see the exact outcome of individual processes, and observed a new process where two of the atoms leave the experiment together. Until now, this level of detail has been impossible to observe in experiments with many atoms.

    “By working at this molecular level, we now know more about how atoms collide and react with one another. With development, this technique could provide a way to build and control single molecules of particular chemicals,” Weyland adds.

    Associate Professor Andersen admits the technique and level of detail can be difficult to comprehend to those outside the world of quantum physics, however he believes the applications of this science will be useful in development of future quantum technologies that might impact society as much as earlier quantum technologies that enabled modern computers and the Internet.

    “Research on being able to build on a smaller and smaller scale has powered much of the technological development over the past decades. For example, it is the sole reason that today’s cellphones have more computing power than the supercomputers of the 1980s. Our research tries to pave the way for being able to build at the very smallest scale possible, namely the atomic scale, and I am thrilled to see how our discoveries will influence technological advancements in the future,” Associate Professor Andersen says.

    The experiment findings [Physical Review Letters] showed that it took much longer than expected to form a molecule compared with other experiments and theoretical calculations, which currently are insufficient to explain this phenomenon. While the researchers suggest mechanisms which may explain the discrepancy, they highlight a need for further theoretical developments in this area of experimental quantum mechanics.

    This completely New Zealand-based research was primarily carried out by members of the University of Otago’s Department of Physics, with assistance from theoretical physicists at Massey University.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Otago, founded in 1869 by an ordinance of the Otago Provincial Council, is New Zealand’s oldest university. The new University was given 100,000 acres of pastoral land as an endowment and authorised to grant degrees in Arts, Medicine, Law and Music.

    The University opened in July 1871 with a staff of just three Professors, one to teach Classics and English Language and Literature, another having responsibility for Mathematics and Natural Philosophy, and the third to cover Mental and Moral Philosophy and Political Economy. The following year a Professor of Natural Science joined the staff. With a further endowment provided in 1872, the syllabus was widened and new lectureships established: lectures in Law started in 1873, and in 1875 courses began in Medicine. Lectures in Mining were given from 1872, and in 1878 a School of Mines was established.

    The University was originally housed in a building (later the Stock Exchange) on the site of John Wickliffe House in Princes Street but it moved to its present site with the completion of the northern parts of the Clocktower and Geology buildings in 1878 and 1879.

    The School of Dentistry was founded in 1907 and the School of Home Science (later Consumer and Applied Sciences) in 1911. Teaching in Accountancy and Commerce subjects began in 1912. Various new chairs and lectureships were established in the years between the two world wars, and in 1946 teaching began in the Faculty of Theology. The School of Physical Education was opened in 1947.

    A federal University of New Zealand was established by statute in 1870 and became the examining and degree-granting body for all New Zealand university institutions until 1961. The University of Otago had conferred just one Bachelor of Arts degree, on Mr Alexander Watt Williamson, when in 1874 it became an affiliated college of the University of New Zealand.

    In 1961 the University of New Zealand was disestablished, and the power to confer degrees was restored to the University of Otago by the University of Otago Amendment Act 1961.

    Since 1961, when its roll was about 3,000, the University has expanded considerably (in 2016 there were over 20,000 students enrolled) and has broadened its range of qualifications to include undergraduate programmes in Surveying, Pharmacy, Medical Laboratory Science, Teacher Education, Physiotherapy, Applied Science, Dental Technology, Radiation Therapy, Dental Hygiene and Dental Therapy (now combined in an Oral Health programme), Biomedical Sciences, Social Work, and Performing Arts, as well as specialised postgraduate programmes in a variety of disciplines.

    Although the University’s main campus is in Dunedin, it also has Health Sciences campuses in Christchurch (University of Otago, Christchurch) and Wellington (University of Otago, Wellington) (established in 1972 and 1977 respectively), an information and teaching centre in central Auckland (1996), and an information office in Wellington (2001).

    The Dunedin College of Education merged with the University on 1 January 2007, and this added a further campus in Invercargill.

     
  • richardmitnick 4:40 pm on February 18, 2020 Permalink | Reply
    Tags: , , , , MIP-multiprover interactive proof, Physics, , ,   

    From Science News: “How a quantum technique highlights math’s mysterious link to physics” 

    From Science News

    February 17, 2020
    Tom Siegfried

    Verifying proofs to very hard math problems is possible with infinite quantum entanglement.

    1
    A technique that relies on quantum entanglement (illustrated) expands the realm of mathematical problems for which the solution could (in theory) be verified. inkoly/iStock/Getty Images Plus.

    It has long been a mystery why pure math can reveal so much about the nature of the physical world.

    Antimatter was discovered in Paul Dirac’s equations before being detected in cosmic rays. Quarks appeared in symbols sketched out on a napkin by Murray Gell-Mann several years before they were confirmed experimentally. Einstein’s equations for gravity suggested the universe was expanding a decade before Edwin Hubble provided the proof. Einstein’s math also predicted gravitational waves a full century before behemoth apparatuses detected those waves (which were produced by collisions of black holes — also first inferred from Einstein’s math).

    Nobel laureate physicist Eugene Wigner alluded to math’s mysterious power as the “unreasonable effectiveness of mathematics in the natural sciences.” Somehow, Wigner said, math devised to explain known phenomena contains clues to phenomena not yet experienced — the math gives more out than was put in. “The enormous usefulness of mathematics in the natural sciences is something bordering on the mysterious and … there is no rational explanation for it,” Wigner wrote in 1960.

    But maybe there’s a new clue to what that explanation might be. Perhaps math’s peculiar power to describe the physical world has something to do with the fact that the physical world also has something to say about mathematics.

    At least that’s a conceivable implication of a new paper that has startled the interrelated worlds of math, computer science and quantum physics.

    In an enormously complicated 165-page paper, computer scientist Zhengfeng Ji and colleagues present a result that penetrates to the heart of deep questions about math, computing and their connection to reality. It’s about a procedure for verifying the solutions to very complex mathematical propositions, even some that are believed to be impossible to solve. In essence, the new finding boils down to demonstrating a vast gulf between infinite and almost infinite, with huge implications for certain high-profile math problems. Seeing into that gulf, it turns out, requires the mysterious power of quantum physics.

    Everybody involved has long known that some math problems are too hard to solve (at least without unlimited time), but a proposed solution could be rather easily verified. Suppose someone claims to have the answer to such a very hard problem. Their proof is much too long to check line by line. Can you verify the answer merely by asking that person (the “prover”) some questions? Sometimes, yes. But for very complicated proofs, probably not. If there are two provers, though, both in possession of the proof, asking each of them some questions might allow you to verify that the proof is correct (at least with very high probability). There’s a catch, though — the provers must be kept separate, so they can’t communicate and therefore collude on how to answer your questions. (This approach is called MIP, for multiprover interactive proof.)

    Verifying a proof without actually seeing it is not that strange a concept. Many examples exist for how a prover can convince you that they know the answer to a problem without actually telling you the answer. A standard method for coding secret messages, for example, relies on using a very large number (perhaps hundreds of digits long) to encode the message. It can be decoded only by someone who knows the prime factors that, when multiplied together, produce the very large number. It’s impossible to figure out those prime numbers (within the lifetime of the universe) even with an army of supercomputers. So if someone can decode your message, they’ve proved to you that they know the primes, without needing to tell you what they are.

    Someday, though, calculating those primes might be feasible, with a future-generation quantum computer. Today’s quantum computers are relatively rudimentary, but in principle, an advanced model could crack codes by calculating the prime factors for enormously big numbers.

    That power stems, at least in part, from the weird phenomenon known as quantum entanglement. And it turns out that, similarly, quantum entanglement boosts the power of MIP provers. By sharing an infinite amount of quantum entanglement, MIP provers can verify vastly more complicated proofs than nonquantum MIP provers.

    It is obligatory to say that entanglement is what Einstein called “spooky action at a distance.” But it’s not action at a distance, and it just seems spooky. Quantum particles (say photons, particles of light) from a common origin (say, both spit out by a single atom) share a quantum connection that links the results of certain measurements made on the particles even if they are far apart. It may be mysterious, but it’s not magic. It’s physics.

    Say two provers share a supply of entangled photon pairs. They can convince a verifier that they have a valid proof for some problems. But for a large category of extremely complicated problems, this method works only if the supply of such entangled particles is infinite. A large amount of entanglement is not enough. It has to be literally unlimited. A huge but finite amount of entanglement can’t even approximate the power of an infinite amount of entanglement.

    As Emily Conover explains in her report for Science News, this discovery proves false a couple of widely believed mathematical conjectures. One, known as Tsirelson’s problem, specifically suggested that a sufficient amount of entanglement could approximate what you could do with an infinite amount. Tsirelson’s problem was mathematically equivalent to another open problem, known as Connes’ embedding conjecture, which has to do with the algebra of operators, the kinds of mathematical expressions that are used in quantum mechanics to represent quantities that can be observed.

    Refuting the Connes conjecture, and showing that MIP plus entanglement could be used to verify immensely complicated proofs, stunned many in the mathematical community. (One expert, upon hearing the news, compared his feces to bricks.) But the new work isn’t likely to make any immediate impact in the everyday world. For one thing, all-knowing provers do not exist, and if they did they would probably have to be future super-AI quantum computers with unlimited computing capability (not to mention an unfathomable supply of energy). Nobody knows how to do that in even Star Trek’s century.

    Still, pursuit of this discovery quite possibly will turn up deeper implications for math, computer science and quantum physics.

    It probably won’t shed any light on controversies over the best way to interpret quantum mechanics, as computer science theorist Scott Aaronson notes in his blog about the new finding. But perhaps it could provide some sort of clues regarding the nature of infinity. That might be good for something, perhaps illuminating whether infinity plays a meaningful role in reality or is a mere mathematical idealization.

    On another level, the new work raises an interesting point about the relationship between math and the physical world. The existence of quantum entanglement, a (surprising) physical phenomenon, somehow allows mathematicians to solve problems that seem to be strictly mathematical. Wondering why physics helps out math might be just as entertaining as contemplating math’s unreasonable effectiveness in helping out physics. Maybe even one will someday explain the other.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 4:01 pm on February 17, 2020 Permalink | Reply
    Tags: "UNTIL THE END OF TIME" Mind, and Our Search for Meaning in an Evolving Universe By Brian Greene, , , , , , , Matter, , , Physics   

    From The New York Times: “Just a Few Billion Years Left to Go” 

    From The New York Times

    Feb. 17, 2020
    Dennis Overbye

    1
    UNTIL THE END OF TIME
    Mind, Matter, and Our Search for Meaning in an Evolving Universe
    By Brian Greene

    2
    Brian Greene’s main idea, his own grand, unified theory of human endeavor, is that we want to transcend death by attaching ourselves to something permanent that will outlast us. Credit Elena Seibert

    “In the fullness of time all that lives will die.” With this bleak truth Brian Greene, a physicist and mathematician at Columbia University, the author of best-selling books like The Elegant Universe and co-founder of the yearly New York celebration of science and art known as the World Science Festival, sets off in Until the End of Time on the ultimate journey, a meditation on how we go on doing what we do, why and how it will end badly, and why it matters anyway.

    For going on is what we do, building bridges, spaceships and families, composing great symphonies and other works of art, directing movies, and waging wars and presidential campaigns, even though not only are we going to die, but so is all life everywhere in the fullness of eternity, according to what science now thinks it knows about us and the universe.

    Until the End of Time is encyclopedic in its ambition and its erudition, often heartbreaking, stuffed with too many profundities that I wanted to quote, as well as potted descriptions of the theories of a galaxy of contemporary thinkers, from Chomsky to Hawking, and anecdotes from Greene’s own life — of which we should wish for more — that had me laughing.

    It is also occasionally afflicted with stretches of prose that seem as if eternity will come before you ever get through them, especially when Greene is discussing challenging topics like entropy. If I really understood entropy, I suspect I would be writing this review in an office at M.I.T., not an apartment on Manhattan’s Upper West Side.

    Greene’s main idea, his own grand unified theory of human endeavor, expanding on the thoughts of people like Otto Rank, Jean-Paul Sartre and Oswald Spengler, is that we want to transcend death by attaching ourselves to something permanent that will outlast us: art, science, our families and so forth.

    For Greene this impulse has taken the form of a lifetime devotion to mathematics and physics, of the search for laws and truths that transcend time and place. “The enchantment of a mathematical proof might be that it stands forever,” he writes.

    If he dies, the work lives on as part of the body of science and knowledge. But as a cosmologist, he knows this is an illusion: “As our trek across time will make clear, life is likely transient, and all understanding that arose with its emergence will almost certainly dissolve with its conclusion. Nothing is permanent. Nothing is absolute.”

    Depressing. But in a Starbucks one day, he says, he had a realization, a sort of conversion to gratitude. Life and thought might occupy only a minute oasis in cosmic time, but, he writes, “If you take that in fully, envisioning a future bereft of stars and planets and things that think, your regard for our era can appreciate toward reverence.” Or maybe, he jokes, he was just losing his mind.

    This book, then, is a love letter to the ephemeral cosmic moment when everything is possible. Reading it is like riding an escalator up through a giant department store. On the lower floors you find things like time, energy, gravity and the Big Bang, and biology.

    The universe is expanding — why? So far the best explanation is that a virulent antigravitational force dubbed “inflation” — and strangely allowed by Einstein’s equations — briefly switched on during the first split trillionth of a second of time and sent everything flying, but astronomers still lack the smoking-gun proof.

    All living creatures that we know about on Earth share the same genetic tool kit, based on DNA. And we are all battery-operated, deriving energy from a molecule called adenosine triphosphate, ATP for short. In order to keep going, Greene tells us, each cell in your body consumes some 10 million of these molecules every second.

    Upward we go through the emporium of ideas to floors dedicated to consciousness, free will, language and religion. We don’t linger long on any floor. Greene is like one of those custom shopping consultants. He knows the wares, the ideas being pitched in every department. He drags in all the experts — from Proust to Hawking — and tries to be an honest broker about the answers to questions we can’t really answer.

    Why do humans tell stories? Was there an evolutionary advantage to be gained from taking time out from the hunt to sit around the campfire and gab — a bonding experience? Is the shared imagination a way to practice navigating unknown territory, or a guide for living your life?

    Can physics explain not just how the mind — neurons and electrochemical impulses — works but also explain the feeling of having a mind, that is to say consciousness? Greene is cautiously hopeful it can. “That the mind can do all it does is extraordinary. That the mind may accomplish all it does with nothing more than the kinds of ingredients and types of forces holding together my coffee cup, makes it more extraordinary still. Consciousness would be demystified without being diminished.”

    But he’s not always sure. Admitting that the neurophysical facts shed only “a monochrome light” on human experience, he extols art as another dimension. “We gain access to worlds otherwise uncharted,” he says. “As Proust emphasized, this is to be celebrated. Only through art, he noted, can we enter the secret universe of another, the only journey in which we truly ‘fly from star to star,’ a journey that cannot be navigated by ‘direct and conscious methods.’”

    Two main themes run through this story. The first is natural selection, the endless inventive process of evolution that keeps molding organisms into more and more complex arrangements and codependencies. The second is what Greene calls the “entropic-two step.” This refers to the physical property known as entropy. In thermodynamics it denotes the amount of heat — wasted energy — inevitably produced by a steam engine, for example as it goes through its cycle of expansion and contraction. It’s the reason you can’t build a perpetual motion machine. In modern physics it’s a measure of disorder and information. Entropy is a big concept in information theory and black holes, as well as in biology.

    We are all little steam engines, apparently, and everything we accomplish has a cost. That is why your exhaust pipe gets too hot to touch, or why your desk tends to get more cluttered by the end of the day.

    In the end, Greene says, entropy will get us all, and everything else in the universe, tearing down what evolution has built. “The entropic two-step and the evolutionary forces of selection enrich the pathway from order to disorder with prodigious structure, but whether stars or black holes, planets or people, molecules or atoms, things ultimately fall apart,” he writes.

    In a virtuosic final section Greene describes how this will work by inviting us to climb an allegorical Empire State Building; on each floor the universe is 10 times older. If the first floor is Year 10, we now are just above the 10th (10 billion years). By the time we get to the 11th floor the sun will be gone and with it probably any life on Earth. As we climb higher we are exposed to expanses of time that make the current age of the universe look like less than the blink of an eye.

    Eventually the Milky Way galaxy will fall into a black hole. On about the 38th floor of the future, when the universe is 100 trillion trillion trillion years old, protons, the building blocks of atoms, will dissolve out from under us, leaving space populated by a thin haze of lightweight electrons and a spittle of radiation.

    In the far, far, far, far future, even holding a thought will require more energy than will be available in the vastly dissipated universe. It will be an empty and cold place that doesn’t remember us. “Nabokov’s description of a human life as a ‘brief crack of light between two eternities of darkness’ may apply to the phenomenon of life itself,” Greene writes.

    In the end it is up to us to make of this what we will. We can contemplate eternity, Greene concludes, “and even though we can reach for eternity, apparently we cannot touch eternity.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 10:34 am on February 14, 2020 Permalink | Reply
    Tags: , , , Light Sources Form Data Solution Task Force", Physics,   

    From Brookhaven National Lab: “Light Sources Form Data Solution Task Force” 

    From Brookhaven National Lab

    February 12, 2020
    Stephanie Kossman
    skossman@bnl.gov

    New collaboration between scientists at the five U.S. Department of Energy light source facilities will develop flexible software to easily process big data.

    BNL NSLS-II

    LBNL ALS

    ANL Advanced Photon Source

    SLAC SSRL Campus

    SLAC LCLS

    Above are the five DOE light sources: Brookhaven National Laboratory’s National Synchrotron Light Source II (NSLS-II), Lawrence Berkeley National Laboratory’s Advanced Light Source (ALS), Argonne National Laboratory’s Advanced Photon Source (APS), and SLAC National Accelerator Laboratory’s Stanford Synchrotron Radiation Lightsource (SSRL) and Linac Coherent Light Source (LCLS).

    Light source facilities are tackling some of today’s biggest scientific challenges, from designing new quantum materials to revealing protein structures. But as these facilities continue to become more technologically advanced, processing the wealth of data they produce has become a challenge of its own. By 2028, the five U.S. Department of Energy (DOE) Office of Science light sources, will produce data at the exabyte scale, or on the order of billions of gigabytes, each year. Now, scientists have come together to develop synergistic software to solve that challenge.

    With funding from DOE for a two-year pilot program, scientists from the five light sources have formed a Data Solution Task Force that will demonstrate, build, and implement software, cyberinfrastructure, and algorithms that address universal needs between all five facilities. These needs range from real-time data analysis capabilities to data storage and archival resources.

    “It is exciting to see the progress that is being made by all the light sources working together to produce solutions that will be deployed across the whole DOE complex,” said Stuart Campbell, leader of the data acquisition, management and analysis group at the National Synchrotron Light Source II (NSLS-II), a DOE Office of Science user facility at DOE’s Brookhaven National Laboratory.

    In addition, the new software will be designed to facilitate multimodal research—studies that combine data collected from multiple experimental stations, called beamlines. Typically, each beamline at a light source uses custom-built data acquisition software that is incompatible with another beamline’s, making it difficult for scientists to collect and compare data from multiple experimental stations. The task force aims to develop flexible software that can be deployed at multiple beamlines across all five facilities, expanding the possibilities for scientific collaboration.

    2
    Members of the task force met at NSLS-II for a project kickoff meeting in August of 2019.

    To develop the new software, the task force will start by building up existing solutions that can already be found at the five light sources. Two of the key components are Bluesky, an open source software that was created at NSLS-II, and Xi-CAM, which was developed at the Advanced Light Source (ALS) and the Center for Advanced Mathematics for Energy Research Applications—both at DOE’s Lawrence Berkeley National Laboratory. Together, Bluesky and Xi-Cam will provide capabilities like live visualization and interactivity, data processing tools, and the ability to export data in real time into nearly any file format.

    Each of the five light sources in the task force is bringing unique tools and skillsets to help develop a more robust and scalable solution to extract scientific knowledge from data for the nation’s light sources.

    “There is tremendous enthusiasm at the light sources for solving the data challenge,” said Alexander Hexemer, senior scientist and computing program lead at ALS. “We strongly believe this will be the path forward for light sources to work together in the future.”

    With the task force in its early stages, researchers have begun running test experiments on beamlines at NSLS-II and installing Bluesky and Xi-CAM at the Advanced Photon Source, a DOE Office of Science user facility at DOE’s Argonne National Laboratory.

    By the end of the two-year pilot project, “we plan to deliver a set of tools that will provide an end-to-end software solution for the targeted scientific areas that can be deployed and used on different beamlines across all the DOE light sources,” Campbell said.

    Alongside the task force pilot, the five light sources are working with DOE to develop data systems solutions that will scale to the unprecedented data rates that will be produced in the near future, using the new generation of “exascale” computers being built by DOE.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    Brookhaven campus

    BNL Center for Functional Nanomaterials

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL Phenix Detector

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

     
  • richardmitnick 6:47 pm on February 13, 2020 Permalink | Reply
    Tags: "University of Chicago to build instrumentation for upgrades to the Large Hadron Collider", , , , , , , Physics,   

    From University of Chicago: “University of Chicago to build instrumentation for upgrades to the Large Hadron Collider” 

    U Chicago bloc

    From University of Chicago

    Feb 13, 2020
    Natalie Lund

    1
    The ATLAS detector at the Large Hadron Collider. UChicago scientists will build components for an upgrade to the detector. CERN.

    Faculty, students, engineers to design and build systems for ATLAS experiment.

    In 2012, scientists and the public around the world rejoiced at the news that CERN’s Large Hadron Collider had discovered the long-sought Higgs boson—a particle regarded as a linchpin in the Standard Model of particle physics, the theory that describes the fundamental forces and classifies all known elementary particles.

    Standard Model of Particle Physics, Quantum Diaries

    CERN CMS Higgs Event May 27, 2012

    CERN ATLAS Higgs Event

    Despite the breakthrough, subsequent collisions in the machine have yet to produce evidence of what physicists call “new physics”: science that could address the areas where the Standard Model seems to break down—like dark matter, dark energy and why there is more matter than antimatter. So now, the particle accelerator and its detectors are getting an upgrade.

    On Feb. 5, the National Science Foundation and the National Science Board gave the green light for $75 million in funding for upgrades to the ATLAS experiment, one of the collider’s two 7-story high and half a football-field long detectors—opening the doors for the discovery of new particles and rare processes. Approximately $5.5 million will go to the University of Chicago, a founding member of the ATLAS experiment, to design and build several components for the upgraded detector.

    “These upgrades will help the physics community answer glaring questions surrounding the structure of the fundamental particle universe,” said Asst. Prof. David Miller, a particle physicist who has worked extensively on the ATLAS detector and is co-leading the University’s participation in the upgrade. “Why do the fundamental particles that we know about exist in the first place? What is the underlying pattern and structure behind them?”

    The upgrades, which are estimated for completion in 2026, will allow researchers to study the Higgs boson in greater detail; continue the hunt for dark matter, which comprises 25% of our universe and has never been directly detected; and identify new particles, interactions, and physical properties such as new symmetries or spatial dimensions.

    The upgrades to the LHC itself will increase its luminosity—the intensity of its proton beams—by a factor of ten, substantially increasing the number of particle collisions that occur in a given amount of time. Thus ATLAS detector, which is the “camera” capturing images of the collisions, must also be upgraded to filter larger quantities of data at high speeds and to deal with more intense radiation.

    “The biggest challenge with our existing detector is separating the signal from the background. For every interesting particle you produce, there are probably something like a million standard particle decays that look about the same,” said Prof. Mark Oreglia, a renowned expert in collider research and development and the other leader for the project.

    Researchers at the University of Chicago will build portions of the calorimeter, the system that measures the energy of the particles that enter the detector; and the trigger, which tells the detector what images, or “events” to record or ignore.

    The challenge for the new calorimeter is building an instrument so sensitive it can instantaneously measure the light and energy coming off the 200 proton-proton collisions that will occur 40 million times per second—while also robust enough to withstand that same powerful radiation.

    UChicago researchers already have built prototypes of some components and sent them for rigorous testing to ensure they could withstand the LHC’s increased intensity. Construction of electronics is slated to begin this spring, with undergraduate students participating in the testing of the boards to look for short circuits and other flaws.

    2
    CERN staff member Irakli Minashvili asks UChicago undergraduate student Hadar Lazar for the results of a test she is running on ATLAS detector electronics. Courtesy Mark Oreglia.

    Another challenge posed by the upgraded LHC is the volume of data produced by the collisions.

    “In their raw form, the data volume is nearly one petabyte per second, so there’s no way we can save that amount,” Miller said. “We have to come up with clever ways to determine what to keep and what to throw away.”

    Miller’s team is partnering with UChicago Computer Science faculty Assoc. Prof. Risi Kondor and Asst. Prof. Yuxin Chen, tapping their pioneering work in machine learning to develop innovative algorithms and software to tackle this unprecedented task.

    “Machine learning helps us detect patterns in the data and uncover features that we might not otherwise have seen,” Miller said. “For example, I’m working with Risi Kondor to build a completely new type of neural network whose inherent structure reflects known symmetries of nature.”

    The $75 million from the National Science Foundation will complement $163 million funding from the U.S. Department of Energy to support the upgrade to the detector.

    The project will involve multiple universities as well as national laboratories. ATLAS is a large international collaboration consisting of 3000 scientists from 182 institutions and 38 countries.

    Additional researchers and groups involved with the construction are Young-Kee Kim, the Louis Block Distinguished Service Professor of Physics; Melvyn Shochet, the Kersten Distinguished Service Professor of Physics; and the staff of the Enrico Fermi Institute’s Electronics Development Group (EDG) and MANIAC Lab: EDG director Mary Heintz, EDG research professor Kelby Anderson, EDG engineers Mircea Bogdan and Fukun Tang, and MANIAC Lab director research professor Robert Gardner, who heads an NSF effort called Scalable Systems Laboratory to develop data processing systems for all of the data collected and to provide platforms for complex data analysis tasks.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Chicago Campus

    An intellectual destination

    One of the world’s premier academic and research institutions, the University of Chicago has driven new ways of thinking since our 1890 founding. Today, UChicago is an intellectual destination that draws inspired scholars to our Hyde Park and international campuses, keeping UChicago at the nexus of ideas that challenge and change the world.

    The University of Chicago is an urban research university that has driven new ways of thinking since 1890. Our commitment to free and open inquiry draws inspired scholars to our global campuses, where ideas are born that challenge and change the world.

    We empower individuals to challenge conventional thinking in pursuit of original ideas. Students in the College develop critical, analytic, and writing skills in our rigorous, interdisciplinary core curriculum. Through graduate programs, students test their ideas with UChicago scholars, and become the next generation of leaders in academia, industry, nonprofits, and government.

    UChicago research has led to such breakthroughs as discovering the link between cancer and genetics, establishing revolutionary theories of economics, and developing tools to produce reliably excellent urban schooling. We generate new insights for the benefit of present and future generations with our national and affiliated laboratories: Argonne National Laboratory, Fermi National Accelerator Laboratory, and the Marine Biological Laboratory in Woods Hole, Massachusetts.

    The University of Chicago is enriched by the city we call home. In partnership with our neighbors, we invest in Chicago’s mid-South Side across such areas as health, education, economic growth, and the arts. Together with our medical center, we are the largest private employer on the South Side.

    In all we do, we are driven to dig deeper, push further, and ask bigger questions—and to leverage our knowledge to enrich all human life. Our diverse and creative students and alumni drive innovation, lead international conversations, and make masterpieces. Alumni and faculty, lecturers and postdocs go on to become Nobel laureates, CEOs, university presidents, attorneys general, literary giants, and astronauts.

     
  • richardmitnick 12:14 pm on February 12, 2020 Permalink | Reply
    Tags: Atom or noise?, , , , , Physics, , Stanford’s Department of Bioengineering   

    From SLAC National Accelerator Lab: “Atom or noise? New method helps cryo-EM researchers tell the difference” 

    From SLAC National Accelerator Lab

    February 11, 2020
    Nathan Collins

    Cryogenic electron microscopy can in principle make out individual atoms in a molecule, but distinguishing the crisp from the blurry parts of an image can be a challenge. A new mathematical method may help.

    Cryogenic electron microscopy, or cryo-EM, has reached the point where researchers could in principle image individual atoms in a 3D reconstruction of a molecule – but just because they could see those details doesn’t always mean they do. Now, researchers at the Department of Energy’s SLAC National Accelerator Laboratory and Stanford University have proposed a new way to quantify how accurate such reconstructions are and, in the process, how confident they can be in their molecular interpretations. The study was published February 10 in Nature Methods.

    Cryo-EM works by freezing biological molecules which can contain thousands of atoms so they can be imaged under an electron microscope. By aligning and combining many two-dimensional images, researchers can compute three-dimensional maps of an entire molecule, and this technique has been used to study everything from battery failure to the way viruses invade cells. However, an issue that has been hard to solve is how to accurately assess the true level of detail or resolution at every point in such maps and in turn determine what atomic features are truly visible or not.

    1
    A cryo-EM map of the molecule apoferritin (left) and a detail of the map showing the atomic model researchers use to construct Q-scores. (Image courtesy Greg Pintilie)

    Wah Chiu, a professor at SLAC and Stanford, Grigore Pintilie, a computational scientist in Chiu’s group, and colleagues devised the new measures, known as Q-scores, to address that issue. To compute Q-scores, scientists start by building and adjusting an atomic model until it best matches the corresponding cryo-EM derived 3D map. Then, they compare the map to an idealized version in which each atom is well-resolved, revealing to what degree the map truly resolves the atoms in the atomic model.

    The researchers validated their approach on large molecules, including a protein called apoferritin that they studied in the Stanford-SLAC Cryo-EM Facilities. Kaiming Zhang, another research scientist in Chiu’s group, produced 3D maps close to the highest resolution reached to date – up to 1.75 angstrom, less than a fifth of a nanometer. Using such maps, they showed how Q-scores varied in predictable ways based on overall resolution and on which parts of a molecule they were studying. Pintilie and Chiu say they hope Q-scores will help biologists and others using cryo-EM better understand and interpret the 3D maps and resulting atomic models.

    The study was performed in collaboration with researchers from Stanford’s Department of Bioengineering. Molecular graphics and analysis were performed using the University of California, San Francisco’s Chimera software package. The project was funded by the National Institutes of Health.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    SLAC/LCLS


    SLAC/LCLS II projected view


    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

    SSRL and LCLS are DOE Office of Science user facilities.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: