Tagged: Symmetry Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:53 am on September 7, 2021 Permalink | Reply
    Tags: "Curious physics results could shed light on dark matter", , , , Symmetry   

    From Symmetry: “Curious physics results could shed light on dark matter” 

    Symmetry Mag

    From Symmetry

    Madeleine O’Keefe

    Even experiments that aren’t looking for dark matter directly could give us hints about the mysterious substance that permeates our universe.

    Scientists love a mystery. It’s satisfying when a prediction is shown to be correct, but it’s intriguing when an experiment turns up a result that deviates from expectations.

    Several such anomalies have shown up in recent years in particle physics and astrophysics.

    Sometimes results like these can be explained by faulty equipment. Sometimes they disappear with more rigorous measurement. But sometimes they stay put and demand to be understood.

    What’s interesting about some recent anomalies is that they each have the potential to be explained by the influence of an undiscovered particle or force. Any one of them could be a sign of the existence of Dark Matter—a mysterious substance that makes up about 85% of matter in our universe.


    Dark Matter Background
    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM, denied the Nobel, some 30 years later, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com.

    Coma cluster via NASA/ESA Hubble.

    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.
    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.
    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).

    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL).

    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970

    Dark Matter Research

    Inside the Axion Dark Matter eXperiment U Washington (US) Credit : Mark Stone U. of Washington. Axion Dark Matter Experiment.

    We call it dark matter—not just because it is impossible to see, but because it’s figuratively opaque. We know very little about it even though it is ubiquitous. We know something is responsible for the clumping of galaxies and the movements of stars, but we have never observed it directly.

    Experiments around the world have been designed exclusively to search for this mysterious substance. But it’s not just direct searches that could point us in the right direction. For example, the following anomalies, found in non–dark-matter experiments, may help shed light on this otherwise-dark part of the universe.

    Muon g-2

    DOE’s Fermi National Accelerator Laboratory(US) Muon g-2 studio. As muons race around a ring at the Muon g-2 studio, their spin axes twirl, reflecting the influence of unseen particles.

    The muon (the electron’s heavier cousin) acts strangely in a magnetic field. Earlier this year, DOE’s Fermi National Accelerator Laboratory (US) Muon g-2 collaboration announced a measurement of the amount a muon “wobbles” in a magnetic field, upholding a 2001 result. The problem is that both results seem to differ greatly from what the Standard Model predicts.

    Scientists know that the behavior of fundamental particles such as the muon are influenced by other subatomic characters. A muon can momentarily emit and reabsorb virtual versions of other fundamental particles, an action called a quantum fluctuation.

    To understand the wobble of the muon, theorists have made complicated calculations registering the effects of all of the possible fluctuations that could occur. If the experimental measurements and theorists’ predictions hold, it could be that the predictions are missing a particle.

    “It turns out, there are some beautiful ideas on how to rectify this problem that also very naturally solve the dark matter problem,” says Jonathan Feng, professor at The University of California-Irvine (US), who first drew this connection in a paper [Physical Review Letters] with Konstantin Matchev a few days after the original 2001 announcement.

    Specifically, Feng mentions the neutralino, a dark matter candidate that is predicted by Supersymmetry. If theorists add the influence of virtual neutralinos into the mix, it could alter their prediction in a way that lines up with experimental results.

    Of course, there are also explanations that don’t have any direct connections to dark matter, and the anomaly could even disappear as the calculations get more precise.

    As the Muon g-2 collaboration continues to take measurements and theorists continue their own calculations, it’s possible that the theoretical and experimental values of g-2 will converge. However, many physicists suspect that this anomaly will remain, and if it does, it may have a natural solution through dark matter.

    The Hubble tension

    Our universe is expanding, but the rate of the universe’s expansion, called the Hubble constant (H0), is the subject of one of the biggest disputes in modern cosmology. That’s because the Hubble constant has been calculated using two different methods that yield irreconcilable results.

    This tension has existed for decades and persists even as the measurements get more precise. Although one method uses measurements from the “early” universe (shortly after the Big Bang) and the other uses measurements from the “late” universe (closer to present day), they should still arrive at the same value for H0. But with a 9% difference between results, there is either a major experimental error or something is off in our current understanding of the universe.

    Could dark matter account for the discrepancy? Sophia Gad-Nasr, a PhD candidate in cosmology at The University of California-Irvine (US), says it is possible, but only if dark matter decays. “The way that the expansion works is that the amount of matter that we have will… kind of pull against the stretch of dark energy [the mysterious force driving the expansion of the universe],” she says. “So, if you decrease that [amount of matter] by decaying the dark matter, then eventually there will be less matter to combat the pull of Dark Energy.

    Dark Energy Survey

    The Dark Energy Survey (DES) is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. DES began searching the Southern skies on August 31, 2013.

    According to Einstein’s theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up. To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called dark energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    DES is designed to probe the origin of the accelerating universe and help uncover the nature of dark energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the DES collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.

    So that would yield a larger number [for H0], which we are actually seeing right now.”

    But, Gad-Nasr points out, if dark matter decays, that has many more implications in other parts of the universe, and things start to get complicated. “We want the simplest explanation that we can have, and I don’t think we’ve come up with one yet,” she says.

    Gad-Nasr says she suspects that the discrepancy might lie in our poor understanding of dark energy instead.

    The KOTO excess

    One of the most recent curious results that may be linked to dark matter came in 2019 from J-PARC|Japan Proton Accelerator Research Complex’s KOTO experiment, which studies a very rare decay of a subatomic particle called a kaon. The decay is so rare in predictions based on the Standard Model that the collaboration didn’t expect to see it at all when they began taking data. Surprisingly, they observed four potential occurrences of the decay.

    In March 2020, researchers proposed three explanations for the seeming excess [Physical Review Letters], including one that it could be due to a new meta-stable particle. To not violate any other measurements, this new particle would need to decay after about 1 meter, which is characteristic of a particle that frequently occurs in dark matter models.

    However, the excess might not hold. The result was presented preliminarily at a workshop in 2019, and further analyses showed that it was not statistically significant.

    “We are not ready yet to put all our hopes in this anomaly,” says Felix Kling, a postdoc at DOE’s SLAC National Accelerator Laboratory (US), who is working on a dark matter experiment called FASER that will test the KOTO anomaly.

    Still, he says, researchers will need to collect more data before they can make a ruling either way.

    “If the anomaly is really there, then what we expect to see is that the longer we run, the more signal we see,” he says. “In my own opinion, we should simply keep an eye on future updates from the KOTO collaboration and see what happens.”

    The proton radius puzzle

    In 2010, the Charge Radius Experiment with Muonic Atoms (CREMA) made an incredibly precise measurement of the proton’s radius by shooting a laser beam at (as its name suggests) hydrogen atoms made with muons instead of electrons.

    Strangely, the measurement was nearly 4% smaller than the then-official value set in 2006 by the Committee on Data of the International Science Council (CODATA), derived from multiple spectroscopy experiments that used ordinary hydrogen. The discrepancy persisted even as CODATA updated their value every four years. By 2017, additional experimental measurements were also showing support for CREMA’s smaller proton radius.

    The unexpected result caused excitement among theorists. Why did the different experimental methods produce such disparate values? Some wondered if “new physics”—like dark matter—was causing a difference between the ways that electrons and muons behave, resulting in the discrepancy between radius measurements with ordinary and muonic hydrogen. Perhaps, as some theorists have suggested, there is a new dark-matter particle that interacts with muons and not electrons, which could solve both the proton radius puzzle and the Muon g-2 anomaly.

    It’s a possibility, though it might not be necessary anymore: In 2019, a spectroscopy experiment using ordinary hydrogen measured a smaller proton radius that agreed with the muonic measurement, suggesting that the puzzle is solved. Still, other spectroscopy experiments continue to yield the larger radius, leading some physicists to believe it’s not over yet.

    More accurate experiments are likely needed to solve the proton radius puzzle definitively.

    How will we know?

    Ultimately, only time—and more precise measurements and predictions—will tell whether these anomalies stick around.

    “We need more developments both on the theory and experimental grounds,” says JiJi Fan, a theoretical physicist and professor at Brown University. “We need other hints beyond these to determine whether these results have any connection to dark matter.”

    If there’s one thing physicists can agree on, it’s that the Standard Model cannot explain the longstanding evidence that there’s something about the way our universe is put together that we don’t understand.

    “We need to keep looking,” Kling says. “If we don’t look everywhere where we can, then we will not find anything.

    “Sometimes these anomalous results can give us the right hint. Many discoveries in physics or particle physics within the last 100 years kind of arise because there was some anomalous result that just suddenly appeared that no one expected but it was just there. Then we started to learn something new.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 11:29 am on August 31, 2021 Permalink | Reply
    Tags: "Photographing the HL-LHC" Photo Essay, , , , , , , , , Symmetry   

    From Symmetry: “Photographing the HL-LHC” Photo Essay 

    Symmetry Mag

    From Symmetry

    Samuel Hertzog

    A CERN photographer and videographer writes about his experiences documenting the ongoing upgrade that will turn the Large Hadron Collider into the High-Luminosity LHC.

    “It’s August 2019, and I’m a photographer employed by CERN to create audiovisual content for CERN’s internal and external communication. Today a colleague and I are photographing the ongoing civil engineering for new passages, caverns and shafts that will enlarge CERN’s subterranean accelerator complex. When completed, they will house the powering, protection and cryogenic systems for the High-Luminosity LHC. These upgrades will increase the collision rate by a factor of five beyond the LHC’s design value and enable the experiments to search for new physics and phenomena that were previously out of reach.

    A security officer guides us, making sure we stay out of the way of the heavy machinery while he shows us his favorite spots. The lighting is dim, which makes navigating the rocky and uneven pathway even more treacherous.

    Photo by Maximilien Brice.

    Courtesy of Samuel Hertzog and Jules Ordan.

    “Our mission is to collect photos and video footage that both convey the feel of the place and document the action. In just a short time, with limited recording gear and the addition of bulky gloves, boots, masks and protective glasses, we rush to set up our shots.

    Two things stand out: The scale of the place, and how rough an area it is. This, to a photographer, is a sign that it is time to break out the wide-angle lenses and get right up close to the workers. We want to create an immersive feeling for the viewer, a sense that they are right there with us taking in the entire scene.

    Courtesy of Samuel Hertzog and Jules Ordan.

    “Before coming to CERN in winter 2019, I primarily focused on wildlife photography and filmmaking. Working at CERN is unlike anything I’ve done before. I often say shooting the CERN caverns is where a top photographer can really make their mark. You are faced with huge structures but very little room to maneuver. It’s dark, so you need to hold for long exposures. But there are also lots of people and machines moving at all times. To balance all these factors at once is a real test of your skills.

    Toward the end of 2019, the workers break through the wall and connect the new tunnel to the one that holds the LHC. Project leaders and the Director General of CERN hold a ceremony to commemorate the moment. The heads of CERN dress in work suits and descend the shaky metallic steps to pose for a photo and sit for a short interview under bright lights we set up for the occasion. It feels almost like being in a photo studio 100 meters underground.

    Courtesy of Samuel Hertzog and Jules Ordan.

    “In May 2021—18 months after the subterranean photoshoot—we return to the HL-LHC tunnels. The crews have been working 24/7 to get the tunnel construction completed before the LHC restart in Spring 2022. We are told that dust is no longer the issue, but vertigo might be. The temporary elevator is being replaced, so our way down is essentially a large bucket suspended by a rope. No room for unsteady nerves on this site!

    Courtesy of Samuel Hertzog and Jules Ordan

    “When we reach the bottom, the tunnel is radically different. We find ourselves in a clean, white entrance hall, with our path illuminated at regular intervals by elegant blue lights.

    Courtesy of Samuel Hertzog and Jules Ordan.

    “The challenge is now less technically extreme. Creatively, however, this is a whole new game. We still have the heavy machinery and workers in high-vis uniforms. But otherwise, the surroundings are pure science fiction. We respond with a change in style, paying attention to symmetry, proportions and structure to convey the modern, elegant environment.

    Courtesy of Samuel Hertzog and Jules Ordan.

    “It is a photographer’s duty to be adaptable and quick to come up with new ideas when documenting, and CERN’s ever-changing environments certainly test those skills. Conditions and constraints ultimately bring out creativity. It is remarkable to me to look back and see not only the evolution the location but also of my own perspective.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 11:17 am on August 26, 2021 Permalink | Reply
    Tags: "Teaching a particle detector new tricks", , , , , , , , , Symmetry   

    From Symmetry: “Teaching a particle detector new tricks” 

    Symmetry Mag

    From Symmetry

    Sarah Charley

    Scientists hoping to find new, long-lived particles at the Large Hadron Collider recently realized they may already have the detector to do it.

    European Organization for Nuclear Research (Organisation européenne pour la recherche nucléaire)(EU) CMS Detector

    Physicist Cristián Peña grew up in Talca, a small town a few hours south of Santiago, Chile. “The Andes run all the way through the country,” he says. “No matter where you look, you always have the mountains.”

    At the age of 13, he first aspired to climb them.

    Over the years, as his mountaineering skills grew, so did his inventory of tools. Ice axes, crampons and ropes expanded his horizons.

    In Peña’s work as a scientist at the DOE’s Fermi National Accelerator Laboratory (US), he applies this same mindset: He creates the tools his experiment needs to explore new terrain.

    “Detector work is key,” he says.

    Peña’s current focus is the CMS detector, one of two large, general-purpose detectors at the Large Hadron Collider. Peña and colleagues want to use CMS to search for a class of theoretical particles with long lifetimes.

    While working through the problem, they realized that an ideal long-lived particle detector is already installed inside CMS: the CMS muon system. The question was whether they could hack it to do something new.

    Courtesy of CMS Collaboration.

    Long-lived particles

    When scientists designed the CMS detector in the 1990s, they had the most popular and mathematically resilient models of particle physics in mind. As far as they knew, the most interesting particles would live just a fraction of a fraction of a second before transforming into well understood secondary particles, such as photons and electrons. CMS would catch signals from those secondary particles and use them as a trail back to the original.

    The prompt-decay assumption worked in the search for Higgs bosons. But scientists are now realizing that this “live fast, die young” model might not apply to every interesting thing that comes out of a collision at the LHC. Peña says he sees this as a sign that it’s time for the experiment to evolve.

    “If you’re a little kid and you walk a mile in the forest, it’s all completely new,” he says. “Now we have more experience and want to push new frontiers.”

    For CMS scientists, that means finding better ways to look for particles with long lifetimes.

    Long-lived particles are not a radical new concept. Neutrons, for example, live for about 14 minutes outside the confines of an atomic nucleus. And protons are so long-lived that scientists aren’t sure whether they decay at all. If undiscovered particles are moving into the detector before becoming visible, they could be hiding in plain sight.

    “Previously, we hadn’t really thought to look for long-lived particles,” says Christina Wang, a graduate student at The California Institute of Technology (US) working on the CMS experiment. “Now, we have to find new ways to use the CMS detector to see them.”

    A new idea

    Peña was thinking about long-lived particles while attending a conference in Aspen, Colorado, in March 2019.

    “There were a bunch of whiteboards, and we were throwing around ideas,” he says. “In that type of situation, you go with the vibe. There’s a lot of creativity and you start thinking outside the box.”

    Peña and his colleagues visualized what an ideal long-lived particle detector might look like. They would need a detector that was far from the collision point. And they would need shielding to filter out the secondary particles that are the stars of the show in traditional searches.

    “When you look at the CMS muon system,” Peña says, “that’s exactly what it is.”

    Muons, often called the heavier cousins of electrons, are produced during the high-energy collisions inside the LHC. A muon can travel long distances, which is why CMS and its sister experiment, ATLAS, have massive detectors in their outer layers solely dedicated to capturing and recording muon tracks.

    Peña ran a quick simulation to see if the CMS muon system would be sensitive to the firework-like signatures of long-lived particles. “It was quick and dirty,” he says, “but it looked feasible.”

    After the conference, Peña returned to his regular activities. A few months later, Caltech rising sophomore Nathan Suri joined Professor Maria Spiropulu’s lab as a summer student, working with Wang. Peña, who was also collaborating with Spiropulu’s research group, assigned Suri the muon detector idea as his summer project.

    “I was always encouraged to give ideas to young, talented people and let them run with it,” Peña says.

    Suri was excited to take on the challenge. “I was in love with the originality of the project,” he says. “I was eager to sink my teeth into it.”

    Testing the concept

    Suri started by scanning event displays of simulated long-lived particle decays to look for any shared visual patterns. He then explored the original technical design report for the CMS muon detector system to see just how sensitive it could be to these patterns.

    “Looking at the unique detector design and highly sensitive elements, I was able to realize what a powerful tool it was,” he says.

    By the end of the summer, Suri’s work had shown that not only was it feasible to use the muon system to detect long-lived particles, but that CMS scientists could use pre-existing LHC data to get a jump start on the search.

    “At this point, the floodgates opened,” Suri says.

    In fall 2019, Wang took the lead on the project. Suri had shown that the idea was possible; Wang wanted to know if it was realistic.

    So far, they had been working with processed data from the muon system, which was not adapted to the kind of search they wanted to do. “All the reconstruction techniques used in the muon system are optimized to detect muons,” Wang says.

    Wang, Peña and Caltech Professor Si Xie set-up a Zoom meeting with muon system experts to ask for advice.

    “They were really surprised that we wanted to use the muon system to infer long-lived particles,” Wang says. “They were like, ‘It’s not designed to do that.’ They thought it was a weird idea.”

    The experts suggested the team should try looking at the raw data instead.

    Doing so would require extracting unprocessed information from tapes and then developing new software and simulations that could reinterpret thousands of raw detector hits. The task would be arduous, if not impossible.

    After the muon system experts left the call, Wang remembers, “we were still in the Zoom room and like, ‘Do we want to continue this?’”

    She says it was not a serious question. Of course they did.

    A trigger of their own

    In fall 2020, Martin Kwok started a postdoctoral position at Fermilab. “We’re encouraged to talk to as many groups as we can and think about what we want to work on most,” he says.

    He met with Fermilab researcher Artur Apresyan, who told him about the collaboration with Caltech to convert the CMS muon system into a long-lived particle detector. “It was immediately attractive,” Kwok says. “It’s not very often that we get to explore new uses for our detector.”

    Wang and her colleagues had forged ahead with the idea, extracting, processing, and analyzing raw data recorded by the CMS muon system between 2016 and 2018.

    It had worked, but the dataset they had available to study was not ideal.

    The LHC generates around a billion collisions every second—much more than scientists can record and process. So scientists use filters called triggers to quickly evaluate and sort fresh collision data.

    For every billion collisions, only about 1000 are deemed “interesting” by the triggers and saved for further analysis. Wang and her colleagues had determined the filters closest to what they were looking for were the ones programmed to look for signs of dark matter.

    Apresyan pitched to Kwok that he could design a new trigger, one actually meant to look for signs of long-lived particles. They could install it in the CMS muon system before the LHC restarts operation in spring 2022.

    With a dedicated trigger, they could increase the number of events deemed “interesting” for long-lived particle searches by up to a factor of 30. “It’s not often that we see a 30-times increase in our ability to capture potential signal events,” Kwok says.

    Kwok was up for the challenge. And it was a challenge.

    “The price of doing something different—of doing something innovative—is that you have to invent your own tools,” Kwok says.

    The CMS collaboration consists of thousands of scientists all using collective research tools that they developed and honed over the last two decades. “It’s a bit like building with Legos,” Kwok says. “All the pieces are there, and depending on how you use and combine them, you can make almost anything.”

    But developing this specialized trigger was less like picking the right Legos and more like creating a new Lego piece out of melted plastic.

    Kwok dug into the experiment’s archives in search of his raw materials. He found an old piece of software that had been developed by CMS but rarely used. “This left-over tool that faded out of popularity turned out to be very handy,” he says.

    Kwok and his collaborators also had to investigate if integrating a new trigger into the muon system was even possible. “There’s only so much bandwidth in the electronics to send information upstream,” Kwok says.

    “I’m thankful that our collaboration ancestors designed the CMS muon system with a few unused bits. Otherwise, we would have had to reinvent the whole triggering scheme.”

    What started as a feasibility study has now evolved into an international effort, with many more institutions contributing to data analysis and trigger R&D. The US institutions contributing to this research are funded by the Department of Energy (US) and the National Science Foundation (US).

    “Because we don’t have dedicated long-lived particle triggers yet, we have a low efficiency,” Wang says. “But we showed that it’s possible—and not only possible, but we are overhauling the CMS trigger system to further improve the sensitivity.”

    The LHC is scheduled to continue into the 2030s, with several major accelerator and detector upgrades along the way. Wang says that to keep probing nature at its most fundamental level, scientists must remain at the frontier of detector technology and question every assumption.

    “Then new areas to explore will naturally follow,” she says. “Long-lived particles are just one of these new areas. We’re just getting started.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 8:30 pm on August 24, 2021 Permalink | Reply
    Tags: "Can light melt atoms into goo?", , , Brookhaven National Laboratory (US) Relativistic Heavy Ion Collider, , , , , , , , Symmetry   

    From Symmetry: “Can light melt atoms into goo?” 

    Symmetry Mag

    From Symmetry

    Sarah Charley

    Courtesy of Christopher Plumberg

    The ATLAS experiment [CH] at European Organization for Nuclear Research [Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH) [CERN]. sees possible evidence of quark-gluon plasma production during collisions between photons and heavy nuclei inside the Large Hadron Collider.

    Photons—the massless particles also known as the quanta of light—are having a moment in physics research.

    Scientists at the Large Hadron Collider have recently studied how, imbued with enough energy, photons can bounce off of one another like massive particles do. Scientists at the LHC and the DOE’s Brookhaven National Laboratory (US) Relative Heavy Ion Collider (US) have also reported seeing photons colliding and converting that energy into massive particles.

    The photon’s most recent seemingly impossible feat? Smashing so hard into a lead nucleus that the collision seems to produce the same state of matter that existed moments after the Big Bang.

    Simulated quark-gluon plasma formation. Courtesy of Chistopher Plumberg.

    “I did not expect that photons could produce a quark-gluon plasma until I actually saw the results,” says theoretical nuclear physicist Jacquelyn Noronha-Hostler, an assistant professor at the University of Illinois -Urbana-Champaign (US).

    Scientists at the LHC at CERN and at RHIC at DOE’s Brookhaven National Laboratory (US) have known for years they could produce small amounts of quark-gluon plasma in collisions between heavy ions. But this is the first time scientists have reported possible evidence of quark-gluon plasma in the aftermath of a collision between the nucleus of a heavy ion and a massless particle of light.

    The scenario seems unlikely. Unlikely, but not impossible, says ATLAS physicist Dennis Perepelitsa, who is an assistant professor at The University of Colorado-Boulder (US).

    “In quantum mechanics, everything that is not forbidden is compulsory,” Perepelitsa says. “If it can happen, it will happen. The question is just how often.”

    Collisions between photons and lead nuclei are common inside the LHC. Perepelitsa and his colleagues are the first to examine them to find out whether they ever produce a quark-gluon plasma. Their first round of results indicate the answer could be yes, an insight that might provide a new understanding of fluid dynamics.

    Scientists contributing to LHC research from US institutions are funded by the Department of Energy (US) and the National Science Foundation (US).

    The Large Light Collider

    Perepelitsa and his colleagues on the ATLAS experiment went looking for collisions between photons and nuclei, called photonuclear collisions, in data collected during the lead-ion runs at the LHC. These runs have happened in the few weeks just before the LHC’s winter shutdown each year that the LHC has been in operation.

    Lead nuclei are made up of protons and neutrons, which are made up of even smaller fundamental particles called quarks. “You can think of the nucleus like a bag of quarks,” Noronha-Hostler says.

    This bag of quarks is held together by gluons, which “glue” small groups of quarks into composite particles called hadrons.

    When two lead nuclei collide at high energy inside the LHC, the gluons can lose their grip, causing the protons and neutrons to melt and merge into a quark-gluon plasma. The now-free quarks and gluons pull on each other, holding together as the plasma expands and cools.

    Eventually, the quarks cool enough to reform into distinct hadrons. Scientists can reconstruct the production, size and shape of the original quark-gluon plasma based on the number, identities and paths of hadrons that escape into their detectors.

    During the lead-ion runs at the LHC, nuclei aren’t the only things colliding. Because they have a positive charge, lead nuclei carry strong electromagnetic fields that grow in intensity as they accelerate. Their electromagnetic fields spit out high-energy photons, which can also collide—a fairly common occurence. “There’s a lot of photons, and the nucleus is big,” Perepelitsa says.

    Despite their frequency, no one had ever closely examined the detailed patterns of these kinds of photonuclear collisions at the LHC. For this reason, ATLAS scientists had to develop a specialized trigger that could pick out the photon-zapped lead ions from everything else.

    According to Blair Seidlitz, a graduate student at CU Boulder, this was tricky. “People have a lot more experience triggering on lead-lead collisions,” he says.

    Luckily, photonuclear collisions have a special asymmetrical shape due to the momentum differences between the tiny photon and the massive lead ion: “It’s like a truck hitting a trash can,” Seidlitz says. “All the debris from the collision will move in the direction of the truck.”

    Seidlitz designed a trigger that looked for collisions that generated a small number of particles, had a skewed shape, and saw remnants of the partially obliterated lead ion embedded in special detectors 140 meters away from the collision point.

    After collecting and analyzing the data, Seidlitz, Perepelitsa and their colleagues saw a particle-flow signature characteristic of a quark-gluon plasma.

    The finding alone is not enough to prove the formation of a quark-gluon plasma, but it’s a first clue. “There are always potential competing explanations, and we need to look for other signatures of quark-gluon plasma that could be there,” Perepelitsa says, “but we haven’t measured them yet.”

    If the photonuclear collisions are indeed creating quark-gluon plasma, it could be a kind of quantum trick, Perepelitsa says.

    Perepelitsa and his colleagues are dubious that a massless photon could pack a powerful enough punch to melt part of a lead nucleus, which contains 82 protons and 126 neutrons. “It would be like throwing a needle into a bowling ball,” he says.

    Instead, he thinks that just before impact, these photons are undergoing a transformation originally predicted by Nobel Laureate Paul Dirac.

    A quantum transformation

    In 1931, Dirac published a paper predicting a new type of particle. The particle would share the mass of the electron but have the opposite charge [positron]. Also, he predicted, “if it collides with an electron, the two will have a chance of annihilating one another.”

    It was the positron, the first predicted particle of antimatter. In 1932, The California Institute of Technology (US) physicist Carl Anderson discovered the particle, and later physicists spotted the annihilation process Dirac had predicted as well.

    When matter and antimatter meet, the two particles are destroyed, releasing their energy in the form of a pair of photons.

    Scientists also see this process happening in reverse, Noronha-Hostler says. “Two photons can interact and create a quark-antiquark pair.”

    Before annihilating, that quark-antiquark pair can bind together to make a hadron.

    Perepelitsa and his colleagues suspect that the collisions they’ve observed, in which photons appear to be colliding with lead nuclei and creating a small amount of quark-gluon plasma, are not actually collisions between nuclei and photons. Instead, they’re collisions between nuclei and those tiny, ephemeral hadrons.

    This makes more sense, Perepelitsa says, as hadrons are bigger in size than photons and are capable of more substantial interactions. “It’s no longer a needle going into a bowling ball, but more like a bullet.”

    The smallest drop

    For now, the exact mechanism that may be causing this quark-gluon plasma signature within photonuclear collisions remains a mystery. Whatever is going on, Noronha-Hostler says figuring out these collisions could be an important step in quark-gluon plasma research.

    LHC scientists’ usual method of studying the quark-gluon plasma has been to examine crashes between lead nuclei, which create a complex soup of quarks and gluons. “We thought originally that the only way we could produce a quark gluon plasma was two massive nuclei hitting each other,” she says. “And then experimentalists started playing around and running smaller things, like protons. With photonuclear collisions, that’s even smaller.”

    If photonuclear collisions are creating quark-gluon plasma, it’s in the form of a tiny droplet composed of a few vaporized protons and neutrons.

    Scientists are hoping to study these droplets to learn more about how liquids behave on subatomic scales.

    “We’re pushing to the most extremes in fluid dynamics,” Noronha-Hostler says. “Not only do we have something that is moving at the speed of light and at the highest temperatures known to humanity, but it looks like we are going to be able to answer ‘What is the smallest droplet of a liquid?’ No other field can do that.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 4:18 pm on August 17, 2021 Permalink | Reply
    Tags: "The search for the sterile neutrino", Clyde Cowan's and Fred Reines' detector eventually picked up enough signals to confirm the existence of neutrinos., Enrico Fermi figured out a complete theory of nuclear beta decay incorporating the new particle which he christened the “neutrino.”, In the 1950s physicists Clyde Cowan and Fred Reines at the DOE's Los Alamos National Laboratory (US) designed a detector to catch neutrinos., , Nuclear beta decay, , Pauli bought Baade a case of champagne to celebrate the discovery., Physicist Walter Baade made a bet with Pauli that his undiscoverable particle would one day be found., Symmetry, Wolfgang Pauli posited that an atom undergoing beta decay actually emitted more than one particle; it was just that the second particle had neither charge nor mass.   

    From Symmetry: “The search for the sterile neutrino” 

    Symmetry Mag

    From Symmetry

    Mary Magnuson

    Back when it was theorized, scientists weren’t sure they would ever detect the neutrino; now they’re searching for a version of the particle that could be even more elusive.

    Neutrinos. Credit: J-PARC T2K Neutrino Experiment.

    In Germany in 1930, a group of scientists held a conference on nuclear physics, and they invited Wolfgang Pauli. The Austrian physicist was known as the originator of the Pauli exclusion principle, work that furthered scientists’ understanding of matter and would eventually earn Pauli a Nobel Prize in Physics.

    Pauli couldn’t attend the German conference; he had a conflict in Zürich. Instead, he sent the attendees a letter that would turn out to be one of the more significant correspondences in physics history. In it, he predicted the existence of what would eventually be known as the neutrino.

    Scientists have since discovered and studied the properties of the theoretical particle. But big questions remain, including whether an undiscovered type of neutrino could be hiding from researchers’ detectors.

    An undetectable particle

    In his letter to the conference attendees, Pauli detailed ideas he’d had about beta decay, a process that had been troubling the nuclear physicists.

    In beta decay, an unstable atom releases energy in the form of a particle (called a beta particle). Scientists studying beta decay found that the energy of the beta particle was not enough to account for the total energy the decaying atom lost.

    Pauli had an idea about where the missing energy could be. He posited that an atom undergoing beta decay actually emitted more than one particle; it was just that the second particle had neither charge nor mass and was therefore undetectable by the technology of the day.

    That was the problem, though. If the particle were undetectable, there would be no way to test whether Pauli’s theory was correct. Pauli lamented that proposing the existence of an undetectable particle was “something no theorist should ever do” and kept the letter informal rather than write an official paper about an idea he was too uncomfortable to fully claim.

    But the idea of an energy-carrying “ghost particle” resonated with many researchers, including Enrico Fermi. A few years later, Fermi figured out a complete theory of nuclear beta decay incorporating the new particle, which he christened the “neutrino.” Fermi theorized that the neutrino interacted through an unknown force, now known as the weak force, which interacts only at extremely short range.

    Not all scientists were as pessimistic as Pauli about detecting the neutrino. Physicist Walter Baade even made a bet with Pauli that his undiscoverable particle would one day be found.

    In the 1950s physicists Clyde Cowan and Fred Reines at the DOE’s Los Alamos National Laboratory (US) designed a detector to catch neutrinos. It would detect the particles passing through and occasionally interacting via the weak force. To ensure they’d nab a few, they planned to set up their device near the most extreme collection of unstable atoms undergoing beta decay that they could create: a nuclear blast.

    Aside from the technical challenges it would take to study a nuclear explosion, it turned out that a nuclear bomb would also produce a lot of background radiation that would make it difficult to isolate the signals from the neutrinos. So Cowan and Reines changed their plans. Instead, they set up their detector in South Carolina next to a nuclear reactor.

    While the reactor produced neutrinos much more slowly than a bomb, the detector eventually picked up enough signals to confirm the existence of neutrinos. Pauli bought Baade a case of champagne to celebrate the discovery.

    More than meets the eye

    Scientists had detected the undetectable particle. But there was a lot left to learn about it.

    In the 1960s, astrophysicists Raymond Davis and John Bahcall measured neutrinos coming from the sun with an experiment installed in Homestake Gold Mine in South Dakota. They detected only a third as many of the particles as they expected.

    In the ’90s, researchers at the Sudbury Neutrino Observatory in Canada and the Super-K experiment in Japan determined the cause of the missing neutrinos.

    The neutrino could “oscillate,” or shift between the three different types or “flavors”: electron neutrinos, muon neutrinos and tau neutrinos. Oscillation implies mass, so the discovery also let them know that neutrinos were not massless like they had thought.

    Scientists had worked out theories about how neutrinos should oscillate, but those theories were put to the test in an experiment at Los Alamos called the Liquid Scintillator Neutrino Detector.

    LSND studied a beam of neutrinos—specifically, muon antineutrinos—to see how many of them oscillated to a different type over a short distance. Its results indicated more of them than anticipated had transformed into electron antineutrinos.

    Scientists wondered: Could this elevated number of oscillations point to the influence of an even more elusive ghost particle? One that did not even interact through the weak force? Was it time to bet another case of champagne?

    In 2002, a similar experiment at DOE’s Fermi National Accelerator Laboratory (US), named after Enrico Fermi, followed up. The MiniBooNE experiment operated at a different energy level and used a different experimental methodology than LSND; it recorded an excess in electron neutrinos as well.

    The results could possibly be accounted for if neutrinos were oscillating in strange ways—say, to more than three flavors. Because it would interact even less strongly with matter, scientists called the hypothetical missing neutrino flavor a “sterile” neutrino.

    There doesn’t have to be just one type of sterile neutrino, says Harvard University (US) physicist Carlos Argüelles-Delgado. In the Standard Model of physics, for example, many particles and phenomena come in sets of three; maybe the sterile neutrinos do, too.

    The incredible number of neutrino experiments.

    Conflicting anomalies

    Argüelles-Delgado works on the University of Wisconsin IceCube (US) Neutrino Observatory experiment. Based in Antarctica, IceCube looks at neutrinos emitted from the sun or other astronomical phenomena, such as supernovae.

    Although sterile neutrinos aren’t its focus, IceCube is one of several experiments offering input into the current search. So far IceCube has mostly just tightened constraints around what sterile neutrinos could be, Argüelles-Delgado says.

    “IceCube has not found any conclusive evidence of oscillations that are compatible with MiniBooNE,” he says. “And we have found no conclusive evidence for a sterile neutrino. However, we have found hints of sterile neutrinos… We have found something that hints towards the right direction.”

    Another experiment that has provided valuable information to the search is DOE’s Fermi National Accelerator Laboratory (US) MINOS, which from 2005 to 2012 studied a beam of neutrinos produced at Fermilab, sampling the particles both close to the origin of the beam and far from it in a mine in Minnesota. The experiment did not find anything that might suggest the existence of sterile neutrinos.

    With LSND and MiniBooNE seeing different results than IceCube and MINOS, Fermilab particle physicist Pedro Machado says it’s nearly impossible to find cohesive evidence for sterile neutrinos.

    Also contributing to the conversation are a group of experiments that harken back to the first detection of neutrinos: reactor experiments such as Daya Bay in China, Double Chooz in France and RENO in Korea.

    To date, they haven’t found any reliable evidence to back up MiniBooNE or LSND, says Virginia Tech physicist Patrick Huber. But they have been involved in their own conflict between theory and experiment. In 2011, a group of theorists recalculated the expected number of electron antineutrinos that reactor experiments should have seen. They found that their prediction did not match the experimental measurement.

    Since the discovery of the possible anomaly, researchers at Daya Bay, as well as theorists and other experimentalists, have continued working on their models and studying the decay processes that produce these antineutrinos.

    Experiments on the horizon

    All of these different experiments offer input into the question of sterile neutrinos, but they offer it at different angles—using different methods and examining different sources of neutrinos. In the near future, scientists may get answers from experiments that try to match the perspectives of the original instigators of the sterile neutrino debate.

    At the Japan Proton Accelerator Research Complex, an experiment called JSNS^2 plans to check the LSND observation. “We aim to confirm or defeat the existence of a sterile neutrino with the same experiment as LSND,” says Takasumi Maruyama, a J-PARC researcher on JSNS^2. “There are lots of experiments, but we have to understand what’s going on with the same neutrino source and same neutrino interaction.

    “So 20 years after the LSND results, I think it’s a nice time to follow up on the LSND experiment.”

    At Fermilab, an experiment called MicroBooNE aims to reproduce MiniBooNE’s measurements in more detail. Researchers expect initial results later this year.

    MicroBooNE also is a part of a larger experiment involving a series of three detectors known as the Short Baseline Neutrino Program at Fermilab.

    The three detectors will paint a detailed picture of neutrino behavior by examining oscillations at three different distances from the source of a neutrino beam. The ICARUS detector, which is the farthest from the source, will start collecting physics data this fall. Construction of the Short Baseline Neutrino Detector, which is the closest to the source, is underway.

    Huber says he imagines sterile neutrinos will remain hidden, possibly until even more precise, future detectors can be designed and built. It could be that they will never be found, either because they don’t exist or because Pauli’s predicted particle has an undetectable side to it after all.

    Argüelles-Delgado says that, whether or not upcoming experiments are able to find sterile neutrinos, science will benefit from the search.

    “In particle physics when there are hints, you have to pursue those hints,” he says, “because some hints will end up being challenges that just enable you to improve your detector technology and techniques—and other hints will do that and also let you discover new physics. So you always win.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 10:52 am on July 28, 2021 Permalink | Reply
    Tags: "Changing a name without forfeiting credit", Now a scientist that changes their name will now be able to contact a single representative at their institution which will pursue the rest of the process on their behalf., Now that digital publication is the norm publishers have started to create new policies., Symmetry, Transgendered scientists desire name changes to be recognized.   

    From Symmetry and DOE’s Lawrence Berkeley National Laboratory (US): “Changing a name without forfeiting credit” 

    Symmetry Mag

    From Symmetry


    DOE’s Lawrence Berkeley National Laboratory (US)

    Mary Magnuson

    A group of US national laboratories, publishers, journals and other organizations is making it easier for researchers to update their names on past publications.

    Six months: That’s how long it took materials scientist Amalie Trewartha to reclaim her work.

    When she finally finished, she felt a burden lift. For the first time in years, she could fill out a CV without facing a choice: Should she abandon the portion of her research published under her old name? Or claim it and risk facing discrimination for being transgender?

    When the last of the journals she contacted updated its records, “I felt like I had ownership of my work in a way that I didn’t before,” Trewartha says. “To go back and see the papers that previously reminded me of a less pleasant part of my life changed was really nice. I felt like I was able to be proud of my work in a way that I wasn’t previously.”

    People change their names for a variety of reasons. Trewartha changed her name to reflect her gender. Other people change their names after marriage or divorce, or for religious or cultural reasons.

    For a long time, publishers met requests to update names on previously published papers with a firm “no.” When publishing exclusively meant printing and physically distributing academic journals, such revisions were not possible. But now that digital publication is the norm publishers have started to create new policies.

    The challenge for researchers like Trewartha is that scientists don’t just publish in a single journal. When each publisher has its own policy, its own representatives to be contacted, and its own bureaucratic hoops to jump through, it’s no wonder an already-busy researcher took half a year to get through it all.

    Enter Joerg Heber, Research Integrity Officer at DOE’s Lawrence Berkeley National Lab (US), and Lady Idos, the lab’s Chief Equity, Diversity and Inclusion Officer. Earlier this year, they joined a push already decades in the making to make changing a name on a past paper not just possible—but simple.

    Today they, along with partners from all 17 US Department of Energy national laboratories, publishers, journals and other organizations, are announcing an initiative to streamline and centralize the name-change process.

    Instead of having to contact multiple journals to update past publications, a scientist that changes their name will now be able to contact a single representative at their institution which will pursue the rest of the process on their behalf.

    Participants in the initiative include major publishers such as the American Chemical Society (US), the American Physical Society (US), the American Society for Microbiology (US), arXiv.org e-Print archive (US), eLife, Elsevier (NL), protocols.io, Scopus [Elsevier], Springer Nature, and Wiley (US). Heber and Trewartha consulted with members of the Name Change Policy Working Group, formed at the University of California-Irvine (US).

    Participation of the national laboratories is driven by the individual institutions, not a DOE or federal directive.

    Deputizing the laboratories to handle name changes simplifies the identity-verification process, Heber says. When considering a name change for an individual, publishers want to make sure the request is coming from the right person. It helps to have an institution vouch for the scientist. Plus, a streamlined process reduces the number of possibly stressful, personal conversations a scientist must go through to claim work that is rightfully theirs.

    Researchers are not required to seek a name change through their institutions; they can still go about it individually, if they’d prefer.

    Miriam Blake, science & technology publications manager at DOE’s Pacific Northwest National Lab (US), is a member of the initiative. She and around 30 people in various departments across the different national labs have met virtually once a month to collaborate. She says it has been an opportunity to evaluate what structures exist to aid transgender researchers in updating their names for published works and figuring out how the labs can support them further.

    “The publishers have been so accommodating, and the labs have been eager to get involved,” Blake says. “We have a long way to go, but every step that we take to make this kind of work visible and make it easier for folks going through any transition in their life that previously was difficult is good for society.”

    Trewartha contributed to the effort by offering her name-change journey as a test case. She worked with Heber and Idos throughout those six months of name-change requests to examine what different policies looked like at different publishers and to identify the hitches in the process.

    She also took the time to offer her feedback to the publishers. Trewartha says that, while this communication took time, she took it on in the hopes that the researchers that come after her will have a better experience.

    Heber says he hopes more publishing partners are on the way. He says it’s already been rewarding to see how many groups have wanted to get involved. “It’s great seeing how so many publishers and other institutions are all constructively working on this,” he says. “We always run into open doors, because everybody wants to improve the situation.”

    While the initiative marks a big step, Trewartha says, it’s not a complete solution. For one, papers don’t represent a researcher’s entire body of scholarly work. Trewartha says she still has conference proceedings, old data sets, talk recordings and more that haven’t been updated. There’s currently no easy way to update every paper that cites a researcher’s past work under an old name. And aggregation services like Google Scholar don’t frequently index, so a researcher’s information might take a while to change.

    “An entire ecosystem of places feed off this data,” Trewartha says. “The publishers have been good about changing the names on papers, but there’s just so much downstream stuff that needs to be changed. It’s difficult to round up every possible instance.”

    Even if it’s impossible to get every single instance of a name changed, Trewartha says, the progress this group has made thus far represents an important culture shift in the academic community.

    “The whole system of publication is built with an assumption that people will have one name” that never changes, Trewartha says. “It’s been obvious for many years that this doesn’t work for everyone.

    “This is a recognition that the system of publication and the system of academia has to work for everyone, and it has to be inclusive.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    LBNL campus

    Bringing Science Solutions to the World

    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) (US) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the National Academy of Sciences (NAS), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California (US) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the UC Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a University of California-Berkeley (US) physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.



    The laboratory was founded on August 26, 1931, by Ernest Lawrence, as the Radiation Laboratory of the University of California, Berkeley, associated with the Physics Department. It centered physics research around his new instrument, the cyclotron, a type of particle accelerator for which he was awarded the Nobel Prize in Physics in 1939.

    LBNL 88 inch cyclotron.

    Throughout the 1930s, Lawrence pushed to create larger and larger machines for physics research, courting private philanthropists for funding. He was the first to develop a large team to build big projects to make discoveries in basic research. Eventually these machines grew too large to be held on the university grounds, and in 1940 the lab moved to its current site atop the hill above campus. Part of the team put together during this period includes two other young scientists who went on to establish large laboratories; J. Robert Oppenheimer founded DOE’s Los Alamos Laboratory (US), and Robert Wilson founded Fermi National Accelerator Laboratory(US).


    Leslie Groves visited Lawrence’s Radiation Laboratory in late 1942 as he was organizing the Manhattan Project, meeting J. Robert Oppenheimer for the first time. Oppenheimer was tasked with organizing the nuclear bomb development effort and founded today’s Los Alamos National Laboratory to help keep the work secret. At the RadLab, Lawrence and his colleagues developed the technique of electromagnetic enrichment of uranium using their experience with cyclotrons. The “calutrons” (named after the University) became the basic unit of the massive Y-12 facility in Oak Ridge, Tennessee. Lawrence’s lab helped contribute to what have been judged to be the three most valuable technology developments of the war (the atomic bomb, proximity fuse, and radar). The cyclotron, whose construction was stalled during the war, was finished in November 1946. The Manhattan Project shut down two months later.


    After the war, the Radiation Laboratory became one of the first laboratories to be incorporated into the Atomic Energy Commission (AEC) (now Department of Energy (US). The most highly classified work remained at Los Alamos, but the RadLab remained involved. Edward Teller suggested setting up a second lab similar to Los Alamos to compete with their designs. This led to the creation of an offshoot of the RadLab (now the Lawrence Livermore National Laboratory (US)) in 1952. Some of the RadLab’s work was transferred to the new lab, but some classified research continued at Berkeley Lab until the 1970s, when it became a laboratory dedicated only to unclassified scientific research.

    Shortly after the death of Lawrence in August 1958, the UC Radiation Laboratory (both branches) was renamed the Lawrence Radiation Laboratory. The Berkeley location became the Lawrence Berkeley Laboratory in 1971, although many continued to call it the RadLab. Gradually, another shortened form came into common usage, LBNL. Its formal name was amended to Ernest Orlando Lawrence Berkeley National Laboratory in 1995, when “National” was added to the names of all DOE labs. “Ernest Orlando” was later dropped to shorten the name. Today, the lab is commonly referred to as “Berkeley Lab”.

    The Alvarez Physics Memos are a set of informal working papers of the large group of physicists, engineers, computer programmers, and technicians led by Luis W. Alvarez from the early 1950s until his death in 1988. Over 1700 memos are available on-line, hosted by the Laboratory.

    The lab remains owned by the Department of Energy (US), with management from the University of California (US). Companies such as Intel were funding the lab’s research into computing chips.

    Science mission

    From the 1950s through the present, Berkeley Lab has maintained its status as a major international center for physics research, and has also diversified its research program into almost every realm of scientific investigation. Its mission is to solve the most pressing and profound scientific problems facing humanity, conduct basic research for a secure energy future, understand living systems to improve the environment, health, and energy supply, understand matter and energy in the universe, build and safely operate leading scientific facilities for the nation, and train the next generation of scientists and engineers.

    The Laboratory’s 20 scientific divisions are organized within six areas of research: Computing Sciences; Physical Sciences; Earth and Environmental Sciences; Biosciences; Energy Sciences; and Energy Technologies. Berkeley Lab has six main science thrusts: advancing integrated fundamental energy science; integrative biological and environmental system science; advanced computing for science impact; discovering the fundamental properties of matter and energy; accelerators for the future; and developing energy technology innovations for a sustainable future. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab tradition that continues today.

    Berkeley Lab operates five major National User Facilities for the DOE Office of Science (US):

    The Advanced Light Source (ALS) is a synchrotron light source with 41 beam lines providing ultraviolet, soft x-ray, and hard x-ray light to scientific experiments.


    The ALS is one of the world’s brightest sources of soft x-rays, which are used to characterize the electronic structure of matter and to reveal microscopic structures with elemental and chemical specificity. About 2,500 scientist-users carry out research at ALS every year. Berkeley Lab is proposing an upgrade of ALS which would increase the coherent flux of soft x-rays by two-three orders of magnitude.

    The DOE Joint Genome Institute (US) supports genomic research in support of the DOE missions in alternative energy, global carbon cycling, and environmental management. The JGI’s partner laboratories are Berkeley Lab, DOE’s Lawrence Livermore National Laboratory (US), DOE’s Oak Ridge National Laboratory (US)(ORNL), DOE’s Pacific Northwest National Laboratory (US) (PNNL), and the HudsonAlpha Institute for Biotechnology (US). The JGI’s central role is the development of a diversity of large-scale experimental and computational capabilities to link sequence to biological insights relevant to energy and environmental research. Approximately 1,200 scientist-users take advantage of JGI’s capabilities for their research every year.

    The LBNL Molecular Foundry (US) [above] is a multidisciplinary nanoscience research facility. Its seven research facilities focus on Imaging and Manipulation of Nanostructures; Nanofabrication; Theory of Nanostructured Materials; Inorganic Nanostructures; Biological Nanostructures; Organic and Macromolecular Synthesis; and Electron Microscopy. Approximately 700 scientist-users make use of these facilities in their research every year.

    The DOE’s NERSC National Energy Research Scientific Computing Center (US) is the scientific computing facility that provides large-scale computing for the DOE’s unclassified research programs. Its current systems provide over 3 billion computational hours annually. NERSC supports 6,000 scientific users from universities, national laboratories, and industry.

    DOE’s NERSC National Energy Research Scientific Computing Center(US) at Lawrence Berkeley National Laboratory

    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    NERSC is a DOE Office of Science User Facility.

    The DOE’s Energy Science Network (US) is a high-speed network infrastructure optimized for very large scientific data flows. ESNet provides connectivity for all major DOE sites and facilities, and the network transports roughly 35 petabytes of traffic each month.

    Berkeley Lab is the lead partner in the DOE’s Joint Bioenergy Institute (US) (JBEI), located in Emeryville, California. Other partners are the DOE’s Sandia National Laboratory (US), the University of California (UC) campuses of Berkeley and Davis, the Carnegie Institution for Science (US), and DOE’s Lawrence Livermore National Laboratory (US) (LLNL). JBEI’s primary scientific mission is to advance the development of the next generation of biofuels – liquid fuels derived from the solar energy stored in plant biomass. JBEI is one of three new U.S. Department of Energy (DOE) Bioenergy Research Centers (BRCs).

    Berkeley Lab has a major role in two DOE Energy Innovation Hubs. The mission of the Joint Center for Artificial Photosynthesis (JCAP) is to find a cost-effective method to produce fuels using only sunlight, water, and carbon dioxide. The lead institution for JCAP is the California Institute of Technology (US) and Berkeley Lab is the second institutional center. The mission of the Joint Center for Energy Storage Research (JCESR) is to create next-generation battery technologies that will transform transportation and the electricity grid. DOE’s Argonne National Laboratory (US) leads JCESR and Berkeley Lab is a major partner.

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 12:13 pm on July 20, 2021 Permalink | Reply
    Tags: "A Video Tour of the Standard Model", , , , , , Symmetry   

    From Quanta Magazine (US) via Symmetry: “A Video Tour of the Standard Model” 

    From Quanta Magazine


    Symmetry Mag


    July 16, 2021
    Kevin Hartnett

    Standard Model of Particle Physics. Credit: Quanta Magazine.

    The Standard Model: The Most Successful Scientific Theory Ever.
    Video: The Standard Model of particle physics is the most successful scientific theory of all time. In this explainer, Cambridge University physicist David Tong recreates the model, piece by piece, to provide some intuition for how the fundamental building blocks of our universe fit together.
    Emily Buder/Quanta Magazine.
    Kristina Armitage and Rui Braz for Quanta Magazine.

    Recently, Quanta has explored the collaboration between physics and mathematics on one of the most important ideas in science: quantum field theory. The basic objects of a quantum field theory are quantum fields, which spread across the universe and, through their fluctuations, give rise to the most fundamental phenomena in the physical world. We’ve emphasized the unfinished business in both physics and mathematics — the ways in which physicists still don’t fully understand a theory they wield so effectively, and the grand rewards that await mathematicians if they can provide a full description of what quantum field theory actually is.

    This incompleteness, however, does not mean the work has been unsatisfying so far.

    For our final entry in this “Math Meets QFT” series, we’re exploring the most prominent quantum field theory of them all: the Standard Model. As the University of Cambridge (UK) physicist David Tong puts it in the accompanying video, it’s “the most successful scientific theory of all time” despite being saddled with a “rubbish name.”

    The Standard Model describes physics in the three spatial dimensions and one time dimension of our universe. It captures the interplay between a dozen quantum fields representing fundamental particles and a handful of additional fields representing forces. The Standard Model ties them all together into a single equation that scientists have confirmed countless times, often with astonishing accuracy. In the video, Professor Tong walks us through that equation term by term, introducing us to all the pieces of the theory and how they fit together. The Standard Model is complicated, but it is easier to work with than many other quantum field theories. That’s because sometimes the fields of the Standard Model interact with each other quite feebly, as writer Charlie Wood described in the second piece in our series.

    From Quanta Magazine : “Mathematicians Prove 2D Version of Quantum Gravity Really Works”

    The Standard Model has been a boon for physics, but it’s also had a bit of a hangover effect. It’s been extraordinarily effective at explaining experiments we can do here on Earth, but it can’t account for several major features of the wider universe, including the action of gravity at short distances and the presence of dark matter and dark energy. Physicists would like to move beyond the Standard Model to an even more encompassing physical theory. But, as the physicist Davide Gaiotto put it in the first piece in our series, the glow of the Standard Model is so strong that it’s hard to see beyond it.

    From Quanta Magazine : “The Mystery at the Heart of Physics That Only Math Can Solve”

    And that, maybe, is where math comes in. Mathematicians will have to develop a fresh perspective on quantum field theory if they want to understand it in a self-consistent and rigorous way. There’s reason to hope that this new vantage will resolve many of the biggest open questions in physics.

    The process of bringing QFT into math may take some time — maybe even centuries, as the physicist Nathan Seiberg speculated in the third piece in our series — but it’s also already well underway. By now, math and quantum field theory have indisputably met. It remains to be seen what happens as they really get to know each other.

    From Quanta Magazine : “Nathan Seiberg on How Math Might Complete the Ultimate Physics Theory”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

  • richardmitnick 11:26 am on July 20, 2021 Permalink | Reply
    Tags: "Homebound astrophysicists miss mountaintops", , , , , , , Symmetry   

    From Symmetry: “Homebound astrophysicists miss mountaintops” 

    Symmetry Mag

    From Symmetry

    Mary Magnuson

    When the COVID-19 pandemic hit, travel bans and stay-at-home orders meant astrophysicists needed to find a new way to conduct their observations.

    Photo by Reidar Hahn, DOE’s Fermi National Accelerator Laboratory (US).

    High in the Chilean Andes, about halfway between the Pacific coast and the border with Argentina, sits the Cerro Tololo Inter-American Observatory.

    At the end of a winding road into the mountains, a group of white and silver domes stand stark against the dusty earth.

    It takes researchers three flights and a shuttle bus ride up the switchbacks to reach the observatory from the DOE’s Fermi National Accelerator Laboratory (US) near Chicago. The trip takes about 24 hours one-way, and many astrophysicists in the Dark Energy Survey collaboration make it several times a year. They’re headed to the Victor M. Blanco 4-meter telescope, home to the Dark Energy Camera.

    Dark Energy Survey

    The Dark Energy Survey (DES) is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. DES began searching the Southern skies on August 31, 2013.

    According to Einstein’s theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up. To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called dark energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    DES is designed to probe the origin of the accelerating universe and help uncover the nature of dark energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the DES collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.

    At least, that’s what they were doing, before a global pandemic threw a wrench in their travel plans.

    Researchers using the Dark Energy Camera aren’t the only ones who ran into issues over the past year or so. When the pandemic hit, observations stopped short for the Dark Energy Spectroscopic Instrument at Kitt Peak National Observatory in Arizona. Both DECam and DESI receive funding from the Department of Energy.


    Dark Energy Spectroscopic Instrument


    Not only was it difficult to travel to the observatory; once there, several people needed to work in the control room together, something they could no longer do, says Fermilab astrophysicist Elizabeth Buckley-Geer.

    After a few months in shutdown, DESI restarted observations. They pared down the in-person team to a single operator—and sometimes a lead observer, who could work in a separate room.

    Astrophysicists who normally made long journeys to the telescope instead scanned the stars from their own homes, using the same web-based software they’d used at the observatory, while connected to a virtual private network.

    DES researchers Sahar Allam and Douglas Tucker, who are married, have observed from home since even before the beginning of the pandemic. The setup in their office is fairly simple. Tucker says he connects a laptop to two other monitors. While they work, their black-and-white cat wanders between the screens.

    Tucker and Allam both say that flipping through lots of tabs becomes a necessity, as they’re used to having double the number of monitors in the control room. During observing shifts, the remote researchers stay in contact with the telescope operator via Zoom call.

    Buckley-Geer says she has a similar setup in her home office.

    “Personally, I think it’s somewhat better to be in the control room seeing the instrument,” she says. “But it works. I mean, we haven’t had any big disasters or problems, and we’re taking very good data.”

    While remote observing isn’t entirely new, it hasn’t been practiced at this scale before, says Antonella Palmese, who works at Fermilab on both DESI and projects using DECam. Many labs house remote observing centers where scientists can connect to observatories remotely. But when the labs went virtual during the pandemic, so did the centers.

    Palmese says she’d observed remotely from Fermilab plenty of times, but doing it from home was different.

    “I’m grateful for the opportunity to be able to get data, but I would say it’s definitely not as exciting,” Palmese says. “One of the nice things about being an astronomer is being able to travel to the telescope and learn more about the instrument. It’s just a different experience.”

    Buckley-Geer notes that remote observing has some advantages. Reducing travel cuts carbon emissions, as well as saving time and money.

    Palmese says she’d take the long trip to Chile once a year and stay at the observatory for around a week. But while remote observing, all she has to do is set an alarm and take a few steps into her living room.

    One unforeseen advantage to switching to observing from home, Palmese says, was the ability to take advantage of time zones. International researchers, who might not normally make it out to the telescope at all, could pick up daytime observing shifts.

    There’s no guarantee when in-person observing will resume. Even when it does, Palmese and Buckley-Geer guess that some adjustments will stick around.

    “We designed the whole system to be able to operate remotely [from the beginning] because that’s how we debugged problems and things like that,” Buckley-Geer says. “But we’ve given remote operating much more testing and much more use than we ever, ever envisioned.”

    Still, Palmese says she looks forward to observing in-person again. She says she used to get a lot of her work done while observing in Chile because during downtime, she had her collaborators right there with her.

    Palmese, Allam and Tucker say they miss in-person observing for reasons other than productivity.

    “A lot of the time you’re inside the dome, in a lit room with a lot of terminals,” Tucker says. “But every once in a while, you go outside.

    “And when your eyes adjust to the dark, you see the Milky Way spread over the sky. In Chile, you see the Magellanic Clouds. You can see galaxies which are visible by eye. And on the Andes mountain range, the Pacific Ocean is just about 30 miles away. So if you look outwards over the ocean, you see the sea fog coming in.”

    Allam shares the sentiment. “It’s just beautiful,” she says. “Since we do it for years and years, it’s emotional. If you do it once, even just for your soul, you will fall in love.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 3:19 pm on July 13, 2021 Permalink | Reply
    Tags: "Blink and it’s gone", Anomaly detection-scanning for anything that deviates from the Standard Model-something that artificial intelligence could help with., , In about half the time it takes a human to blink the CMS experiment’s triggers have processed and discarded 99.9975% of the data., Once we lose the data we lose it forever., Physicists use specialized systems called trigger systems to decide which collisions to retain for analysis and which ones to discard., Symmetry, The challenge of deciding in a split second which data to keep some scientists say could be met with Artificial Intelligence., The trigger is the only component to observe every collision.   

    From Symmetry: “Blink and it’s gone” 

    Symmetry Mag

    From Symmetry

    Eoin O’Carroll

    Fast electronics and artificial intelligence are helping physicists capture data and decide what to keep and what to throw away.

    Illustration by Sandbox Studio, Chicago with Ana Kova.

    The nucleus of the atom was discovered a century ago thanks to scientists who didn’t blink.

    Working in pitch darkness at the University of Manchester (UK) between 1909 and 1913, research assistants Hans Geiger and Ernest Marsden peered through microscopes to count flashes of alpha particles on a fluorescent screen. The task demanded total concentration, and the scientists could count accurately for only about a minute before fatigue set in. The physicist and science historian Siegmund Brandt wrote that Geiger and Marsden maintained their focus by ingesting strong coffee and “a pinch of strychnine.”

    Modern particle detectors use sensitive electronics instead of microscopes and rat poison to observe particle collisions, but now there’s a new challenge. Instead of worrying about blinking and missing interesting particle interactions, physicists worry about accidentally throwing them away.

    The Large Hadron Collider at CERN produces collisions at a rate of 40 million per second, producing enough data to fill more than 140,000 one-terabyte storage drives every hour. Capturing all those events is impossible, so the electronics have to make some tough choices.







    To decide which collisions to retain for analysis and which ones to discard, physicists use specialized systems called trigger systems. The trigger is the only component to observe every collision. In about half the time it takes a human to blink the CMS experiment’s triggers have processed and discarded 99.9975% of the data.

    Depending on how a trigger is programmed, it could be the first to capture evidence of new phenomena—or to lose it.

    “Once we lose the data, we lose it forever,” says Georgia Karagiorgi, a professor of physics at Columbia University (UK) and the US project manager for the data acquisition system for the Deep Underground Neutrino Experiment.

    “We need to be constantly looking. We can’t close our eyes.”

    The challenge of deciding in a split second which data to keep some scientists say could be met with artificial intelligence.

    A numbers game

    Discovering new subatomic phenomena often requires amassing a colossal dataset, most of it uninteresting.

    Geiger and Marsden learned this the hard way. Working under the direction of Ernest Rutherford, the two scientists sought to reveal the structures of atoms by sending streams of alpha particles through sheets of gold foil and observing how the particles scattered. They found that for about every 8000 particles that passed straight through the foil, one particle would bounce away as though it had collided with something solid. That was the atom’s nucleus, and its discovery sent physics itself on a new trajectory.

    By today’s physics’ standards, Geiger and Marsden’s 1-in-8000 odds look like a safe bet. The Higgs boson is thought to appear in only one out of every 5 billion collisions in the LHC. And scientists have only a small window of time in which to catch them.

    “At CMS we have a massive amount of data,” says Princeton University (US) physicist Isobel Ojalvo, who has been heavily involved in upgrading the CMS trigger system. “We’re only able to store that data for about three and a half [millionths of a second] before we make decisions about keeping it or throwing it away.”

    The triggers will soon need to get even faster. In the LHC’s Run 3, set to begin in March 2022, the total number of collisions will equal that of the two previous runs combined. The collision rate will increase dramatically during the LHC’s High-Luminosity era, which is scheduled to begin in 2027 and continue through the 2030s.

    That’s when the collider’s luminosity, a measure of how tightly the crossing beams are packed with particles, is set to increase tenfold over its original design value.

    Collecting this data is important because in the coming decade, scientists will intensify their searches for phenomena that are just as mysterious to today’s physicists as atomic nuclei were to Geiger and Marsden.

    A new physics

    In 2012, the Higgs boson became the last confirmed elementary particle of the Standard Model, the equation that succinctly describes all known forms of matter and predicts with astonishing accuracy how they interact.

    But there are strong signs that the Standard Model, which has guided physics for nearly 50 years, won’t have the last word.

    In April, for instance, preliminary results from the Muon g-2 experiment at the DOE’s Fermi National Accelerator Laboratory (US) offered tantalizing hints that the muon may be interacting with a force or particle the Standard Model doesn’t include. Identifying these phenomena and many others may require a new understanding.

    “Given that we have not seen [beyond the Standard Model] physics yet, we need to revolutionize how we collect our data to enable processing data rates at least an order of magnitude higher than achieved thus far,” says Massachusetts Institute of Technology (US) physicist Mike Williams, who is a member of the Institute for Research and Innovation in Software for High-Energy Physics (US), IRIS-HEP.

    Physicists agree that future triggers will need to be faster, but there’s less consensus on how they should be programmed.

    “How do we make discoveries when we don’t know what to look for?” asks Peter Elmer, executive director and principal investigator for IRIS-HEP. “We don’t want to throw anything away that might hint at new physics.”

    There are two different schools of thought, Ojalvo says.

    The more conservative approach is to search for signatures that match theoretical predictions. “Another way,” she says, “is to look for things that are different from everything else.”

    This second option, known as anomaly detection, would scan not for specific signatures, but for anything that deviates from the Standard Model-something that artificial intelligence could help with.

    “In the past, we guessed the model and used the trigger system to pick those signatures up,” Ojalvo says.

    But “now we’re not finding the new physics that we believe is out there,” Ojalvo says. “It may be that we cannot create those interactions in present-day colliders, but we also need to ask ourselves if we’ve turned over every stone.”

    Instead of searching one-by-one for signals predicted by each theory, physicists could deploy to a collider’s trigger system an unsupervised machine-learning algorithm, Ojalvo says. They could train the algorithm only on the collisions it observes, without reference to any other dataset. Over time, the algorithm would learn to distinguish common collision events from rare ones. The approach would not require knowing any details in advance about what new signals might be, and it would avoid bias toward one theory or another.

    MIT physicist Philip Harris says that recent advances in artificial intelligence are fueling a growing interest in this approach—but that advocates of “theoryless searches” remain a minority in the physics community.

    More generally, says Harris, using AI for triggers can create opportunities for more innovative ways to acquire data. “The algorithm will be able to recognize the beam conditions and adapt their choices,” he says. “Effectively, it can change itself.”

    Programming triggers calls for tradeoffs between efficiency, breadth, accuracy and feasibility. “All of this is wonderful in theory,” says Karagiorgi. “It’s all about hardware resource constraints, power resource constraints, and, of course, cost.”

    “Thankfully,” she adds, “we don’t need strychnine.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 9:17 pm on July 6, 2021 Permalink | Reply
    Tags: "What is a photon?", , , , , , Symmetry   

    From Symmetry: “What is a photon?” 

    Symmetry Mag

    From Symmetry

    Amanda Solliday
    Kathryn Jepsen

    The fundamental particle of light is both ordinary and full of surprises.

    Credit: Single-Photon Workshop 2019. http://www.eventideib.polimi.it/en/events/single-photon-workshop-2019/

    What physicists refer to as photons, other people might just call light. As quanta of light, photons are the smallest possible packets of electromagnetic energy. If you are reading this article on a screen or a page, streams of photons are carrying the images of the words to your eyes.

    In science, photons are used for more than just illumination.

    “They’re ubiquitous,” says Richard Ruiz, a research associate at the Institute of Nuclear Physics – Polish Academy of Sciences[Instytut Fizyki Jądrowej-polska akademia nauk](PL], and a theorist looking for new physics at the Large Hadron Collider.

    “Photons are everywhere in particle physics, so you almost forget about them.”

    The photon has fueled centuries of discovery, and it remains an important tool today.

    From wave, to particle, to boson

    People have investigated the nature of light since ancient times, with early insights coming from philosophers and scholars in Egypt, Mesopotamia, India and Greece. Between the late 17th and early 20th centuries, scientists went back and forth on the answer to one question in particular: Does light behave as a particle or as a wave?

    In 1690, Christiaan Huygens published Traité de la Lumière, his treatise on light. In it, he described light as being made up of waves that moved through the ether, which was thought to permeate space.

    Isaac Newton declared in his 1704 book Opticks that he disagreed. When light reflects off of a surface, it acts like a bouncing ball; the angle at which it approaches the surface is equal to the angle at which it bounces off. Newton argued that this phenomenon, among other things, could be explained if light were made up of particles, which he called “corpuscules.”

    A glass prism refracts a beam of white light into a rainbow of colors. Newton noticed that when the light was then refracted again, through a second prism, it did not divide any further; the rainbow colors stayed the same.

    Newton said this could be explained by assuming that white light was made up of many different corpuscules of different sizes. Red light was made up of the biggest corpuscules; violet was made up of the smallest. Newton said their different sizes caused the corpuscules to be pulled through the glass at different, accelerated speeds. This spread them out, producing the rainbow of colors that could not be broken down further by a second prism.

    Newton’s corpuscular model had a significant drawback, though.

    When light travels through a small hole, it spreads out just like ripples in water. Newton’s corpuscular model couldn’t explain this behavior, and Huygens’ wave model could.

    Still, scientists were generally inclined to dismiss Huygens and listen to Newton—he did write Principia, one of the most important books in the history of science, after all.

    But Huygens’ model received some support in 1801, when Thomas Young conducted the double slit experiment. In the experiment, Young sent a beam of light through two small holes, side-by-side, and found that the light passing through them formed a particular pattern. At regular intervals the intersecting ripples emanating from the two holes interfered either constructively—combining to make brighter light—or destructively—canceling one another out. Just like waves.

    About five decades later, another experiment put Huygens’ model definitively in the lead.

    In 1850, Léon Foucalt compared the speed of light through air with the speed of light through water and found that, contrary to Newton’s assertions, light did not move faster in the denser medium. Instead, just like a wave would, it slowed down.

    Eleven years later, James Clerk Maxwell published On Physical Lines of Force, in which he predicted the existence of electromagnetic waves. Maxwell noted their similarity to lightwaves, leading him to conclude that the two were one and the same.

    It seemed that Huygens’ wave model had won the day. But in 1900, Max Planck came up with an idea that would spark a brand new concept of light.

    Planck explained some puzzling behaviors of radiation by describing the energy of electromagnetic waves as divided into individual packets. In 1905, Albert Einstein built on Planck’s concept of energy packets and finally settled the corpuscule-versus-wave debate—by declaring it a tie.

    As Einstein explained, light behaves as both a particle and a wave, with the energy of each particle of light corresponding to the frequency of the wave.

    His evidence came from studies of the photoelectric effect—the way in which light knocked electrons loose from metal. If light traveled only in a continuous wave, then shining a light on metal for long enough would always dislodge an electron, because the energy the light transferred to the electron would accumulate over time.

    But the photoelectric effect didn’t work that way. In 1902 Philipp Lenard had observed that only light above a certain energy—or lightwaves above a certain frequency—could pry an electron loose from the metal. And it seemed to do so on contact, immediately.

    In this case, the light was acting more like a particle, an individual packet of energy.

    Still convinced of the wave model of light, Robert Millikan set out to disprove Einstein’s hypothesis. Millikan took careful measurements of the relationship between the light and electrons involved in the photoelectric effect. To his surprise, he confirmed each of Einstein’s predictions.

    Einstein’s study of the photoelectric effect earned him his sole Nobel Prize in 1921.

    In 1923, Arthur Compton provided additional support for Einstein’s model of light. Compton aimed high-energy light at materials, and he successfully predicted the angles at which electrons released by the collisions would scatter. He did it by presuming the light would act like tiny billiard balls.

    Chemist Gilbert Lewis came up with a name for these billiard balls. In a 1926 letter to the journal Nature, he called them “photons.”

    The way that scientists think about photons has continued to evolve in more recent years. For one, the photon is now known as a “gauge boson.”

    Gauge bosons are force-carrying particles that enable matter particles to interact via the fundamental forces. Atoms, for example, stick together because the positively charged protons in their nuclei exchange photons with the negatively charged electrons that orbit them—an interaction via the electromagnetic force.

    Secondly, the photon is now thought of as a particle, a wave, and an excitation—kind of like a wave—in a quantum field.

    A quantum field, such as the electromagnetic field, is a kind of energy and potential spread throughout space. Physicists think of every particle as an excitation of a quantum field.

    “I like to think of a quantum field as a calm pond surface where you don’t see anything,” Ruiz says. “Then you put a pebble on the surface, and the water pops up a bit. That’s a particle.”

    Photons as a tool

    Radio waves and microwaves; infrared and ultraviolet light; X-rays and gamma rays: All of these are light, and all of them are made up of photons.

    Photons are at work all around you. They travel through connected fibers to deliver internet, cable and cell phone signals. They are used in plastics upcycling, to break down objects into small building blocks that can be used in new materials. They are used in hospitals, in beams that target and destroy cancerous tissues.

    And they are key to all kinds of scientific research.

    Photons are essential in cosmology: the study of the past, present and future of the universe. Scientists study stars by examining the electromagnetic radiation they emit, such as radio waves and visible light. Astronomers develop maps of our galaxy and its neighbors by imaging the microwave sky. They detect space dust that blocks their view of distant stars by detecting its infrared light.

    Scientists collect strong signals, in the form of ultraviolet radiation, X-rays, and gamma rays emitted by energetic objects from our galaxy and beyond. And they detect weak signals, such as the faint pattern of light known as the cosmic microwave background, which serves as a record of the state of the universe seconds after the Big Bang.

    Photons also remain important in physics.

    In 2012, scientists at the Large Hadron Collider discovered the Higgs boson by studying its decay into pairs of photons.

    Physicist Donna Strickland won a share of the Nobel Prize in Physics in 2018 for her work developing ultrashort, high-intensity laser pulses, formed from highly focused high-energy light.

    Machines called light sources create intense beams of X-rays, ultraviolet light and infrared light to help scientists break down the steps of the fastest chemical processes and examine materials in molecular detail.

    “Across the electromagnetic spectrum, photons can provide us with so much information about the world,” says Jennifer Dionne, associate professor of materials science and engineering at Stanford University (US).

    Dionne conducts research in the field of nanophotonics, a subfield of physics in which scientists control light and study its interactions with molecules and nano-sized structures. Among other projects, her lab uses photons to up the effectiveness of catalysts, substances used to kick off high-efficiency chemical reactions.

    “Light—photons—are a reagent in chemistry that people don’t always think about,” Dionne says. “People often think about adding new chemicals to enable a certain reaction or controlling the temperature or pH of a solution. Light can bring a whole new dimension and an entirely new tool kit.”

    Some physicists are even looking for new types of photons. Theoretical “dark photons” would serve as a new kind of gauge bosons, mediating the interactions between particles of dark matter.

    “Photons are always full of surprises,” Dionne says

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: