Tagged: Particle Physics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:20 pm on May 14, 2019 Permalink | Reply
    Tags: , , , , LS2, , Particle Physics, , Superconducting magnet circuits   

    From CERN: “LS2 Report: consolidating the energy extraction systems of LHC superconducting magnet circuits” 

    Cern New Bloc

    Cern New Particle Event

    From CERN

    13 May, 2019
    Anaïs Schaeffer

    The LS2 team from the NRC Kurchatov-IHEP Institute, Protvino, Russia, with a 13 kA energy extraction system (Image: NRC Kurchatov-IHEP Institute)

    In the LHC, 1232 superconducting dipole magnets and 392 quadrupole magnets guide and focus the beams around the accelerator’s 27-kilometre ring, which is divided into eight sectors. These magnets operate at very low temperatures – 1.9 K or −271.3 °C – where even a tiny amount of energy released inside a magnet can warm its windings to above the critical temperature, causing the loss of superconductivity: this is called a quench. When this happens, the energy stored in the affected magnet has to be safely extracted in a short time to avoid damage to the magnet coil.

    To do so, two protection elements are activated: at the level of the quenching magnet, a diode diverts the current into a parallel by-pass circuit in less than a second; at the level of the circuit, 13 kA energy extraction systems absorb the energy of the whole magnet circuit in a few minutes. There are equivalent extraction systems installed for about 200 corrector circuits with currents up to 600 A.

    “In the framework of a long-lasting and fruitful collaboration between CERN and the Russian Federation, energy extraction systems for quench protection of the LHC superconducting magnets were designed in close partnership with two Russian institutes, the NRC Kurchatov-IHEP Institute in Protvino for the 13 kA systems and the Budker Institute in Novosibirsk for the 600 A systems. Russian industry was involved in the manufacturing of the parts of these systems,” explains Félix Rodríguez Mateos, leader of the Electrical Engineering (EE) section in the Machine Protection and Electrical Integrity (MPE) group of CERN’s Technology department.

    With a wealth of expertise and know-how, the Russian teams have continuously provided invaluable support to the MPE group. “Our Russian colleagues come to CERN for every year-end technical stop (YETS) and long shutdown to help us perform preventive maintenance and upgrade activities on the energy extraction systems,” says Rodríguez Mateos.

    During LS2, an extensive maintenance campaign is being performed on the 13 kA systems, which already count 10 years of successful operation in the LHC. “We are currently replacing an element, the arcing contact, in each one of the 256 electromechanical switches of the energy extraction systems to ensure their continuous reliable operation throughout the next runs,” adds Rodríguez Mateos. “In February, we fully replaced 32 switches at Point 8 of the accelerator in anticipation of consolidation for the future HL-LHC.”

    During LS2, the Electrical Engineering section is involved in many other activities that will be the subject of future articles.

    See the full article here.

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries

    Cern Courier

  • richardmitnick 12:04 pm on May 14, 2019 Permalink | Reply
    Tags: >Model-dependent vs model-independent research, , , , , , , , , Particle Physics, ,   

    From Symmetry: “Casting a wide net” 

    Symmetry Mag
    From Symmetry

    Jim Daley

    Illustration by Sandbox Studio, Chicago

    In their quest to discover physics beyond the Standard Model, physicists weigh the pros and cons of different search strategies.

    On October 30, 1975, theorists John Ellis, Mary K. Gaillard and D.V. Nanopoulos published a paper [Science Direct] titled “A Phenomenological Profile of the Higgs Boson.” They ended their paper with a note to their fellow scientists.

    “We should perhaps finish with an apology and a caution,” it said. “We apologize to experimentalists for having no idea what is the mass of the Higgs boson… and for not being sure of its couplings to other particles, except that they are probably all very small.

    “For these reasons, we do not want to encourage big experimental searches for the Higgs boson, but we do feel that people performing experiments vulnerable to the Higgs boson should know how it may turn up.”

    What the theorists were cautioning against was a model-dependent search, a search for a particle predicted by a certain model—in this case, the Standard Model of particle physics.

    Standard Model of Particle Physics

    It shouldn’t have been too much of a worry. Around then, most particle physicists’ experiments were general searches, not based on predictions from a particular model, says Jonathan Feng, a theoretical particle physicist at the University of California, Irvine.

    Using early particle colliders, physicists smashed electrons and protons together at high energies and looked to see what came out. Samuel Ting and Burton Richter, who shared the 1976 Nobel Prize in physics for the discovery of the charm quark, for example, were not looking for the particle with any theoretical prejudice, Feng says.

    That began to change in the 1980s and ’90s. That’s when physicists began exploring elegant new theories such as supersymmetry, which could tie up many of the Standard Model’s theoretical loose ends—and which predict the existence of a whole slew of new particles for scientists to try to find.

    Of course, there was also the Higgs boson. Even though scientists didn’t have a good prediction of its mass, they had good motivations for thinking it was out there waiting to be discovered.

    And it was. Almost 40 years after the theorists’ tongue-in-cheek warning about searching for the Higgs, Ellis found himself sitting in the main auditorium at CERN next to experimentalist Fabiola Gianotti, the spokesperson of the ATLAS experiment at the Large Hadron Collider who, along with CMS spokesperson Joseph Incandela, had just co-announced the discovery of the particle he had once so pessimistically described.

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    Model-dependent vs model-independent

    Scientists’ searches for particles predicted by certain models continue, but in recent years, searches for new physics independent of those models have begun to enjoy a resurgence as well.

    “A model-independent search is supposed to distill the essence from a whole bunch of specific models and look for something that’s independent of the details,” Feng says. The goal is to find an interesting common feature of those models, he explains. “And then I’m going to just look for that phenomenon, irrespective of the details.”

    Particle physicist Sara Alderweireldt uses model-independent searches in her work on the ATLAS experiment at the Large Hadron Collider.

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    Alderweireldt says that while many high-energy particle physics experiments are designed to make very precise measurements of a specific aspect of the Standard Model, a model-independent search allows physicists to take a wider view and search more generally for new particles or interactions. “Instead of zooming in, we try to look in as many places as possible in a consistent way.”

    Such a search makes room for the unexpected, she says. “You’re not dependent on the prior interpretation of something you would be looking for.”

    Theorist Patrick Fox and experimentalist Anadi Canepa, both at Fermilab, collaborate on searches for new physics.

    In Canepa’s work on the CMS experiment, the other general-purpose particle detector at the LHC, many of the searches are model-independent.

    While the nature of these searches allows them to “cast a wider net,” Fox says, “they are in some sense shallower, because they don’t manage to strongly constrain any one particular model.”

    At the same time, “by combining the results from many independent searches, we are getting closer to one dedicated search,” Canepa says. “Developing both model-dependent and model-independent searches is the approach adopted by the CMS and ATLAS experiments to fully exploit the unprecedented potential of the LHC.”

    Driven by data and powered by machine learning

    Model-dependent searches focus on a single assumption or look for evidence of a specific final state following an experimental particle collision. Model-independent searches are far broader—and how broad is largely driven by the speed at which data can be processed.

    “We have better particle detectors, and more advanced algorithms and statistical tools that are enabling us to understand searches in broader terms,” Canepa says.

    One reason model-independent searches are gaining prominence is because now there is enough data to support them. Particle detectors are recording vast quantities of information, and modern computers can run simulations faster than ever before, she says. “We are able to do model-independent searches because we are able to better understand much larger amounts of data and extreme regions of parameter and phase space.”

    Machine-learning is a key part of this processing power, Canepa says. “That’s really a change of paradigm, because it really made us make a major leap forward in terms of sensitivity [to new signals]. It really allows us to benefit from understanding the correlations that we didn’t capture in a more classical approach.”

    These broader searches are an important part of modern particle physics research, Fox says.

    “At a very basic level, our job is to bequeath to our descendants a better understanding of nature than we got from our ancestors,” he says. “One way to do that is to produce lots of information that will stand the test of time, and one way of doing that is with model-independent searches.”

    Models go in and out of fashion, he adds. “But model-independent searches don’t feel like they will.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 9:09 am on May 13, 2019 Permalink | Reply
    Tags: , , , CLIC, , , , Particle Physics, , Roadmap for the future of the discipline, The European Strategy Group   

    From CERN: “In Granada, the European particle physics community prepares decisions for the future of the field” 

    Cern New Bloc

    Cern New Particle Event

    From CERN

    13 May, 2019

    The European particle physics community is meeting this week in Granada, Spain, to discuss the roadmap for the future of the discipline.


    Geneva and Granada. The European particle physics community is meeting this week in Granada, Spain, to discuss the roadmap for the future of the discipline. The aim of the symposium is to define scientific priorities and technological approaches for the coming years and to consider plans for the medium- and long-term future. An important focus of the discussions will be assessing the various options for the period beyond the lifespan of the Large Hadron Collider.

    “The Granada symposium is an important step in the process of updating the European Strategy for Particle Physics and aims to prioritise our scientific goals and prepare for the upcoming generation of facilities and experiments,” said the President of the CERN Council, Ursula Bassler. “The discussions will focus on the scientific reach of potential new projects, the associated technological challenges and the resources required.”

    The European Strategy Group, which was established to coordinate the update process, has received 160 contributions from the scientific community setting out their views on possible future projects and experiments. The symposium in Granada will provide an opportunity to assess and discuss them.

    “The intent is to make sure that we have a good understanding of the science priorities of the community and of all the options for realising them,” said the Chair of the European Strategy Group, Professor Halina Abramowicz. “This will ensure that the European Strategy Group is well informed when deciding about the strategy update.”

    The previous update of the European Strategy, approved in May 2013, recommended that design and feasibility studies be conducted in order for Europe “to be in a position to propose an ambitious post-LHC accelerator project.” Over the last few years, in collaboration with partners from around the world, Europe has therefore been engaging in R&D and design projects for a range of ambitious post-LHC facilities under the CLIC and FCC umbrellas.

    CLIC collider

    CERN FCC Future Circular Collider details of proposed 100km-diameter successor to LHC

    A study to investigate the potential to build projects that are complementary to high-energy colliders, exploiting the opportunities offered by CERN’s unique accelerator complex, was also launched by CERN in 2016. These contributions will feed into the discussion, which will also take into account the worldwide particle physics landscape and developments in related fields.

    “At least two decades will be needed to design and build a new collider to succeed the LHC. Such a machine should maximise the potential for new discoveries and enable major steps forward in our understanding of fundamental physics” said CERN Director-General, Fabiola Gianotti. “It is not too early to start planning for it as it will take time to develop the new technologies needed for its implementation.”

    The Granada symposium will be followed up with the compilation of a “briefing book” and with a Strategy Drafting Session, which will take place in Bad Honnef, Germany, from 20 to 24 January 2020. The update of the European Strategy for Particle Physics is due to be completed and approved by the CERN Council in May 2020.

    An online Q&A session will be held on Thursday 16 May – 4pm CEST

    Reporters interested in participating are invited to register by sending an e-mail to press@cern.ch


    See the full article here.

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries

    Cern Courier

  • richardmitnick 12:54 pm on May 10, 2019 Permalink | Reply
    Tags: , “Belle II will accumulate more than 50 times the data sample of the original Belle experiment at KEK”, “We are developing the data-distribution software working not only with Belle II colleagues but also with colleagues at CERN., “We store an entire copy of the Belle II data and we have the computing resources to process that data and make it available to collaborators around the world”, , Belle II detector, Benefitting from our own experience at the RHIC & ATLAS Computing Center, , Brookhaven’s magnet division constructed 43 custom-designed corrector magnets., , , Particle Physics, Physicists and engineers in the Laboratory’s Superconducting Magnet Division made contributions essential to upgrading the KEK accelerator helping to transform it into SuperKEKB., Physicists will search for signs of “new physics”—something that cannot be explained by the particles and forces already included in the Standard Model., , SuperKEKB accelerator, SuperKEKB collides electrons with their antimatter counterparts known as positrons, The corrector magnets are installed on each side of the Belle II detector   

    From Brookhaven National Lab: “Brookhaven Lab and the Belle II Experiment” 

    From Brookhaven National Lab

    May 7, 2019
    Karen McNulty Walsh

    Tracking particle smashups and detector conditions from half a world away, scientists seek answers to big physics mysteries.

    SuperKEKB accelerator and Belle II detector at the interaction region.(Credit: Belle II/KEK)

    If you think keeping track of the photos on your mobile phone is a challenge, imagine how daunting the job would be if your camera were taking thousands of photos every second. That’s the task faced by particle physicists working on the Belle II experiment at Japan’s SuperKEKB particle accelerator, which started its first physics run in late March. Belle II physicists will sift through “snapshots” of millions of subatomic smashups per day—as well as data on the conditions of the “camera” at the time of each collision—to seek answers to some of the biggest questions in physics.

    A key part of the experiment is taking place half a world away, using computing resources and expertise at the U.S. Department of Energy’s Brookhaven National Laboratory, the lead laboratory for U.S. collaborators on Belle II.

    “We store an entire copy of the Belle II data, and we have the computing resources to process that data and make it available to collaborators around the world,” said Benedikt Hegner, a physicist in Brookhaven Lab’s Computational Sciences Initiative. To date, Brookhaven’s Scientific Data and Computing Center (SDCC) has handled up to 95 percent of the experiment’s entire computing workload—reconstructing particles from simulated events prior to the experiment’s startup, and since late March, from live collision events. SDCC will continue that role for the experiment’s first three years, thereafter maintaining some 30 percent of the data-transfer and storage responsibility while transitioning the rest to other Belle II member nations that have powerful GRID computing capabilities.

    “We are developing the data-distribution software, working not only with Belle II colleagues but also with colleagues at CERN, the European laboratory for particle physics research, learning from their experience managing datasets from the Large Hadron Collider (LHC)—as well as our own experience at the RHIC & ATLAS Computing Center,” Hegner said.

    Benedikt Hegner in the Scientific Data and Computing Center at Brookhaven Lab, which stores and processes Belle II data and makes it available to collaborators around the world.

    Brookhaven also hosts Belle II’s “conditions database”—an archive of the detector’s conditions at the time of each recorded collision. This database tracks millions of variables—for example, the detector’s level of electronic noise, millimeter-scale movements of the detector due to the strong magnetic field, and variations in electronic response due to small temperature changes—all of which need to be properly taken into account to make sense of Belle II’s measurements.

    “This is the first time a particle physics experiment’s conditions database is being hosted at a distant location,” Hegner noted. Tracking the conditions helps calibrate the detector and even feeds input to the “trigger” systems that decide which collisions to record. “If we’re having trouble with our system, Belle II will eventually see that during data collection. So, the reliability of our services is essential,” Hegner said.

    But Brookhaven’s involvement in Belle II goes beyond cataloging collisions and crunching the numbers. Physicists and engineers in the Laboratory’s Superconducting Magnet Division made contributions essential to upgrading the KEK accelerator, helping to transform it into SuperKEKB, and members of Brookhaven Lab’s physics department are looking forward to analyzing Belle II data and being part of the upgraded facility’s discoveries.

    Improved magnets, more collisions, “new physics”?

    Like its predecessor, SuperKEKB collides electrons with their antimatter counterparts, known as positrons. To keep collision rates high, these beams must be tightly focused. But the magnetic fields guiding the particles in one beam can have unwanted effects in the adjacent beam, causing the particles to spread. To fine-tune the fields of the accelerator magnets and counteract these adjacent-beam effects, Brookhaven’s magnet division constructed 43 custom-designed corrector magnets. These corrector magnets are installed on each side of the Belle II detector, making adjustments to both the incoming and outgoing beams to maintain high beam intensity, or “luminosity.” High luminosity results in higher collision rates, so physicists at Brookhaven and around the world will have more data to analyze.

    Corrector magnets: Leak field cancel coil being wound by Brookhaven Lab magnet division technician Thomas Van Winckel.

    “Belle II will accumulate more than 50 times the data sample of the original Belle experiment at KEK,” said Brookhaven physicist David Jaffe, who is coordinating Brookhaven Lab scientists’ involvement in the project.

    By scouring reconstructed images of the particles emerging from these electron-positron collisions, physicists will search for signs of “new physics”—something that cannot be explained by the particles and forces already included in the Standard Model, the world’s reigning (and well-tested) theory of particle physics.

    One particular area of interest is the decay of beauty and charm mesons—particles made of two quarks, one of which is a heavy “beauty” or “charm” quark. These “heavy flavor” mesons are created in abundance in electron-positron collisions at the SuperKEKB accelerator.

    “SuperKEKB is called a ‘B factory’ because it is optimized for the production of beauty mesons. It also produces an abundance of charm mesons,” Jaffe said. “While many physicists on Belle II will be investigating the behavior of beauty mesons, the Brookhaven team will be exploiting the huge sample of charm mesons to look for possible discoveries.”

    For example, if heavy flavor mesons measured by Belle II decay (transform into other particles) differently than predicted by the Standard Model, such a discrepancy would be an indication that some new, previously undiscovered particle might be taking part in the action.

    Evidence of new particles might help account for the mysterious dark matter that makes up some 27 percent of the universe, or offer clues about dark energy, which accounts for another 68 percent (with the remaining 5 percent made of the ordinary matter we see around us). Such a discovery might also help explain why today’s universe is made of matter rather than a mix of matter and antimatter, even though scientists believe both were created in equal amounts at the very beginning of time.

    To grasp how shocking this matter-antimatter asymmetry is, think of the common laundry experience of losing a random sock in the dryer. But imagine if every time you did the laundry—even a billion loads, each with a billion pairs of socks labeled “left” and “right”—you always ended up with a single unpaired left sock and never a lone right sock. That’s what it’s like for physicists trying to understand why the universe ended up with only matter. There must be some difference in the way matter and antimatter behave to explain this anomaly.

    There is evidence that matter and antimatter behave differently from several well-known experiments studying meson decays. These include a Nobel Prize-winning experiment at Brookhaven’s Alternating Gradient Synchrotron, which studied the decay of mesons containing a strange quark in the 1960s. More recently, several experiments studying beauty meson decays at other B factories—the original Belle at KEK, the BaBar experiment at the SLAC National Accelerator Laboratory in the U.S., and the LHCb experiment at CERN—observed similar asymmetries. But thus far, the matter-antimatter asymmetry observed in beauty and strange mesons follows the pattern predicted by the Standard Model, and is not sufficient to explain the matter-antimatter asymmetry of the universe.

    LHCb also recently observed a smaller level of matter-antimatter asymmetry in charm meson decays for the first time. It is unclear if this new observation is consistent with the Standard Model or due to new particles that preferentially interact with charm quarks. Additional measurements are needed to solve this mystery.

    Physicist David Jaffe is coordinating Brookhaven Lab’s contributions to Belle II.

    “What we’ll do at Belle II is like many, many trips to the laundromat where we carefully launder our `charmed’ socks and use different methods to dry them. We’ll use our observations from these different loads of charmed laundry to map out what happens in charm meson decays to higher precision than ever before,” explained Jaffe. “Then we’ll compare those observations to our expectations from the Standard Model to see if we’ve found evidence for new particles.”

    The Belle II experiment, Jaffe noted, complements LHCb. “Belle II has a different range of features that enable contrasting studies of the charm mesons,” he said. “We are starting to accumulate large data samples to help us make the precision measurements we need to resolve these questions. Once we’ve confirmed the technical capabilities of the experiment, we will move on to data analysis and the possibility of discovery.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL Center for Functional Nanomaterials



    BNL RHIC Campus

    BNL/RHIC Star Detector


    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

  • richardmitnick 8:01 am on May 10, 2019 Permalink | Reply
    Tags: "Q&A: SLAC/Stanford researchers prepare for a new quantum revolution", , , , , , Particle Physics, , , , Quantum squeezing, , The most exciting opportunities in quantum control make use of a phenomenon known as entanglement   

    From SLAC National Accelerator Lab- “Q&A: SLAC/Stanford researchers prepare for a new quantum revolution” 

    From SLAC National Accelerator Lab

    May 9, 2019
    Manuel Gnida

    Monika Schleier-Smith and Kent Irwin explain how their projects in quantum information science could help us better understand black holes and dark matter.

    The tech world is abuzz about quantum information science (QIS). This emerging technology explores bizarre quantum effects that occur on the smallest scales of matter and could potentially revolutionize the way we live.

    Quantum computers would outperform today’s most powerful supercomputers; data transfer technology based on quantum encryption would be more secure; exquisitely sensitive detectors could pick up fainter-than-ever signals from all corners of the universe; and new quantum materials could enable superconductors that transport electricity without loss.

    In December 2018, President Trump signed the National Quantum Initiative Act into law, which will mobilize $1.2 billion over the next five years to accelerate the development of quantum technology and its applications. Three months earlier, the Department of Energy had already announced $218 million in funding for 85 QIS research awards.

    The Fundamental Physics and Technology Innovation directorates of DOE’s SLAC National Accelerator Laboratory recently joined forces with Stanford University on a new initiative called Q-FARM to make progress in the field. In this Q&A, two Q-FARM scientists explain how they will explore the quantum world through projects funded by DOE QIS awards in high-energy physics.

    Monika Schleier-Smith, assistant professor of physics at Stanford, wants to build a quantum simulator made of atoms to test how quantum information spreads. The research, she said, could even lead to a better understanding of black holes.

    Kent Irwin, professor of physics at Stanford and professor of photon science and of particle physics and astrophysics at SLAC, works on quantum sensors that would open new avenues to search for the identity of the mysterious dark matter that makes up most of the universe.

    Monika Schleier-Smith and Kent Irwin are the principal investigators of three quantum information science projects in high-energy physics at SLAC. (Farrin Abbott/Dawn Harmer/SLAC National Accelerator Laboratory)

    What exactly is quantum information science?

    Irwin: If we look at the world on the smallest scales, everything we know is already “quantum.” On this scale, the properties of atoms, molecules and materials follow the rules of quantum mechanics. QIS strives to make significant advances in controlling those quantum effects that don’t exist on larger scales.

    Schleier-Smith: We’re truly witnessing a revolution in the field in the sense that we’re getting better and better at engineering systems with carefully designed quantum properties, which could pave the way for a broad range of future applications.

    What does quantum control mean in practice?

    Schleier-Smith: The most exciting opportunities in quantum control make use of a phenomenon known as entanglement – a type of correlation that doesn’t exist in the “classical,” non-quantum world. Let me give you a simple analogy: Imagine that we flip two coins. Classically, whether one coin shows heads or tails is independent of what the other coin shows. But if the two coins are instead in an entangled quantum state, looking at the result for one “coin” automatically determines the result for the other one, even though the coin toss still looks random for either coin in isolation.

    Entanglement thus provides a fundamentally new way of encoding information – not in the states of individual “coins” or bits but in correlations between the states of different qubits. This capability could potentially enable transformative new ways of computing, where problems that are intrinsically difficult to solve on classical computers might be more efficiently solved on quantum ones. A challenge, however, is that entangled states are exceedingly fragile: any measurement of the system – even unintentional – necessarily changes the quantum state. So a major area of quantum control is to understand how to generate and preserve this fragile resource.

    At the same time, certain quantum technologies can also take advantage of the extreme sensitivity of quantum states to perturbations. One application is in secure telecommunications: If a sender and receiver share information in the form of quantum bits, an eavesdropper cannot go undetected, because her measurement necessarily changes the quantum state.

    Another very promising application is quantum sensing, where the idea is to reduce noise and enhance sensitivity by controlling quantum correlations, for instance, through quantum squeezing.

    What is quantum squeezing?

    Irwin: Quantum mechanics sets limits on how we can measure certain things in nature. For instance, we can’t perfectly measure both the position and momentum of a particle. The very act of measuring one changes the other. This is called the Heisenberg uncertainty principle. When we search for dark matter, we need to measure an electromagnetic signal extremely well, but Heisenberg tells us that we can’t measure the strength and timing of this signal without introducing uncertainty.

    Quantum squeezing allows us to evade limits on measurement set by Heisenberg by putting all the uncertainty into one thing (which we don’t care about), and then measuring the other with much greater precision. So, for instance, if we squeeze all of the quantum uncertainty in an electromagnetic signal into its timing, we can measure its strength much better than quantum mechanics would ordinarily allow. This lets us search for an electromagnetic signal from dark matter much more quickly and sensitively than is otherwise possible.

    Kent Irwin (at left with Dale Li) leads efforts at SLAC and Stanford to build quantum sensors for exquisitely sensitive detectors. (Andy Freeberg/SLAC National Accelerator Laboratory)

    What types of sensors are you working on?

    Irwin: My team is exploring quantum techniques to develop sensors that could break new ground in the search for dark matter.

    We’ve known since the 1930s that the universe contains much more matter than the ordinary type that we can see with our eyes and telescopes – the matter made up of atoms. Whatever dark matter is, it’s a new type of particle that we don’t understand yet. Most of today’s dark matter detectors search for relatively heavy particles, called weakly interacting massive particles, or WIMPs.

    PandaX II Dark Matter experiment at Jin-ping Underground Laboratory (CJPL) in Sichuan, China

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB deep in Sudbury’s Creighton Mine

    LBNL LZ project at SURF, Lead, SD, USA

    But what if dark matter particles were so light that they wouldn’t leave a trace in those detectors? We want to develop sensors that would be able to “see” much lighter dark matter particles.

    There would be so many of these very light dark matter particles that they would behave much more like waves than individual particles. So instead of looking for collisions of individual dark matter particles within a detector, which is how WIMP detectors work, we want to look for dark matter waves, which would be detected like a very weak AM radio signal.

    In fact, we even call one of our projects “Dark Matter Radio.” It works like the world’s most sensitive AM radio. But it’s also placed in the world’s most perfect radio shield, made up of a material called a superconductor, which keeps all normal radio waves out. However, unlike real AM radio signals, dark matter waves would be able to go right through the shield and produce a signal. So we are looking for a very weak AM radio station made by dark matter at an unknown frequency.

    Quantum sensors can make this radio much more sensitive, for instance by using quantum tricks such as squeezing and entanglement. So the Dark Matter Radio will not only be the world’s most sensitive AM radio; it will also be better than the Heisenberg uncertainty principle would normally allow.

    What are the challenges of QIS?

    Schleier-Smith: There is a lot we need to learn about controlling quantum correlations before we can make broad use of them in future applications. For example, the sensitivity of entangled quantum states to perturbations is great for sensor applications. However, for quantum computing it’s a major challenge because perturbations of information encoded in qubits will introduce errors, and nobody knows for sure how to correct for them.

    To make progress in that area, my team is studying a question that is very fundamental to our ability to control quantum correlations: How does information actually spread in quantum systems?

    The model system we’re using for these studies consists of atoms that are laser-cooled and optically trapped. We use light to controllably turn on interactions between the atoms, as a means of generating entanglement. By measuring the speed with which quantum information can spread in the system, we hope to understand how to design the structure of the interactions to generate entanglement most efficiently. We view the system of cold atoms as a quantum simulator that allows us to study principles that are also applicable to other physical systems.

    In this area of quantum simulation, one major thrust has been to advance understanding of solid-state systems, by trapping atoms in arrays that mimic the structure of a crystalline material. In my lab, we are additionally working to extend the ideas and tools of quantum simulation in new directions. One prospect that I am particularly excited about is to use cold atoms to simulate what happens to quantum information in black holes.

    Monika Schleier-Smith (at center with graduate students Emily Davis and Eric Cooper) uses laser-cooled atoms in her lab at Stanford to study the transfer of quantum information. (Dawn Harmer/SLAC National Accelerator Laboratory)

    What do cold atoms have to do with black holes?

    Schleier-Smith: The idea that there might be any connection between quantum systems we can build in the lab and black holes has its origins in a long-standing theoretical problem: When particles fall into a black hole, what happens to the information they contained? There were compelling arguments that the information should be lost, but that would contradict the laws of quantum mechanics.

    More recently, theoretical physicists – notably my Stanford colleague Patrick Hayden – found a resolution to this problem: We should think of the black hole as a highly chaotic system that “scrambles” the information as fast as physically possible. It’s almost like shredding documents, but quantum information scrambling is much richer in that the result is a highly entangled quantum state.

    Although precisely recreating such a process in the lab will be very challenging, we hope to look at one of its key features already in the near term. In order for information scrambling to happen, information needs to be transferred through space exponentially fast. This, in turn, requires quantum interactions to occur over long distances, which is quite counterintuitive because interactions in nature typically become weaker with distance. With our quantum simulator, we are able to study interactions between distant atoms by sending information back and forth with photons, particles of light.

    What do you hope will happen in QIS over the next few years?

    Irwin: We need to prove that, in real applications, quantum technology is superior to the technology that we already have. We are in the early stages of this new quantum revolution, but this is already starting to happen. The things we’re learning now will help us make a leap in developing future technology, such as universal quantum computers and next-generation sensors. The work we do on quantum sensors will enable new science, not only in dark matter research. At SLAC, I also see potential for quantum-enhanced sensors in X-ray applications, which could provide us with new tools to study advanced materials and understand how biomolecules work.

    Schleier-Smith: QIS offers plenty of room for breakthroughs. There are many open questions we still need to answer about how to engineer the properties of quantum systems in order to harness them for technology, so it’s imperative that we continue to broadly advance our understanding of complex quantum systems. Personally, I hope that we’ll be able to better connect experimental observations with the latest theoretical advances. Bringing all this knowledge together will help us build the technologies of the future.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition


    SLAC/LCLS II projected view

    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

  • richardmitnick 11:26 am on May 5, 2019 Permalink | Reply
    Tags: 'Where Does A Proton’s Mass Come From?', 99.8% of the proton’s mass comes from gluons, , Antiquarks, Asymptotic freedom: the particles that mediate this force are known as gluons., , , , , , , , Particle Physics, , , The production of Higgs bosons is dominated by gluon-gluon collisions at the LHC, , The strong interaction is the most powerful interaction in the entire known Universe.   

    From Ethan Siegel: “Ask Ethan: ‘Where Does A Proton’s Mass Come From?'” 

    From Ethan Siegel
    May 4, 2019

    The three valence quarks of a proton contribute to its spin, but so do the gluons, sea quarks and antiquarks, and orbital angular momentum as well. The electrostatic repulsion and the attractive strong nuclear force, in tandem, are what give the proton its size, and the properties of quark mixing are required to explain the suite of free and composite particles in our Universe. (APS/ALAN STONEBRAKER)

    The whole should equal the sum of its parts, but doesn’t. Here’s why.

    The whole is equal to the sum of its constituent parts. That’s how everything works, from galaxies to planets to cities to molecules to atoms. If you take all the components of any system and look at them individually, you can clearly see how they all fit together to add up to the entire system, with nothing missing and nothing left over. The total amount you have is equal to the amounts of all the different parts of it added together.

    So why isn’t that the case for the proton? It’s made of three quarks, but if you add up the quark masses, they not only don’t equal the proton’s mass, they don’t come close. This is the puzzle that Barry Duffey wants us to address, asking:

    “What’s happening inside protons? Why does [its] mass so greatly exceed the combined masses of its constituent quarks and gluons?”

    In order to find out, we have to take a deep look inside.

    The composition of the human body, by atomic number and by mass. The whole of our bodies is equal to the sum of its parts, until you get down to an extremely fundamental level. At that point, we can see that we’re actually more than the sum of our constituent components. (ED UTHMAN, M.D., VIA WEB2.AIRMAIL.NET/UTHMAN (L); WIKIMEDIA COMMONS USER ZHAOCAROL (R))

    There’s a hint that comes just from looking at your own body. If you were to divide yourself up into smaller and smaller bits, you’d find — in terms of mass — the whole was equal to the sum of its parts. Your body’s bones, fat, muscles and organs sum up to an entire human being. Breaking those down further, into cells, still allows you to add them up and recover the same mass you have today.

    Cells can be divided into organelles, organelles are composed of individual molecules, molecules are made of atoms; at each stage, the mass of the whole is no different than that of its parts. But when you break atoms into protons, neutrons and electrons, something interesting happens. At that level, there’s a tiny but noticeable discrepancy: the individual protons, neutrons and electrons are off by right around 1% from an entire human. The difference is real.

    From macroscopic scales down to subatomic ones, the sizes of the fundamental particles play only a small role in determining the sizes of composite structures. Whether the building blocks are truly fundamental and/or point-like particles is still not known. (MAGDALENA KOWALSKA / CERN / ISOLDE TEAM)


    Like all known organisms, human beings are carbon-based life forms. Carbon atoms are made up of six protons and six neutrons, but if you look at the mass of a carbon atom, it’s approximately 0.8% lighter than the sum of the individual component particles that make it up. The culprit here is nuclear binding energy; when you have atomic nuclei bound together, their total mass is smaller than the mass of the protons and neutrons that comprise them.

    The way carbon is formed is through the nuclear fusion of hydrogen into helium and then helium into carbon; the energy released is what powers most types of stars in both their normal and red giant phases. That “lost mass” is where the energy powering stars comes from, thanks to Einstein’s E = mc². As stars burn through their fuel, they produce more tightly-bound nuclei, releasing the energy difference as radiation.

    In between the 2nd and 3rd brightest stars of the constellation Lyra, the blue giant stars Sheliak and Sulafat, the Ring Nebula shines prominently in the night skies. Throughout all phases of a star’s life, including the giant phase, nuclear fusion powers them, with the nuclei becoming more tightly bound and the energy emitted as radiation coming from the transformation of mass into energy via E = mc². (NASA, ESA, DIGITIZED SKY SURVEY 2)

    NASA/ESA Hubble Telescope

    ESO Online Digitized Sky Survey Telescopes

    Caltech Palomar Samuel Oschin 48 inch Telescope, located in San Diego County, California, United States, altitude 1,712 m (5,617 ft)

    Australian Astronomical Observatory, Siding Spring Observatory, near Coonabarabran, New South Wales, Australia, 1.2m UK Schmidt Telescope, Altitude 1,165 m (3,822 ft)

    From http://archive.eso.org/dss/dss

    This is how most types of binding energy work: the reason it’s harder to pull apart multiple things that are bound together is because they released energy when they were joined, and you have to put energy in to free them again. That’s why it’s such a puzzling fact that when you take a look at the particles that make up the proton — the up, up, and down quarks at the heart of them — their combined masses are only 0.2% of the mass of the proton as a whole. But the puzzle has a solution that’s rooted in the nature of the strong force itself.

    The way quarks bind into protons is fundamentally different from all the other forces and interactions we know of. Instead of the force getting stronger when objects get closer, like the gravitational, electric, or magnetic forces, the attractive force goes down to zero when quarks get arbitrarily close. And instead of the force getting weaker when objects get farther away, the force pulling quarks back together gets stronger the farther away they get.

    The internal structure of a proton, with quarks, gluons, and quark spin shown. The nuclear force acts like a spring, with negligible force when unstretched but large, attractive forces when stretched to large distances. (BROOKHAVEN NATIONAL LABORATORY)

    This property of the strong nuclear force is known as asymptotic freedom, and the particles that mediate this force are known as gluons. Somehow, the energy binding the proton together, responsible for the other 99.8% of the proton’s mass, comes from these gluons. The whole of matter, somehow, weighs much, much more than the sum of its parts.

    This might sound like an impossibility at first, as the gluons themselves are massless particles. But you can think of the forces they give rise to as springs: asymptoting to zero when the springs are unstretched, but becoming very large the greater the amount of stretching. In fact, the amount of energy between two quarks whose distance gets too large can become so great that it’s as though additional quark/antiquark pairs exist inside the proton: sea quarks.

    When two protons collide, it isn’t just the quarks making them up that can collide, but the sea quarks, gluons, and beyond that, field interactions. All can provide insights into the spin of the individual components, and allow us to create potentially new particles if high enough energies and luminosities are reached. (CERN / CMS COLLABORATION)

    Those of you familiar with quantum field theory might have the urge to dismiss the gluons and the sea quarks as just being virtual particles: calculational tools used to arrive at the right result. But that’s not true at all, and we’ve demonstrated that with high-energy collisions between either two protons or a proton and another particle, like an electron or photon.

    The collisions performed at the Large Hadron Collider at CERN are perhaps the greatest test of all for the internal structure of the proton. When two protons collide at these ultra-high energies, most of them simply pass by one another, failing to interact. But when two internal, point-like particles collide, we can reconstruct exactly what it was that smashed together by looking at the debris that comes out.

    A Higgs boson event as seen in the Compact Muon Solenoid detector at the Large Hadron Collider. This spectacular collision is 15 orders of magnitude below the Planck energy, but it’s the precision measurements of the detector that allow us to reconstruct what happened back at (and near) the collision point. Theoretically, the Higgs gives mass to the fundamental particles; however, the proton’s mass is not due to the mass of the quarks and gluons that compose it. (CERN / CMS COLLABORATION)

    Under 10% of the collisions occur between two quarks; the overwhelming majority are gluon-gluon collisions, with quark-gluon collisions making up the remainder. Moreover, not every quark-quark collision in protons occurs between either up or down quarks; sometimes a heavier quark is involved.

    Although it might make us uncomfortable, these experiments teach us an important lesson: the particles that we use to model the internal structure of protons are real. In fact, the discovery of the Higgs boson itself was only possible because of this, as the production of Higgs bosons is dominated by gluon-gluon collisions at the LHC. If all we had were the three valence quarks to rely on, we would have seen different rates of production of the Higgs than we did.

    Before the mass of the Higgs boson was known, we could still calculate the expected production rates of Higgs bosons from proton-proton collisions at the LHC. The top channel is clearly production by gluon-gluon collisions. I (E. Siegel) have added the yellow highlighted region to indicate where the Higgs boson was discovered. (CMS COLLABORATION (DORIGO, TOMMASO FOR THE COLLABORATION) ARXIV:0910.3489)

    As always, though, there’s still plenty more to learn. We presently have a solid model of the average gluon density inside a proton, but if we want to know where the gluons are actually more likely to be located, that requires more experimental data, as well as better models to compare the data against. Recent advances by theorists Björn Schenke and Heikki Mäntysaari may be able to provide those much needed models. As Mäntysaari detailed:

    “It is very accurately known how large the average gluon density is inside a proton. What is not known is exactly where the gluons are located inside the proton. We model the gluons as located around the three [valence] quarks. Then we control the amount of fluctuations represented in the model by setting how large the gluon clouds are, and how far apart they are from each other. […] The more fluctuations we have, the more likely this process [producing a J/ψ meson] is to happen.”

    A schematic of the world’s first electron-ion collider (EIC). Adding an electron ring (red) to the Relativistic Heavy Ion Collider (RHIC) at Brookhaven would create the eRHIC: a proposed deep inelastic scattering experiment that could improve our knowledge of the internal structure of the proton significantly. (BROOKHAVEN NATIONAL LABORATORY-CAD ERHIC GROUP)

    The combination of this new theoretical model and the ever-improving LHC data will better enable scientists to understand the internal, fundamental structure of protons, neutrons and nuclei in general, and hence to understand where the mass of the known objects in the Universe comes from. From an experimental point of view, the greatest boon would be a next-generation electron-ion collider, which would enable us to perform deep inelastic scattering experiments to reveal the internal makeup of these particles as never before.

    But there’s another theoretical approach that can take us even farther into the realm of understanding where the proton’s mass comes from: Lattice QCD.

    A better understanding of the internal structure of a proton, including how the “sea” quarks and gluons are distributed, has been achieved through both experimental improvements and new theoretical developments in tandem. (BROOKHAVEN NATIONAL LABORATORY)

    The difficult part with the quantum field theory that describes the strong force — quantum chromodynamics (QCD) — is that the standard approach we take to doing calculations is no good. Typically, we’d look at the effects of particle couplings: the charged quarks exchange a gluon and that mediates the force. They could exchange gluons in a way that creates a particle-antiparticle pair or an additional gluon, and that should be a correction to a simple one-gluon exchange. They could create additional pairs or gluons, which would be higher-order corrections.

    We call this approach taking a perturbative expansion in quantum field theory, with the idea that calculating higher and higher-order contributions will give us a more accurate result.

    Today, Feynman diagrams are used in calculating every fundamental interaction spanning the strong, weak, and electromagnetic forces, including in high-energy and low-temperature/condensed conditions. But this approach, which relies on a perturbative expansion, is only of limited utility for the strong interactions, as this approach diverges, rather than converges, when you add more and more loops for QCD.(DE CARVALHO, VANUILDO S. ET AL. NUCL.PHYS. B875 (2013) 738–756)

    Richard Feynman © Open University

    But this approach, which works so well for quantum electrodynamics (QED), fails spectacularly for QCD. The strong force works differently, and so these corrections get very large very quickly. Adding more terms, instead of converging towards the correct answer, diverges and takes you away from it. Fortunately, there is another way to approach the problem: non-perturbatively, using a technique called Lattice QCD.

    By treating space and time as a grid (or lattice of points) rather than a continuum, where the lattice is arbitrarily large and the spacing is arbitrarily small, you overcome this problem in a clever way. Whereas in standard, perturbative QCD, the continuous nature of space means that you lose the ability to calculate interaction strengths at small distances, the lattice approach means there’s a cutoff at the size of the lattice spacing. Quarks exist at the intersections of grid lines; gluons exist along the links connecting grid points.

    As your computing power increases, you can make the lattice spacing smaller, which improves your calculational accuracy. Over the past three decades, this technique has led to an explosion of solid predictions, including the masses of light nuclei and the reaction rates of fusion under specific temperature and energy conditions. The mass of the proton, from first principles, can now be theoretically predicted to within 2%.

    As computational power and Lattice QCD techniques have improved over time, so has the accuracy to which various quantities about the proton, such as its component spin contributions, can be computed. By reducing the lattice spacing size, which can be done simply by raising the computational power employed, we can better predict the mass of not only the proton, but of all the baryons and mesons. (LABORATOIRE DE PHYSIQUE DE CLERMONT / ETM COLLABORATION)

    It’s true that the individual quarks, whose masses are determined by their coupling to the Higgs boson, cannot even account for 1% of the mass of the proton. Rather, it’s the strong force, described by the interactions between quarks and the gluons that mediate them, that are responsible for practically all of it.

    The strong nuclear force is the most powerful interaction in the entire known Universe. When you go inside a particle like the proton, it’s so powerful that it — not the mass of the proton’s constituent particles — is primarily responsible for the total energy (and therefore mass) of the normal matter in our Universe. Quarks may be point-like, but the proton is huge by comparison: 8.4 × 10^-16 m in diameter. Confining its component particles, which the binding energy of the strong force does, is what’s responsible for 99.8% of the proton’s mass.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

  • richardmitnick 2:08 pm on May 3, 2019 Permalink | Reply
    Tags: "A quantum leap in particle simulation", , , , , Particle Physics, Particles called fermions which are the building blocks of matter; and particles called bosons which are field particles and tug on the matter particles.,   

    From Fermi National Accelerator Lab: “A quantum leap in particle simulation” 

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

    May 2, 2019
    Leah Hesla

    A group of scientists at the Department of Energy’s Fermilab has figured out how to use quantum computing to simulate the fundamental interactions that hold together our universe.

    In a paper published in Physical Review Letters, Fermilab researchers fill a conspicuous gap in modeling the subatomic world using quantum computers, addressing a family of particles that, until recently, has been relatively neglected in quantum simulations.

    The fundamental particles that make up our universe can be divided into two groups: particles called fermions, which are the building blocks of matter, and particles called bosons, which are field particles and tug on the matter particles.

    In recent years, scientists have successfully developed quantum algorithms to compute systems made of fermions. But they’ve had a much tougher time doing the same for boson systems.

    For the first time, Fermilab scientist Alexandru Macridin has found a way to model systems containing both fermions and bosons on general-purpose quantum computers, opening a door to realistic simulations of the subatomic realm. His work is part of the Fermilab quantum science program.

    “The representation of bosons in quantum computing was never addressed very well in the literature before,” Macridin said. “Our method worked, and better than we expected.”

    The relative obscurity of bosons in quantum-computation literature has partly to do with bosons themselves and partly with the way quantum-computing research has evolved.

    Over the last decade, the development of quantum algorithms focused strongly on simulating purely fermionic systems, such as molecules in quantum chemistry.

    “But in high-energy physics, we also have bosons, and high-energy physicists are particularly interested in the interactions between bosons and fermions,” said Fermilab scientist Jim Amundson, a co-author on the Physical Review Letters paper. “So we took existing fermion models and extended them to include bosons, and we did that in a novel way.”

    The biggest barrier to modeling bosons related to the properties of a qubit — a quantum bit.

    Mapping the states

    A qubit has two states: 1 and 0.

    Similarly, a fermion state has two distinct modes: occupied and unoccupied.

    The qubit’s two-state property means it can represent a fermion state pretty straightforwardly: One qubit state is assigned to “occupied,” and the other, “unoccupied.”

    (You might remember something about the occupation of states from high school chemistry: An atom’s electron orbitals can each be occupied by a maximum of one electron. So they’re either occupied or not. Those orbitals, in turn, combine to form the electron shells that surround the nucleus.)

    The one-to-one mapping between qubit state and fermion state makes it easy to determine the number of qubits you’ll need to simulate a fermionic process. If you’re dealing with a system of 40 fermion states, like a molecule with 40 orbitals, you’ll need 40 qubits to represent it.

    In a quantum simulation, a researcher sets up qubits to represent the initial state of, say, a molecular process. Then the qubits are manipulated according to an algorithm that reflects how that process evolves.

    More complex processes need a greater number of qubits. As the number grows, so does the computing power needed to carry it out. But even with only a handful of qubits at one’s disposal, researchers are able to tackle some interesting problems related to fermion processes.

    “There’s a well-developed theory for how to map fermions onto qubits,” said Fermilab theorist Roni Harnik, a co-author of the paper.

    Bosons, nature’s force particles, are a different story. The business of mapping them gets complicated quickly. That’s partly because, unlike the restricted, two-choice fermion state, boson states are highly accommodating.

    A system of bosons can be modeled as a system of harmonic oscillators, a phenomenon that occurs everywhere in nature. The motion of a spring bobbing up and down and the vibration of a plucked string are both examples of harmonic oscillators. In quantum mechanics, the harmonic oscillator motion is described by typical wave functions. Several (typical) wave functions are shown here. A Fermilab team recently found a way to represent wave functions for bosonic systems on a quantum computer. Image: Allen McC

    Accommodating bosons

    Since only one fermion can occupy a particular fermion quantum state, that state is either occupied or not — 1 or 0.

    In contrast, a boson state can be variably occupied, accommodating one boson, a zillion bosons, or anything in between. That makes it tough to map bosons to qubits. With only two possible states, a single qubit cannot, by itself, represent a boson state.

    With bosons, the question is not whether the qubit represents an occupied or unoccupied state, but rather, how many qubits are needed to represent the boson state.

    “Scientists have come up with ways to encode bosons into qubits that would require a large number of qubits to give you accurate results,” Amundson said.

    A prohibitively large number, in many cases. By some methods, a useful simulation would need millions of qubits to faithfully model a boson process, like the transformation of a particle that ultimately produces a particle of light, which is a type of boson.

    And that’s just in representing the process’s initial setup, let alone letting it evolve.

    Macridin’s solution was to recast the boson system as something else, something very familiar to physicists — a harmonic oscillator.

    Harmonic oscillators are everywhere in nature, from the subatomic to the astronomical scales. The vibration of molecules, the pulse of current through a circuit, the up-and-down bob of a loaded spring, the motion of a planet around a star — all are harmonic oscillators. Even bosonic particles, like those Macridin looked to simulate, can be treated like tiny harmonic oscillators. Thanks to their ubiquity, harmonic oscillators are well-understood and can be modeled precisely.

    With a background in condensed-matter physics — the study of nature a couple of notches up from its particle foundation — Macridin was familiar with modeling harmonic oscillators in crystals. He found a way to represent a harmonic oscillator on a quantum computer, mapping such systems to qubits with exceptional precision and enabling the precise simulation of bosons on quantum computers.

    And at a low qubit cost: Representing a discrete harmonic oscillator on a quantum computer requires only a few qubits, even if the oscillator represents a large number of bosons.

    “Our method requires a relatively small number of qubits for boson states — exponentially smaller than what was proposed by other groups before,” Macridin said. “For other methods to do the same thing, they would probably need orders of magnitude larger number of qubits.”

    Macridin estimates that six qubits per boson state is enough to explore interesting physics problems.

    Simulation success

    As a trial of Macridin’s mapping method, the Fermilab group first tapped into quantum field theory, a branch of physics that focuses on modeling subatomic structures. They successfully modeled the interaction of electrons in a crystal with the vibrations of the atoms that form the crystal. The ‘unit’ of that vibration is a boson called a phonon.

    Using a quantum simulator at nearby Argonne National Laboratory, they modeled the electron-phonon system and — voila! — they showed they could calculate, with high accuracy, the system’s properties using only about 20 qubits. The simulator is a classical computer that simulates how a quantum computer, up to 35 qubits, works. Argonne researchers leverage the simulator and their expertise in scalable algorithms to explore the potential impact of quantum computing in key areas such as quantum chemistry and quantum materials.

    “We showed that the technique worked,” Harnik said.

    They further showed that, by representing bosons as harmonic oscillators, one could efficiently and accurately describe systems involving fermion-boson interactions.

    “It turned out to be a good fit,” Amundson said.

    “I’d started with one idea, and it didn’t work, so then I changed the representation of the bosons,” Macridin said. “And it worked well. It makes the simulation of fermion-boson systems feasible for near-term quantum computers.”

    Universal application

    The Fermilab group’s simulation is not the first time scientists have modeled bosons in quantum computers. But in the other cases, scientists used hardware specifically designed to simulate bosons, so the simulated evolution of a boson system would happen naturally, so to speak, on those special computers.

    The Fermilab group’s approach is the first that can be applied efficiently in a general-purpose, digital quantum computer, also called a universal quantum computer.

    The next step for Macridin, Amundson and other particle physicists at Fermilab is to use their method on problems in high-energy physics.

    “In nature, fermion-boson interactions are fundamental. They appear everywhere,” Macridin said. “Now we can extend our algorithm to various theories in our field.”

    Their achievement extends beyond particle physics. Amundson says their group has heard from materials scientists who think the work could be useful in solving real-world problems in the foreseeable future.

    “We introduced bosons in a new way that requires fewer resources,” Amundson said. “It really opens up a whole new class of quantum simulations.”

    This work was funded by the DOE Office of Science. Learn more about this result in Physical Review Letters [above]. Visit the Fermilab quantum science website [above] to learn about other quantum initiatives.

    See the full article here.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world collaborate at Fermilab on experiments at the frontiers of discovery.

    FNAL MINERvA front face Photo Reidar Hahn


    FNAL Muon g-2 studio

    FNAL Short-Baseline Near Detector under construction

    FNAL Mu2e solenoid

    Dark Energy Camera [DECam], built at FNAL

    FNAL DUNE Argon tank at SURF


    FNAL Don Lincoln


    FNAL Cryomodule Testing Facility

    FNAL MINOS Far Detector in the Soudan Mine in northern Minnesota

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    FNAL/NOvA experiment map

    FNAL NOvA Near Detector


    FNAL Holometer

  • richardmitnick 3:02 pm on May 2, 2019 Permalink | Reply
    Tags: , , , , , , Particle Physics, ,   

    From University of Chicago: “Scientists invent way to trap mysterious ‘dark world’ particle at Large Hadron Collider” 

    U Chicago bloc

    From University of Chicago

    Apr 17, 2019 [Just found this via social media]
    Louise Lerner

    Courtesy of Zarija Lukic/Berkeley Lab

    A new paper outlines a method to directly detect particles from the ‘dark world’ using the Large Hadron Collider. Until now we’ve only been able to make indirect measurements and simulations, such as the visualization of dark matter above.

    CERN LHC Maximilien Brice and Julien Marius Ordan

    Higgs boson could be tied with dark particle, serve as ‘portal to the dark world’.

    Now that they’ve identified the Higgs boson, scientists at the Large Hadron Collider have set their sights on an even more elusive target.

    All around us is dark matter and dark energy—the invisible stuff that binds the galaxy together, but which no one has been able to directly detect. “We know for sure there’s a dark world, and there’s more energy in it than there is in ours,” said LianTao Wang, a University of Chicago professor of physics who studies how to find signals in large particle accelerators like the LHC.

    Wang, along with scientists from the University and UChicago-affiliated Fermilab, think they may be able to lead us to its tracks; in a paper published April 3 in Physical Review Letters, they laid out an innovative method for stalking dark matter in the LHC by exploiting a potential particle’s slightly slower speed.

    While the dark world makes up more than 95% of the universe, scientists only know it exists from its effects—like a poltergeist you can only see when it pushes something off a shelf. For example, we know there’s dark matter because we can see gravity acting on it—it helps keep our galaxies from flying apart.

    Theorists think there’s one particular kind of dark particle that only occasionally interacts with normal matter. It would be heavier and longer-lived than other known particles, with a lifetime up to one tenth of a second. A few times in a decade, researchers believe, this particle can get caught up in the collisions of protons that the LHC is constantly creating and measuring.

    “One particularly interesting possibility is that these long-lived dark particles are coupled to the Higgs boson in some fashion—that the Higgs is actually a portal to the dark world,” said Wang, referring to the last holdout particle in physicists’ grand theory of how the universe works, discovered at the LHC in 2012.

    Standard Model of Particle Physics

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    “It’s possible that the Higgs could actually decay into these long-lived particles.”

    The only problem is sorting out these events from the rest; there are more than a billion collisions per second in the 27-kilometer LHC, and each one of these sends subatomic chaff spraying in all directions.

    Wang, UChicago postdoctoral fellow Jia Liu and Fermilab scientist Zhen Liu (now at the University of Maryland) proposed a new way to search by exploiting one particular aspect of such a dark particle. “If it’s that heavy, it costs energy to produce, so its momentum would not be large—it would move more slowly than the speed of light,” said Liu, the first author on the study.

    That time delay would set it apart from all the rest of the normal particles. Scientists would only need to tweak the system to look for particles that are produced and then decay a bit more slowly than everything else.

    The difference is on the order of a nanosecond—a billionth of a second—or smaller. But the LHC already has detectors sophisticated enough to catch this difference; a recent study using data collected from the last run and found the method should work, plus the detectors will get even more sensitive as part of the upgrade that is currently underway.

    “We anticipate this method will increase our sensitivity to long-lived dark particles by more than an order of magnitude—while using capabilities we already have at the LHC,” Liu said.

    Experimentalists are already working to build the trap: When the LHC turns back on in 2021, after boosting its luminosity by tenfold, all three of the major detectors will be implementing the new system, the scientists said. “We think it has great potential for discovery,” Liu said.


    CERN/CMS Detector

    CERN/ALICE Detector

    “If the particle is there, we just have to find a way to dig it out,” Wang said. “Usually, the key is finding the question to ask.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Chicago Campus

    An intellectual destination

    One of the world’s premier academic and research institutions, the University of Chicago has driven new ways of thinking since our 1890 founding. Today, UChicago is an intellectual destination that draws inspired scholars to our Hyde Park and international campuses, keeping UChicago at the nexus of ideas that challenge and change the world.

    The University of Chicago is an urban research university that has driven new ways of thinking since 1890. Our commitment to free and open inquiry draws inspired scholars to our global campuses, where ideas are born that challenge and change the world.

    We empower individuals to challenge conventional thinking in pursuit of original ideas. Students in the College develop critical, analytic, and writing skills in our rigorous, interdisciplinary core curriculum. Through graduate programs, students test their ideas with UChicago scholars, and become the next generation of leaders in academia, industry, nonprofits, and government.

    UChicago research has led to such breakthroughs as discovering the link between cancer and genetics, establishing revolutionary theories of economics, and developing tools to produce reliably excellent urban schooling. We generate new insights for the benefit of present and future generations with our national and affiliated laboratories: Argonne National Laboratory, Fermi National Accelerator Laboratory, and the Marine Biological Laboratory in Woods Hole, Massachusetts.

    The University of Chicago is enriched by the city we call home. In partnership with our neighbors, we invest in Chicago’s mid-South Side across such areas as health, education, economic growth, and the arts. Together with our medical center, we are the largest private employer on the South Side.

    In all we do, we are driven to dig deeper, push further, and ask bigger questions—and to leverage our knowledge to enrich all human life. Our diverse and creative students and alumni drive innovation, lead international conversations, and make masterpieces. Alumni and faculty, lecturers and postdocs go on to become Nobel laureates, CEOs, university presidents, attorneys general, literary giants, and astronauts.

  • richardmitnick 1:01 pm on May 2, 2019 Permalink | Reply
    Tags: , An unexpected signature, , , It’s not always about what you discover, Nature might be tough with us- but maybe nature is testing us and making us stronger., , Particle Physics, , Taking a closer look, Why the force of gravity is so much weaker than other known forces like electromagnetism. There is only one right answer. We haven’t found it yet.   

    From Symmetry: “The unseen progress of the LHC” 

    Symmetry Mag
    From Symmetry

    Sarah Charley


    CERN map

    CERN LHC Maximilien Brice and Julien Marius Ordan

    CERN LHC particles

    It’s not always about what you discover.

    About seven years ago, physicist Stephane Willocq at the University of Massachusetts became enthralled with a set of theories that predicted the existence of curled-up extra dimensions hiding within our classical four dimensions of spacetime.

    “The idea of extra spatial dimensions is appealing because it allows us to look at the fundamental problems in particle physics from a different viewpoint,” Willocq says.

    As an experimental physicist, Willocq can do more than ponder. At the Large Hadron Collider at CERN, he put his pet theories to the test.

    Models based on those theories predicted how curled-up extra dimensions would affect the outcome of proton-proton collisions at the LHC. They predicted the collisions would produce more high-energy particles than expected.

    After several searches, Willocq and his colleagues found nothing out of the ordinary. “It was a great idea and disappointing to see it fade away, bit by bit,” he says, “but that’s how scientific progress works—finding the right idea by process of elimination.”

    The LHC research program is famous for discovering and studying the long-sought Higgs boson. But out of the spotlight, scientists have been using the LHC for an equally important scientific endeavor: testing, constraining and eliminating hundreds of theories that propose solutions to outstanding problems in physics, such as why the force of gravity is so much weaker than other known forces like electromagnetism.

    “There is only one right answer,” Willocq says. “We haven’t found it yet.”

    Now that scientists are at the end of the second run of the LHC, they have covered a huge amount of ground, eliminating the simplest versions of numerous theoretical ideas. They’ve covered four times as much phase space as previous searches for heavy new particles and set strict limits on what is physically possible.

    These studies don’t get the same attention as the Higgs boson, but these null results—results that don’t support a certain hypothesis—have moved physics forward as well.

    An unexpected signature

    Having chased down their most obvious leads, physicists are now adapting their methodology and considering new possibilities in their pursuit of new physics.

    Thus far, physicists have often used a straightforward formula to look for new particles. Massive particles produced in particle collisions will almost instantly decay, transforming into more stable particles. If scientists can measure all of those particles, they can reconstruct the mass and properties of the original particle that produced them.

    This worked wonderfully when scientists discovered the top quark in 1995 and the Higgs boson in 2012. But finding the next new thing might take a different tactic.

    “Finding new physics is more challenging than we expected it to be,” says University of Wisconsin physicist Tulika Bose of the CMS experiment. “Challenging situations make people come up with clever ideas.”

    One idea is that maybe scientists have been so focused on instantly decaying particles that they’ve been missing a whole host of particles that can travel up to several meters before falling apart. This would look like a firework exploding randomly in one of the detector subsystems.

    Scientists are rethinking how they reconstruct the data as a way to cast a bigger net and potentially catch particles with signatures like these. “If we only used our standard analysis methods, we would definitely not be sensitive to anything like this,” Bose says. “We’re no longer just reloading previous analyses but exploring innovative ideas.”

    Taking a closer look

    Since looking for excess particles coming out of collisions has yet to yield evidence of extra spatial dimensions, Willocq has decided to devote some of his efforts to a different method used at LHC experiments: precision measurements.

    Models also make predictions about properties of particles such as how often they decay into one set of particles versus another set. If precise measurements show deviations from predictions by the Standard Model of particle physics, it can mean that something new is at play.

    “Several new physics models predict an enhanced rate of rare subatomic processes,” Bose says. “However, their rates are so low that we have not been able to measure them yet.”

    In the past, precision measurements of well-known particles have overturned seemingly bulletproof paradigms. In the 1940s, for example, the measurement of a property called the “magnetic moment” of the neutron showed that it was not a fundamental particle, as had been previously assumed. This eventually helped lead to the discovery of particles that make up neutrons: quarks.

    Another example is the measurement of the mismatched decays of certain matter and antimatter particles, which led to the prediction of a new group of quarks—later confirmed by the discoveries of the top and bottom quarks.

    The plan for the LHC research program is to collect a huge amount of data, which will give scientists the resolution they need to examine every shadowy corner of the Standard Model.

    “This work naturally pushes our search methods towards making more detailed and higher precision measurements that will help us constrain possible deviations by new physics,” Willocq says.

    Because many of these predictions have never been thoroughly tested, scientists are hoping that they’ll find a few small deviations that could open the door to a new era of physics research. “Nature might be tough with us,” Bose says, “but maybe nature is testing us and making us stronger.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 2:38 pm on May 1, 2019 Permalink | Reply
    Tags: , Charged particles travel faster than light through the quantum vacuum of space that surrounds pulsars., , Dame Susan Jocelyn Bell Burnell (1943 – ) still working, Particle Physics, ,   

    From SPACE.com: “Faster-Than-Light Particles Emit Superbright Gamma Rays that Circle Pulsars” 

    space-dot-com logo

    From SPACE.com

    Yasemin Saplakoglu

    The Vela pulsar that lives 1,000 light years from our planet. (Image: © NASA/CXC/Univ of Toronto/M.Durant et al)

    Charged particles travel faster than light through the quantum vacuum of space that surrounds pulsars. As these electrons and protons fly by pulsars, they create the ultrabright gamma-ray flashes emitted by the rapidly twirling neutron stars, new research reveals.

    These gamma-rays, called Cherenkov emissions, are also found in powerful particle accelerators on Earth, such as the Large Hadron Collider near Geneva, Switzerland. The rays are also the source of the bluish-white glow in the waters of a nuclear reactor.


    CERN map

    CERN LHC Tunnel

    CERN LHC particles

    Daya Bay, nuclear power plant, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China

    But until now, no one thought that pulsar emissions consisted of Cherenkov radiation.

    That’s in part because of Albert Einstein’s famous theory of relativity, which holds that nothing can travel faster than light in a vacuum. Because of those propositions, scientists previously thought that Cherenkov emissions couldn’t happen in the quantum vacuum of space surrounding pulsars. That area is mostly devoid of matter but home to ghostly quantum particles that flicker in and out of existence.

    So, does this new research mean Einstein’s landmark theory was just violated? Not at all, said study co-author Dino Jaroszynski, a professor of physics at the University of Strathclyde in Scotland.

    Pulsars create crushingly strong electromagnetic fields in the quantum vacuum surrounding the stars. These fields warp, or polarize, the vacuum, essentially creating speed bumps that slow down light particles, Jaroszynski told Live Science. Meanwhile, charged particles such as protons and electrons zoom through these fields, racing past light.

    As charged particles fly through this field, they displace electrons along their path and emit radiation, which gathers into an electromagnetic wave. This wave, like an optical version of a sonic boom, is what we see as the gamma-ray flash, according to a statement.

    The team still doesn’t know exactly how bright these gamma-ray flashes are, Jaroszynski said.

    “What we do know is that, under the right conditions, vacuum Cherenkov radiation outshines synchrotron radiation,” he added, referring to another type of radiation that is emitted from pulsars by charged particles moving along a curved path.

    But the new findings could have implications beyond pulsars, the researchers said.

    “This is a very exciting new prediction because it could provide answers to basic questions such as what is the origin of the gamma-ray glow at the centre of galaxies?” Jaroszynski said in the statement. “It provides a new way of testing some of the most fundamental theories of science by pushing them to their limits.”

    The researchers reported their findings April 25 in the journal Physical Review Letters.

    See the full article here .

    Women in STEM – Dame Susan Jocelyn Bell Burnell

    Dame Susan Jocelyn Bell Burnell, discovered pulsars with radio astronomy. Jocelyn Bell at the Mullard Radio Astronomy Observatory, Cambridge University, taken for the Daily Herald newspaper in 1968. Denied the Nobel.

    Dame Susan Jocelyn Bell Burnell at work on first plusar chart 1967 pictured working at the Four Acre Array in 1967. Image courtesy of Mullard Radio Astronomy Observatory.

    Dame Susan Jocelyn Bell Burnell 2009

    Dame Susan Jocelyn Bell Burnell (1943 – ), still working from http://www. famousirishscientists.weebly.com


    Please help promote STEM in your local schools.

    Stem Education Coalition

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: