Tagged: Particle Accelerators Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:20 pm on May 14, 2019 Permalink | Reply
    Tags: , , , , LS2, Particle Accelerators, , , Superconducting magnet circuits   

    From CERN: “LS2 Report: consolidating the energy extraction systems of LHC superconducting magnet circuits” 

    Cern New Bloc

    Cern New Particle Event


    From CERN

    13 May, 2019
    Anaïs Schaeffer

    1
    The LS2 team from the NRC Kurchatov-IHEP Institute, Protvino, Russia, with a 13 kA energy extraction system (Image: NRC Kurchatov-IHEP Institute)

    In the LHC, 1232 superconducting dipole magnets and 392 quadrupole magnets guide and focus the beams around the accelerator’s 27-kilometre ring, which is divided into eight sectors. These magnets operate at very low temperatures – 1.9 K or −271.3 °C – where even a tiny amount of energy released inside a magnet can warm its windings to above the critical temperature, causing the loss of superconductivity: this is called a quench. When this happens, the energy stored in the affected magnet has to be safely extracted in a short time to avoid damage to the magnet coil.

    To do so, two protection elements are activated: at the level of the quenching magnet, a diode diverts the current into a parallel by-pass circuit in less than a second; at the level of the circuit, 13 kA energy extraction systems absorb the energy of the whole magnet circuit in a few minutes. There are equivalent extraction systems installed for about 200 corrector circuits with currents up to 600 A.

    “In the framework of a long-lasting and fruitful collaboration between CERN and the Russian Federation, energy extraction systems for quench protection of the LHC superconducting magnets were designed in close partnership with two Russian institutes, the NRC Kurchatov-IHEP Institute in Protvino for the 13 kA systems and the Budker Institute in Novosibirsk for the 600 A systems. Russian industry was involved in the manufacturing of the parts of these systems,” explains Félix Rodríguez Mateos, leader of the Electrical Engineering (EE) section in the Machine Protection and Electrical Integrity (MPE) group of CERN’s Technology department.

    With a wealth of expertise and know-how, the Russian teams have continuously provided invaluable support to the MPE group. “Our Russian colleagues come to CERN for every year-end technical stop (YETS) and long shutdown to help us perform preventive maintenance and upgrade activities on the energy extraction systems,” says Rodríguez Mateos.

    During LS2, an extensive maintenance campaign is being performed on the 13 kA systems, which already count 10 years of successful operation in the LHC. “We are currently replacing an element, the arcing contact, in each one of the 256 electromechanical switches of the energy extraction systems to ensure their continuous reliable operation throughout the next runs,” adds Rodríguez Mateos. “In February, we fully replaced 32 switches at Point 8 of the accelerator in anticipation of consolidation for the future HL-LHC.”

    During LS2, the Electrical Engineering section is involved in many other activities that will be the subject of future articles.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

     
  • richardmitnick 12:04 pm on May 14, 2019 Permalink | Reply
    Tags: >Model-dependent vs model-independent research, , , , , , , , Particle Accelerators, , ,   

    From Symmetry: “Casting a wide net” 

    Symmetry Mag
    From Symmetry

    05/14/19
    Jim Daley

    1
    Illustration by Sandbox Studio, Chicago

    In their quest to discover physics beyond the Standard Model, physicists weigh the pros and cons of different search strategies.

    On October 30, 1975, theorists John Ellis, Mary K. Gaillard and D.V. Nanopoulos published a paper [Science Direct] titled “A Phenomenological Profile of the Higgs Boson.” They ended their paper with a note to their fellow scientists.

    “We should perhaps finish with an apology and a caution,” it said. “We apologize to experimentalists for having no idea what is the mass of the Higgs boson… and for not being sure of its couplings to other particles, except that they are probably all very small.

    “For these reasons, we do not want to encourage big experimental searches for the Higgs boson, but we do feel that people performing experiments vulnerable to the Higgs boson should know how it may turn up.”

    What the theorists were cautioning against was a model-dependent search, a search for a particle predicted by a certain model—in this case, the Standard Model of particle physics.

    Standard Model of Particle Physics

    It shouldn’t have been too much of a worry. Around then, most particle physicists’ experiments were general searches, not based on predictions from a particular model, says Jonathan Feng, a theoretical particle physicist at the University of California, Irvine.

    Using early particle colliders, physicists smashed electrons and protons together at high energies and looked to see what came out. Samuel Ting and Burton Richter, who shared the 1976 Nobel Prize in physics for the discovery of the charm quark, for example, were not looking for the particle with any theoretical prejudice, Feng says.

    That began to change in the 1980s and ’90s. That’s when physicists began exploring elegant new theories such as supersymmetry, which could tie up many of the Standard Model’s theoretical loose ends—and which predict the existence of a whole slew of new particles for scientists to try to find.

    Of course, there was also the Higgs boson. Even though scientists didn’t have a good prediction of its mass, they had good motivations for thinking it was out there waiting to be discovered.

    And it was. Almost 40 years after the theorists’ tongue-in-cheek warning about searching for the Higgs, Ellis found himself sitting in the main auditorium at CERN next to experimentalist Fabiola Gianotti, the spokesperson of the ATLAS experiment at the Large Hadron Collider who, along with CMS spokesperson Joseph Incandela, had just co-announced the discovery of the particle he had once so pessimistically described.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    Model-dependent vs model-independent

    Scientists’ searches for particles predicted by certain models continue, but in recent years, searches for new physics independent of those models have begun to enjoy a resurgence as well.

    “A model-independent search is supposed to distill the essence from a whole bunch of specific models and look for something that’s independent of the details,” Feng says. The goal is to find an interesting common feature of those models, he explains. “And then I’m going to just look for that phenomenon, irrespective of the details.”

    Particle physicist Sara Alderweireldt uses model-independent searches in her work on the ATLAS experiment at the Large Hadron Collider.

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    Alderweireldt says that while many high-energy particle physics experiments are designed to make very precise measurements of a specific aspect of the Standard Model, a model-independent search allows physicists to take a wider view and search more generally for new particles or interactions. “Instead of zooming in, we try to look in as many places as possible in a consistent way.”

    Such a search makes room for the unexpected, she says. “You’re not dependent on the prior interpretation of something you would be looking for.”

    Theorist Patrick Fox and experimentalist Anadi Canepa, both at Fermilab, collaborate on searches for new physics.


    In Canepa’s work on the CMS experiment, the other general-purpose particle detector at the LHC, many of the searches are model-independent.

    While the nature of these searches allows them to “cast a wider net,” Fox says, “they are in some sense shallower, because they don’t manage to strongly constrain any one particular model.”

    At the same time, “by combining the results from many independent searches, we are getting closer to one dedicated search,” Canepa says. “Developing both model-dependent and model-independent searches is the approach adopted by the CMS and ATLAS experiments to fully exploit the unprecedented potential of the LHC.”

    Driven by data and powered by machine learning

    Model-dependent searches focus on a single assumption or look for evidence of a specific final state following an experimental particle collision. Model-independent searches are far broader—and how broad is largely driven by the speed at which data can be processed.

    “We have better particle detectors, and more advanced algorithms and statistical tools that are enabling us to understand searches in broader terms,” Canepa says.

    One reason model-independent searches are gaining prominence is because now there is enough data to support them. Particle detectors are recording vast quantities of information, and modern computers can run simulations faster than ever before, she says. “We are able to do model-independent searches because we are able to better understand much larger amounts of data and extreme regions of parameter and phase space.”

    Machine-learning is a key part of this processing power, Canepa says. “That’s really a change of paradigm, because it really made us make a major leap forward in terms of sensitivity [to new signals]. It really allows us to benefit from understanding the correlations that we didn’t capture in a more classical approach.”

    These broader searches are an important part of modern particle physics research, Fox says.

    “At a very basic level, our job is to bequeath to our descendants a better understanding of nature than we got from our ancestors,” he says. “One way to do that is to produce lots of information that will stand the test of time, and one way of doing that is with model-independent searches.”

    Models go in and out of fashion, he adds. “But model-independent searches don’t feel like they will.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 9:09 am on May 13, 2019 Permalink | Reply
    Tags: , , , CLIC, , , Particle Accelerators, , , Roadmap for the future of the discipline, The European Strategy Group   

    From CERN: “In Granada, the European particle physics community prepares decisions for the future of the field” 

    Cern New Bloc

    Cern New Particle Event


    From CERN

    13 May, 2019

    The European particle physics community is meeting this week in Granada, Spain, to discuss the roadmap for the future of the discipline.

    1

    Geneva and Granada. The European particle physics community is meeting this week in Granada, Spain, to discuss the roadmap for the future of the discipline. The aim of the symposium is to define scientific priorities and technological approaches for the coming years and to consider plans for the medium- and long-term future. An important focus of the discussions will be assessing the various options for the period beyond the lifespan of the Large Hadron Collider.

    “The Granada symposium is an important step in the process of updating the European Strategy for Particle Physics and aims to prioritise our scientific goals and prepare for the upcoming generation of facilities and experiments,” said the President of the CERN Council, Ursula Bassler. “The discussions will focus on the scientific reach of potential new projects, the associated technological challenges and the resources required.”

    The European Strategy Group, which was established to coordinate the update process, has received 160 contributions from the scientific community setting out their views on possible future projects and experiments. The symposium in Granada will provide an opportunity to assess and discuss them.

    “The intent is to make sure that we have a good understanding of the science priorities of the community and of all the options for realising them,” said the Chair of the European Strategy Group, Professor Halina Abramowicz. “This will ensure that the European Strategy Group is well informed when deciding about the strategy update.”

    The previous update of the European Strategy, approved in May 2013, recommended that design and feasibility studies be conducted in order for Europe “to be in a position to propose an ambitious post-LHC accelerator project.” Over the last few years, in collaboration with partners from around the world, Europe has therefore been engaging in R&D and design projects for a range of ambitious post-LHC facilities under the CLIC and FCC umbrellas.


    CLIC collider

    CERN FCC Future Circular Collider details of proposed 100km-diameter successor to LHC

    A study to investigate the potential to build projects that are complementary to high-energy colliders, exploiting the opportunities offered by CERN’s unique accelerator complex, was also launched by CERN in 2016. These contributions will feed into the discussion, which will also take into account the worldwide particle physics landscape and developments in related fields.

    “At least two decades will be needed to design and build a new collider to succeed the LHC. Such a machine should maximise the potential for new discoveries and enable major steps forward in our understanding of fundamental physics” said CERN Director-General, Fabiola Gianotti. “It is not too early to start planning for it as it will take time to develop the new technologies needed for its implementation.”

    The Granada symposium will be followed up with the compilation of a “briefing book” and with a Strategy Drafting Session, which will take place in Bad Honnef, Germany, from 20 to 24 January 2020. The update of the European Strategy for Particle Physics is due to be completed and approved by the CERN Council in May 2020.

    An online Q&A session will be held on Thursday 16 May – 4pm CEST

    Reporters interested in participating are invited to register by sending an e-mail to press@cern.ch

    https://europeanstrategy.cern/

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

     
  • richardmitnick 12:54 pm on May 10, 2019 Permalink | Reply
    Tags: , “Belle II will accumulate more than 50 times the data sample of the original Belle experiment at KEK”, “We are developing the data-distribution software working not only with Belle II colleagues but also with colleagues at CERN., “We store an entire copy of the Belle II data and we have the computing resources to process that data and make it available to collaborators around the world”, , Belle II detector, Benefitting from our own experience at the RHIC & ATLAS Computing Center, , Brookhaven’s magnet division constructed 43 custom-designed corrector magnets., , Particle Accelerators, , Physicists and engineers in the Laboratory’s Superconducting Magnet Division made contributions essential to upgrading the KEK accelerator helping to transform it into SuperKEKB., Physicists will search for signs of “new physics”—something that cannot be explained by the particles and forces already included in the Standard Model., , SuperKEKB accelerator, SuperKEKB collides electrons with their antimatter counterparts known as positrons, The corrector magnets are installed on each side of the Belle II detector   

    From Brookhaven National Lab: “Brookhaven Lab and the Belle II Experiment” 

    From Brookhaven National Lab

    May 7, 2019
    Karen McNulty Walsh
    kmcnulty@bnl.gov

    Tracking particle smashups and detector conditions from half a world away, scientists seek answers to big physics mysteries.

    1
    SuperKEKB accelerator and Belle II detector at the interaction region.(Credit: Belle II/KEK)

    If you think keeping track of the photos on your mobile phone is a challenge, imagine how daunting the job would be if your camera were taking thousands of photos every second. That’s the task faced by particle physicists working on the Belle II experiment at Japan’s SuperKEKB particle accelerator, which started its first physics run in late March. Belle II physicists will sift through “snapshots” of millions of subatomic smashups per day—as well as data on the conditions of the “camera” at the time of each collision—to seek answers to some of the biggest questions in physics.

    A key part of the experiment is taking place half a world away, using computing resources and expertise at the U.S. Department of Energy’s Brookhaven National Laboratory, the lead laboratory for U.S. collaborators on Belle II.

    “We store an entire copy of the Belle II data, and we have the computing resources to process that data and make it available to collaborators around the world,” said Benedikt Hegner, a physicist in Brookhaven Lab’s Computational Sciences Initiative. To date, Brookhaven’s Scientific Data and Computing Center (SDCC) has handled up to 95 percent of the experiment’s entire computing workload—reconstructing particles from simulated events prior to the experiment’s startup, and since late March, from live collision events. SDCC will continue that role for the experiment’s first three years, thereafter maintaining some 30 percent of the data-transfer and storage responsibility while transitioning the rest to other Belle II member nations that have powerful GRID computing capabilities.

    “We are developing the data-distribution software, working not only with Belle II colleagues but also with colleagues at CERN, the European laboratory for particle physics research, learning from their experience managing datasets from the Large Hadron Collider (LHC)—as well as our own experience at the RHIC & ATLAS Computing Center,” Hegner said.

    2
    Benedikt Hegner in the Scientific Data and Computing Center at Brookhaven Lab, which stores and processes Belle II data and makes it available to collaborators around the world.

    Brookhaven also hosts Belle II’s “conditions database”—an archive of the detector’s conditions at the time of each recorded collision. This database tracks millions of variables—for example, the detector’s level of electronic noise, millimeter-scale movements of the detector due to the strong magnetic field, and variations in electronic response due to small temperature changes—all of which need to be properly taken into account to make sense of Belle II’s measurements.

    “This is the first time a particle physics experiment’s conditions database is being hosted at a distant location,” Hegner noted. Tracking the conditions helps calibrate the detector and even feeds input to the “trigger” systems that decide which collisions to record. “If we’re having trouble with our system, Belle II will eventually see that during data collection. So, the reliability of our services is essential,” Hegner said.

    But Brookhaven’s involvement in Belle II goes beyond cataloging collisions and crunching the numbers. Physicists and engineers in the Laboratory’s Superconducting Magnet Division made contributions essential to upgrading the KEK accelerator, helping to transform it into SuperKEKB, and members of Brookhaven Lab’s physics department are looking forward to analyzing Belle II data and being part of the upgraded facility’s discoveries.

    Improved magnets, more collisions, “new physics”?

    Like its predecessor, SuperKEKB collides electrons with their antimatter counterparts, known as positrons. To keep collision rates high, these beams must be tightly focused. But the magnetic fields guiding the particles in one beam can have unwanted effects in the adjacent beam, causing the particles to spread. To fine-tune the fields of the accelerator magnets and counteract these adjacent-beam effects, Brookhaven’s magnet division constructed 43 custom-designed corrector magnets. These corrector magnets are installed on each side of the Belle II detector, making adjustments to both the incoming and outgoing beams to maintain high beam intensity, or “luminosity.” High luminosity results in higher collision rates, so physicists at Brookhaven and around the world will have more data to analyze.

    4
    Corrector magnets: Leak field cancel coil being wound by Brookhaven Lab magnet division technician Thomas Van Winckel.

    “Belle II will accumulate more than 50 times the data sample of the original Belle experiment at KEK,” said Brookhaven physicist David Jaffe, who is coordinating Brookhaven Lab scientists’ involvement in the project.

    By scouring reconstructed images of the particles emerging from these electron-positron collisions, physicists will search for signs of “new physics”—something that cannot be explained by the particles and forces already included in the Standard Model, the world’s reigning (and well-tested) theory of particle physics.

    One particular area of interest is the decay of beauty and charm mesons—particles made of two quarks, one of which is a heavy “beauty” or “charm” quark. These “heavy flavor” mesons are created in abundance in electron-positron collisions at the SuperKEKB accelerator.

    “SuperKEKB is called a ‘B factory’ because it is optimized for the production of beauty mesons. It also produces an abundance of charm mesons,” Jaffe said. “While many physicists on Belle II will be investigating the behavior of beauty mesons, the Brookhaven team will be exploiting the huge sample of charm mesons to look for possible discoveries.”

    For example, if heavy flavor mesons measured by Belle II decay (transform into other particles) differently than predicted by the Standard Model, such a discrepancy would be an indication that some new, previously undiscovered particle might be taking part in the action.

    Evidence of new particles might help account for the mysterious dark matter that makes up some 27 percent of the universe, or offer clues about dark energy, which accounts for another 68 percent (with the remaining 5 percent made of the ordinary matter we see around us). Such a discovery might also help explain why today’s universe is made of matter rather than a mix of matter and antimatter, even though scientists believe both were created in equal amounts at the very beginning of time.

    To grasp how shocking this matter-antimatter asymmetry is, think of the common laundry experience of losing a random sock in the dryer. But imagine if every time you did the laundry—even a billion loads, each with a billion pairs of socks labeled “left” and “right”—you always ended up with a single unpaired left sock and never a lone right sock. That’s what it’s like for physicists trying to understand why the universe ended up with only matter. There must be some difference in the way matter and antimatter behave to explain this anomaly.

    There is evidence that matter and antimatter behave differently from several well-known experiments studying meson decays. These include a Nobel Prize-winning experiment at Brookhaven’s Alternating Gradient Synchrotron, which studied the decay of mesons containing a strange quark in the 1960s. More recently, several experiments studying beauty meson decays at other B factories—the original Belle at KEK, the BaBar experiment at the SLAC National Accelerator Laboratory in the U.S., and the LHCb experiment at CERN—observed similar asymmetries. But thus far, the matter-antimatter asymmetry observed in beauty and strange mesons follows the pattern predicted by the Standard Model, and is not sufficient to explain the matter-antimatter asymmetry of the universe.

    LHCb also recently observed a smaller level of matter-antimatter asymmetry in charm meson decays for the first time. It is unclear if this new observation is consistent with the Standard Model or due to new particles that preferentially interact with charm quarks. Additional measurements are needed to solve this mystery.

    5
    Physicist David Jaffe is coordinating Brookhaven Lab’s contributions to Belle II.

    “What we’ll do at Belle II is like many, many trips to the laundromat where we carefully launder our `charmed’ socks and use different methods to dry them. We’ll use our observations from these different loads of charmed laundry to map out what happens in charm meson decays to higher precision than ever before,” explained Jaffe. “Then we’ll compare those observations to our expectations from the Standard Model to see if we’ve found evidence for new particles.”

    The Belle II experiment, Jaffe noted, complements LHCb. “Belle II has a different range of features that enable contrasting studies of the charm mesons,” he said. “We are starting to accumulate large data samples to help us make the precision measurements we need to resolve these questions. Once we’ve confirmed the technical capabilities of the experiment, we will move on to data analysis and the possibility of discovery.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus


    BNL Center for Functional Nanomaterials

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 11:26 am on May 5, 2019 Permalink | Reply
    Tags: 'Where Does A Proton’s Mass Come From?', 99.8% of the proton’s mass comes from gluons, , Antiquarks, Asymptotic freedom: the particles that mediate this force are known as gluons., , , , , , , Particle Accelerators, , , , The production of Higgs bosons is dominated by gluon-gluon collisions at the LHC, , The strong interaction is the most powerful interaction in the entire known Universe.   

    From Ethan Siegel: “Ask Ethan: ‘Where Does A Proton’s Mass Come From?'” 

    From Ethan Siegel
    May 4, 2019

    1
    The three valence quarks of a proton contribute to its spin, but so do the gluons, sea quarks and antiquarks, and orbital angular momentum as well. The electrostatic repulsion and the attractive strong nuclear force, in tandem, are what give the proton its size, and the properties of quark mixing are required to explain the suite of free and composite particles in our Universe. (APS/ALAN STONEBRAKER)

    The whole should equal the sum of its parts, but doesn’t. Here’s why.

    The whole is equal to the sum of its constituent parts. That’s how everything works, from galaxies to planets to cities to molecules to atoms. If you take all the components of any system and look at them individually, you can clearly see how they all fit together to add up to the entire system, with nothing missing and nothing left over. The total amount you have is equal to the amounts of all the different parts of it added together.

    So why isn’t that the case for the proton? It’s made of three quarks, but if you add up the quark masses, they not only don’t equal the proton’s mass, they don’t come close. This is the puzzle that Barry Duffey wants us to address, asking:

    “What’s happening inside protons? Why does [its] mass so greatly exceed the combined masses of its constituent quarks and gluons?”

    In order to find out, we have to take a deep look inside.

    2
    The composition of the human body, by atomic number and by mass. The whole of our bodies is equal to the sum of its parts, until you get down to an extremely fundamental level. At that point, we can see that we’re actually more than the sum of our constituent components. (ED UTHMAN, M.D., VIA WEB2.AIRMAIL.NET/UTHMAN (L); WIKIMEDIA COMMONS USER ZHAOCAROL (R))

    There’s a hint that comes just from looking at your own body. If you were to divide yourself up into smaller and smaller bits, you’d find — in terms of mass — the whole was equal to the sum of its parts. Your body’s bones, fat, muscles and organs sum up to an entire human being. Breaking those down further, into cells, still allows you to add them up and recover the same mass you have today.

    Cells can be divided into organelles, organelles are composed of individual molecules, molecules are made of atoms; at each stage, the mass of the whole is no different than that of its parts. But when you break atoms into protons, neutrons and electrons, something interesting happens. At that level, there’s a tiny but noticeable discrepancy: the individual protons, neutrons and electrons are off by right around 1% from an entire human. The difference is real.

    3
    From macroscopic scales down to subatomic ones, the sizes of the fundamental particles play only a small role in determining the sizes of composite structures. Whether the building blocks are truly fundamental and/or point-like particles is still not known. (MAGDALENA KOWALSKA / CERN / ISOLDE TEAM)

    CERN ISOLDE

    Like all known organisms, human beings are carbon-based life forms. Carbon atoms are made up of six protons and six neutrons, but if you look at the mass of a carbon atom, it’s approximately 0.8% lighter than the sum of the individual component particles that make it up. The culprit here is nuclear binding energy; when you have atomic nuclei bound together, their total mass is smaller than the mass of the protons and neutrons that comprise them.

    The way carbon is formed is through the nuclear fusion of hydrogen into helium and then helium into carbon; the energy released is what powers most types of stars in both their normal and red giant phases. That “lost mass” is where the energy powering stars comes from, thanks to Einstein’s E = mc². As stars burn through their fuel, they produce more tightly-bound nuclei, releasing the energy difference as radiation.

    4
    In between the 2nd and 3rd brightest stars of the constellation Lyra, the blue giant stars Sheliak and Sulafat, the Ring Nebula shines prominently in the night skies. Throughout all phases of a star’s life, including the giant phase, nuclear fusion powers them, with the nuclei becoming more tightly bound and the energy emitted as radiation coming from the transformation of mass into energy via E = mc². (NASA, ESA, DIGITIZED SKY SURVEY 2)

    NASA/ESA Hubble Telescope

    ESO Online Digitized Sky Survey Telescopes

    Caltech Palomar Samuel Oschin 48 inch Telescope, located in San Diego County, California, United States, altitude 1,712 m (5,617 ft)


    Australian Astronomical Observatory, Siding Spring Observatory, near Coonabarabran, New South Wales, Australia, 1.2m UK Schmidt Telescope, Altitude 1,165 m (3,822 ft)


    From http://archive.eso.org/dss/dss

    This is how most types of binding energy work: the reason it’s harder to pull apart multiple things that are bound together is because they released energy when they were joined, and you have to put energy in to free them again. That’s why it’s such a puzzling fact that when you take a look at the particles that make up the proton — the up, up, and down quarks at the heart of them — their combined masses are only 0.2% of the mass of the proton as a whole. But the puzzle has a solution that’s rooted in the nature of the strong force itself.

    The way quarks bind into protons is fundamentally different from all the other forces and interactions we know of. Instead of the force getting stronger when objects get closer, like the gravitational, electric, or magnetic forces, the attractive force goes down to zero when quarks get arbitrarily close. And instead of the force getting weaker when objects get farther away, the force pulling quarks back together gets stronger the farther away they get.

    5
    The internal structure of a proton, with quarks, gluons, and quark spin shown. The nuclear force acts like a spring, with negligible force when unstretched but large, attractive forces when stretched to large distances. (BROOKHAVEN NATIONAL LABORATORY)

    This property of the strong nuclear force is known as asymptotic freedom, and the particles that mediate this force are known as gluons. Somehow, the energy binding the proton together, responsible for the other 99.8% of the proton’s mass, comes from these gluons. The whole of matter, somehow, weighs much, much more than the sum of its parts.

    This might sound like an impossibility at first, as the gluons themselves are massless particles. But you can think of the forces they give rise to as springs: asymptoting to zero when the springs are unstretched, but becoming very large the greater the amount of stretching. In fact, the amount of energy between two quarks whose distance gets too large can become so great that it’s as though additional quark/antiquark pairs exist inside the proton: sea quarks.

    6
    When two protons collide, it isn’t just the quarks making them up that can collide, but the sea quarks, gluons, and beyond that, field interactions. All can provide insights into the spin of the individual components, and allow us to create potentially new particles if high enough energies and luminosities are reached. (CERN / CMS COLLABORATION)

    Those of you familiar with quantum field theory might have the urge to dismiss the gluons and the sea quarks as just being virtual particles: calculational tools used to arrive at the right result. But that’s not true at all, and we’ve demonstrated that with high-energy collisions between either two protons or a proton and another particle, like an electron or photon.

    The collisions performed at the Large Hadron Collider at CERN are perhaps the greatest test of all for the internal structure of the proton. When two protons collide at these ultra-high energies, most of them simply pass by one another, failing to interact. But when two internal, point-like particles collide, we can reconstruct exactly what it was that smashed together by looking at the debris that comes out.

    7
    A Higgs boson event as seen in the Compact Muon Solenoid detector at the Large Hadron Collider. This spectacular collision is 15 orders of magnitude below the Planck energy, but it’s the precision measurements of the detector that allow us to reconstruct what happened back at (and near) the collision point. Theoretically, the Higgs gives mass to the fundamental particles; however, the proton’s mass is not due to the mass of the quarks and gluons that compose it. (CERN / CMS COLLABORATION)

    Under 10% of the collisions occur between two quarks; the overwhelming majority are gluon-gluon collisions, with quark-gluon collisions making up the remainder. Moreover, not every quark-quark collision in protons occurs between either up or down quarks; sometimes a heavier quark is involved.

    Although it might make us uncomfortable, these experiments teach us an important lesson: the particles that we use to model the internal structure of protons are real. In fact, the discovery of the Higgs boson itself was only possible because of this, as the production of Higgs bosons is dominated by gluon-gluon collisions at the LHC. If all we had were the three valence quarks to rely on, we would have seen different rates of production of the Higgs than we did.

    8
    Before the mass of the Higgs boson was known, we could still calculate the expected production rates of Higgs bosons from proton-proton collisions at the LHC. The top channel is clearly production by gluon-gluon collisions. I (E. Siegel) have added the yellow highlighted region to indicate where the Higgs boson was discovered. (CMS COLLABORATION (DORIGO, TOMMASO FOR THE COLLABORATION) ARXIV:0910.3489)

    As always, though, there’s still plenty more to learn. We presently have a solid model of the average gluon density inside a proton, but if we want to know where the gluons are actually more likely to be located, that requires more experimental data, as well as better models to compare the data against. Recent advances by theorists Björn Schenke and Heikki Mäntysaari may be able to provide those much needed models. As Mäntysaari detailed:

    “It is very accurately known how large the average gluon density is inside a proton. What is not known is exactly where the gluons are located inside the proton. We model the gluons as located around the three [valence] quarks. Then we control the amount of fluctuations represented in the model by setting how large the gluon clouds are, and how far apart they are from each other. […] The more fluctuations we have, the more likely this process [producing a J/ψ meson] is to happen.”

    9
    A schematic of the world’s first electron-ion collider (EIC). Adding an electron ring (red) to the Relativistic Heavy Ion Collider (RHIC) at Brookhaven would create the eRHIC: a proposed deep inelastic scattering experiment that could improve our knowledge of the internal structure of the proton significantly. (BROOKHAVEN NATIONAL LABORATORY-CAD ERHIC GROUP)

    The combination of this new theoretical model and the ever-improving LHC data will better enable scientists to understand the internal, fundamental structure of protons, neutrons and nuclei in general, and hence to understand where the mass of the known objects in the Universe comes from. From an experimental point of view, the greatest boon would be a next-generation electron-ion collider, which would enable us to perform deep inelastic scattering experiments to reveal the internal makeup of these particles as never before.

    But there’s another theoretical approach that can take us even farther into the realm of understanding where the proton’s mass comes from: Lattice QCD.

    10
    A better understanding of the internal structure of a proton, including how the “sea” quarks and gluons are distributed, has been achieved through both experimental improvements and new theoretical developments in tandem. (BROOKHAVEN NATIONAL LABORATORY)

    The difficult part with the quantum field theory that describes the strong force — quantum chromodynamics (QCD) — is that the standard approach we take to doing calculations is no good. Typically, we’d look at the effects of particle couplings: the charged quarks exchange a gluon and that mediates the force. They could exchange gluons in a way that creates a particle-antiparticle pair or an additional gluon, and that should be a correction to a simple one-gluon exchange. They could create additional pairs or gluons, which would be higher-order corrections.

    We call this approach taking a perturbative expansion in quantum field theory, with the idea that calculating higher and higher-order contributions will give us a more accurate result.

    11
    Today, Feynman diagrams are used in calculating every fundamental interaction spanning the strong, weak, and electromagnetic forces, including in high-energy and low-temperature/condensed conditions. But this approach, which relies on a perturbative expansion, is only of limited utility for the strong interactions, as this approach diverges, rather than converges, when you add more and more loops for QCD.(DE CARVALHO, VANUILDO S. ET AL. NUCL.PHYS. B875 (2013) 738–756)

    Richard Feynman © Open University

    But this approach, which works so well for quantum electrodynamics (QED), fails spectacularly for QCD. The strong force works differently, and so these corrections get very large very quickly. Adding more terms, instead of converging towards the correct answer, diverges and takes you away from it. Fortunately, there is another way to approach the problem: non-perturbatively, using a technique called Lattice QCD.

    By treating space and time as a grid (or lattice of points) rather than a continuum, where the lattice is arbitrarily large and the spacing is arbitrarily small, you overcome this problem in a clever way. Whereas in standard, perturbative QCD, the continuous nature of space means that you lose the ability to calculate interaction strengths at small distances, the lattice approach means there’s a cutoff at the size of the lattice spacing. Quarks exist at the intersections of grid lines; gluons exist along the links connecting grid points.

    As your computing power increases, you can make the lattice spacing smaller, which improves your calculational accuracy. Over the past three decades, this technique has led to an explosion of solid predictions, including the masses of light nuclei and the reaction rates of fusion under specific temperature and energy conditions. The mass of the proton, from first principles, can now be theoretically predicted to within 2%.

    12
    As computational power and Lattice QCD techniques have improved over time, so has the accuracy to which various quantities about the proton, such as its component spin contributions, can be computed. By reducing the lattice spacing size, which can be done simply by raising the computational power employed, we can better predict the mass of not only the proton, but of all the baryons and mesons. (LABORATOIRE DE PHYSIQUE DE CLERMONT / ETM COLLABORATION)

    It’s true that the individual quarks, whose masses are determined by their coupling to the Higgs boson, cannot even account for 1% of the mass of the proton. Rather, it’s the strong force, described by the interactions between quarks and the gluons that mediate them, that are responsible for practically all of it.

    The strong nuclear force is the most powerful interaction in the entire known Universe. When you go inside a particle like the proton, it’s so powerful that it — not the mass of the proton’s constituent particles — is primarily responsible for the total energy (and therefore mass) of the normal matter in our Universe. Quarks may be point-like, but the proton is huge by comparison: 8.4 × 10^-16 m in diameter. Confining its component particles, which the binding energy of the strong force does, is what’s responsible for 99.8% of the proton’s mass.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 2:08 pm on May 3, 2019 Permalink | Reply
    Tags: "A quantum leap in particle simulation", , , , Particle Accelerators, , Particles called fermions which are the building blocks of matter; and particles called bosons which are field particles and tug on the matter particles.,   

    From Fermi National Accelerator Lab: “A quantum leap in particle simulation” 

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

    May 2, 2019
    Leah Hesla

    A group of scientists at the Department of Energy’s Fermilab has figured out how to use quantum computing to simulate the fundamental interactions that hold together our universe.

    In a paper published in Physical Review Letters, Fermilab researchers fill a conspicuous gap in modeling the subatomic world using quantum computers, addressing a family of particles that, until recently, has been relatively neglected in quantum simulations.

    The fundamental particles that make up our universe can be divided into two groups: particles called fermions, which are the building blocks of matter, and particles called bosons, which are field particles and tug on the matter particles.

    In recent years, scientists have successfully developed quantum algorithms to compute systems made of fermions. But they’ve had a much tougher time doing the same for boson systems.

    For the first time, Fermilab scientist Alexandru Macridin has found a way to model systems containing both fermions and bosons on general-purpose quantum computers, opening a door to realistic simulations of the subatomic realm. His work is part of the Fermilab quantum science program.

    “The representation of bosons in quantum computing was never addressed very well in the literature before,” Macridin said. “Our method worked, and better than we expected.”

    The relative obscurity of bosons in quantum-computation literature has partly to do with bosons themselves and partly with the way quantum-computing research has evolved.

    Over the last decade, the development of quantum algorithms focused strongly on simulating purely fermionic systems, such as molecules in quantum chemistry.

    “But in high-energy physics, we also have bosons, and high-energy physicists are particularly interested in the interactions between bosons and fermions,” said Fermilab scientist Jim Amundson, a co-author on the Physical Review Letters paper. “So we took existing fermion models and extended them to include bosons, and we did that in a novel way.”

    The biggest barrier to modeling bosons related to the properties of a qubit — a quantum bit.

    Mapping the states

    A qubit has two states: 1 and 0.

    Similarly, a fermion state has two distinct modes: occupied and unoccupied.

    The qubit’s two-state property means it can represent a fermion state pretty straightforwardly: One qubit state is assigned to “occupied,” and the other, “unoccupied.”

    (You might remember something about the occupation of states from high school chemistry: An atom’s electron orbitals can each be occupied by a maximum of one electron. So they’re either occupied or not. Those orbitals, in turn, combine to form the electron shells that surround the nucleus.)

    The one-to-one mapping between qubit state and fermion state makes it easy to determine the number of qubits you’ll need to simulate a fermionic process. If you’re dealing with a system of 40 fermion states, like a molecule with 40 orbitals, you’ll need 40 qubits to represent it.

    In a quantum simulation, a researcher sets up qubits to represent the initial state of, say, a molecular process. Then the qubits are manipulated according to an algorithm that reflects how that process evolves.

    More complex processes need a greater number of qubits. As the number grows, so does the computing power needed to carry it out. But even with only a handful of qubits at one’s disposal, researchers are able to tackle some interesting problems related to fermion processes.

    “There’s a well-developed theory for how to map fermions onto qubits,” said Fermilab theorist Roni Harnik, a co-author of the paper.

    Bosons, nature’s force particles, are a different story. The business of mapping them gets complicated quickly. That’s partly because, unlike the restricted, two-choice fermion state, boson states are highly accommodating.

    2
    A system of bosons can be modeled as a system of harmonic oscillators, a phenomenon that occurs everywhere in nature. The motion of a spring bobbing up and down and the vibration of a plucked string are both examples of harmonic oscillators. In quantum mechanics, the harmonic oscillator motion is described by typical wave functions. Several (typical) wave functions are shown here. A Fermilab team recently found a way to represent wave functions for bosonic systems on a quantum computer. Image: Allen McC

    Accommodating bosons

    Since only one fermion can occupy a particular fermion quantum state, that state is either occupied or not — 1 or 0.

    In contrast, a boson state can be variably occupied, accommodating one boson, a zillion bosons, or anything in between. That makes it tough to map bosons to qubits. With only two possible states, a single qubit cannot, by itself, represent a boson state.

    With bosons, the question is not whether the qubit represents an occupied or unoccupied state, but rather, how many qubits are needed to represent the boson state.

    “Scientists have come up with ways to encode bosons into qubits that would require a large number of qubits to give you accurate results,” Amundson said.

    A prohibitively large number, in many cases. By some methods, a useful simulation would need millions of qubits to faithfully model a boson process, like the transformation of a particle that ultimately produces a particle of light, which is a type of boson.

    And that’s just in representing the process’s initial setup, let alone letting it evolve.

    Macridin’s solution was to recast the boson system as something else, something very familiar to physicists — a harmonic oscillator.

    Harmonic oscillators are everywhere in nature, from the subatomic to the astronomical scales. The vibration of molecules, the pulse of current through a circuit, the up-and-down bob of a loaded spring, the motion of a planet around a star — all are harmonic oscillators. Even bosonic particles, like those Macridin looked to simulate, can be treated like tiny harmonic oscillators. Thanks to their ubiquity, harmonic oscillators are well-understood and can be modeled precisely.

    With a background in condensed-matter physics — the study of nature a couple of notches up from its particle foundation — Macridin was familiar with modeling harmonic oscillators in crystals. He found a way to represent a harmonic oscillator on a quantum computer, mapping such systems to qubits with exceptional precision and enabling the precise simulation of bosons on quantum computers.

    And at a low qubit cost: Representing a discrete harmonic oscillator on a quantum computer requires only a few qubits, even if the oscillator represents a large number of bosons.

    “Our method requires a relatively small number of qubits for boson states — exponentially smaller than what was proposed by other groups before,” Macridin said. “For other methods to do the same thing, they would probably need orders of magnitude larger number of qubits.”

    Macridin estimates that six qubits per boson state is enough to explore interesting physics problems.

    Simulation success

    As a trial of Macridin’s mapping method, the Fermilab group first tapped into quantum field theory, a branch of physics that focuses on modeling subatomic structures. They successfully modeled the interaction of electrons in a crystal with the vibrations of the atoms that form the crystal. The ‘unit’ of that vibration is a boson called a phonon.

    Using a quantum simulator at nearby Argonne National Laboratory, they modeled the electron-phonon system and — voila! — they showed they could calculate, with high accuracy, the system’s properties using only about 20 qubits. The simulator is a classical computer that simulates how a quantum computer, up to 35 qubits, works. Argonne researchers leverage the simulator and their expertise in scalable algorithms to explore the potential impact of quantum computing in key areas such as quantum chemistry and quantum materials.

    “We showed that the technique worked,” Harnik said.

    They further showed that, by representing bosons as harmonic oscillators, one could efficiently and accurately describe systems involving fermion-boson interactions.

    “It turned out to be a good fit,” Amundson said.

    “I’d started with one idea, and it didn’t work, so then I changed the representation of the bosons,” Macridin said. “And it worked well. It makes the simulation of fermion-boson systems feasible for near-term quantum computers.”

    Universal application

    The Fermilab group’s simulation is not the first time scientists have modeled bosons in quantum computers. But in the other cases, scientists used hardware specifically designed to simulate bosons, so the simulated evolution of a boson system would happen naturally, so to speak, on those special computers.

    The Fermilab group’s approach is the first that can be applied efficiently in a general-purpose, digital quantum computer, also called a universal quantum computer.

    The next step for Macridin, Amundson and other particle physicists at Fermilab is to use their method on problems in high-energy physics.

    “In nature, fermion-boson interactions are fundamental. They appear everywhere,” Macridin said. “Now we can extend our algorithm to various theories in our field.”

    Their achievement extends beyond particle physics. Amundson says their group has heard from materials scientists who think the work could be useful in solving real-world problems in the foreseeable future.

    “We introduced bosons in a new way that requires fewer resources,” Amundson said. “It really opens up a whole new class of quantum simulations.”

    This work was funded by the DOE Office of Science. Learn more about this result in Physical Review Letters [above]. Visit the Fermilab quantum science website [above] to learn about other quantum initiatives.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world collaborate at Fermilab on experiments at the frontiers of discovery.

    FNAL MINERvA front face Photo Reidar Hahn

    FNAL DAMIC

    FNAL Muon g-2 studio

    FNAL Short-Baseline Near Detector under construction

    FNAL Mu2e solenoid

    Dark Energy Camera [DECam], built at FNAL

    FNAL DUNE Argon tank at SURF

    FNAL/MicrobooNE

    FNAL Don Lincoln

    FNAL/MINOS

    FNAL Cryomodule Testing Facility

    FNAL MINOS Far Detector in the Soudan Mine in northern Minnesota

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    FNAL/NOvA experiment map

    FNAL NOvA Near Detector

    FNAL ICARUS

    FNAL Holometer

     
  • richardmitnick 3:02 pm on May 2, 2019 Permalink | Reply
    Tags: , , , , , Particle Accelerators, , ,   

    From University of Chicago: “Scientists invent way to trap mysterious ‘dark world’ particle at Large Hadron Collider” 

    U Chicago bloc

    From University of Chicago

    Apr 17, 2019 [Just found this via social media]
    Louise Lerner

    1
    Courtesy of Zarija Lukic/Berkeley Lab

    A new paper outlines a method to directly detect particles from the ‘dark world’ using the Large Hadron Collider. Until now we’ve only been able to make indirect measurements and simulations, such as the visualization of dark matter above.

    CERN LHC Maximilien Brice and Julien Marius Ordan

    Higgs boson could be tied with dark particle, serve as ‘portal to the dark world’.

    Now that they’ve identified the Higgs boson, scientists at the Large Hadron Collider have set their sights on an even more elusive target.

    All around us is dark matter and dark energy—the invisible stuff that binds the galaxy together, but which no one has been able to directly detect. “We know for sure there’s a dark world, and there’s more energy in it than there is in ours,” said LianTao Wang, a University of Chicago professor of physics who studies how to find signals in large particle accelerators like the LHC.

    Wang, along with scientists from the University and UChicago-affiliated Fermilab, think they may be able to lead us to its tracks; in a paper published April 3 in Physical Review Letters, they laid out an innovative method for stalking dark matter in the LHC by exploiting a potential particle’s slightly slower speed.

    While the dark world makes up more than 95% of the universe, scientists only know it exists from its effects—like a poltergeist you can only see when it pushes something off a shelf. For example, we know there’s dark matter because we can see gravity acting on it—it helps keep our galaxies from flying apart.

    Theorists think there’s one particular kind of dark particle that only occasionally interacts with normal matter. It would be heavier and longer-lived than other known particles, with a lifetime up to one tenth of a second. A few times in a decade, researchers believe, this particle can get caught up in the collisions of protons that the LHC is constantly creating and measuring.

    “One particularly interesting possibility is that these long-lived dark particles are coupled to the Higgs boson in some fashion—that the Higgs is actually a portal to the dark world,” said Wang, referring to the last holdout particle in physicists’ grand theory of how the universe works, discovered at the LHC in 2012.

    Standard Model of Particle Physics

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    “It’s possible that the Higgs could actually decay into these long-lived particles.”

    The only problem is sorting out these events from the rest; there are more than a billion collisions per second in the 27-kilometer LHC, and each one of these sends subatomic chaff spraying in all directions.

    Wang, UChicago postdoctoral fellow Jia Liu and Fermilab scientist Zhen Liu (now at the University of Maryland) proposed a new way to search by exploiting one particular aspect of such a dark particle. “If it’s that heavy, it costs energy to produce, so its momentum would not be large—it would move more slowly than the speed of light,” said Liu, the first author on the study.

    That time delay would set it apart from all the rest of the normal particles. Scientists would only need to tweak the system to look for particles that are produced and then decay a bit more slowly than everything else.

    The difference is on the order of a nanosecond—a billionth of a second—or smaller. But the LHC already has detectors sophisticated enough to catch this difference; a recent study using data collected from the last run and found the method should work, plus the detectors will get even more sensitive as part of the upgrade that is currently underway.

    “We anticipate this method will increase our sensitivity to long-lived dark particles by more than an order of magnitude—while using capabilities we already have at the LHC,” Liu said.

    Experimentalists are already working to build the trap: When the LHC turns back on in 2021, after boosting its luminosity by tenfold, all three of the major detectors will be implementing the new system, the scientists said. “We think it has great potential for discovery,” Liu said.

    CERN ATLAS Credit CERN SCIENCE PHOTO LIBRARY


    CERN/CMS Detector


    CERN/ALICE Detector

    “If the particle is there, we just have to find a way to dig it out,” Wang said. “Usually, the key is finding the question to ask.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Chicago Campus

    An intellectual destination

    One of the world’s premier academic and research institutions, the University of Chicago has driven new ways of thinking since our 1890 founding. Today, UChicago is an intellectual destination that draws inspired scholars to our Hyde Park and international campuses, keeping UChicago at the nexus of ideas that challenge and change the world.

    The University of Chicago is an urban research university that has driven new ways of thinking since 1890. Our commitment to free and open inquiry draws inspired scholars to our global campuses, where ideas are born that challenge and change the world.

    We empower individuals to challenge conventional thinking in pursuit of original ideas. Students in the College develop critical, analytic, and writing skills in our rigorous, interdisciplinary core curriculum. Through graduate programs, students test their ideas with UChicago scholars, and become the next generation of leaders in academia, industry, nonprofits, and government.

    UChicago research has led to such breakthroughs as discovering the link between cancer and genetics, establishing revolutionary theories of economics, and developing tools to produce reliably excellent urban schooling. We generate new insights for the benefit of present and future generations with our national and affiliated laboratories: Argonne National Laboratory, Fermi National Accelerator Laboratory, and the Marine Biological Laboratory in Woods Hole, Massachusetts.

    The University of Chicago is enriched by the city we call home. In partnership with our neighbors, we invest in Chicago’s mid-South Side across such areas as health, education, economic growth, and the arts. Together with our medical center, we are the largest private employer on the South Side.

    In all we do, we are driven to dig deeper, push further, and ask bigger questions—and to leverage our knowledge to enrich all human life. Our diverse and creative students and alumni drive innovation, lead international conversations, and make masterpieces. Alumni and faculty, lecturers and postdocs go on to become Nobel laureates, CEOs, university presidents, attorneys general, literary giants, and astronauts.

     
  • richardmitnick 1:01 pm on May 2, 2019 Permalink | Reply
    Tags: , An unexpected signature, , , It’s not always about what you discover, Nature might be tough with us- but maybe nature is testing us and making us stronger., Particle Accelerators, , , Taking a closer look, Why the force of gravity is so much weaker than other known forces like electromagnetism. There is only one right answer. We haven’t found it yet.   

    From Symmetry: “The unseen progress of the LHC” 

    Symmetry Mag
    From Symmetry

    05/02/19
    Sarah Charley

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    It’s not always about what you discover.

    About seven years ago, physicist Stephane Willocq at the University of Massachusetts became enthralled with a set of theories that predicted the existence of curled-up extra dimensions hiding within our classical four dimensions of spacetime.

    “The idea of extra spatial dimensions is appealing because it allows us to look at the fundamental problems in particle physics from a different viewpoint,” Willocq says.

    As an experimental physicist, Willocq can do more than ponder. At the Large Hadron Collider at CERN, he put his pet theories to the test.

    Models based on those theories predicted how curled-up extra dimensions would affect the outcome of proton-proton collisions at the LHC. They predicted the collisions would produce more high-energy particles than expected.

    After several searches, Willocq and his colleagues found nothing out of the ordinary. “It was a great idea and disappointing to see it fade away, bit by bit,” he says, “but that’s how scientific progress works—finding the right idea by process of elimination.”

    The LHC research program is famous for discovering and studying the long-sought Higgs boson. But out of the spotlight, scientists have been using the LHC for an equally important scientific endeavor: testing, constraining and eliminating hundreds of theories that propose solutions to outstanding problems in physics, such as why the force of gravity is so much weaker than other known forces like electromagnetism.

    “There is only one right answer,” Willocq says. “We haven’t found it yet.”

    Now that scientists are at the end of the second run of the LHC, they have covered a huge amount of ground, eliminating the simplest versions of numerous theoretical ideas. They’ve covered four times as much phase space as previous searches for heavy new particles and set strict limits on what is physically possible.

    These studies don’t get the same attention as the Higgs boson, but these null results—results that don’t support a certain hypothesis—have moved physics forward as well.

    An unexpected signature

    Having chased down their most obvious leads, physicists are now adapting their methodology and considering new possibilities in their pursuit of new physics.

    Thus far, physicists have often used a straightforward formula to look for new particles. Massive particles produced in particle collisions will almost instantly decay, transforming into more stable particles. If scientists can measure all of those particles, they can reconstruct the mass and properties of the original particle that produced them.

    This worked wonderfully when scientists discovered the top quark in 1995 and the Higgs boson in 2012. But finding the next new thing might take a different tactic.

    “Finding new physics is more challenging than we expected it to be,” says University of Wisconsin physicist Tulika Bose of the CMS experiment. “Challenging situations make people come up with clever ideas.”

    One idea is that maybe scientists have been so focused on instantly decaying particles that they’ve been missing a whole host of particles that can travel up to several meters before falling apart. This would look like a firework exploding randomly in one of the detector subsystems.

    Scientists are rethinking how they reconstruct the data as a way to cast a bigger net and potentially catch particles with signatures like these. “If we only used our standard analysis methods, we would definitely not be sensitive to anything like this,” Bose says. “We’re no longer just reloading previous analyses but exploring innovative ideas.”

    Taking a closer look

    Since looking for excess particles coming out of collisions has yet to yield evidence of extra spatial dimensions, Willocq has decided to devote some of his efforts to a different method used at LHC experiments: precision measurements.

    Models also make predictions about properties of particles such as how often they decay into one set of particles versus another set. If precise measurements show deviations from predictions by the Standard Model of particle physics, it can mean that something new is at play.

    “Several new physics models predict an enhanced rate of rare subatomic processes,” Bose says. “However, their rates are so low that we have not been able to measure them yet.”

    In the past, precision measurements of well-known particles have overturned seemingly bulletproof paradigms. In the 1940s, for example, the measurement of a property called the “magnetic moment” of the neutron showed that it was not a fundamental particle, as had been previously assumed. This eventually helped lead to the discovery of particles that make up neutrons: quarks.

    Another example is the measurement of the mismatched decays of certain matter and antimatter particles, which led to the prediction of a new group of quarks—later confirmed by the discoveries of the top and bottom quarks.

    The plan for the LHC research program is to collect a huge amount of data, which will give scientists the resolution they need to examine every shadowy corner of the Standard Model.

    “This work naturally pushes our search methods towards making more detailed and higher precision measurements that will help us constrain possible deviations by new physics,” Willocq says.

    Because many of these predictions have never been thoroughly tested, scientists are hoping that they’ll find a few small deviations that could open the door to a new era of physics research. “Nature might be tough with us,” Bose says, “but maybe nature is testing us and making us stronger.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 12:32 pm on April 18, 2019 Permalink | Reply
    Tags: "When Beauty Gets in the Way of Science", , , , , , Particle Accelerators, , , , ,   

    From Nautilus: “When Beauty Gets in the Way of Science” 

    Nautilus

    From Nautilus

    April 18, 2019
    Sabine Hossenfelder

    Insisting that new ideas must be beautiful blocks progress in particle physics.

    When Beauty Gets in the Way of Science. Nautilus

    The biggest news in particle physics is no news. In March, one of the most important conferences in the field, Rencontres de Moriond, took place. It is an annual meeting at which experimental collaborations present preliminary results. But the recent data from the Large Hadron Collider (LHC), currently the world’s largest particle collider, has not revealed anything new.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    Forty years ago, particle physicists thought themselves close to a final theory for the structure of matter. At that time, they formulated the Standard Model of particle physics to describe the elementary constituents of matter and their interactions.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)

    After that, they searched for the predicted, but still missing, particles of the Standard Model. In 2012, they confirmed the last missing particle, the Higgs boson.

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    The Higgs boson is necessary to make sense of the rest of the Standard Model. Without it, the other particles would not have masses, and probabilities would not properly add up to one. Now, with the Higgs in the bag, the Standard Model is complete; all Pokémon caught.

    1
    HIGGS HANGOVER: After the Large Hadron Collider (above) confirmed the Higgs boson, which validated the Standard Model, it’s produced nothing newsworthy, and is unlikely to, says physicist Sabine Hossenfelder.Shutterstock

    The Standard Model may be physicists’ best shot at the structure of fundamental matter, but it leaves them wanting. Many particle physicists think it is simply too ugly to be nature’s last word. The 25 particles of the Standard Model can be classified by three types of symmetries that correspond to three fundamental forces: The electromagnetic force, and the strong and weak nuclear forces. Physicists, however, would rather there was only one unified force. They would also like to see an entirely new type of symmetry, the so-called “supersymmetry,” because that would be more appealing.

    2
    Supersymmetry builds on the Standard Model, with many new supersymmetric particles, represented here with a tilde (~) on them. ( From the movie “Particle fever” reproduced by Mark Levinson)

    Oh, and additional dimensions of space would be pretty. And maybe also parallel universes. Their wish list is long.

    It has become common practice among particle physicists to use arguments from beauty to select the theories they deem worthy of further study. These criteria of beauty are subjective and not evidence-based, but they are widely believed to be good guides to theory development. The most often used criteria of beauty in the foundations of physics are presently simplicity and naturalness.

    By “simplicity,” I don’t mean relative simplicity, the idea that the simplest theory is the best (a.k.a. “Occam’s razor”). Relying on relative simplicity is good scientific practice. The desire that a theory be simple in absolute terms, in contrast, is a criterion from beauty: There is no deep reason that the laws of nature should be simple. In the foundations of physics, this desire for absolute simplicity presently shows in physicists’ hope for unification or, if you push it one level further, in the quest for a “Theory of Everything” that would merge the three forces of the Standard Model with gravity.

    The other criterion of beauty, naturalness, requires that pure numbers that appear in a theory (i.e., those without units) should neither be very large nor very small; instead, these numbers should be close to one. Exactly how close these numbers should be to one is debatable, which is already an indicator of the non-scientific nature of this argument. Indeed, the inability of particle physicists to quantify just when a lack of naturalness becomes problematic highlights that the fact that an unnatural theory is utterly unproblematic. It is just not beautiful.

    Anyone who has a look at the literature of the foundations of physics will see that relying on such arguments from beauty has been a major current in the field for decades. It has been propagated by big players in the field, including Steven Weinberg, Frank Wilczek, Edward Witten, Murray Gell-Mann, and Sheldon Glashow. Countless books popularized the idea that the laws of nature should be beautiful, written, among others, by Brian Greene, Dan Hooper, Gordon Kane, and Anthony Zee. Indeed, this talk about beauty has been going on for so long that at this point it seems likely most people presently in the field were attracted by it in the first place. Little surprise, then, they can’t seem to let go of it.

    Trouble is, relying on beauty as a guide to new laws of nature is not working.

    Since the 1980s, dozens of experiments looked for evidence of unified forces and supersymmetric particles, and other particles invented to beautify the Standard Model. Physicists have conjectured hundreds of hypothetical particles, from “gluinos” and “wimps” to “branons” and “cuscutons,” each of which they invented to remedy a perceived lack of beauty in the existing theories. These particles are supposed to aid beauty, for example, by increasing the amount of symmetries, by unifying forces, or by explaining why certain numbers are small. Unfortunately, not a single one of those particles has ever been seen. Measurements have merely confirmed the Standard Model over and over again. And a theory of everything, if it exists, is as elusive today as it was in the 1970s. The Large Hadron Collider is only the most recent in a long series of searches that failed to confirm those beauty-based predictions.

    These decades of failure show that postulating new laws of nature just because they are beautiful according to human standards is not a good way to put forward scientific hypotheses. It’s not the first time this has happened. Historical precedents are not difficult to find. Relying on beauty did not work for Kepler’s Platonic solids, it did not work for Einstein’s idea of an eternally unchanging universe, and it did not work for the oh-so-pretty idea, popular at the end of the 19th century, that atoms are knots in an invisible ether. All of these theories were once considered beautiful, but are today known to be wrong. Physicists have repeatedly told me about beautiful ideas that didn’t turn out to be beautiful at all. Such hindsight is not evidence that arguments from beauty work, but rather that our perception of beauty changes over time.

    That beauty is subjective is hardly a breakthrough insight, but physicists are slow to learn the lesson—and that has consequences. Experiments that test ill-motivated hypotheses are at high risk to only find null results; i.e., to confirm the existing theories and not see evidence of new effects. This is what has happened in the foundations of physics for 40 years now. And with the new LHC results, it happened once again.

    The data analyzed so far shows no evidence for supersymmetric particles, extra dimensions, or any other physics that would not be compatible with the Standard Model. In the past two years, particle physicists were excited about an anomaly in the interaction rates of different leptons. The Standard Model predicts these rates should be identical, but the data demonstrates a slight difference. This “lepton anomaly” has persisted in the new data, but—against particle physicists’ hopes—it did not increase in significance, is hence not a sign for new particles. The LHC collaborations succeeded in measuring the violation of symmetry in the decay of composite particles called “D-mesons,” but the measured effect is, once again, consistent with the Standard Model. The data stubbornly repeat: Nothing new to see here.

    Of course it’s possible there is something to find in the data yet to be analyzed. But at this point we already know that all previously made predictions for new physics were wrong, meaning that there is now no reason to expect anything new to appear.

    Yes, null results—like the recent LHC measurements—are also results. They rule out some hypotheses. But null results are not very useful results if you want to develop a new theory. A null-result says: “Let’s not go this way.” A result says: “Let’s go that way.” If there are many ways to go, discarding some of them does not help much.

    To find the way forward in the foundations of physics, we need results, not null-results. When testing new hypotheses takes decades of construction time and billions of dollars, we have to be careful what to invest in. Experiments have become too costly to rely on serendipitous discoveries. Beauty-based methods have historically not worked. They still don’t work. It’s time that physicists take note.

    And it’s not like the lack of beauty is the only problem with the current theories in the foundations of physics. There are good reasons to think physics is not done. The Standard Model cannot be the last word, notably because it does not contain gravity and fails to account for the masses of neutrinos. It also describes neither dark matter nor dark energy, which are necessary to explain galactic structures.

    So, clearly, the foundations of physics have problems that require answers. Physicists should focus on those. And we currently have no reason to think that colliding particles at the next higher energies will help solve any of the existing problems. New effects may not appear until energies are a billion times higher than what even the next larger collider could probe. To make progress, then, physicists must, first and foremost, learn from their failed predictions.

    So far, they have not. In 2016, the particle physicists Howard Baer, Vernon Barger, and Jenny List wrote an essay for Scientific American arguing that we need a larger particle collider to “save physics.” The reason? A theory the authors had proposed themselves, that is natural (beautiful!) in a specific way, predicts such a larger collider should see new particles. This March, Kane, a particle physicist, used similar beauty-based arguments in an essay for Physics Today. And a recent comment in Nature Reviews Physics about a big, new particle collider planned in Japan once again drew on the same motivations from naturalness that have already not worked for the LHC. Even the particle physicists who have admitted their predictions failed do not want to give up beauty-based hypotheses. Instead, they have argued we need more experiments to test just how wrong they are.

    Will this latest round of null-results finally convince particle physicists that they need new methods of theory-development? I certainly hope so.

    As an ex-particle physicist myself, I understand very well the desire to have an all-encompassing theory for the structure of matter. I can also relate to the appeal of theories such a supersymmetry or string theory. And, yes, I quite like the idea that we live in one of infinitely many universes that together make up the “multiverse.” But, as the latest LHC results drive home once again, the laws of nature care heartily little about what humans find beautiful.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
  • richardmitnick 11:37 am on April 16, 2019 Permalink | Reply
    Tags: , , , , Particle Accelerators, , ,   

    From Symmetry: “A collision of light” 

    Symmetry Mag
    From Symmetry

    04/16/19
    Sarah Charley

    1
    Natasha Hartono

    One of the latest discoveries from the LHC takes the properties of photons beyond what your electrodynamics teacher will tell you in class.

    Professor Anne Sickles is currently teaching a laboratory class at the University of Illinois in which her students will measure what happens when two photons meet.

    What they will find is that the overlapping waves of light get brighter when two peaks align and dimmer when a peak meets a trough. She tells her students that this is process called interference, and that—unlike charged particles, which can merge, bond and interact—light waves can only add or subtract.

    “We teach undergraduates the classical theory,” Sickles says. “But there are situations where effects forbidden in the classical theory are allowed in the quantum theory.”

    Sickles is a collaborator on the ATLAS experiment at CERN and studies what happens when particles of light meet inside the Large Hadron Collider.

    CERN ATLAS Credit CERN SCIENCE PHOTO LIBRARY

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    For most of the year, the LHC collides protons, but for about a month each fall, the LHC switches things up and collides heavy atomic nuclei, such as lead ions. The main purpose of these lead collisions is to study a hot and dense subatomic fluid called the quark-gluon plasma, which is harder to create in collisions of protons. But these ion runs also enable scientists to turn the LHC into a new type of machine: a photon-photon collider.

    “This result demonstrates that photons can scatter off each other and change each other’s direction,” says Peter Steinberg, and ATLAS scientist at Brookhaven National Laboratory.

    When heavy nuclei are accelerated in the LHC, they are encased within an electromagnetic aura generated by their large positive charges.

    As the nuclei travel faster and faster, their surrounding fields are squished into disks, making them much more concentrated. When two lead ions pass closely enough that their electromagnetic fields swoosh through one another, the high-energy photons which ultimately make up these fields can interact. In rare instances, a photon from one lead ion will merge with a photon from an oncoming lead ion, and they will ricochet in different directions.

    However, according to Steinberg, it’s not as simple as two solid particles bouncing off each other. Light particles are both chargeless and massless, and must go through a quantum mechanical loophole (literally called a quantum loop) to interact with one another.

    “That’s why this process is so rare,” he says. “They have no way to bounce off of each other without help.”

    When the two photons see each other inside the LHC, they sometimes overreact with excitement and split themselves into an electron and positron pair. These electron-positron pairs are not fully formed entities, but rather unstable quantum fluctuations that scientists call virtual particles. The four virtual particles swirl into each other and recombine to form two new photons, which scatter off at weird angles into the detector.

    “It’s like a quantum-mechanical square dance,” Steinberg says.

    When ATLAS first saw hints of this process in 2017, they had only 13 candidate events with the correct characteristics (collisions that resulted in two low-energy photons inside the detector and nothing else).

    After another two years of data taking, they have now collected 59 candidate events, bumping this original observation into the statistical certainty of a full-fledged discovery.

    Steinberg sees this discovery as a big win for quantum electrodynamics, a theory about the quantum behavior of light that predicted this interaction. “This amazingly precise theory, which was developed in the first half of the 20th century, made a prediction that we are finally able to confirm many decades later.”

    Sickles says she is looking forward to exploring these kinds of light-by-light interactions and figuring out what else they could teach us about the laws of physics. “It’s one thing to see something,” she says. “It’s another thing to study it.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: