Tagged: CERN LHC Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:24 pm on April 30, 2020 Permalink | Reply
    Tags: "The large boson-boson collider", , , CERN LHC, , , , , , Weak Interaction   

    From Symmetry: “The large boson-boson collider” 

    Symmetry Mag
    From Symmetry<

    04/30/20
    Sarah Charley

    1
    Courtesy of CERN

    Scientists study rare, one-in-a-trillion heavy boson collisions happening inside the LHC.

    The Large Hadron Collider is the world’s most powerful particle accelerator. It accelerates and smashes protons and other atomic nuclei to study the fundamental properties of matter.

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    Normally scientists look at the particles produced during these collisions to learn about the laws of nature. But scientists can also learn about subatomic matter by peering into the collisions themselves and asking: What exactly is doing the colliding?

    When the answer to that question involves rarely seen, massive particles, it gives scientists a unique way to study the Higgs boson.

    Protons are not solid spheres, but composite particles containing even tinier components called quarks and gluons.

    The quark structure of the proton 16 March 2006 Arpad Horvath

    “As far as we know the quarks and gluons are point-like particles with no internal structure,” says Aram Apyan, a research associate at the US Department of Energy’s Fermi National Accelerator Laboratory.

    According to Apyan, two quarks cannot actually hit each other; they don’t have volume or surfaces. So what really happens when these point-like particles collide?

    “When we talk about two quarks colliding, what we really mean is that they are very close to each other spatially and exchanging particles,” says Richard Ruiz, a theorist at Université Catholique de Louvain in Belgium. “Namely, they exchange force-carrying bosons.”

    All elementary matter particles (like quarks and electrons) communicate with each other through bosons. For instance, quarks know to bind together by throwing bosons called gluons back and forth, which carry the message, “Stick together!”

    Almost every collision inside the LHC starts with an exchange of bosons (the only exceptions are when matter particles meet antimatter particles).

    The lion’s share of LHC collisions happen when two passing energetic gluons meet, fuse and then transform into all sorts of particles through the wonders of quantum mechanics.

    Gluons carry the strong interaction, which pulls quarks together into particles like protons and neutrons. Gluon-gluon collisions are so powerful that the protons they are a part of are ripped apart and the original quarks in those protons are consumed.

    In extremely rare instances, colliding quarks can also interact through a different force: the weak interaction, which is carried by the massive W and Z bosons. The weak interaction arbitrates all nuclear decay and fusion, such as when the protons in the center of the sun are squished and squeezed into helium nuclei.

    The weak interaction passes the message, “Time to change!’’and inspires quarks to take on a new identity–for instance, change from a down quark to an up quark or vice versa.

    Although it may seem counterintuitive, the W and Z bosons that carry the weak interaction are extremely heavy–roughly 80 times more massive than the protons the LHC smashes together. For two minuscule quarks to produce two enormous W or Z bosons simultaneously, they need access to a big pot of excess energy.

    That’s where the LHC comes in; by accelerating protons to nearly the speed of light, it produces the most energetic collisions ever seen in a particle accelerator. “The LHC is special,” Ruiz says. “The LHC is the first collider in which we have evidence of W and Z boson scattering; the weak interaction bosons themselves are colliding.”

    Even inside the LHC, weak interaction boson-boson collisions are extremely rare. This is because the range of the weak interaction extends to only about 0.1% of the diameter of a proton. (Compare this to the range of the strong interaction, which is equivalent to the proton’s diameter.)

    “This range is quite small,” Apyan says. “Two quarks have to be extremely close and radiate a W or Z boson simultaneously for there to be a chance of the bosons colliding.”

    Apyan studies collisions in which two colliding quarks simultaneously release a W or Z boson, which then scatter off one another before transforming into more stable particles. Unlike other processes, the W and Z boson collisions maintain their quarks, which then fly off into the detector as the proton falls apart. “This process has a nice signature,” Apyan says. “The remnants of the original quarks end up in our detector, and we see them as jets of particles very close to the beampipe.”

    The probability of this happening during an LHC collision is about one in a trillion. Luckily, the LHC generates about 600 million proton-proton collisions every second. At this rate, scientists are able to see this extremely rare event about once every other minute when the LHC is running.

    These heavy boson-boson collisions inside the LHC provide physicists with a unique view of the subatomic world, Ruiz says.

    Creating and scattering bosons allows physicists to see how their mathematical models hold up under stringent experimental tests. This can allow them to search for physics beyond the Standard Model.

    The scattering of W and Z bosons is a particularly pertinent test for the strength of the Higgs field. “The coupling strength between the Higgs boson and W and Z bosons is proportional to the masses of the W and Z bosons, and this raises many interesting questions,” Apyan says.

    Even small tweaks to the Higgs field could have major implications for the properties of Z and W bosons and how they ricochet off each other. By studying how these particles collide inside the LHC, scientists are able to open yet another window into the properties of the Higgs.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 11:38 am on March 30, 2020 Permalink | Reply
    Tags: "Big Labs Replace Data Taking with New Priorities", , Brookhaven National Laboratory’s National Synchrotron Light Source II (NSLS-II), , CERN LHC, ,   

    From “Physics”: “Big Labs Replace Data Taking with New Priorities” 

    About Physics

    From “Physics”

    March 30, 2020
    Katherine Wright

    Large research facilities have curtailed data collection and shut their doors—but their scientists are busier than ever, and some have joined the fight against COVID-19.

    1
    Brookhaven National Laboratory

    Despite being stuck at home, John Hill has never been busier. As the director of Brookhaven National Laboratory’s National Synchrotron Light Source II (NSLS-II)—a state-of-the-art x-ray facility in New York—Hill spent the middle of March rapidly ramping down experiments in response to the COVID-19 pandemic. As of midday March 23rd, only two of the 28 beamlines at NSLS-II remained operational, and only a handful of staff were onsite.

    BNL NSLS II


    BNL NSLS-II Interior

    Like many large facilities, NSLS-II shut its doors to comply with government guidelines and to keep staff safe and healthy. These rapid closures have, in some cases, brought experiments to a grinding halt—LIGO in the US and Virgo in Italy both stopped their search for gravitational-wave signals on Friday, more than a month earlier than they’d planned before the pandemic. At other places, such as CERN in Switzerland, where detectors were already off, long-awaited upgrades are now on indefinite hold.

    Caltech/MIT Advanced aLigo Hanford, WA, USA installation


    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    The scientists in charge of major experiments are now scrambling to rethink the next few months. And even though most researchers from these facilities are housebound, they are still hard at work. Most have switched their attention to data analysis, paper writing, and software development, interspersed with virtual meetings. A small number of others remain onsite, busy with an unexpected focus: new experiments to understand the virus that causes COVID-19.

    “The last three weeks have been nonstop planning,” says David Reitze, the director of LIGO, who spoke from his home in California. “We had to react very quickly.” Reitze has been in daily meetings with his LIGO and Virgo colleagues to implement remote working procedures for its 1300 international team members, who are used to frequent in-person meetings. Until Friday, LIGO had kept a skeleton crew running its two detectors, but the collaboration decided to turn the detectors off. Reitze says it was sad to end early, but he wanted to keep the staff safe.

    In addition to making contingency plans, researchers are busy replacing their usual in-person meetings with virtual ones. LIGO and Virgo canceled their biannual conference, scheduled for March 16th at Lake Geneva in Wisconsin. “The risk of becoming a central spreader of the disease just didn’t seem like a good idea,” says Patrick Brady, the current spokesperson for the LIGO collaboration. Instead, attendees live-streamed talks, with around 200 people watching the plenary session. “It was a chance to focus on science,” Reitze says. “That brought a bit of normalcy.”

    The researchers interviewed for this story say that they expect little to no impact on the scientific output from their institutions—for now. Many places have loads of data to analyze already. The gravitational-wave community has 11 months of data and over 50 events to analyze from the latest observation run. Scientists using Diamond Light Source, a synchrotron in the UK, just finished a cycle of experiments.

    Diamond Light Source, located at the Harwell Science and Innovation Campus in Oxfordshire U.K.

    And CERN has been undergoing upgrades since December 2018, so many of the scientists were already focused on data analysis. “Eighteen months after a shutdown you often get a spike in publications as people write up all the work that they have perhaps fallen behind with,” says Andrew Harrison, the CEO of Diamond.

    That said, this shutdown is different from most. LIGO and Virgo expect to submit papers from this run later than initially planned, having added a four-week extension to their writing activities. “Because of the sense of anxiety that many people are feeling, we decided to relax our timelines,” Brady says.

    Physics output could be impacted if the shutdowns extend beyond a few months. At CERN, for example, the ongoing instrument and equipment upgrades have been deemed nonessential activities and are now on hiatus. “We had to drop the screwdrivers,” says Giovanni Passaleva, the spokesperson for CERN’s Large Hadron Collider Beauty (LHCb) experiment.

    CERN LHCb chamber, LHC

    Stopping the upgrade could potentially delay LHC’s plan to restart in May 2021. But with no equipment to attend to, Passaleva notes that updates to LHCb’s software are “going faster than before.” And his group is still committed to its daily coffee hour, only now they do it online. “It’s very important that we keep connections with each other,” he says.

    In-person interactions are still possible at some labs. At NSLS-II, a handful of scientists are onsite helping researchers from pharmaceutical companies and academia study the crystal structures of synthetic versions of proteins found in the virus that causes COVID-19. Their goal is to use this information to develop drugs for treating those infected with the disease, Hill says. Similar experiments are ongoing at the Advanced Photon Source at Argonne National Laboratory, Illinois, and will start tomorrow at two beamlines at Diamond, which as of Friday had received a dozen applications for experiments to study the proteins in the virus.

    ANL Advanced Photon Source

    Hill says it’s good for NSLS-II that it can continue to contribute. “We don’t feel we are sitting powerless, watching this disease come—we are actively trying to fight it.” Harrison echoes this sentiment, saying he is pleased that basic science is considered essential work and that Diamond can contribute in the effort to understand the new disease. “It’s very positive that governments are engaging [with scientists],” he says. He also thinks the situation has forced scientists to refocus their priorities. “The things you thought were important just completely change,” he says.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Physicists are drowning in a flood of research papers in their own fields and coping with an even larger deluge in other areas of physics. How can an active researcher stay informed about the most important developments in physics? Physics highlights a selection of papers from the Physical Review journals. In consultation with expert scientists, the editors choose these papers for their importance and/or intrinsic interest. To highlight these papers, Physics features three kinds of articles: Viewpoints are commentaries written by active researchers, who are asked to explain the results to physicists in other subfields. Focus stories are written by professional science writers in a journalistic style and are intended to be accessible to students and non-experts. Synopses are brief editor-written summaries. Physics provides a much-needed guide to the best in physics, and we welcome your comments (physics@aps.org).

     
  • richardmitnick 8:41 am on March 25, 2020 Permalink | Reply
    Tags: "Plasma polarised by spin-orbit effect", , , , , CERN LHC, , , ,   

    From CERN Courier: “Plasma polarised by spin-orbit effect” 


    From CERN Courier

    23 March 2020

    A report from the ALICE experiment

    1
    Fig. 1. The spin alignment of (spin-1) K*0 mesons (red circles) can be characterised by deviations from ρ00 = 1/3, which is estimated here versus their transverse momenta, pT. The same variable was estimated for (spin-0) K0S mesons (magenta stars), and K*0 mesons produced in proton–proton collisions with negligible angular momentum (hollow orange circles), as systematic tests. Credit: CERN

    Spin-orbit coupling causes fine structure in atomic physics and shell structure in nuclear physics, and is a key ingredient in the field of spintronics in materials sciences. It is also expected to affect the development of the quickly rotating quark–gluon plasma (QGP) created in non-central collisions of lead nuclei at LHC energies. As such plasmas are created by the collisions of lead nuclei that almost miss each other, they have very high angular momenta of the order of 107ħ – equivalent to the order of 1021 revolutions per second. While the extreme magnetic fields generated by spectating nucleons (of the order of 1014 T, CERN Courier Jan/Feb 2020 p17) quickly decay as the spectator nucleons pass by, the plasma’s angular momentum is sustained throughout the evolution of the system as it is a conserved quantity. These extreme angular momenta are expected to lead to spin-orbit interactions that polarise the quarks in the plasma along the direction of the angular momentum of the plasma’s rotation. This should in turn cause the spins of vector (spin-1) mesons to align if hadronisation proceeds via the recombination of partons or by fragmentation. To study this effect, the ALICE collaboration recently measured the spin alignment of the decay products of neutral K* and φ vector mesons produced in non-central Pb–Pb collisions.

    Spin alignment can be studied by measuring the angular distribution of the decay products of the vector mesons. It is quantified by the probability ρ00 of finding a vector meson in a spin state 0 with respect to the direction of the angular momentum of the rotating QGP, which is approximately perpendicular to the plane of the beam direction and the impact parameter of the two colliding nuclei. In the absence of spin-alignment effects, the probability of finding a vector meson in any of the three spin states (–1, 0, 1) should be equal, with ρ00 = 1/3.

    The ALICE collaboration measured the angular distributions of neutral K* and φ vector mesons via their hadronic decays to Kπ and KK pairs, respectively. ρ00 was found to deviate from 1/3 for low-pT and mid-central collisions at a level of 3σ (figure 1). The corresponding results for φ mesons show a deviation of ρ00 values from 1/3 at a level of 2σ. The observed pT dependence of ρ00 is expected if quark polarisation via spin-orbit coupling is subsequently transferred to the vector mesons by hadronisation, via the recombination of a quark and an anti-quark from the quark–gluon plasma. The data are also consistent with the initial angular momentum of the hot and dense matter being highest for mid-central collisions and decreasing towards zero for central and peripheral collisions.

    The results are surprising, however, as corresponding quark-polarisation values obtained from studies with Λ hyperons are compatible with zero. A number of systematic tests have been carried out to verify these surprising results. K0S mesons do indeed yield ρ00 = 1/3, indicating no spin alignment, as must be true for a spin-zero particle. For proton–proton collisions, the absence of initial angular momentum also leads to ρ00 = 1/3, consistent with the observed neutral K* spin alignment being the result of spin-orbit coupling.

    The present measurements are a step towards experimentally establishing possible spin-orbit interactions in the relativistic-QCD matter of the quark–gluon plasma. In the future, higher statistics measurements in Run 3 will significantly improve the precision, and studies with the charged K*, which has a magnetic moment seven times larger than neutral K*, may even allow a direct observation of the effect of the strong magnetic fields initially experienced by the quark–gluon plasma.
    Further reading

    ALICE Collaboration 2019 arXiv:1910.14408.

    ALICE Collaboration 2019 arXiv:1909.01281.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN/ATLAS detector

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    SixTRack CERN LHC particles

     
  • richardmitnick 3:14 pm on March 10, 2020 Permalink | Reply
    Tags: "Accounting for the Higgs", , , CERN LHC, , , , , ,   

    From Symmetry: “Accounting for the Higgs” 

    Symmetry Mag
    From Symmetry<

    03/10/20
    Sarah Charley

    Only a fraction of collision events that look like they produce a Higgs boson actually produce a Higgs boson. Luckily, it doesn’t matter.

    CERN CMS Higgs Event May 27, 2012

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    I’ll let you in on a little secret: Even though physicists have produced millions of Higgs bosons at the Large Hadron Collider, they’ve never actually seen one. Higgs bosons are fragile things that dissolve immediately after they’re born. But as they die, they produce other particles, which, if they’re created at the LHC, can travel through a particle detector and leave recognizable signatures.

    Here’s another secret: Higgs signatures are identical to the signatures of numerous other processes. In fact, every time the Higgs signs its name in a detector, there are many more background processes leaving the exact same marks.

    For instance, one of the Higgs boson’s cleanest signatures is two photons with a combined mass of around 125 billion electronvolts. But for every 10 diphotons that look like a Higgs signature, only about one event actually belongs to a Higgs.

    So how can scientists study something that they cannot see and cannot isolate? They employ the same technique FBI agents use to uncover illegal money laundering schemes: accounting.

    In money laundering, “dirty” money (from illegal activities) is mixed with “clean” money from a legitimate business like a car wash. It all looks the same, so determining which Benjamins came from drugs versus which came from detailing is impossible. But agents don’t need to look at the individual dollars; they just need to look for suspiciously large spikes in profit that cannot be explained by regular business activities.

    In physics, the accounting comes from a much-loved set of equations called the Standard Model.

    Standard Model of Particle Physics, Quantum Diaries

    Physicists have spent decades building and perfecting the Standard Model, which tells them what percentage of the time different subatomic processes should happen. Scientists know which signatures are associated with which processes, so if they see a signature more often than expected, it means there is something happening outside the purview of the Standard Model: a new process.

    Clever accounting is how scientists originally discovered the Higgs boson in 2012. Theorists predicted what the Higgs signatures should look like, and when physicists went searching, they consistently saw some of these signatures more frequently than they could explain without the Higgs boson. When scientists added the mathematics for the Higgs boson into the equations, the predictions matched the data.

    Today, physicists use this accounting method to search for new particles. Many of these new particles are predicted to be rarer than Higgs bosons (for reference, Higgs bosons are produced in about one in a billion collisions). Many processes are also less clear-cut, and just the act of standardizing the accounting is a challenge. (To return to the money laundering analogy, it would be like FBI agents investigating an upscale bar, where a sudden excess could be explained by a generous tip.)

    To find these complex and subtle signs of mischief, scientists need a huge amount of data and a finely tuned model. Future runs of the LHC will be dedicated to building up these enormous datasets so that scientists can dig through the books for numbers that the Standard Model cannot explain.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 1:38 pm on January 29, 2020 Permalink | Reply
    Tags: "Particle Physics Turns to Quantum Computing for Solutions to Tomorrow’s Big-Data Problems", , CERN LHC, , , , ,   

    From Lawrence Berkeley National Lab: “Particle Physics Turns to Quantum Computing for Solutions to Tomorrow’s Big-Data Problems” 

    From Lawrence Berkeley National Lab

    January 29, 2020
    Glenn Roberts Jr.
    geroberts@lbl.gov
    (510) 486-5582

    Berkeley Lab researchers testing several techniques, technologies to be ready for the incoming deluge of particle data.

    1
    Display of a simulated High-Luminosity Large Hadron Collider (HL-LHC) particle collision event in an upgraded ATLAS detector. The event has an average of 200 collisions per particle bunch crossing. (Credit: ATLAS Collaboration/CERN)

    Giant-scale physics experiments are increasingly reliant on big data and complex algorithms fed into powerful computers, and managing this multiplying mass of data presents its own unique challenges.

    To better prepare for this data deluge posed by next-generation upgrades and new experiments, physicists are turning to the fledgling field of quantum computing to find faster ways to analyze the incoming info.

    Giant-scale physics experiments are increasingly reliant on big data and complex algorithms fed into powerful computers, and managing this multiplying mass of data presents its own unique challenges.

    To better prepare for this data deluge posed by next-generation upgrades and new experiments, physicists are turning to the fledgling field of quantum computing to find faster ways to analyze the incoming info.

    Click on a name or photo in the series of articles listed below profile three student researchers who have participated in Berkeley Lab-led efforts to learn about research projects in quantum computing by early-career researchers at Berkeley Lab:

    In a conventional computer, memory takes the form of a large collection of bits, and each bit has only two values: a one or zero, akin to an on or off position. In a quantum computer, meanwhile, data is stored in quantum bits, or qubits. A qubit can represent a one, a zero, or a mixed state in which it is both a one and a zero at the same time.

    By tapping into this and other quantum properties, quantum computers hold the potential to handle larger datasets and quickly work through some problems that would trip up even the world’s fastest supercomputers. For other types of problems, though, conventional computers will continue to outperform quantum machines.

    The High Luminosity Large Hadron Collider (HL-LHC) Project, a planned upgrade of the world’s largest particle accelerator at the CERN laboratory in Europe, will come on line in 2026.

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    It will produce billions of particle events per second – five to seven times more data than its current maximum rate – and CERN is seeking new approaches to rapidly and accurately analyze this data.

    In these particle events, positively charged subatomic particles called protons collide, producing sprays of other particles, including quarks and gluons, from the energy of the collision. The interactions of particles can also cause other particles – like the Higgs boson – to pop into existence.

    Tracking the creation and precise paths (called “tracks”) of these particles as they travel through layers of a particle detector – while excluding the unwanted mess, or “noise” produced in these events – is key in analyzing the collision data.

    The data will be like a giant 3D connect-the-dots puzzle that contains many separate fragments, with little guidance on how to connect the dots.

    To address this next-gen problem, a group of student researchers and other scientists at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have been exploring a wide range of new solutions.

    One such approach is to develop and test a variety of algorithms tailored to different types of quantum-computing systems. Their aim: Explore whether these technologies and techniques hold promise for reconstructing these particle tracks better and faster than conventional computers can.

    Particle detectors work by detecting energy that is deposited in different layers of the detector materials. In the analysis of detector data, researchers work to reconstruct the trajectory of specific particles traveling through the detector array. Computer algorithms can aid this process through pattern recognition, and particles’ properties can be detailed by connecting the dots of individual “hits” collected by the detector and correctly identifying individual particle trajectories.

    2
    A new wheel-shaped muon detector is part of the ATLAS detector upgrade at CERN. This wheel-shaped detector measures more than 30 feet in diameter. (Credit: Julien Marius Ordan/CERN)

    Heather Gray, an experimental particle physicist at Berkeley Lab and a UC Berkeley physics professor, leads the Berkeley Lab-based R&D effort – Quantum Pattern Recognition for High-Energy Physics (HEP.QPR) – that seeks to identify quantum technologies to rapidly perform this pattern-recognition process in very-high-volume collision data. This R&D effort is funded as part of the DOE’s QuantISED (Quantum Information Science Enabled Discovery for High Energy Physics) portfolio.

    The HEP.QPR project is also part of a broader initiative to boost quantum information science research at Berkeley Lab and across U.S. national laboratories.

    Other members of the HEP.QPR group are: Wahid Bhimji, Paolo Calafiura, Wim Lavrijsen, and former postdoctoral researcher Illya Shapoval, who explored quantum algorithms for associative memory. Bhimji is a big data architect at Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC). Calafiura is chief software architect of CERN’s ATLAS experiment and a member of Berkeley Lab’s Computational Research Division (CRD). And Lavrijsen is a CRD software engineer who is also involved in CERN’s ATLAS experiment.

    Members of the HEP.QPR project have collaborated with researchers at the University of Tokyo and from Canada on the development of quantum algorithms in high-energy physics, and jointly organized a Quantum Computing Mini-Workshop at Berkeley Lab in October 2019.

    Gray and Calafiura were also involved in a CERN-sponsored competition, launched in mid-2018, that challenged computer scientists to develop machine-learning-based techniques to accurately reconstruct particle tracks using a simulated set of HL-LHC data known as TrackML. Machine learning is a form of artificial intelligence in which algorithms can become more efficient and accurate through a gradual training process akin to human learning. Berkeley Lab’s quantum-computing effort in particle-track reconstruction also utilizes this TrackML set of simulated data.

    Berkeley Lab and UC Berkeley are playing important roles in the rapidly evolving field of quantum computing through their participation in several quantum-focused efforts, including The Quantum Information Edge, a research alliance announced in December 2019.

    The Quantum Information Edge is a nationwide alliance of national labs, universities, and industry advancing the frontiers of quantum computing systems to address scientific challenges and maintain U.S. leadership in next-generation information technology. It is led by the DOE’s Berkeley Lab and Sandia National Laboratories.

    The series of articles listed below profile three student researchers who have participated in Berkeley Lab-led efforts to apply quantum computing to the pattern-recognition problem in particle physics:

    4
    Lucy Linder, while working as a researcher at Berkeley Lab, developed her master’s thesis – supervised by Berkeley Lab staff scientist Paolo Calafiura – about the potential application of a quantum-computing technique called quantum annealing for finding particle tracks. She remotely accessed quantum-computing machines at D-Wave Systems Inc. in Canada and at Los Alamos National Laboratory in New Mexico.

    Linder’s approach was to first format the particle-track simulated data as something known as a QUBO (quadratic unconstrained binary optimization) problem that formulated the problem as an equation with binary values: either a one or a zero. This QUBO formatting also helped prepare the data for analysis by a quantum annealer, which uses qubits to help identify the best possible solution by applying a physics principle that describes how objects naturally seek the lowest-possible energy state.
    Read More

    5
    Eric Rohm, an undergraduate student working on a contract at Berkeley Lab as part of the DOE’s Science Undergraduate Laboratory Internship program, developed a quantum approximate optimization algorithm (QAOA) using quantum-computing resources at Rigetti Computing in Berkeley, California. He was supervised by Berkeley Lab physicist Heather Gray.

    This approach used a blend of conventional and quantum computing techniques to develop a custom algorithm. The algorithm, still in refinement, has been tested on the Rigetti Quantum Virtual Machine, a conventional computer that simulates a small quantum computer. The algorithm may eventually be tested on a Rigetti quantum processing unit that is equipped with actual qubits.
    Read More

    6
    Amitabh Yadav, a student research associate at Berkeley Lab since November who is supervised by Gray and Berkeley Lab software engineer Wim Lavrijsen, is working to apply a quantum version of a convention technique called Hough transform to identify and reconstruct particle tracks using IBM’s Quantum Experience, a form of quantum computing.

    The classical Hough transform technique can be used to detect specific features such as lines, curves, and circles in complex patterns, and the quantum Hough transform technique could potentially call out more complex shapes from exponentially larger datasets. Read More

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    LBNL campus

    LBNL Molecular Foundry

    Bringing Science Solutions to the World
    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the National Academy of Sciences (NAS), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California (UC) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the UC Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a UC Berkeley physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    A U.S. Department of Energy National Laboratory Operated by the University of California.

    University of California Seal

     
  • richardmitnick 1:19 pm on January 21, 2020 Permalink | Reply
    Tags: , , , CERN LHC, , , ,   

    From Symmetry: “The other dark matter candidate” 

    Symmetry Mag
    From Symmetry<

    01/21/20
    Laura Dattaro

    Inside the ADMX experiment hall at the University of Washington Credit Mark Stone U. of Washington. Axion Dark Matter Experiment

    CERN CAST Axion Solar Telescope

    As technology improves, scientists discover new ways to search for theorized dark matter particles called axions.

    In the early 1970s, physics had a symmetry problem. According to the Standard Model, the guiding framework of particle physics, a symmetry between particles and forces in our universe and a mirror version should be broken.

    Standard Model of Particle Physics

    It was broken by the weak force, a fundamental force involved in processes like radioactive decay.

    This breaking should feed into the interactions mediated by another fundamental force, the strong force. But experiments show that, unlike the weak force, the strong force obeys mirror symmetry perfectly. No one could explain it.

    The problem confounded physicists for years. Then, in 1977, physicists Roberto Peccei and Helen Quinn found a solution: a mechanism that, if it existed, would cause the strong force to obey this symmetry and right the Standard Model.

    Shortly after, Frank Wilczek and Steven Weinberg—both of whom went on to win the Nobel Prize—realized that this mechanism creates an entirely new particle. Wilczek ultimately dubbed this new particle the axion, after a dish detergent with the same name, for its ability to “clean up” the symmetry problem.

    Several years later, the theoretical axion was found not only to solve the symmetry problem, but also to be a possible candidate for dark matter, the missing matter that scientists think makes up 85% of the universe but the true nature of which is unknown.

    Fritz Zwicky discovered Dark Matter when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com

    Coma cluster via NASA/ESA Hubble

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science)


    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL)


    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970. https://home.dtm.ciw.edu

    The LSST, or Large Synoptic Survey Telescope is to be named the Vera C. Rubin Observatory by an act of the U.S. Congress.

    LSST telescope, The Vera Rubin Survey Telescope currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    LSST Data Journey, Illustration by Sandbox Studio, Chicago with Ana Kova

    Dark Matter Research

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Scientists studying the cosmic microwave background [CMB]hope to learn about more than just how the universe grew—it could also offer insight into dark matter, dark energy and the mass of the neutrino.

    [caption id="attachment_73741" align="alignnone" width="632"] CMB per ESA/Planck

    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al

    Dark Matter Particle Explorer China

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB deep in Sudbury’s Creighton Mine

    LBNL LZ Dark Matter project at SURF, Lead, SD, USA


    Inside the ADMX experiment hall at the University of Washington Credit Mark Stone U. of Washington. Axion Dark Matter Experiment

    Despite its theoretical promise, though, the axion stayed in relative obscurity, due to a combination of its strange nature and being outshone by another new dark matter candidate, called a WIMP, that seemed even more like a sure thing.

    But today, four decades after they were first theorized, axions are once again enjoying a moment in the sun, and may even be on the verge of detection, poised to solve two major problems in physics at once.

    “I think WIMPs have one last hurrah as these multiton experiments come online,” says MIT physicist Lindley Winslow. “Since they’re not done building those yet, we have to take a deep breath and see if we find something.

    “But if you ask me the thing we need to be ramping up, it’s axions. Because the axion has to be there, or we have other problems.”

    Around the time the axion was proposed, physicists were developing a theory called Supersymmetry, which called for a partner for every known particle.

    Standard Model of Supersymmetry via DESY

    The newly proposed dark matter candidate called a WIMP—or weakly interacting massive particle—fit beautifully with the theory of Supersymmetry, making physicists all but certain they’d both be discovered.

    Even more promising was that both the supersymmetric particles and the theorized WIMPs could be detected at the Large Hadron Collider at CERN.

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    “People just knew nature was going to deliver supersymmetric particles at the LHC,” says University of Washington physicist Leslie Rosenberg. “The LHC was a machine built to get a Nobel Prize for detecting Supersymmetry.”

    Experiments at the LHC made another Nobel-worthy discovery: the Higgs boson. But evidence of both WIMPS and Supersymmetry has yet to appear.

    Peter Higgs

    CERN CMS Higgs Event May 27, 2012

    CERN ATLAS Higgs Event

    Axions are even trickier than WIMPs. They’re theorized to be extremely light—a millionth of an electronvolt or so, about a trillion times lighter than the already tiny electron—making them next to impossible to produce or study in a traditional particle physics experiment. They even earned the nickname “invisible axion” for the unlikeliness they’d ever be seen.

    But axions don’t need to be made in a detector to be discovered. If axions are dark matter, they were created at the beginning of the universe and exist, free-floating, throughout space. Theorists believe they also should be created inside of stars, and because they’re so light and weakly interacting, they’d be able to escape into space, much like other lightweight particles called neutrinos. That means they exist all around us, as many as 10 trillion per cubic centimeter, waiting to be detected.

    In 1983, newly minted physics professor Pierre Sikivie decided to tackle this problem, taking inspiration from a course he had just taught on electromagnetism. Sikivie discovered that axions have another unusual property: In the presence of an electromagnetic field, they should sometimes spontaneously convert to easily detectable photons.

    “What I found is that it was impossible or extremely difficult to produce and detect axions,” Sikivie says. “But if you ask a less ambitious goal of detecting the axions that are already there, axions already there either as dark matter or as axions emitted by the sun, that actually became feasible.”

    When Rosenberg, then a postdoc working on cosmic rays at the University of Chicago, heard about Sikivie’s breakthrough—what he calls “Pierre’s Great Idea”—he knew he wanted to dedicate his work to the search.

    “Pierre’s paper hit me like a rock in the head,” Rosenberg says. “Suddenly, this thing that was the invisible axion, which I thought was so compelling, is detectable.”

    Rosenberg began work on what’s now called the Axion Dark Matter Experiment, or ADMX. The concept behind the experiment is relatively simple: Use a large magnet to create an electromagnetic field, and wait for the axions to convert to photons, which can then be detected with quantum sensors.

    When work on ADMX began, the technology wasn’t sensitive enough to pick up the extremely light axions. While Rosenberg kept the project moving forward, much of the field has focused on WIMPs, building ever-larger dark matter detectors to find them.

    But neither WIMPs nor supersymmetric particles have been discovered, pushing scientists to think creatively about what happens next.

    “That’s caused a lot of people to re-evaluate what other dark matter models we have,” says University of Michigan theorist Ben Safdi. “And when people have done that re-evaluation, the axion is the natural candidate that’s still floating around. The downfall of the WIMP has been matched exactly by the rise of axions in terms of popularity.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 3:29 pm on January 14, 2020 Permalink | Reply
    Tags: , , , , CERN LHC, Dilepton channel, Drell–Yan process, , , Searching for new physics in the TeV regime by looking for the decays of new particles., The dark photon (Zd)?   

    From CERN Courier: “CMS goes scouting for dark photons” 


    From CERN Courier

    6 December 2019
    A report from the CMS experiment

    One of the best strategies for searching for new physics in the TeV regime is to look for the decays of new particles. The CMS collaboration has searched in the dilepton channel for particles with masses above a few hundred GeV since the start of LHC data taking. Thanks to newly developed triggers, the searches are now being extended to the more difficult lower range of masses. A promising possible addition to the Standard Model (SM) that could exist in this mass range is the dark photon (Zd). Its coupling with SM particles and production rate depend on the value of a kinetic mixing coefficient ε, and the resulting strength of the interaction of the Zd with ordinary matter may be several orders of magnitude weaker than the electroweak interaction.

    The CMS collaboration has recently presented results of a search for a narrow resonance decaying to a pair of muons in the mass range from 11.5 to 200 GeV. This search looks for a strikingly sharp peak on top of a smooth dimuon mass spectrum that arises mainly from the Drell–Yan process. At masses below approximately 40 GeV, conventional triggers are the main limitation for this analysis as the thresholds on the muon transverse momenta (pT), which are applied online to reduce the rate of events saved for offline analysis, introduce a significant kinematic acceptance loss, as evident from the red curve in figure 1.

    1
    Fig. 1. Dimuon invariant-mass distributions obtained from data collected by the standard dimuon triggers (red) and the dimuon scouting triggers (green).

    A dedicated set of high-rate dimuon “scouting” triggers, with some additional kinematic constraints on the dimuon system and significantly lower muon pT thresholds, was deployed during Run 2 to overcome this limitation. Only a minimal amount of high-level information from the online reconstruction is stored for the selected events. The reduced event size allows significantly higher trigger rates, up to two orders of magnitude higher than the standard muon triggers. The green curve in figure 1 shows the dimuon invariant mass distribution obtained from data collected with the scouting triggers. The increase in kinematic acceptance for low masses can be well appreciated.

    The full data sets collected with the muon scouting and standard dimuon triggers during Run 2 are used to probe masses below 45 GeV, and between 45 and 200 GeV, respectively, excluding the mass range from 75 to 110 GeV where Z-boson production dominates. No significant resonant peaks are observed, and limits are set on ε2 at 90% confidence as a function of the ZD mass (figure 2). These are among the world’s most stringent constraints on dark photons in this mass range.

    2
    Fig. 2. Upper limits on ε2 as a function of the ZD mass. Results obtained with data collected by the dimuon scouting triggers are to the left of the dashed line. Constraints from measurement of the electroweak observables are shown in light blue.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN/ATLAS detector

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 12:32 pm on December 9, 2019 Permalink | Reply
    Tags: "Part of a disCERNing crowd", , Australian astrophysicist Martin White discusses life with and around the Large Hadron Collider., CERN LHC, , , ,   

    From COSMOS Magazine: “Part of a disCERNing crowd” 

    Cosmos Magazine bloc

    From COSMOS Magazine

    09 December 2019

    Australian astrophysicist Martin White discusses life with and around the Large Hadron Collider.

    1
    An aerial view of the CERN site, enlivened by Martin White’s hand-written annotations. Credit: Atlas experiment / CERN

    It’s lunchtime, and I am standing with a colleague under the main site of the CERN laboratory, trying to work out whether to go right or left.

    With the rainy Geneva winter in full swing, he informs me that he’s found a hidden entrance to a network of tunnels under the foyer of CERN’s main building and has worked out how to get to the fabled Restaurant 2 without getting wet.

    All we have to do is follow his secret route through the tunnels, which it transpires is so secret that he himself has forgotten it. After half an hour squeezing past hanging cables and scary radiation warnings, we emerge starving exactly where we started out.

    This is life at CERN in a nutshell – an endless search for the unknown conducted in a spirit of frivolity by permanently hungry practitioners. Established in 1954, the European Organisation for Nuclear Research (CERN) hosts the largest particle accelerator ever built by humankind, named, rather appropriately, the Large Hadron Collider (LHC).

    It also has an ambitious and wide-ranging program of other experiments, which test various aspects of particle and nuclear physics, and develop practical spin-off applications of the cutting-edge technology required to push our understanding of the universe to deeper and deeper levels.

    Having lived there on and off for many years, the question I get asked more than any other is: “What does a person at CERN actually do all day?”

    2
    Martin White – proudly part of “an endless search for the unknown’. Credit: GLENN HUNT

    I never had a typical day at CERN, since my work brought me into contact with computer scientists, civil and electrical engineers, medical physicists, theoretical physicists, accelerator experts, and detector physicists.

    The only common thread was attendance at a large number of meetings which, when located at opposite ends of the main site, led to frantic daily runs of a few kilometres that contributed to a significant weight loss – until I discovered the CERN cake selection.

    The preferred language is English, but it’s not easy to recognise it as such, due to a heavy reliance on jargon and acronyms.

    Moreover, I met physicists who could answer me in English, before translating for an Italian colleague, and mocking my question in German to a bystander.

    Nevertheless, I am always surprised at how quickly the exotic becomes normalised at CERN, whether that means getting acclimatised to constantly being surrounded by extraordinarily smart people or becoming used to dinner party statements like “I have a terrible day tomorrow – I have to reassemble the positron accumulator!”

    My work at CERN has involved the ATLAS experiment, one of the seven experiments of the LHC whose job is to filter and record the results of proton-proton collisions that occur more than one billion times a second.

    The middle of this detector is effectively a giant digital camera, consisting of 6.3 million strips of silicon, and my first job at CERN was to write the software that monitored each of these strips individually to confirm that the system was operating smoothly.

    I am one of CERN’s 12,000 users, and like most of them I have worked for various universities and research institutes scattered around the world, with frequent travel to the CERN laboratory as an external participant.

    The intense lure of CERN is that it remains the best international facility for discovering the new particles and laws of nature that would explain both how the Universe works on its smallest scales, and how it operated 0.0000000001 seconds after the Big Bang.

    The Standard Model of particle physics that I learnt as an undergraduate, and now pass on to my students, remains incapable of explaining most of the matter in the Universe, and it is widely believed that the LHC will finally shift us to a higher plane of understanding.

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 8:15 am on October 2, 2019 Permalink | Reply
    Tags: "How AI could change science", , , , , CERN LHC, , , Kavli Institute for Cosmological Physics,   

    From University of Chicago: “How AI could change science” 

    U Chicago bloc

    From University of Chicago

    Oct 1, 2019
    Louise Lerner
    Rob Mitchum

    1
    At the University of Chicago, researchers are using artificial intelligence’s ability to analyze massive amounts of data in applications from scanning for supernovae to finding new drugs. shutterstock.com

    Researchers at the University of Chicago seek to shape an emerging field.

    AI technology is increasingly used to open up new horizons for scientists and researchers. At the University of Chicago, researchers are using it for everything from scanning the skies for supernovae to finding new drugs from millions of potential combinations and developing a deeper understanding of the complex phenomena underlying the Earth’s climate.

    Today’s AI commonly works by starting from massive data sets, from which it figures out its own strategies to solve a problem or make a prediction—rather than rely on humans explicitly programming it how to reach a conclusion. The results are an array of innovative applications.

    “Academia has a vital role to play in the development of AI and its applications. While the tech industry is often focused on short-term returns, realizing the full potential of AI to improve our world requires long-term vision,” said Rebecca Willett, professor of statistics and computer science at the University of Chicago and a leading expert on AI foundations and applications in science. “Basic research at universities and national laboratories can establish the fundamentals of artificial intelligence and machine learning approaches, explore how to apply these technologies to solve societal challenges, and use AI to boost scientific discovery across fields.”

    2
    Prof. Rebecca Willett gives an introduction to her research on AI and data science foundations. Photo by Clay Kerr

    Willett is one of the featured speakers at the InnovationXLab Artificial Intelligence Summit hosted by UChicago-affiliated Argonne National Laboratory, which will soon be home to the most powerful computer in the world—and it’s being designed with an eye toward AI-style computing. The Oct. 2-3 summit showcases the U.S. Department of Energy lab, bringing together industry, universities, and investors with lab innovators and experts.

    Depiction of ANL ALCF Cray Intel SC18 Shasta Aurora exascale supercomputer

    The workshop comes as researchers around UChicago and the labs are leading new explorations into AI.

    For example, say that Andrew Ferguson, an associate professor at the Pritzker School of Molecular Engineering, wants to look for a new vaccine or flexible electronic materials. New materials essentially are just different combinations of chemicals and molecules, but there are literally billions of such combinations. How do scientists pick which ones to make and test in the labs? AI could quickly narrow down the list.

    “There are many areas where the Edisonian approach—that is, having an army of assistants make and test hundreds of different options for the lightbulb—just isn’t practical,” Ferguson said.

    Then there’s the question of what happens if AI takes a turn at being the scientist. Some are wondering whether AI models could propose new experiments that might never have occurred to their human counterparts.

    “For example, when someone programmed the rules for the game of Go into an AI, it invented strategies never seen in thousands of years of humans playing the game,” said Brian Nord, an associate scientist in the Kavli Institute for Cosmological Physics and UChicago-affiliated Fermi National Accelerator Laboratory.

    “Maybe sometimes it will have more interesting ideas than we have.”

    Ferguson agreed: “If we write down the laws of physics and input those, what can AI tell us about the universe?”

    3
    Scenes from the 2016 games of Go, an ancient Chinese game far more complex than chess, between Google’s AI “AlphaGo” and world-record Go player Lee Sedol. The match ended with the AI up 4-1. Image courtesy of Bob van den Hoek.

    But ensuring those applications are accurate, equitable, and effective requires more basic computer science research into the fundamentals of AI. UChicago scientists are exploring ways to reduce bias in model predictions, use advanced tools even when data is scarce, and developing “explainable AI” systems that will produce more actionable insights and raise trust among users of those models.

    “Most AIs right now just spit out an answer without any context. But a doctor, for example, is not going to accept a cancer diagnosis unless they can see why and how the AI got there,” Ferguson said.

    With the right calibration, however, researchers see a world of uses for AI. To name just a few: Willett, in collaboration with scientists from Argonne and the Department of Geophysical Sciences, is using machine learning to study clouds and their effect on weather and climate. Chicago Booth economist Sendhil Mullainathan is studying ways in which machine learning technology could change the way we approach social problems, such as policies to alleviate poverty; while neurobiologist David Freedman, a professor in the University’s Division of Biological Sciences, is using machine learning to understand how brains interpret sights and sounds and make decisions.

    Below are looks into three projects at the University showcasing the breadth of AI applications happening now.

    The depths of the universe to the structures of atoms

    We’re getting better and better at building telescopes to scan the sky and accelerators to smash particles at ever-higher energies. What comes along with that, however, is more and more data. For example, the Large Hadron Collider in Europe generates one petabyte of data per second; for perspective, in less than five minutes, that would fill up the world’s most powerful supercomputer.

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    That’s way too much data to store. “You need to quickly pick out the interesting events to keep, and dump the rest,” Nord said.

    But see “From UC Santa Barbara: “Breaking Data out of the Silos

    Similarly, each night hundreds of telescopes scan the sky. Existing computer programs are pretty good at picking interesting things out of them, but there’s room to improve. (After LIGO detected the gravity waves from two neutron stars crashing together in 2017, telescopes around the world had rooms full of people frantically looking through sky photos to find the point of light it created.)

    MIT /Caltech Advanced aLigo


    VIRGO Gravitational Wave interferometer, near Pisa, Italy

    Years ago, Nord was sitting and scanning telescope images to look for gravitational lensing, an effect in which large objects distort light as it passes.

    Gravitational Lensing NASA/ESA

    “We were spending all this time doing this by hand, and I thought, surely there has to be a better way,” he said. In fact, the capabilities of AI were just turning a corner; Nord began writing programs to search for lensing with neural networks. Others had the same idea; the technique is now emerging as a standard approach to find gravitational lensing.

    This year Nord is partnering with computer scientist Yuxin Chen to explore what they call a “self-driving telescope”: a framework that could optimize when and where to point telescopes to gather the most interesting data.

    “I view this collaboration between AI and science, in general, to be in a very early phase of development,” Chen said. “The outcome of the research project will not only have transformative effects in advancing the basic science, but it will also allow us to use the science involved in the physical processes to inform AI development.”

    Disentangling style and content for art and science

    In recent years, popular apps have sprung up that can transform photographs into different artistic forms—from generic modes such as charcoal sketches or watercolors to the specific styles of Dali, Monet and other masters. These “style transfer” apps use tools from the cutting edge of computer vision—primarily the neural networks that prove adept at image classification for applications such as image search and facial recognition.

    But beyond the novelty of turning your selfie into a Picasso, these tools kick-start a deeper conversation around the nature of human perception. From a young age, humans are capable of separating the content of an image from its style; that is, recognizing that photos of an actual bear, a stuffed teddy bear, or a bear made out of LEGOs all depict the same animal. What’s simple for humans can stump today’s computer vision systems, but Assoc. Profs. Jason Salavon and Greg Shakhnarovich think the “magic trick” of style transfer could help them catch up.

    Photo gallery 1/2

    4
    This tryptych of images demonstrates how neural networks can transform images with different artistic forms. [Sorry, I do not see the point here.]

    “The fact that we can look at pictures that artists create and still understand what’s in them, even though they sometimes look very different from reality, seems to be closely related to the holy grail of machine perception: what makes the content of the image understandable to people,” said Shakhnarovich, an associate professor at the Toyota Technological Institute of Chicago.

    Salavon and Shakhnarovich are collaborating on new style transfer approaches that separate, capture and manipulate content and style, unlocking new potential for art and science. These new models could transform a headshot into a much more distorted style, such as the distinctive caricatures of The Simpsons, or teach self-driving cars to better understand road signs in different weather conditions.

    “We’re in a global arms race for making cool things happen with these technologies. From what would be called practical space to cultural space, there’s a lot of action,” said Salavon, an associate professor in the Department of Visual Arts at the University of Chicago and an artist who makes “semi-autonomous art”. “But ultimately, the idea is to get to some computational understanding of the ‘essence’ of images. That’s the rich philosophical question.”

    5
    Researchers hope to use AI to decode nature’s rules for protein design, in order to create synthetic proteins with a range of applications. Image courtesy of Emw / CC BY-SA 3.0

    Learning nature’s rules for protein design

    Nature is an unparalleled engineer. Millions of years of evolution have created molecular machines capable of countless functions and survival in challenging environments, like deep sea vents. Scientists have long sought to harness these design skills and decode nature’s blueprints to build custom proteins of their own for applications in medicine, energy production, environmental clean-up and more. But only recently have the computational and biochemical technologies needed to create that pipeline become possible.

    Ferguson and Prof. Rama Ranganathan are bringing these pieces together in an ambitious project funded by a Center for Data and Computing seed grant. Combining recent advancements in machine learning and synthetic biology, they will build an iterative pipeline to learn nature’s rules for protein design, then remix them to create synthetic proteins with elevated or even new functions and properties.

    “It’s not just rebuilding what nature built, we can push it beyond what nature has ever shown us before,” said Ranganathan. “This proposal is basically the starting point for building a whole framework of data-driven molecular engineering.”

    “The way we think of this project is we’re trying to mimic millions of years of evolution in the lab, using computation and experiments instead of natural selection,” Ferguson said.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Chicago Campus

    An intellectual destination

    One of the world’s premier academic and research institutions, the University of Chicago has driven new ways of thinking since our 1890 founding. Today, UChicago is an intellectual destination that draws inspired scholars to our Hyde Park and international campuses, keeping UChicago at the nexus of ideas that challenge and change the world.

    The University of Chicago is an urban research university that has driven new ways of thinking since 1890. Our commitment to free and open inquiry draws inspired scholars to our global campuses, where ideas are born that challenge and change the world.

    We empower individuals to challenge conventional thinking in pursuit of original ideas. Students in the College develop critical, analytic, and writing skills in our rigorous, interdisciplinary core curriculum. Through graduate programs, students test their ideas with UChicago scholars, and become the next generation of leaders in academia, industry, nonprofits, and government.

    UChicago research has led to such breakthroughs as discovering the link between cancer and genetics, establishing revolutionary theories of economics, and developing tools to produce reliably excellent urban schooling. We generate new insights for the benefit of present and future generations with our national and affiliated laboratories: Argonne National Laboratory, Fermi National Accelerator Laboratory, and the Marine Biological Laboratory in Woods Hole, Massachusetts.

    The University of Chicago is enriched by the city we call home. In partnership with our neighbors, we invest in Chicago’s mid-South Side across such areas as health, education, economic growth, and the arts. Together with our medical center, we are the largest private employer on the South Side.

    In all we do, we are driven to dig deeper, push further, and ask bigger questions—and to leverage our knowledge to enrich all human life. Our diverse and creative students and alumni drive innovation, lead international conversations, and make masterpieces. Alumni and faculty, lecturers and postdocs go on to become Nobel laureates, CEOs, university presidents, attorneys general, literary giants, and astronauts.

     
  • richardmitnick 8:35 pm on August 29, 2019 Permalink | Reply
    Tags: "Forget About Electrons And Protons; The Unstable Muon Could Be The Future Of Particle Physics", , CERN LHC, , , , MICE collaboration — which stands for Muon Ionization Cooling Experiment — continues to push this technology to new heights and may make a muon collider a real possibility for the future.,   

    From Ethan Siegel: “Forget About Electrons And Protons; The Unstable Muon Could Be The Future Of Particle Physics” 

    From Ethan Siegel
    Aug 29, 2019

    1
    The particle tracks emanating from a high energy collision at the LHC in 2014 show the creation of many new particles. It’s only because of the high-energy nature of this collision that new masses can be created. (WIKIMEDIA COMMONS USER PCHARITO)

    Electron-positron or proton-proton colliders are all the rage. But the unstable muon might be the key to unlocking the next frontier.

    If you want to probe the frontiers of fundamental physics, you have to collide particles at very high energies: with enough energy that you can create the unstable particles and states that don’t exist in our everyday, low-energy Universe. So long as you obey the Universe’s conservation laws and have enough free energy at your disposal, you can create any massive particle (and/or its antiparticle) from that energy via Einstein’s E = mc².

    Traditionally, there have been two strategies to do this.

    Collide electrons moving in one direction with positrons moving in the opposite direction, tuning your beams to whatever energy corresponds to the mass of particles you wish to produce.
    Collide protons in one direction with either other protons or anti-protons in the other, reaching higher energies but creating a much messier, less controllable signal to extract.

    One Nobel Laureate, Carlo Rubbia, has called for physicists to build something entirely novel: a muon collider.

    2
    Carlo Rubbia at the 62nd Lindau Nobel Laureate Meeting on July 4, 2012. Markus Pössel (user name: Mapos)

    It’s ambitious and presently impractical, but it just might be the future of particle physics.

    3
    The particles and antiparticles of the Standard Model have now all been directly detected, with the last holdout, the Higgs Boson, falling at the LHC earlier this decade.

    Standard Model of Particle Physics

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    All of these particles can be created at LHC energies, and the masses of the particles lead to fundamental constants that are absolutely necessary to describe them fully. These particles can be well-described by the physics of the quantum field theories underlying the Standard Model, but they do not describe everything, like dark matter. (E. SIEGEL / BEYOND THE GALAXY)

    Above, you can see the particles and antiparticles of the Standard Model, which have now all been discovered. The Large Hadron Collider (LHC) at CERN discovered the Higgs boson, the long-sought-after last holdout, earlier this decade.

    While there’s still much science left to be done at the LHC — it’s only taken 2% of all the data it will acquire by the end of the 2030s — particle physicists are already looking ahead to the next generation of future colliders.

    5
    A hypothetical new accelerator, either a long linear one or one inhabiting a large tunnel beneath the Earth, could dwarf the sensitivity to new particles that prior and current colliders can achieve. Even at that, there’s no guarantee we’ll find anything new, but we’re certain to find nothing new if we fail to try. ILC collaboration

    All of the plans put forth involve scaled-up version of existing technologies that have been used in past and/or current accelerators. We know how to accelerate electrons, positrons, and protons in a straight line. We know how to bend them into a circle, and maximize both the energy of the collisions and the number of particles colliding per second. Larger, more energetic versions of existing technologies are the simplest approach.

    FNAL/Tevatron map

    CERN map

    Future Circular Collider (FCC) Larger LHC

    CERN FCC Future Circular Collider map

    CERN Future Circular Collider

    The scale of the proposed Future Circular Collider (FCC), compared with the LHC presently at CERN and the Tevatron, formerly operational at Fermilab. The Future Circular Collider is perhaps the most ambitious proposal for a next-generation collider to date, including both lepton and proton options as various phases of its proposed scientific programme. (PCHARITO / WIKIMEDIA COMMONS)

    Of course, there are both benefits and drawbacks to each method we could use. You can build a linear collider, but the energy you can reach is going to be limited by how powerfully you can impart energy to these particles per-unit-distance as well as how long you build your accelerator. The drawback is that, without a continuous injection of circulating particles, linear colliders have lower collision rates and take longer amounts of time to collect the same amount of data.

    The other main style of collider is the style currently used at CERN: circular colliders. Instead of only getting one continuous shot to accelerate your particles before giving them the opportunity to collide, you speed them up while bending them in a circle, adding more and more particles to each clockwise and counterclockwise beam with every revolution. You set up your detectors at designated collision points, and measure what comes out.

    6
    A candidate Higgs event in the ATLAS detector. Note how even with the clear signatures and transverse tracks, there is a shower of other particles; this is due to the fact that protons are composite particles. This is only the case because the Higgs gives mass to the fundamental constituents that compose these particles. At high enough energies, the currently most-fundamental particles known may yet split apart themselves. (THE ATLAS COLLABORATION / CERN)

    CERN ATLAS Image Claudia Marcelloni

    This is the preferred method, so long as your tunnel is long enough and your magnets are strong enough, for both electron/positron and proton/proton colliders. Compared to linear colliders, with a circular collider, you get

    greater numbers of particles inside the beam at any one time,
    second and third and thousandth chances for particles that missed one another on the prior pass through,
    and much greater collision rates overall, particularly for lower-energy heavy particles like the Z-boson.

    In general, electron/positron colliders are better for precision studies of known particles, while proton/proton colliders are better for probing the energy frontier.

    7
    A four-muon candidate event in the ATLAS detector at the Large Hadron Collider. The muon/anti-muon tracks are highlighted in red, as the long-lived muons travel farther than any other unstable particle. The energies achieved by the LHC are sufficient for creating Higgs bosons; previous electron-positron colliders could not achieve the necessary energies. (ATLAS COLLABORATION/CERN)

    In fact, if you compare the LHC — which collides protons with protons — with the previous collider in the same tunnel (LEP, which collided electrons with positrons), you’d find something that surprises most people: the particles inside LEP went much, much faster than the ones inside the LHC!

    CERN LEP Collider


    CERN LEP Collider

    Everything in this Universe is limited by the speed of light in a vacuum: 299,792,458 m/s. It’s impossible to accelerate any massive particle to that speed, much less past it. At the LHC, particles get accelerated up to extremely high energies of 7 TeV per particle. Considering that a proton’s rest energy is only 938 MeV (or 0.000938 TeV), it’s easy to see how it reaches a speed of 299,792,455 m/s.

    But the electrons and positrons at LEP went even faster: 299,792,457.9964 m/s. Yet despite these enormous speeds, they only reached energies of ~110 GeV, or 1.6% the energies achieved at the LHC.

    Let’s understand how colliding particles create new ones. First, the energy available for creating new particles — the “E” in E = mc² — comes from the center-of-mass energy of the two colliding particles. In a proton-proton collision, it’s the internal structures that collide: quarks and gluons. The energy of each proton is divided up among many constituent particles, and these particles zip around inside the proton as well. When two of them collide, the energy available for creating new particles might still be large (up to 2 or 3 TeV), but isn’t the full-on 14 TeV.

    But the electron-positron idea is a lot cleaner: they’re not composite particles, and they don’t have internal structure or energy divided among constituents. Accelerate an electron and positron to the same speed in opposite directions, and 100% of that energy goes into creating new particles. But it won’t be anywhere near 14 TeV.

    8
    A number of the various lepton colliders, with their luminosity (a measure of the collision rate and the number of detections one can make) as a function of center-of-mass collision energy. Note that the red line, which is a circular collider option, offers many more collisions than the linear version, but gets less superior as energy increases. Beyond about 380 GeV, circular colliders cannot achieve those energies, and a linear collider like CLIC is the far superior option. (GRANADA STRATEGY MEETING SUMMARY SLIDES / LUCIE LINSSEN (PRIVATE COMMUNICATION))

    Even though electrons and positrons go much faster than protons do, the total amount of energy a particle possesses is determined by its speed and also its original mass. Even though the electrons and positrons are much closer to the speed of light, it takes nearly 2,000 of them to make up as much rest mass as a proton. They have a greater speed but a much lower rest mass, and hence, a lower energy overall.

    There’s a good physics reasons why, even with the same radius ring and the same strong magnetic fields to bend them into a circle, electrons won’t reach the same energy as protons: synchrotron radiation. When you accelerate a charged particle with a magnetic field, it gives off radiation, which means it carries energy away.

    9
    Relativistic electrons and positrons can be accelerated to very high speeds, but will emit synchrotron radiation (blue) at high enough energies, preventing them from moving faster. This synchrotron radiation is the relativistic analog of the radiation predicted by Rutherford so many years ago, and has a gravitational analogy if you replace the electromagnetic fields and charges with gravitational ones. (CHUNG-LI DONG, JINGHUA GUO, YANG-YUAN CHEN, AND CHANG CHING-LIN, ‘SOFT-X-RAY SPECTROSCOPY PROBES NANOMATERIAL-BASED DEVICES’)

    The amount of energy radiated away is dependent on the field strength (squared), the energy of the particle (squared), but also on the inherent charge-to-mass ratio of the particle (to the fourth power). Since electrons and positrons have the same charge as the proton, but just 1/1836th of a proton’s mass, that synchrotron radiation is the limiting factor for electron-positron systems in a circular collider. You’d need a circular collider 100 km around just to be able to create a pair of top-antitop quarks in a next-generation particle accelerator using electrons and positrons.

    This is where the big idea of using muons comes in. Muons (and anti-muons) are the cousins of electrons (and positrons), being:

    fundamental (and not composite) particles,
    being 206 times as massive as an electron (with a much smaller charge-to-mass ratio and much less synchrotron radiation),
    and also, unlike electrons or positrons, being fundamentally unstable.

    That last difference is the present dealbreaker: muons have a mean lifetime of just 2.2 microseconds before decaying away.

    10
    An earlier design plan (now defunct) for a full-scale muon-antimuon collider at Fermilab, the source of the world’s second-most powerful particle accelerator behind the LHC at CERN. (FERMILAB)

    In the future, however, we might be able to work around that anyway. You see, Einstein’s special relativity tells us that as particles move closer and closer to the speed of light, time dilates for that particle in the observer’s reference frame. In other words, if we make this muon move fast enough, we can dramatically increase the time it lives before decaying; this is the same physics behind why cosmic ray muons pass through us all the time!

    If we could accelerate a muon up to the same 6.5 TeV in energy that LHC protons achieved during their prior data-taking run, that muon would live for 135,000 microseconds instead of 2.2 microseconds: enough time to circle the LHC some 1,500 times before decaying away. If you could collide a muon/anti-muon pair at those speeds, you’d have 100% of that energy — all 13 TeV of it — available for particle creation.

    11
    The prototype MICE 201-megahertz RF module, with the copper cavity mounted, is shown during assembly at Fermilab. This apparatus could focus and collimate a muon beam, enabling the muons to be accelerated and survive for much longer than 2.2 microseconds. (Y. TORUN / IIT / FERMILAB TODAY)

    Humanity can always choose to build a bigger ring or invest in producing stronger-field magnets; those are easy ways to go to higher energies in particle physics. But there’s no cure for synchrotron radiation with electrons and positrons; you’d have to use heavier particles instead. There’s no cure for energy being distributed among multiple constituent particles inside a proton; you’d have to use fundamental particles instead.

    The muon is the one particle that could solve both of these issues. The only drawback is that they’re unstable, and difficult to keep alive for a long time. However, they’re easy to make: smash a proton beam into a piece of acrylic and you’ll produce pions, which will decay into both muons and anti-muons. Accelerate those muons to high energy and collimate them into beams, and you can put them in a circular collider.

    12
    While many unstable particles, both fundamental and composite, can be produced in particle physics, only protons, neutrons (bound in nuclei) and the electron are stable, along with their antimatter counterparts and the photon. Everything else is short-lived, but if muons can be kept at high enough speeds, they might live long enough to forge a next-generation particle collider out of. (CONTEMPORARY PHYSICS EDUCATION PROJECT (CPEP), U.S. DEPARTMENT OF ENERGY / NSF / LBNL)

    The MICE collaboration — which stands for Muon Ionization Cooling Experiment — continues to push this technology to new heights, and may make a muon collider a real possibility for the future. The goal is to reveal whatever secrets nature might have waiting in store for us, and these are secrets we cannot predict. As Carlo Rubbia himself said,

    “…these fundamental choices are coming from nature, not from individuals. Theorists can do what they like, but nature is the one deciding in the end….”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: