Tagged: CERN LHC Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:36 pm on September 25, 2020 Permalink | Reply
    Tags: "Synchrotrons face a data deluge", , , , CERN LHC, EBS-Extremely Brilliant Source, European Synchrotron Research Facility, , National Synchrotron Light Source II,   

    From Physics Today: “Synchrotrons face a data deluge” 

    Physics Today bloc

    From Physics Today

    25 Sep 2020
    Rahul Rao

    The latest upgrades at light sources around the world are only accelerating the growth in data rates, and automation is struggling to keep pace.

    1
    The storage ring of the European Synchrotron Research Facility sits at the confluence of the Drac and Isère rivers in Grenoble, France. Credit: ESRF/Jocelyn Chavy.

    In August the European Synchrotron Research Facility (ESRF) in Grenoble, France, opened the Extremely Brilliant Source (EBS), a newly upgraded light source that is 100 times as brilliant as its predecessor and some 100 billion times as powerful as the average hospital x-ray source.

    ESRF Grenoble, France re-opened as
    Extremely Brilliant Source (EBS).

    There’s something else EBS can do better than its predecessor: produce data.

    The light source has a theoretical capacity to produce 1 petabyte of data per day, says Harald Reichert, ESRF’s director of research in physical sciences. “We don’t have the capacity to analyze this data, or even store it, if that machine fires continuously, day after day.”

    Since the 1980s, both beamline photon fluxes and detector data rates have far outpaced the rate of increase in Moore’s law. Even as scientists turn to automation, the unique nature of synchrotrons, with their myriad applications, makes automating their outputs uniquely complicated.

    In the early 2000s, three months’ worth of data from a detector could fit into a 100-megabyte archive, says Stefan Vogt, an x-ray scientist at Argonne National Laboratory’s Advanced Photon Source (APS).


    ANL Advanced Photon Source.

    “These days, for these specific types of experiments, it’s several terabytes.” APS will soon be capable of producing around 200 petabytes per year. A single beamline can produce as much as 12 gigabytes per second.

    The data transfer and storage infrastructure can and does fail to keep up, resulting in lags that can stretch up to days, according to James Holton, a biophysicist at Lawrence Berkeley National Laboratory’s Advanced Light Source.

    LBNL ALS .

    The rapid inflation of data also makes it difficult to future-proof new beamlines. “You have to be making choices now against what the computing infrastructure is going to look like in about five years’ time,” says Graeme Winter, an x-ray crystallographer at the Diamond Light Source in the UK.

    Diamond Light Source, located at the Harwell Science and Innovation Campus in Oxfordshire U.K.

    Upgrading the storage infrastructure only shifts the bottleneck further downstream. There, automation can pick up the reins. Not only can AI, machine learning, and neural networks help in analysis, but they can make data much more manageable by throwing away poor-quality data. They can also reduce excess by stopping data collection in mid-experiment when certain conditions have been reached.

    Indeed, the Large Hadron Collider (LHC), which CERN claims can produce around 25 gigabytes per second recording the aftermath of particle collisions, relies on a worldwide computing grid to handle its data deluge.

    CERN LHC Maximilien Brice and Julien Marius Ordan.


    MonALISA LHC Computing GridMap monalisa.caltech.edu/ml/_client.beta

    But Reichert says that the LHC’s detectors capture each collision in essentially the same way, producing predictable data sets that lend themselves well to automation. In contrast, a synchrotron facility can host multiple beamlines with a far more diverse array of applications, such as determining protein crystal structures, charting the brain’s neural connections, and watching chemical catalysis and additive manufacturing in real time. “They have nothing in common,” says Reichert. “They use different detectors. The data is completely different.”

    Consequently, it’s often left to the users of each beamline and application to develop their own specialized firmware and algorithms. When large synchrotrons like APS host dozens of beamlines, some of which are deeply customizable, the volume of specialized use cases renders a streamlined system like CERN’s impractical.

    2
    The sample chamber within a beamline at the National Synchrotron Light Source II in Upton, New York. State-of-the-art synchrotron beamlines can produce up to 12 gigabytes of data per second. Credit: Brookhaven National Laboratory, CC BY-NC-ND 2.0

    BNL NSLS II.

    When users do take up the challenge of building tools, the programmers who make them often approach the problem differently from the scientists who need them. Earlier this year, crystallographers used Diamond to aid drug discovery efforts by scanning a protease in the virus responsible for COVID-19. Collecting the data took mere days, but it took longer to convert the observational data into structures detailed enough for chemists to use. The researchers’ tools also provided output that was difficult for chemists to interpret. Without an effective means of communicating and checking their results with chemists, the project was delayed by weeks.

    Frank von Delft, a macromolecular crystallographer at Diamond, says that programmers should focus on making their tools easier to use. “When that’s achieved,” he says, “your whole platform suddenly becomes powerful.” In particular, he cites Phenix, a crystallography tool that can help determine molecular structures. Phenix is one of the most popular tools of its kind, in large part thanks to a graphical user interface designed by a biologist rather than a career programmer.

    Fortunately, the future seems to be pointing toward greater streamlining, including at the synchrotron end. Traditionally, facilities left the data-handling part of their science to the users, but the enormous data volumes, as well as other factors such as more computation shifting to the cloud, are changing that.

    Reichert believes each synchrotron facility should help provide scientists the tools they need and assist with the computation. “When we give [a scientist] beam time,” he says, “we’d better ask the question: What do you do with the data, and what kind of help do you need to actually get an answer to your scientific problem and put the answer out in the open?”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Our mission

    The mission of Physics Today is to be a unifying influence for the diverse areas of physics and the physics-related sciences.

    It does that in three ways:

    • by providing authoritative, engaging coverage of physical science research and its applications without regard to disciplinary boundaries;
    • by providing authoritative, engaging coverage of the often complex interactions of the physical sciences with each other and with other spheres of human endeavor; and
    • by providing a forum for the exchange of ideas within the scientific community.”

     
  • richardmitnick 7:53 am on August 28, 2020 Permalink | Reply
    Tags: "CMS experiment at CERN releases fifth batch of open data", , , , CERN LHC, , , ,   

    From CERN CMS: “CMS experiment at CERN releases fifth batch of open data” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN CMS

    27 August, 2020
    Achintya Rao

    All research-quality data recorded by CMS during the first two years of LHC operation are now publicly available.

    1
    An artistic representation of the CMS detector made of pulsating lines. (Image: Achintya Rao/CERN.)

    The CMS collaboration at CERN has released into the open 18 new datasets, comprising proton–proton collision data recorded by CMS at the Large Hadron Collider (LHC) during the second half of 2011. LHC data are unique and are of interest to the scientific community as well as to those in education. Preserving the data and the knowledge to analyse them is critical. CMS has therefore committed to releasing its research data openly, with up to 100% being made available 10 years after recording them; the embargo gives the scientists working on CMS adequate time to analyse the data themselves.

    The total data volume of this latest release is 96 terabytes. Not only does this batch complement the data from the first half of 2011, released back in 2016, it also provides additional tools, workflows and examples as well as improved documentation for analysing the data using cloud technologies. The data and related materials are available on the CERN Open Data portal, an open repository built using CERN’s home-grown and open source software, Invenio.

    Previous releases from CMS included the full recorded data volume from 2010 and half the volumes from 2011 and 2012 (the first “run” of the LHC). Special “derived datasets”, some for education and others for data science, have allowed people around the world to “rediscover” the Higgs boson in CMS open data. Novel papers have also been published using CMS data, by scientists unaffiliated with the collaboration.

    In the past, those interested in analysing CMS open data needed to install the CMS software onto virtual machines to re-create the appropriate analysis environment. This made it challenging to scale up a full analysis for research use, a task that requires considerable computing resources. With this batch, CMS has updated the documentation for using software containers with all the software pre-installed and added workflows running on them, allowing the data to be easily analysed in the cloud, either at universities or using commercial providers. Some of the new workflows are also integrated with REANA, the CERN platform for reusable analyses.

    CMS and the CERN Open Data team have been working closely with current and potential users of the open data – in schools, in higher education and in research – to improve the services offered. The search functionality of the portal has been updated with feedback from teachers who participated in dedicated workshops at CERN in previous years, the documentation has been enhanced based on conversations with research users and a new online forum has been established to provide support. In September, CMS is organising a virtual workshop for theoretical physicists interested in using the open data.

    “We are thrilled to be able to release these new data and tools from CMS into the public domain,” says Kati Lassila-Perini, who has co-led the CMS project for open data and data preservation since its inception. “We look forward to seeing how the steps we have taken to improve the usability of our public data are received by the community of users, be it in education or in research.”

    You can read more about the latest open-data release from CMS on the CERN Open Data portal: opendata.cern.ch/docs/cms-completes-2010-2011-pp-data

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    CMS
    CERN CMS New

     
  • richardmitnick 3:27 pm on August 3, 2020 Permalink | Reply
    Tags: "CERN experiments announce first indications of a rare Higgs boson process", , , , CERN LHC, , , , , The ATLAS and CMS experiments at CERN have announced new results which show that the Higgs boson decays into two muons.   

    From CERN: “CERN experiments announce first indications of a rare Higgs boson process” 

    Cern New Bloc

    Cern New Particle Event


    From CERN

    3 August, 2020

    The ATLAS [below] and CMS [below] experiments at CERN have announced new results which show that the Higgs boson decays into two muons.

    1
    Candidate event displays of a Higgs boson decaying into two muons as recorded by CMS (left) and ATLAS (right). (Image: CERN)

    Geneva. At the 40th ICHEP conference, the ATLAS and CMS experiments announced new results which show that the Higgs boson decays into two muons. The muon is a heavier copy of the electron, one of the elementary particles that constitute the matter content of the Universe. While electrons are classified as a first-generation particle, muons belong to the second generation. The physics process of the Higgs boson decaying into muons is a rare phenomenon as only about one Higgs boson in 5000 decays into muons. These new results have pivotal importance for fundamental physics because they indicate for the first time that the Higgs boson interacts with second-generation elementary particles.

    Physicists at CERN have been studying the Higgs boson since its discovery in 2012 in order to probe the properties of this very special particle. The Higgs boson, produced from proton collisions at the Large Hadron Collider, disintegrates – referred to as decay – almost instantaneously into other particles. One of the main methods of studying the Higgs boson’s properties is by analysing how it decays into the various fundamental particles and the rate of disintegration.

    CMS achieved evidence of this decay with 3σ, which means that the chance of seeing the Higgs boson decaying into a muon pair from statistical fluctuation is less than one in 700. ATLAS’s 2σ result means the chances are one in 40 [strange, lower statistical signifance but greater probability, never saw that before] . The combination of both results would increase the significance well above 3σ and provides strong evidence for the Higgs boson decay to two muons.

    “CMS is proud to have achieved this sensitivity to the decay of Higgs bosons to muons, and to show the first experimental evidence for this process. The Higgs boson seems to interact also with second-generation particles in agreement with the prediction of the Standard Model, a result that will be further refined with the data we expect to collect in the next run,” said Roberto Carlin, spokesperson for the CMS experiment.

    The Higgs boson is the quantum manifestation of the Higgs field, which gives mass to elementary particles it interacts with, via the Brout-Englert-Higgs mechanism. By measuring the rate at which the Higgs boson decays into different particles, physicists can infer the strength of their interaction with the Higgs field: the higher the rate of decay into a given particle, the stronger its interaction with the field. So far, the ATLAS and CMS experiments have observed the Higgs boson decays into different types of bosons such as W and Z, and heavier fermions such as tau leptons. The interaction with the heaviest quarks, the top and bottom, was measured in 2018. Muons are much lighter in comparison and their interaction with the Higgs field is weaker. Interactions between the Higgs boson and muons had, therefore, not previously been seen at the LHC.

    Standard Model of Particle Physics, Quantum Diaries

    “This evidence of Higgs boson decays to second-generation matter particles complements a highly successful Run 2 Higgs physics programme. The measurements of the Higgs boson’s properties have reached a new stage in precision and rare decay modes can be addressed. These achievements rely on the large LHC dataset, the outstanding efficiency and performance of the ATLAS detector and the use of novel analysis techniques,” said Karl Jakobs, ATLAS spokesperson.

    What makes these studies even more challenging is that, at the LHC, for every predicted Higgs boson decaying to two muons, there are thousands of muon pairs produced through other processes that mimic the expected experimental signature. The characteristic signature of the Higgs boson’s decay to muons is a small excess of events that cluster near a muon-pair mass of 125 GeV, which is the mass of the Higgs boson. Isolating the Higgs boson to muon-pair interactions is no easy feat. To do so, both experiments measure the energy, momentum and angles of muon candidates from the Higgs boson’s decay. In addition, the sensitivity of the analyses was improved through methods such as sophisticated background modelling strategies and other advanced techniques such as machine-learning algorithms. CMS combined four separate analyses, each optimised to categorise physics events with possible signals of a specific Higgs boson production mode. ATLAS divided their events into 20 categories that targeted specific Higgs boson production modes.

    The results, which are so far consistent with the Standard Model predictions, used the full data set collected from the second run of the LHC. With more data to be recorded from the particle accelerator’s next run and with the High-Luminosity LHC, the ATLAS and CMS collaborations expect to reach the sensitivity (5 sigma) needed to establish the discovery of the Higgs boson decay to two muons and constrain possible theories of physics beyond the Standard Model that would affect this decay mode of the Higgs boson.

    Scientific materials

    Papers:
    CMS physics analysis summary: https://cds.cern.ch/record/2725423
    ATLAS paper on arXiv: https://arxiv.org/abs/2007.07830

    Physics briefings:
    CMS: https://cmsexperiment.web.cern.ch/news/cms-sees-evidence-higgs-boson-decaying-muons
    ATLAS: https://atlas.cern/updates/physics-briefing/new-search-rare-higgs-decays-muons

    Event displays and plots:
    CMS: https://cds.cern.ch/record/2720665?ln=en
    http://cds.cern.ch/record/2725728
    ATLAS: https://cds.cern.ch/record/2725717?ln=en
    https://atlas.web.cern.ch/Atlas/GROUPS/PHYSICS/PAPERS/HIGG-2019-14

    Images:
    CMS muon system:

    ATLAS muon spectrometer:

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Tunnel

    SixTRack CERN LHC particles

     
  • richardmitnick 7:28 am on July 31, 2020 Permalink | Reply
    Tags: "Looking forward: ATLAS measures proton scattering when light turns into matter", , ATLAS Forward Proton (AFP) spectrometer (fig No.1 in post), , , CERN LHC, , , , Physicists studied data recorded by the AFP spectrometer throughout 2017 to establish direct evidence of these scattered protons when matter – electron–positron or muon–antimuon pairs – are cr,   

    From CERN ATLAS: “Looking forward: ATLAS measures proton scattering when light turns into matter” 

    CERN/ATLAS detector

    CERN ATLAS Higgs Event

    CERN ATLAS another view Image Claudia Marcelloni ATLAS CERN


    From CERN ATLAS

    30th July 2020
    ATLAS Collaboration

    Today, at the International Conference for High Energy Physics (ICHEP 2020), the ATLAS Collaboration announced first results using the ATLAS Forward Proton (AFP) spectrometer (Figure 1). With this instrument, physicists directly observed and measured the long sought-after prediction of proton scattering when particles of light turn into matter.

    1
    Figure 1: Schematic diagram of ATLAS Forward Proton (AFP) spectrometer relative to the main ATLAS detector (not to scale). After the incident proton beams intersect, the leptons are detected by the main ATLAS detector and the scattered proton is detected by AFP. (Image: ATLAS Collaboration/CERN)

    In 1928, theoretical physicist Paul Dirac predicted the existence of the positron, the positively-charged antimatter partner to the electron. When brought together, this matter–antimatter pair annihilates into two particles of light (photons). Remarkably, quantum mechanics predicts that the reverse can also occur. Two photons with sufficient energy can turn into a matter–antimatter pair, as shown in Figure 2.

    2
    Figure 2: Diagram of a pair of photons (γ) turning into a pair of leptons (electrons or muons) (ℓ). The scattered protons (p) can remain intact in such interactions, but deflected from their paths along the beam so that they can be measured in a proton spectrometer. (Image: ATLAS Collaboration/CERN)

    To observe this phenomenon, physicists can use the LHC, a proton–proton collider, as a photon–photon collider. Usually, particles are created by protons colliding head-on which break apart. However, if two protons pass very close to each other, they can scatter via the electromagnetic force to produce photons that turn into a matter–antimatter pair. The two protons remain intact, continuing their path in the LHC beam pipe, which the AFP spectrometer can detect. Observing these intact scattered protons is a hallmark of a photon–photon collision.

    The AFP spectrometer is unique in many ways. Installed in 2017, it is one of the newest additions to the ATLAS experiment. It sits either side of the main ATLAS cavern, just over 200 metres downstream from the collision point as shown in Figure 1. Its detectors are based on silicon technology, which reach directly into the LHC beam pipe to only two millimetres from the proton beam itself. If a scattered proton emits a photon and loses a few percentage points of energy, the LHC magnets deflect the proton into the AFP spectrometer. These scattered protons are among the highest-energy particles measured at the LHC.

    Physicists studied data recorded by the AFP spectrometer throughout 2017 to establish direct evidence of these scattered protons when matter – electron–positron or muon–antimuon pairs – are created from the interaction of two photons. This was achieved by comparing the proton energy loss measured by the AFP spectrometer from the proton deflection angle to the produced matter–antimatter pair recorded in the central ATLAS experiment, as shown in Figure 3. If the scattered proton arose while photons turned into matter, the measurements from both locations are predicted to be equal (within the measurement precision).

    The ATLAS Collaboration has observed this striking phenomenon, recording 180 events that have an intact proton detected by the AFP spectrometer and a matching electron–positron or muon–antimuon pair measured in the main ATLAS detector. The expected background from accidentally matching forward protons amounts to about 20 events. The statistical significance of this result thus exceeds 9 standard deviations for each electron and muon channels.

    This landmark measurement using the AFP spectrometer provides valuable information about how often the protons stay intact, which is challenging to calculate from theory. These measurements are important tests of how light interacts with matter at the highest laboratory energies. Certain theories predict such interactions are modified by new particles that could explain the mysterious dark matter in our universe. With more data, physicists can use the AFP to search for these phenomena in new ways.

    3
    Figure 3: The fractional proton energy loss measured by the AFP spectrometer (ξAFP) is compared to that measured from the electron or muon pairs in the central ATLAS detector (ξll). A signal peak is observed when these two quantities are approximately equal, indicating that the scattered proton emitted a photon that produced the lepton pair. The labels A and C denote opposite sides of the collision point along the beam line. (Image: ATLAS Collaboration/CERN)

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN Courier

    Quantum Diaries
    QuantumDiaries

    CERN map


    CERN LHC Grand Tunnel
    CERN LHC particles

     
  • richardmitnick 7:24 am on June 26, 2020 Permalink | Reply
    Tags: "Experiment at CERN makes the first observation of rare events producing three massive force carriers simultaneously", , , , CERN LHC, , , , The simultaneous production of three W or Z bosons- subatomic "mediator particles" that carry the weak interaction.   

    From Caltech: “Experiment at CERN makes the first observation of rare events producing three massive force carriers simultaneously” 

    Caltech Logo

    From Caltech

    June 25, 2020
    Emily Velasco
    626‑395‑6487
    evelasco@caltech.edu

    1
    Caltech

    Modern physics knows a great deal about how the universe works, from the grand scale of galaxies down to the infinitesimally small size of quarks and gluons. Still, the answers to some major mysteries, such as the nature of dark matter and origin of gravity, have remained out of reach.

    Caltech physicists and their colleagues using the Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN) in Geneva, Switzerland, the largest and most powerful particle accelerator in existence, and its Compact Muon Solenoid (CMS) experiment have made a new observation of very rare events that could help take physics beyond its current understanding of the world.

    LHC

    CERN map

    CERN LHC Maximilien Brice and Julien Marius Ordan

    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector

    CMS

    CERN/CMS

    LHCb

    CERN/LHCb detector

    The new observation involves the simultaneous production of three W or Z bosons, subatomic “mediator particles” that carry the weak interaction—one of the four known fundamental forces—which is responsible for the phenomenon of radioactivity as well as an essential ingredient in the sun’s thermonuclear processes.

    Bosons are a class of particles that also include photons, which make up light; the Higgs boson, which is thought to be responsible for giving mass to matter; and gluons, which bind nuclei together. The W and Z bosons are similar to each other in that they both carry the weak force but are different in that the Z boson has no electric charge. The existence of these bosons, along with other subatomic particles like gluons and neutrinos, is explained by what is known as the Standard Model of particle physics.

    Caltech graduate student Zhicai Zhang (MS ’18), a member of the High Energy Physics research team led by Harvey Newman, the Marvin L. Goldberger Professor of Physics, and Maria Spiropulu, the Shang-Yi Ch’en Professor of Physics, is one of the principal contributors to the new observation, working together with other team members.

    The events producing the trios of bosons occur when intense bunches of high-energy protons that have been accelerated to nearly the speed of light are brought into a head-on collision at a few points along the circular path of the LHC. When two protons collide, the quarks and gluons in the protons are forced apart, and as that happens, W and Z bosons can pop into existence; in very rare cases, they appear as triplets: WWW, WWZ, WZZ, and ZZZ. Such triplets of W and Z bosons, Newman says, are only produced in one out of 10 trillion proton-proton collisions.

    These events are recorded using the CMS, which surrounds one of the collision points along the LHC’s path. Zhang says these events are 50 times rarer than those used to discover the Higgs boson.

    “With the LHC creating an enormous number of collisions, we can see things that are very rare, like the production of these bosons,” Newman says.

    It is possible for the W and Z bosons to self-interact, allowing W and Z bosons to create still more W and Z bosons; these may manifest themselves as events with two or three massive bosons. Still, this creation is rare, so the more bosons that are produced, the less frequent the production happens. The production of two massive bosons has previously been observed and measured with good precision at the LHC.

    The creation of these bosons was not the specific goal of the experiment, Newman says. By collecting enough data, including many events with boson triplets and other rare events, researchers will be able to test the Standard Model’s predictions with increasing precision and may eventually find and be able to study the new interactions that lie beyond it.

    “We know from observing the rotation and distribution of galaxies that there must be dark matter exerting its gravitational influence, but dark matter doesn’t fit into the Standard Model. There is no room in it for dark particles, nor does it include gravity, and it simply does not work at the energy scales typical of the early universe in the first moments after the Big Bang. We know that there is a more fundamental yet-to-be-discovered theory than the Standard Model,” Newman says.

    The next three-year experimental run, scheduled for 2021–24, is already being prepared. At the end of that run, the equipment will be upgraded to increase its data-collection capacity 30-fold. “There is a lot of unrealized potential. The masses of data we have already collected still represent only a few percent of what we expect to collect following major upgrades of both CMS and the LHC, at the High Luminosity LHC that is scheduled to run for 10 years beginning in 2027. We are only at the very beginning of this 30-year physics program,” he says.

    A paper describing their findings, titled, “Observation of heavy triboson production in leptonic final states in proton-proton collisions at √s = 13 TeV” is available at CERN [ http://cds.cern.ch/record/2714899 ].

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”

    Caltech campus

     
  • richardmitnick 12:24 pm on April 30, 2020 Permalink | Reply
    Tags: "The large boson-boson collider", , , CERN LHC, , , , , , Weak Interaction   

    From Symmetry: “The large boson-boson collider” 

    Symmetry Mag
    From Symmetry<

    04/30/20
    Sarah Charley

    1
    Courtesy of CERN

    Scientists study rare, one-in-a-trillion heavy boson collisions happening inside the LHC.

    The Large Hadron Collider is the world’s most powerful particle accelerator. It accelerates and smashes protons and other atomic nuclei to study the fundamental properties of matter.

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    Normally scientists look at the particles produced during these collisions to learn about the laws of nature. But scientists can also learn about subatomic matter by peering into the collisions themselves and asking: What exactly is doing the colliding?

    When the answer to that question involves rarely seen, massive particles, it gives scientists a unique way to study the Higgs boson.

    Protons are not solid spheres, but composite particles containing even tinier components called quarks and gluons.

    The quark structure of the proton 16 March 2006 Arpad Horvath

    “As far as we know the quarks and gluons are point-like particles with no internal structure,” says Aram Apyan, a research associate at the US Department of Energy’s Fermi National Accelerator Laboratory.

    According to Apyan, two quarks cannot actually hit each other; they don’t have volume or surfaces. So what really happens when these point-like particles collide?

    “When we talk about two quarks colliding, what we really mean is that they are very close to each other spatially and exchanging particles,” says Richard Ruiz, a theorist at Université Catholique de Louvain in Belgium. “Namely, they exchange force-carrying bosons.”

    All elementary matter particles (like quarks and electrons) communicate with each other through bosons. For instance, quarks know to bind together by throwing bosons called gluons back and forth, which carry the message, “Stick together!”

    Almost every collision inside the LHC starts with an exchange of bosons (the only exceptions are when matter particles meet antimatter particles).

    The lion’s share of LHC collisions happen when two passing energetic gluons meet, fuse and then transform into all sorts of particles through the wonders of quantum mechanics.

    Gluons carry the strong interaction, which pulls quarks together into particles like protons and neutrons. Gluon-gluon collisions are so powerful that the protons they are a part of are ripped apart and the original quarks in those protons are consumed.

    In extremely rare instances, colliding quarks can also interact through a different force: the weak interaction, which is carried by the massive W and Z bosons. The weak interaction arbitrates all nuclear decay and fusion, such as when the protons in the center of the sun are squished and squeezed into helium nuclei.

    The weak interaction passes the message, “Time to change!’’and inspires quarks to take on a new identity–for instance, change from a down quark to an up quark or vice versa.

    Although it may seem counterintuitive, the W and Z bosons that carry the weak interaction are extremely heavy–roughly 80 times more massive than the protons the LHC smashes together. For two minuscule quarks to produce two enormous W or Z bosons simultaneously, they need access to a big pot of excess energy.

    That’s where the LHC comes in; by accelerating protons to nearly the speed of light, it produces the most energetic collisions ever seen in a particle accelerator. “The LHC is special,” Ruiz says. “The LHC is the first collider in which we have evidence of W and Z boson scattering; the weak interaction bosons themselves are colliding.”

    Even inside the LHC, weak interaction boson-boson collisions are extremely rare. This is because the range of the weak interaction extends to only about 0.1% of the diameter of a proton. (Compare this to the range of the strong interaction, which is equivalent to the proton’s diameter.)

    “This range is quite small,” Apyan says. “Two quarks have to be extremely close and radiate a W or Z boson simultaneously for there to be a chance of the bosons colliding.”

    Apyan studies collisions in which two colliding quarks simultaneously release a W or Z boson, which then scatter off one another before transforming into more stable particles. Unlike other processes, the W and Z boson collisions maintain their quarks, which then fly off into the detector as the proton falls apart. “This process has a nice signature,” Apyan says. “The remnants of the original quarks end up in our detector, and we see them as jets of particles very close to the beampipe.”

    The probability of this happening during an LHC collision is about one in a trillion. Luckily, the LHC generates about 600 million proton-proton collisions every second. At this rate, scientists are able to see this extremely rare event about once every other minute when the LHC is running.

    These heavy boson-boson collisions inside the LHC provide physicists with a unique view of the subatomic world, Ruiz says.

    Creating and scattering bosons allows physicists to see how their mathematical models hold up under stringent experimental tests. This can allow them to search for physics beyond the Standard Model.

    The scattering of W and Z bosons is a particularly pertinent test for the strength of the Higgs field. “The coupling strength between the Higgs boson and W and Z bosons is proportional to the masses of the W and Z bosons, and this raises many interesting questions,” Apyan says.

    Even small tweaks to the Higgs field could have major implications for the properties of Z and W bosons and how they ricochet off each other. By studying how these particles collide inside the LHC, scientists are able to open yet another window into the properties of the Higgs.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 11:38 am on March 30, 2020 Permalink | Reply
    Tags: "Big Labs Replace Data Taking with New Priorities", , Brookhaven National Laboratory’s National Synchrotron Light Source II (NSLS-II), , CERN LHC, ,   

    From “Physics”: “Big Labs Replace Data Taking with New Priorities” 

    About Physics

    From “Physics”

    March 30, 2020
    Katherine Wright

    Large research facilities have curtailed data collection and shut their doors—but their scientists are busier than ever, and some have joined the fight against COVID-19.

    1
    Brookhaven National Laboratory

    Despite being stuck at home, John Hill has never been busier. As the director of Brookhaven National Laboratory’s National Synchrotron Light Source II (NSLS-II)—a state-of-the-art x-ray facility in New York—Hill spent the middle of March rapidly ramping down experiments in response to the COVID-19 pandemic. As of midday March 23rd, only two of the 28 beamlines at NSLS-II remained operational, and only a handful of staff were onsite.

    BNL NSLS II


    BNL NSLS-II Interior

    Like many large facilities, NSLS-II shut its doors to comply with government guidelines and to keep staff safe and healthy. These rapid closures have, in some cases, brought experiments to a grinding halt—LIGO in the US and Virgo in Italy both stopped their search for gravitational-wave signals on Friday, more than a month earlier than they’d planned before the pandemic. At other places, such as CERN in Switzerland, where detectors were already off, long-awaited upgrades are now on indefinite hold.

    Caltech/MIT Advanced aLigo Hanford, WA, USA installation


    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    The scientists in charge of major experiments are now scrambling to rethink the next few months. And even though most researchers from these facilities are housebound, they are still hard at work. Most have switched their attention to data analysis, paper writing, and software development, interspersed with virtual meetings. A small number of others remain onsite, busy with an unexpected focus: new experiments to understand the virus that causes COVID-19.

    “The last three weeks have been nonstop planning,” says David Reitze, the director of LIGO, who spoke from his home in California. “We had to react very quickly.” Reitze has been in daily meetings with his LIGO and Virgo colleagues to implement remote working procedures for its 1300 international team members, who are used to frequent in-person meetings. Until Friday, LIGO had kept a skeleton crew running its two detectors, but the collaboration decided to turn the detectors off. Reitze says it was sad to end early, but he wanted to keep the staff safe.

    In addition to making contingency plans, researchers are busy replacing their usual in-person meetings with virtual ones. LIGO and Virgo canceled their biannual conference, scheduled for March 16th at Lake Geneva in Wisconsin. “The risk of becoming a central spreader of the disease just didn’t seem like a good idea,” says Patrick Brady, the current spokesperson for the LIGO collaboration. Instead, attendees live-streamed talks, with around 200 people watching the plenary session. “It was a chance to focus on science,” Reitze says. “That brought a bit of normalcy.”

    The researchers interviewed for this story say that they expect little to no impact on the scientific output from their institutions—for now. Many places have loads of data to analyze already. The gravitational-wave community has 11 months of data and over 50 events to analyze from the latest observation run. Scientists using Diamond Light Source, a synchrotron in the UK, just finished a cycle of experiments.

    Diamond Light Source, located at the Harwell Science and Innovation Campus in Oxfordshire U.K.

    And CERN has been undergoing upgrades since December 2018, so many of the scientists were already focused on data analysis. “Eighteen months after a shutdown you often get a spike in publications as people write up all the work that they have perhaps fallen behind with,” says Andrew Harrison, the CEO of Diamond.

    That said, this shutdown is different from most. LIGO and Virgo expect to submit papers from this run later than initially planned, having added a four-week extension to their writing activities. “Because of the sense of anxiety that many people are feeling, we decided to relax our timelines,” Brady says.

    Physics output could be impacted if the shutdowns extend beyond a few months. At CERN, for example, the ongoing instrument and equipment upgrades have been deemed nonessential activities and are now on hiatus. “We had to drop the screwdrivers,” says Giovanni Passaleva, the spokesperson for CERN’s Large Hadron Collider Beauty (LHCb) experiment.

    CERN LHCb chamber, LHC

    Stopping the upgrade could potentially delay LHC’s plan to restart in May 2021. But with no equipment to attend to, Passaleva notes that updates to LHCb’s software are “going faster than before.” And his group is still committed to its daily coffee hour, only now they do it online. “It’s very important that we keep connections with each other,” he says.

    In-person interactions are still possible at some labs. At NSLS-II, a handful of scientists are onsite helping researchers from pharmaceutical companies and academia study the crystal structures of synthetic versions of proteins found in the virus that causes COVID-19. Their goal is to use this information to develop drugs for treating those infected with the disease, Hill says. Similar experiments are ongoing at the Advanced Photon Source at Argonne National Laboratory, Illinois, and will start tomorrow at two beamlines at Diamond, which as of Friday had received a dozen applications for experiments to study the proteins in the virus.

    ANL Advanced Photon Source

    Hill says it’s good for NSLS-II that it can continue to contribute. “We don’t feel we are sitting powerless, watching this disease come—we are actively trying to fight it.” Harrison echoes this sentiment, saying he is pleased that basic science is considered essential work and that Diamond can contribute in the effort to understand the new disease. “It’s very positive that governments are engaging [with scientists],” he says. He also thinks the situation has forced scientists to refocus their priorities. “The things you thought were important just completely change,” he says.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Physicists are drowning in a flood of research papers in their own fields and coping with an even larger deluge in other areas of physics. How can an active researcher stay informed about the most important developments in physics? Physics highlights a selection of papers from the Physical Review journals. In consultation with expert scientists, the editors choose these papers for their importance and/or intrinsic interest. To highlight these papers, Physics features three kinds of articles: Viewpoints are commentaries written by active researchers, who are asked to explain the results to physicists in other subfields. Focus stories are written by professional science writers in a journalistic style and are intended to be accessible to students and non-experts. Synopses are brief editor-written summaries. Physics provides a much-needed guide to the best in physics, and we welcome your comments (physics@aps.org).

     
  • richardmitnick 8:41 am on March 25, 2020 Permalink | Reply
    Tags: "Plasma polarised by spin-orbit effect", , , , , CERN LHC, , , ,   

    From CERN Courier: “Plasma polarised by spin-orbit effect” 


    From CERN Courier

    23 March 2020

    A report from the ALICE experiment

    1
    Fig. 1. The spin alignment of (spin-1) K*0 mesons (red circles) can be characterised by deviations from ρ00 = 1/3, which is estimated here versus their transverse momenta, pT. The same variable was estimated for (spin-0) K0S mesons (magenta stars), and K*0 mesons produced in proton–proton collisions with negligible angular momentum (hollow orange circles), as systematic tests. Credit: CERN

    Spin-orbit coupling causes fine structure in atomic physics and shell structure in nuclear physics, and is a key ingredient in the field of spintronics in materials sciences. It is also expected to affect the development of the quickly rotating quark–gluon plasma (QGP) created in non-central collisions of lead nuclei at LHC energies. As such plasmas are created by the collisions of lead nuclei that almost miss each other, they have very high angular momenta of the order of 107ħ – equivalent to the order of 1021 revolutions per second. While the extreme magnetic fields generated by spectating nucleons (of the order of 1014 T, CERN Courier Jan/Feb 2020 p17) quickly decay as the spectator nucleons pass by, the plasma’s angular momentum is sustained throughout the evolution of the system as it is a conserved quantity. These extreme angular momenta are expected to lead to spin-orbit interactions that polarise the quarks in the plasma along the direction of the angular momentum of the plasma’s rotation. This should in turn cause the spins of vector (spin-1) mesons to align if hadronisation proceeds via the recombination of partons or by fragmentation. To study this effect, the ALICE collaboration recently measured the spin alignment of the decay products of neutral K* and φ vector mesons produced in non-central Pb–Pb collisions.

    Spin alignment can be studied by measuring the angular distribution of the decay products of the vector mesons. It is quantified by the probability ρ00 of finding a vector meson in a spin state 0 with respect to the direction of the angular momentum of the rotating QGP, which is approximately perpendicular to the plane of the beam direction and the impact parameter of the two colliding nuclei. In the absence of spin-alignment effects, the probability of finding a vector meson in any of the three spin states (–1, 0, 1) should be equal, with ρ00 = 1/3.

    The ALICE collaboration measured the angular distributions of neutral K* and φ vector mesons via their hadronic decays to Kπ and KK pairs, respectively. ρ00 was found to deviate from 1/3 for low-pT and mid-central collisions at a level of 3σ (figure 1). The corresponding results for φ mesons show a deviation of ρ00 values from 1/3 at a level of 2σ. The observed pT dependence of ρ00 is expected if quark polarisation via spin-orbit coupling is subsequently transferred to the vector mesons by hadronisation, via the recombination of a quark and an anti-quark from the quark–gluon plasma. The data are also consistent with the initial angular momentum of the hot and dense matter being highest for mid-central collisions and decreasing towards zero for central and peripheral collisions.

    The results are surprising, however, as corresponding quark-polarisation values obtained from studies with Λ hyperons are compatible with zero. A number of systematic tests have been carried out to verify these surprising results. K0S mesons do indeed yield ρ00 = 1/3, indicating no spin alignment, as must be true for a spin-zero particle. For proton–proton collisions, the absence of initial angular momentum also leads to ρ00 = 1/3, consistent with the observed neutral K* spin alignment being the result of spin-orbit coupling.

    The present measurements are a step towards experimentally establishing possible spin-orbit interactions in the relativistic-QCD matter of the quark–gluon plasma. In the future, higher statistics measurements in Run 3 will significantly improve the precision, and studies with the charged K*, which has a magnetic moment seven times larger than neutral K*, may even allow a direct observation of the effect of the strong magnetic fields initially experienced by the quark–gluon plasma.
    Further reading

    ALICE Collaboration 2019 arXiv:1910.14408.

    ALICE Collaboration 2019 arXiv:1909.01281.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN/ATLAS detector

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    SixTRack CERN LHC particles

     
  • richardmitnick 3:14 pm on March 10, 2020 Permalink | Reply
    Tags: "Accounting for the Higgs", , , CERN LHC, , , , , ,   

    From Symmetry: “Accounting for the Higgs” 

    Symmetry Mag
    From Symmetry<

    03/10/20
    Sarah Charley

    Only a fraction of collision events that look like they produce a Higgs boson actually produce a Higgs boson. Luckily, it doesn’t matter.

    CERN CMS Higgs Event May 27, 2012

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    I’ll let you in on a little secret: Even though physicists have produced millions of Higgs bosons at the Large Hadron Collider, they’ve never actually seen one. Higgs bosons are fragile things that dissolve immediately after they’re born. But as they die, they produce other particles, which, if they’re created at the LHC, can travel through a particle detector and leave recognizable signatures.

    Here’s another secret: Higgs signatures are identical to the signatures of numerous other processes. In fact, every time the Higgs signs its name in a detector, there are many more background processes leaving the exact same marks.

    For instance, one of the Higgs boson’s cleanest signatures is two photons with a combined mass of around 125 billion electronvolts. But for every 10 diphotons that look like a Higgs signature, only about one event actually belongs to a Higgs.

    So how can scientists study something that they cannot see and cannot isolate? They employ the same technique FBI agents use to uncover illegal money laundering schemes: accounting.

    In money laundering, “dirty” money (from illegal activities) is mixed with “clean” money from a legitimate business like a car wash. It all looks the same, so determining which Benjamins came from drugs versus which came from detailing is impossible. But agents don’t need to look at the individual dollars; they just need to look for suspiciously large spikes in profit that cannot be explained by regular business activities.

    In physics, the accounting comes from a much-loved set of equations called the Standard Model.

    Standard Model of Particle Physics, Quantum Diaries

    Physicists have spent decades building and perfecting the Standard Model, which tells them what percentage of the time different subatomic processes should happen. Scientists know which signatures are associated with which processes, so if they see a signature more often than expected, it means there is something happening outside the purview of the Standard Model: a new process.

    Clever accounting is how scientists originally discovered the Higgs boson in 2012. Theorists predicted what the Higgs signatures should look like, and when physicists went searching, they consistently saw some of these signatures more frequently than they could explain without the Higgs boson. When scientists added the mathematics for the Higgs boson into the equations, the predictions matched the data.

    Today, physicists use this accounting method to search for new particles. Many of these new particles are predicted to be rarer than Higgs bosons (for reference, Higgs bosons are produced in about one in a billion collisions). Many processes are also less clear-cut, and just the act of standardizing the accounting is a challenge. (To return to the money laundering analogy, it would be like FBI agents investigating an upscale bar, where a sudden excess could be explained by a generous tip.)

    To find these complex and subtle signs of mischief, scientists need a huge amount of data and a finely tuned model. Future runs of the LHC will be dedicated to building up these enormous datasets so that scientists can dig through the books for numbers that the Standard Model cannot explain.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 1:38 pm on January 29, 2020 Permalink | Reply
    Tags: "Particle Physics Turns to Quantum Computing for Solutions to Tomorrow’s Big-Data Problems", , CERN LHC, , , , ,   

    From Lawrence Berkeley National Lab: “Particle Physics Turns to Quantum Computing for Solutions to Tomorrow’s Big-Data Problems” 

    From Lawrence Berkeley National Lab

    January 29, 2020
    Glenn Roberts Jr.
    geroberts@lbl.gov
    (510) 486-5582

    Berkeley Lab researchers testing several techniques, technologies to be ready for the incoming deluge of particle data.

    1
    Display of a simulated High-Luminosity Large Hadron Collider (HL-LHC) particle collision event in an upgraded ATLAS detector. The event has an average of 200 collisions per particle bunch crossing. (Credit: ATLAS Collaboration/CERN)

    Giant-scale physics experiments are increasingly reliant on big data and complex algorithms fed into powerful computers, and managing this multiplying mass of data presents its own unique challenges.

    To better prepare for this data deluge posed by next-generation upgrades and new experiments, physicists are turning to the fledgling field of quantum computing to find faster ways to analyze the incoming info.

    Giant-scale physics experiments are increasingly reliant on big data and complex algorithms fed into powerful computers, and managing this multiplying mass of data presents its own unique challenges.

    To better prepare for this data deluge posed by next-generation upgrades and new experiments, physicists are turning to the fledgling field of quantum computing to find faster ways to analyze the incoming info.

    Click on a name or photo in the series of articles listed below profile three student researchers who have participated in Berkeley Lab-led efforts to learn about research projects in quantum computing by early-career researchers at Berkeley Lab:

    In a conventional computer, memory takes the form of a large collection of bits, and each bit has only two values: a one or zero, akin to an on or off position. In a quantum computer, meanwhile, data is stored in quantum bits, or qubits. A qubit can represent a one, a zero, or a mixed state in which it is both a one and a zero at the same time.

    By tapping into this and other quantum properties, quantum computers hold the potential to handle larger datasets and quickly work through some problems that would trip up even the world’s fastest supercomputers. For other types of problems, though, conventional computers will continue to outperform quantum machines.

    The High Luminosity Large Hadron Collider (HL-LHC) Project, a planned upgrade of the world’s largest particle accelerator at the CERN laboratory in Europe, will come on line in 2026.

    LHC

    CERN map


    CERN LHC Maximilien Brice and Julien Marius Ordan


    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    It will produce billions of particle events per second – five to seven times more data than its current maximum rate – and CERN is seeking new approaches to rapidly and accurately analyze this data.

    In these particle events, positively charged subatomic particles called protons collide, producing sprays of other particles, including quarks and gluons, from the energy of the collision. The interactions of particles can also cause other particles – like the Higgs boson – to pop into existence.

    Tracking the creation and precise paths (called “tracks”) of these particles as they travel through layers of a particle detector – while excluding the unwanted mess, or “noise” produced in these events – is key in analyzing the collision data.

    The data will be like a giant 3D connect-the-dots puzzle that contains many separate fragments, with little guidance on how to connect the dots.

    To address this next-gen problem, a group of student researchers and other scientists at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have been exploring a wide range of new solutions.

    One such approach is to develop and test a variety of algorithms tailored to different types of quantum-computing systems. Their aim: Explore whether these technologies and techniques hold promise for reconstructing these particle tracks better and faster than conventional computers can.

    Particle detectors work by detecting energy that is deposited in different layers of the detector materials. In the analysis of detector data, researchers work to reconstruct the trajectory of specific particles traveling through the detector array. Computer algorithms can aid this process through pattern recognition, and particles’ properties can be detailed by connecting the dots of individual “hits” collected by the detector and correctly identifying individual particle trajectories.

    2
    A new wheel-shaped muon detector is part of the ATLAS detector upgrade at CERN. This wheel-shaped detector measures more than 30 feet in diameter. (Credit: Julien Marius Ordan/CERN)

    Heather Gray, an experimental particle physicist at Berkeley Lab and a UC Berkeley physics professor, leads the Berkeley Lab-based R&D effort – Quantum Pattern Recognition for High-Energy Physics (HEP.QPR) – that seeks to identify quantum technologies to rapidly perform this pattern-recognition process in very-high-volume collision data. This R&D effort is funded as part of the DOE’s QuantISED (Quantum Information Science Enabled Discovery for High Energy Physics) portfolio.

    The HEP.QPR project is also part of a broader initiative to boost quantum information science research at Berkeley Lab and across U.S. national laboratories.

    Other members of the HEP.QPR group are: Wahid Bhimji, Paolo Calafiura, Wim Lavrijsen, and former postdoctoral researcher Illya Shapoval, who explored quantum algorithms for associative memory. Bhimji is a big data architect at Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC). Calafiura is chief software architect of CERN’s ATLAS experiment and a member of Berkeley Lab’s Computational Research Division (CRD). And Lavrijsen is a CRD software engineer who is also involved in CERN’s ATLAS experiment.

    Members of the HEP.QPR project have collaborated with researchers at the University of Tokyo and from Canada on the development of quantum algorithms in high-energy physics, and jointly organized a Quantum Computing Mini-Workshop at Berkeley Lab in October 2019.

    Gray and Calafiura were also involved in a CERN-sponsored competition, launched in mid-2018, that challenged computer scientists to develop machine-learning-based techniques to accurately reconstruct particle tracks using a simulated set of HL-LHC data known as TrackML. Machine learning is a form of artificial intelligence in which algorithms can become more efficient and accurate through a gradual training process akin to human learning. Berkeley Lab’s quantum-computing effort in particle-track reconstruction also utilizes this TrackML set of simulated data.

    Berkeley Lab and UC Berkeley are playing important roles in the rapidly evolving field of quantum computing through their participation in several quantum-focused efforts, including The Quantum Information Edge, a research alliance announced in December 2019.

    The Quantum Information Edge is a nationwide alliance of national labs, universities, and industry advancing the frontiers of quantum computing systems to address scientific challenges and maintain U.S. leadership in next-generation information technology. It is led by the DOE’s Berkeley Lab and Sandia National Laboratories.

    The series of articles listed below profile three student researchers who have participated in Berkeley Lab-led efforts to apply quantum computing to the pattern-recognition problem in particle physics:

    4
    Lucy Linder, while working as a researcher at Berkeley Lab, developed her master’s thesis – supervised by Berkeley Lab staff scientist Paolo Calafiura – about the potential application of a quantum-computing technique called quantum annealing for finding particle tracks. She remotely accessed quantum-computing machines at D-Wave Systems Inc. in Canada and at Los Alamos National Laboratory in New Mexico.

    Linder’s approach was to first format the particle-track simulated data as something known as a QUBO (quadratic unconstrained binary optimization) problem that formulated the problem as an equation with binary values: either a one or a zero. This QUBO formatting also helped prepare the data for analysis by a quantum annealer, which uses qubits to help identify the best possible solution by applying a physics principle that describes how objects naturally seek the lowest-possible energy state.
    Read More

    5
    Eric Rohm, an undergraduate student working on a contract at Berkeley Lab as part of the DOE’s Science Undergraduate Laboratory Internship program, developed a quantum approximate optimization algorithm (QAOA) using quantum-computing resources at Rigetti Computing in Berkeley, California. He was supervised by Berkeley Lab physicist Heather Gray.

    This approach used a blend of conventional and quantum computing techniques to develop a custom algorithm. The algorithm, still in refinement, has been tested on the Rigetti Quantum Virtual Machine, a conventional computer that simulates a small quantum computer. The algorithm may eventually be tested on a Rigetti quantum processing unit that is equipped with actual qubits.
    Read More

    6
    Amitabh Yadav, a student research associate at Berkeley Lab since November who is supervised by Gray and Berkeley Lab software engineer Wim Lavrijsen, is working to apply a quantum version of a convention technique called Hough transform to identify and reconstruct particle tracks using IBM’s Quantum Experience, a form of quantum computing.

    The classical Hough transform technique can be used to detect specific features such as lines, curves, and circles in complex patterns, and the quantum Hough transform technique could potentially call out more complex shapes from exponentially larger datasets. Read More

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    LBNL campus

    LBNL Molecular Foundry

    Bringing Science Solutions to the World
    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the National Academy of Sciences (NAS), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California (UC) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the UC Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a UC Berkeley physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    A U.S. Department of Energy National Laboratory Operated by the University of California.

    University of California Seal

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: