Tagged: Physics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:52 am on March 25, 2019 Permalink | Reply
    Tags: , , , , , , ExaLearn, , , , , Physics   

    From insideHPC: “ExaLearn Project to bring Machine Learning to Exascale” 

    From insideHPC

    March 24, 2019

    As supercomputers become ever more capable in their march toward exascale levels of performance, scientists can run increasingly detailed and accurate simulations to study problems ranging from cleaner combustion to the nature of the universe. Enter ExaLearn, a new machine learning project supported by DOE’s Exascale Computing Project (ECP), aims to develop new tools to help scientists overcome this challenge by applying machine learning to very large experimental datasets and simulations.

    The first research area for ExaLearn’s surrogate models will be in cosmology to support projects such a the LSST (Large Synoptic Survey Telescope) now under construction in Chile and shown here in an artist’s rendering. (Todd Mason, Mason Productions Inc. / LSST Corporation)

    “The challenge is that these powerful simulations require lots of computer time. That is, they are “computationally expensive,” consuming 10 to 50 million CPU hours for a single simulation. For example, running a 50-million-hour simulation on all 658,784 compute cores on the Cori supercomputer NERSC would take more than three days.


    NERSC Cray Cori II supercomputer at NERSC at LBNL, named after Gerty Cori, the first American woman to win a Nobel Prize in science

    NERSC Hopper Cray XE6 supercomputer

    LBL NERSC Cray XC30 Edison supercomputer

    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.


    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.


    Cray Shasta Perlmutter SC18 AMD Epyc Nvidia pre-exascale supeercomputer

    Running thousands of these simulations, which are needed to explore wide ranges in parameter space, would be intractable.

    One of the areas ExaLearn is focusing on is surrogate models. Surrogate models, often known as emulators, are built to provide rapid approximations of more expensive simulations. This allows a scientist to generate additional simulations more cheaply – running much faster on many fewer processors. To do this, the team will need to run thousands of computationally expensive simulations over a wide parameter space to train the computer to recognize patterns in the simulation data. This then allows the computer to create a computationally cheap model, easily interpolating between the parameters it was initially trained on to fill in the blanks between the results of the more expensive models.

    “Training can also take a long time, but then we expect these models to generate new simulations in just seconds,” said Peter Nugent, deputy director for science engagement in the Computational Research Division at LBNL.

    From Cosmology to Combustion

    Nugent is leading the effort to develop the so-called surrogate models as part of ExaLearn. The first research area will be cosmology, followed by combustion. But the team expects the tools to benefit a wide range of disciplines.

    “Many DOE simulation efforts could benefit from having realistic surrogate models in place of computationally expensive simulations,” ExaLearn Principal Investigator Frank Alexander of Brookhaven National Lab said at the recent ECP Annual Meeting.

    “These can be used to quickly flesh out parameter space, help with real-time decision making and experimental design, and determine the best areas to perform additional simulations.”

    The surrogate models and related simulations will aid in cosmological analyses to reduce systematic uncertainties in observations by telescopes and satellites. Such observations generate massive datasets that are currently limited by systematic uncertainties. Since we only have a single universe to observe, the only way to address these uncertainties is through simulations, so creating cheap but realistic and unbiased simulations greatly speeds up the analysis of these observational datasets. A typical cosmology experiment now requires sub-percent level control of statistical and systematic uncertainties. This then requires the generation of thousands to hundreds of thousands of computationally expensive simulations to beat down the uncertainties.

    These parameters are critical in light of two upcoming programs:

    The Dark Energy Spectroscopic Instrument, or DESI, is an advanced instrument on a telescope located in Arizona that is expected to begin surveying the universe this year.

    LBNL/DESI Dark Energy Spectroscopic Instrument for the Nicholas U. Mayall 4-meter telescope at Kitt Peak National Observatory near Tucson, Ariz, USA

    NOAO/Mayall 4 m telescope at Kitt Peak, Arizona, USA, Altitude 2,120 m (6,960 ft)

    DESI seeks to map the large-scale structure of the universe over an enormous volume and a wide range of look-back times (based on “redshift,” or the shift in the light of distant objects toward redder wavelengths of light). Targeting about 30 million pre-selected galaxies across one-third of the night sky, scientists will use DESI’s redshifts data to construct 3D maps of the universe. There will be about 10 terabytes (TB) of raw data per year transferred from the observatory to NERSC. After running the data through the pipelines at NERSC (using millions of CPU hours), about 100 TB per year of data products will be made available as data releases approximately once a year throughout DESI’s five years of operations.

    The Large Synoptic Survey Telescope, or LSST, is currently being built on a mountaintop in Chile.


    LSST Camera, built at SLAC

    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    LSST Data Journey, Illustration by Sandbox Studio, Chicago with Ana Kova

    When completed in 2021, the LSST will take more than 800 panoramic images each night with its 3.2 billion-pixel camera, recording the entire visible sky twice each week. Each patch of sky it images will be visited 1,000 times during the survey, and each of its 30-second observations will be able to detect objects 10 million times fainter than visible with the human eye. A powerful data system will compare new with previous images to detect changes in brightness and position of objects as big as far-distant galaxy clusters and as small as nearby asteroids.

    For these programs, the ExaLearn team will first target large-scale structure simulations of the universe since the field is more developed than others and the scale of the problem size can easily be ramped up to an exascale machine learning challenge.

    As an example of how ExaLearn will advance the field, Nugent said a researcher could run a suite of simulations with the parameters of the universe consisting of 30 percent dark energy and 70 percent dark matter, then a second simulation with 25 percent and 75 percent, respectively. Each of these simulations generates three-dimensional maps of tens of billions of galaxies in the universe and how the cluster and spread apart as time goes by. Using a surrogate model trained on these simulations, the researcher could then quickly run another surrogate model that would generate the output of a simulation in between these values, at 27.5 and 72.5 percent, without needing to run a new, costly simulation — that too would show the evolution of the galaxies in the universe as a function of time. The goal of the ExaLearn software suite is that such results, and their uncertainties and biases, would be a byproduct of the training so that one would know the generated models are consistent with a full simulation.

    Toward this end, Nugent’s team will build on two projects already underway at Berkeley Lab: CosmoFlow and CosmoGAN. CosmoFlow is a deep learning 3D convolutional neural network that can predict cosmological parameters with unprecedented accuracy using the Cori supercomputer at NERSC. CosmoGAN is exploring the use of generative adversarial networks to create cosmological weak lensing convergence maps — maps of the matter density of the universe as would be observed from Earth — at lower computational costs.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Founded on December 28, 2006, insideHPC is a blog that distills news and events in the world of HPC and presents them in bite-sized nuggets of helpfulness as a resource for supercomputing professionals. As one reader said, we’re sifting through all the news so you don’t have to!

    If you would like to contact me with suggestions, comments, corrections, errors or new company announcements, please send me an email at rich@insidehpc.com. Or you can send me mail at:

    2825 NW Upshur
    Suite G
    Portland, OR 97239

    Phone: (503) 877-5048

  • richardmitnick 10:22 am on March 25, 2019 Permalink | Reply
    Tags: , , For Phase 3 installation of the full decay VerteX Detector (VXD) was completed. With this change Belle II is now fully equipped and ready to take physics data., , KEK Inter-University Research Institute Corporation, , , Physics   

    From KEK Inter-University Research Institute Corporation: “SuperKEKB Phase 3 (Belle II Physics Run) Starts” 

    From KEK Inter-University Research Institute Corporation


    On March 11th, 2019, Phase 3 operation of the SuperKEKB project began successfully, marking a major milestone in the development of Japan’s leading particle collider. This phase will be the physics run of the project, in which the Belle II experiment will start taking data with a fully instrumented detector.

    The KEKB accelerator, operated from 1999 to 2010, currently holds the world record luminosity for an electron-positron collider. SuperKEKB, its successor, plans to reach a luminosity 40 times greater over its lifetime.

    Belle II and SuperKEKB are poised to become the world’s first Super B factory facility. Belle II aims to accumulate 50 times more data than its predecessor, Belle, and to seek out new physics hidden in subatomic particles that could shed light on mysteries of the early universe.

    Belle II KEK High Energy Accelerator Research Organization Tsukuba, Japan

    The Belle experiment, which completed data taking in 2010, along with its competitor in the United States BaBar, demonstrated Charge-Parity Violation (CPV) in weak interactions of B mesons.

    SLAC BaBar

    SLAC BaBar

    This discovery was explicitly recognized by the Nobel Foundation and resulted in the 2008 Nobel Prize for Physics being awarded to Professors Makoto Kobayashi and Toshihide Maskawa for their work developing the theory of CPV in weak interactions.

    A major upgrade, the Belle II/SuperKEKB facility, began construction at the end of 2010. SuperKEKB will achieve its goal of 40 times KEKB’s luminosity by shrinking the beams to “nano-beam” size, at the collision point, 20 times smaller than the beam sizes achieved at KEKB while simultaneously doubling the beam currents. These changes will result in much larger quantities of data as well as greater beam backgrounds. Belle II was designed to handle these conditions.

    In February 2016, Phase 1 commissioning of the SuperKEKB accelerator was successfully completed. Low-emittance Ampère-level beams were circulated in both rings, but no collisions were possible. This was followed by the installation of the superconducting final focus magnets and the Belle II outer detector. Phase 2, the pilot run of Belle II, began in March of 2018, with the first collisions recorded in the early hours of April 26th. Initial results from Phase 2 were shown at international conferences in 2018.

    For Phase 3, installation of the full decay VerteX Detector (VXD) was completed. With this change, Belle II is now fully equipped and ready to take physics data.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    KEK-Accelerator Laboratory

    KEK, the High Energy Accelerator Research Organization, is one of the world’s leading accelerator science research laboratories, using high-energy particle beams and synchrotron light sources to probe the fundamental properties of matter. With state-of-the-art infrastructure, KEK is advancing our understanding of the universe that surrounds us, its mechanisms and their control. Our mission is:

    • To make discoveries that address the most compelling questions in a wide range of fields, including particle physics, nuclear physics, materials science, and life science. We at KEK strive to make the most effective use of the funds entrusted by Japanese citizens for the benefit of all, by adding to knowledge and improving the technology that protects the environment and serves the economy, academia, and public health; and

    • To act as an Inter-University Research Institute Corporation, a center of excellence that promotes academic research by fulfilling the needs of researchers in universities across the country and by cooperating extensively with researchers abroad; and

    • To promote national and international collaborative research activities by providing advanced research facilities and opportunities. KEK is committed to be in the forefront of accelerator science in Asia-Oceania, and to cooperate closely with other institutions, especially with Asian laboratories.

    Established in 1997 in a reorganization of the Institute of Nuclear Study, University of Tokyo (established in 1955), the National Laboratory for High Energy Physics (established in 1971), and the Meson Science Laboratory of the University of Tokyo (established in 1988), KEK serves as a center of excellence for domestic and foreign researchers, providing a wide variety of research opportunities. In addition to the activities at the Tsukuba Campus, KEK is now jointly operating a high-intensity proton accelerator facility (J-PARC) in Tokai village, together with the Japan Atomic Energy Agency (JAEA). Over 600 scientists, engineers, students and staff perform research activities on the Tsukuba and Tokai campuses. KEK attracts nearly 100,000 national and international researchers every year (total man-days), and provides excellent research facilities and opportunities to many students and post-doctoral fellows each year.

  • richardmitnick 9:43 am on March 25, 2019 Permalink | Reply
    Tags: "In a new quantum simulator light behaves like a magnet", , , , Physics,   

    From École Polytechnique Fédérale de Lausanne: “In a new quantum simulator, light behaves like a magnet” 

    EPFL bloc

    From École Polytechnique Fédérale de Lausanne

    Nik Papageorgiou

    Physicists at EPFL propose a new “quantum simulator”: a laser-based device that can be used to study a wide range of quantum systems. Studying it, the researchers have found that photons can behave like magnetic dipoles at temperatures close to absolute zero, following the laws of quantum mechanics. The simple simulator can be used to better understand the properties of complex materials under such extreme conditions.

    When subject to the laws of quantum mechanics, systems made of many interacting particles can display behaviour so complex that its quantitative description defies the capabilities of the most powerful computers in the world. In 1981, the visionary physicist Richard Feynman argued we can simulate such complex behavior using an artificial apparatus governed by the very same quantum laws – what has come to be known as a “quantum simulator”.

    One example of a complex quantum system is that of magnets placed at really low temperatures. Close to absolute zero (-273.15°C), magnetic materials may undergo what is known as a “quantum phase transition”. Like a conventional phase transition (e.g. ice melting into water, or water evaporating into steam), the system still switches between two states, except that close to the transition point the system manifests quantum entanglement – the most profound feature predicted by quantum mechanics. Studying this phenomenon in real materials is an astoundingly challenging task for experimental physicists.

    But physicists led by Vincenzo Savona at EPFL have now come up with a quantum simulator that promises to solve the problem. “The simulator is a simple photonic device that can easily be built and run with current experimental techniques,” says Riccardo Rota, the postdoc at Savona’s lab who led the study. “But more importantly, it can simulate the complex behavior of real, interacting magnets at very low temperatures.”

    The simulator may be built using superconducting circuits – the same technological platform used in modern quantum computers. The circuits are coupled to laser fields in such a way that it causes an effective interaction among light particles (photons). “When we studied the simulator, we found that the photons behaved in the same way as magnetic dipoles across the quantum phase transition in real materials,” says Rota. In short, we can now use photons to run a virtual experiment on quantum magnets instead of having to set up the experiment itself.

    “We are theorists,” says Savona. “We came up with the idea for this particular quantum simulator and modelled its behavior using traditional computer simulations, which can be done when the quantum simulator addresses a small enough system. Our findings prove that the quantum simulator we propose is viable, and we are now in talks with experimental groups who would like to actually build and use it.”

    Understandably, Rota is excited: “Our simulator can be applied to a broad class of quantum systems, allowing physicists to study several complex quantum phenomena. It is a truly remarkable advance in the development of quantum technologies.”

    Science paper:
    Riccardo Rota, Fabrizio Minganti, Cristiano Ciuti, Vincenzo Savona.
    “Quantum Critical Regime in a Quadratically Driven Nonlinear Photonic Lattice”
    Physical Review Letters

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    EPFL campus

    EPFL is Europe’s most cosmopolitan technical university. It receives students, professors and staff from over 120 nationalities. With both a Swiss and international calling, it is therefore guided by a constant wish to open up; its missions of teaching, research and partnership impact various circles: universities and engineering schools, developing and emerging countries, secondary schools and gymnasiums, industry and economy, political circles and the general public.

  • richardmitnick 8:11 am on March 25, 2019 Permalink | Reply
    Tags: "Highlights from Moriond: ATLAS explores the full Run 2 dataset", , , , , , , Physics   

    From CERN ATLAS: “Highlights from Moriond: ATLAS explores the full Run 2 dataset” 

    CERN/ATLAS detector

    CERN ATLAS Higgs Event

    CERN ATLAS another view Image Claudia Marcelloni ATLAS CERN



    23rd March 2019
    Pierre Savard

    Figure 1: The highest-mass dijet event measured by ATLAS (mass = 8.12 TeV). (Image: ATLAS Collaboration/CERN)

    This week, particle physicists from around the world gathered in La Thuile, Italy, for the annual Rencontres de Moriond conference on Electroweak Interactions and Unified Theories. It was one of the first major conferences to be held following the recent completion of the Large Hadron Collider’s (LHC) second operation period (Run 2). The ATLAS Collaboration unveiled a wide range of new results, including new analyses using the full Run 2 dataset, as well as some high-profile studies of Higgs, electroweak and heavy-ion physics.

    Figure 2: The invariant mass spectrum of two electrons compared with the Standard Model prediction, and with putative signals from a Z’ boson. (Image: ATLAS Collaboration/CERN)

    First search results using the full Run 2 dataset

    Over the course of Run 2 of the LHC – from 2015 to 2018 – the ATLAS experiment collected 139 inverse femtobarn of proton-proton collision data for analysis. Though this data-taking period concluded just a few months ago, ATLAS physicists have already reported on a variety of new searches using the full Run 2 dataset. So far, all of these new searches are in agreement with the Standard Model expectation.

    The first of these analyses, released in a paper submitted to Physics Letters B, is a search for heavy neutral gauge bosons (denoted Z’) decaying into lepton pairs. The sensitivity of this analysis has increased significantly over the lifetime of the LHC, as seen in Figure 3. The new result sets exclusion limits on specific theoretical models up to a mass of 5.1 TeV.

    A similar search was also conducted by looking for new particles – or “resonances” – decaying to two jets of particles. These “dijet” searches reveal events with the highest energies observed at the LHC; an example of such an event can be seen in Figure 1. No evidence of significant resonant structures was observed in the mass spectrum probed with the Run 2 dataset.

    ATLAS physicists also presented a search for new particles decaying to two weak bosons (W of Z), where the weak bosons then decay to two jets each. As the resonance would be very heavy, the weak bosons produced would be highly energetic and would generate overlapping jets as they decay. Thus, identifying the weak bosons is particularly challenging, and required the development of new reconstruction and analysis techniques. These have substantially improved the sensitivity of the analysis, as illustrated in Figure 4, setting significantly improved constraints on the allowed parameter space for such heavy resonances decaying to W or Z bosons.

    New searches for supersymmetry were also presented. One analysis focused on electroweak production of supersymmetric particles called “charginos” and “sleptons” decaying into two electrons or muons, along with missing transverse momentum. A second analysis looked for long-lived supersymmetric partners of the top quark that decay further away from the collision point. Many other searches are ongoing at ATLAS that will probe vast regions of yet-unexplored supersymmetric parameter space.

    Figure 3: Ratio of the observed limit to the Z’ cross section for the combination of the channels with two electrons and two muons. (Image: ATLAS Collaboration/CERN)

    Figure 4: Comparison between the current and previous expected limits on the cross section times branching ratio for WW+WZ production. An extrapolation of the expected limits from the previous results to the current dataset size, assuming no change to the previous analysis strategy or its uncertainties, is also shown. (Image: ATLAS Collaboration/CERN)

    First measurement of the Higgs boson using the full Run 2 dataset

    ATLAS also released a new measurement of the rare production cross section of the Higgs boson in association with two top quarks (ttH), followed by the similarly rare decay of the Higgs boson to two photons. The ttH production process was first observed in 2018, though it required the combination of many Higgs decay channels. Using the full Run 2 dataset, the observation of ttH in a single Higgs decay channel – into a pair of photons – is now possible. This allowed for a measurement of the production rate with an uncertainty of 25%, with a central value that is compatible with the Standard Model prediction.

    An updated combination of Higgs analyses was also presented, setting new constraints on the Higgs coupling to other particles, as well as interesting indirect constraints on the elusive self-coupling of the Higgs boson with itself. This update includes analyses that use 80 fb-1 (the data taken from 2015 to 2017) and represents the most comprehensive and precise set of Higgs properties measurements presented by the collaboration to date. Other results reporting first evidence for the rare electroweak processes involving the production of three weak bosons were also shown at the conference.

    Observation of light-by-light scattering

    The scattering of light by light involves two incoming photons scattering off of each other and producing two outgoing photons. This is a purely quantum mechanical effect that is not predicted by the classical theory of electromagnetism. The scattering of light requires a very intense source of photons, which can be achieved by using the enormous electric fields generated by fully ionised lead ions. As the ions cross each other, the intense electric fields supply a beam of photons that can collide, effectively turning the Large Hadron Collider into a “Large Photon Collider”.

    Evidence for this process at the LHC was first reported by ATLAS in 2017 in Nature Physics, and was also seen by CMS. Using the much larger dataset collected in 2018, ATLAS was able to clearly observe this process with a significance of over 8 standard deviations and measure the cross section with an uncertainty of 19%. This was one of the first results presented at Moriond.

    New measurement of CP violation

    Figure 5: The measured values of φs and ∆Γs, compared to measurements by other LHC experiments. (Image: ATLAS Collaboration/CERN)

    The observed asymmetry between matter and antimatter in the Universe (a symmetry breaking known as “CP violation”) is one the unresolved puzzles in particle physics. As the Standard Model is only able to explain part of this asymmetry, there is great motivation to search for additional sources in the form of new or larger CP violation phases. The LHC produces copious samples of B mesons that are used to measure CP violating processes. In a new analysis using 80 fb-1 of data, ATLAS investigated the decay of B-sub-s (Bs) mesons, which are composed of a bottom quark and a strange quark. Specifically, J/ψ φ decays were investigated to measure the CP-violating phase φs, the average decay width (Γs), and the width difference (∆Γs) between the physical Bs meson states.

    In the Standard Model, φs is predicted to be small. However, physics beyond the Standard Model could increase the size of the observed CP violation by enhancing the mixing phase φs with respect to the Standard Model value. The measured values of φs and ∆Γs are shown in Figure 5, and compared to measurements by other LHC experiments and to the Standard Model prediction.

    A week of rich and exciting results

    This week, ATLAS and other LHC experiments presented important new results, deepening our understanding of particle physics. The presentation of the first results with the full Run 2 dataset represent the first steps in the realisation of what will be a rich and exciting Run 2 physics programme. Though Moriond EW is now drawing to a close, the Moriond QCD conference will start on its heels on Sunday 24 March – expect more exciting new results.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN Courier

    Quantum Diaries

    CERN map

    CERN LHC Grand Tunnel
    CERN LHC particles


    Quantum Diaries

  • richardmitnick 2:12 pm on March 22, 2019 Permalink | Reply
    Tags: , , , , Muoscope-a new small-scale portable muon telescope, , , Physics   

    From CERN CMS: “A ‘muoscope’ with CMS technology” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN CMS

    22 March, 2019
    Cristina Agrigoroae

    The resistive plate chambers (RPC) at CMS are fast gaseous detectors that provide a muon trigger system (Image: CERN)

    Particle physicists are experts at seeing invisible things and their detecting techniques have already found many applications in medical imaging or the analysis of art works. Researchers from the CMS experiment at the Large Hadron Collider are developing a new application based on one of the experiment’s particle detectors: a new, small-scale, portable muon telescope, which will allow imaging of visually inaccessible spaces.

    CERN CMS Muoscope- a new, small-scale, portable muon telescope developed by the CMS Collaborators from Ghent University and the University of Louvain in Belgium

    Earth’s atmosphere is constantly bombarded by particles arriving from outer space. By interacting with atmospheric matter, they decay into a cascade of new particles, generating a flux of muons, heavier cousins of electrons. These cosmic-ray muons continue their journey towards the Earth’s surface, travelling through almost all material objects.

    This “superpower” of muons makes them the perfect partners for seeing through thick walls or other visually challenging subjects. Volcanic eruptions, enigmatic ancient pyramids, underground caves and tunnels: these can all be scanned and explored from the inside using muography, an imaging method using naturally occurring background radiation in the form of cosmic-ray muons.

    Large-area muon telescopes have been developed in recent years for many different applications, some of which use technology developed for the LHC detectors. The muon telescope conceived by CMS researchers from two Belgian universities, Ghent University and the Catholic University of Louvain, is compact and light and therefore easy to transport. It is nonetheless able to perform muography at high resolution. It will be the first spin-off for muography using the CMS Resistive Plate Chambers (RPC) technology. A first prototype of the telescope, also baptised a “muoscope”, has been built with four RPC planes with an active area of 16×16 cm. The same prototype was used in the “UCL to Mars” project; it was tested for its robustness in a simulation of Mars-like conditions in the Utah Desert, where it operated for one month and later came back fully functional.

    Other CMS technologies have been used in muon tomography for security and environmental protection, as well as for homeland security.

    Learn more about the muon telescope here.

    See the full article here.

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries

    Cern Courier

    CERN CMS New

  • richardmitnick 11:45 am on March 21, 2019 Permalink | Reply
    Tags: , Physics, , , , , , "LHCb discovers matter-antimatter asymmetry in charm quarks", “They might look nearly identical from the outside but they behave differently” says Ivan Polyakov.“This is the puzzle of antimatter.”   

    From Symmetry: “LHCb discovers matter-antimatter asymmetry in charm quarks” 

    Symmetry Mag
    From Symmetry

    Sarah Charley

    A new observation by the LHCb experiment finds that charm quarks behave differently than their antiparticle counterparts.

    Artwork by Sandbox Studio, Chicago with Ana Kova

    CERN/LHCb detector

    Scientists on the LHCb experiment at the Large Hadron Collider at CERN have discovered a new way in which matter and antimatter behave differently.

    With 99.9999 percent statistical certainty, LHCb scientists have observed a difference between the decays of matter and antimatter particles containing charm quarks. This discovery opens up a new realm to study the differences between matter and antimatter and could help explain why we live in a matter-dominated universe.

    “This is a major breakthrough in experimental physics,” says Sheldon Stone, a professor at Syracuse University and collaborator on the LHCb experiment. “There’s been many attempts to make this measurement, but until now, no one had ever seen it. It’s a huge milestone in antimatter research.”

    Every structure in the universe—from the tiniest speck of dust to the mightiest star—is built from matter. But there is an equally qualified material for the job: antimatter. Antimatter is nearly identical to matter, except that its charge and magnetic properties are reversed. Precision studies of antihydrogen atoms, for example, have shown that their characteristics are identical to hydrogen atoms to beyond the billionth decimal place.

    Matter and antimatter cannot coexist in the same physical space because if they come into contact, they annihilate each other. This equal-but-opposite nature of matter and antimatter poses a conundrum for cosmologists, who theorize that the same amount of matter and antimatter should have exploded into existence during the birth of our universe. But if that’s true, all of that matter and antimatter should have annihilated one another, leaving nothing but energy behind.

    Particle physicists are looking for any tiny differences between matter and antimatter which could help explain why matter won out over antimatter in the early universe.

    Lucky for them, antimatter is not a totally extinct species. “We don’t usually see antimatter in our world,” says Ivan Polyakov, a postdoc at Syracuse University and internal LHCb reviewer for this new analysis. “But it can be produced when ordinary matter particles are smashed together at high energies, such as they do inside the Large Hadron Collider.”

    The main way scientists study the tiny and rare particles produced during the LHC’s collisions is by mapping how they decay and transform into more-stable byproducts.

    “This gives us a sort of family lineage for our ­particle of interest,” says Cesar da Silva, a scientist from Los Alamos National Lab and also a LHCb collaborator. “Once stable particles are measured by the detector, we can trace their ancestors to find the primordial generation of particles in the collision.

    “Because of quantum mechanics, we cannot predict what each single unstable particle will decay into, but we can figure out the probabilities for each possible outcome.”

    The new LHCb study looked at the decays of particles consisting of two bound quarks—the internal structural components of particles like protons and neutrons. One version of this particle (called D0 by scientists) contained a charm quark and the antimatter version of the up quark, called an anti-up quark. The other version contained the reverse, an up quark and an anti-charm quark.

    Scientists on the LHCb experiment identified tens of millions of both D0 and anti-D0 particles and counted how many times each transformed into one set of byproducts (a pair of particles called pions) versus another possible set (a pair of particles called kaons).

    With everything else being equal, the ratio of these two possible outcomes should have been identical for both D0 and anti-D0 particles. But scientists found that the two ratios differed by about a tenth of a percent—evidence that these charmed matter and antimatter particles are not totally interchangeable.

    “They might look nearly identical from the outside, but they behave differently,” Polyakov says. “This is the puzzle of antimatter.”

    The idea that matter and antimatter particles behave slightly differently is not new and has been observed previously in studies of particles containing strange quarks and bottom quarks. What makes this study unique is that it is the first time this asymmetry has been observed in particles containing charm quarks.

    Previous experiments—including BaBar, Belle and CDF—endeavored to make this same measurement but fell short of collecting enough data to to tease out such a subtle effect.

    SLAC BaBar

    Belle II KEK High Energy Accelerator Research Organization Tsukuba, Japan

    FNAL/Tevatron CDF detector

    The huge amount of data generated since the start of LHC Run 2 combined with the introduction of more advanced methods to tag the particles of interest enabled scientists to collect enough matter and antimatter D0 particles to finally see these decay differences beyond a shadow of a doubt.

    The next step is to see how this measurement fits with the theoretical models, which are still a little fuzzy on this prediction.

    “Theorists will need to figure out if the Standard Model can explain this,” Stone says.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)

    “We’re pushing our field and this result will certainly be in the history books.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 10:40 am on March 21, 2019 Permalink | Reply
    Tags: "All together now: adding more pieces to the Higgs boson puzzle", , , , , , Physics   

    From CERN ATLAS: “All together now: adding more pieces to the Higgs boson puzzle” 

    CERN/ATLAS detector

    CERN ATLAS Higgs Event

    CERN ATLAS another view Image Claudia Marcelloni ATLAS CERN



    18th March 2019

    Figure 1: Cross sections time branching fraction for the main Higgs production modes at the LHC (ggF, VBF, VH and ttH+tH) in each relevant decay mode (γγ, WW, ZZ, ττ, bb). All values are normalized to Standard Model predictions. In addition, the combined results for each production cross-section are also shown, assuming the Standard Model values for the branching ratios into each decay mode. (Image: ATLAS Collaboration/CERN)

    The Higgs boson was discovered in 2012 by the ATLAS and CMS experiments, but its rich interaction properties (its coupling to other particles) remain a puzzle whose pieces the experiments on the Large Hadron Collider (LHC) are bringing together.

    Fortunately, the LHC provides many windows into measuring Higgs boson couplings. There are four main ways to produce the Higgs boson: through the fusion of two gluon particles (gluon-fusion, or ggF); through the fusion of weak vector bosons (VBF); or in association with a W or Z boson (VH), or one or more top quarks (ttH+tH). There are also five main channels in which Higgs bosons can decay: into pairs of photons, W or Z bosons, tau leptons or b quarks. Each of these processes brings unique insights into the Higgs boson properties – and a separate piece in the puzzle of its true nature.

    Thanks to the unprecedented amount of Higgs bosons produced at the LHC, all of the above production and decay modes have now been observed. In a new result presented by the ATLAS Collaboration, utilising data collected up to 2017, the measurements for each of these processes have reached the five standard deviation significance threshold, past which their existence is considered established.

    The Higgs boson yields for most of the combinations of production and decay have been measured (see Figure 1) and have been found to agree with Standard Model predictions. The measurement of the cross sections for each production mode in proton–proton collisions at 13 TeV, assuming the decays occur as predicted by the Standard Model, are the most precise ones obtained to date.

    Figure 2: Combined measurements of the cross section for the different production phase-space regions (STXS) considered in the analysis normalized by the Standard Model expectations. These regions are defined by different ranges of Higgs transverse momentum, number of associated jets, interval of vector boson transverse momentum in the VH production mechanism with the vector boson decaying leptonically. In this combination, the Higgs decay branching fractions are fixed to Standard Model values. (Image: ATLAS Collaboration/CERN)

    As physicists have placed these new pieces, they’ve also begun to explore the Higgs boson puzzle in a new way. In the latest analyses, instead of counting Higgs bosons inclusively in the major production and decay modes, ATLAS physicists have measured Higgs boson topologies separately for smaller regions of phase-space: different ranges of Higgs boson transverse momentum, numbers of associated jets, and numbers and kinematic properties of associated weak bosons and top quarks. Using these smaller puzzle pieces, called “simplified template cross sections” (STXS), allows physicists to better separate the measurement process from the interpretation in terms of theoretical properties. Ultimately, it provides a finer-grained picture of Higgs boson couplings at the LHC and more stringent tests of the Standard Model.

    Among the STXS regions considered in the analysis, some have already been measured with good precision at the LHC (see Figure 2), but no deviation from the Standard Model has been observed so far. These measurements allow physicists to further enhance the sensitivity on the coupling properties of the Higgs boson to the other elementary particles. Further, they have set constraints on new physics theories – such as the “two-Higgs doublet model”, which introduces additional Higgs bosons, and the hMSSM supersymmetric model – which are more stringent than those reported previously by ATLAS.

    These measurements will continue to improve as more data from Run 2 and beyond are included, providing a yet-finer picture of the properties of the Higgs boson.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN Courier

    Quantum Diaries

    CERN map

    CERN LHC Grand Tunnel
    CERN LHC particles


    Quantum Diaries

  • richardmitnick 10:24 am on March 21, 2019 Permalink | Reply
    Tags: "ATLAS measures Higgs boson coupling to top quark in diphoton channel with full Run 2 dataset", , , , , , Physics   

    From CERN ATLAS: “ATLAS measures Higgs boson coupling to top quark in diphoton channel with full Run 2 dataset” 

    CERN/ATLAS detector

    CERN ATLAS Higgs Event

    CERN ATLAS another view Image Claudia Marcelloni ATLAS CERN



    18th March 2019

    Figure 1: Visualisation of an event from the tt̄H(γγ) analysis. The event contains two photon candidates (green towers), while the b-jets are shown as yellow (blue) cones. (Image: ATLAS Collaboration/CERN)

    In 2018, the ATLAS and CMS Collaborations announced the observation of the production of the Higgs boson in association with a top-quark pair, known as “ttH” production. This result was the first observation of the Higgs boson coupling to quarks. It was followed shortly by the observation of Higgs boson decays to bottom quarks.

    As only about 1% of the Higgs bosons are produced in association with a top-quark pair at the Large Hadron Collider (LHC), achieving this observation was especially challenging. It was accomplished by searching across many different Higgs boson decay channels, including decays to two W or Z bosons (WW* or ZZ*), a pair of tau leptons, a pair of b-quarks, and a pair of photons (“diphoton”). Their combination established ttH production with a significance of 6.3 standard deviations. The diphoton channel alone, using 80 fb-1 of data recorded by ATLAS between 2015 and 2017, provided an observed significance of 4.1 standard deviations (for 3.7 standard deviations expected when assuming ttH production to occur as predicted by the Standard Model).

    At the Rencontres de Moriond (La Thuile, Italy), the ATLAS Collaboration presented an updated measurement of ttH production in the diphoton channel. The result examines the full Run 2 dataset – 139 fb-1 collected between 2015 and 2018 – to observe ttH production in a single channel with a significance of 4.9 standard deviations (for 4.2 expected).

    Figure 2: The ttH signal in the diphoton invariant mass spectrum. Events from the different analysis categories are weighted according to the category sensitivity to the ttH signal. The ttH signal manifests itself as a localised resonant bump in the red curve, representing the fit to the data of the signal and background shapes. The other Higgs production modes provide a small contribution to the resonant peak, as shown by by the green dashed line. (Image: ATLAS Collaboration/CERN)

    The analysis techniques utilised in the new result followed closely those employed in the previously published analysis – with a few exceptions. To cope with the intense 2018 data-taking conditions, ATLAS physicists revised their data calibration and selection mechanisms. In particular, the result utilises a revised procedure for differentiating photons arising, for example, from a Higgs boson decay from those induced by hadron jets, as well as an adapted photon energy calibration. Additionally, ATLAS implemented a new calibration for hadron jets, especially for those issued from bottom quarks, whose presence in the event is used to identify the decay of top quarks.

    The ttH cross section times the Higgs-to-diphoton branching fraction (the probability that a Higgs boson will decay into a photon pair) was measured to be 1.58 ± 0.39 fb. Its ratio to the Standard Model prediction is 1.38 ± 0.41, in agreement with unity.

    ATLAS is now working on extending the analysis of the diphoton channel – which is sensitive both to ttH and the other Higgs production modes – to the full Run 2 dataset. This complete diphoton measurement will allow for an even more sensitive test of the Higgs mechanism, and will further refine the ttH measurement.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN Courier

    Quantum Diaries

    CERN map

    CERN LHC Grand Tunnel
    CERN LHC particles

  • richardmitnick 8:39 am on March 21, 2019 Permalink | Reply
    Tags: "Superconducting nanowires could be used to detect dark matter", A promising new sensor based on tiny superconducting wires, , , , Physics, The team’s prototype already shows the potential of this approach, Yonit Hochberg at Hebrew University of Jerusalem in Israel and a few colleagues   

    From M.I.T. Technology Review: “Superconducting nanowires could be used to detect dark matter” 

    MIT Technology Review
    From M.I.T. Technology Review


    March 20, 2019
    No writer credit or image credits

    One of the great scientific searches of our time is the hunt for dark matter. Physicists believe this stuff fills the universe and think they can see evidence of it in the way galaxies rotate. Indeed, galaxies spin so quickly that they ought to fly apart unless some hidden mass is generating enough gravitational force to hold them together.

    That evidence has set physicists scrabbling to find dark matter on Earth. They’ve constructed dozens of observatories, most of them in underground caverns deep beneath the surface, where background noise is low. At stake is scientific fame and fortune, with the group that finds dark matter likely to be richly rewarded.

    But so far physicists have found precisely nothing. If it is out there, dark matter is very well hidden. Or physicists have been looking in the wrong place. One possibility is that dark matter particles are too small for current experiments to see. So physicists desperately want better, more sensitive ways to detect these things.


    Enter Yonit Hochberg at Hebrew University of Jerusalem in Israel and a few colleagues, who have developed a promising new sensor based on tiny superconducting wires. The team’s prototype already shows the potential of this approach.

    The principle behind the new device is straightforward. Cool certain metals below a critical temperature and they conduct with no resistance. But as soon as their temperature rises above this threshold, the superconducting behavior disappears.

    Physicists know that dark matter particles cannot interact strongly with visible matter; otherwise they would have already seen them. But dark matter particles can collide head-on with ordinary particles.

    These collisions are rare because ordinary matter is mostly empty space, so dark matter particles can pass straight through. But when they do collide with an atomic nucleus or electron in a lattice, for example, the collision causes the lattice to vibrate, thereby raising its temperature.

    It is this rise in temperature that superconducting nanowires are good at revealing. The heating causes a small portion of the wire to stop superconducting, and this in turn creates a voltage pulse that is easy to measure. What’s more, such a device produces few, if any, false positives.

    Hochberg and Co have put their idea through its paces by building a prototype. This device consists of set of tungsten silicide nanowires just 140 nanometers wide (a human hair is about 100,000 nanometers wide) and 400 micrometers long. The entire apparatus sits just a few millidegrees above absolute zero, so that the tungsten silicide wires become superconductors.

    The team then looked for the voltage pulses that might reveal a dark matter collision. With appropriate shielding in place, they found no pulses during the 10,000-second duration of their measurements.

    That places important constraints on the type of dark matter that could be present and its density. It also places constraints on other types of particles that physicists speculate might exist.

    One of these is the “dark photon”—essentially the dark matter equivalent of the ordinary photon. If they exist, then the new sensor did not detect a single one. “The results from this device already place meaningful bounds on dark matter-electron interactions, including the strongest terrestrial bounds on sub-eV dark photon absorption to date,” say Hochberg and Co.

    That’s impressive work, given that the mass of the nanowires is just a few nanograms. The next stage is to fabricate them on a larger scale. Hochberg and co say that the technology is relatively mature, so this should be possible on a short time scale. Indeed, they estimate that an academic lab could churn out a thousand 200-nanometer detectors with a total mass of 1.3 grams in just a year. “An industrial effort could realize many times that number,” they point out.

    So a kilogram-scale detector could be feasible in the not too distant future. Such a machine would rival those already in operation in the search for dark matter, but it would look at different energies in a different way.

    So it may be that one day, superconducting nanowires will discover dark matter—if it exists at all.

    Science paper:
    Detecting Dark Matter with Superconducting Nanowires

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

  • richardmitnick 3:55 pm on March 20, 2019 Permalink | Reply
    Tags: , , , , , , Physics   

    From ALICE at CERN: “The subterranean ballet of ALICE” 

    From From ALICE at CERN

    19 March, 2019
    Corinne Pralavorio

    During the long shutdown of CERN’s accelerators, the ALICE experiment at the LHC is removing and refurbishing or replacing the majority of its detectors.

    CERN ALICE Time Projection Chamber (Image: Maximilien Brice/CERN)

    The experiment caverns of the Large Hadron Collider (LHC) are staging a dazzling performance during Long Shutdown 2 (LS2). The resplendent sub-detectors, released from their underground homes, are performing a fascinating ballet. At the end of February, ALICE removed the two trackers, the inner tracker system and the time projection chamber, from the detector. At the very start of the long shutdown, on 3 December 2018, the teams began disconnecting the dozens of sub-detectors. And finally, on 25 February, the two trackers were ready to be removed.

    The trackers are located around the collision points and are used to reconstruct the tracks of the particles produced in the collisions. The data they generate are essential for identifying the particles and understanding what happened during the collision. ALICE’s inner tracker is a 1.5-metre-long tube, 1 metre in diameter. It will be replaced with a new, much more precise detector closer to the collision point, formed of seven pixel layers and containing a total of 12.5 billion pixels. The current detector is still in the cavern and could spend its retirement as a museum piece in an exhibition above ground.

    CERN ALICE internal tracker system (Image: Maximilien Brice/ Julien Ordan CERN)

    The time projection chamber is an imposing cylinder, measuring 5.1 metres in length and 5.6 metres in diameter, weighing an enormous 15 tonnes. The huge sub-detector was nonetheless hoisted out in just four hours, to be transferred to a building where it will undergo a complete metamorphosis. The current detector is based on multiwire proportional chamber technology. To increase the detector’s acquisition speed by a factor of 100, the readout system will be equipped with much faster components called gas electron multipliers (GEMs), and the electronics will be completely replaced. The teams have started the renovation work, which should take around 11 months.

    At present, the removal process is continuing in the cavern. Most of the calorimeters have been removed for refurbishment. Around 50 people are hard at work at the experiment.

    After the removal of the two trackers, ALICE’s heart is now empty. (Image: Julien Ordan/CERN)

    To find out more about the major work in progress at ALICE, see these articles on the website and in the CERN Courier.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN/ALICE Detector

    Meet CERN in a variety of places:

    Quantum Diaries

    Cern Courier

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: