Tagged: Particle Physics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:57 pm on July 22, 2019 Permalink | Reply
    Tags: , CERN NA64, , , , Particle Physics,   

    From CERN: “NA64 casts light on dark photons” 

    Cern New Bloc

    Cern New Particle Event


    From CERN

    22 July, 2019
    Ana Lopes

    The NA64 collaboration has placed new limits on the interaction between a photon and its hypothetical dark-matter counterpart.

    1
    The NA64 experiment (Image: CERN)

    Without dark matter, most galaxies in the universe would not hold together. Scientists are pretty sure about this. However, they have not been able to observe dark matter and the particles that comprise it directly. They have only been able to infer its presence through the gravitational pull it exerts on visible matter.

    Fritz Zwicky discovered Dark Matter when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com

    Coma cluster via NASA/ESA Hubble

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science)


    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL)


    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970. https://home.dtm.ciw.edu

    One hypothesis is that dark matter consists of particles that interact with each other and with visible matter through a new force carried by a particle called the dark photon. In a recent study, the collaboration behind the NA64 experiment at CERN describes how it has tried to hunt down such dark photons.

    NA64 is a fixed-target experiment. A beam of particles is fired onto a fixed target to look for particles and phenomena produced by collisions between the beam particles and atomic nuclei in the target. Specifically, the experiment uses an electron beam of 100 GeV energy from the Super Proton Synchrotron accelerator.

    The Super Proton Synchrotron (SPS), CERN’s second-largest accelerator. (Image: Julien Ordan/CERN)

    In the new study, the NA64 team looked for dark photons using the missing-energy technique: although dark photons would escape through the NA64 detector unnoticed, they would carry away energy that can be identified by analysing the energy budget of the collisions.

    The team analysed data collected in 2016, 2017 and 2018, which together corresponded to a whopping hundred billion electrons hitting the target. They found no evidence of dark photons in the data but their analysis resulted in the most stringent bounds yet on the strength of the interaction between a photon and a dark photon for dark-photon masses between 1 MeV and 0.2 GeV.

    These bounds imply that a 1-MeV dark photon would interact with an electron with a force that is at least one hundred thousand times weaker than the electromagnetic force carried by a photon, whereas a 0.2-GeV dark photon would interact with an electron with a force that is at least one thousand times weaker. The collaboration anticipates obtaining even stronger limits with the upgraded detector, which is expected to be completed in 2021.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Tunnel

    CERN LHC particles

     
  • richardmitnick 12:49 pm on July 21, 2019 Permalink | Reply
    Tags: "A golden era of exploration: ATLAS highlights from EPS-HEP 2019", , , , , , Particle Physics,   

    From CERN ATLAS: “A golden era of exploration: ATLAS highlights from EPS-HEP 2019” 

    CERN/ATLAS detector

    CERN ATLAS Higgs Event

    CERN ATLAS another view Image Claudia Marcelloni ATLAS CERN


    CERN ATLAS New II Credit CERN SCIENCE PHOTO LIBRARY


    From CERN ATLAS

    20th July 2019
    Katarina Anthony

    1
    Event display of a Higgs boson candidate decaying in the four-lepton channel. (Image: ATLAS Collaboration/CERN)

    Eight years of operation. Over 10,000 trillion high-energy proton collisions. One critical new particle discovery. Countless new insights into our universe. The Large Hadron Collider (LHC) has been breaking records since data-taking began in 2010 – and yet, for ATLAS and its fellow LHC experiments, a golden era of exploration is only just beginning.

    2
    Figure 1: New ATLAS measurement of the Higgs boson decaying in the four-lepton channel, using the full LHC Run-2 dataset. The distribution of the invariant mass of the four leptons (m4l) is shown. The Higgs boson corresponds to the excess of events (blue) over the non-resonant ZZ* background (red) at 125 GeV. (Image: ATLAS Collaboration/CERN)

    This week, the ATLAS Collaboration presented 25 new results at the European Physical Society’s High-Energy Physics conference (EPS-HEP) in Ghent, Belgium. The new analyses examine the largest-ever proton–proton collision dataset from the LHC, recorded during Run 2 of the accelerator (2015–2018) at the 13 TeV energy frontier.

    The new data have been fertile ground for ATLAS. New precision measurements of the Higgs boson, observations of key electroweak processes and high-precision tests of the Standard Model are among the highlights described below; find the full list of ATLAS public results using the full Run-2 dataset here.

    Studying the Higgs discovery channels

    Just over seven years ago, the Higgs boson was an elusive particle, out of reach from physicists for nearly five decades. Today, not only is the Higgs boson frequently observed, it is studied with such precision as to become a powerful tool for exploration.

    Key to these accomplishments are the so-called “Higgs discovery channels”: H→γγ, where the Higgs boson decays into two photons, and H→ZZ*→4l, where it decays via two Z bosons into four leptons. Though rare, these decays are easily identified in the ATLAS detector, making them essential to both the particle’s discovery and study.

    ATLAS presented new explorations of the Higgs boson in these channels (Figures 1 and 2), yielding greater insight into its behaviour. The new results benefit from the large full Run-2 dataset, as well as a number of new improvements to the analysis techniques. For example, ATLAS physicists now utilise Deep-Learning Neural Networks to assign the Higgs-boson events to specific production modes.

    All four Higgs-boson production modes can now be clearly identified in a single decay channel. ATLAS’ studies of the Higgs boson have advanced so quickly, in fact, that rare processes – such as its production in association with a top-quark pair, observed only just last year – can now been seen in just a single decay channel. The new sensitivity allowed physicists to measure kinematic properties of the Higgs boson with unprecedented precision (Figure 3). These are sensitive to new physics processes, making their exploration of particular interest to the collaboration.

    All four Higgs-boson production modes can now be clearly identified in a single decay channel. ATLAS’ studies of the Higgs boson have advanced so quickly, in fact, that rare processes – such as its production in association with a top-quark pair, observed only just last year – can now been seen in just a single decay channel. The new sensitivity allowed physicists to measure kinematic properties of the Higgs boson with unprecedented precision (Figure 3). These are sensitive to new physics processes, making their exploration of particular interest to the collaboration.

    3
    Figure 2: Distribution of the invariant mass of the two photons in the ATLAS measurement of H→γγ using the full Run-2 dataset. The Higgs boson corresponds to the excess of events observed at 125 GeV with respect to the non-resonant background (dashed line). (Image: ATLAS Collaboration/CERN)

    4
    Figure 3: Differential cross section for the transverse momentum (pT,H) of the Higgs boson from the two individual channels (H→ZZ*→4ℓ, H→γγ) and their combination. (Image: ATLAS Collaboration/CERN)

    Searching unseen properties of the Higgs boson

    Having accomplished the observation of Higgs boson interactions with third-generation quarks and leptons, ATLAS physicists are turning their focus to the lighter, second-generation of fermions: muons, charm quarks and strange quarks. While their interactions with the Higgs boson are described by the Standard Model, they have – so far – remained relegated to theory. Results from the ATLAS Collaboration are backing up these theories with real data.

    At EPS-HEP, ATLAS presented a new search for the Higgs boson decaying into muon pairs. This already-rare process is made all the more difficult to detect by background Standard Model processes, which produce muon pairs in abundance.

    5
    Figure 4: ATLAS search for the Higgs boson decaying to two muons. The plot shows the weighted muon pair invariant mass spectrum (muu) summed over all categories. (Image: ATLAS Collaboration/CERN)

    The new result utilised novel machine learning techniques to provide ATLAS’ most sensitive result yet, with a moderate excess of 1.5 standard deviations expected for the predicted signal. In agreement with this prediction, only a small excess of 0.8 standard deviations is present around the Higgs-boson mass in the data (Figure 4).

    “This result shows that we are now close to the sensitivity required to test the Standard Model’s predictions for this very rare decay of the Higgs boson,” said ATLAS spokesperson Karl Jakobs from the University of Freiburg, Germany. “However, a definitive statement on the second generation will require the larger datasets that will be provided by the LHC in Run 3 and by the High-Luminosity LHC.”

    ATLAS’ growing sensitivity was also clearly on display in the collaboration’s new “di-Higgs” search, where two Higgs bosons are formed via the fusion of two vector bosons. Though one of the rarest Standard Model processes explored by ATLAS, its study gives unique insight into the previously-untested relationship between vector boson and Higgs-boson pairs. A small variation of this coupling relative to the Standard Model value would result in a dramatic rise in the measured cross section. The new search, despite being negative, successfully sets the first constraints on this relationship.

    Entering the Higgs sector

    The Higgs mechanism, giving mass to all elementary particles, is directly connected with profound questions about our universe, including the stability and energy of the vacuum, the “naturalness” of a world described by the Standard Model, and more. As such, the exploration of the Higgs sector is not limited to direct measurements of the Higgs boson – it instead requires a broad experimental programme that will extend over decades.

    A perfect example of this came in ATLAS’ new observation of the electroweak production of two jets in association with a pair of Z bosons. The Z and W bosons are the force carriers of weak interactions and, as they both have a spin of 1, are known as “vector bosons”. The Higgs boson is a vital mediator in “vector-boson scattering”, an electroweak process that contributes to the pair production of vector bosons (WW, WZ and ZZ) with jets. Measurements of these production processes are key for the study of electroweak symmetry breaking via the Higgs mechanism.

    The new ATLAS result – with a statistical significance of 5.5 standard deviations (Figure 5) – completes the experiment’s observation of vector-boson scattering in these critical processes, and sparks new ways to test the Standard Model.

    6
    Figure 5: Observed and predicted distributions (BDT) in the signal regions of Z-boson pairs decaying to four leptons. The electroweak production of the Z-boson pair is shown in red; the error bars on the data points (black) show the statistical uncertainty on data. (Image: ATLAS Collaboration/CERN)

    7
    Figure 6: Summary of the mass limits on supersymmetry models set by the ATLAS searches for Supersymmetry. Results are quoted for the nominal cross section in both a region of near-maximal mass reach and a demonstrative alternative scenario, in order to display the range in model space of search sensitivity. (Image: ATLAS Collaboration/CERN)

    Probing new physics

    As the community enters the tenth year of supersymmetry searches at the LHC, the ATLAS Collaboration continues to take a broad approach to the hunt. ATLAS is committed to providing results that are theory-independent as well as signature-based searches, in addition to the highly-targeted, model-dependent ones.

    Along with new, updated limits on various supersymmetry searches using the full Run-2 dataset (Figure 6), ATLAS once again highlighted new searches (first presented at the LHCP2019 conference) for superpartners produced through the electroweak interaction. Generated at extremely low rates at the LHC and decaying into Standard Model particles that are themselves difficult to reconstruct, such supersymmetry searches can only be described by the iconic quote: “not because it is easy, but because it is hard”.

    Overall, the results place strong constraints on important supersymmetric scenarios, which will inform theory developments and future ATLAS searches. Further, they provide examples of how advanced reconstruction techniques can help improve the ATLAS’ sensitivity of new physics searches.

    Asymmetric top-quark production

    The Standard Model continued to show its strength in ATLAS’ new precision measurement of charge asymmetry in top-quark pairs (Figure 7). This intriguing imbalance – where top and antitop quarks are not produced equally at all angles with respect to the proton beam direction – is among the most subtle, difficult and yet vital properties to measure in the study of top quarks.

    The effect of this asymmetry is predicted to be extremely small, however new physics processes interfering with the known production modes can lead to larger (or even smaller) values. ATLAS found evidence of this imbalance, with a significance of four standard deviations, with a value compatible with the Standard Model. The result marks an important milestone for the field, following decades of measurements which began at the Tevatron proton–antiproton collider, the predecessor of the LHC in the USA.

    FNAL/Tevatron


    FNAL/Tevatron map

    8
    Figure 7: Measured values of the charge asymmetry (Ac) as a function of the invariant mass of the top quark pair system (mtt) in data. (Image: ATLAS Collaboration/CERN)

    Following the data

    As EPS-HEP 2019 drew to a close, it was clear that exploration of the high-energy frontier remains far from complete. With the LHC – and its upcoming HL-­LHC upgrade – set to continue apace, the future of high-energy physics will be guided by the results of ATLAS and its fellow experiments at the energy frontier.

    “Our community is living through data-driven times,” said ATLAS Deputy Spokesperson Andreas Hoecker from CERN. “Experimental results must guide the high-energy physics community to the next stage of exploration. This requires a broad and diverse particle physics research programme. The ATLAS Collaboration is up to taking this challenge!”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN Courier

    Quantum Diaries
    QuantumDiaries

    CERN map


    CERN LHC Grand Tunnel
    CERN LHC particles

     
  • richardmitnick 9:53 am on July 21, 2019 Permalink | Reply
    Tags: , , , Cecilia Payne, , , Particle Physics, Payne discovered that hydrogen and helium are the dominant elements of the stars -1925 Ph.D. thesis., ,   

    From COSMOS Magazine: Women in STEM- “This week in science history: The woman who found hydrogen in the stars is born” Cecilia Payne 

    Cosmos Magazine bloc

    From COSMOS Magazine

    THIS POST IS DEDICATED TO L.Z. OF RUTGERS PHYSICS AND HP, WHO BROUGHT CECILIA PAYNE TO MY ATTENTION. I HOPE HE SEES THIS. IF HE SEES IT, HE CAN ADVISE ME BY EMAIL.

    1
    Meet the Woman Who Discovered the Composition of the Stars, Cecilia Payne. Mental Floss, Caitlin Schneider August 26, 2015

    Cecilia Payne is today recognised as an equal to Newton and Einstein, but it wasn’t always so.

    10 May 2018
    Jeff Glorfeld

    2
    Cecilia Payne, photographed in 1951. Bettmann / Contributor / Getty Images

    Cecilia Payne, born on May 10, 1900, in Wendover, England, began her scientific career in 1919 with a scholarship to Cambridge University, where she studied physics. But in 1923 she received a fellowship to move to the United States and study astronomy at Harvard. Her 1925 thesis, Stellar Atmospheres, was described at the time by renowned Russian-American astronomer Otto Struve as “the most brilliant PhD thesis ever written in astronomy”.

    In the January, 2015, Richard Williams of the American Physical Society, wrote: “By calculating the abundance of chemical elements from stellar spectra, her work began a revolution in astrophysics.”

    In 1925 Payne received the first PhD in astronomy from Radcliffe, Harvard’s college for women, – because Harvard itself did not grant doctoral degrees to women.

    In the early 1930s she met Sergey Gaposchkin, a Russian astronomer who could not return to the Soviet Union because of his politics. Payne was able to find a position at Harvard for him. They married in 1934.

    Finally, in 1956, she achieved two Harvard firsts: she became its first female professor, and the first woman to become department chair.

    In a 2016 article about Payne for New York magazine, writer Dava Sobel reports that when she arrived at Harvard, Payne found the school had a collection of several hundred thousand glass photographs of the night sky, taken over a period of 40 years. Many of these images stretched starlight into strips, or spectra, marked by naturally occurring lines that revealed the constituent elements.

    As she painstakingly examined these plates, Payne reached her controversial – and groundbreaking – conclusion: that unlike on Earth, hydrogen and helium are the dominant elements of the stars.

    At the time, most scientists believed that because stars contained familiar elements such as silicon, aluminium and iron, similar to Earth’s make-up, they would be present in the same proportions, with only small amounts of hydrogen.

    Although the presence of hydrogen in stars had been known since the 1860s, when chemical analysis at a distance first became possible, no one expected the great abundance claimed by Payne.

    Richard Williams, writing for the American Physical Society in 2015, said: “The giants – Copernicus, Newton, and Einstein – each in his turn, brought a new view of the universe. Payne’s discovery of the cosmic abundance of the elements did no less.”

    However, at the time of her thesis publication the foremost authority on stellar composition, Henry Norris Russell, of Princeton University, convinced Payne that her conclusions had to be wrong, encouraging her write that her percentages of hydrogen and helium were “improbably high” and therefore “almost certainly not real”.

    But in brilliant vindication, Russell devoted the next four years to studying Payne’s findings, and in the issue of the Astrophysical Journal, he agreed with her and cited her 1925 study, concluding for the record that the great abundance of hydrogen “can hardly be doubted”.

    Cecilia Payne-Gaposchkin died on December 7, 1979.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 12:35 pm on July 18, 2019 Permalink | Reply
    Tags: "CMS releases open data for Machine Learning", , , , , Particle Physics,   

    From CERN CMS: “CMS releases open data for Machine Learning” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN CMS

    17 July, 2019

    CMS has also provided open access to 100% of its research data recorded in proton–proton collisions in 2010.

    1
    (Image: Fermilab/CERN)

    The CMS collaboration at CERN has released its fourth batch of open data to the public. With this release, which brings the volume of its open data to more than 2 PB (or two million GB), CMS has now provided open access to 100% of its research data recorded in proton–proton collisions in 2010, in line with the collaboration’s data-release policy. The release also includes several new data and simulation samples. The new release builds upon and expands the scope of the successful use of CMS open data in research and in education.

    In this release, CMS open data address the ever-growing application of machine learning (ML) to challenges in high-energy physics. According to a recent paper, collaboration with the data-science and ML community is considered a high-priority to help advance the application of state-of-the-art algorithms in particle physics. CMS has therefore also made available samples that can help foster such collaboration.

    “Modern machine learning is having a transformative impact on collider physics, from event reconstruction and detector simulation to searches for new physics,” remarks Jesse Thaler, an Associate Professor at MIT, who is working on ML using CMS open data with two doctoral students, Patrick Komiske and Eric Metodiev. “The performance of machine-learning techniques, however, is directly tied to the quality of the underlying training data. With the extra information provided in the latest data release from CMS, outside users can now investigate novel strategies on fully realistic samples, which will likely lead to exciting advances in collider data analysis.”

    The ML datasets, derived from millions of CMS simulation events for previous and future runs of the Large Hadron Collider, focus on solving a number of representative challenges for particle identification, tracking and distinguishing between multiple collisions that occur in each crossing of proton bunches. All the datasets come with extensive documentation on what they contain, how to use them and how to reproduce them with modified content.

    In its policy on data preservation and open access, CMS commits to releasing 100% of its analysable data within ten years of collecting them. Around half of proton-proton collision data collected at 7 TeV center-of-mass in 2010 were released in the first CMS release in 2014, and the remaining data are included in this new release. In addition, a small sample of unprocessed raw data from LHC’s Run 1 (2010 to 2012) are also released. These samples will help test the chain for processing CMS data using the legacy software environment.

    Reconstructed data and simulations from the CASTOR calorimeter, which was used by CMS in 2010, are also available and represent the first release of data from the very-forward region of CMS. Finally, CMS has released instructions and examples on how to generate simulated events and how to analyse data in isolated “containers”, within which one has access to the CMS software environment required for specific datasets. It is also easier to search through the simulated data and to discover the provenance of datasets.

    As before, the data are released into the public domain under the Creative Commons CC0 waiver via the CERN Open Data portal. The portal is openly developed by the CERN Information Technology department, in cooperation with the experimental collaborations who release open data on it.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    CMS
    CERN CMS New

     
  • richardmitnick 11:07 am on July 18, 2019 Permalink | Reply
    Tags: , , Birth of the Moon, , Particle Physics, , Replicating the forces that generate new planets, Sarah T. Stewart, ,   

    From Nautilus: Women in STEM-“She Rewrote the Moon’s Origin Story” Sarah T. Stewart 

    Nautilus

    From Nautilus

    July 18, 2019
    Brian Gallagher

    1
    Nautilus

    2
    Fire When Ready: In her lab, Sarah T. Stewart (above) tries to replicate the forces that generate new planets. She employs “light gas guns, essentially cannons,” she says, to fire disks—at eight kilometers per second—toward minerals, vaporizing them, to generate the pressures and temperatures needed for planet formation. John D. & Catherine T. MacArthur Foundation.

    Fifty years ago, in the Oval Office, Richard Nixon made what he called the “most historic phone call ever.” Houston had put him through to the men on the moon. “It’s a great honor and privilege for us to be here,” Neil Armstrong said, “representing not only the United States but men of peace of all nations, and with interest and a curiosity and a vision for the future.” The Apollo missions—a daring feat of passion and reason—weren’t just for show. In reaching the moon in 1969, fulfilling John F. Kennedy’s promise seven years earlier to go there not because it would be easy, but hard, humanity tested its limits—as well as the lunar soil.

    The samples the astronauts brought back to Earth have revolutionized our understanding of the moon’s origins, leading scientists to imagine new models of how our planet, and its companion, emerged. One of those scientists is Sarah T. Stewart, a planetary physicist at the University of California, Davis. Last year she won a MacArthur Foundation Fellowship, unofficially known as the “genius grant,” for her work on the origin of Earth’s moon. Her theory upends one held for decades.

    Stewart’s bold vision grows out a love for science planted in high school in O’Fallon, Illinois. “I had phenomenal math and physics teachers,” she said. “So when I went to college, I wanted to be a physics major.” At Harvard, where she studied astronomy and physics, “I met amazing scientists, and that sparked a whole career.” She earned her Ph.D. at Caltech.

    Nautilus spoke to Stewart last year about the scientific significance of the Apollo lunar landings, as well as how her laboratory experiments, which replicate the pressures and temperatures of planetary collisions, informed her model of the moon’s birth.

    How significant were the Apollo moon landings to science?

    This July marks the 50th anniversary of the Apollo moon landing. The rock samples that the Apollo missions brought back basically threw out every previous idea for the origin of the moon. Before the Apollo results were in, a Russian astronomer named Viktor Safronov had been developing models of how planets grow. He found that they grow into these sub- or proto-planet-size bodies that would then collide. A couple of different groups then independently proposed that a giant impact made a disc around the Earth that the moon accreted from. Over the past 50 years, that model became quantitative, predictive. Simulations showed that the moon should be made primarily out of the object that struck the proto-Earth. But the Apollo mission found that the moon is practically a twin of the Earth, particularly its mantle, in major elements and in isotopic ratios: The different weight elements are like fingerprints, present in the same abundances. Every single small asteroid and planet in the solar system has a different fingerprint, except the Earth and the moon. So the giant impact hypothesis was wrong. It’s a lesson in how science works—the giant impact hypothesis hung on for so long because there was no alternative model that hadn’t already been disproven.

    How is your proposal for the moon’s birth different?

    We changed the giant impact. And by changing it we specifically removed one of the original constraints. The original giant impact was proposed to set the length of day of the Earth, because angular momentum—the rotational equivalent of linear momentum—is a physical quantity that is conserved: If we go backwards in time, the moon comes closer to the Earth. At the time the moon grew, the Earth would have been spinning with a five-hour day. So all of the giant impact models were tuned to give us a five-hour day for the Earth right after the giant impact. What we did was say, “Well, what if there were a way to change the angular momentum after the moon formed?” That would have to be through a dynamical interaction with the sun. What that means is that we could start the Earth spinning much faster—we were exploring models where the Earth had a two- to three-hour day after the giant impact.

    What did a faster-spinning Earth do to your models?

    The surprising new thing is that when the Earth is hot, vaporized, and spinning quickly, it isn’t a planet anymore. There’s a boundary beyond which all of the Earth material cannot physically stay in an object that rotates altogether—we call that the co-rotation limit. A body that exceeds the co-rotation limit forms a new object that we named a synestia, a Greek-derived word that is meant to represent a connected structure. A synestia is a different object than a planet plus a disc. It has different internal dynamics. In this hot vaporized state, the hot gas in the disc can’t fall onto the planet, because the planet has an atmosphere that’s pushing that gas out. What ends up happening is that the rock vapor that forms a synestia cools by radiating to space, forms magma rain in the outer parts of the synestia, and that magma rain accretes to form the moon within the rock vapor that later cools to become the Earth.

    How did you the idea of a synestia come about?

    In 2012, Matija Ćuk and I published a paper that was a high-spin model for the origin of the moon. We changed the impact event, but we didn’t realize that after the impact, things were completely different. It just wasn’t anything we ever extracted from the simulations. It wasn’t until two years later when my student Simon Lock and I were looking at different plots, plots we had never made before out of the same simulations, that we realized that we had been interpreting what happened next incorrectly. There was a bonafide eureka moment where we’re sitting together talking about how the disc would evolve around the Earth after the impact, and realizing that it wasn’t a standard disc. These synestias have probably been sitting in people’s computer systems for quite some time without anyone ever actually identifying them as something different.

    Was the size of the synestia beyond the moon’s current orbit?

    It could have been bigger. Exactly how big it was depends on the energy of the event and how fast it was spinning. We don’t have precise constraints on that to make the moon because a range of synestias could make the moon.

    How long was the Earth in a synestia state?

    The synestia was very large, but it didn’t last very long. Because rock vapor is very hot, and where we are in the solar system is far enough away from the sun that our mean temperature is cooler than rock vapor, the synestia cooled very quickly. So it could last a thousand years or so before looking like a normal planet again. Exactly how long it lasts depends on what else is happening in the solar system around the Earth. In order to be a long lived object it would need to be very close to the star.

    What was the size of the object that struck proto-Earth?

    We can’t tell, because a variety of mass ratios, impact angles, impact velocities can make a synestia that has enough mass and angular momentum in it to make our moon. I don’t know that we will ever know for sure exactly what hit us. There may be ways for us to constrain the possibilities. One way to do that is to look deep in the Earth for clues about how large the event could have been. There are chemical tracers from the deep mantle that indicate that the Earth wasn’t completely melted and mixed, even by the moon-forming event. Those reach the surface through what are called ocean island basalts, sometimes called mantle plumes, from near the core-mantle boundary, up through the whole mantle to the surface. It could be that that could be used as a constraint on being too big. Because the Earth and the moon are very similar in the mantles of the two bodies, that can be used to determine what is too small of an event. That would give us a range that can probably be satisfied by a number of different impact configurations.

    How much energy does it take to form a synestia?

    Giant impacts are tremendously energetic events. The energy of the event, in terms of the kinetic energy of the impact, is released over hours. The power involved is similar to the power, or luminosity, of the sun. We really cannot think of the Earth as looking anything like the Earth when you’ve just dumped the energy of the sun into this planet.

    How common are synestias?

    We actually think that synestias should happen quite frequently during rocky planet formation. We haven’t looked at the gas giant planets. There are some different physics that happen with those. But for growing rocky bodies like the Earth, we attempted to estimate the statistics of how often there should be synestias. And for Earth-mass bodies anywhere in the universe probably, the body is a synestia at least once while it’s growing. The likelihood of making a synestia goes up as the bodies become larger. Super-Earths also should have been a synestia at some point.

    You say that all of the pressures and temperatures reached during planet formation are now accessible in the laboratory. First, give us a sense of the magnitude of those pressures and temperatures, and then tell us how accessing them in labs is possible.

    The center of the Earth is at several thousand degrees, and has hundreds of gigapascals of pressure—about three million times more pressure than the surface. Jupiter’s center is even hotter. The center-of-Jupiter pressures can be reached temporarily during a giant impact, as the bodies are colliding together. A giant impact and the center of Jupiter are about the limits of the pressures and temperatures reached during planet formation: so tens of thousands of degrees, and a million times the pressure of the Earth. To replicate that, we need to dump energy into our rock or mineral very quickly in order to generate a shockwave that reaches these amplitudes in pressure and temperature. We use major minerals in the Earth, or rocky planets—so we’ve studied iron, quartz, forsterite, enstatite, and different alloy compositions of those. Other people have studied the hydrogen helium mixture for Jupiter, and ices for Uranus and Neptune. In my lab we have light gas guns, essentially cannons. And, using compressed hydrogen, we can launch a metal flyer plate—literally a thin disk—to almost eight kilometers per second. We can reach the core pressures in the Earth, but I can’t reach the range of giant impacts or the center of Jupiter in my lab. But the Sandia Z machine, which is a big capacitor that launches metal plates using a magnetic force, can reach 40 kilometers per second. , which is a big capacitor that launches metal plates using a magnetic force, can reach 40 kilometers per second.

    Sandia Z machine

    And with the National Ignition Facility laser at Lawrence Livermore National Lab, we can reach the pressures at the center of Jupiter.


    National Ignition Facility at LLNL

    What happens to the flyer plates when they’re shot?

    The target simply gets turned to dust after being vaporized and then cooling again. They’re very destructive experiments. You have to make real time measurements—of the wave itself and how fast it’s traveling—within tens of nanoseconds. That we can translate to pressure. My group has spent a lot of time developing ways to measure temperature, and to find phase boundaries. The work that led to the origin of the moon was specifically studying what it takes to vaporize Earth materials, and to determine the boiling points of rocks. We needed to know when it would be vaporized in order to calculate when something would become a synestia.

    How do you use your experimental results?

    What runs in our code is a simplified version of a planet. With our experiments we can simulate a simplified planet to infer the more complicated chemical system. Once we’ve determined the pressure-temperature of the average system, you can ask more detailed questions about the multi-component chemistry of a real planet. In the moon paper that was published last year, there’s two big sections. One that does the simplified modeling of the giant impact—it gives us the pressure-temperature range in the synestia. Then another that looks at the chemistry of the system that starts at these high pressures and temperatures and cools, but now using a more realistic model for the Earth.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
  • richardmitnick 7:59 am on July 17, 2019 Permalink | Reply
    Tags: "Bottomonium particles don’t go with the flow", , , , , , Particle Physics,   

    From CERN: “Bottomonium particles don’t go with the flow” 

    Cern New Bloc

    Cern New Particle Event


    From CERN

    16 July, 2019
    Ana Lopes

    The first measurement, by the ALICE [below] collaboration, of an elliptic-shaped flow for bottomonium particles could help shed light on the early universe.

    A few millionths of a second after the Big Bang, the universe was so dense and hot that the quarks and gluons that make up protons, neutrons and other hadrons existed freely in what is known as the quark–gluon plasma. The ALICE experiment at the Large Hadron Collider (LHC) can recreate this plasma in high-energy collisions of beams of heavy ions of lead. However, ALICE, as well as any other collision experiments that can recreate the plasma, cannot observe this state of matter directly. The presence and properties of the plasma can only be deduced from the signatures it leaves on the particles that are produced in the collisions.

    In a new article, presented at the ongoing European Physical Society conference on High-Energy Physics, the ALICE collaboration reports the first measurement of one such signature – the elliptic flow – for upsilon particles produced in lead–lead LHC collisions.

    The upsilon is a bottomonium particle, consisting of a bottom (often also called beauty) quark and its antiquark. Bottomonia and their charm-quark counterparts, charmonium particles, are excellent probes of the quark–gluon plasma. They are created in the initial stages of a heavy-ion collision and therefore experience the entire evolution of the plasma, from the moment it is produced to the moment it cools down and gives way to a state in which hadrons can form.

    One indication that the quark–gluon plasma forms is the collective motion, or flow, of the produced particles. This flow is generated by the expansion of the hot plasma after the collision, and its magnitude depends on several factors, including: the particle type and mass; how central, or “head on”, the collision is; and the momenta of the particles at right angles to the collision line. One type of flow, called elliptic flow, results from the initial elliptic shape of non-central collisions.

    In their new study, the ALICE team determined the elliptic flow of the upsilons by observing the pairs of muons (heavier cousins of the electron) into which they transform, or “decay”. They found that the magnitude of the upsilon elliptic flow for a range of momenta and collision centralities is small, making the upsilons the first hadrons that don’t seem to exhibit a significant elliptic flow.

    The results are consistent with the prediction that the upsilons are largely split up into their constituent quarks in the early stages of their interaction with the plasma, and they pave the way to higher-precision measurements using data from ALICE’s upgraded detector, which will be able to record ten times more upsilons. Such data should also cast light on the curious case of the J/psi flow. This lighter charmonium particle has a larger flow and is believed to re-form after being split up by the plasma.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 4:25 pm on July 16, 2019 Permalink | Reply
    Tags: , , Michigan State University, , , Particle Physics,   

    From U Wisconsin IceCube Collaboration: A Flock of Articles on NSF Grant to Upgrade IceCube 

    U Wisconsin ICECUBE neutrino detector at the South Pole

    From From U Wisconsin IceCube Collaboration

    From U Wisconsin: “UW lab gears up for another Antarctic drilling campaign”

    With news that the National Science Foundation (NSF) and international partners will support an upgrade to the IceCube neutrino detector at the South Pole, the UW–Madison lab that built the novel drill used to bore mile-deep holes in the Antarctic ice is gearing up for another drilling campaign.

    The UW’s Physical Sciences Laboratory (PSL), which specializes in making customized equipment for UW–Madison researchers, will once again lead drilling operations. The $37 million upgrade announced this week (July 16, 2019) will expand the IceCube detector by adding seven new strings of 108 optical modules each to study the basic properties of neutrinos, phantom-like particles that emanate from black holes and exploding stars, but that also cascade through Earth’s atmosphere as a result of colliding subatomic particles.

    1
    “It takes a crew of 30 people to run this 24/7. It’s the people that make it work,” says Bob Paulos, director of the Physical Sciences Lab. Photo: Bryce Richter

    See the full article here .

    From U Wisconsin: “IceCube: Antarctic neutrino detector to get $37 million upgrade”

    2
    The IceCube Neutrino Observatory is located at NSF’s Amundsen-Scott South Pole Station. Management and operation of the observatory is through the Wisconsin IceCube Particle Astrophysics Center at UW–Madison. Raffaela Busse, IceCube / NSF

    IceCube, the Antarctic neutrino detector that in July of 2018 helped unravel one of the oldest riddles in physics and astronomy — the origin of high-energy neutrinos and cosmic rays — is getting an upgrade.

    This month, the National Science Foundation (NSF) approved $23 million in funding to expand the detector and its scientific capabilities. Seven new strings of optical modules will be added to the 86 existing strings, adding more than 700 new, enhanced optical modules to the 5,160 sensors already embedded in the ice beneath the geographic South Pole.

    The upgrade, to be installed during the 2022–23 polar season, will receive additional support from international partners in Japan and Germany as well as from Michigan State University and the University of Wisconsin–Madison. Total new investment in the detector will be about $37 million.

    See the full article here .

    From Niels Bohr Institute: “A new Upgrade for the IceCube detector”

    3
    Illustration of the IceCube laboratory under the South Pole. The sensors detecting neutrinos are attached to the strings lowered into the ice. The upgrade will take place in the Deep Core area. Illustration: IceCube/NSF

    Neutrino Research:

    The IceCube Neutrino Observatory in Antarctica is about to get a significant upgrade. This huge detector consists of 5,160 sensors embedded in a 1x1x1 km volume of glacial ice deep beneath the geographic South Pole. The purpose of the installation is to detect neutrinos, the “ghost particles” of the Universe. The IceCube Upgrade will add more than 700 new and enhanced optical sensors in the deepest, purest ice, greatly improving the observatory’s ability to measure low-energy neutrinos produced in the Earth’s atmosphere. The research in neutrinos at the Niels Bohr Institute, University of Copenhagen is led by Associate Professor Jason Koskinen

    See the full article here .

    From Michigan State University: “Upgrade for neutrino detector, thanks to NSF grant”

    5
    The IceCube Neutrino Observatory, the Antarctic detector that identified the first likely source of high-energy neutrinos and cosmic rays, is getting an upgrade. Courtesy of IceCube

    The IceCube Neutrino Observatory, the Antarctic detector that identified the first likely source of high-energy neutrinos and cosmic rays, is getting an upgrade.

    The National Science Foundation is upgrading the IceCube detector, extending its scientific capabilities to lower energies, and bridging IceCube to smaller neutrino detectors worldwide. The upgrade will insert seven strings of optical modules at the bottom center of the 86 existing strings, adding more than 700 new, enhanced optical modules to the 5,160 sensors already embedded in the ice beneath the geographic South Pole.

    The upgrade will include two new types of sensor modules, which will be tested for a ten-times-larger future extension of IceCube – IceCube-Gen2. The modules to be deployed in this first extension will be two to three times more sensitive than the ones that make up the current detector. This is an important benefit for neutrino studies, but it becomes even more relevant for planning the larger IceCube-Gen2.

    The $37 million extension, to be deployed during the 2022-23 polar field season, has now secured $23 million in NSF funding. Last fall, the upgrade office was set up, thanks to initial funding from NSF and additional support from international partners in Japan and Germany as well as from Michigan State University and the University of Wisconsin-Madison.

    See the full article here .

    From U Wisconsin IceCube: “The IceCube Upgrade: An international effort”

    The IceCube Upgrade project is an international collaboration made possible not only by support from the National Science Foundation but also thanks to significant contributions from partner institutions in the U.S. and around the world. Our national and international collaborators play a huge role in manufacturing new sensors, developing firmware, and much more. Learn more about a few of our partner institutions below.

    8
    The Chiba University group poses with one of the new D-Egg optical detectors. Credit: Chiba University

    Chiba University is responsible for the new D-Egg optical detectors, 300 of which will be deployed on the new Upgrade strings. A D-Egg is 30 percent smaller than the original IceCube DOM, but its photon detection effective area is twice as large thanks to two 8-inch PMTs in the specially designed egg-shaped vessel made of UV-transparent glass. Its up-down symmetric detection efficiency is expected to improve our precision for measuring Cherenkov light from neutrino interactions. The newly designed flasher devices in the D-Egg will also give a better understanding of optical characteristics in glacial ice to improve the resolution of arrival directions of cosmic neutrinos.

    See the full article here .

    From DESY: “Neutrino observatory IceCube receives significant upgrade”

    6
    Deep down in the perpetual ice of Antarctica IceCube watches out for a faint bluish glow that indicates a rare collision of a cosmic neutrino within the ice. Artist’s concept: DESY, Science Communication Lab

    Particle detector at the South Pole will be expanded to comprise a neutrino laboratory

    The international neutrino observatory IceCube at the South Pole will be considerably expanded in the coming years. In addition to the existing 5160 sensors, a further 700 optical modules will be installed in the perpetual ice of Antarctica. The National Science Foundation in the USA has approved 23 million US dollars for the expansion. The Helmholtz Centres DESY and Karlsruhe Institute of Technology (KIT) are supporting the construction of 430 new optical modules with a total of 5.7 million euros (6.4 million US dollars), which will turn the observatory into a neutrino laboratory. IceCube, for which Germany with a total of nine participating universities and the two Helmholtz Centres is the most important partner after the USA, had published convincing indications last year of a first source of high-energy neutrinos from the cosmos.

    See the full article here .

    See the full articles above .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition
    IceCube is a particle detector at the South Pole that records the interactions of a nearly massless sub-atomic particle called the neutrino. IceCube searches for neutrinos from the most violent astrophysical sources: events like exploding stars, gamma ray bursts, and cataclysmic phenomena involving black holes and neutron stars. The IceCube telescope is a powerful tool to search for dark matter, and could reveal the new physical processes associated with the enigmatic origin of the highest energy particles in nature. In addition, exploring the background of neutrinos produced in the atmosphere, IceCube studies the neutrinos themselves; their energies far exceed those produced by accelerator beams. IceCube is the world’s largest neutrino detector, encompassing a cubic kilometer of ice.

    IceCube employs more than 5000 detectors lowered on 86 strings into almost 100 holes in the Antarctic ice NSF B. Gudbjartsson, IceCube Collaboration

    Lunar Icecube

    IceCube DeepCore annotated

    IceCube PINGU annotated


    DM-Ice II at IceCube annotated

     
  • richardmitnick 12:21 pm on July 16, 2019 Permalink | Reply
    Tags: , being replaced by LBNL Lux Zeplin project, , ending, , Lead, , Particle Physics, SD, , U Washington LUX Dark matter Experiment at SURF, ,   

    From Lawrence Berkeley National Lab: “Some Assembly Required: Scientists Piece Together the Largest U.S.-Based Dark Matter Experiment” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    July 16, 2019

    Glenn Roberts Jr.
    geroberts@lbl.gov
    (510) 486-5582

    Major deliveries in June set the stage for the next phase of work on LUX-ZEPLIN project.

    1
    Lower (left) and upper photomultiplier tube arrays are prepared for LZ at the Sanford Underground Research Facility in Lead, South Dakota. (Credit: Matt Kapust/SURF)

    Most of the remaining components needed to fully assemble an underground dark matter-search experiment called LUX-ZEPLIN (LZ) arrived at the project’s South Dakota home during a rush of deliveries in June.

    When complete, LZ will be the largest, most sensitive U.S.-based experiment yet that is designed to directly detect dark matter particles. Scientists around the world have been trying for decades to solve the mystery of dark matter, which makes up about 85 percent of all matter in the universe though we have so far only detected it indirectly through observed gravitational effects.

    The bulk of the digital components for LZ’s electronics system, which is designed to transmit and record signals from ever-slight particle interactions in LZ’s core detector vessel, were among the new arrivals at the Sanford Underground Research Facility (SURF). SURF, the site of a former gold mine now dedicated to a broad spectrum of scientific research, was also home to a predecessor search experiment called LUX.

    U Washington LUX Dark matter Experiment at SURF, Lead, SD, USA

    A final set of snugly fitting acrylic vessels, which will be filled with a special liquid designed to identify false dark matter signals in LZ’s inner detector, also arrived at SURF in June.

    3
    An intricately thin wire grid is visible (click image to view larger size) atop an array of photomultiplier tube. The components are part of the LZ inner detector. (Credit: Matt Kapust/SURF)

    Also, the last two of four intricately woven wire grids that are essential to maintain a constant electric field and extract signals from the experiment’s inner detector, also called the time projection chamber, arrived in June (see related article).

    LZ achieved major milestones in June. It was the busiest single month for delivering things to SURF — it was the peak,” said LZ Project Director Murdock Gilchriese of the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab). Berkeley Lab is the lead institution for the LZ project, which is supported by an international collaboration that has about 37 participating institutions and about 250 researchers and technical support crew members.

    “A few months from now all of the action on LZ is going to be at SURF — we are already getting close to having everything there,” Gilchriese said.

    Mike Headley, executive director at SURF, said, “We’ve been collectively preparing for these deliveries for some time and everything has gone very well. It’s been exciting to see the experiment assembly work progress and we look forward to lowering the assembled detector a mile underground for installation.”

    4
    Components for the LUX-ZEPLIN project are stored inside a water tank nearly a mile below ground. The inner detector will be installed on the central mount pictured here, and acrylic vessels (wrapped in white) will fit snugly around this inner detector. (Credit: Matt Kapust/SURF)

    All of these components will be transported down a shaft and installed in a nearly mile-deep research cavern. The rock above provides a natural shield against much of the constant bombardment of particles raining down on the planet’s surface that produce unwanted “noise.”

    LZ components have also been painstakingly tested and selected to ensure that the materials they are made of do not themselves interfere with particle signals that researchers are trying to tease out.

    LZ is particularly focused on finding a type of theoretical particle called a weakly interacting massive particle or WIMP by triggering a unique sequence of light and electrical signals in a tank filled with 10 metric tons of highly purified liquid xenon, which is among Earth’s rarest elements. The properties of xenon atoms allow them to produce light in certain particle interactions.

    Proof of dark matter particles would fundamentally change our understanding of the makeup of the universe, as our current Standard Model of Physics does not account for their existence.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)

    Assembly of the liquid xenon time projection chamber for LZ is now about 80 percent complete, Gilchriese said. When fully assembled later this month this inner detector will contain about 500 photomultiplier tubes. The tubes are designed to amplify and transmit signals produced within the chamber.

    5
    An array of photomultiplier tubes that are designed to detect signals occurring within LZ’s liquid xenon tank. (Credit: Matt Kapust/SURF)

    Once assembled, the time projection chamber will be lowered carefully into a custom titanium vessel already at SURF. Before it is filled with xenon, this chamber will be lowered to a depth of about 4,850 feet. It will be carried in a frame that is specially designed to minimize vibrations, and then floated into the experimental cavern across a temporarily assembled metal runway on air-pumped pucks known as air skates.

    Finally, it will be lowered into a larger outer titanium vessel, already underground, to form the final vacuum-insulated cryostat needed to house the liquid xenon.

    That daylong journey, planned in September, will be a nail-biting experience for the entire project team, noted Berkeley Lab’s Simon Fiorucci, LZ deputy project manager.

    “It will certainly be the most stressful — this is the thing that really cannot fail. Once we’re done with this, a lot of our risk disappears and a lot of our planning becomes easier,” he said, adding, “This will be the biggest milestone that’s left besides having liquid xenon in the detector.”

    Project crews will soon begin testing the xenon circulation system, already installed underground, that will continually circulate xenon through the inner detector, further purify it, and reliquify it. Fiorucci said researchers will use about 250 pounds of xenon for these early tests.

    Work is also nearing completion on LZ’s cryogenic cooling system that is required to convert xenon gas to its liquid form.

    6
    Researchers from the University of Rochester in June installed six racks of electronics hardware that will be used to process signals from the LZ experiment. (Credit: University of Rochester)

    LZ digital electronics, which will ultimately connect to the arrays of photomultiplier tubes and enable the readout of signals from particle interactions, were designed, developed, delivered, and installed by University of Rochester researchers and technical staff at SURF in June.

    “All of our electronics have been designed specifically for LZ with the goal of maximizing our sensitivity for the smallest possible signals,” said Frank Wolfs, a professor of physics and astronomy at the University of Rochester who is overseeing the university’s efforts.

    He noted that more than 28 miles of coaxial cable will connect the photomultiplier tubes and their amplifying electronics – which are undergoing tests at UC Davis – to the digitizing electronics. “The successful installation of the digital electronics and the online network and computing infrastructure in June makes us eager to see the first signals emerge from LZ,” Wolfs added.

    Also in June, LZ participants exercised high-speed data connections from the site of the experiment to the surface level at SURF and then to Berkeley Lab. Data captured by the detectors’ electronics will ultimately be transferred to LZ’s primary data center, the National Energy Research Scientific Computing Center (NERSC) at Berkeley Lab via the Energy Sciences Network (ESnet), a high-speed nationwide data network based at Berkeley Lab.

    NERSC

    NERSC Cray Cori II supercomputer at NERSC at LBNL, named after Gerty Cori, the first American woman to win a Nobel Prize in science

    NERSC Hopper Cray XE6 supercomputer


    LBL NERSC Cray XC30 Edison supercomputer


    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF


    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    Future:

    Cray Shasta Perlmutter SC18 AMD Epyc Nvidia pre-exascale supeercomputer

    NERSC is a DOE Office of Science User Facility.

    The production of the custom acrylic tanks (see related article), which will contain a fluid known as a liquid scintillator, was overseen by LZ participants at University of California,Santa Barbara.

    5
    The top three acrylic tanks for the LUX-ZEPLIN outer detector during testing at the fabrication vendor. These tanks are now at the Sanford Underground Research Facility in Lead, South Dakota. (Credit: LZ Collaboration)

    “The partnership between LZ and SURF is tremendous, as evidenced by the success of the assembly work to date,” Headley said. “We’re proud to be a part of the LZ team and host this world-leading experiment in South Dakota.”

    NERSC and ESnet are DOE Office of Science User Facilities.

    Major support for LZ comes from the DOE Office of Science, the South Dakota Science and Technology Authority, the U.K.’s Science & Technology Facilities Council, and by collaboration members in the U.S., U.K., South Korea, and Portugal.

    More:

    For information about LZ and the LZ collaboration, visit: http://lz.lbl.gov/

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    LBNL campus

    Bringing Science Solutions to the World
    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the National Academy of Sciences (NAS), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California (UC) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the UC Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a UC Berkeley physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    A U.S. Department of Energy National Laboratory Operated by the University of California.

    University of California Seal

    DOE Seal

     
  • richardmitnick 9:17 am on July 16, 2019 Permalink | Reply
    Tags: , “Photonic” and “spintronic” materials that transfer digitized information based on the spin of photons or electrons respectively is also made possible thanks to these results., Particle Physics, , , , Weyl semimetals- a class of quantum materials have bulk quantum states whose electrical properties can be controlled using light., With quantum computing all platforms are light-based so it’s the photon which is the carrier of quantum information.   

    From Penn Today: “Unique electrical properties in quantum materials can be controlled using light” 


    From Penn Today

    July 15, 2019
    Erica K. Brockmeier

    Insights from quantum physics have allowed engineers to incorporate components used in circuit boards, optical fibers, and control systems in new applications ranging from smartphones to advanced microprocessors. But, even with significant progress made in recent years, researchers are still looking for new and better ways to control the uniquely powerful electronic properties of quantum materials.

    A new study from Penn researchers found that Weyl semimetals, a class of quantum materials, have bulk quantum states whose electrical properties can be controlled using light. The project was led by Ritesh Agarwal and graduate student Zhurun Ji in the School of Engineering and Applied Science in collaboration with Charles Kane, Eugene Mele, and Andrew M. Rappe in the School of Arts and Sciences, along with Zheng Liu from Nanyang Technological University. Penn’s Zachariah Addison, Gerui Liu, Wenjing Liu, and Heng Gao, and Nanyang’s Peng Yu, also contributed to the work. Their findings were published in Nature Materials.

    A hint of these unconventional photogalvanic properties, or the ability to generate electric current using light, was first reported by Agarwal in silicon. His group was able to control the movement of electrical current by changing the chirality, or the inherent symmetry of the arrangement of silicon atoms, on the surface of the material.

    “At that time, we were also trying to understand the properties of topological insulators, but we could not prove that what we were seeing was coming from those unique surface states,” Agarwal explains.

    1
    A microscopic image of multiple electrodes on a sheet of Weyl semimetal, with red and blue arrows depicting the circular movement of the light-induced electrical current by either left- (blue) or right-circularly polarized light (right). (Photo: Zhurun Ji)

    Then, while conducting new experiments on Weyl semimetals, where the unique quantum states exist in the bulk of the material, Agarwal and Ji got results that didn’t match any theories that could explain how the electrical field was moving when activated by light. Instead of the electrical current flowing in a single direction, the current moved around the semimetal in a swirling circular pattern.

    Agarwal and Ji turned to Kane and Mele to help develop a new theoretical framework that could explain what they were seeing. After conducting new, extremely thorough experiments to iteratively eliminate all other possible explanations, the physicists were able to narrow the possible explanations to a single theory related to the structure of the light beam.

    “When you shine light on matter, it’s natural to think about a beam of light as laterally uniform,” says Mele. “What made these experiments work is that the beam has a boundary, and what made the current circulate had to do with its behavior at the edge of the beam.”

    Using this new theoretical framework, and incorporating Rappe’s insights on the electron energy levels inside the material, Ji was able to confirm the unique circular movements of the electrical current. The scientists also found that the current’s direction could be controlled by changing the light beam’s structure, such as changing the direction of its polarization or the frequency of the photons.

    “Previously, when people did optoelectronic measurements, they always assume that light is a plane wave. But we broke that limitation and demonstrated that not only light polarization but also the spatial dispersion of light can affect the light-matter interaction process,” says Ji.

    This work allows researchers to not only better observe quantum phenomena, but it provides a way to engineer and control unique quantum properties simply by changing light beam patterns. “The idea that the modulation of light’s polarization and intensity can change how an electrical charge is transported could be powerful design idea,” says Mele.

    Future development of “photonic” and “spintronic” materials that transfer digitized information based on the spin of photons or electrons respectively is also made possible thanks to these results. Agarwal hopes to expand this work to include other optical beam patterns, such as “twisted light,” which could be used to create new quantum computing materials that allow more information to be encoded onto a single photon of light.

    “With quantum computing, all platforms are light-based, so it’s the photon which is the carrier of quantum information. If we can configure our detectors on a chip, everything can be integrated, and we can read out the state of the photon directly,” Agarwal says.

    Agarwal and Mele emphasize the “heroic” effort made by Ji, including an additional year’s measurements made while running an entirely new set of experiments that were crucial to the interpretation of the study. “I’ve rarely seen a graduate student faced with that challenge who was able not only to rise to it but to master it. She had the initiative to do something new, and she got it done,” says Mele.

    This research was supported by Office of Naval Research (Grant N00014-17-1-2661), U.S. Army Research Office (Grant W911NF- 17-1-0436), U.S. Department of Energy (grants DE FG02 84ER45118 and DE-FG02-07ER46431), and a National Science Foundation Materials Research Science and Engineering Centers seed grant.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Penn campus

    Academic life at Penn is unparalleled, with 100 countries and every U.S. state represented in one of the Ivy League’s most diverse student bodies. Consistently ranked among the top 10 universities in the country, Penn enrolls 10,000 undergraduate students and welcomes an additional 10,000 students to our world-renowned graduate and professional schools.

    Penn’s award-winning educators and scholars encourage students to pursue inquiry and discovery, follow their passions, and address the world’s most challenging problems through an interdisciplinary approach.

     
  • richardmitnick 12:10 pm on July 15, 2019 Permalink | Reply
    Tags: , , , , , , , , Particle Physics,   

    From CERN: “Exploring the Higgs boson “discovery channels” 

    Cern New Bloc

    Cern New Particle Event


    From CERN

    12th July 2019
    ATLAS Collaboration

    1
    Event display of a two-electron two-muon ZH candidate. The Higgs candidate can be seen on the left with the two leading electrons represented by green tracks and green EM calorimeter deposits (pT = 22 and 120 GeV), and two subleading muons indicated by two red tracks (pT = 34 and 43 GeV). Recoiling against the four lepton candidate in the left hemisphere is a dimuon pair in the right hemisphere indicated by two red tracks (pT = 139 and 42 GeV) and an invariant mass of 91.5 GeV, which agrees well with the mass of the Z boson. (Image: ATLAS Collaboration/CERN)

    At the 2019 European Physical Society’s High-Energy Physics conference (EPS-HEP) taking place in Ghent, Belgium, the ATLAS and CMS collaborations presented a suite of new results. These include several analyses using the full dataset from the second run of CERN’s Large Hadron Collider (LHC), recorded at a collision energy of 13 TeV between 2015 and 2018. Among the highlights are the latest precision measurements involving the Higgs boson. In only seven years since its discovery, scientists have carefully studied several of the properties of this unique particle, which is increasingly becoming a powerful tool in the search for new physics.

    The results include new searches for transformations (or “decays”) of the Higgs boson into pairs of muons and into pairs of charm quarks. Both ATLAS and CMS also measured previously unexplored properties of decays of the Higgs boson that involve electroweak bosons (the W, the Z and the photon) and compared these with the predictions of the Standard Model (SM) of particle physics. ATLAS and CMS will continue these studies over the course of the LHC’s Run 3 (2021 to 2023) and in the era of the High-Luminosity LHC (from 2026 onwards).

    The Higgs boson is the quantum manifestation of the all-pervading Higgs field, which gives mass to elementary particles it interacts with, via the Brout-Englert-Higgs mechanism. Scientists look for such interactions between the Higgs boson and elementary particles, either by studying specific decays of the Higgs boson or by searching for instances where the Higgs boson is produced along with other particles. The Higgs boson decays almost instantly after being produced in the LHC and it is by looking through its decay products that scientists can probe its behaviour.

    In the LHC’s Run 1 (2010 to 2012), decays of the Higgs boson involving pairs of electroweak bosons were observed. Now, the complete Run 2 dataset – around 140 inverse femtobarns each, the equivalent of over 10 000 trillion collisions – provides a much larger sample of Higgs bosons to study, allowing measurements of the particle’s properties to be made with unprecedented precision. ATLAS and CMS have measured the so-called “differential cross-sections” of the bosonic decay processes, which look at not just the production rate of Higgs bosons but also the distribution and orientation of the decay products relative to the colliding proton beams. These measurements provide insight into the underlying mechanism that produces the Higgs bosons. Both collaborations determined that the observed rates and distributions are compatible with those predicted by the Standard Model, at the current rate of statistical uncertainty.

    Since the strength of the Higgs boson’s interaction is proportional to the mass of elementary particles, it interacts most strongly with the heaviest generation of fermions, the third. Previously, ATLAS and CMS had each observed these interactions. However, interactions with the lighter second-generation fermions – muons, charm quarks and strange quarks – are considerably rarer. At EPS-HEP, both collaborations reported on their searches for the elusive second-generation interactions.
    ATLAS presented their first result from searches for Higgs bosons decaying to pairs of muons (H→μμ) with the full Run 2 dataset. This search is complicated by the large background of more typical SM processes that produce pairs of muons. “This result shows that we are now close to the sensitivity required to test the Standard Model’s predictions for this very rare decay of the Higgs boson,” says Karl Jakobs, the ATLAS spokesperson. “However, a definitive statement on the second generation will require the larger datasets that will be provided by the LHC in Run 3 and by the High-Luminosity LHC.”
    CMS presented their first result on searches for decays of Higgs bosons to pairs of charm quarks (H→cc). When a Higgs boson decays into quarks, these elementary particles immediately produce jets of particles. “Identifying jets formed by charm quarks and isolating them from other types of jets is a huge challenge,” says Roberto Carlin, spokesperson for CMS. “We’re very happy to have shown that we can tackle this difficult decay channel. We have developed novel machine-learning techniques to help with this task.”

    3
    An event recorded by CMS showing a candidate for a Higgs boson produced in association with two top quarks. The Higgs boson and top quarks decay leading to a final state with seven jets (orange cones), an electron (green line), a muon (red line) and missing transverse energy (pink line) (Image: CMS/CERN)

    The Higgs boson also acts as a mediator of physics processes in which electroweak bosons scatter or bounce off each other. Studies of these processes with very high statistics serve as powerful tests of the Standard Model. ATLAS presented the first-ever measurement of the scattering of two Z bosons. Observing this scattering completes the picture for the W and Z bosons as ATLAS has previously observed the WZ scattering process and both collaborations the WW processes. CMS presented the first observation of electroweak-boson scattering that results in the production of a Z boson and a photon.
    “The experiments are making big strides in the monumental task of understanding the Higgs boson,” says Eckhard Elsen, CERN’s Director of Research and Computing. “After observation of its coupling to the third-generation fermions, the experiments have now shown that they have the tools at hand to address the even more challenging second generation. The LHC’s precision physics programme is in full swing.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: