Tagged: CERN LHC Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:48 pm on September 27, 2018 Permalink | Reply
    Tags: , , CERN LHC, , LHCb experiment discovers two perhaps three new particles, , ,   

    From CERN: “LHCb experiment discovers two, perhaps three, new particles” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN

    27 Sep 2018
    Ana Lopes

    1
    The LHCb experiment at CERN. (Image: CERN)

    It could be three for the price of one. The LHCb collaboration has found two never-before-seen particles, as well as hints of another new particle, in high-energy proton collisions at the Large Hadron Collider (LHC). Future studies of the properties of these new particles will shed light on the strong force that binds subatomic particles called quarks together.

    The new particles are predicted by the well-established quark model, and belong to the same family of particles as the protons that the LHC accelerates and collides: baryons, which are made up of three quarks. But the type of quarks they contain are different: whereas protons contain two up quarks and one down quark, the new particles, dubbed Σb(6097)+ and Σb(6097)-, are bottom baryons composed of one bottom quark and two up quarks (buu) or one bottom quark and two down quarks (bdd) respectively. Four relatives of these particles, known as Σb+, Σb-, Σb*+ and Σb*-, were first observed at a Fermilab experiment, but this is the first time that their two higher-mass counterparts, Σb(6097)+ and Σb(6097)-, have been detected.

    The LHCb collaboration found these particles using the classic particle-hunting technique of looking for an excess of events, or bump, over a smooth background of events in data from particle collisions. In this case, the researchers looked for such bumps in the mass distribution of a two-particle system consisting of a neutral baryon called Λb0 and a charged quark-antiquark particle called the π meson. They found two bumps corresponding to the Σb(6097)+ and Σb(6097)- particles, with the whopping significances of 12.7 and 12.6 standard deviations respectively; five standard deviations is the usual threshold to claim the discovery of a new particle. The 6097 in the names refers to the approximate masses of the new particles in MeV, about six times more massive than the proton.

    The third particle, named Zc-(4100) by the LHCb collaboration, is a possible candidate for a different type of quark beast, one made not of the usual two or three quarks but of four quarks (strictly speaking, two quarks and two antiquarks), two of which are heavy charm quarks. Such exotic mesons, sometimes described as “tetraquarks”, as well as five-quark particles called “pentaquarks”, have long been predicted to exist but have only relatively recently been discovered. Searching for structures in the decays of heavier B mesons, the LHCb researchers detected evidence for Zc-(4100) with a significance of more than three standard deviations, short of the threshold for discovery. Future studies with more data, at LHCb or at other experiments, may be able to boost or disprove this evidence.

    The new findings, described in two papers posted online and submitted for publication to physics journals, represent another step in physicists’ understanding of the strong force, one of the four fundamental forces of nature.

    For more information, see the LHCb website.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles

    OTHER PROJECTS AT CERN

    CERN AEGIS

    CERN ALPHA

    CERN ALPHA

    CERN AMS

    CERN ACACUSA

    CERN ASACUSA

    CERN ATRAP

    CERN ATRAP

    CERN AWAKE

    CERN AWAKE

    CERN CAST

    CERN CAST Axion Solar Telescope

    CERN CLOUD

    CERN CLOUD

    CERN COMPASS

    CERN COMPASS

    CERN DIRAC

    CERN DIRAC

    CERN ISOLDE

    CERN ISOLDE

    CERN LHCf

    CERN LHCf

    CERN NA62

    CERN NA62

    CERN NTOF

    CERN TOTEM

    CERN UA9

    CERN Proto Dune

    CERN Proto Dune

    Advertisements
     
  • richardmitnick 3:39 pm on September 25, 2018 Permalink | Reply
    Tags: , , , Argonne's Theta supercomputer, Aurora exascale supercomputer, , CERN LHC, , , , ,   

    From Argonne National Laboratory ALCF: “Argonne team brings leadership computing to CERN’s Large Hadron Collider” 

    Argonne Lab
    News from Argonne National Laboratory

    From Argonne National Laboratory ALCF

    ANL ALCF Cetus IBM supercomputer

    ANL ALCF Theta Cray supercomputer

    ANL ALCF Cray Aurora supercomputer

    ANL ALCF MIRA IBM Blue Gene Q supercomputer at the Argonne Leadership Computing Facility

    September 25, 2018
    Madeleine O’Keefe

    CERN’s Large Hadron Collider (LHC), the world’s largest particle accelerator, expects to produce around 50 petabytes of data this year. This is equivalent to nearly 15 million high-definition movies—an amount so enormous that analyzing it all poses a serious challenge to researchers.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    A team of collaborators from the U.S. Department of Energy’s (DOE) Argonne National Laboratory is working to address this issue with computing resources at the Argonne Leadership Computing Facility (ALCF), a DOE Office of Science User Facility. Since 2015, this team has worked with the ALCF on multiple projects to explore ways supercomputers can help meet the growing needs of the LHC’s ATLAS experiment.

    The efforts are especially important given what is coming up for the accelerator. In 2026, the LHC will undergo an ambitious upgrade to become the High-Luminosity LHC (HL-LHC). The aim of this upgrade is to increase the LHC’s luminosity—the number of events detected per second—by a factor of 10. “This means that the HL-LHC will be producing about 20 times more data per year than what ATLAS will have on disk at the end of 2018,” says Taylor Childers, a member of the ATLAS collaboration and computer scientist at the ALCF who is leading the effort at the facility. “CERN’s computing resources are not going to grow by that factor.”

    Luckily for CERN, the ALCF already operates some of the world’s most powerful supercomputers for science, and the facility is in the midst of planning for an upgrade of its own. In 2021, Aurora—the ALCF’s next-generation system, and the first exascale machine in the country—is scheduled to come online.

    It will provide the ATLAS experiment with an unprecedented resource for analyzing the data coming out of the LHC—and soon, the HL-LHC.

    CERN/ATLAS detector

    Why ALCF?

    CERN may be best known for smashing particles, which physicists do to study the fundamental laws of nature and gather clues about how the particles interact. This involves a lot of computationally intense calculations that benefit from the use of the DOE’s powerful computing systems.

    The ATLAS detector is an 82-foot-tall, 144-foot-long cylinder with magnets, detectors, and other instruments layered around the central beampipe like an enormous 7,000-ton Swiss roll. When protons collide in the detector, they send a spray of subatomic particles flying in all directions, and this particle debris generates signals in the detector’s instruments. Scientists can use these signals to discover important information about the collision and the particles that caused it in a computational process called reconstruction. Childers compares this process to arriving at the scene of a car crash that has nearly completely obliterated the vehicles and trying to figure out the makes and models of the cars and how fast they were going. Reconstruction is also performed on simulated data in the ATLAS analysis framework, called Athena.

    An ATLAS physics analysis consists of three steps. First, in event generation, researchers use the physics that they know to model the kinds of particle collisions that take place in the LHC. In the next step, simulation, they generate the subsequent measurements the ATLAS detector would make. Finally, reconstruction algorithms are run on both simulated and real data, the output of which can be compared to see differences between theoretical prediction and measurement.

    “If we understand what’s going on, we should be able to simulate events that look very much like the real ones,” says Tom LeCompte, a physicist in Argonne’s High Energy Physics division and former physics coordinator for ATLAS.

    “And if we see the data deviate from what we know, then we know we’re either wrong, we have a bug, or we’ve found new physics,” says Childers.

    Some of these simulations, however, are too complicated for the Worldwide LHC Computing Grid, which LHC scientists have used to handle data processing and analysis since 2002.

    MonALISA LHC Computing GridMap http:// monalisa.caltech.edu/ml/_client.beta

    The Grid is an international distributed computing infrastructure that links 170 computing centers across 42 countries, allowing data to be accessed and analyzed in near real-time by an international community of more than 10,000 physicists working on various LHC experiments.

    The Grid has served the LHC well so far, but as demand for new science increases, so does the required computing power.

    That’s where the ALCF comes in.

    In 2011, when LeCompte returned to Argonne after serving as ATLAS physics coordinator, he started looking for the next big problem he could help solve. “Our computing needs were growing faster than it looked like we would be able to fulfill them, and we were beginning to notice that there were problems we were trying to solve with existing computing that just weren’t able to be solved,” he says. “It wasn’t just an issue of having enough computing; it was an issue of having enough computing in the same place. And that’s where the ALCF really shines.”

    LeCompte worked with Childers and ALCF computer scientist Tom Uram to use Mira, the ALCF’s 10-petaflops IBM Blue Gene/Q supercomputer, to carry out calculations to improve the performance of the ATLAS software.

    MIRA IBM Blue Gene Q supercomputer at the Argonne Leadership Computing Facility

    Together they scaled Alpgen, a Monte Carlo-based event generator, to run efficiently on Mira, enabling the generation of millions of particle collision events in parallel. “From start to finish, we ended up processing events more than 20 times as fast, and used all of Mira’s 49,152 processors to run the largest-ever event generation job,” reports Uram.

    But they weren’t going to stop there. Simulation, which takes up around five times more Grid computing than event generation, was the next challenge to tackle.
    Moving forward with Theta

    In 2017, Childers and his colleagues were awarded a two-year allocation from the ALCF Data Science Program (ADSP), a pioneering initiative designed to explore and improve computational and data science methods that will help researchers gain insights into very large datasets produced by experimental, simulation, or observational methods. The goal is to deploy Athena on Theta, the ALCF’s 11.69-petaflops Intel-Cray supercomputer, and develop an end-to-end workflow to couple all the steps together to improve upon the current execution model for ATLAS jobs which involves a many­step workflow executed on the Grid.

    ANL ALCF Theta Cray XC40 supercomputer

    “Each of those steps—event generation, simulation, and reconstruction—has input data and output data, so if you do them in three different locations on the Grid, you have to move the data with it,” explains Childers. “Ideally, you do all three steps back-to-back on the same machine, which reduces the amount of time you have to spend moving data around.”

    Enabling portions of this workload on Theta promises to expedite the production of simulation results, discovery, and publications, as well as increase the collaboration’s data analysis reach, thus moving scientists closer to new particle physics.

    One challenge the group has encountered so far is that, unlike other computers on the Grid, Theta cannot reach out to the job server at CERN to receive computing tasks. To solve this, the ATLAS software team developed Harvester, a Python edge service that can retrieve jobs from the server and submit them to Theta. In addition, Childers developed Yoda, an MPI-enabled wrapper that launches these jobs on each compute node.

    Harvester and Yoda are now being integrated into the ATLAS production system. The team has just started testing this new workflow on Theta, where it has already simulated over 12 million collision events. Simulation is the only step that is “production-ready,” meaning it can accept jobs from the CERN job server.

    The team also has a running end-to-end workflow—which includes event generation and reconstruction—for ALCF resources. For now, the local ATLAS group is using it to run simulations investigating if machine learning techniques can be used to improve the way they identify particles in the detector. If it works, machine learning could provide a more efficient, less resource-intensive method for handling this vital part of the LHC scientific process.

    “Our traditional methods have taken years to develop and have been highly optimized for ATLAS, so it will be hard to compete with them,” says Childers. “But as new tools and technologies continue to emerge, it’s important that we explore novel approaches to see if they can help us advance science.”
    Upgrade computing, upgrade science

    As CERN’s quest for new science gets more and more intense, as it will with the HL-LHC upgrade in 2026, the computational requirements to handle the influx of data become more and more demanding.

    “With the scientific questions that we have right now, you need that much more data,” says LeCompte. “Take the Higgs boson, for example. To really understand its properties and whether it’s the only one of its kind out there takes not just a little bit more data but takes a lot more data.”

    This makes the ALCF’s resources—especially its next-generation exascale system, Aurora—more important than ever for advancing science.

    Depiction of ANL ALCF Cray Shasta Aurora exascale supercomputer

    Aurora, scheduled to come online in 2021, will be capable of one billion billion calculations per second—that’s 100 times more computing power than Mira. It is just starting to be integrated into the ATLAS efforts through a new project selected for the Aurora Early Science Program (ESP) led by Jimmy Proudfoot, an Argonne Distinguished Fellow in the High Energy Physics division. Proudfoot says that the effective utilization of Aurora will be key to ensuring that ATLAS continues delivering discoveries on a reasonable timescale. Since increasing compute resources increases the analyses that are able to be done, systems like Aurora may even enable new analyses not yet envisioned.

    The ESP project, which builds on the progress made by Childers and his team, has three components that will help prepare Aurora for effective use in the search for new physics: enable ATLAS workflows for efficient end-to-end production on Aurora, optimize ATLAS software for parallel environments, and update algorithms for exascale machines.

    “The algorithms apply complex statistical techniques which are increasingly CPU-intensive and which become more tractable—and perhaps only possible—with the computing resources provided by exascale machines,” explains Proudfoot.

    In the years leading up to Aurora’s run, Proudfoot and his team, which includes collaborators from the ALCF and Lawrence Berkeley National Laboratory, aim to develop the workflow to run event generation, simulation, and reconstruction. Once Aurora becomes available in 2021, the group will bring their end-to-end workflow online.

    The stated goals of the ATLAS experiment—from searching for new particles to studying the Higgs boson—only scratch the surface of what this collaboration can do. Along the way to groundbreaking science advancements, the collaboration has developed technology for use in fields beyond particle physics, like medical imaging and clinical anesthesia.

    These contributions and the LHC’s quickly growing needs reinforce the importance of the work that LeCompte, Childers, Proudfoot, and their colleagues are doing with ALCF computing resources.

    “I believe DOE’s leadership computing facilities are going to play a major role in the processing and simulation of the future rounds of data that will come from the ATLAS experiment,” says LeCompte.

    This research is supported by the DOE Office of Science. ALCF computing time and resources were allocated through the ASCR Leadership Computing Challenge, the ALCF Data Science Program, and the Early Science Program for Aurora.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    About ALCF

    The Argonne Leadership Computing Facility’s (ALCF) mission is to accelerate major scientific discoveries and engineering breakthroughs for humanity by designing and providing world-leading computing facilities in partnership with the computational science community.

    We help researchers solve some of the world’s largest and most complex problems with our unique combination of supercomputing resources and expertise.

    ALCF projects cover many scientific disciplines, ranging from chemistry and biology to physics and materials science. Examples include modeling and simulation efforts to:

    Discover new materials for batteries
    Predict the impacts of global climate change
    Unravel the origins of the universe
    Develop renewable energy technologies

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

     
  • richardmitnick 2:32 pm on September 25, 2018 Permalink | Reply
    Tags: , , , CERN LHC, , , , ,   

    From ALICE at CERN: “What the LHC upgrade brings to CERN” 

    CERN
    CERN New Masthead

    From From ALICE at CERN

    25 September 2018
    Rashmi Raniwala
    Sudhir Raniwala

    Six years after discovery, Higgs boson validates a prediction. Soon, an upgrade to Large Hadron Collider will allow CERN scientists to produce more of these particles for testing Standard Model of physics.

    FNAL magnets such as this one, which is mounted on a test stand at Fermilab, for the High-Luminosity LHC Photo Reidar Hahn

    Six years after the Higgs boson was discovered at the CERN Large Hadron Collider (LHC), particle physicists announced last week that they have observed how the elusive particle decays.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    The finding, presented by ATLAS and CMS collaborations, observed the Higgs boson decaying to fundamental particles known as bottom quarks.

    In 2012, the Nobel-winning discovery of the Higgs boson validated the Standard Model of physics, which also predicts that about 60% of the time a Higgs boson will decay to a pair of bottom quarks.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.


    Standard Model of Particle Physics from Symmetry Magazine

    According to CERN, “testing this prediction is crucial because the result will either lend support to the Standard Model — which is built upon the idea that the Higgs field endows quarks and other fundamental particles with mass — or rock its foundations and point to new physics”.

    The Higgs boson was detected by studying collisions of particles at different energies. But they last only for one zeptosecond, which is 0.000000000000000000001 seconds, so detecting and studying their properties requires an incredible amount of energy and advanced detectors. CERN announced earlier this year that it is getting a massive upgrade, which will be completed by 2026.

    Why study particles?

    Particle physics probes nature at extreme scales, to understand the fundamental constituents of matter. Just like grammar and vocabulary guide (and constrain) our communication, particles communicate with each other in accordance with certain rules which are embedded in what are known as the ‘four fundamental interactions’. The particles and three of these interactions are successfully described by a unified approach known as the Standard Model. The SM is a framework that required the existence of a particle called the Higgs boson, and one of the major aims of the LHC was to search for the Higgs boson.

    How are such tiny particles studied?

    Protons are collected in bunches, accelerated to nearly the speed of light and made to collide. Many particles emerge from such a collision, termed as an event. The emergent particles exhibit an apparently random pattern but follow underlying laws that govern part of their behaviour. Studying the patterns in the emission of these particles help us understand the properties and structure of particles.

    Initially, the LHC provided collisions at unprecedented energies allowing us to focus on studying new territories. But, it is now time to increase the discovery potential of the LHC by recording a larger number of events.

    3
    No image credit or caption

    So, what will an upgrade mean?

    After discovering the Higgs boson, it is imperative to study the properties of the newly discovered particle and its effect on all other particles. This requires a large number of Higgs bosons. The SM has its shortcomings, and there are alternative models that fill these gaps. The validity of these and other models that provide an alternative to SM can be tested by experimenting to check their predictions. Some of these predictions, including signals for “dark matter”, “supersymmetric particles” and other deep mysteries of nature are very rare, and hence difficult to observe, further necessitating the need of a High Luminosity LHC (HL-LHC).

    Imagine trying to find a rare variety of diamond amongst a very large number of apparently similar looking pieces. The time taken to find the coveted diamond will depend on the number of pieces provided per unit time for inspection, and the time taken in inspection. To complete this task faster, we need to increase the number of pieces provided and inspect faster. In the process, some new pieces of diamond, hitherto unobserved and unknown, may be discovered, changing our perspective about rare varieties of diamonds.

    Once upgraded, the rate of collisions will increase and so will the probability of most rare events. In addition, discerning the properties of the Higgs boson will require their copious supply. After the upgrade, the total number of Higgs bosons produced in one year may be about 5 times the number produced currently; and in the same duration, the total data recorded may be more than 20 times.

    With the proposed luminosity (a measure of the number of protons crossing per unit area per unit time) of the HL-LHC, the experiments will be able to record about 25 times more data in the same period as for LHC running. The beam in the LHC has about 2,800 bunches, each of which contains about 115 billion protons. The HL- LHC will have about 170 billion protons in each bunch, contributing to an increase in luminosity by a factor of 1.5.

    How will it be upgraded?

    The protons are kept together in the bunch using strong magnetic fields of special kinds, formed using quadrupole magnets. Focusing the bunch into a smaller size requires stronger fields, and therefore greater currents, necessitating the use of superconducting cables. Newer technologies and new material (Niobium-tin) will be used to produce the required strong magnetic fields that are 1.5 times the present fields (8-12 tesla).

    The creation of long coils for such fields is being tested. New equipment will be installed over 1.2 km of the 27-km LHC ring close to the two major experiments (ATLAS and CMS), for focusing and squeezing the bunches just before they cross.

    CERN crab cavities that will be used in the HL-LHC


    FNAL Crab cavities for the HL-LHC

    Hundred-metre cables of superconducting material (superconducting links) with the capacity to carry up to 100,000 amperes will be used to connect the power converters to the accelerator. The LHC gets the protons from an accelerator chain, which will also need to be upgraded to meet the requirements of the high luminosity.

    Since the length of each bunch is a few cm, to increase the number of collisions a slight tilt is being produced in the bunches just before the collisions to increase the effective area of overlap. This is being done using ‘crab cavities’.

    The experimental particle physics community in India has actively participated in the experiments ALICE and CMS. The HL-LHC will require an upgrade of these too. Both the design and the fabrication of the new detectors, and the ensuing data analysis will have a significant contribution from the Indian scientists.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:


    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS
    ATLAS
    CERN/ATLAS detector

    ALICE
    CERN ALICE New

    CMS
    CERN/CMS Detector

    LHCb

    CERN/LHCb

    LHC

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles


    Quantum Diaries

     
  • richardmitnick 2:02 pm on September 25, 2018 Permalink | Reply
    Tags: , , CERN LHC, , , , , LSND, , , ,   

    From Symmetry: “How not to be fooled in physics” 

    Symmetry Mag
    From Symmetry

    09/25/18
    Laura Dattaro

    1
    Illustration by Sandbox Studio, Chicago with Ana Kova

    Particle physicists and astrophysicists employ a variety of tools to avoid erroneous results.

    In the 1990s, an experiment conducted in Los Alamos, about 35 miles northwest of the capital of New Mexico, appeared to find something odd.

    Scientists designed the Liquid Scintillator Neutrino Detector experiment at the US Department of Energy’s Los Alamos National Laboratory to count neutrinos, ghostly particles that come in three types and rarely interact with other matter.

    LSND experiment at Los Alamos National Laboratory and Virginia Tech

    LSND was looking for evidence of neutrino oscillation, or neutrinos changing from one type to another.

    Several previous experiments had seen indications of such oscillations, which show that neutrinos have small masses not incorporated into the Standard Model, the ruling theory of particle physics. LSND scientists wanted to double-check these earlier measurements.

    By studying a nearly pure source of one type of neutrinos—muon neutrinos—LSND did find evidence of oscillation to a different type of neutrinos, electron neutrinos. However, they found many more electron neutrinos in their detector than predicted, creating a new puzzle.

    This excess could have been a sign that neutrinos oscillate between not three but four different types, suggesting the existence of a possible new type of neutrino, called a sterile neutrino, which theorists had suggested as a possible way to incorporate tiny neutrino masses into the Standard Model.

    Or there could be another explanation. The question is: What? And how can scientists guard against being fooled in physics?

    Brand new thing

    Many physicists are looking for results that go beyond the Standard Model. They come up with experiments to test its predictions; if what they find doesn’t match up, they have potentially discovered something new.

    “Do we see what we expected from the calculations if all we have there is the Standard Model?” says Paris Sphicas, a researcher at CERN.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.


    Standard Model of Particle Physics from Symmetry Magazine

    “If the answer is yes, then it means we have nothing new. If the answer is no, then you have the next question, which is, ‘Is this within the uncertainties of our estimates? Could this be a result of a mistake in our estimates?’ And so on and so on.”

    A long list of possible factors can trick scientists into thinking they’ve made a discovery. A big part of scientific research is identifying them and finding ways to test what’s really going on.

    “The community standard for discovery is a high bar, and it ought to be,” says Yale University neutrino physicist Bonnie Fleming. “It takes time to really convince ourselves we’ve really found something.”

    In the case of the LSND anomaly, scientists wonder whether unaccounted-for background events tipped the scales or if some sort of mechanical problem caused an error in the measurement.

    Scientists have designed follow-up experiments to see if they can reproduce the result. An experiment called MiniBooNE, hosted by Fermi National Accelerator Laboratory, recently reported seeing signs of a similar excess. Other experiments, such as the MINOS experiment, also at Fermilab, have not seen it, complicating the search.

    FNAL/MiniBooNE

    FNAL Minos map


    FNAL/MINOS


    FNAL MINOS Far Detector in the Soudan Mine in northern Minnesota

    “[LSND and MiniBooNE] are clearly measuring an excess of events over what they expect,” says MINOS co-spokesperson Jenny Thomas, a physicist at University College London. “Are those important signal events, or are they a background they haven’t estimated properly? That’s what they are up against.”

    Managing expectations

    Much of the work in understanding a signal involves preparatory work before one is even seen.

    In designing an experiment, researchers need to understand what physics processes can produce or mimic the signal being sought, events that are often referred to as “background.”

    Physicists can predict backgrounds through simulations of experiments. Some types of detector backgrounds can be identified through “null tests,” such as pointing a telescope at a blank wall. Other backgrounds can be identified through tests with the data itself, such as so-called “jack-knife tests,” which involve splitting data into subsets—say, data from Monday and data from Tuesday—which by design must produce the same results. Any inconsistencies would warn scientists about a signal that appears in just one subset.

    Researchers looking at a specific signal work to develop a deep understanding of what other physics processes could produce the same signature in their detector. MiniBooNE, for example, studies a beam primarily made of muon neutrinos to measure how often those neutrinos oscillate to other flavors. But it will occasionally pick up stray electron neutrinos, which look like muon neutrinos that have transformed. Beyond that, other physics processes can mimic the signal of an electron neutrino event.

    “We know we’re going to be faked by those, so we have to do the best job to estimate how many of them there are,” Fleming says. “Whatever excess we find has to be in addition to those.”

    Even more variable than a particle beam: human beings. While science strives to be an objective measurement of facts, the process itself is conducted by a collection of people whose actions can be colored by biases, personal stories and emotion. A preconceived notion that an experiment will (or won’t) produce a certain result, for example, could influence a researcher’s work in subtle ways.

    “I think there’s a stereotype that scientists are somehow dispassionate, cold, calculating observers of reality,” says Brian Keating, an astrophysicist at University of California San Diego and author of the book Losing the Nobel Prize, which chronicles how the desire to make a prize-winning discovery can steer a scientist away from best practices. “In reality, the truth is we actually participate in it, and there are sociological elements at work that influence a human being. Scientists, despite the stereotypes, are very much human beings.”

    Staying cognizant of this fact and incorporating methods for removing bias are especially important if a particular claim upends long-standing knowledge—such as, for example, our understanding of neutrinos. In these cases, scientists know to adhere to the adage: Extraordinary claims require extraordinary evidence.

    “If you’re walking outside your house and you see a car, you probably think, ‘That’s a car,’” says Jonah Kanner, a research scientist at Caltech. “But if you see a dragon, you might think, ‘Is that really a dragon? Am I sure that’s a dragon?’ You’d want a higher level of evidence.”


    Dragon or discovery?

    Physicists have been burned by dragons before. In 1969, for example, a scientist named Joe Weber announced that he had detected gravitational waves: ripples in the fabric of space-time first predicted by Albert Einstein in 1916. Such a detection, which many had thought was impossible to make, would have proved a key tenet of relativity. Weber rocketed to momentary fame, until other physicists found they could not replicate his results.

    The false discovery rocked the gravitational wave community, which, over the decades, became increasingly cautious about making such announcements.

    So in 2009, as the Laser Interferometer Gravitational Wave Observatory, or LIGO, came online for its next science run, the scientific collaboration came up with a way to make sure collaboration members stayed skeptical of their results. They developed a method of adding a false or simulated signal into the detector data stream without alerting the majority of the 800 or so researchers on the team. They called it a blind injection. The rest of the members knew an injection was possible, but not guaranteed.

    “We’d been not detecting signals for 30 years,” Kanner, a member of the LIGO collaboration, says. “How clear or obvious would the signature have to be for everyone to believe it?… It forced us to push our algorithms and our statistics and our procedures, but also to test the sociology and see if we could get a group of people to agree on this.”

    In late 2010, the team got the alert they had been waiting for: The computers detected a signal. For six months, hundreds of scientists analyzed the results, eventually concluding that the signal looked like gravitational waves. They wrote a paper detailing the evidence, and more than 400 team members voted on its approval. Then a senior member told them it had all been faked.

    Picking out and spending so much time examining such an artificial signal may seem like a waste of time, but the test worked just as intended. The exercise forced the scientists to work through all of the ways they would need to scrutinize a real result before one ever came through. It forced the collaboration to develop new tests and approaches to demonstrating the consistency of a possible signal in advance of a real event.

    “It was designed to keep us honest in a sense,” Kanner says. “Everyone to some extent goes in with some guess or expectation about what’s going to come out of that experiment. Part of the idea of the blind injection was to try and tip the scales on that bias, where our beliefs about whether we thought nature should produce an event would be less important.”

    All of the hard work paid off: In September 2015, when an authentic signal hit the LIGO detectors, scientists knew what to do. In 2016, the collaboration announced the first confirmed direct detection of gravitational waves. One year later, the discovery won the Nobel Prize.


    VIRGO Gravitational Wave interferometer, near Pisa, Italy

    Caltech/MIT Advanced aLigo Hanford, WA, USA installation


    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA

    Cornell SXS, the Simulating eXtreme Spacetimes (SXS) project

    Gravitational waves. Credit: MPI for Gravitational Physics/W.Benger

    ESA/eLISA the future of gravitational wave research

    1
    Skymap showing how adding Virgo to LIGO helps in reducing the size of the source-likely region in the sky. (Credit: Giuseppe Greco (Virgo Urbino group)

    No easy answers

    While blind injections worked for the gravitational waves community, each area of physics presents its own unique challenges.

    Neutrino physicists have an extremely small sample size with which to work, because their particles interact so rarely. That’s why experiments such as the NOvA experiment and the upcoming Deep Underground Neutrino experiment use such enormous detectors.

    FNAL/NOvA experiment map


    FNAL NOvA detector in northern Minnesota


    FNAL NOvA Near Detector

    Astronomers have even fewer samples: They have just one universe to study, and no way to conduct controlled experiments. That’s why they conduct decades-long surveys, to collect as much data as possible.

    Researchers working at the Large Hadron Collider have no shortage of interactions to study—an estimated 600 million events are detected every second.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    But due to the enormous size, cost and complexity of the technology, scientists have built only one LHC. That’s why inside the collider sit multiple different detectors, which can check one another’s work by measuring the same things in a variety of ways with detectors of different designs.

    CERN ATLAS


    CERN/CMS Detector



    CERN ALICE detector


    CERN LHCb chamber, LHC

    While there are many central tenets to checking a result—knowing your experiment and background well, running simulations and checking that they agree with your data, testing alternative explanations of a suspected result—there’s no comprehensive checklist that every physicist performs. Strategies vary from experiment to experiment, among fields and over time.

    Scientists must do everything they can to test a result, because in the end, it will need to stand up to the scrutiny of their peers. Fellow physicists will question the new result, subject it to their own analyses, try out alternative interpretations, and, ultimately, try to repeat the measurement in a different way. Especially if they’re dealing with dragons.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 1:53 pm on September 14, 2018 Permalink | Reply
    Tags: , , CERN LHC, , , ,   

    From CERN: “The LHC prepares for the future” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN

    14 Sep 2018
    Corinne Pralavorio

    1
    View of the CERN Control Centre where the operators control the LHC (Image: Maximilien Brice/CERN)

    The Large Hadron Collider is stopping proton collisions for five days this week to undergo numerous tests.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    Accelerator specialists need to test the LHC when it is not in production mode and there are only several weeks left in which they can do it. At the end of the year, CERN’s accelerators will be shut down for a major two-year upgrade programme that will result in a renovated accelerator complex using more intense beams and higher energy. Scientists are conducting research to prepare for this new stage and the next, the High-Luminosity LHC.

    “We have many requests from CERN’s teams because these periods of machine development allow components to be tested in real conditions and the results of simulations to be checked,” says Jan Uythoven, the head of the machine development programme. No fewer than twenty-four tests are scheduled for what will be this year’s third testing period.

    One of the major areas of research focuses on beam stability : perturbations are systematically tracked and corrected by the LHC operators. When instabilities arise, the operators stop the beams and dump them. “To keep high-intensity beams stable, we have to improve the fine-tuning of the LHC,” Jan Uythoven adds. Extensive research is therefore being carried out to better understand these instabilities, with operators causing them deliberately in order to study how the beams behave.

    The operators are also testing new optics for the High-Luminosity LHC or, in other words, a new way of adjusting the magnets to increase the beam concentration at the collision points. Another subject of the study concerns the heat generated by more intense future beams, which raises the temperature in the magnet’s core to the limit of what is needed to maintain the superconducting state. Lastly, tests are also being carried out on new components. In particular, innovative collimators were implemented at the start of the year. Collimators are protective items of equipment that stop the particles that deviate from the trajectory to prevent them from damaging the accelerator.

    After this five-day test period, the LHC will stop running completely for a technical stop lasting another five days, during which teams will carry out repairs and maintenance. The technical stop will be followed by five weeks of proton collisions before the next period of machine development and the lead-ion run.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles

    OTHER PROJECTS AT CERN

    CERN AEGIS

    CERN ALPHA

    CERN ALPHA

    CERN AMS

    CERN ACACUSA

    CERN ASACUSA

    CERN ATRAP

    CERN ATRAP

    CERN AWAKE

    CERN AWAKE

    CERN CAST

    CERN CAST Axion Solar Telescope

    CERN CLOUD

    CERN CLOUD

    CERN COMPASS

    CERN COMPASS

    CERN DIRAC

    CERN DIRAC

    CERN ISOLDE

    CERN ISOLDE

    CERN LHCf

    CERN LHCf

    CERN NA62

    CERN NA62

    CERN NTOF

    CERN TOTEM

    CERN UA9

    CERN Proto Dune

    CERN Proto Dune

     
  • richardmitnick 9:26 pm on September 13, 2018 Permalink | Reply
    Tags: , , Because we only looked at one-millionth of the data that's out there. Perhaps the nightmare is one we've brought upon ourselves, CERN LHC, , Every 25 nanoseconds there's a chance of a collision, Has The Large Hadron Collider Accidentally Thrown Away The Evidence For New Physics?, , Most of CERN's data from the LHC has been lost forever., Only 0.0001% of the total data can be saved for analysis, Out of every one million collisions that occurs at the LHC only one of them has all of its data written down and recorded., , , , We think we're doing the smart thing by choosing to save what we're saving but we can't be sure   

    From Ethan Siegel: “Has The Large Hadron Collider Accidentally Thrown Away The Evidence For New Physics?” 

    From Ethan Siegel
    Sep 13, 2018

    The Universe is out there, waiting for you to discover it.

    1
    The ATLAS particle detector of the Large Hadron Collider (LHC) at the European Nuclear Research Center (CERN) in Geneva, Switzerland. Built inside an underground tunnel of 27km (17miles) in circumference, CERN’s LHC is the world’s largest and most powerful particle collider and the largest single machine in the world. It can only record a tiny fraction of the data it collects. No image credit.

    Over at the Large Hadron Collider, protons simultaneously circle clockwise and counterclockwise, smashing into one another while moving at 99.9999991% the speed of light apiece. At two specific points designed to have the greatest numbers of collisions, enormous particle detectors were constructed and installed: the CMS and ATLAS detectors. After billions upon billions of collisions at these enormous energies, the LHC has brought us further in our hunt for the fundamental nature of the Universe and our understanding of the elementary building blocks of matter.

    Earlier this month, the LHC celebrated 10 years of operation, with the discovery of the Higgs boson marking its crowning achievement. Yet despite these successes, no new particles, interactions, decays, or fundamental physics has been found. Worst of all is this: most of CERN’s data from the LHC has been lost forever.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    2
    The CMS Collaboration, whose detector is shown prior to final assembly here, has released their latest, most comprehensive results ever. There is no indication of physics beyond the Standard Model in the results.CERN/Maximlien Brice.

    This is one of the least well-understood pieces of the high-energy physics puzzle, at least among the general public. The LHC hasn’t just lost most of its data: it’s lost a whopping 99.9999% of it. That’s right; out of every one million collisions that occurs at the LHC, only one of them has all of its data written down and recorded.

    It’s something that happened out of necessity, due to the limitations imposed by the laws of nature themselves, as well as what technology can presently do. But in making that decision, there’s a tremendous fear made all the more palpable by the fact that, other than the much-anticipated Higgs, nothing new has been discovered. The fear is this: that there is new physics waiting to be discovered, but we’ve missed it by throwing this data away.

    3
    A four-muon candidate event in the ATLAS detector at the Large Hadron Collider. The muon/anti-muon tracks are highlighted in red, as the long-lived muons travel farther than any other unstable particle. This is an interesting event, but for every event we record, a million others get discarded.ATLAS Collaboration/CERN

    We didn’t have a choice in the matter, really. Something had to be thrown away. The way the LHC works is by accelerating protons as close to the speed of light as possible in opposite directions and smashing them together. This is how particle accelerators have worked best for generations. According to Einstein, a particle’s energy is a combination of its rest mass (which you may recognize as E = mc2) and the energy of its motion, also known as its kinetic energy. The faster you go — or more accurately, the closer you get to the speed of light — the higher energy-per-particle you can achieve.

    At the LHC, we collide protons together at 299,792,455 m/s, just 3 m/s shy of the speed of light itself. By smashing them together at such high speeds, moving in opposite directions, we make it possible for otherwise impossible particles to exist.

    The reason is this: all particle (and antiparticles) that we can create have a certain amount of energy inherent to them, in the form of their mass-at-rest. When you smash two particles together, some of that energy has to go into the individual components of those particles, both their rest energy and their kinetic energy (i.e., their energy-of-motion).

    But if you have enough energy, some of that energy can also go into the production of new particles! This is where E = mc2 gets really interesting: not only do all particles with a mass (m) have an energy (E) inherent to their existence, but if you have enough available energy, you can create new particles. At the LHC, humanity has achieved collisions with more available energy for the creation of new particles than in any other laboratory in history.

    The energy-per-particle is around 7 TeV, meaning each proton achieves approximately 7,000 times its rest-mass energy in the form of kinetic energy. But collisions are rare and protons aren’t just tiny, they’re mostly empty space. In order to get a large probability of a collision, you need to put more than one proton in at a time; you inject your protons in bunches instead.

    At full intensity, this means that there are many tiny bunches of protons going clockwise and counterclockwise inside the LHC whenever it’s running. The LHC tunnels are approximately 26 kilometers long, with only 7.5 meters (or around 25 feet) separating each bunch. As these bunches of beams go around, they get squeezed as they interact at the mid-point of each detector. Every 25 nanoseconds, there’s a chance of a collision.

    So what do you do? Do you have a small number of collisions and record every one? That’s a waste of energy and potential data.

    Instead, you pump in enough protons in each bunch to ensure you have a good collision every time two bunches pass through. And every time you have a collision, particles rip through the detector in all directions, triggering the complex electronics and circuitry that allow us to reconstruct what was created, when, and where in the detector. It’s like a giant explosion, and only by measuring all the pieces of shrapnel that come out can we reconstruct what happened (and what new things were created) at the point of ignition.

    CERN CMS Higgs Event

    The problem that then arises, however, is in taking all of that data and recording it. The detectors themselves are big: 22 meters for CMS and 46 meters long for ATLAS. At any given time, there are particles arising from three different collisions in CMS and six separate collisions in ATLAS. In order to record data, there are two steps that must occur:

    The data has to be moved into the detector’s memory, which is limited by the speed of your electronics. Even at the speed of light, we can only “remember” about 1-in-1,000 collisions.
    The data in memory has to be written to disk (or some other permanent device), and that’s a much slower process than storing data in memory. Only about 1-in-1,000 collisions that the memory stores can be written to disk.

    That’s why, with the necessity of taking both of these steps, only 0.0001% of the total data can be saved for analysis.

    How do we know we’re saving the right pieces of data? The ones where it’s most likely we’re creating new particles, seeing the importance of new interactions, or observing new physics?

    When you have proton-proton collisions, most of what comes out are normal particles, in the sense that they’re made up almost exclusively of up-and-down quarks. (This means particles like protons, neutrons, and pions.) And most collisions are glancing collisions, meaning that most of the particles wind up hitting the detector in the forwards or backwards direction.

    So, to take that first step, we try and look for particle tracks of relatively high-energies that go in the transverse direction, rather than forwards or backwards. We try and put into the detector’s memory the events that we think had the most available energy (E) for creating new particles, of the highest mass (m) possible. Then, we quickly perform a computational scan of what’s in the detector’s memory to see if it’s worth writing to disk or not. If we choose to do so, that’s the only thing that detector will be writing for approximately the next 1/40th of a second or so.

    1/40th of a second might not seem like much, but it’s approximately 25,000,000 nanoseconds: enough time for about a million bunches to collide.

    5
    The particle tracks emanating from a high energy collision at the LHC in 2014. Only 1-in-1,000,000 such collisions have been written down and saved; the majority have been lost.

    We think we’re doing the smart thing by choosing to save what we’re saving, but we can’t be sure. In 2010, the CERN Data Centre passed an enormous data milestone: 10 Petabytes of data. By the end of 2013, they had passed 100 Petabytes of data; in 2017, they passed the 200 Petabyte milestone. Yet for all of it, we know that we’ve thrown away — or failed to record — about 1,000,000 times that amount. We may have collected hundreds of Petabytes, but we’ve discarded, and lost forever, hundreds of Zettabytes: more than the total amount of internet data created in a year.

    6
    The total amount of data that’s been collected by the LHC far outstrips the total amount of data sent-and-received over the internet over the last 10 years. But only 0.0001% of that data has been written down and saved; the rest is gone for good. No image credit.

    It’s eminently possible that the LHC created new particles, saw evidence of new interactions, and observed and recorded all the signs of new physics. And it’s also possible, due to our ignorance of what we were looking for, we’ve thrown it all away, and will continue to do so. The nightmare scenario — of no new physics beyond the Standard Model — appears to be coming true. But the real nightmare is the very real possibility that the new physics is there, we’ve built the perfect machine to find it, we’ve found it, and we’ll never realize it because of the decisions and assumptions we’ve made. The real nightmare is that we’ve fooled ourselves into believing the Standard Model is right, because we only looked at one-millionth of the data that’s out there. Perhaps the nightmare is one we’ve brought upon ourselves.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    See the full article here .

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 7:42 pm on September 11, 2018 Permalink | Reply
    Tags: A cool fact about this accelerator? It’s 60 years old and includes a Van de Graaff accelerator that was repurposed for use with CASPAR, , , “We are made of stardust.” Carl Sagan, , , , CERN LHC, , LUNA at Gran Sasso, Stellar burning and evolutionary phases in stars,   

    From Sanford Underground Research Facility: “CASPAR” 

    SURF logo
    Sanford Underground levels

    From Sanford Underground Research Facility

    Constance Walter
    Photos by Matt Kapust

    1
    CASPAR is a low-energy particle accelerator that allows researchers to study processes that take place inside collapsing stars.

    “We are made of stardust.”

    While that statement may sound like a song title from the 1960s, it was actually made by astrophysicist and science fiction author Carl Sagan.

    Carl Sagan NASA/JPL

    And he was right. The nuclear burning inside collapsing stars produces the elements that make up and sustain life on Earth: carbon, nitrogen, iron and calcium, to name a few. Even lead, gold and the rock beneath our feet come from stars.

    The Compact Accelerator System for Performing Astrophysical Research (CASPAR) collaboration uses a low-energy accelerator to better understand how elements are produced in the Universe and at what rate and how much energy is produced during the process.

    “Unlike other underground experiments, we look at many different interactions and are not focused on discovering just one event,” said Dan Robertson, research associate professor at the University of Notre Dame. “All of these details give us a better understanding of the life of a star and what material is kicked out into the Universe during explosive stellar events.”

    2

    Studying the stars from underground

    Although it may seem counter-intuitive, going nearly a mile underground at Sanford Lab gives the CASPAR team a perfect place to study those stellar environments. CASPAR is one of just two underground accelerators in the world studying stellar environments. The other is the Laboratory for Underground Nuclear Astrophysics (LUNA), which is located at Gran Sasso National Laboratory in Italy and has been in existence for 25 years. Frank Strieder, principal investigator for the project and an associate professor of physics at South Dakota School of Mines & Technology (SD Mines), worked on that experiment for 22 years.

    LUNA-Laboratory for Underground Nuclear Astrophysics , which is located at Gran Sasso National Laboratory in Italy

    Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy

    “Both experiments are studying stellar burning and evolutionary phases in stars, but the work is different,” Strieder said. “And with our accelerator, we can cover a larger energy range than previous underground experiments.”

    3
    The accelerator

    The most famous particle accelerator in the world is the 17-mile long Large Hadron Collider, located in Switzerland and France, which generates up to 7 trillion volts as it hurls particles toward each other at nearly the speed of light.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    CASPAR, on the other hand, is a 50-foot long system that includes a Van de Graaff accelerator that uses radio-frequency energy to accelerate a beam of protons or alpha particles toward a target of up to 1.1 million volts.

    Robertson compares the accelerator to a tabletop version of the Van de Graaff used in high school or at a science museum—touch the polished metal dome and your hair stands on end. “Think of the accelerator as generating and storing a large voltage which then repels ionized particles (which we create at its heart) away from it.”

    A cool fact about this accelerator? It’s 60 years old and was repurposed for use with CASPAR.

    4
    The target

    Every reaction the CASPAR team investigates, requires two elements to interact—a projectile and a target. The target material varies according to the interaction they want to study and could include anything from nitrogen and carbon up to magnesium. These elements are usually stored on a heavier backing material for stability, which are kept extremely cold.

    The team bombards the target with either a proton beam or alpha beam generated in the accelerator. The power the beam dissipates in the target is up to 100 watts, “which is the same power as a good light bulb,” Strieder said.

    What’s LIGO got to do with it?

    In late 2017, the Laser-Interferometer Gravitational Wave Observatory (LIGO), recorded a violent collision of two neutron stars—this was on top of two previous observations of black hole mergers that emitted gravitational waves. Observations made after the collision reinforce the need for measurements like those CASPAR hopes to take, explained Strieder.


    VIRGO Gravitational Wave interferometer, near Pisa, Italy

    Caltech/MIT Advanced aLigo Hanford, WA, USA installation


    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA

    Cornell SXS, the Simulating eXtreme Spacetimes (SXS) project

    Gravitational waves. Credit: MPI for Gravitational Physics/W.Benger

    ESA/eLISA the future of gravitational wave research

    1
    Skymap showing how adding Virgo to LIGO helps in reducing the size of the source-likely region in the sky. (Credit: Giuseppe Greco (Virgo Urbino group)
    See also https://sciencesprings.wordpress.com/2017/10/16/from-ucsc-a-uc-santa-cruz-special-report-neutron-stars-gravitational-waves-and-all-the-gold-in-the-universe/

    “The basic point is that from the information we learned from this cataclysmic event, we can calculate the amount of heavy element material produced.” Strieder said. “And then compare it with the heavy elements found in our planetary system.”

    1,100,000
    Volts of energy generated by CASPAR

    7,700,000,000
    Volts of energy generated by LHC

    5
    Collecting data

    In July 2017, CASPAR achieved first beam and began full operations earlier this year. The accelerator runs for several days at a time, collecting data using a germanium detector.

    “We are recording the number of reactions that occur per time period, and in what conditions,” Roberson said. “For example, what energy did the interacting particle have prior to striking the target? The measurement of radiation and particles emitted during the interaction helps us backtrack what happened in the target material and at what rate. This can then be extrapolated to events in a star and scaled up for the star’s massive size.”

    6
    A lofty goal

    The end goal for the field of nuclear astrophysics is to complete the puzzle of how everything is made in the Universe and the locations and processes that govern such production. The experiments studying stellar processes are looking at singular puzzle pieces without knowing what the complete picture is.

    “Only as we understand how these pieces fit can we begin to put the whole puzzle together,” Robertson said. “CASPAR’s unique location deep underground means it is able to more clearly investigate the images previously obscured by cosmic interference.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    About us.
    The Sanford Underground Research Facility in Lead, South Dakota, advances our understanding of the universe by providing laboratory space deep underground, where sensitive physics experiments can be shielded from cosmic radiation. Researchers at the Sanford Lab explore some of the most challenging questions facing 21st century physics, such as the origin of matter, the nature of dark matter and the properties of neutrinos. The facility also hosts experiments in other disciplines—including geology, biology and engineering.

    The Sanford Lab is located at the former Homestake gold mine, which was a physics landmark long before being converted into a dedicated science facility. Nuclear chemist Ray Davis earned a share of the Nobel Prize for Physics in 2002 for a solar neutrino experiment he installed 4,850 feet underground in the mine.

    Homestake closed in 2003, but the company donated the property to South Dakota in 2006 for use as an underground laboratory. That same year, philanthropist T. Denny Sanford donated $70 million to the project. The South Dakota Legislature also created the South Dakota Science and Technology Authority to operate the lab. The state Legislature has committed more than $40 million in state funds to the project, and South Dakota also obtained a $10 million Community Development Block Grant to help rehabilitate the facility.

    In 2007, after the National Science Foundation named Homestake as the preferred site for a proposed national Deep Underground Science and Engineering Laboratory (DUSEL), the South Dakota Science and Technology Authority (SDSTA) began reopening the former gold mine.

    In December 2010, the National Science Board decided not to fund further design of DUSEL. However, in 2011 the Department of Energy, through the Lawrence Berkeley National Laboratory, agreed to support ongoing science operations at Sanford Lab, while investigating how to use the underground research facility for other longer-term experiments. The SDSTA, which owns Sanford Lab, continues to operate the facility under that agreement with Berkeley Lab.

    The first two major physics experiments at the Sanford Lab are 4,850 feet underground in an area called the Davis Campus, named for the late Ray Davis. The Large Underground Xenon (LUX) experiment is housed in the same cavern excavated for Ray Davis’s experiment in the 1960s.
    LUX/Dark matter experiment at SURFLUX/Dark matter experiment at SURF

    In October 2013, after an initial run of 80 days, LUX was determined to be the most sensitive detector yet to search for dark matter—a mysterious, yet-to-be-detected substance thought to be the most prevalent matter in the universe. The Majorana Demonstrator experiment, also on the 4850 Level, is searching for a rare phenomenon called “neutrinoless double-beta decay” that could reveal whether subatomic particles called neutrinos can be their own antiparticle. Detection of neutrinoless double-beta decay could help determine why matter prevailed over antimatter. The Majorana Demonstrator experiment is adjacent to the original Davis cavern.

    Another major experiment, the Long Baseline Neutrino Experiment (LBNE)—a collaboration with Fermi National Accelerator Laboratory (Fermilab) and Sanford Lab, is in the preliminary design stages. The project got a major boost last year when Congress approved and the president signed an Omnibus Appropriations bill that will fund LBNE operations through FY 2014. Called the “next frontier of particle physics,” LBNE will follow neutrinos as they travel 800 miles through the earth, from FermiLab in Batavia, Ill., to Sanford Lab.

    Fermilab LBNE
    LBNE

     
  • richardmitnick 12:08 pm on September 9, 2018 Permalink | Reply
    Tags: , , , CERN LHC, First successful test of a proton-driven plasma wakefield accelerator, , ILC-International Linear Collider, , ,   

    From Sanford Underground Research Facility via SingularityHub: “This Breakthrough New Particle Accelerator Is Small But Mighty” 

    SURF logo
    Sanford Underground levels

    From Sanford Underground Research Facility

    via

    SingularityHub

    Sep 04, 2018
    Edd Gent

    CERN AWAKE

    Particle accelerators have become crucial tools for understanding the fundamental nature of our universe, but they are incredibly big and expensive. That could change, though, after scientists validated a new approach that could usher in a generation of smaller, more powerful accelerators.

    The discovery of the Higgs Boson in 2012 was a scientific triumph that helped validate decades of theoretical research. But finding it required us to build the 17-mile–long Large Hadron Collider (LHC) beneath Switzerland and France, which cost about $13.25 billion.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    Now scientists at CERN, the organization that runs the LHC, have published results of the first successful test of a proton-driven plasma wakefield accelerator in Nature. The machine is the first successful demonstration of an idea only dreamt up in 2009, which could achieve considerably higher energies over shorter distances than older approaches.

    The idea of using wakefields to accelerate particles has been around since the 1970s. By firing a high–energy beam into a plasma—the fourth state of matter that is essentially a gas whose electrons have come loose from their atoms or molecules—it’s possible to get its soup of electrons to oscillate.

    This creates something akin to the wake formed as a ship passes through water, and by shooting another beam of electrons into the plasma at a specific angle, you can get the electrons to effectively ride this plasma wave, accelerating them to much higher speeds.

    Previous approaches have relied on lasers or electron beams to create these wakefields, but their energy dissipates quickly, so they can only accelerate particles over short distances. That means reaching higher energies would likely require multiple stages [Nature]. Protons, on the other hand, are easy to accelerate and can maintain high energies over very long distances, so a wakefield accelerator driven by them is able to accelerate particles to much higher speeds in a single stage.

    In its first demonstration, the AWAKE experiment boosted electrons to 2 GeV, or 2 billion electronvolts (a measure of energy also commonly used as a unit of momentum in particle physics) over 10 meters. In theory, the same approach could achieve 1 TeV (1,000 GeV) if scaled up to 1 kilometer long (0.6 miles).

    CERN AWAKE schematic


    CERN AWAKE

    That pales in significance compared to the energies reached by the LHC, which smashes protons together to reach peak energies of 13TeV. But proton collisions are messy, because they are made up of lots of smaller fundamental particles, so analyzing the results is a time-consuming and tricky task.

    That’s why most designs for future accelerators plan to use lighter particles like electrons, which will create cleaner collisions [PhysicsWorld]. Current theories also consider electrons to be fundamental particles (i.e., they don’t break into smaller parts), but smashing them into other particles at higher speeds may prove that wrong [New Scientist].

    These particles lose energy far quicker than protons in circular accelerators like the LHC, so most proposals are for linear accelerators. That means that unlike the LHC, where particles can be boosted repeatedly as they circulate around the ring multiple times, all the acceleration has to be done in a single go. The proposed International Linear Collider (ILC) is expected to cost $7 billion [Science] and will require a 20– to 40–kilometer-long tunnel to reach 0.25 TeV.

    ILC schematic, being planned for the Kitakami highland, in the Iwate prefecture of northern Japan

    That’s because, like the LHC, it will rely on radio frequency cavities, which bounce high-intensity radio waves around inside a metallic chamber to create an electric field that accelerates the particles. Reaching higher energies requires many such RF cavities and therefore long and costly tunnels. That makes the promise of reaching TeVs over just a few kilometers with proton-driven wakefield accelerators very promising.

    But it’s probably a bit early to rip up the ILC’s designs quite yet. Building devices that can generate useful experimental results will require substantial improvements in the beam quality, which is currently somewhat lacking. The current approach also requires a powerful proton source—in this case, CERN’s Super Proton Synchrotron—so it’s more complicated than just building the accelerator.

    The Super Proton Synchrotron (SPS), CERN’s second-largest accelerator.

    Nonetheless, AWAKE deputy spokesperson Matthew Wing told Science that they could be doing practical experiments within five years, and within 20 years the technology could be used to convert the LHC into an electron-proton collider at roughly a 10th of the cost of a more conventional radio frequency cavity design.

    5
    Last year, physicists working on the Advanced Wakefield collaboration at CERN added an electron source and beamline (pictured) to their plasma wakefield accelerator. Maximilien Brice, Julien Ordan/CERN

    That could make it possible to determine whether electrons truly are fundamental particles, potentially opening up entirely new frontiers in physics and rewriting our understanding of the universe.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    About us.
    The Sanford Underground Research Facility in Lead, South Dakota, advances our understanding of the universe by providing laboratory space deep underground, where sensitive physics experiments can be shielded from cosmic radiation. Researchers at the Sanford Lab explore some of the most challenging questions facing 21st century physics, such as the origin of matter, the nature of dark matter and the properties of neutrinos. The facility also hosts experiments in other disciplines—including geology, biology and engineering.

    The Sanford Lab is located at the former Homestake gold mine, which was a physics landmark long before being converted into a dedicated science facility. Nuclear chemist Ray Davis earned a share of the Nobel Prize for Physics in 2002 for a solar neutrino experiment he installed 4,850 feet underground in the mine.

    Homestake closed in 2003, but the company donated the property to South Dakota in 2006 for use as an underground laboratory. That same year, philanthropist T. Denny Sanford donated $70 million to the project. The South Dakota Legislature also created the South Dakota Science and Technology Authority to operate the lab. The state Legislature has committed more than $40 million in state funds to the project, and South Dakota also obtained a $10 million Community Development Block Grant to help rehabilitate the facility.

    In 2007, after the National Science Foundation named Homestake as the preferred site for a proposed national Deep Underground Science and Engineering Laboratory (DUSEL), the South Dakota Science and Technology Authority (SDSTA) began reopening the former gold mine.

    In December 2010, the National Science Board decided not to fund further design of DUSEL. However, in 2011 the Department of Energy, through the Lawrence Berkeley National Laboratory, agreed to support ongoing science operations at Sanford Lab, while investigating how to use the underground research facility for other longer-term experiments. The SDSTA, which owns Sanford Lab, continues to operate the facility under that agreement with Berkeley Lab.

    The first two major physics experiments at the Sanford Lab are 4,850 feet underground in an area called the Davis Campus, named for the late Ray Davis. The Large Underground Xenon (LUX) experiment is housed in the same cavern excavated for Ray Davis’s experiment in the 1960s.
    LUX/Dark matter experiment at SURFLUX/Dark matter experiment at SURF

    In October 2013, after an initial run of 80 days, LUX was determined to be the most sensitive detector yet to search for dark matter—a mysterious, yet-to-be-detected substance thought to be the most prevalent matter in the universe. The Majorana Demonstrator experiment, also on the 4850 Level, is searching for a rare phenomenon called “neutrinoless double-beta decay” that could reveal whether subatomic particles called neutrinos can be their own antiparticle. Detection of neutrinoless double-beta decay could help determine why matter prevailed over antimatter. The Majorana Demonstrator experiment is adjacent to the original Davis cavern.

    Another major experiment, the Long Baseline Neutrino Experiment (LBNE)—a collaboration with Fermi National Accelerator Laboratory (Fermilab) and Sanford Lab, is in the preliminary design stages. The project got a major boost last year when Congress approved and the president signed an Omnibus Appropriations bill that will fund LBNE operations through FY 2014. Called the “next frontier of particle physics,” LBNE will follow neutrinos as they travel 800 miles through the earth, from FermiLab in Batavia, Ill., to Sanford Lab.

    Fermilab LBNE
    LBNE

     
  • richardmitnick 3:00 pm on September 4, 2018 Permalink | Reply
    Tags: , , CERN LHC, , , , , , , ,   

    From University at Buffalo: “UB physicists awarded $1.45 million to study inner workings of the universe” 

    U Buffalo bloc.

    From University at Buffalo

    September 4, 2018
    Charlotte Hsu

    1
    Photo illustration: Left to right: University at Buffalo physicists Avto Kharchilava, Ia Iashvili and Salvatore Rappoccio. Credit: Douglas Levere / University at Buffalo / CERN

    Funding comes as the field marks its latest big discovery — the observation of the Higgs boson’s most common mode of decay.

    University at Buffalo scientists have received $1.45 million from the National Science Foundation (NSF) for research in high-energy physics, a field that uses particle accelerators to smash beams of protons into one another at near-light speeds, generating data that illuminates the fundamental laws of nature.

    The grant was awarded to Salvatore Rappoccio, PhD, associate professor of physics in the UB College of Arts and Sciences, and UB physics professors Ia Iashvili, PhD, and Avto Kharchilava, PhD.

    The funding began Sept. 1, just days after the latest big discovery in high-energy physics.

    On Aug. 28, an international team of thousands of researchers — including Iashvili, Kharchilava and Rappoccio — announced that they had observed the Higgs boson, a subatomic particle, decaying into a pair of lighter particles called a bottom quark and antibottom quark.

    The sighting took place at the world’s most powerful particle accelerator, the Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN).

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    The finding deepens our understanding of why objects have mass. It also validates the Standard Model, a set of equations that physicists use to describe elementary particles and the way they behave (in essence, the way the universe works).

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.


    Standard Model of Particle Physics from Symmetry Magazine

    For Kharchilava, the discovery was over a decade in the making. He and his UB students had been searching for evidence of the Higgs boson transforming into bottom quarks since around 2005.

    “I was looking for this decay for almost 15 years, when we began the search at Fermilab, which operated the Tevatron collider,” he says. “We did not succeed back then because we did not have enough data and precision, so now we have more data and better precision and we have finally made the discovery.”


    FNAL/Tevatron map



    FNAL/Tevatron CDF detector


    FNAL/Tevatron DZero detector

    The new NSF funding will enable the UB scientists to continue their work on the Higgs boson, the Standard Model and the hunt for new phenomena in physics.

    The finding deepens our understanding of why objects have mass. It also validates the Standard Model, a set of equations that physicists use to describe elementary particles and the way they behave (in essence, the way the universe works).

    For Kharchilava, the discovery was over a decade in the making. He and his UB students had been searching for evidence of the Higgs boson transforming into bottom quarks since around 2005.

    “I was looking for this decay for almost 15 years, when we began the search at Fermilab, which operated the Tevatron collider,” he says. “We did not succeed back then because we did not have enough data and precision, so now we have more data and better precision and we have finally made the discovery.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Buffalo Campus

    UB is a premier, research-intensive public university and a member of the Association of American Universities. As the largest, most comprehensive institution in the 64-campus State University of New York system, our research, creative activity and people positively impact the world.

     
  • richardmitnick 12:47 pm on September 4, 2018 Permalink | Reply
    Tags: , , , CERN LHC, CISE-NSF's Office of Advanced Cyberinfrastructure in the Directorate for Computer and Information Science and Engineering, , IRIS-HEP-Institute for Research and Innovation in Software for High-Energy Physics, Molecular Sciences Software Institute and the Science Gateways Community Institute, MPS-NSF Division of Physics in the Directorate for Mathematical and Physical Sciences, , SCAILFIN-Scalable Cyberinfrastructure for Artificial Intelligence and Likelihood-Free Inference   

    From University of Illinois Physics: “University of Illinois part of $25 million software institute to enable discoveries in high-energy physics” 

    U Illinois bloc

    From University of Illinois Physics

    U Illinois Physics bloc

    9/4/2018
    Siv Schwink

    1
    A data visualization from a simulation of collision between two protons that will occur at the High-Luminosity Large Hadron Collider (HL-LHC). On average, up to 200 collisions will be visible in the collider’s detectors at the same time. Shown here is a design for the Inner Tracker of the ATLAS detector, one of the hardware upgrades planned for the HL-LHC. Image courtesy of the ATLAS Experiment © 2018 CERN

    CERN/ATLAS detector

    Today, the National Science Foundation (NSF) announced its launch of the Institute for Research and Innovation in Software for High-Energy Physics (IRIS-HEP).

    The $25 million software-focused institute will tackle the unprecedented torrent of data that will come from the high-luminosity running of the Large Hadron Collider (LHC), the world’s most powerful particle accelerator located at CERN near Geneva, Switzerland.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    The High-Luminosity LHC (HL-LHC) will provide scientists with a unique window into the subatomic world to search for new phenomena and to study the properties of the Higgs boson in great detail.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    The 2012 discovery at the LHC of the Higgs boson—a particle central to our fundamental theory of nature—led to the Nobel Prize in physics a year later and has provided scientists with a new tool for further discovery.

    The HL-LHC will begin operations around 2026, continuing into the 2030s. It will produce more than 1 billion particle collisions every second, from which only a tiny fraction will reveal new science, because the phenomena that physicists want to study have a very low probability per collision of occurring. The HL-LHC’s tenfold increase in luminosity—a measure of the number of particle collisions occurring in a given amount of time—will enable physicists to study familiar processes at an unprecedented level of detail and observe rare new phenomena present in nature.

    But the increased luminosity also leads to more complex collision data. A tenfold increase in the required data processing and storage can not be achieved without new software tools for intelligent data filtering that record only the most interesting collision events, to enable scientists to analyze the data more efficiently.

    Over the next five years, IRIS-HEP will focus on developing innovative software for use in particle physics research with the HL-LHC as the key science driver. It will also create opportunities for training and education in related areas of computational and data science and outreach to the general public. The institute will also work to increase participation from women and minorities who are underrepresented in high-energy physics research.

    IRIS-HEP brings together multidisciplinary teams of researchers and educators from 17 universities, including Mark Neubauer, a professor of physics at the University of Illinois at Urbana-Champaign and a faculty affiliate with the National Center for Supercomputing Applications (NCSA) in Urbana.

    2

    Neubauer is a member of the ATLAS Experiment, which generates and analyzes data from particle collisions at the LHC. Neubauer will serve on the IRIS-HEP Executive Committee and coordinate the institute’s activities to develop and evolve the strategic vision of the institute.

    Neubauer, along with colleagues Peter Elmer (Princeton) and Michael Sokoloff (Cincinnati), led a community-wide effort to conceptualize the institute with funding from the NSF and was a key member of the group that developed the IRIS-HEP proposal. Through a process to conceptualize the institute involving 18 workshops over the last two years, key national and international partners from high-energy physics, computer science, industry, and data-science communities were brought together to generate more than eight community position papers, most notably a strategic plan for the institute and a roadmap for HEP software and computing R&D over the next decade. They reviewed two decades of approaches to LHC data processing and analysis and developed strategies to address the challenges and opportunities that lay ahead. IRIS-HEP emerged from that effort.

    “IRIS-HEP will serve as a new intellectual hub of software development for the international high-energy physics community,” comments Neubauer. “The founding of this Institute will do much more than fund software development to support the HL-LHC science; it will provide fertile ground for new ideas and innovation, empower early-career researchers interested in software and computing aspects of data-enabled science through mentoring and training to support their professional development, and will redefine the traditional boundaries of the high-energy physics community.”

    Neubauer will receive NSF funding through IRIS-HEP to contribute to the institute’s efforts in software research and innovation. He plans to collaborate with Daniel S. Katz, NCSA’s assistant director for scientific software and applications, to put together a team to research new approaches and systems for data analysis and innovative algorithms that apply machine learning and other approaches to accelerate computation on modern computing architectures.

    In related research also beginning in the current Fall semester, Neubauer and Katz through a separate NSF award with Kyle Cranmer (NYU), Heiko Mueller (NYU) and Michael Hildreth (Notre Dame) will be collaborating on the Scalable Cyberinfrastructure for Artificial Intelligence and Likelihood-Free Inference (SCAILFIN) Project. SCAILFIN aims to maximize the potential of artificial intelligence and machine learning to improve new physics searches at the LHC, while addressing current issues in software and data sustainability by making data analyses more reusable and reproducible.

    Katz says he is looking forward to delving into these projects: “How to build tools that make more sense of the data, how to make the software more sustainable so there is less rewriting, how to write software that is portable across different systems and compatible with future hardware changes—these are tremendous challenges. And these questions really are timely. They fit into the greater dialogue that is ongoing in both the computer science and the information science communities. I’m excited for this opportunity to meld the most recent work from these complementary fields together with work in physics.”

    Neubauer concludes, “The quest to understand the fundamental building blocks of nature and their interactions is one of the oldest and most ambitious of human scientific endeavors. The HL-LHC will represent a big step forward in this quest and is a top priority for the US particle physics community. As is common in frontier-science experiments pushing at the boundaries of knowledge, it comes with daunting challenges. The LHC experiments are making large investments to upgrade their detectors to be able to operate in the challenging HL-LHC environment.

    “A significant investment in R&D for software used to acquire, manage, process and analyze the huge volume of data that will be generated during the HL-LHC era will be critical to maximize the scientific return on investment in the accelerator and detectors. This is not a problem that could be solved by gains from hardware technology evolution or computing resources alone. The institute will support early-career scientists to develop innovative software over the next five to ten years, to get us where we need to be to do our science during the HL-LHC era. I am elated to see such a large investment by the NSF in this area for high-energy physics.”

    IRIS-HEP is co-funded by NSF’s Office of Advanced Cyberinfrastructure in the Directorate for Computer and Information Science and Engineering (CISE) and the NSF Division of Physics in the Directorate for Mathematical and Physical Sciences (MPS). IRIS-HEP is the latest NSF contribution to the 40-nation LHC effort. It is the third OAC software institute, following the Molecular Sciences Software Institute and the Science Gateways Community Institute.

    See the full University of Illinois article on this subject here .
    See the full Cornell University article on the subject here.
    See the full Princeton University article on this subject here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Illinois campus

    The University of Illinois at Urbana-Champaign community of students, scholars, and alumni is changing the world.

    With our land-grant heritage as a foundation, we pioneer innovative research that tackles global problems and expands the human experience. Our transformative learning experiences, in and out of the classroom, are designed to produce alumni who desire to make a significant, societal impact.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: