Tagged: Symmetry Magazine Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:28 pm on January 14, 2017 Permalink | Reply
    Tags: , , , , , Symmetry Magazine, Twinkles   

    From Symmetry: “Twinkle, twinkle, little supernova” 

    Symmetry Mag
    Symmetry

    01/12/17
    Ricarda Laasch

    1
    Phil Marshall, SLAC

    Using Twinkles, the new simulation of images of our night sky, scientists get ready for a gigantic cosmological survey unlike any before.

    Almost every worthwhile performance is preceded by a rehearsal, and scientific performances are no exception. Engineers test a car’s airbag deployment using crash test dummies before incorporating them into the newest model. Space scientists fire a rocket booster in a test environment before attaching it to a spacecraft in flight.

    One of the newest “training grounds” for astrophysicists is called Twinkles. The Twinkles dataset, which has not yet been released, consists of thousands of simulated, highly realistic images of the night sky, full of supernovae and quasars. The simulated-image database will help scientists rehearse a future giant cosmological survey called LSST.

    LSST
    LSST/Camera, built at SLAC
    LSST/Camera, built at SLAC
    LSST Interior
    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.
    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    LSST, short for the Large Synoptic Survey Telescope, is under construction in Chile and will conduct a 10-year survey of our universe, covering the entire southern sky once a year. Scientists will use LSST images to explore our galaxy to learn more about supernovae and to shine a light on the mysterious dark energy that is responsible for the expansion of our universe.

    It’s a tall order, and it needs a well prepared team. Scientists designed LSST using simulations and predictions for its scientific capabilities. But Twinkles’ thousands of images will give them an even better chance to see how accurately their LSST analysis tools can measure the changing brightness of supernovae and quasars. That’s the advantage of using simulated data. Scientists don’t know about all the objects in the sky above our heads, but they do know their simulated sky— there, they already know the answers. If the analysis tools make a calculation error, they’ll see it.

    The findings will be a critical addition to LSST’s measurements of certain cosmological parameters, where a small deviation can have a huge impact on the outcome.

    “We want to understand the whole path of the light: From other galaxies through space to our solar system and our planet, then through our atmosphere to the telescope – and from there through our data-taking system and image processing,” says Phil Marshall, a scientist at the US Department of Energy’s SLAC National Accelerator Laboratory who leads the Twinkles project. “Twinkles is our way to go all the way back and study the whole picture instead of one single aspect.”

    Scientists simulate the images as realistically as possible to figure out if some systematic errors add up or intertwine with each other. If they do, it could create unforeseen problems, and scientists of course want to deal with them before LSST starts.

    Twinkles also lets scientists practice sorting out a different kind of problem: A large collaboration spread across the whole globe that will perform numerous scientific searches simultaneously on the same massive amounts of data.

    Richard Dubois, senior scientist at SLAC and co-leader of the software infrastructure team, works with his team of computing experts to create methods and plans to deal with the data coherently across the whole collaboration and advise the scientists to choose specific tools to make their life easier.

    “Chaos is a real danger; so we need to keep it in check,” Dubois says. “So with Twinkles, we test software solutions and databases that help us to keep our heads above water.”

    The first test analysis using Twinkles images will start toward the end of the year. During the first go, scientists extract type 1a supernovae and quasars and learn how to interpret the automated LSST measurements.

    “We hid both types of objects in the Twinkles data,” Marshall says. “Now we can see whether they look the way they’re supposed to.”

    LSST will start up in 2022, and the first LSST data will be released at the end of 2023.

    “High accuracy cosmology will be hard,” Marshall says. “So we want to be ready to start learning more about our universe right away!”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 11:31 am on January 11, 2017 Permalink | Reply
    Tags: , How heavy is a neutrino?, , , Symmetry Magazine   

    From Symmetry: “How heavy is a neutrino?” 

    Symmetry Mag

    Symmetry

    01/10/17
    Kathryn Jepsen

    1
    No image caption. No image credit.

    The question is more complicated than it seems.

    Neutrinos are elementary particles first discovered six decades ago.

    Over the years, scientists have learned several surprising things about them. But they have yet to answer what might sound like a basic question: How much do neutrinos weigh? The answer could be key to understanding the nature of the strange particles and of our universe.

    To understand why figuring out the mass of neutrinos is such a challenge, first you must understand that there’s more than one way to picture a neutrino.

    Neutrinos come in three flavors: electron, muon and tau. When a neutrino hits a neutrino detector, a muon, electron or tau particle is produced. When you catch a neutrino accompanied by an electron, you call it an electron neutrino, and so on.

    Knowing this, you might be forgiven for thinking that there are three types of neutrinos: electron neutrinos, muon neutrinos and tau neutrinos. But that’s not quite right.

    That’s because every neutrino is actually a quantum superposition of all three flavors. Depending on the energy of a neutrino and where you catch it on its journey, it has a different likelihood of appearing as electron-flavored, muon-flavored or tau-flavored.

    Armed with this additional insight, you might be forgiven for thinking that, when all is said and done, there is actually just one type of neutrino. But that’s even less right.

    Scientists count three types of neutrino after all. Each one has a different mass and is a different mixture of the three neutrino flavors. These neutrino types are called the three neutrino mass states.

    2
    Sandbox Studio, Chicago with Corinne Mucha

    A weighty problem

    We know that the masses of these three types of neutrinos are small. We know that the flavor mixture of the first neutrino mass state is heavy on electron flavor. We know that the second is more of an even blend of electron, muon and tau. And we know that the third is mostly muon and tau.

    We know that the masses of the first two neutrinos are close together and that the third is the odd one out. What we don’t know is whether the third one is lighter or heavier than the others.

    The question of whether this third mass state is the heaviest or the lightest mass state is called the neutrino mass hierarchy (or neutrino mass ordering) problem.

    3
    No image caption. No image credit.

    Easy as 1,2,3—or 3,1,2?

    Some models that unify the different forces in the Standard Model of particle physics predict that the neutrino mass ordering will follow the pattern 1, 2, 3—what they call a normal hierarchy. Other models predict that the mass ordering will follow the pattern 3, 1, 2—an inverted hierarchy. Knowing whether the hierarchy is normal or inverted can help theorists answer other questions.

    For example, four forces—the strong, weak, electromagnetic and gravitational forces—govern the interactions of the smallest building blocks of matter. Some theorists think that, in the early universe, these four forces were united into a single force. Most theories about the unification of forces predict a normal neutrino mass hierarchy.

    Scientists’ current best tools for figuring out the neutrino mass hierarchy are long-baseline neutrino experiments, most notably one called NOvA.

    FNAL/NOvA experiment
    NOvA map
    FNAL NOvA Near Detector
    FNAL NOvA Near Detector

    3
    No image caption. No image credit.

    Electron drag

    The NOvA detector, located in Minnesota near the border of Canada, studies a beam of neutrinos that originates at Fermi National Accelerator Laboratory in Illinois.

    Neutrinos very rarely interact with other matter. That means they can travel 500 miles straight through the Earth from the source to the detector. In fact, it’s important that they do so, because as they travel, they pass through trillions of electrons.

    This affects the electron-flavor neutrinos—and only the electron-flavor neutrinos—making them seem more massive. Since the first and second mass states contain more electron flavor than the third, those two experience the strongest electron interactions as they move through the Earth.

    This interaction has different effects on neutrinos and antineutrinos—and the effects depend on the mass hierarchy. If the hierarchy is normal, muon neutrinos will be more likely to turn into electron neutrinos, and muon antineutrinos will be less likely to turn into electron antineutrinos. If the hierarchy is inverted, the opposite will happen.

    So if NOvA scientists see that, after traveling through miles of rock and dirt, more muon neutrinos and fewer muon antineutrinos than expected have shifted flavors, it will be a sign the mass hierarchy is normal. If they see fewer muon neutrinos and more muon antineutrinos have shifted flavors, it will be a sign that the mass hierarchy is inverted.

    The change is subtle. It will take years of data collection to get the first hint of an answer. Another, shorter long-baseline neutrino experiment, T2K, is taking related measurements. The JUNO experiment under construction in China aims to measure the mass hierarchy in a different way. The definitive measurement likely won’t come until the next generation of long-baseline experiments, DUNE in the US and the proposed Hyper-Kamiokande experiment in Japan.

    T2K Experiment
    T2K map
    T2K, Japan

    JUNO Neutrino detector China
    JUNO Neutrino detector, at Kaiping, Jiangmen in Southern China

    FNAL LBNF/DUNE from FNAL to SURF
    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    Hyper-Kamiokande, a neutrino physics laboratory located underground in the Mozumi Mine of the Kamioka Mining and Smelting Co. near the Kamioka section of the city of Hida in Gifu Prefecture, Japan.
    Hyper-Kamiokande, a neutrino physics laboratory located underground in the Mozumi Mine of the Kamioka Mining and Smelting Co. near the Kamioka section of the city of Hida in Gifu Prefecture, Japan

    Neutrinos are some of the most abundant particles in the universe. As we slowly uncover their secrets, they give us more clues about how our universe works.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 2:48 pm on January 5, 2017 Permalink | Reply
    Tags: , , , , , , , , Symmetry Magazine   

    From Symmetry: “Anything to declare?” A Really Cool Article by Sarah Charley 

    Symmetry Mag
    Symmetry

    01/05/17
    Sarah Charley

    1
    A scientist at CERN removes a delicate half-disk of pixels from its custom-made box. The box was designed to fit snugly in an airplane seat. Photo courtesy of John Conway

    John Conway knows the exact width of airplane aisles (15 inches). He also personally knows the Transportation Security Administration operations manager at Chicago’s O’Hare Airport. That’s because Conway has spent the last decade transporting extremely sensitive detector equipment in commercial airline cabins.

    “We have a long history of shipping particle detectors through commercial carriers and having them arrive broken,” says Conway, who is a physicist at the University of California, Davis. “So in 2007 we decided to start carrying them ourselves. Our equipment is our baby, so who better to transport it than the people whose work depends on it?”

    Their instrument isn’t musical, but it’s just as fragile and irreplaceable as a vintage Italian cello, and it travels the same way. Members of the collaboration for the CMS experiment at CERN research center tested different approaches for shipping the instrument by embedding accelerometers in the packages. Their best method for safety and cost-effectiveness? Reserving a seat on the plane for the delicate cargo.

    CERN CMS Higgs Event
    CERN/CMS Detector
    CMS at CERN

    In November Conway accompanied parts of the new CMS pixel detector from the Department of Energy’s Fermi National Accelerator Laboratory [FNAL] in Chicago to CERN in Geneva. The pixels are very thin silicon chips mounted inside a long cylindrical tube. This new part will sit in the heart of the CMS experiment and record data from the high-energy particle collisions generated by the Large Hadron Collider [LHC].

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    “It functions like the sensor inside a digital camera,” Conway said, “except it has 45 megapixels and takes 40 million pictures every second.”

    Scientists and engineers assembled and tested these delicate silicon disks at Fermilab before Conway and two colleagues escorted them to Geneva. The development and construction of the component pieces took place at Fermilab and universities around the United States.

    Conway and his colleagues reserved each custom-made container its own economy seat and then accompanied these precious packages through check-in, security and all the way to their final destination at CERN. And although these packages did not leave Fermilab through the shipping department, each carried its own official paperwork.

    “We’d get a lot of weird looks when rolling them onto the airplane,” Conway says. “One time the flight crew kept joking that we were transporting dinosaur eggs.”

    After four trips by three people across the Atlantic, all 12 components of the US-built pixel detectors are at CERN and ready for integration with their European counterparts. This winter the completed new pixel detector will replace its time-worn predecessor currently inside the CMS detector.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 10:31 am on December 21, 2016 Permalink | Reply
    Tags: 2016 year in particle physics, Symmetry Magazine   

    From Symmetry: “2016 year in particle physics” 

    Symmetry Mag
    Symmetry

    12/20/16
    Jim Siegrist, US DOE Office of High Energy Physics

    Scientists furthered studies of the Higgs boson, neutrinos, dark matter, dark energy and cosmic inflation and continued the search for undiscovered particles, forces and principles.

    Working together, particle physicists from the US and around the globe made exciting advances this year in our understanding of the universe at the smallest and largest scales.

    The LIGO experiment made the first detection of gravitational waves, originally predicted by Albert Einstein in 1916 in his general theory of relativity.

    LIGO bloc new
    Caltech/MIT Advanced aLigo Hanford, WA, USA installation
    Caltech/MIT Advanced aLigo Hanford, WA, USA installation
    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA
    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA
    Gravitational waves. Credit: MPI for Gravitational Physics/W.Benger-Zib
    Gravitational waves. Credit: MPI for Gravitational Physics/W.Benger-Zib
    Cornell SXS, the Simulating eXtreme Spacetimes (SXS) project
    Cornell SXS, the Simulating eXtreme Spacetimes (SXS) project

    And scientists have pushed closer to the next big discovery at experiments such as those at the Large Hadron Collider and at ultra-sensitive underground neutrino detectors.

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    FNAL LBNF/DUNE from FNAL to SURF
    FNAL LBNF/DUNE from FNAL to SURF

    The pursuit of particle physics is a truly international effort. It takes the combined resources and expertise of partnering nations to develop and use unique world-class facilities and advanced technology detectors.

    Efforts in particle physics can be divided into five intertwined lines of inquiry: explorations of the Higgs boson, neutrinos, dark matter, cosmic acceleration and the unknown. Following this community vision enabled physicists to make major scientific advances in 2016 and set the stage for a fascinating future.

    Using the Higgs boson as a new tool for discovery

    CERN CMS Higgs Event
    CERN CMS Higgs Event

    CERN ATLAS Higgs Event
    CERN ATLAS Higgs Event

    The discovery of the Higgs boson in 2012 at the Large Hadron Collider at CERN opened a new door to understanding the universe. In 2016, the LHC produced roughly the same number of particle collisions that it did during all of its previous years of operation combined. At its current collision rate, it produces a Higgs boson about once per second.

    While it will take time for the ATLAS and CMS experiment collaborations to digest this deluge of data, early results are already probing for any signs of unexpected Higgs boson behavior. In August, the ATLAS and CMS collaborations used data from the highest energy LHC collisions to “rediscover” the Higgs boson and confirm that it agrees with the predictions of the Standard Model of particle physics—so far.

    CERN/ATLAS detector
    CERN/ATLAS detector

    CERN/CMS Detector
    CERN/CMS Detector

    Deviations from the predictions would signal new physics beyond the Standard Model.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Since the LHC aims to continue running at its record pace for the next two years and more than double the delivered particle collisions to the experiments, this window to the universe is only beginning to open. The latest theoretical calculations of all of the major ways a Higgs boson can be produced and decay will enable rigorous new tests of the Standard Model.

    US scientists are also ramping up efforts with their international partners to develop future upgrades for a High-Luminosity LHC that would provide 10 times the collisions and launch an era of high-precision Higgs-boson physics.

    CERN HL-LHC bloc

    Scientists have made significant progress this year in the development of more powerful superconducting magnets for the HL-LHC, including the production of a successful prototype that is currently the strongest accelerator magnet ever created.

    Pursuing the physics associated with neutrino mass

    In 2016, several experiments continued to study ghostly neutrinos—particles so pervasive and aloof that 100 trillion of them pass through you each second. In the late ’90s and early ’00s, experiments in Japan and Canada found proof that these peculiar particles have some mass and that they can transform between types of neutrino as they travel.

    A global program of experiments aims to address numerous remaining questions about neutrinos. Long-baseline experiments study the particles as they fly through the earth between Tokai and Kamioka in Japan or between Illinois and Minnesota in the US. These experiments aim to discern what masses neutrinos have and whether there are differences between the transformations of neutrinos and their antimatter partners, antineutrinos.

    Super-Kamiokande experiment Japan
    Super-Kamiokande experiment Japan

    FNAL/NOvA experiment
    FNAL/NOvA

    In July, the T2K experiment in Japan announced that their data showed a possible difference between the rate at which a muon neutrino turns into an electron neutrino and the rate at which a muon antineutrino turns into an electron antineutrino. The T2K data hint at a combination of neutrino properties that would also give the NOvA experiment in the US their most favorable chance of making a discovery about neutrinos in the next few years.

    In China, construction is underway for the Jiangmen Underground Neutrino Observatory, which will investigate neutrino mass in an effort to determine which neutrino is the lightest.

    JUNO Neutrino detector China
    JUNO Neutrino detector China

    In the longer term, particle physicists aim to definitively determine these answers by hosting the world-class Long-Baseline Neutrino Facility, which would send a high-intensity neutrino beam 800 miles from Illinois to South Dakota. There, the international Deep Underground Neutrino Experiment a mile beneath the surface would enable precision neutrino science.

    Identifying the new physics of dark matter

    Overwhelming indirect evidence indicates that more than a quarter of the mass and energy in the observable universe is made up of an invisible substance called dark matter. But the nature of dark matter remains a mystery. Little is known about it other than that it interacts through gravity.

    To guide the experimental search for dark matter, theorists have studied the possible interactions that known particles might have with a wide variety of potential dark matter candidates with possible masses ranging over more than a dozen orders of magnitude.

    Huge sensitive detectors, such as the Large Underground Xenon, or LUX, experiment located a mile beneath the Black Hills of South Dakota, directly search for the dark matter particles that may be continually passing through Earth. This year, LUX completed the world’s most sensitive search for direct evidence of dark matter, improving upon its own previous world’s best search by a factor of four and narrowing the hiding space for an important class of theoretical dark matter particles.

    Lux Zeplin project at SURF
    Lux Zeplin project at SURF

    In addition, data from the Fermi Gamma-ray Space Telescope and other facilities continued to tighten constraints on dark matter through indirect searches.

    NASA/Fermi Telescope
    NASA/Fermi Telescope

    This sets the stage for a suite of complementary next-generation experiments—including LZ, SuperCDMS-SNOLAB and ADMX-G2 in the US—that aim to significantly improve sensitivity and reveal the nature of dark matter.

    Understanding cosmic acceleration

    Particle physicists turn to the sky in their efforts investigate a different mystery: Our universe is expanding at an accelerating rate. Scientists seek to understand the nature of dark energy, responsible for overcoming the force of gravity and pushing our universe apart.

    Large-scale, ground-based cosmic surveys aim to measure the long-term expansion history of the universe and improve our understanding of dark energy. This year, scientists on the Baryon Oscillation Spectroscopic Survey used their final data set, comprising 1.5 million galaxies and quasars, to make improved measurements of the cosmological scale of the universe and the rate of cosmic structure growth.

    BOSS Supercluster Baryon Oscillation Spectroscopic Survey (BOSS)
    BOSS Supercluster Baryon Oscillation Spectroscopic Survey (BOSS)

    These measurements will allow theorists to test and refine models that aim to explain the origin of the current era of cosmic acceleration.

    Through efforts that include private sector partnerships and international collaborations, US physicists aim to rapidly usher in the era of precision cosmology—and shed light on dark energy—with the ongoing Dark Energy Survey and the upcoming Dark Energy Spectroscopic Instrument and Large Synoptic Survey Telescope.

    LBNL/DESI Dark Energy Spectroscopic Instrument for the Nicholas U. Mayall 4-meter telescope at Kitt Peak National Observatory near Tucson, Ariz, USA
    LBNL/DESI Dark Energy Spectroscopic Instrument for the Nicholas U. Mayall 4-meter telescope at Kitt Peak National Observatory near Tucson, Ariz, USA

    LSST
    LSST/Camera, built at SLAC
    LSST/Camera, built at SLAC

    LSST telescope, currently under construction at Cerro Pachón ChileLSST Interior
    LSST telescope, currently under construction at Cerro Pachón Chile

    Community efforts are also underway to develop a next-generation cosmic microwave background experiment, CMB-S4. Precision measurements from CMB-S4 will not only advance dark energy studies and provide cosmic constraints on neutrino properties, but offer a way to probe the early era of cosmic acceleration known as inflation, which occurred at energies far greater than can be achieved in an accelerator on Earth.

    Exploring the unknown

    Oftentimes, results from an experiment show a hint of something new and unexpected, and scientists must design new technology to determine if what they’ve seen is real. But between 2015 and 2016, scientists at the LHC both raised and answered their own question.

    In late 2015, LHC scientists found an unexpected bump in their data, a possible first hint of a new particle. Theorists were on the case; early in 2016 they laid the framework for possible interpretations of the data and explored how it might impact the Standard Model of particle physics. But in August, experimentalists had gathered enough new data to deem the hint a statistical fluctuation.

    Stimulated by the discovery of pentaquark and tetraquark states, some theorists have predicted that bound states of four b quarks should soon be observable at the LHC.

    CERN LHCb pentaquark
    CERN LHCb pentaquark

    6
    Tetraquark. physicsworld.com

    Experimentalists continue to test theorists’ predictions against data by performing high-precision measurements or studying extremely rare particle decays at experiments such as the LHCb experiment at the LHC, the upcoming Belle II experiment in Japan and the Muon g-2 and Muon to Electron Conversion experiments at Fermi National Accelerator Laboratory.

    FNAL Muon g-2 studio
    FNAL Muon g-2 studio

    Investing in the future of discovery science

    The world-class facilities and experiments that enable the global program of particle physics are built on a foundation of advanced technology. Ongoing research and development of particle accelerator and detector technology seed the long-term future prospects for discovery.

    In 2016, scientists and engineers continued to make advances in particle accelerator technology to prepare to build next-generation machines and possible far-future facilities.

    Advances in the efficiency of superconducting radio-frequency cavities will lead to cost savings in building and operating machines such as the Linac Coherent Light Source II. In February, researchers at the Berkeley Lab Laser Accelerator, or BELLA, demonstrated the first multi-stage accelerator based on “tabletop” laser-plasma technology. This key step is necessary to push toward far-future particle colliders that could be thousands of times shorter than conventional accelerators.

    These results reflect only a small portion of the total scientific output of the particle physics community in 2016. The stage is set for exciting discoveries that will advance our understanding of the universe.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 7:22 am on December 10, 2016 Permalink | Reply
    Tags: , , , , , , SESAME Synchrotron, Symmetry Magazine   

    From Symmetry: “SESAME to open in 2017” 

    Symmetry Mag

    Symmetry

    12/09/16
    Troy Rummler

    The first synchrotron radiation source in the Middle East is running tests before its planned 2017 start.

    1
    SESAME Particle Accelerator Jordan interior. Noemi Caraban, SESAME

    2
    SESAME (Synchrotron-light for Experimental Science and Applications in the Middle East) campus

    Scientists and engineers at the first synchrotron radiation source in the Middle East have begun commissioning, a major milestone before officially starting operations in 2017.

    When fully operational, the facility in Allan, Jordan, called SESAME, will mark a major victory for science in the region and also for its international backers. Like CERN, SESAME was established under the auspices of UNESCO, but it is now an independent intergovernmental organization and aims to facilitate peace through scientific collaboration that might supersede political divisions. Countries and labs the world over have responded to that vision by contributing to SESAME’s design, instrumentation and construction.

    SESAME, which stands for The Synchrotron-light for Experimental Science and Applications in the Middle East, is a 133-meter circumference storage ring built to produce intense radiation ranging from infrared to X-rays, given off by electrons circling inside it at high energies. At the heart of SESAME are injector components from BESSY I, a Berlin-based synchrotron that was decommissioned in 1999, donated to SESAME and upgraded to support a completely new 2.5-GeV storage ring. With funding provided in part by the European Commission and construction led by CERN in collaboration with SESAME, the new ring is on par with most modern synchrotrons.

    Now that the machine is largely complete, technicians can perform quality testing before researchers gain access and determine whether the light source can accomplish its scientific mission.

    “The first scientific mission of SESAME is to promote excellence in science in the Middle East,” says Zehra Sayers, chair of SESAME’s scientific committee and also a faculty member at Sabanci University in Istanbul, Turkey.

    Over the past decade, SESAME has organized regular users meetings each year to discuss and develop proposed research plans. That community is now over 200 strong. The international facility hosts members from Bahrain, Cyprus, Egypt, Iran, Israel, Jordan, Pakistan, the Palestinian Authority and Turkey.

    4
    14th SESAME users’ meeting. Noemi Caraban, SESAME.

    “It is very important for us to be able to perform high quality science at SESAME,” Sayers says. “Because that is what will make it viable, only then people will want to come here to do experiments, and only then people will think that this is really where they can find answers to their questions.”

    Dozens of synchrotrons in other locations throughout the world have already proven themselves as research hubs. Synchrotrons create ultra-bright light radiation and channel it into instruments used for advanced imaging research, with applications ranging from materials science to drug discovery.

    No synchrotrons existed in the Middle East until now. Political turbulence can make access to other facilities abroad challenging. Sayers says she is confident that SESAME will fill the need for a local laboratory.

    The new facility creates an opportunity for regional scientists to collaborate, for example, to study shared cultural heritage. The SESAME light source will be used to identify materials in ancient, cultural artifacts such as textiles and dyes, parchments and inks, and could reveal new information about how the materials were originally prepared.

    Researchers will initially have access to two beamlines of different wavelengths when operations begin. The facility has capacity for 25 beamlines, and it is expected that within a year two more beamlines will become available. As beamlines are added, the number of applications will grow to encompass diverse fields such as archeology, molecular biology, materials science and environmental science.

    The potential diversity is one of SESAME’s greatest strengths, says Maher Attal, who is coordinating the commissioning process. Twelve straight sections of the machine have the capacity for installing insertion devices, series of small dipole magnets that tune the spectrum of the emitted synchrotron light. This makes SESAME a “third generation” light source. SESAME’s materials science beamline, which will come into operation in 2017 or 2018 will be the first to be supplied with light from such a device.

    SESAME is undergoing a period of testing and quality control that usually takes several months. After technicians install and test the individual components, they will guide the beam through the whole machine at low energy to allow scientists to perfect its alignment, then to make measurements and corrections if its performance deviates too far from predicted values. The machine then must pass the same inspections at its maximum energy before the synchrotron officially opens.

    “We expect to deliver the first photon beam to the users in April 2017,” Attal says.

    Scientists will be watching and waiting.

    “We owe it to the region to make SESAME a success,” Sayers says. “It will be a ray of hope in a time of turmoil.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 12:55 pm on December 6, 2016 Permalink | Reply
    Tags: , , Deep learning takes on physics, , Symmetry Magazine   

    From Symmetry: “Deep learning takes on physics” 

    Symmetry Mag

    Symmetry

    12/06/16
    Molly Olmstead

    1
    Illustration by Sandbox Studio, Chicago with Ana Kova

    Can the same type of technology Facebook uses to recognize faces also recognize particles?

    When you upload a photo of one of your friends to Facebook, you set into motion a complex behind-the-scenes process. An algorithm whirs away, analyzing the pixels in the photo until it spits out your friend’s name. This same cutting-edge technique enables self-driving cars to distinguish pedestrians and other vehicles from the scenery around them.

    Can this technology also be used to tell a muon from an electron? Many physicists believe so. Researchers in the field are beginning to adapt it to analyze particle physics data.

    Proponents hope that using deep learning will save experiments time, money and manpower, freeing physicists to do other, less tedious work. Others hope they will improve the experiments’ performance, making them better able to identify particles and analyze data than any algorithm used before. And while physicists don’t expect deep learning to be a cure-all, some think it could be key to warding off an impending data-processing crisis.

    Neural networks

    Up until now, computer scientists have often coded algorithms by hand, a task that requires countless hours of work with complex computer languages. “We still do great science,” says Gabe Perdue, a scientist at Fermi National Accelerator Laboratory. “But I think we could do better science.”

    Deep learning, on the other hand, requires a different kind of human input.

    One way to conduct deep learning is to use a convolutional neural network, or CNN. CNNs are modeled after human visual perception. Humans process images using a network of neurons in the body; CNNs process images through layers of inputs called nodes. People train CNNs by feeding them pre-processed images. Using these inputs, an algorithm continuously tweaks the weight it places on each node and learns to identify patterns and points of interest. As the algorithm refines these weights, it becomes more and more accurate, often outperforming humans.

    Convolutional neural networks break down data processing in a way that short-circuits steps by tying multiple weights together, meaning fewer elements of the algorithm have to be adjusted.

    CNNs have been around since the late ’90s. But in recent years, breakthroughs have led to more affordable hardware for processing graphics, bigger data sets for training and innovations in the design of the CNNs themselves. As a result, more and more researchers are starting to use them.

    The development of CNNs has led to advances in speech recognition and translation, as well as in other tasks traditionally completed by humans. A London-based company owned by Google used a CNN to create AlphaGo, a computer program that in March beat the second-ranked international player of Go, a strategy board game far more complex than chess.

    CNNs have made it much more feasible to handle previously prohibitively large amounts of image-based data—the kind of amounts seen often in high-energy physics.

    Reaching the field of physics

    CNNs became practical around the year 2006 with the emergence of big data and graphics processing units, which have the necessary computing power to process large amounts of information. “There was a big jump in accuracy, and people have been innovating like wild on top of that ever since,” Perdue says.

    Around a year ago, researchers at various high-energy experiments began to consider the possibility of applying CNNs to their experiments. “We’ve turned a physics problem into, ‘Can we tell a car from a bicycle?’” says SLAC National Accelerator Laboratory researcher Michael Kagan. “We’re just figuring out how to recast problems in the right way.”

    For the most part, CNNs will be used for particle identification and classification and particle-track reconstruction. A couple of experiments are already using CNNs to analyze particle interactions, with high levels of accuracy. Researchers at the NOvA neutrino experiment, for example, have applied a CNN to their data.

    FNAL/NOvA experiment
    FNAL/NOvA experiment

    “This thing was really designed for identifying pictures of dogs and cats and people, but it’s also pretty good at identifying these physics events,” says Fermilab scientist Alex Himmel. “The performance was very good—equivalent to 30 percent more data in our detector.”

    Scientists on experiments at the Large Hadron Collider hope to use deep learning to make their experiments more autonomous, says CERN physicist Maurizio Pierini.

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    “We’re trying to replace humans on a few tasks. It’s much more costly to have a person watching things than a computer.”

    CNNs promise to be useful outside of detector physics as well. On the astrophysics side, some scientists are working on developing CNNs that can discover new gravitational lenses, massive celestial objects such as galaxy clusters that can distort light from distant galaxies behind them. The process of scanning the telescope data for signs of lenses is highly time-consuming, and normal pattern-recognizing programs have a hard time distinguishing their features.

    “It’s fair to say we’ve only begun to scratch the surface when it comes to using these tools,” says Alex Radovic, a postdoctoral fellow at The College of William & Mary who works on the NOvA experiment at Fermilab.

    2
    Illustration by Sandbox Studio, Chicago with Ana Kova

    The upcoming data flood

    Some believe neural networks could help avert what they see as an upcoming data processing crisis.

    An upgraded version of the Large Hadron Collider planned for 2025 will produce roughly 10 times as much data.

    CERN HL-LHC bloc

    The Dark Energy Spectroscopic Instrument will collect data from about 35 million cosmic objects, and the Large Synoptic Survey Telescope will capture high-resolution video of nearly 40 billion galaxies.

    LBNL/DESI spectroscopic instrument on the Mayall 4-meter telescope at Kitt Peak National Observatory starting in 2018
    LBNL/DESI spectroscopic instrument on the Mayall 4-meter telescope at Kitt Peak National Observatory starting in 2018

    LSST
    LSST/Camera, built at SLAC
    LSST/Camera, built at SLAC
    LSST Interior
    LSST telescope, currently under construction at Cerro Pachón Chile
    LSST telescope, currently under construction at Cerro Pachón Chile

    Data streams promise to grow, but previously exponential growth in the power of computer chips is predicted to falter. That means greater amounts of data will become increasingly expensive to process.

    “You may need 100 times more capability for 10 times more collisions,” Pierini says. “We are going toward a dead end for the traditional way of doing things.”

    Not all experiments are equally fit for the technology, however.

    “I think this’ll be the right tool sometimes, but it won’t be all the time,” Himmel says. “The more dissimilar your data is from natural images, the less useful the networks are going to be.”

    Most physicists would agree that CNNs are not appropriate for data analysis at experiments that are just starting up, for example—neural networks are not very transparent about how they do their calculations. “It would be hard to convince people that they have discovered things,” Pierini says. “I still think there’s value to doing things with paper and pen.”

    In some cases, the challenges of running a CNN will outweigh the benefits. For one, the data need to be converted to image form if they aren’t already. And the networks require huge amounts of data for the training—sometimes millions of images taken from simulations. Even then, simulations aren’t as good as real data. So the networks have to be tested with real data and other cross-checks.

    “There’s a high standard for physicists to accept anything new,” says Amir Farbin, an associate professor of physics at The University of Texas, Arlington. “There’s a lot of hoops to jump through to convince everybody this is right.”

    Looking to the future

    For those who are already convinced, CNNs spawn big dreams for faster physics and the possibility of something unexpected.

    Some look forward to using neural networks for detecting anomalies in the data—which could indicate a flaw in a detector or possibly a hint of a new discovery. Rather than trying to find specific signs of something new, researchers looking for new discoveries could simply direct a CNN to work through the data and try to find what stands out. “You don’t have to specify which new physics you’re searching for,” Pierini says. “It’s a much more open-minded way of taking data.”

    Someday, researchers might even begin to take tackle physics data with unsupervised learning. In unsupervised learning, as the name suggests, an algorithm would train with vast amounts of data without human guidance. Scientists would be able to give algorithms data, and the algorithms would be able to figure out what conclusions to draw from it themselves.

    “If you had something smart enough, you could use it to do all types of things,” Perdue says. “If it could infer a new law of nature or something, that would be amazing.”

    “But,” he adds, “I would also have to go look for new employment.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 2:08 pm on December 2, 2016 Permalink | Reply
    Tags: , , , , Symmetry Magazine   

    From Symmetry: “Viewing our turbulent universe” 

    Symmetry Mag
    Symmetry

    12/02/16
    Liz Kruesi

    Construction has begun for the Cherenkov Telescope Array [CTA], a discovery machine that will study the highest energy objects and events across the entire sky.

    1
    Daniel Mazinkn, CTA Observatory

    Billions of light-years away, a supermassive black hole is spewing high-energy radiation, launching it far outside of the confines of its galaxy. Some of the gamma rays released by that turbulent neighborhood travel unimpeded across the universe, untouched by the magnetic fields threading the cosmos, toward our small, rocky, blue planet.

    We have space-based devices, such as the Fermi Gamma-ray Space Telescope, that can detect those messengers, allowing us to see into the black hole’s extreme environment or search for evidence of dark matter.

    NASA/Fermi Telescope
    NASA/Fermi Telescope

    But Earth’s atmosphere blocks gamma rays. When they meet the atmosphere, sequences of interactions with gas molecules break them into a shower of fast-moving secondary particles. Some of those generated particles—which could be, for example, fast-moving electrons and their antiparticles, positrons—speed through the atmosphere so quickly that they generate a faint flash of blue light, called Cherenkov radiation.

    A special type of telescope—large mirrors fitted with small reflective cones to funnel the faint light—can detect this blue flash in the atmosphere. Three observatories equipped with Cherenkov telescopes look at the sky during moonless hours of the night: VERITAS in Arizona has an array of four; MAGIC in La Palma, Spain, has two; and HESS in Namibia, Africa, has an array of five.

    CfA/VERITAS, AZ, USA
    CfA/VERITAS, AZ, USA

    MAGIC Cherenkov gamma ray telescope  on the Canary island of La Palma, Spain
    MAGIC Cherenkov gamma ray telescope on the Canary island of La Palma, Spain

    HESS Cherenko Array, located on the Cranz family farm, Göllschau, in Namibia, near the Gamsberg
    HESS Cherenko Array, located on the Cranz family farm, Göllschau, in Namibia, near the Gamsberg

    All three observatories have operated for at least 10 years, revealing a gamma-ray sky to astrophysicists.

    “Those telescopes really have helped to open the window, if you like, on this particular region of the electromagnetic spectrum,” says Paula Chadwick, a gamma-ray astronomer at Durham University in the United Kingdom. But that new window has also hinted at how much more there is to learn.

    “It became pretty clear that what we needed was a much bigger instrument to give us much better sensitivity,” she says. And so gamma-ray scientists have been working since 2005 to develop the next-generation Cherenkov observatory: “a discovery machine,” as Stefan Funk of Germany’s Erlangen Centre for Astroparticle Physics calls it, that will reveal the highest energy objects and events across the entire sky. This is the Cherenkov Telescope Array (CTA), and construction has begun.

    Ironing out the details

    As of now, nearly 1400 researchers and engineers from 32 countries are members of the CTA collaboration, and membership continues to grow. “If we look at the number of CTA members as a function of time, it’s essentially a linear increase,” says CTA spokesperson Werner Hofmann.

    Technology is being developed in laboratories spread across the globe: in Germany, Italy, the United Kingdom, Japan, the United States (supported by the NSF—given the primarily astrophysics science mission of the CTA, it is not a part of the Department of Energy High Energy Physics program), and others. Those nearly 1400 researchers are collaborating and working together to gain a better understanding of how our universe works. “It’s the science that’s got everybody together, got everybody excited, and devoting so much of their time and energy to this,” Chadwick says.

    3
    G. Pérez, IAC, SMM

    The CTA will be split between two locations, with one array in the Northern Hemisphere and a larger one in the Southern Hemisphere. The dual location enables a view of the entire sky.

    CTA’s northern site will host four large telescopes (23 meters wide) and 15 medium telescopes (12 meters wide). The southern site will also host four large telescopes, plus 25 medium and 70 small telescopes (4 meters) that will use three different designs. The small telescopes are equipped to capture the highest energy gamma rays, which emanate, for example, from the center of our galaxy. That high-energy source is visible only from the Southern Hemisphere.

    In July 2015, the CTA Observatory (CTAO) council—the official governing body that acts on behalf of the observatory—chose their top locations in each hemisphere. And in 2016, the council has worked to make those preferences official. On September 19 the council and the Instituto de Astrofísica de Canarias signed an agreement stating that the Roque de los Muchachos Observatory on the Canary Island of La Palma would host the northern array and its 19 constituent telescopes. This same site hosts the current-generation Cherenkov array MAGIC.

    IAC

    Construction of the foundation is progressing at the La Palma site to prepare for a prototype of the large telescope. The telescope itself is expected to be complete in late 2017.

    “It’s an incredibly aggressive schedule,” Hofmann says. “With a bit of luck we’ll have the first of these big telescopes operational at La Palma a year from now.”

    While the large telescope prototype is being built on the La Palma site, the medium and small prototype telescopes are being built in laboratories across the globe and installed at observatories similarly scattered. The prototypes’ optical designs and camera technologies need to be tested in a variety of environments. For example, the team working on one of the small telescope designs has a prototype on the slope of Mount Etna in Sicily. There, volcanic ash sometimes batters the mirrors and attached camera, providing a test to ensure CTA telescopes and instruments can withstand the environment. Unlike optical telescopes, which sit in protective domes, Cherenkov telescopes are exposed to the open air.

    The CTAO council expects to complete negotiations with the European Southern Observatory before the end of 2016 to finalize plans for the southern array. The current plan is to build 99 telescopes in Chile.

    ESO Bloc Icon

    This year, the council also chose the location of the CTA Science Management Center, which will be the central point of data processing, software updates and science coordination. This building, which will be located at Deutsches Elektronen-Synchrotron (also known as DESY) outside of Berlin, has not yet been built, but Hofmann says that should happen in 2018.

    DESY

    The observatory is on track for the first trial observations (essentially, testing) in 2021 and the first regular observations beginning in 2022. How close the project’s construction stays to this outlined schedule depends on funding from nations across the globe. But if the finances remain on track, then in 2024, the full observatory should be complete, and its 118 telescopes will then look for bright flashes of Cherenkov light signaling a violent event or object in the universe.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 1:13 pm on November 21, 2016 Permalink | Reply
    Tags: Symmetry Magazine, What to do with the data that is here and what is coming.   

    From Symmetry: “What to do with the data?” 

    Symmetry Mag

    Symmetry

    11/15/16
    Manuel Gnida

    1
    Illustration by Sandbox Studio, Chicago with Corinne Mucha

    Physicists and scientific computing experts prepare for an onslaught of petabytes.

    Rapid advances in computing constantly translate into new technologies in our everyday lives. The same is true for high-energy physics. The field has always been an early adopter of new technologies, applying them in ever more complex experiments that study fine details of nature’s most fundamental processes. However, these sophisticated experiments produce floods of complex data that become increasingly challenging to handle and analyze.

    Researchers estimate a decade from now, computing resources may have a hard time keeping up with the slew of data produced by state-of-the-art discovery machines. CERN’s Large Hadron Collider, for example, already generates tens of petabytes (millions of gigabytes) of data per year today, and it will produce ten times more after a future high-luminosity upgrade.

    CERN HL-LHC bloc
    HL-LHC

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Big data challenges like these are not limited to high-energy physics. When the Large Synoptic Survey Telescope begins observing the entire southern sky in never-before-seen detail, it will create a stream of 10 million time-dependent events every night and a catalog of 37 billion astronomical objects over 10 years.

    LSST
    LSST/Camera, built at SLAC
    LSST/Camera, built at SLAC
    LSST Interior
    LSST telescope, currently under construction at Cerro Pachón Chile
    LSST telescope, currently under construction at Cerro Pachón Chile

    Another example is the future LCLS-II X-ray laser at the Department of Energy’s SLAC National Accelerator Laboratory, which will fire up to a million X-ray pulses per second at materials to provide unprecedented views of atoms in motion. It will also generate tons of scientific data.

    lcls-ii-image
    SLAC/LCLS II schematic
    SLAC/LCLS II

    To make things more challenging, all big data applications will have to compete for available computing resources, for example when shuttling information around the globe via shared networks.

    What are the tools researchers will need to handle future data piles, sift through them and identify interesting science? How will they be able to do it as fast as possible? How will they move and store tremendous data volumes efficiently and reliably? And how can they possibly accomplish all of this while facing budgets that are expected to stay flat?

    “Clearly, we’re at a point where we need to discuss in what direction scientific computing should be going in order to address increasing computational demands and expected shortfalls,” says Richard Mount, head of computing for SLAC’s Elementary Particle Physics Division.

    The researcher co-chaired the 22nd International Conference on Computing in High-Energy and Nuclear Physics (CHEP 2016), held Oct. 10-14 in San Francisco, where more than 500 physicists and computing experts brainstormed possible solutions.

    Here are some of their ideas.

    Exascale supercomputers

    Scientific computing has greatly benefited from what is known as Moore’s law—the observation that the performance of computer chips has doubled every 18 months or so for the past decades. This trend has allowed scientists to handle data from increasingly sophisticated machines and perform ever more complex calculations in reasonable amounts of time.

    Moore’s law, based on the fact that hardware engineers were able to squeeze more and more transistors into computer chips, has recently reached its limits because transistor densities have begun to cause problems with heat.

    Instead, modern hardware architectures involve multiple processor cores that run in parallel to speed up performance. Today’s fastest supercomputers, which are used for demanding calculations such as climate modeling and cosmological simulations, have millions of cores and can perform tens of millions of billions of computing operations per second.

    “In the US, we have a presidential mandate to further push the limits of this technology,” says Debbie Bard, a big-data architect at the National Energy Research Scientific Computing Center. “The goal is to develop computing systems within the next 10 years that will allow calculations on the exascale, corresponding to at least a billion billion operations per second.”

    Software reengineering

    Running more data analyses on supercomputers could help address some of the foreseeable computing shortfalls in high-energy physics, but the approach comes with its very own challenges.

    “Existing analysis codes have to be reengineered,” Bard says. “This is a monumental task, considering that many have been developed over several decades.”

    Maria Girone, chief technology officer at CERN openlab, a collaboration of public and private partners developing IT solutions for the global LHC community and other scientific research, says, “Computer chip manufacturers keep telling us that our software only uses a small percentage of today’s processor capabilities. To catch up with the technology, we need to rewrite software in a way that it can be adapted to future hardware developments.”

    Part of this effort will be educating members of the high-energy physics community to write more efficient software.

    “This was much easier in the past when the hardware was less complicated,” says Makoto Asai, who leads SLAC’s team for the development of Geant4, a widely used simulation toolkit for high-energy physics and many other applications. “We must learn the new architectures and make them more understandable for physicists, who will have to write software for our experiments.”

    Smarter networks and cloud computing

    Today, LHC computing is accomplished with the Worldwide LHC Computing Grid, or WLCG, a network of more than 170 linked computer centers in 42 countries that provides the necessary resources to store, distribute and analyze the tens of petabytes of data produced by LHC experiments annually.

    “The WLCG is working very successfully, but it doesn’t always operate in the most cost-efficient way,” says Ian Fisk, deputy director for computing at the Simons Foundation and former computing coordinator of the CMS experiment at the LHC.

    “We need to move large amounts of data and store many copies so that they can be analyzed in various locations. In fact, two-thirds of the computing-related costs are due to storage, and we need to ask ourselves if computing can evolve so that we don’t have to distribute LHC data so widely.”

    More use of cloud services that offer internet-based, on-demand computing could be a viable solution for remote data processing and analysis without reproducing data.

    Commercial clouds have the capacity and capability to take on big data: Google, receives billions of photos per day and hundreds of hours of video every minute, posing technical challenges that have led to the development of powerful computing, storage and networking solutions.

    Deep machine learning for data analysis

    While conventional computer algorithms perform only operations that they are explicitly programmed to perform, machine learning uses algorithms that learn from the data and successively become better at analyzing them.

    In the case of deep learning, data are processed in several computational layers that form a network of algorithms inspired by neural networks. Deep learning methods are particularly good at finding patterns in data. Search engines, text and speech recognition, and computer vision are all examples.

    “There are many areas where we can learn from technology developments outside the high-energy physics realm,” says Craig Tull, who co-chaired CHEP 2016 and is head of the Science Software Systems Group at Lawrence Berkeley National Laboratory. “Machine learning is a very good example. It could help us find interesting patterns in our data and detect anomalies that could potentially hint at new science.”

    At present, machine learning in high-energy physics is in its infancy, but researchers have begun implementing it in the analysis of data from a number of experiments, including ATLAS at the LHC, the Daya Bay neutrino experiment in China and multiple experiments at Fermi National Accelerator Laboratory near Chicago.

    CERN/ATLAS detector
    CERN/ATLAS detector

    Daya Bay, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China
    Daya Bay, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China

    FNAL LBNF/DUNE from FNAL to SURF
    FNAL LBNF/DUNE from FNAL to SURF

    Quantum computing

    The most futuristic approach to scientific computing is quantum computing, an idea that goes back to the 1980s when it was first brought up by Richard Feynman and other researchers.

    Unlike conventional computers, which encode information as a series of bits that can have only one of two values, quantum computers use a series of quantum bits, or qubits, that can exist in several states at once. This multitude of states at any given time exponentially increases the computing power.

    A simple one-qubit system could be an atom that can be in its ground state, excited state or a superposition of both, all at the same time.

    “A quantum computer with 300 qubits will have more states than there are atoms in the universe,” said Professor John Martinis from the University of California, Santa Barbara, during his presentation at CHEP 2016. “We’re at a point where these qubit systems work quite well and can perform simple calculations.”

    Martinis has teamed up with Google to build a quantum computer. In a year or so, he says, they will have built the first 50-qubit system. Then, it will take days or weeks for the largest supercomputers to validate the calculations done within a second on the quantum computer.

    We might soon find out in what directions scientific computing in high-energy physics will develop: The community will give the next update at CHEP 2018 in Bulgaria.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 9:22 am on November 21, 2016 Permalink | Reply
    Tags: , , , , , , Symmetry Magazine   

    From Symmetry- “Q and A: What more can we learn about the Higgs?” 

    Symmetry Mag

    Symmetry

    11/17/16
    Angela Anderson

    1
    Illustration by Sandbox Studio, Chicago with Ana Kova

    Four physicists discuss Higgs boson research since the discovery.

    CERN CMS Higgs Event
    CERN CMS Higgs Event

    CERN ATLAS Higgs Event
    CERN ATLAS Higgs Event

    More than two decades before the discovery of the Higgs boson, four theoretical physicists wrote a comprehensive handbook called The Higgs Hunter’s Guide. The authors—Sally Dawson of the Department of Energy’s Brookhaven National Laboratory; John F. Gunion from the University of California, Davis; Howard E. Haber from the University of California, Santa Cruz; and Gordon Kane from the University of Michigan—were recently recognized for “instrumental contributions to the theory of the properties, reactions and signatures of the Higgs boson” as recipients of the American Physical Society’s 2017 J.J. Sakurai Prize for Theoretical Physics.

    They are still investigating the particle that completed the Standard Model, and some are hunting different Higgs bosons that could take particle physics beyond that model.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Dawson, Gunion and Haber recently attended the Higgs Couplings 2016 workshop at SLAC National Accelerator Laboratory, where physicists gathered to talk about the present and future of Higgs research. Symmetry interviewed all four to find out what’s on the horizon.

    S: What is meant by “Higgs couplings”?
    JG: The Higgs is an unstable particle that lasts a very short time in the detector before it decays into pairs of things like top quarks, gluons, and photons. The rates and relative importance of these decays is determined by the couplings of the Higgs boson to these different particles. And that’s what the workshop is all about, trying to determine whether or not the couplings predicted in the Standard Model agree with the couplings that are measured experimentally.

    SD: Right, we can absolutely say how much of the time we expect the Higgs to decay to the known particles, so a comparison of our predictions with the experimental measurements tells us whether there’s any possible deviation from our Standard Model.

    JG: For us what would be really exciting is if we did see deviations. However, that probably requires more precision than we currently have experimentally.

    GK: But we don’t all agree on that, in the sense that I would prefer that it almost exactly agree with the Standard Model predictions because of a theory that I like that says it should. But most of the people in the world would prefer what John and Sally said.

    S.How many people are working in Higgs research now worldwide?

    GK: I did a search for “Higgs” in the title of scientific papers after 2011 on arXiv.org and came up with 5211 hits; there are several authors per paper, of course, and some have written multiple papers, so we can only estimate.

    SD: There are roughly 5000 people on each experiment, ATLAS and CMS, and some fraction of those work on Higgs research, but it’s really too hard to calculate. They all contribute in different ways. Let’s just say many thousands of experimentalists and theorists worldwide.
    What are Higgs researchers hoping to accomplish?

    HH: There are basically two different avenues. One is called the precision Higgs program designed to improve precision in the current data. The other direction addresses a really simple question: Is the Higgs boson a solo act or not? If additional Higgs-like particles exist, will they be discovered in future LHC experiments?

    SD: I think everybody would like to see more Higgs bosons. We don’t know if there are more, but everybody is hoping.

    JG: If you were Gordy [Kane] who only believes in one Higgs boson, you would be working to confirm with greater and greater precision that the Higgs boson you see has precisely the properties predicted in the Standard Model. This will take more and more luminosity and maybe some future colliders like a high luminosity LHC or an e+e- collider.

    HH: The precision Higgs program is a long-term effort because the high luminosity LHC is set to come online in the mid 2020s and is imagined to continue for another 10 years. There are a lot of people trying to predict what precision could you ultimately achieve in the various measurements of Higgs boson properties that will be made by the mid 2030s. Right now we have a set of measurements with statistical and systematic errors of about 20 percent. By the end of the high luminosity LHC, we anticipate that the size of the measurement errors can be reduced to around 10 percent and maybe in some cases to 5 percent.

    S. How has research on the topic changed since the Higgs discovery?

    SD: People no longer build theoretical models that don’t have a Higgs in them. You have to make sure that your model is consistent with what we know experimentally. You can’t just build a crazy model; it has to be a model with a Higgs with roughly the properties we’ve observed, and that is actually pretty restrictive.

    JG: Many theoretical models have either been eliminated or considerably constrained. For example, the supersymmetric models that are theoretically attractive kind of expect a Higgs boson of this mass, but only after pushing parameters to a bit of an extreme. There’s also an issue called naturalness: In the Standard Model alone there is no reason why the Higgs boson should have such a light mass as we see, whereas in some of these theories it is natural to see the Higgs boson at this mass. So that’s a very important topic of research—looking for those models that are in a certain sense naturally predicting what we see and finding additional experimental signals associated with such models.

    GK: For example, the supersymmetric theories predict that there will be five Higgs bosons with different masses. The extent to which the electroweak symmetry is broken by each of the five depends on their couplings, but there should be five discovered eventually if the others exist.

    HH: There’s also a slightly different attitude to the research today. Before the Higgs boson was discovered it was known that the Standard Model was theoretically inconsistent without the Higgs boson. It had to be there in some form. It wasn’t going to be that we ran the LHC and saw nothing—no Higgs boson and nothing else. This is called a no-lose theorem. Now, having discovered the Higgs boson, you cannot guarantee that additional new phenomenon exists that must be discovered at the LHC. In other words, the Standard Model itself, with the Higgs boson, is a theoretically consistent theory. Nevertheless, not all fundamental phenomena can be explained by Standard Model physics (such as neutrino masses, dark matter and the gravitational force), so we know that new phenomena beyond the Standard Model must be present at some very high-energy scale. However, there is no longer a no-lose theorem that states that this new phenomena must appear at the energy scale that is probed at the LHC.

    S. How have the new capabilities of the LHC changed the game?

    SD: We have way more Higgs bosons; that’s really how it’s changed. Since the energy is higher we can potentially make heavier new particles.

    GK: There were about a million Higgs bosons produced in the first run of the LHC, and there will be more than twice that in the second run, but they only can find a small fraction of those in the detector because of background noise and some other things. It’s very hard. It takes clever experimenters. To find a couple of hundred Higgs you need to produce a million.

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    SD: Most of the time the Higgs decays into something we can’t see in our detector. But as the measurements get better and better, experimentalists who have been extracting the couplings are quantifying more properties of the Higgs decays. So instead of just counting how many Higgs bosons decay to two Z bosons, they will look at where the two Z bosons are in the detector or the energy of the Z bosons.

    S. Are there milestones you are looking forward to?

    GK: Confirming the Standard Model Higgs with even more precision. The decay the Higgs boson was discovered in—two photons—could happen in any other kind of particle. But the decay to W boson pairs is the one that you need for it to break the electroweak symmetry [a symmetry between the masses of the particles associated with the electromagnetic and weak forces], which is what it should do according to the Standard Model.

    SD: So, one of the things we will see a lot of in the next year or two is better measurements of the Higgs decay into the bottom quarks. Within a few years, we should learn whether or not there are more Higgs bosons. Measuring the couplings to the desired precision will take 20 years or more.

    JG: There’s another thing people are thinking about, which is how the Higgs can be connected to the important topic of dark matter. We are working on models that establish such a connection, but most of these models, of course, have extra Higgs bosons. It’s even possible that one of those extra Higgs bosons might be invisible dark matter. So the question is whether the Higgs we can see tells us something about dark matter Higgs bosons or other dark matter particles, such as the invisible particles that are present in supersymmetry.

    S. Are there other things still to learn?

    JG: There are many possible connections between Higgs bosons, in a generic sense and the history of the universe. For example, it could be that a Higgs-like particle called the inflaton is responsible for the expansion of the universe. As a second example, generalized Higgs boson models could explain the preponderance of matter over antimatter in the current universe.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 12:14 pm on November 8, 2016 Permalink | Reply
    Tags: , , , Symmetry Magazine   

    From Symmetry: “The origins of dark matter” 

    Symmetry Mag
    Symmetry

    11/08/16
    Matthew R. Francis

    1
    Illustration by Sandbox Studio, Chicago with Corinne Mucha

    Theorists think dark matter was forged in the hot aftermath of the Big Bang.

    Transitions are everywhere we look. Water freezes, melts, or boils; chemical bonds break and form to make new substances out of different arrangements of atoms.

    The universe itself went through major transitions in early times. New particles were created and destroyed continually until things cooled enough to let them survive.

    CMB per ESA/Planck
    “CMB per ESA/Planck

    Those particles include ones we know about, such as the Higgs boson or the top quark.

    CERN CMS Higgs Event
    CERN CMS Higgs Event

    CERN ATLAS Higgs Event
    CERN ATLAS Higgs Event

    But they could also include dark matter, invisible particles which we presently know only because of their gravitational effects.

    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al
    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al.

    In cosmic terms, dark matter particles could be a “thermal relic,” forged in the hot early universe and then left behind during the transitions to more moderate later eras. One of these transitions, known as “freeze-out,” changed the nature of the whole universe.

    The hot cosmic freezer

    On average, today’s universe is a pretty boring place. If you pick a random spot in the cosmos, it’s far more likely to be in intergalactic space than, say, the heart of a star or even inside an alien solar system. That spot is probably cold, dark and quiet.

    The same wasn’t true for a random spot shortly after the Big Bang.

    “The universe was so hot that particles were being produced from photons smashing into other photons, of photons hitting electrons, and electrons hitting positrons and producing these very heavy particles,” says Matthew Buckley of Rutgers University.

    The entire cosmos was a particle-smashing party, but parties aren’t meant to last. This one lasted only a trillionth of a second. After that came the cosmic freeze-out.

    During the freeze-out, the universe expanded and cooled enough for particles to collide far less frequently and catastrophically.

    “One of these massive particles floating through the universe is finding fewer and fewer antimatter versions of itself to collide with and annihilate,” Buckley says.

    “Eventually the universe would get large enough and cold enough that the rate of production and the rate of annihilation basically goes to zero, and you just a relic abundance, these few particles that are floating out there lonely in space.”

    Many physicists think dark matter is a thermal relic, created in huge numbers in before the cosmos was a half-second old and lingering today because it barely interacts with any other particle.

    A WIMPy miracle

    One reason to think of dark matter as a thermal relic is an interesting coincidence known as the “WIMP miracle.”

    WIMP stands for “weakly-interacting massive particle,” and WIMPs are the most widely accepted candidates for dark matter. Theory says WIMPs are likely heavier than protons and interact via the weak force, or at least interactions related to the weak force.

    The last bit is important, because freeze-out for a specific particle depends on what forces affect it and the mass of the particle. Thermal relics made by the weak force were born early in the universe’s history because particles need to be jammed in tight for the weak force, which only works across short distances, to be a factor.

    “If dark matter is a thermal relic, you can calculate how big the interaction [between dark matter particles] needs to be,” Buckley says.

    Both the primordial light known as the cosmic microwave background and the behavior of galaxies tell us that most dark matter must be slow-moving (“cold” in the language of physics). That means interactions between dark matter particles must be low in strength.

    “Through what is perhaps a very deep fact about the universe,” Buckley says, “that interaction turns out to be the strength of what we know as the weak nuclear force.”

    That’s the WIMP miracle: The numbers are perfect to make just the right amount of WIMPy matter.

    The big catch, though, is that experiments haven’t found any WIMPs yet. It’s too soon to say WIMPs don’t exist, but it does rule out some of the simpler theoretical predictions about them.

    Ultimately, the WIMP miracle could just be a coincidence. Instead of the weak force, dark matter could involve a new force of nature that doesn’t affect ordinary matter strongly enough to detect. In that scenario, says Jessie Shelton of the University of Illinois at Urbana-Champaign, “you could have thermal freeze-out, but the freeze-out is of dark matter to some other dark field instead of [something in] the Standard Model.”

    In that scenario, dark matter would still be a thermal relic but not a WIMP.

    For Shelton, Buckley, and many other physicists, the dark matter search is still full of possibilities.

    “We have really compelling reasons to look for thermal WIMPs,” Shelton says. “It’s worth remembering that this is only one tiny corner of a much broader space of possibilities.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: