Tagged: SLAC Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 7:48 pm on September 28, 2017 Permalink | Reply
    Tags: , , , SLAC, Stanford PULSE Institute   

    From SLAC: “A Potential New and Easy Way to Make Attosecond Laser Pulses: Focus a Laser on Ordinary Glass” 


    SLAC Lab

    September 28, 2017
    Glennda Chui

    1
    In this illustration, a near-infrared laser beam hits a piece of ordinary glass and triggers a process called high harmonic generation. It produces laser light pulses (top right) that are just billionths of a billionth of a second, or attoseconds, long, and the photons in those pulses are much higher energy than those in the original beam. The insets zoom in on how this happens. When the incoming laser light knocks electrons (e-) out of atoms in the glass, they fly away, loop back and reconnect with either their home atom (lower right) or a neighboring atom (upper left). These reconnections generate bright bursts of light, forming a “train” of attosecond pulses that leaves the glass and can be used to probe electron movements in solids. (Greg Stewart/SLAC National Accelerator Laboratory)

    This novel method could shrink the equipment needed to make laser pulses that are billionths of a billionth of a second long for studying ultra-speedy electron movements in solids, chemical reactions and future electronics.

    The discovery 30 years ago that laser light can be boosted to much higher energies and shorter pulses – just billionths of a billionth of a second, or attoseconds, long – is the basis of attosecond science, where researchers observe and try to control the movements of electrons. Electrons are key players in chemical reactions, biological processes, electronics, solar cells and other technologies, and only pulses this short can make snapshots of their incredibly swift moves.

    Now scientists from the Stanford PULSE Institute at the Department of Energy’s SLAC National Accelerator Laboratory have found a potential new way to make attosecond laser pulses using ordinary glass – in this case, the cover slip from a microscope slide.

    The discovery, reported in Nature Communications today, was a real surprise and opens new possibilities for attosecond science and technology, including the ability to probe ultra-speedy electron motions inside glasses and other solid materials. It could also dramatically shrink the size and cost of the setups needed to produce these tiny pulses, to the point where you might be able to generate pulses inside a fiber optic cable that delivers them to where they’re needed.

    “With today’s methods, you have to shine the laser beam through a special gas jet or through a crystal that has to be grown with great care at ultra-cold temperatures,” said Yong Sing You, a postdoctoral researcher at PULSE and lead author of the study. “But this is exciting because you can use everyday glass, which is cheap and easily available, at room temperature. If you were to put your eyeglasses into the experiment, it would still work, and it would not even damage the glasses.”

    2
    Postdoctoral researcher Yong Sing You, left, and staff scientist Shambhu Ghimire in the PULSE laser lab at SLAC where the experiments were carried out. (Chris Smith/SLAC National Accelerator Laboratory)

    A String of Surprises

    The process that generates attosecond laser pulses is called high harmonic generation, or HHG. Much like pressing on a guitar string produces a note that’s higher in pitch, shining laser light through certain materials changes the nature of the light, shifting it to higher energies and shorter pulses than a laser can reach on its own.

    Most of the time this is done in a gas. Incoming photons, or particles of light, from the laser hit atoms in the gas and liberate some of their electrons. The freed electrons fly away, loop back and reconnect with their home atoms. This reconnection generates attosecond bursts of light that combine to form an attosecond laser pulse.

    Starting in 2010, a series of experiments led by PULSE researchers Shambhu Ghimire and David Reis showed HHG can be produced in ways that were previously thought unlikely or even impossible: by beaming laser light into a crystal, frozen argon gas or an atomically thin semiconductor material.

    Unlike a gas, whose atoms are so far apart that you can think of them as behaving independently, atoms in a solid are so close together that scientists thought electrons freed by an incoming laser pulse would hit neighboring atoms, scatter and never return home to make that crucial reconnection. But it turned out this was not the case, Reis said: “There’s something about the orderly structure of the crystal that allows electrons to move throughout the lattice in a way that doesn’t dissipate their energy or give them a kick in some other direction. Even if they connect with a neighboring atom, they can still participate in HHG.”

    Fundamental Science with Practical Potential

    The fact that glass could generate HHG was also a surprise, said Ghimire, who helped lead the latest study. Because it’s amorphous, meaning that its silicon and oxygen atoms are arranged in no particular order, it did not seem like a good candidate.

    But glass’s random nature was just what the team needed to answer the fundamental scientific question at the heart of the study: How do the density and crystallinity of a material – the degree to which its atoms are arranged in an orderly lattice – independently affect its ability to produce HHG? A piece of glass and a quartz crystal are both made of silicon and oxygen, and they’re roughly the same density; only the arrangement of their atoms is different. So comparing the two should provide some answers.

    The scientists put the glass cover slip in their apparatus and hit it with pulses from their infrared laser beam.

    “You might think, again, that this wouldn’t work, because the electrons would bounce off their neighbors and never make it back home,” said Reis, who was not involved in the current paper. “But the surprising thing is that even in glass, if you hit the glass hard enough but not so hard that you break it, it works fine, although by a slightly different process.”

    The ability to produce HHG in glass and other solids is exciting, he said, because it has the potential to shrink the equipment needed to do this from the size of a lab bench to maybe just a few nanometers – billionths of a meter – in size.

    Ghimire added that producing harmonics in glass has potential technological applications. For instance, it produces the short wavelengths of laser light needed to design masks for patterning nanometer-scale features on semiconductor chips.

    “For this, they want as much intensity as possible, and also an easy way to deliver light to their samples,” he said. “Being able to produce short-wavelength laser light in normal glass would bring us a couple of steps closer to something they could actually use. We could even generate the short-wavelength light in the glass portion of optical fibers that then deliver it to wherever they wanted it.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.
    i1

    Advertisements
     
  • richardmitnick 3:02 pm on August 30, 2017 Permalink | Reply
    Tags: Artificial Intelligence Analyzes Gravitational Lenses 10 Million Times Faster, , , , SLAC   

    From SLAC: “Artificial Intelligence Analyzes Gravitational Lenses 10 Million Times Faster” 


    SLAC Lab

    August 30, 2017
    Andrew Gordon
    agordon@slac.stanford.edu
    (650) 926-2282
    Written by Manuel Gnida

    SLAC and Stanford researchers demonstrate that brain-mimicking ‘neural networks’ can revolutionize the way astrophysicists analyze their most complex data, including extreme distortions in spacetime that are crucial for our understanding of the universe.

    Researchers from the Department of Energy’s SLAC National Accelerator Laboratory and Stanford University have for the first time shown that neural networks – a form of artificial intelligence – can accurately analyze the complex distortions in spacetime known as gravitational lenses 10 million times faster than traditional methods.

    Gravitational Lensing NASA/ESA

    “Analyses that typically take weeks to months to complete, that require the input of experts and that are computationally demanding, can be done by neural nets within a fraction of a second, in a fully automated way and, in principle, on a cell phone’s computer chip,” said postdoctoral fellow Laurence Perreault Levasseur, a co-author of a study published today in Nature.

    1
    KIPAC scientists have for the first time used artificial neural networks to analyze complex distortions in spacetime, called gravitational lenses, demonstrating that the method is 10 million times faster than traditional analyses. (Greg Stewart/SLAC National Accelerator Laboratory)

    Lightning Fast Complex Analysis

    The team at the Kavli Institute for Particle Astrophysics and Cosmology (KIPAC), a joint institute of SLAC and Stanford, used neural networks to analyze images of strong gravitational lensing, where the image of a faraway galaxy is multiplied and distorted into rings and arcs by the gravity of a massive object, such as a galaxy cluster, that’s closer to us. The distortions provide important clues about how mass is distributed in space and how that distribution changes over time – properties linked to invisible dark matter that makes up 85 percent of all matter in the universe and to dark energy that’s accelerating the expansion of the universe.

    Until now this type of analysis has been a tedious process that involves comparing actual images of lenses with a large number of computer simulations of mathematical lensing models. This can take weeks to months for a single lens.

    But with the neural networks, the researchers were able to do the same analysis in a few seconds, which they demonstrated using real images from NASA’s Hubble Space Telescope and simulated ones.

    To train the neural networks in what to look for, the researchers showed them about half a million simulated images of gravitational lenses for about a day. Once trained, the networks were able to analyze new lenses almost instantaneously with a precision that was comparable to traditional analysis methods. In a separate paper, submitted to The Astrophysical Journal Letters, the team reports how these networks can also determine the uncertainties of their analyses.

    Prepared for Data Floods of the Future

    “The neural networks we tested – three publicly available neural nets and one that we developed ourselves – were able to determine the properties of each lens, including how its mass was distributed and how much it magnified the image of the background galaxy,” said the study’s lead author Yashar Hezaveh, a NASA Hubble postdoctoral fellow at KIPAC.

    This goes far beyond recent applications of neural networks in astrophysics, which were limited to solving classification problems, such as determining whether an image shows a gravitational lens or not.

    The ability to sift through large amounts of data and perform complex analyses very quickly and in a fully automated fashion could transform astrophysics in a way that is much needed for future sky surveys that will look deeper into the universe – and produce more data – than ever before.

    The Large Synoptic Survey Telescope (LSST), for example, whose 3.2-gigapixel camera is currently under construction at SLAC, will provide unparalleled views of the universe and is expected to increase the number of known strong gravitational lenses from a few hundred today to tens of thousands.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    “We won’t have enough people to analyze all these data in a timely manner with the traditional methods,” Perreault Levasseur said. “Neural networks will help us identify interesting objects and analyze them quickly. This will give us more time to ask the right questions about the universe.”

    3
    KIPAC researchers used images of strongly lensed galaxies taken with the Hubble Space Telescope to test the performance of neural networks, which promise to speed up complex astrophysical analyses tremendously. (Yashar Hezaveh/Laurence Perreault Levasseur/Phil Marshall/Stanford/SLAC National Accelerator Laboratory; NASA/ESA)

    A Revolutionary Approach

    Neural networks are inspired by the architecture of the human brain, in which a dense network of neurons quickly processes and analyzes information.

    In the artificial version, the “neurons” are single computational units that are associated with the pixels of the image being analyzed. The neurons are organized into layers, up to hundreds of layers deep. Each layer searches for features in the image. Once the first layer has found a certain feature, it transmits the information to the next layer, which then searches for another feature within that feature, and so on.

    “The amazing thing is that neural networks learn by themselves what features to look for,” said KIPAC staff scientist Phil Marshall, a co-author of the paper. “This is comparable to the way small children learn to recognize objects. You don’t tell them exactly what a dog is; you just show them pictures of dogs.”

    But in this case, Hezaveh said, “It’s as if they not only picked photos of dogs from a pile of photos, but also returned information about the dogs’ weight, height and age.”

    3
    Scheme of an artificial neural network, with individual computational units organized into hundreds of layers. Each layer searches for certain features in the input image (at left). The last layer provides the result of the analysis. The researchers used particular kinds of neural networks, called convolutional neural networks, in which individual computational units (neurons, gray spheres) of each layer are also organized into 2-D slabs that bundle information about the original image into larger computational units. (Greg Stewart/SLAC National Accelerator Laboratory)

    Although the KIPAC scientists ran their tests on the Sherlock high-performance computing cluster at the Stanford Research Computing Center, they could have done their computations on a laptop or even on a cell phone, they said. In fact, one of the neural networks they tested was designed to work on iPhones.

    “Neural nets have been applied to astrophysical problems in the past with mixed outcomes,” said KIPAC faculty member Roger Blandford, who was not a co-author on the paper. “But new algorithms combined with modern graphics processing units, or GPUs, can produce extremely fast and reliable results, as the gravitational lens problem tackled in this paper dramatically demonstrates. There is considerable optimism that this will become the approach of choice for many more data processing and analysis problems in astrophysics and other fields.”

    Part of this work was funded by the DOE Office of Science.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.
    i1

     
  • richardmitnick 4:44 pm on July 26, 2017 Permalink | Reply
    Tags: , , SLAC, SLAC’s ultrafast 'electron camera' reveals unusual atomic motions that could be crucial for the efficiency of next-generation perovskite solar cells   

    From SLAC: “Atomic Movies May Help Explain Why Perovskite Solar Cells Are More Efficient” 


    SLAC Lab

    July 26, 2017
    Andrew Gordon
    agordon@slac.stanford.edu
    (650) 926-2282

    SLAC’s ultrafast ‘electron camera’ reveals unusual atomic motions that could be crucial for the efficiency of next-generation perovskite solar cells.

    1
    According to a new SLAC study, atoms in perovskites respond to light with unusual rotational motions and distortions that could explain the high efficiency of these next-generation solar cell materials. (Greg Stewart/SLAC National Accelerator Laboratory.)

    In recent years, perovskites have taken the solar cell industry by storm. They are cheap, easy to produce and very flexible in their applications. Their efficiency at converting light into electricity has grown faster than that of any other material – from under four percent in 2009 to over 20 percent in 2017 – and some experts believe that perovskites could eventually outperform the most common solar cell material, silicon. But despite their popularity, researchers don’t know why perovskites are so efficient.

    Now experiments with a powerful “electron camera” at the Department of Energy’s SLAC National Accelerator Laboratory have discovered that light whirls atoms around in perovskites, potentially explaining the high efficiency of these next-generation solar cell materials and providing clues for making better ones.

    “We’ve taken a step toward solving the mystery,” said Aaron Lindenberg from the Stanford Institute for Materials and Energy Sciences (SIMES) and the Stanford PULSE Institute for ultrafast science, which are jointly operated by Stanford University and SLAC. “We recorded movies that show that certain atoms in a perovskite respond to light within trillionths of a second in a very unusual manner. This may facilitate the transport of electric charges through the material and boost its efficiency.”

    The study was published today in Science Advances.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.
    i1

     
  • richardmitnick 12:58 pm on July 19, 2017 Permalink | Reply
    Tags: , , New Droplet-on-Tape Method Assists Biochemical Research at X-Ray Lasers, SLAC, ,   

    From SLAC: “New Droplet-on-Tape Method Assists Biochemical Research at X-Ray Lasers” 


    SLAC Lab

    February 27, 2017 [Never saw this one before]

    1
    Acoustic droplet ejection allows scientists to deposit nanoliters of sample directly into the X-ray beam, considerably increasing the efficiency of sample consumption. A femtosecond pulse from an X-ray free-electron laser then intersects with a droplet that contains protein crystals. (SLAC National Accelerator Laboratory)

    SLAC/LCLS

    2
    As the drops move forward, they are hit with pulses of visible light or treated with oxygen gas, which triggers different chemical reactions depending on the sample studied. (SLAC National Accelerator Laboratory)

    Biological samples studied with intense X-rays at free-electron lasers are destroyed within nanoseconds after they are exposed. Because of this, the samples need to be continually refreshed to allow the many images needed for an experiment to be obtained. Conventional methods use jets that supply a continuous stream of samples, but this can be very wasteful as the X-rays only interact with a tiny fraction of the injected material.

    To help address this issue, scientists at the Department of Energy’s Lawrence Berkeley National Laboratory, SLAC National Accelerator Laboratory, Brookhaven National Laboratory, and other institutes designed a new assembly-line system that rapidly replaces exposed samples by moving droplets along a miniature conveyor belt, timed to coincide with the arrival of the X-ray pulses.

    The droplet-on-tape system now allows the team to study the biochemical reactions in real-time from microseconds to seconds, revealing the stages of these complex reactions.

    In their approach, protein solution or crystals are precisely deposited in tiny liquid drops, made as ultrasound waves push the liquid onto a moving tape. As the drops move forward, they are hit with pulses of visible light or treated with oxygen gas, which triggers different chemical reactions depending on the sample studied. This allows the study of processes such as photosynthesis, which determines how plants absorb light from the sun and convert it into useable energy.

    Finally, powerful X-ray pulses from SLAC’s X-ray laser, the Linac Coherent Light Source (LCLS), probe the drops. In this study published in Nature Methods, the X-ray light scattered from the sample onto two different detectors simultaneously, one for X-ray crystallography and the other for X-ray emission spectroscopy. These are two complementary methods that provide information about the geometric and electronic structure of the catalytic sites of the proteins and allowed them to watch with atomic precision how the protein structures changed during the reaction.

    Below, see the conveyor belt in action at LCLS, a Department of Energy Office of Science User Facility.

    3
    Droplet-on-tape conveyor belt system delivers samples at the Linac Coherent Light Source (LCLS). (SLAC National Accelerator Laboratory)

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.
    i1

     
  • richardmitnick 5:34 pm on July 5, 2017 Permalink | Reply
    Tags: , , Getting ready for LCLS-II, SLAC,   

    From SLAC: “SLAC’s Electron Hub Gets New ‘Metro Map’ for World’s Most Powerful X-Ray Laser” 


    SLAC Lab

    July 5, 2017
    Manuel Gnida

    1
    A reconfiguration of SLAC’s historic Beam Switch Yard will include electron transport lines needed for LCLS-II, a major upgrade to the Linac Coherent Light Source (LCLS) X-ray laser. (Greg Stewart/SLAC National Accelerator Laboratory)

    The central hub for powerful electron beams at the Department of Energy’s SLAC National Accelerator Laboratory is getting a makeover to prepare for the installation of LCLS-II – a major upgrade to the Linac Coherent Light Source (LCLS), the world’s first hard X-ray free-electron laser. LCLS-II will deliver the most powerful X-rays ever made in a lab, with beams that are 10,000 times brighter than before, opening up unprecedented research opportunities in chemistry, materials science, biology and energy research.

    2
    Central portion of the BSY before (left) and after the Reconfiguration Project. (Scott DeBarger/SLAC National Accelerator Laboratory)

    A Monumental Clean-up Operation

    To clear the path for LCLS-II, crews first had to remove all unnecessary materials from the BSY – a monumental task considering SLAC’s rich history in accelerator science and the legacy material it created.

    “When experiments end, most of the old equipment is typically left in place,” says SLAC’s Mark Woodley, an optics designer involved in the BSY Reconfiguration Project. “Only the things that are in the way of new experiments are taken out.”

    In its early days in the 1960s, the linac delivered electron beams to three experimental stations. There was one line going straight into the lab’s research yard. Today this line continues to the LCLS undulator. Pulsed magnets in the BSY could divert the beam into End Stations A and B via two beamlines that branched off the central line.

    In 1980, two more branches were added to feed electrons and positrons, the antiparticle siblings of electrons, into the two storage rings of the PEP accelerator (PEP-II from 1999). In 1987, another two branches were needed to deliver beams to the two arms of the Stanford Linear Collider (SLC).

    Most of the old materials left behind in the BSY by these experiments have now been cleared – a job that took 300 employees and subcontractors almost 24,000 hours of work in the period from December 2016 to May 2017. They removed 325 cubic yards, or about 24 tons, of material – enough to fill eight sea-land shipping containers – and more than 300,000 feet of cables.

    “Considering the monumental task we had ahead of us, it’s truly impressive how well this project went,” DeBarger says. “It involved many people from inside and outside the lab, and every single one of them was absolutely needed.”

    Building the Future of X-ray Science

    After clearing out the BSY, members of the Reconfiguration Project installed a new beamline that runs from the copper linac to the current LCLS undulator. In parallel, the system to extract electrons for the End Station A line was put in place by another project team.

    “We also installed the very first LCLS-II beam pipe at the end of a ‘muon shield’ that is constructed of 5- and 10-ton steel blocks and shields the beam transport hall downstream of the BSY, allowing access while beams are tuned in the BSY,” says Dean Hanquist, control account manager on Chan’s team.

    “In the end, we had to make sure that everything works properly again for LCLS, which has now resumed its experimental program,” says BSY Area Physicist Tonee Smith. “For example, all of the magnets used in the beamline to focus the electron beam and make small corrections to it were refurbished, and we had to remeasure and test them.”

    The remaining beamlines and junctions will be installed during a yearlong LCLS downtime, which will start in the summer of 2018. Once completed, the new BSY “metro system” will be ready to transport electron trains to the new X-ray laser facility, where they will power groundbreaking X-ray science for years to come.

    4
    Crew members gather at the conclusion of the BSY Reconfiguration Project. (Dawn Harmer/SLAC National Accelerator Laboratory)

    For questions or comments, contact the SLAC Office of Communications at communications@slac.stanford.edu.

    5
    The interior of the east portion of the Beam Switch Yard (BSY) showing three “tracks” that electrons accelerated in SLAC’s linear accelerator can be directed into. All of the beams for LCLS and LCLS-II are sent through the central tunnel. In early 2017, as part of the LCLS-II project, the steel Muon Shield was reconfigured to permit installation of a new beamline that will transport beams to a new Soft X-Ray Undulator. (Chris Smith/SLAC National Accelerator Laboratory)

    6
    Workers install the shield pipe that will position and protect the LCLS-II vacuum chamber within the Muon Shield. (Chris Smith/SLAC National Accelerator Laboratory)

    7
    Surveyors Bryan Rutledge and Francis Gaudreault measure the position of the LCLS beamline prior to its disassembly. (Chris Smith/SLAC National Accelerator Laboratory)

    8
    Mechanical Engineer Alev Ibrahimov, left, and Transport Systems CAM Dean Hanquist inspect the LCLS-II installation location in Sector 30. (Dawn Harmer/SLAC National Accelerator Laboratory)

    9
    Rigger Scot Johnson positions a movable hoist. (Chris Smith/SLAC National Accelerator Laboratory)

    10
    A crane removes the D-10 Tune-up Dump. This dump has five apertures, visible at the end of the device, which over the years allowed beams to head to various downstream experimental areas including LCLS, End Station A, End Station B and SPEAR. (Chris Smith/SLAC National Accelerator Laboratory)

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.
    i1

     
  • richardmitnick 12:35 pm on June 17, 2017 Permalink | Reply
    Tags: Axion theory, , , , , Helen Quinn and Roberto Peccei, Peccei-Quinn symmetry, , SLAC,   

    From Quanta: “Roberto Peccei and Helen Quinn, Driving Around Stanford in a Clunky Jeep” 

    Quanta Magazine
    Quanta Magazine

    June 15, 2017
    Thomas Lin
    Olena Shmahalo, Art Director
    Lucy Reading-Ikkanda, graphics

    1
    Ryan Schude for Quanta Magazine
    Helen Quinn and Roberto Peccei walking toward Stanford University’s new science and engineering quad. Behind them is the main quad, the oldest part of the campus. “If you look at a campus map,” said Quinn, who along with Peccei proposed Peccei-Quinn symmetry, “you will see the axis that goes through the middle of both quadrangle areas. We are on that line between the two.”

    Four decades ago, Helen Quinn and Roberto Peccei took on one of the great problems in theoretical particle physics: the strong charge-parity (CP) problem. Why does the symmetry between matter and antimatter break in weak interactions, which are responsible for nuclear decay, but not in strong interactions, which hold matter together?

    “The academic year 1976-77 was particularly exciting for me because Helen Quinn and Steven Weinberg were visiting the Stanford department of physics,” Peccei told Quanta in an email. “Helen and I had similar interests and we soon started working together.”

    Encouraged by Weinberg, who would go on to win a Nobel Prize in physics in 1979 for his work on the unification of electroweak interactions, Quinn and Peccei zeroed in on a CP-violating interaction whose strength can be characterized by an angular variable, theta. They knew theta had to be small, but no one had an elegant mechanism for explaining its smallness.

    “Steve liked to discuss physics over lunch, and Helen and I often joined him,” Peccei said. “Steve invariably brought up the theta problem in our lunch discussions, urging us to find a natural solution for why it was so small.”

    Quinn said by email that she and Peccei knew two things: The problem goes away if any quarks have zero mass (which seems to make theta irrelevant), and “in the very early hot universe all the quarks have zero mass.” They wondered how it could be that “theta is irrelevant in the early universe but matters once it cools enough that the quarks get their masses?”

    They proceeded to draft a “completely wrong paper based on conclusions we drew from this set of facts,” Quinn said. They went to Weinberg, whose comments helped clarify their thinking and, she said, “put us on the right track.”

    They realized they could naturally arrive at a zero value for theta by requiring a new symmetry, now known as the Peccei-Quinn mechanism. Besides being one of the popular proposed solutions to the strong CP problem, Peccei-Quinn symmetry also predicts the existence of a hypothetical “axion” particle, which has become a mainstay in theories of supersymmetry and cosmic inflation and has been proposed as a candidate for dark matter.

    2
    Peccei and Quinn discussing their proposed symmetry with the aid of a sombrero. Ryan Schude for Quanta Magazine

    That year at Stanford, Quinn and Peccei regularly interacted with the theory group at the Stanford Linear Accelerator Center (SLAC) as well as with another group from the University of California, Santa Cruz.

    “We formed a large and active group of theorists, which created a wonderful atmosphere of open discussion and collaboration,” Quinn said, adding that she recalls “riding with Roberto back and forth from Stanford to SLAC in his yellow and clunky Jeep, talking physics ideas as we went.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 9:29 pm on June 7, 2017 Permalink | Reply
    Tags: , , , Belle, , , , , , SLAC, Vera Lüth,   

    From SLAC: Women in STEM – “Q&A: SLAC’s Vera Lüth Discusses the Search for New Physics” 


    SLAC Lab

    June 7, 2017
    Manuel Gnida

    4
    Vera Lüth, professor emerita of experimental particle physics at SLAC. (Dawn Harmer/SLAC National Accelerator Laboratory)

    Data from BABAR, Belle and LHCb experiments hint at phenomena beyond the Standard Model of particle physics.

    SLAC BABAR

    1
    An electron-positron annihilation producing a pair of B mesons as recorded by the BABAR detector at the PEP-II storage rings. Among the reconstructed curved particle tracks is a muon (bottom left). The direction of the associated anti-neutrino (dashed arrow) is identified as missing momentum. Both particles originate from the same B-meson decay. (SLAC National Accelerator Laboratory)

    KEK Belle detector, at the High Energy Accelerator Research Organisation (KEK) in Tsukuba, Ibaraki Prefecture, Japan

    CERN LHCb chamber, LHC

    The Standard Model of particle physics describes the properties and interactions of the constituents of matter.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    The development of this theory began in the early 1960s, and in 2012 the last piece of the puzzle was solved by the discovery of the Higgs boson at the Large Hadron Collider (LHC) at CERN in Switzerland.

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    Experiments have confirmed time and again the Standard Model’s very accurate predictions.

    Yet, researchers have reasons to believe that physics beyond the Standard Model exists and should be found. For instance, the Standard Model does not explain why matter dominates over antimatter in the universe. It also does not provide clues about the nature of dark matter – the invisible substance that is five times more prevalent than the regular matter we observe.

    In this Q&A, particle physicist Vera Lüth discusses scientific results that potentially hint at physics beyond the Standard Model. The professor emerita of experimental particle physics at the Department of Energy’s SLAC National Accelerator Laboratory is co-author of a review article published today in Nature that summarizes the findings of three experiments: BABAR at SLAC, Belle in Japan and LHCb at CERN.

    What are the hints of new physics that you describe in your article?

    The hints originate from studies of an elementary particle, known as the B meson – an unstable particle produced in the collision of powerful particle beams. More precisely, these studies looked at decays of the B meson that involve leptons – electrically charged elementary particles and their associated neutrinos. There are three charged leptons: the electron, a critical component of atoms discovered in 1897; the muon, first observed in cosmic rays in 1937; and the much heavier tau, discovered at the SPEAR electron-positron (e+e-) storage ring at SLAC in 1975 by Martin Perl.

    Due to their very different masses, the three leptons also have very different lifetimes. The electron is stable, whereas the muon and tau decay in a matter of microseconds and a fraction of a picosecond, respectively. A fundamental assumption of the Standard Model is that the interactions of the three charged leptons are the same if their different masses and lifetimes are taken into account.

    Over many years, different experiments have tested this assumption – referred to as “lepton universality” – and to date no definite violation of this rule has been observed. We now have indications that the rates for B meson decays involving tau leptons are larger than expected compared to the measured rates of decays involving electrons or muons, taking into account the differences in mass. This observation would violate lepton universality, a fundamental assumption of the Standard Model.

    What does a violation of the Standard Model actually mean?

    It means that there is evidence for phenomena that we cannot explain in the context of the Standard Model. If such a phenomenon is firmly established, the Standard Model needs to be extended – by introducing new fundamental particles and also new interactions related to these particles.

    In recent years, searches for fundamentally new phenomena have relied on high-precision measurements to detect deviations from Standard Model predictions or on searches for new particles or interactions with properties that differ from known ones.

    What exactly are the BABAR, Belle and LHCb experiments?

    They are three experiments that have challenged lepton universality.

    Belle and BABAR were two experiments specifically designed to study B mesons with unprecedented precision – particles that are five times heavier than the proton and contain a bottom or b quark. These studies were performed at e+e- storage rings that are commonly referred to as B factories and operate at colliding-beam energies just high enough to produce a pair of B mesons, and no other particle. BABAR operated at SLAC’s PEP-II from 1999 to 2008, Belle at KEKB in Japan from 1999 to 2010. The great advantage of these experiments is that the B mesons are produced pairwise, each decaying into lighter particles – on average five charged particles and a similar number of photons.

    The LHCb experiment is continuing to operate at the proton-proton collider LHC with energies that exceed the ones of B factories by more than a factor of 1,000. At this higher energy, B mesons are produced at a much larger rate than at B factories. However, at each crossing of the beams, hundreds of other particles are produced in addition to B mesons. This feature tremendously complicates the identification of B meson decays.

    To study lepton universality, all three experiments focus on B decays involving a charged lepton and an associated neutrino. A neutrino doesn’t leave a trace in the detector, but its presence is detected as missing energy and momentum in an individual B decay.

    What evidence do you have so far for a potential violation of lepton universality?

    All three experiments have identified specific B meson decays and have compared the rates of decays involving an electron or muon to those involving the higher mass tau lepton. All three experiments observe higher-than-expected decay rates for the decays with a tau. The average value of the reported results, taking into account the statistical and systematic uncertainties, exceeds the Standard Model expectation by four standard deviations.

    This enhancement is intriguing, but not considered sufficient to unambiguously establish a violation of lepton universality. To claim a discovery, particle physicists generally demand a significance of at least five standard deviations. However, the fact that this enhancement was detected by three experiments, operating in very different environments, deserves attention. Nevertheless, more data will be needed, and are expected in the not too distant future.

    What was your role in this research?

    As the technical coordinator of the BABAR collaboration during the construction of the detector, I was the liaison between the physicists and the engineering teams, supported by the BABAR project management team at SLAC. With more than 500 BABAR members from 11 countries, this was a challenging task, but with the combined expertise and dedication of the collaboration the detector was completed and ready to take data in four years.

    Once data became available, I rejoined SLAC’s Research Group C and took over its leadership from Jonathan Dorfan. As convener of the physics working group on B decays involving leptons, I coordinated various analyses by scientists from different external groups, among them SLAC postdocs and graduate students, and helped to develop the analysis tools needed for precision measurements.

    Almost 10 years ago, we started updating an earlier analysis performed under the leadership of Jeff Richman of the University of California, Santa Barbara on B decays involving tau leptons and extended it to the complete BABAR data set. This resulted in the surprisingly large decay rate. The analysis was the topic of the PhD thesis of my last graduate student, Manuel Franco Sevilla, who over the course of four years made a number of absolutely critical contributions that significantly improved the precision of this measurement, and thereby enhanced its significance.

    What keeps you excited about particle physics?

    Over the past 50 years that I have been working in particle physics, I have witnessed enormous progress in theory and experiments leading to our current understanding of matter’s constituents and their interactions at the most fundamental level. But there are still many unanswered questions, from very basic ones like “Why do particles have certain masses and not others?” to questions about the grand scale of things, such as “What is the origin of the universe, and is there more than one?”

    Lepton universality is one of the Standard Model’s fundamental assumptions. If it were violated, unexpected new physics processes must exist. This would be a major breakthrough – even more surprising than the discovery of the Higgs boson, which was predicted to exist many decades ago.

    What results do you expect in the near future?

    There is actually a lot going on in the field. LHCb researchers are collecting more data and will try to find out if the lepton universality is indeed violated. My guess is that we should know the answer by the end of this year. A confirmation will be a great event and will undoubtedly trigger intense experimental and theoretical research.

    At present we do not understand the origin of the observed enhancement. We first assumed that it could be related to a charged partner of the Higgs boson. Although the observed features did not match the expectations, an extension of the Higgs model could do so. Another possible explanation that can neither be confirmed nor excluded is the presence of so-called lepto-quarks. These open questions will remain a very exciting topic that need to be addressed by experiments and theoretical work.

    Recently, LHCb scientists have reported an interesting result indicating that certain B meson decays more often include an electron pair than a muon pair. However, the significance of this new finding is only about 2.6 standard deviations, so it’s too early to draw any conclusions. BABAR and Belle have not confirmed this observation.

    At the next-generation B factory, Super-KEKB in Japan, the new Belle II experiment is scheduled to begin its planned 10-year research program in 2018. The expected very large new data sets will open up many opportunities for searches for these and other indications of physics beyond the Standard Model.

    4
    Super-KEKB in Japan

    5
    Belle II at the SuperKEKB accelerator complex at KEK in Tsukuba, Ibaraki Prefecture, Japan

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.
    i1

     
  • richardmitnick 8:05 am on April 27, 2017 Permalink | Reply
    Tags: , , Paul Fuoss, SLAC,   

    From SLAC: “Where Scientist Meets Machine: A Fresh Approach to Experimental Design at SLAC X-Ray Laser” 


    SLAC Lab

    April 26, 2017
    Glennda Chui

    1
    Paul Fuoss, the new head of experimental design at SLAC’s Linac Coherent Light Source X-ray laser. (Dawn Harmer/SLAC National Accelerator Laboratory)

    Paul Fuoss’s Mission is to Make Experiments at LCLS and Other Light Sources More Productive and User-Friendly

    Big leaps in technology require big leaps in design ­– entirely new approaches that can take full advantage of everything the technology has to offer.

    That’s the thinking behind a new initiative at the Department of Energy’s SLAC National Accelerator Laboratory. To make sure experimenters can get the most out of a major X-ray laser upgrade that will produce beams that are 10,000 times brighter and pulses up to a million times per second, the lab has created a new position – head of experimental design at the Linac Coherent Light Source – and hired a world-renowned X-ray scientist to fill it.

    2
    The first 21 of 33 undulators in place in the LCLS Undulator Hall. (Photo by Mike Zurawel.)

    Paul Fuoss (pronounced “foos”) will look at LCLS and the LCLS-II upgrade from a fresh perspective and work with scientists and engineers across the lab to design instruments, user-friendly control systems and experimental flows that take full advantage of this technological leap.

    2

    Although the upgrade won’t be finished until the early 2020s, there’s really no time to lose, said LCLS Director Mike Dunne.

    “We’re on the verge of a transformation of our science capabilities that is simply unattainable today. When you take these big leaps you have to fundamentally rethink how you approach the science and the design of experiments,” Dunne said.

    “You can’t just do it the way you did before but a bit better. You have to approach it from a completely new thought process: What is the scientific knowledge you’re trying to get out, and what is the scientific data that might illuminate that new understanding, and how does that translate back into how you obtain that data, and how does that influence how you design the facility?”

    Taming Complexity to Make Science More Productive

    For Fuoss, the broader goal is to increase productivity and improve the experiences of scientists at X-ray light sources everywhere.

    “Experiments have gotten a lot more complex over the past 20 years, not just at LCLS but at synchrotron light sources, too,” he said. “We’ve gone from controlling experiments with a single computer and detecting a single pixel of data at a time to using multiple computers and detecting more like a million pixels at once. Our ability to integrate different tools and computers and visualize the data has often not kept up with the technology. And at LCLS, that complexity is going to increase dramatically in a few years when the LCLS-II upgrade becomes operational.”

    One way to make working with LCLS more streamlined and intuitive is to incorporate user-friendly features into the instruments that come on board as part of LCLS-II.

    “A lot of that will be working with the scientists and engineers who are designing those instruments to get the building blocks for user compatibility in there,” Fuoss said. “It’s not part of the core training of scientists and engineers, so we expect we will need to reach out to people who have that expertise and get them to help us.”

    Another way, he said, is to create tools that let scientists visualize their data as it’s being collected, so they can understand what is going on in real time.

    “There are a lot of different pieces that need to be coordinated,” Fuoss said. “All of them are currently being done, but we need to bring a unified focus and make sure there are no unnecessary barriers. Ultimately, you want to integrate this kind of thing into everyone’s day-to-day development activities.”

    X-Rays, Inventions and Human Interfaces

    Fuoss has deep roots at SLAC. Originally from South Dakota, where he grew up on a ranch, he earned a degree in physics at South Dakota School of Mines and Technology and came to Stanford University in 1975 for graduate school. He wound up doing his graduate research at SLAC, using X-rays from what later became the Stanford Synchrotron Radiation Lightsource (SSRL) to investigate materials.

    3
    SSRL-Stanford Synchrotron Radiation Lightsource – Stanford University

    After earning a PhD, Fuoss went on to do research at Bell Laboratories, AT&T Laboratories and Argonne National Laboratory. He’s been an active user of SSRL and other light sources and has developed a number of new techniques for exploring materials with X-rays, many of which are now standard at light sources around the world; in 2015 he received SLAC’s Farrel W. Lytle Award for this work. Fuoss also played a role in designing LCLS.

    In the mid-1990s, while a researcher at AT&T Laboratories, Fuoss took a six-year detour into the world of human interface design and human factors research – the study of how people interact with technology, from airplane cockpits to your office copier. Back then, he focused on making telecommunications systems and web interfaces more user friendly. This experience can also be applied to LCLS experimental design.

    “Paul has an incredible background,” Dunne said. “He brings that deep understanding of the nature of X-ray science, an understanding of all the instruments and the technical pieces, and then an understanding of what we’re trying to achieve scientifically.”

    Getting the Most out of Beam Time

    Unlike synchrotron light sources, which may have dozens of X-ray beamlines and many experiments going on simultaneously, the current version of LCLS has just one powerful beam, a billion times brighter than any available before, whose pulses arrive up to 120 times per second. In theory this limits the facility to doing one experiment at a time.

    But in the seven years since it opened, scientists and engineers have come up with a number of ways to get around that limitation, such as splitting the beam so it can be delivered to two or more experiments at once. At the same time, they reduced the down time between experiments by scheduling similar experiments back to back, so they don’t have to change out equipment as often. These and other measures increased the number of experiments run per year by 72 percent from 2014 to 2016, and LCLS recently passed the milestone of hosting more than 1,000 users per year.

    LCLS-II will add a second X-ray laser beam, further increasing the facility’s capacity. By continuing to find ways to squeeze in more experiments while making the way people interact with LCLS more straightforward, Fuoss said, “We can improve productivity and allow the scientific users to have a more hands-on role in the actual data collection. That will both reduce the load on the LCLS staff and lead to a better experience for the scientists who are coming here to use it.“

    LCLS and SSRL are DOE Office of Science User Facilities.

    For questions or comments, contact the SLAC Office of Communications at communications@slac.stanford.edu.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.
    i1

     
  • richardmitnick 8:08 am on April 25, 2017 Permalink | Reply
    Tags: , , SLAC   

    From SLAC: “Machine Learning Dramatically Streamlines Search for More Efficient Chemical Reactions” 


    SLAC Lab

    April 24, 2017
    Glennda Chui

    1
    A diagram shows the many possible paths one simple catalytic reaction can theoretically take – in this case, conversion of syngas, which is a combination of carbon dioxide (CO2) and carbon monoxide (CO), to acetaldehyde. Machine learning allowed SUNCAT theorists to prune away the least likely paths and identify the most likely one (red) so scientists can focus on making it more efficient. (Zachary Ulissi/SUNCAT)

    Even a simple chemical reaction can be surprisingly complicated. That’s especially true for reactions involving catalysts, which speed up the chemistry that makes fuel, fertilizer and other industrial goods. In theory, a catalytic reaction may follow thousands of possible paths, and it can take years to identify which one it actually takes so scientists can tweak it and make it more efficient.

    Now researchers at the Department of Energy’s SLAC National Accelerator Laboratory and Stanford University have taken a big step toward cutting through this thicket of possibilities. They used machine learning – a form of artificial intelligence – to prune away the least likely reaction paths, so they can concentrate their analysis on the few that remain and save a lot of time and effort.

    The method will work for a wide variety of complex chemical reactions and should dramatically speed the development of new catalysts, the team reported in Nature Communications.

    ‘A Daunting Task’

    “Designing a novel catalyst to speed a chemical reaction is a very daunting task,” said Thomas Bligaard, a staff scientist at the SUNCAT Center for Interface Science and Catalysis, a joint SLAC/Stanford institute where the research took place. “There’s a huge amount of experimental work that normally goes into it.”

    For instance, he said, finding a catalyst that turns nitrogen from the air into ammonia – considered one of the most important developments of the 20th century because it made the large-scale production of fertilizer possible, helping to launch the Green Revolution – took decades of testing various reactions one by one.

    Even today, with the help of supercomputer simulations that predict the results of reactions by applying theoretical models to huge databases on the behavior of chemicals and catalysts, the search can take years, because until now it has relied largely on human intuition to pick possible winners out of the many available reaction paths.

    “We need to know what the reaction is, and what are the most difficult steps along the reaction path, in order to even think about making a better catalyst,” said Jens Nørskov, a professor at SLAC and Stanford and director of SUNCAT.

    “We also need to know whether the reaction makes only the product we want or if it also makes undesirable byproducts. We’ve basically been making reasonable assumptions about these things, and we really need a systematic theory to guide us.”

    Trading Human Intuition for Machine Learning

    For this study, the team looked at a reaction that turns syngas, a combination of carbon monoxide and hydrogen, into fuels and industrial chemicals. The syngas flows over the surface of a rhodium catalyst, which like all catalysts is not consumed in the process and can be used over and over. This triggers chemical reactions that can produce a number of possible end products, such as ethanol, methane or acetaldehyde.

    “In this case there are thousands of possible reaction pathways – an infinite number, really – with hundreds of intermediate steps,” said Zachary Ulissi, a postdoctoral researcher at SUNCAT. “Usually what would happen is that a graduate student or postdoctoral researcher would go through them one at a time, using their intuition to pick what they think are the most likely paths. This can take years.”

    The new method ditches intuition in favor of machine learning, where a computer uses a set of problem-solving rules to learn patterns from large amounts of data and then predict similar patterns in new data. It’s a behind-the-scenes tool in an increasing number of technologies, from self-driving cars to fraud detection and online purchase recommendations.

    Rapid Weeding

    The data used in this process came from past studies of chemicals and their properties, including calculations that predict the bond energies between atoms based on principles of quantum mechanics. The researchers were especially interested in two factors that determine how easily a catalytic reaction proceeds: How strongly the reacting chemicals bond to the surface of the catalyst and which steps in the reaction present the most significant barriers to going forward. These are known as rate-limiting steps.

    A reaction will seek out the path that takes the least energy, Ulissi explained, much like a highway designer will choose a route between mountains rather than waste time looking for an efficient way to go over the top of a peak. With machine learning the researchers were able to analyze the reaction pathways over and over, each time eliminating the least likely paths and fine-tuning the search strategy for the next round.

    Once everything was set up, Ulissi said, “It only took seconds or minutes to weed out the paths that were not interesting. In the end there were only about 10 reaction barriers that were important.” The new method, he said, has the potential to reduce the time needed to identify a reaction pathway from years to months.

    Andrew Medford, a former SUNCAT graduate student who is now an assistant professor at the Georgia Institute of Technology, also contributed to this research, which was funded by the DOE Office of Science.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.
    i1

     
  • richardmitnick 7:34 am on April 20, 2017 Permalink | Reply
    Tags: SLAC, Virtual tour of LCLS   

    From SLAC: “Virtual Tours of LCLS” 


    SLAC Lab

    The Linac Coherent Light Source (LCLS) at SLAC allows scientists to see the world in femtosecond resolution. Click on the images below to take virtual tours of the Undulator Hall and the Near Experimental Hall (NEH) at LCLS. Also check out our LCLS album on Flickr for photos of the facility.

    2
    Undulator Hall View the video images. Click on the blue circle to navigate.

    2
    Near Experimental Hall View the video images. Click on the blue circle to navigate.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.
    i1

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: