Tagged: SLAC National Accelerator Lab Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:20 pm on December 15, 2014 Permalink | Reply
    Tags: , , SLAC National Accelerator Lab   

    From SLAC: “Is the Higgs Boson a Piece of the Matter-Antimatter Puzzle?” 

    SLAC Lab

    December 15, 2014

    A SLAC Theorist and Colleagues Lay Out a Possible Way to Tell if the Higgs is Involved

    Several experiments, including the BaBar experiment at the Department of Energy’s SLAC National Accelerator Laboratory, have helped explain some – but not all – of the imbalance between matter and antimatter in the universe. Now a SLAC theorist and his colleagues have laid out a possible method for determining if the Higgs boson is involved.

    In a paper published in Physical Review D, they suggest that scientists at CERN’s Large Hadron Collider (LHC), where the Higgs was discovered, look for a specific kind of Higgs decay when the collider starts up again in 2015. The details of that decay could tell them whether or not the Higgs has a say in the matter-antimatter imbalance.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    “The time to plan a search strategy is now,” said Matt Dolan, a research associate in SLAC’s Particle Theory group and co-author of the paper. “That way, when the LHC begins to operate at full strength we’ll be ready.”

    Why there’s more matter than antimatter is one of the biggest questions confounding particle physicists and cosmologists, and it cuts to the heart of our own existence. In the time following the Big Bang, when the budding universe cooled enough for matter to form, most matter-antimatter particle pairs that popped into existence annihilated each other. Yet something tipped the balance in favor of matter, or we – and stars, planets, galaxies, life – would not be here.

    The recently discovered Higgs boson is directly connected to the issues of mass and matter. Asking whether the Higgs is involved in the preponderance of matter over antimatter seems a reasonable question.

    The paper is based on a phenomenon called CP – or charge-parity – violation, the same phenomenon investigated by BaBar. CP violation means that nature treats a particle and its oppositely charged mirror-image version differently.

    “Searching for CP violation at the LHC is tricky,” Dolan said. “We’ve just started to look into the properties of the Higgs, and the experiments must be very carefully designed if we are to improve our understanding of how the Higgs behaves under different conditions.”

    First, researchers need to confirm that the Higgs fits into the Standard Model, our current best explanation of matter, energy and the processes that turned them into us. A Higgs that fits the Standard Model where CP violation is concerned is called CP-even; one that does not is called CP-odd. A tell-tale sign that the Higgs is involved in CP violation is if it’s a mixture of even and odd.

    The Standard Model of elementary particles, with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    The theorists proposed that experimenters look for a process in which a Higgs decays into two tau particles, which are like supersized cousins of electrons, while the remainder of the energy from the original proton-proton collision sprays outward in two jets. Any mix of CP-even and CP-odd in the Higgs is revealed by the angle between the two jets.

    In this illustration, two protons collide at high energy, producing a Higgs boson that instantly decays, producing two tau particles. The rest of the energy from the collision sprays outward in two jets (pink cones). Measuring the angle between these jets could reveal whether or not the Higgs is involved in charge-parity (CP) violation, which says that nature treats a particle and its oppositely charged antiparticle differently. A SLAC researcher and his colleagues propose such an experiment in a recent paper in Physical Review D. (SLAC National Accelerator Laboratory)

    “This is a very high-profile and involved analysis,” said Philip Harris, a staff physicist at CERN and co-author of the paper along with Martin Jankowiak of the University of Heidelberg and Michael Spannowsky of Durham University. A member of the CMS collaboration, Harris focuses on Higgs-to-tau-tau decays, evidence of which has only recently begun to mount.

    “I wanted to add a CP violation measurement to our analysis, and what Matt, Martin and Michael proposed is the most viable avenue,” Harris said, adding that he’s looking forward to all the data the LHC will generate when it starts up again early next year at its full design strength.

    “Even with just a few months of data we can start to make real statements about the Higgs and CP violation,” he said.

    Citation: Matthew Dolan et al., Physical Review D, 21 October 2014 (10.1103/PhysRevD.90.073008)

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

  • richardmitnick 3:17 pm on November 11, 2014 Permalink | Reply
    Tags: , , , , SLAC National Accelerator Lab,   

    From Symmetry: “The November Revolution” 


    November 11, 2014
    Amanda Solliday

    Forty years ago today, two different research groups announced the discovery of the same new particle and redefined how physicists view the universe.

    On November 11, 1974, members of the Cornell high-energy physics group could have spent the lulls during their lunch meeting chatting about the aftermath of Nixon’s resignation or the upcoming Big Red hockey season.

    But on that particular Monday, the most sensational topic was physics-related. One of the researchers in the audience stood up to report that two labs on opposite sides of the country were about to announce the same thing: the discovery of a new particle that heralded the birth of the Standard Model of particle physics.

    Ting and Richter

    The Standard Model of elementary particles, with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    “Nobody at the meeting knew what the hell it was,” says physicist Kenneth Lane of Boston University, a former postdoctoral researcher at Cornell. Lane, among others, would spend the next few years describing the theory and consequences of this new particle.

    It isn’t often that a discovery comes along that forces everyone to reevaluate the way the world works. It’s even rarer for two groups to make such a discovery at the same time, using different methods.

    One announcement would come from a research group led by MIT physicist Sam Ting at Brookhaven National Laboratory in New York. The other was to come from a team headed by physicist Burton Richter at SLAC National Accelerator Laboratory, then called the Stanford Linear Accelerator Center, in California. Word traveled fast.

    “We started getting all sorts of inquiries and congratulations before we even finished writing the paper,” Richter says. “Somebody told a friend, and then a friend told another friend.”

    Ting called the new particle the J particle. Richter called it psi. It became known as J/psi, the discovery that sparked the November Revolution.

    Independently, the researchers at Brookhaven and SLAC had designed two complementary experiments.

    Ting and his team had made the discovery using a proton machine, shooting an intense beam of particles at a fixed target. Ting was interested in how photons, particles of light, turn into heavy photons, particles with mass, and he wanted to know how many of these types of heavy photons existed in nature. So his team—consisting of 13 scientists from MIT with help from researchers at Brookhaven—designed and built a detector that would accept a wide range of heavy photon masses.

    “The experiment was quite difficult,” Ting says. “I guess when you’re younger, you’re more courageous.”

    In early summer 1974, they started the experiment at a high mass, around 4 to 5 billion electronvolts. They saw nothing. Later, they lowered the mass and soon saw a peak near 3 billion electronvolts that indicated a high production rate of a previously unknown particle.

    At SLAC, Richter had created a new type of collider, the Stanford Positron Electron Asymmetric Rings (SPEAR). His research group used a beam of electrons produced by a linear accelerator and stored the particles in a ring of magnets. Then, they would generate positrons in a linear accelerator and inject them in the other direction. The detector was able to look at everything produced in electron-positron collisions.

    The goal was to determine the masses of known elementary particles, but the researchers saw strange effects in the summer of 1974. They looked at that particular region with finer resolution, and over the weekend of November 9-10, discovered a tall, thin energy peak around 3 billion electronvolts.

    At the time, Ting visited SLAC as part of an advisory committee. The laboratory’s director, Pief Panofsky, asked Richter to meet with him.

    “He called and said, ‘It sounds like you guys have found the same thing,’” Richter says.

    Both researchers sent their findings to the journal Physical Review Letters. Their papers were published in the same issue. Other labs quickly replicated and confirmed the results.

    At the time, the basic pieces of today’s Standard Model of particle physics were still falling into place. Just a decade before, it had resembled the periodic table of the elements, including a wide, unruly collection of different types of particles called hadrons.

    Theorists Murray Gell-Mann and George Zweig were the first to propose that all of those different types of hadrons were actually made up of the same building blocks, called quarks. This model included three types of quark: up, down and strange. Other theorists—Sheldon Lee Glashow, James Bjorken, and then also John Iliopoulos and Luciano Maiani—proposed the existence of a fourth quark.

    On the day of the J/psi announcement, the Cornell researchers talked about the findings well into the afternoon. One of the professors in the department, Ken Wilson, made a connection between the discovery and a seminar given earlier that fall by Tom Appelquist, a physicist at Harvard University. Appelquist had been working with his colleague David Politzer to describe something they called “charmonium,” a bound state of a new type of quark and antiquark.

    “Only a few of us were thinking about the idea of a fourth quark,” says Appelquist, now a professor at Yale. “Ken called me right after the discovery and urged me to get our paper out ASAP.”

    The J/psi news inspired many other theorists to pick up their chalk as well.

    “It was clear from day one that J/psi was a major discovery,” Appelquist says. “It almost completely reoriented the theoretical community. Everyone wanted to think about it.”

    Less than two weeks after the initial discovery, Richter’s group also found psi-prime, a relative of J/psi that showed even more cracks in the three-quark model.

    “There was a whole collection of possibilities of what could exist outside the current model, and people were speculating about what that may be,” Richter says. “Our experiment pruned the weeds.”

    The findings of the J/psi teams triggered additional searches for unknown elementary particles, exploration that would reveal the final shape of the Standard Model. In 1976, the two experiment leaders were awarded the Nobel Prize for their achievement.

    In 1977, scientists at Fermilab discovered the fifth quark, the bottom quark. In 1995, they discovered the sixth one, the top.

    Today, theorists and experimentalists are still driven to answer questions not explained by the current prevailing model. Does supersymmetry exist? What are dark matter and dark energy? What particles have we yet to discover?

    Supersymmetry standard model
    Standard Model of Supersymmetry

    “If the answers are found, it will take us even deeper into what we are supposed to be doing as high-energy physicists,” Lane says. “But it probably isn’t going to be this lightning flash that happens on one Monday afternoon.”

    Ting and Richter
    Courtesy of: SLAC National Accelerator Laboratory

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 3:09 pm on November 8, 2014 Permalink | Reply
    Tags: , , , , , , SLAC National Accelerator Lab,   

    From The Conversation: “Cheaper, more compact particle accelerators are a step closer” 

    The Conversation
    The Conversation

    Scientists working on an experiment at the SLAC National Accelerator Laboratory in the US have taken a step forward in developing a technology which could significantly reduce the size of particle accelerators. The technology is able to accelerate particles more rapidly than conventional accelerators at a much smaller size.

    Before the big bang. SLAC National Accelerator Laboratory

    One of the most impressive aspects of particle accelerators used for research such as the Large Hadron Collider (LHC) at CERN is its physical size.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Yet, even with a circumference of 27km, the LHC would be smaller than most of the next generation of proposed colliders. For example the International Linear Collider (ILC), a possible future collider of electrons and positrons (anti-electrons) could be 31km long, and there is even a proposal for a circular accelerator with an 80km circumference that could be built at CERN as part of the Future Circular Colliders (FCC) project.

    ILC schematic
    ILC schematic

    With the discovery of the Higgs boson at the LHC in 2012, coupled with the absence of other phenomena, the particle physics panorama has become, surprisingly perhaps, very open. While the Standard Model could appear as a complete theory, several undeniable observations tell us that there is more to the story. The nature of dark matter, the origin of the baryon asymmetry in the universe, the mysteries lying behind the very small neutrino masses are telling us to keep looking for answers. Are the required new phenomena to be found at higher energies, or have they escaped detection because of very small couplings? The FCCs will address these fundamental open issues of particle physics.

    The size of all of these machines is determined by our ability to build structures that can transfer energy to particles allowing us to accelerate them to greater speeds. The higher the speed, the greater the energy when these particle beams collide, giving scientists a better chance of answering fundamental questions about the universe. This is because higher energy collisions can create conditions that are similar to those existing when the universe was born.

    Most current accelerators use a structure called an “rf cavity”, a carefully designed “box” through which the particle beam passes. The cavity transfers electromagnetic energy into the kinetic energy of particles, accelerating them. However, there is a limit to the amount of energy that an rf cavity can transfer to particles. This is because, despite operating in a vacuum, there is a risk that increasing electromagnetic fields can lead to lightning-like discharges of energy.

    However, even routine experiments in places like the LHC require more energy than a single rf cavity can provide. That is why the current solution is to use very many cavities arranged in a straight line, if it is a linear machine such as the SLAC, or using the same cavity very many times if it is in a circular machine, such as the LHC.

    Either solution presents challenges and requires a large machine to fit in the many parts needed. This raises the costs. Any technology which can increase the acceleration with smaller parts and without the need for more machinery will make future accelerators more compact.

    This matters because particle accelerators are not just for particle physicists. They are increasingly used in medicine, industry and security. For example, accelerators provide X-rays and particle beams for cancer therapy, for the fabrication of minuscule devices and for scanning the contents of everything from suitcases to freight containers.

    The new technology which could promise more compact particle accelerators has just been published in a study in Nature. The study suggests that, if bunches of electrons are passed through a short column of lithium vapour “plasma” in rapid succession, the electric field of the plasma is able to translate enough energy to accelerate particles hundreds of times quicker than the LHC. It is able to achieve all this while only being 30cm in length.

    Plasma is a state of matter where atoms are broken down into positively charged ions and negatively charged electrons. Most of the matter in the sun exists as plasma, but we can create that state on Earth using high energy lasers.

    The electric field between particles in a plasma can be extremely high. In this experiment, as the bunch of electrons passes through the plasma it causes the electrons of the plasma to move, leaving behind it a region of oscillating electrons. It is this oscillation which generate the “wakefield” which can then be used to accelerate a second set of trailing electrons following very close behind the first bunch.

    Although previous experiments have shown even greater gains in energy, what makes this experiment interesting is the number of electrons accelerated and how evenly each of them acquires energy. Being able to accelerate large numbers of particles to the same energy simultaneously is a prerequisite for any future practical use of this technology called “plasma wakefield acceleration”.

    Other groups around the world including the AWAKE collaboration at CERN and the ALPHA-X collaboration based at the University of Strathclyde are pursuing different approaches to plasma wakefield acceleration using proton beams or lasers to generate the wakefield. Meanwhile there are already tentative designs being proposed for future accelerators that could make use of this technology, if accelerating large numbers of particles simultaneously can be made reliable.

    See the full article here.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 12:59 pm on September 9, 2014 Permalink | Reply
    Tags: , , , SLAC National Accelerator Lab,   

    From SLAC: “Buckyballs and Diamondoids Join Forces in Tiny Electronic Gadget” 

    SLAC Lab

    September 9, 2014
    Press Office Contact: Andrew Gordon, agordon@slac.stanford.edu, (650) 926-2282

    Scientists Craft Two Exotic Forms of Carbon into a Molecule for Steering Electron Flow

    Scientists have married two unconventional forms of carbon – one shaped like a soccer ball, the other a tiny diamond – to make a molecule that conducts electricity in only one direction. This tiny electronic component, known as a rectifier, could play a key role in shrinking chip components down to the size of molecules to enable faster, more powerful devices.

    An international team led by researchers at SLAC National Accelerator Laboratory and Stanford University joined two offbeat carbon molecules – diamondoids, the square cages at left, and buckyballs, the soccer-ball shapes at right – to create “buckydiamondoids,” center. These hybrid molecules function as rectifiers, conducting electrons in only one direction, and could help pave the way to molecular electronic devices. (Manoharan Lab/Stanford University)

    “We wanted to see what new, emergent properties might come out when you put these two ingredients together to create a ‘buckydiamondoid,’” said Hari Manoharan of the Stanford Institute for Materials and Energy Sciences (SIMES) at the Department of Energy’s SLAC National Accelerator Laboratory. “What we got was a basically a one-way valve for conducting electricity – clearly more than the sum of its parts.”

    The research team, which included scientists from Stanford University, Belgium, Germany and Ukraine, reported its results September 9, 2014, in Nature Communications.

    Two Offbeat Carbon Characters Meet Up

    Many electronic circuits have three basic components: a material that conducts electrons; rectifiers, which commonly take the form of diodes, to steer that flow in a single direction; and transistors to switch the flow on and off. Scientists combined two offbeat ingredients – buckyballs and diamondoids – to create the new diode-like component.

    Buckyballs – short for buckminsterfullerenes – are hollow carbon spheres whose 1985 discovery earned three scientists a Nobel Prize in chemistry. Diamondoids are tiny carbon cages bonded together as they are in diamonds, but weighing less than a billionth of a billionth of a carat. Both are subjects of a lot of research aimed at understanding their properties and finding ways to use them.

    In 2007, a team led by researchers from SLAC and Stanford discovered that a single layer of diamondoids on a metal surface can efficiently emit a beam of electrons. Manoharan and his colleagues wondered: What would happen if they paired an electron-emitting diamondoid with another molecule that likes to grab electrons? Buckyballs are just that sort of electron-grabbing molecule.

    A Very Small Valve for Channeling Electron Flow

    For this study, diamondoids were produced in the SLAC laboratory of SIMES researchers Jeremy Dahl and Robert Carlson, who are world experts in extracting the tiny diamonds from petroleum. They were then shipped to Germany, where chemists at Justus-Liebig University figured out how to attach them to buckyballs.

    The resulting buckydiamondoids, which are just a few nanometers long, were tested in SIMES laboratories at Stanford. A team led by graduate student Jason Randel and postdoctoral researcher Francis Niestemski used a scanning tunneling microscope to make images of the hybrid molecules and measure their electronic behavior. They discovered the hybrid is an excellent rectifier: The electrical current flowing through the molecule was up to 50 times stronger in one direction, from electron-spitting diamondoid to electron-catching buckyball, than in the opposite direction. This is something neither component can do on its own.

    An image made with a scanning tunneling microscope shows hybrid buckydiamondoid molecules on a gold surface. The buckyball end of each molecule is attached to the surface, with the diamondoid end sticking up; both are clearly visible. The area shown here is 5 nanometers on a side. (H. Manoharan et al, Nature Communications)

    Illustration of a buckydiamondoid molecule under a scanning tunneling microscope (STM). The sharp metallic tip of the STM ends in a single atom; as it scans over a sample, electrons tunnel from the tip into the sample. In this study the STM made images of the buckydiamondoids and probed their electronic properties. (SLAC National Accelerator Laboratory)

    While this is not the first molecular rectifier ever invented, it’s the first one made from just carbon and hydrogen, a simplicity researchers find appealing, said Manoharan, who is an associate professor of physics at Stanford. The next step, he said, is to see if transistors can be constructed from the same basic ingredients.

    “Buckyballs are easy to make – they can be isolated from soot – and the type of diamondoid we used here, which consists of two tiny cages, can be purchased commercially,” he said. “And now that our colleagues in Germany have figured out how to bind them together, others can follow the recipe. So while our research was aimed at gaining fundamental insights about a novel hybrid molecule, it could lead to advances that help make molecular electronics a reality.”

    Other research collaborators came from the Catholic University of Louvain in Belgium and Kiev Polytechnic Institute in Ukraine. The primary funding for the work came from the U.S. Department of Energy Office of Science.

    See the full article here.

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 4:37 pm on August 12, 2014 Permalink | Reply
    Tags: , , , , SLAC National Accelerator Lab   

    From SLAC Lab: “Construction of Large Synoptic Survey Telescope to Begin” 

    SLAC Lab

    August 4, 2014

    LSST Will Capture Unprecedented View of Night Sky

    LSST Telescope

    On August 1, 2014, the National Science Foundation (NSF) announced an award to the Association of Universities for Research in Astronomy (AURA) to manage construction of the Large Synoptic Survey Telescope (LSST); with this announcement, construction of the LSST observatory can begin.

    When the LSST observatory starts surveying the entire visible southern sky from a Chilean mountaintop in October 2022, it will produce a unique view of the universe—the widest and fastest views of the night sky ever observed. LSST’s vast public archive of data will dramatically advance knowledge of the dark energy and dark matter that make up much of the universe, as well as galaxy formation and potentially hazardous asteroids. The LSST is expected to see “engineering first light” by 2020.

    LSST Camera
    SLAC is leading the construction of the 3,200-megapixel LSST camera, which will be the size of a small car and will weigh more than 3 tons. The digital camera will be the largest ever built for astronomy, allowing LSST to create an unprecedented public archive of data – about 6 million gigabytes per year, the equivalent of shooting roughly 800,000 images with a regular eight-megapixel digital camera every night. (SLAC National Accelerator Laboratory)

    LSST is an NSF and DOE partnership. NSF is responsible for the telescope and site, education and outreach, and the data management system, and DOE is providing the camera and related instrumentation. The National Research Council’s Astronomy and Astrophysics decadal survey ranked the LSST as the top new ground-based priority for the field in its 2010 report “New Worlds, New Horizons.”

    See the full article here.

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 8:45 pm on August 5, 2014 Permalink | Reply
    Tags: , , , , , SLAC National Accelerator Lab   

    From SLAC: “Rebooted Muon Experiment Tests Detector Design at SLAC” 

    SLAC Lab

    August 5, 2014
    Last year, a monster magnet set out from Brookhaven National Lab on an epic, 35-day trek by land and sea to its new home at Fermilab, where it will serve as the heart of a search for evidence of new subatomic particles. Last month, with much less fanfare, researchers came to the End Station Test Beam (ESTB) facility at the Department of Energy’s SLAC National Accelerator Laboratory to test the eyes and nerves of the same experiment: a cutting-edge design for a new detector.

    Muon g-2 magnet to be transported to Fermilab.

    The goal of the experiment, called Muon g-2 (pronounced gee-minus-two), is to precisely measure a property of muons by studying the way their spins precess, or wobble like a slowing top, in the grip of a powerful magnet. Researchers can track this spin by observing the muon’s decay into electrons, their lighter, longer-lived siblings.

    In the experiment’s original incarnation at Brookhaven, researchers discovered the spin rate is a tiny bit different from what theory says it should be – a difference that could indicate the influence of unknown virtual particles that pop into existence from the vacuum, affect the muons, and disappear once more.

    However, the researchers at Brookhaven weren’t able to measure the property precisely enough to know for sure. That prompted the relocation of the experiment – including the headline-grabbing move of the giant ring magnet – to Fermilab, with its more powerful muon beam.

    More Muons = More Data

    To take advantage of more muons, and thus more data, a team led by University of Washington physicist David Hertzog developed a new detector design for the experiment, a novel combination of lead-fluoride crystals and silicon photomultiplier chips that they hope will capture more information about the escaping electrons.

    Hertzog and his colleagues brought some of the crystals and silicon chips to SLAC’s ESTB facility, where electrons from the linear accelerator could stand in for the results of muon decays – but controlled and easily tracked muon decays, unlike what the detectors will face during the actual experiment.

    “These detectors will need to catch a tremendous number of muon decays, pinpointing their times and the energies of the electrons,” Hertzog said. “The electrons at ESTB can be delivered one at a time and with known energies, so we can see how the crystals and silicon photomultipliers respond.”

    The tests at ESTB have been much more low-key than the magnet’s 3200-mile trek, but Hertzog said his team can also look back at a successful venture.

    “This experiment has been really enjoyable,” Hertzog said. “We’ve got good data and our system seems to be working well.”

    See the full article here.

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 9:45 pm on July 31, 2014 Permalink | Reply
    Tags: , , , , , SLAC National Accelerator Lab   

    From SLAC Lab: “Despite Extensive Analysis, Fermi Bubbles Defy Explanation” 

    SLAC Lab

    July 31, 2014

    Scientists from Stanford and the Department of Energy’s SLAC National Accelerator Laboratory have analyzed more than four years of data from NASA’s Fermi Gamma-ray Space Telescope, along with data from other experiments, to create the most detailed portrait yet of two towering bubbles that stretch tens of thousands of light-years above and below our galaxy.

    This artist’s representation shows the Fermi bubbles towering above and below the galaxy. (NASA’s Goddard Space Flight Center)

    NASA Fermi Telescope

    The bubbles, which shine most brightly in energetic gamma rays, were discovered almost four years ago by a team of Harvard astrophysicists led by Douglas Finkbeiner who combed through data from Fermi’s main instrument, the Large Area Telescope.

    NASA Fermi LAT Large Area Telescope
    NASA/Fermi LAT

    The new portrait, described in a paper that has been accepted for publication in The Astrophysical Journal, reveals several puzzling features, said Dmitry Malyshev, a postdoctoral researcher at the Kavli Institute for Particle Astrophysics and Cosmology who co-led on the analysis.

    For example, the outlines of the bubbles are quite sharp, and the bubbles themselves glow in nearly uniform gamma rays over their colossal surfaces, like two 30,000-light-year-tall incandescent bulbs screwed into the center of the galaxy.

    Their size is another puzzle. The farthest reaches of the Fermi bubbles boast some of the highest energy gamma rays, but there’s no discernable cause for them that far from the galaxy.

    Finally, although the parts of the bubbles closest to the galactic plane shine in microwaves as well as gamma rays, about two-thirds of the way out the microwaves fade and only gamma rays are detectable. Not only is this different from other galactic bubbles, but it makes the researchers’ work that much more challenging, said Malyshev’s co-lead, KIPAC postdoctoral researcher Anna Franckowiak.

    KIPAC researchers Dmitry Malyshev (left) and Anna Franckowiak with the magazine issues that contain the articles about the Fermi bubbles they co-authored for the general public. Malyshev’s is in the July 2014 issue of Scientific American, while Franckowiak’s article is in the July 2014 issue of Physics Today. (SLAC National Accelerator Laboratory)

    “Since the Fermi bubbles have no known counterparts in other wavelengths in areas high above the galactic plane, all we have to go on for clues are the gamma rays themselves,” she said.

    What Blew The Bubbles?

    Soon after the initial discovery theorists jumped in, offering several explanations for the bubbles’ origins. For example, they could have been created by huge jets of accelerated matter blasting out from the supermassive black hole at the center of our galaxy. Or they could have been formed by a population of giant stars, born from the plentiful gas surrounding the black hole, all exploding as supernovae at roughly the same time.

    “There are several models that explain them, but none of the models is perfect,” Malyshev said. “The bubbles are rather mysterious.”

    Creating the portrait wasn’t easy.

    “It’s very tricky to model,” said Franckowiak. “We had to remove all the foreground gamma-ray emissions from the data before we could clearly see the bubbles.”

    From the vantage point of most Earth-bound telescopes, all but the highest-energy gamma rays are completely screened out by our atmosphere. It wasn’t until the era of orbiting gamma-ray observatories like Fermi that scientists discovered how common extra-terrestrial gamma rays really are. Pulsars, supermassive black holes in other galaxies and supernovae are all gamma rays point sources, like distant stars are point sources of visible light, and all those gamma rays had to be scrubbed from the Fermi data. Hardest to remove were the galactic diffuse emissions, a gamma ray fog that fills the galaxy from cosmic rays interacting with interstellar particles.

    “Subtracting all those contributions didn’t subtract the bubbles,” Franckowiak said. “The bubbles do exist and their properties are robust.” In other words, the bubbles don’t disappear when other gamma-ray sources are pulled out of the Fermi data – in fact, they stand out quite clearly.

    Franckowiak says more data is necessary before they can narrow down the origin of the bubbles any further.

    “What would be very interesting would be to get a better view of them closer to the galactic center,” she said, “but the galactic gamma ray emissions are so bright we’d need to get a lot better at being able to subtract them.”

    Fermi is continuing to gather the data Franckowiak wants, but for now, both researchers said, there are a lot of open questions.

    See the full article here.

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 11:02 pm on July 16, 2014 Permalink | Reply
    Tags: , , , , , SLAC National Accelerator Lab   

    From SLAC Lab: “Antimatter Once More Flowing to Experiments at SLAC” 

    SLAC Lab

    July 16, 2014
    No Writer Credit

    Following an absence of six years, beams of positrons – the antimatter twins of electrons – are once more streaming through the linear accelerator to waiting experiments at the Department of Energy’s SLAC National Accelerator Laboratory.

    SLAC Linac 2
    SLAC 1.9 mile (3 kilometer) long Klystron Gallery

    Positrons were last seen at SLAC during a particle physics experiment called BaBar, which studied electron-positron collisions. Because matter and antimatter annihilate into a flash of pure energy when they collide, the particles born from these collisions are easier to track than particles resulting from proton-proton collisions.

    When BaBar stopped taking data in 2008, the equipment to create the positrons was left in place. Then SLAC opened the Facility for Advanced Accelerator Experimental Tests (FACET) in 2012, where researchers can explore cutting-edge technologies to power the next generation of accelerators – including accelerators that send matter smashing into antimatter. Positrons were a necessary addition to the FACET toolbox.



    facet box
    The instruments at the heart of the Facility for Advanced Accelerator Experimental Tests, such as the plasma oven seen here, can now experiment on positrons. (Matt Beardsley/SLAC National Accelerator Laboratory)

    Beginning in March, a cross-lab team led by SLAC accelerator physicists Jerry Yocky and Nate Lipkowitz started getting the equipment up and running again. By June, the final push began.

    “A lot of people had to work really hard, but it was fun and exciting to bring them back,” Yocky said. “SLAC is unique. We make the most positrons anywhere, plus our linac can accelerate them to the highest energy available anywhere in the world today.”

    Three experiments were able to use the positron beam before FACET shut down for maintenance and upgrades in early July.

    Ioan Tudosa, a member of the Stanford Institute for Materials and Energy Sciences (SIMES) at SLAC, exposed magnetic materials to the positron beam. The experiment was a continuation of previous work at FACET on ultrafast magnetic switching. “We don’t expect a very different result from previous experiments with electrons, but surprises can occur,” he said.

    Stanford graduate student Spencer Gessner has been working on ways to accelerate both electrons and positrons in clouds of hot, ionized gas called plasmas. “We saw definite acceleration of the positrons in the plasma,” he said. “When you add that to the acceleration we’ve already seen from electrons in the plasma, I think we’ve really taken our first steps toward compact accelerators that are still powerful enough for electron-positron colliders.”

    Gessner is also exploring “hollow channel” plasma acceleration, in which a laser drills through neutral gas, forming a tunnel with walls of plasma. Theory suggests both electron and positron beams can be accelerated through such tubes, and, said Gessner, the research team was able to begin exploring how the positron beam and the tube interacted.

    “It’s step zero for hollow channel experiments, but it’s a big step zero,” he said.

    This image shows a cross-section of a plasma-walled channel (orange ring) with a positron bunch (bright dot) making a bull’s eye in the center. (E200 Collaboration)

    See the full article here.

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 11:43 am on July 15, 2014 Permalink | Reply
    Tags: , , , , KEK Laboratory, , SLAC National Accelerator Lab   

    From interactions.org: “KEK: 50 years from the discovery of ‘CP-violation'” 


    11 July 2014
    Professor Yoshihide Sakai
    Co-spokesperson, the Belle Collaboration
    The High Energy Accelerator Research Organization

    Public Relations Office, High Energy Accelerator Research Organization (KEK), Japan
    Saeko Okada
    Senior Press Officer, Public Relations Office, KEK
    TEL: +81-29-879-6046
    FAX: +81-29-879-6049
    E-mail: press@kek.jp

    Belle and Babar complete a joint book on their experimental work to prove the Kobayashi-Maskawa theory of CP-violation

    The joint publication was completed last month. To celebrate this achievement, the first special editions of the book are presented to Drs. Cronin, Kobayashi and Maskawa today at the 50 Years of CP Violation conference held in London.

    In 1993 the SLAC National Accelerator Laboratory in California and the KEK laboratory near Tokyo in Japan embarked on a quest to understand the nature of CP violation, a tiny difference between matter and antimatter that is vital for our existence. This effect was discovered in the decay of a particle called a kaon in 1964. These kaons exhibited strange behaviour compared with other particles studied at the time, and we now refer to the quark that causes that behaviour as a strange (or just s) quark. The amount of CP violation in kaon decays is insufficient to explain how the universe came to be dominated by matter.

    SLAC Campus
    SLAC National Accelerator Lab

    KEK lab

    SLAC and KEK constructed so called B Factories, which are particle accelerators and detectors to produce a large number of Bottom (or Beauty) particles, which contain b quarks, and study CP violation. The B Factory mission was to explore the phenomenon of CP violation in these particles. Twenty-one years on, these two international collaborations have come to the end of a global collaborative project: one that has produced a weighty tome over 900 pages in length, detailing all aspects of the Physics of the B Factories and their detectors: BaBar and Belle. The physics harvest from the international collaborations that run BaBar and Belle have included many notable discoveries including: CP violation in B decays, first studies of some very rare B decays, and a host of new particles. The breakthroughs have continued more recently with the determination of mixing in neutral charm mesons. This discovery paves the way for the next generation of experiments to search for certain types of CP violation in the decay of charm mesons. Almost a thousand papers have been published by these two experiments during their lifetime.

    The original flagship measurements of the B Factories were found to be consistent with the Cabibbo-Kobayashi-Maskawa matrix description of CP violation. This provides the Standard Model of particle physics with a description of CP violation as predicted by Kobayashi and Maskawa in 1972.

    The Standard Model of elementary particles, with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    The B Factory confirmation of the Kobayashi-Maskawa mechanism was quickly followed by Kobayashi and Maskawa sharing a Nobel Prize (in 2008) for their insightful work. The Cabibbo-Kobayashi-Maskawa matrix is now known to provide the leading description of CP violation. However, while this was an important step forward for the field, the amount of CP violation in the Standard Model remains about a billion times too small to explain the matter-dominated universe that we live in. As a result the focus of the field has turned from understanding how nature behaves to the much more subtle task of trying to understand if there are small deviations from this leading description that have been missed so far.

    A new book has been written as a collaboration between the two teams of physicists working on BaBar and Belle, with the help of the theory community. This is envisioned to be a pedagogical resource for the next generation of experimentalists to work in this field. Preparations started in 2008 and the concept was solidified through a number of international meetings over the past six years. This effort brought together experts from the global flavour physics communities from four continents. The KEK B Factory is in the process of being upgraded and should recommence data taking as a “Super B Factory” with a physics programme resuming in 2016. A decade from now someone will surely need to write a book on the Physics of the Super B Factory.

    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 4:05 pm on July 10, 2014 Permalink | Reply
    Tags: , , , , SLAC National Accelerator Lab,   

    From SLAC: “Uncertainty Gives Scientists New Confidence in Search for Novel Materials “ 

    SLAC Lab

    July 10, 2014
    Andrew Gordon, agordon@slac.stanford.edu, (650) 926-2282

    Scientists at Stanford University and the Department of Energy’s SLAC National Accelerator Laboratory have found a way to estimate uncertainties in computer calculations that are widely used to speed the search for new materials for industry, electronics, energy, drug design and a host of other applications. The technique, reported in the July 11 issue of Science, should quickly be adopted in studies that produce some 30,000 scientific papers per year.

    “Over the past 10 years our ability to calculate the properties of materials and chemicals, such as reactivity and mechanical strength, has increased enormously. It’s totally exploded,” said Jens Nørskov, a professor at SLAC and Stanford and director of the SUNCAT Center for Interface Science and Catalysis, who led the research.

    “As more and more researchers use computer simulations to predict which materials have the interesting properties we’re looking for – part of a process called ‘materials by design’ ­– knowing the probability for error in these calculations is essential,” he said. “It tells us exactly how much confidence we can put in our results.”

    Nørskov and his colleagues have been at the forefront of developing this approach, using it to find better and cheaper catalysts to speed ammonia synthesis and generate hydrogen gas for fuel, among other things. But the technique they describe in the paper can be broadly applied to all kinds of scientific studies.

    This image shows the results of calculations aimed at determining which of six chemical elements would make the best catalyst for promoting an ammonia synthesis reaction. Researchers at SLAC and Stanford used Density Functional Theory (DFT) to calculate the strength of the bond between nitrogen atoms and the surfaces of the catalysts. The bond strength, plotted on the horizontal axis, is a key factor in determining the reaction speed, plotted on the vertical axis. Based on thousands of these calculations, which yielded a range of results (colored dots) that reveal the uncertainty involved, researchers estimated an 80 percent chance that ruthenium (Ru, in red) will be a better catalyst than iron (Fe, in orange.) (Andrew Medford and Aleksandra Vojvodic/SUNCAT, Callie Cullum)

    Speeding the Material Design Cycle

    The set of calculations involved in this study is known as DFT, for Density Functional Theory. It predicts bond energies between atoms based on the principles of quantum mechanics. DFT calculations allow scientists to predict hundreds of chemical and materials properties, from the electronic structures of compounds to density, hardness, optical properties and reactivity.

    Because researchers use approximations to simplify the calculations – otherwise they’d take too much computer time – each of these calculated material properties could be off by a fairly wide margin.

    To estimate the size of those errors, the team applied a statistical method: They calculated each property thousands of times, each time tweaking one of the variables to produce slightly different results. That variation in results represents the possible range of error.

    “Even with the estimated uncertainties included, when we compared the calculated properties of different materials we were able to see clear trends,” said Andrew J. Medford, a graduate student with SUNCAT and first author of the study. “We could predict, for instance, that ruthenium would be a better catalyst for synthesizing ammonia than cobalt or nickel, and say what the likelihood is of our prediction being right.”

    An Essential New Tool for Thousands of Studies

    DFT calculations are used in the materials genome initiative to search through millions of solids and compounds, and also widely used in drug design, said Kieron Burke, a professor of chemistry and physics at the University of California-Irvine who was not involved in the study.

    “There were roughly 30,000 papers published last year using DFT,” he said. “I believe the technique they’ve developed will become absolutely necessary for these kinds of calculations in all fields in a very short period of time.”

    Thomas Bligaard, a senior staff scientist in charge of theoretical method development at SUNCAT, said the team has a lot of work ahead in implementing these ideas, especially in calculations attempting to make predictions of new phenomena or new functional materials.

    Other researchers involved in the study were Jess Wellendorff, Aleksandra Vojvodic, Felix Studt, and Frank Abild-Pedersen of SUNCAT and Karsten W. Jacobsen of the Technical University of Denmark. Funding for the research came from the DOE Office of Science.

    See the full article here.

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

    ScienceSprings is powered by MAINGEAR computers

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc

Get every new post delivered to your Inbox.

Join 378 other followers

%d bloggers like this: