Tagged: Symmetry Magazine Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:02 pm on September 22, 2014 Permalink | Reply
    Tags: , , , , , , , Symmetry Magazine   

    From Symmetry: “Cosmic dust proves prevalent” 


    September 22, 2014
    Kathryn Jepsen

    Space dust accounts for at least some of the possible signal of cosmic inflation the BICEP2 experiment announced in March. How much remains to be seen.

    Space is full of dust, according to a new analysis from the European Space Agency’s Planck experiment.


    That includes the area of space studied by the BICEP2 experiment, which in March announced seeing a faint pattern left over from the big bang that could tell us about the first moments after the birth of the universe.

    Gravitational Wave Background from BICEP2

    The Planck analysis, which started before March, was not meant as a direct check of the BICEP2 result. It does, however, reveal that the level of dust in the area BICEP2 scientists studied is both significant and higher than they thought.

    “There is still a wide range of possibilities left open,” writes astronomer Jan Tauber, ESA project scientist for Planck, in an email. “It could be that all of the signal is due to dust; but part of the signal could certainly be due to primordial gravitational waves.”

    BICEP2 scientists study the cosmic microwave background, a uniform bath of radiation permeating the universe that formed when the universe first cooled enough after the big bang to be transparent to light. BICEP2 scientists found a pattern within the cosmic microwave background, one that would indicate that not long after the big bang, the universe went through a period of exponential expansion called cosmic inflation. The BICEP2 result was announced as the first direct evidence of this process.

    The problem is that the same pattern, called B-mode polarization, also appears in space dust. The BICEP2 team subtracted the then known influence of the dust from their result. But based on today’s Planck result, they didn’t manage to scrub all of it.

    How much the dust influenced the BICEP2 result remains to be seen.

    In November, Planck scientists will release their own analysis of B-mode polarization in the cosmic microwave background, in addition to a joint analysis with BICEP2 specifically intended to check the BICEP2 result. These results could answer the question of whether BICEP2 really saw evidence of cosmic inflation.

    “While we can say the dust level is significant,” writes BICEP2 co-leader Jamie Bock of Caltech and NASA’s Jet Propulsion Laboratory, “we really need to wait for the joint BICEP2-Planck paper that is coming out in the fall to get the full answer.”

    [Me? I am rooting for my homey, Alan Guth, from Highland Park, NJ, USA]

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 1:51 pm on September 4, 2014 Permalink | Reply
    Tags: , GEANT4, Symmetry Magazine   

    From Symmetry: “Forecasting the future” 


    September 04, 2014
    Monica Friedlander

    Physicists and other scientists use the GEANT4 toolkit to identify problems before they occur.

    Physicists can tell the future—or at least foresee multiple possible versions of it. They do this through computer simulations. Simulations can help scientists predict what will happen when a particular kind of particle hits a particular kind of material in a particle detector. But physicists are not the only scientists interested in predicting how particles and other matter will interact. This information is critical in multiple fields, especially those concerned about the effects of radiation.

    At CERN in 1974, scientists created the first version of GEANT (Geometry and Tracking) to help physicists create simulations. Today it is in its fourth iteration, developed by an international collaboration of about 100 scientists from 19 countries. Anyone can download the system to a personal computer, use C++ programming language to plug in details about the particle and material in question and find out what will happen when the two meet.

    at cern
    Geant4 at CERN

    GEANT4 is used in some of the most advanced accelerator experiments in the world, but its user base has grown beyond the particle physics community.

    Space science and astrophysics

    NASA, the European Space Agency, Japan Aerospace Exploration Agency and many space-related companies such as Boeing and Lockheed Martin have all used GEANT4 simulations to assess radiation hazards in space.

    Most current and near-future unmanned spacecraft—headed to places like Earth’s orbit, the moon, Mars, Venus, Mercury, Jupiter and nearby asteroids—are simulated by GEANT4 for their radiation hardness.

    Even one particle of radiation hitting a memory cell in an integrated circuit device onboard the International Space Station or other space equipment can have serious consequences. Based on information from GEANT4 simulations, scientists and engineers have made key modifications to the structure of integrated circuits and have optimized shielding structures to minimize radiation exposure.

    GEANT4 is also used for astrophysics studies. Recently scientists used it to simulate the acceleration of electrons in a solar flare and the emission of gamma rays from a pulsar.

    Air travel

    Earth’s atmosphere and magnetic field shield the planet from most cosmic radiation, but some does make it through. Areas at high altitudes are the most exposed to this radiation, as they are the least protected by layers of air, so airline passengers and crews receive higher doses of radiation than people below on the ground. Airplane companies are using GEANT4 to calculate the dose received by airline passengers and crews to help them find ways to better protect the people inside.

    Medical applications


    Imaging procedures such as CT and PET scans, which create detailed images of areas inside the body, are standard in modern diagnostic medicine. With GEANT4, medical researchers can simulate how such a procedure distributes radiation in the body while taking into account not only the motion of the machine but also that of the patient. The simulation can be done at all energy ranges and for all kinds of particles or scanning procedures.


    Traditional radiation therapy used to target cancer cells can inadvertently irradiate healthy cells in the patient. This is a serious concern, especially for younger patients who are at risk of developing secondary tumors later in life as a result. GEANT4 simulations are used to analyze and minimize this risk. By simulating the chain of interactions from beam to body, scientists can predict how various organs will be affected by treatment.


    An important part of modern security is the non-destructive scanning of cargo containers to prevent unauthorized and illegal transportation of radioactive material. Scientists use GEANT4 to simulate and optimize such scanning systems. GEANT4-based simulation is also used to optimize apparatuses for detecting explosives.

    Information technology and computing

    Just as GEANT4 can simulate the many possible ways a particle might interact in a detector, it can also simulate the many possible ways that a typical scientific code might run. GEANT4 scientists at SLAC National Accelerator Laboratory have been working with cutting-edge Silicon Valley companies such as Intel, Nvidia and Google to help them optimize their products for large-scale scientific computing.

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 1:17 pm on September 3, 2014 Permalink | Reply
    Tags: , , , Symmetry Magazine   

    From Symmetry: “Watching ‘the clock’ at the LHC” 


    September 03, 2014
    Sarah Charley

    As time ticks down to the restart of the Large Hadron Collider, scientists are making sure their detectors run like clockwork.

    Photo by Antonio Saba, CERN

    For the last two years, the Large Hadron Collider at CERN has been quietly slumbering while engineers and technicians prime it for the next run of data-taking in the summer of 2015.

    But this has been anything but a break for researchers from the LHC experiments.

    “Two years seems like a long time, but it goes by really fast,” says Michael Williams, a researcher on the LHCb experiment and assistant professor of physics at the Massachusetts Institute of Technology. “I think now it’s becoming a reality that running is coming soon, and it’s exciting.”

    CERN LHCb New

    One of the biggest tasks the collaborations are confronting right now is calibrating all the individual components so that their timing is completely synchronized. This synchronization of the components—called “the clock”—allows physicists to reconstruct the flights of particles through the different parts of the detector to form a picture of the entire collision event.

    “The clock is the foundation on which everything stands. It’s the heartbeat of the detector,” says UCLA physicist and CMS run coordinator Greg Rakness. “If the clock isn’t working, then the data makes no sense.”

    CERN CMS New

    The four largest LHC detectors—called ALICE, ATLAS, CMS and LHCb—each consist of dozens of smaller subdetectors, which in turn are supported by myriads of electronics and supporting subsystems. A huge challenge is ensuring that all of the subdetectors, electronics and supporting software are functioning as one single unit.



    “We have 18 different detectors that make up ALICE, and we have several different detection techniques,” says Federico Ronchetti, a scientist associated with CERN and Italian laboratory INFN who serves as the ALICE experiment 2015 run coordinator. “You have to combine the different pieces of information to produce an event. This is an integration, one of the most critical parts of the overall detector commissioning.”

    As Rakness says: “In the end, it’s one detector.”

    In addition to being in time with themselves, the LHC detectors must be in time with the LHC. During this next run, high-energy bunches of protons accelerated inside the LHC will collide every 25 nanoseconds. If a detector’s timing is out of sync with the accelerator, scientists will have no way of accurately reconstructing the particle collisions.

    If the detector were out of sync with the LHC, it would mistakenly show large chunks of energy suddenly going missing—just what physicists expect would happen if a rarely interacting particle, such as a dark matter particle, passed through the detector.

    “What a better way to create a fake ‘new physics’ signal than if half the detector is out of sync?” Rakness says. “You’d have new physics all the time!”

    Even though the task is daunting, the LHC researchers charged with commissioning the detectors are confident that they and their detectors will be ready for the accelerator’s second run in early 2015.

    “We understand our detector much better now,” says Kendall Reeves, a researcher for the University of Texas, Dallas, who works on the ATLAS experiment. “We have the experience from Run 1 to help out—and having that experience is invaluable. We are in a much better position now then we were at the beginning of Run 1.”

    “Nothing is too complicated,” Rakness says. “In the end, this whole complicated chain breaks down to a step-by-step process. And then it ticks.”

    CERN LHC particles

    LHC Tube Graphic
    LHC Tunnel

    CERN LHC Map
    LHC at CERN

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 1:01 pm on September 2, 2014 Permalink | Reply
    Tags: , , Symmetry Magazine   

    From Symmetry: “Detectors in daily life” 


    September 02, 2014
    Calla Cofield

    Ask someone what our world would look like if there were no cars or no telephones and they’ll probably have an immediate answer. It’s easy to imagine how our lives would be different without these things. But what if there were no particle detectors? Most people aren’t sure what would change if these devices disappeared—even though particle detectors are at work all around us.

    Tree diagram showing the relationship between types and classification of most common particle detectors

    Particle detectors play a role in undertakings from drug development to medical imaging, from protecting astronauts to dating ancient artifacts, from testing materials to understanding the universe.

    A particle detector is a device built to observe and identify particles. Some measure photons, particles of light such as the visible light from stars or the invisible X-rays we use to examine broken bones. Other particle detectors identify protons, neutrons and electrons—the particles that make up atoms—or even entire atoms. Some detectors are designed to detect antimatter. By sensing the electric charge and interactions of particles, the detectors can determine their presence, their energy and their motion.

    Without particle detectors, climate models and weather forecasts would be a bit cloudy.

    One of the most common tools used in modern weather forecasting is radar—radio waves sent through the air, bounced off of objects (such as rain drops) and collected by radio—frequency (RF) detectors. Using radar, scientists can gather information about the location, size and distribution of raindrops or other precipitation, as well as wind speed and direction and other variables. Some more advanced instruments apply the same technique using microwaves or lasers instead of radio waves. Radar is also a primary tool in the study of tornadoes and other severe storms, often with the goal of improving safety measures for people on the ground.

    The Earth’s climate—the long-term behavior of the atmosphere—is largely created by the interplay between two fluids: water and air. Fluids can be studied in detail in the laboratory using experimental devices that carefully track their complex motions using X-rays and other techniques that rely on particle detectors. Work done in the lab feeds into a growing understanding of our planet’s complex climate.

    At CERN European research center, the Cosmics Leaving Outdoor Droplets experiment, or CLOUD, is investigating the link between energetic particles from space, known as cosmic rays, and the formation of clouds in our atmosphere. The experiment suggests that the rays create aerosols, which act as seeds for new clouds. CLOUD scientists try to recreate atmospheric conditions inside a sealed chamber, and use detectors to carefully monitor changes taking place. CLOUD’s findings should help scientists better understand the effects of clouds and aerosols on Earth’s climate.

    Time for your checkup

    Particle detectors continue to change the way medical personnel see the human body—or more specifically, how they see into it. Many types of medical scanners, including PET (positron emission tomography) and MRI (magnetic resonance imaging) rely on particle detectors.

    Patients undergoing a PET scan receive an injection of molecules designed to accumulate in the hardest-working parts of the body, including the heart, the brain and organs such as the liver—as well as in trouble spots, such as tumors. Shortly after they arrive, a chemical element bound to each molecule will decay and emit a lightweight particle called a positron, which collides with a nearby electron and creates a gamma ray. Throughout this long chain of events, the gamma ray carries information about the body part being studied. It exits the body, is picked up by a particle detector, and the information is translated into images.

    MRI machines surround the body with strong magnetic fields. Manipulating those fields can cause hydrogen atoms inside the body to emit faint radio signals. Different body tissues emit slightly different signals, and a highly specialized RF detector measures those subtleties, revealing tissue structure.

    Even X-ray film is a simple type of particle detector: It turns black where it is exposed to X-rays and remains translucent where the X-rays are blocked by bones and teeth. More advanced X-ray detectors used today can image tissue as well.

    In addition, various types of particle detectors are used in radiation therapy for cancer, which bombards cancer cells with high-energy photons, protons or whole atoms. These particles are forms of ionizing radiation, meaning they can damage the molecules in cancer cells, including the cells’ DNA, making it more difficult for the cells to divide and, ideally, halting the growth of the tumor.

    At the corner of physics and pharmaceuticals

    Particle detectors help scientists shine a light on viruses and develop drugs to fight them.

    Intense, focused beams of light created by machines known as synchrotrons provide a window to the microscopic world. Shining such light on certain viruses—like those that cause strains of influenza, or the human immunodeficiency virus—reveals their unique, detailed structure. Particle detectors capture information about the interaction between the photons and the viruses and feed it back to the scientists. Armed with this information, researchers have engineered drugs that specifically target those viruses.
    Art and history

    Head to your local museum, and you may see evidence of particle detectors at work.

    Synchrotrons and other material analysis instruments can reveal the chemical make-up of art, artifacts and fossils. For anthropologists, this information can help determine the age and origin of objects. Art historians use X-ray, infrared, UV and visible light, and even beams of electrons, to gather detailed information about great works. This information is particularly important when deciding how best to restore and preserve paintings, sculptures and more. And in some cases, as with a painting by Vincent van Gogh, studies with particle detectors have revealed hidden treasures below the surface: earlier works of art painted over by the artist.

    By sea and by air

    Take a trip to the airport and you’ll be asked to place your bag on an X-ray scanner before you step through a metal detector or millimeter scanner—all of which use particle detectors.

    Cargo containers shipped by air or by sea undergo similar scanning processes. Hundreds to thousands of containers may pass through a major airport or seaport every day and must be scanned quickly and efficiently to ensure dangerous materials aren’t inside. Cargo scanners primarily use X-rays, but in recent years developments have been made using neutrons. Within this busy industry, scientists and engineers are being challenged to build better scanners and improved particle detectors.

    The final frontier

    In the hostile environment of space, astronauts require protection from solar radiation.

    High-energy photons and other particles ejected by the sun threaten to seriously harm humans who venture above the clouds. Down on the ground, earthlings are protected from most of these particles by the atmosphere. In space, particle detectors play a vital role in keeping astronauts safe from radiation. NASA scientists use particle detectors to carefully monitor each astronaut’s exposure to radiation. Satellite detectors look out for intense bursts of radiation from the sun, and provide a warning to astronauts when excess amounts are headed their way.

    The list goes on

    Particle detectors help test the structural integrity of industrial equipment, such as steam turbines and airplane engines. Smart phones and other computing devices contain semiconductors that are engineered using particle detectors in a process called ion implantation. In geology, neutrons and neutron detectors are used to identify oil reserves and rare minerals deep underground. Electron beams sterilize food, food packaging and medical equipment for consumer safety—and they use particle detectors. Muons—particles raining down from outer space—have been used to probe geophysical phenomena, like the internal structure of volcanoes and mountains. More recently, scientists have been working on devices that use muons to search for radioactive materials in waste containers or cargo ships.

    And of course, physicists use particle detectors to learn more about the universe, from finding subatomic particles such as the Higgs boson, to searching for new particles that would explain dark matter and other unsolved phenomena, to looking back in time at the first few seconds after the big bang.

    Particle detectors are used every day, making us safer, healthier and more knowledgeable and helping us get things done.

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 11:58 am on August 29, 2014 Permalink | Reply
    Tags: , JUNO Experiment, , Symmetry Magazine   

    From Symmetry: “Massive neutrino experiment proposed in China” 


    August 29, 2014
    Calla Cofield

    China’s neutrino physics program could soon expand with a new experiment aimed at cracking a critical neutrino mystery.

    Physicists have proposed building one of the largest-ever neutrino experiments in the city of Jiangamen, China, about 60 miles outside of Hong Kong. It could help answer a fundamental question about the nature of neutrinos.

    Jiangmen Underground Neutrino Observatory

    The Jiangmen Underground Neutrino Observatory, or JUNO, gained official status in 2013 and established its collaboration this month. Scientists are currently awaiting approval to start constructing JUNO’s laboratory near the Yangjiang and Taishan nuclear power plants. If it is built, current projections anticipate it will start taking data in 2020.

    The plan is to bury the laboratory in a mountain under roughly half of a mile of rock and earth, a shield from distracting cosmic rays. From this subterranean seat, JUNO’s primary scientific goal would be to resolve the question of neutrino mass. There are three known neutrino types, or flavors: electron, muon and tau. Scientists know the difference between the masses of each neutrino, but not their specific values—so they don’t yet know which neutrino is heaviest or lightest.

    “This is very important for our understanding of the neutrino picture,” says Yifang Wang, spokesperson for JUNO and director of the Institute of High Energy Physics of the Chinese Academy of Sciences. “For almost every neutrino model, you need to know which neutrino is heavier and which one is lighter. It has an impact on almost every other question about neutrinos.”

    To reach this goal, JUNO needs to acquire a hoard of data, which requires two key elements: a large detector and a high influx of neutrinos.

    The proposed detector design is called a liquid-scintillator—the same basic set-up used to detect neutrinos for the first time in 1956. The detector consists primarily of an acrylic sphere 34.5 meters (or nearly 115 feet) in diameter, filled with fluid engineered specifically for detecting neutrinos. When a neutrino interacts with the fluid, a chain reaction creates two tiny flashes of light. An additional sphere, made of photomultiplier tubes, would surround the ceramic sphere and capture these light signals.

    The more fluid the detector has, the more neutrino interactions the experiment can expect to see. Current liquid scintillator experiments include the Borexino experiment at the Gran Sasso Laboratory in Italy, which contains 300 tons of target liquid, and KamLand in Japan, which contains a 1000-ton target. If plans go ahead, JUNO will be the largest liquid scintillator detector ever built, containing 20,000 tons of target liquid.

    To discover the mass order of the three neutrino flavors, JUNO will look specifically at electron antineutrinos produced by the two nearby nuclear power plants.

    “Only in Asia are there relatively new reactor power plants that can have four to six reactor cores in the same place,” Wang says. With the potential to run four to six cores each, the Chinese reactors would send a dense shower of neutrinos toward JUNO’s detector. Over time, a picture of the antineutrino energies would emerge. The order of the neutrino masses influences what that energy spectrum looks like.

    Experiment representatives say JUNO could reach this goal by 2026.

    It’s possible that the NOvA experiment in the United States or the T2K experiment in Japan, both of which are currently taking data, could make a measurement of the neutrino mass hierarchy before JUNO. At least four proposed experiments could also reach the same goal. But only JUNO would make the measurement via this particular approach.

    The JUNO experiment would also tackle various other questions about the nature of neutrinos and refine some previously made measurements. If a supernova went off in our galaxy, JUNO would be able to observe the neutrinos it released. JUNO would also be the largest and most sensitive detector for geoneutrinos, which are produced by the decay of radioactive elements in the earth.

    Six nations have officially joined China in the collaboration: the Czech Republic, France, Finland, Germany, Italy and Russia. US scientists are actively participating in JUNO, but the United States is not currently an official member of the collaboration.

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 2:13 pm on August 5, 2014 Permalink | Reply
    Tags: , , , , , Symmetry Magazine   

    From Symmetry: “Neutrino researchers pull double duty” 


    August 05, 2014
    Hanae Armitage

    Neutrino researchers work collaboratively, sharing and comparing results to help advance the field of neutrino physics.

    For Philip Rodrigues, a postdoc at the University of Rochester, receiving a new dataset from the MINERvA neutrino experiment means two things: that one of the neutrino experiments in which he participates has met a milestone and that the other can verify some of its predictions.

    Rodrigues, who is a member of both MINERvA in the US and the T2K experiment in Japan, is not the only neutrino physicist to double dip like this. More than 50 percent of neutrino researchers work on multiple projects simultaneously.

    Scientists stand with the Minerva neutrino detector, located 330 feet underground at Fermi National Accelerator Laboratory.

    T2K experiment passes five-sigma threshold

    “You want the scientists designing future generations of experiments to have a broad experience in current neutrino research,” says Fermilab physicist Debbie Harris, co-leader of the MINERvA neutrino experiment. “So it’s great to have people on multiple projects.”

    Unlike collaborative neutrino researchers like Rodrigues, the neutrino is extremely anti-social. We can’t see it, we can’t feel it, and we don’t entirely understand it. But it may be important for understanding the formation of the universe.

    The elusive nature of neutrinos makes working together even more appealing. Scientists who share Fermilab’s neutrino beamline meet regularly to discuss neutrino flux, the quantity of neutrinos per unit area observed in the detectors, and how that information can inform their respective projects.

    “It’s impossible to have one detector that can measure every little last thing about the interaction at every neutrino energy that’s important,” Harris said. “So that’s why we need to have a lot of different experiments to help each other make these measurements.”

    Neutrino experiments are usually in one of two categories: interaction experiments and oscillation experiments. The primary goal of interaction experiments is to observe the way neutrinos interact with different materials. The primary goal of oscillation experiments is to observe the way neutrinos, which come in three types, change from one type to the next. Both types of experiments can give researchers insight into neutrino characteristics such as their masses and how the different types of neutrinos relate to each other.

    Both kinds of experiments shoot extremely intense beams of neutrinos at particle detectors, but the placement of the detector depends on the type of experiment. Detectors for oscillation experiments are located much farther away, miles from the neutrino source, to give the particles time to change.

    Data from interaction experiments is critical for scientists at oscillation experiments to understand how the particles will interact in their detectors.

    “Neutrinos are neutrinos, and we can measure how they interact with different nuclei, and those results can help us constrain models,” Harris says. “Then those models can be used for experiments that use the same type of target for their far detector.”

    In addition, data from similar experiments can be used to double-check one another.

    “I think the more data we can get, and the more measurements we can take, the more input we have to help us understand what’s going on in terms of the physics,” Rodrigues says. “It’s very useful, both for the individual experiment, as well as the advancement of the field as a whole.”

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 2:47 pm on August 1, 2014 Permalink | Reply
    Tags: , , , , , , Symmetry Magazine,   

    From Symmetry: “Using the Higgs boson to search for clues” 


    August 01, 2014
    Sarah Charley

    After a new particle such as the Higgs boson is discovered, scientists want to measure all of its properties as accurately as possible. Not only does this help determine how it fits into our greater understanding of matter, but it can also provide hints of what we don’t yet know.

    “To some people, the discovery of the Higgs completed everything,” says Colin Jessop, a professor of physics at the University of Notre Dame. “But to particle physicists, it is the beginning of everything.”

    Historically, precision measurements of known particles have yielded priceless information. In the 1960s and ’70s, for example, scientists from SLAC National Accelerator Laboratory and the Massachusetts Institute of Technology proved the existence of quarks, fundamental particles that make up protons

    The quark structure of the proton. (The color assignment of individual quarks is not important, only that all three colors are present.)

    and neutrons,

    The quark structure of the neutron. (The color assignment of individual quarks is not important, only that all three colors are present.)

    by making precise measurements of protons. More recently, scientists at Fermi National Accelerator Laboratory

    Fermilab Tevatron
    Fermilab Tevatron map

    and CERN carefully measured the mass of the heaviest of the quarks, the top quark,

    A collision event involving top quarks

    and the W boson, a particle that carries the force that mediates atomic decay, to help them estimate the mass of the then-undiscovered Higgs boson.

    Now, scientists at the Large Hadron Collider are hoping that precision measurements of the Higgs boson will help them solve the next big mysteries, such as the origin of dark matter.

    LHC Tube
    LHC Tube

    CERN LHC Map
    LHC map

    “The reason we proposed the concept of dark matter is because we cannot explain the total mass of the universe,” says Swagato Banerjee, a postdoc at the University of Wisconsin. “And the only way we know how fundamental particles acquire mass is through the Higgs mechanism. So if dark matter is fundamental, it has to interact with the Higgs to acquire mass, at least in our known framework.”

    When the Higgs is produced at the LHC, it quickly decays into lighter, more stable particles. If the Higgs interacts with dark matter, then it should be able to decay into dark matter. Currently LHC scientists are studying all the possible ways the Higgs can decay into other particles to search for any unexplained decays that could be hints of something new, like dark matter.
    Hunting for new processes

    If scientists find that the Higgs does something unexpected, it could be a clue that we don’t yet grasp the full picture, says Maria Cepeda Hermida, a postdoc at the University of Wisconsin.

    “If you only look for what you think exists, then you could miss something important,” Cepeda says. “We have to look outside of the box for what we don’t expect to see if there are any new surprises.”

    Cepeda is involved in a study that searches for evidence of the Higgs boson disobeying the well-defined laws of the Standard Model of particle physics. Recent studies from the CMS and ATLAS experiments at the LHC showed that the Higgs boson decays directly to particles of matter, some of which have a special characteristic called flavor.

    The Standard Model of elementary particles, with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    CERN CMS New


    The family of matter particles called leptons is made up of particles of three flavo[u]rs: electron, muon and tau. Because the Higgs boson has no associated flavor, the Standard Model predicts that the sum of its decay products will also have no flavor. A particle of a certain flavor and its antiparticle will cancel one another out.

    Cepeda is part of an analysis group Jessop leads that is looking for evidence of the Higgs breaking this rule by decaying to two particles of different flavors, namely, a tau and a muon.

    The preliminary results from this study restrict the likelihood of this process enormously—down to less than three out of every 200 decays. But it cannot rule it out entirely. In fact, scientists saw a small deviation from their predictions, which is either the result of normal statistical fluctuation or the first glimpse of a new process.

    If scientists find that the Higgs can break the laws of flavor, it would help explain why fundamental particles come in a variety of masses.

    “We see families of particles that are identical except in their mass,” says Roni Harnik, a theorist at Fermilab. “So what mechanism ‘decides’ that the tau is much heavier than the muon, who in turn is much heavier than the electron? In the Standard Model, it is the Higgs alone.”

    Evidence for this process would also open up many other possibilities, Harnik says.

    “Seeing this decay would teach us something profound: that the Higgs boson is not the exclusive source of mass in the universe, and that we have more interesting things to discover.”
    A mystery in itself

    The Higgs boson may be physicists’ best tool to look for new particles because the Standard Model predicts that it interacts with everything that has mass. But the Higgs itself still holds many mysteries. For instance, the mass of the Higgs is precariously perched between two stable zones predicted by the Standard Model.

    “If there is just the Standard Model Higgs and nothing else, then the mass of the Higgs boson is theoretically unstable,” says Hideki Okawa, a postdoc at Brookhaven National Laboratory. “Many people think that there should be something else that stabilizes the Higgs mass.”

    The Higgs is also an entirely unique particle and unlike anything else ever observed by scientists. So the question remains: Is the Higgs alone? Or are there others?

    “We think there could be more Higgs bosons,” Okawa says. “This would make the theory more natural.”

    The restart of the LHC in early 2015 will give scientists more data to probe the properties of the Higgs boson even further and search for new physics or phenomena just out of reach.

    “Finding the Higgs opens more questions than it answers,” Banerjee says. “But it also focuses the questions we had before its discovery.”

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.

    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 1:23 pm on July 29, 2014 Permalink | Reply
    Tags: , , , , Symmetry Magazine, Synchrotron science   

    From Symmetry: “Partnership generates bright ideas for photon science” 


    July 29, 2014
    Calla Cofield

    Photon science, a spin-off of particle physics, has returned to its roots for help developing better, faster detectors.

    In late 1940s, scientists doing fundamental physics research at the General Electric Research Laboratory in Schenectady, New York, noticed a bright arc of light coming from their particle accelerator. As a beam of electrons whipped around the accelerator’s circular track, photons trickled away like water from a punctured hose.

    At the time, this was considered a problem; the leaking photons were sapping energy from the electron beam. But scientists at labs around the world were already looking into the phenomenon, and not long after, circular particle accelerators were being built explicitly to capture the escaping light.

    Today, these instruments are called synchrotrons, and they serve as powerful tools for studying the atomic and molecular structure of a seemingly limitless number of materials.

    Despite their symbiotic beginning, synchrotron science and particle physics existed largely independent of one another. However, recent developments in the design and construction of particle detectors for synchrotron experiments—as well as new light source instruments—have sparked a reunion.

    The custom-detector revolution

    Modern synchrotrons generate powerful beams of light—infrared, ultraviolet, or X-ray—and aim them at a sample—such as a protein being tested for use in a drug. The light interacts with the sample, bouncing off of it, passing through it or being absorbed into it. (Imagine a beam of sunlight diffracting in a crystal or reflecting off the face of a watch.) By detecting how the sample changes the light, scientists can gather all kinds of information about its structure, make-up and behavior.

    A synchrotron facility can host dozens of experiments at a time. The detector plays a vital role in each one: It captures the light, which becomes the data, which holds the answers to the experimenter’s questions.

    And yet from the 1950s through the 1990s, the vast majority of detectors used at synchrotrons were not built specifically for these experiments. The designers and engineers would usually buy off-the-shelf X-ray detectors intended for other purposes, or adapt used detectors the best they could to fit the needs of the users.

    Heinz Graafsma, head of the detector group at DESY, the German Electron Synchrotron, says the science coming out of synchrotrons during this time of patchwork detectors was fantastic, thanks largely to the dramatic improvements to the quality of the light beam. But that same rapid advancement made developing detectors “like shooting at a moving target,” Graafsma says. Customized detectors could take as long as a decade to design and build, and in that time the brightness of the light beam could go up by two or three orders of magnitude, rendering the detector obsolete.

    The lack of custom detectors may also simply have come down to tight budgets, says Sol Grunner, former director of the Cornell High Energy Synchrotron Source or CHESS.

    cornell macchess
    Cornell MacCHESS

    Frustrated with the limits of detector technology, Gunner became one of the early pioneers to build custom detectors for synchrotrons, allowing scientists to conduct experiments that could not be done otherwise. His work in the 1990s helped set the stage for a cultural shift in synchrotron detectors.
    A renewed partnership

    In the last 15 years, things have changed, especially with the advent of the free-electron laser—a kind of synchrotron on steroids.

    The free-electron laser FELIX at the FOM Institute for Plasma Physics Rijnhuizen (nl), Nieuwegein, The Netherlands.

    Photon scientists have begun to face some of the same challenges as particle physicists: Scientists at light sources increasingly must collect huge amounts of data at a dizzying rate.

    So they have begun to look to particle physicists for technological insight. The partnerships that have developed have turned out to be beneficial for both sides.

    The Linac Coherent Light Source, an X-ray free-electron laser at SLAC National Accelerator Laboratory, has six beamlines open for users. The LCLS has produced 6 petabytes of data in its first five years of operation and currently averages 1.5 petabytes per year. That much stored data is comparable to the major experiments at the Large Hadron Collider at CERN.


    “There’s a big team, it’s actually bigger than the detector team, handling the big data that comes out of the LCLS,” says Chris Kenny, head of the LCLS detector group. “A lot of the know-how and a lot of the people were taken directly from particle physics.”

    At the National Synchrotron Light Source at Brookhaven National Laboratory, a group led by Pete Siddons is developing a detector that will use something called a Vertically Integrated Photon Imaging Chip, designed by high-energy physicists at Fermilab. VIPIC is an example of a circuit built with a specific purpose in mind, rather than a generic circuit that can have many applications. High-energy physics helped pioneer the creation of application-specific integrated circuits, called ASICs.

    Brookhaven NSLS
    Brookhaven NSLS

    With the advanced capability of the VIPIC chip, the researchers hope the new detector will allow synchrotron users to watch fast processes as they take place. This could include watching materials undergo phase transitions, such as the change between liquid and solid.

    Siddons and his NSLS detector team are building the silicon detectors that will capture the light, but they need particle physicists to fabricate the highly specialized integrated circuits for sorting all the incoming information.

    “Making integrated circuits is a very, very specialized, tricky business,” Siddons says.

    Physicists and engineers at Fermilab design the circuits, which are then fabricated at commercial foundries. Fermilab scientists then put the circuits and particle sensors together into large integrated systems. The collaborative project will also involve contributions by scientists at Argonne National Laboratory and AGH University of Science and Technology in Krakow, Poland.

    “You need very expensive software tools to do it—for doing the design and layout and simulating and checking,” Siddons says. “And they have that, because the high-energy physics community has been building large detector systems forever.”

    Benefits for both sides

    Compared to those used in synchrotron science, particle physics detectors live very long lives.

    “We may build a few large scale detectors a decade,” says Ron Lipton, a Fermilab scientist who has been involved with the development of several large scale particle physics detectors and is a collaborator on the VIPIC chip.

    Partnering with synchrotron science has given particle physicists a chance to develop and test new technologies on a shorter time scale, he says.

    Scientists at the Paul Scherrer Institute in Switzerland used chip technology from particle physics experiments at CERN to create one of the most widely used custom synchrotron detectors: the Pilatus detector. Researchers at Fermilab and DESY say the work put into technologies like this have already fed new information and new ideas back into particle physics.

    Collaboration also provides work for detector engineers and increases the market need for detector components, which drives down costs, Lipton says.

    These days, more synchrotron facilities employ scientists and engineers to design custom detectors for lab use or for future commercialization. Half a dozen detectors, designed especially for light sources, are already available.

    “It is at the moment very exciting,” Graafsma says. “There are budgets available. The facilities see this as an important issue. But also the technology is now available. So we can really build the detectors we want.”

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.

    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 8:58 pm on July 24, 2014 Permalink | Reply
    Tags: , , , , , Symmetry Magazine   

    From Symmetry: “How to weigh a galaxy cluster” 


    July 24, 2014
    Glenn Roberts Jr.

    The study of galaxy clusters is bringing scientists closer to an understanding of the “dark” universe.

    Step on a scale and you’ll get a quick measure of your weight. Weighing galaxy clusters, groups of hundreds or thousands of galaxies bound together by gravity, isn’t so easy.

    But scientists have many ways to do it. This is fortunate for particle astrophysics; determining the mass of galaxy clusters led to the discovery of dark matter, and it’s key to the continuing study of the “dark” universe: dark matter and dark energy.

    “Galaxy cluster measurements are one of the most powerful probes of cosmology we have,” says Steve Allen, an associate professor of physics at SLAC National Accelerator Laboratory and Stanford University.

    When you weigh a galaxy cluster, what you see is not all that you get. Decades ago, when scientists first estimated the masses of galaxy clusters based on the motions of the galaxies within them, they realized that something strange was going on. The galaxies were moving faster than expected, which implied that the clusters were more massive than previously thought, based on the amout of light they emitted. The prevailing explanation today is that galaxy clusters contain vast amounts of dark matter.

    Measurements of the masses of galaxy clusters can tell scientists about the sizes and shapes of the dark matter “halos” enveloping them and can help them determine the effects of dark energy, which scientists think is driving the universe’s accelerating expansion.

    Today, researchers use a combination of simulations and space- and ground-based telescope observations to estimate the total masses of galaxy clusters.

    Redshift, blueshift: Just as an ambulance’s siren seems higher in pitch as it approaches and lower as it speeds into the distance, the light of objects traveling away from us is shifted to longer, “redder” wavelengths, and the light of those traveling toward us is shifted to shorter, “bluer” wavelengths. Measurements of these shifts in light coming from galaxies orbiting a galaxy cluster can tell scientists how much gravitational pull the cluster has, which is related to its mass.

    Courtesy of NASA

    Gravitational lensing: Gravitational lensing, theorized by Albert Einstein, occurs when the light from a distant galaxy is bent by the gravitational pull of a massive object between it and the viewer. This bending distorts the image of the background galaxy (pictured above). Where the effects are strong, the process can cause dramatic distortions; multiple images of the galaxy can appear. Typically, however, the effects are subtle and require careful measurements to detect. The greater the lensing effect caused by a galaxy cluster, the larger the galaxy cluster’s mass.

    X-rays: Galaxy clusters are filled with superhot, 10- to 100-million-degree gas that shines brightly at X-ray wavelengths. Scientists use X-ray data from space telescopes to find and study massive galaxy clusters. They can use the measured properties of the gas to infer the clusters’ masses.

    The Sunyaev-Zel’dovich effect: The Sunyaev-Zel’dovich effect is a shift in the wavelength of the Cosmic Microwave Background—light left over from the big bang—that occurs when this light passes through the hot gas in a galaxy cluster. The size of the wavelength shift can tell scientists the mass of the galaxy cluster it passed through.

    Cosmic Background Radiation Planck
    CMB (ESA/Planck)

    “These methods are much more powerful in combination than alone,” says Aaron Roodman, a faculty member at the Kavli Institute for Particle Astrophysics and Cosmology at SLAC National Accelerator Laboratory.

    Forthcoming data from the Dark Energy Survey, the under-construction Large Synoptic Survey Telescope and Dark Energy Spectroscopic Instrument, improved Sunyaev-Zel’dovich effect measurements, and the soon-to-be-launched ASTRO-H and eRosita X-ray telescopes should further improve galaxy cluster mass estimates and advance cosmology. Computer simulations are also playing an important role in testing and improving mass estimates based on data from observations.

    LSST Telescope

    DESI Dark Energy Spectroscopic Instrument

    Astro H

    eROSITA Max Planck

    Even with an extensive toolkit, it remains a challenging business to weigh galaxy clusters, says Marc Kamionkowski, a theoretical physicist and professor of physics and astronomy at Johns Hopkins University. They are constantly changing; they continue to suck in matter; their dark matter halos can overlap; and no two are alike.

    “It’s like asking how many birds are in my backyard,” he says.

    Despite this, Allen says he sees no roadblocks toward pushing mass estimates to within a few percent accuracy.

    “We will be able to take full advantage of these amazing new data sets that are coming along,” he says. “We are going to see rapid advances.”

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.

    ScienceSprings is powered by MAINGEAR computers

  • richardmitnick 8:48 pm on July 11, 2014 Permalink | Reply
    Tags: , , , Symmetry Magazine   

    From Symmetry: “US reveals its next generation of dark matter experiments” 


    July 11, 2014
    Kathryn Jepsen

    Together, the three experiments will search for a variety of types of dark matter particles.

    Two US federal funding agencies announced today which experiments they will support in the next generation of the search for dark matter.

    The Department of Energy and National Science Foundation will back the Super Cryogenic Dark Matter Search-SNOLAB, or SuperCDMS; the LUX-Zeplin experiment, or LZ; and the next iteration of the Axion Dark Matter eXperiment, ADMX-Gen2.

    CDMX detector



    “We wanted to pool limited resources to put together the most optimal unified national dark matter program we could create,” says Michael Salamon, who manages DOE’s dark matter program.

    Second-generation dark matter experiments are defined as experiments that will be at least 10 times as sensitive as the current crop of dark matter detectors.

    Program directors from the two federal funding agencies decided which experiments to pursue based on the advice of a panel of outside experts. Both agencies have committed to working to develop the new projects as expeditiously as possible, says Jim Whitmore, program director for particle astrophysics in the division of physics at NSF.

    Physicists have seen plenty of evidence of the existence of dark matter through its strong gravitational influence, but they do not know what it looks like as individual particles. That’s why the funding agencies put together a varied particle-hunting team.

    Both LZ and SuperCDMS will look for a type of dark matter particles called WIMPs, or weakly interacting massive particles. ADMX-Gen2 will search for a different kind of dark matter particles called axions.

    LZ is capable of identifying WIMPs with a wide range of masses, including those much heavier than any particle the Large Hadron Collider at CERN could produce. SuperCDMS will specialize in looking for light WIMPs with masses lower than 10 GeV. (And of course both LZ and SuperCDMS are willing to stretch their boundaries a bit if called upon to double-check one another’s results.)

    If a WIMP hits the LZ detector, a high-tech barrel of liquid xenon, it will produce quanta of light, called photons. If a WIMP hits the SuperCDMS detector, a collection of hockey-puck-sized integrated circuits made with silicon or germanium, it will produce quanta of sound, called phonons.

    “But if you detect just one kind of signal, light or sound, you can be fooled,” says LZ spokesperson Harry Nelson of the University of California, Santa Barbara. “A number of things can fake it.”

    SuperCDMS and LZ will be located underground—SuperCDMS at SNOLAB in Ontario, Canada, and LZ at the Sanford Underground Research Facility in South Dakota—to shield the detectors from some of the most common fakers: cosmic rays. But they will still need to deal with natural radiation from the decay of uranium and thorium in the rock around them: “One member of the decay chain, lead-210, has a half-life of 22 years,” says SuperCDMS spokesperson Blas Cabrera of Stanford University. “It’s a little hard to wait that one out.”

    To combat this, both experiments collect a second signal, in addition to light or sound—charge. The ratio of the two signals lets them know whether the light or sound came from a dark matter particle or something else.

    SuperCDMS will be especially skilled at this kind of differentiation, which is why the experiment should excel at searching for hard-to-hear low-mass particles.

    LZ’s strength, on the other hand, stems from its size.

    Dark matter particles are constantly flowing through the Earth, so their interaction points in a dark matter detector should be distributed evenly throughout. Quanta of radiation, however, can be stopped by much less significant barriers—alpha particles by a piece of paper, beta particles by a sandwich. Even gamma ray particles, which are harder to stop, cannot reach the center of LZ’s 7-ton detector. When a particle with the right characteristics interacts in the center of LZ, scientists will know to get excited.

    The ADMX detector, on the other hand, approaches the dark matter search with a more delicate touch. The dark matter axions ADMX scientists are looking for are too light for even SuperCDMS to find.

    If an axion passed through a magnetic field, it could convert into a photon. The ADMX team encourages this subtle transformation by placing their detector within a strong magnetic field, and then tries to detect the change.

    “It’s a lot like an AM radio,” says ADMX-Gen2 co-spokesperson Gray Rybka of the University of Washington in Seattle.

    The experiment slowly turns the dial, tuning itself to watch for one axion mass at a time. Its main background noise is heat.

    “The more noise there is, the harder it is to hear and the slower you have to tune,” Rybka says.

    In its current iteration, it would take around 100 years for the experiment to get through all of the possible channels. But with the addition of a super-cooling refrigerator, ADMX-Gen2 will be able to search all of its current channels, plus many more, in the span of just three years.

    With SuperCDMS, LZ and ADMX-Gen2 in the works, the next several years of the dark matter search could be some of its most interesting.

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.

    ScienceSprings is powered by MAINGEAR computers

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc

Get every new post delivered to your Inbox.

Join 325 other followers

%d bloggers like this: