Tagged: Symmetry Magazine Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 7:11 pm on February 4, 2016 Permalink | Reply
    Tags: , Neutrino mass experiments, , Symmetry Magazine   

    From Symmetry: “Weighing the lightest particle” 

    Symmetry

    02/04/16
    Diana Kwon

    Physicists are using one of the oldest laws of nature to find the mass of the elusive neutrino.

    Neutrinos are everywhere. Every second, 100 trillion of them pass through your body unnoticed, hardly ever interacting. Though exceedingly abundant, they are the lightest particles of matter, and physicists around the world are attempting the difficult challenge of measuring their mass.

    For a long time, physicists thought neutrinos were massless. This belief was overturned by the discovery that neutrinos oscillate between three flavors: electron, muon and tau. This happens because each flavor contains a mixture of three mass types, neutrino-1, neutrino-2 and neutrino-3, which travel at slightly different speeds.

    According to the measurements taken so far, neutrinos must weigh less than 2 electronvolts (a minute fraction of the mass of the tiny electron, which weighs 511,000 electronvolts). A new generation of experiments is attempting to lower this limit—and possibly even identify the actual mass of this elusive particle.

    Where did the energy go?

    Neutrinos were first proposed by the Austrian-born theoretical physicist Wolfgang Pauli to resolve a problem with beta decay. In the process of beta decay, a neutron in an unstable nucleus transforms into a proton while emitting an electron. Something about this process was especially puzzling to scientists. During the decay, some energy seemed to go missing, breaking the well-established law of energy conservation.

    Pauli suggested that the disappearing energy was slipping away in the form of another particle. This particle was later dubbed the neutrino, or “little neutral one,” by the Italian physicist Enrico Fermi.

    Scientists are now applying the principle of energy conservation to direct neutrino mass experiments. By very precisely measuring the energy of electrons released during the decay of unstable atoms, physicists can deduce the mass of neutrinos.

    “The heavier the neutrino is, the less energy is left over to be carried by the electron,” says Boris Kayser, a theoretical physicist at Fermilab. “So there is a maximum energy that an electron can have when a neutrino is emitted.”

    These experiments are considered direct because they rely on fewer assumptions than other neutrino mass investigations. For example, physicists measure mass indirectly by observing neutrinos’ imprints on other visible things such as galaxy clustering.

    Detecting the kinks

    Of the direct neutrino mass experiments, KATRIN, which is based at the Karlsrule Institute for Technology in Germany, is the closest to beginning its search.

    “If everything works as planned, I think we’ll have very beautiful results in 2017,” says Guido Drexlin, a physicist at KIT and co-spokesperson for KATRIN.

    KATRIN plans to measure the energy of the electrons released from the decay of the radioactive isotope tritium. It will do so by using a giant tank tuned to a precise voltage that allows only electrons above a specific energy to pass through to the detector at the other side. Physicists can use this information to plot the rate of decays at any given energy.

    KIT Katrin experiment
    KATRIN

    The mass of a neutrino will cause a disturbance in the shape of this graph [no graph is present]. Each neutrino mass type should create its own kink. KATRIN, with a peak sensitivity of 0.2 electronvolts (a factor 100 better than previous experiments) will look for a “broad kink” that physicists can use to calculate average neutrino mass.

    Another tritium experiment, Project 8, is attempting a completely different method to measure neutrino mass. The experimenters plan to detect the energy of each individual electron ejected from a beta decay by measuring the frequency of its spiraling motion in a magnetic field. Though still in the early stages, it has the potential to go beyond KATRIN’s sensitivity, giving physicists high hopes for its future.

    Project 8 Full setup
    Project 8 Full setup. I am told that “most of the workhorse hardware is at the University of Washington in Seattle”.

    “KATRIN is the furthest along—it will come out with guns blazing,” says Joseph Formaggio, a physicist at MIT and Project 8 co-spokesperson. “But if they see a signal, the first thing people are going to want to know is whether the kink they see is real. And we can come in and do another experiment with a completely different method.”

    Cold capture

    Others are looking for these telltale kinks using a completely different element, holmium, which decays through a process called electron capture. In these events, an electron in an unstable atom combines with a proton, turning it into a neutron while releasing a neutrino.

    Physicists are measuring the very small amount of energy released in this decay by enclosing the holmium source in microscopic detectors that are operated at very low temperatures (typically below minus 459.2 degrees Fahrenheit). Each holmium decay leads to a tiny increase of the detector’s temperature (about 1/1000 degrees Fahrenheit).

    “To lower the limit on the electron neutrino mass, you need a good thermometer that can measure these very small changes of temperature with high precision,” says Loredana Gastaldo, a Heidelberg University physicist and spokesperson for the ECHo experiment.

    There are currently three holmium experiments, ECHo and HOLMES in Europe and NuMECs in the US, which are in various stages of testing their detectors and producing isotopes of holmium.

    The holmium and tritium experiments will help lower the limit on how heavy neutrinos can be, but it may be that none will be able to definitively determine their mass. It will likely require a combination of both direct and indirect neutrino mass experiments to provide scientists with the answers they seek—or, physicists might even find completely unexpected results.

    “Don’t bet on neutrinos,” Formaggio says. “They’re kind of unpredictable.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 5:31 pm on February 2, 2016 Permalink | Reply
    Tags: , Radioactivity, Symmetry Magazine   

    From Symmetry: “This radioactive life” 

    Symmetry

    02/02/16
    Chris Patrick

    radioactive

    Radiation is everywhere. The question is: How much?

    An overly plump atomic nucleus just can’t keep itself together.

    When an atom has too many protons or neutrons, it’s inherently unstable. Although it might sit tight for a while, eventually it can’t hold itself together any longer and it spontaneously decays, spitting out energy in the form of waves or particles.

    The end result is a smaller, more stable nucleus. The spit-out waves and particles are known as radiation, and the process of nuclear decay that produces them is called radioactivity.

    Radiation is a part of life. There are radioactive elements in most of the materials we encounter on a daily basis, which constantly spray us with radiation. For the average American, this adds up to a dose of about 620 millirem of radiation every year. That’s roughly equivalent to 10 abdominal X-rays.

    Scientists use the millirem unit to express how much a radiation dose damages the human body. A person receives 1 millirem during an airline flight from one U.S. coast to the other.

    But where exactly does our annual dose of radiation come from? Looking at sources, we can split the dosage in two nearly equal parts: About half comes from natural background radiation and half comes from manmade sources.

    Natural background radiation originates from outer space, the atmosphere, the ground, and our own bodies. There’s radon in the air we breathe, radium in the water we drink and miscellaneous radioactive elements in the food we eat. Some of these pass through our bodies without much ado, but some get incorporated into our molecules. When the nuclei eventually decay, our own bodies expose us to tiny doses of radiation.

    “We’re exposed to background radiation whether we like it or not,” says Sayed Rokni, radiation safety officer and radiation protection department head at SLAC National Accelerator Laboratory. “That exists no matter what we do. I wouldn’t advise it, but we could choose not to have dental X-rays. But we can’t choose not to be exposed to terrestrial radiation—radiation that is in the crust of the earth, or from cosmic radiation.”

    It’s no reason to panic, though.

    “The human species, and everything around us, has evolved over the ages while receiving radiation from natural sources. It has formed us. So clearly there is an acceptable level of radiation,” Rokni says.

    Any radiation not considered background comes from manmade sources, primarily through diagnostic or therapeutic medical procedures. In the early 1980s, medical procedures accounted for 15 percent of an American’s yearly radiation exposure—they now account for 48 percent.

    “The amount of natural background radiation has stayed the same,” says Don Cossairt, Fermilab radiation protection manager. “But radiation from medical procedures has blossomed, perhaps with corresponding dramatic improvements in treating many diseases and ailments.”

    Growth in the use of medical imaging has raised the average American’s yearly exposure from its 1980s’ average of 360 millirems to 620 millirems. Today’s annual average is not regarded as harmful to health by any regulatory authority.

    While medical procedures make up most of the manmade radiation we receive, about 2 percent of the overall annual dose comes from radiation emitted by some consumer products. Most of these products are probably in your home right now. Simply examining the average kitchen, one finds a cornucopia of items that emit enough radiation to detect it with a Geiger counter, in both manmade consumer products and natural foods.

    Are there Brazil nuts in your pantry? They’re the most radioactive food there is. A Brazil nut tree’s roots reach far down into the soil to deep underground where there’s more radium, absorb this radioactive element, and pass it on to the nuts. Brazil nuts also contain potassium, which occurs in tandem with potassium-40, a naturally occurring radioactive isotope.

    Potassium-40 is the most prevalent radioactive element in the food we eat. Potassium-packed bananas are well known for their radioactivity, so much so that a banana’s worth of radioactivity is used as an informal measurement of radiation. It’s called the Banana Equivalent Dose. One BED is equal to 0.01 millirem. A typical chest x-ray is somewhere around 200 to 1000 BED. A fatal dose of radiation is about 50 million BED in one sitting.

    Some other potassium-40-containing munchies that emit radiation include carrots, potatoes, lima and kidney beans and red meat. From food and water alone, the average person receives an annual internal dose of about 30 millirem. That’s 3000 bananas!

    Even the dish off of which you’re eating may be giving you a slight dose of radiation. The glaze of some older ceramics contains uranium, thorium or good ol’ potassium-40 to make it a certain color, especially red-orange pottery made pre-1960s. Likewise, some yellowish and greenish antique glassware contains uranium as a colorant. Though this dinnerware might make a Geiger counter click, it’s still safe to eat with.

    Your smoke detector, which usually hangs silently on the ceiling until its batteries go dead, is radioactive too. That’s how it can save you from a burning building: A small amount of americium-241 in the device allows it to detect when there’s smoke in the air.

    “It’s not dangerous unless you take it out in the garage and beat it up with a hammer to release the radioactivity,” Cossairt says. The World Nuclear association notes that the americium dioxide found in smoke detectors is insoluble and would “pass through the digestive tract without delivering a significant radiation dose.”

    Granite countertops also contain uranium and thorium, which decays into radon gas. Most of the gas gets trapped in the countertop, but some can be released and add a small amount to the radon level in a home—which primarily comes from the soil a structure sits on.

    Granite doesn’t just emit radiation inside the home. People living in areas with more granite rock receive an extra boost of radiation per year.

    Yearly radiation exposure varies significantly depending on where you live. People at higher altitudes receive a greater dose of radiation showered from space per year.

    But not to worry if you live in a locale with lots of altitude and granite, like Denver, Colorado. “No health effect due to radiation exposure has ever been correlated with people living at higher altitudes,” Cossairt says. Similarly, no one has noted a correlation between health and the increased dose of radiation from environmental granite rock.

    It doesn’t matter if you’re living at altitude or sea level, in the Rocky Mountains or on Maryland’s Eastern Shore—radiation is everywhere. But annual doses from background and manmade sources aren’t enough to worry about. So enjoy your banana and feel free to grab another handful of Brazil nuts.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 3:58 pm on January 20, 2016 Permalink | Reply
    Tags: , Antineutrinos, , , Symmetry Magazine   

    From Symmetry: “Is the neutrino its own antiparticle?” 

    Symmetry

    01/20/16
    Signe Brewster

    The mysterious particle could hold the key to why matter won out over antimatter in the early universe.

    Temp 1
    Artwork by Sandbox Studio, Chicago with Ana Kova

    Almost every particle has an antimatter counterpart: a particle with the same mass but opposite charge, among other qualities.

    This seems to be true of neutrinos, tiny particles that are constantly streaming through us. Judging by the particles released when a neutrino interacts with other matter, scientists can tell when they’ve caught a neutrino versus an antineutrino.

    But certain characteristics of neutrinos and antineutrinos make scientists wonder: Are they one and the same? Are neutrinos their own antiparticles?

    This isn’t unheard of. Gluons and even Higgs bosons are thought to be their own antiparticles. But if scientists discover neutrinos are their own antiparticles, it could be a clue as to where they get their tiny masses—and whether they played a part in the existence of our matter-dominated universe.

    Dirac versus Majorana

    The idea of the antiparticle came about in 1928 when British physicist Paul Dirac developed what became known as the Dirac equation. His work sought to explain what happened when electrons moved at close to the speed of light. But his calculations resulted in a strange requirement: that electrons sometimes have negative energy.

    “When Dirac wrote down his equation, that’s when he learned antiparticles exist,” says André de Gouvêa, a theoretical physicist and professor at Northwestern University. “Antiparticles are a consequence of his equation.”

    Physicist Carl Anderson discovered the antimatter partner of the electron that Dirac foresaw in 1932. He called it the positron—a particle like an electron but with a positive charge.

    Dirac predicted that, in addition to having opposite charges, antimatter partners should have opposite handedness as well.

    A particle is considered right-handed if its spin is in the same direction as its motion. It is considered left-handed if its spin is in the opposite direction.

    Dirac’s equation allowed for neutrinos and anti-neutrinos to be different particles, and, as a result, four types of neutrino were possible: left- and right-handed neutrinos and left- and right-handed antineutrinos. But if the neutrinos had no mass, as scientists thought at the time, only left-handed neutrinos and right-handed antineutrinos needed to exist.

    In 1937, Italian physicist Ettore Majorana debuted another theory: Neutrinos and antineutrinos are actually the same thing. The Majorana equation described neutrinos that, if they happened to have mass after all, could turn into antineutrinos and then back into neutrinos again.

    Temp 2
    Artwork by Sandbox Studio, Chicago with Ana Kova

    The matter-antimatter imbalance

    Whether neutrino masses were zero remained a mystery until 1998, when the Super-Kamiokande and SNO experiments found they do indeed have very small masses—an achievement recognized with the 2015 Nobel Prize for Physics.

    Super-Kamiokande Detector
    Super-Kamiokande neutrino detector

    SNOLAB
    SNO detector [under construction]

    Since then, experiments have cropped up across Asia, Europe and North America searching for hints that the neutrino is its own antiparticle.

    The key to finding this evidence is something called lepton number conservation. Scientists consider it a fundamental law of nature that lepton number is conserved, meaning that the number of leptons and anti-leptons involved in an interaction should remain the same before and after the interaction occurs.

    Scientists think that, just after the big bang, the universe should have contained equal amounts of matter and antimatter. The two types of particles should have interacted, gradually canceling one another until nothing but energy was left behind. Somehow, that’s not what happened.

    Finding out that lepton number is not conserved would open up a loophole that would allow for the current imbalance between matter and antimatter. And neutrino interactions could be the place to find that loophole.

    Neutrinoless double-beta decay

    Scientists are looking for lepton number violation in a process called double beta decay, says SLAC theorist Alexander Friedland, who specializes in the study of neutrinos.

    In its common form, double beta decay is a process in which a nucleus decays into a different nucleus and emits two electrons and two antineutrinos. This balances leptonic matter and antimatter both before and after the decay process, so it conserves lepton number.

    If neutrinos are their own antiparticles, it’s possible that the antineutrinos emitted during double beta decay could annihilate one another and disappear, violating lepton number conservation. This is called neutrinoless double beta decay.

    Such a process would favor matter over antimatter, creating an imbalance.

    “Theoretically it would cause a profound revolution in our understanding of where particles get their mass,” Friedland says. “It would also tell us there has to be some new physics at very, very high energy scales—that there is something new in addition to the Standard Model we know and love.”

    Standard Model
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    It’s possible that neutrinos and antineutrinos are different, and that there are two neutrino and anti-neutrino states, as called for in Dirac’s equation. The two missing states could be so elusive that physicists have yet to spot them.

    But spotting evidence of neutrinoless double beta decay would be a sign that Majorana had the right idea instead—neutrinos and antineutrinos are the same.

    “These are very difficult experiments,” de Gouvêa says. “They’re similar to dark matter experiments in the sense they have to be done in very quiet environments with very clean detectors and no radioactivity from anything except the nucleus you’re trying to study.”

    Physicists are still evaluating their understanding of the elusive particles.

    “There have been so many surprises coming out of neutrino physics,” says Reina Maruyama, a professor at Yale University associated with the CUORE neutrinoless double beta decay experiment.

    CUORE experiment
    CUORE neutrinoless double beta decay experiment at Gran Sasso in Italy.

    “I think it’s really exciting to think about what we don’t know.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 3:56 pm on January 19, 2016 Permalink | Reply
    Tags: , , , , Symmetry Magazine,   

    From Symmetry: “A speed trap for dark matter” 

    Symmetry

    01/19/16
    Manuel Gnida

    Analyzing the motion of X-ray sources could help researchers identify dark matter signals.

    Temp 1
    ASTRO-H, an X-ray satellite of the Japan Aerospace Exploration Agency

    Dark matter or not dark matter? That is the question when it comes to the origin of intriguing X-ray signals scientists have found coming from space.

    In a theory paper published today in Physical Review Letters, scientists have suggested a surprisingly simple way of finding the answer: by setting up a speed trap for the enigmatic particles.

    Eighty-five percent of all matter in the universe is dark: It doesn’t emit light, nor does it interact much with regular matter other than through gravity.

    The nature of dark matter remains one of the biggest mysteries of modern physics. Most researchers believe that the invisible substance is made of fundamental particles, but so far they’ve evaded detection. One way scientists hope to prove their particle assumption is by searching the sky for energetic light that would emerge when dark matter particles decayed or annihilated each other in space.

    Over the past couple of years, several groups analyzing data from two X-ray satellites—the European Space Agency’s XMM-Newton and NASA’s Chandra X-ray space observatories—reported the detection of faint X-rays with a well-defined energy of 3500 electronvolts (3.5 keV).

    ESA XMM Newton
    ESA/XMM-Newton

    NASA Chandra Telescope
    NASA/Chandra

    The signal emanated from the center of the Milky Way; its nearest neighbor galaxy, Andromeda; and a number of galaxy clusters.

    1
    Andromeda Galaxy. Adam Evans

    Some scientists believe it might be a telltale sign of decaying dark matter particles called sterile neutrinos—hypothetical heavier siblings of the known neutrinos produced in fusion reactions in the sun, radioactive decays and other nuclear processes. However, other researchers argue that there could be more mundane astrophysical origins such as hot gases.

    There might be a straightforward way of distinguishing between the two possibilities, suggest researchers from Ohio State University and the Kavli Institute for Particle Astrophysics and Cosmology [KIPAC], a joint institute of Stanford University and SLAC National Accelerator Laboratory.

    It involves taking a closer look at the Doppler shifts of the X-ray signal. The Doppler effect is the shift of a signal to higher or lower frequencies depending on the relative velocity between the signal source and its observer. It’s used, for instance, in roadside speed traps by the police, but it could also help astrophysicists “catch” dark matter particles.

    “On average, dark matter moves differently than gas,” says study co-author Ranjan Laha from KIPAC. “Dark matter has random motion, whereas gas rotates with the galaxies to which it is confined. By measuring the Doppler shifts in different directions, we can in principle tell whether a signal—X-rays or any other frequency—stems from decaying dark matter particles or not.”

    Researchers would even know if the signal were caused by the observation instrument itself because then the Doppler shift would be zero for all directions

    Although a promising approach, it can’t just yet be applied to the 3.5-keV X-rays because the associated Doppler shifts are very small. Current instruments either don’t have enough energy resolution for the analysis or they don’t operate in the right energy range.

    However, this situation may change very soon with ASTRO-H, an X-ray satellite of the Japan Aerospace Exploration Agency, whose launch is planned for early this year. As the researchers show in their paper, it will have just the right specifications to return a verdict on the mystery X-ray line. Dark matter had better watch its speed.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 9:33 pm on January 14, 2016 Permalink | Reply
    Tags: , , , , , Symmetry Magazine   

    From Symmetry: “Exploring the dark universe with supercomputers” 

    Symmetry

    Temp 1

    01/14/16
    Katie Elyce Jones

    Next-generation telescopic surveys will work hand-in-hand with supercomputers to study the nature of dark energy.

    The 2020s could see a rapid expansion in dark energy research.

    For starters, two powerful new instruments will scan the night sky for distant galaxies. The Dark Energy Spectroscopic Instrument, or DESI, will measure the distances to about 35 million cosmic objects, and the Large Synoptic Survey Telescope, or LSST, will capture high-resolution videos of nearly 40 billion galaxies.

    DESI Dark Energy Spectroscopic Instrument
    LBL DESI

    LSST Exterior
    LSST Telescope
    LSST Camera
    LSST, the building that will house it in Chile, and the camera, being built at SLAC

    Both projects will probe how dark energy—the phenomenon that scientists think is causing the universe to expand at an accelerating rate—has shaped the structure of the universe over time.

    But scientists use more than telescopes to search for clues about the nature of dark energy. Increasingly, dark energy research is taking place not only at mountaintop observatories with panoramic views but also in the chilly, humming rooms that house state-of-the-art supercomputers.

    The central question in dark energy research is whether it exists as a cosmological constant—a repulsive force that counteracts gravity, as Albert Einstein suggested a century ago—or if there are factors influencing the acceleration rate that scientists can’t see. Alternatively, Einstein’s theory of gravity [General Relativity] could be wrong.

    “When we analyze observations of the universe, we don’t know what the underlying model is because we don’t know the fundamental nature of dark energy,” says Katrin Heitmann, a senior physicist at Argonne National Laboratory. “But with computer simulations, we know what model we’re putting in, so we can investigate the effects it would have on the observational data.”

    Temp 2
    A simulation shows how matter is distributed in the universe over time. Katrin Heitmann, et al., Argonne National Laboratory

    Growing a universe

    Heitmann and her Argonne colleagues use their cosmology code, called HACC, on supercomputers to simulate the structure and evolution of the universe. The supercomputers needed for these simulations are built from hundreds of thousands of connected processors and typically crunch well over a quadrillion calculations per second.

    The Argonne team recently finished a high-resolution simulation of the universe expanding and changing over 13 billion years, most of its lifetime. Now the data from their simulations is being used to develop processing and analysis tools for the LSST, and packets of data are being released to the research community so cosmologists without access to a supercomputer can make use of the results for a wide range of studies.

    Risa Wechsler, a scientist at SLAC National Accelerator Laboratory and Stanford University professor, is the co-spokesperson of the DESI experiment. Wechsler is producing simulations that are being used to interpret measurements from the ongoing Dark Energy Survey, as well as to develop analysis tools for future experiments like DESI and LSST.

    Dark Energy Survey
    Dark Energy Camera
    CTIO Victor M Blanco 4m Telescope
    DES, The DECam camera, built at FNAL, and the Victor M Blanco 4 meter telescope in Chile that houses the camera.

    “By testing our current predictions against existing data from the Dark Energy Survey, we are learning where the models need to be improved for the future,” Wechsler says. “Simulations are our key predictive tool. In cosmological simulations, we start out with an early universe that has tiny fluctuations, or changes in density, and gravity allows those fluctuations to grow over time. The growth of structure becomes more and more complicated and is impossible to calculate with pen and paper. You need supercomputers.”

    Supercomputers have become extremely valuable for studying dark energy because—unlike dark matter, which scientists might be able to create in particle accelerators—dark energy can only be observed at the galactic scale.

    “With dark energy, we can only see its effect between galaxies,” says Peter Nugent, division deputy for scientific engagement at the Computational Cosmology Center at Lawrence Berkeley National Laboratory.

    Trial and error bars

    “There are two kinds of errors in cosmology,” Heitmann says. “Statistical errors, meaning we cannot collect enough data, and systematic errors, meaning that there is something in the data that we don’t understand.”

    Computer modeling can help reduce both.

    DESI will collect about 10 times more data than its predecessor, the Baryon Oscillation Spectroscopic Survey, and LSST will generate 30 laptops’ worth of data each night. But even these enormous data sets do not fully eliminate statistical error.

    LBL BOSS
    LBL BOSS telescope

    Simulation can support observational evidence by modeling similar conditions to see if the same results appear consistently.

    “We’re basically creating the same size data set as the entire observational set, then we’re creating it again and again—producing up to 10 to 100 times more data than the observational sets,” Nugent says.

    Processing such large amounts of data requires sophisticated analyses. Simulations make this possible.

    To program the tools that will compare observational and simulated data, researchers first have to model what the sky will look like through the lens of the telescope. In the case of LSST, this is done before the telescope is even built.

    After populating a simulated universe with galaxies that are similar in distribution and brightness to real galaxies, scientists modify the results to account for the telescope’s optics, Earth’s atmosphere, and other limiting factors. By simulating the end product, they can efficiently process and analyze the observational data.

    Simulations are also an ideal way to tackle many sources of systematic error in dark energy research. By all appearances, dark energy acts as a repulsive force. But if other, inconsistent properties of dark energy emerge in new data or observations, different theories and a way of validating them will be needed.

    “If you want to look at theories beyond the cosmological constant, you can make predictions through simulation,” Heitmann says.

    A conventional way to test new scientific theories is to introduce change into a system and compare it to a control. But in the case of cosmology, we are stuck in our universe, and the only way scientists may be able to uncover the nature of dark energy—at least in the foreseeable future—is by unleashing alternative theories in a virtual universe.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 2:04 pm on January 12, 2016 Permalink | Reply
    Tags: , , , , Symmetry Magazine   

    From Symmetry: “Black holes” 

    Symmetry

    01/12/16
    Ali Sundermier

    Let yourself be pulled into the weird world of black holes.

    Temp 1

    Imagine, somewhere in the galaxy, the corpse of a star so dense that it punctures the fabric of space and time. So dense that it devours any surrounding matter that gets too close, pulling it into a riptide of gravity that nothing, not even light, can escape.

    And once matter crosses over the point of no return, the event horizon, it spirals helplessly toward an almost infinitely small point, a point where spacetime is so curved that all our theories break down: the singularity. No one gets out alive.

    Black holes sound too strange to be real. But they are actually pretty common in space. There are dozens known and probably millions more in the Milky Way and a billion times that lurking outside. Scientists also believe there could be a supermassive black hole at the center of nearly every galaxy, including our own. The makings and dynamics of these monstrous warpings of spacetime have been confounding scientists for centuries.

    A history of black holes

    It all started in England in 1665, when an apple broke from the branch of a tree and fell to the ground. Watching from his garden at Woolsthorpe Manor, Isaac Newton began thinking about the apple’s descent: a line of thought that, two decades later, ended with his conclusion that there must be some sort of universal force governing the motion of apples and cannonballs and even planetary bodies. He called it gravity.

    Newton realized that any object with mass would have a gravitational pull. He found that as mass increases, gravity increases. To escape an object’s gravity, you would need to reach its escape velocity. To escape the gravity of Earth, you would need to travel at a rate of roughly 11 kilometers per second.

    It was Newton’s discovery of the laws of gravity and motion that, 100 years later, led Reverend John Michell, a British polymath, to the conclusion that if there were a star much more massive or much more compressed than the sun, its escape velocity could surpass even the speed of light. He called these objects “dark stars.” Twelve years later, French scientist and mathematician Pierre Simon de Laplace arrived at the same conclusion and offered mathematical proof for the existence of what we now know as black holes.

    In 1915, Albert Einstein set forth the revolutionary theory of general relativity, which regarded space and time as a curved four-dimensional object. Rather than viewing gravity as a force, Einstein saw it as a warping of space and time itself. A massive object, such as the sun, would create a dent in spacetime, a gravitational well, causing any surrounding objects, such as the planets in our solar system, to follow a curved path around it.

    A month after Einstein published this theory, German physicist Karl Schwarzschild discovered something fascinating in Einstein’s equations. Schwarzschild found a solution that led scientists to the conclusion that a region of space could become so warped that it would create a gravitational well that no object could escape.

    Up until 1967, these mysterious regions of spacetime had not been granted a universal title. Scientists tossed around terms like “collapsar” or “frozen star” when discussing the dark plots of inescapable gravity. At a conference in New York, physicist John Wheeler popularized the term “black hole.”

    How to find a black hole

    During star formation, gravity compresses matter until it is stopped by the star’s internal pressure. If the internal pressure does not stop the compression, it can result in the formation of a black hole.

    Some black holes are formed when massive stars collapse. Others, scientists believe, were formed very early in the universe, a billion years after the big bang.

    There is no limit to how immense a black hole can be, sometimes more than a billion times the mass of the sun. According to general relativity, there is also no limit to how small they can be (although quantum mechanics suggests otherwise). Black holes grow in mass as they continue to devour their surrounding matter. Smaller black holes accrete matter from a companion star while the larger ones feed off of any matter that gets too close.

    Black holes contain an event horizon, beyond which not even light can escape. Because no light can get out, it is impossible to see beyond this surface of a black hole. But just because you can’t see a black hole, doesn’t mean you can’t detect one.

    Scientists can detect black holes by looking at the motion of stars and gas nearby as well as matter accreted from its surroundings. This matter spins around the black hole, creating a flat disk called an accretion disk. The whirling matter loses energy and gives off radiation in the form of X-rays and other electromagnetic radiation before it eventually passes the event horizon.

    This is how astronomers identified Cygnus X-1 in 1971. Cygnus X-1 was found as part of a binary star system in which an extremely hot and bright star called a blue supergiant formed an accretion disk around an invisible object. The binary star system was emitting X-rays, which are not usually produced by blue supergiants. By calculating how far and fast the visible star was moving, astronomers were able to calculate the mass of the unseen object. Although it was compressed into a volume smaller than the Earth, the object’s mass was more than six times as heavy as our sun.

    Several different experiments study black holes. The Event Horizon Telescope [EHT] will look at black holes in the nucleus of our galaxy and a nearby galaxy, M87. Its resolution is high enough to image flowing gas around the event horizon.

    Event Horizon Telescope map
    EHT

    Scientists can also do reverberation mapping, which uses X-ray telescopes to look for time differences between emissions from various locations near the black hole to understand the orbits of gas and photons around the black hole.

    The Laser Interferometer Gravitational-Wave Observatory, or LIGO, seeks to identify the merger of two black holes, which would emit gravitational radiation, or gravitational waves, as the two black holes merge.

    Caltech Ligo
    MIT/Caltech Advanced LIGO

    In addition to accretion disks, black holes also have winds and incredibly bright jets erupting from them along their rotation axis, shooting out matter and radiation at nearly the speed of light. Scientists are still working to understand how these jets form.

    What we don’t know

    Scientists have learned that black holes are not as black as they once thought them to be. Some information might escape them. In 1974, Stephen Hawking published results that showed that black holes should radiate energy, or Hawking radiation.

    Matter-antimatter pairs are constantly being produced throughout the universe, even outside the event horizon of a black hole. Quantum theory predicts that one particle might be dragged in before the pair has a chance to annihilate, and the other might escape in the form of Hawking radiation. This contradicts the picture general relativity paints of a black hole from which nothing can escape.

    But as a black hole radiates Hawking radiation, it slowly evaporates until it eventually vanishes. So what happens to all the information encoded on its horizon? Does it disappear, which would violate quantum mechanics? Or is it preserved, as quantum mechanics would predict? One theory is that the Hawking radiation contains all of that information. When the black hole evaporates and disappears, it has already preserved the information of everything that fell into it, radiating it out into the universe.

    Black holes give scientists an opportunity to test general relativity in very extreme gravitational fields. They see black holes as an opportunity to answer one of the biggest questions in particle physics theory: Why can’t we square quantum mechanics with general relativity?

    Beyond the event horizon, black holes curve into one of the darkest mysteries in physics. Scientists can’t explain what happens when objects cross the event horizon and spiral towards the singularity. General relativity and quantum mechanics collide and Einstein’s equations explode into infinities. Black holes might even house gateways to other universes called wormholes and violent fountains of energy and matter called white holes, though it seems very unlikely that nature would allow these structures to exist.

    Sometimes reality is stranger than fiction.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 2:30 pm on January 5, 2016 Permalink | Reply
    Tags: , , , Symmetry Magazine   

    From Symmetry: “The booming science of dwarf galaxies” 

    Symmetry

    01/05/16
    Manuel Gnida

    1
    Dragonfly Telephoto Array Discovers Seven Dwarf Galaxies around Messier 101

    U Toronto Dunlap Dragonfly telescope Array
    U Toronto Dunlap Institute Dragonfly Array

    2
    Messier 101 (Pinwheel Galaxy) This image is from NASA/ESA Hubble It is presented here so that the viewer can approximate the locations of the dwarf galaxies. Dwarf galaxies are associated with most major galaxies.

    NASA Hubble Telescope
    NASA/ESA Hubble

    A recent uptick in the discovery of the smallest, oldest galaxies benefits studies of dark matter, galaxy formation and the evolution of the universe.

    Galaxies are commonly perceived as gigantic spirals full of billions to trillions of stars. Yet some galaxies, called dwarf galaxies, can harbor just a few hundred suns.

    The recent discovery of 20 new potential dwarf galaxies fueled a boom in the science of these faint objects, which are valuable tools to study dark matter, galaxy formation and cosmic history.

    Ten years ago, only about a dozen dwarf galaxies were known. This number quickly doubled after the Sloan Digital Sky Survey [SDSS] began its second phase of operation in 2005. SDSS-II took better than ever images of the sky, and researchers started using computer programs to identify dwarf galaxies in them.

    SDSS Telescope
    SDSS teleecope at Apache Point, NM, USA

    The number of potential dwarf galaxies has spiked of late, largely due to results from the first two years of the new Dark Energy Survey, which can see objects 10 times as faint.

    DECam
    CTIO Victor M Blanco 4m Telescope
    DECam, built at FNAL, and the CTIO Victor M Blanco telescope in Chile in which it is housed.

    The total number of known dwarf galaxy candidates orbiting our Milky Way—not all of them have been confirmed as galaxies yet—has now reached about 50.

    “The precise number is being updated on almost a weekly basis during recent months,” says Keith Bechtol of the University of Wisconsin, Madison, one of the lead authors of two DES papers, published in March and August, announcing the discoveries of potential satellite galaxies. “These are truly exciting times for this type of research.”

    Dim lights for dark matter research

    In general, the term “dwarf galaxy” refers to a galaxy that is smaller than a tenth of the size of our Milky Way, which is made of 100 billion stars. So not all dwarf galaxies are truly dwarfish. In fact, two of these objects in the southern night sky, called the Magellanic Clouds, are so large that they are visible to the naked eye.

    4
    Large Magellanic Cloud. Adrian Pingstone in December 2003

    6
    Small Magellanic Cloud (SMC) via ESO/Digitized Sky Survey 2

    However, researchers are particularly interested in the faintest dwarf galaxies. They make excellent laboratories in which to study dark matter—the invisible form of matter that is five times more prevalent than its visible counterpart but whose nature remains a mystery.

    Scientists’ best guess is that it’s made of fundamental particles, with hypothetical weakly interacting massive particles, or WIMPs, as the top contenders. Researchers think WIMPs could produce gamma rays as they decay or annihilate each other in space. They’re searching for this radiation with sensitive gamma-ray telescopes.

    Ultra-faint dwarf galaxies orbiting the Milky Way are ideal targets for this search for two reasons. First, because they have high ratios of dark matter to regular matter and are relatively close to us, they could produce detectable dark matter signals.

    “The motions of stars in ultra-faint galaxies are so fast that they are best explained if there is 100 to 1000 times more dark matter than the masses of all the stars taken together,” Bechtol says.

    Second, these galaxies are the oldest known galaxies. Their busiest days are in the past; most formed their stars more than 10 billion years ago. This, together with the fact that they have few stars and little gas, makes them very “clean” objects for the dark matter search.

    By contrast, in the also dark-matter-rich center of the much younger Milky Way, stars are still forming and many other astrophysical processes are producing gamma-ray signals that could obscure signs of dark matter.

    “If we ever saw gamma rays coming from these ultra-faint galaxies, it would be a smoking gun for dark matter,” says researcher Andrea Albert of the Kavli Institute for Particle Astrophysics and Cosmology, a joint institute of Stanford University and the SLAC National Accelerator Laboratory. She is involved in the dark matter analysis of the recently discovered DES dwarf galaxy candidates with the Fermi Gamma-ray Space Telescope.

    No convincing sign of WIMPs [Weakly interacting massive particles, being hunted as a possible constituent of dark matter] has yet been found coming from dwarf galaxies, including the most recent DES candidate dwarf galaxies, as preliminary results presented at the 2015 Topics in Astroparticle and Underground Physics conference in Torino, Italy, suggest.

    But even the absence of a signal is progress because it sets limits on what dark matter can and cannot be.

    Excavation tools for astrophysical archeology

    Researchers also study dwarf galaxies because they hope to learn about the history of our cosmic neighborhood and the formation of galaxies like our own.

    Current models suggest that galaxies don’t start out as enormous objects with a gazillion of stars, but rather as small structures that merge with others into larger ones. Dwarf galaxies are at the bottom of this hierarchy and are believed to be the building blocks of larger galaxies.

    “The way we see the Milky Way and its satellites today is only a snapshot in time,” says astronomer Marla Geha of Yale University. “Our own galaxy is a merger of smaller galaxies, and it’s still merging.”

    In other words, the Milky Way itself may have started out as a dwarf galaxy with only a few hundred to thousand stars. Today, it is a large galaxy, and simulations suggest that in 4 to 5 billion years the Milky Way will merge with the Andromeda Galaxy, the nearest major galaxy in our cosmic neighborhood.

    But dwarf galaxies can give us insight into more than just our local neighborhood. By better understanding dwarf galaxies, researchers can also study the evolution of the whole universe.

    Since dark matter is abundant and interacts gravitationally with itself and regular matter, it has influenced the cosmos and its structures ever since the big bang. In fact, we know today that galaxies are embedded in clumps, or halos, of dark matter that have formed in the expanding universe. These halos, in turn, are surrounded by smaller halos that harbor dwarf satellite galaxies.

    “The discovery of an increasing number of dwarf galaxies is exciting because our cosmological theories predict that the Milky Way has a few hundred of them,” Geha says. “If we don’t find enough satellites, we will need to adjust our models. However, considering that we haven’t surveyed the entire sky yet and haven’t been looking deeply enough, we’re mostly on track for our predictions.”

    Some of the dwarf galaxy candidates discovered by DES earlier this year could potentially orbit the Magellanic Clouds, the largest satellites of the Milky Way. If confirmed, the result would be quite fascinating, says KIPAC researcher Risa Wechsler.

    “Satellites of satellites are predicted by our models of dark matter,” she says. “Either we’re seeing these types of systems for the first time, or there is something we don’t understand about how these satellite galaxies are distributed in the sky.”
    A better and better view

    So far, studies of dwarf galaxies have largely been restricted to satellites of our Milky Way, and researchers believe that much could be learned from studying more distant ones.

    “One question we would like to answer is why the faintest dwarf galaxies are so extreme in size, age and dark matter content,” Geha says. “Is this because the ones we can observe are affected by their proximity to the Milky Way, or are these properties common to all dwarf galaxies in the universe?”

    For the ultimate test, researchers want to be able to detect even fainter objects and look farther into space than they can with DES. They’ll be able to do so once the Large Synoptic Survey Telescope will come online in 2022. The telescope’s 3.2-gigapixel camera will produce the deepest views of the night sky ever observed.

    And if that’s not enough, NASA is planning a space mission called the Wide-Field Infrared Survey Telescope, which could spot ultra-faint dwarf galaxies that evade even LSST’s watchful eye.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 3:27 pm on December 11, 2015 Permalink | Reply
    Tags: , , , Symmetry Magazine   

    From Symmetry: “The next gamma-ray eye on the sky” 

    Symmetry

    12/11/15
    Liz Kruesi

    Scientists have successfully tested the first prototype camera for the Cherenkov Telescope Array.

    1
    DESY/Milde Science Comm./Exozet

    Telescope arrays VERITAS, HESS and MAGIC have spied active supermassive black holes, the remnants of the explosions of massive stars, binary star systems, and galaxies actively churning out new stars.

    Veritas Telescope
    VERITAS

    HESS Cherenko Array
    HESS Cherenko Array

    MAGIC Telescope
    MAGIC

    This is possible thanks to what all of these cosmic objects have in common: They are all sources of high-energy gamma rays. VERITAS, HESS and MAGIC all look for the optical light produced when those gamma rays interact with Earth’s atmosphere.

    One gamma-ray source that continues to elude these powerful telescopes is the brightest electromagnetic event known to occur in the universe: a gamma-ray burst. But a new telescope array currently under development might be able to catch one.

    The Cherenkov Telescope Array, or CTA, will cover a substantially larger area on the ground, making it an enormous “bucket” to collect incoming gamma-ray-produced radiation.

    Cherenkov Telescope Array
    CTA

    It will also be able to collect data during almost twice as many hours per year as current arrays.

    The array will study the entire range of gamma-ray sources. It also has the capability to detect the annihilation signature of dark matter particles.

    “We’re really hoping to find something new, some new type of high-energy astrophysical phenomenon,” says Rene Ong, the CTA consortium co-spokesperson.

    Scientists successfully operated the first CTA prototype camera in late November. The full array is scheduled to start running in the 2020s.

    The usefulness of gamma rays

    Gamma rays are almost ideal messengers of high-energy particle astrophysics. They are created in the most energetic processes in the universe. And, like all other forms of light, they are electrically neutral and thus aren’t buffeted by galactic magnetic fields as they travel through space. This means scientists can use them to point back to their sources.

    The drawback is that these messengers can’t make it through Earth’s atmosphere. Instead, they interact and produce a shower of lower-energy particles.

    If some of those are traveling at a velocity faster than the speed of light in the gaseous medium of the atmosphere, they will create flashes of light peaking between blue and ultraviolet, akin to a sonic boom following a supersonic jet. This light is called Cherenkov radiation, and it’s what ground-based high-energy gamma-ray telescopes actually detect.

    VERITAS in Arizona, HESS in Namibia, and MAGIC on the Canary island of La Palma are arrays of optical telescopes that have been detecting this light for about a decade. VERITAS contains four of these scopes, HESS has five, and MAGIC has two. The weak light reflects off each segmented primary mirror and is funneled to a “camera.” Each telescope’s camera is made of hundreds to thousands of photomultiplier tubes which convert the incoming photons into electrical signals.

    With the next-generation CTA, scientists hope to catch a gamma-ray burst with a ground-based telescope array for the first time. They want to know the underlying physics of these blasts, the sources of which are thought to be located millions to billions of light-years away.

    Scientists have seen gamma-ray bursts with space-based instruments, such as the Fermi Gamma-Ray Space Telescope and Swift.

    NASA Fermi Telescope
    NASA/Fermi

    NASA SWIFT Telescope
    NASA/Swift

    But only a ground-based array could detect their highest-energy gamma rays, those above 100 billion electronvolts. And a large ground-based array such as the CTA, which will cover 10 square kilometers in the south and 1 square kilometer in the north, would be able to capture much more information.
    Building the CTA

    An international consortium of nearly 1300 researchers from 31 countries is working toward building the CTA. The array will focus on a wider gamma ray energy range than the currently operating instruments—seeing between 20 billion electronvolts and 300 trillion electronvolts—and will do so with 10 times the sensitivity.

    The CTA will consist of two detection sites on Earth, one in each hemisphere. At Cerro Paranal in Chile’s Atacama Desert, approximately 100 telescopes spread across an area of about 10 square kilometers will scan the Southern sky. On the Spanish island of La Palma, some 19 telescopes will watch the Northern sky. The CTA Observatory is in the final negotiations with representatives from both locations to finalize the agreements to host the arrays.

    Both the northern and southern arrays will each have four large telescopes, each 23 meters wide and spaced about 100 meters apart from one another, clustered toward the center of the array. Moving outward will be telescopes in the 10 to 12 meters range. The northern array will have 15 of these medium-sized telescopes, while the southern array will have 25. The Cerro Paranal location additionally will host approximately 70 4-meter-wide telescopes, farther out from the array’s center.

    The 70 small telescopes will use new detectors made of silicon. These have several advantages over the current design, says University of Oxford graduate student Andrea De Franco, “but the most sexy for us is they can resist bright night-sky background.”

    That means they can detect Cherenkov light even in bright moonlight, something VERITAS, HESS and MAGIC cannot do. This new technology will let the CTA observatory operate for about 16 to 17 percent of the hours in a year; current arrays can observe during only about 10 percent.

    Work in progress

    CTA is in the development phase right now, meaning the consortium members are developing and testing the hardware, verifying how to deploy and operate the instruments, and simulating the best layout of those telescopes at each site.

    In October, the CTA project began constructing the large telescope prototype at La Palma.

    Two medium-sized telescope prototypes are also under construction: A two-mirror design with a 10-meter primary mirror is being built in southern Arizona; a prototype of a single-mirror, 12-meter-wide design is in testing in Berlin, and its camera is nearly complete.

    All three small-sized prototypes are well underway. A single-mirror, 4-meter design has been constructed in Krakow, Poland; a two-mirror, 4-meter design is operational near Mount Etna, Italy; and another two-mirror, 4-meter design was just inaugurated December 1 outside of Paris.

    De Franco has spent the last two years building and testing the camera for the Paris-based prototype in addition to helping commission it before the inauguration. On November 26, he and his colleagues proved the design was working—even with the City of Light nearby. The camera recorded Cherenkov light, making it the first CTA prototype fully working and observing.

    De Franco says it’s more likely that the light was part of a particle shower caused by an incoming cosmic ray rather than a gamma ray. But even if it was, this detection marked yet another step forward along the path to build science’s next gamma-ray eye scouring the sky.

    The next step will be to construct and deploy the pre-production telescopes at the actual array sites.

    “Ideally, [each of these] is identical to the final production telescope,” says CTA Project Manager Christopher Townsley. “It’s just that we will always learn something from putting it in the desert.”

    Members of the CTA project expect to begin this phase in spring 2017, depending on the availability of funding.

    Once the pre-production telescopes are operational, data collection can begin, though it won’t be anywhere near the quality expected from the full observatory. According to the current timeline, most of the telescopes at both arrays will be complete in 2020 or 2021.

    At that point, the data will surpass what today’s best gamma-ray instruments can obtain. And CTA will only get better from there.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 5:17 pm on December 9, 2015 Permalink | Reply
    Tags: , , , Symmetry Magazine   

    From Symmetry: “Save the particles” 

    Symmetry

    12/09/15
    Sarah Charley

    To learn more about the particles they collide, physicists turn their attention to a less destructive type of collision in the LHC.

    1
    CMS. Maximilien Brice, CERN

    Every second, the Large Hadron Collider generates millions of particle collisions. Scientists watching these interactions usually look out for only the most spectacular ones.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    But recently they’ve also taken an interest in some gentler moments, during which the accelerated particles interact with photons, quanta of light.

    When charged particles—like the protons the LHC usually collides or the lead ions it is colliding right now—are forced around bends in an accelerator, they lose energy in the form of light radiation.

    Originally, physicists perceived this photon leak as a nuisance. But today, laboratories around the world specifically build accelerators to produce it. They can use this high-energy light to take high-speed images of materials and processes in the tiniest detail.

    Scientists are now using the LHC as a kind of light source to figure out what’s going on inside the protons and ions they collide.

    The LHC’s accelerated particles are chock-full of energy. When protons collide—or, more specifically, when the quarks and gluons that make up protons interact—their energy is converted into mass with manifests as other particles, such as Higgs bosons.

    Those particles decay back into energy as they sail through particle detectors set up around the collision points, leaving their signatures behind. Physicists usually study these particles, the ones created in collisions.

    In proton-photon collisions, however, they can study the protons themselves. That’s because photons can traverse a particle’s core without rupturing its structure. They pass harmlessly through the proton, creating new particles along the way.

    “When a high-energy light wave hits a proton, it produces particles—all kinds of particles—without breaking the proton,” says Daniel Tapia Takaki, an assistant professor at the University of Kansas who is a part of the CMS collaboration. “These particles are recorded by our detector and allow us to reconstruct an unprecedentedly high-quality picture of what’s inside.”

    Tapia Takaki is interested in using these photon-induced interactions to study the density of gluons inside high-energy protons and nuclei.

    As a proton is accelerated to close to the speed of light, its gluons swell and eventually split—like cells dividing in an embryo. Scientists want to know: Just how packed are gluons inside these protons? And what can that tell us about what happens when they collide?

    The Standard Model—a well-vetted model that predicts the properties of subatomic particles—predicts that the density of gluons inside a proton is directly related to the likelihood a proton will spit out a pair of charm quarks in the form of a J/psi particle during a proton-photon interaction.

    3
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    “So by measuring the J/psi’s production rate very precisely, we can automatically have access to the density of gluons,” Tapia Takaki says.

    Prior to joining the CMS experiment, Tapia Takaki worked with colleagues on the ALICE experiment to conduct a similar study of photon-lead interactions.

    AliceDetectorLarge
    ALICE

    Tapia Takaki plans to study the lead ions currently being collided in the LHC in more detail with his current team.

    The trickiest part of these studies isn’t applying the equation, but identifying the collisions, Tapia Takaki says.

    To identify subtle proton-photon and photon-lead collisions, Tapia Takaki and his colleagues must carefully program their experiments to cherry-pick and record events in which there’s no evidence of protons colliding—yet there is still evidence of the production of low-energy particles.

    “It’s challenging because the interactions of light with protons or lead ions take place all the time,” Tapia Takaki says. “We had to find a way to record these events without overloading the detector’s bandwidth.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 9:40 pm on December 3, 2015 Permalink | Reply
    Tags: , , Symmetry Magazine   

    From Symmetry: “Holometer rules out first theory of space-time correlations” 

    Symmetry

    12/03/15
    Andre Salles

    The extremely sensitive quantum-spacetime-measuring tool will serve as a template for continuing scientific exploration.

    1

    Our common sense and the laws of physics assume that space and time are continuous. The Holometer, an experiment based at the US Department of Energy’s Fermi National Accelerator Laboratory, challenges this assumption.

    We know that energy on the atomic level, for instance, is not continuous and comes in small, indivisible amounts. The Holometer was built to test if space and time behave the same way.

    In a new result (1) Search for space-time correlations from the Planck scale with the Fermilab Holometer released this week after a year of data-taking, the Holometer collaboration has announced that it has ruled out one theory of a pixelated universe to a high level of statistical significance.

    If space and time were not continuous, everything would be pixelated, like a digital image.

    When you zoom in far enough, you see that a digital image is not smooth, but made up of individual pixels. An image can only store as much data as the number of pixels allows. If the universe were similarly segmented, then there would be a limit to the amount of information space-time could contain.

    The main theory the Holometer was built to test was posited by Craig Hogan, a professor of astronomy and physics at the University of Chicago and the head of Fermilab’s Center for Particle Astrophysics. The Holometer did not detect the amount of correlated holographic noise—quantum jitter—that this particular model of space-time predicts.

    But as Hogan emphasizes, it’s just one theory, and with the Holometer, this team of scientists has proven that space-time can be probed at an unprecedented level.

    “This is just the beginning of the story,” Hogan says. “We’ve developed a new way of studying space and time that we didn’t have before. We weren’t even sure we could attain the sensitivity we did.”

    The Holometer isn’t much to look at. It’s a small array of lasers and mirrors with a trailer for a control room.

    Temp 1
    During an exceptionally snowy winter, Aaron Chou and Vanderbilt University student Brittany Kamai make their way to the Holometer’s modest home base, a relatively isolated trailer on the Fermilab prairie. Photo by: Reidar Hahn, Fermilab

    But the low-tech look of the device belies the fact that it is an unprecedentedly sensitive instrument, able to measure movements that last only a millionth of a second and distances that are a billionth of a billionth of a meter—a thousand times smaller than a single proton.

    The Holometer uses a pair of laser interferometers placed close to one another, each sending a 1-kilowatt beam of light through a beam splitter and down two perpendicular arms, 40 meters each. The light is then reflected back into the beam splitter where the two beams recombine.

    If no motion has occurred, then the recombined beam will be the same as the original beam. But if fluctuations in brightness are observed, researchers will then analyze these fluctuations to see if the splitter is moving in a certain way, being carried along on a jitter of space itself.

    According to Fermilab’s Aaron Chou, project manager of the Holometer experiment, the collaboration looked to the work done to design other, similar instruments, such as the one used in the Laser Interferometer Gravitational-Wave Observatory [LIGO] experiment.

    Caltech Ligo
    MIT/Caltech Advanced LIGO

    Chou says that once the Holometer team realized that this technology could be used to study the quantum fluctuation they were after, the work of other collaborations using laser interferometers (including LIGO) was invaluable.

    “No one has ever applied this technology in this way before,” Chou says. “A small team, mostly students, built an instrument nearly as sensitive as LIGO’s to look for something completely different.”

    The challenge for researchers using the Holometer is to eliminate all other sources of movement until they are left with a fluctuation they cannot explain. According to Fermilab’s Chris Stoughton, a scientist on the Holometer experiment, the process of taking data was one of constantly adjusting the machine to remove more noise.

    “You would run the machine for a while, take data, and then try to get rid of all the fluctuation you could see before running it again,” he says. “The origin of the phenomenon we’re looking for is a billion billion times smaller than a proton, and the Holometer is extremely sensitive, so it picks up a lot of outside sources, such as wind and traffic.”

    If the Holometer were to see holographic noise that researchers could not eliminate, it might be detecting noise that is intrinsic to space-time, which may mean that information in our universe could actually be encoded in tiny packets in two dimensions.

    The fact that the Holometer ruled out his theory to a high level of significance proves that it can probe time and space at previously unimagined scales, Hogan says. It also proves that if this quantum jitter exists, it is either much smaller than the Holometer can detect, or is moving in directions the current instrument is not configured to observe.

    So what’s next? Hogan says the Holometer team will continue to take and analyze data, and will publish more general and more sensitive studies of holographic noise. The collaboration already released a result related to the study of gravitational waves.

    And Hogan is already putting forth a new model of holographic structure that would require similar instruments of the same sensitivity, but different configurations sensitive to the rotation of space. The Holometer, he says, will serve as a template for an entirely new field of experimental science.

    “It’s new technology, and the Holometer is just the first example of a new way of studying exotic correlations,” Hogan says. “It is just the first glimpse through a newly invented microscope.”

    The Holometer experiment is supported by funding from the DOE Office of Science. The Holometer collaboration includes scientists from Fermilab, the University of Chicago, the Massachusetts Institute of Technology and the University of Michigan.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 536 other followers

%d bloggers like this: