Tagged: Symmetry Magazine Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:29 pm on November 6, 2014 Permalink | Reply
    Tags: , , Astrostatistics, , , Symmetry Magazine   

    From Symmetry: “The rise of astrostatistics” 


    November 04, 2014
    Lori Ann White

    Astrophysicists and cosmologists are turning to statisticians to help them analyze an ever-increasing deluge of data.

    Artwork by Sandbox Studio, Chicago with Kimberly Boustead

    In late 1801 the orbit of the newly discovered asteroid Ceres carried it behind the sun, and astronomers worried they had lost it forever. A young mathematical prodigy named Carl Friedrich Gauss developed a new statistical technique to find it. Called “least squares regression,” that technique is now a fundamental method of statistical analysis.

    For about 200 years after that, however, astronomers and statisticians had little to do with one another. But in the last decade or so, astronomy and statistics have finally begun to formalize a promising relationship. Together they are developing the new discipline of astrostatistics.

    Jogesh Babu, a Pennsylvania State professor and the director of the Penn State Center for Astrostatistics, remembers when the new age of astrostatistics dawned for him. Twenty-five years ago, when Babu’s focus was statistical theory, astronomy professor Eric Feigelson asked to meet with him to talk about a problem. At the end of the conversation, Babu says, “we realized we both speak English but we didn’t understand a word the other said.”

    To address that disconnect, the statistician and the astrophysicist organized a continuing series of conferences at Penn State. They also wrote a book, Astrostatistics, which effectively christened the new field. But collaborations between astrophysicists and statisticians remained small and scattered, only really starting to pick up in 2006, says Babu.

    “The development of statistical techniques useful to advanced astronomical research progressed very slowly, and until recently most all analyses had to be done by hand,” says statistician Joseph Hilbe, a statistics professor Arizona State University. Before the advent of computers with sufficient capacity to do the work, certain useful calculations could take statisticians weeks to months to complete, he said.

    In addition, says Tom Loredo, an astrostatistician at Cornell University, “astrophysicists are some of the more mathematically literate scientists, and we thought we could do it on our own.”

    Other fields had already embraced statistics. Statistics is vital to all branches of biology—especially epidemiology, medical research, and public health—and geology. In fact, in the 1990s Hilbe developed some of the first advanced statistical tools used to analyze Medicare data. Statisticians also contribute to the social sciences, economics, environmental and ecological sciences, and to the insurance and risk analysis industries.

    Slowly, though, astronomers began to realize that they might be able to benefit from the expert help of a statistician.

    “I believe the large surveys shocked astronomers with how much data there is,” Hilbe says. “The Sloan Digital Sky Survey [one of the first automated and digitized comprehensive astronomical sky surveys] told them they needed statistics.”

    Although he was aware of Babu’s and Feigelson’s nascent community, Hilbe decided to go bigger. He founded the International Statistical Institute’s Astrostatistics Interest Group, the first interest group or committee authorized by an astronomical or statistical association, in 2008. The formation of working groups within the American Astronomical Society and the International Astronomical Union followed in 2012. In the same year Hilbe was elected the first president of the newly formed International Astrostatistical Association.

    All told, about 700 scientists belong to the various groups, which have been gathered together under the umbrella of the Astrostatistics and Astroinformatics Portal, hosted by Penn State and with Feigelson and Hilbe as co-editors. The IAA also sponsors the new Cosmostatistics Consortium.

    One of the recently formed groups is the LSST Informatics and Statistical Science collaboration, organized in preparation for the Large Synoptic Survey Telescope, which, beginning in 2022, will photograph the entire southern sky every three days for 10 years. Babu and his collaborator Feigelson are members, as is Loredo.

    LSST Exterior
    LSST Telescope

    “One of the virtues of big data is that it gives you access to rare events,” Loredo says. He likens it to sifting through the trillions of bytes of Large Hadron Collider data to find a handful of Higgs bosons.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles

    “Now that we have a billion galaxies, what are the rare events that we wouldn’t ever see in only a million galaxies? Studying those will require statistical methods that are as good with small data sets as with big data sets.”

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 3:18 pm on October 28, 2014 Permalink | Reply
    Tags: , , , , , Symmetry Magazine   

    From Symmetry: “Scientists mull potential gamma-ray study sites” 


    October 28, 2014
    Kelen Tuttle

    An international panel is working to determine the two locations from which the Cherenkov Telescope Array will observe the gamma-ray sky.

    Cherenkov Telescope Array
    Cherenkov Telescope Array

    Somewhere in the Southern Hemisphere, about 100 state-of-the-art telescopes will dot the otherwise empty landscape for half a kilometer in every direction. Meanwhile, in the Northern Hemisphere, a swath of land a little over a third the size will house about 20 additional telescopes, every one of them pointing toward the heavens each night for a full-sky view of the most energetic—and enigmatic—processes in the universe.

    This is the plan for the Cherenkov Telescope Array Observatory, the world’s largest and most sensitive gamma-ray detector. The first of the two arrays is scheduled to begin taking data in 2016, with the other coming online in by 2020. At that point, CTA’s telescopes will observe gamma rays produced in some of the universe’s most violent events—everything from supernovas to supermassive black holes.

    Yet where exactly the telescopes will be built remains to be seen.

    Scientists representing the 29-country CTA consortium met last week to discuss the next steps toward narrowing down potential sites in the Northern Hemisphere: two in the United States (both in Arizona) and two others in Mexico and the Canary Islands. Although details from that meeting remain confidential, the CTA resource board is expected to begin negotiations with the potential host countries within the next few months. That will be the final step before the board makes its decision, says Rene Ong, co-spokesperson of CTA and a professor of physics and astronomy at UCLA.

    “Whichever site it goes to, it will be very important in that country,” Ong says. “It’s a major facility, and it will bring with it a huge amount of intellectual capital.”

    Site selection for the Southern Hemisphere is a bit further along. Last April, the CTA resource board narrowed down that list to two potential sites: one in Southern Namibia and one in Northern Chile. The board is now in the process of choosing between the sites based on factors including weather, operating costs, existing infrastructure like roads and utilities, and host country contributions. A final decision is expected soon.

    Artwork by: Sandbox Studio, Chicago

    “The consortium went through an exhaustive 3-year process of examining the potential sites, and all of the sites now being considered will deliver on the science,” says CTA Project Scientist Jim Hinton, a professor of physics and astronomy at the University of Leicester. “We’re happy that we have so many really good potential sites. If we reach an impasse with one, we can still keep moving forward with the others.”

    Scientists do not completely understand how high-energy gamma rays are created. Previous studies suggest that they stream from jets of plasma pouring out of enormous black holes, supernovae and other extreme environments, but the processes that create the rays—as well as the harsh environments where they are produced—remain mysterious.

    To reach its goal of better understanding high-energy gamma rays, CTA needs to select two sites—one in the Northern Hemisphere and one in the Southern Hemisphere—to see the widest possible swath of sky. In addition, the view from the two sites will overlap just enough to allow experimenters to better calibrate their instruments, reducing error and ensuring accurate measurements.

    With 10 times the sensitivity of previous experiments, CTA will fill in the many blank regions in our gamma-ray map of the universe. Gamma-rays with energies up to 100 gigaelectronvolts have already been mapped by the Fermi Gamma-ray Space Telescope and others; CTA will cover energies up to 100,000 gigaelectronvolts. It will survey more of the sky than any previous such experiment and be significantly better at determining the origin of each gamma ray, allowing researchers to finally understand the astrophysical processes that produce these energetic rays.

    NASA Fermi Telescope

    CTA may also offer insight into dark matter. If a dark matter particle were to naturally decay or interact with its antimatter partner to release a flash of energy, the telescope array could theoretically detect that flash. In fact, CTA is one of very few instruments that could see such flashes with energies above 100 gigaelectronvolts.

    “I’m optimistic that we’ll see something totally new and unexpected,” Ong says. “Obviously I can’t tell you what it will be—otherwise it wouldn’t be unexpected—but history tells us that when you make a big step forward in capability, you tend to see something totally new. And that’s just what we’re doing here.”

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 1:51 pm on October 15, 2014 Permalink | Reply
    Tags: , , , , , , , Symmetry Magazine   

    From Symmetry: “Top quark still raising questions” 


    October 15, 2014
    Troy Rummler

    Why are scientists still interested in the heaviest fundamental particle nearly 20 years after its discovery?

    “What happens to a quark deferred?” the poet Langston Hughes may have asked, had he been a physicist. If scientists lost interest in a particle after its discovery, much of what it could show us about the universe would remain hidden. A niche of scientists, therefore, stay dedicated to intimately understanding its properties.

    Photo by Reidar Hahn, Fermilab

    Case in point: Top 2014, an annual workshop on top quark physics, recently convened in Cannes, France, to address the latest questions and scientific results surrounding the heavyweight particle discovered in 1995 (early top quark event pictured above).

    Top and Higgs: a dynamic duo?

    A major question addressed at the workshop, held from September 29 to October 3, was whether top quarks have a special connection with Higgs bosons. The two particles, weighing in at about 173 and 125 billion electronvolts, respectively, dwarf other fundamental particles (the bottom quark, for example, has a mass of about 4 billion electronvolts and a whole proton sits at just below 1 billion electronvolts).

    Prevailing theory dictates that particles gain mass through interactions with the Higgs field, so why do top quarks interact so much more with the Higgs than do any other known particles?

    Direct measurements of top-Higgs interactions depend on recording collisions that produce the two side-by-side. This hasn’t happened yet at high enough rates to be seen; these events theoretically require higher energies than the Tevatron or even the LHC’s initial run could supply. But scientists are hopeful for results from the next run at the LHC.

    “We are already seeing a few tantalizing hints,” says Martijn Mulders, staff scientist at CERN. “After a year of data-taking at the higher energy, we expect to see a clear signal.” No one knows for sure until it happens, though, so Mulders and the rest of the top quark community are waiting anxiously.

    A sensitive probe to new physics

    Top and anti-top quark production at colliders, measured very precisely, started to reveal some deviations from expected values. But in the last year, theorists have responded by calculating an unprecedented layer of mathematical corrections, which refined the expectation and promise to realigned the slightly rogue numbers.

    Precision is an important, ongoing effort. If researchers aren’t able to reconcile such deviations, the logical conclusion is that the difference represents something they don’t know about—new particles, new interactions, new physics beyond the standard model.

    The Standard Model of elementary particles, with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    The challenge of extremely precise measurements can also drive the formation of new research alliances. Earlier this year, the first Fermilab-CERN joint announcement of collaborative results set a world standard for the mass of the top quark.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Such accuracy hones methods applied to other questions in physics, too, the same way that research on W bosons, discovered in 1983, led to the methods Mulders began using to measure the top quark mass in 2005. In fact, top quark production is now so well controlled that it has become a tool itself to study detectors.
    Forward-backward synergy

    With the upcoming restart in 2015, the LHC will produce millions of top quarks, giving researchers troves of data to further physics. But scientists will still need to factor in the background noise and data-skewing inherent in the instruments themselves, called systematic uncertainty.

    “The CDF and DZero experiments at the Tevatron are mature,” says Andreas Jung, senior postdoc at Fermilab. “It’s shut down, so the understanding of the detectors is very good, and thus the control of systematic uncertainties is also very good.”

    Tevatron at Fermilab

    CDF experiment at the Tevatron

    FNAL DZero
    DZero at the Tevatron

    Jung has been combing through the old data with his colleagues and publishing new results, even though the Tevatron hasn’t collided particles since 2011. The two labs combined their respective strengths to produce their joint results, but scientists still have much to learn about the top quark, and a new arsenal of tools to accomplish it.

    “DZero published a paper in Nature in 2004 about the measurement of the top quark mass that was based on 22 events,” Mulders says. “And now we are working with millions of events. It’s incredible to see how things have evolved over the years.”

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 1:38 pm on October 3, 2014 Permalink | Reply
    Tags: , , , , LIGO, , Symmetry Magazine   

    From Symmetry: “To catch a gravitational wave” 


    October 03, 2014
    Jessica Orwig

    Advanced LIGO, designed to detect gravitational waves, will eventually be 1000 times more powerful than its predecessor.

    Thirty years ago, a professor and a student with access to a radiotelescope in Puerto Rico made the first discovery of a binary pulsar: a cosmic dance between a pair of small, dense, rapidly rotating neutron stars, called pulsars, in orbit around one another.

    Scientists noticed that their do-si-do was gradually speeding up, which served as indirect evidence for a phenomenon predicted by Albert Einstein called gravitational waves.

    Today in Livingston, Louisiana, and Hanford, Washington, scientists are preparing the next stage of a pair of experiments that they hope will detect gravitational waves directly within the next five years. They’re called the Laser Interferometer Gravitational-Wave Observatory, or LIGO.

    Distorting the fabric of spacetime

    Gravitational waves are faint ripples in the fabric of spacetime thought to propagate throughout the universe. According to the theory of general relativity, objects with mass—and therefore gravitational pull—should emit these waves whenever they accelerate. Scientists think the stars in the binary pulsar that Russell Hulse and Joseph Taylor discovered in 1974 are being pulled closer and closer together because they are losing miniscule amounts of energy each year through the emission of gravitational waves.

    If a gravitational wave from a binary pulsar passes through Livingston or Hanford, the LIGO experiments will be waiting. In summer 2015, scientists will begin collecting data with Advanced LIGO, the next stage of LIGO, with more powerful lasers and attuned sensors. Advanced LIGO will by 2020 become 1000 times more likely than its predecessor to detect gravitational waves.

    “We’ll be able to see well beyond the local group, up to 300 megaparsecs away, which includes thousands of galaxies,” says Mario Diaz, a professor at the University of Texas at Brownsville and director of the Center for Gravitational Wave Astronomy. ”That’s the reason why pretty much everyone agrees if gravitational waves exist then Advanced LIGO has to see them.”

    Eventually joining LIGO in its attempt to catch a gravitational wave will be the VIRGO Interferometer at the European Gravitational Observatory in Italy and the Kamioka Gravitational Wave Detector at the Kamioka Mine in Japan. VIRGO started its search in 2007 and is currently undergoing upgrades. KAGRA is expected to begin operations in 2018. By the time KAGRA comes online, all three instruments should have similar levels of sensitivity.

    Advanced LIGO

    LIGO is made up of two identical laser interferometers, one in Louisiana and the other in Washington.

    Courtesy of LIGO Laboratory

    At a laser interferometer, scientists take a single, powerful laser beam and split it in two. The two beams then travel down two equally long tunnels. At the end of each tunnel, each beam hits a mirror and reflects back.

    The tunnels are perpendicular to one another, creating a giant “L.” Because of this, the reflected beams return to the same spot and cancel each other out. That is, unless a gravitational wave intervenes.

    The light path through a Michelson interferometer. The two light rays with a common source combine at the half-silvered mirror to reach the detector. They may either interfere constructively (strengthening in intensity) if their light waves arrive in phase, or interfere destructively (weakening in intensity) if they arrive out of phase, depending on the exact distances between the three mirrors.

    If a gravitational wave passes through, it will distort the fabric of spacetime in which the observatory sits. This will warp the physical distance between the mirrors, giving one of the laser beams the advantage in reaching its final destination first. Because the beams will not cancel one another out, they will produce a signal in the detector.

    Advanced LIGO isn’t any bigger than LIGO, says Fred Raab of Caltech, head of the LIGO Hanford Observatory. Scientists are transforming the experiment from the inside. “That was part of the strategy for building LIGO… it’s the upgrades to technology that really counts.”

    The impressive part, says Gabriela Gonzalez, LIGO spokesperson and professor at Louisiana State University, is the miniscule size of the change in distance and the technology’s capability to detect it.

    “The [tunnels] are 4 kilometers long, and we have sensitivities to about 10-18 meters,” Gonzalez says. “We can tell how 4 kilometers one way differs from 4 kilometers the other way by a change that is a thousandth the size of a proton diameter.”

    Scientists built two identical machines 1865 miles apart because the wavelength of the gravitational waves they’re looking for should be about that long; if they measure the same signal in both detectors simultaneously, it will be a good indication that the signature is genuine.

    One of the new features of Advanced LIGO will be an additional mirror that will enable scientists to enhance sensitivity to different frequencies of gravitational waves. With different frequencies come different levels of spacetime distortion and hence different changes in the distance between the two mirrors. The different signals will tell scientists something about the properties of gravitational waves and their sources.

    “The extra mirror allows us to apply a boost in sensitivity to a smaller range of frequencies in the search band,” Raab says. “It works kind of like the treble/bass adjustment in your car stereo. You still hear the music, but with different frequencies enhanced.”

    Straight to the source

    Scientists at Advanced LIGO would like to identify the sources of gravitational waves.

    They most likely come from binary neutron stars like the one Hulse and Taylor discovered. But they could also originate in systems that right now exist only in theory, such as black hole binaries and neutron star-black hole binary systems.

    Christopher Berry, a research fellow at the University of Birmingham, is part of a team that is designing a way to quickly estimate where in the sky the source of a gravitational wave might originate in order to share that information with astronomers around the world, who could take a closer look.

    “You can analyze the data to determine quantities like mass, orientation and location,” he says. “One of the things we want to do with parameter estimation is quickly estimate where in the sky a source came from and then tell people with telescopes to point there.”

    Gravitational waves could also come from the same systems that produce gamma-ray bursts, the brightest known electromagnetic events in the universe. Scientists think that gamma-ray bursts may come from merging binary neutron stars, a hypothesis LIGO could investigate.

    Determining a link between gamma-ray bursts and binary neutron stars would be one outstanding achievement for Advanced LIGO, but the future observatory has potential for more, Berry says.

    “We can see inside the sun using neutrinos, and gravitational waves are yet another way to look at the universe,” he says. “We can make discoveries we weren’t expecting.”

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 3:30 pm on September 30, 2014 Permalink | Reply
    Tags: , , , Symmetry Magazine   

    From Symmetry: “Accelerating the fight against cancer” 


    September 30, 2014
    Glennda Chui

    As charged-particle therapies grow in popularity, physicists are working with other experts to make them smaller, cheaper and more effective—and more available to cancer patients in the United States.

    Once physicists started accelerating particles to high energies in the 1930s, it didn’t take them long to think of a killer app for this new technology: zapping tumors.

    Standard radiation treatments, which had already been around for decades, send X-rays straight through the tumor and out the other side of the body, damaging healthy tissue both coming and going. But protons and ions—atoms stripped of electrons—slow when they hit the body and come to a stop, depositing most of their destructive energy at their stopping point. If you tune a beam of protons or ions so they stop inside a tumor, you can deliver the maximum dose of radiation while sparing healthy tissue and minimizing side effects. This makes it ideal for treating children, whose developing bodies are particularly sensitive to radiation damage, and for cancers very close to vital tissues such as the optic nerves or spinal cord.

    Protons and electrons in an atom

    Today, nearly 70 years after American particle physicist Robert Wilson came up with the idea, proton therapy has been gaining traction worldwide and in the United States, where 14 centers are treating patients and nine more are under construction. Ions such as carbon, helium and oxygen are being used to treat patients in Germany, Italy, China and Japan. More than 120,000 patients had been treated with various forms of charged-particle therapy by the end of 2013, according to the Particle Therapy Co-Operative Group.

    New initiatives from CERN research center in Europe and the Department of Energy and National Cancer Institute in the United States are aimed at moving the technology along, assessing its strengths and limitations and making it more affordable.

    And physicists are still deeply involved. No one knows more about building and operating particle accelerators and detectors. But there’s a lot more to know. So they’ve been joining forces with physicians, engineers, biologists, computer scientists and other experts to make the equipment smaller, lighter, cheaper and more efficient and to improve the way treatments are done.

    particle Accelerator

    “As you get closer to the patient, you leave the world accelerator physicists live in and get closer to the land of people who have PhDs in medical physics,” says Stephen Peggs, an accelerator physicist at Brookhaven National Laboratory.

    “It’s alignment, robots and patient ergonomics, which require just the right skill sets, which is why it’s fun, of course, and one reason why it’s interesting—designing with patients in mind.”

    Knowing where to stop

    The collaborations that make charged-particle therapy work go back a long way. The first experimental treatments took place in 1954 at what is now Lawrence Berkeley National Laboratory. Later scientists at Fermi National Accelerator Laboratory designed and built the circular accelerator at the heart of the first hospital-based proton therapy center in the United States, opened in 1990 at California’s Loma Linda University Medical Center.

    A number of private companies have jumped into the field, opening treatment centers, selling equipment and developing more compact and efficient treatment systems that are designed to cut costs. ProTom International, for instance, recently received US Food and Drug Administration approval for a system that’s small enough and light enough to ship on a plane and move in through a door, so it will no longer be necessary to build the treatment center around it. Other players include ProCure, Mevion, IBA, Varian Medical Systems, ProNova, Hitachi, Sumitomo and Mitsubishi.

    The goal of any treatment scheme is to get the beam to stop in exactly the right spot; the most advanced systems scan a beam back and forth to “paint” the 3-D volume of the tumor with great precision. Aiming it is not easy, though. Not only is every patient’s body different—a unique conglomeration of organs and tissues of varying densities—but every patient breathes, so the target is in constant motion.

    Doctors use X-ray CT scans—the CT stands for “computed tomography”—to make a 3-D image of the tumor and its surroundings so they can calculate the ideal stopping point for the proton beam. But since protons don’t travel through the body exactly the same way X-rays do—their paths are shifted by tiny, rapid changes in the tissues they encounter along the way—their end points can differ slightly from the predicted ones.

    Physicists are trying to reduce that margin of error with a technology called proton CT.

    There are 49 charged-particle treatment centers operating worldwide, including 14 in the United States, and 27 more under construction. This map shows the number of patients treated through the end of 2013 in centers that are now in operation. Source: Particle Therapy Co-Operative Group.
    Artwork by: Sandbox Studio, Chicago with Shawna X.

    Reconnoitering with proton CT

    The idea is simple: Use protons rather than X-rays to make the images. The protons are tuned to high enough energies that they go through the body without stopping, depositing about one-tenth as much radiation along their path as X-rays do.

    Detectors in front of and behind the body pinpoint where each proton beam enters and leaves, and a separate detector measures how much energy the protons lose as they pass through tissues. By directing proton beams through the patient from different angles, doctors can create a 3-D image that tells them, much more accurately than X-rays, how to tune the proton beam so it stops inside the tumor.

    Two teams are now in friendly competition, testing rival ways to perform proton CT on “phantom” human heads made of plastic. Both approaches are based on detectors that are staples in particle physics.

    One team is made up of researchers from Northern Illinois University, Fermilab, Argonne National Laboratory and the University of Delhi in India and funded by the US Army Medical Research Acquisition Center in Maryland. They use a pair of fiber trackers on each side of the phantom head to pinpoint where the proton beams enter and exit. Each tracker contains thousands of thin plastic fibers. When a proton hits a fiber, it gives off a flash of light that is picked up by another physics standby—a silicon photomultiplier—and conveyed to a detector.

    The team is testing this system, which includes computers and software for turning the data into images, at the CDH Proton Center in Warrenville, Illinois.

    “The point is to demonstrate you can get the image quality you need to target the treatment more accurately with a lower radiation dose level than with X-ray CT,” says Peter Wilson, principal investigator for the Fermilab part of the project.

    The second project, a collaboration between researchers at Loma Linda, University of California, Santa Cruz, and Baylor University, is fi-nanced by a $2 million grant from the National Institutes of Health. Their proton CT system is based on silicon strip detectors the Santa Cruz group developed for the Fermi Gamma-ray Space Telescope and the ATLAS experiment at CERN, among others. It’s being tested at Loma Linda.

    NASA Fermi Telescope


    “We know how to detect charged particles with silicon detectors. Charged particles for us are duck soup,” says UCSC particle physicist Hartmut Sadrozinski, who has been working with these detectors for more than 30 years. Since a single scan requires tracking about a billion protons, the researchers also introduced software packages developed for high-energy physics to analyze the high volume of data coming into the detector.

    Proton CT will have to get a lot faster before it’s ready for the treatment room. In experiments with the phantom head, the system can detect a million protons per second, completing a scan in about 10 minutes, Sadrozinski says; the goal is to bring that down to 2 to 3 minutes, reducing the time the patient has to hold still and ensuring accurate images and dose delivery.
    Trimming the size and cost of ion therapy

    The first ion therapy center opened in Japan in 1994; by the end of 2013 centers in Japan, China, Germany and Italy had treated nearly 13,000 patients.

    There’s reason to think ions could be more effective than protons or X-rays for treating certain types of cancer, according to a recent review of the field published in Radiation Oncology by researchers from the National Cancer Institute and Walter Reed National Military Medical Center. Ions deliver a more powerful punch than protons, causing more damage to a tumor’s DNA, and patient treatments have shown promise.

    But the high cost of building and operating treatment centers has held the technology back, the researchers wrote; and long-term research on possible side effects, including the possibility of triggering secondary cancers, is lacking.

    The cost of building ion treatment centers is higher in part because the ions are so much heavier than protons. You need bigger magnets to steer them around an accelerator, and heavier equipment to deliver them to the patient.

    Two projects at Brookhaven National Laboratory aim to bring the size and cost of the equipment down.

    One team, led by accelerator physicist Dejan Trbojevic, has developed and patented a simpler, less expensive gantry that rotates around a stationary patient to aim an ion beam at a tumor from various angles. Gantries for ion therapy can be huge—the one in use at the Heidelberg Ion-Beam Therapy Center in Germany weighs 670 tons and is tall as a jetliner. The new design shrinks the size of the gantry by making a single set of simpler, smaller magnets do double duty, both bending and focusing the particle beam.

    In the second project, Brookhaven scientists are working with a Virginia company, Best Medical International, to design a system for treating patients with protons, carbon ions and other ion beams. Called the ion Rapidly Cycling Medical Synchrotron (iRCMS), it is designed to deliver ions to patients in smaller, more rapid pulses. With smaller pulses, the diameter of the beam also shrinks, along with the size of the magnets used to steer it. Brookhaven is building one of the system’s three magnet girders, radio-frequency acceleration cavities and a power supply for a prototype system. The end product must be simple and reliable enough for trained hospital technicians to operate for years.

    “A particle accelerator for cancer treatment has to be industrial, robust—not the high-tech, high-performance, typical machine we’re used to,” says Brookhaven’s Peggs, one of the lead scientists on the project. “It’s more like a Nissan than a Ferrari.”

    Artwork by: Sandbox Studio, Chicago with Shawna X.

    Launching a CERN initiative for cancer treatment

    CERN, the international particle physics center in Geneva, is best known to many as the place where the Higgs boson was discovered in 2012. In 1996 it began collaborating on a study called PIMMS that designed a system for delivering both proton and ion treatments. That system evolved into the equipment at the heart of two ion therapy centers: CNAO, the National Center for Oncological Treatment in Pavia, Italy, which treated its first patient in 2011, and MedAustron, scheduled to open in Austria in 2015.

    Now scientists at CERN want to spearhead an international collaboration to design a new, more compact treatment system that will incorporate the latest particle physics technologies. It’s part of a larger CERN initiative launched late last year with a goal of contributing to a global system for treating cancer with charged-particle beams.

    Part of an existing CERN accelerator, the Low Energy Ion Ring, will be converted into a facility to provide various types of charged-particle beams for research into how they affect healthy and cancerous tissue. The lab will also consider developing detectors for making medical images and controlling the treatment beam, investigating ways to control the dose the patient receives and adapting large-scale computing for medical applications.

    CERN Low Energy Ion Ring
    CERN Low Energy Ion Ring

    CERN will provide seed funding and seek out other funding from foundations, philanthropists and other sources, such as the European Union.

    “Part of CERN’s mission is knowledge transfer,” says Steve Myers, director of the medical initiative, who spent the past five years running the Large Hadron Collider as director of accelerators and technology for CERN.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles

    “We would like to make the technologies we have developed for particle physics available to other fields of research simply because we think it’s a nice thing to do,” he says. “All the things we do are related to the same goal, which is treating cancer tumors in the most effective and efficient way possible.”

    Expanding the options in the US

    In the US, the biggest barrier to setting up ion treatment centers is financial: Treatment centers cost hundreds of millions of dollars. Unlike in Europe and Asia, no government funding is available, so these projects have to attract private investors. But without rigorous studies showing that ion therapy is worth the added cost in terms of eradicating cancer, slowing its spread or improving patients’ lives, investors are reluctant to pony up money and insurance companies are reluctant to pay for treatments.

    Studies that rigorously compare the results of proton or ion treatment with standard radiation therapy are just starting, says James Deye, program director for medical physics at the National Cancer Institute’s radiation research program.

    The need for more research on ion therapy has caught the attention of the Department of Energy, whose Office of High Energy Physics oversees fundamental, long-term accelerator research in the US. A 2010 report, “Accelerators for America’s Future,” identified ion therapy as one of a number of areas where accelerator research and development could make important contributions to society.

    In January 2013, more than 60 experts from the US, Japan and Europe met at a workshop sponsored by the DOE and NCI to identify areas where more research is needed on both the hardware and medical sides to develop the ion therapy systems of the future. Ideally, the participants concluded, future facilities should offer treatment with multiple types of charged particles—from protons to lithium, helium, boron and carbon ions—to allow researchers to compare their effectiveness and individual patients to get more than one type of treatment.

    In June, the DOE’s Accelerator Stewardship program asked researchers to submit proposals for seed funding to improve accelerator and beam delivery systems for ion therapy.

    “If there are accelerator technologies that can better enable this type of treatment, our job is to apply our R&D and technical skills to try to improve their ability to do so,” says Michael Zisman, an accelerator physicist from Lawrence Berkeley National Laboratory who is temporarily detailed to the DOE Office of High Energy Physics.

    “Ideally we hope there will be partnerships between labs, industry, universities and medical facilities,” he says. “We don’t want good technology ideas in search of a problem. We rather want to make sure our customers are identifying real problems that we believe the application of improved accelerator technology can actually solve.”

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 2:02 pm on September 22, 2014 Permalink | Reply
    Tags: , , , , , , , Symmetry Magazine   

    From Symmetry: “Cosmic dust proves prevalent” 


    September 22, 2014
    Kathryn Jepsen

    Space dust accounts for at least some of the possible signal of cosmic inflation the BICEP2 experiment announced in March. How much remains to be seen.

    Space is full of dust, according to a new analysis from the European Space Agency’s Planck experiment.


    That includes the area of space studied by the BICEP2 experiment, which in March announced seeing a faint pattern left over from the big bang that could tell us about the first moments after the birth of the universe.

    Gravitational Wave Background from BICEP2

    The Planck analysis, which started before March, was not meant as a direct check of the BICEP2 result. It does, however, reveal that the level of dust in the area BICEP2 scientists studied is both significant and higher than they thought.

    “There is still a wide range of possibilities left open,” writes astronomer Jan Tauber, ESA project scientist for Planck, in an email. “It could be that all of the signal is due to dust; but part of the signal could certainly be due to primordial gravitational waves.”

    BICEP2 scientists study the cosmic microwave background, a uniform bath of radiation permeating the universe that formed when the universe first cooled enough after the big bang to be transparent to light. BICEP2 scientists found a pattern within the cosmic microwave background, one that would indicate that not long after the big bang, the universe went through a period of exponential expansion called cosmic inflation. The BICEP2 result was announced as the first direct evidence of this process.

    The problem is that the same pattern, called B-mode polarization, also appears in space dust. The BICEP2 team subtracted the then known influence of the dust from their result. But based on today’s Planck result, they didn’t manage to scrub all of it.

    How much the dust influenced the BICEP2 result remains to be seen.

    In November, Planck scientists will release their own analysis of B-mode polarization in the cosmic microwave background, in addition to a joint analysis with BICEP2 specifically intended to check the BICEP2 result. These results could answer the question of whether BICEP2 really saw evidence of cosmic inflation.

    “While we can say the dust level is significant,” writes BICEP2 co-leader Jamie Bock of Caltech and NASA’s Jet Propulsion Laboratory, “we really need to wait for the joint BICEP2-Planck paper that is coming out in the fall to get the full answer.”

    [Me? I am rooting for my homey, Alan Guth, from Highland Park, NJ, USA]

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 1:51 pm on September 4, 2014 Permalink | Reply
    Tags: , GEANT4, Symmetry Magazine   

    From Symmetry: “Forecasting the future” 


    September 04, 2014
    Monica Friedlander

    Physicists and other scientists use the GEANT4 toolkit to identify problems before they occur.

    Physicists can tell the future—or at least foresee multiple possible versions of it. They do this through computer simulations. Simulations can help scientists predict what will happen when a particular kind of particle hits a particular kind of material in a particle detector. But physicists are not the only scientists interested in predicting how particles and other matter will interact. This information is critical in multiple fields, especially those concerned about the effects of radiation.

    At CERN in 1974, scientists created the first version of GEANT (Geometry and Tracking) to help physicists create simulations. Today it is in its fourth iteration, developed by an international collaboration of about 100 scientists from 19 countries. Anyone can download the system to a personal computer, use C++ programming language to plug in details about the particle and material in question and find out what will happen when the two meet.

    at cern
    Geant4 at CERN

    GEANT4 is used in some of the most advanced accelerator experiments in the world, but its user base has grown beyond the particle physics community.

    Space science and astrophysics

    NASA, the European Space Agency, Japan Aerospace Exploration Agency and many space-related companies such as Boeing and Lockheed Martin have all used GEANT4 simulations to assess radiation hazards in space.

    Most current and near-future unmanned spacecraft—headed to places like Earth’s orbit, the moon, Mars, Venus, Mercury, Jupiter and nearby asteroids—are simulated by GEANT4 for their radiation hardness.

    Even one particle of radiation hitting a memory cell in an integrated circuit device onboard the International Space Station or other space equipment can have serious consequences. Based on information from GEANT4 simulations, scientists and engineers have made key modifications to the structure of integrated circuits and have optimized shielding structures to minimize radiation exposure.

    GEANT4 is also used for astrophysics studies. Recently scientists used it to simulate the acceleration of electrons in a solar flare and the emission of gamma rays from a pulsar.

    Air travel

    Earth’s atmosphere and magnetic field shield the planet from most cosmic radiation, but some does make it through. Areas at high altitudes are the most exposed to this radiation, as they are the least protected by layers of air, so airline passengers and crews receive higher doses of radiation than people below on the ground. Airplane companies are using GEANT4 to calculate the dose received by airline passengers and crews to help them find ways to better protect the people inside.

    Medical applications


    Imaging procedures such as CT and PET scans, which create detailed images of areas inside the body, are standard in modern diagnostic medicine. With GEANT4, medical researchers can simulate how such a procedure distributes radiation in the body while taking into account not only the motion of the machine but also that of the patient. The simulation can be done at all energy ranges and for all kinds of particles or scanning procedures.


    Traditional radiation therapy used to target cancer cells can inadvertently irradiate healthy cells in the patient. This is a serious concern, especially for younger patients who are at risk of developing secondary tumors later in life as a result. GEANT4 simulations are used to analyze and minimize this risk. By simulating the chain of interactions from beam to body, scientists can predict how various organs will be affected by treatment.


    An important part of modern security is the non-destructive scanning of cargo containers to prevent unauthorized and illegal transportation of radioactive material. Scientists use GEANT4 to simulate and optimize such scanning systems. GEANT4-based simulation is also used to optimize apparatuses for detecting explosives.

    Information technology and computing

    Just as GEANT4 can simulate the many possible ways a particle might interact in a detector, it can also simulate the many possible ways that a typical scientific code might run. GEANT4 scientists at SLAC National Accelerator Laboratory have been working with cutting-edge Silicon Valley companies such as Intel, Nvidia and Google to help them optimize their products for large-scale scientific computing.

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 1:17 pm on September 3, 2014 Permalink | Reply
    Tags: , , , Symmetry Magazine   

    From Symmetry: “Watching ‘the clock’ at the LHC” 


    September 03, 2014
    Sarah Charley

    As time ticks down to the restart of the Large Hadron Collider, scientists are making sure their detectors run like clockwork.

    Photo by Antonio Saba, CERN

    For the last two years, the Large Hadron Collider at CERN has been quietly slumbering while engineers and technicians prime it for the next run of data-taking in the summer of 2015.

    But this has been anything but a break for researchers from the LHC experiments.

    “Two years seems like a long time, but it goes by really fast,” says Michael Williams, a researcher on the LHCb experiment and assistant professor of physics at the Massachusetts Institute of Technology. “I think now it’s becoming a reality that running is coming soon, and it’s exciting.”

    CERN LHCb New

    One of the biggest tasks the collaborations are confronting right now is calibrating all the individual components so that their timing is completely synchronized. This synchronization of the components—called “the clock”—allows physicists to reconstruct the flights of particles through the different parts of the detector to form a picture of the entire collision event.

    “The clock is the foundation on which everything stands. It’s the heartbeat of the detector,” says UCLA physicist and CMS run coordinator Greg Rakness. “If the clock isn’t working, then the data makes no sense.”

    CERN CMS New

    The four largest LHC detectors—called ALICE, ATLAS, CMS and LHCb—each consist of dozens of smaller subdetectors, which in turn are supported by myriads of electronics and supporting subsystems. A huge challenge is ensuring that all of the subdetectors, electronics and supporting software are functioning as one single unit.



    “We have 18 different detectors that make up ALICE, and we have several different detection techniques,” says Federico Ronchetti, a scientist associated with CERN and Italian laboratory INFN who serves as the ALICE experiment 2015 run coordinator. “You have to combine the different pieces of information to produce an event. This is an integration, one of the most critical parts of the overall detector commissioning.”

    As Rakness says: “In the end, it’s one detector.”

    In addition to being in time with themselves, the LHC detectors must be in time with the LHC. During this next run, high-energy bunches of protons accelerated inside the LHC will collide every 25 nanoseconds. If a detector’s timing is out of sync with the accelerator, scientists will have no way of accurately reconstructing the particle collisions.

    If the detector were out of sync with the LHC, it would mistakenly show large chunks of energy suddenly going missing—just what physicists expect would happen if a rarely interacting particle, such as a dark matter particle, passed through the detector.

    “What a better way to create a fake ‘new physics’ signal than if half the detector is out of sync?” Rakness says. “You’d have new physics all the time!”

    Even though the task is daunting, the LHC researchers charged with commissioning the detectors are confident that they and their detectors will be ready for the accelerator’s second run in early 2015.

    “We understand our detector much better now,” says Kendall Reeves, a researcher for the University of Texas, Dallas, who works on the ATLAS experiment. “We have the experience from Run 1 to help out—and having that experience is invaluable. We are in a much better position now then we were at the beginning of Run 1.”

    “Nothing is too complicated,” Rakness says. “In the end, this whole complicated chain breaks down to a step-by-step process. And then it ticks.”

    CERN LHC particles

    LHC Tube Graphic
    LHC Tunnel

    CERN LHC Map
    LHC at CERN

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 1:01 pm on September 2, 2014 Permalink | Reply
    Tags: , , Symmetry Magazine   

    From Symmetry: “Detectors in daily life” 


    September 02, 2014
    Calla Cofield

    Ask someone what our world would look like if there were no cars or no telephones and they’ll probably have an immediate answer. It’s easy to imagine how our lives would be different without these things. But what if there were no particle detectors? Most people aren’t sure what would change if these devices disappeared—even though particle detectors are at work all around us.

    Tree diagram showing the relationship between types and classification of most common particle detectors

    Particle detectors play a role in undertakings from drug development to medical imaging, from protecting astronauts to dating ancient artifacts, from testing materials to understanding the universe.

    A particle detector is a device built to observe and identify particles. Some measure photons, particles of light such as the visible light from stars or the invisible X-rays we use to examine broken bones. Other particle detectors identify protons, neutrons and electrons—the particles that make up atoms—or even entire atoms. Some detectors are designed to detect antimatter. By sensing the electric charge and interactions of particles, the detectors can determine their presence, their energy and their motion.

    Without particle detectors, climate models and weather forecasts would be a bit cloudy.

    One of the most common tools used in modern weather forecasting is radar—radio waves sent through the air, bounced off of objects (such as rain drops) and collected by radio—frequency (RF) detectors. Using radar, scientists can gather information about the location, size and distribution of raindrops or other precipitation, as well as wind speed and direction and other variables. Some more advanced instruments apply the same technique using microwaves or lasers instead of radio waves. Radar is also a primary tool in the study of tornadoes and other severe storms, often with the goal of improving safety measures for people on the ground.

    The Earth’s climate—the long-term behavior of the atmosphere—is largely created by the interplay between two fluids: water and air. Fluids can be studied in detail in the laboratory using experimental devices that carefully track their complex motions using X-rays and other techniques that rely on particle detectors. Work done in the lab feeds into a growing understanding of our planet’s complex climate.

    At CERN European research center, the Cosmics Leaving Outdoor Droplets experiment, or CLOUD, is investigating the link between energetic particles from space, known as cosmic rays, and the formation of clouds in our atmosphere. The experiment suggests that the rays create aerosols, which act as seeds for new clouds. CLOUD scientists try to recreate atmospheric conditions inside a sealed chamber, and use detectors to carefully monitor changes taking place. CLOUD’s findings should help scientists better understand the effects of clouds and aerosols on Earth’s climate.

    Time for your checkup

    Particle detectors continue to change the way medical personnel see the human body—or more specifically, how they see into it. Many types of medical scanners, including PET (positron emission tomography) and MRI (magnetic resonance imaging) rely on particle detectors.

    Patients undergoing a PET scan receive an injection of molecules designed to accumulate in the hardest-working parts of the body, including the heart, the brain and organs such as the liver—as well as in trouble spots, such as tumors. Shortly after they arrive, a chemical element bound to each molecule will decay and emit a lightweight particle called a positron, which collides with a nearby electron and creates a gamma ray. Throughout this long chain of events, the gamma ray carries information about the body part being studied. It exits the body, is picked up by a particle detector, and the information is translated into images.

    MRI machines surround the body with strong magnetic fields. Manipulating those fields can cause hydrogen atoms inside the body to emit faint radio signals. Different body tissues emit slightly different signals, and a highly specialized RF detector measures those subtleties, revealing tissue structure.

    Even X-ray film is a simple type of particle detector: It turns black where it is exposed to X-rays and remains translucent where the X-rays are blocked by bones and teeth. More advanced X-ray detectors used today can image tissue as well.

    In addition, various types of particle detectors are used in radiation therapy for cancer, which bombards cancer cells with high-energy photons, protons or whole atoms. These particles are forms of ionizing radiation, meaning they can damage the molecules in cancer cells, including the cells’ DNA, making it more difficult for the cells to divide and, ideally, halting the growth of the tumor.

    At the corner of physics and pharmaceuticals

    Particle detectors help scientists shine a light on viruses and develop drugs to fight them.

    Intense, focused beams of light created by machines known as synchrotrons provide a window to the microscopic world. Shining such light on certain viruses—like those that cause strains of influenza, or the human immunodeficiency virus—reveals their unique, detailed structure. Particle detectors capture information about the interaction between the photons and the viruses and feed it back to the scientists. Armed with this information, researchers have engineered drugs that specifically target those viruses.
    Art and history

    Head to your local museum, and you may see evidence of particle detectors at work.

    Synchrotrons and other material analysis instruments can reveal the chemical make-up of art, artifacts and fossils. For anthropologists, this information can help determine the age and origin of objects. Art historians use X-ray, infrared, UV and visible light, and even beams of electrons, to gather detailed information about great works. This information is particularly important when deciding how best to restore and preserve paintings, sculptures and more. And in some cases, as with a painting by Vincent van Gogh, studies with particle detectors have revealed hidden treasures below the surface: earlier works of art painted over by the artist.

    By sea and by air

    Take a trip to the airport and you’ll be asked to place your bag on an X-ray scanner before you step through a metal detector or millimeter scanner—all of which use particle detectors.

    Cargo containers shipped by air or by sea undergo similar scanning processes. Hundreds to thousands of containers may pass through a major airport or seaport every day and must be scanned quickly and efficiently to ensure dangerous materials aren’t inside. Cargo scanners primarily use X-rays, but in recent years developments have been made using neutrons. Within this busy industry, scientists and engineers are being challenged to build better scanners and improved particle detectors.

    The final frontier

    In the hostile environment of space, astronauts require protection from solar radiation.

    High-energy photons and other particles ejected by the sun threaten to seriously harm humans who venture above the clouds. Down on the ground, earthlings are protected from most of these particles by the atmosphere. In space, particle detectors play a vital role in keeping astronauts safe from radiation. NASA scientists use particle detectors to carefully monitor each astronaut’s exposure to radiation. Satellite detectors look out for intense bursts of radiation from the sun, and provide a warning to astronauts when excess amounts are headed their way.

    The list goes on

    Particle detectors help test the structural integrity of industrial equipment, such as steam turbines and airplane engines. Smart phones and other computing devices contain semiconductors that are engineered using particle detectors in a process called ion implantation. In geology, neutrons and neutron detectors are used to identify oil reserves and rare minerals deep underground. Electron beams sterilize food, food packaging and medical equipment for consumer safety—and they use particle detectors. Muons—particles raining down from outer space—have been used to probe geophysical phenomena, like the internal structure of volcanoes and mountains. More recently, scientists have been working on devices that use muons to search for radioactive materials in waste containers or cargo ships.

    And of course, physicists use particle detectors to learn more about the universe, from finding subatomic particles such as the Higgs boson, to searching for new particles that would explain dark matter and other unsolved phenomena, to looking back in time at the first few seconds after the big bang.

    Particle detectors are used every day, making us safer, healthier and more knowledgeable and helping us get things done.

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 11:58 am on August 29, 2014 Permalink | Reply
    Tags: , JUNO Experiment, , Symmetry Magazine   

    From Symmetry: “Massive neutrino experiment proposed in China” 


    August 29, 2014
    Calla Cofield

    China’s neutrino physics program could soon expand with a new experiment aimed at cracking a critical neutrino mystery.

    Physicists have proposed building one of the largest-ever neutrino experiments in the city of Jiangamen, China, about 60 miles outside of Hong Kong. It could help answer a fundamental question about the nature of neutrinos.

    Jiangmen Underground Neutrino Observatory

    The Jiangmen Underground Neutrino Observatory, or JUNO, gained official status in 2013 and established its collaboration this month. Scientists are currently awaiting approval to start constructing JUNO’s laboratory near the Yangjiang and Taishan nuclear power plants. If it is built, current projections anticipate it will start taking data in 2020.

    The plan is to bury the laboratory in a mountain under roughly half of a mile of rock and earth, a shield from distracting cosmic rays. From this subterranean seat, JUNO’s primary scientific goal would be to resolve the question of neutrino mass. There are three known neutrino types, or flavors: electron, muon and tau. Scientists know the difference between the masses of each neutrino, but not their specific values—so they don’t yet know which neutrino is heaviest or lightest.

    “This is very important for our understanding of the neutrino picture,” says Yifang Wang, spokesperson for JUNO and director of the Institute of High Energy Physics of the Chinese Academy of Sciences. “For almost every neutrino model, you need to know which neutrino is heavier and which one is lighter. It has an impact on almost every other question about neutrinos.”

    To reach this goal, JUNO needs to acquire a hoard of data, which requires two key elements: a large detector and a high influx of neutrinos.

    The proposed detector design is called a liquid-scintillator—the same basic set-up used to detect neutrinos for the first time in 1956. The detector consists primarily of an acrylic sphere 34.5 meters (or nearly 115 feet) in diameter, filled with fluid engineered specifically for detecting neutrinos. When a neutrino interacts with the fluid, a chain reaction creates two tiny flashes of light. An additional sphere, made of photomultiplier tubes, would surround the ceramic sphere and capture these light signals.

    The more fluid the detector has, the more neutrino interactions the experiment can expect to see. Current liquid scintillator experiments include the Borexino experiment at the Gran Sasso Laboratory in Italy, which contains 300 tons of target liquid, and KamLand in Japan, which contains a 1000-ton target. If plans go ahead, JUNO will be the largest liquid scintillator detector ever built, containing 20,000 tons of target liquid.

    To discover the mass order of the three neutrino flavors, JUNO will look specifically at electron antineutrinos produced by the two nearby nuclear power plants.

    “Only in Asia are there relatively new reactor power plants that can have four to six reactor cores in the same place,” Wang says. With the potential to run four to six cores each, the Chinese reactors would send a dense shower of neutrinos toward JUNO’s detector. Over time, a picture of the antineutrino energies would emerge. The order of the neutrino masses influences what that energy spectrum looks like.

    Experiment representatives say JUNO could reach this goal by 2026.

    It’s possible that the NOvA experiment in the United States or the T2K experiment in Japan, both of which are currently taking data, could make a measurement of the neutrino mass hierarchy before JUNO. At least four proposed experiments could also reach the same goal. But only JUNO would make the measurement via this particular approach.

    The JUNO experiment would also tackle various other questions about the nature of neutrinos and refine some previously made measurements. If a supernova went off in our galaxy, JUNO would be able to observe the neutrinos it released. JUNO would also be the largest and most sensitive detector for geoneutrinos, which are produced by the decay of radioactive elements in the earth.

    Six nations have officially joined China in the collaboration: the Czech Republic, France, Finland, Germany, Italy and Russia. US scientists are actively participating in JUNO, but the United States is not currently an official member of the collaboration.

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.

    ScienceSprings relies on technology from

    MAINGEAR computers



Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc

Get every new post delivered to your Inbox.

Join 357 other followers

%d bloggers like this: