Tagged: Physics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:45 am on December 7, 2016 Permalink | Reply
    Tags: Physics, , , Researchers discover hot hydrogen atoms in Earth's upper atmosphere   

    From U Illinois: “Researchers discover hot hydrogen atoms in Earth’s upper atmosphere” 

    U Illinois bloc

    University of Illinois

    No writer credit found

    A schematic diagram of the Global Ultraviolet Imager observational geometry. The TIMED satellite is orbiting at 625 km and viewing in the anti-sunward limb direction. No image credit.


    A team of University of Illinois researchers has discovered the existence of hot atomic hydrogen (H) atoms in an upper layer of Earth’s atmosphere known as the thermosphere. This finding, which the authors report today in Nature Communications, significantly changes current understanding of the H distribution and its interaction with other atmospheric constituents.

    Because H atoms are very light, they can easily overcome a planet’s gravitational force and permanently escape into interplanetary space. The ongoing atmospheric escape of H atoms is one reason why Earth’s sister planet, Mars, has lost the majority of its water. In addition, H atoms play a critical role in the physics governing the Earth’s upper atmosphere and also serve as an important shield for societies’ technological assets, such as the numerous satellites in low earth orbit, against the harsh space environment.

    Lara Waldrop

    “Hot H atoms had been theorized to exist at very high altitudes, above several thousand kilometers, but our discovery that they exist as low as 250 kilometers was truly surprising,” said ECE ILLINOIS Assistant Professor Lara Waldrop, principle investigator of the project. Waldrop is also affiliated with the Coordinated Science Lab at Illinois. “This result suggests that current atmospheric models are missing some key physics that impacts many different studies, ranging from atmospheric escape to the thermal structure of the upper atmosphere.”

    The discovery was enabled by the development of new numerical techniques and their application to years’ worth of remote sensing measurements acquired by NASA’s Thermosphere Ionosphere Mesosphere Energetics and Dynamics (TIMED) satellite. “Classical assumptions about upper atmospheric physics didn’t allow for the presence of hot H atoms at these heights,” recalled Dr. Jianqi Qin, the ECE ILLINOIS research scientist who developed the data analysis technique.

    Dr. Jianqi Qin

    “Once we changed our approach to avoid this unphysical assumption, we were able to correctly interpret the data for the first time.”

    Atomic hydrogen efficiently scatters ultraviolet radiation emitted by the sun, and the amount of scattered light sensitively depends on the amount of H atoms that are present in the atmosphere. As a result, remote observations of the scattered H emission, such as those made by NASA’s TIMED satellite, can be used to probe the abundance and spatial distribution of this key atmospheric constituent. In order to extract information about the upper atmosphere from such measurements, one needs to calculate exactly how the solar photons are scattered, which falls into Qin’s unique expertise.

    Under support from the National Science Foundation and NASA, the researchers developed a model of the radiative transfer of the scattered emission along with a new analysis technique that incorporated a transition region between the lower and upper extents of the H distribution. “It turns out that the new model fits the measurements perfectly,” said Qin. “Our analysis of the TIMED data led to the counter-intuitive finding that the temperature of the H atoms in the thermosphere increases significantly with declining solar activity, in contrast to the ambient atmospheric temperature, which decreases with declining solar activity.”

    Their results also show that the presence of such hot H atoms in the thermosphere significantly affects the distribution of the H atoms throughout the entire atmosphere. The origin of such hot H atoms, previously thought not to be able to exist in the thermosphere, is still a mystery. “We know that there must be a source of hot H atoms, either in the local thermosphere or in more distant layers of the atmosphere, but we do not have a solid answer yet,” said Waldrop.

    Qin added, “We will definitely keep working on this puzzle, because knowledge about the H density distribution is critical to the investigation of our atmospheric system as well as its response to space weather, which affects many space-based technologies that are so important for our modern society.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    U Illinois campus

    The University of Illinois at Urbana-Champaign community of students, scholars, and alumni is changing the world.

    With our land-grant heritage as a foundation, we pioneer innovative research that tackles global problems and expands the human experience. Our transformative learning experiences, in and out of the classroom, are designed to produce alumni who desire to make a significant, societal impact.

  • richardmitnick 1:55 pm on December 6, 2016 Permalink | Reply
    Tags: , , Physics, , This genius map explains how everything in physics is connected   

    From Science Alert: “This genius map explains how everything in physics is connected” 


    Science Alert


    1 DEC 2016

    Physics is a huge, complex field. It also happens to be one of the most fascinating, dealing with everything from black holes and wormholes to quantum teleportation and gravitational waves.

    But unless you have an innate knowledge of the field, it’s pretty hard to figure out how all these concepts actually fit together – and how they tie in with the stuff like the physics of inertia and circuits that we learned in high school.

    After all, everyone is constantly trying to prove Einstein wrong, and Stephen Hawking has famously struggled to come up with a ‘theory of everything’, so it’s easy to get confused about how things do actually fit together in physics (if at all).

    To straighten that out once and for all, YouTuber Dominic Walliman has created a map that shows how the many branches of physics link together, from the earliest days of classical physics and Isaac Newton, all the way through to Einstein’s relativity and quantum physics (with a little bit of philosophy thrown in there for good measure).

    If just the thought of a physics map breaks you out in an anxious sweat, but we promise it’s a lot less scary when you see it.

    You can buy a poster version of the map here, and also download a higher res version.

    If that still just makes you feel a little nauseous, don’t worry, because Walliman has also created an amazing animation that takes you through this map step by step, and summarises the history of physics, in just 8 delightful minutes.

    Access the mp4 video here .

    It takes you all the way from Newton’s falling apple to today’s scientists trying to peer inside black holes and find a theory to unify gravity with quantum mechanics.

    The video shows that there’s a gaping “chasm of ignorance” that physicists need to fill in before we can truly understand how the Universe works. This includes things like dark matter and energy, which work in theory, but so far have never been directly observed or explained.

    The bottom line in all of this is that, the more we learn, the more we realise how much we have left to discover, and that’s one of the things we love the most about science.

    So, for anyone who’s ever hurt their brain by trying to think about what the Universe is expanding into, or what exactly space-time is made of, this is for you. Because when the history of physics is broken down into a palatable 8 minutes, it suddenly doesn’t seem so scary after all.

    Access the mp4 vfideo here .

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 12:55 pm on December 6, 2016 Permalink | Reply
    Tags: , , Deep learning takes on physics, Physics,   

    From Symmetry: “Deep learning takes on physics” 

    Symmetry Mag


    Molly Olmstead

    Illustration by Sandbox Studio, Chicago with Ana Kova

    Can the same type of technology Facebook uses to recognize faces also recognize particles?

    When you upload a photo of one of your friends to Facebook, you set into motion a complex behind-the-scenes process. An algorithm whirs away, analyzing the pixels in the photo until it spits out your friend’s name. This same cutting-edge technique enables self-driving cars to distinguish pedestrians and other vehicles from the scenery around them.

    Can this technology also be used to tell a muon from an electron? Many physicists believe so. Researchers in the field are beginning to adapt it to analyze particle physics data.

    Proponents hope that using deep learning will save experiments time, money and manpower, freeing physicists to do other, less tedious work. Others hope they will improve the experiments’ performance, making them better able to identify particles and analyze data than any algorithm used before. And while physicists don’t expect deep learning to be a cure-all, some think it could be key to warding off an impending data-processing crisis.

    Neural networks

    Up until now, computer scientists have often coded algorithms by hand, a task that requires countless hours of work with complex computer languages. “We still do great science,” says Gabe Perdue, a scientist at Fermi National Accelerator Laboratory. “But I think we could do better science.”

    Deep learning, on the other hand, requires a different kind of human input.

    One way to conduct deep learning is to use a convolutional neural network, or CNN. CNNs are modeled after human visual perception. Humans process images using a network of neurons in the body; CNNs process images through layers of inputs called nodes. People train CNNs by feeding them pre-processed images. Using these inputs, an algorithm continuously tweaks the weight it places on each node and learns to identify patterns and points of interest. As the algorithm refines these weights, it becomes more and more accurate, often outperforming humans.

    Convolutional neural networks break down data processing in a way that short-circuits steps by tying multiple weights together, meaning fewer elements of the algorithm have to be adjusted.

    CNNs have been around since the late ’90s. But in recent years, breakthroughs have led to more affordable hardware for processing graphics, bigger data sets for training and innovations in the design of the CNNs themselves. As a result, more and more researchers are starting to use them.

    The development of CNNs has led to advances in speech recognition and translation, as well as in other tasks traditionally completed by humans. A London-based company owned by Google used a CNN to create AlphaGo, a computer program that in March beat the second-ranked international player of Go, a strategy board game far more complex than chess.

    CNNs have made it much more feasible to handle previously prohibitively large amounts of image-based data—the kind of amounts seen often in high-energy physics.

    Reaching the field of physics

    CNNs became practical around the year 2006 with the emergence of big data and graphics processing units, which have the necessary computing power to process large amounts of information. “There was a big jump in accuracy, and people have been innovating like wild on top of that ever since,” Perdue says.

    Around a year ago, researchers at various high-energy experiments began to consider the possibility of applying CNNs to their experiments. “We’ve turned a physics problem into, ‘Can we tell a car from a bicycle?’” says SLAC National Accelerator Laboratory researcher Michael Kagan. “We’re just figuring out how to recast problems in the right way.”

    For the most part, CNNs will be used for particle identification and classification and particle-track reconstruction. A couple of experiments are already using CNNs to analyze particle interactions, with high levels of accuracy. Researchers at the NOvA neutrino experiment, for example, have applied a CNN to their data.

    FNAL/NOvA experiment
    FNAL/NOvA experiment

    “This thing was really designed for identifying pictures of dogs and cats and people, but it’s also pretty good at identifying these physics events,” says Fermilab scientist Alex Himmel. “The performance was very good—equivalent to 30 percent more data in our detector.”

    Scientists on experiments at the Large Hadron Collider hope to use deep learning to make their experiments more autonomous, says CERN physicist Maurizio Pierini.

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    “We’re trying to replace humans on a few tasks. It’s much more costly to have a person watching things than a computer.”

    CNNs promise to be useful outside of detector physics as well. On the astrophysics side, some scientists are working on developing CNNs that can discover new gravitational lenses, massive celestial objects such as galaxy clusters that can distort light from distant galaxies behind them. The process of scanning the telescope data for signs of lenses is highly time-consuming, and normal pattern-recognizing programs have a hard time distinguishing their features.

    “It’s fair to say we’ve only begun to scratch the surface when it comes to using these tools,” says Alex Radovic, a postdoctoral fellow at The College of William & Mary who works on the NOvA experiment at Fermilab.

    Illustration by Sandbox Studio, Chicago with Ana Kova

    The upcoming data flood

    Some believe neural networks could help avert what they see as an upcoming data processing crisis.

    An upgraded version of the Large Hadron Collider planned for 2025 will produce roughly 10 times as much data.

    CERN HL-LHC bloc

    The Dark Energy Spectroscopic Instrument will collect data from about 35 million cosmic objects, and the Large Synoptic Survey Telescope will capture high-resolution video of nearly 40 billion galaxies.

    LBNL/DESI spectroscopic instrument on the Mayall 4-meter telescope at Kitt Peak National Observatory starting in 2018
    LBNL/DESI spectroscopic instrument on the Mayall 4-meter telescope at Kitt Peak National Observatory starting in 2018

    LSST/Camera, built at SLAC
    LSST/Camera, built at SLAC
    LSST Interior
    LSST telescope, currently under construction at Cerro Pachón Chile
    LSST telescope, currently under construction at Cerro Pachón Chile

    Data streams promise to grow, but previously exponential growth in the power of computer chips is predicted to falter. That means greater amounts of data will become increasingly expensive to process.

    “You may need 100 times more capability for 10 times more collisions,” Pierini says. “We are going toward a dead end for the traditional way of doing things.”

    Not all experiments are equally fit for the technology, however.

    “I think this’ll be the right tool sometimes, but it won’t be all the time,” Himmel says. “The more dissimilar your data is from natural images, the less useful the networks are going to be.”

    Most physicists would agree that CNNs are not appropriate for data analysis at experiments that are just starting up, for example—neural networks are not very transparent about how they do their calculations. “It would be hard to convince people that they have discovered things,” Pierini says. “I still think there’s value to doing things with paper and pen.”

    In some cases, the challenges of running a CNN will outweigh the benefits. For one, the data need to be converted to image form if they aren’t already. And the networks require huge amounts of data for the training—sometimes millions of images taken from simulations. Even then, simulations aren’t as good as real data. So the networks have to be tested with real data and other cross-checks.

    “There’s a high standard for physicists to accept anything new,” says Amir Farbin, an associate professor of physics at The University of Texas, Arlington. “There’s a lot of hoops to jump through to convince everybody this is right.”

    Looking to the future

    For those who are already convinced, CNNs spawn big dreams for faster physics and the possibility of something unexpected.

    Some look forward to using neural networks for detecting anomalies in the data—which could indicate a flaw in a detector or possibly a hint of a new discovery. Rather than trying to find specific signs of something new, researchers looking for new discoveries could simply direct a CNN to work through the data and try to find what stands out. “You don’t have to specify which new physics you’re searching for,” Pierini says. “It’s a much more open-minded way of taking data.”

    Someday, researchers might even begin to take tackle physics data with unsupervised learning. In unsupervised learning, as the name suggests, an algorithm would train with vast amounts of data without human guidance. Scientists would be able to give algorithms data, and the algorithms would be able to figure out what conclusions to draw from it themselves.

    “If you had something smart enough, you could use it to do all types of things,” Perdue says. “If it could infer a new law of nature or something, that would be amazing.”

    “But,” he adds, “I would also have to go look for new employment.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 7:00 am on December 3, 2016 Permalink | Reply
    Tags: , , , , In Practice: What do data analysts do all day?, , , Physics, The appeal of the unknown   

    From CERN: “In Practice: What do data analysts do all day?” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead


    2 Dec 2016
    Kathryn Coldham
    Kate Kahle
    Harriet Jarlett

    CMS physicist Nadjieh Jafari switched from theoretical to experimental physics early on in her career. “It was an easy decision,” she says. “Once I saw CERN, it became my quest.” (Image: Sophia Bennett/ CERN)

    Another day, another mountain of data to analyse. In 2016, CERN’s Large Hadron Collider produced more collisions than in all previous years of operation put together. Experimental physicists spend much of their professional lives analysing collision data, working towards a potential discovery or to sharpen our picture of nature. But when the day-to-day findings become predictable, do physicists lose motivation?

    What if there’s nothing there?

    CERN has made headlines with its discoveries, but does this mean today’s researchers are just seeking fame and fortune? For most, being front-page news is not what stokes their physics passion, as they stare at their computer screens for hours. Instead, it’s the knowledge and excitement of understanding our universe at the most fundamental level.

    Siegfried Foertsch, run coordinator of the ALICE experiment, is motivated by “the completely new discoveries that lie around the corner. They’ve become ascertainable because of the new energies that the LHC machine is providing.”

    Sitting in the ALICE control room, Siegfried explains: “I think what motivates people in these experiments is that you are entering terra incognita, it’s completely new science. It drives most people in these big experiments, it’s about new discoveries.” (Image: Sophia Bennett/CERN)

    These headline-worthy discoveries are rare. Instead, researchers make small, incremental findings day-by-day. “It doesn’t bother me that it’s not going to make front-page news. I know that within the particle physics community the research is important and that’s enough,” says Sneha Malde of the LHCb experiment.

    For CMS physicist Anne-Marie Magnan, her colleagues provide the much-needed push.

    “We have deadlines, so if you are part of an analysis you have pressure to make progress and you put personal pressure on yourself because you want to see the result. If you’re on a review committee you have deadlines, you need to provide feedback, the same if you’re managing a subgroup, you’re responsible for the group to show results at conferences. So you push people and they push you back to try and make progress,” she explains.

    Magnan analyses data to search for Higgs bosons . She describes her daily work as “programming, mostly. A lot of interaction with people, I have students to Skype with and when they say ‘I’m stuck, I don’t know what to do’ we chat and find solutions. At some points I’ve been a subgroup convener. There you encourage people to make progress and provide feedback on their analyses.”

    “It’s an exercise of patience because, after time, the incremental findings lead to a result. And even if you’re just working towards a result, you still have to solve technical problems each day,” explains Leticia Cunqueiro Mendez, a senior postdoctoral researcher working with the ALICE detector.

    Building bonds: the road to success

    Each one of these incremental, small discoveries are documented by a research paper. At CERN, these papers are often authored by hundreds, even thousands of people, as was the case with the papers announcing the Higgs discovery. And they aren’t just experimental physicists; students, technicians, engineers and computer scientists are all often equally involved.

    Having a high level of motivation can only get a physicist so far, working with others is the route to success.

    “People need each other here,” says Siegfried Foertsch, “the idea of a physicist without an engineer at CERN is unthinkable, and similarly vice versa. It’s symbiotic.”

    “I think the work of the technicians is a major contribution to the applied physics that I’m involved in. They are the unsung heroes in most of what we do to some extent,” says David Francis, Project Leader of the ATLAS Trigger and Data Acquisition System.

    For Cunqueiro Mendez, “the main thing is to know the possibilities of your detector and to have an interesting idea of what physics might be observable. For this you need interaction with the theorists so, in principle, you have to be reading papers and attending conferences. Here at CERN, you can meet your theory colleagues for a coffee and discuss your possibilities.”

    Eeney meeney miney mo

    Working with others can be collaborative, but it can also be competitive. There is a point of pride for one experiment to beat the competition to a discovery.

    Sneha Malde standing in the corridor outside of her office (Image: Maximilien Brice/CERN)

    While the ATLAS and CMS experiments perform similar searches, the LHCb and ALICE experiments have particular fields of study, and the work that the associated physicists do differs as a result.

    Bump searches are what physicists call it when they try to find statistically significant peaks in the data; the presence of a bump could indicate the existence of a new particle. Some of these searches are done at ATLAS and CMS, where new particles are the name of the game. At LHCb and ALICE they try to take precision measurements of phenomenon, more than particles.

    “I don’t think I would be very happy just looking at empty plots with nothing in them, which could happen in bump searches if they don’t find anything new,” muses Malde. “I like the precision measurement aspect of LHCb’s data.”

    Studying and searching for different things means the data plots for different experiments look very different.

    “I like having obvious things in my plots. I like nice bumps, big ones. We have lots of bumps that don’t disappear, and they are really big peaks. We don’t have bumps, we have mountains!” – Sneha Malde, LHCb data analyst

    ATLAS physicist Anatolli Romaniouk, marvels at this range of LHC experiments. They “embrace an incredible field of physics, they search for everything.”

    “This is physics; if we know what we are searching for, then we don’t need experiments. If you know what exactly you want to find, it’s already found, or will be found soon. That’s why our experiments are beautiful because these experiments embrace an incredible field of physics, the LHC, it searches for everything,” explains Romaniouk.

    The beauty of the unknown

    ATLAS physicist Anatolli Romaniouk has worked at CERN since 1990. The students he sees in the collaboration “know a bit of electronics, data acquisition and data analysis, very often they do it from second year of university and this is interesting. I find this brilliant, that they practice real physics at an early stage of their education.” (Image: Sophia Bennett/CERN)

    The appeal of the unknown, the as yet undiscovered, ignites the curiosity in the physicists and fuels them in their analyses.

    “When you have something in theory and think that it could be real – that it could exist – then you start to really think how you can look for it and try to find it,” says CMS physicist Nadjieh Jafari. “You build your experiment based on the theories. The CMS’s muon system was perfectly designed to discover the Higgs boson but at the moment of designing it, it was just an idea that we might find it. For me, that’s the most beautiful part of what we do.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Cern Courier




    CERN CMS New

    CERN LHCb New II


    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

  • richardmitnick 7:02 pm on December 2, 2016 Permalink | Reply
    Tags: , Physics,   

    From Ethan Siegel: “What every layperson should know about string theory” 

    From Ethan Siegel


    The idea that instead of 0-dimensional particles, it’s 1-dimensional strings that fundamentally make up the Universe is at the core of string theory. Image credit: flickr user Trailfan, via https://www.flickr.com/photos/7725050@N06/631503428.

    If you’ve ever wondered just why it has piqued the interest of so many, have a look inside.

    “I just think too many nice things have happened in string theory for it to be all wrong. Humans do not understand it very well, but I just don’t believe there is a big cosmic conspiracy that created this incredible thing that has nothing to do with the real world.” -Edward Witten

    It’s one of the most brilliant, controversial and unproven ideas in all of physics: string theory. At the heart of string theory is the thread of an idea that’s run through physics for centuries, that at some fundamental level, all the different forces, particles, interactions and manifestations of reality are tied together as part of the same framework. Instead of four independent fundamental forces — strong, electromagnetic, weak and gravitational — there’s one unified theory that encompasses all of them. In many regards, string theory is the best contender for a quantum theory of gravitation, which just happens to unify at the highest-energy scales. Although there’s no experimental evidence for it, there are compelling theoretical reasons to think it might be true. A year ago, the top living string theorist, Ed Witten, wrote a piece on what every physicist should know about string theory. Here’s what that means, translated for non-physicists.

    The difference between standard quantum field theory interactions (L), for point-like particles, and string theory interactions (R), for closed strings. Image credit: Wikimedia Commons user Kurochka.

    When it comes to the laws of nature, it’s remarkable how many similarities there are between seemingly unrelated phenomena. The way that two massive bodies gravitate, according to Newton’s laws, is almost identical to the way that electrically charged particles attract-or-repel. The way a pendulum oscillates is completely analogous to the way a mass on a spring moves back-and-forth, or the way a planet orbits a star. Gravitational waves, water waves and light waves all share remarkably similar features, despite arising from fundamentally different physical origins. And in the same vein, although most don’t realize it, the quantum theory of a single particle and how you’d approach a quantum theory of gravity are similarly analogous.

    A Feynman diagram representing electron-electron scattering, which requires summing over all the possible histories of the particle-particle interactions. Image credit: Dmitri Fedorov.

    he way quantum field theory works is that you take a particle and you perform a mathematical “sum over histories.” You can’t just calculate where the particle was and where it is and how it got to be there, since there’s an inherent, fundamental quantum uncertainty to nature. Instead, you add up all the possible ways it could have arrived at its present state, appropriately weighted probabilistically, and that’s how you calculate the state of a single particle. Because Einstein’s General Relativity isn’t concerned with particles but rather the curvature of spacetime, you don’t average over all possible histories of a particle, but rather over all possible spacetime geometries.

    Gravity, governed by Einstein, and everything else (strong, weak and electromagnetic interactions), governed by quantum physics, are the two independent rules known to govern everything in our Universe. Image credit: SLAC National Accelerator Laboratory.

    Working in three spatial dimensions is very difficult, but if you go down to one dimension, things become very simple. The only possible one-dimensional surfaces are an open string, where there are two separate, unattached ends, or a closed string, where the two ends are attached to form a loop. In addition, the spatial curvature — so complicated in three dimensions — becomes trivial. So what we’re left with, if we want to add in matter, is a set of scalar fields (just like certain types of particles) and the cosmological constant (which acts just like a mass term): a beautiful analogy.

    The extra degrees of freedom a particle gains from being in multiple dimensions don’t play much of a role; so long as you can define a momentum vector, that’s the main dimension that matters. In one dimension, therefore, quantum gravity looks just like a free quantum particle in any arbitrary number of dimensions. The next step is to incorporate interactions, and to go from a free particle with no scattering amplitudes or cross-sections to one that can play a physical role, coupled to the Universe.

    A graph with trivalent vertices is a key component of constructing the path integral relevant for 1-D quantum gravity. Image credit: Phys. Today 68, 11, 38 (2015).

    Graphs, like the one above, allow us to describe the physical concept of action in quantum gravity. If we write down all the possible combinations of such graphs and sum over them — applying the same laws like conservation of momentum that we always enforce — we can complete the analogy. Quantum gravity in one dimension is very much like a single particle interacting in any number of dimensions.

    The probability of finding a quantum particle at any particular location is never 100%; the probability is spread out over both space and time. Image credit: Wikimedia Commons user Maschen.

    The next step would be to move from one spatial dimension to 3+1 dimensions: where the Universe has three spatial dimensions and one time dimension. But doing it for gravity may be very challenging. Instead, there might be a better approach in working in the opposite direction. Instead of calculating how a single particle (a zero-dimensional entity) behaves in any number of dimensions, maybe we could calculate how a string, whether open or closed (a one-dimensional entity) behaves. And then, from that, we can look for analogies to a more complete theory of quantum gravity in a more realistic number of dimensions.

    Feynman diagrams (top) are based off of point particles and their interactions. Converting them into their string theory analogues (bottom) gives rise to surfaces which can have non-trivial curvature. Image credit: Phys. Today 68, 11, 38 (2015).

    Instead of points and interactions, we immediately start working with surfaces. And once you have a true, multi-dimensional surface, that surface can be curved in non-trivial ways. You start getting very interesting behavior out; behavior that just might be at the root of the spacetime curvature we experience in our Universe as General Relativity. While 1D quantum gravity gave us quantum field theory for particles in a possibly curved spacetime, it didn’t describe gravitation itself. The subtle piece of the puzzle that was missing? There was no correspondence between operators, or the functions that represent quantum mechanical forces and properties, and states, or how the particles and their properties evolve over time. But if we move from point-like particles to string-like entities, that correspondence shows up.

    Deforming the spacetime metric can be represented by the fluctuation (labelled ‘p’), and if you apply it to the string analogues, it describes a spacetime fluctuation and corresponds to a quantum state of the string. Image credit: Phys. Today 68, 11, 38 (2015).

    There’s a real operator-state correspondence, where a fluctuation in the spacetime metric (i.e., an operator) automatically represents a state in the quantum mechanical description of a string’s properties. So you can get a quantum theory of gravity in spacetime from string theory. But that’s not all you get: you also get quantum gravity unified with the other particles and forces in spacetime, the ones that correspond to the other operators in the field theory of the string. There’s also the operator that describes the spacetime geometry’s fluctuations, and the other quantum states of the string. The biggest news about string theory is that it can give you a working quantum theory of gravity.

    Brian Greene presenting on String Theory. Image credit: NASA/Goddard/Wade Sisler.

    That doesn’t mean it’s a foregone conclusion, however, that string theory is the path to quantum gravity. The great hope of string theory is that these analogies will hold up at all scales, and that there will be an unambiguous, one-to-one mapping of the string picture onto the Universe we observe around us. Right now, there are only a few sets of dimensions that the string/superstring picture is self-consistent in, and the most promising one doesn’t give us the four-dimensional gravity of Einstein, but rather a 10-dimensional Brans-Dicke theory of gravity. In order to recover the gravity of our Universe, you must “get rid of” six dimensions and take the Brans-Dicke coupling constant, ω, to infinity. How this happens remains an open challenge for string theory.

    A 2-D projection of a Calabi-Yau manifold, one popular method of compactifying the extra, unwanted dimensions of String Theory. Image credit: Wikimedia Commons user Lunch.

    But string theory offers a path to quantum gravity, and if we make the judicious choices of “the math works out this way,” we can get both General Relativity and the Standard Model out of it. It’s the only idea, to date, that gives us this, and that’s why it’s so hotly pursued. No matter whether you tout string theory’s successes or failure, or how you feel about its lack of verifiable predictions, it will no doubt remain one of the most active areas of theoretical physics research, and at the core of a great many physicists’ dreams of an ultimate theory.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

  • richardmitnick 7:05 pm on November 30, 2016 Permalink | Reply
    Tags: , , , , , Physics, ,   

    From Quanta: “The Case Against Dark Matter” 

    Quanta Magazine
    Quanta Magazine

    November 29, 2016
    Natalie Wolchover

    Erik Verlinde
    Ilvy Njiokiktjien for Quanta Magazine

    For 80 years, scientists have puzzled over the way galaxies and other cosmic structures appear to gravitate toward something they cannot see. This hypothetical “dark matter” seems to outweigh all visible matter by a startling ratio of five to one, suggesting that we barely know our own universe. Thousands of physicists are doggedly searching for these invisible particles.

    But the dark matter hypothesis assumes scientists know how matter in the sky ought to move in the first place. This month, a series of developments has revived a long-disfavored argument that dark matter doesn’t exist after all. In this view, no missing matter is needed to explain the errant motions of the heavenly bodies; rather, on cosmic scales, gravity itself works in a different way than either Isaac Newton or Albert Einstein predicted.

    The latest attempt to explain away dark matter is a much-discussed proposal by Erik Verlinde, a theoretical physicist at the University of Amsterdam who is known for bold and prescient, if sometimes imperfect, ideas. In a dense 51-page paper posted online on Nov. 7, Verlinde casts gravity as a byproduct of quantum interactions and suggests that the extra gravity attributed to dark matter is an effect of “dark energy” — the background energy woven into the space-time fabric of the universe.

    Instead of hordes of invisible particles, “dark matter is an interplay between ordinary matter and dark energy,” Verlinde said.

    To make his case, Verlinde has adopted a radical perspective on the origin of gravity that is currently in vogue among leading theoretical physicists. Einstein defined gravity as the effect of curves in space-time created by the presence of matter. According to the new approach, gravity is an emergent phenomenon. Space-time and the matter within it are treated as a hologram that arises from an underlying network of quantum bits (called “qubits”), much as the three-dimensional environment of a computer game is encoded in classical bits on a silicon chip. Working within this framework, Verlinde traces dark energy to a property of these underlying qubits that supposedly encode the universe. On large scales in the hologram, he argues, dark energy interacts with matter in just the right way to create the illusion of dark matter.

    In his calculations, Verlinde rediscovered the equations of “modified Newtonian dynamics,” or MOND. This 30-year-old theory makes an ad hoc tweak to the famous “inverse-square” law of gravity in Newton’s and Einstein’s theories in order to explain some of the phenomena attributed to dark matter. That this ugly fix works at all has long puzzled physicists. “I have a way of understanding the MOND success from a more fundamental perspective,” Verlinde said.

    Many experts have called Verlinde’s paper compelling but hard to follow. While it remains to be seen whether his arguments will hold up to scrutiny, the timing is fortuitous. In a new analysis of galaxies published on Nov. 9 in Physical Review Letters, three astrophysicists led by Stacy McGaugh of Case Western Reserve University in Cleveland, Ohio, have strengthened MOND’s case against dark matter.

    The researchers analyzed a diverse set of 153 galaxies, and for each one they compared the rotation speed of visible matter at any given distance from the galaxy’s center with the amount of visible matter contained within that galactic radius. Remarkably, these two variables were tightly linked in all the galaxies by a universal law, dubbed the “radial acceleration relation.” This makes perfect sense in the MOND paradigm, since visible matter is the exclusive source of the gravity driving the galaxy’s rotation (even if that gravity does not take the form prescribed by Newton or Einstein). With such a tight relationship between gravity felt by visible matter and gravity given by visible matter, there would seem to be no room, or need, for dark matter.

    Even as dark matter proponents rise to its defense, a third challenge has materialized. In new research that has been presented at seminars and is under review by the Monthly Notices of the Royal Astronomical Society, a team of Dutch astronomers have conducted what they call the first test of Verlinde’s theory: In comparing his formulas to data from more than 30,000 galaxies, Margot Brouwer of Leiden University in the Netherlands and her colleagues found that Verlinde correctly predicts the gravitational distortion or “lensing” of light from the galaxies — another phenomenon that is normally attributed to dark matter. This is somewhat to be expected, as MOND’s original developer, the Israeli astrophysicist Mordehai Milgrom, showed years ago that MOND accounts for gravitational lensing data. Verlinde’s theory will need to succeed at reproducing dark matter phenomena in cases where the old MOND failed.

    Kathryn Zurek, a dark matter theorist at Lawrence Berkeley National Laboratory, said Verlinde’s proposal at least demonstrates how something like MOND might be right after all. “One of the challenges with modified gravity is that there was no sensible theory that gives rise to this behavior,” she said. “If [Verlinde’s] paper ends up giving that framework, then that by itself could be enough to breathe more life into looking at [MOND] more seriously.”

    The New MOND

    In Newton’s and Einstein’s theories, the gravitational attraction of a massive object drops in proportion to the square of the distance away from it. This means stars orbiting around a galaxy should feel less gravitational pull — and orbit more slowly — the farther they are from the galactic center. Stars’ velocities do drop as predicted by the inverse-square law in the inner galaxy, but instead of continuing to drop as they get farther away, their velocities level off beyond a certain point. The “flattening” of galaxy rotation speeds, discovered by the astronomer Vera Rubin in the 1970s, is widely considered to be Exhibit A in the case for dark matter — explained, in that paradigm, by dark matter clouds or “halos” that surround galaxies and give an extra gravitational acceleration to their outlying stars.

    Searches for dark matter particles have proliferated — with hypothetical “weakly interacting massive particles” (WIMPs) and lighter-weight “axions” serving as prime candidates — but so far, experiments have found nothing.

    Lucy Reading-Ikkanda for Quanta Magazine

    Meanwhile, in the 1970s and 1980s, some researchers, including Milgrom, took a different tack. Many early attempts at tweaking gravity were easy to rule out, but Milgrom found a winning formula: When the gravitational acceleration felt by a star drops below a certain level — precisely 0.00000000012 meters per second per second, or 100 billion times weaker than we feel on the surface of the Earth — he postulated that gravity somehow switches from an inverse-square law to something close to an inverse-distance law. “There’s this magic scale,” McGaugh said. “Above this scale, everything is normal and Newtonian. Below this scale is where things get strange. But the theory does not really specify how you get from one regime to the other.”

    Physicists do not like magic; when other cosmological observations seemed far easier to explain with dark matter than with MOND, they left the approach for dead. Verlinde’s theory revitalizes MOND by attempting to reveal the method behind the magic.

    Verlinde, ruddy and fluffy-haired at 54 and lauded for highly technical string theory calculations, first jotted down a back-of-the-envelope version of his idea in 2010. It built on a famous paper he had written months earlier, in which he boldly declared that gravity does not really exist. By weaving together numerous concepts and conjectures at the vanguard of physics, he had concluded that gravity is an emergent thermodynamic effect, related to increasing entropy (or disorder). Then, as now, experts were uncertain what to make of the paper, though it inspired fruitful discussions.

    The particular brand of emergent gravity in Verlinde’s paper turned out not to be quite right, but he was tapping into the same intuition that led other theorists to develop the modern holographic description of emergent gravity and space-time — an approach that Verlinde has now absorbed into his new work.

    In this framework, bendy, curvy space-time and everything in it is a geometric representation of pure quantum information — that is, data stored in qubits. Unlike classical bits, qubits can exist simultaneously in two states (0 and 1) with varying degrees of probability, and they become “entangled” with each other, such that the state of one qubit determines the state of the other, and vice versa, no matter how far apart they are. Physicists have begun to work out the rules by which the entanglement structure of qubits mathematically translates into an associated space-time geometry. An array of qubits entangled with their nearest neighbors might encode flat space, for instance, while more complicated patterns of entanglement give rise to matter particles such as quarks and electrons, whose mass causes the space-time to be curved, producing gravity. “The best way we understand quantum gravity currently is this holographic approach,” said Mark Van Raamsdonk, a physicist at the University of British Columbia in Vancouver who has done influential work on the subject.

    The mathematical translations are rapidly being worked out for holographic universes with an Escher-esque space-time geometry known as anti-de Sitter (AdS) space, but universes like ours, which have de Sitter geometries, have proved far more difficult. In his new paper, Verlinde speculates that it’s exactly the de Sitter property of our native space-time that leads to the dark matter illusion.

    De Sitter space-times like ours stretch as you look far into the distance. For this to happen, space-time must be infused with a tiny amount of background energy — often called dark energy — which drives space-time apart from itself. Verlinde models dark energy as a thermal energy, as if our universe has been heated to an excited state. (AdS space, by contrast, is like a system in its ground state.) Verlinde associates this thermal energy with long-range entanglement between the underlying qubits, as if they have been shaken up, driving entangled pairs far apart. He argues that this long-range entanglement is disrupted by the presence of matter, which essentially removes dark energy from the region of space-time that it occupied. The dark energy then tries to move back into this space, exerting a kind of elastic response on the matter that is equivalent to a gravitational attraction.

    Because of the long-range nature of the entanglement, the elastic response becomes increasingly important in larger volumes of space-time. Verlinde calculates that it will cause galaxy rotation curves to start deviating from Newton’s inverse-square law at exactly the magic acceleration scale pinpointed by Milgrom in his original MOND theory.

    Van Raamsdonk calls Verlinde’s idea “definitely an important direction.” But he says it’s too soon to tell whether everything in the paper — which draws from quantum information theory, thermodynamics, condensed matter physics, holography and astrophysics — hangs together. Either way, Van Raamsdonk said, “I do find the premise interesting, and feel like the effort to understand whether something like that could be right could be enlightening.”

    One problem, said Brian Swingle of Harvard and Brandeis universities, who also works in holography, is that Verlinde lacks a concrete model universe like the ones researchers can construct in AdS space, giving him more wiggle room for making unproven speculations. “To be fair, we’ve gotten further by working in a more limited context, one which is less relevant for our own gravitational universe,” Swingle said, referring to work in AdS space. “We do need to address universes more like our own, so I hold out some hope that his new paper will provide some additional clues or ideas going forward.”

    Access mp4 video here .

    The Case for Dark Matter

    Verlinde could be capturing the zeitgeist the way his 2010 entropic-gravity paper did. Or he could be flat-out wrong. The question is whether his new and improved MOND can reproduce phenomena that foiled the old MOND and bolstered belief in dark matter.

    One such phenomenon is the Bullet cluster, a galaxy cluster in the process of colliding with another.

    X-ray photo by Chandra X-ray Observatory of the Bullet Cluster (1E0657-56). Exposure time was 0.5 million seconds (~140 hours) and the scale is shown in megaparsecs. Redshift (z) = 0.3, meaning its light has wavelengths stretched by a factor of 1.3. Based on today’s theories this shows the cluster to be about 4 billion light years away.
    In this photograph, a rapidly moving galaxy cluster with a shock wave trailing behind it seems to have hit another cluster at high speed. The gases collide, and gravitational fields of the stars and galalxies interact. When the galaxies collided, based on black-body temperture readings, the temperature reached 160 million degrees and X-rays were emitted in great intensity, claiming title of the hottest known galactic cluster.
    Studies of the Bullet cluster, announced in August 2006, provide the best evidence to date for the existence of dark matter.

    Superimposed mass density contours, caused by gravitational lensing of dark matter. Photograph taken with Hubble Space Telescope.
    Date 22 August 2006

    The visible matter in the two clusters crashes together, but gravitational lensing suggests that a large amount of dark matter, which does not interact with visible matter, has passed right through the crash site. Some physicists consider this indisputable proof of dark matter. However, Verlinde thinks his theory will be able to handle the Bullet cluster observations just fine. He says dark energy’s gravitational effect is embedded in space-time and is less deformable than matter itself, which would have allowed the two to separate during the cluster collision.

    But the crowning achievement for Verlinde’s theory would be to account for the suspected imprints of dark matter in the cosmic microwave background (CMB), ancient light that offers a snapshot of the infant universe.

    CMB per ESA/Planck
    CMB per ESA/Planck

    The snapshot reveals the way matter at the time repeatedly contracted due to its gravitational attraction and then expanded due to self-collisions, producing a series of peaks and troughs in the CMB data. Because dark matter does not interact, it would only have contracted without ever expanding, and this would modulate the amplitudes of the CMB peaks in exactly the way that scientists observe. One of the biggest strikes against the old MOND was its failure to predict this modulation and match the peaks’ amplitudes. Verlinde expects that his version will work — once again, because matter and the gravitational effect of dark energy can separate from each other and exhibit different behaviors. “Having said this,” he said, “I have not calculated this all through.”

    While Verlinde confronts these and a handful of other challenges, proponents of the dark matter hypothesis have some explaining of their own to do when it comes to McGaugh and his colleagues’ recent findings about the universal relationship between galaxy rotation speeds and their visible matter content.

    In October, responding to a preprint of the paper by McGaugh and his colleagues, two teams of astrophysicists independently argued that the dark matter hypothesis can account for the observations. They say the amount of dark matter in a galaxy’s halo would have precisely determined the amount of visible matter the galaxy ended up with when it formed. In that case, galaxies’ rotation speeds, even though they’re set by dark matter and visible matter combined, will exactly correlate with either their dark matter content or their visible matter content (since the two are not independent). However, computer simulations of galaxy formation do not currently indicate that galaxies’ dark and visible matter contents will always track each other. Experts are busy tweaking the simulations, but Arthur Kosowsky of the University of Pittsburgh, one of the researchers working on them, says it’s too early to tell if the simulations will be able to match all 153 examples of the universal law in McGaugh and his colleagues’ galaxy data set. If not, then the standard dark matter paradigm is in big trouble. “Obviously this is something that the community needs to look at more carefully,” Zurek said.

    Even if the simulations can be made to match the data, McGaugh, for one, considers it an implausible coincidence that dark matter and visible matter would conspire to exactly mimic the predictions of MOND at every location in every galaxy. “If somebody were to come to you and say, ‘The solar system doesn’t work on an inverse-square law, really it’s an inverse-cube law, but there’s dark matter that’s arranged just so that it always looks inverse-square,’ you would say that person is insane,” he said. “But that’s basically what we’re asking to be the case with dark matter here.”

    Given the considerable indirect evidence and near consensus among physicists that dark matter exists, it still probably does, Zurek said. “That said, you should always check that you’re not on a bandwagon,” she added. “Even though this paradigm explains everything, you should always check that there isn’t something else going on.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

  • richardmitnick 11:12 am on November 30, 2016 Permalink | Reply
    Tags: , , , Physics   

    From Physics: “Focus: More Hints of Exotic Cosmic-Ray Origin” 

    Physics LogoAbout Physics

    Physics Logo 2


    November 28, 2016
    Michael Schirber

    The Alpha Magnetic Spectrometer (AMS) aboard the International Space Station

    New Space Station data support a straightforward model of cosmic-ray propagation through the Galaxy but also add to previous signs of undiscovered cosmic-ray sources such as dark matter.

    Observing the constant rain of cosmic rays hitting Earth can provide information on the “magnetic weather” in other parts of the Galaxy. A new high-precision measurement of two cosmic-ray elements, boron and carbon, supports a specific model of the magnetic turbulence that deflects cosmic rays on their journey through the Galaxy. The data, which come from the Alpha Magnetic Spectrometer (AMS) aboard the International Space Station, appear to rule out alternative models for cosmic-ray propagation. The failure of these models—which were devised to explain recent observations of cosmic-ray antimatter—implies a possible exotic origin for some cosmic rays.

    The majority of cosmic rays are particles or nuclei produced in supernovae or other astrophysical sources. However, as these so-called primary cosmic rays travel through the Galaxy to Earth, they collide with gas atoms in the interstellar medium. The collisions produce a secondary class of cosmic rays with masses and energies that differ from primary cosmic rays. To investigate the relationship of the two classes, astrophysicists often look at the ratio of the number of detections of two nuclei, such as boron and carbon. For the most part, carbon cosmic rays have a primary origin, whereas boron is almost exclusively created in secondary processes. A relatively high boron-to-carbon (B/C) ratio in a certain energy range implies that the relevant cosmic rays are traversing a lot of gas before reaching us. “The B/C ratio tells you how cosmic rays propagate through space,” says AMS principal investigator Samuel Ting of MIT.

    Previous measurements of the B/C ratio have had large errors of 15% or more, especially at high energy, mainly because of the brief data collection time available for balloon-based detectors. But the AMS has been operating on the Space Station for five years, and over this time it has collected more than 80 billion cosmic rays. The AMS detectors measure the charges of these cosmic rays, allowing the elements to be identified. The collaboration has detected over ten million carbon and boron nuclei, with energies per nucleon ranging from a few hundred MeV up to a few TeV.

    The B/C ratio decreases with energy because higher-energy cosmic rays tend to take a more direct path to us (and therefore experience fewer collisions producing boron). By contrast, lower-energy cosmic rays are diverted more strongly by magnetic fields, so they bounce around like pinballs among magnetic turbulence regions in the Galaxy. Several theories have been proposed to describe the size and spacing of these turbulent regions, and these theories lead to predictions for the energy dependence of the B/C ratio. However, previous B/C observations have not been precise enough to favor one theory over another. The AMS data show very clearly that the B/C ratio is proportional to the energy raised to the -1/3 power. This result matches a prediction based on a theory of magnetohydrodynamics developed in 1941 by the Russian mathematician Andrey Kolmogorov [1].

    These results conflict with models that predict that the B/C ratio should exhibit some more complex energy dependence, such as kinks in the B/C spectrum at specific energies [2]. Theorists proposed these models to explain anomalous observations—by AMS and other experiments—that showed an increase in the number of positrons (anti-electrons) reaching Earth relative to electrons at high energy (see 3 April 2013 Viewpoint). The idea was that these “excess” positrons are—like boron—produced in collisions between cosmic rays and interstellar gas. But such a scenario would require that cosmic rays encounter additional scattering sites, not just magnetically turbulent regions. By ruling out these models, the AMS results support the alternative explanation—a new primary cosmic ray source that emits positrons. Candidates include pulsars and dark matter, but a lot of mystery still surrounds the unexplained positron data.

    Igor Moskalenko from Stanford University is very surprised at the close match between the data and the Kolmogorov model. He expected that the ratio would deviate from a single power law in a way that might provide clues to the origin of the excess positrons. “This is a dramatic result that should lead to much better understanding of interstellar magnetohydrodynamic turbulence and propagation of cosmic rays,” he says. “On the other hand, it is very much unexpected in that it makes recent discoveries in astrophysics of cosmic rays even more puzzling.”

    This research is published in Physical Review Letters.


    A. N. Kolmogorov, The Local Structure of Turbulence in Incompressible Viscous Fluid for Very Large Reynolds Numbers, Dokl. Akad. Nauk SSSR 30, 301 (1941).
    A. E. Vladimirov, G. Jóhannesson, I. V. Moskalenko, and T. A. Porter, Testing the Origin of High-Energy Cosmic Rays, Astrophys. J. 752, 68 (2012).

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Physicists are drowning in a flood of research papers in their own fields and coping with an even larger deluge in other areas of physics. How can an active researcher stay informed about the most important developments in physics? Physics highlights a selection of papers from the Physical Review journals. In consultation with expert scientists, the editors choose these papers for their importance and/or intrinsic interest. To highlight these papers, Physics features three kinds of articles: Viewpoints are commentaries written by active researchers, who are asked to explain the results to physicists in other subfields. Focus stories are written by professional science writers in a journalistic style and are intended to be accessible to students and non-experts. Synopses are brief editor-written summaries. Physics provides a much-needed guide to the best in physics, and we welcome your comments (physics@aps.org).

  • richardmitnick 10:56 am on November 30, 2016 Permalink | Reply
    Tags: , , Physics,   

    From Vox: The Map of Physics Video 

    Access the mp4 video here .
    Watch, enjoy, learn.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 2:58 pm on November 29, 2016 Permalink | Reply
    Tags: , , Physics, SNO+   

    From UC Davis: “SNO+ Neutrino Detector Gets Ready For Run” 

    UC Davis bloc

    UC Davis

    November 29th, 2016
    Andy Fell

    SNO+ neutrino detector being filled with ultrapure water. The detector will search for neutrinos from distant supernovae and nuclear reactors. Credit: SNO+ Collaboration

    Not a still from a science fiction movie, but the SNO+ neutrino detector being filled with very pure water prior to starting operations. Located over a mile underground in a mine in Ontario, Canada, the SNO+ detector consists of an acrylic sphere 12 meters in diameter filled with 800 tonnes of scintillation fluid, floating in a bath of ultrapure water surrounded by 10,000 photomultiplier tubes that will detect flashes of light from passing neutrinos.

    The color in the photograph isn’t computer enhanced, said Professor Robert Svoboda, chair of the Department of Physics, who is a member of the SNO+ team.

    “The blue color comes from the fact the water is very pure – more than 10,000 times purer than Lake Tahoe,” Svoboda said.

    Svoboda and other UC Davis physicists, including graduate students Morgan Askins and Teal Pershing and postdocs Vincent Fischer and Leon Pickard, helped build SNO+ and will be working on analyzing data from the experiment. It will measure neutrinos from the Sun, from distant supernovae, and from nuclear reactors in the U.S. and Canada. It will search for a rare form of radioactivity, predicted but not yet observed, which would show whether neutrinos are different from other fundamental particles.

    SNO+ will be a new kilo-tonne scale liquid scintillator detector that will study neutrinos. The experiment will be located approximately 2km underground in VALE’s Creighton mine near Sudbury, Ontario, Canada. The heart of the SNO+ detector will be a 12m diameter acrylic sphere fill with approximately 800 tonnes of liquid scintillator which will float in a water bath. This volume will be monitored by about 10,000 photomultiplier tubes (PMTs), which are very sensitive light detectors. The acrylic sphere, PMTs and PMT support structure will be re-used from the SNO experiment. Therefore, SNO+ will look almost exactly the same as SNO (shown above) except that, in addition to the ropes that currently hold the acrylic vessel up, we will add ropes to hold the vessel down once it is filled with (buoyant) scintillator.

    Liquid scintillator is an organic liquid that gives off light when charged particles pass through it. SNO+ will detect neutrinos when they interact with electrons and nuclei in the detector to produce charged particles which, in turn, create light as they pass through the scintillator. The flash of light is then detected by the PMT array. This process is very similar to the way in which SNO detected neutrinos except that, in the SNO experiment, the light was produced through the Cherenkov process rather than by scintillation. It is this similarity in detection schemes that allows the SNO detector to be so efficiently converted for use as a liquid scintillator detector.

    The scintillator in the SNO+ experiment will be primarily composed of linear alkyl benzene (LAB), which is a new scintillator for this type of experiment. LAB was chosen because it has good light output, is quite transparent, and is a “nice” chemical to work with (it has properties much like those of mineral oil). It also seems to be compatible with acrylic (which is obviously important for SNO+). LAB is used commercially to manufacture of dish soap, among other things, which means that it is available in the large quantities needed for SNO+ at a relatively low price. As an added bonus, there is a plant in Quebec that produces very good LAB, meaning that SNO+ can have a “local supplier” of high quality scintillator.




    University of Alberta:
    A. Bialek, A. Hallin, M. Hedayatipoor, C. Krauss, P. Mekarski, L. Sibley, K. Singh, J. Soukup

    Armstrong Atlantic State University:
    J. Secrest

    University of California, Berkeley / LBNL:
    F. Descamps, C. Dock, G. Orebi Gann, K. Haghighi, K. Kamdin, T. Kaptanohku, P. McKenna, K. Nuckolls, C. Weisser

    Black Hills State University:
    K. Keeter

    Brookhaven National Laboratory:
    W.Beriguete, S. Hans, L. Hu, R. Rosero, M. Yeh, Y. Williamson

    University of California, Davis:
    M. Askins, C. Grant, R. Svoboda

    University of Chicago:
    E. Blucher, J. Cushing, K. Labe, T. LaTorre, M. Strait

    Dresden University of Technology:
    V. Lozza, B. von Krosigk, F. Krüger, M. Reinhard, A. Soerensen, K. Zuber

    Laurentian University:
    D. Chauhan, E.D. Hallman, C. Kraus, T. Shantz, M. Schwendener, C. Virtue

    LIP Lisbon and Coimbra:
    R. Alves, S. Andringa, L. Gurriana, A.Maio, J. Maneira, G. Prior

    University of Lancaster:
    H.Okeeffe, L. Kormos

    University of Liverpool:
    N.McCauley, H. J. Rose, R. Stainforth, J. Walker

    University of North Carolina at Chapel Hill:
    M.Howe, J. Wikerson

    Oxford University:
    S.Biller, I. Coulter, N. Jelley, C. Jones, J. Ligard, K. Majumdar, A. Reichold, L. Segui

    University of Pennsylvania:
    E. Beier, R. Bonventre, W.J. Heintzelman, J. Klein, P. Keener, R.Knapik, A. Mastbaum, T. Shokair, R. Van Berg

    Queen Mary, University of London:
    E. Arushanova, A. Back, S. Langrock, F. Di Lodovico, P. Jones, J. Wilson

    Queen’s University:

    S. Asahi, M. Boulay, M. Chen, N. Fatemi-Ghomi, P.J. Harvey, C. Hearns, A. McDonald, A. Noble, M. Seddighim, T. Sonley, E. O’Sullivan, P. Skensved, I. Takashi

    C.Beaudoin, G. Bellehumeur, O. Chkvorets, B. Cleveland, F. Duncan, R. Ford, N. Gagnon, C. Jillings, S. Korte, I. Lawson, T. O’Malley, M. Schumaker, E. Vazquez-Jauregui

    University of Sheffield:
    J. McMillan

    University of Sussex:
    E. Falk, J. Hartnell, M. Mottram, S. Peeters, J. Sinclair, J. Waterfield, R. White

    R. Helmer

    University of Washington:
    L. Kippenbrock, T. Major, J. Tatar, N. Tolich

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    UC Davis Campus

    The University of California, Davis, is a major public research university located in Davis, California, just west of Sacramento. It encompasses 5,300 acres of land, making it the second largest UC campus in terms of land ownership, after UC Merced.

  • richardmitnick 2:20 pm on November 29, 2016 Permalink | Reply
    Tags: , , , Physics   

    From CERN: “A new ring to slow down antimatter” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead


    28 Nov 2016
    Corinne Pralavorio
    Posted by Harriet Kim Jarlett

    The new deceleration ring ELENA will slow down antimatter particles further than ever to improve the efficiency of experiments studying antimatter. (Image: Maximilien Brice/CERN)

    You could mistake ELENA for a miniature accelerator. But, unlike most accelerators, it’s housed in a hangar and you can take it all in in just a single glance. The biggest difference though, is that it doesn’t accelerate particles, but decelerates them.

    CERN’s brand-new machine measures just 30 metres in circumference and has just begun its first tests with beam.

    The ELENA (Extra Low ENergy Antiproton) deceleration ring will be connected to the Antiproton Decelerator (AD), which has been in service since 2000. The AD is a unique facility that enables the study of antimatter.

    Antimatter can be thought of as a mirror image of matter and it remains a mystery for physicists. For example, matter and antimatter should have been created in equal quantities at the time of the Big Bang— the event at the origin of our Universe. But antimatter seems to have disappeared from the Universe. Where it has gone is one of the many questions physicists are trying to solve with the AD machine.

    The 182-metre-circumference ring decelerates antiprotons (the anti-particles of protons) to 5.3 MeV, the lowest energy possible in a machine of this size. The antiprotons are then sent to experiments where they are studied or used to produce atoms of antimatter. The slower the antiprotons (i.e. the less energy they have), the easier it is for the experiments to study or manipulate them.

    And this is where ELENA comes in. Coupled with the AD, this small ring will slow the antiprotons down even further, reducing their energy by a factor of 50, from 5.3 MeV to just 0.1 MeV. In addition, the density of the beams will be improved. The experiments will be able to trap 10 to 100 times more antiprotons, improving efficiency and paving the way for new studies.

    Decelerating beams is just as complicated as accelerating them. The slower the particles, the harder it is to control their trajectories. At low energy, beams are more sensitive to outside interference, such as the earth’s magnetic field. ELENA is therefore equipped with magnets that are optimised to operate with very weak fields. An electron cooling system concentrates and decelerates the beams.

    Now that the components of the new decelerator have been installed, the teams have begun the first tests with beam.

    “After five years of development and construction, this is a very important stage. We are going to continue the tests over the coming weeks to see if everything is working as planned,” explains Christian Carli, ELENA project leader. “GBAR, the first experiment to be connected to ELENA, should receive its first antiprotons in 2017.”

    The other experiments will be connected during the second long shutdown of CERN’s accelerators in 2019-2020. ELENA will supply antiprotons to four experiments in parallel.

    Several experiments are studying antimatter and its properties: ALPHA, ASACUSA, ATRAP and BASE. GBAR and AEGIS are working more specifically on the effect of gravity on antimatter.

    You can read more about ELENA in the the CERN Courier.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Cern Courier




    CERN CMS New

    CERN LHCb New II


    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: