Tagged: QCD: Quantum Chromodynamics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:54 am on July 27, 2019 Permalink | Reply
    Tags: "Ask Ethan: Can We Really Get A Universe From Nothing?", , , , , Because dark energy is a property of space itself when the Universe expands the dark energy density must remain constant., , , , , Galaxies that are gravitationally bound will merge together into groups and clusters while the unbound groups and clusters will accelerate away from one another., , Heisenberg uncertainty principle, Negative gravity?, QCD: Quantum Chromodynamics,   

    From Ethan Siegel: “Ask Ethan: Can We Really Get A Universe From Nothing?” 

    From Ethan Siegel
    July 27, 2019

    Our entire cosmic history is theoretically well-understood in terms of the frameworks and rules that govern it. It’s only by observationally confirming and revealing various stages in our Universe’s past that must have occurred, like when the first stars and galaxies formed, and how the Universe expanded over time, that we can truly come to understand what makes up our Universe and how it expands and gravitates in a quantitative fashion. The relic signatures imprinted on our Universe from an inflationary state before the hot Big Bang give us a unique way to test our cosmic history, subject to the same fundamental limitations that all frameworks possess. (NICOLE RAGER FULLER / NATIONAL SCIENCE FOUNDATION)

    And does it require the idea of ‘negative gravity’ in order to work?

    The biggest question that we’re even capable of asking, with our present knowledge and understanding of the Universe, is where did everything we can observe come from? If it came from some sort of pre-existing state, we’ll want to know exactly what that state was like and how our Universe came from it. If it emerged out of nothingness, we’d want to know how we went from nothing to the entire Universe, and what if anything caused it. At least, that’s what our Patreon supporter Charles Buchanan wants to know, asking:

    “One concept bothers me. Perhaps you can help. I see it in used many places, but never really explained. “A universe from Nothing” and the concept of negative gravity. As I learned my Newtonian physics, you could put the zero point of the gravitational potential anywhere, only differences mattered. However Newtonian physics never deals with situations where matter is created… Can you help solidify this for me, preferably on [a] conceptual level, maybe with a little calculation detail?”

    Gravitation might seem like a straightforward force, but an incredible number of aspects are anything but intuitive. Let’s take a deeper look.

    Countless scientific tests of Einstein’s general theory of relativity have been performed, subjecting the idea to some of the most stringent constraints ever obtained by humanity. Einstein’s first solution was for the weak-field limit around a single mass, like the Sun; he applied these results to our Solar System with dramatic success. We can view this orbit as Earth (or any planet) being in free-fall around the Sun, traveling in a straight-line path in its own frame of reference. All masses and all sources of energy contribute to the curvature of spacetime. (LIGO SCIENTIFIC COLLABORATION / T. PYLE / CALTECH / MIT)

    MIT /Caltech Advanced aLigo

    VIRGO Gravitational Wave interferometer, near Pisa, Italy

    Caltech/MIT Advanced aLigo Hanford, WA, USA installation

    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA

    LSC LIGO Scientific Collaboration

    Cornell SXS, the Simulating eXtreme Spacetimes (SXS) project

    Gravitational waves. Credit: MPI for Gravitational Physics/W.Benger

    Gravity is talking. Lisa will listen. Dialogos of Eide

    ESA/eLISA the future of gravitational wave research

    Localizations of gravitational-wave signals detected by LIGO in 2015 (GW150914, LVT151012, GW151226, GW170104), more recently, by the LIGO-Virgo network (GW170814, GW170817). After Virgo came online in August 2018

    Skymap showing how adding Virgo to LIGO helps in reducing the size of the source-likely region in the sky. (Credit: Giuseppe Greco (Virgo Urbino group)

    If you have two point masses located some distance apart in your Universe, they’ll experience an attractive force that compels them to gravitate towards one another. But this attractive force that you perceive, in the context of relativity, comes with two caveats.

    The first caveat is simple and straightforward: these two masses will experience an acceleration towards one another, but whether they wind up moving closer to one another or not is entirely dependent on how the space between them evolves. Unlike in Newtonian gravity, where space is a fixed quantity and only the masses within that space can evolve, everything is changeable in General Relativity. Not only does matter and energy move and accelerate due to gravitation, but the very fabric of space itself can expand, contract, or otherwise flow. All masses still move through space, but space itself is no longer stationary.

    The ‘raisin bread’ model of the expanding Universe, where relative distances increase as the space (dough) expands. The farther away any two raisin are from one another, the greater the observed redshift will be by time the light is received. The redshift-distance relation predicted by the expanding Universe is borne out in observations, and has been consistent with what’s been known going all the way back to the 1920s. (NASA / WMAP SCIENCE TEAM)

    NASA/WMAP 2001 to 2010

    The second caveat is that the two masses you’re considering, even if you’re extremely careful about accounting for what’s in your Universe, are most likely not the only forms of energy around. There are bound to be other masses in the form of normal matter, dark matter, and neutrinos. There’s the presence of radiation, from both electromagnetic and gravitational waves. There’s even dark energy: a type of energy inherent to the fabric of space itself.

    Now, here’s a scenario that might exemplify where your intuition leads you astray: what happens if these masses, for the volume they occupy, have less total energy than the average energy density of the surrounding space?

    The gravitational attraction (blue) of overdense regions and the relative repulsion (red) of the underdense regions, as they act on the Milky Way. Even though gravity is always attractive, there is an average amount of attraction throughout the Universe, and regions with lower energy densities than that will experience (and cause) an effective repulsion with respect to the average. (YEHUDA HOFFMAN, DANIEL POMARÈDE, R. BRENT TULLY, AND HÉLÈNE COURTOIS, NATURE ASTRONOMY 1, 0036 (2017))

    You can imagine three different scenarios:

    1.The first mass has a below-average energy density while the second has an above-average value.
    2.The first mass has an above-average energy density while the second has a below-average value.
    3.Both the first and second masses have a below-average energy density compared to the rest of space.

    In the first two scenarios, the above-average mass will begin growing as it pulls on the matter/energy all around it, while the below-average mass will start shrinking, as it’s less able to hold onto its own mass in the face of its surroundings. These two masses will effectively repel one another; even though gravitation is always attractive, the intervening matter is preferentially attracted to the heavier-than-average mass. This causes the lower-mass object to act like it’s both repelling and being repelled by the heavier-mass object, the same way a balloon held underwater will still be attracted to Earth’s center, but will be forced away from it owing to the (buoyant) effects of the water.

    The Earth’s crust is thinnest over the ocean and thickest over mountains and plateaus, as the principle of buoyancy dictates and as gravitational experiments confirm. Just as a balloon submerged in water will accelerate away from the center of the Earth, a region with below-average energy density will accelerate away from an overdense region, as average-density regions will be more preferentially attracted to the overdense region than the underdense region will. (USGS)

    So what’s going to happen if you have two regions of space with below-average densities, surrounded by regions of just average density? They’ll both shrink, giving up their remaining matter to the denser regions around them. But as far as motions go, they’ll accelerate towards one another, with exactly the same magnitude they’d accelerate at if they were both overdense regions that exceeded the average density by equivalent amounts.

    You might be wondering why it’s important to think about these concerns when talking about a Universe from nothing. After all, if your Universe is full of matter and energy, it’s pretty hard to understand how that’s relevant to making sense of the concept of something coming from nothing. But just as our intuition can lead us astray when thinking about matter and energy on the spacetime playing field of General Relativity, it’s a comparable situation when we think about nothingness.

    A representation of flat, empty space with no matter, energy or curvature of any type. With the exception of small quantum fluctuations, space in an inflationary Universe becomes incredibly flat like this, except in a 3D grid rather than a 2D sheet. Space is stretched flat, and particles are rapidly driven away. (AMBER STUVER / LIVING LIGO)

    You very likely think about nothingness as a philosopher would: the complete absence of everything. Zero matter, zero energy, an absolutely zero value for all the quantum fields in the Universe, etc. You think of space that’s completely flat, with nothing around to cause its curvature anywhere.

    If you think this way, you’re not alone: there are many different ways to conceive of “nothing.” You might even be tempted to take away space, time, and the laws of physics themselves, too. The problem, if you start doing that, is that you lose your ability to predict anything at all. The type of nothingness you’re thinking about, in this context, is what we call unphysical.

    If we want to think about nothing in a physical sense, you have to keep certain things. You need spacetime and the laws of physics, for example; you cannot have a Universe without them.

    A visualization of QCD illustrates how particle/antiparticle pairs pop out of the quantum vacuum for very small amounts of time as a consequence of Heisenberg uncertainty.

    The quantum vacuum is interesting because it demands that empty space itself isn’t so empty, but is filled with all the particles, antiparticles and fields in various states that are demanded by the quantum field theory that describes our Universe. Put this all together, and you find that empty space has a zero-point energy that’s actually greater than zero. (DEREK B. LEINWEBER)

    But here’s the kicker: if you have spacetime and the laws of physics, then by definition you have quantum fields permeating the Universe everywhere you go. You have a fundamental “jitter” to the energy inherent to space, due to the quantum nature of the Universe. (And the Heisenberg uncertainty principle, which is unavoidable.)

    Put these ingredients together — because you can’t have a physically sensible “nothing” without them — and you’ll find that space itself doesn’t have zero energy inherent to it, but energy with a finite, non-zero value. Just as there’s a finite zero-point energy (that’s greater than zero) for an electron bound to an atom, the same is true for space itself. Empty space, even with zero curvature, even devoid of particles and external fields, still has a finite energy density to it.

    The four possible fates of the Universe with only matter, radiation, curvature and a cosmological constant allowed. The top three possibilities are for a Universe whose fate is determined by the balance of matter/radiation with spatial curvature alone; the bottom one includes dark energy. Only the bottom “fate” aligns with the evidence. (E. SIEGEL / BEYOND THE GALAXY)

    From the perspective of quantum field theory, this is conceptualized as the zero-point energy of the quantum vacuum: the lowest-energy state of empty space. In the framework of General Relativity, however, it appears in a different sense: as the value of a cosmological constant, which itself is the energy of empty space, independent of curvature or any other form of energy density.

    Although we do not know how to calculate the value of this energy density from first principles, we can calculate the effects it has on the expanding Universe. As your Universe expands, every form of energy that exists within it contributes to not only how your Universe expands, but how that expansion rate changes over time. From multiple independent lines of evidence — including the Universe’s large-scale structure, the cosmic microwave background, and distant supernovae — we have been able to determine how much energy is inherent to space itself.

    Constraints on dark energy from three independent sources: supernovae, the CMB (cosmic microwave background) and BAO (which is a wiggly feature seen in the correlations of large-scale structure). Note that even without supernovae, we’d need dark energy for certain, and also that there are uncertainties and degeneracies between the amount of dark matter and dark energy that we’d need to accurately describe our Universe. (SUPERNOVA COSMOLOGY PROJECT, AMANULLAH, ET AL., AP.J. (2010))

    This form of energy is what we presently call dark energy, and it’s responsible for the observed accelerated expansion of the Universe. Although it’s been a part of our conceptions of reality for more than two decades now, we don’t fully understand its true nature. All we can say is that when we measure the expansion rate of the Universe, our observations are consistent with dark energy being a cosmological constant with a specific magnitude, and not with any of the alternatives that evolve significantly over cosmic time.

    Because dark energy causes distant galaxies to appear to recede from one another more and more quickly as time goes on — since the space between those galaxies is expanding — it’s often called negative gravity. This is not only highly informal, but incorrect. Gravity is only positive, never negative. But even positive gravity, as we saw earlier, can have effects that look very much like negative repulsion.

    Dark Energy Survey

    Dark Energy Camera [DECam], built at FNAL

    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    Timeline of the Inflationary Universe WMAP

    The Dark Energy Survey (DES) is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. DES began searching the Southern skies on August 31, 2013.

    According to Einstein’s theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up. To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called dark energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    DES is designed to probe the origin of the accelerating universe and help uncover the nature of dark energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the DES collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.

    How energy density changes over time in a Universe dominated by matter (top), radiation (middle), and a cosmological constant (bottom). Note that dark energy doesn’t change in density as the Universe expands, which is why it comes to dominate the Universe at late times. (E. SIEGEL)

    If there were greater amounts of dark energy present within our spatially flat Universe, the expansion rate would be greater. But this is true for all forms of energy in a spatially flat Universe: dark energy is no exception. The only different between dark energy and the more commonly encountered forms of energy, like matter and radiation, is that as the Universe expands, the densities of matter and radiation decrease.

    But because dark energy is a property of space itself, when the Universe expands, the dark energy density must remain constant. As time goes on, galaxies that are gravitationally bound will merge together into groups and clusters, while the unbound groups and clusters will accelerate away from one another. That’s the ultimate fate of the Universe if dark energy is real.

    Laniakea supercluster. From Nature The Laniakea supercluster of galaxies R. Brent Tully, Hélène Courtois, Yehuda Hoffman & Daniel Pomarède at http://www.nature.com/nature/journal/v513/n7516/full/nature13674.html. Milky Way is the red dot.

    So why do we say we have a Universe that came from nothing? Because the value of dark energy may have been much higher in the distant past: before the hot Big Bang. A Universe with a very large amount of dark energy in it will behave identically to a Universe undergoing cosmic inflation. In order for inflation to end, that energy has to get converted into matter and radiation. The evidence strongly points to that happening some 13.8 billion years ago.

    When it did, though, a small amount of dark energy remained behind. Why? Because the zero-point energy of the quantum fields in our Universe isn’t zero, but a finite, greater-than-zero value. Our intuition may not be reliable when we consider the physical concepts of nothing and negative/positive gravity, but that’s why we have science. When we do it right, we wind up with physical theories that accurately describe the Universe we measure and observe.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

  • richardmitnick 4:47 pm on September 24, 2018 Permalink | Reply
    Tags: A New Single-Photon Sensor for Quantum Imaging, , Berkeley Quantum, Figuring out how to extend the search for dark matter particles, From Quantum Gravity to Quantum Technology, , , News Center A Quantum Leap Toward Expanding the Search for Dark Matter, , QCD: Quantum Chromodynamics, U.S. Department of Energy’s Office of High Energy Physics, University of Massachusetts Amherst,   

    From Lawrence Berkeley National Lab: “News Center A Quantum Leap Toward Expanding the Search for Dark Matter” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    September 24, 2018
    Glenn Roberts Jr.
    (510) 486-5582

    A visualization of a massive galaxy cluster that shows dark matter density (purple filaments) overlaid with the gas velocity field. (Credit: Illustris Collaboration)

    Figuring out how to extend the search for dark matter particles – dark matter describes the stuff that makes up an estimated 85 percent of the total mass of the universe yet so far has only been measured by its gravitational effects – is a bit like building a better mousetrap…that is, a mousetrap for a mouse you’ve never seen, will never see directly, may be joined by an odd assortment of other mice, or may not be a mouse after all.

    Now, through a new research program supported by the U.S. Department of Energy’s Office of High Energy Physics (HEP), a consortium of researchers from the DOE’s Lawrence Berkeley National Laboratory (Berkeley Lab), UC Berkeley, and the University of Massachusetts Amherst will develop sensors that enlist the seemingly weird properties of quantum physics to probe for dark matter particles in new ways, with increased sensitivity, and in uncharted regions. Maurice Garcia-Sciveres, a Berkeley Lab physicist, is leading this Quantum Sensors HEP-Quantum Information Science (QIS) Consortium.

    Quantum technologies are emerging as promising alternatives to the more conventional “mousetraps” that researchers have previously used to track down elusive particles. And the DOE, through the same HEP office, is also supporting a collection of other research efforts led by Berkeley Lab scientists that tap into quantum theory, properties, and technologies in the QIS field.

    These efforts include:

    Unraveling the Quantum Structure of Quantum Chromodynamics in Parton Shower Monte Carlo Generators – This effort will develop computer programs that test the interactions between fundamental particles in extreme detail. Current computer simulations are limited by classical algorithms, though quantum algorithms could more accurately model these interactions and could provide a better way to compare with and understand particle events measured at CERN’s Large Hadron Collider, the world’s most powerful particle collider. Berkeley Lab’s Christian Bauer, a senior research scientist, will lead this effort.
    Quantum Pattern Recognition (QPR) for High-Energy Physics –Increasingly powerful particle accelerators require vastly faster computer algorithms to monitor and sort through billions of particle events per second, and this effort will develop and study the potential of quantum-based algorithms for pattern recognition to reconstruct charged particles. Such algorithms have the potential for significant speed improvements and increased precision. Led by Berkeley Lab physicist and Divisional Fellow Heather Gray, this effort will involve high-energy physics and high-performance computing expertise in Berkeley Lab’s Physics Division and at the Lab’s National Energy Research Scientific Computing Center, a DOE Office of Science User Facility, and also at UC Berkeley.
    Skipper-CCD, a New Single-Photon Sensor for Quantum Imaging – For the past six years, Berkeley Lab and Fermi National Accelerator Laboratory (Fermilab) have been collaborating in the development of a detector for astrophysics experiments that can detect the smallest individual unit of light, known as a photon. This Skipper-CCD detector was successfully demonstrated in the summer of 2017 with an incredibly low noise that allowed the detection of even individual electrons. As a next step, this Fermilab-led effort will seek to image pairs of photons that exist in a state of quantum entanglement, meaning their properties are inherently related – even over long distances – such that the measurement of one of the particles necessarily defines the properties of the other. Steve Holland, a senior scientist and engineer at Berkeley Lab who is a pioneer in the development of high-performance silicon detectors for a range of uses, is leading Berkeley Lab’s participation in this project.
    Geometry and Flow of Quantum Information: From Quantum Gravity to Quantum Technology –This effort will develop quantum algorithms and simulations for properties, including error correction and information scrambling, that are relevant to black hole theories and to quantum computing involving highly connected arrays of superconducting qubits – the basic units of a quantum computer. Researchers will also compare these with more classical methods. UC Berkeley is heading up this research program, and Irfan Siddiqi, a scientist in Berkeley Lab’s Materials Sciences Division and founding director of the Center for Quantum Coherent Science at UC Berkeley, is leading Berkeley Lab’s involvement.
    Siddiqi is also leading a separate research program, Field Programmable Gate Array-based Quantum Control for High-Energy Physics Simulations with Qutrits, that will develop specialized tools and logic families for high-energy-physics-focused quantum computing. This effort involves Berkeley Lab’s Accelerator Technology and Applied Physics Division.

    These projects are also part of Berkeley Quantum, a partnership that harnesses the expertise and facilities of Berkeley Lab and UC Berkeley to advance U.S. quantum capabilities by conducting basic research, fabricating and testing quantum-based devices and technologies, and educating the next generation of researchers.

    Also, across several of its offices, the DOE has announced support for a wave of other R&D efforts (see a related news release) that will foster collaborative innovation in quantum information science at Berkeley Lab, at other national labs, and at partner institutions.

    At Berkeley Lab, the largest HEP-funded QIS-related undertaking will include a multidisciplinary team in the development and demonstration of quantum sensors to look for very-low-mass dark matter particles – so-called “light dark matter” – by instrumenting two different detectors.

    One of these detectors will use liquid helium at a very low temperature where otherwise familiar phenomena such as heat and thermal conductivity display quantum behavior. The other detector will use specially fabricated crystals of gallium arsenide (see a related article), also chilled to cryogenic temperatures. The ideas for how these experiments can search for very light dark matter sprang from theory work at Berkeley Lab.

    “There’s a lot of unexplored territory in low-mass dark matter,” said Natalie Roe, director of the Physics Division at Berkeley Lab and the principal investigator for the Lab’s HEP-related quantum efforts. “We have all the pieces to pull this together: in theory, experiments, and detectors.”

    This image of the Andromeda Galaxy, taken from a 1970 study by astronomers Vera Rubin and W. Kent Ford Jr., shows points (dots) that were tracked at different distances from the galaxy center. The selected points unexpectedly were found to rotate at a similar rate, which provides evidence for the existence of dark matter. (Credit: Vera Rubin, W. Kent Ford Jr.)

    Garcia-Sciveres, who is leading the effort in applying quantum sensors to the low-mass dark matter search, noted that other major efforts – such as the Berkeley Lab-led LUX-ZEPLIN (LZ) experiment that is taking shape in South Dakota – will help root out whether dark matter particles known as WIMPs (weakly interacting massive particles) exist with masses comparable to that of atoms. But LZ and similar experiments are not designed to detect dark matter particles of much lower masses.

    LBNL Lux Zeplin project at SURF

    “The traditional WIMP dark matter experiments haven’t found anything yet,” he said. “And there is a lot of theoretical work on models that favor particles of a lower mass than experiments like LZ can measure,” he added. “This has motivated people to really look hard at how you can detect very-low-mass particles. It’s not so easy. It’s a very small signal that has to be detected without any background noise.”

    Researchers hope to develop quantum sensors that are better at filtering out the noise of unwanted signals. While a traditional WIMP experiment is designed to sense the recoil of an entire atomic nucleus after it is “kicked” by a dark matter particle, very-low-mass dark matter particles will bounce right off nuclei without affecting them, like a flea bouncing off an elephant.

    The goal of the new effort is to sense the low-mass particles via their energy transfer in the form of very feeble quantum vibrations, which go by names like “phonons” or “rotons,” for example, Garcia-Sciveres said.

    “You would never be able to tell that an invisible flea hits an elephant by watching the elephant. But what if every time an invisible flea hits an elephant at one end of the herd, a visible flea is flung away from an elephant at the other end of the herd?” he said.

    “You could use these sensors to watch for such slight signals in a very cold crystal or superfluid helium, where an incoming dark matter particle is like the invisible flea, and the outgoing visible flea is a quantum vibration that must be detected.”

    The particle physics community has held some workshops to brainstorm the possibilities for low-mass dark matter detection. “This is a new regime. This is an area where there aren’t even any measurements yet. There is a promise that QIS techniques can help give us more sensitivity to the small signals we’re looking for,” Garcia-Sciveres added. “Let’s see if that’s true.”

    The demonstration detectors will each have about 1 cubic centimeter of detector material. Dan McKinsey, a Berkeley Lab faculty senior scientist and UC Berkeley physics professor who is responsible for the development of the liquid helium detector, said that the detectors will be constructed on the UC Berkeley campus. Both are designed to be sensitive to particles with a mass lighter than protons – the positively charged particles that reside in atomic nuclei.

    A schematic for low-mass dark matter particle detection in a planned superfluid helium (He) experiment. (Credit: Berkeley Lab)

    The superfluid helium detector will make use of a process called “quantum evaporation,” in which rotons and phonons cause individual helium atoms to be evaporated from the surface of superfluid helium.

    Kathryn Zurek, a Berkeley Lab physicist and pioneering theorist in the search for very-low-mass dark matter particles who is working on the quantum sensor project, said the technology to detect such “whispers” of dark matter didn’t exist just a decade ago but “has made major gains in the last few years.” She also noted, “There had been a fair amount of skepticism about how realistic it would be to look for this light-mass dark matter, but the community has moved more broadly in that direction.”

    There are many synergies in the expertise and capabilities that have developed both at Berkeley Lab and on the UC Berkeley campus that make it a good time – and the right place – to develop and apply quantum technologies to the hunt for dark matter, Zurek said.

    Theories developed at Berkeley Lab suggest that certain exotic materials exhibit quantum states or “modes” that low-mass dark matter particles can couple with, which would make the particles detectable – like the “visible flea” referenced above.

    “These ideas are the motivation for building these experiments to search for light dark matter,” Zurek said. “This is a broad and multipronged approached, and the idea is that it will be a stepping stone to a larger effort.”

    The new project will draw from a deep experience in building other types of particle detectors, and R&D in ultrasensitive sensors that operate at the threshold where an electrically conducting material becomes a superconductor – the “tipping point” that is sensitive to the slightest fluctuations. Versions of these sensors are already used to search for slight temperature variations in the relic microwave light that spans the universe.

    At the end of the three-year demonstration, researchers could perhaps turn their sights to more exotic types of detector materials in larger volumes.

    “I’m excited to see this program move forward, and I think it will become a significant research direction in the Physics Division at Berkeley Lab,” she said, adding that the program could also demonstrate ultrasensitive detectors that have applications in other fields of science.

    More info:

    Read a news release that summarizes all of the Berkeley Lab quantum information science awards announced Sept. 24
    Berkeley Lab to Build an Advanced Quantum Computing Testbed
    About Berkeley Quantum

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

  • richardmitnick 4:33 pm on August 20, 2018 Permalink | Reply
    Tags: , Anomalies, , Branes, , , , , , Parity violation, QCD: Quantum Chromodynamics, , , , , , The second superstring revolution, Theorist John Schwarz   

    From Caltech: “Long and Winding Road: A Conversation with String Theory Pioneer” John Schwarz 

    Caltech Logo

    From Caltech


    Whitney Clavin
    (626) 395-1856

    John Schwarz discusses the history and evolution of superstring theory.

    John Schwarz. Credit: Seth Hansen for Caltech

    The decades-long quest for a theory that would unify all the known forces—from the microscopic quantum realm to the macroscopic world where gravity dominates—has had many twists and turns. The current leading theory, known as superstring theory and more informally as string theory, grew out of an approach to theoretical particle physics, called S-matrix theory, which was popular in the 1960s. Caltech’s John H. Schwarz, the Harold Brown Professor of Theoretical Physics, Emeritus, began working on the problem in 1971, while a junior faculty member at Princeton University. He moved to Caltech in 1972, where he continued his research with various collaborators from other universities. Their studies in the 1970s and 1980s would dramatically shift the evolution of the theory and, in 1984, usher in what’s known as the first superstring revolution.

    Essentially, string theory postulates that our universe is made up, at its most fundamental level, of infinitesimal tiny vibrating strings and contains 10 dimensions—three for space, one for time, and six other spatial dimensions curled up in such a way that we don’t perceive them in everyday life or even with the most sensitive experimental searches to date. One of the many states of a string is thought to correspond to the particle that carries the gravitational force, the graviton, thereby linking the two pillars of fundamental physics—quantum mechanics and the general theory of relativity, which includes gravity.

    We sat down with Schwarz to discuss the history and evolution of string theory and how the theory itself might have moved past strings.

    What are the earliest origins of string theory?

    The first study often regarded as the beginning of string theory came from an Italian physicist named Gabriele Veneziano in 1968. He discovered a mathematical formula that had many of the properties that people were trying to incorporate in a fundamental theory of the strong nuclear force [a fundamental force that holds nuclei together]. This formula was kind of pulled out of the blue, and ultimately Veneziano and others realized, within a couple years, that it was actually describing a quantum theory of a string—a one-dimensional extended object.

    How did the field grow after this paper?

    In the early ’70s, there were several hundred people worldwide working on string theory. But then everything changed when quantum chromodynamics, or QCD—which was developed by Caltech’s Murray Gell-Mann [Nobel Laureate, 1969] and others—became the favored theory of the strong nuclear force. Almost everyone was convinced QCD was the right way to go and stopped working on string theory. The field shrank down to just a handful of people in the course of a year or two. I was one of the ones who remained.

    How did Gell-Mann become interested in your work?

    Gell-Mann is the one who brought me to Caltech and was very supportive of my work. He took an interest in studies I had done with a French physicist, André Neveu, when we were at Princeton. Neveu and I introduced a second string theory. The initial Veneziano version had many problems. There are two kinds of fundamental particles called bosons and fermions, and the Veneziano theory only described bosons. The one I developed with Neveu included fermions. And not only did it include fermions but it led to the discovery of a new kind of symmetry that relates bosons and fermions, which is called supersymmetry. Because of that discovery, this version of string theory is called superstring theory.

    When did the field take off again?

    A pivotal change happened after work I did with another French physicist, Joël Scherk, whom Gell-Mann and I had brought to Caltech as a visitor in 1974. During that period, we realized that many of the problems we were having with string theory could be turned into advantages if we changed the purpose. Instead of insisting on constructing a theory of the strong nuclear force, we took this beautiful theory and asked what it was good for. And it turned out it was good for gravity. Neither of us had worked on gravity. It wasn’t something we were especially interested in but we realized that this theory, which was having trouble describing the strong nuclear force, gives rise to gravity. Once we realized this, I knew what I would be doing for the rest of my career. And I believe Joël felt the same way. Unfortunately, he died six years later. He made several important discoveries during those six years, including a supergravity theory in 11 dimensions.

    Surprisingly, the community didn’t respond very much to our papers and lectures. We were generally respected and never had a problem getting our papers published, but there wasn’t much interest in the idea. We were proposing a quantum theory of gravity, but in that era physicists who worked on quantum theory weren’t interested in gravity, and physicists who worked on gravity weren’t interested in quantum theory.

    That changed after I met Michael Green [a theoretical physicist then at the University of London and now at the University of Cambridge], at the CERN cafeteria in Switzerland in the summer of 1979. Our collaboration was very successful, and Michael visited Caltech for several extended visits over the next few years. We published a number of papers during that period, which are much cited, but our most famous work was something we did in 1984, which had to do with a problem known as anomalies.

    What are anomalies in string theory?

    One of the facts of nature is that there is what’s called parity violation, which means that the fundamental laws are not invariant under mirror reflection. For example, a neutrino always spins clockwise and not counterclockwise, so it would look wrong viewed in a mirror. When you try to write down a fundamental theory with parity violation, mathematical inconsistencies often arise when you take account of quantum effects. This is referred to as the anomaly problem. It appeared that one couldn’t make a theory based on strings without encountering these anomalies, which, if that were the case, would mean strings couldn’t give a realistic theory. Green and I discovered that these anomalies cancel one another in very special situations.

    When we released our results in 1984, the field exploded. That’s when Edward Witten [a theoretical physicist at the Institute for Advanced Study in Princeton], probably the most influential theoretical physicist in the world, got interested. Witten and three collaborators wrote a paper early in 1985 making a particular proposal for what to do with the six extra dimensions, the ones other than the four for space and time. That proposal looked, at the time, as if it could give a theory that is quite realistic. These developments, together with the discovery of another version of superstring theory, constituted the first superstring revolution.

    Richard Feynman was here at Caltech during that time, before he passed away in 1988. What did he think about string theory?

    After the 1984 to 1985 breakthroughs in our understanding of superstring theory, the subject no longer could be ignored. At that time it acquired some prominent critics, including Richard Feynman and Stephen Hawking. Feynman’s skepticism of superstring theory was based mostly on the concern that it could not be tested experimentally. This was a valid concern, which my collaborators and I shared. However, Feynman did want to learn more, so I spent several hours explaining the essential ideas to him. Thirty years later, it is still true that there is no smoking-gun experimental confirmation of superstring theory, though it has proved its value in other ways. The most likely possibility for experimental support in the foreseeable future would be the discovery of supersymmetry particles. So far, they have not shown up.

    What was the second superstring revolution about?

    The second superstring revolution occurred 10 years later in the mid ’90s. What happened then is that string theorists discovered what happens when particle interactions become strong. Before, we had been studying weakly interacting systems. But as you crank up the strength of the interaction, a 10th dimension of space can emerge. New objects called branes also emerge. Strings are one dimensional; branes have all sorts of dimensions ranging from zero to nine. An important class of these branes, called D-branes, was discovered by the late Joseph Polchinski [BS ’75]. Strings do have a special role, but when the system is strongly interacting, then the strings become less fundamental. It’s possible that in the future the subject will get a new name but until we understand better what the theory is, which we’re still struggling with, it’s premature to invent a new name.

    What can we say now about the future of string theory?

    It’s now over 30 years since a large community of scientists began pooling their talents, and there’s been enormous progress in those 30 years. But the more big problems we solve, the more new questions arise. So, you don’t even know the right questions to ask until you solve the previous questions. Interestingly, some of the biggest spin-offs of our efforts to find the most fundamental theory of nature are in pure mathematics.

    Do you think string theory will ultimately unify the forces of nature?

    Yes, but I don’t think we’ll have a final answer in my lifetime. The journey has been worth it, even if it did take some unusual twists and turns. I’m convinced that, in other intelligent civilizations throughout the galaxy, similar discoveries will occur, or already have occurred, in a different sequence than ours. We’ll find the same result and reach the same conclusions as other civilizations, but we’ll get there by a very different route.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”

    Caltech campus

  • richardmitnick 11:34 am on November 9, 2017 Permalink | Reply
    Tags: , But what is matter exactly, Einstein: m = E/c2. This is the great insight (not E = mc2), Frank Wilczek, , Higgs field, , , , Physics Has Demoted Mass, QCD: Quantum Chromodynamics, Quarks are quantum wave-particles   

    From Nautilus: “Physics Has Demoted Mass” 



    November 9, 2017
    Jim Baggott

    You’re sitting here, reading this article. Maybe it’s a hard copy, or an e-book on a tablet computer or e-reader. It doesn’t matter. Whatever you’re reading it on, we can be reasonably sure it’s made of some kind of stuff: paper, card, plastic, perhaps containing tiny metal electronic things on printed circuit boards. Whatever it is, we call it matter or material substance. It has a characteristic property that we call solidity. It has mass.

    But what is matter, exactly? Imagine a cube of ice, measuring a little over one inch (or 2.7 centimeters) in length. Imagine holding this cube of ice in the palm of your hand. It is cold, and a little slippery. It weighs hardly anything at all, yet we know it weighs something.

    Let’s make our question a little more focused. What is this cube of ice made of? And, an important secondary question: What is responsible for its mass?

    Credit below

    To understand what a cube of ice is made of, we need to draw on the learning acquired by the chemists. Building on a long tradition established by the alchemists, these scientists distinguished between different chemical elements, such as hydrogen, carbon, and oxygen. Research on the relative weights of these elements and the combining volumes of gases led John Dalton and Louis Gay-Lussac to the conclusion that different chemical elements consist of atoms with different weights which combine according to a set of rules involving whole numbers of atoms.

    The mystery of the combining volumes of hydrogen and oxygen gas to produce water was resolved when it was realized that hydrogen and oxygen are both diatomic gases, H2 and O2. Water is then a compound consisting of two hydrogen atoms and one oxygen atom, H2O.

    This partly answers our first question. Our cube of ice consists of molecules of H2O organized in a regular array. We can also make a start on our second question. Avogadro’s law states that a mole of chemical substance will contain about 6 × 10^23 discrete “particles.” Now, we can interpret a mole of substance simply as its molecular weight scaled up to gram quantities. Hydrogen (in the form of H2) has a relative molecular weight of 2, implying that each hydrogen atom has a relative atomic weight of 1. Oxygen (O2) has a relative molecular weight of 32, implying that each oxygen atom has a relative atomic weight of 16. Water (H2O) therefore has a relative molecular weight of 2 × 1 + 16 = 18.

    It so happens that our cube of ice weighs about 18 grams, which means that it represents a mole of water, more or less. According to Avogadro’s law it must therefore contain about 6 × 10^23 molecules of H2O. This would appear to provide a definitive answer to our second question. The mass of the cube of ice derives from the mass of the hydrogen and oxygen atoms present in 6 × 10^23 molecules of H2O.

    But, of course, we can go further. We learned from J.J. Thompson, Ernest Rutherford, and Niels Bohr and many other physicists in the early 20th century that all atoms consist of a heavy, central nucleus surrounded by light, orbiting electrons. We subsequently learned that the central nucleus consists of protons and neutrons. The number of protons in the nucleus determines the chemical identity of the element: A hydrogen atom has one proton, an oxygen atom has eight (this is called the atomic number). But the total mass or weight of the nucleus is determined by the total number of protons and neutrons in the nucleus.

    Hydrogen still has only one (its nucleus consists of a single proton—no neutrons). The most common isotope of oxygen has—guess what?—16 (eight protons and eight neutrons). It’s obviously no coincidence that these proton and neutron counts are the same as the relative atomic weights I quoted above.

    If we ignore the light electrons, then we would be tempted to claim that the mass of the cube of ice resides in all the protons and neutrons in the nuclei of its hydrogen and oxygen atoms. Each molecule of H2O contributes 10 protons and eight neutrons, so if there are 6 × 10^23 molecules in the cube and we ignore the small difference in mass between a proton and a neutron, we conclude that the cube contains in total about 18 times this figure, or 108 × 10^23 protons and neutrons.

    So far, so good. But we’re not quite done yet. We now know that protons and neutrons are not elementary particles. They consist of quarks. A proton contains two up quarks and a down quark, a neutron two down quarks and an up quark. And the color force binding the quarks together inside these larger particles is carried by massless gluons.

    Okay, so surely we just keep going. If once again we approximate the masses of the up and down quarks as the same we just multiply by three and turn 108 × 10^23 protons and neutrons into 324 × 10^23 up and down quarks. We conclude that this is where all the mass resides. Yes?

    No. This is where our naïve atomic preconceptions unravel. We can look up the masses of the up and down quarks on the Particle Data Group website. The up and down quarks are so light that their masses can’t be measured precisely and only ranges are quoted. The following are all reported in units of MeV/c2. In these units the mass of the up quark is given as 2.3 with a range from 1.8 to 3.0. The down quark is a little heavier, 4.8, with a range from 4.5 to 5.3. Compare these with the mass of the electron, about 0.51 measured in the same units.

    Now comes the shock. In the same units of MeV/c2 the proton mass is 938.3, the neutron 939.6. The combination of two up quarks and a down quark gives us only 9.4, or just 1 percent of the mass of the proton. The combination of two down quarks and an up quark gives us only 11.9, or just 1.3 percent of the mass of the neutron. About 99 percent of the masses of the proton and neutron seem to be unaccounted for. What’s gone wrong?

    To answer this question, we need to recognize what we’re dealing with. Quarks are not self-contained “particles” of the kind that the Greeks or the mechanical philosophers might have imagined. They are quantum wave-particles; fundamental vibrations or fluctuations of elementary quantum fields. The up and down quarks are only a few times heavier than the electron, and we’ve demonstrated the electron’s wave-particle nature in countless laboratory experiments. We need to prepare ourselves for some odd, if not downright bizarre behavior.

    And let’s not forget the massless gluons. Or special relativity, and E = mc2. Or the difference between “bare” and “dressed” mass. And, last but not least, let’s not forget the role of the Higgs field in the “origin” of the mass of all elementary particles. To try to understand what’s going on inside a proton or neutron we need to reach for quantum chromodynamics, the quantum field theory of the color force between quarks.

    icedmocha / Shutterstock

    Quarks and gluons possess color “charge.” Just what is this, exactly? We have no way of really knowing. We do know that color is a property of quarks and gluons and there are three types, which physicists have chosen to call red, green, and blue. But, just as nobody has ever “seen” an isolated quark or gluon, so more or less by definition nobody has ever seen a naked color charge. In fact, quantum chromodynamics (QCD) suggests that if a color charge could be exposed like this it would have a near-infinite energy. Aristotle’s maxim was that “nature abhors a vacuum.” Today we might say: “nature abhors a naked color charge.”

    So, what would happen if we could somehow create an isolated quark with a naked color charge? Its energy would go up through the roof, more than enough to conjure virtual gluons out of “empty” space. Just as the electron moving through its own self-generated electromagnetic field gathers a covering of virtual photons, so the exposed quark gathers a covering of virtual gluons. Unlike photons, the gluons themselves carry color charge and they are able to reduce the energy by, in part, masking the exposed color charge. Think of it this way: The naked quark is acutely embarrassed, and it quickly dresses itself with a covering of gluons.

    This isn’t enough, however. The energy is high enough to produce not only virtual particles (like a kind of background noise or hiss), but elementary particles, too. In the scramble to cover the exposed color charge, an anti-quark is produced which pairs with the naked quark to form a meson. A quark is never—but never—seen without a chaperone.

    But this still doesn’t do it. To cover the color charge completely we would need to put the anti-quark in precisely the same place at precisely the same time as the quark. Heisenberg’s uncertainty principle won’t let nature pin down the quark and anti-quark in this way. Remember that a precise position implies an infinite momentum, and a precise rate of change of energy with time implies an infinite energy. Nature has no choice but to settle for a compromise. It can’t cover the color charge completely but it can mask it with the anti-quark and the virtual gluons. The energy is at least reduced to a manageable level.

    This kind of thing also goes on inside the proton and neutron. Within the confines of their host particles, the three quarks rattle around relatively freely. But, once again, their color charges must be covered, or at least the energy of the exposed charges must be reduced. Each quark produces a blizzard of virtual gluons that pass back and forth between them, together with quark–anti-quark pairs. Physicists sometimes call the three quarks that make up a proton or a neutron “valence” quarks, as there’s enough energy inside these particles for a further sea of quark–anti-quark pairs to form. The valence quarks are not the only quarks inside these particles.

    What this means is that the mass of the proton and neutron can be traced largely to the energy of the gluons and the sea of quark–anti-quark pairs that are conjured from the color field.

    How do we know? Well, it must be admitted that it is actually really rather difficult to perform calculations using QCD. The color force is extremely strong, and the corresponding energies of color-force interactions are therefore very high. Remember that the gluons also carry color charge, so everything interacts with everything else. Virtually anything can happen, and keeping track of all the possible virtual and elementary-particle permutations is very demanding.

    This means that although the equations of QCD can be written down in a relatively straightforward manner, they cannot be solved analytically, on paper. Also, the mathematical sleight-of-hand used so successfully in QED no longer applies—because the energies of the interactions are so high we can’t apply the techniques of renormalization. Physicists have had no choice but to solve the equations on a computer instead.

    Considerable progress was made with a version of QCD called “QCD-lite.” This version considered only massless gluons and up and down quarks, and further assumed that the quarks themselves are also massless (so, literally, “lite”). Calculations based on these approximations yielded a proton mass that was found to be just 10 percent lighter than the measured value.

    Let’s stop to think about that for a bit. A simplified version of QCD in which we assume that no particles have mass to start with nevertheless predicts a mass for the proton that is 90 percent right. The conclusion is quite startling. Most of the mass of the proton comes from the energy of the interactions of its constituent quarks and gluons.

    John Wheeler used the phrase “mass without mass” to describe the effects of superpositions of gravitational waves which could concentrate and localize energy such that a black hole is created. If this were to happen, it would mean that a black hole—the ultimate manifestation of super-high-density matter—had been created not from the matter in a collapsing star but from fluctuations in spacetime. What Wheeler really meant was that this would be a case of creating a black hole (mass) from gravitational energy.

    But Wheeler’s phrase is more than appropriate here. Frank Wilczek, one of the architects of QCD, used it in connection with his discussion of the results of the QCD-lite calculations. If much of the mass of a proton and neutron comes from the energy of interactions taking place inside these particles, then this is indeed “mass without mass,” meaning that we get the behavior we tend to ascribe to mass without the need for mass as a property.

    Does this sound familiar? Recall that in Einstein’s seminal addendum to his 1905 paper on special relativity the equation he derived is actually m = E/c2. This is the great insight (not E = mc2). And Einstein was surely prescient when he wrote: “the mass of a body is a measure of its energy content.”[1] Indeed, it is. In his book The Lightness of Being, Wilczek wrote:[2]

    “If the body is a human body, whose mass overwhelmingly arises from the protons and neutrons it contains, the answer is now clear and decisive. The inertia of that body, with 95 percent accuracy, is its energy content.”

    In the fission of a U-235 nucleus, some of the energy of the color fields inside its protons and neutrons is released, with potentially explosive consequences. In the proton–proton chain involving the fusion of four protons, the conversion of two up quarks into two down quarks, forming two neutrons in the process, results in the release of a little excess energy from its color fields. Mass does not convert to energy. Energy is instead passed from one kind of quantum field to another.

    Where does this leave us? We’ve certainly come a long way since the ancient Greek atomists speculated about the nature of material substance, 2,500 years ago. But for much of this time we’ve held to the conviction that matter is a fundamental part of our physical universe. We’ve been convinced that it is matter that has energy. And, although matter may be reducible to microscopic constituents, for a long time we believed that these would still be recognizable as matter—they would still possess the primary quality of mass.

    Modern physics teaches us something rather different, and deeply counter-intuitive. As we worked our way ever inward—matter into atoms, atoms into sub-atomic particles, sub-atomic particles into quantum fields and forces—we lost sight of matter completely. Matter lost its tangibility. It lost its primacy as mass became a secondary quality, the result of interactions between intangible quantum fields. What we recognize as mass is a behavior of these quantum fields; it is not a property that belongs or is necessarily intrinsic to them.

    Despite the fact that our physical world is filled with hard and heavy things, it is instead the energy of quantum fields that reigns supreme. Mass becomes simply a physical manifestation of that energy, rather than the other way around.

    This is conceptually quite shocking, but at the same time extraordinarily appealing. The great unifying feature of the universe is the energy of quantum fields, not hard, impenetrable atoms. Perhaps this is not quite the dream that philosophers might have held fast to, but a dream nevertheless.


    1. Einstein, A. Does the inertia of a body depend upon its energy-content? Annalen der Physik 18 (1905).

    2. Wilczek, F. The Lightness of Being Basic Books, New York, NY (2008).

    Photocollage credits: Physicsworld.com; Thatree Thitivongvaroon / Getty Images

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

  • richardmitnick 12:14 pm on August 25, 2017 Permalink | Reply
    Tags: , Basic science research seeks to improve our understanding of the world around us, , , Center for Frontiers of Nuclear Science, , , Nucleons, QCD: Quantum Chromodynamics,   

    From BNL: “Research Center Established to Explore the Least Understood and Strongest Force Behind Visible Matter” 

    Brookhaven Lab

    August 22, 2017
    Peter Genzer
    (631) 344-3174

    In an Electron-Ion Collider, a beam of electrons (e-) would scatter off a beam of protons or atomic nuclei, generating virtual photons (λ)—particles of light that penetrate the proton or nucleus to tease out the structure of the quarks and gluons within.

    Science can explain only a small portion of the matter that makes up the universe, from the earth we walk on to the stars we see at night. Stony Brook University and the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory (BNL) have established the Center for Frontiers of Nuclear Science to help scientists better understand the building blocks of visible matter. The new Center will push the frontiers of knowledge about quarks, gluons and their interactions that form protons, neutrons, and ultimately 99.9 percent of the mass of atoms – the bulk of the visible universe.

    “The Center for Frontiers in Nuclear Science will bring us closer to understanding our universe in ways in which it has never before been possible,” said Samuel L. Stanley Jr., MD, President of Stony Brook University. “Thanks to the vision of the Simons Foundation, scientists from Stony Brook, Brookhaven Laboratory and many other institutions are now empowered to pursue the big ideas that will lead to new knowledge about the structure of the building blocks of everything in the universe today.”

    Bolstered by a new $5 million grant from the Simons Foundation and augmented by $3 million in research grants received by Stony Brook University, the Center will be a research and education hub to ultimately help scientists unravel more secrets of the universe’s strongest and least-understood force to advance both fundamental science and applications that transform our lives.

    Jim Simons, PhD, Chairman of the Simons Foundation said, “Nuclear physics is a deep and important discipline, casting light on many poorly understood facets of matter in our universe. It is a pleasure to support research in this area conducted by members of the outstanding team to be assembled by Brookhaven Lab and Stony Brook University. We much look forward to the results of this effort.”

    “Basic science research seeks to improve our understanding of the world around us, and it can take human understanding to wonderful and unexpected places,” said Marilyn Simons, President of the Simons Foundation. “Exploring the qualities and behaviors of fundamental particles seems likely to do just that.”

    The Center brings together current Stony Brook faculty and BNL staff, and scientists around the world with students and new scientific talent to investigate the structure of nucleons and nuclei at a fundamental level. Despite the importance of nucleons in all visible matter, scientists know less about their internal structure and dynamics than about any other component of visible matter. Over the next several decades, the Center is slated to become a leading international intellectual hub for quantum chromodynamics (QCD), a branch of physics that describes the properties of nucleons, starting from the interactions of the quarks and gluons inside them.

    An Electron-Ion Collider would probe the inner microcosm of protons to help scientists understand how interactions among quarks (colored spheres) and glue-like gluons (yellow) generate the proton’s essential properties and the large-scale structure of the visible matter in the universe today.

    As part of the Center’s mission as a destination of research, collaboration and education for international scientists and students, workshops and seminars are planned for scientists to discuss and investigate theoretical concepts and promote experimental measurements to advance QCD-based nuclear science. The Center will support graduate education in nuclear science and conduct visitor programs to support and promote the Center’s role as an international research hub for physics related to a proposed Electron Ion Collider (EIC).

    One of the central aspects of the Center’s focus during its first few years will be activities on the science of a proposed EIC, a powerful new particle accelerator that would create rapid-fire, high-resolution “snapshots” of quarks and gluons contained in nucleons and complex nuclei. An EIC would enable scientists to see deep inside these objects and explore the still mysterious structures and interactions of quarks and gluons, opening up a new frontier in nuclear physics.

    “The role of quarks and gluons in determining the properties of protons and neutrons remains one of the greatest unsolved mysteries in physics,” said Doon Gibbs, Ph.D., Brookhaven Lab Director. “An Electron Ion Collider would reveal the internal structure of these atomic building blocks, a key part of the quest to understand the matter we’re made of.”

    Building an EIC and its research program in the United States would strengthen and expand U.S. leadership in nuclear physics and stimulate economic benefits well into the 2040s. In 2015, the DOE and the National Science Foundation’s Nuclear Science Advisory Committee recommended an EIC as the highest priority for new facility construction. Similar to explorations of fundamental particles and forces that have driven our nation’s scientific, technological, and economic progress for the past century — from the discovery of electrons that power our sophisticated computing and communications devices to our understanding of the cosmos — groundbreaking nuclear science research at an EIC will spark new innovations and technological advances.

    Stony Brook and BNL have internationally renowned programs in nuclear physics that focus on understanding QCD. Stony Brook’s nuclear physics group has recently expanded its expertise by adding faculty in areas such as electron scattering and neutrino science. BNL operates the Relativistic Heavy Ion Collider, a DOE Office of Science User Facility and the world’s most versatile particle collide. RHIC has pioneered the study of quark-gluon matter at high temperatures and densities—known as quark-gluon plasma— and is exploring the limits of normal nuclear matter. Together, these cover a major part of the course charted by the U.S. nuclear science community in its 2015 Long Range Plan.

    Abhay Deshpande, PhD, Professor of experimental nuclear physics in the Department of Physics and Astronomy in the College of Arts and Sciences at Stony Brook University, has been named Director of the Center. Professor Deshpande has promoted an EIC for more than two decades and helped create a ~700-member global scientific community (the EIC Users Group, EICUG) interested in pursuing the science of an EIC. In the fall of 2016, he was elected as the first Chair of its Steering Committee, effectively serving as its spokesperson, a position from which he has stepped down to direct the new Center. Concurrently with his position as Center Director, Dr. Deshpande also serves as Director of EIC Science at Brookhaven Lab.

    Scientists at the Center, working with EICUG, will have a specific focus on QCD inside the nucleon and how it shapes fundamental nucleon properties, such as spin and mass; the role of high-density many-body QCD and gluons in nuclei; the quark-gluon plasma at the high temperature frontier; and the connections of QCD to weak interactions and nuclear astrophysics. Longer term, the Center’s programmatic focus is expected to reflect the evolution of nuclear science priorities in the United States.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

  • richardmitnick 12:35 pm on June 10, 2017 Permalink | Reply
    Tags: , , Irish Centre for High-End Computing, , , PRACE, QCD: Quantum Chromodynamics, , Sinéad Ryan, ,   

    From Science Node= Women in STEM-“A day in the life of an Irish particle physicist” Sinéad Ryan 

    Science Node bloc

    Science Node

    02 Jun, 2017
    Tristan Fitzpatrick

    Sinéad Ryan is a quantum chromodynamics expert in Dublin. She relies on PRACE HPC resources to calculate the mass of quarks, gluons, and hadrons — and uncover the secrets of the universe.

    Uncovering the mysteries of the cosmos is just another day in the office for Sinéad Ryan.


    Ryan, professor of theoretical high energy physics at Trinity College Dublin, specializes in quantum chromodynamics (QCD). The field examines how quarks and gluons form hadrons, the fundamental starting point of our universe.

    “Quarks and gluons are the building blocks for everything in the world around us and for our universe,” says Ryan. “The question is, how do these form the matter that we see around us?”

    To answer this, Ryan performs numerical simulations on high-performance computing (HPC) resources managed by the Partnership for Advanced Computing in Europe’s (PRACE).

    “I think PRACE is crucial for our field,” says Ryan, “and I’m sure other people would tell you the same thing.”

    When quarks are pulled apart, energy grows between them, similar to the tension in a rubber band when it is stretched. Eventually, enough energy is produced to create more quarks which then form hadrons in accordance with Einstein’s equation E=MC2.

    The problem, according to Ryan, comes in solving the equations of QCD. PRACE’s HPC resources make Ryan’s work possible because they enable her to run simulations on a larger scale than simple pen and paper would allow.

    “It’s a huge dimensional integral to solve, and we’re talking about solving a million times a million matrices that we must invert,” says Ryan.

    “This is where HPC comes in. If you want to make predictions in the theory, you need to be able to do the simulations numerically.”

    In Ireland, the Irish Centre for High-End Computing is one resource Ryan has tapped in her research, but PRACE enables her and her collaborators to access resources not just locally but across the world.

    IITAC IBM supercomputer

    “This sort of work tends to be very collaborative and international,” says Ryan. “We can apply through PRACE for time on HPC machines throughout Europe. In my field, any machine anywhere is fair game.”

    Besides providing resources, PRACE also determines whether HPC resources are suitable for the kinds of research questions scientists are interested in answering.

    “PRACE’s access to these facilities means that good science gets done on these machines,” says Ryan. “These are computations that are based around fundamental questions posed by people who have a track record for doing good science and asking the right questions. I think that’s crucial.”

    Without PRACE’s support, Ryan’s work examining how quarks and gluons form matter and the beginnings of our universe would be greatly diminished, leaving us one step further behind uncovering the building blocks of the universe.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

  • richardmitnick 2:30 pm on February 14, 2017 Permalink | Reply
    Tags: , , , QCD: Quantum Chromodynamics,   

    From CERN ALICE: “QGP: 17 years after the public announcement…” 

    CERN New Masthead


    31 January 2017
    Virginia Greco

    Interview with Luciano Maiani, DG of CERN from 1999 to 2003, who gave the announcement talk of the discovery of QGP at the SPS.

    CERN  Super Proton Synchrotron
    CERN Super Proton Synchrotron

    About 25 years after its first theoretical prediction, the new state of matter called quark-gluon plasma (QGP) was observed at CERN’s SPS. The public announcement was made on the 10th of February 2000 by Luciano Maiani, Director General of CERN back then. At the event organized by ALICE to celebrate the 30-year anniversary of the first heavy-ion collisions at the SPS, Maiani gave his account of this piece of history of physics.

    We had an interview with him after the seminar.

    After one year of mandate as DG of CERN you had the honour and the responsibility to announce that evidence of the existence of QGP had been found at the SPS. How did you live these happenings?

    At that time I was not an expert in heavy ion physics, because I hadn’t worked in the field. Nevertheless, I was aware of the phase transition issue and of the two existing visions about what happens to nuclear matter at very high temperature. On one side there was the theory that matter would break down into a gas of quarks and gluons (and temperature could be freely increased), on the other side the model of Hagedorn about the existence of an upper limit of the temperature reachable, which could be estimated from the hadron spectrum to be 170-180 MeV.

    With the development of QCD it was possible to combine these two models. In particular, in 1975 Nicola Cabibbo and Giorgio Parisi suggested that the Hagedorn limit temperature is just the critical temperature of a phase transition from a gas of hadrons, made of confined quarks, to a gas of deconfined quarks and gluons (the QGP). These works had convinced the experts in the field.

    When the moment came to decide whether to make a public announcement about what the SPS had found, I discussed with many of the people involved, such as Claude Detraz, who was Director for Fixed Target and Future Programmes during my mandate, Reinhard Stock and Hans Specht. After examining the data and collecting opinions, I concluded that we had convincing signals that what we were observing was indeed the quark-gluon plasma.

    But the public announcement was cautious, wasn’t it? Was there still some doubt?

    I think that the announcement was quite clear. I have the text of it with me, it reads: “The data provide evidence for colour deconfinement in the early collision stage and for a collective explosion of the collision fireball in its late stages. The new state of matter exhibits many of the characteristic features of the theoretically predicted Quark-Gluon Plasma.” The key word is “evidence”, not discovery, and the evidence was there, indeed.

    In the talk I gave at that time I also described the concept of quark deconfinement using an analogy with the snow on the Jura Mountain, which I particularly like. We can consider a quark as a skier: when the temperature is not very low, on the mountain there are only patches of snow in which the skier can move. When the temperature decreases and the snow increases, the skier can move along bigger and bigger spaces, up to a point where he or she can freely sweep long distances. The same can be said for a quark confined in a hadron (the patch), which becomes free when temperature increases.

    Of course at that moment the idea still popular was that we were dealing with a phase transition to a gaseous state in which quarks and gluons would be asymptotically free. Later RHIC showed that the situation is more complicated and that this new state is much more like a liquid with very low viscosity rather than like a gas.

    The announcement came just a few months before the start of the programme of RHIC. Were there some polemics about this “timing”?

    The Solenoidal Tracker at the Relativistic Heavy Ion Collider (RHIC)

    We were almost at the conclusion of a long and accurate experimental programme at the SPS, so making a summing up was needed. In addition, as I said, we thought there were the elements for a public announcement. And this has been proved right by later experiments.

    Somebody thought that it would make RHIC, which was going to enter in operation, appear useless. But that was not the case, since much more was left to study. Indeed in the same announcement talk I said: “the higher energies of RHIC and LHC are needed to complete the picture and provide a full characterization of the Quark-Gluon Plasma”.

    In your opinion, what is the future of this branch of research?

    Well, there are still many open problems, things that need to be studied further.

    It is very important to explore the properties of this new state of matter and the connected phenomena, to get a more precise physical picture of the new state.

    Personally, I think that there is also another possible line of research in this field: to study the production of those exotic hadronic resonances that are not included in the scheme of baryons and mesons (i.e. three quarks or quark-antiquark structures). These resonances have been observed in CMS and LHCb in pp collisions, and it would be interesting to study how they are produced in heavy-ion collisions. It could give us indications about what these objects are, tell us if they are molecules made of colourless hadrons or new states which are configurations of quarks and antiquarks (different from mesons) that include subcomponents connected by colour bounds.

    ALICE could provide an important contribution to this research. It is not easy to observe such exotic states in heavy-ion collisions but I think it is worth trying.

    No image caption. No image credit.

    An iconic view of the universe
    Inflationary Universe. NASA/WMAP
    Inflationary Universe. NASA/WMAP

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Cern Courier

    CERN/ATLAS detector


    CERN/CMS Detector




    CERN/LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

  • richardmitnick 5:57 pm on June 20, 2016 Permalink | Reply
    Tags: , , , , QCD: Quantum Chromodynamics   

    From Don Lincoln at FNAL: “QCD: Quantum Chromodynamics” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    FNAL Don Lincoln
    Don Lincoln

    The strongest force in the universe is the strong nuclear force and it governs the behavior of quarks and gluons inside protons and neutrons. The name of the theory that governs this force is quantum chromodynamics, or QCD. In this video, Fermilab’s Dr. Don Lincoln explains the intricacies of this dominant component of the Standard Model.

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Watch, enjoy, learn.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: