Tagged: Dark Energy Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:36 pm on February 16, 2018 Permalink | Reply
    Tags: , , , , , , Dark Energy, , European backed missions,   

    From CERN Courier: “Europe defines astroparticle strategy” 


    CERN Courier

    Feb 16, 2018

    1

    Multi-messenger astronomy, neutrino physics and dark matter are among several topics in astroparticle physics set to take priority in Europe in the coming years, according to a report by the Astroparticle Physics European Consortium (APPEC).

    The APPEC strategy for 2017–2026, launched at an event in Brussels on 9 January, is the culmination of two years of consultation with the astroparticle and related communities. It involved some 20 agencies in 16 countries and includes representation from the European Committee for Future Accelerators, CERN and the European Southern Observatory (ESO).

    Lying at the intersection of astronomy, particle physics and cosmology, astroparticle physics is well placed to search for signs of physics beyond the standard models of particle physics and cosmology. As a relatively new field, however, European astroparticle physics does not have dedicated intergovernmental organisations such as CERN or ESO to help drive it. In 2001, European scientific agencies founded APPEC to promote cooperation and coordination, and specifically to formulate a strategy for the field.

    Building on earlier strategies released in 2008 and 2011, APPEC’s latest roadmap presents 21 recommendations spanning scientific issues, organisational aspects and societal factors such as education and industry, helping Europe to exploit tantalising potential for new discoveries in the field.

    The recent detection of gravitational waves from the merger of two neutron stars (CERN Courier December 2017 p16) opens a new line of exploration based on the complementary power of charged cosmic rays, electromagnetic waves, neutrinos and gravitational waves for the study of extreme events such as supernovae, black-hole mergers and the Big Bang itself. “We need to look at cross-fertilisation between these modes to maximise the investment in facilities,” says APPEC chair Antonio Masiero of the INFN and the University of Padova. “This is really going to become big.”

    APPEC strongly supports Europe’s next-generation ground-based gravitational interferometer, the Einstein Telescope, and the space-based LISA detector.

    ASPERA Einstein Telescope

    ESA/NASA eLISA space based the future of gravitational wave research

    In the neutrino sector, KM3NeT is being completed for high-energy cosmic neutrinos at its site in Sicily, as well as for precision studies of atmospheric neutrinos at its French site near Toulon.

    Artist’s expression of the KM3NeT neutrino telescope

    Europe is also heavily involved in the upgrade of the leading cosmic-ray facility the Pierre Auger Observatory in Argentina.

    Pierre Auger Observatory in the western Mendoza Province, Argentina, near the Andes, at an altitude of 1330 m–1620 m, average ~1400 m

    Significant R&D work is taking place at CERN’s neutrino platform for the benefit of long- and short-baseline neutrino experiments in Japan and the US (CERN Courier July/August 2016 p21), and Europe is host to several important neutrino experiments. Among them are KATRIN at KIT in Germany, which is about to begin measurements of the neutrino absolute mass scale, and experiments searching for neutrinoless double-beta decay (NDBD) such as GERDA and CUORE at INFN’s Gran Sasso National Laboratory (CERN Courier December 2017 p8).


    KIT Katrin experiment

    CUORE experiment UC Berkeley, experiment at the Italian National Institute for Nuclear Physics’ (INFN’s) Gran Sasso National Laboratories (LNGS), a search for neutrinoless double beta decay

    Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy

    There are plans to join forces with experiments in the US to build the next generation of NDBD detectors. APPEC has a similar vision for dark matter, aiming to converge next year on plans for an “ultimate” 100-tonne scale detector based on xenon and argon via the DARWIN and Argo projects.

    DARWIN Dark Matter experiment

    APPEC also supports ESA’s Euclid mission, which will establish European leadership in dark-energy research, and encourages continued European participation in the US-led DES and LSST ground-based projects.

    Dark Energy Camera [DECam], built at FNAL


    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    Following from ESA’s successful Planck mission, APPEC strongly endorses a European-led satellite mission, such as COrE, to map the cosmic-microwave background and the consortium plans to enhance its interactions with its present observers ESO and CERN in areas of mutual interest.

    ESA/Planck

    “It is important at this time to put together the human forces,” says Masiero. “APPEC will exercise influence in the European Strategy for Particle Physics, and has a significant role to play in the next European Commission Framework Project, FP9.”

    A substantial investment is needed to build the next generation of astroparticle-physics research, the report concedes. According to Masiero, European agencies within APPEC currently invest around €80 million per year in astroparticle-related activities, in addition to funding large research infrastructures. A major effort in Europe is necessary for it to keep its leading position. “Many young people are drawn into science by challenges like dark matter and, together with Europe’s existing research infrastructures in the field, we have a high technological level and are pushing industries to develop new technologies,” continues Masiero. “There are great opportunities ahead in European astroparticle physics.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Advertisements
     
  • richardmitnick 9:28 am on January 18, 2018 Permalink | Reply
    Tags: , , , , Dark Energy, , Λ Cosmological Constant, , Speed of universe’s expansion remains elusive, Type 1a supernovas as “standard candles”? Maybe not   

    From ScienceNews: “Speed of universe’s expansion remains elusive” 

    ScienceNews

    January 16, 2018
    Tom Siegfried

    1
    In August of 2011, researchers discovered SN 2011fe, a type 1a supernova 21 million light-years away in galaxy M101 (images show the galaxy before and after the supernova, with the supernova circled at right). Studies using type 1a supernovas as “standard candles” to measure how fast the universe expands (the Hubble constant) produce a result in conflict with other data used to infer the cosmic growth rate. NASA, Swift, Peter Brown, Univ. of Utah

    NASA Neil Gehrels Swift Observatory

    Unless you are a recent arrival from another universe, you’ve no doubt heard that this one is expanding. It’s getting bigger all the time. What’s more, its growth rate is accelerating. Every day, the universe expands a little bit faster than it did the day before.

    Those day-to-day differences are negligible, though, for astronomers trying to measure the universe’s expansion rate. They want to know how fast it is expanding “today,” meaning the current epoch of cosmic history. That rate is important for understanding how the universe works, knowing what its ultimate fate will be and even what it is made of. After all, the prime mission of the Hubble Space Telescope when it was launched in 1990 was to help determine that expansion rate (known, not coincidentally, as the Hubble constant, named for the astronomer Edwin Hubble).

    Since then evidence from Hubble (the telescope) and other research projects has established a reasonably precise answer for the Hubble constant: 73, in the units commonly used for this purpose. (It means that two independent astronomical bodies separated by 3.26 million light-years will appear to be moving away from each other at 73 kilometers per second.) Sure, there’s a margin of error, but not much. The latest analysis from one team, led by Nobel laureate Adam Riess, puts the Hubble constant in the range of 72–75, as reported in a paper posted online January 3 ApJ. Considering that as late as the 1980s astronomers argued about whether the Hubble constant was closer to 40 or 90, that’s quite an improvement in precision.

    But there’s a snag in this success. Current knowledge of the universe suggests a way to predict what the Hubble constant ought to be. And that prediction gives a probable range of only 66–68. The two methods don’t match.

    “This is very surprising, I think, and very interesting,” Riess, of the Space Telescope Science Institute in Baltimore, said in a talk January 9 at a meeting of the American Astronomical Society.

    It’s surprising because astrophysicists and cosmologists thought they had pretty much figured the universe out. It’s made up of a little bit of ordinary matter, a lot of some exotic “dark matter” of unknown identity, and even more of a mysterious energy permeating the vacuum of space, exerting gravitational repulsion. Remember that acceleration of the expansion rate? It implies the existence of such energy. Because nobody knows what it is, people call it “dark energy,” while suspecting that its real name is lambda, the Greek letter that stands for “cosmological constant.” (It’s called a constant because any part of space should possess the same amount of vacuum energy.) Dark energy contributes something like 70 percent of the total mass-energy content of the universe, various lines of evidence indicate.

    If all that’s right, then it’s not all that hard to infer how fast the universe should be expanding today. You just take the recipe of matter, dark matter and dark energy and add some ghostly subatomic particles known as neutrinos. Then you carefully measure the temperature of deep space, where the only heat is the faint glow remaining from the Big Bang. That glow, the cosmic microwave background radiation, varies slightly in temperature from point to point. From the size of those variations, you can calculate how far the radiation from the Big Bang has been traveling to reach our telescopes. Combine that with the universe’s mass-energy recipe, and you can calculate how fast the universe is expanding. (You can, in fact, do this calculation at home with the proper mathematical utensils.)

    An international team’s project using cosmic microwave background [CMB]data inferred a Hubble constant of 67, substantially less than the 73 or 74 based on actually measuring the expansion (by analyzing how the light from distant supernova explosions has dimmed over time).

    CMB per ESA/Planck

    When this discrepancy first showed up a few years ago, many experts believed it was just a mirage that would fade with more precise measurement. But it hasn’t.

    “This starts to get pretty serious,” Riess said at the astronomy meeting. “In both cases these are very mature measurements. This is not the first time around for either of these projects.”

    One commonly proposed explanation contends that the supernova studies are measuring the local value of the Hubble constant. Perhaps we live in a bubble, with much less matter than average, skewing expansion measurements. In that case, the cosmic microwave background data might provide a better picture of the “global” expansion rate for the whole universe. But supernovas observed by the Hubble telescope extend far enough out to refute that possibility, Riess said.

    “Even if you thought we lived in a void…, you still are basically stuck with the same problem.”

    Consequently it seems most likely that something is wrong with the matter-energy recipe for the universe (technically, the cosmological standard model) used in making the expansion rate prediction. Maybe the vacuum energy driving cosmic acceleration is not a cosmological constant after all, but some other sort of field filling space. Such a field could vary in strength over time and throw off the calculations based on a constant vacuum energy. But Riess pointed out that the evidence is growing stronger and stronger that the vacuum energy is just the cosmological constant. “I would say there we have less and less wiggle room.”

    Another possibility, appealing to many theorists, is the existence of a new particle, perhaps a fourth neutrino or some other relativistic (moving very rapidly) particle zipping around in the early universe.

    “Relativistic particles — theorists have no trouble inventing new ones, ones that don’t violate anything else,” Riess said. “Many of them are quite giddy about the prospect of some evidence for that. So that would not be a long reach.”

    Other assumptions built into the current cosmological standard model might also need to be revised. Dark matter, for example, is presumed to be very aloof from other forms of matter and energy. But if it interacted with radiation in the early universe, it could have an effect similar to that of relativistic particles, changing how the energy in the early universe is divided up among its components. Such a change in energy balance would alter how much the universe expands at early times, corrupting the calibrations needed to infer the current expansion rate.

    It’s not the first time that determining the Hubble constant has provoked controversy. Edwin Hubble himself initially (in the 1930s) vastly overestimated the expansion rate. Using his rate, calculations indicated that the universe was much younger than the Earth, an obvious contradiction. Even by the 1990s, some Hubble constant estimates suggested an age for the universe of under 10 billion years, whereas many stars appeared to be several billion years older than that.

    Hubble’s original error could be traced to lack of astronomical knowledge. His early overestimates turned out to be signals of a previously unknown distinction between different generations of stars, some younger and some older, Riess pointed out. That threw off distance estimates to some stars that Hubble used to estimate the expansion rate. Similarly, in the 1990s the expansion rate implied too young a universe because dark energy was not then known to exist and therefore was not taken into account when calculating the universe’s age.

    So the current discrepancy, Riess suggested, might also be a signal of some astronomical unknown, whether a new particle, new interactions of matter and radiation, or a phenomenon even more surprising — something that would really astound a visitor from another universe.

    See the full article here .

    Science News is edited for an educated readership of professionals, scientists and other science enthusiasts. Written by a staff of experienced science journalists, it treats science as news, reporting accurately and placing findings in perspective. Science News and its writers have won many awards for their work; here’s a list of many of them.

    Published since 1922, the biweekly print publication reaches about 90,000 dedicated subscribers and is available via the Science News app on Android, Apple and Kindle Fire devices. Updated continuously online, the Science News website attracted over 12 million unique online viewers in 2016.

    Science News is published by the Society for Science & the Public, a nonprofit 501(c) (3) organization dedicated to the public engagement in scientific research and education.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 3:29 pm on December 1, 2017 Permalink | Reply
    Tags: André Maeder, , , Dark Energy, , , Katie Mack, , The strongest evidence for dark matter comes not from the motions of stars and galaxies “but from the behavior of matter on cosmological scales as measured by signatures in the cosmic microwave back   

    From COSMOS: “Radical dark matter theory prompts robust rebuttals” 

    Cosmos Magazine bloc

    COSMOS Magazine

    01 December 2017
    Richard A Lovett

    1
    Most cosmologists invoke dark energy to explain the accelerating expansion of the universe. A few are not so certain. Mina De La O / Getty
    Images

    In 1887, physicists Alfred Michelson and Edward Morley set up an array of prisms and mirrors in an elegant attempt to measure the passage of the Earth through what was then known as “luminiferous ether” – a mysterious substance through which light waves were believed to propagate, like sound waves through air.

    The experiment should have worked, but in one of the most famous results of Nineteenth Century physics no ether movement was detected. That was a head-scratcher until 1905, when Albert Einstein took the results at face value and used them as a cornerstone in developing his theory of relativity.

    Today, physicists are hunting for two equally mysterious commodities: dark matter and dark energy. And maybe, suggests a recent line of research from astrophysicist André Maeder at the University of Geneva, Switzerland, they too don’t exist, and scientists need to again revise their theories, this time to look for ways to explain the universe without the need for either of them.

    Dark matter was first proposed all the way back in 1933, when astrophysicists realised there wasn’t enough visible matter to explain the motions of stars and galaxies. Instead, there appeared to be a hidden component contributing to the gravitational forces affecting their motion. It is now believed that even though we still have not successfully observed it, dark matter is five times more prevalent in the universe than normal matter.

    Dark energy came into the picture more recently, when astrophysicists realised that the expansion of the universe could not be explained without the existence of some kind of energy that provides a repulsive force that steadily accelerates the rate at which galaxies are flying away from each other. Dark energy is believed to be even more prevalent than dark matter, comprising a full 70% of the universe’s total mass-energy.

    Maeder’s argument, published in a series of papers this year in The Astrophysical Journal is that maybe we don’t need dark matter and dark energy to explain these effects. Maybe it’s our concept of Einsteinian space-time that’s wrong.

    His argument begins with the conventional cosmological understanding that the universe started with a Big Bang, about 13.8 billion years ago, followed by continual expansion. But in this mode, there is a possibility that hasn’t been taken into account, he says: “By that I mean the scale invariance of empty space; in other words the empty space and its properties do not change following a dilation or contraction.”

    If so, that would affect our entire understanding of gravity and the evolution of the universe.

    Based on this hypothesis, Maeder found that with the right parameters he could explain the expansion of the universe without dark energy. He could also explain the motion of stars and galaxies without the need for dark matter.

    To say that Maeder’s ideas are controversial is an understatement. Katie Mack, an astrophysicist at the University of Melbourne on Australia, calls them “massively overhyped.” And physicist and blogger Sabine Hossenfelder of the Frankfurt Institute for Advanced Studies, Germany, wrote that while Maeder “clearly knows his stuff,” he does not yet have “a consistent theory.”

    Specifically, Mack notes that the strongest evidence for dark matter comes not from the motions of stars and galaxies, “but from the behavior of matter on cosmological scales, as measured by signatures in the cosmic microwave background [CMB] and the distribution of galaxies.” Gravitational lensing of distant objects by nearer galaxies also reveals the existence of dark matter, she says.

    CMB per ESA/Planck

    ESA/Planck

    Gravitational Lensing NASA/ESA

    Also, she notes that while there are a “whole heap” of ways to modify Einstein’s theories, these are “nothing new and not especially interesting.”

    The challenge, she says, is to reproduce everything, including “dark matter and dark energy’s biggest successes.” Until a new theory can produce “precise agreement” with measurements of a wide range of cosmic variables, she says, there’s no reason “at all” to throw out the existing theory.

    Dark matter researcher Benjamin Roberts, at the University of Reno, Nevada, US, agrees. “The evidence for dark matter is very substantial and comes from a large number of sources,” he says. “Until a single theory can explain all of these observations, there is no reason to doubt the existence of dark matter.”

    That said, this doesn’t mean that “new physics” theories such as Maeder’s should be ignored. “They should be, and are, taken seriously,” he says.

    Or as Maeder puts it, “Nothing can ever be taken for granted.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 12:56 pm on November 25, 2017 Permalink | Reply
    Tags: , , , , , Dark Energy, , , Scientific Theories Never Die Not Unless Scientists Choose To Let Them   

    From Ethan Siegel: “Scientific Theories Never Die, Not Unless Scientists Choose To Let Them” 

    From Ethan Siegel

    Nov 23, 2017

    As wonderful as the evidence that supports or invalidates a theory is, it can never truly kill the ones that don’t work out.

    When it comes to science, we like to think that we formulate hypotheses, test them, throw away the ones that fail to match, and continue testing the successful one until only the best ideas are left. But the truth is a lot muddier than that. The actual process of science involves tweaking your initial hypothesis over and over, trying to pull it in line with what we already know. It involves a leap-of-faith that when you formulate your theory correctly, the predictions it makes will be even more successful, across-the-board, than any other alternatives. And when things don’t work out, it doesn’t always necessitate abandoning your original hypothesis. In fact, most scientists don’t. In a very real way, scientific theories can never truly be killed. The only way they ever go away is if people stop working on them.

    1
    Without dark energy, the Universe wouldn’t be accelerating. But to explain the distant supernovae we see, among other features, dark energy (or something that mimics it exactly) appears to be necessary. Image credit: NASA & ESA, of possible models of the expanding Universe.

    When distant supernovae were first discovered to be fainter than they otherwise should have been based on their redshift, it brought about a revolution in cosmology. The way the Universe expands is inextricably linked to the matter and energy present within it, and so the goal of cosmology, for a long time, was to measure the expansion rate and how it changes over time.

    The expectation was that it would either recollapse or expand forever, or remain in an in-between state right on the border between those two. Instead, these supernovae showed that a fourth option was most likely: the most distant galaxies of all were speeding up as they moved away from us. There must be some new form of energy in the Universe — dark energy — different from all other forms of energy, permeating all of space.

    1
    The Bubble Nebula is on the outskirts of a supernova remnant occurring thousands of years ago. If distant supernovae are in dustier environments than their modern-day counterparts, perhaps they’re not indicative of dark energy after all. Image credit: T.A. Rector/University of Alaska Anchorage, H. Schweiker/WIYN and NOAO/AURA/NSF.


    NOAO WIYN 3.5 meter telescope at Kitt Peak, AZ, USA, Altitude 2,096 m (6,877 ft)

    But for many years, most physicists and astronomers approached this idea with skepticism, wondering if there weren’t another explanation. Perhaps, one alternative theory posited, space wasn’t expanding with an extra value due to some form of dark energy, but rather there was something occurring at large distances to block the light. So that became a proposition: there was some additional dust in the distant Universe, and the reason the supernovae appeared fainter wasn’t because they were farther away due to an extra expansion of space, but because dust was blocking the light.

    2
    Infrared light penetrates more dust and gas than visible light, allowing details to become visible in this nebula. Similarly, blue light is blocked preferentially compared to red light, indicating that if dust were responsible for dimming supernovae, they’d appear different in color from their nearby counterparts. Image credit: NASA, ESA, and the Hubble Heritage Team (STScI/AURA), and J. Hester.

    NASA/ESA Hubble Telescope

    Dust grains, however, come in particular sizes, and the size of the dust grains determines which wavelengths of light are preferentially blocked, with most dust better at blocking blue than red light. Measurements of different wavelengths of light, however, showed that both red and blue light were reduced by equal amounts.

    Was that sufficient to rule out the “dust” theory? In that incarnation, yes. But what if the dust in the distant Universe was of a new type, that blocked all the wavelengths of light equally? This undiscovered type of dust, dubbed “grey dust,” could block all wavelengths equally. So we needed some way to put that to the test, and that involved looking at supernovae at a variety of distances, to see whether dust would continue to block more and more light at greater distances, as more and more “grey dust” would tend to do.

    3
    The observation of even more distant supernovae allowed us to discern the difference between ‘grey dust’ and dark energy, ruling the former out. But the modification of ‘replenishing grey dust’ is still indistinguishable from dark energy. Image credit: A.G. Riess et al. (2004), The Astrophysical Journal, Volume 607, Number 2.

    It didn’t. So does that mean dark energy must be real? Not necessarily, because you can modify your “grey dust” explanation to include dust that changes in density and location over time: “replenishing grey dust.” By the addition of enough extra free parameters, caveats, behaviors, or modifications to your theory, you can literally salvage any idea. As long as you’re willing to tweak what you’ve come up with sufficiently, you can never rule anything out.

    There have been many ideas in this vein that have the same problem (or feature) inherent to them: so long as you’re willing to make the theory more complicated, you can fit any data that comes back. The discovery of the CMB ruled out the Steady-State theory, but they added reflected starlight to explain that leftover glow. When the spectrum of the CMB was measured, ruling out reflected starlight, they added a series of bursts and “mini-bangs” in the past, creating a Quasi-Steady-State theory. When the fluctuations in the CMB’s temperature were discovered, ruling that out, its proponents tweaked it still further.

    3
    Three different types of measurements, distant stars and galaxies, the large scale structure of the Universe, and the fluctuations in the CMB, tell us the expansion history of the Universe, and rule out alternatives to the Big Bang. Image credit: NASA/ESA Hubble (top L), SDSS (top R), ESA and the Planck Collaboration (bottom).

    SDSS Telescope at Apache Point Observatory, NM, USA, Altitude 2,788 meters (9,147 ft)

    ESA/Planck

    This behavior isn’t unique to scientists, but has been a feature (or bug) of science for centuries. It led Max Planck, more than 100 years ago, to make the following now-famous statement:

    “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”

    This is often paraphrased as “physics advances one funeral at a time,” owing to the fact that ideas cannot be proven wrong as we commonly think. Rather, they need to be tweaked so thoroughly and so frequently that they lose their predictive power, instead always playing catch-up as new observations come in.

    4
    Combining quantum field theory and the standard model of particle physics with General Relativity enables us to calculate practically everything we can conceive of in the Universe at a fundamental level. Image credit: SLAC National Accelerator Laboratory.

    SLAC Campus

    It’s why theories like quantum field theory and general relativity are so powerful: even after all these decades, they’re still making new predictions that are being successfully borne out by experiment. It’s why dark matter is here to stay, as its successful predictions include the speeds of galaxy pairs, the large-scale cosmic web, the fluctuations in the CMB, baryon acoustic oscillations, gravitational lensing and more. It’s why cosmic inflation — with its successful predictions including superhorizon fluctuations, the acoustic peaks in the Big Bang’s leftover glow, the departure from scale invariance, etc. — is the leading theory for the origin of the Big Bang. And it’s why their alternatives are so thoroughly fringe.

    4
    Alan Guth, Highland Park High School and M.I.T., who first proposed cosmic inflation

    HPHS Owls

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex MittelmannColdcreation

    5
    Alan Guth’s notes. http://www.bestchinanews.com/Explore/4730.html

    5
    As ripples through space arising from distant gravitational waves pass through our Solar System, including Earth, they ever-so-slightly compress and expand the space around them. Alternatives can be constrained incredibly tightly thanks to our measurements in this regime. Image credit: European Gravitational Observatory, Lionel BRET/EUROLIOS.

    6

    VIRGO Gravitational Wave interferometer, near Pisa, Italy

    You can always add another loophole, parameter, or epicycle to your own pet theory to make it be “not ruled out.” I, along with most physicists, feel this way about a great many non-standard alternatives, including MOND, f(R) gravity, the Quasi-Steady-State model, tired-light cosmology, the plasma Universe, and so on. At some point, you just have to say “enough.” You have to recognize that the level of contortions you need to perform are absurd, and that these theories don’t have any useful predictive power. They’re simply an example of special pleading.

    7
    The warm-hot intergalactic medium (WHIM) has been seen before, along incredibly overdense regions, like the Sculptor wall, illustrated above. But it’s conceivable that there are still surprises out there in the Universe, and our current understanding will once again be subject to a revolution. Image credit: Spectrum: NASA/CXC/Univ. of California Irvine/T. Fang. Illustration: CXC/M. Weiss.

    Of course, their adherents don’t think so. They think they’re being marginalized, oppressed, ignored, or not taken seriously. On very rare occasion, they’re actually correct, and that’s when a scientific revolution occurs. It’s important to keep your mind open to those possibilities, to explore them, and to consider what it would look like if these alternatives were correct after all. But for the overwhelming majority of scientists working on these alternative ideas, their life’s work will turn out to be a blind alley, and their ideas will die out when they (and possibly their students) die. It’s both sad and tragic to look back at history and realize that the last decades of the scientific careers of Einstein, Hoyle, Burbidge, Schrodinger, and many more were a total waste. But whether even the most brilliant scientist accepts a new scientific truth or not is irrelevant. Our knowledge and understanding march forward.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 2:00 pm on September 29, 2017 Permalink | Reply
    Tags: , , , , , Dark Energy, ,   

    From CfA: “New Insights on Dark Energy” 

    Harvard Smithsonian Center for Astrophysics


    Center For Astrophysics

    Inflationary Universe. NASA/WMAP

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex MittelmannColdcreation

    The universe is not only expanding – it is accelerating outward, driven by what is commonly referred to as “dark energy.”

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    The term is a poetic analogy to label for dark matter, the mysterious material that dominates the matter in the universe and that really is dark because it does not radiate light (it reveals itself via its gravitational influence on galaxies).

    Dark Matter Research

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Scientists studying the cosmic microwave background hope to learn about more than just how the universe grew—it could also offer insight into dark matter, dark energy and the mass of the neutrino.

    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al

    Dark Matter Particle Explorer China

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB deep in Sudbury’s Creighton Mine

    LUX Dark matter Experiment at SURF, Lead, SD, USA

    ADMX Axion Dark Matter Experiment, U Uashington

    Two explanations are commonly advanced to explain dark energy. The first, as Einstein once speculated, is that gravity itself causes objects to repel one another when they are far enough apart (he added this “cosmological constant” term to his equations). The second explanation hypothesizes (based on our current understanding of elementary particle physics) that the vacuum has properties that provide energy to the cosmos for expansion.

    For several decades cosmologies have successfully used a relativistic equation with dark matter and dark energy to explain increasingly precise observations about the cosmic microwave background, the cosmological distribution of galaxies, and other large-scale cosmic features.

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    But as the observations have improved, some apparent discrepancies have emerged. One of the most notable is the age of the universe: there is an almost 10% difference between measurements inferred from the Planck satellite data and those from so-called Baryon Acoustic Oscillation experiments. The former relies on far-infrared and submillimeter measurements of the cosmic microwave background [CMB] and the latter on spatial distribution of visible galaxies.

    BOSS Supercluster Baryon Oscillation Spectroscopic Survey (BOSS)

    CMB per ESA/Planck

    ESA/Planck

    CfA astronomer Daniel Eisenstein was a member of a large consortium of scientists who suggest that most of the difference between these two methods, which sample different components of the cosmic fabric, could be reconciled if the dark energy were not constant in time. The scientists apply sophisticated statistical techniques to the relevant cosmological datasets and conclude that if the dark energy term varied slightly as the universe expanded (though still subject to other constraints), it could explain the discrepancy. Direct evidence for such a variation would be a dramatic breakthrough, but so far has not been obtained. One of the team’s major new experiments, the Dark Energy Spectroscopic Instrument (DESI) Survey…

    LBNL/DESI Dark Energy Spectroscopic Instrument for the Nicholas U. Mayall 4-meter telescope at Kitt Peak National Observatory near Tucson, Ariz, USA

    …could settle the matter. It will map over twenty-five million galaxies in the universe, reaching back to objects only a few billion years after the big bang, and should be completed sometime in the mid 2020’s.

    Reference(s):

    Dynamical Dark Energy in Light of the Latest Observations, Gong-Bo Zhao et al. Nature Astronomy, 1, 627, 2017

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Center for Astrophysics combines the resources and research facilities of the Harvard College Observatory and the Smithsonian Astrophysical Observatory under a single director to pursue studies of those basic physical processes that determine the nature and evolution of the universe. The Smithsonian Astrophysical Observatory (SAO) is a bureau of the Smithsonian Institution, founded in 1890. The Harvard College Observatory (HCO), founded in 1839, is a research institution of the Faculty of Arts and Sciences, Harvard University, and provides facilities and substantial other support for teaching activities of the Department of Astronomy.

     
  • richardmitnick 2:17 pm on September 26, 2017 Permalink | Reply
    Tags: , , Dark Energy, EGS Collab- Enhanced Geothermal Systems Collaboration, , Listening to the Earth to harness geothermal energy, SIGMA-V,   

    From SURF: “Listening to the Earth to harness geothermal energy “ 

    SURF logo
    Sanford Underground levels

    Sanford Underground Research facility

    September 25, 2017
    Constance Walter

    Geothermal energy has the potential to power 100 million homes in America.

    1
    Hunter Knox and Bill Roggenthen from South Dakota School of Mines lower sensors down a set of holes that were drilled for the kISMET experiment. Matthew Kapust

    As a geophysicist, Hunter Knox has worked all over the world testing bridges, dams and levees, and listening to the sounds of the earth. She even peered into the center of the earth from a volcano in Antarctica at an open connecting lake.

    “I’m a seismologist. It’s what I do.”

    Now, the field coordinator from Sandia National Laboratory (SNL), is setting her sights on Sanford Lab’s 4850 Level, where she’s planning the logistics for SIGMA-V, a project under the auspices of the Enhanced Geothermal Systems Collaboration (EGS Collab).

    Led by Lawrence Berkeley National Laboratory, the EGS Collab recently received a $9 million grant from the Department of Energy to study geothermal systems. It is believed this clean-energy technology could power up to 100 million American homes.

    But before that can happen, more studies need to be done.

    “We need to better understand how fractures created in deep, hard-rock environments can be used to produce geothermal energy,” Knox said.

    Building on data collected from the recent kISMET experiment at Sanford Lab, the collaboration hopes to expand its understanding of the rock stress and incorporate additional equipment to meet the needs of EGS technology.

    “A typical geothermal system mines heat from the earth by extracting steam or hot water,” said Tim Kneafsey, principal investigator for EGS Collab and a staff earth scientist with LBNL. But for that to happen, three things are needed: hot rock, fluid and the ability for fluid to move through rock.

    “These conditions are not met everywhere,” Kneafsey said. “There is a lot of accessible hot rock, but it may be missing the permeability or fluid or both.”

    “We know fracturing rock can be done. But can it be effective for geothermal purposes? We need good, well-monitored field tests of fracturing, particularly in crystalline rock, to better understand that,” he said.

    That’s where SIGMA-V—or Stimulation Investigations for Geothermal Modeling and Analysis—comes in. “SIGMA-V is shorthand for vertical stress,” Kneafsey said.

    The goal of the project is to collect data that will allow the team to create better predictive and geomechanic models that will allow them to better understand the subsurface of the earth. The team will drill two boreholes: one for injection and one for production. Each will be 60 meters long in the direction of the minimum horizontal stress. Six additional monitoring boreholes will contain seismic, electrical and fiber optic sensors.

    When the holes are drilled, the team will place “straddle packers”—a mandrel, or pipe, with two deflated balloons on either end—inside them. Once inside, they will inflate the balloons and flow water down the pipe to create an airtight section. They will continue to pump water until the rock fractures and use the monitoring equipment to listen for acoustic emissions, the sounds that will tell them what is happening within the rock.

    “One of the problems with EGS is that it is difficult to maintain the fracture network,” Knox said. “Since the boreholes are hard to drill in these hot and very hard rocks and the fracture networks can’t be sustained, it is challenging to maintain an adequate heat exchanger to pull the energy out. We want to figure out how to maintain these networks so we can use the heat for energy.”

    And so, she’ll continue to listen to the rock nearly a mile underground and, perhaps, learn the secret to using it for geothermal energy.

    Forging ahead

    Data collected from SIGMA-V will be applied toward the Frontier Observatory for Research in Geothermal Energy (FORGE), a flagship DOE geothermal project, Kneafsey said. FORGE aims to develop technologies needed to create large-scale, economically sustainable heat exchange systems, thus paving the way for a reproducible approach that will reduce risks associated with EGS development.

    The two FORGE sites are in Fallon, Nevada, which is led by Sandia National Laboratories; and Milford, Utah, led by the University of Utah. The FORGE initiative will include innovative drilling techniques, reservoir stimulation techniques and well connectivity and flow-testing efforts.

    The EGS Collab includes researchers from eight national labs—LBNL, SNL, Lawrence Livermore National Laboratory, Pacific Northwest National Laboratory, Idaho National Laboratory, Los Alamos National Laboratory, National Energy Research Laboratory, and Oak Ridge National Laboratory; and six universities—South Dakota School of Mines and Technology, Stanford, University of Wisconsin, University of Oklahoma, Colorado School of Mines and Penn State.

    Some information for this article was provided by LBNL: http://newscenter.lbl.gov/2017/07/20/berkeley-lab-lead-multimillion-dollar-geothermal-energy-project/

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    About us.
    The Sanford Underground Research Facility in Lead, South Dakota, advances our understanding of the universe by providing laboratory space deep underground, where sensitive physics experiments can be shielded from cosmic radiation. Researchers at the Sanford Lab explore some of the most challenging questions facing 21st century physics, such as the origin of matter, the nature of dark matter and the properties of neutrinos. The facility also hosts experiments in other disciplines—including geology, biology and engineering.

    The Sanford Lab is located at the former Homestake gold mine, which was a physics landmark long before being converted into a dedicated science facility. Nuclear chemist Ray Davis earned a share of the Nobel Prize for Physics in 2002 for a solar neutrino experiment he installed 4,850 feet underground in the mine.

    Homestake closed in 2003, but the company donated the property to South Dakota in 2006 for use as an underground laboratory. That same year, philanthropist T. Denny Sanford donated $70 million to the project. The South Dakota Legislature also created the South Dakota Science and Technology Authority to operate the lab. The state Legislature has committed more than $40 million in state funds to the project, and South Dakota also obtained a $10 million Community Development Block Grant to help rehabilitate the facility.

    In 2007, after the National Science Foundation named Homestake as the preferred site for a proposed national Deep Underground Science and Engineering Laboratory (DUSEL), the South Dakota Science and Technology Authority (SDSTA) began reopening the former gold mine.

    In December 2010, the National Science Board decided not to fund further design of DUSEL. However, in 2011 the Department of Energy, through the Lawrence Berkeley National Laboratory, agreed to support ongoing science operations at Sanford Lab, while investigating how to use the underground research facility for other longer-term experiments. The SDSTA, which owns Sanford Lab, continues to operate the facility under that agreement with Berkeley Lab.

    The first two major physics experiments at the Sanford Lab are 4,850 feet underground in an area called the Davis Campus, named for the late Ray Davis. The Large Underground Xenon (LUX) experiment is housed in the same cavern excavated for Ray Davis’s experiment in the 1960s.
    LUX/Dark matter experiment at SURFLUX/Dark matter experiment at SURF

    In October 2013, after an initial run of 80 days, LUX was determined to be the most sensitive detector yet to search for dark matter—a mysterious, yet-to-be-detected substance thought to be the most prevalent matter in the universe. The Majorana Demonstrator experiment, also on the 4850 Level, is searching for a rare phenomenon called “neutrinoless double-beta decay” that could reveal whether subatomic particles called neutrinos can be their own antiparticle. Detection of neutrinoless double-beta decay could help determine why matter prevailed over antimatter. The Majorana Demonstrator experiment is adjacent to the original Davis cavern.

    Another major experiment, the Long Baseline Neutrino Experiment (LBNE)—a collaboration with Fermi National Accelerator Laboratory (Fermilab) and Sanford Lab, is in the preliminary design stages. The project got a major boost last year when Congress approved and the president signed an Omnibus Appropriations bill that will fund LBNE operations through FY 2014. Called the “next frontier of particle physics,” LBNE will follow neutrinos as they travel 800 miles through the earth, from FermiLab in Batavia, Ill., to Sanford Lab.

    Fermilab LBNE
    LBNE

     
  • richardmitnick 9:55 pm on September 5, 2017 Permalink | Reply
    Tags: , , , , , Dark Energy, , , , , , , ,   

    From Symmetry: “What can particles tell us about the cosmos?” 

    Symmetry Mag
    Symmetry

    09/05/17
    Amanda Solliday

    The minuscule and the immense can reveal quite a bit about each other.

    In particle physics, scientists study the properties of the smallest bits of matter and how they interact. Another branch of physics—astrophysics—creates and tests theories about what’s happening across our vast universe.

    1
    The current theoretical framework that describes elementary particles and their forces, known as the Standard Model, is based on experiments that started in 1897 with the discovery of the electron. Today, we know that there are six leptons, six quarks, four force carriers and a Higgs boson. Scientists all over the world predicted the existence of these particles and then carried out the experiments that led to their discoveries. Learn all about the who, what, where and when of the discoveries that led to a better understanding of the foundations of our universe.

    While particle physics and astrophysics appear to focus on opposite ends of a spectrum, scientists in the two fields actually depend on one another. Several current lines of inquiry link the very large to the very small.

    The seeds of cosmic structure

    For one, particle physicists and astrophysicists both ask questions about the growth of the early universe.

    In her office at Stanford University, Eva Silverstein explains her work parsing the mathematical details of the fastest period of that growth, called cosmic inflation.

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex MittelmannColdcreation

    “To me, the subject is particularly interesting because you can understand the origin of structure in the universe,” says Silverstein, a professor of physics at Stanford and the Kavli Institute for Particle Astrophysics and Cosmology. “This paradigm known as inflation accounts for the origin of structure in the most simple and beautiful way a physicist can imagine.”

    Scientists think that after the Big Bang, the universe cooled, and particles began to combine into hydrogen atoms. This process released previously trapped photons—elementary particles of light.

    The glow from that light, called the cosmic microwave background, lingers in the sky today.

    CMB per ESA/Planck

    Scientists measure different characteristics of the cosmic microwave background to learn more about what happened in those first moments after the Big Bang.

    According to scientists’ models, a pattern that first formed on the subatomic level eventually became the underpinning of the structure of the entire universe. Places that were dense with subatomic particles—or even just virtual fluctuations of subatomic particles—attracted more and more matter. As the universe grew, these areas of density became the locations where galaxies and galaxy clusters formed. The very small grew up to be the very large.

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Scientists studying the cosmic microwave background hope to learn about more than just how the universe grew—it could also offer insight into dark matter, dark energy and the mass of the neutrino.

    Dark Matter

    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al

    Dark Matter Particle Explorer China

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB deep in Sudbury’s Creighton Mine

    LUX Dark matter Experiment at SURF, Lead, SD, USA

    ADMX Axion Dark Matter Experiment, U Uashington

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA


    FNAL DUNE Argon tank at SURF


    Surf-Dune/LBNF Caverns at Sanford



    SURF building in Lead SD USA

    “It’s amazing that we can probe what was going on almost 14 billion years ago,” Silverstein says. “We can’t learn everything that was going on, but we can still learn an incredible amount about the contents and interactions.”

    For many scientists, “the urge to trace the history of the universe back to its beginnings is irresistible,” wrote theoretical physicist Stephen Weinberg in his 1977 book The First Three Minutes. The Nobel laureate added, “From the start of modern science in the sixteenth and seventeenth centuries, physicists and astronomers have returned again and again to the problem of the origin of the universe.”

    Searching in the dark

    Particle physicists and astrophysicists both think about dark matter and dark energy. Astrophysicists want to know what made up the early universe and what makes up our universe today. Particle physicists want to know whether there are undiscovered particles and forces out there for the finding.

    “Dark matter makes up most of the matter in the universe, yet no known particles in the Standard Model [of particle physics] have the properties that it should possess,” says Michael Peskin, a professor of theoretical physics at SLAC.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    “Dark matter should be very weakly interacting, heavy or slow-moving, and stable over the lifetime of the universe.”

    There is strong evidence for dark matter through its gravitational effects on ordinary matter in galaxies and clusters. These observations indicate that the universe is made up of roughly 5 percent normal matter, 25 percent dark matter and 70 percent dark energy. But to date, scientists have not directly observed dark energy or dark matter.

    “This is really the biggest embarrassment for particle physics,” Peskin says. “However much atomic matter we see in the universe, there’s five times more dark matter, and we have no idea what it is.”

    But scientists have powerful tools to try to understand some of these unknowns. Over the past several years, the number of models of dark matter has been expanding, along with the number of ways to detect it, says Tom Rizzo, a senior scientist at SLAC and head of the theory group.

    Some experiments search for direct evidence of a dark matter particle colliding with a matter particle in a detector. Others look for indirect evidence of dark matter particles interfering in other processes or hiding in the cosmic microwave background. If dark matter has the right properties, scientists could potentially create it in a particle accelerator such as the Large Hadron Collider.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    Physicists are also actively hunting for signs of dark energy. It is possible to measure the properties of dark energy by observing the motion of clusters of galaxies at the largest distances that we can see in the universe.

    “Every time that we learn a new technique to observe the universe, we typically get lots of surprises,” says Marcelle Soares-Santos, a Brandeis University professor and a researcher on the Dark Energy Survey. “And we can capitalize on these new ways of observing the universe to learn more about cosmology and other sides of physics.”

    Forces at play

    Particle physicists and astrophysicists find their interests also align in the study of gravity. For particle physicists, gravity is the one basic force of nature that the Standard Model does not quite explain. Astrophysicists want to understand the important role gravity played and continues to play in the formation of the universe.

    In the Standard Model, each force has what’s called a force-carrier particle or a boson. Electromagnetism has photons. The strong force has gluons. The weak force has W and Z bosons. When particles interact through a force, they exchange these force-carriers, transferring small amounts of information called quanta, which scientists describe through quantum mechanics.

    General relativity explains how the gravitational force works on large scales: Earth pulls on our own bodies, and planetary objects pull on each other. But it is not understood how gravity is transmitted by quantum particles.

    Discovering a subatomic force-carrier particle for gravity would help explain how gravity works on small scales and inform a quantum theory of gravity that would connect general relativity and quantum mechanics.

    Compared to the other fundamental forces, gravity interacts with matter very weakly, but the strength of the interaction quickly becomes larger with higher energies. Theorists predict that at high enough energies, such as those seen in the early universe, quantum gravity effects are as strong as the other forces. Gravity played an essential role in transferring the small-scale pattern of the cosmic microwave background into the large-scale pattern of our universe today.

    “Another way that these effects can become important for gravity is if there’s some process that lasts a long time,” Silverstein says. “Even if the energies aren’t as high as they would need to be sensitive to effects like quantum gravity instantaneously.”

    Physicists are modeling gravity over lengthy time scales in an effort to reveal these effects.

    Our understanding of gravity is also key in the search for dark matter. Some scientists think that dark matter does not actually exist; they say the evidence we’ve found so far is actually just a sign that we don’t fully understand the force of gravity.

    Big ideas, tiny details

    Learning more about gravity could tell us about the dark universe, which could also reveal new insight into how structure in the universe first formed.

    Scientists are trying to “close the loop” between particle physics and the early universe, Peskin says. As scientists probe space and go back further in time, they can learn more about the rules that govern physics at high energies, which also tells us something about the smallest components of our world.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 7:06 am on July 3, 2017 Permalink | Reply
    Tags: and the Big Bang?, , , , Can faster-than-light particles explain dark matter, , , Dark Energy, Tachyons   

    From COSMOS: “Can faster-than-light particles explain dark matter, dark energy, and the Big Bang?” 

    Cosmos Magazine bloc

    COSMOS

    30 June 2017
    Robyn Arianrhod

    1
    Tachyons may explain dark matter, dark energy and the black holes at the core of many galaxies. Andrzej Wojcicki / Science Photo Library / Getty.

    Here are six big questions about our universe that current physics can’t answer:

    What is dark energy, the mysterious energy that appears to be accelerating the expansion of the universe?
    What is dark matter, the invisible substance we can only detect by its gravitational effect on stars and galaxies?
    What caused inflation, the blindingly fast expansion of the universe immediately after the Big Bang?
    For that matter, what caused the Big Bang?
    Are there many possible Big Bangs or universes?
    Is there a telltale characteristic associated with the death of a universe?

    Despite the efforts of some of the world’s brightest brains, the Standard Model of particle physics – our current best theory of how the universe works at a fundamental level – has no solution to these stumpers.

    A compelling new theory claims to solve all six in a single sweep. The answer, according to a paper published in European Physical Journal C by Herb Fried from Brown University and Yves Gabellini from INLN-Université de Nice, may be a kind of particle called a tachyon.

    Tachyons are hypothetical particles that travel faster than light. According to Einstein’s special theory of relativity – and according to experiment so far – in our ‘real’ world, particles can never travel faster than light. Which is just as well: if they did, our ideas about cause and effect would be thrown out the window, because it would be possible to see an effect manifest before its cause.

    Although it is elegantly simple in conception, Fried and Gabellini’s model is controversial because it requires the existence of these tachyons: specifically electrically charged, fermionic tachyons and anti-tachyons, fluctuating as virtual particles in the quantum vacuum (QV). (The idea of virtual particles per se is nothing new: in the Standard Model, forces like electromagnetism are regarded as fields of virtual particles constantly ducking in and out of existence. Taken together, all these virtual particles make up the quantum vacuum.)

    But special relativity, though it bars faster-than-light travel for ordinary matter and photons, does not entirely preclude the existence of tachyons. As Fried explains, “In the presence of a huge-energy event, such as a supernova explosion or the Big Bang itself, perhaps these virtual tachyons can be torn out of the QV and sent flying into the real vacuum (RV) of our everyday world, as real particles that have yet to be measured.”

    If these tachyons do cross the speed-of-light boundary, the researchers believe that their high masses and small distances of interaction would introduce into our world an immeasurably small amount of ‘a-causality’.

    Fried and Gabellini arrived at their tachyon-based model while trying to find an explanation for the dark energy throughout space that appears to fuel the accelerating expansion of the universe. They first proposed that dark energy is produced by fluctuations of virtual pairs of electrons and positrons.

    However, this model ran into mathematical difficulties with unexpected imaginary numbers. In special relativity, however, the rest mass of a tachyon is an imaginary number, unlike the rest mass of ordinary particles. While the equations and imaginary numbers in the new model involve far more than simple masses, the idea is suggestive: Gabellini realized that by including fluctuating pairs of tachyons and anti-tachyons he and Fried could cancel and remove the unwanted imaginary numbers from their calculations. What is more, a huge bonus followed from this creative response to mathematical necessity: Gabellini and Fried realized that by adding their tachyons to the model, they could explain inflation too.

    “This assumption [of fluctuating tachyon-anti-tachyon pairs] cannot be negated by any experimental test,” says Fried – and the model fits beautifully with existing experimental data on dark energy and inflation energy.

    Of course, both Fried and Gabellini recognize that many physicists are wary of theories based on such radical assumptions.

    But, taken as a whole, their model suggests the possibility of a unifying mechanism that gives rise not only to inflation and dark energy, but also to dark matter. Calculations suggest that these high-energy tachyons would re-absorb almost all of the photons they emit and hence be invisible.

    And there is more: as Fried explains, “If a very high-energy tachyon flung into the real vacuum (RV) were then to meet and annihilate with an anti-tachyon of the same species, this tiny quantum ‘explosion’ of energy could be the seed of another Big Bang, giving rise to a new universe. That ‘seed’ would be an energy density, at that spot of annihilation, which is so great that a ‘tear’ occurs in the surface separating the Quantum Vacuum from the RV, and the huge energies stored in the QV are able to blast their way into the RV, producing the Big Bang of a new universe. And over the course of multiple eons, this situation could happen multiple times.”

    This model – like any model of such non-replicable phenomena as the creation of the universe – may be simply characterized as a tantalizing set of speculations. Nevertheless, it not only fits with data on inflation and dark energy, but also offers a possible solution to yet another observed mystery.

    Within the last few years, astronomers have realized that the black hole at the centre of our Milky Way galaxy is ‘supermassive’, containing the mass of a million or more suns. And the same sort of supermassive black hole (SMBH) may be seen at the centres of many other galaxies in our current universe.

    Exactly how such objects form is still an open question. The energy stored in the QV is normally large enough to counteract the gravitational tendency of galaxies to collapse in on themselves. In the theory of Fried and Gabellini, however, when a new universe forms, a huge amount of the QV energy from the old universe escapes through the ‘tear’ made by the tachyon-anti-tachyon annihilation (the new Big Bang). Eventually, even faraway parts of the old universe will be affected, as the old universe’s QV energy leaks into the new universe like air escaping through a hole in a balloon. The decrease in this QV-energy buffer against gravity in the old universe suggests that as the old universe dies, many of its galaxies will form SMBHs in the new universe, each containing the mass of the old galaxy’s former suns and planets. Some of these new SMBHs may form the centres of new galaxies in the new universe.

    “This may not be a very pleasant picture,” says Fried, speaking of the possible fate of our own universe. “But it is at least scientifically consistent.”

    And in the weird, untestable world of Big Bangs and multiple universes, consistency may be the best we can hope for.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 9:44 am on June 14, 2017 Permalink | Reply
    Tags: , , , , Dark Energy, Human waste used as biosolids for fertilizer, Macdonald campus in Ste-Anne-de-Bellevue, , McGill gets $3 million to fund research into cutting greenhouse gases, Mitigating greenhouse gas emissions caused by water and fertilizer use in agriculture,   

    From McGill via Montreal Gazette: “McGill gets $3 million to fund research into cutting greenhouse gases” 

    McGill University

    McGill University

    1

    Montreal Gazette

    June 14, 2017
    John Meagher

    2
    McGill professor Grant Clark displays human waste used as biosolids for fertilizer, on test fields at Macdonald campus on Monday. The federal government is investing in the university to conduct research on greenhouse gas mitigation in agriculture. Pierre Obendrauf / Montreal Gazette

    McGill University researchers at Macdonald campus in Ste-Anne-de-Bellevue got some welcome news Monday when the federal government announced nearly $3 million in funding for research projects that will help farmers cut greenhouse gas emissions.

    Local Liberal MP Francis Scarpaleggia and Jean-Claude Poissant, Parliamentary Secretary for the Minister of Agriculture, announced $2.9 million in funding at a press conference for two McGill projects aimed at mitigating greenhouse gas emissions caused by water and fertilizer use in agriculture.

    Scarpaleggia said the funding will “enable our agricultural sector to be a world leader and to develop new clean technologies and practices to enhance the economic and environmental sustainability of Canadian farms.”

    A project led by Prof. Chandra Madramootoo, of McGill’s Department of Bioresource Engineering, will receive more than $1.6 million to study the effects of different water management systems in Eastern Canada.

    The aim is to provide information on water-management practices that reduce greenhouse gas emissions while increasing agricultural productivity.

    The second project, headed by McGill Prof. Grant Clark, also of the Department of Bioresource Engineering, will receive $1.3 million. The project will research best management practices for the use of municipal bio-solids, a by-product of wastewater treatment plants, as a crop fertilizer.

    “I’m a firm believer in science-based policy,” Clark said. “And we require the support of government to develop the knowledge to promote that policy.

    “I would also like to acknowledge the government’s support of real concrete action to (address) climate change and reduce greenhouse gas emissions.”

    Clark said the research project will examine how to “reduce, reuse, recycle, reclaim” the use of nutrients and organics in agriculture

    “If were are going to develop a sustainable agricultural system, we must be conscious of how we conserve resources, reduce inputs as well as reduce greenhouse gas emissions and build and preserve the health of our soils,” he said.

    “We are interested in linking the intensive food production required to support a growing global population with the recycling of organic wastes from our municipal centres,” Clark added.

    “The objective of the program is to use the residual solids from the treatment of municipal waste waters, or biosolids, as fertilizers for agricultural production. So this mirrors the natural cycling of nutrients or organic carbon that we see in nature. However, we can’t just go out and poop in the field. The cycle is a little more involved in order that we preserve public health and hygiene.”

    Scarpaleggia described the research work being done at the Macdonald campus in Ste-Anne as “world class.”

    “The federal government has always recognized the enormous value of Macdonald campus as a world-class research facility,” said the MP for Lac-St-Louis riding.

    “They’re doing groundbreaking work here in any areas of agriculture, including water management, which is a particular interest of mine. So it’s very important to channel some research funds to Macdonald campus.”

    Scarpaleggia said the McGill projects being funded by federal government will promote job growth in the green economy.

    “As we move ahead with climate change policies, we are, as a consequence, stimulating research, stimulating industrial innovation. We’re making that jump to the green economy with all its benefits in terms of employment and high value-added jobs.”

    The federal funding, which comes from the Agricultural Greenhouse Gases Program (AGGP), was made on behalf of Lawrence MacAuley, the Minister of Agriculture and Agri-Food Canada.

    “The Government of Canada continues to invest in research with partners like McGill University in order to provide our farmers with the best strategies for adapting to climate change and for producing more quality food for a growing population while keeping agriculture clean and sustainable,” said Poissant.

    The AGGP is $27-million initiative aimed at helping the agricultural sector adjust to climate change and improve soil and water conservation. McGill’s agronomists and scientists are involved in 20 new research projects being conducted across Canada, from British Columbia to the Maritimes.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    All about McGill

    With some 300 buildings, more than 38,500 students and 250,000 living alumni, and a reputation for excellence that reaches around the globe, McGill has carved out a spot among the world’s greatest universities.
    Founded in Montreal, Quebec, in 1821, McGill is a leading Canadian post-secondary institution. It has two campuses, 11 faculties, 11 professional schools, 300 programs of study and some 39,000 students, including more than 9,300 graduate students. McGill attracts students from over 150 countries around the world, its 8,200 international students making up 21 per cent of the student body.

     
  • richardmitnick 2:16 pm on June 10, 2017 Permalink | Reply
    Tags: , , , , Dark Energy, , , The largest virtual Universe ever simulated, U Zürich   

    From U Zürich: “The largest virtual Universe ever simulated.” 

    University of Zürich

    9 June 2017
    Contact
    Prof. Dr. Romain Teyssier
    romain.teyssier@uzh.ch
    Institute for Computational Science
    University of Zurich
    +41 44 635 60 20

    Dr. Joachim Stadel
    stadel@physik.uzh.ch
    Institute for Computational Science
    University of Zurich
    Phone: +41 44 635 58 16

    Researchers from the University of Zürich have simulated the formation of our entire Universe with a large supercomputer. A gigantic catalogue of about 25 billion virtual galaxies has been generated from 2 trillion digital particles. This catalogue is being used to calibrate the experiments on board the Euclid satellite, that will be launched in 2020 with the objective of investigating the nature of dark matter and dark energy.

    ESA/Euclid spacecraft

    1
    The Cosmic Web: A section of the virtual universe, a billion light years across, showing how dark matter is distributed in space, with dark matter halos the yellow clumps, interconnected by dark filaments. Cosmic void, shown as the white areas, are the lowest density regions in the Universe. (Image: Joachim Stadel, UZH)

    Over a period of three years, a group of astrophysicists from the University of Zürich has developed and optimised a revolutionary code to describe with unprecedented accuracy the dynamics of dark matter and the formation of large-scale structures in the Universe. As Joachim Stadel, Douglas Potter and Romain Teyssier report in their recently published paper [Computational Astrophysics and Cosmology], the code (called PKDGRAV3) has been designed to use optimally the available memory and processing power of modern supercomputing architectures, such as the “Piz Daint” supercomputer of the Swiss National Computing Center (CSCS). The code was executed on this world-leading machine for only 80 hours, and generated a virtual universe of two trillion (i.e., two thousand billion or 2 x 1012) macro-particles representing the dark matter fluid, from which a catalogue of 25 billion virtual galaxies was extracted.

    Cray Piz Daint supercomputer of the Swiss National Supercomputing Center (CSCS)

    Studying the composition of the dark universe

    Thanks to the high precision of their calculation, featuring a dark matter fluid evolving under its own gravity, the researchers have simulated the formation of small concentration of matter, called dark matter halos, in which we believe galaxies like the Milky Way form.

    Caterpillar Project A Milky-Way-size dark-matter halo and its subhalos circled, an enormous suite of simulations . Griffen et al. 2016

    The challenge of this simulation was to model galaxies as small as one tenth of the Milky Way, in a volume as large as our entire observable Universe. This was the requirement set by the European Euclid mission, whose main objective is to explore the dark side of the Universe.

    Measuring subtle distortions

    Indeed, about 95 percent of the Universe is dark. The cosmos consists of 23 percent of dark matter and 72 percent of dark energy. “The nature of dark energy remains one of the main unsolved puzzles in modern science,” says Romain Teyssier, UZH professor for computational astrophysics.

    Earthbound science of Dark Energy

    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam

    A puzzle that can be cracked only through indirect observation: When the Euclid satellite will capture the light of billions of galaxies in large areas of the sky, astronomers will measure very subtle distortions that arise from the deflection of light of these background galaxies by a foreground, invisible distribution of mass – dark matter. “That is comparable to the distortion of light by a somewhat uneven glass pane,” says Joachim Stadel from the Institute for Computational Science of the UZH.

    Optimizing observation strategies of the satellite

    This new virtual galaxy catalogue will help optimize the observational strategy of the Euclid experiment and minimize various sources of error, before the satellite embarks on its six-year data collecting mission in 2020. “Euclid will perform a tomographic map of our Universe, tracing back in time more than 10-billion-year of evolution in the cosmos,” Stadel says. From the Euclid data, researchers will obtain new information on the nature of this mysterious dark energy, but also hope to discover new physics beyond the standard model, such as a modified version of general relativity or a new type of particle.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The University of Zürich (UZH, German: Universität Zürich), located in the city of Zürich, is the largest university in Switzerland, with over 26,000 students. It was founded in 1833 from the existing colleges of theology, law, medicine and a new faculty of philosophy.

    Currently, the university has seven faculties: Philosophy, Human Medicine, Economic Sciences, Law, Mathematics and Natural Sciences, Theology and Veterinary Medicine. The university offers the widest range of subjects and courses of any Swiss higher education institutions.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: