Tagged: Dark Energy Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 7:06 am on July 3, 2017 Permalink | Reply
    Tags: and the Big Bang?, , , , Can faster-than-light particles explain dark matter, , , Dark Energy, Tachyons   

    From COSMOS: “Can faster-than-light particles explain dark matter, dark energy, and the Big Bang?” 

    Cosmos Magazine bloc

    COSMOS

    30 June 2017
    Robyn Arianrhod

    1
    Tachyons may explain dark matter, dark energy and the black holes at the core of many galaxies. Andrzej Wojcicki / Science Photo Library / Getty.

    Here are six big questions about our universe that current physics can’t answer:

    What is dark energy, the mysterious energy that appears to be accelerating the expansion of the universe?
    What is dark matter, the invisible substance we can only detect by its gravitational effect on stars and galaxies?
    What caused inflation, the blindingly fast expansion of the universe immediately after the Big Bang?
    For that matter, what caused the Big Bang?
    Are there many possible Big Bangs or universes?
    Is there a telltale characteristic associated with the death of a universe?

    Despite the efforts of some of the world’s brightest brains, the Standard Model of particle physics – our current best theory of how the universe works at a fundamental level – has no solution to these stumpers.

    A compelling new theory claims to solve all six in a single sweep. The answer, according to a paper published in European Physical Journal C by Herb Fried from Brown University and Yves Gabellini from INLN-Université de Nice, may be a kind of particle called a tachyon.

    Tachyons are hypothetical particles that travel faster than light. According to Einstein’s special theory of relativity – and according to experiment so far – in our ‘real’ world, particles can never travel faster than light. Which is just as well: if they did, our ideas about cause and effect would be thrown out the window, because it would be possible to see an effect manifest before its cause.

    Although it is elegantly simple in conception, Fried and Gabellini’s model is controversial because it requires the existence of these tachyons: specifically electrically charged, fermionic tachyons and anti-tachyons, fluctuating as virtual particles in the quantum vacuum (QV). (The idea of virtual particles per se is nothing new: in the Standard Model, forces like electromagnetism are regarded as fields of virtual particles constantly ducking in and out of existence. Taken together, all these virtual particles make up the quantum vacuum.)

    But special relativity, though it bars faster-than-light travel for ordinary matter and photons, does not entirely preclude the existence of tachyons. As Fried explains, “In the presence of a huge-energy event, such as a supernova explosion or the Big Bang itself, perhaps these virtual tachyons can be torn out of the QV and sent flying into the real vacuum (RV) of our everyday world, as real particles that have yet to be measured.”

    If these tachyons do cross the speed-of-light boundary, the researchers believe that their high masses and small distances of interaction would introduce into our world an immeasurably small amount of ‘a-causality’.

    Fried and Gabellini arrived at their tachyon-based model while trying to find an explanation for the dark energy throughout space that appears to fuel the accelerating expansion of the universe. They first proposed that dark energy is produced by fluctuations of virtual pairs of electrons and positrons.

    However, this model ran into mathematical difficulties with unexpected imaginary numbers. In special relativity, however, the rest mass of a tachyon is an imaginary number, unlike the rest mass of ordinary particles. While the equations and imaginary numbers in the new model involve far more than simple masses, the idea is suggestive: Gabellini realized that by including fluctuating pairs of tachyons and anti-tachyons he and Fried could cancel and remove the unwanted imaginary numbers from their calculations. What is more, a huge bonus followed from this creative response to mathematical necessity: Gabellini and Fried realized that by adding their tachyons to the model, they could explain inflation too.

    “This assumption [of fluctuating tachyon-anti-tachyon pairs] cannot be negated by any experimental test,” says Fried – and the model fits beautifully with existing experimental data on dark energy and inflation energy.

    Of course, both Fried and Gabellini recognize that many physicists are wary of theories based on such radical assumptions.

    But, taken as a whole, their model suggests the possibility of a unifying mechanism that gives rise not only to inflation and dark energy, but also to dark matter. Calculations suggest that these high-energy tachyons would re-absorb almost all of the photons they emit and hence be invisible.

    And there is more: as Fried explains, “If a very high-energy tachyon flung into the real vacuum (RV) were then to meet and annihilate with an anti-tachyon of the same species, this tiny quantum ‘explosion’ of energy could be the seed of another Big Bang, giving rise to a new universe. That ‘seed’ would be an energy density, at that spot of annihilation, which is so great that a ‘tear’ occurs in the surface separating the Quantum Vacuum from the RV, and the huge energies stored in the QV are able to blast their way into the RV, producing the Big Bang of a new universe. And over the course of multiple eons, this situation could happen multiple times.”

    This model – like any model of such non-replicable phenomena as the creation of the universe – may be simply characterized as a tantalizing set of speculations. Nevertheless, it not only fits with data on inflation and dark energy, but also offers a possible solution to yet another observed mystery.

    Within the last few years, astronomers have realized that the black hole at the centre of our Milky Way galaxy is ‘supermassive’, containing the mass of a million or more suns. And the same sort of supermassive black hole (SMBH) may be seen at the centres of many other galaxies in our current universe.

    Exactly how such objects form is still an open question. The energy stored in the QV is normally large enough to counteract the gravitational tendency of galaxies to collapse in on themselves. In the theory of Fried and Gabellini, however, when a new universe forms, a huge amount of the QV energy from the old universe escapes through the ‘tear’ made by the tachyon-anti-tachyon annihilation (the new Big Bang). Eventually, even faraway parts of the old universe will be affected, as the old universe’s QV energy leaks into the new universe like air escaping through a hole in a balloon. The decrease in this QV-energy buffer against gravity in the old universe suggests that as the old universe dies, many of its galaxies will form SMBHs in the new universe, each containing the mass of the old galaxy’s former suns and planets. Some of these new SMBHs may form the centres of new galaxies in the new universe.

    “This may not be a very pleasant picture,” says Fried, speaking of the possible fate of our own universe. “But it is at least scientifically consistent.”

    And in the weird, untestable world of Big Bangs and multiple universes, consistency may be the best we can hope for.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 9:44 am on June 14, 2017 Permalink | Reply
    Tags: , , , , Dark Energy, Human waste used as biosolids for fertilizer, Macdonald campus in Ste-Anne-de-Bellevue, , McGill gets $3 million to fund research into cutting greenhouse gases, Mitigating greenhouse gas emissions caused by water and fertilizer use in agriculture,   

    From McGill via Montreal Gazette: “McGill gets $3 million to fund research into cutting greenhouse gases” 

    McGill University

    McGill University

    1

    Montreal Gazette

    June 14, 2017
    John Meagher

    2
    McGill professor Grant Clark displays human waste used as biosolids for fertilizer, on test fields at Macdonald campus on Monday. The federal government is investing in the university to conduct research on greenhouse gas mitigation in agriculture. Pierre Obendrauf / Montreal Gazette

    McGill University researchers at Macdonald campus in Ste-Anne-de-Bellevue got some welcome news Monday when the federal government announced nearly $3 million in funding for research projects that will help farmers cut greenhouse gas emissions.

    Local Liberal MP Francis Scarpaleggia and Jean-Claude Poissant, Parliamentary Secretary for the Minister of Agriculture, announced $2.9 million in funding at a press conference for two McGill projects aimed at mitigating greenhouse gas emissions caused by water and fertilizer use in agriculture.

    Scarpaleggia said the funding will “enable our agricultural sector to be a world leader and to develop new clean technologies and practices to enhance the economic and environmental sustainability of Canadian farms.”

    A project led by Prof. Chandra Madramootoo, of McGill’s Department of Bioresource Engineering, will receive more than $1.6 million to study the effects of different water management systems in Eastern Canada.

    The aim is to provide information on water-management practices that reduce greenhouse gas emissions while increasing agricultural productivity.

    The second project, headed by McGill Prof. Grant Clark, also of the Department of Bioresource Engineering, will receive $1.3 million. The project will research best management practices for the use of municipal bio-solids, a by-product of wastewater treatment plants, as a crop fertilizer.

    “I’m a firm believer in science-based policy,” Clark said. “And we require the support of government to develop the knowledge to promote that policy.

    “I would also like to acknowledge the government’s support of real concrete action to (address) climate change and reduce greenhouse gas emissions.”

    Clark said the research project will examine how to “reduce, reuse, recycle, reclaim” the use of nutrients and organics in agriculture

    “If were are going to develop a sustainable agricultural system, we must be conscious of how we conserve resources, reduce inputs as well as reduce greenhouse gas emissions and build and preserve the health of our soils,” he said.

    “We are interested in linking the intensive food production required to support a growing global population with the recycling of organic wastes from our municipal centres,” Clark added.

    “The objective of the program is to use the residual solids from the treatment of municipal waste waters, or biosolids, as fertilizers for agricultural production. So this mirrors the natural cycling of nutrients or organic carbon that we see in nature. However, we can’t just go out and poop in the field. The cycle is a little more involved in order that we preserve public health and hygiene.”

    Scarpaleggia described the research work being done at the Macdonald campus in Ste-Anne as “world class.”

    “The federal government has always recognized the enormous value of Macdonald campus as a world-class research facility,” said the MP for Lac-St-Louis riding.

    “They’re doing groundbreaking work here in any areas of agriculture, including water management, which is a particular interest of mine. So it’s very important to channel some research funds to Macdonald campus.”

    Scarpaleggia said the McGill projects being funded by federal government will promote job growth in the green economy.

    “As we move ahead with climate change policies, we are, as a consequence, stimulating research, stimulating industrial innovation. We’re making that jump to the green economy with all its benefits in terms of employment and high value-added jobs.”

    The federal funding, which comes from the Agricultural Greenhouse Gases Program (AGGP), was made on behalf of Lawrence MacAuley, the Minister of Agriculture and Agri-Food Canada.

    “The Government of Canada continues to invest in research with partners like McGill University in order to provide our farmers with the best strategies for adapting to climate change and for producing more quality food for a growing population while keeping agriculture clean and sustainable,” said Poissant.

    The AGGP is $27-million initiative aimed at helping the agricultural sector adjust to climate change and improve soil and water conservation. McGill’s agronomists and scientists are involved in 20 new research projects being conducted across Canada, from British Columbia to the Maritimes.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    All about McGill

    With some 300 buildings, more than 38,500 students and 250,000 living alumni, and a reputation for excellence that reaches around the globe, McGill has carved out a spot among the world’s greatest universities.
    Founded in Montreal, Quebec, in 1821, McGill is a leading Canadian post-secondary institution. It has two campuses, 11 faculties, 11 professional schools, 300 programs of study and some 39,000 students, including more than 9,300 graduate students. McGill attracts students from over 150 countries around the world, its 8,200 international students making up 21 per cent of the student body.

     
  • richardmitnick 2:16 pm on June 10, 2017 Permalink | Reply
    Tags: , , , , Dark Energy, , , The largest virtual Universe ever simulated, U Zürich   

    From U Zürich: “The largest virtual Universe ever simulated.” 

    University of Zürich

    9 June 2017
    Contact
    Prof. Dr. Romain Teyssier
    romain.teyssier@uzh.ch
    Institute for Computational Science
    University of Zurich
    +41 44 635 60 20

    Dr. Joachim Stadel
    stadel@physik.uzh.ch
    Institute for Computational Science
    University of Zurich
    Phone: +41 44 635 58 16

    Researchers from the University of Zürich have simulated the formation of our entire Universe with a large supercomputer. A gigantic catalogue of about 25 billion virtual galaxies has been generated from 2 trillion digital particles. This catalogue is being used to calibrate the experiments on board the Euclid satellite, that will be launched in 2020 with the objective of investigating the nature of dark matter and dark energy.

    ESA/Euclid spacecraft

    1
    The Cosmic Web: A section of the virtual universe, a billion light years across, showing how dark matter is distributed in space, with dark matter halos the yellow clumps, interconnected by dark filaments. Cosmic void, shown as the white areas, are the lowest density regions in the Universe. (Image: Joachim Stadel, UZH)

    Over a period of three years, a group of astrophysicists from the University of Zürich has developed and optimised a revolutionary code to describe with unprecedented accuracy the dynamics of dark matter and the formation of large-scale structures in the Universe. As Joachim Stadel, Douglas Potter and Romain Teyssier report in their recently published paper [Computational Astrophysics and Cosmology], the code (called PKDGRAV3) has been designed to use optimally the available memory and processing power of modern supercomputing architectures, such as the “Piz Daint” supercomputer of the Swiss National Computing Center (CSCS). The code was executed on this world-leading machine for only 80 hours, and generated a virtual universe of two trillion (i.e., two thousand billion or 2 x 1012) macro-particles representing the dark matter fluid, from which a catalogue of 25 billion virtual galaxies was extracted.

    Cray Piz Daint supercomputer of the Swiss National Supercomputing Center (CSCS)

    Studying the composition of the dark universe

    Thanks to the high precision of their calculation, featuring a dark matter fluid evolving under its own gravity, the researchers have simulated the formation of small concentration of matter, called dark matter halos, in which we believe galaxies like the Milky Way form.

    Caterpillar Project A Milky-Way-size dark-matter halo and its subhalos circled, an enormous suite of simulations . Griffen et al. 2016

    The challenge of this simulation was to model galaxies as small as one tenth of the Milky Way, in a volume as large as our entire observable Universe. This was the requirement set by the European Euclid mission, whose main objective is to explore the dark side of the Universe.

    Measuring subtle distortions

    Indeed, about 95 percent of the Universe is dark. The cosmos consists of 23 percent of dark matter and 72 percent of dark energy. “The nature of dark energy remains one of the main unsolved puzzles in modern science,” says Romain Teyssier, UZH professor for computational astrophysics.

    Earthbound science of Dark Energy

    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam

    A puzzle that can be cracked only through indirect observation: When the Euclid satellite will capture the light of billions of galaxies in large areas of the sky, astronomers will measure very subtle distortions that arise from the deflection of light of these background galaxies by a foreground, invisible distribution of mass – dark matter. “That is comparable to the distortion of light by a somewhat uneven glass pane,” says Joachim Stadel from the Institute for Computational Science of the UZH.

    Optimizing observation strategies of the satellite

    This new virtual galaxy catalogue will help optimize the observational strategy of the Euclid experiment and minimize various sources of error, before the satellite embarks on its six-year data collecting mission in 2020. “Euclid will perform a tomographic map of our Universe, tracing back in time more than 10-billion-year of evolution in the cosmos,” Stadel says. From the Euclid data, researchers will obtain new information on the nature of this mysterious dark energy, but also hope to discover new physics beyond the standard model, such as a modified version of general relativity or a new type of particle.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The University of Zürich (UZH, German: Universität Zürich), located in the city of Zürich, is the largest university in Switzerland, with over 26,000 students. It was founded in 1833 from the existing colleges of theology, law, medicine and a new faculty of philosophy.

    Currently, the university has seven faculties: Philosophy, Human Medicine, Economic Sciences, Law, Mathematics and Natural Sciences, Theology and Veterinary Medicine. The university offers the widest range of subjects and courses of any Swiss higher education institutions.

     
  • richardmitnick 8:31 am on May 29, 2017 Permalink | Reply
    Tags: , Dark Energy, , Harnessing the energy generated when freshwater meets saltwater, ,   

    From Penn State via phys.org: “Harnessing the energy generated when freshwater meets saltwater” 

    Penn State Bloc

    Pennsylvania State University

    phys.org

    May 29, 2017
    Jennifer Matthews

    2
    Credit: Pennsylvania State University

    Penn State researchers have created a new hybrid technology that produces unprecedented amounts of electrical power where seawater and freshwater combine at the coast.

    “The goal of this technology is to generate electricity from where the rivers meet the ocean,” said Christopher Gorski, assistant professor in environmental engineering at Penn State. “It’s based on the difference in the salt concentrations between the two water sources.”

    That difference in salt concentration has the potential to generate enough energy to meet up to 40 percent of global electricity demands. Though methods currently exist to capture this energy, the two most successful methods, pressure retarded osmosis (PRO) and reverse electrodialysis (RED), have thus far fallen short.

    PRO, the most common system, selectively allows water to transport through a semi-permeable membrane, while rejecting salt. The osmotic pressure created from this process is then converted into energy by turning turbines.

    “PRO is so far the best technology in terms of how much energy you can get out,” Gorski said. “But the main problem with PRO is that the membranes that transport the water through foul, meaning that bacteria grows on them or particles get stuck on their surfaces, and they no longer transport water through them.”

    This occurs because the holes in the membranes are incredibly small, so they become blocked easily. In addition, PRO doesn’t have the ability to withstand the necessary pressures of super salty waters.

    The second technology, RED, uses an electrochemical gradient to develop voltages across ion-exchange membranes.

    “Ion exchange membranes only allow either positively charged ions to move through them or negatively charged ions,” Gorski explained. “So only the dissolved salt is going through, and not the water itself.”

    Here, the energy is created when chloride or sodium ions are kept from crossing ion-exchange membranes as a result of selective ion transport. Ion-exchange membranes don’t require water to flow through them, so they don’t foul as easily as the membranes used in PRO; however, the problem with RED is that it doesn’t have the ability to produce large amounts of power.

    3
    Photograph of the concentration flow cell. Two plates clamp the cell together, which contains two narrow channels fed with either synthetic freshwater or seawater through the plastic lines. Credit: Pennsylvania State University

    A third technology, capacitive mixing (CapMix), is a relatively new method also being explored. CapMix is an electrode-based technology that captures energy from the voltage that develops when two identical electrodes are sequentially exposed to two different kinds of water with varying salt concentrations, such as freshwater and seawater. Like RED, the problem with CapMix is that it’s not able to yield enough power to be viable.

    Gorski, along with Bruce Logan, Evan Pugh Professor and the Stan and Flora Kappe Professor of Environmental Engineering, and Taeyoung Kim, post-doctoral scholar in environmental engineering, may have found a solution to these problems. The researchers have combined both the RED and CapMix technologies in an electrochemical flow cell.

    “By combining the two methods, they end up giving you a lot more energy,” Gorski said.

    The team constructed a custom-built flow cell in which two channels were separated by an anion-exchange membrane. A copper hexacyanoferrate electrode was then placed in each channel, and graphite foil was used as a current collector. The cell was then sealed using two end plates with bolts and nuts. Once built, one channel was fed with synthetic seawater, while the other channel was fed with synthetic freshwater. Periodically switching the water’s flow paths allowed the cell to recharge and further produce power. From there, they examined how the cutoff voltage used for switching flow paths, external resistance and salt concentrations influenced peak and average power production.

    “There are two things going on here that make it work,” said Gorski. “The first is you have the salt going to the electrodes. The second is you have the chloride transferring across the membrane. Since both of these processes generate a voltage, you end up developing a combined voltage at the electrodes and across the membrane.”

    To determine the gained voltage of the flow cell depending on the type of membrane used and salinity difference, the team recorded open-circuit cell voltages while feeding two solutions at 15 milliliters per minute. Through this method, they identified that stacking multiple cells did influence electricity production. At 12.6 watts per square meter, this technology leads to peak power densities that are unprecedentedly high compared to previously reported RED (2.9 watts per square meter), and on par with the maximum calculated values for PRO (9.2 watts per square meter), but without the fouling problems.

    “What we’ve shown is that we can bring that power density up to what people have reported for pressure retarded osmosis and to a value much higher that what has been reported if you use these two processes alone,” Gorski said.

    Though the results are promising, the researchers want to do more research on the stability of the electrodes over time and want to know how other elements in seawater— like magnesium and sulfate— might affect the performance of the cell.

    “Pursuing renewable energy sources is important,” Gorski said. “If we can do carbon neutral energy, we should.”

    No science paper referenced.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Penn State Campus

    WHAT WE DO BEST

    We teach students that the real measure of success is what you do to improve the lives of others, and they learn to be hard-working leaders with a global perspective. We conduct research to improve lives. We add millions to the economy through projects in our state and beyond. We help communities by sharing our faculty expertise and research.

    Penn State lives close by no matter where you are. Our campuses are located from one side of Pennsylvania to the other. Through Penn State World Campus, students can take courses and work toward degrees online from anywhere on the globe that has Internet service.

    We support students in many ways, including advising and counseling services for school and life; diversity and inclusion services; social media sites; safety services; and emergency assistance.

    Our network of more than a half-million alumni is accessible to students when they want advice and to learn about job networking and mentor opportunities as well as what to expect in the future. Through our alumni, Penn State lives all over the world.

    The best part of Penn State is our people. Our students, faculty, staff, alumni, and friends in communities near our campuses and across the globe are dedicated to education and fostering a diverse and inclusive environment.

     
  • richardmitnick 9:41 pm on May 18, 2017 Permalink | Reply
    Tags: , , , Dark Energy, , , ,   

    From Nautilus: “The Physicist Who Denies Dark Matter” Revised and Improved from post of 2017/03/01 

    Nautilus

    Nautilus

    May 18, 2017
    Oded Carmeli

    4
    Mordehai Milgrom. Cosmos on Nautilus

    Maybe Newtonian physics doesn’t need dark matter to work.

    He is one of those dark matter people,” Mordehai Milgrom said about a colleague stopping by his office at the Weizmann Institute of Science. Milgrom introduced us, telling me that his friend is searching for evidence of dark matter in a project taking place just down the hall.

    “There are no ‘dark matter people’ and ‘MOND people,’ ” his colleague retorted.

    7
    http://www.astro.umd.edu/~ssm/mond/

    “I am ‘MOND people,’” Milgrom proudly proclaimed, referring to Modified Newtonian Dynamics, his theory that fixes Newtonian physics instead of postulating the existence of dark matter and dark energy—two things that, according to the standard model of cosmology, constitute 95.1 percent of the total mass-energy content of the universe.

    This friendly incident is indicative of (“Moti”) Milgrom’s calmly quixotic character. There is something almost misleading about the 70-year-old physicist wearing shorts in the hot Israeli summer, whose soft voice breaks whenever he gets excited. Nothing about his pleasant demeanor reveals that this man claims to be the third person to correct Newtonian physics: First Max Planck (with quantum theory), then Einstein (with relativity), now Milgrom.

    This year marks Milgrom’s 50th year at the Weizmann.


    Weizmann Institute Campus

    I visited him there to learn more about how it feels to be a science maverick, what he appreciates about Thomas Kuhn’s The Structure of Scientific Revolutions, and why he thinks dark matter and dark energy don’t exist.

    1
    NASA

    What inspired you to dedicate your life to the motion of stars?

    I remember very vividly the way physics struck me. I was 16 and I thought: Here is a way to understand how things work, far beyond the understanding of my peers. It wasn’t a long-term plan. It was a daily attraction. I simply loved physics, the same way other people love art or sports. I never dreamed of one day making a major discovery, like correcting Newton.

    I had a terrific physics teacher at school, but when you study textbook material, you’re studying done deals. You still don’t see the effort that goes into making breakthrough science, when things are unclear and advances are made intuitively and often go wrong. They don’t teach you that at school. They teach you that science always goes forward: You have a body of knowledge, and then someone discovers something and expands that body of knowledge. But it doesn’t really work that way. The progress of science is never linear.

    How did you get involved with the problem of dark matter?

    Toward the end of my Ph.D., the physics department here wanted to expand. So they asked three top Ph.D. students working on particle physics to choose a new field. We chose astrophysics, and the Weizmann Institute pulled some strings with institutions abroad so they would accept us as postdocs. And so I went to Cornell to fill my gaps in astrophysics.

    After a few years in high energy astrophysics, working on the physics of X-ray radiation in space, I decided to move to yet another field: The dynamics of galaxies. It was a few years after the first detailed measurements of the speed of stars orbiting spiral galaxies came in. And, well, there was a problem with the measurements.

    To understand this problem, one needs to wrap one’s head around some celestial rotations. Our planet orbits the sun, which, in turn, orbits the center of the Milky Way galaxy. Inside solar systems, the gravitational pull from the mass of the sun and the speed of the planets are in balance. By Newton’s laws, this is why Mercury, the innermost planet in our solar system, orbits the sun at over 100,000 miles per hour, while the outermost plant, Neptune, is crawling at just over 10,000 miles per hour.

    Milky Way NASA/JPL-Caltech /ESO R. Hurt

    Now, you might assume that the same logic would apply to galaxies: The farther away the star is from the galaxy’s center, the slower it revolves around it; however, while at smaller radiuses the measurements were as predicted by Newtonian physics, farther stars proved to move much faster than predicted from the gravitational pull of the mass we see in these galaxies. The observed gap got a lot wider when, in the late 1970s, radio telescopes were able to detect and measure the cold gas clouds at the outskirts of galaxies. These clouds orbit the galactic center five times farther than the stars, and thus the anomaly grew to become a major scientific puzzle.

    One way to solve this puzzle is to simply add more matter. If there is too little visible mass at the center of galaxies to account for the speed of stars and gas, perhaps there is more matter than meets the eye, matter that we cannot see, dark matter.

    What made you first question the very existence of dark matter?

    What struck me was some regularity in the anomaly. The rotational velocities were not just larger than expected, they became constant with radius. Why? Sure, if there was dark matter, the speed of stars would be greater, but the rotation curves, meaning the rotational speed drawn as a function of the radius, could still go up and down depending on its distribution. But they didn’t. That really struck me as odd. So, in 1980, I went on my Sabbatical in the Institute for Advance Studies in Princeton with the following hunch: If the rotational speeds are constant, then perhaps we’re looking at a new law of nature. If Newtonian physics can’t predict the fixed curves, perhaps we should fix Newton, instead of making up a whole new class of matter just to fit our measurements.

    If you’re going to change the laws of nature that work so well in our own solar system, you need to find a property that differentiates solar systems from galaxies. So I made up a chart of different properties, such as size, mass, speed of rotation, etc. For each parameter, I put in the Earth, the solar system and some galaxies. For example, galaxies are bigger than solar systems, so perhaps Newton’s laws don’t work over large distances? But if this was the case, you would expect the rotation anomaly to grow bigger in bigger galaxies, while, in fact, it is not. So I crossed that one out and moved on to the next properties.

    I finally struck gold with acceleration: The pace at which the velocity of objects changes.

    3
    NASA

    We usually think of earthbound cars that accelerate in the same direction, but imagine a merry-go-round. You could be going in circles and still accelerate. Otherwise, you would simply fall off. The same goes for celestial merry-go-rounds. And it’s in acceleration that we find a big difference in scales, one that justifies modifying Newton: The normal acceleration for a star orbiting the center of a galaxy is about a hundred million times smaller than that of the Earth orbiting the sun.

    For those small accelerations, MOND introduces a new constant of nature, called a0. If you studied physics in high school, you probably remember Newton’s second law: force equals mass times acceleration, or F=ma. While this is a perfectly good tool when dealing with accelerations much greater than a0, such as those of the planets around our sun, I suggested that at significantly lower accelerations, lower even than that of our sun around the galactic center, force becomes proportional to the square of the acceleration, or F=ma2/a0.

    To put it in other words: According to Newton’s laws, the rotation speed of stars around galactic centers should decrease the farther the star is from the center of mass. If MOND is correct, it should reach a constant value, thus eliminating the need for dark matter.

    What did your colleagues at Princeton think about all this?

    I didn’t share these thoughts with my colleagues at Princeton. I was afraid to come across as, well, crazy. And then, in 1981, when I already had a clear idea of MOND, I didn’t want anyone to jump on my wagon, so to speak, which is even crazier when you think about it. Needless to say [laughs] no one jumped on my wagon, even when I desperately wanted them to.

    Well, you were 35 and you proposed to fix Newton.

    Why not? What’s the big deal? If something doesn’t work, fix it. I wasn’t trying to be bold. I was very naïve at the time. I didn’t understand that scientists are just as swayed as other people by conventions and interests.

    Like Thomas Kuhn’s The Structure of Scientific Revolutions.

    10

    I love that book. I read it several times. It showed me how my life’s story has happened to so many others scientists throughout history. Sure, it’s easy to make fun of people who once objected to what we now know is good science, but are we any different? Kuhn stresses that these objectors are usually good scientists with good reasons to object. It is just that the dissenters usually have a unique point of view of things that is not shared by most others. I laugh about it now, because MOND has made such progress, but there were times when I felt depressed and isolated.

    What’s it like being a science maverick?

    By and large, the last 35 years have been exciting and rewarding exactly because I have been advocating a maverick paradigm. I am a loner by nature, and despite the daunting and doubting times, I much prefer this to being carried with the general flow. I was quite confident in the basic validity of MOND from the very start, which helped me a lot in taking all this in stride, but there are two great advantages to the lingering opposition to MOND: Firstly, it gave me time to make more contributions to MOND than I would had the community jumped on the MOND wagon early on. Secondly, once MOND is accepted, the long and wide resistance to it will only have proven how nontrivial an idea it is.

    By the end of my sabbatical in Princeton, I had secretly written three papers introducing MOND to the world. Publishing them, however, was a whole different story. At first I sent my kernel paper to journals such as Nature and Astrophysical Journal Letters, and it got rejected almost off-hand. It took a long time until all three papers were published, side by side, in Astrophysical Journal.

    The first person to hear about MOND was my wife Yvonne. Frankly, tears come to my eyes when I say this. Yvonne is not a scientist, but she has been my greatest supporter.

    The first scientist to back MOND was another physics maverick: The late Professor Jacob Bekenstein, who was the first to suggest that black holes should have a well-defined entropy, later dubbed the Bekenstein-Hawking entropy. After I submitted the initial MOND trilogy, I sent the preprints to several astrophysicists, but Jacob was the first scientist I discussed MOND with. He was enthusiastic and encouraging from the very start.

    Slowly but surely, this tiny opposition to dark matter grew from just two physicists to several hundred proponents, or at least scientists who take MOND seriously. Dark matter is still the scientific consensus, but MOND is now a formidable opponent that proclaims the emperor has no clothes, that dark matter is our generation’s ether.

    So what happened? As far as dark matter is concerned, nothing really. A host of experiments searching for dark matter, including the Large Hadron Collider, many underground experiments and several space missions, have failed to directly observe its very existence. Meanwhile, MOND was able to accurately predict the rotation of more and more spiral galaxies—over 150 galaxies to date, to be precise.

    All of them? Some papers claim that MOND wasn’t able to predict the dynamics of certain galaxies.

    That’s true and it’s perfectly fine, because MOND’s predictions are based on measurements. Given the distribution of regular, visible matter alone, MOND can predict the dynamics of galaxies. But that prediction is based on our initial measurements. We measure the light coming in from a galaxy to calculate its mass, but we often don’t know the distance to that galaxy for sure, so we don’t know for certain just how massive that galaxy really is. And there are other variables, such as molecular gas, that we can’t observe at all. So yes, some galaxies don’t perfectly match MOND’s predictions, but all in all, it’s almost a miracle that we have enough data on galaxies to prove MOND right, over and over again.

    Your opponents say MOND’s greatest flaw is its incompatibility with relativistic physics.

    In 2004, Bekenstein proposed his TeVeS, or Relativistic Gravitational Theory for MOND.

    12
    http://astroweb.case.edu/ssm/mond/

    Since then, several different relativistic MOND formulations have been put forth, including one by me, called Bimetric MOND, or BIMOND.

    So, no, incorporating MOND into Einsteinian physics is no longer a challenge. I hear this statement still made, but only from people who parrot others, who themselves are not abreast with the developments of the last 10 years. There are several relativistic versions of MOND. What remains a challenge is demonstrating that MOND can account for the mass anomalies in cosmology.

    Another argument that cosmologists often make is that dark matter is needed not just for motion within galaxies, but on even larger scales. What does MOND have to say about that?

    According to the Big Bang theory, the universe began as a uniform singularity 13.8 billion years ago. And, just as in galaxies, observations made of the cosmic background radiation from the early universe suggest that the gravity of all the matter in the universe is simply not enough to form the different patterns we currently see, like galaxies and stars, in just 13.8 billion years. Once again, dark matter was called to the rescue: It does not emit radiation, but it does engage visible material with gravitation. And so, starting from the 1980s, the new cosmological dogma was that dark matter constituted a staggering 95 percent of all matter in the universe. That lasted, well, right until the bomb hit us in 1998.

    It turned out that the expansion of the universe is accelerating, not decelerating like all of us originally thought.

    13
    Timeline of the universe, assuming a cosmological constant. Coldcreation/wikimedia, CC BY-SA

    Any form of genuine matter, dark or not, should have slowed down acceleration. And so a whole new type of entity was invented: Dark energy. Now the accepted cosmology is that the universe is made up of 70 percent dark energy, 25 percent dark matter, and 5 percent regular matter..

    Dark energy depiction. Image: Volker Springle/Max Planck Institute for Astrophysics/SP)

    But dark energy is just a quick fix, the same as dark matter is. And just as in galaxies, you can either invent a whole new type of energy and then spend years trying to understand its properties, or you can try fixing your theory.

    Among other things, MOND points to a very deep connection between structure and dynamics in galaxies and cosmology. This is not expected in accepted physics. Galaxies are tiny structures within the grand scale of the universe, and those structures can behave differently without contradicting the current cosmological consensus. However, MOND creates this connection, binding the two.

    This connection is surprising: For whatever reason, the MOND constant of a0 is close to the acceleration that characterizes the universe itself. In fact, MOND’s constant equals the speed of light squared, divided by the radius of universe.

    So, indeed, to your question, the conundrum pointed to is valid at present. MOND doesn’t have a sufficient cosmology yet, but we’re working on it. And once we fully understand MOND, I believe we’ll also fully understand the expansion of the universe, and vice versa: A new cosmological theory would explain MOND. Wouldn’t that be amazing?

    What do you think about the proposed unified theories of physics, which merge MOND with quantum mechanics?

    These all hark back to my 1999 paper on MOND as a vacuum effect, where it was pointed out that the quantum vacuum in a universe such as ours may produce MOND behavior within galaxies, with the cosmological constant appearing in the guise of the MOND acceleration constant, a0. But I am greatly gratified to see these propositions put forth, especially because they are made by people outside the traditional MOND community. It is very important that researchers from other backgrounds become interested in MOND and bring new ideas to further our understanding of its origin.

    And what if you had a unified theory of physics that explains everything? What then?

    You know, I’m not a religious person, but I often think about our tiny blue dot, and the painstaking work we physicists do here. Who knows? Perhaps somewhere out there, in one of those galaxies I spent my life researching, there already is a known unified theory of physics, with a variation of MOND built into it. But then I think: So what? We still had fun doing the math. We still had the thrill of trying to wrap our heads around the universe, even if the universe never noticed it at all.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
  • richardmitnick 8:19 am on May 17, 2017 Permalink | Reply
    Tags: , , , , Dark Energy, New Explanation for Dark Energy? Tiny Fluctuations of Time and Space,   

    From Universe Today: “New Explanation for Dark Energy? Tiny Fluctuations of Time and Space” 

    universe-today

    Universe Today

    16 May , 2017
    Matt Williams

    1
    A new study from researchers from the University of British Columbia offers a new explanation of Dark Energy. Credit: NASA

    Since the late 1920s, astronomers have been aware of the fact that the Universe is in a state of expansion. Initially predicted by Einstein’s Theory of General Relativity, this realization has gone on to inform the most widely-accepted cosmological model – the Big Bang Theory. However, things became somewhat confusing during the 1990s, when improved observations showed that the Universe’s rate of expansion has been accelerating for billions of years.

    This led to the theory of Dark Energy, a mysterious invisible force that is driving the expansion of the cosmos. Much like Dark Matter which explained the “missing mass”, it then became necessary to find this illusive energy, or at least provide a coherent theoretical framework for it. A new study from the University of British Columbia (UBC) seeks to do just that by postulating the the Universe is expanding due to fluctuations in space and time.

    The study – which was recently published in the journal Physical Review D – was led by Qingdi Wang, a PhD student with the Department of Physics and Astronomy at UBC. Under the supervisions of UBC Professor William Unruh (the man who proposed the Unruh Effect) and with assistance from Zhen Zhu (another PhD student at UBC), they provide a new take on Dark Energy.

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Credit: Alex Mittelmann

    Inflationary Universe. NASA/WMAP

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 12:44 pm on May 9, 2017 Permalink | Reply
    Tags: , , , , Dark Energy, , Detecting infrared light, , ,   

    From JPL-Caltech: “NASA Delivers Detectors for ESA’s Euclid Spacecraft” 

    NASA JPL Banner

    JPL-Caltech

    May 9, 2017
    Elizabeth Landau
    Jet Propulsion Laboratory, Pasadena, Calif.
    818-354-6425
    elizabeth.landau@jpl.nasa.gov

    Giuseppe Racca
    Euclid Project Manager
    Directorate of Science
    European Space Agency
    giuseppe.racca@esa.int

    René Laureijs
    Euclid Project Scientist
    Directorate of Science
    European Space Agency
    Rene.Laureijs@esa.int

    ESA/Euclid spacecraft

    Three detector systems for the Euclid mission, led by ESA (European Space Agency), have been delivered to Europe for the spacecraft’s near-infrared instrument. The detector systems are key components of NASA’s contribution to this upcoming mission to study some of the biggest questions about the universe, including those related to the properties and effects of dark matter and dark energy — two critical, but invisible phenomena that scientists think make up the vast majority of our universe.

    “The delivery of these detector systems is a milestone for what we hope will be an extremely exciting mission, the first space mission dedicated to going after the mysterious dark energy,” said Michael Seiffert, the NASA Euclid project scientist based at NASA’s Jet Propulsion Laboratory, Pasadena, California, which manages the development and implementation of the detector systems.

    Euclid will carry two instruments: a visible-light imager (VIS) and a near-infrared spectrometer and photometer (NISP). A special light-splitting plate on the Euclid telescope enables incoming light to be shared by both instruments, so they can carry out observations simultaneously.

    The spacecraft, scheduled for launch in 2020, will observe billions of faint galaxies and investigate why the universe is expanding at an accelerating pace. Astrophysicists think dark energy is responsible for this effect, and Euclid will explore this hypothesis and help constrain dark energy models. This census of distant galaxies will also reveal how galaxies are distributed in our universe, which will help astrophysicists understand how the delicate interplay of the gravity of dark matter, luminous matter and dark energy forms large-scale structures in the universe.

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Additionally, the location of galaxies in relation to each other tells scientists how they are clustered. Dark matter, an invisible substance accounting for over 80 percent of matter in our universe, can cause subtle distortions in the apparent shapes of galaxies. That is because its gravity bends light that travels from a distant galaxy toward an observer, which changes the appearance of the galaxy when it is viewed from a telescope.

    Gravitational Lensing NASA/ESA

    Euclid’s combination of visible and infrared instruments will examine this distortion effect and allow astronomers to probe dark matter and the effects of dark energy.

    Detecting infrared light, which is invisible to the human eye, is especially important for studying the universe’s distant galaxies. Much like the Doppler effect for sound, where a siren’s pitch seems higher as it approaches and lower as it moves away, the frequency of light from an astronomical object gets shifted with motion. Light from objects that are traveling away from us appears redder, and light from those approaching us appears bluer. Because the universe is expanding, distant galaxies are moving away from us, so their light gets stretched out to longer wavelengths. Between 6 and 10 billion light-years away, galaxies are brightest in infrared light.

    JPL procured the NISP detector systems, which were manufactured by Teledyne Imaging Sensors of Camarillo, California. They were tested at JPL and at NASA’s Goddard Space Flight Center, Greenbelt, Maryland, before being shipped to France and the NISP team.

    Each detector system consists of a detector, a cable and a “readout electronics chip” that converts infrared light to data signals read by an onboard computer and transmitted to Earth for analysis. Sixteen detectors will fly on Euclid, each composed of 2040 by 2040 pixels. They will cover a field of view slightly larger than twice the area covered by a full moon. The detectors are made of a mercury-cadmium-telluride mixture and are designed to operate at extremely cold temperatures.

    “The U.S. Euclid team has overcome many technical hurdles along the way, and we are delivering superb detectors that will enable the collection of unprecedented data during the mission,” said Ulf Israelsson, the NASA Euclid project manager, based at JPL.

    Delivery to ESA of the next set of detectors for NISP is planned in early June. The Centre de Physique de Particules de Marseille, France, will provide further characterization of the detector systems. The final detector focal plane will then be assembled at the Laboratoire d’Astrophysique de Marseille, and integrated with the rest of NISP for instrument tests.

    For more information about Euclid, visit:

    http://sci.esa.int/Euclid

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA JPL Campus

    Jet Propulsion Laboratory (JPL) is a federally funded research and development center and NASA field center located in the San Gabriel Valley area of Los Angeles County, California, United States. Although the facility has a Pasadena postal address, it is actually headquartered in the city of La Cañada Flintridge [1], on the northwest border of Pasadena. JPL is managed by the nearby California Institute of Technology (Caltech) for the National Aeronautics and Space Administration. The Laboratory’s primary function is the construction and operation of robotic planetary spacecraft, though it also conducts Earth-orbit and astronomy missions. It is also responsible for operating NASA’s Deep Space Network.

    Caltech Logo

    NASA image

     
  • richardmitnick 12:58 pm on April 18, 2017 Permalink | Reply
    Tags: , , , , Dark Energy, ,   

    From EarthSky: “Who needs dark energy?” 

    1

    EarthSky

    April 17, 2017
    Brian Koberlein

    Dark energy is thought to be the driver for the expansion of the universe. But do we need dark energy to account for an expanding universe?

    1
    Image via Brian Koberlein/ One Universe at a Time.

    Our universe is expanding. We’ve known this for nearly a century, and modern observations continue to support this. Not only is our universe expanding, it is doing so at an ever-increasing rate. But the question remains as to what drives this cosmic expansion. The most popular answer is what we call dark energy. But do we need dark energy to account for an expanding universe? Perhaps not.

    The idea of dark energy comes from a property of general relativity known as the cosmological constant. The basic idea of general relativity is that the presence of matter https://briankoberlein.com/2013/09/09/the-attraction-of-curves/. As a result, light and matter are deflected from simple straight paths in a way that resembles a gravitational force. The simplest mathematical model in relativity just describes this connection between matter and curvature, but it turns out that the equations also allow for an extra parameter, the cosmological constant, that can give space an overall rate of expansion. The cosmological constant perfectly describes the observed properties of dark energy, and it arises naturally in general relativity, so it’s a reasonable model to adopt.

    In classical relativity, the presence of a cosmological constant simply means that cosmic expansion is just a property of spacetime. But our universe is also governed by the quantum theory, and the quantum world doesn’t play well with the cosmological constant. One solution to this issue is that quantum vacuum energy might be driving cosmic expansion, but in quantum theory vacuum fluctuations would probably make the cosmological constant far larger than what we observe, so it isn’t a very satisfactory answer.

    Despite the unexplainable weirdness of dark energy, it matches observations so well that it has become part of the concordance model for cosmology, also known as the Lambda-CDM model. Here the Greek letter Lambda is the symbol for dark energy, and CDM stands for Cold Dark Matter.

    In this model there is a simple way to describe the overall shape of the cosmos, known as the Friedmann–Lemaître–Robertson–Walker (FLRW) metric. The only catch is that this assumes matter is distributed evenly throughout the universe. In the real universe matter is clumped together into clusters of galaxies, so the FLRW metric is only an approximation to the real shape of the universe. Since dark energy makes up about 70% of the mass/energy of the universe, the FLRW metric is generally thought to be a good approximation. But what if it isn’t?

    A new paper argues just that. Since matter clumps together, space would be more highly curved in those regions. In the large voids between the clusters of galaxies, there would be less space curvature. Relative to the clustered regions, the voids would appear to be expanding similarly to the appearance of dark energy. Using this idea the team ran computer simulations of a universe using this cluster effect rather than dark energy. They found that the overall structure evolved similarly to dark energy models.

    That would seem to support the idea that dark energy might be an effect of clustered galaxies.

    It’s an interesting idea, but there are reasons to be skeptical. While such clustering can have some effect on cosmic expansion, it wouldn’t be nearly as strong as we observe. While this particular model seems to explain the scale at which the clustering of galaxies occur, it doesn’t explain other effects, such as observations of distant supernovae which strongly support dark energy. Personally, I don’t find this new model very convincing, but I think ideas like this are certainly worth exploring. If the model can be further refined, it could be worth another look.

    Paper: Gabor Rácz, et al. Concordance cosmology without dark energy. Monthly Notices of the Royal Astronomical Society Letters: DOI: 10.1093/mnrasl/slx026 (2017)


    Dark Energy Camera [DECam], built at FNAL

    DECam at Cerro Tololo, Chile, housing DECam

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 9:10 am on March 30, 2017 Permalink | Reply
    Tags: , , , , Dark Energy,   

    From RAS: “Explaining the accelerating expansion of the universe without dark energy” 

    Royal Astronomical Society

    Royal Astronomical Society

    30 March 2017

    Enigmatic ‘dark energy’, thought to make up 68% of the universe, may not exist at all, according to a Hungarian-American team. The researchers believe that standard models of the universe fail to take account of its changing structure, but that once this is done the need for dark energy disappears. The team publish their results in a paper in Monthly Notices of the Royal Astronomical Society.

    1
    A still from an animation that shows the expansion of the universe in the standard ‘Lambda Cold Dark Matter’ cosmology, which includes dark energy (top left panel, red), the new Avera model, that considers the structure of the universe and eliminates the need for dark energy (top middle panel, blue), and the Einstein-de Sitter cosmology, the original model without dark energy (top right panel, green). The panel at the bottom shows the increase of the ‘scale factor’ (an indication of the size) as a function of time, where 1Gya is 1 billion years. The growth of structure can also be seen in the top panels. One dot roughly represents an entire galaxy cluster. Units of scale are in Megaparsecs (Mpc), where 1 Mpc is around 3 million million million km. Credit: István Csabai et al.

    Our universe was formed in the Big Bang, 13.8 billion years ago, and has been expanding ever since. The key piece of evidence for this expansion is Hubble’s law, based on observations of galaxies, which states that on average, the speed with which a galaxy moves away from us is proportional to its distance.

    Astronomers measure this velocity of recession by looking at lines in the spectrum of a galaxy, which shift more towards red the faster the galaxy is moving away. From the 1920s, mapping the velocities of galaxies led scientists to conclude that the whole universe is expanding, and that it began life as a vanishingly small point.

    In the second half of the twentieth century, astronomers found evidence for unseen ‘dark matter’ by observing that something extra was needed to explain the motion of stars within galaxies. Dark matter is now thought to make up 27% of the content of universe (in contrast ‘ordinary’ matter amounts to only 5%).

    Observations of the explosions of white dwarf stars in binary systems, so-called Type Ia supernovae, in the 1990s then led scientists to the conclusion that a third component, dark energy, made up 68% of the cosmos, and is responsible for driving an acceleration in the expansion of the universe.

    In the new work, the researchers, led by Phd student Gábor Rácz of Eötvös Loránd University in Hungary, question the existence of dark energy and suggest an alternative explanation. They argue that conventional models of cosmology (the study of the origin and evolution of the universe), rely on approximations that ignore its structure, and where matter is assumed to have a uniform density.

    “Einstein’s equations of general relativity that describe the expansion of the universe are so complex mathematically, that for a hundred years no solutions accounting for the effect of cosmic structures have been found. We know from very precise supernova observations that the universe is accelerating, but at the same time we rely on coarse approximations to Einstein’s equations which may introduce serious side-effects, such as the need for dark energy, in the models designed to fit the observational data.” explains Dr László Dobos, co-author of the paper, also at Eötvös Loránd University.

    In practice, normal and dark matter appear to fill the universe with a foam-like structure, where galaxies are located on the thin walls between bubbles, and are grouped into superclusters. The insides of the bubbles are in contrast almost empty of both kinds of matter.

    Using a computer simulation to model the effect of gravity on the distribution of millions of particles of dark matter, the scientists reconstructed the evolution of the universe, including the early clumping of matter, and the formation of large scale structure.

    Unlike conventional simulations with a smoothly expanding universe, taking the structure into account led to a model where different regions of the cosmos expand at different rate. The average expansion rate though is consistent with present observations, which suggest an overall acceleration.

    Dr Dobos adds: “The theory of general relativity is fundamental in understanding the way the universe evolves. We do not question its validity; we question the validity of the approximate solutions. Our findings rely on a mathematical conjecture which permits the differential expansion of space, consistent with general relativity, and they show how the formation of complex structures of matter affects the expansion. These issues were previously swept under the rug but taking them into account can explain the acceleration without the need for dark energy.”

    If this finding is upheld, it could have a significant impact on models of the universe and the direction of research in physics. For the past 20 years, astronomers and theoretical physicists have speculated on the nature of dark energy, but it remains an unsolved mystery. With the new model, Csabai and his collaborators expect at the very least to start a lively debate.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Royal Astronomical Society (RAS), founded in 1820, encourages and promotes the study of astronomy, solar-system science, geophysics and closely related branches of science.

     
  • richardmitnick 2:39 pm on March 24, 2017 Permalink | Reply
    Tags: , , , , Dark Energy, , , ,   

    From WIRED: “Astronomers Don’t Point This Telescope—The Telescope Points Them” 

    Wired logo

    WIRED

    03.23.17
    Sarah Scoles

    1
    U Texas Austin McDonald Observatory Hobby-Eberly Telescope

    The hills of West Texas rise in waves around the Hobby-Eberly Telescope, a powerful instrument encased in a dome that looks like the Epcot ball. Soon, it will become more powerful still: Scientists recently primed the telescope to find evidence of dark energy in the early universe, prying open its eye so it can see and process a wide swath of sky. On April 8, scientists will dedicate the new telescope, capping off the $40 million upgrade and beginning the real work.

    The dark energy experiment, called Hetdex, isn’t how astronomy has traditionally been done. In the classical model, a lone astronomer goes to a mountaintop and solemnly points a telescope at one predetermined object. But Hetdex won’t look for any objects in particular; it will just scan the sky and churn petabytes of the resulting data through a silicon visual cortex. That’s only possible because of today’s steroidal computers, which let scientists analyze, store, and send such massive quantities of data.

    “Dark energy is not only terribly important for astronomy, it’s the central problem for physics. It’s been the bone in our throat for a long time.”

    Steven Weinberg
    Nobel Laureate
    University of Texas at Austin

    The hope is so-called blind surveys like this one will find stuff astronomers never even knew to look for. In this realm, computers take over curation of the sky, telling astronomers what is interesting and worthy of further study, rather than the other way around. These wide-eyed projects are becoming a standard part of astronomers’ arsenal, and the greatest part about them is that their best discoveries are still totally TBD.

    Big Sky Country

    To understand dark energy—that mysterious stuff that pulls the taffy of spacetime—the Hetdex team needed Hobby-Eberly to study one million galaxies 9-11 billion light-years away as they fly away from Earth. To get that many galaxies in a reasonable amount of time, they broadened the view of its 91 tessellated stop-sign-shaped mirrors by 100. They also created an instrument called Virus, with 35,000 optical fibers that send the light from the universe to a spectrograph, which splits it up into constituent wavelengths. All that data can determine both how far away a galaxy is and how fast it’s traveling away from Earth.

    But when a telescope takes a ton of data down from the sky, scientists can also uncover the unexpected. Hetdex’s astronomers will find more than just the stretch marks of dark energy. They’ll discover things about supermassive black holes, star formation, dark matter, and the ages of stars in nearby galaxies.

    The classical method still has advantages; if you know exactly what you want to look at, you write up a nice proposal to Hubble and explain why a fixed gaze at the Whirlpool Galaxy would yield significant results. “But what you see is what you get,” says astronomer Douglas Hudgins. “This is an object, and the science of that object is what you’re stuck with.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: