Tagged: ESA/Euclid Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:04 am on November 27, 2017 Permalink | Reply
    Tags: , , , , , ESA/Euclid, , , Simulating the universe using Einstein’s theory of gravity may solve cosmic puzzles   

    From ScienceNews: “Simulating the universe using Einstein’s theory of gravity may solve cosmic puzzles” 

    ScienceNews bloc

    ScienceNews

    November 25, 2017
    Emily Conover

    Until recently, simulations of the universe haven’t given its lumps their due.

    1
    UNEVEN TERRAIN Universe simulations that consider general relativity (one shown) may shift knowledge of the cosmos. James Mertens

    If the universe were a soup, it would be more of a chunky minestrone than a silky-smooth tomato bisque.

    Sprinkled with matter that clumps together due to the insatiable pull of gravity, the universe is a network of dense galaxy clusters and filaments — the hearty beans and vegetables of the cosmic stew. Meanwhile, relatively desolate pockets of the cosmos, known as voids, make up a thin, watery broth in between.

    Until recently, simulations of the cosmos’s history haven’t given the lumps their due. The physics of those lumps is described by general relativity, Albert Einstein’s theory of gravity. But that theory’s equations are devilishly complicated to solve. To simulate how the universe’s clumps grow and change, scientists have fallen back on approximations, such as the simpler but less accurate theory of gravity devised by Isaac Newton.

    Relying on such approximations, some physicists suggest, could be mucking with measurements, resulting in a not-quite-right inventory of the cosmos’s contents. A rogue band of physicists suggests that a proper accounting of the universe’s clumps could explain one of the deepest mysteries in physics: Why is the universe expanding at an increasingly rapid rate?

    The accepted explanation for that accelerating expansion is an invisible pressure called dark energy. In the standard theory of the universe, dark energy makes up about 70 percent of the universe’s “stuff” — its matter and energy. Yet scientists still aren’t sure what dark energy is, and finding its source is one of the most vexing problems of cosmology.

    Perhaps, the dark energy doubters suggest, the speeding up of the expansion has nothing to do with dark energy. Instead, the universe’s clumpiness may be mimicking the presence of such an ethereal phenomenon.

    Most physicists, however, feel that proper accounting for the clumps won’t have such a drastic impact. Robert Wald of the University of Chicago, an expert in general relativity, says that lumpiness is “never going to contribute anything that looks like dark energy.” So far, observations of the universe have been remarkably consistent with predictions based on simulations that rely on approximations.

    _____________________________________________________________________________

    Growing a lumpy universe

    The universe has gradually grown lumpier throughout its history. During inflation, rapid expansion magnified tiny quantum fluctuations into minute density variations. Over time, additional matter glommed on to dense spots due to the stronger gravitational pull from the extra mass. After 380,000 years, those blips were imprinted as hot and cold spots in the cosmic microwave background, the oldest light in the universe. Lumps continued growing for billions of years, forming stars, planets, galaxies and galaxy clusters.

    1

    _____________________________________________________________________________

    As observations become more detailed, though, even slight inaccuracies in simulations could become troublesome. Already, astronomers are charting wide swaths of the sky in great detail, and planning more extensive surveys. To translate telescope images of starry skies into estimates of properties such as the amount of matter in the universe, scientists need accurate simulations of the cosmos’s history. If the detailed physics of clumps is important, then simulations could go slightly astray, sending estimates off-kilter. Some scientists already suggest that the lumpiness is behind a puzzling mismatch of two estimates of how fast the universe is expanding.

    Researchers are attempting to clear up the debate by conquering the complexities of general relativity and simulating the cosmos in its full, lumpy glory. “That is really the new frontier,” says cosmologist Sabino Matarrese of the University of Padua in Italy, “something that until a few years ago was considered to be science fiction.” In the past, he says, scientists didn’t have the tools to complete such simulations. Now researchers are sorting out the implications of the first published results of the new simulations. So far, dark energy hasn’t been explained away, but some simulations suggest that certain especially sensitive measurements of how light is bent by matter in the universe might be off by as much as 10 percent.

    Soon, simulations may finally answer the question: How much do lumps matter? The idea that cosmologists might have been missing a simple answer to a central problem of cosmology incessantly nags some skeptics. For them, results of the improved simulations can’t come soon enough. “It haunts me. I can’t let it go,” says cosmologist Rocky Kolb of the University of Chicago.

    Smooth universe

    By observing light from different eras in the history of the cosmos, cosmologists can compute the properties of the universe, such as its age and expansion rate. But to do this, researchers need a model, or framework, that describes the universe’s contents and how those ingredients evolve over time. Using this framework, cosmologists can perform computer simulations of the universe to make predictions that can be compared with actual observations.

    2
    COSMIC WEB Clumps and filaments of matter thread through a simulated universe 2 billion light years across. This simulation incorporates some aspects of Einstein’s theory of general relativity, allowing for detailed results while avoiding the difficulties of the full-fledged theory.

    After Einstein introduced his theory in 1915, physicists set about figuring out how to use it to explain the universe. It wasn’t easy, thanks to general relativity’s unwieldy, difficult-to-solve suite of equations. Meanwhile, observations made in the 1920s indicated that the universe wasn’t static as previously expected; it was expanding. Eventually, researchers converged on a solution to Einstein’s equations known as the Friedmann-Lemaître-Robertson-Walker metric. Named after its discoverers, the FLRW metric describes a simplified universe that is homogeneous and isotropic, meaning that it appears identical at every point in the universe and in every direction. In this idealized cosmos, matter would be evenly distributed, no clumps. Such a smooth universe would expand or contract over time.

    A smooth-universe approximation is sensible, because when we look at the big picture, averaging over the structures of galaxy clusters and voids, the universe is remarkably uniform. It’s similar to the way that a single spoonful of minestrone soup might be mostly broth or mostly beans, but from bowl to bowl, the overall bean-to-broth ratios match.

    In 1998, cosmologists revealed that not only was the universe expanding, but its expansion was also accelerating (SN: 2/2/08, p. 74). Observations of distant exploding stars, or supernovas, indicated that the space between us and them was expanding at an increasing clip. But gravity should slow the expansion of a universe evenly filled with matter. To account for the observed acceleration, scientists needed another ingredient, one that would speed up the expansion. So they added dark energy to their smooth-universe framework.

    Now, many cosmologists follow a basic recipe to simulate the universe — treating the cosmos as if it has been run through an imaginary blender to smooth out its lumps, adding dark energy and calculating the expansion via general relativity. On top of the expanding slurry, scientists add clumps and track their growth using approximations, such as Newtonian gravity, which simplifies the calculations.

    In most situations, Newtonian gravity and general relativity are near-twins. Throw a ball while standing on the surface of the Earth, and it doesn’t matter whether you use general relativity or Newtonian mechanics to calculate where the ball will land — you’ll get the same answer. But there are subtle differences. In Newtonian gravity, matter directly attracts other matter. In general relativity, gravity is the result of matter and energy warping spacetime, creating curves that alter the motion of objects (SN: 10/17/15, p. 16). The two theories diverge in extreme gravitational environments. In general relativity, for example, hulking black holes produce inescapable pits that reel in light and matter (SN: 5/31/14, p. 16). The question, then, is whether the difference between the two theories has any impact in lumpy-universe simulations.

    Most cosmologists are comfortable with the status quo simulations because observations of the heavens seem to fit neatly together like interlocking jigsaw puzzle pieces. Predictions based on the standard framework agree remarkably well with observations of the cosmic microwave background — ancient light released when the universe was just 380,000 years old (SN: 3/21/15, p. 7). And measurements of cosmological parameters — the fraction of dark energy and matter, for example — are generally consistent, whether they are made using the light from galaxies or the cosmic microwave background [CMB].

    CMB per ESA/Planck


    ESA/Planck

    3
    An image from the Two-Micron All Sky Survey of 1.6 million galaxies in infrared light reveals how matter clumps into galaxy clusters and filaments. Future large-scale surveys may require improved simulations that use general relativity to track the evolution of lumps over time. T.H. Jarrett, J. Carpenter & R. Hurt, obtained as part of 2MASS, a joint project of Univ. of Massachusetts and the Infrared Processing and Analysis Center/Caltech, funded by NASA and NSF.


    Caltech 2MASS Telescopes, a joint project of the University of Massachusetts and the Infrared Processing and Analysis Center (IPAC) at Caltech, at the Whipple Observatory on Mt. Hopkins south of Tucson, AZ, and at the Cerro Tololo Inter-American Observatory near La Serena, Chile.

    Dethroning dark energy

    Some cosmologists hope to explain the universe’s accelerating expansion by fully accounting for the universe’s lumpiness, with no need for the mysterious dark energy.

    These researchers argue that clumps of matter can alter how the universe expands, when the clumps’ influence is tallied up over wide swaths of the cosmos. That’s because, in general relativity, the expansion of each local region of space depends on how much matter is within. Voids expand faster than average; dense regions expand more slowly. Because the universe is mostly made up of voids, this effect could produce an overall expansion and potentially an acceleration. Known as backreaction, this idea has lingered in obscure corners of physics departments for decades, despite many claims that backreaction’s effect is small or nonexistent.

    Backreaction continues to appeal to some researchers because they don’t have to invent new laws of physics to explain the acceleration of the universe. “If there is an alternative which is based only upon traditional physics, why throw that away completely?” Matarrese asks.

    Most cosmologists, however, think explaining away dark energy just based on the universe’s lumps is unlikely. Previous calculations have indicated any effect would be too small to account for dark energy, and would produce an acceleration that changes in time in a way that disagrees with observations.

    “My personal view is that it’s a much smaller effect,” says astrophysicist Hayley Macpherson of Monash University in Melbourne, Australia. “That’s just basically a gut feeling.” Theories that include dark energy explain the universe extremely well, she points out. How could that be if the whole approach is flawed?

    New simulations by Macpherson and others that model how lumps evolve in general relativity may be able to gauge the importance of backreaction once and for all. “Up until now, it’s just been too hard,” says cosmologist Tom Giblin of Kenyon College in Gambier, Ohio.

    To perform the simulations, researchers needed to get their hands on supercomputers capable of grinding through the equations of general relativity as the simulated universe evolves over time. Because general relativity is so complex, such simulations are much more challenging than those that use approximations, such as Newtonian gravity. But, a seemingly distinct topic helped lay some of the groundwork: gravitational waves, or ripples in the fabric of spacetime.

    4
    SPECKLED SPACETIME A lumpy universe, recently simulated using general relativity, shows clumps of matter (pink and yellow) that beget stars and galaxies. H. Macpherson, Paul Lasky, Daniel Price.

    The Advanced Laser Interferometer Gravitational-Wave Observatory, LIGO, searches for the tremors of cosmic dustups such as colliding black holes (SN: 10/28/17, p. 8).


    VIRGO Gravitational Wave interferometer, near Pisa, Italy

    Caltech/MIT Advanced aLigo Hanford, WA, USA installation


    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA

    Cornell SXS, the Simulating eXtreme Spacetimes (SXS) project

    Gravitational waves. Credit: MPI for Gravitational Physics/W.Benger-Zib

    ESA/eLISA the future of gravitational wave research

    1
    Skymap showing how adding Virgo to LIGO helps in reducing the size of the source-likely region in the sky. (Credit: Giuseppe Greco (Virgo Urbino group)

    In preparation for this search, physicists honed their general relativity skills on simulations of the spacetime storm kicked up by black holes, predicting what LIGO might see and building up the computational machinery to solve the equations of general relativity. Now, cosmologists have adapted those techniques and unleashed them on entire, lumpy universes.

    The first lumpy universe simulations to use full general relativity were unveiled in the June 2016 Physical Review Letters. Giblin and colleagues reported their results simultaneously with Eloisa Bentivegna of the University of Catania in Italy and Marco Bruni of the University of Portsmouth in England.

    So far, the simulations have not been able to account for the universe’s acceleration. “Nearly everybody is convinced [the effect] is too small to explain away the need for dark energy,” says cosmologist Martin Kunz of the University of Geneva. Kunz and colleagues reached the same conclusion in their lumpy-universe simulations, which have one foot in general relativity and one in Newtonian gravity. They reported their first results in Nature Physics in March 2016.

    Backreaction aficionados still aren’t dissuaded. “Before saying the effect is too small to be relevant, I would, frankly, wait a little bit more,” Matarrese says. And the new simulations have potential caveats. For example, some simulated universes behave like an old arcade game — if you walk to one edge of the universe, you cross back over to the other side, like Pac-Man exiting the right side of the screen and reappearing on the left. That geometry would suppress the effects of backreaction in the simulation, says Thomas Buchert of the University of Lyon in France. “This is a good beginning,” he says, but there is more work to do on the simulations. “We are in infancy.”

    Different assumptions in a simulation can lead to disparate results, Bentivegna says. As a result, she doesn’t think that her lumpy, general-relativistic simulations have fully closed the door on efforts to dethrone dark energy. For example, tricks of light might be making it seem like the universe’s expansion is accelerating, when in fact it isn’t.

    When astronomers observe far-away sources like supernovas, the light has to travel past all of the lumps of matter between the source and Earth. That journey could make it look like there’s an acceleration when none exists. “It’s an optical illusion,” Bentivegna says. She and colleagues see such an effect in a simulation reported in March in the Journal of Cosmology and Astroparticle Physics. But, she notes, this work simulated an unusual universe, in which matter sits on a grid — not a particularly realistic scenario.

    For most other simulations, the effect of optical illusions remains small. That leaves many cosmologists, including Giblin, even more skeptical of the possibility of explaining away dark energy: “I feel a little like a downer,” he admits.

    6
    Lumps (gray) within this simulated universe change the path light takes (yellow lines), potentially affecting observations. Matter bends space, slightly altering the light’s trajectory from that in a smooth universe. James Mertens.

    Surveying the skies

    Subtle effects of lumps could still be important. In Hans Christian Andersen’s The Princess and the Pea, the princess felt a tiny pea beneath an impossibly tall stack of mattresses. Likewise, cosmologists’ surveys are now so sensitive that even if the universe’s lumps have a small impact, estimates could be thrown out of whack.

    The Dark Energy Survey, for example, has charted 26 million galaxies using the Victor M. Blanco Telescope in Chile, measuring how the light from those galaxies is distorted by the intervening matter on the journey to Earth.

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    In a set of papers posted online August 4 at arXiv.org, scientists with the Dark Energy Survey reported new measurements of the universe’s properties, including the amount of matter (both dark and normal) and how clumpy that matter is (SN: 9/2/17, p. 32). The results are consistent with those from the cosmic microwave background [CMB] — light emitted billions of years earlier.

    To make the comparison, cosmologists took the measurements from the cosmic microwave background, early in the universe, and used simulations to extrapolate to what galaxies should look like later in the universe’s history. It’s like taking a baby’s photograph, precisely computing the number and size of wrinkles that should emerge as the child ages and finding that your picture agrees with a snapshot taken decades later. The matching results so far confirm cosmologists’ standard picture of the universe — dark energy and all.

    “So far, it has not yet been important for the measurements that we’ve made to actually include general relativity in those simulations,” says Risa Wechsler, a cosmologist at Stanford University and a founding member of the Dark Energy Survey. But, she says, for future measurements, “these effects could become more important.” Cosmologists are edging closer to Princess and the Pea territory.

    Those future surveys include the Dark Energy Spectroscopic Instrument, DESI, set to kick off in 2019 at Kitt Peak National Observatory near Tucson; the European Space Agency’s Euclid satellite, launching in 2021; and the Large Synoptic Survey Telescope in Chile, which is set to begin collecting data in 2023.

    LBNL/DESI spectroscopic instrument on the Mayall 4-meter telescope at Kitt Peak National Observatory, Altitude 2,120 m (6,960 ft)

    LBNL/DESI spectroscopic instrument on the Mayall 4-meter telescope at Kitt Peak National Observatory starting in 2018

    NOAO/Mayall 4 m telescope at Kitt Peak, Arizona, USA, Altitude 2,120 m (6,960 ft)

    ESA/Euclid spacecraft

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    If cosmologists keep relying on simulations that don’t use general relativity to account for lumps, certain kinds of measurements of weak lensing — the bending of light due to matter acting like a lens — could be off by up to 10 percent, Giblin and colleagues reported at arXiv.org in July. “There is something that we’ve been ignoring by making approximations,” he says.

    That 10 percent could screw up all kinds of estimates, from how dark energy changes over the universe’s history to how fast the universe is currently expanding, to the calculations of the masses of ethereal particles known as neutrinos. “You have to be extremely certain that you don’t get some subtle effect that gets you the wrong answers,” Geneva’s Kunz says, “otherwise the particle physicists are going to be very angry with the cosmologists.”

    Some estimates may already be showing problem signs, such as the conflicting estimates of the cosmic expansion rate (SN: 8/6/16, p. 10). Using the cosmic microwave background, cosmologists find a slower expansion rate than they do from measurements of supernovas. If this discrepancy is real, it could indicate that dark energy changes over time. But before jumping to that conclusion, there are other possible causes to rule out, including the universe’s lumps.

    Until the issue of lumps is smoothed out, scientists won’t know how much lumpiness matters to the cosmos at large. “I think it’s rather likely that it will turn out to be an important effect,” Kolb says. Whether it explains away dark energy is less certain. “I want to know the answer so I can get on with my life.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Advertisements
     
  • richardmitnick 2:16 pm on June 10, 2017 Permalink | Reply
    Tags: , , , , , , ESA/Euclid, The largest virtual Universe ever simulated, U Zürich   

    From U Zürich: “The largest virtual Universe ever simulated.” 

    University of Zürich

    9 June 2017
    Contact
    Prof. Dr. Romain Teyssier
    romain.teyssier@uzh.ch
    Institute for Computational Science
    University of Zurich
    +41 44 635 60 20

    Dr. Joachim Stadel
    stadel@physik.uzh.ch
    Institute for Computational Science
    University of Zurich
    Phone: +41 44 635 58 16

    Researchers from the University of Zürich have simulated the formation of our entire Universe with a large supercomputer. A gigantic catalogue of about 25 billion virtual galaxies has been generated from 2 trillion digital particles. This catalogue is being used to calibrate the experiments on board the Euclid satellite, that will be launched in 2020 with the objective of investigating the nature of dark matter and dark energy.

    ESA/Euclid spacecraft

    1
    The Cosmic Web: A section of the virtual universe, a billion light years across, showing how dark matter is distributed in space, with dark matter halos the yellow clumps, interconnected by dark filaments. Cosmic void, shown as the white areas, are the lowest density regions in the Universe. (Image: Joachim Stadel, UZH)

    Over a period of three years, a group of astrophysicists from the University of Zürich has developed and optimised a revolutionary code to describe with unprecedented accuracy the dynamics of dark matter and the formation of large-scale structures in the Universe. As Joachim Stadel, Douglas Potter and Romain Teyssier report in their recently published paper [Computational Astrophysics and Cosmology], the code (called PKDGRAV3) has been designed to use optimally the available memory and processing power of modern supercomputing architectures, such as the “Piz Daint” supercomputer of the Swiss National Computing Center (CSCS). The code was executed on this world-leading machine for only 80 hours, and generated a virtual universe of two trillion (i.e., two thousand billion or 2 x 1012) macro-particles representing the dark matter fluid, from which a catalogue of 25 billion virtual galaxies was extracted.

    Cray Piz Daint supercomputer of the Swiss National Supercomputing Center (CSCS)

    Studying the composition of the dark universe

    Thanks to the high precision of their calculation, featuring a dark matter fluid evolving under its own gravity, the researchers have simulated the formation of small concentration of matter, called dark matter halos, in which we believe galaxies like the Milky Way form.

    Caterpillar Project A Milky-Way-size dark-matter halo and its subhalos circled, an enormous suite of simulations . Griffen et al. 2016

    The challenge of this simulation was to model galaxies as small as one tenth of the Milky Way, in a volume as large as our entire observable Universe. This was the requirement set by the European Euclid mission, whose main objective is to explore the dark side of the Universe.

    Measuring subtle distortions

    Indeed, about 95 percent of the Universe is dark. The cosmos consists of 23 percent of dark matter and 72 percent of dark energy. “The nature of dark energy remains one of the main unsolved puzzles in modern science,” says Romain Teyssier, UZH professor for computational astrophysics.

    Earthbound science of Dark Energy

    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam

    A puzzle that can be cracked only through indirect observation: When the Euclid satellite will capture the light of billions of galaxies in large areas of the sky, astronomers will measure very subtle distortions that arise from the deflection of light of these background galaxies by a foreground, invisible distribution of mass – dark matter. “That is comparable to the distortion of light by a somewhat uneven glass pane,” says Joachim Stadel from the Institute for Computational Science of the UZH.

    Optimizing observation strategies of the satellite

    This new virtual galaxy catalogue will help optimize the observational strategy of the Euclid experiment and minimize various sources of error, before the satellite embarks on its six-year data collecting mission in 2020. “Euclid will perform a tomographic map of our Universe, tracing back in time more than 10-billion-year of evolution in the cosmos,” Stadel says. From the Euclid data, researchers will obtain new information on the nature of this mysterious dark energy, but also hope to discover new physics beyond the standard model, such as a modified version of general relativity or a new type of particle.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The University of Zürich (UZH, German: Universität Zürich), located in the city of Zürich, is the largest university in Switzerland, with over 26,000 students. It was founded in 1833 from the existing colleges of theology, law, medicine and a new faculty of philosophy.

    Currently, the university has seven faculties: Philosophy, Human Medicine, Economic Sciences, Law, Mathematics and Natural Sciences, Theology and Veterinary Medicine. The university offers the widest range of subjects and courses of any Swiss higher education institutions.

     
  • richardmitnick 12:44 pm on May 9, 2017 Permalink | Reply
    Tags: , , , , , , Detecting infrared light, ESA/Euclid, ,   

    From JPL-Caltech: “NASA Delivers Detectors for ESA’s Euclid Spacecraft” 

    NASA JPL Banner

    JPL-Caltech

    May 9, 2017
    Elizabeth Landau
    Jet Propulsion Laboratory, Pasadena, Calif.
    818-354-6425
    elizabeth.landau@jpl.nasa.gov

    Giuseppe Racca
    Euclid Project Manager
    Directorate of Science
    European Space Agency
    giuseppe.racca@esa.int

    René Laureijs
    Euclid Project Scientist
    Directorate of Science
    European Space Agency
    Rene.Laureijs@esa.int

    ESA/Euclid spacecraft

    Three detector systems for the Euclid mission, led by ESA (European Space Agency), have been delivered to Europe for the spacecraft’s near-infrared instrument. The detector systems are key components of NASA’s contribution to this upcoming mission to study some of the biggest questions about the universe, including those related to the properties and effects of dark matter and dark energy — two critical, but invisible phenomena that scientists think make up the vast majority of our universe.

    “The delivery of these detector systems is a milestone for what we hope will be an extremely exciting mission, the first space mission dedicated to going after the mysterious dark energy,” said Michael Seiffert, the NASA Euclid project scientist based at NASA’s Jet Propulsion Laboratory, Pasadena, California, which manages the development and implementation of the detector systems.

    Euclid will carry two instruments: a visible-light imager (VIS) and a near-infrared spectrometer and photometer (NISP). A special light-splitting plate on the Euclid telescope enables incoming light to be shared by both instruments, so they can carry out observations simultaneously.

    The spacecraft, scheduled for launch in 2020, will observe billions of faint galaxies and investigate why the universe is expanding at an accelerating pace. Astrophysicists think dark energy is responsible for this effect, and Euclid will explore this hypothesis and help constrain dark energy models. This census of distant galaxies will also reveal how galaxies are distributed in our universe, which will help astrophysicists understand how the delicate interplay of the gravity of dark matter, luminous matter and dark energy forms large-scale structures in the universe.

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Additionally, the location of galaxies in relation to each other tells scientists how they are clustered. Dark matter, an invisible substance accounting for over 80 percent of matter in our universe, can cause subtle distortions in the apparent shapes of galaxies. That is because its gravity bends light that travels from a distant galaxy toward an observer, which changes the appearance of the galaxy when it is viewed from a telescope.

    Gravitational Lensing NASA/ESA

    Euclid’s combination of visible and infrared instruments will examine this distortion effect and allow astronomers to probe dark matter and the effects of dark energy.

    Detecting infrared light, which is invisible to the human eye, is especially important for studying the universe’s distant galaxies. Much like the Doppler effect for sound, where a siren’s pitch seems higher as it approaches and lower as it moves away, the frequency of light from an astronomical object gets shifted with motion. Light from objects that are traveling away from us appears redder, and light from those approaching us appears bluer. Because the universe is expanding, distant galaxies are moving away from us, so their light gets stretched out to longer wavelengths. Between 6 and 10 billion light-years away, galaxies are brightest in infrared light.

    JPL procured the NISP detector systems, which were manufactured by Teledyne Imaging Sensors of Camarillo, California. They were tested at JPL and at NASA’s Goddard Space Flight Center, Greenbelt, Maryland, before being shipped to France and the NISP team.

    Each detector system consists of a detector, a cable and a “readout electronics chip” that converts infrared light to data signals read by an onboard computer and transmitted to Earth for analysis. Sixteen detectors will fly on Euclid, each composed of 2040 by 2040 pixels. They will cover a field of view slightly larger than twice the area covered by a full moon. The detectors are made of a mercury-cadmium-telluride mixture and are designed to operate at extremely cold temperatures.

    “The U.S. Euclid team has overcome many technical hurdles along the way, and we are delivering superb detectors that will enable the collection of unprecedented data during the mission,” said Ulf Israelsson, the NASA Euclid project manager, based at JPL.

    Delivery to ESA of the next set of detectors for NISP is planned in early June. The Centre de Physique de Particules de Marseille, France, will provide further characterization of the detector systems. The final detector focal plane will then be assembled at the Laboratoire d’Astrophysique de Marseille, and integrated with the rest of NISP for instrument tests.

    For more information about Euclid, visit:

    http://sci.esa.int/Euclid

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA JPL Campus

    Jet Propulsion Laboratory (JPL) is a federally funded research and development center and NASA field center located in the San Gabriel Valley area of Los Angeles County, California, United States. Although the facility has a Pasadena postal address, it is actually headquartered in the city of La Cañada Flintridge [1], on the northwest border of Pasadena. JPL is managed by the nearby California Institute of Technology (Caltech) for the National Aeronautics and Space Administration. The Laboratory’s primary function is the construction and operation of robotic planetary spacecraft, though it also conducts Earth-orbit and astronomy missions. It is also responsible for operating NASA’s Deep Space Network.

    Caltech Logo

    NASA image

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: