Tagged: Cosmology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:17 pm on June 13, 2016 Permalink | Reply
    Tags: , , Cosmology, ,   

    “From LBL: “Researchers Gear Up Galaxy-seeking Robots for a Test Run” 

    Berkeley Logo

    Berkeley Lab

    June 13, 2016
    Glenn Roberts Jr
    geroberts@lbl.gov
    510-486-5582

    1
    Parker Fagrelius of Berkeley Lab and UC Berkeley inspects ProtoDESI, a prototype system for the Dark Energy Spectroscopic Instrument. ProtoDESI will be tested at the Mayall Telescope in Arizona in August and September. (Credit: Paul Mueller/Berkeley Lab)

    A prototype system, designed as a test for a planned array of 5,000 galaxy-seeking robots, is taking shape at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab).

    Dubbed ProtoDESI, the scaled-down, 10-robot system will help scientists achieve the pinpoint accuracy needed to home in on millions of galaxies, quasars and stars with the Dark Energy Spectroscopic Instrument (DESI) planned for the Mayall Telescope at Kitt Peak National Observatory near Tucson, Ariz. ProtoDESI will be installed on the Mayall Telescope this August and September.

    LBL/DESI spectroscopic instrument
    LBL/DESI spectroscopic instrument

    NOAO/Mayall 4 m telescope at Kitt Peak, Arizona, USA
    NOAO Mayall 4 m telescope interior
    NOAO/Mayall 4 m telescope at Kitt Peak, Arizona, USA

    The full DESI project, which is managed by Berkeley Lab, involves about 200 scientists and about 45 institutions from around the globe. DESI will provide the most detailed 3-D map of the universe and probe the secrets of dark energy, which is accelerating the universe’s expansion. It is also expected to improve our understanding of dark matter, the infant universe, and the structure of our own galaxy.

    Milky Way NASA/JPL-Caltech /ESO R. Hurt
    Milky Way NASA/JPL-Caltech /ESO R. Hurt

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey
    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    2
    ProtoDESI will have 10 rodlike robots (above, left)—10 inches long and designed to point fiber-optic cables at sky objects and gather light—and 16 light-emitting devices (middle) to ensure the system is targeting correctly. The ProtoDESI setup (right, with robots and light rods shown in yellow circle) will be tested at the Mayall Telescope in Arizona from August-September.

    4
    DESI robots (right) will poke out from 10 wedge-shaped “petals” that will be fitted together in a Focal Plate Assembly (left).

    6
    DESI’s robots can point their fiber-optical cables (red dots, upper left) at any sky object (blue dots) within a 12-millimeter-diameter area. A rendering of the robotic array (right) with an overlay of a star field that can be reached by one robot.

    7
    A rendering of DESI inside the Mayall Telescope (left). The robotic array is contained in a gray-shaded structure pointed toward the top of the dome.

    (Credits: Berkeley Lab, University of Michigan, DESI Collaboration, NOAO.)

    The thin, cylindrical robots that will be tested in ProtoDESI each carry a fiber-optic cable that will be precisely pointed at selected objects in the night sky in order to capture their light. A predecessor galaxy-measuring project, called BOSS, required the light-gathering cables to be routinely plugged by hand into metal plates with holes drilled to match the position of pre-selected sky objects. DESI will automate and greatly speed up this process.

    Each 10-inch-long robot has two small motors in it that allow two independent rotating motions to position a fiber anywhere within a circular area 12 millimeters in diameter. In the completed DESI array, these motions will enable the 5,000 robots to cover every point above their metal, elliptical base, which measures about 2.5 feet across.

    That requires precise, software-controlled choreography so that the tightly packed robots don’t literally bump heads as they spin into new positions several times each hour to collect light from different sets of pre-selected sky objects.

    “The main goal of ProtoDESI is to be able to fix fibers on actual objects and hold them there,” said Parker Fagrelius, who is managing the ProtoDESI project at Berkeley Lab. Fagrelius is a UC Berkeley graduate student who is also an affiliate in the Physics Division at Berkeley Lab. ProtoDESI’s robots, assembled at University of Michigan and then shipped to Berkeley Lab, are positioned far enough apart that they won’t accidentally collide during their initial test run.

    While DESI’s robots will primarily target galaxies, ProtoDESI will use mostly bright, familiar stars to tune its robotic positioning system and ensure the system is accurately tracking with the motion of objects in the sky. Mounted next to the positioners is a custom digital camera known as the GFA (for guide, focus and alignment) that will remain targeted on a “guide star”—a bright star that will aid the tracking of other objects targeted by the robot-pointed fibers. Several Spanish research institutions in Barcelona and Madrid are responsible for this GFA system.

    “We’ll choose the fields we look at quite carefully,” Fagrelius said. The robots will initially fix on isolated sky objects so that they don’t mistakenly point at the wrong objects.

    In addition to the 10-robot system, ProtoDESI is equipped with a set of 16 light-emitting rods—shaped similarly to the robots—that project small points of blue light onto a camera to calibrate the positioning system. The completed project will include 120 of these devices, called “illuminated fiducials.”

    The fibers carried by the robots each have a core that is 107 microns (millionths of a meter) wide. After repositioning, the fibers will be backlit to project points of light on a camera that can help to fine-tune their individual positions, if needed. Yale University is supplying this fiber-view camera and also the fiducials.

    8
    A view of the ProtoDESI setup under assembly at Berkeley Lab, with the underside of the robotic fiber-positioners visible at left. (Credit: Paul Mueller/Berkeley Lab)

    Fagrelius will join a team of researchers at Kitt Peak’s 4-meter Mayall telescope in early August to run through a checklist of ProtoDESI tests. About 28 researchers from nine institutions in the DESI collaboration are working on ProtoDESI, including six Berkeley Lab researchers.

    Researchers will test the auto-positioning system by slightly shifting the pointing of the telescope and the fibers—a process known as “dithering”—to see how the components readjust to find the correct targets. A digital camera will measure light streaming in from the fibers to determine if the robots are properly targeting sky objects.

    9
    ProtoDESI will test 10 robots like the one in this diagram. Each one can rotate in two different ways and is designed to point a fiber-optic cable at sky objects to collect their light. (Credit: MNRAS, DOI: 10.1093/mnras/stv541)

    “ProtoDESI will show us how the software and positioners are working together,” Fagrelius said. “All of the things we learn along the way from ProtoDESI will be built back into the plans for DESI’s commissioning.” Some measurements and pre-testing with ProtoDESI will be conducted at Berkeley Lab even before ProtoDESI moves to the Mayall telescope, she added.

    The full robotic array planned for DESI will be segmented in 10 pie-wedge-shaped “petals” that each contains 500 robots. The first petal will be fully assembled by October at Berkeley Lab and tested at the lab through December. The multi-petal design will allow engineers to remove and replace individual petals.

    Each robot will have an electronic circuit board and wiring, and on the final DESI project each robot’s fiber-optic cable will be spliced to a 42-meter-long fiber-optic cable that will run to a light-measuring device known as a spectrograph (ProtoDESI will not have a spectrograph).

    The completed project will feature 10 high-resolution spectrographs, that will measure the properties of objects’ light to tell us about how fast faraway galaxies are moving away from us and their distribution, and will help us trace the universe’s expansion history back 12 billion years.

    9
    A camera test of a type of robotic fiber-optic positioner (left and center) that will be tested in ProtoDESI. (Credit: MNRAS, DOI: 10.1093/mnras/stv541)

    Joe Silber, a Berkeley Lab engineer working on DESI systems that include its robotics, said the fiber-optic cables are among the most sensitive components in DESI. “If there is too tight of a bend or you stress the fiber, it will degrade its performance,” he said, noting that there have already been tests of the repeated bends and twists to the cables caused by the movement of the robots. Over the lifetime of DESI the ends of the fiber-optic cables will be turned almost 200,000 times, he said. Installation of DESI is expected to begin in 2018.

    Fagrelius said she looks forward to the ProtoDESI run at Mayall. “September will have a lot more clear nights than August. There should be four weeks of decent time that we can get on sky,” she said, and other tests can be conducted even when viewing is obscured by weather.

    DESI is supported by the U.S. Department of Energy Office of Science; additional support for DESI is provided by the U.S. National Science Foundation, Division of Astronomical Sciences under contract to the National Optical Astronomy Observatory; the Science and Technologies Facilities Council of the United Kingdom; the Gordon and Betty Moore Foundation; the Heising-Simons Foundation; the National Council of Science and Technology of Mexico, and DESI member institutions. The DESI scientists are honored to be permitted to conduct astronomical research on Iolkam Du’ag (Kitt Peak), a mountain with particular significance to the Tohono O’odham Nation.


    This video shows the rotating motions of a robotic fiber-optic positioner. ProtoDESI will test a group of 10 robotic positioners, and DESI will feature 5,000 robots. (Credit: Berkeley Lab)
    Access mp4 video here .


    A simulation of the movements of 499 DESI robots, carefully choreographed to avoid bumping into one another, as seen from above. ProtoDESI is testing 10 robots for the completed DESI project, which will have 5,000 robots. (Credit: Joe Silber/Berkeley Lab)
    Access mp4 video here .

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 9:33 pm on January 14, 2016 Permalink | Reply
    Tags: , , Cosmology, , ,   

    From Symmetry: “Exploring the dark universe with supercomputers” 

    Symmetry

    Temp 1

    01/14/16
    Katie Elyce Jones

    Next-generation telescopic surveys will work hand-in-hand with supercomputers to study the nature of dark energy.

    The 2020s could see a rapid expansion in dark energy research.

    For starters, two powerful new instruments will scan the night sky for distant galaxies. The Dark Energy Spectroscopic Instrument, or DESI, will measure the distances to about 35 million cosmic objects, and the Large Synoptic Survey Telescope, or LSST, will capture high-resolution videos of nearly 40 billion galaxies.

    DESI Dark Energy Spectroscopic Instrument
    LBL DESI

    LSST Exterior
    LSST Telescope
    LSST Camera
    LSST, the building that will house it in Chile, and the camera, being built at SLAC

    Both projects will probe how dark energy—the phenomenon that scientists think is causing the universe to expand at an accelerating rate—has shaped the structure of the universe over time.

    But scientists use more than telescopes to search for clues about the nature of dark energy. Increasingly, dark energy research is taking place not only at mountaintop observatories with panoramic views but also in the chilly, humming rooms that house state-of-the-art supercomputers.

    The central question in dark energy research is whether it exists as a cosmological constant—a repulsive force that counteracts gravity, as Albert Einstein suggested a century ago—or if there are factors influencing the acceleration rate that scientists can’t see. Alternatively, Einstein’s theory of gravity [General Relativity] could be wrong.

    “When we analyze observations of the universe, we don’t know what the underlying model is because we don’t know the fundamental nature of dark energy,” says Katrin Heitmann, a senior physicist at Argonne National Laboratory. “But with computer simulations, we know what model we’re putting in, so we can investigate the effects it would have on the observational data.”

    Temp 2
    A simulation shows how matter is distributed in the universe over time. Katrin Heitmann, et al., Argonne National Laboratory

    Growing a universe

    Heitmann and her Argonne colleagues use their cosmology code, called HACC, on supercomputers to simulate the structure and evolution of the universe. The supercomputers needed for these simulations are built from hundreds of thousands of connected processors and typically crunch well over a quadrillion calculations per second.

    The Argonne team recently finished a high-resolution simulation of the universe expanding and changing over 13 billion years, most of its lifetime. Now the data from their simulations is being used to develop processing and analysis tools for the LSST, and packets of data are being released to the research community so cosmologists without access to a supercomputer can make use of the results for a wide range of studies.

    Risa Wechsler, a scientist at SLAC National Accelerator Laboratory and Stanford University professor, is the co-spokesperson of the DESI experiment. Wechsler is producing simulations that are being used to interpret measurements from the ongoing Dark Energy Survey, as well as to develop analysis tools for future experiments like DESI and LSST.

    Dark Energy Survey
    Dark Energy Camera
    CTIO Victor M Blanco 4m Telescope
    DES, The DECam camera, built at FNAL, and the Victor M Blanco 4 meter telescope in Chile that houses the camera.

    “By testing our current predictions against existing data from the Dark Energy Survey, we are learning where the models need to be improved for the future,” Wechsler says. “Simulations are our key predictive tool. In cosmological simulations, we start out with an early universe that has tiny fluctuations, or changes in density, and gravity allows those fluctuations to grow over time. The growth of structure becomes more and more complicated and is impossible to calculate with pen and paper. You need supercomputers.”

    Supercomputers have become extremely valuable for studying dark energy because—unlike dark matter, which scientists might be able to create in particle accelerators—dark energy can only be observed at the galactic scale.

    “With dark energy, we can only see its effect between galaxies,” says Peter Nugent, division deputy for scientific engagement at the Computational Cosmology Center at Lawrence Berkeley National Laboratory.

    Trial and error bars

    “There are two kinds of errors in cosmology,” Heitmann says. “Statistical errors, meaning we cannot collect enough data, and systematic errors, meaning that there is something in the data that we don’t understand.”

    Computer modeling can help reduce both.

    DESI will collect about 10 times more data than its predecessor, the Baryon Oscillation Spectroscopic Survey, and LSST will generate 30 laptops’ worth of data each night. But even these enormous data sets do not fully eliminate statistical error.

    LBL BOSS
    LBL BOSS telescope

    Simulation can support observational evidence by modeling similar conditions to see if the same results appear consistently.

    “We’re basically creating the same size data set as the entire observational set, then we’re creating it again and again—producing up to 10 to 100 times more data than the observational sets,” Nugent says.

    Processing such large amounts of data requires sophisticated analyses. Simulations make this possible.

    To program the tools that will compare observational and simulated data, researchers first have to model what the sky will look like through the lens of the telescope. In the case of LSST, this is done before the telescope is even built.

    After populating a simulated universe with galaxies that are similar in distribution and brightness to real galaxies, scientists modify the results to account for the telescope’s optics, Earth’s atmosphere, and other limiting factors. By simulating the end product, they can efficiently process and analyze the observational data.

    Simulations are also an ideal way to tackle many sources of systematic error in dark energy research. By all appearances, dark energy acts as a repulsive force. But if other, inconsistent properties of dark energy emerge in new data or observations, different theories and a way of validating them will be needed.

    “If you want to look at theories beyond the cosmological constant, you can make predictions through simulation,” Heitmann says.

    A conventional way to test new scientific theories is to introduce change into a system and compare it to a control. But in the case of cosmology, we are stuck in our universe, and the only way scientists may be able to uncover the nature of dark energy—at least in the foreseeable future—is by unleashing alternative theories in a virtual universe.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 8:57 pm on January 14, 2016 Permalink | Reply
    Tags: , , Cosmology, IBT   

    From IBT: “‘Green Pea’ Galaxies May Hold The Key To Understanding The Early Universe” 

    IBT bloc

    International Business Times

    01/13/16
    Charles Poladian

    Temp 1
    The galaxy J0925 is known as a “green pea” galaxy. Researchers used the Hubble Space Telescope to observe UV radiation being emitted by this galaxy. Photo: Ivana Orlitová, Astronomical Institute, Czech Academy of Sciences (Prague)

    The universe began with a big bang followed by a great expansion [inflation]. The subsequent creation of hydrogen and helium led to the dark ages of the universe, but something happened that caused hydrogen to heat up — the process known as [re]ionization where superheated gases obtain a positive or negative charge — and usher the visible universe era.

    All of this took place within a billion years after the big bang. A study released Wednesday shows that the first galaxies in the early universe may have been the catalyst behind cosmic reionization.

    The dark ages of the universe comprised neutral helium, hydrogen, dark matter and normal matter. Gravity would soon pull all of this together to create the first stars. The creation of the first stars would pave the way for the first galaxies. The young stars and early galaxies were hot enough to strip electrons from the neutral gases. Cosmic reionization started out as a flashlight revealing what was out in the darkness. As more stars and galaxies formed, more gas was ionized. Soon, the lights were turned on and the early universe became visible.

    “Though the Epoch of Reionization took place deep in the universe’s past, it lies at the very frontier of our current cosmological observations. The more researchers learn about this period, in fact, the more it reveals about the end of the cosmic dark ages, the first stars and galaxies and the structure of our universe,” Stanford University’s Kavli Institute for Particle Astrophysics and Cosmology explained.

    Temp 2
    Infographic detailing the history of the universe. Photo: S. G. Djorgovski et al., Caltech

    Stars emit UV radiation and ionizing photons necessary to heat and strip surrounding gas. Galaxies were believed to have triggered cosmic reionization, but researchers had yet to find a galaxy emitting enough radiation necessary to reionize hydrogen. Galaxies need to eject the ionizing photons instead of absorbing the photons, according to the researchers from the University of Geneva.

    The researchers focused on tiny galaxies known as “green pea” galaxies due to their compact size.

    Temp 3
    Pea Scientific Montage. Galaxy Zoo

    These active star-forming galaxies located 1.5 billion and 5 billion light-years from Earth are similar to galaxies in the early universe. If these galaxies were emitting radiation that could heat and strip hydrogen, it’s likely similar galaxies were doing the same thing 13 billion years ago.

    The researchers found 5,000 green pea galaxies using the Sloan [Digital Sky] Survey’s [SDSS] collection of more than 1 million galaxies.

    SDSS Telescope
    SDSS telescope at Apache Point, NM, USA

    After finding potential candidates, the Hubble Space Telescope’s ability to detect UV radiation was used to determine if any green peas were emitting radiation.

    NASA Hubble Telescope
    NASA/ESA Hubble

    One such galaxy, J0925 — located 3 billion light-years from Earth — was emitting UV radiation and ejecting photons. This is just the first step in understanding what caused cosmic reionization in the early universe. The researchers hope to use Hubble for further observations of J0925 and other galaxies that could be emitting radiation.

    See the full post here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 8:36 am on December 15, 2015 Permalink | Reply
    Tags: , , Cosmology, , XXL Survey   

    From ESO: “XXL Hunt for Galaxy Clusters” 


    European Southern Observatory

    15 December 2015
    Marguerite Pierre
    CEA
    Saclay, France
    Email: marguerite.pierre@cea.fr

    Richard Hook
    ESO Public Information Officer
    Garching bei München, Germany
    Tel: +49 89 3200 6655
    Cell: +49 151 1537 3591
    Email: rhook@eso.org.

    Observations from ESO telescopes provide crucial third dimension in probe of Universe’s dark side

    q

    ESO telescopes have provided an international team of astronomers with the gift of the third dimension in a plus-sized hunt for the largest gravitationally bound structures in the Universe — galaxy clusters. Observations by the VLT and the NTT complement those from other observatories across the globe and in space as part of the XXL survey — one of the largest ever such quests for clusters.

    Galaxy clusters are massive congregations of galaxies that host huge reservoirs of hot gas — the temperatures are so high that X-rays are produced. These structures are useful to astronomers because their construction is believed to be influenced by the Universe’s notoriously strange components — dark matter and dark energy. By studying their properties at different stages in the history of the Universe, galaxy clusters can shed light on the Universe’s poorly understood dark side.

    The team, consisting of over 100 astronomers from around the world, started a hunt for the cosmic monsters in 2011. Although the high-energy X-ray radiation that reveals their location is absorbed by the Earth’s atmosphere, it can be detected by X-ray observatories in space. Thus, they combined an ESA XMM-Newton survey — the largest time allocation ever granted for this orbiting telescope — with observations from ESO and other observatories.

    ESA XMM Newton
    ESA/XMM-Newton

    The result is a huge and growing collection of data across the electromagnetic spectrum [1], collectively called the XXL survey.

    “The main goal of the XXL survey is to provide a well-defined sample of some 500 galaxy clusters out to a distance when the Universe was half its current age,” explains XXL principal investigator Marguerite Pierre of CEA, Saclay, France.

    The XMM-Newton telescope imaged two patches of sky — each one hundred times the area of the full Moon — in an attempt to discover a huge number of previously unknown galaxy clusters. The XXL survey team have now released their findings in a series of papers using the 100 brightest clusters discovered [2].

    Observations from the EFOSC2 instrument installed on the New Technology Telescope (NTT), along with the FORS instrument attached to ESO’s Very Large Telescope (VLT), also were used to carefully analyse the light coming from galaxies within these galaxy clusters.

    ESO EFOSC2
    EFOSC2 instrument

    ESO FORS1
    FORS1

    Crucially, this allowed the team to measure the precise distances to the galaxy clusters, providing the three-dimensional view of the cosmos required to perform precise measurements of dark matter and dark energy [3].

    The XXL survey is expected to produce many exciting and unexpected results, but even with one fifth of the final expected data, some surprising and important findings have already appeared.

    One paper reports the discovery of five new superclusters — clusters of galaxy clusters — adding to those already known, such as our own, the Laniakea Supercluster.

    2
    The Laniakea Supercluster

    Another reports followup observations of one particular galaxy cluster (informally known as XLSSC-116), located over six billion light-years away [4]. In this cluster unusually bright diffuse light was observed using MUSE on the VLT.

    ESO MUSE
    MUSE

    “This is the first time that we are able to study in detail the diffuse light in a distant galaxy cluster, illustrating the power of MUSE for such valuable studies,” explained co-author Christoph Adami of the Laboratoire d’Astrophysique, Marseille, France.

    The team have also used the data to confirm the idea that galaxy clusters in the past are scaled down versions of those we observe today — an important finding for the theoretical understanding of the evolution of clusters over the life of the Universe.

    The simple act of counting galaxy clusters in the XXL data has also confirmed a strange earlier result — there are fewer distant clusters than expected based on predictions from the cosmological parameters measured by ESA’s Planck telescope.

    ESA Planck
    ESA/Planck

    The reason for this discrepancy is unknown, however the team hope to get to the bottom of this cosmological curiosity with the full sample of clusters in 2017.

    These four important results are just a foretaste of what is to come in this massive survey of some of the most massive objects in the Universe.

    Notes

    [1] The XXL survey has combined archival data as well as new observations of galaxy clusters covering the wavelength range from 1 × 10—4 μm (X-ray, observed with XMM) to 492 μm (submillimetre range, observed with the Giant Metrewave Radio Telescope [GMRT]).

    Giant Metrewave Radio Telescope
    GMRT

    [2] The galaxy clusters reported in the thirteen papers are found at redshifts between z = 0.05 and z = 1.05, which correspond to when the Universe was approximately 13 and 5.7 billion years old, respectively.

    [3] Probing the galaxy clusters required their precise distances to be known. While approximate distances — photometric redshifts — can be measured by analysing their colours at different wavelengths, more accurate spectroscopic redshifts are needed. Spectroscopic redshifts were also sourced from archival data, as part of the VIMOS Public Extragalactic Redshift Survey (VIPERS), the VIMOS-VLT Deep Survey (VVDS) and the GAMA survey.

    Temp 1
    From VIPERS

    3

    5
    From GAMA

    [4] This galaxy cluster was found to be at a redshift of z = 0.543.

    More information

    A description of the survey, and some of the early science results, will be presented in a series of papers to appear in the journal Astronomy & Astrophysics on 15 December 2015.

    XXL is an international project based around an XMM Very Large Programme surveying two 25 square degrees extragalactic fields at a depth of ~5 × 10–15 erg cm—2 s—1 in the [0.5—2] keV band for point-like sources. The XXL website is found here. Multi-band information and spectroscopic follow-up of the X-ray sources are obtained through a number of survey programmes is summarised here.

    Links:

    XXL Survey
    Scientific Papers in Astronomy & Astrophysics

    The full XXL CONSORTIUM:
    C. Adami (Laboratoire d’Astrophysique, Marseille, FR)
    S. Alis (Observatoire de la Cote d’Azur, Nice, FR)
    A. Alshino (University of Bahrain, BH)
    B. Altieri (European Space Astronomy Center, Madrid, SP)
    N. Baran (University of Zagreb, HR)
    S. Basilakos (Research Center for Astronomy, Academy of Athens, GR)
    C. Benoist (Observatoire de la Cote d’Azur, Nice, FR)
    M. Birkishaw (University of Bristol, UK)
    A. Bongiorno (Rome Observatory, Italy)
    V. Bouillot (Observatoire de Paris, FR)
    M. Bremer (University of Bristol, UK)
    T. Broadhurst (Basque University, Bilbao, SP)
    M. Brusa (INAF-OABO, Bologna, IT)
    A. Butler (University of Western Austalia, AU)
    N. Cappelluti (INAF-OABO, Bologna, IT)
    A. Cappi (INAF-OABO, Bologna, IT)
    T. Chantavat (Naresuan University, TH)
    L. Chiappetti (INAF-IASF, Milano, IT)
    P. Ciliegi (INAF-OABO, Bologna, IT)
    F. Civano (H. S. Center for Astrophysics, Cambridge, US)
    A. Comastri (INAF-OABO, Bologna, IT)
    P. S. Corasaniti (Observatoire de Paris, FR)
    J. Coupon (ASIAA, Taipei, TW)
    N. Clerc (Service d’Astrophysique CEA, Saclay, FR)
    C. De Breuck (ESO Garching, DE)
    J. Delhaize (University of Zagreb, HR)
    J. Democles (University of Birmingham, UK)
    Sh. Desai (University of Illinois, US)
    J. Devriendt (University of Oxford, UK)
    O. Dore (JPL Caltech, Pasadena, US)
    Y. Dubois (University of Oxford, UK)
    D. Eckert (ISCD, Geneva Observatory, CH)
    L. Edwards (Mount Allison Observatory, CA)
    D. Elbaz (Service d’Astrophysique CEA, Saclay, FR)
    A. Elyiv (University of Liege, BE)
    S. Ettori (INAF-OABO, Bologna, IT)
    A. E. Evrard (University of Michigan, Ann Arbor, US)
    L. Faccioli (Service d’Astrophysique CEA, Saclay, FR)
    A. Farahi (University of Michigan, Ann Arbor, US)
    C. Ferrari (Observatoire de la Cote d’Azur, FR)
    F. Finet (Aryabhatta Research institute for Observational Science, IN)
    F. Fiore (Observatory of Roma, IT)
    S. Fotopoulou (ISCD, Geneva Observatory, CH)
    W. Forman (H. S. Center for Astrophysics, Cambridge, US)
    E. Freeland (Stockholm University)
    P. Gandhi (ISAS, JAXA, Sagamihara, JP)
    F. Gastadello (INAF-IASF, Milan, IT)
    I. Georgantopoulos (Observatory of Athens, GR)
    P. Gilles (University of Bristol, UK)
    R. Gilli (INAF-OABO, Bologna, IT)
    A. Goulding (H. S. Center for Astrophysics, Cambridge, US)
    Ch. Gordon (University of Oxford, UK)
    L. Guennou (University of Kwazulu-Natal, ZA)
    V. Guglielmo (Observatory of Padova, IT)
    R. C. Hickox (Durham University, UK)
    C. Horellou (Chalmers University of Technology, Onsala, SE)
    K. Husband (University of Bristol, UK)
    M. Huynh (University of Western Austalia, AU)
    A. Iovino (INAF-OAB, Brera, IT)
    Ch. Jones (H. S. Center for Astrophysics, Cambridge, US)
    S. Lavoie (University of Victoria, CA)
    A. Le Brun (Service d’Astrophysique CEA, Saclay, FR)
    J.-P. Le Fevre (Service d’Informatique CEA, Saclay, FR)
    M. Lieu (University of Birmingham, UK)
    C.A Lin (Service d’Astrophysique CEA, Saclay, FR)
    M. Kilbinger (Service d’Astrophysique CEA, Saclay, FR)
    E. Koulouridis (Service d’Astrophysique CEA, Saclay, FR)
    Ch. Lidman (Australian Astronomical Observatory, Epping, AU)
    M. Matturi (ITA/ZAH Heildelberg, DE)
    B. Maughan (University of Bristol, UK)
    A. Mantz (University of Chicago, US)
    S. Maurogordato (Observatoire de la Cote d’Azur, Nice, FR)
    I. McCarthy (University of Liverpool, UK)
    S. McGee (Leiden Univeristy, NL)
    F. Menanteau (University of Illinois, US)
    J.-B. Melin (Service de Physique des Particules CEA, Saclay, FR)
    O. Melnyk (University of Liege, BE)
    J. Mohr (University of Munich, DE)
    S. Molnar (ASIAA, Taipei, TW)
    E. Mörtsell (Stockholm University, SE)
    L. Moscardini (University of Bologna, IT)
    S. S. Murray (Jon Hopkins, Baltimore, US)
    M. Novak (University of Zagreb, HR)
    F. Pacaud (Argelander-Institut fur Astronomie, Bonn, DE)
    S. Paltani (ISCD, Geneva Observatory, CH)
    S. Paulin-Henriksson (Service d’Astrophysique CEA, Saclay, FR)
    E. Piconcelli (INAF, Roma Observatory, IT)
    M. Pierre (Service d’Astrophysique CEA, Saclay, FR)
    T. Plagge (University of Chicago, US)
    M. Plionis (Aristotle University of Thessaloniki, Department of Physics, GR)
    B. Poggianti (Observatory of Padova, IT)
    D. Pomarede (Service d’Informatique CEA, Saclay, FR)
    E. Pompei (European Souhern Observatory, Garching, DE)
    T. Ponman (University of Birmingham, UK)
    M. E. Ramos Ceja (Argelander-Institut fur Astronomie, Bonn, DE)
    P. Ranalli (Observatory of Athens, GR)
    D. Rapetti (Copenhagen University, DK)
    S. Raychaudhury (University of Birmingham, UK)
    T. Reiprich (Argelander-Institut fur Astronomie, Bonn, DE)
    H. Rottgering (Leiden Observatory, NL)
    E. Rozo (SLAC National Accelerator Laboratory, US)
    E. Rykoff (SLAC National Accelerator Laboratory, US)
    T. Sadibekova (Service d’Astrophysique CEA, Saclay, FR)
    M. Sahlén (University of Oxford, UK)
    J. Santos (INAF – Osservatorio Astronomico di Arcetri, IT)
    J.-L. Sauvageot (Service d’Astrophysique CEA, Saclay, FR)
    C. Schimd (Laboratoire d’Astrophysique, Marseille, FR)
    M. Sereno (University of Bologna, IT)
    J. Silk (University of Oxford, UK)
    G.P. Smith (University of Birmingham, UK)
    V. Smolcic (University of Zagreb, HR)
    S. Snowden (NASA, GSFC, US)
    D. Spergel (Princeton University, US)
    A. Stanford (University of California, Davis, US)
    J. Surdej (University of Liege, BE)
    K. Umetsu (ASIAA, Taipei, TW)
    P. Valageas (Institut de Physique Theorique du CEA, Saclay, FR)
    A. Valotti (Service d’Astrophysique CEA, Saclay, FR)
    I. Valtchanov (European Space Astronomy Center, Madrid, SP)
    C. Vignali (University of Bologna, IT)
    J. Willis (University of Victoria, CA)
    F. Ziparo (University of Birmingham, UK)

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition
    Visit ESO in Social Media-

    Facebook

    Twitter

    YouTube

    ESO Bloc Icon

    ESO is the foremost intergovernmental astronomy organisation in Europe and the world’s most productive ground-based astronomical observatory by far. It is supported by 16 countries: Austria, Belgium, Brazil, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Poland, Portugal, Spain, Sweden, Switzerland and the United Kingdom, along with the host state of Chile. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising cooperation in astronomical research. ESO operates three unique world-class observing sites in Chile: La Silla, Paranal and Chajnantor. At Paranal, ESO operates the Very Large Telescope, the world’s most advanced visible-light astronomical observatory and two survey telescopes. VISTA works in the infrared and is the world’s largest survey telescope and the VLT Survey Telescope is the largest telescope designed to exclusively survey the skies in visible light. ESO is a major partner in ALMA, the largest astronomical project in existence. And on Cerro Armazones, close to Paranal, ESO is building the 39-metre European Extremely Large Telescope, the E-ELT, which will become “the world’s biggest eye on the sky”.

    ESO LaSilla
    LaSilla

    ESO VLT Interferometer
    VLT

    ESO Vista Telescope
    VISTA

    ESO VLT Survey telescope
    VLT Survey Telescope

    ESO NTT
    NTT

    ALMA Array
    ALMA

    ESO E-ELT
    E-ELT

    ESO APEX
    Atacama Pathfinder Experiment (APEX) Telescope

     
  • richardmitnick 4:56 pm on December 11, 2015 Permalink | Reply
    Tags: , Cosmology, FQXI, Parallel universes   

    From FQXI: “Detecting Parallel Universes Hidden Inside Back Holes — The First Proof of the Multiverse?” 

    FQXI bloc

    FQXI

    Dec. 10, 2015
    FQXi Administrator Zeeya Meral

    1
    Garriga et al, arXiv:1512.01819v2

    It’s hard to say what’s the most exciting element of this new paper on parallel universes, the inflationary multiverse, and black holes, by Tufts cosmologist (and FQXi member) Alex Vilenkin and colleagues. Is it the idea that black holes hide baby universes inside them — inflating their own spacetimes — connected to our universe by wormholes? Could it be that, according to the authors, astronomers may soon be able to find evidence to confirm this crazy notion? Perhaps it’s the fact that this paper could be presenting the first way to find definitive evidence that an inflationary multiverse of parallel worlds exists. Oh yes, and the authors also say that such black holes could have seeded supermassive black holes — the origin of which remains a mystery — *and*, in some of the scenarios they’ve looked at, they could comprise dark matter, the invisible stuff that makes up most of the matter in the universe.

    Phew! No wonder the paper by Vilenkin along with Jaume Garriga, at the University of Barcelona, and Jun Zhang also at Tufts, is almost 50 pages long! (Black Holes and the Multiverse arXiv:1512.01819v2.)

    Let’s take this piece by piece. Vilenkin sent me the paper, which he has just posted to the physics preprint server, arXiv, because, for him, what’s exciting is that it provides a “new way to test multiverse models observationally.” Their analysis is based on inflation theory — the idea that our universe underwent a phase of rapid expansion, or inflation, in its early history. This is now a pretty mainstream notion, which serves to solve a number of mysteries about the state of our universe today. It has also had good observational backing since various satellites have now measured the slight temperature differences in the afterglow of the big bang — the cosmic microwave background [CMB] radiation — and found patterns that match those predicted by inflationary models.

    Cosmic Microwave Background  Planck
    CMB per ESA/Planck

    ESA Planck
    ESA/Planck

    (There are still alternative proposals out there to explain these features, however. See Sophie Hebden’s Faster than Light for an example.)

    Slightly more controversial is the idea that inflation forces us to accept that we live in a multiverse of neighbouring universes with potentially very different physical parameters than our cosmos. This stems from the realisation, by Vilenkin and others, that inflation is unlikely to have been a one-off event. Just as the patch of space that we now call home once inflated to create an entire cosmos for us to wonder at, other neighbouring patches are probably inflating all around us, creating parallel bubble universes nearby.

    The multiverse idea has been criticised because it’s tough to test. Almost by definition, parallel bubbles are spacetimes that are divorced from ours, and so we can’t interact with them directly. That hasn’t stopped cosmologists like Vilenkin, and our own Anthony Aguirre, from coming up with inventive ways we might be able to detect them. For instance, two neighbouring bubbles might collide and leave a scar on our universe, which we could pick out of the cosmic microwave background data. (See “When Worlds Collide” by Kate Becker.)

    In their new paper, Garriga, Vilenkin, and Zhang have investigated another possible consequence of inflationary cosmology — providing a new mechanism for the formation of black holes in our universe. We often talk about stellar mass black holes that were formed from the collapse of stars. There are also supermassive black holes that can be found at the centre of galaxies, which can have masses up to a billion times that of the Sun. Astrophysicists aren’t quite sure how those latter behemoths are formed.

    According to Garriga, Vilenkin and Zhang, black holes could also have been formed by little bubbles of vacuum in our early universe. These would have expanded during our universe’s inflationary phase (as the cosmos they were embedded in was also growing around them). When inflation ended in our cosmos, these bubbles would — depending on their mass — have either collapsed down to a singularity (an infinitely dense point that we think lies at the core of a black hole) — or, if they were heavier than some critical mass, the bubble interior would continue to inflate into an entirely new baby universe. This universe would look to us, from the outside, like a black hole, and would be connected to our universe by a wormhole. (See the image, taken from the paper, at the top of this post.)

    The team has also examined another mechanism in which black holes are formed inside spherical “domain walls” that are thought to be created during inflation. A domain wall is like a fracture or defect in space, created as the universe cools. You can think of it like a defect created in a cube of ice, where the crystal structure in the solid has misaligned as the water froze.

    The paper takes a detailed look at some of the possible properties of such black holes formed by these novel processes, including the masses they might have, and the sort of observable signs they might give out that astronomers could pick up. They caution that they would need to carry out comprehensive computer simulations to work out all possible signatures and the possible effects of, for instance, energy being siphoned off from our universe through the wormhole. But a preliminary analysis suggests that these novel black holes could provide noticeable signatures, in the form of gamma rays given out by the black holes, or distortions induced on the cosmic microwave background spectrum created by radiation that was emitted as gas accreted onto large black holes in the early universe.

    By looking at observational evidence that is already out there, the team can rule out inflationary black holes with certain parameters, but others are still allowed. Those that remain viable could have seeded today’s supermassive black holes, the team says. And for certain model parameters they have investigated, the number and mass of black holes they expect to see suggests that these black holes could make up the missing dark matter in the universe.

    The authors also calculated that the baby universe could contain very different physical parameters from each other. Thus the network of baby universes within black holes, linked by wormholes, would create an inflationary multiverse.

    “We note that the mass distributions of black holes resulting from domain walls and from vacuum bubbles are expected to be different and can in principle be distinguished observationally,” the teams writes in their paper. “If a black hole population produced by vacuum bubbles or domain walls is discovered, it could be regarded as evidence for the existence of a multiverse.”

    It’s worth noting here that this isn’t the first time that physicists have suggested that black holes lead to parallel universes. For example, FQXi members Lee Smolin and Jorge Pullin have independently had similar ideas in the past. On the podcast, on the June 2013 edition, you can hear Pullin talking about how loop quantum gravity predicts that black holes are tunnels to parallel worlds. (Smolin is also on that edition, talking about his book.) But this is the first analysis carried out using inflationary theory.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    To catalyze, support, and disseminate research on questions at the foundations of physics and cosmology, particularly new frontiers and innovative ideas integral to a deep understanding of reality but unlikely to be supported by conventional funding sources.

    Goals

    FQXi has five goals:

    To expand the purview of scientific inquiry to include scientific disciplines fundamental to a deep understanding of reality, but which are currently largely unsupported by conventional grant sources

    To redress incrementalism in research programming by establishing or expanding new “islands” of understanding via flexible funding of high-risk, high-reward research in these areas

    To forge and maintain useful collaborations between researchers working on foundational questions in physics, cosmology, and related fields

    To provide the public with a deeper understanding of known and future discoveries in these areas, and their potential implications for our worldview

    To create a logistically, intellectually, and financially self-sustaining independent Institute to accomplish these goals during and beyond the initial four year program beginning in 2006, thereby pioneering a new model of philanthropically-funded scientific research

    FQXi therefore aims to support research that is both foundational (with potentially significant and broad implications for our understanding of the deep or “ultimate” nature of reality) and unconventional (enabling research that, because of its speculative, non-mainstream, or high-risk nature, would otherwise go unperformed due to lack of funding).

     
  • richardmitnick 11:42 am on September 29, 2015 Permalink | Reply
    Tags: , , Cosmology, THE Q CONTINUUM SIMULATION   

    From AAS NOVA: “The Q Continuum Simulation” 

    AASNOVA

    Amercan Astronomical Society

    28 September 2015
    Susanna Kohler

    1

    Each frame in this image ([in the full article]click for the full view!) represents a different stage in the simulated evolution of our universe, ending at present day in the rightmost panel. In a recently-published paper, Katrin Heitmann (Argonne National Laboratory) and collaborators reveal the results from — and challenges inherent in — the largest cosmological simulation currently available: the Q Continuum simulation. Evolving a volume of 1300 Mpc3, this massive N-body simulation tracks over half a trillion particles as they clump together as a result of their mutual gravity, imitating the evolution of our universe over the last 13.8 billion years. Cosmological simulations such as this one are important for understanding observations, testing analysis pipelines, investigating the capabilities of future observing missions, and much more. For more information and the original image (as well as several other awesome images!), see the paper below.
    Citation:

    Katrin Heitmann et al 2015 ApJS 219 34. doi:10.1088/0067-0049/219/2/34

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 8:18 pm on September 28, 2015 Permalink | Reply
    Tags: , , Cosmology,   

    From NOVA: “Could the Universe Be Lopsided?” 

    PBS NOVA

    NOVA

    28 Sep 2015
    Paul Halpern

    One hundred years ago, [Albert] Einstein re-envisioned space and time as a rippling, twisting, flexible fabric called spacetime. His theory of general relativity showed how matter and energy change the shape of this fabric. One might expect, therefore, that the fabric of the universe, strewn with stars, galaxies, and clouds of particles, would be like a college student’s dorm room: a mess of rumpled, crumpled garments.

    Indeed, if you look at the universe on the scale of stars, galaxies, and even galaxy clusters, you’ll find it puckered and furrowed by the gravity of massive objects. But take the wider view—the cosmologists’ view, which encompasses the entire visible universe—and the fabric of the universe is remarkably smooth and even, no matter which direction you turn. Look up, down, left, or right and count up the galaxies you see: you’ll find it’s roughly the same from every angle. The cosmic microwave background [CMB], the cooled-down relic of radiation from the early universe, demonstrates the same remarkable evenness on the very largest scale.

    Cosmic Background Radiation Planck
    CMB per ESA/Planck

    ESA Planck
    ESA/Planck satellite

    1
    A computer simulation of the ‘cosmic web’ reveals the great filaments, made largely of dark matter, located in the space between galaxies. By NASA, ESA, and E. Hallman (University of Colorado, Boulder), via Wikimedia Commons

    Physicists call a universe that appears roughly similar in all directions isotropic. Because the geometry of spacetime is shaped by the distribution of matter and energy, an isotropic universe must posses a geometric structure that looks the same in all directions as well. The only three such possibilities for three-dimensional spaces are positively curved (the surface of a hypersphere, like a beach ball but in a higher dimension), negatively curved (the surface of a hyperboloid, shaped like a saddle or potato chip), or flat. Russian physicist [Alexei] Fridmann, Belgian cleric and mathematician Georges Lemaître and others incorporated these three geometries into some of the first cosmological solutions of Einstein’s equations. (By solutions, we mean mathematical descriptions of how the three spatial dimensions of the universe behave over time, given the type of geometry and the distribution of matter and energy.) Supplemented by the work of American physicist Howard Robertson and British mathematician Arthur Walker, this class of isotropic solutions has become the standard for descriptions of the universe in the Big Bang theory.

    However, in 1921 Edward Kasner—best known for his coining of the term “Googol” for the number 1 followed by 100 zeroes—demonstrated that there was another class of solutions to Einstein’s equations: anisotropic, or “lopsided,” solutions.

    Known as the Kasner solutions, these cosmic models describe a universe that expands in two directions while contracting in the third. That is clearly not the case with the actual universe, which has grown over time in all three directions. But the Kasner solutions become more intriguing when you apply them to a kind of theory called a Kaluza-Klein model, in which there are unseen extra dimensions beyond space and time. Thus space could theoretically have three expanding dimensions and a fourth, hidden, contracting dimension. Physicists Alan Chodos and Steven Detweiler explored this concept in their paper Where has the fifth dimension gone?

    Kasner’s is far from the only anisotropic model of the universe. In 1951, physicist Abraham Taub applied the shape-shifting mathematics of Italian mathematician Luigi Bianchi to general relativity and revealed even more baroque classes of anisotropic solutions that expand, contract or pulsate differently in various directions. The most complex of these, categorized as Bianchi type-IX, turned out to have chaotic properties and was dubbed by physicist Charles Misner the “Mixmaster Universe” for its resemblance to the whirling, twirling kitchen appliance.

    Like a cake rising in a tray, while bubbling and quivering on the sides, the Mixmaster Universe expands and contracts, first in one dimension and then in another, while a third dimension just keeps expanding. Each oscillation is called a Kasner epoch. But then, after a certain number of pulses, the direction of pure expansion abruptly switches. The formerly uniformly expanding dimension starts pulsating, and one of those formerly pulsating starts uniformly expanding. It is as if the rising cake were suddenly turned on its side and another direction started rising instead, while the other directions, including the one that was previously rising, just bubbled.

    One of the weird things about the Mixmaster Universe is that if you tabulate the number of Kasner epochs in each era, before the behavior switches, it appears as random as a dice roll. For example, the universe might oscillate in two directions five times, switch, oscillate in two other directions 17 times, switch again, pulsate another way twice, and so forth—without a clear pattern. While the solution stems from deterministic general relativity, it seems unpredictable. This is called deterministic chaos.

    Could the early moments of the universe have been chaotic, and then somehow regularized over time, like a smoothed-out pudding? Misner initially thought so, until he realized that the Mixmaster Universe couldn’t smooth out on its own. However, it could have started out “lopsided,” then been stretched out during an era of ultra-rapid expansion called inflation until its irregularities were lost from sight.

    As cosmologists have collected data from instruments such as the Hubble Space Telescope, Planck Satellite, and WMAP satellite (now retired), the bulk of the evidence supports the idea that our universe is indeed isotropic.

    NASA Hubble Telescope
    NASA/ESA Hubble

    WMAP
    NASA/WMAP

    But a minority of researchers have used measurements of the velocities of galaxies and other observations, such as an odd line up of temperature fluctuations in the cosmic microwave background dubbed the “Axil of Evil” to assert that the universe could be slightly irregular after all.

    For example, starting in 2008, Alexander Kashlinsky, a researcher at NASA’s Goddard Space Flight Center, and his colleagues have statistically analyzed cosmic microwave background data gathered by first the WMAP satellite and the Planck satellite to show that, in addition to their motion due to cosmic expansion, many galaxy clusters seem to be heading toward a particular direction on the sky. He dubbed this phenomenon “dark flow,” and suggested that it is evidence of a previously-unseen cosmic anisotropy known as a “tilt.” Although the mainstream astronomical community has disputed Kashlinsky’s conclusion, he has continued to gather statistical evidence for dark flow and the idea of tilted universes.

    Whether or not the universe really is “lopsided,” it is intriguing to study the rich range of solutions of Einstein’s general theory of relativity. Even if the preponderance of evidence today points to cosmic regularity, who knows when a new discovery might call that into question, and compel cosmologists to dust off alternative ideas. Such is the extraordinary flexibility of Einstein’s masterful theory: a century after its publication, physicists are still exploring its possibilities.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

     
  • richardmitnick 11:22 am on July 26, 2015 Permalink | Reply
    Tags: , Cosmology, , , Time Travel   

    From RT: “Time-traveling photons connect general relativity to quantum mechanics” 

    RT Logo

    RT

    23 Jun, 2014
    No Writer Credit

    1
    Space-time structure exhibiting closed paths in space (horizontal) and time (vertical). A quantum particle travels through a wormhole back in time and returns to the same location in space and time. (Photo credit: Martin Ringbauer)

    Scientists have simulated time travel by using particles of light acting as quantum particles sent away and then brought back to their original space-time location. This is a huge step toward marrying two of the most irreconcilable theories in physics.

    Since traveling all the way to a black hole to see if an object you’re holding would bend, break or put itself back together in inexplicable ways is a bit of a trek, scientists have decided to find a point of convergence between general relativity and quantum mechanics in lab conditions, and they achieved success.

    Australian researchers from the UQ’s School of Mathematics and Physics wanted to plug the holes in the discrepancies that exist between two of our most commonly accepted physics theories, which is no easy task: on the one hand, you have Einstein’s theory of general relativity, which predicts the behavior of massive objects like planets and galaxies; but on the other, you have something whose laws completely clash with Einstein’s – and that is the theory of quantum mechanics, which describes our world at the molecular level. And this is where things get interesting: we still have no concrete idea of all the principles of movement and interaction that underpin this theory.

    Natural laws of space and time simply break down there.

    The light particles used in the study are known as photons, and in this University of Queensland study, they stood in for actual quantum particles for the purpose of finding out how they behaved while moving through space and time.

    The team simulated the behavior of a single photon that travels back in time through a wormhole and meets its older self – an identical photon. “We used single photons to do this but the time-travel was simulated by using a second photon to play the part of the past incarnation of the time traveling photon,” said UQ Physics Professor Tim Ralph asquotedby The Speaker.

    The findings were published in the journal Nature Communications and gained support from the country’s key institutions on quantum physics.

    Some of the biggest examples of why the two approaches can’t be reconciled concern the so-called space-time loop. Einstein suggested that you can travel back in time and return to the starting point in space and time. This presented a problem, known commonly as the ‘grandparents paradox,’ theorized by Kurt Godel in 1949: if you were to travel back in time and prevent your grandparents from meeting, and in so doing prevent your own birth, the classical laws of physics would prevent you from being born.

    But Tim Ralph has reminded that in 1991, such situations could be avoided by harnessing quantum mechanics’ flexible laws: “The properties of quantum particles are ‘fuzzy’ or uncertain to start with, so this gives them enough wiggle room to avoid inconsistent time travel situations,” he said.

    There are still ways in which science hasn’t tested the meeting points between general relativity and quantum mechanics – such as when relativity is tested under extreme conditions, where its laws visibly seem to bend, just like near the event horizon of a black hole.

    But since it’s not really easy to approach one, the UQ scientists were content with testing out these points of convergence on photons.

    “Our study provides insights into where and how nature might behave differently from what our theories predict,” Professor Ralph said.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 1:19 pm on July 20, 2015 Permalink | Reply
    Tags: , , Cosmology,   

    From NOVA: “Black Holes Could Turn You Into a Hologram, and You Wouldn’t Even Notice” 

    PBS NOVA

    NOVA

    01 Jul 2015
    Tim De Chant

    1
    Black holes may not have event horizons, but fuzzy surfaces.

    Few things are as mysterious as black holes. Except, of course, what would happen to you if you fell into one.

    Physicists have been debating what might happen to anyone unfortunate enough to slip toward the singularity, and so far, they’ve come up with approximately 2.5 ways you might die, from being stretched like spaghetti to burnt to a crisp.

    The fiery hypothesis is a product of Stephen Hawking’s firewall theory, which also says that black holes eventually evaporate, destroying everything inside. But this violates a fundamental principle of physics—that information cannot be destroyed—so other physicists, including Samir Mathur, have been searching for ways to address that error.

    Here’s Marika Taylor, writing for The Conversation:

    The general relativity description of black holes suggests that once you go past the event horizon, the surface of a black hole, you can go deeper and deeper. As you do, space and time become warped until they reach a point called the “singularity” at which point the laws of physics cease to exist. (Although in reality, you would die pretty early on on this journey as you are pulled apart by intense tidal forces).

    In Mathur’s universe, however, there is nothing beyond the fuzzy event horizon.

    Mathur’s take on black holes suggests that they aren’t surrounded by a point-of-no-return event horizon or a firewall that would incinerate you, but a fuzzball with small variations that maintain a record of the information that fell into it. What does touch the fuzzball is converted into a hologram. It’s not a perfect copy, but a doppelgänger of sorts.

    Perhaps more bizarrely, you even wouldn’t be aware that of the transformation. Say you were to be sucked toward a black hole. At the point where you’d normally hit the event horizon, Mathur says, you’d instead touch the fuzzy surface. But instead of noticing anything, the fuzzy surface would appear like any other part of space immediately around you. Everything would seem the same as it was, except that you’d be a hologram.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

     
  • richardmitnick 4:24 pm on July 19, 2015 Permalink | Reply
    Tags: , Cosmology, , ,   

    From WIRED: “Chemists Invent New Letters for Nature’s Genetic Alphabet” 

    Wired logo

    Wired

    07.19.15
    Emily Singer

    1
    Olena Shmahalo/Quanta Magazine

    DNA stores our genetic code in an elegant double helix.

    1
    The structure of the DNA double helix. The atoms in the structure are colour-coded by element and the detailed structure of two base pairs are shown in the bottom right.

    But some argue that this elegance is overrated. “DNA as a molecule has many things wrong with it,” said Steven Benner, an organic chemist at the Foundation for Applied Molecular Evolution in Florida.

    Nearly 30 years ago, Benner sketched out better versions of both DNA and its chemical cousin RNA, adding new letters and other additions that would expand their repertoire of chemical feats.

    2
    A hairpin loop from a pre-mRNA. Highlighted are the nucleobases (green) and the ribose-phosphate backbone (blue). Note that this is a single strand of RNA that folds back upon itself.

    He wondered why these improvements haven’t occurred in living creatures. Nature has written the entire language of life using just four chemical letters: G, C, A and T. Did our genetic code settle on these four nucleotides for a reason? Or was this system one of many possibilities, selected by simple chance? Perhaps expanding the code could make it better.

    Benner’s early attempts at synthesizing new chemical letters failed. But with each false start, his team learned more about what makes a good nucleotide and gained a better understanding of the precise molecular details that make DNA and RNA work. The researchers’ efforts progressed slowly, as they had to design new tools to manipulate the extended alphabet they were building. “We have had to re-create, for our artificially designed DNA, all of the molecular biology that evolution took 4 billion years to create for natural DNA,” Benner said.

    Now, after decades of work, Benner’s team has synthesized artificially enhanced DNA that functions much like ordinary DNA, if not better. In two papers published in the Journal of the American Chemical Society last month, the researchers have shown that two synthetic nucleotides called P and Z fit seamlessly into DNA’s helical structure, maintaining the natural shape of DNA. Moreover, DNA sequences incorporating these letters can evolve just like traditional DNA, a first for an expanded genetic alphabet.

    The new nucleotides even outperform their natural counterparts. When challenged to evolve a segment that selectively binds to cancer cells, DNA sequences using P and Z did better than those without.

    “When you compare the four-nucleotide and six-nucleotide alphabet, the six-nucleotide version seems to have won out,” said Andrew Ellington, a biochemist at the University of Texas, Austin, who was not involved in the study.

    Benner has lofty goals for his synthetic molecules. He wants to create an alternative genetic system in which proteins—intricately folded molecules that perform essential biological functions—are unnecessary. Perhaps, Benner proposes, instead of our standard three-component system of DNA, RNA and proteins, life on other planets evolved with just two.

    Better Blueprints for Life

    The primary job of DNA is to store information. Its sequence of letters contains the blueprints for building proteins. Our current four-letter alphabet encodes 20 amino acids, which are strung together to create millions of different proteins. But a six-letter alphabet could encode as many as 216 possible amino acids and many, many more possible proteins.

    3
    Expanding the genetic alphabet dramatically expands the number of possible amino acids and proteins that cells can build, at least in theory. The existing four-letter alphabet produces 20 amino acids (small circle) while a six-letter alphabet could produce 216 possible amino acids. Olena Shmahalo/Quanta Magazine

    Why nature stuck with four letters is one of biology’s fundamental questions. Computers, after all, use a binary system with just two “letters”—0s and 1s. Yet two letters probably aren’t enough to create the array of biological molecules that make up life. “If you have a two-letter code, you limit the number of combinations you get,” said Ramanarayanan Krishnamurthy, a chemist at the Scripps Research Institute in La Jolla, Calif.

    On the other hand, additional letters could make the system more error prone. DNA bases come in pairs—G pairs with C and A pairs with T. It’s this pairing that endows DNA with the ability to pass along genetic information. With a larger alphabet, each letter has a greater chance of pairing with the wrong partner, and new copies of DNA might harbor more mistakes. “If you go past four, it becomes too unwieldy,” Krishnamurthy said.

    But perhaps the advantages of a larger alphabet can outweigh the potential drawbacks. Six-letter DNA could densely pack in genetic information. And perhaps six-letter RNA could take over some of the jobs now handled by proteins, which perform most of the work in the cell.

    Proteins have a much more flexible structure than DNA and RNA and are capable of folding into an array of complex shapes. A properly folded protein can act as a molecular lock, opening a chamber only for the right key. Or it can act as a catalyst, capturing and bringing together different molecules for chemical reactions.

    Adding new letters to RNA could give it some of these abilities. “Six letters can potentially fold into more, different structures than four letters,” Ellington said.

    Back when Benner was sketching out ideas for alternative DNA and RNA, it was this potential that he had in mind. According to the most widely held theory of life’s origins, RNA once performed both the information-storage job of DNA and the catalytic job of proteins. Benner realized that there are many ways to make RNA a better catalyst.

    “With just these little insights, I was able to write down the structures that are in my notebook as alternatives that would make DNA and RNA better,” Benner said. “So the question is: Why did life not make these alternatives? One way to find out was to make them ourselves, in the laboratory, and see how they work.”

    3
    Steven Benner’s lab notebook from 1985 outlining plans to synthesize “better” DNA and RNA by adding new chemical letters. Courtesy of Steven Benner

    It’s one thing to design new codes on paper, and quite another to make them work in real biological systems. Other researchers have created their own additions to the genetic code, in one case even incorporating new letters into living bacteria. But these other bases fit together a bit differently from natural ones, stacking on top of each other rather than linking side by side. This can distort the shape of DNA, particularly when a number of these bases cluster together. Benner’s P-Z pair, however, is designed to mimic natural bases.

    One of the new papers by Benner’s team shows that Z and P are yoked together by the same chemical bond that ties A to T and C to G. (This bond is known as Watson-Crick pairing, after the scientists who discovered DNA’s structure.) Millie Georgiadis, a chemist at Indiana University-Purdue University Indianapolis, along with Benner and other collaborators, showed that DNA strands that incorporate Z and P retain their proper helical shape if the new letters are strung together or interspersed with natural letters.

    “This is very impressive work,” said Jack Szostak, a chemist at Harvard University who studies the origin of life, and who was not involved in the study. “Finding a novel base pair that does not grossly disrupt the double-helical structure of DNA has been quite difficult.”

    The team’s second paper demonstrates how well the expanded alphabet works. Researchers started with a random library of DNA strands constructed from the expanded alphabet and then selected the strands that were able to bind to liver cancer cells but not to other cells. Of the 12 successful binders, the best had Zs and Ps in their sequences, while the weakest did not.

    “More functionality in the nucleobases has led to greater functionality in nucleic acids themselves,” Ellington said. In other words, the new additions appear to improve the alphabet, at least under these conditions.

    But additional experiments are needed to determine how broadly that’s true. “I think it will take more work, and more direct comparisons, to be sure that a six-letter version generally results in ‘better’ aptamers [short DNA strands] than four-letter DNA,” Szostak said. For example, it’s unclear whether the six-letter alphabet triumphed because it provided more sequence options or because one of the new letters is simply better at binding, Szostak said.

    Benner wants to expand his genetic alphabet even further, which could enhance its functional repertoire. He’s working on creating a 10- or 12-letter system and plans to move the new alphabet into living cells. Benner’s and others’ synthetic molecules have already proved useful in medical and biotech applications, such as diagnostic tests for HIV and other diseases. Indeed, Benner’s work helped to found the burgeoning field of synthetic biology, which seeks to build new life, in addition to forming useful tools from molecular parts.

    Why Life’s Code Is Limited

    Benner’s work and that of other researchers suggests that a larger alphabet has the capacity to enhance DNA’s function. So why didn’t nature expand its alphabet in the 4 billion years it has had to work on it? It could be because a larger repertoire has potential disadvantages. Some of the structures made possible by a larger alphabet might be of poor quality, with a greater risk of misfolding, Ellington said.

    Nature was also effectively locked into the system at hand when life began. “Once [nature] has made a decision about which molecular structures to place at the core of its molecular biology, it has relatively little opportunity to change those decisions,” Benner said. “By constructing unnatural systems, we are learning not only about the constraints at the time that life first emerged, but also about constraints that prevent life from searching broadly within the imagination of chemistry.”

    5
    The genetic code—made up of the four letters, A, T, G and C—stores the blueprint for proteins. DNA is first transcribed into RNA and then translated into proteins, which fold into specific shapes. Olena Shmahalo/Quanta Magazine

    Benner aims to make a thorough search of that chemical space, using his discoveries to make new and improved versions of both DNA and RNA. He wants to make DNA better at storing information and RNA better at catalyzing reactions. He hasn’t shown directly that the P-Z base pairs do that. But both bases have the potential to help RNA fold into more complex structures, which in turn could make proteins better catalysts. P has a place to add a “functional group,” a molecular structure that helps folding and is typically found in proteins. And Z has a nitro group, which could aid in molecular binding.

    In modern cells, RNA acts as an intermediary between DNA and proteins. But Benner ultimately hopes to show that the three-biopolymer system—DNA, RNA and proteins—that exists throughout life on Earth isn’t essential. With better-engineered DNA and RNA, he says, perhaps proteins are unnecessary.

    Indeed, the three-biopolymer system may have drawbacks, since information flows only one way, from DNA to RNA to proteins. If a DNA mutation produces a more efficient protein, that mutation will spread slowly, as organisms without it eventually die off.

    What if the more efficient protein could spread some other way, by directly creating new DNA? DNA and RNA can transmit information in both directions. So a helpful RNA mutation could theoretically be transformed into beneficial DNA. Adaptations could thus lead directly to changes in the genetic code.

    Benner predicts that a two-biopolymer system would evolve faster than our own three-biopolymer system. If so, this could have implications for life on distant planets. “If we find life elsewhere,” he said, “it would likely have the two-biopolymer system.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 581 other followers

%d bloggers like this: