Updates from richardmitnick Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 5:44 pm on February 17, 2018 Permalink | Reply
    Tags: , , , , , , , ,   

    From ESO: “7. Challenges in Obtaining an Image of a Supermassive Black Hole” 

    ESO 50 Large

    European Southern Observatory

    “Seeing a black hole” has been a long-cherished desire for many astronomers, but now, thanks to the Event Horizon Telescope (EHT) and the Global mm-VLBI Array (GMVA) projects, it may no longer be just a dream.

    Event Horizon Telescope Array

    Arizona Radio Observatory
    Arizona Radio Observatory/Submillimeter-wave Astronomy (ARO/SMT)

    ESO/APEX
    Atacama Pathfinder EXperiment

    CARMA Array no longer in service
    Combined Array for Research in Millimeter-wave Astronomy (CARMA)

    Atacama Submillimeter Telescope Experiment (ASTE)
    Atacama Submillimeter Telescope Experiment (ASTE)

    Caltech Submillimeter Observatory
    Caltech Submillimeter Observatory (CSO)

    IRAM NOEMA interferometer
    Institut de Radioastronomie Millimetrique (IRAM) 30m

    James Clerk Maxwell Telescope interior, Mauna Kea, Hawaii, USA
    James Clerk Maxwell Telescope interior, Mauna Kea, Hawaii, USA

    Large Millimeter Telescope Alfonso Serrano
    Large Millimeter Telescope Alfonso Serrano

    CfA Submillimeter Array Hawaii SAO
    Submillimeter Array Hawaii SAO

    ESO/NRAO/NAOJ ALMA Array
    ESO/NRAO/NAOJ ALMA Array, Chile

    South Pole Telescope SPTPOL
    South Pole Telescope SPTPOL

    Future Array/Telescopes

    Plateau de Bure interferometer
    Plateau de Bure interferometer

    NSF CfA Greenland telescope

    Global mm-VLBI Array

    Greenland Telescope

    To make it possible to image the shadow of the event horizon of Sagittarius A* [SgrA*], many researchers and cutting-edge technologies have been mobilised — because obtaining an image of a black hole is not as easy as snapping a photo with an ordinary camera.

    Sagittarius A* has a mass of approximately four million times that of the Sun, but it only looks like a tiny dot from Earth, 26 000 light-years away.

    SGR A* , the supermassive black hole at the center of the Milky Way. NASA’s Chandra X-Ray Observatory

    NASA/Chandra Telescope

    To capture its image, incredibly high resolution is needed. As explained in the fifth post of this blog series, the key is to use Very-Long-Baseline Interferometry (VLBI), a technique that combines the observing power of and the data from telescopes around the world to create a virtual giant radio telescope.

    The resolution of a telescope can be calculated from the radio wavelength the telescope is observing at and the size of the telescope — or in VLBI, the distance between the antennas. However, while actually observing, several kinds of noise and errors interfere with the telescope’s performance and affect the resolution.

    In VLBI, each antenna is equipped with an extremely precise atomic clock to record the time at which radio signals from the target object were received. The gathered data are synthesised using the times as a reference, so that the arrival time of the radio waves to each antenna can be accurately adjusted.

    But this process isn’t always straightforward because the Earth’s atmosphere blocks a certain range of wavelengths. Several kinds of molecules such as water vapour absorb a fraction of radio waves that pass through the atmosphere, with shorter wavelengths more susceptible to absorption. To minimise the effect of atmospheric absorption, radio telescopes are built at high and dry sites, but even then they are still not completely immune from the effect.

    The tricky part of this absorption effect is that the direction of a radio wave is slightly changed when it passes through the atmosphere containing water vapour. This means that the radio waves arrive at different times at each antenna, making it difficult to synthesise the data later using the time signal as a reference. And even worse: since VLBI utilises antennas located thousands of kilometres apart, it has to take into account the differences in the amount of water vapour in the sky above each site, as well as the large fluctuations of water vapour content during the observation period. In optical observations, these fluctuations make the light of a star flicker and lower the resolution. Radio observations have similar problems.

    “We have only a few ways to reduce this effect in VLBI observations,” explains Satoki Matsushita at the Academia Sinica Institute of Astronomy and Astrophysics (ASIAA) of Taiwan. “If there is a compact object emitting intense radiation near the target object, we can remove most of the effect of refraction of radio waves by water vapour by using such an intense radiation source as a reference. However, no such intense reference source has been found near Sagittarius A* so far. And even if there is a reference source, there are still necessary conditions that must be satisfied: the telescopes need to have the ability to observe the target object and reference object at the same time; or the telescopes need to have the high-speed drive mechanism to quickly switch the observation between the target object and the reference object. Unfortunately, not all telescopes participating in the EHT/GMVA observations have this capability. One of the methods to remove the effect is to equip each antenna with an instrument to measure the amount of water vapour, but ALMA is the only telescope that has adopted this method at this point.”

    Another major challenge in imaging a black hole is obtaining a high-quality image. By combining the data collected by antennas thousands of kilometres apart, VLBI achieves a resolution equivalent to a radio telescope several thousands of kilometres in diameter. However, VLBI also has a lot of large blank areas that are not covered by any of the antennas. These missing parts make it difficult for VLBI to reproduce a high-fidelity image of a target object from the synthesised data. This is a common problem for all radio interferometers, including ALMA, but it can be more serious in VLBI where the antennas are located very far apart.

    It might be natural to think that a higher resolution means a higher image quality, as is the case with an ordinary digital camera, but in radio observations the resolution and image quality are quite different things. The resolution of a telescope determines how close two objects can be to each other and yet still be resolved as separate objects, while the image quality defines the fidelity in reproducing the image of the structure of the observed object. For example, imagine a leaf, which has a variety of veins. The resolution is the ability to see thinner vein patterns, while the image quality is the ability to capture the overall spread of the leaf. In normal human experience, it would seem bizarre if you could see the very thin veins of a leaf but couldn’t grasp a complete view of the leaf — but such things happen in VLBI, since some portions of data are inevitably missing.

    1
    This infographic illustrates how ALMA contributes to the EHT observations. With its shorter baseline, ALMA is sensitive to larger scales than the EHT and so ALMA can fill in the lower-resolution, larger-scale structures that the EHT misses. Credit: NRAO

    Researchers have been studying data processing methods to improve image quality for almost as long as the history of the radio interferometer itself, so there are some established methods that are already widely used, while others are still in an experimental phase. In the Event Horizon Telescope (EHT) and the Global mm-VLBI Array (GMVA) projects, which are both aiming to capture the shadow of a black hole’s event horizon for the first time, researchers began to develop effective image analysis methods using simulation data well before the start of the observations.

    2
    A simulated image of the supermassive black hole at the centre of the M87 galaxy. The dark gap at the centre is the shadow of the black hole. Credit: Monika Moscibrodzka (Radboud University)

    The observations with the EHT and the GMVA were completed in April 2017. The data collected by the antennas around the world has been sent to the US and Germany, where data processing will be conducted with dedicated data-processing computers called correlators. The data from the South Pole Telescope, one of the participating telescopes in the EHT, will arrive at the end of 2017, and then data calibration and data synthesis will begin in order to produce an image, if possible. This process might take several months to achieve the goal of obtaining the first image of a black hole, which is eagerly awaited by black hole researchers and the general astronomical community worldwide.

    This lengthy time span between observations and results is normal in astronomy, as the reduction and analysis of the data is a careful, time-consuming process. Right now, all we can do is wait patiently for success to come — for a long-held dream of astronomers to be transformed into a reality.

    Until then, this is the last post in our blog series about the EHT and GMVA projects. When the results become available in early 2018, we’ll be back with what will hopefully be exciting new information about our turbulent and fascinating galactic centre

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition
    Visit ESO in Social Media-

    Facebook

    Twitter

    YouTube

    ESO Bloc Icon

    ESO is the foremost intergovernmental astronomy organisation in Europe and the world’s most productive ground-based astronomical observatory by far. It is supported by 16 countries: Austria, Belgium, Brazil, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Poland, Portugal, Spain, Sweden, Switzerland and the United Kingdom, along with the host state of Chile. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising cooperation in astronomical research. ESO operates three unique world-class observing sites in Chile: La Silla, Paranal and Chajnantor. At Paranal, ESO operates the Very Large Telescope, the world’s most advanced visible-light astronomical observatory and two survey telescopes. VISTA works in the infrared and is the world’s largest survey telescope and the VLT Survey Telescope is the largest telescope designed to exclusively survey the skies in visible light. ESO is a major partner in ALMA, the largest astronomical project in existence. And on Cerro Armazones, close to Paranal, ESO is building the 39-metre European Extremely Large Telescope, the E-ELT, which will become “the world’s biggest eye on the sky”.

    ESO LaSilla
    ESO/Cerro LaSilla 600 km north of Santiago de Chile at an altitude of 2400 metres.

    ESO VLT
    VLT at Cerro Paranal, with an elevation of 2,635 metres (8,645 ft) above sea level.

    ESO Vista Telescope
    ESO/Vista Telescope at Cerro Paranal, with an elevation of 2,635 metres (8,645 ft) above sea level.

    ESO NTT
    ESO/NTT at Cerro LaSilla 600 km north of Santiago de Chile at an altitude of 2400 metres.

    ESO VLT Survey telescope
    VLT Survey Telescope at Cerro Paranal with an elevation of 2,635 metres (8,645 ft) above sea level.

    ALMA Array
    ALMA on the Chajnantor plateau at 5,000 metres.

    ESO E-ELT
    ESO/E-ELT to be built at Cerro Armazones at 3,060 m.

    ESO APEX
    APEX Atacama Pathfinder 5,100 meters above sea level, at the Llano de Chajnantor Observatory in the Atacama desert.

    Leiden MASCARA instrument, La Silla, located in the southern Atacama Desert 600 kilometres (370 mi) north of Santiago de Chile at an altitude of 2,400 metres (7,900 ft)

    Leiden MASCARA cabinet at ESO Cerro la Silla located in the southern Atacama Desert 600 kilometres (370 mi) north of Santiago de Chile at an altitude of 2,400 metres (7,900 ft)

    ESO Next Generation Transit Survey at Cerro Paranel, 2,635 metres (8,645 ft) above sea level

    SPECULOOS four 1m-diameter robotic telescopes 2016 in the ESO Paranal Observatory, 2,635 metres (8,645 ft) above sea level

    ESO TAROT telescope at Paranal, 2,635 metres (8,645 ft) above sea level

    ESO ExTrA telescopes at Cerro LaSilla at an altitude of 2400 metres

    Advertisements
     
  • richardmitnick 1:10 pm on February 17, 2018 Permalink | Reply
    Tags: A new approach to rechargeable batteries, , ,   

    From MIT: “A new approach to rechargeable batteries” 

    MIT News

    MIT Widget

    MIT News

    January 22, 2018 [Just now in social media.]
    David L. Chandler


    A type of battery first invented nearly five decades ago could catapult to the forefront of energy storage technologies, thanks to a new finding by researchers at MIT. Illustration modified from an original image by Felice Frankel

    A type of battery first invented nearly five decades ago could catapult to the forefront of energy storage technologies, thanks to a new finding by researchers at MIT. The battery, based on electrodes made of sodium and nickel chloride and using a new type of metal mesh membrane, could be used for grid-scale installations to make intermittent power sources such as wind and solar capable of delivering reliable baseload electricity.

    The findings are being reported today in the journal Nature Energy, by a team led by MIT professor Donald Sadoway, postdocs Huayi Yin and Brice Chung, and four others.

    Although the basic battery chemistry the team used, based on a liquid sodium electrode material, was first described in 1968, the concept never caught on as a practical approach because of one significant drawback: It required the use of a thin membrane to separate its molten components, and the only known material with the needed properties for that membrane was a brittle and fragile ceramic. These paper-thin membranes made the batteries too easily damaged in real-world operating conditions, so apart from a few specialized industrial applications, the system has never been widely implemented.

    But Sadoway and his team took a different approach, realizing that the functions of that membrane could instead be performed by a specially coated metal mesh, a much stronger and more flexible material that could stand up to the rigors of use in industrial-scale storage systems.

    “I consider this a breakthrough,” Sadoway says, because for the first time in five decades, this type of battery — whose advantages include cheap, abundant raw materials, very safe operational characteristics, and an ability to go through many charge-discharge cycles without degradation — could finally become practical.

    While some companies have continued to make liquid-sodium batteries for specialized uses, “the cost was kept high because of the fragility of the ceramic membranes,” says Sadoway, the John F. Elliott Professor of Materials Chemistry. “Nobody’s really been able to make that process work,” including GE, which spent nearly 10 years working on the technology before abandoning the project.

    As Sadoway and his team explored various options for the different components in a molten-metal-based battery, they were surprised by the results of one of their tests using lead compounds. “We opened the cell and found droplets” inside the test chamber, which “would have to have been droplets of molten lead,” he says. But instead of acting as a membrane, as expected, the compound material “was acting as an electrode,” actively taking part in the battery’s electrochemical reaction.

    “That really opened our eyes to a completely different technology,” he says. The membrane had performed its role — selectively allowing certain molecules to pass through while blocking others — in an entirely different way, using its electrical properties rather than the typical mechanical sorting based on the sizes of pores in the material.

    In the end, after experimenting with various compounds, the team found that an ordinary steel mesh coated with a solution of titanium nitride could perform all the functions of the previously used ceramic membranes, but without the brittleness and fragility. The results could make possible a whole family of inexpensive and durable materials practical for large-scale rechargeable batteries.

    The use of the new type of membrane can be applied to a wide variety of molten-electrode battery chemistries, he says, and opens up new avenues for battery design. “The fact that you can build a sodium-sulfur type of battery, or a sodium/nickel-chloride type of battery, without resorting to the use of fragile, brittle ceramic — that changes everything,” he says.

    The work could lead to inexpensive batteries large enough to make intermittent, renewable power sources practical for grid-scale storage, and the same underlying technology could have other applications as well, such as for some kinds of metal production, Sadoway says.

    Sadoway cautions that such batteries would not be suitable for some major uses, such as cars or phones. Their strong point is in large, fixed installations where cost is paramount, but size and weight are not, such as utility-scale load leveling. In those applications, inexpensive battery technology could potentially enable a much greater percentage of intermittent renewable energy sources to take the place of baseload, always-available power sources, which are now dominated by fossil fuels.

    The research team included Fei Chen, a visiting scientist from Wuhan University of Technology; Nobuyuki Tanaka, a visiting scientist from the Japan Atomic Energy Agency; MIT research scientist Takanari Ouchi; and postdocs Huayi Yin, Brice Chung, and Ji Zhao. The work was supported by the French oil company Total S.A. through the MIT Energy Initiative.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 12:33 pm on February 17, 2018 Permalink | Reply
    Tags: , Uncovering Genome Mysteries Project,   

    From Uncovering Genome Mysteries Project at WCG: “Analysis Underway on 30 Terabytes of Data” 

    New WCG Logo

    WCGLarge

    World Community Grid (WCG)

    24 Nov 2017 [In social media just now]

    Summary
    The Uncovering Genome Mysteries data (all 30 terabytes) was transferred to the research teams in Brazil and Australia this year. Now, the researchers are analyzing this vast amount of data, and looking for ways to make it easy for other scientists and the public to understand.

    Background

    Last year, World Community Grid volunteers completed the calculations for the Uncovering Genome Mysteries project, which examined approximately 200 million genes from a wide variety of life forms to help discover new protein functions. The project’s main goals include:

    Discovering new protein functions and augmenting knowledge about biochemical processes in general
    Identifying how organisms interact with each other and the environment
    Documenting the current baseline microbial diversity, allowing a better understanding of how microorganisms change under environmental stresses, such as climate change
    Understanding and modeling complex microbial systems

    The data generated by World Community Grid volunteers has been regrouped on the new bioinformatics server at the Oswaldo Cruz Foundation (Fiocruz), under the direction of Dr. Wim Degrave. Additionally, a full copy of all data has been sent to co-investigator Dr. Torsten Thomas and his team from the Centre for Marine Bio-Innovation & the School of Biological, Earth and Environmental Sciences at the University of New South Wales in Sydney, Australia. At the University of New South Wales, the results from protein comparisons will help to interpret the analyses of marine bacterial ecosystems, where micro-organisms, coral reef, sponges and many other intriguing creatures interact and form their life communities. The dataset, more than 30 terabytes under highly compressed form, took a few months to be transferred from Brazil to Australia.

    Data Processing and Analysis at Fiocruz

    The Fiocruz team has been busy with the further processing of the primary output of the project. In the workflow, raw data are expanded and deciphered, associated with the correct inter-genome comparisons, checked for errors, tabulated, and associated with many different data objects to transform that into meaningful information.

    The team is dealing with the rapidly growing size of the database, and purchased and installed new hardware (600 Tb) to help accommodate all the data. They also wish to build a database interface that appeals to the general public interested in biodiversity, and not only to scientists who specialize in functional analysis of encoded proteins in genomes of particular life forms.

    Some of the data are currently being used in projects such as vaccine and drug design against arboviruses such as Zika, dengue, and yellow fever viruses, but also for understanding of the interaction of bacteria with their environment and how this reflects in their metabolic pathways, when free living bacteria are compared with their close relatives that are human pathogens, such as Mycobacterium tuberculosis versus environmental mycobacteria.

    Searching for Partnerships

    Fiocruz is looking for partnerships that would add extra data analytics and artificial intelligence to the project. The researchers would like to include visualizations of functional connections between organisms as well as particularities from a wide variety of organisms, including deep sea thermal vent archaeal bacteria; bacteria and protists (any one-celled organism that is not an animal, plant or fungus) from soil, water, land, and sea or important for human, animal, or plant health; and highly complex plant, animal, and human genomes.

    We thank everyone who participated in the World Community Grid portion of this project, and look forward to sharing more updates as we continue to analyze the data.

    See the full article here.

    Ways to access the blog:
    https://sciencesprings.wordpress.com
    http://facebook.com/sciencesprings

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    World Community Grid (WCG) brings people together from across the globe to create the largest non-profit computing grid benefiting humanity. It does this by pooling surplus computer processing power. We believe that innovation combined with visionary scientific research and large-scale volunteerism can help make the planet smarter. Our success depends on like-minded individuals – like you.”
    WCG projects run on BOINC software from UC Berkeley.
    BOINCLarge

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing.

    BOINC WallPaper

    CAN ONE PERSON MAKE A DIFFERENCE? YOU BET!!

    My BOINC
    MyBOINC
    “Download and install secure, free software that captures your computer’s spare power when it is on, but idle. You will then be a World Community Grid volunteer. It’s that simple!” You can download the software at either WCG or BOINC.

    Please visit the project pages-
    Smash Childhood Cancer4

    FightAIDS@home Phase II

    FAAH Phase II
    OpenZika

    Rutgers Open Zika

    Help Stop TB
    WCG Help Stop TB
    Outsmart Ebola together

    Outsmart Ebola Together

    Mapping Cancer Markers
    mappingcancermarkers2

    Uncovering Genome Mysteries
    Uncovering Genome Mysteries

    Say No to Schistosoma

    GO Fight Against Malaria

    Drug Search for Leishmaniasis

    Computing for Clean Water

    The Clean Energy Project

    Discovering Dengue Drugs – Together

    Help Cure Muscular Dystrophy

    Help Fight Childhood Cancer

    Help Conquer Cancer

    Human Proteome Folding

    FightAIDS@Home

    faah-1-new-screen-saver

    faah-1-new

    World Community Grid is a social initiative of IBM Corporation
    IBM Corporation
    ibm

    IBM – Smarter Planet
    sp

     
  • richardmitnick 8:33 am on February 17, 2018 Permalink | Reply
    Tags: , , , , ,   

    From Astronomy Magazine: “Celebrating Pluto’s discovery” 

    Astronomy magazine

    Astronomy Magazine

    February 15, 2018
    Alison Klesman

    1
    This is Pluto as it appeared to the New Horizons spacecraft during its approach of the dwarf planet in July 2015. NASA/Johns Hopkins University Applied Physics Laboratory/Southwest Research Institute.

    On February 18, 1930, Pluto was discovered by astronomer Clyde W. Tombaugh at the Lowell Observatory in Flagstaff, Arizona.

    Lowell Observatory, in Flagstaff, Arizona, USA

    Compared with the major planets in our solar system, Pluto has had a shorter but rockier history. Originally hailed as our solar system’s ninth planet, Pluto was reclassified as a dwarf planet by a 2006 vote of the International Astronomical Union — a move that remains controversial and challenged to this day.

    Pluto, regardless of the category into which it is sorted, has played a vital role in our understanding of the formation and evolution of our solar system. We now know it is part of a family of objects called the Kuiper Belt, comprised of icy, rocky remnants from the solar nebula’s earliest days. The Pluto system itself is larger than initially believed; its largest moon, Charon, wasn’t discovered until 1978, and only in the past two decades have astronomers uncovered four more tiny moons using the world’s most powerful telescopes.

    2
    An artist’s concept shows New Horizons flying through the Pluto system. Johns Hopkins University Applied Physics Laboratory/Southwest Research Institute.

    Until 2015, Pluto remained a dim dot through Earthbound telescopes, and a mere few pixels on images taken by the orbiting Hubble Space Telescope. On July 14, 2015, the New Horizons spacecraft flew past the Pluto system, forever changing our view of this distant world. Astronomy celebrated the accomplishment with our Year of Pluto, a wealth of fascinating articles looking back over our past expectations, guesses, and dreams about Pluto, and highlighting the unrivaled success of and the wealth of information unlocked by New Horizons over the course of just a few short hours.

    Circling the Sun on an elliptical orbit tilted relative to the plane of the planets, Pluto takes about 248 (Earth) years to make one trip; the tiny, icy world has not yet completed even a single orbit since its discovery. But despite its distance and its still-controversial status, Pluto remains one of the most beloved and fascinating objects in our solar system. Below, you can find links to some of our favorite articles on the history of Pluto, leading up to its discovery, its naming, and the 2015 flyby. Or we invite you to explore our full library of Pluto articles here: Year of Pluto.

    And if, like many, you believe Pluto should regain its place among the rightful planets of our solar system, stay tuned — Astronomy will be featuring an exclusive on the definition of the word planet, and how we might rethink it, in an upcoming magazine issue and online bonus feature.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 8:19 am on February 17, 2018 Permalink | Reply
    Tags: , , , , , Lensed quasar RXJ1131−1231,   

    From Astronomy: “Astronomers report a possible slew of extragalactic exoplanets” 

    Astronomy magazine

    Astronomy Magazine

    February 09, 2018
    Mara Johnson-Groh

    Could a distant galaxy be home to a large population of unbound planets?

    1
    Astronomers have identified a population of rogue planets – planets not bound to or orbiting parent stars – in a lensing galaxy sitting between Earth and a distant quasar.
    NASA/JPL-Caltech

    Discoveries of exoplanets in our galaxy exceed 3,700 to date, but if that’s not enough for you, astronomers are now probing outside of the Milky Way to find exoplanets in other galaxies. A group of researchers at the University of Oklahoma has just announced the discovery of a large population of free-floating planets in a galaxy 3.8 billion light-years away. Their results were published February 2 in The Astrophysical Journal Letters.

    The researchers used a method known as quasar microlensing, which has traditionally been used to study the disk-like regions around supermassive black holes where material gathers as it spirals in toward the event horizon.

    2
    Credit: NASA/Jason Cowan (Astronomy Technology Center).

    When a distant quasar is eclipsed by a closer galaxy, the intervening galaxy will create several magnified replica images of the quasar. These replicas are further magnified by stars in the interloping galaxy to create a final super-magnified image that can be used to study the quasar in detail.

    Wild planets

    While studying the light emitted by the lensed quasar RXJ1131−1231 with the Chandra X-ray Observatory, the researchers noticed a particular wavelength of light emitted by iron was stronger than could be explained solely by the lensing effect of stars in the intervening galaxy.

    NASA/Chandra Telescope

    By modeling their results, the researchers concluded that the shifted energy signature was most likely caused by a huge population of planets with masses ranging from our Moon to Jupiter. The model that best matched the data found a ratio of 2,000 planets for every main sequence star in the galaxy —billions of stars. These planets are specifically “unbound” — not orbiting a star but wandering freely — as bound planets don’t have the same boosting effect seen in the data. Because the models only provided a wide range of potential planet masses, the researchers hope to identify the distribution of the sizes further with additional modeling.

    3
    RX J1131-1231 is about 6 billion light-years away. It is a lensed quasar; gravitational lensing caused by an intervening elliptical galaxy (center, yellow) has magnified and multiplied the image of RX J1131 into four images (pink) as seen with the Chandra X-ray Observatory.
    X-ray: NASA/CXC/Univ of Michigan/R.C.Reis et al; Optical: NASA/STScI

    NASA/ESA Hubble Telescope

    These preliminary results may just be the first out of the floodgates. “There are also other galaxies we’re working on,” says Xinyu Dai, lead author of the paper and researcher at the University of Oklahoma. “We think there are some signatures showing the presence of a small mass population, but we need to run detailed models to see if this is true or not.”

    Other Sightings

    This isn’t the first time astronomers have claimed a discovery of an exoplanet outside our galaxy. A signature consistent with a three-Earth-mass planet was detected in a galaxy 4 billion light-years away, but the one-time chance nature of the alignment causing the microlensing meant the discovery could not be confirmed with further observations. Similarly, a different version of microlensing using a star instead of a galaxy was previously used to probe the Andromeda Galaxy. A team found deviations in the light that they believed could be caused by an exoplanet six times as massive as Jupiter, but again the detection was never confirmed.

    The interloper star HIP 13044 was reported to itself host an exoplanet 25 percent larger than Jupiter, but subsequent follow-up found no evidence for the planet. Though this star is currently a part of the Milky Way, it originally came from a small galaxy that collided with the Milky Way six billion years ago.

    Vagabond stars like HIP 13044 may provide our best chance for examining exoplanets from other galaxies in detail. With current telescope technology, microlensing can point to a detection in other galaxies, but it cannot fully probe the properties of these candidates. Finding relatively nearby exoplanets around stars that originated abroad, however, may help us learn more about how exoplanets form and whether there are differences between planets born in different galaxies.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 3:36 pm on February 16, 2018 Permalink | Reply
    Tags: , , , , , , , , European backed missions,   

    From CERN Courier: “Europe defines astroparticle strategy” 


    CERN Courier

    Feb 16, 2018

    1

    Multi-messenger astronomy, neutrino physics and dark matter are among several topics in astroparticle physics set to take priority in Europe in the coming years, according to a report by the Astroparticle Physics European Consortium (APPEC).

    The APPEC strategy for 2017–2026, launched at an event in Brussels on 9 January, is the culmination of two years of consultation with the astroparticle and related communities. It involved some 20 agencies in 16 countries and includes representation from the European Committee for Future Accelerators, CERN and the European Southern Observatory (ESO).

    Lying at the intersection of astronomy, particle physics and cosmology, astroparticle physics is well placed to search for signs of physics beyond the standard models of particle physics and cosmology. As a relatively new field, however, European astroparticle physics does not have dedicated intergovernmental organisations such as CERN or ESO to help drive it. In 2001, European scientific agencies founded APPEC to promote cooperation and coordination, and specifically to formulate a strategy for the field.

    Building on earlier strategies released in 2008 and 2011, APPEC’s latest roadmap presents 21 recommendations spanning scientific issues, organisational aspects and societal factors such as education and industry, helping Europe to exploit tantalising potential for new discoveries in the field.

    The recent detection of gravitational waves from the merger of two neutron stars (CERN Courier December 2017 p16) opens a new line of exploration based on the complementary power of charged cosmic rays, electromagnetic waves, neutrinos and gravitational waves for the study of extreme events such as supernovae, black-hole mergers and the Big Bang itself. “We need to look at cross-fertilisation between these modes to maximise the investment in facilities,” says APPEC chair Antonio Masiero of the INFN and the University of Padova. “This is really going to become big.”

    APPEC strongly supports Europe’s next-generation ground-based gravitational interferometer, the Einstein Telescope, and the space-based LISA detector.

    ASPERA Einstein Telescope

    ESA/NASA eLISA space based the future of gravitational wave research

    In the neutrino sector, KM3NeT is being completed for high-energy cosmic neutrinos at its site in Sicily, as well as for precision studies of atmospheric neutrinos at its French site near Toulon.

    Artist’s expression of the KM3NeT neutrino telescope

    Europe is also heavily involved in the upgrade of the leading cosmic-ray facility the Pierre Auger Observatory in Argentina.

    Pierre Auger Observatory in the western Mendoza Province, Argentina, near the Andes, at an altitude of 1330 m–1620 m, average ~1400 m

    Significant R&D work is taking place at CERN’s neutrino platform for the benefit of long- and short-baseline neutrino experiments in Japan and the US (CERN Courier July/August 2016 p21), and Europe is host to several important neutrino experiments. Among them are KATRIN at KIT in Germany, which is about to begin measurements of the neutrino absolute mass scale, and experiments searching for neutrinoless double-beta decay (NDBD) such as GERDA and CUORE at INFN’s Gran Sasso National Laboratory (CERN Courier December 2017 p8).


    KIT Katrin experiment

    CUORE experiment UC Berkeley, experiment at the Italian National Institute for Nuclear Physics’ (INFN’s) Gran Sasso National Laboratories (LNGS), a search for neutrinoless double beta decay

    Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy

    There are plans to join forces with experiments in the US to build the next generation of NDBD detectors. APPEC has a similar vision for dark matter, aiming to converge next year on plans for an “ultimate” 100-tonne scale detector based on xenon and argon via the DARWIN and Argo projects.

    DARWIN Dark Matter experiment

    APPEC also supports ESA’s Euclid mission, which will establish European leadership in dark-energy research, and encourages continued European participation in the US-led DES and LSST ground-based projects.

    Dark Energy Camera [DECam], built at FNAL


    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    Following from ESA’s successful Planck mission, APPEC strongly endorses a European-led satellite mission, such as COrE, to map the cosmic-microwave background and the consortium plans to enhance its interactions with its present observers ESO and CERN in areas of mutual interest.

    ESA/Planck

    “It is important at this time to put together the human forces,” says Masiero. “APPEC will exercise influence in the European Strategy for Particle Physics, and has a significant role to play in the next European Commission Framework Project, FP9.”

    A substantial investment is needed to build the next generation of astroparticle-physics research, the report concedes. According to Masiero, European agencies within APPEC currently invest around €80 million per year in astroparticle-related activities, in addition to funding large research infrastructures. A major effort in Europe is necessary for it to keep its leading position. “Many young people are drawn into science by challenges like dark matter and, together with Europe’s existing research infrastructures in the field, we have a high technological level and are pushing industries to develop new technologies,” continues Masiero. “There are great opportunities ahead in European astroparticle physics.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 3:02 pm on February 16, 2018 Permalink | Reply
    Tags: , Improving quantum information processing,   

    From ORNL: “Researchers demonstrate promising method for improving quantum information processing” 

    i1

    Oak Ridge National Laboratory

    February 16, 2018
    Scott Jones, Communications
    jonesg@ornl.gov
    865.241.6491

    1
    Joseph Lukens, Pavel Lougovski and Nicholas Peters (from left), researchers with ORNL’s Quantum Information Science Group, are examining methods for encoding photons with quantum information that are compatible with the existing telecommunications infrastructure and that incorporate off-the-shelf components. Credit: Genevieve Martin/Oak Ridge National Laboratory, U.S. Department of Energy.

    A team of researchers led by the Department of Energy’s Oak Ridge National Laboratory has demonstrated a new method for splitting light beams into their frequency modes. The scientists can then choose the frequencies they want to work with and encode photons with quantum information. Their work could spur advancements in quantum information processing and distributed quantum computing.

    The team’s findings were published in Physical Review Letters.

    The frequency of light determines its color. When the frequencies are separated, as in a rainbow, each color photon can be encoded with quantum information, delivered in units known as qubits. Qubits are analogous to but different from classical bits, which have a value of either 0 or 1, because qubits are encoded with values of both 0 and 1 at the same time.

    The researchers liken quantum information processing to stepping into a hallway and being able to go both ways, whereas in classical computing just one path is possible.

    The team’s novel approach—featuring the first demonstration of a frequency tritter, an instrument that splits light into three frequencies—returned experimental results that matched their predictions and showed that many quantum information processing operations can be run simultaneously without increasing error. The quantum system performed as expected under increasingly complex conditions without degrading the encoded information.

    “Under our experimental conditions, we got a factor 10 better than typical error rates,” said Nicholas Peters, Quantum Communications team lead for ORNL’s Quantum Information Science Group. “This establishes our method as a frontrunner for high-dimensional frequency-based quantum information processing.”

    Photons can carry quantum information in superpositions—where photons simultaneously have multiple bit values—and the presence of two quantum systems in superposition can lead to entanglement, a key resource in quantum computing.

    Entanglement boosts the number of calculations a quantum computer could run, and the team’s focus on creating more complex frequency states aims to make quantum simulations more powerful and efficient. The researchers’ method is also notable because it demonstrates the Hadamard gate, one of the elemental circuits required for universal quantum computing.

    “We were able to demonstrate extremely high-fidelity results right off the bat, which is very impressive for the optics approach,” said Pavel Lougovski, the project’s principal investigator. “We are carving out a subfield here at ORNL with our frequency-based encoding work.”

    The method leverages widely available telecommunications technology with off-the-shelf components while yielding high-fidelity results. Efforts to develop quantum repeaters, which extend the distance quantum information can be transmitted between physically separated computers, will benefit from this work.

    “The fact that our method is telecom network-compatible is a big advantage,” Lougovski said. “We could perform quantum operations on telecom networks if needed.”

    Peters added that their project demonstrates that unused fiber-optic bandwidth could be harnessed to reduce computational time by running operations in parallel.

    “Our work uses frequency’s main advantage—stability—to get very high fidelity and then do controlled frequency jumping when we want it,” said Wigner Fellow Joseph Lukens, who led the ORNL experiment. The researchers have experimentally shown that quantum systems can be transformed to yield desired outputs.

    The researchers suggest their method could be paired with existing beam-splitting technology, taking advantage of the strengths of both and bringing the scientific community closer to full use of frequency-based photonic quantum information processing.

    Peters, Lougovski and Lukens, all physicists with ORNL’s Quantum Information Science Group, collaborated with graduate student Hsuan-Hao Lu, professor Andrew Weiner, and colleagues at Purdue University. The team published the theory for their experiments in Optica in January 2017.

    This research is supported by ORNL’s Laboratory Directed Research and Development program and the National Science Foundation.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    i2

     
  • richardmitnick 2:45 pm on February 16, 2018 Permalink | Reply
    Tags: , , , , , Magnetic Reconnection in the Sun   

    From CfA: “Magnetic Reconnection in the Sun” 

    Harvard Smithsonian Center for Astrophysics


    Center For Astrophysics

    February 16, 2018

    1

    An ultraviolet picture of the sun’s chromosphere, the thin layer of solar atmosphere sandwiched between the visible surface, the photosphere, and the corona. Astronomers have developed a simulation to address magnetic reconnection in the chromosphere. The image was taken by the Hinode spacecraft.

    JAXA/NASA HINODE spacecraft


    JAXA/NASA

    The Sun glows with a surface temperature of about 5500 degrees Celsius. On the other hand its hot outer layer, the corona, has a temperature of over a million degrees and ejects a wind of charged particles at a rate equivalent to about one-millionth of the moon’s mass each year. Some of these particles bombard the Earth, producing auroral glows and occasionally disrupting global communications. In between these two regions of the Sun is the chromosphere. Within this complex interface zone, only a few thousand kilometers deep, the density of the gas drops with height by a factor of about one million and the temperature increases. Almost all of the mechanical energy that drives solar activity is converted into heat and radiation within this interface zone.

    Charged particles are produced by the high temperatures of the gas, and their motions produce powerful, dynamic magnetic fields. Those field lines can sometimes break apart forcefully, but movement of the underlying charged particles often leads them to reconnect. There are two important, longstanding, and related questions about the hot solar wind: how is it heated, and how does the corona produce the wind? Astronomers suspect that magnetic reconnection in the chromosphere plays a key role.

    CfA astronomer Nicholas Murphy and his three colleagues have completed complex new simulations of magnetic reconnection in hot ionized gas like that present in the solar chromosphere. (The lead author on the study, Lei Ni, was a visitor to the CfA.) The scientists include for the first time the effects of incompletely ionized gas in lower temperature regions, certain particle-particle effects, and other details of the neutral and ionized gas interactions. They find that the neutral and ionized gas is well-coupled throughout the reconnection region, and conclude that reconnection can often occur in the cooler portions of the zone. They also note that new, high-resolution solar telescopes are capable of studying smaller and smaller regions of low ionization for which their results are particularly applicable.

    Science paper:
    Magnetic Reconnection in Strongly Magnetized Regions of the Low Solar Chromosphere, The Astrophysical Journal

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Center for Astrophysics combines the resources and research facilities of the Harvard College Observatory and the Smithsonian Astrophysical Observatory under a single director to pursue studies of those basic physical processes that determine the nature and evolution of the universe. The Smithsonian Astrophysical Observatory (SAO) is a bureau of the Smithsonian Institution, founded in 1890. The Harvard College Observatory (HCO), founded in 1839, is a research institution of the Faculty of Arts and Sciences, Harvard University, and provides facilities and substantial other support for teaching activities of the Department of Astronomy.

     
  • richardmitnick 2:17 pm on February 16, 2018 Permalink | Reply
    Tags: , , , , PRIMA   

    From MIT: “Integrated simulations answer 20-year-old question in fusion research” 

    MIT News

    MIT Widget

    MIT News

    February 16, 2018
    Leda Zimmerman

    To make fusion energy a reality, scientists must harness fusion plasma, a fiery gaseous maelstrom in which radioactive particles react to generate heat for electricity. But the turbulence of fusion plasma can confront researchers with unruly behaviors that confound attempts to make predictions and develop models. In experiments over the past two decades, an especially vexing problem has emerged: In response to deliberate cooling at its edges, fusion plasma inexplicably undergoes abrupt increases in central temperature.

    These counterintuitive temperature spikes, which fly against the physics of heat transport models, have not found an explanation — until now.

    A team led by Anne White, the Cecil and Ida Green Associate Professor in the Department of Nuclear Science and Engineering, and Pablo Rodriguez Fernandez, a graduate student in the department, has conducted studies that offer a new take on the complex physics of plasma heat transport and point toward more robust models of fusion plasma behavior. The results of their work appear this week in the journal Physical Review Letters. Rodriguez Fernandez is first author on the paper.

    In experiments using MIT’s Alcator C-Mod tokamak (a toroidal-shaped device that deploys a magnetic field to contain the star-furnace heat of plasma), the White team focused on the problem of turbulence and its impact on heating and cooling.

    Alcator C-Mod tokamak at MIT, no longer in operation

    In tokamaks, heat transport is typically dominated by turbulent movement of plasma, driven by gradients in plasma pressure.

    Hot and cold

    Scientists have a good grasp of turbulent transport of heat when the plasma is held at steady-state conditions. But when the plasma is intentionally perturbed, standard models of heat transport simply cannot capture plasma’s dynamic response.

    In one such case, the cold-pulse experiment, researchers perturb the plasma near its edge by injecting an impurity, which results in a rapid cooling of the edge.

    “Now, if I told you we cooled the edge of hot plasma, and I asked you what will happen at the center of the plasma, you would probably say that the center should cool down too,” says White. “But when scientists first did this experiment 20 years ago, they saw that edge cooling led to core heating in low-density plasmas, with the temperature in the core rising, and much faster than any standard transport model would predict.” Further mystifying researchers was the fact that at higher densities, the plasma core would cool down.

    Replicated many times, these cold-pulse experiments with their unlikely results defy what is called the standard local model for the turbulent transport of heat and particles in fusion devices. They also represent a major barrier to predictive modeling in high-performance fusion experiments such as ITER, the international nuclear fusion project, and MIT’s own proposed smaller-scale fusion reactor, ARC.

    MIT ARC Fusion Reactor

    ITER Tokamak in Saint-Paul-lès-Durance, which is in southern France

    To achieve a new perspective on heat transport during cold-pulse experiments, White’s team developed a unique twist.

    “We knew that the plasma rotation, that is, how fast the plasma was spinning in the toroidal direction, would change during these cold-pulse experiments, which complicates the analysis quite a bit,” White notes. This is because the coupling between momentum transport and heat transport in fusion plasmas is still not fully understood,” she explains. “We needed to unambiguously isolate one effect from the other.”

    As a first step, the team developed a new experiment that conclusively demonstrated how the cold-pulse phenomena associated with heat transport would occur irrespective of the plasma rotation state. With Rodriguez Fernandez as first author, White’s group reported this key result in the journal Nuclear Fusion in 2017.

    A new integrated simulation

    From there, a tour de force of modeling was needed to recreate the cold-pulse dynamics seen in the experiments. To tackle the problem, Rodriguez Fernandez built a new framework, called PRIMA, which allowed him to introduce cold-pulses in time-dependent simulations. Using special software that factored in the turbulence, radiation and heat transport physics inside a tokamak, PRIMA could model cold-pulse phenomena consistent with experimental measurements.

    “I spent a long time simulating the propagation of cold pulses by only using an increase in radiated power, which is the most intuitive effect of a cold-pulse injection,” Rodriguez Fernandez says.

    Because experimental data showed that the electron density increased with every cold pulse injection, Rodriguez Fernandez implemented an analogous effect in his simulations. He observed a very good match in amplitude and time-scales of the core temperature behavior. “That was an ‘aha!’ moment,” he recalls.

    Using PRIMA, Rodriguez Fernandez discovered that a competition between types of turbulent modes in the plasma could explain the cold-pulse experiments. These different modes, explains White, compete to become the dominant cause of the heat transport. “Whichever one wins will determine the temperature profile response, and determine whether the center heats up or cools down after the edge cooling,” she says.

    By determining the factors behind the center-heating phenomenon (the so-called nonlocal response) in cold-pulse experiments, White’s team has removed a central concern about limitations in the standard, predictive (local) model of plasma behavior. This means, says White, that “we are more confident that the local model can be used to predict plasma behavior in future high performance fusion plasma experiments — and eventually, in reactors.”

    “This work is of great significance for validating fundamental assumptions underpinning the standard model of core tokamak turbulence,” says Jonathan Citrin, Integrated Modelling and Transport Group leader at the Dutch Institute for Fundamental Energy Research (DIFFER), who was not involved in the research. “The work also validated the use of reduced models, which can be run without the need for supercomputers, allowing to predict plasma evolution over longer timescales compared to full-physics simulations,” says Citrin. “This was key to deciphering the challenging experimental observations discussed in the paper.”

    The work isn’t over for the team. As part of a separate collaboration between MIT and General Atomics, Plasma Science and Fusion Center scientists are installing a new laser ablation system to facilitate cold-pulse experiments at the DIII-D tokamak in San Diego, California, with first data expected soon. Rodriguez Fernandez has used the integrated simulation tool PRIMA to predict the cold-pulse behavior at DIII-D, and he will perform an experimental test of the predictions later this year to complete his PhD research.

    The research team included Brian Grierson and Xingqiu Yuan, research scientists at Princeton Plasma Physics Laboratory; Gary Staebler, research scientist at General Atomics; Martin Greenwald, Nathan Howard, Amanda Hubbard, Jerry Hughes, Jim Irby and John Rice, research scientists from the MIT Plasma Science and Fusion Center; and MIT grad students Norman Cao, Alex Creely, and Francesco Sciortino. The work was supported by the US DOE Fusion Energy Sciences.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 1:30 pm on February 16, 2018 Permalink | Reply
    Tags: , , , , ,   

    From ESOblog: “How to Install a Planetarium A conversation with engineer Max Rößner about his work on the ESO Supernova” 

    ESO 50 Large

    ESOblog

    2

    Part of ESO Headquarters in Garching, Germany, is currently in a frenzy of activity as we prepare to open the ESO Supernova Planetarium & Visitor Centre in April 2018. This cutting-edge free astronomy centre is equipped with a 14-metre planetarium dome and an amazing exhibition that takes visitors on a journey to the stars. It’s a lot of work to install a planetarium system from scratch, but to engineer Max Rößner, the ESO Supernova is like a giant playground.

    ESO Supernova Planetarium, Garching Germany

    Q: What’s your role at the ESO Supernova Planetarium & Visitor Centre?

    A: I’d say that I am the Systems Engineer for the ESO Supernova planetarium. I concentrate on the technical implementation of the planetarium, integrating the projection and multimedia systems. Sometimes I also work on the content — such as the shows and night sky tours that will be played on the dome. There is quite a lot of pressure, as at the moment I am the only person who entirely understands the planetarium system, so in a way the project depends on me.

    Q: How do you know so much about planetariums?

    A: I’ve been working in planetariums for most of my life. I started presenting planetarium shows when I was about 10 or 11 in a small planetarium near Augsburg, which is about an hour from Munich, Germany. It is run by an association of volunteers and it was my first taste of these magical places. Of course in the beginning I worked in a voluntary capacity, but it also helps now that I am an engineer.

    Q: How is the ESO Supernova’s planetarium different to those you have previously worked in?

    A: It’s very different. The most obvious visible difference is that the ESO Supernova has an inclined dome — it is tilted by 25 degrees to allow for a better viewing experience. Overall, it’s a complex project, because we are actually implementing two different planetarium systems from Zeiss and Evans & Sutherland (E&S). The market of planetarium systems is a packed field, including Zeiss, E&S, and numerous others. All of them have their pros and cons. Our system looks a lot like a DJ deck — we have an audio mixer, spotlights, and lots of effects!

    4
    Max Rößner at the control board of the newly installed planetarium at the ESO Supernova Planetarium & Visitor Centre.
    Credit: ESO

    Another difference is that the ESO Supernova won’t use an optomechanical projector, usually used to project a nice starry sky. Instead, we are using a digital projection. Both types have positives and negatives. Optomechanical projectors are better at creating really precise stars — tiny, exact pinpricks of light. However, with the digital projection system there is much greater flexibility, and a much greater range in what we can show. For example, the presenter can even fly to a different location in space, which can’t be done with an optomechanical projector.

    Q: What kind of experience are you aiming to give visitors with these awesome systems?

    A: There is a joke in the planetarium world that people go to planetariums twice in their life: as a child and with their children. In the past, presenters generally gave a tour of the starry sky, including the Big Dipper and other famous constellations, and they would also point out some planets. But to match the expectations of audiences today, we use more advanced technology to create the kinds of shows that can also be continually updated to match modern science, and that are more personal and changeable.

    We want to avoid presenting a Hollywood-style film that has a clear beginning and neatly wrapped-up ending, so visitors just come, watch it and leave. Instead we want to create a dialogue with the audience, presenting each show with a more personal flair so each one is different. This can evolve depending on who is in the audience — such as their age or their background — and the questions people have throughout the show can also influence its direction.

    6
    The ESO Science Outreach Network (ESON) visited the ESO Supernova in 2017 and had a sneak-peek of some of the delights to come, learning more about the Extremely Large Telescope (ELT) on a test fulldome show. Credit: ESO/P. Horálek

    Q: You mentioned that the dome is tilted — why?

    A: This is a philosophical question. A tilted planetarium dome does make it a little more difficult to orient the audience, as people are used to using the Earth’s horizon as a reference point for celestial objects. For example, it is a little harder to demonstrate that the Sun rises in the east and sets in the west, because the sky itself isn’t tilted! But with the planetarium seats, which are raised up ‘diagonally’ on a slope like cinema seats, your brain does seem to correct for this.

    An advantage of the tilted dome is that people don’t have to look up very far, so they can look at the dome in a comfortable way and feel fully immersed in the show.

    A: There are a few technical and practical reasons for this:

    Ventilation: Fresh air comes in and used air goes out.
    Noise: We want the sound from within the planetarium to penetrate through the dome rather than bouncing off it completely, or we would end up with a chaotic chamber of echoing noise. The loudspeakers are also mounted behind the dome, and the sound needs to get through so we can hear it.
    Reflections: Similar to the problem of noise, we don’t want light to reflect around the dome from one area to another. The holes and the paint give the dome 58% reflectivity, reducing this problem.

    Q: How is content made differently for the curved screen of the planetarium?

    A: There are two ways to develop content for a planetarium. Firstly, to create films with a fisheye-like representation so they display correctly on the dome. In order to achieve this, a film is first split into the various projection fields, and then warped to compensate for the curved nature of the dome. These little pieces of the frame are then stored on individual PCs and fed to the different projectors.

    Live shows are another type of content. They are created and rendered on the spot, at the moment you present them. For example, we can show the sky as the visitors would see it now, outside. Tomorrow the Moon will change its position a little, and the Sun will set a bit later as we head towards spring, so we can adjust for these changes every day. This is a native functionality of Digistar, which is the planetarium system created by E&S. It’s a little like Google Maps, except with time, and showing the Universe.

    8
    With just a few months until the opening of the ESO Supernova Planetarium & Visitor Centre in spring 2018, the interior of the Centre is coming together.
    Credit: ESO/P. Horálek

    Q: What’s the day-to-day work like in a planetarium?

    A: It’s great! I love having some freedom in making design decisions and seeing those decisions realised. It’s exciting to see something you have planned and worked on for such a long time coming into reality, and to know that you are a big part of it.

    Q: What has been the biggest challenge so far?

    A: We’ve faced so many challenges. One memorable moment was when we were trying to test the software, but nothing happened. Nothing turned on, and we just saw a black sky above us. Of course, we panicked — but it turned out that we had left the dust caps on the projectors! So luckily, that didn’t turn out to be too challenging to fix. Even specialists make mistakes!

    An actual challenge was to raise awareness about our operational requirements. For example, we had to clearly communicate to the architects that we need a low horizon, room for equipment, extra sockets, space in the server room, and so on. Essentially, we were concerned about the practical side of running a planetarium with limited manpower and how that would be balanced with the architectural priorities of design and aesthetics.

    Then there’s the pressure from the fact that the project is dependent on me, because the software is absolutely fundamental to the working of the planetarium. One of the most difficult things has been getting the two planetarium systems to work together in a unified way. We need the added computational power of the second system to realise our operational goals.

    Of course, another challenge is that funding has been a limiting factor in some ways. Any project is easier when you have boundless amounts of money, but that’s not the reality here — especially since the ESO Supernova will be a free, open-source visitor centre.

    10
    A striking sunset shines upon the futuristic curves of the ESO Supernova Planetarium & Visitor Centre.
    Credit: P. Horálek/ESO

    Q: Another exciting part of the ESO Supernova project is the Data2Dome system. Tell us more about that.

    A: Up until now, planetariums have struggled to present really up-to-date content. First of all the content — such as new films, video clips or images — has to be found on the internet, then downloaded, then uploaded to the planetarium system. A script then has to be written to present alongside the content. This means it can take weeks for new research findings from around the world to reach planetariums’ audiences. Other mediums are way faster, like the internet, TV, and newspapers. So there was a problem: a planetarium is meant to be the competence centre of astronomical knowledge in a community, but it was lagging behind.

    We wanted to streamline the process of bringing research from astronomers to audiences around the world. ESO’s outreach department collaborated with E&S and the International Planetarium Society to come up with a technical standard: Data2Dome.

    Essentially, this helps scientific organisations publish their content in such a way that it enables planetarium vendors to download the content directly into their software. Manually shuffling and downloading data is bypassed. NASA, ESA, ESO and many others can directly stream their content into planetariums worldwide. It’s a free and open standard, first implemented by E&S. In particular, it’s great for smaller planetariums that may not have the time to continuously create new content.

    People are already using this software around the world — Data2Dome is streaming content to planetariums as we speak.

    Q: What are you looking forward to most once the ESO Supernova opens?

    A: It will be great to have the planetarium fully working and engaging with the audience. I feel the planetarium is my brainchild, so seeing it finally come to life will be amazing.

    Q: You’ve been working in planetariums for so many years — do you still feel excited when a show begins?

    A: Of course. This space holds a certain fascination that has never left me. I still get goosebumps. Emotion is a key part of the planetarium experience: shows are not just meant to teach you, but to touch you. When you think about it, the entire known Universe is stored in the computers downstairs at the ESO Supernova…so in a small dome on the edge of a city in Germany, we can leave Earth and travel to a different part of the Universe.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition
    Visit ESO in Social Media-

    Facebook

    Twitter

    YouTube

    ESO Bloc Icon

    ESO is the foremost intergovernmental astronomy organisation in Europe and the world’s most productive ground-based astronomical observatory by far. It is supported by 16 countries: Austria, Belgium, Brazil, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Poland, Portugal, Spain, Sweden, Switzerland and the United Kingdom, along with the host state of Chile. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising cooperation in astronomical research. ESO operates three unique world-class observing sites in Chile: La Silla, Paranal and Chajnantor. At Paranal, ESO operates the Very Large Telescope, the world’s most advanced visible-light astronomical observatory and two survey telescopes. VISTA works in the infrared and is the world’s largest survey telescope and the VLT Survey Telescope is the largest telescope designed to exclusively survey the skies in visible light. ESO is a major partner in ALMA, the largest astronomical project in existence. And on Cerro Armazones, close to Paranal, ESO is building the 39-metre European Extremely Large Telescope, the E-ELT, which will become “the world’s biggest eye on the sky”.

    ESO LaSilla
    ESO/Cerro LaSilla 600 km north of Santiago de Chile at an altitude of 2400 metres.

    ESO VLT
    VLT at Cerro Paranal, with an elevation of 2,635 metres (8,645 ft) above sea level.

    ESO Vista Telescope
    ESO/Vista Telescope at Cerro Paranal, with an elevation of 2,635 metres (8,645 ft) above sea level.

    ESO NTT
    ESO/NTT at Cerro LaSilla 600 km north of Santiago de Chile at an altitude of 2400 metres.

    ESO VLT Survey telescope
    VLT Survey Telescope at Cerro Paranal with an elevation of 2,635 metres (8,645 ft) above sea level.

    ALMA Array
    ALMA on the Chajnantor plateau at 5,000 metres.

    ESO E-ELT
    ESO/E-ELT to be built at Cerro Armazones at 3,060 m.

    ESO APEX
    APEX Atacama Pathfinder 5,100 meters above sea level, at the Llano de Chajnantor Observatory in the Atacama desert.

    Leiden MASCARA instrument, La Silla, located in the southern Atacama Desert 600 kilometres (370 mi) north of Santiago de Chile at an altitude of 2,400 metres (7,900 ft)

    Leiden MASCARA cabinet at ESO Cerro la Silla located in the southern Atacama Desert 600 kilometres (370 mi) north of Santiago de Chile at an altitude of 2,400 metres (7,900 ft)

    ESO Next Generation Transit Survey at Cerro Paranel, 2,635 metres (8,645 ft) above sea level

    SPECULOOS four 1m-diameter robotic telescopes 2016 in the ESO Paranal Observatory, 2,635 metres (8,645 ft) above sea level

    ESO TAROT telescope at Paranal, 2,635 metres (8,645 ft) above sea level

    ESO ExTrA telescopes at Cerro LaSilla at an altitude of 2400 metres

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: