Tagged: Spectroscopy Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:57 pm on November 24, 2021 Permalink | Reply
    Tags: "Mapping Lithium Evolution of Giant Stars with LAMOST-Kepler Data", Asteroseismicity, , , , , , RGB: red giant branch stars, , Spectroscopy, Stars at RGB phase showed natural depletion of lithium along with their evolution.,   

    From The Chinese Academy of Sciences [中国科学院] (CN) : “Mapping Lithium Evolution of Giant Stars with LAMOST-Kepler Data” 

    From The Chinese Academy of Sciences [中国科学院] (CN)

    Nov 24, 2021
    XU Ang
    National Astronomical Observatories
    annxu@nao.cas.cn

    1
    Credit: CC0 Public Domain.

    Mapping Lithium Evolution of Giant Stars with LAMOST-Kepler Data

    3

    LAMOST telescope located in Xinglong Station, Hebei Province, China, Altitude 960 m (3,150 ft).

    Based on LAMOST and Kepler data, a new study led by astronomers from The National Astronomical Observatories of China [ 国家天文台] at Chinese Academy of Sciences [中国科学院](CN) has revealed evolutionary features of lithium for evolved stars, which updates our understanding about the theory of stellar structure and evolution.

    The results were published in The Astrophysical Journal Letters.

    Surface lithium (Li) abundances display various patterns for stars of different types as well as at different evolutionary stages. The signatures of Li provide key information about internal stellar structure and evolution.

    However, due to the difficulties of classifying evolutionary stages, especially separating core helium burning (HeB) from red giant branch (RGB) bump stars by traditional approach, the evolutionary features of Li from the RGB to the HeB stage have long been in the dark.

    Thanks to the uncovered evolutionary stages and determined Li abundances based on the asteroseismic analysis and spectroscopic survey, respectively, the researchers investigated the signatures of Li for stars evolving from the RGB to the HeB phase with the 1,848 giants selected in the LAMOST-Kepler/K2 fields.

    They found that the stars at RGB phase showed natural depletion along with their evolution; particularly, there were no obvious crowd stars with anomalously high Li abundances near the bump. While during the HeB phase, there was no indication of obvious Li depletion.

    Furthermore, the Li abundances of most of the low-mass stars that just start their HeB phase (zero-age HeB, ZAHeB) showed an increase compared to the stars above the RGB bump. It suggested that the helium flash, which happened between the phases of RGB and HeB, might lead to moderate Li production.

    “A previous theoretical study has speculated that helium flash may produce Li, while this work has just confirmed their inference from observations, and also provides a constraint on the amount of Li produced during helium flash,” said Dr. ZHANG Jinghua, the first author of the study.

    However, the standard helium flash model cannot explain the high Li abundance in (super) Li-rich stars given that the Li abundances of most low-mass ZAHeB stars are still lower than the lower limit of Li-rich stars in a classic definition. For (super) Li-rich stars, some special mechanisms should be considered during helium flash. Other scenarios, such as mergers, could also be sources given that Li-rich stars can be found at any time during the steady-state phase of HeB.

    This is a timely paper on an interesting topic (Li in red giant stars) that has long been a puzzle, and is now being revitalized by the existence of new, larger, more precise, and better characterized datasets, commented by the reviewer of the paper.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Chinese Academy of Sciences [中国科学院] (CN) is the linchpin of China’s drive to explore and harness high technology and the natural sciences for the benefit of China and the world. Comprising a comprehensive research and development network, a merit-based learned society and a system of higher education, CAS brings together scientists and engineers from China and around the world to address both theoretical and applied problems using world-class scientific and management approaches.

    Since its founding, CAS has fulfilled multiple roles — as a national team and a locomotive driving national technological innovation, a pioneer in supporting nationwide S&T development, a think tank delivering S&T advice and a community for training young S&T talent.

    Now, as it responds to a nationwide call to put innovation at the heart of China’s development, CAS has further defined its development strategy by emphasizing greater reliance on democratic management, openness and talent in the promotion of innovative research. With the adoption of its Innovation 2020 programme in 2011, the academy has committed to delivering breakthrough science and technology, higher caliber talent and superior scientific advice. As part of the programme, CAS has also requested that each of its institutes define its “strategic niche” — based on an overall analysis of the scientific progress and trends in their own fields both in China and abroad — in order to deploy resources more efficiently and innovate more collectively.

    As it builds on its proud record, CAS aims for a bright future as one of the world’s top S&T research and development organizations.

     
  • richardmitnick 10:55 am on May 29, 2021 Permalink | Reply
    Tags: "Alien stars found in our Milky Way", Archeoastronomy, , , , , , , , Spectroscopy, ,   

    From University of Birmingham (UK) via EarthSky : “Alien stars found in our Milky Way” 

    From University of Birmingham (UK)

    via

    1

    EarthSky

    May 25, 2021
    Theresa Wiegert

    1
    Infrared image of stars at the center of our Milky Way galaxy, via the Spitzer Space Telescope.

    Observing in infrared makes it possible to peer behind the gas clouds that otherwise cover the central region of the galaxy. There are around 10 million stars within just 3.3 light-years of the galactic center. These are dominated by red giants, the same kind of old stars found to be from another galaxy in this study. Image via National Aeronautics Space Agency (US)/ JPL-Caltech (US)/ S. Stolovy (NASA Spitzer Science Center (US)/California Institute of Technology (US)).

    Astronomers used a new technique – asteroseismology combined with spectroscopy – to pinpoint the ages of a sample of around 100 old red giant stars in the Milky Way.

    They were able to reach a much higher accuracy of the stars’ ages, they said in a statement on May 17, 2021. And they also found that a number of those red giant stars did not originate in the Milky Way! They are instead alien stars, which came here from another galaxy. Their original home in space was Gaia Enceladus (also known as the Gaia Sausage), a dwarf galaxy that collided and merged with our Milky way galaxy about 10 billion years ago.

    3
    Artist’s concept of the stars from dwarf galaxy Gaia Enceladus, which merged with the Milky Way some 10 billion years ago. The Milky Way is in the center of the illustration, shown from above, and the Gaia Enceladus stars – debris of the dwarf galaxy – are represented by little arrows – vectors – that show their position and the direction in which they move. The data are from a computer simulation. Image via European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU).

    2
    This Hubble Space Telescope (HST) image of a dense swarm of stars shows the central region of the globular cluster NGC 2808 and its 3 generations of stars. NASA, European Space Agency, A. Sarajedini (University of Florida (US)) and G. Piotto (University of Padua [Università degli Studi di Padova] (IT))

    This new research was published on May 17, 2021, in the peer-reviewed journal Nature Astronomy.

    What do alien stars tell us?

    So the idea here is that the Milky Way galaxy had already started forming many of its stars before a dwarf galaxy came by and merged with our galaxy, bringing its own stars with it. This event took place around 8-11 billion years ago. In contrast, the age of the Milky Way is about 13.6 billion years, give or take a few.

    This merger, then, happened early in our galaxy’s history.

    The dwarf galaxy – or the remnants of it – go today under the name Gaia Enceladus or the Gaia Sausage [above], because of the highly elongated shape it forms – like a sausage – as seen from data from the Gaia mission.

    In Greek mythology, Enceladus was the offspring of the goddess Gaia. It is also, incidentally, the name for one of Saturn’s moons.

    In this new research, the astronomers were able to identify stars that are remnants of the merger. These stars provide a way of looking back to the distant past, when the merger took place. Josefina Montalbán at the University of Birmingham is the lead author on the paper. She said:

    “The chemical composition, location and motion of the stars we can observe today in the Milky Way contain precious information about their origin. As we increase our knowledge of how and when these stars were formed, we can start to better understand how the merger of Gaia-Enceladus with the Milky Way affected the evolution of our galaxy.”

    How did astronomers find the stars?

    These astronomers had targeted a sample of 100 old stars observed with the Kepler mission.

    These are red giant stars, at the end of their lives.

    The team used data from three Milky Way research instruments to measure the stars’ ages, all with the task of mapping and analyzing Milky Way stars. One instrument was Kepler, as mentioned previously. The other two were the Gaia satellite and APOGEE.

    With data from these instruments, the astronomers used the technique of asteroseismology that studies how stars oscillate. That is, the technique measures regular variations within the star. Asteroseismology is similar to helioseismology, the study of oscillations in the sun. Learning how a star oscillates lets astronomers gain info about a star’s size and internal structure, which, in turn, will let them estimate the star’s age.

    Team member Mathieu Vrard at Ohio State University’s (US) Department of Astronomy, said:

    “[It] allows us to get very precise ages for the stars, which are important in determining the chronology of when events happened in the early Milky Way.”

    In addition, the astronomers also used spectroscopy – the study of the stellar spectrum – to learn the chemical composition of the stars. This also helps with age determination, and in combination, the methods let the astronomers determine the ages to an unprecedented precision.

    The astronomers noticed that a number of them were of the same age, and that this age was a bit younger than most of the stars that we know started their lives in the Milky Way.

    Team member Andrea Miglio at the University of Bologna [Alma mater studiorum – Università di Bologna](IT) added:

    “We have shown the huge potential of asteroseismology in combination with spectroscopy to deliver precise, accurate relative ages for individual, very old, stars. Taken together, these measurements contribute to sharpen our view on the early years of our galaxy and promise a bright future for [Milky Way] archeoastronomy.”

    Now the researchers want to apply their approach to larger samples of stars to get a better view of the Milky Way’s formation history and evolution.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    University of Birmingham (UK) has been challenging and developing great minds for more than a century. Characterised by a tradition of innovation, research at the University has broken new ground, pushed forward the boundaries of knowledge and made an impact on people’s lives. We continue this tradition today and have ambitions for a future that will embed our work and recognition of the Birmingham name on the international stage.

    The University of Birmingham is a public research university located in Edgbaston, Birmingham, United Kingdom. It received its royal charter in 1900 as a successor to Queen’s College, Birmingham (founded in 1825 as the Birmingham School of Medicine and Surgery), and Mason Science College (established in 1875 by Sir Josiah Mason), making it the first English civic or ‘red brick’ university to receive its own royal charter. It is a founding member of both the Russell Group (UK) of British research universities and the international network of research universities, Universitas 21.

    The student population includes 23,155 undergraduate and 12,605 postgraduate students, which is the 7th largest in the UK (out of 169). The annual income of the institution for 2019–20 was £737.3 million of which £140.4 million was from research grants and contracts, with an expenditure of £667.4 million.

    The university is home to the Barber Institute of Fine Arts, housing works by Van Gogh, Picasso and Monet; the Shakespeare Institute; the Cadbury Research Library, home to the Mingana Collection of Middle Eastern manuscripts; the Lapworth Museum of Geology; and the 100-metre Joseph Chamberlain Memorial Clock Tower, which is a prominent landmark visible from many parts of the city. Academics and alumni of the university include former British Prime Ministers Neville Chamberlain and Stanley Baldwin, the British composer Sir Edward Elgar and eleven Nobel laureates.

    Scientific discoveries and inventions

    The university has been involved in many scientific breakthroughs and inventions. From 1925 until 1948, Sir Norman Haworth was Professor and Director of the Department of Chemistry. He was appointed Dean of the Faculty of Science and acted as Vice-Principal from 1947 until 1948. His research focused predominantly on carbohydrate chemistry in which he confirmed a number of structures of optically active sugars. By 1928, he had deduced and confirmed the structures of maltose, cellobiose, lactose, gentiobiose, melibiose, gentianose, raffinose, as well as the glucoside ring tautomeric structure of aldose sugars. His research helped to define the basic features of the starch, cellulose, glycogen, inulin and xylan molecules. He also contributed towards solving the problems with bacterial polysaccharides. He was a recipient of the Nobel Prize in Chemistry in 1937.

    The cavity magnetron was developed in the Department of Physics by Sir John Randall, Harry Boot and James Sayers. This was vital to the Allied victory in World War II. In 1940, the Frisch–Peierls memorandum, a document which demonstrated that the atomic bomb was more than simply theoretically possible, was written in the Physics Department by Sir Rudolf Peierls and Otto Frisch. The university also hosted early work on gaseous diffusion in the Chemistry department when it was located in the Hills building.

    Physicist Sir Mark Oliphant made a proposal for the construction of a proton-synchrotron in 1943, however he made no assertion that the machine would work. In 1945, phase stability was discovered; consequently, the proposal was revived, and construction of a machine that could surpass proton energies of 1 GeV began at the university. However, because of lack of funds, the machine did not start until 1953. The DOE’s Brookhaven National Laboratory (US) managed to beat them; they started their Cosmotron in 1952, and had it entirely working in 1953, before the University of Birmingham.

    In 1947, Sir Peter Medawar was appointed Mason Professor of Zoology at the university. His work involved investigating the phenomenon of tolerance and transplantation immunity. He collaborated with Rupert E. Billingham and they did research on problems of pigmentation and skin grafting in cattle. They used skin grafting to differentiate between monozygotic and dizygotic twins in cattle. Taking the earlier research of R. D. Owen into consideration, they concluded that actively acquired tolerance of homografts could be artificially reproduced. For this research, Medawar was elected a Fellow of the Royal Society. He left Birmingham in 1951 and joined the faculty at University College London (UK), where he continued his research on transplantation immunity. He was a recipient of the Nobel Prize in Physiology or Medicine in 1960.

     
  • richardmitnick 9:26 am on February 12, 2021 Permalink | Reply
    Tags: "Shaping Light Pulses with Deep Learning", , Directly shaping arbitrary THz input pulses into a variety of desired waveforms., , Ozcan Lab, Spectroscopy,   

    From UCLA via Optics & Photonics: “Shaping Light Pulses with Deep Learning” 

    UCLA bloc

    From UCLA

    via

    From Optics & Photonics

    11 February 2021
    William G. Schulz

    1
    Illustration of an optical diffractive network, trained with deep learning techniques, to directly shape pulses of light. Credit: Ozcan Lab/UCLA.

    Direct engineering and control of terahertz pulses could boost access to those wavelengths for many powerful applications in spectroscopy, imaging, optical communications and more. But wrangling the phase and amplitude values of a continuum of frequencies in the THz band has proved challenging.

    Now, researchers at University of California, Los Angeles, led by OSA Fellows Aydogan Ozcan and Mona Jarrahi, say they have used deep learning and a 3D printer to create a passive network device that can directly shape arbitrary THz input pulses into a variety of desired waveforms [Nature Communications]. The team writes that these results further motivate “the development of all-optical machine learning and information processing platforms that can better harness the 4D spatiotemporal information carried by light.”

    Shaping any terahertz pulse

    The team’s method, Ozcan says, can directly shape any input THz pulse through passive light diffraction via deep-learning-designed, 3D-printed polymer wafers. It is fundamentally different, he says, from previous approaches that indirectly synthesize a desired THz pulse through optical-to-terahertz converters or shaping of the optical pump that interacts with THz sources.

    What is more, Ozcan adds, the deep-learning framework is flexible and versatile; it can be used to engineer THz pulses regardless of polarization state, beam shape, beam quality or aberrations of the specific generation mechanism.

    Diffractive optical networks

    In 2018, Ozcan’s group reported development of the first all-optical diffractive deep neural network using 3D-printed polymer wafers with uneven surfaces for light diffraction. That work was primarily about machine learning by way of light propagated through the trained diffractive layers to execute an image-classification task, he says.

    But deep-learning-designed diffractive networks can also tackle inverse design problems in optics and photonics, Ozcan says, and the team’s new work in THz pulse shaping “highlights this unique opportunity.” They used diffractive optical networks—four wafers in a precisely stacked and spaced arrangement—to shape pulses by simultaneously controlling the relative phase and amplitude of each spectral component across a continuous and wide range of frequencies, the researchers write.

    On-demand synthesis of new pulses

    2
    A 3D-printed optical diffractive network that is used to engineer THz pulses. Credit: Ozcan Lab/UCLA.

    For on-demand synthesis of new pulses, Ozcan says, the team used a Lego-like physical transfer learning approach. That is, by training with deep learning a new layer or layers to replace part of an existing network model, the team found new pulses can be synthesized.

    In terms of its footprint, the pulse-shaping framework has a compact design, with an axial length of approximately 250 wavelengths, Ozcan says. Moreover, he adds, it does not use any conventional optical components such as spatial light modulators, which makes it ideal for pulse shaping in the THz band—where high-resolution spatiotemporal modulation and control of complex wavefronts over a broad bandwidth represent a significant challenge.

    Improving efficiency

    To improve the efficiency of the network, Ozcan says, a switch to low-absorption polymers for the 3D-printing material could be beneficial. To further improve output efficiency, he says, antireflective coatings over diffractive surfaces could be used to reduce back reflections.

    Altogether, the capabilities of the deep-learning-designed diffractive network approach to pulse shaping enable a variety of new opportunities, Ozcan says. When merged with appropriate fabrication methods and materials, he adds, the approach can be used to directly engineer THz pulses generated through quantum cascade lasers, solid-state circuits and particle accelerators.

    “There is already commercial interest in licensing diffractive-network–related intellectual property,” Ozcan says, “and we expect this to accelerate as we continue demonstrating some of the unique advantages of this framework for various applications in machine learning, computer vision and optical design.”

    The team is also working on visible diffractive networks, which could benefit various applications in computer vision and computational imaging fields, says Ozcan, calling it “work in progress.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Optics and Photonics News (OPN) is The Optical Society’s monthly news magazine. It provides in-depth coverage of recent developments in the field of optics and offers busy professionals the tools they need to succeed in the optics industry, as well as informative pieces on a variety of topics such as science and society, education, technology and business. OPN strives to make the various facets of this diverse field accessible to researchers, engineers, businesspeople and students. Contributors include scientists and journalists who specialize in the field of optics. We welcome your submissions.

    UC LA Campus

    For nearly 100 years, UCLA has been a pioneer, persevering through impossibility, turning the futile into the attainable.

    We doubt the critics, reject the status quo and see opportunity in dissatisfaction. Our campus, faculty and students are driven by optimism. It is not naïve; it is essential. And it has fueled every accomplishment, allowing us to redefine what’s possible, time after time.

    This can-do perspective has brought us 12 Nobel Prizes, 12 Rhodes Scholarships, more NCAA titles than any university and more Olympic medals than most nations. Our faculty and alumni helped create the Internet and pioneered reverse osmosis. And more than 100 companies have been created based on technology developed at UCLA.

     
  • richardmitnick 11:34 am on November 30, 2020 Permalink | Reply
    Tags: "LAMOST-Kepler/K2 Survey Announces the First Light Result", , , , , , Kepler photometry, LAMOST-Kepler project, , , Spectroscopy   

    From National Astronomical Observatories of China (CN) and phys.org: “LAMOST-Kepler/K2 Survey Announces the First Light Result” 

    From National Astronomical Observatories of China (CN)

    at

    Chinese Academy of Sciences (CN)

    and


    From phys.org

    An international team led by Prof. Jian-Ning Fu and Dr. Weikai Zong, from Beijing Normal University, published the first light result of medium-resolution spectroscopic observations, which is undertaken by the LAMOST-Kepler/K2 Survey.

    4
    Phase II of the LAMOST-Kepler/K2 Survey.

    This result demonstrates that the medium-resolution spectrographs, equipped on LAMOST, perform to the designed expectation. The article is published this November online in the Astrophysical Journal Supplement Series.

    NASA/Kepler Telescope, and K2 March 7, 2009 until November 15, 2018.

    The LAMOST-Kepler/K2 Survey [science paper above] was launched based on the success of the LAMOST-Kepler project [RAA], a low-resolution spectroscopic survey that consecutive performed since 2011.

    1
    From LAMOST-Kepler project. Targets of scientific interest in the field of view (FOV) of the Keplermission. The black dots refer to the centers of the 14 LK-fields that cover the KeplerFOV. The following color coding is used: green for standard targets, blue for KASC targets, and orange for planet targets. The LK-fields observed in 2011–2014 are indicated by the circles drawn with a full line going from thick to thin and from gray to black, respectively.

    Different from LAMOST-Kepler project, the LAMOST-Kepler/K2 Survey aims to collect time-series spectroscopies with medium-resolution on about 55,000 stars distributed on Kepler and K2 campaigns, with higher priority given to the targets with available Kepler photometry. Each of those input targets will be visited about 60 times during the period from September 2018 to June 2023. This project is allocated with one sixth of the entire time within the LAMOST medium-resolution observations.

    From May 2018 to June 2019, a total of thirteen LAMOST-Kepler/K2 Survey footprints have been visited by LAMOST, and obtained about 370,000 high-quality spectra of 28,000 stars. The internal uncertainties for the effective temperature, surface gravity, metallicity and radial velocity are 80 K,0.08 dex, 0.05 dex and 1km/s when the signal to noise ratio equals to 20, respectively, which suggests that the performance of LAMOST medium-resolution spectrographs meet the designed expectation. The external comparisons with GAIA and APOGEE show that LAMOST stellar atmospheric parameters have a good linear relationship, which indicates the quality of LAMOST medium-resolution spectra is reliable.

    The result demonstrated that the medium-resolution spectrographs on LAMOST performed to the designed expectation.

    The LAMOST-Kepler/K2 Survey was launched based on the success of the LAMOST-Kepler project, a low-resolution spectroscopic survey that consecutively performed since 2011.

    Different from LAMOST-Kepler project, the LAMOST-Kepler/K2 Survey aims to collect time-series spectroscopies with medium resolution on about 55,000 stars distributed on Kepler and K2 campaigns, with higher priority given to the targets with available Kepler photometry.

    Each of those input targets will be visited about 60 times during the period from September 2018 to June 2023. This project is allocated with one-sixth of the entire time within the LAMOST medium-resolution observations.

    From May 2018 to June 2019, a total of 13 LAMOST-Kepler/K2 Survey footprints have been visited by LAMOST, and obtained about 370,000 high-quality spectra of 28,000 stars.

    The internal uncertainties for the effective temperature, surface gravity, metallicity and radial velocity were 80 K,0.08 dex, 0.05 dex and 1km/s when the signal to noise ratio equals to 20, respectively, which suggested that the performance of LAMOST medium-resolution spectrographs meet the designed expectation.

    The external comparisons with GAIA and APOGEE showed that LAMOST stellar atmospheric parameters had a good linear relationship, which indicated the quality of LAMOST medium-resolution spectra is reliable.

    ESA (EU)/GAIA satellite .

    SDSS Apache Point Observatory Galactic Evolution Experiment – Apogee

    SDSS Telescope at Apache Point Observatory, near Sunspot NM, USA, Altitude2,788 meters (9,147 ft).

    Apache Point Observatory, near Sunspot, New Mexico Altitude 2,788 meters (9,147 ft).

    The LAMOST-Kepler/K2 Survey is the first project dedicated to obtaining time series of spectra by using the LAMOST medium-resolution spectrographs, pointing toward the Kepler/K2 fields. These spectra will be very important for many scientific goals, including the discovery of new binaries or even the brown dwarfs, the study of oscillation dynamics for large-amplitude pulsators and the investigation of the variability of stellar activity.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The National Astronomical Observatories of the Chinese Academy of Sciences (NAOC) (CN) was officially founded in April 2001 through the merger of four observatories, three observing stations and one research center, all under the Chinese Academy of Sciences (CAS).

    NAOC is headquartered in Beijing and has four subordinate units across the country: the Yunnan Observatory (YNAO), the Nanjing Institute of Astronomical Optics and Technology (NIAOT), the Xinjiang Astronomical Observatory (XAO) and the Changchun Observatory.

    The headquarters of NAOC, located in Beijing and formerly known as the Beijing Astronomical Observatory, is simply referred to as NAOC. Established in 1958 and aiming at the forefront of astronomical science, NAOC conducts cutting-edge astronomical studies, operates major national facilities and develops state-of the-art technological innovations. Applying astronomical methods and knowledge to fulfill national interests and needs is also an integral part of the mission of NAOC. NAOC hosts the Center for Astronomical Mega-Science of Chinese Academy of Sciences (CAMS), which is a new initiative to establish a mechanism for reaching consensus in the construction of major facilities, operations and technology developments among the CAS core observatories (NAOC; the Purple Mountain Observatory, PMO; and the Shanghai Astronomical Observatory, SHAO). CAMS will strive for the sharing of financial, personnel resources and technical expertise among the three core observatories of CAS.

    NAOC’s main research involves cosmological large-scale structures, the formation and evolution of galaxies and stars, high-energy astrophysics, solar magnetism and activity, lunar and deep space exploration, and astronomical instrumentation.

    NAOC has seven major research divisions in the areas of optical astronomy, radio astronomy, galaxies and cosmology, space science, solar physics, lunar and deep space exploration, and applications in astronomy. These divisions encompass more than 50 research groups and house the CAS Key Laboratories of Optical Astronomy, Solar Activity, Lunar and Deep-Space Exploration, Space Astronomical Science and Technology, and Computational Astrophysics.

    NAOC also has three major observing stations: Xinglong, for optical and infrared astronomy; Huairou, for solar magnetics; and Miyun, for radio astronomy and satellite data downlinks. NAOC has been deeply involved in the China Lunar Exploration Program, from designing and managing lunar exploration satellite payload systems, to receiving, storing and analyzing the data transmitted by these satellites from space. NAOC also has a GPU super-cluster computing facility with 85 nodes at a peak performance of up to 280 teraflops.

    NAOC also publishes Research in Astronomy and Astrophysics (RAA), a journal catalogued by SCI.

    The Chinese Academy of Sciences (CN) is the linchpin of China’s drive to explore and harness high technology and the natural sciences for the benefit of China and the world. Comprising a comprehensive research and development network, a merit-based learned society and a system of higher education, CAS brings together scientists and engineers from China and around the world to address both theoretical and applied problems using world-class scientific and management approaches.

    Since its founding, CAS has fulfilled multiple roles — as a national team and a locomotive driving national technological innovation, a pioneer in supporting nationwide S&T development, a think tank delivering S&T advice and a community for training young S&T talent.

    Now, as it responds to a nationwide call to put innovation at the heart of China’s development, CAS has further defined its development strategy by emphasizing greater reliance on democratic management, openness and talent in the promotion of innovative research. With the adoption of its Innovation 2020 programme in 2011, the academy has committed to delivering breakthrough science and technology, higher caliber talent and superior scientific advice. As part of the programme, CAS has also requested that each of its institutes define its “strategic niche” — based on an overall analysis of the scientific progress and trends in their own fields both in China and abroad — in order to deploy resources more efficiently and innovate more collectively.

    As it builds on its proud record, CAS aims for a bright future as one of the world’s top S&T research and development organizations.

     
  • richardmitnick 12:22 pm on January 5, 2019 Permalink | Reply
    Tags: , , , ‘Following the Water’, , , , , Fingerprinting Life, , , , Spectroscopy, The habitable zone serves as a target selection tool, , , UCO Lick Observatory Mt Hamilton in San Jose California, UCR’s Alternative Earths Astrobiology Center   

    From UC Riverside: “Are We Alone?” 

    UC Riverside bloc

    From UC Riverside

    May 24, 2018
    Sarah Nightingale

    1
    Illustration by The Brave Union

    Forty years ago, the Voyager 2 spacecraft launched from Florida’s Cape Canaveral. Over the next decade, it swept across the solar system, sending back images of Jupiter’s volcanoes, Saturn’s rings, and for the first time, the icy atmospheres of Uranus and Neptune.

    NASA/Voyager 2

    2
    UCR’s Tim Lyons, left, and Stephen Kane are some of the only researchers in the world using Earth’s history as a guide to finding life in outer space. (Photo by Kurt Miller)

    The mission was more than enough to encourage Stephen Kane, a teenager growing up in Australia, to study planetary science in college. By the time he’d graduated, scientists had detected the first planet outside our solar system, known as an exoplanet, inspiring him to join the hunt and look for more.

    Over the past two decades, Kane, now an associate professor of planetary astrophysics at UC Riverside, has discovered hundreds of alien planets. At first, he focused on identifying giant Jupiter-like planets, which he describes as “low-hanging fruit” due to their large sizes. But in 2011, the Kepler Space Telescope identified the first rocky planet — Kepler 10b. Unlike gas giants such as Jupiter, rocky planets could potentially harbor life.

    NASA/Kepler Telescope

    With the discovery of more Earth-sized planets on the horizon, Kane realized that astrophysicists would struggle to understand the data they were receiving about terrestrial planets and their atmospheres.

    “During the course of the ongoing Kepler mission, I sought out planetary and Earth scientists because they’ve spent hundreds of years studying the solar system and how the Earth’s atmosphere has been shaped by biological and geophysical processes, so they have a lot to bring to the table,” Kane said.

    In 2017, Kane formalized that collaboration by joining an interdisciplinary research group led by Tim Lyons, a distinguished professor of biogeochemistry in the Department of Earth Sciences and director of UCR’s Alternative Earths Astrobiology Center. Backed by roughly $7.5 million from NASA, the center, one of only a handful like it in the world, brings together geochemists, biologists, planetary scientists, and astrophysicists from UCR and partner institutions to search for life on distant worlds using a template defined by the only known planet with life: Earth.

    3
    Astrobiology researchers study areas on Earth that hold evidence of ancient life, such as these stromatolites at the Hamelin Pool Marine Nature Reserve in Shark Bay, Australia. The rocky, dome-shaped structures formed in shallow water through the trapping of sedimentary grains by communities of microorganisms. (Photo by Mark Boyle)

    Fingerprinting Life

    Since its formation more than 4.5 billion years ago, Earth has undergone immense periods of geological and biological change.

    When the first life appeared — in the form of simple microbes — the sun was fainter, there were no continents, and there was no oxygen in the atmosphere. A new kind of life emerged around 2.7 billion years ago: photosynthetic bacteria that use the sun’s energy to convert carbon dioxide and water into food and oxygen gas. Multicellular life evolved from those bacteria, followed by more familiar lifeforms: fish about 530 million years ago, land plants 470 million years ago, and mammals 200 million years ago.

    “There are periods in the Earth’s past that are as different from one another as Earth is from an exoplanet,” Lyons said. “That is the concept of alternative Earths. You can slice the Earth’s history into chapters, pages, and even paragraphs, and there has been life evolving, thriving, surviving, and dying with each step. If we know what kind of atmospheres were present during the early stages of life on Earth, and their relationships to the evolving continents and oceans, we can look for similar signposts in our search for life on exoplanets.”

    While it might seem impossible to characterize ancient oceans and atmospheres, scientists can glean hints by studying rocks formed billions of years ago.

    “The chemical compositions of rocks are determined by the chemistry of the oceans and their life, and many of the gases in the atmosphere, through exchange with the oceans, are controlled by the same processes,” Lyons said. “These atmospheric fingerprints of life in the underlying oceans, or biosignatures, can be used as markers of life on other planets light years away.”

    The search for alien biosignatures typically centers on the gases produced by living creatures on Earth because they’re the only examples scientists have to work with. But Earth’s many chapters of inhabitation reveal the great number of possible gas combinations. Oxygen gas, ozone, and methane in a planet’s biosignature could all indicate the presence of life — and seeing them together could present an even stronger argument.

    The center’s search for life is different from the hunt for intelligent life. While those researchers probe for signs of alien civilizations, such as radio waves or powerful lasers, Lyons’ team is essentially looking for the byproducts of simple lifeforms.

    “As we’re exploring exoplanets, what we’re really trying to do is characterize their atmospheres,” he said. “If we see certain profiles of gases, then we may be detecting microbial waste products that are accumulating in the atmosphere.”

    The UCR team must also account for processes that produce the same gases without contributions from life, a phenomenon researchers call false positives. For example, a planetary atmosphere with abundant oxygen would be a promising biosignature, but that evidence could be misleading without fully addressing where it came from. Similarly, methane is a key biosignature, but there are many nonbiological ways to produce this gas on Earth. These distinctions require careful considerations of many factors, including seasonal patterns, tectonic activity, the type of planet and its star, among other data.

    False negatives are another concern, Lyons said. In previous research on ancient organic-rich rocks collected in Western Australia and South Africa, his group showed that about two billion years passed between the moment organisms first started producing oxygen on Earth and when it accumulated at levels high enough to be detectable in the atmosphere. In that scenario, a classic biosignature, oxygen, could be missed.

    “It’s also entirely possible that on some planets oxygen is produced through photosynthesis in pockets in the ocean and you’d never see it in the atmosphere,” Lyons said. “We have to be very clever to consider the many possibilities for biosignatures, and Earth’s past gives us many to choose from.”

    3
    Illustration by The Brave Union

    ‘Following the Water’

    With several hundred terrestrial planets confirmed and many more awaiting discovery, the search for life-bearing worlds is an almost overwhelming task.

    Astronomers are narrowing down their search by focusing on habitable zones — the orbital region around stars where it’s neither too hot nor too cold for liquid water to exist on the surface.

    “We know that liquid water is essential for life as we know it, and so we’re beginning our search by looking for planets that are capable of having similar environments to Earth. We call this approach ‘following the water,’” Kane said.

    While the habitable zone serves as a target selection tool, Kane said a planet nestled in this region won’t necessarily show signs of life — or even liquid water. Venus, for example, occupies the inner edge of the Sun’s habitable zone, but its scorching surface temperature has boiled away any liquid water that once existed.

    “We are extremely fortunate to have Venus in our solar system because it reminds us that a planet can be exactly the same size as Earth and still have things go catastrophically wrong,” Kane said.

    Equally important, being in the habitable zone doesn’t mean a planet will boast other factors that make Earth ideal for life. In addition to liquid water, the perfect candidate would have an insulating atmosphere and a protective magnetic field. It would also offer the right chemical ingredients for life and ways of recycling those elements over and over when continents collide, mountains lift up and wear down, and nutrients are swept back to the seas by rivers.

    “People question why we focus so intently on Earth, but the answer is obvious. We only know what we know about life because of what the Earth has given us,” said Lyons, who has spent decades reconstructing the conditions during which life evolved.

    “If I asked you to design a planet with the perfect conditions for life, you would design something just like Earth because it has all of these essential features,” he added. “We are studying how these building blocks have been assembled in different ways during Earth’s history and asking which of them are essential for life, which can be taken away. From that vantage point, we ask how they could be assembled in very different ways on other planets and still sustain life.”

    Kane said a distant planetary system called TRAPPIST-1, which NASA scientists discovered in 2017, could provide clues about the ingredients that are necessary for life.

    A size comparison of the planets of the TRAPPIST-1 system, lined up in order of increasing distance from their host star. The planetary surfaces are portrayed with an artist’s impression of their potential surface features, including water, ice, and atmospheres. NASA

    The TRAPPIST-1 star, an ultracool dwarf, is orbited by seven Earth-size planets (NASA).

    ESO Belgian robotic Trappist National Telescope at Cerro La Silla, Chile


    ESO Belgian robotic Trappist National Telescope at Cerro La Silla, Chile

    Although miniature compared to our own solar system — TRAPPIST-1 would easily fit inside Mercury’s orbit around the sun — it boasts seven planets, three of which are in the habitable zone. However, the planets don’t have moons, and they may not even have atmospheres.

    “We are finding that compact planetary systems orbiting faint stars are much more common than larger systems, so it’s important that we study them and find out if they could have habitable environments,” Kane said.

    4
    An artist’s illustration of the possible surface of TRAPPIST-1f, one of the planets in the TRAPPIST-1 system.

    Remote Observations

    At about 40 light-years (235 trillion miles) from Earth, the TRAPPIST-1 system is relatively close, but we’re never going to go there.

    “The fascinating thing about astronomy as a science is that it’s all based on remote observations,” Kane said. “We are trying to squeeze every piece of information we can out of photons that we are receiving from a very distant object.”

    While scientists have studied the atmospheres of several dozen exoplanets, most are too distant to probe with current instruments. That situation is changing. In April, NASA launched its Transiting Exoplanet Survey Satellite, known as TESS, which will seek Earth-sized planets around more than 500,000 nearby stars.

    NASA/MIT TESS

    In May 2020, NASA plans to launch the James Webb Space Telescope, which will perform atmospheric studies of the rocky worlds discovered by TESS.

    NASA/ESA/CSA Webb Telescope annotated

    Like Kepler, TESS detects exoplanets using the transit method, which measures the minute dimming of a star as an orbiting planet passes between it and the Earth.

    Planet transit. NASA/Ames

    Because light also passes through the atmosphere of planets, scientists will use the Webb telescope to identify the blanket of gases surrounding them through a technique called spectroscopy.

    Kane and Lyons are working with NASA to design missions that will directly image exoplanets in ways that will ensure that interdisciplinary teams such as theirs can properly interpret a wide variety of planetary processes.

    “As we design future missions, we must make sure they are equipped with the right instruments to detect biosignatures and geological processes, such as active volcanoes,” Kane said.

    UCR’s astrobiology team is one of only a few groups in the world studying ancient Earth to create a catalog of biosignatures that will inform mission design in NASA’s search for life on distant worlds. With quintillions — think the number of gallons of water in all of our oceans — of potentially habitable planets in the universe, Lyons said he is optimistic that we’ll find signs of life in the future.

    “Just like the Voyager missions were important because of what they taught us about our solar system — from the discovery of Jupiter’s rings to the first close-up glimpses of Uranus and Neptune — the TESS and James Webb missions, and more importantly the next generation of telescopes planned for the coming decades, are very likely to change our understanding of distant space,” Lyons said. And perhaps nestled in those discoveries will be an answer to the most fundamental of all questions, ‘are we alone?’

    Alternative Earths Astrobiology Center

    Founded in 2015

    One of 12 research teams funded by the NASA Astrobiology Institute, and one of only two using Earth’s history to guide exoplanet exploration

    Awarded $7.5 million over five years to cultivate a “search engine” for life on planets orbiting distant stars using Earth’s evolution over billions of years as a template

    Builds on existing UCR strengths in biogeochemistry, Earth history, and astrophysics

    Unites 66 researchers and staff at 11 institutions around the world, including primary partners led by former UCR graduate students now on the faculty at Yale and Georgia Tech

    4
    A NASA illustration of TESS monitoring stars outside our solar system.

    Through the Looking Glass

    In April, the Transiting Exoplanet Survey Satellite (TESS) Mission launched with the goal of discovering new Earths and super-Earths around nearby stars. As a guest investigator on the TESS Mission, Stephen Kane will use University of California telescopes, including those at the Lick Observatory in Mt. Hamilton to help determine whether candidate exoplanets identified by TESS are actually planets.

    UCSC Lick Observatory, Mt Hamilton, in San Jose, California, Altitude 1,283 m (4,209 ft)

    UCSC Lick Automated Planet Finder telescope, Mount Hamilton, CA, USA

    The UCO Lick C. Donald Shane telescope is a 120-inch (3.0-meter) reflecting telescope located at the Lick Observatory, Mt Hamilton, in San Jose, California, Altitude 1,283 m (4,209 ft)

    UC Santa Cruz Shelley Wright at the 1-meter Nickel Telescope NIROSETI


    NIROSETI team from left to right Rem Stone UCO Lick Observatory Dan Werthimer UC Berkeley Jérôme Maire U Toronto, Shelley Wright UCSD Patrick Dorval U Toronto Richard Treffers Richard Treffers Starman Systems. (Image by Laurie Hatch)

    By studying the planet mass data obtained from the ground-based telescopes and planet diameter readings from spacecraft observations, Kane will also help determine the overall composition of the newly identified planets.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    UC Riverside Campus

    The University of California, Riverside is one of 10 universities within the prestigious University of California system, and the only UC located in Inland Southern California.

    Widely recognized as one of the most ethnically diverse research universities in the nation, UCR’s current enrollment is more than 21,000 students, with a goal of 25,000 students by 2020. The campus is in the midst of a tremendous growth spurt with new and remodeled facilities coming on-line on a regular basis.

    We are located approximately 50 miles east of downtown Los Angeles. UCR is also within easy driving distance of dozens of major cultural and recreational sites, as well as desert, mountain and coastal destinations.

     
  • richardmitnick 4:42 pm on November 19, 2015 Permalink | Reply
    Tags: , , , Spectroscopy   

    From CANDELS: “Preparing Multi Object Spectroscopy Observations” 

    Hubble Candles

    November 19, 2015
    Janine Pforr

    Although CANDELS is a photometric survey, many team members have proposed for and been granted observing time for CANDELS sources to obtain spectroscopy. Such additional data not only provides us with a more accurate measurement of the distances of galaxies (aka redshift), but also with additional information to decode their properties, such as how many stars they are forming and how much dust is contained in the galaxies.

    1
    Figure 1: Example pointing for a MOS observation with the GMOS instrument at the Gemini Telescope. The image in the background shows the targeted sky area. The cyan outline shows the field of view of the instrument with the gaps between the 3 CCD detectors. The dashed outlined box shows the sky area in which the guide star needs to be placed. The red”arm” shows the arm that holds the camera that monitors the guide star.

    Classically, spectroscopy was carried out object by object, by placing one long slit where your one object is located. With this you restrict the area which lets light through to the detector to a narrow slit and blocking out everything else around it. The light that enters the prism or grism through this slit is then dispersed according to its wavelength, creating a spectrum of the object. Bright spots highlight the presence of elements that emit at this frequency/wavelength, and dark spots tell us where certain elements absorbed light and stopped it from reaching us. You can imagine though that carrying out such observations object by object is very time consuming.

    In the last decades though, astronomical studies for galaxy evolution started to greatly profit from new instrumentation which allows us to observe many objects at the same time. This is not only true for taking images of the sky, but also for spectroscopic observations.

    One method to take spectroscopy of many objects at the same time is grism spectroscopy, which we showed you in our post about grism spectroscopy with the Hubble Space Telescope. In that case nothing in your field of view is masked out and everything is dispersed. If your field of view is very crowded, meaning you have many many objects in your piece of sky, many spectra will overlap and will be hard to disentangle.

    2
    Figure 2: I-band image of the piece of sky to be observed with Multi Object Spectroscopy within the mask-making software. The red outline shows the field of view of the instrument, the blue stripes mark the gaps between the detectors. All potential target objects are marked with different smaller symbols according to their priority (blue triangles, green boxes, white circles
    and cyan diamonds for alignment stars).

    Another method is multi-object spectroscopy (MOS) via slit-masks. With this method you can take spectra for many objects at the same time by placing slits on many objects and blocking out the rest of the sky. This requires the creation of so-called MOS-masks in which the slit areas and the blocked out areas are clearly defined. This means that for every different observation you need a custom mask. Most current instruments require these masks to be prepared well in advance of the observation and to be cut out of plastic. This process isn’t feasible for a space telescope, but works very well on the ground. However, times are changing. For example, for the MOSFIRE (Multi-Object Spectrometer for InfraRed Exploration) instrument at the Keck Telescope, the masks are created on the fly and “bars” that create slits are then moved into the right position within the instrument.

    Keck MOSFIRE
    MOSFIRE

    Keck Observatory
    Keck Observatory Interior
    Keck Observatory

    Also for the upcoming James Webb Space Telescope a MOS unit will be available.

    NASA Webb Telescope
    NASA/Webb

    It is designed in such a way that little shutters open and close to produce slits and masked out areas. For many other instruments however, a mask is essentially one large piece of plastic that has lots of tiny slits cut out of it. The slits are placed exactly where you want to observe an object. To create such a mask is in principle relatively simple and I illustrate the process here with a series of images.

    I recently created some MOS masks for the Gemini Multi Object Spectrograph (GMOS) instrument at the Gemini Telescope to observe CANDELS galaxies and will use one of the masks I created as an example here to illustrate the process.

    Gemini Multi Object Spectrograph
    Gemini Multi Object Spectrograph (GMOS)

    Gemini North telescope
    Gemini North Interior
    Gemini North

    Firstly, an image of the desired piece of sky in which the positions of the objects you want to observe are measured (Figure 2) and a list of objects, i.e. a catalogue, are required. From that list we picked our desired targets. Often these are selected based on specific properties and limited by their brightness to ensure the maximum success with the granted observation time. Then we also need a list of stars to guide the telescope and to align the mask properly. Guide stars are used to correct for the rotation of the Earth throughout the observation so that the telescope is pointing at the same portion of the sky the entire time. You can see an example pointing in the first figure.

    3
    Figure 3: Zoom in [in original article] to show the placement of slits on some targets. Objects with blue triangles have highest priority, next are objects with green boxes, and then those with white circles. The yellow vertical stripes overlaid on an object show where the slit will be placed and cut out of the mask. The horizontal white lines mark the extension of the dispersed light, i.e. the spectrum of the object. Basically, all the light that hits the disperser when it comes through the vertically extended slit, is dispersed in the horizontal direction.

    Alignment stars are included on the mask to make sure all the slits are on the selected objects and not on some other piece of empty sky when the telescope operators define the pointing of the telescope. Then we take this image and list of targets and run them through the provided software for the given instrument. Usually, the original list of targets leaves room for other objects to be placed on the mask as well, so we basically work with a prioritized list of objects. The highest priority objects are “forced” onto the mask into the space left after placements of the alignment stars to observe as many as possible of the desired targets. Then any available gaps are filled with objects of lower priority. In Figures 3 and 4 you can see all the slits that were placed on this particular mask and a zoom in that shows you a slit.

    4
    Figure 4: The finished mask. The red outline is the field of view of the instrument, the blue vertical lines mark the gaps in the detector. Each rectangle box shows where the spectrum of that object will extend. Yellow vertical lines mark the position of the slit on the selected object. The cyan rectangle boxes mark the position of the alignment stars.

    After this, the observer can manually remove objects that received a slit if he/she wants the software to pick out a different object for example, one that might be more optimally placed. Then there are usually a few iterations in which the slit placement is refined a bit more and the maximum amount of objects are placed on the mask. And that’s it, the mask is finished. All that is left to do is create all the masks for all the pointings in the same manner and then sending them off to the telescope and instrument support team for checking and approval. Once a mask is approved, all the necessary information is send to the mask cutting team who cut the mask, meaning all the tiny slits are cut out. After masks are cut, they will be installed in the instrument and then it’s anxious waiting for us for the completion of your observations if they are carried out by the support astronomers at the observatory (Figure 5) or hoping for good weather if we go to the telescope ourselves to carry out the observations.

    The CANDELS fields are currently targeted by astronomers all over the world with many observational programs on instruments such as DEIMOS (on the Keck Telescope), MOSFIRE (on the Keck Telescope), GMOS (on the Gemini Telescopes, described in this post) and VIMOS (at the VLT, for example with the VIMOS UltraDeep Survey).

    5
    Figure 5: Example observation from one of the GMOS masks. Each horizontal package of lines is the dispersed light from one slit. The bright vertical lines (a few are highlighted by the violet arrows) are emission lines caused by the night sky, meaning elements in our atmosphere emit light at certain wavelengths which are also detected and then overlap with the spectrum of the target object. The spectral traces of the target objects are highlighted by red arrows and are faint horizontal lines. In the red box, we can clearly see 2 bright dots, these are emission lines in the target object which we can use to determine its redshift and other properties. The green arrows point towards high energy cosmic rays that hit the detector and cause a detection. In order to retrieve the spectra for the target objects, astronomers have to remove the cosmic rays and subtract the spectrum of the night sky, so that ideally only the spectra of the real targets are left in the end.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    About the CANDELS blog

    In late 2009, the Hubble Space Telescope began an ambitious program to map five carefully selected areas of the sky with its sensitive near-infrared camera, the Wide-Field Camera 3. The observations are important for addressing a wide variety of questions, from testing theories for the birth and evolution of galaxies, to refining our understanding of the geometry of the universe.

    This is a research blog written by people involved in the project. We aim to share some of the excitement of working at the scientific frontier, using one of the greatest telescopes ever built. We will also share some of the trials and tribulations of making the project work, from the complications of planning and scheduling the observations to the challenges of trying to understand the data. Along the way, we may comment on trends in astronomy or other such topics.

    CANDELS stands for the Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey. It builds on the legacy of the Hubble Deep Field, as well as the wider-area surveys called GOODS, AEGIS, COSMOS, and UKIDSS UDS. The CANDELS observations are designed to search for galaxies within about a billion years of the big bang, study galaxies at cosmic high-noon about 3 billion years after the big bang – when star-formation and black hole growth were at their peak intensity – and discover distant supernovae for refining our understanding of cosmic acceleration. You can find more details, and download the CANDELS data, from the CANDELS website.

    You can also use the Hubble Legacy Archive to view the CANDELS images.

     
  • richardmitnick 11:14 pm on July 17, 2015 Permalink | Reply
    Tags: , , Spectroscopy,   

    From The Conversation: “Explainer: seeing the universe through spectroscopic eyes” 

    Conversation
    The Conversation

    July 17, 2015
    Amanda Bauer

    When you look up on a clear night and see stars, what are you really looking at? A twinkling pinprick of light with a hint of colour?

    Imagine looking at a starry sky with eyes like prisms that separate the light from each star into its full rainbow of colour. Astronomers have built instruments to do just that, and spectroscopy is one of the most powerful tools in the astronomer’s box.

    The technique might not produce the well-known pretty pictures sent down by the Hubble Space Telescope, but for astronomers, a spectrum is worth a thousand pictures.

    Visible spectra reveal huge amounts of information about objects in the distant cosmos that we can’t learn any other way.

    So what is spectroscopy?

    Spectroscopy is the process of separating starlight into its constituent wavelengths, like a prism turning sunlight into a rainbow. The familiar colours of the rainbow correspond to different wavelengths of visible light.

    1
    The spectrum of visible light. Note the wavelength increases towards the red. Wikimedia, CC BY

    The human eye is sensitive to the visible spectrum – a narrow range of frequencies among the entire electromagnetic spectrum. The visible spectrum covers wavelengths of roughly 390 nanometers to 780 nanometers (astronomers often use units of Angstroms (10-10), so visible light spans 3,900 to 7,800 Angstroms).

    Once visible starlight reaches the curved primary mirror of a telescope, it is reflected toward the focal point and can then be directed anywhere. If the light is sent directly to a camera, an image of the night sky is seen on a computer screen as a result.

    If the light is instead sent through a spectrograph before it hits the camera, then the light from the astronomical object gets separated into its basic parts.

    2
    The colours of the spectrum revealed as the light passes through a glass prism. Flickr/final gather, CC BY-ND

    A very simple spectrograph was used by [Sir] Isaac Newton in the 1660s when he dispersed light with a glass prism. Modern spectrographs consist of a series of optics, a dispersing element and a camera at the end. The light is digitised and sent to a computer, which astronomers use to inspect and analyse the resulting spectra.

    The video (above) shows the path of distant starlight through the 4-metre Anglo-Australian Telescope (AAT) and a typical spectrograph, revealing real data at the end.

    Anglo Australian Telescope Exterior
    Anglo Australian Telescope Interior
    Anglo-Australian Telescope

    What do spectra teach us?

    A spectrum allows astronomers to determine many things about the object being viewed, such as how far away it is, its chemical makeup, age, formation history, temperature and more. While every astronomical object has a unique rainbow fingerprint, some general properties are universal.

    3
    Top shows a spiral galaxy spectrum. Bottom shows non-star-forming galaxy spectrum. Screenshot from Australian Astronomical Observatory video,
    Author provided

    Here we examine the galaxy spectra shown in the video. The spectrum of a galaxy is the combined light from its billions of stars and all other radiating matter in the galaxy, such as gas and dust.

    In the top spectrum you can see a few strong spikes. These are called “emission lines” and occur at discrete wavelengths due to the atomic structure of atoms as electrons jump between energy levels.

    The hydrogen spectrum is particularly important because 90% of the normal matter in the universe is hydrogen. Because of the details of hydrogen’s atomic structure, we recognise the strong hydrogen-alpha emission line at roughly 7,500 Angstroms in the top spectrum image.

    In a galaxy, only the youngest, biggest stars are hot enough to excite surrounding hydrogen gas enough that the electrons populate the third energy level, before falling to the second lowest, thus emitting a hydrogen-alpha photon.

    Because of this, we know the strength of the hydrogen-alpha line in a galaxy’s spectrum indicates how many very young stars there are in the galaxy. Since the bottom spectrum shows no hydrogen-alpha emission, we can conclude that the bottom galaxy is not sparking new life in the form of shining stars, while the top galaxy harbours several hard working stellar nurseries.

    In the bottom spectrum you can see a number dips. These are called “absorption lines” because they appear in the spectrum if there is anything between the light’s source and the observer on Earth absorbing the light. Absorbing material could be the extended layers of a star or interstellar clouds of gas or dust.

    The absorption lines close to each other below 5,000 Angstroms in the bottom spectrum are the calcium H and K lines and can be used to determine how quickly stars are zooming around the galaxy.

    In a galaxy how far away?

    A basic piece of information derived from a spectrum is the distance to the galaxy, or specifically, how much the light has stretched during its journey to Earth. Because the universe is expanding, the light emitted by the galaxy is stretched toward redder wavelengths as it innocently moves across space. We measure this as redshift.

    To determine the exact distance of a galaxy, astronomers measure the well-studied pattern of absorption and emission lines in the observed spectrum and compare it to the laboratory wavelengths of these features on Earth. The difference tells how much the light was stretched, and therefore how long the light was travelling through space, and consequently how far away the galaxy is.

    4
    The absorption lines ‘shift’ the farther away an object is, giving us an indication of its distance from us. Georg Wiora (Dr. Schorsch)/Wikimedia Commons,
    CC BY

    In the top galaxy spectrum mentioned earlier, we measure the strong red emission line of hydrogen-alpha to be at a wavelength of roughly 7,450 Angstroms. Since we know that line has a rest wavelength of 6,563 Angstroms, we calculate a redshift of 0.13, which means the light was travelling for 1.7 billion years before it reached our lucky telescope. The galaxy emitted that light when the universe was roughly 11.8 billion years old.

    Australia’s strength in spectroscopy

    Australia has led the way internationally for spectroscopic technology development for the last 20 years, largely due to the use of fibre optics to direct galaxy light from the telescope structure to the spectrograph.

    A huge advantage of using optical fibres is that more than one spectrum can be obtained simultaneously, drastically improving the efficiency of the telescope observing time.

    Australian astronomers have also led the world in building robotic technologies to position the individual optical fibres. With these, the AAT
    and the UK Schmidt Telescopes (both located at Siding Spring Observatory in New South Wales) have collected spectra for a third of all the 2.5 million galaxy spectra that humans have ever observed.

    UK Schmidt Telescope Exterior
    UK Schmidt Telescope Interior
    UK Schmidt Telescope

    While my own research uses hundreds of thousands of galaxy spectra for individual projects, it still amazes me think that each one of these spectra are composite collections of light created by hundreds of billions of stars gravitationally bound together in a single swirling galaxy, many similar to our own Milky Way home.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Conversation US launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

     
  • richardmitnick 8:38 pm on July 30, 2014 Permalink | Reply
    Tags: , , , , , , Spectroscopy   

    From NASA/Webb: “Revolutionary Microshutter Technology Hurdles Significant Challenges” 

    NASA James Webb Header

    NASA James Webb Telescope

    James Webb Space Telescope
    July 29, 2014
    Lori Keesey
    NASA Goddard Space Flight Center, Greenbelt, Maryland

    NASA technologists have hurdled a number of significant technological challenges in their quest to improve an already revolutionary observing technology originally created for the James Webb Space Telescope.

    The team, led by Principal Investigator Harvey Moseley, a scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, has demonstrated that electrostatically actuated microshutter arrays — that is, those activated by applying an specific voltage — are as functional as the current technology’s magnetically activated arrays. This advance makes them a highly attractive capability for potential Explorer-class missions designed to perform multi-object observations.

    “We have identified real applications — three scientists want to use our microshutter arrays and the commercial sector has expressed interest,” said Mary Li, a Goddard engineer who is working with Moseley and other team members to fully develop this already groundbreaking observing technology. “The electrostatic concept has been fully demonstrated and our focus now is on making these devices highly reliable.”

    Progress, she said, is in large part due to the fact that the team successfully eliminated all macro-moving parts — in particular, a large magnet — and dramatically lowered the voltage needed to actuate the microshutter array. In addition, the team applied advanced electronic circuitry and manufacturing techniques to assure the microshutter arrays’ dependable operation in orbit, Li added.

    The Microshutter Breakthrough

    Considered among the most innovative technologies to fly on the Webb telescope, the microshutter assembly is created from micro-electro-mechanical technologies and comprises thousands of tiny shutters, each about the width of a human hair.

    Assembled on four postage-size grids or arrays, the 250,000 shutters open or close individually to allow only the light from targeted objects to enter Webb’s Near Infrared Spectrograph (NIRSpec), which will help identify types of stars and gases and measure their distances and motions. Because Webb will observe faint, far-away objects, it will take as long as a week for NIRSpec to gather enough light to obtain good spectra.

    NASA Webb NIRspec
    NASA/Webb NIRSpec

    NIRSpec’s microshutter array, however, enhances the instrument’s observing efficiencies. It will allow scientists to gather spectral data on 100 objects at a time, vastly increasing the observatory’s productivity. When NASA launches the Webb telescope in 2018, it will represent a first for multi-object spectroscopy.

    close
    This image shows a close-up view of the next-generation microshutter arrays during the fabrication process. The technology advances an already groundbreaking multi-object observing technique.
    Image Credit: NASA/Bill Hrybyk

    Quest to Improve Design

    Determined to make the microshutter technology more broadly available, Goddard technologists have spent the past four years experimenting with techniques to advance this capability.

    One of the first things the team did was eliminate the magnet that sweeps over the shutter arrays to activate them. As with all mechanical parts, the magnet takes up space, adds weight, and is prone to mechanical failure. Perhaps more important, the magnet cannot be easily scaled up in size without creating significant fabrication challenges. As a result, the instrument’s field of view — that is, the area that is observable through an instrument — is limited in size. This greatly impedes next-generation space observatories that will require larger fields of view.

    Magnetic activation also takes longer. With the Webb telescope, the magnet must first sweep over the array to open all the shutters before voltages are selectively applied to open or close specific shutters.

    Achieving the Voltage Sweet Spot and Other Milestones

    To accommodate the needs of future observatories, the team replaced the magnet with electrostatic actuation. By applying an alternating-current voltage to electrodes placed on the frontside of the microshutters, the shutters swing open. To latch the desired shutters, a direct current voltage is applied to electrodes on the backside. In other words, only the needed shutters are opened; the rest remain closed. “This reduction in cycles should allow us to extend the lifetime of the microshutter arrays 100 times or more,” Li explained.

    And because the magnet no longer dictates the size of the array, its elimination will allow scientists to assemble much larger arrays for instruments whose fields of view are 50 times larger than Webb’s NIRSpec, she said.

    Just as significant is the voltage needed to actuate the arrays. When the effort first began four years ago, the team only could open and close the shutters with 1,000 volts. By 2011, the team had slashed that number to 80 volts — a level that still could exceed instrument voltage specifications. By last year, the team had achieved a major milestone by activating the shutters with just 30 volts — a voltage sweet spot, Li said.

    “But we also did something else,” she added.

    Through experimentation, the team used atomic layer deposition, a state-of-the-art fabrication technology, to fully insulate the tiny space between the electrodes to eliminate potential electrical crosstalk that could interfere with the arrays’ operation.

    The team also applied a very thin anti-stiction coating to prevent the shutters from sticking when opened. Before applying the coating, a 3,000-cycle laboratory test indicated that a third of the shutters stuck. After coating them, the team ran a 27,000-cycle test and not a single shutter adhered to the sides, Li said.

    Success Breeds Success; More Work Ahead

    men
    Goddard engineers Devin Burns and Lance Oh are pictured here with the next-generation microshutter arrays.
    Image Credit:
    NASA/Bill Hrybyk

    oh
    Goddard engineer Lance Oh is one of several technologists developing a next-generation microshutter array technology originally developed for the James Webb Space Telescope.
    Image Credit: NASA/Bill Hrybyk

    As a result of the progress, Li said three astrophysicists now are interested in applying the technology to their own mission concepts, which include observing nearby star-forming regions in the ultraviolet, studying the origins of astronomical objects to better understand the cosmic order, and understanding how galaxies, stars, and black holes evolve. In fact, one of those scientists is so committed to advancing the microshutter array that he plans to demonstrate it during a sounding-rocket mission next year, Li said.

    Although spectroscopy — the study of the absorption and emission of light by matter — is the obvious beneficiary of the technology’s advance, Li said it also is applicable to lidar instruments that measure distance by illuminating a target with a laser and analyzing the reflected light. A major automotive company also has expressed interested in the technology, she added.

    However, before others can use the new and improved microshutter technology, Li said the team must develop an assembly and packaging to house multiple arrays. “If you want to use the microshutter array on a large telescope, we need to make a larger field of view. To make this happen, we need to take multiple arrays and stitch them together,” Li said.

    Currently, the technology relies on a large computerized switch box — a heavy device unsuitable for spaceflight missions. The team plans to incorporate an integrated circuit, or silicon chip, that drives the switching functions. Placed next to the shutters, the circuit would take up only a fraction of the space. The team currently is identifying circuits from different vendors and plans to begin testing shortly.

    “In just four years, we have made great progress. A major private company has expressed interest in our technology, to say nothing of the three potential astrophysics missions,” Li said. “Given our progress, I am confident that we can make this technology more readily accessible to the optics community.”

    See the full article here.

    The James Webb Space Telescope will be a large infrared telescope with a 6.5-meter primary mirror. Launch is planned for later in the decade.

    Webb telescope will be the premier observatory of the next decade, serving thousands of astronomers worldwide. It will study every phase in the history of our Universe, ranging from the first luminous glows after the Big Bang, to the formation of solar systems capable of supporting life on planets like Earth, to the evolution of our own Solar System.

    Webb telescope was formerly known as the “Next Generation Space Telescope” (NGST); it was renamed in Sept. 2002 after a former NASA administrator, James Webb.

    Webb is an international collaboration between NASA, the European Space Agency (ESA), and the Canadian Space Agency (CSA). The NASA Goddard Space Flight Center is managing the development effort. The main industrial partner is Northrop Grumman; the Space Telescope Science Institute will operate Webb after launch.

    Several innovative technologies have been developed for Webb. These include a folding, segmented primary mirror, adjusted to shape after launch; ultra-lightweight beryllium optics; detectors able to record extremely weak signals, microshutters that enable programmable object selection for the spectrograph; and a cryocooler for cooling the mid-IR detectors to 7K.

    There will be four science instruments on Webb: the Near InfraRed Camera (NIRCam), the Near InfraRed Spectrograph (NIRSpec), the Mid-InfraRed Instrument (MIRI), and the Fine Guidance Sensor/ Near InfraRed Imager and Slitless Spectrograph (FGS-NIRISS). Webb’s instruments will be designed to work primarily in the infrared range of the electromagnetic spectrum, with some capability in the visible range. It will be sensitive to light from 0.6 to 28 micrometers in wavelength.

    Webb has four main science themes: The End of the Dark Ages: First Light and Reionization, The Assembly of Galaxies, The Birth of Stars and Protoplanetary Systems, and Planetary Systems and the Origins of Life.

    Launch is scheduled for later in the decade on an Ariane 5 rocket. The launch will be from Arianespace’s ELA-3 launch complex at European Spaceport located near Kourou, French Guiana. Webb will be located at the second Lagrange point, about a million miles from the Earth.

    NASA

    ESA Icon Large

    Canadian Space Agency


    ScienceSprings is powered by MAINGEAR computers

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: