Tagged: Cosmology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:33 am on November 23, 2017 Permalink | Reply
    Tags: , , , Cosmology, , , GomX-4A and GomX-4B,   

    From ESA: “ESA’s latest technology CubeSat cleared for launch site” 

    ESA Space For Europe Banner

    European Space Agency

    23 November 2017
    No writer credit

    1
    Testing CubeSat pair.

    GomX-4B, ESA’s latest and largest technology-testing CubeSat, will be launched from China early next year, together with the near-identical GomX-4A. The pair will test intersatellite communication links and propulsion while orbiting up to 4500 km apart.

    The cereal box-sized GomX-4B has been passed as ready to travel along with its twin from manufacturer GomSpace in Denmark in early December to begin launch preparations in China.

    “GomX-4B is scheduled to be launched on a Chinese Long March rocket on 1 February, along with GomX-4A, owned by the Danish Ministry of Defence,” says Roger Walker, heading ESA’s Technology CubeSat initiative.

    The majority of tests were made at GomSpace and other facilities in Denmark, apart from thermal–vacuum testing – ensuring that the CubeSats can withstand the hard vacuum and temperature extremes of low orbit – which took place at ESA’s technical centre in the Netherlands.

    2
    GomX-4B with GomX-4A
    Released 13/10/2016
    Copyright GomSpace
    ESA has signed a contract for its biggest small nanosatellite yet: GomX-4B will be a ‘6-unit’ CubeSat, intended to demonstrate miniaturised technologies, preparing the way for future operational nanosatellite constellations.
    GomX-4B is double the size of ESA’s first technology CubeSat, GomX-3, which was released from the International Space Station last year.
    The contract with Danish CubeSat specialist GomSpace is supported through the In-Orbit Demonstration element of ESA’s General Support Technology Programme, focused on readying new products for space and the marketplace.
    Aiming for flight in late 2017, GomX-4B will be launched and flown together with GomX-4A, designed by GomSpace for the Danish Ministry of Defence under a separate contract.
    The two CubeSats will stay linked through a new version of the software-defined radio system demonstrated on GomX-3, while their relative positions along their shared orbit is controlled up to a maximum 4500 km.
    Such intersatellite links will allow future CubeSat constellations to relay data quickly to users on the ground. The same radio system will also be used for rapid payload data downloads to Earth.
    Nanospace in Sweden are contributing the highly miniaturised cold-gas thrusters for controlling the orbit, allowing future CubeSat-based constellations to be deployed quickly after launch.
    Additional technology payloads include a compact hyperspectral imager called HyperScout, developed by Cosine Research in the Netherlands; a miniaturised startracker from Innovative Solutions In Space, also in the Netherlands; an inhouse ESA experiment to test components for radiation hardness; and an ADS-B antenna for aircraft tracking, developed from the GomSpace system tested on GomX-3.

    CubeSats are nanosatellites based on standardised 10×10 cm units. GomX-4B is a ‘6-unit’ CubeSat, double the size of its predecessor GomX-3, which was released from the International Space Station in 2015.

    Roger adds, “The two CubeSats will test intersatellite link technology, routing data from one satellite to the other, then down to the ground station. Part of the ground testing ensured they could indeed talk to each other and the actual ground station on an end-to-end basis.”

    Once released from the rocket, the CubeSats will first orient themselves to align their antennas. Then GomX-4B will gradually fly away from its counterpart, pausing at around 100 km intervals with their intersatellite links activated to see how well they work.

    5
    GomX-4B
    Released 23/11/2017
    Copyright ESA/GomSpace
    ESA’s GomX-4B CubeSat will test intersatellite links and propulsive orbit control techniques for future constellation operations with twin GomX-4A, which is owned by the Danish Ministry of Defence under a separate contract.
    Both ‘6-unit’ CubeSats are being built and tested by Danish nanosatellite specialist GomSpace.
    A video shows a simulation of GomX-4B’s ‘launch and early operations phase’.
    This is the crucial stage when the nanosatellite is released from its launcher and initially tumbles through space – reproduced by hand here – before deploying its antennas to link up with controllers back on Earth.
    See GomX-4B’s vibration test – simulating the violent shaking of its rocket launch – here.

    Their separation will be controlled by new cold-gas propulsion on GomX-4B contributed by Sweden’s NanoSpace company, using highly miniaturised thrusters.

    They will maintain their links through flat, patch antennas and software-controlled radios at a maximum distance of some 4500 km – a limit being set by the operating concept of a minimum of 10 satellites equally spaced around the same orbital plane to form a future constellation.

    “As well as operating together, the two also have separate payloads,” says Roger. “GomX-4B is the first CubeSat to fly our new HyperScout hyperspectral imager, developed by cosine Research in the Netherlands through ESA’s General Support Technology Programme.

    “Hyperscout images Earth in 45 different spectral bands, gathering a wealth of environmental data – so much so, in fact, that the camera must perform its own processing to drastically reduce the amount needing to be sent back to the ground.”

    GomX-4B also carries a new small startracker for precise attitude determination developed by Innovative Solutions in Space in the Netherlands, an ESA test payload checking components’ susceptibility to space radiation, and a dedicated radio receiver to detect signals from worldwide air traffic.

    5
    Magnetic cleaning
    Released 23/11/2017
    Copyright ESA/GomSpace
    Technology CubeSat GomX-4B undergoing ‘degaussing’ – reducing the magnetic fields of its component parts by applying an opposite magnetic field to it – at ESA’s Mobile Coil Facility at its technical centre in the Netherlands in June 2017. Such a magnetic cleaning procedure is needed to optimise the performance of the ‘magnetotorquers’ the nanosatellite will use for attitude control, with electromagnets reacting to Earth’s magnetic field.

    “Now the testing has been concluded, our main job is to keep the satellites’ batteries topped off, ahead of their transport to China,” concludes Roger. “Once they arrive, they will be checked and the propellant tanks filled.”

    The pair is flying as secondary payloads with China’s Seismo-Electromagnetic Satellite, CSES-1, designed to detect precursor signals of earthquakes in Earth’s ionosphere, an electrically active outer layer of the atmosphere.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The European Space Agency (ESA), established in 1975, is an intergovernmental organization dedicated to the exploration of space, currently with 19 member states. Headquartered in Paris, ESA has a staff of more than 2,000. ESA’s space flight program includes human spaceflight, mainly through the participation in the International Space Station program, the launch and operations of unmanned exploration missions to other planets and the Moon, Earth observation, science, telecommunication as well as maintaining a major spaceport, the Guiana Space Centre at Kourou, French Guiana, and designing launch vehicles. ESA science missions are based at ESTEC in Noordwijk, Netherlands, Earth Observation missions at ESRIN in Frascati, Italy, ESA Mission Control (ESOC) is in Darmstadt, Germany, the European Astronaut Centre (EAC) that trains astronauts for future missions is situated in Cologne, Germany, and the European Space Astronomy Centre is located in Villanueva de la Cañada, Spain.

    ESA50 Logo large

    Advertisements
     
  • richardmitnick 10:02 am on November 23, 2017 Permalink | Reply
    Tags: , , , Cosmology, ING - Isaac Newton Group of Telescopes, , TDE-stellar tidal disruption event   

    From ING: “Stars Regularly Ripped Apart by Black Holes in Colliding Galaxies” 

    Isaac Newton Group of Telescopes Logo
    Isaac Newton Group of Telescopes

    31 March, 2017 [Appeared now in RSS]
    Javier Méndez (Public Relations Officer)
    outreach@ing.iac.es

    Based on spectroscopic observations taken with the William Herschel Telescope (WHT) in 2015, astronomers from the Department of Physics and Astronomy at the University of Sheffield have found the first evidence for a stellar tidal disruption event (TDE) in a galaxy with a massive on-going starburst.

    Until now, TDEs — in which stars are ripped apart by supermassive black holes in the nuclei of galaxies — had been found in surveys of many thousands of galaxies, and the rate deduced for such events was low: one event every 10,000 to 100,000 years per galaxy. However, the Sheffield team detected a TDE in repeat WHT spectroscopic observations of a sample of just 15 ultra-luminous infrared galaxies (ULIRGs) over a period of only 10 years. Since ULIRGs represent the peaks of major galaxy mergers, this suggests that the rate of TDEs is substantially enhanced in mergers.

    1
    Artist’s impression of the tidal disruption event in F01004-2237. Credits and copyright: Mark Garlick.

    The team first observed the 15 ULIRGs in the sample with WHT/ISIS in 2005, during a project to study merger-induced star formation. However, when they observed the sample again in 2015 — this time to study the outflows driven by the active galactic nuclei (AGN) triggered in the mergers — they noticed that the nuclear spectrum of one galaxy (F01004-2237) appeared strikingly different. In particular, the object showed unusually strong and broad helium emission lines.

    2
    Comparison of the optical spectrum of F01004-2237 taken in September 2015 using the ISIS spectrograph on the WHT with that taken in September 2000 using the STIS spectrograph on the Hubble Space Telescope (HST). Note the detection of a broad component to the HeI 4686Å emission line in 2015 that is not visible in the Hβ and Hγ Balmer lines. ING.

    NASA/ESA Hubble Telescope

    Alerted to the possibility of an unusual transient event, the team then searched the Catalina Sky Survey database for evidence of variability, and discovered that F01004-2237 (z=0.117835) underwent a spectacular flare in its optical V-band light in 2010.

    3
    Catalina Sky Survey (CSS) light curves for F01004-2237 (solid blue points) and the other 14 ULIRGs in the spectroscopic sample (black dotted lines and red dashed line). Note that, whereas F01004-2237 has shown a substantial flare in its V-band brightness (ΔmV=0.45±0.02 mag) over the ~10 years of the survey, none of the other sources have shown similar flares. The data used are available from the CSS data release 2 website.

    The particular combination of variability and post-flare spectrum observed in F01004-2237 is unlike any known supernova or AGN, but is characteristic of TDEs. According to Clive Tadhunter, who led the research: “The enhanced rate of TDEs in major galaxy mergers is likely to be due to a combination of the high stellar densities associated with the circum-nuclear starbursts, and the movement of the two supermassive black holes from the progenitor galaxies through these dense star fields as they merge together.”

    The study, published in the journal Nature Astronomy, was supported by a grant from the UK Science and Technology Facilities Council.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    Isaac Newton Group telescopes
    Isaac Newton Group telescopes


    ING 4 meter William Herschel Telescope at Roque de los Muchachos Observatory on La Palma in the Canary Islands, 2,396 m (7,861 ft)


    ING Isaac Newton 2.5m telescope at Roque de los Muchachos Observatory on La Palma in the Canary Islands, Spain, Altitude 2,344 m (7,690 ft)

     
  • richardmitnick 8:54 pm on November 22, 2017 Permalink | Reply
    Tags: , , , Cosmology, The theory of Panspermia,   

    From Universe Today: “Galactic Panspermia: Interstellar Dust Could Transport Life from Star to Star” 

    universe-today

    Universe Today

    22 Nov , 2017
    Matt Williams

    1
    A new study from the University of Edinburgh suggests that life could be distributed throughout the cosmos by interstellar dust. Credit: ESO/R. Fosbury (ST-ECF)

    The theory of Panspermia states that life exists through the cosmos, and is distributed between planets, stars and even galaxies by asteroids, comets, meteors and planetoids. In this respect, life began on Earth about 4 billion years ago after microorganisms hitching a ride on space rocks landed on the surface. Over the years, considerable research has been devoted towards demonstrating that the various aspects of this theory work.

    The latest comes from the University of Edinburgh, where Professor Arjun Berera offers another possible method for the transport of life-bearing molecules. According to his recent study, space dust that periodically comes into contact with Earth’s atmosphere could be what brought life to our world billions of years ago. If true, this same mechanism could be responsible for the distribution of life throughout the Universe.

    For the sake of his study, which was recently published in Astrobiology under the title Space Dust Collisions as a Planetary Escape Mechanism, Prof. Berera examined the possibility that space dust could facilitate the escape of particles from Earth’s atmosphere. These include molecules that indicate the presence of life on Earth (aka. biosignatures), but also microbial life and molecules that are essential to life.

    2
    The theory of Panspermia states that life is distributed throughout the Universe by microbes traveling on objects between star system. Credit: NASA/Jenny Mottor

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 8:38 pm on November 22, 2017 Permalink | Reply
    Tags: , , , , Cosmology, No Missing Satellites?   

    From astrobites: “No Missing Satellites?” 

    Astrobites bloc

    astrobites

    Nov 22, 2017
    Gourav Khullar

    Title: There is no missing satellites problem
    Authors: S.Y. Kim, A.G.H. Peters, and J.R. Hargis
    First Author’s Institution: Dept. of Astronomy, The Ohio State University, USA

    Status: Submitted to The Astrophysical Journal Letters (ApJL) (open source)

    Dark Matter : Structure in the Universe

    It can be said with tremendous confidence that the Lambda Cold Dark Matter (LCDM) model of the universe is doing a fantastic job so far as THE model of the universe that explains most of its phenomena – the cosmic microwave background [CMB], the formation and evolution of objects in the universe, cosmic accelerated expansion, the evolution of various particle species, and even gravitational waves!

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex MittelmannColdcreation

    CMB per ESA/Planck

    Gravitational waves. Credit: MPI for Gravitational Physics/Werner Benger

    Both simulations and observations that incorporate LCDM to make predictions have been consistent with each other. That being said, there are a few areas in the field of astrophysics where LCDM is falling a little short of convincing; this is a cause for worry.

    Structure formation has been an exciting field for the past few decades, keeping astrophysicists busy with the mysteries of how objects like galaxies and galaxy clusters evolve from the primordial perturbations in the universe. Simulations of large-scale structure from the early universe to now (e.g. Bolshoi, Millenium, Illustris), and large sky surveys that observe billions of objects (e.g. SDSS, DES), indicate the presence of massive dark matter halos whose gravitational potential wells attract matter to form galaxies and eventually galaxy clusters.

    SDSS Telescope at Apache Point Observatory, NM, USA, Altitude 2,788 meters (9,147 ft)


    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    Moreover, natural product of this is the presence of low mass and low brightness dwarf galaxies on the peripheries of larger galaxies, known as satellites. These are galaxies that are influenced by the gravitational potential of their host galaxies, but are stable astrophysical entities in themselves. The above described paradigm and its products are the centerpiece of an ongoing debate (one of the major LCDM issues), called the ‘Missing Satellites Problem’.

    3
    Figure 1. Dwarf/Satellite galaxies seen in (left) a simulation with a Milky Way sized halo, and (right) observations of the Milky Way from sky surveys. The circles on the left are the highest mass satellite halos seen in simulations, and there are far too many similar-sized halos in the simulation than in observations. This is the MSP (From Weinberg et al. 2013).

    What is the problem?

    The ‘Missing Satellites Problem’ (MSP) is the discrepancy between the number of satellite galaxies seen when a Milky-Way type galaxy is simulated, and the dwarf galaxies observed around our own Milky Way – we aren’t seeing enough satellites around us. A Milky Way type galaxy is made of 10^10.5 Msolar worth of stars, 10^11 Msolar worth of gas, and 10^11.5 Msolar worth of a dark matter halo that contain the Milky Way and its satellite dwarf galaxies. While we are still in the process of discovering faint dwarf galaxies, astrophysicists wonder if there are indeed enough satellites to confirm simulation predictions.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    What do we do?

    Astrobites is a daily astrophysical literature journal written by graduate students in astronomy. Our goal is to present one interesting paper per day in a brief format that is accessible to undergraduate students in the physical sciences who are interested in active research.
    Why read Astrobites?

    Reading a technical paper from an unfamiliar subfield is intimidating. It may not be obvious how the techniques used by the researchers really work or what role the new research plays in answering the bigger questions motivating that field, not to mention the obscure jargon! For most people, it takes years for scientific papers to become meaningful.
    Our goal is to solve this problem, one paper at a time. In 5 minutes a day reading Astrobites, you should not only learn about one interesting piece of current work, but also get a peek at the broader picture of research in a new area of astronomy.

     
  • richardmitnick 8:16 pm on November 22, 2017 Permalink | Reply
    Tags: , , , Can You Overwater a Planet?, Cosmology, ,   

    From Many Worlds: “Can You Overwater a Planet?” 

    NASA NExSS bloc

    NASA NExSS

    Many Words icon

    Many Worlds

    Posted on 2017-11-22 by Marc Kaufman
    By guest columnist Elizabeth Tasker

    Wherever we find water on Earth, we find life. It is a connection that extends to the most inhospitable locations, such as the acidic pools of Yellowstone, the black smokers on the ocean floor or the cracks in frozen glaciers. This intimate relationship led to the NASA maxim, “Follow the Water”, when searching for life on other planets.

    Yet it turns out you can have too much of a good thing. In the November NExSS Habitable Worlds workshop in Wyoming, researchers discussed what would happen if you over-watered a planet. The conclusions were grim.

    Despite oceans covering over 70% of our planet’s surface, the Earth is relatively water-poor, with water only making up approximately 0.1% of the Earth’s mass. This deficit is due to our location in the Solar System, which was too warm to incorporate frozen ices into the forming Earth. Instead, it is widely — though not exclusively — theorized that the Earth formed dry and water was later delivered by impacts from icy meteorites. It is a theory that two asteroid missions, NASA’s OSIRIS-REx and JAXA’s Hayabusa2, will test when they reach their destinations next year.

    NASA OSIRIS-REx Spacecraft

    JAXA/Hayabusa 2

    But not all planets orbit where they were formed. Around other stars, planets frequently show evidence of having migrated to their present orbit from a birth location elsewhere in the planetary system.

    One example are the seven planets orbiting the star, TRAPPIST-1.

    A size comparison of the planets of the TRAPPIST-1 system, lined up in order of increasing distance from their host star. The planetary surfaces are portrayed with an artist’s impression of their potential surface features, including water, ice, and atmospheres. NASA


    The TRAPPIST-1 star, an ultracool dwarf, is orbited by seven Earth-size planets (NASA).

    Discovered…


    ESO Belgian robotic Trappist-South National Telescope at Cerro La Silla, Chile, 600 km north of Santiago de Chile at an altitude of 2400 metres.

    …in February this year, these Earth-sized worlds orbit in resonance, meaning that their orbital times are nearly exact integer ratios. Such a pattern is thought to occur in systems of planets that formed further away from the star and migrated inwards.

    The TRAPPIST-1 worlds currently orbit in a temperate region where the levels of radiation from the star are similar to that received by our terrestrial worlds. Three of the planets orbit in the star’s habitable zone, where a planet like the Earth is most likely to exist.

    However, if these planets were born further from the star, they may have formed with a high fraction of their mass in ices. As the planets migrated inwards to more clement orbits, this ice would have melted to produce a deep ocean. The result would be water worlds.

    With more water than the Earth, such planets are unlikely to have any exposed land. This does not initially sound like a problem; life thrives in the Earth’s seas, from photosynthesizing algae to the largest mammals on the planet. The problem occurs with the planet itself.

    The clement environment on the Earth’s surface is dependent on our atmosphere. If this envelope of gas was stripped away, the Earth’s average global temperature would be about -18°C (-0.4°F): too cold for liquid water. Instead, this envelope of gases results in a global average of 15°C (59°F).

    Exactly how much heat is trapped by our atmosphere depends on the quantity of greenhouse gases such as carbon dioxide. On geological timescales, the carbon dioxide levels can be adjusted by a geological process known as the “carbon-silicate cycle”.

    In this cycle, carbon dioxide in the air dissolves in rainwater where it splashes down on the Earth’s silicate rocks. The resulting reaction is termed weathering. Weathering forms carbonates and releases minerals from the rocks that wash into the oceans. Eventually, the carbon is released back into the air as carbon dioxide through volcanoes.

    3
    Continents are not only key for habitability because they sources of minerals and needed elements but also because they allow for plate tectonics — the movements and subsequent crackings of the planet’s crust that allow gases to escape. Those gases are needed to produce an atmosphere. (National Oceanic and Atmospheric Administration)

    The rate of weathering is sensitive to temperature, slowing when he planet is cool and increasing when the temperature rises. This allows the Earth to maintain an agreeable climate for life during small variations in our orbit due to the tug of our neighboring planets or when the sun was young and cooler. The minerals released by weathering are used by all life on Earth, in particular phosphorous which forms part of our DNA.

    However, this process requires land. And that is a commodity a water world lacks. Speaking at the Habitable Worlds workshop, Theresa Fisher, a graduate student at Arizona State University, warned against the effects of submerging your continents.

    Fisher considered the consequences of adding roughly five oceans of water to an Earth-sized planet, covering all land in a global sea. Feasible, because weathering could still occur with rock on the ocean floor, though at a much reduced efficiency. The planet might then be able to regulate carbon dioxide levels, but the large reduction in freed minerals with underwater weathering would be devastating for life.

    Despite being a key element for all life on Earth, phosphorus is not abundant on our planet. The low levels are why phosphorous is the main ingredient in fertilizer. Reduce the efficiency with which phosphorous is freed from rocks and life will plummet.

    Such a situation is a big problem for finding a habitable world, warns Steven Desch, a professor at Arizona State University. Unless life is capable of strongly influencing the composition of the atmosphere, its presence will remain impossible to detect from Earth.

    “You need to have land not to have life, but to be able to detect life,” Desch concludes.

    However, considerations of detectability become irrelevant if even more water is added to the planet. Should an Earth-sized planet have fifty oceans of water (roughly 1% of the planet’s mass), the added weight will cause high pressure ices to form on the ocean floor. A layer of thick ice would seal the planet rock away from the ocean and atmosphere, shutting down the carbon-silicate cycle. The planet would be unable to regulate its surface temperature and trapped minerals would be inaccessible for life.

    Add still more water and Cayman Unterborn, a postdoctoral fellow at Arizona State, warns that the pressure will seal the planet’s lid. The Earth’s surface is divided into plates that are in continual motion. The plates melt as they slide under one another and fresh crust is formed where the plates pull apart. When the ocean weight reaches 2% of the planet’s mass, melting is suppressed and the planet’s crust grinds to a halt.

    A stagnant lid would prevent any gases trapped in the rocks during the planet’s formation from escaping. Such “degassing” is the main source of atmosphere for a rocky planet. Without such a process, the Earth-sized deep water world could only cling to an envelop of water vapor and any gas that may have escaped before the crust sealed shut.

    Unterborn’s calculations suggest that this fate awaits the TRAPPIST-1 planets, with the outer worlds plausibly having hundreds of oceans worth of water pressing down on the planet.

    So can we prove if TRAPPIST-1 and similarly migrated worlds are drowning in a watery grave? Aki Roberge, an astrophysicist at NASA Goddard Space Flight Center, notes that exoplanets are currently seen only as “dark shadows” briefly reducing their star’s light.

    However, the next generation of telescopes such as NASA’s James Webb Space Telescope, will aim to change this with observations of planetary atmospheres.

    NASA/ESA/CSA Webb Telescope annotated

    Intertwined with the planet’s geological and biological processes, this cloak of gases may reveal if the world is living or dead.

    Elizabeth Tasker is a planetary scientist and communicator at the Japanese space agency JAXA and the Earth-Life Science Institute (ELSI) in Tokyo. She is also author of a new book about planet formation titled The Planet Factory.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Many Worlds

    There are many worlds out there waiting to fire your imagination.

    Marc Kaufman is an experienced journalist, having spent three decades at The Washington Post and The Philadelphia Inquirer, and is the author of two books on searching for life and planetary habitability. While the “Many Worlds” column is supported by the Lunar Planetary Institute/USRA and informed by NASA’s NExSS initiative, any opinions expressed are the author’s alone.

    This site is for everyone interested in the burgeoning field of exoplanet detection and research, from the general public to scientists in the field. It will present columns, news stories and in-depth features, as well as the work of guest writers.

    About NExSS

    The Nexus for Exoplanet System Science (NExSS) is a NASA research coordination network dedicated to the study of planetary habitability. The goals of NExSS are to investigate the diversity of exoplanets and to learn how their history, geology, and climate interact to create the conditions for life. NExSS investigators also strive to put planets into an architectural context — as solar systems built over the eons through dynamical processes and sculpted by stars. Based on our understanding of our own solar system and habitable planet Earth, researchers in the network aim to identify where habitable niches are most likely to occur, which planets are most likely to be habitable. Leveraging current NASA investments in research and missions, NExSS will accelerate the discovery and characterization of other potentially life-bearing worlds in the galaxy, using a systems science approach.
    The National Aeronautics and Space Administration (NASA) is the agency of the United States government that is responsible for the nation’s civilian space program and for aeronautics and aerospace research.

    President Dwight D. Eisenhower established the National Aeronautics and Space Administration (NASA) in 1958 with a distinctly civilian (rather than military) orientation encouraging peaceful applications in space science. The National Aeronautics and Space Act was passed on July 29, 1958, disestablishing NASA’s predecessor, the National Advisory Committee for Aeronautics (NACA). The new agency became operational on October 1, 1958.

    Since that time, most U.S. space exploration efforts have been led by NASA, including the Apollo moon-landing missions, the Skylab space station, and later the Space Shuttle. Currently, NASA is supporting the International Space Station and is overseeing the development of the Orion Multi-Purpose Crew Vehicle and Commercial Crew vehicles. The agency is also responsible for the Launch Services Program (LSP) which provides oversight of launch operations and countdown management for unmanned NASA launches. Most recently, NASA announced a new Space Launch System that it said would take the agency’s astronauts farther into space than ever before and lay the cornerstone for future human space exploration efforts by the U.S.

    NASA science is focused on better understanding Earth through the Earth Observing System, advancing heliophysics through the efforts of the Science Mission Directorate’s Heliophysics Research Program, exploring bodies throughout the Solar System with advanced robotic missions such as New Horizons, and researching astrophysics topics, such as the Big Bang, through the Great Observatories [Hubble, Chandra, Spitzer, and associated programs. NASA shares data with various national and international organizations such as from the [JAXA]Greenhouse Gases Observing Satellite.

     
  • richardmitnick 5:33 pm on November 22, 2017 Permalink | Reply
    Tags: , , , Cosmology, , , Neutrino astronomy, , ,   

    From LBNL: “How the Earth Stops High-Energy Neutrinos in Their Tracks” 

    Berkeley Logo

    Berkeley Lab

    November 22, 2017
    Glenn Roberts Jr.
    geroberts@lbl.gov
    (510) 520-0843

    Efforts of Berkeley Lab scientists are key in new analysis of data from Antarctic experiment.

    3
    Illustration of how a muon interacts in the IceCube detector array. (Credit: IceCube Collaboration)


    IceCube has measured for the first time the probability that neutrinos are absorbed by Earth as a function of their energy and the amount of matter that they go through. This measurement of the neutrino cross section using Earth absorption has confirmed predictions from the Standard Model to energies up to 980 TeV. A detailed understanding of how high-energy neutrinos interact with Earth’s matter will allow using these particles to investigate the composition of Earth’s core and mantel. (Credit: IceCube Collaboration)


    U Wisconsin ICECUBE neutrino detector at the South Pole

    Neutrinos are abundant subatomic particles that are famous for passing through anything and everything, only very rarely interacting with matter. About 100 trillion neutrinos pass through your body every second.

    Now, scientists have demonstrated that the Earth stops energetic neutrinos—they do not go through everything. These high-energy neutrino interactions were seen by the IceCube detector, an array of 5,160 basketball-sized optical sensors deeply encased within a cubic kilometer of very clear Antarctic ice near the South Pole.

    IceCube’s sensors do not directly observe neutrinos, but instead measure flashes of blue light, known as Cherenkov radiation, emitted by muons and other fast-moving charged particles, which are created when neutrinos interact with the ice, and by the charged particles produced when the muons interact as they move through the ice. By measuring the light patterns from these interactions in or near the detector array, IceCube can estimate the neutrinos’ directions and energies.

    The study, published in the Nov. 22 issue of the journal Nature, was led by researchers at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and UC Berkeley.

    Spencer Klein, who leads Berkeley Lab’s IceCube research team, commented “This analysis is important because it shows that IceCube can make real contributions to particle and nuclear physics, at energies above the reach of current accelerators.”

    Sandra Miarecki, who performed much of the data analysis while working toward her PhD as an IceCube researcher at Berkeley Lab and UC Berkeley, said, “It’s a multidisciplinary idea.” The analysis required input from geologists who have created models of the Earth’s interior from seismic studies. Physicists have used these models to help predict how neutrinos are absorbed in the Earth.

    “You create ‘pretend’ muons that simulate the response of the sensors,” Miarecki said. “You have to simulate their behavior, there has to be an ice model to simulate the ice’s behavior, you also have to have cosmic ray simulations, and you have to simulate the Earth using equations. Then you have to predict, probability-wise, how often a particular muon would come through the Earth.”

    The study’s results are based on one year of data from about 10,800 neutrino-related interactions, stemming from a natural supply of very energetic neutrinos from space that go through a thick and dense absorber: the Earth. The energy of the neutrinos was critical to the study, as higher energy neutrinos are more likely to interact with matter and be absorbed by the Earth.

    Scientists found that there were fewer energetic neutrinos making it all the way through the Earth to the IceCube detector than from less obstructed paths, such as those coming in at near-horizontal trajectories. The probability of neutrinos being absorbed by the Earth was consistent with expectations from the Standard Model of particle physics, which scientists use to explain the fundamental forces and particles in the universe. This probability—that neutrinos of a given energy will interact with matter—is what physicists refer to as a “cross section.”

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    “Understanding how neutrinos interact is key to the operation of IceCube,” explained Francis Halzen, principal investigator for the IceCube Neutrino Observatory and a University of Wisconsin–Madison professor of physics. Precision measurements at the HERA accelerator in Hamburg, Germany, allow us to compute the neutrino cross section with great accuracy within the Standard Model—which would apply to IceCube neutrinos of much higher energies if the Standard Model is valid at these energies.

    UC Berkeley Hydrogen Epoch of Reionization Array (HERA)

    “We were of course hoping for some new physics to appear, but we unfortunately find that the Standard Model, as usual, withstands the test,” added Halzen.

    James Whitmore, program director in the National Science Foundation’s physics division, said, “IceCube was built to both explore the frontiers of physics and, in doing so, possibly challenge existing perceptions of the nature of universe. This new finding and others yet to come are in that spirit of scientific discovery.”

    2
    In this study, researchers measured the flux of muon neutrinos as a function of their energy and their incoming direction. Neutrinos with higher energies and with incoming directions closer to the North Pole are more likely to interact with matter on their way through Earth. (Credit: IceCube Collaboration)

    This study provides the first cross-section measurements for a neutrino energy range that is up to 1,000 times higher than previous measurements at particle accelerators. Most of the neutrinos selected for this study were more than a million times more energetic than the neutrinos produced by more familiar sources, like the sun or nuclear power plants. Researchers took care to ensure that the measurements were not distorted by detector problems or other uncertainties.

    “Neutrinos have quite a well-earned reputation of surprising us with their behavior,” said Darren Grant, spokesperson for the IceCube Collaboration and a professor of physics at the University of Alberta in Canada. “It is incredibly exciting to see this first measurement and the potential it holds for future precision tests.”

    In addition to providing the first measurement of the Earth’s absorption of neutrinos, the analysis shows that IceCube’s scientific reach is extending beyond its core focus on particle physics discoveries and the emerging field of neutrino astronomy into the fields of planetary science and nuclear physics. This analysis will also interest geophysicists who would like to use neutrinos to image the Earth’s interior, although this will require more data than was used in the current study.

    The neutrinos used in this analysis were mostly produced when hydrogen or heavier nuclei from high-energy cosmic rays, created outside the solar system, interacted with nitrogen or oxygen nuclei in the Earth’s atmosphere. This creates a cascade of particles, including several types of subatomic particles that decay, producing neutrinos. These particles rain down on the Earth’s surface from all directions.

    The analysis also included a small number of astrophysical neutrinos, which are produced outside of the Earth’s atmosphere, from cosmic accelerators unidentified to date, perhaps associated with supermassive black holes.

    The neutrino-interaction events that were selected for the study have energies of at least one trillion electron volts, or a teraelectronvolt (TeV), roughly the kinetic energy of a flying mosquito. At this energy, the Earth’s absorption of neutrinos is relatively small, and the lowest energy neutrinos in the study largely served as an absorption-free baseline. The analysis was sensitive to absorption in the energy range from 6.3 TeV to 980 TeV, limited at the high-energy end by a shortage of sufficiently energetic neutrinos.

    At these energies, each individual proton or neutron in a nucleus acts independently, so the absorption depends on the number of protons or neutrons that each neutrino encounters. The Earth’s core is particularly dense, so absorption is largest there. By comparison, the most energetic neutrinos that have been studied at human-built particle accelerators were at energies below 0.4 TeV. Researchers have used these accelerators to aim beams containing an enormous number of these lower energy neutrinos at massive detectors, but only a very tiny fraction yield interactions.

    IceCube researchers used data collected from May 2010 to May 2011, from a partial array of 79 “strings,” each containing 60 sensors embedded more than a mile deep in the ice.

    Gary Binder, a UC Berkeley graduate student affiliated with Berkeley Lab’s Nuclear Science Division, developed the software that was used to fit IceCube’s data to a model describing how neutrinos propagate through the Earth.

    From this, the software determined the cross-section that best fit the data. University of Wisconsin – Madison student Chris Weaver developed the code for selecting the detection events that Miarecki used.

    Simulations to support the analysis have been conducted using supercomputers at the University of Wisconsin–Madison and at Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC).

    NERSC Cray Cori II supercomputer

    LBL NERSC Cray XC30 Edison supercomputer


    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF


    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    Physicists now hope to repeat the study using an expanded, multiyear analysis of data from the full 86-string IceCube array, which was completed in December 2010, and to look at higher ranges of neutrino energies for any hints of new physics beyond the Standard Model.

    IceCube Gen-2 DeepCore


    IceCube Gen-2 DeepCore PINGU

    IceCube has already detected multiple ultra-high-energy neutrinos, in the range of petaelectronvolts (PeV), which have a 1,000-times-higher energy than those detected in the TeV range.

    Klein said, “Once we can reduce the uncertainties and can look at slightly higher energies, we can look at things like nuclear effects in the Earth, and collective electromagnetic effects.”

    Binder added, “We can also study how much energy a neutrino transfers to a nucleus when it interacts, giving us another probe of nuclear structure and physics beyond the Standard Model.”

    A longer term goal is to build a larger detector, which would enable scientists to study neutrinos of even higher energies. The proposed IceCube-Gen2 would be 10 times larger than IceCube. Its larger size would enable the detector to collect more data from neutrinos at very high energies.

    Some scientists are looking to build an even larger detector, 100 cubic kilometers or more, using a new approach that searches for pulses of radio waves produced when very high energy neutrinos interact in the ice. Measurements of neutrino absorption by a radio-based detector could be used to search for new phenomena that go well beyond the physics accounted for in the Standard Model and could scrutinize the structure of atomic nuclei in greater detail than those of other experiments.

    Miarecki said, “This is pretty exciting – I couldn’t have thought of a more interesting project.”

    Berkeley Lab’s National Energy Research Scientific Computing Center is a DOE Office of Science User Facility.

    The work was supported by the U.S. National Science Foundation-Office of Polar Programs, U.S. National Science Foundation-Physics Division, University of Wisconsin Alumni Research Foundation, Grid Laboratory of Wisconsin (GLOW) grid infrastructure at the University of Wisconsin–Madison, Open Science Grid (OSG) grid infrastructure, National Energy Research Scientific Computing Center, Louisiana Optical Network Initiative (LONI) grid computing resources, U.S. Department of Energy Office of Nuclear Physics, and United States Air Force Academy; Natural Sciences and Engineering Research Council of Canada, WestGrid and Compute/Calcul Canada; Swedish Research Council, Swedish Polar Research Secretariat, Swedish National Infrastructure for Computing (SNIC), and Knut and Alice Wallenberg Foundation, Sweden; German Ministry for Education and Research (BMBF), Deutsche Forschungsgemeinschaft (DFG), Helmholtz Alliance for Astroparticle Physics (HAP), Initiative and Networking Fund of the Helmholtz Association, Germany; Fund for Scientific Research (FNRS-FWO), FWO Odysseus programme, Flanders Institute to encourage scientific and technological research in industry (IWT), Belgian Federal Science Policy Office (Belspo); Marsden Fund, New Zealand; Australian Research Council; Japan Society for Promotion of Science (JSPS); the Swiss National Science Foundation (SNSF), Switzerland; National Research Foundation of Korea (NRF); Villum Fonden, Danish National Research Foundation (DNRF), Denmark.

    The IceCube Neutrino Observatory was built under a National Science Foundation (NSF) Major Research Equipment and Facilities Construction grant, with assistance from partner funding agencies around the world. The NSF Office of Polar Programs and NSF Physics Division support the project with a Maintenance and Operations (M&O) grant. The University of Wisconsin–Madison is the lead institution for the IceCube Collaboration, coordinating data-taking and M&O activities.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 10:56 am on November 22, 2017 Permalink | Reply
    Tags: , , , Comet 45P/Honda-Mrkos-Pajdušáková, Cosmology, ,   

    From Goddard: “NASA Telescope Studies Quirky Comet 45P” 

    NASA Goddard Banner
    NASA Goddard Space Flight Center

    Nov. 21, 2017
    Elizabeth Zubritsky
    elizabeth.a.zubritsky@nasa.gov
    NASA’s Goddard Space Flight Center in Greenbelt, Md.

    When comet 45P zipped past Earth early in 2017, researchers observing from NASA’s Infrared Telescope Facility, or IRTF, in Hawai’i gave the long-time trekker a thorough astronomical checkup. The results help fill in crucial details about ices in Jupiter-family comets and reveal that quirky 45P doesn’t quite match any comet studied so far.

    Like a doctor recording vital signs, the team measured the levels of nine gases released from the icy nucleus into the comet’s thin atmosphere, or coma. Several of these gases supply building blocks for amino acids, sugars and other biologically relevant molecules. Of particular interest were carbon monoxide and methane, which are so hard to detect in Jupiter-family comets that they’ve only been studied a few times before.

    1
    Comet 45P/Honda-Mrkos-Pajdušáková is captured using a telescope on December 22 from Farm Tivoli in Namibia, Africa.
    Credits: Gerald Rhemann

    The gases all originate from the hodgepodge of ices, rock and dust that make up the nucleus. These native ices are thought to hold clues to the comet’s history and how it has been aging.

    “Comets retain a record of conditions from the early solar system, but astronomers think some comets might preserve that history more completely than others,” said Michael DiSanti, an astronomer at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, and lead author of the new study in The Astronomical Journal.

    The comet—officially named 45P/Honda-Mrkos-Pajdušáková—belongs to the Jupiter family of comets, frequent orbiters that loop around the Sun about every five to seven years. Much less is known about native ices in this group than in the long-haul comets from the Oort Cloud.

    To identify native ices, astronomers look for chemical fingerprints in the infrared part of the spectrum, beyond visible light. DiSanti and colleagues conducted their studies using the iSHELL high-resolution spectrograph recently installed at IRTF on the summit of Maunakea.

    NASA Infrared Telescope facility Mauna Kea, Hawaii, USA, 4,207 m (13,802 ft) above sea level

    With iSHELL, researchers can observe many comets that used to be considered too faint.

    The spectral range of the instrument makes it possible to detect many vaporized ices at once, which reduces the uncertainty when comparing the amounts of different ices. The instrument covers wavelengths starting at 1.1 micrometers in the near-infrared (the range of night-vision goggles) up to 5.3 micrometers in the mid-infrared region.

    iSHELL also has high enough resolving power to separate infrared fingerprints that fall close together in wavelength. This is particularly necessary in the cases of carbon monoxide and methane, because their fingerprints in comets tend to overlap with the same molecules in Earth’s atmosphere.

    “The combination of iSHELL’s high resolution and the ability to observe in the daytime at IRTF is ideal for studying comets, especially short-period comets,” said John Rayner, director of the IRTF, which is managed for NASA by the University of Hawai’i.

    While observing for two days in early January 2017—shortly after 45P’s closest approach to the Sun—the team made robust measurements of water, carbon monoxide, methane and six other native ices. For five ices, including carbon monoxide and methane, the researchers compared levels on the sun-drenched side of the comet to the shaded side. The findings helped fill in some gaps but also raised new questions.

    The results reveal that 45P is running so low on frozen carbon monoxide, that it is officially considered depleted. By itself, this wouldn’t be too surprising, because carbon monoxide escapes into space easily when the Sun warms a comet. But methane is almost as likely to escape, so an object lacking carbon monoxide should have little methane. 45P, however, is rich in methane and is one of the rare comets that contains more methane than carbon monoxide ice.

    It’s possible that the methane is trapped inside other ice, making it more likely to stick around. But the researchers think the carbon monoxide might have reacted with hydrogen to form methanol. The team found that 45P has a larger-than-average share of frozen methanol.

    When this reaction took place is another question—one that gets to the heart of comet science. If the methanol was produced on grains of primordial ice before 45P formed, then the comet has always been this way. On the other hand, the levels of carbon monoxide and methanol in the coma might have changed over time, especially because Jupiter-family comets spend more time near the Sun than Oort Cloud comets do.

    “Comet scientists are like archaeologists, studying old samples to understand the past,” said Boncho Bonev, an astronomer at American University and the second author on the paper. “We want to distinguish comets as they formed from the processing they might have experienced, like separating historical relics from later contamination.”

    The team is now on the case to figure out how typical their results might be among similar comets. 45P was the first of five such short-period comets that are available for study in 2017 and 2018. On the heels of 45P were comets 2P/Encke and 41P/Tuttle-Giacobini-Kresak. Due next summer and fall is 21P/Giacobini–Zinner, and later will come 46P/Wirtanen, which is expected to remain within 10 million miles (16 million kilometers) of Earth throughout most of December 2018.

    “This research is groundbreaking,” said Faith Vilas, the solar and planetary research program director at the National Science Foundation, or NSF, which helped support the study. “This broadens our knowledge of the mix of molecular species coexisting in the nuclei of Jovian-family comets, and the differences that exist after many trips around the Sun.”

    “We’re excited to see this first publication from iSHELL, which was built through a partnership between NSF, the University of Hawai’i, and NASA,” said Kelly Fast, IRTF program scientist at NASA Headquarters. “This is just the first of many iSHELL results to come.”

    More information about NASA’s IRTF:
    http://irtfweb.ifa.hawaii.edu/

    More information about comets:
    http://www.nasa.gov/comets

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA’s Goddard Space Flight Center is home to the nation’s largest organization of combined scientists, engineers and technologists that build spacecraft, instruments and new technology to study the Earth, the sun, our solar system, and the universe.

    Named for American rocketry pioneer Dr. Robert H. Goddard, the center was established in 1959 as NASA’s first space flight complex. Goddard and its several facilities are critical in carrying out NASA’s missions of space exploration and scientific discovery.


    NASA/Goddard Campus

     
  • richardmitnick 10:50 am on November 22, 2017 Permalink | Reply
    Tags: , , , Cosmology, JAXA MMX spacecraft, , Seeing red: JHU's Applied Physics Lab will build 'eyeglasses' for Mars moon mission   

    From JHU Applied Physics Lab: “Seeing red: JHU’s Applied Physics Lab will build ‘eyeglasses’ for Mars moon mission” 

    Johns Hopkins
    Johns Hopkins University

    Johns Hopkins Applied Physics Lab bloc
    JHU Applied Physics Lab

    Nov 17, 2017
    Michael Buckley

    1
    Martian moon Deimos with the red planet Mars in the background. Image credit: Getty Images

    2024 launch planned for Japan Aerospace Exploration Agency mission.

    Scientists at the Johns Hopkins Applied Physics Laboratory have been tasked with building a pair of space-ready spectacles for a Japan-led mission to two moons of Mars.

    The instrument, a sophisticated gamma-ray and neutron spectrometer named MEGANE—pronounced meh-gah-nay, meaning eyeglasses in Japanese—will help scientists resolve one of the most enduring mysteries of the Red Planet: when and how the small moons formed.

    Planned for launch in 2024, the Martian Moons eXploration being developed by the Japan Aerospace Exploration Agency will visit the Martian moons Phobos and Deimos, land on the surface of Phobos, collect a surface sample, and then return that sample to Earth. NASA is supporting the development of one of the spacecraft’s seven science instruments.

    “Solving the riddle of how Mars’ moons came to be will help us better understand how planets formed around our sun and, in turn, around other stars,” said Thomas Zurbuchen, associate administrator for NASA’s Science Mission Directorate at headquarters in Washington, D.C. “International partnerships like this provide high-quality science with high-impact return.”

    2
    Artist’s concept of the Japan Aerospace Exploration Agency’s MMX mission to explore the two Martian moons, Phobos and Deimos; the inset shows the gamma-ray and neutron spectrometer—to be built by APL—that will measure the moons’ surface elemental composition.
    Image credit: APL/JAXA


    JAXA MMX spacecraft

    APL space scientist David Lawrence will lead the team developing MEGANE, also an acronym for Mars-moon Exploration with GAmma rays and NEutrons. The instrument will give the mission team the ability to “see” the elemental composition of Phobos and Deimos by measuring naturally emitted gamma rays and neutrons from the Martian moons. These gamma rays and neutrons are generated by cosmic rays that continually strike and penetrate their surfaces.

    The measurements will help scientists determine whether the Martian moons are captured asteroids or the result of a larger body hitting Mars. MEGANE data will also support site selection for the MMX-gathered samples that will be returned to Earth, and provide critical context as scientists study these samples.

    “Understanding how Phobos and Deimos formed has been a goal of the planetary science community for many years,” Lawrence said.

    APL has built 69 spacecraft and more than 200 specialized instruments that have collected critical scientific data from the sun to Pluto and beyond. The lab’s most recent Mars instrument—the powerful Compact Reconnaissance Imaging Spectrometer for Mars, or CRISM, aboard NASA’s Mars Reconnaissance Orbiter—uncovered a wide range of chemical evidence indicating where and when water was present on the Red Planet.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Johns Hopkins Applied Physics Lab Campus

    Founded on March 10, 1942—just three months after the United States entered World War II—APL was created as part of a federal government effort to mobilize scientific resources to address wartime challenges.

    APL was assigned the task of finding a more effective way for ships to defend themselves against enemy air attacks. The Laboratory designed, built, and tested a radar proximity fuze (known as the VT fuze) that significantly increased the effectiveness of anti-aircraft shells in the Pacific—and, later, ground artillery during the invasion of Europe. The product of the Laboratory’s intense development effort was later judged to be, along with the atomic bomb and radar, one of the three most valuable technology developments of the war.

    On the basis of that successful collaboration, the government, The Johns Hopkins University, and APL made a commitment to continue their strategic relationship. The Laboratory rapidly became a major contributor to advances in guided missiles and submarine technologies. Today, more than seven decades later, the Laboratory’s numerous and diverse achievements continue to strengthen our nation.

    APL continues to relentlessly pursue the mission it has followed since its first day: to make critical contributions to critical challenges for our nation.

    Johns Hopkins Campus

    The Johns Hopkins University opened in 1876, with the inauguration of its first president, Daniel Coit Gilman. “What are we aiming at?” Gilman asked in his installation address. “The encouragement of research … and the advancement of individual scholars, who by their excellence will advance the sciences they pursue, and the society where they dwell.”

    The mission laid out by Gilman remains the university’s mission today, summed up in a simple but powerful restatement of Gilman’s own words: “Knowledge for the world.”

    What Gilman created was a research university, dedicated to advancing both students’ knowledge and the state of human knowledge through research and scholarship. Gilman believed that teaching and research are interdependent, that success in one depends on success in the other. A modern university, he believed, must do both well. The realization of Gilman’s philosophy at Johns Hopkins, and at other institutions that later attracted Johns Hopkins-trained scholars, revolutionized higher education in America, leading to the research university system as it exists today.

     
  • richardmitnick 9:25 am on November 22, 2017 Permalink | Reply
    Tags: , , , Cosmology, , Preparing to Light Up the LSST Network   

    From LSST: “Preparing to Light Up the LSST Network” 

    LSST

    Large Synoptic Survey Telescope

    November 16, 2017
    No writer credit found

    November 12, 2017 – LSST’s fiber-optic network, which will provide the necessary 100Gbps connectivity to move data from the summit of Cerro Pachón to all LSST operational sites and to multiple data centers, came one milestone closer to activation last week; the AURA LSST Dense Wavelength Division Multiplexing (DWDM) Network Equipment that LSST will use initially was installed in several key locations. DWDM equipment sends pulses of light down the fiber to transmit data, therefore a DWDM box is needed at each end of a fiber network in order for the network to be operational. In this installation project, the Summit-Base Network DWDM equipment was set up in the La Serena computer room and in the communications hut on the summit of Cerro Pachón. The Santiago portion of the Base-Archive Network was also addressed, with DWDM hardware installed in La Serena as well as at the National University Network (REUNA) facility in Santiago. The DWDM hardware in Santiago will be connected to AmLight DWDM equipment which will transfer the data to Florida. There, it will be picked up by Florida LambdaRail (FLR), ESnet, and internet2 for its journey to NSCA via Chicago.

    The primary South to North network traffic will be the transfer of raw image data from Cerro Pachón to the National Center for Supercomputing Applications (NCSA), where the data will be processed into scientific data products, including transient alerts, calibrated images, and catalogs. From there, a backup of the raw data will be made over the international network to IN2P3 in Lyon, France. IN2P3 will also perform half of the annual catalog processing. The network will also transfer data from North to South, returning the processed scientific data products to the Chilean Data Access Center (DAC), where they will be made available to the Chilean scientific community.

    The LSST Summit-Base and Base-Archive networks are on new fibers all the way to Santiago; there is also an existing fiber that provides a backup path from La Serena to Santiago. From Santiago to Florida, the data will travel on a new submarine fiber cable, with a backup on existing fiber cables. LSST currently shares the AURA fiber-optic network (connecting La Serena and the Summit) with the Gemini and CTIO telescopes, but will have its own dedicated DWDM equipment in 2018. Additional information on LSST data flow during LSST Operations is available here.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    LSST telescope, currently under construction at Cerro Pachón Chile
    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.
    LSST Interior

    LSST/Camera, built at SLAC
    LSST/Camera, built at SLAC

    The LSST is a new kind of telescope. Currently under construction in Chile, it is being built to rapidly survey the night-time sky. Compact and nimble, the LSST will move quickly between images, yet its large mirror and large field of view—almost 10 square degrees of sky, or 40 times the size of the full moon—work together to deliver more light from faint astronomical objects than any optical telescope in the world.

    From its mountaintop site in the foothills of the Andes, the LSST will take more than 800 panoramic images each night with its 3.2 billion-pixel camera, recording the entire visible sky twice each week. Each patch of sky it images will be visited 1000 times during the survey. With a light-gathering power equal to a 6.7-m diameter primary mirror, each of its 30-second observations will be able to detect objects 10 million times fainter than visible with the human eye. A powerful data system will compare new with previous images to detect changes in brightness and position of objects as big as far-distant galaxy clusters and as small as near-by asteroids.

    The LSST’s combination of telescope, mirror, camera, data processing, and survey will capture changes in billions of faint objects and the data it provides will be used to create an animated, three-dimensional cosmic map with unprecedented depth and detail , giving us an entirely new way to look at the Universe. This map will serve a myriad of purposes, from locating that mysterious substance called dark matter and characterizing the properties of the even more mysterious dark energy, to tracking transient objects, to studying our own Milky Way Galaxy in depth. It will even be used to detect and track potentially hazardous asteroids—asteroids that might impact the Earth and cause significant damage.

    As with past technological advances that opened new windows of discovery, such a powerful system for exploring the faint and transient Universe will undoubtedly serve up surprises.

    Plans for sharing the data from LSST with the public are as ambitious as the telescope itself. Anyone with a computer will be able to view the moving map of the Universe created by the LSST, including objects a hundred million times fainter than can be observed with the unaided eye. The LSST project will provide analysis tools to enable both students and the public to participate in the process of scientific discovery. We invite you to learn more about LSST science.

    The LSST will be unique: no existing telescope or proposed camera could be retrofitted or re-designed to cover ten square degrees of sky with a collecting area of forty square meters. Named the highest priority for ground-based astronomy in the 2010 Decadal Survey, the LSST project formally began construction in July 2014.

     
  • richardmitnick 9:14 am on November 22, 2017 Permalink | Reply
    Tags: , , , Cosmology, NASA TSIS 1 Total Solar Irradiance Spectral Solar Irradiance 1,   

    From Goddard: “NASA’s TSIS-1 Keeps an Eye on Sun’s Power Over Ozone” 

    NASA Goddard Banner
    NASA Goddard Space Flight Center

    Nov. 21, 2017
    Rani Gran
    rani.c.gran@nasa.gov
    NASA’s Goddard Space Flight Center, Greenbelt, Md.

    NASA TSIS 1 Total Solar Irradiance Spectral Solar Irradiance 1


    TSIS-1 will be affixed to the International Space Station in December 2017 TSIS-1 operates like a sun flower: it follows the Sun, from the ISS sunrise to its sunset, which happens every 90 minutes. At sunset, it rewinds, recalibrates and waits for the next sunset.
    Credits: Courtesy NASA/LASP

    1
    Antarctic ozone hole, Oct. 10, 2017: Purple and blue represent areas of low ozone concentrations in the atmosphere; yellow and red are areas of higher concentrations. Carbon tetrachloride (CCl4), which was once used in applications such as dry cleaning and as a fire-extinguishing agent, was regulated in 1987 under the Montreal Protocol along with other chlorofluorocarbons that destroy ozone and contribute to the ozone hole over Antarctica. Credits: NASA’s Goddard Space Flight Center

    2
    The picture on the left shows a calm sun from October 2010. The right side, from October 2012, shows a much more active and varied solar atmosphere as the sun moves closer to peak solar activity, or solar maximum. NASA’s Solar Dynamics Observatory (SDO) captured both images.
    Credits: NASA’s Goddard Space Flight Center/SDO

    NASA/SDO

    High in the atmosphere, above weather systems, is a layer of ozone gas. Ozone is Earth’s natural sunscreen, absorbing the Sun’s most harmful ultraviolet radiation and protecting living things below. But ozone is vulnerable to certain gases made by humans that reach the upper atmosphere. Once there, they react in the presence of sunlight to destroy ozone molecules.

    Currently, several NASA and National Oceanic and Atmospheric Administration (NOAA) satellites track the amount of ozone in the upper atmosphere and the solar energy that drives the photochemistry that creates and destroys ozone. NASA is now ready to launch a new instrument to the International Space Station that will provide the most accurate measurements ever made of sunlight as seen from above Earth’s atmosphere — an important component for evaluating the long-term effects of ozone-destroying chemistry. The Total and Spectral solar Irradiance Sensor (TSIS-1) will measure the total amount of sunlight that reaches the top of Earth’s atmosphere and how that light is distributed between different wavelengths, including ultraviolet wavelengths that we cannot sense with our eyes, but are felt by our skin and harmful to our DNA.

    This is not the first time NASA has measured the total light energy from the Sun. TSIS-1 succeeds previous and current NASA missions to monitor incoming sunlight with technological upgrades that should improve stability, provide three times better accuracy and lower interference from other sources of light, according to Candace Carlisle, TSIS-1 project manager at NASA’s Goddard Space Flight Center in Greenbelt, Maryland.

    “We need to measure the full spectrum of sunlight and the individual wavelengths to evaluate how the Sun affects Earth’s atmosphere,” said Dong Wu, TSIS-1 project scientist at Goddard.

    TSIS-will see more than 1,000 wavelength bands from 200 to 2400 nanometers. The visible part of the spectrum our eyes see goes from about 390 nanometers (blue) to 700 nanometers (red). A nanometer is one billionth of a meter.

    “Each color or wavelength of light affects Earth’s atmosphere differently,” Wu said.

    TSIS-1 will see different types of ultraviolet (UV) light, including UV-B and UV-C. Each plays a different role in the ozone layer. UV-C rays are essential in creating ozone. UV-B rays and some naturally occurring chemicals regulate the abundance of ozone in the upper atmosphere. The amount of ozone is a balance between these natural production and loss processes. In the course of these processes, UV-C and UV-B rays are absorbed, preventing them from reaching Earth’s surface and harming living organisms. Thinning of the ozone layer has allowed some UV-B rays to reach the ground.

    In the 1970s, scientists theorized that certain human-made chemicals found in spray cans, air conditioners and refrigerators could throw off the natural balance of ozone creation and depletion and cause an unnatural depletion of the protective ozone. In the 1980s, scientists observed ozone loss consistent with the concentrations of these chemicals and confirmed this theory.

    Ozone loss was far more severe than expected over the South Pole during the Antarctic spring (fall in the United States), a phenomenon that was named “the Antarctic ozone hole.” The discovery that human-made chemicals could have such a large effect on Earth’s atmosphere brought world leaders together. They created an international commitment to phase out ozone-depleting chemicals called the Montreal Protocol, which was universally ratified in 1987 by all countries that participate in the United Nations, and has been updated to tighten constraints and account for additional ozone depleting chemicals.

    A decade after the ratification of the Montreal Protocol, the amount of human-made ozone-destroying chemicals in the atmosphere peaked and began a slow decline. However, it takes decades for these chemicals to completely cycle out of the upper atmosphere, and the concentrations of these industrially produced molecules are not all decreasing as expected, while additional, new compounds are being created and released.

    More than three decades after ratification, NASA satellites have verified that ozone losses have stabilized and, in some specific locations, have even begun to recover due to reductions in the ozone-destroying chemicals regulated under the Montreal Protocol.

    As part of their work in monitoring the recovery of the ozone hole, scientists use computer models of the atmosphere that simulate the physical, chemical and weather processes in the atmosphere. These atmospheric models can then take input from ground and satellite observations of various atmospheric gases, both natural and human-produced, to help predict ozone layer recovery. They test the models by simulating past changes and then compare the results with satellite measurements to see if the simulations match past outcomes. To run the best possible simulation, the models also need accurate measurements of sunlight across the spectrum.

    “Atmospheric models need accurate measurements of sunlight across the to model the ozone layer correctly,” said Peter Pilewskie, TSIS-1 lead scientist at the Laboratory for Atmospheric and Space Physics in Boulder, Colorado. Scientists have learned that variations in UV radiance produce significant changes in the results of the computer simulations.

    Overall, solar energy output varies by approximately 0.1 percent — or about 1 watt per square meter between the most and least active part of an 11-year solar cycle. The solar cycle is marked by the alternating high and low activity periods of sunspots, dark regions of complex magnetic activity on the Sun’s surface. While UV light represents a tiny fraction of the total sunlight that reaches the top of Earth’s atmosphere, it fluctuates much more, anywhere from 3 to 10 percent, a change that in turn causes small changes in the chemical composition and thermal structure of the upper atmosphere.

    That’s where TSIS-1 comes in. “[TSIS] measurements of the solar spectrum are three times more accurate than previous instruments,” said Pilewskie. Its high quality measurements will allow scientists to fine tune their computer models and produce better simulations of the ozone layer’s behavior — as well as other atmospheric processes influenced by sunlight, such as the movement of winds and weather that are.

    TSIS-1 joins a fleet of NASA’s Earth-observing missions that monitor nearly every aspect of the Earth system, watching for any changes in our environment that could harm life.

    For more than five decades, NASA has used the vantage point of space to understand and explore our home planet, improve lives and safeguard our future by deploying space based sensors like TSIS-1. NASA’s Goddard Space Flight Center has overall responsibility for the development and operation of TSIS-1 on International Space Station as part of the Earth Systematic Missions program. The Laboratory for Atmospheric and Space Physics at the University of Colorado Boulder, under contract with NASA, is responsible for providing the TSIS-1 measurements and ensuring their availability to the scientific community.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA’s Goddard Space Flight Center is home to the nation’s largest organization of combined scientists, engineers and technologists that build spacecraft, instruments and new technology to study the Earth, the sun, our solar system, and the universe.

    Named for American rocketry pioneer Dr. Robert H. Goddard, the center was established in 1959 as NASA’s first space flight complex. Goddard and its several facilities are critical in carrying out NASA’s missions of space exploration and scientific discovery.


    NASA/Goddard Campus

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: