Tagged: Basic Research Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:29 pm on June 25, 2019 Permalink | Reply
    Tags: "Galaxy Clusters Caught in a First Kiss", , , Basic Research, , , Giant Metrewave Radio Telescope, JAXA/Suzaku satellite, , SDSS Telescope at Apache Point Observatory, SKA LOFAR core near Exloo Netherlands   

    From NASA Chandra: “Galaxy Clusters Caught in a First Kiss” 

    NASA Chandra Banner

    NASA/Chandra Telescope


    From NASA Chandra

    June 25, 2019
    Media contacts:
    Megan Watzke
    Chandra X-ray Center, Cambridge, Mass.
    617-496-7998
    mwatzke@cfa.harvard.edu

    1
    Composite

    2
    X-ray

    3
    Optical

    4
    Radio

    Credit: X-ray: NASA/CXC/RIKEN/L. Gu et al; Radio: NCRA/TIFR/GMRT; Optical: SDSS
    Press Image, Caption, and Videos

    Giant Metrewave Radio Telescope, an array of thirty telecopes, located near Pune in India

    SDSS Telescope at Apache Point Observatory, near Sunspot NM, USA, Altitude 2,788 meters (9,147 ft)

    For the first time, astronomers have found two giant clusters of galaxies that are just about to collide. This observation can be seen as a missing ‘piece of the puzzle’ in our understanding of the formation of structure in the Universe, since large-scale structures—such as galaxies and clusters of galaxies—are thought to grow by collisions and mergers. The result was published in Nature Astronomy on June 24th, 2019 and used data from NASA’s Chandra X-ray Observatory and other X-ray missions.

    Clusters of galaxies are the largest known bound objects and consist of hundreds of galaxies that each contain hundreds of billions of stars. Ever since the Big Bang, these objects have been growing by colliding and merging with each other. Due to their large size, with diameters of a few million light years, these collisions can take about a billion years to complete. Eventually the two colliding clusters will have merged into one bigger cluster.

    Because the merging process takes much longer than a human lifetime, we only see snapshots of the various stages of these collisions. The challenge is to find colliding clusters that are just at the stage of first touching each other.

    3

    In theory, this stage has a relatively short duration and is therefore hard to find. It is like finding a raindrop that just touches the water surface in a photograph of a pond during a rain shower. Obviously, such a picture would show a lot of falling droplets and ripples on the water surface, but only few droplets in the process of merging with the pond. Similarly, astronomers found a lot of single clusters and merged clusters with outgoing ripples indicating a past collision, but until now no two clusters that are just about to touch each other.

    An international team of astronomers now announced the discovery of two clusters on the verge of colliding. This enabled astronomers to test their computer simulations, which show that in the first moments a shock wave, analogous to the sonic boom produced by supersonic motion of an airplane, is created in between the clusters and travels out perpendicular to the merging axis. “These clusters show the first clear evidence for this type of merger shock,” says first author Liyi Gu from RIKEN national science institute in Japan and SRON Netherlands Institute for Space Research. “The shock created a hot belt region of 100-million-degree gas between the clusters, which is expected to extend up to, or even go beyond the boundary of the giant clusters. Therefore, the observed shock has a huge impact on the evolution of galaxy clusters and large scale structures.”

    Astronomers are planning to collect more ‘snapshots’ to ultimately build up a continuous model describing the evolution of cluster mergers. SRON-researcher Hiroki Akamatsu: “More merger clusters like this one will be found by eROSITA, an X-ray all-sky survey mission that will be launched this year.

    eRosita DLR MPG

    Two other upcoming X-ray missions, XRISM and Athena, will help us understand the role of these colossal merger shocks in the structure formation history.”

    JAXA XRSM spacecraft schematic

    ESA Athena

    Liyi Gu and his collaborators studied the colliding pair during an observation campaign, carried out with three X-ray satellites (ESA’s XMM-Newton satellite, NASA’s Chandra, and JAXA’s Suzaku satellite) and two radio telescopes (the Low-Frequency Array, a European project led by the Netherlands, and the Giant Metrewave Radio Telescope operated by National Centre for Radio Astrophysics of India).

    ESA/XMM Newton

    JAXA/Suzaku satellite

    SKA LOFAR core (“superterp”) near Exloo, Netherlands

    Other materials about the findings are available at:
    http://chandra.si.edu

    For more Chandra images, multimedia and related materials, visit:
    http://www.nasa.gov/chandra

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NASA’s Marshall Space Flight Center in Huntsville, Ala., manages the Chandra program for NASA’s Science Mission Directorate in Washington. The Smithsonian Astrophysical Observatory controls Chandra’s science and flight operations from Cambridge, Mass.

     
  • richardmitnick 1:06 pm on June 25, 2019 Permalink | Reply
    Tags: "The Low Density of Some Exoplanets is Confirmed", , , Basic Research, , , Kepler-9 and its planets Kepler-9b and Kepler-9c   

    From Harvard-Smithsonian Center for Astrophysics: “The Low Density of Some Exoplanets is Confirmed” 

    Harvard Smithsonian Center for Astrophysics


    From Harvard-Smithsonian Center for Astrophysics

    June 21, 2019

    The Kepler mission and its extension, called K2, discovered thousands of exoplanets.

    NASA/Kepler Telescope, and K2 March 7, 2009 until November 15, 2018

    It detected them using the transit technique, measuring the dip in light intensity whenever an orbiting planet moved across the face of its host star as viewed from Earth.

    Planet transit. NASA/Ames

    Transits can not only measure the orbital period, they often can determine the size of the exoplanet from the detailed depth and shape of its transit curve and the host star’s properties. The transit method, however, does not measure the mass of the planet. The radial velocity method, by contrast, which measures the wobble of a host star under the gravitational pull of an orbiting exoplanet, allows for the measurement of its mass. Knowing a planet’s radius and mass allows for the determination of its average density, and hence clues to its composition.

    Radial Velocity Method-Las Cumbres Observatory

    About fifteen years ago, CfA astronomers and others realized that in planetary systems with multiple planets, the periodic gravitational tug of one planet on another will alter their orbital parameters. Although the transit method cannot directly measure exoplanet masses, it can detect these orbital variations and these can be modeled to infer masses. Kepler has identified hundreds of exoplanet systems with transit-timing variations, and dozens have been successfully modeled. Surprisingly, this procedure seemed to find a prevalence of exoplanets with very low densities. The Kepler-9 system, for example, appears to have two planets with densities respectively of 0.42 and 0.31 grams per cubic centimeter. (For comparison, the rocky Earth’s average density is 5.51 grams per cubic centimeter, water is, by definition, 1.0 grams per cubic centimeter, and the gas giant Saturn is 0.69 grams per cubic centimeter.) The striking results cast some doubt on one or more parts of the transit timing variation methodology and created a long-standing concern.

    CfA astronomers David Charbonneau, David Latham, Mercedes Lopez-Morales, and David Phillips, and their colleagues tested the reliability of the method by measuring the densities of the Kepler-9 planets using the radial velocity method, its two Saturn-like planets being among a small group of exoplanets whose masses can be measured (if just barely) with either technique.

    2
    An artist’s depiction of Kepler-9 and its planets Kepler-9b and Kepler-9c. NASA

    They used the HARPS-N spectrometer on the Telescopio Nazionale Galileo in La Palma in sixteen observing epochs; HARPS-N can typically measure velocity variations with an error as tiny as about twenty miles an hour. Their results confirm the very low densities obtained by the transit-timing method, and verify the power of the transit-variation method.

    Harps North at Telescopio Nazionale Galileo –

    Telescopio Nazionale Galileo a 3.58-meter Italian telescope, located at the Roque de los Muchachos Observatory on the island of La Palma in the Canary Islands, Spain, Altitude 2,396 m (7,861 ft)

    Science paper:
    HARPS-N Radial Velocities Confirm the Low Densities of the Kepler-9 Planets
    MNRAS

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Center for Astrophysics combines the resources and research facilities of the Harvard College Observatory and the Smithsonian Astrophysical Observatory under a single director to pursue studies of those basic physical processes that determine the nature and evolution of the universe. The Smithsonian Astrophysical Observatory (SAO) is a bureau of the Smithsonian Institution, founded in 1890. The Harvard College Observatory (HCO), founded in 1839, is a research institution of the Faculty of Arts and Sciences, Harvard University, and provides facilities and substantial other support for teaching activities of the Department of Astronomy.

     
  • richardmitnick 12:40 pm on June 25, 2019 Permalink | Reply
    Tags: "NASA Technology Missions Launch on SpaceX Falcon Heavy", , , Basic Research, ,   

    From JPL-Caltech: “NASA Technology Missions Launch on SpaceX Falcon Heavy” 

    NASA JPL Banner

    From JPL-Caltech

    June 25, 2019

    Arielle Samuelson
    Jet Propulsion Laboratory, Pasadena, Calif.
    818-354-0307
    arielle.a.samuelson@jpl.nasa.gov

    Clare Skelly
    Headquarters, Washington
    202-358-4273
    clare.a.skelly@nasa.gov

    Karen Fox
    Goddard Space Flight Center, Greenbelt, Md.
    301-286-6284
    karen.c.fox@nasa.gov

    1
    A SpaceX Falcon Heavy rocket is ready for launch on the pad at Launch Complex 39A at NASA’s Kennedy Space Center in Florida on June 24, 2019. SpaceX and the U.S. Department of Defense will launch two dozen satellites to space, including four NASA payloads that are part of the Space Test Program-2, managed by the U.S. Photo Credit: NASA/Kim Shiflett

    NASA technology demonstrations, which one day could help the agency get astronauts to Mars, and science missions, which will look at the space environment around Earth and how it affects us, have launched into space on a Falcon Heavy rocket.

    The NASA missions – including the Deep Space Atomic Clock and two instruments from NASA’S Jet Propulsion Laboratory in Pasadena, California – lifted off at 11:30 p.m. PDT (2:30 a.m. EDT) Tuesday from NASA’s Kennedy Space Center in Florida, as part of the Department of Defense’s Space Test Program-2 (STP-2) launch.

    “This launch was a true partnership across government and industry, and it marked an incredible first for the U.S. Air Force Space and Missile Systems Center,” said Jim Reuter, associate administrator for NASA’s Space Technology Mission Directorate. “The NASA missions aboard the Falcon Heavy also benefited from strong collaborations with industry, academia and other government organizations.”

    The missions, each with a unique set of objectives, will aid in smarter spacecraft design and benefit the agency’s Moon to Mars exploration plans by providing greater insight into the effects of radiation in space and testing an atomic clock that could change how spacecraft navigate.

    With launch and deployments complete, the missions will start to power on, communicate with Earth and collect data. They each will operate for about a year, providing enough time to mature the technologies and collect valuable science data. Below is more information about each mission, including notional timelines for key milestones.

    Enhanced Tandem Beacon Experiment

    Two NASA CubeSats making up the Enhanced Tandem Beacon Experiment (E-TBEx) deployed at 12:08 and 12:13 a.m. PDT (3:08 and 3:13 a.m. EDT). Working in tandem with NOAA’s COSMIC-2 mission – six satellites that each carry a radio occultation (GPS) receiver developed at JPL – E-TBEx will explore bubbles in the electrically-charged layers of Earth’s upper atmosphere, which can disrupt communications and GPS signals that we rely on every day. The CubeSats will send signals in several frequencies down to receiving stations on Earth. Scientists will measure any disruptions in these signals to determine how they’re being affected by the upper atmosphere.

    One to three weeks after launch: E-TBEx operators “check out” the CubeSats to make sure power, navigation/guidance and data systems are working in space as expected.
    Approximately three weeks after launch: Science beacons that send signals to antennas on Earth power up and begin transmitting to ground stations.
    About one year after launch: The E-TBEx mission ends.

    Deep Space Atomic Clock

    NASA’s Deep Space Atomic Clock is a toaster oven-size instrument traveling aboard a commercial satellite that was released into low-Earth orbit at 12: 54 a.m. PDT (3:54 a.m. EDT). The unique atomic clock will test a new way for spacecraft to navigate in deep space. The technology could make GPS-like navigation possible at the Moon and Mars.

    NASA Deep Space Atomic Clock

    Two to four weeks after launch: The ultra-stable oscillator, part of the Deep Space Atomic Clock that keeps precise time, powers on to warm up in space.
    Four to seven weeks after launch: The full Deep Space Atomic Clock powers on.
    Three to four months after launch: Preliminary clock performance results are expected.
    One year after full power on: The Deep Space Atomic Clock mission ends, final data analysis begins.

    Green Propellant Infusion Mission

    The Green Propellant Infusion Mission (GPIM) deployed at 12:57 a.m. PDT (3:57 a.m. EDT) and immediately began to power on. GPIM will test a new propulsion system that runs on a high-performance and non-toxic spacecraft fuel. This technology could help propel constellations of small satellites in and beyond low-Earth orbit.

    Within a day of launch: Mission operators check out the small spacecraft.
    One to three weeks after launch: Mission operators ensure the propulsion system heaters and thrusters are operating as expected.
    During the first three months after launch: To demonstrate the performance of the spacecraft’s thrusters, GPIM performs three lowering burns that place it in an elliptical orbit; each time GPIM gets closer to Earth at one particular point in its orbit.
    Throughout the mission: Secondary instruments aboard GPIM measure space weather and test a system that continuously reports the spacecraft’s position and velocity.
    About 12 months after launch: Mission operators command a final thruster burn to deplete the fuel tank, a technical requirement for the end of mission.
    About 13 months after launch: The GPIM mission ends.

    Space Environment Testbeds

    The U.S. Air Force Research Laboratory’s Demonstration and Science Experiments (DSX) was the last spacecraft to be released from STP-2 at 3:04 a.m. PDT (6:04 a.m. EDT) Onboard is an instrument designed by JPL to measure spacecraft vibrations, and four NASA experiments that make up the Space Environment Testbeds (SET). SET will study how to better protect satellites from space radiation by analyzing the harsh environment of space near Earth and testing various strategies to mitigate the impacts. This information can be used to improve spacecraft design, engineering and operations in order to protect spacecraft from harmful radiation driven by the Sun.

    Three weeks after launch: SET turns on for check out and testing of all four experiments.
    Eight weeks after launch: Anticipated start of science data collection.
    About 12 months after check-out: The SET mission ends.

    n all, STP-2 delivered about two dozen satellites into three separate orbits around Earth. Kennedy Space Center engineers mentored Florida high school students who developed and built a CubeSat that also launched on STP-2.

    “It was gratifying to see 24 satellites launch as one,” said Nicola Fox, director of the Heliophysics Division in NASA’s Science Mission Directorate. “The space weather instruments and science CubeSats will teach us how to better protect our valuable hardware and astronauts in space, insights useful for the upcoming Artemis program and more.”

    GPIM and the Deep Space Atomic Clock are both part of the Technology Demonstration Missions program within NASA’s Space Technology Mission Directorate. The Space Communications and Navigation program within NASA’s Human Exploration and Operations Mission Directorate also provided funding for the atomic clock. SET and E-TBEx were both funded by NASA’s Science Mission Directorate.

    Learn more about NASA technology:

    https://www.nasa.gov/spacetech

    Find out how NASA is sending astronaut back to the Moon and on to Mars at:

    https://www.nasa.gov/topics/moon-to-mars

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NASA JPL Campus

    Jet Propulsion Laboratory (JPL)) is a federally funded research and development center and NASA field center located in the San Gabriel Valley area of Los Angeles County, California, United States. Although the facility has a Pasadena postal address, it is actually headquartered in the city of La Cañada Flintridge, on the northwest border of Pasadena. JPL is managed by the nearby California Institute of Technology (Caltech) for the National Aeronautics and Space Administration. The Laboratory’s primary function is the construction and operation of robotic planetary spacecraft, though it also conducts Earth-orbit and astronomy missions. It is also responsible for operating NASA’s Deep Space Network.

    Caltech Logo

    NASA image

     
  • richardmitnick 12:19 pm on June 25, 2019 Permalink | Reply
    Tags: "The future of particle accelerators may be autonomous", Basic Research, Fermilab | FAST Facility, FNAL PIP-II Injector Test (PIP2IT) facility, In December 2018 operators at LCLS at SLAC successfully tested an algorithm trained on simulations and actual data from the machine to tune the beam., ,   

    From Symmetry: “The future of particle accelerators may be autonomous” 

    Symmetry Mag
    From Symmetry

    06/25/19
    Caitlyn Buongiorno

    1
    Illustration by Sandbox Studio, Chicago with Ana Kova

    Particle accelerators are some of the most complicated machines in science. Scientists are working on ways to run them with a diminishing amount of direction from humans.

    In 2015, operators at the Linac Coherent Light Source particle accelerator looked into how they were spending their time managing the machine.

    SLAC/LCLS

    They tracked the hours they spent on tasks like investigating problems and orchestrating new configurations of the particle beam for different experiments.

    They discovered that, if they could automate the process of tuning the beam—tweaking the magnets that keep the LCLS particle beam on its course through the machine—it would free up a few hundred hours each year.

    Scientists have been working to automate different aspects of the operation of accelerators since the 1980s. In today’s more autonomous era of self-driving cars and vacuuming robots, efforts are still going strong, and the next generation of particle accelerators promises to be more automated than ever. Scientists are using machine learning to optimize beamlines more efficiently, detect problems more effectively and create the simulations they need in real-time.

    Quicker fixes

    With any machine, there is a chance that a part might malfunction or break. In the case of an accelerator, that part might be one of the many magnets that direct the particle beam.

    If one magnet stops working, there are ways to circumvent the problem using the magnets around it. But it’s not easy. A particle accelerator is a nonlinear system; when an operator makes a change to it, all of the possible downstream effects of that change can be difficult to predict.

    “The human brain isn’t good at that kind of optimization,” says Dan Ratner, the leader of the strategic initiative for machine learning at the US Department of Energy’s SLAC National Accelerator Laboratory in California.

    An operator can find the solution by trial and error, but that can take some time. With machine learning, an autonomous accelerator could potentially do the same task many times faster.

    In December 2018, operators at LCLS at SLAC successfully tested an algorithm trained on simulations and actual data from the machine to tune the beam.

    Ratner doesn’t expect either LCLS or its upgrade, LCLS-II, scheduled to come online in 2021, to run without human operators, but he’s hoping to give operators a new tool. “Ultimately, we’re trying to free up operators for tasks that really need a human,” he says.

    SLAC/LCLS II projected view

    Practical predictions

    At Fermi National Accelerator Laboratory in Illinois, physicist Jean-Paul Carneiro is working on an upgrade to the lab’s accelerator complex in the hopes that it will one day run with little to no human intervention.

    He was recently awarded a two-year grant for the project through the University of Chicago’s FACCTS program—France And Chicago Collaborating in The Sciences. He is integrating a code developed by scientist Dider Uriot at France’s Saclay Nuclear Research Center into the lab’s PIP-II Injector Test (PIP2IT) facility.

    2
    FNAL PIP-II Injector Test (PIP2IT) facility

    PIP2IT is the proving ground for technologies intended for PIP-II, the upgrade to Fermilab’s accelerator complex that will supply the world’s most intense beams of neutrinos for the international Deep Underground Neutrino Experiment.

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    Carneiro says autonomous accelerator operation would increase the usability of the beam for experiments by drastically reducing the accelerator’s downtime. On average, accelerators can currently expect to run at about 90% usability, he says. “If you want to achieve a 98 or 99% availability, the only way to do it is with a computer code.”

    Beyond quickly fixing tuning problems, another way to increase the availability of beam is to detect potential complications before they happen.

    Even in relatively stable areas, the Earth is constantly shifting under our feet—and shifting underground particle accelerators as well. People don’t feel these movements, but an accelerator beam certainly does. Over the course of a few days, these shifts can cause the beam to begin creeping away from its intended course. An autonomous accelerator could correct the beam’s path before a human would even notice the problem.

    Lia Merminga, PIP-II project director at Fermilab, says she thinks the joint project with CEA Saclay is a fantastic opportunity for the laboratory. “Part of our laboratory’s mission is to advance the science and technology of particle accelerators. These advancements will free up accelerator physicists to focus their talent more on developing new ideas and concepts, while providing users with higher reliability and more efficient beam delivery, ultimately increasing the scientific output.”

    Speedy simulations

    Accelerator operators don’t spend all of their time trouble-shooting; they also make changes to the beam to optimize it for specific experiments. Scientists can apply for time on an accelerator to conduct a study. The parameters they originally wanted sometimes change as they begin to conduct their experiment. Finding ways to automate this process would save operators and experimental physicists countless hours.

    Auralee Edelen, a research associate at SLAC, is doing just that by exploring how scientists can improve their models of different beam configurations and how to best achieve them.

    To map the many parameters of an entire beam line from start to end, scientists have thus far needed to use thousands of hours on a supercomputer—not always ideal for online adjustments or finding the best way to obtain a particular beam configuration. A machine learning model, on the other hand, could be trained to simulate what would happen if variables were changed, in under a second.

    “This is one of the new capabilities of machine learning that we want to leverage,” Edelen says. “We’re just now getting to a point where we can integrate these models into the control system for operators to use.”

    In 2016 a neural network—a machine learning algorithm designed to recognize patterns—put this idea to the test at the Fermilab Accelerator Science and Technology facility [FAST].

    Fermilab | FAST Facility

    It completed what had been a 20-minute process to compare a few different simulations in under a millisecond. Edelen is expanding on her FAST research at LCLS, pushing the limits of what is currently possible.

    Simulations also come in handy when it isn’t possible for a scientist to take a measurement they want, because doing so would interfere with the beam. To get around this, scientists can use an algorithm to correlate the measurement with others that don’t affect the beam and infer what the desired measurement would have shown.

    Initial studies at FAST demonstrated that a neural network could use this technique to predict meaurements. Now, SLAC’s Facility for Advanced Accelerator and Experimental Tests, or FACET, and its successor, FACET-II, are leading SLAC’s effort to refine this technique for the scientists that use their beam line.

    SLAC FACET

    3
    FACET-II Design, Parameters and Capabilities

    “It’s an exciting time,” says Merminga. “Any one of these improvements would help advance the field of accelerator physics. I am delighted that PIP2IT is being used to test new concepts in accelerator operation.”

    Who knows—within the next few decades, autonomous accelerators may seem as mundane as roaming robotic vacuums

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 11:17 am on June 25, 2019 Permalink | Reply
    Tags: "Is This The Most Massive Star In The Universe?", , , Basic Research, ,   

    From Ethan Siegel: “Is This The Most Massive Star In The Universe?” 

    From Ethan Siegel

    June 24, 2019

    1
    The largest group of newborn stars in our Local Group of galaxies, cluster R136, contains the most massive stars we’ve ever discovered: over 250 times the mass of our Sun for the largest. The brightest of the stars found here are more than 8,000,000 times as luminous as our Sun. And yet, there are still likely even more massive ones out there. (NASA, ESA, AND F. PARESCE, INAF-IASF, BOLOGNA, R. O’CONNELL, UNIVERSITY OF VIRGINIA, CHARLOTTESVILLE, AND THE WIDE FIELD CAMERA 3 SCIENCE OVERSIGHT COMMITTEE)

    At the core of the largest star-forming region of the Local Group sits the biggest star we know of.

    Mass is the single most important astronomical property in determining the lives of stars.
    2
    The (modern) Morgan–Keenan spectral classification system, with the temperature range of each star class shown above it, in kelvin. Our Sun is a G-class star, producing light with an effective temperature of around 5800 K and a brightness of 1 solar luminosity. Stars can be as low in mass as 8% the mass of our Sun, where they’ll burn with ~0.01% our Sun’s brightness and live for more than 1000 times as long, but they can also rise to hundreds of times our Sun’s mass, with millions of times our Sun’s luminosity. (WIKIMEDIA COMMONS USER LUCASVB, ADDITIONS BY E. SIEGEL)

    Greater masses generally lead to higher temperatures, greater brightnesses, and shorter lifetimes.

    3
    The active star-forming region, NGC 2363, is located in a nearby galaxy just 10 million light-years away. The brightest star visible here is NGC 2363-V1, visible as the isolated, bright star in the dark void at left. Despite being 6,300,000 times as bright as our Sun, it’s only 20 times as massive, having likely brightened recently as the result of an outburst. (LAURENT DRISSEN, JEAN-RENE ROY AND CARMELLE ROBERT (DEPARTMENT DE PHYSIQUE AND OBSERVATOIRE DU MONT MEGANTIC, UNIVERSITE LAVAL) AND NASA)

    Since massive stars burn through their fuel so quickly, the record holders are found in actively star-forming regions.

    4
    The ‘supernova impostor’ of the 19th century precipitated a gigantic eruption, spewing many Suns’ worth of material into the interstellar medium from Eta Carinae. High mass stars like this within metal-rich galaxies, like our own, eject large fractions of mass in a way that stars within smaller, lower-metallicity galaxies do not. Eta Carinae might be over 100 times the mass of our Sun and is found in the Carina Nebula, but it is not among the most massive stars in the Universe. (NATHAN SMITH (UNIVERSITY OF CALIFORNIA, BERKELEY), AND NASA)

    Luminosity isn’t enough, as short-lived outbursts can cause exceptional, temporary brightening in typically massive stars.

    5
    The star cluster NGC 3603 is located a little over 20,000 light-years away in our own Milky Way galaxy. The most massive star inside it is, NGC 3603-B, which is a Wolf-Rayet star located at the centre of the HD 97950 cluster which is contained within the large, overall star-forming region. (NASA, ESA AND WOLFGANG BRANDNER (MPIA), BOYKE ROCHAU (MPIA) AND ANDREA STOLTE (UNIVERSITY OF COLOGNE))

    Within our own Milky Way, massive star-forming regions, like NGC 3603, house many stars over 100 times our Sun’s mass.

    6
    The star at the center of the Heart Nebula (IC 1805) is known as HD 15558, which is a massive O-class star that is also a member of a binary system. With a directly-measured mass of 152 solar masses, it is the most massive star we know of whose value is determined directly, rather than through evolutionary inferences. (S58Y / FLICKR)

    As a member of a binary system, HD 15558 A is the most massive star with a definitive value: 152 solar masses.

    7
    The Large Magellanic Cloud, the fourth largest galaxy in our local group, with the giant star-forming region of the Tarantula Nebula (30 Doradus) just to the right and below the main galaxy. It is the largest star-forming region contained within our Local Group. (NASA, FROM WIKIMEDIA COMMONS USER ALFA PYXISDIS)

    However, all stellar mass records originate from the star forming region 30 Doradus in the Large Magellanic Cloud.

    8
    A large section of the Tarantula Nebula, the largest star-forming region in the Local Group, imaged by the Ciel Austral team. At top, you can see the presence of hydrogen, sulfur, and oxygen, which reveals the rich gas and plasma structure of the LMC, while the lower view shows an RGB color composite, revealing reflection and emission nebulae. (CIEL AUSTRAL: JEAN CLAUDE CANONNE, PHILIPPE BERNHARD, DIDIER CHAPLAIN, NICOLAS OUTTERS AND LAURENT BOURGON)

    Known as the Tarantula Nebula, it has a mass of ~450,000 Suns and contains over 10,000 stars.

    9
    The star forming region 30 Doradus, in the Tarantula Nebula in one of the Milky Way’s satellite galaxies, contains the largest, highest-mass stars known to humanity. The largest collection of bright, blue stars shown here is the ultra-dense star cluster R136, which contains nearly 100 stars that are approximately 100 solar masses or greater. Many of them have brightnesses that exceed a million solar luminosities. (NASA, ESA, AND E. SABBI (ESA/STSCI); ACKNOWLEDGMENT: R. O’CONNELL (UNIVERSITY OF VIRGINIA) AND THE WIDE FIELD CAMERA 3 SCIENCE OVERSIGHT COMMITTEE)

    The central star cluster, R136, contains 72 of the brightest, most massive classes of star.

    10
    The cluster RMC 136 (R136) in the Tarantula Nebula in the Large Magellanic Cloud, is home to the most massive stars known. R136a1, the greatest of them all, is over 250 times the mass of the Sun. While professional telescopes are ideal for teasing out high-resolution details such as these stars in the Tarantula Nebula, wide-field views are better with the types of long-exposure times only available to amateurs. (EUROPEAN SOUTHERN OBSERVATORY/P. CROWTHER/C.J. EVANS)

    The record-holder is R136a1, some 260 times our Sun’s mass and 8,700,000 times as bright.

    11
    An ultraviolet image and a spectrographic pseudo-image of the hottest, bluest stars at the core of R136. In this small component of the Tarantula Nebula alone, nine stars over 100 solar masses and dozens over 50 are identified through these measurements. The most massive star of all in here, R136a1, exceeds 250 solar masses, and is a candidate, later in its life, for photodisintegration. (ESA/HUBBLE, NASA, K.A. BOSTROEM (STSCI/UC DAVIS))

    Stars such as this cannot be individually resolved beyond our Local Group.

    12
    An illustration of the first stars turning on in the Universe. Without metals to cool down the stars, only the largest clumps within a large-mass cloud can become stars. Until enough time has passes for gravity to affect larger scales, only the small-scales can form structure early on. Without heavy elements to facilitate cooling, stars are expected to routinely exceed the mass thresholds of the most massive ones known today. (NASA)

    With NASA’s upcoming James Webb Space Telescope, we may discover Population III stars, which could reach thousands of solar masses.

    NASA/ESA/CSA Webb Telescope annotated

    13

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 8:16 am on June 25, 2019 Permalink | Reply
    Tags: "The highest-energy photons ever seen hail from the Crab Nebula", , , Basic Research, , , , , , The Tibet AS-gamma experiment, When a high-energy photon hits Earth’s atmosphere it creates a shower of other subatomic particles that can be detected on the ground.   

    From Science News: “The highest-energy photons ever seen hail from the Crab Nebula” 

    From Science News

    June 24, 2019
    Emily Conover

    Some of the supernova remnant’s gamma rays have more than 100 trillion electron volts of energy.

    1
    CRAB FISHING Scientists hunting for high-energy photons raining down on Earth from space have found the most energetic light yet detected. It’s from the Crab Nebula, a remnant of an exploded star (shown in an image combining light seen by multiple telescopes).

    Physicists have spotted the highest-energy light ever seen. It emanated from the roiling remains left behind when a star exploded.

    This light made its way to Earth from the Crab Nebula, a remnant of a stellar explosion, or supernova, about 6,500 light-years away in the Milky Way. The Tibet AS-gamma experiment caught multiple particles of light — or photons — from the nebula with energies higher than 100 trillion electron volts, researchers report in a study accepted in Physical Review Letters. Visible light, for comparison, has just a few electron volts of energy.

    Tibet AS Gamma Expeiment

    “This energy regime has not been accessible before,” says astrophysicist Petra Huentemeyer of Michigan Technological University in Houghton, who was not involved with the research. For physicists who study this high-energy light, known as gamma rays, “it’s an exciting time,” she says.

    In space, supernova remnants and other cosmic accelerators can boost subatomic particles such as electrons, photons and protons to extreme energies, much higher than those achieved in the most powerful earthly particle accelerators (SN: 10/1/05, p. 213). Protons in the Large Hadron Collider in Geneva, for example, reach a comparatively wimpy 6.5 trillion electron volts. Somehow, the cosmic accelerators vastly outperform humankind’s most advanced machines.

    “The question is: How does nature do it?” says physicist David Hanna of McGill University in Montreal.

    In the Crab Nebula, the initial explosion set up the conditions for acceleration, with magnetic fields and shock waves plowing through space, giving an energy boost to charged particles such as electrons. Low-energy photons in the vicinity get kicked to high energies when they collide with the speedy electrons, and ultimately, some of those photons make their way to Earth.

    When a high-energy photon hits Earth’s atmosphere, it creates a shower of other subatomic particles that can be detected on the ground. To capture that resulting deluge, Tibet AS-gamma uses nearly 600 particle detectors spread across an area of more than 65,000 square meters in Tibet. From the information recorded by the detectors, researchers can calculate the energy of the initial photon.

    But other kinds of spacefaring particles known as cosmic rays create particle showers that are much more plentiful. To select photons, cosmic rays, which are mainly composed of protons and atomic nuclei, need to be weeded out. So the researchers used underground detectors to look for muons — heavier relatives of electrons that are created in cosmic ray showers, but not in showers created by photons.

    Previous experiments have glimpsed photons with nearly 100 TeV, or trillion electron volts. Now, after about three years of gathering data, the researchers found 24 seemingly photon-initiated showers above 100 TeV, and some with energies as high as 450 TeV. Because the weeding out process isn’t perfect, the researchers estimate that around six of those showers could have come from cosmic rays mimicking photons, but the rest are the real deal.

    Researchers with Tibet AS-gamma declined to comment for this story, as the study has not yet been published.

    Looking for photons of ever higher energies could help scientists nail down the details of how the particles are accelerated. “There has to be a limit to how high the energy of the photons can go,” Hanna says. If scientists can pinpoint that maximum energy, that could help distinguish between various theoretical tweaks to how the particles get their oomph.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 4:40 pm on June 24, 2019 Permalink | Reply
    Tags: "The Interiors of Exoplanets May Well Hold the Key to Their Habitability", , , “The heart of habitability is in planetary interiors” concluded Carnegie geochemist George Cody, Basic Research, Cosmochemistry, , Deep Carbon Observatory’s Biology Meets Subduction project, Findings from the Curiosity rover that high levels of the gas methane had recently been detected on Mars., , , PREM-Preliminary Reference Earth Model, This idea that subsurface life on distant planets could be identified by their byproducts in the atmosphere has just taken on a new immediacy, We’ve only understood the Earth’s structure for the past hundred years.   

    From Many Worlds: “The Interiors of Exoplanets May Well Hold the Key to Their Habitability” 

    NASA NExSS bloc

    NASA NExSS

    Many Words icon

    From Many Worlds

    June 23, 2019
    Marc Kaufman

    1
    Scientists have had a working — and evolving — understanding of the interior of the Earth for only a century or so. But determining whether a distant planet is truly habitable may require an understanding of its inner dynamics — which will for sure be a challenge to achieve. (Harvard-Smithsonian Center for Astrophysics)

    The quest to find habitable — and perhaps inhabited — planets and moons beyond Earth focuses largely on their location in a solar system and the nature of its host star, the eccentricity of its orbit, its size and rockiness, and the chemical composition of its atmosphere, assuming that it has one.

    Astronomy, astrophysics, cosmochemistry and many other disciplines have made significant progress in characterizing at least some of the billions of exoplanets out there, although measuring the chemical makeup of atmospheres remains a immature field.

    But what if these basic characteristics aren’t sufficient to answer necessary questions about whether a planet is habitable? What if more information — and even more difficult to collect information — is needed?

    That’s the position of many planetary scientists who argue that the dynamics of a planet’s interior are essential to understand its habitability.

    With our existing capabilities, observing an exoplanet’s atmospheric composition will clearly be the first way to search for signatures of life elsewhere. But four scientists at the Carnegie Institution of Science — Anat Shahar, Peter Driscoll, Alycia Weinberger, and George Cody — argued in a recent perspective article in Science that a true picture of planetary habitability must consider how a planet’s atmosphere is linked to and shaped by what’s happening in its interior.

    They argue that on Earth, for instance, plate tectonics are crucial for maintaining a surface climate where life can fill every niche. And without the cycling of material between the planet’s surface and interior, the convection that drives the Earth’s magnetic field would not be possible and without a magnetic field, we would be bombarded by cosmic radiation.

    1
    What makes a planet potentially habitable and what are signs that it is not. This graphic from the Carnegie paper illustrates the differences (Shahar et al.)

    “The perspective was our way to remind people that the only exoplanet observable right now is the atmosphere, but that the atmospheric composition is very much linked to planetary interiors and their evolution,” said lead author Shahar, who is trained in geological sciences. “If there is a hope to one day look for a biosignature, it is crucial we understand all the ways that interiors can influence the atmospheric composition so that the observations can then be better understood.”

    “We need a better understanding of how a planet’s composition and interior influence its habitability, starting with Earth,” she said. “This can be used to guide the search for exoplanets and star systems where life could thrive, signatures of which could be detected by telescopes.”

    It all starts with the formation process. Planets are born from the rotating ring of dust and gas that surrounds a young star.

    The elemental building blocks from which rocky planets form–silicon, magnesium, oxygen, carbon, iron, and hydrogen–are universal. But their abundances and the heating and cooling they experience in their youth will affect their interior chemistry and, in turn, defining factors such ocean volume and atmospheric composition.

    “One of the big questions we need to ask is whether the geologic and dynamic features that make our home planet habitable can be produced on planets with different compositions,” Carnegie planetary scientist Peter Driscoll explained in a release.

    In the next decade as a new generation of telescopes come online, scientists will begin to search in earnest for biosignatures in the atmospheres of rocky exoplanets. But the colleagues say that these observations must be put in the context of a larger understanding of how a planet’s total makeup and interior geochemistry determines the evolution of a stable and temperate surface where life could perhaps arise and thrive.

    “The heart of habitability is in planetary interiors,” concluded Carnegie geochemist George Cody.

    Our knowledge of the Earth’s interior starts with these basic contours: it has a thin outer crust, a thick mantle, and a core the size of Mars. A basic question that can be asked and to some extent answered now is whether this structure is universal for small rocky planets. Will these three layers be present in some form in many other rocky planets as well?

    Earlier preliminary research published in the The Astrophysical Journal suggests that the answer is yes – they will have interiors very similar to Earth.

    “We wanted to see how Earth-like these rocky planets are. It turns out they are very Earth-like,” said lead author Li Zeng of the Harvard-Smithsonian Center for Astrophysics (CfA)

    To reach this conclusion Zeng and his co-authors applied a computer model known as the Preliminary Reference Earth Model (PREM), which is the standard model for Earth’s interior. They adjusted it to accommodate different masses and compositions, and applied it to six known rocky exoplanets with well-measured masses and physical sizes.

    They found that the other planets, despite their differences from Earth, all should have a nickel/iron core containing about 30 percent of the planet’s mass. In comparison, about a third of the Earth’s mass is in its core. The remainder of each planet would be mantle and crust, just as with Earth.

    “We’ve only understood the Earth’s structure for the past hundred years. Now we can calculate the structures of planets orbiting other stars, even though we can’t visit them,” adds Zeng.

    The model assumes that distant exoplanets have chemical compositions similar to Earth. This is reasonable based on the relevant abundances of key chemical elements like iron, magnesium, silicon, and oxygen in nearby systems. However, planets forming in more or less metal-rich regions of the galaxy could show different interior structures.

    While thinking about exoplanetary interiors—and some day finding ways to investigate them — is intriguing and important, it’s also apparent that there’s a lot more to learn about role of the Earth’s interior in making the planet habitable.

    In 2017, for instance, an interdisciplinary group of early career scientists visited Costa Rica’s subduction zone, (where the ocean floor sinks beneath the continent) to find out if subterranean microbes can affect geological processes that move carbon from Earth’s surface into the deep interior.

    3
    Donato Giovannelli and Karen Lloyd collect samples from the crater lake in Poás Volcano in Costa Rica. (Katie Pratt)

    The study shows that microbes consume and trap a small but measurable amount of the carbon sinking into the trench off Costa Rica’s Pacific coast. The microbes may also be involved in chemical processes that pull out even more carbon, leaving cement-like veins of calcite in the crust.

    According to their new study in Nature, the answer is yes.

    In all, microbes and calcite precipitation combine to trap about 94 percent of the carbon squeezed out from the edge of the oceanic plate as it sinks into the mantle during subduction. This carbon remains naturally sequestered in the crust, where it cannot escape back to the surface through nearby volcanoes in the way that much carbon ultimately recycles.

    These unexpected findings have important implications for how much carbon moves from Earth’s surface into the interior, especially over geological timescales. The research is part of the Deep Carbon Observatory’s Biology Meets Subduction project.

    Overall, the study shows that biology has the power to affect carbon recycling and thereby deep Earth geology.

    “We already knew that microbes altered geological processes when they first began producing oxygen from photosynthesis,” said Donato Giovannelli of University of Naples, Italy (and who I knew from time spent at the Earth-Life Science Institute Tokyo.) He is a specialist in extreme environments and researches what they can tell us about early Earth and possibly other planets.

    “I think there are probably even more ways that biology has had an outsized impact on geology, we just haven’t discovered them yet.”

    The findings also shows, Giovanelli told me, that subsurface microbes might have a similarly outsized effect on the composition and balancing of atmospheres—“hinting to the possibility of detecting the indirect effect of subsurface life through atmosphere measurements of exoplanets,” he said.

    5
    The 2003 finding by Michael Mumma and Geronimo Villanueva of NASA Goddard Space Flight Center showing signs of major plumes of methane on Mars. While some limited and seasonably determined concentrations of methane have been detected since, there has been nothing to compare with the earlier high methane readings Mars — until just last week. (NASA/ M. Mumma et al)

    This idea that subsurface life on distant planets could be identified by their byproducts in the atmosphere has just taken on a new immediacy with findings from the Curiosity rover that high levels of the gas methane had recently been detected on Mars. Earlier research had suggested that Mars had some subsurface methane, but the amount appeared to be quite minimal — except as detected once back in 2003 by NASA scientists.

    None of the researchers now or in the past have claimed that they know the origin of the methane — whether it is produced biologically or through other planetary processes. But on Earth, some 90 percent of methane comes from biology — bacteria, plants, animals.

    Could, then, these methane plumes be a sign that life exists (or existed) below the surface of Mars? It’s possible, and highlights the great importance of what goes on below the surface of planets and moons.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    About Many Worlds
    There are many worlds out there waiting to fire your imagination.

    Marc Kaufman is an experienced journalist, having spent three decades at The Washington Post and The Philadelphia Inquirer, and is the author of two books on searching for life and planetary habitability. While the “Many Worlds” column is supported by the Lunar Planetary Institute/USRA and informed by NASA’s NExSS initiative, any opinions expressed are the author’s alone.

    This site is for everyone interested in the burgeoning field of exoplanet detection and research, from the general public to scientists in the field. It will present columns, news stories and in-depth features, as well as the work of guest writers.

    About NExSS

    The Nexus for Exoplanet System Science (NExSS) is a NASA research coordination network dedicated to the study of planetary habitability. The goals of NExSS are to investigate the diversity of exoplanets and to learn how their history, geology, and climate interact to create the conditions for life. NExSS investigators also strive to put planets into an architectural context — as solar systems built over the eons through dynamical processes and sculpted by stars. Based on our understanding of our own solar system and habitable planet Earth, researchers in the network aim to identify where habitable niches are most likely to occur, which planets are most likely to be habitable. Leveraging current NASA investments in research and missions, NExSS will accelerate the discovery and characterization of other potentially life-bearing worlds in the galaxy, using a systems science approach.
    The National Aeronautics and Space Administration (NASA) is the agency of the United States government that is responsible for the nation’s civilian space program and for aeronautics and aerospace research.

    President Dwight D. Eisenhower established the National Aeronautics and Space Administration (NASA) in 1958 with a distinctly civilian (rather than military) orientation encouraging peaceful applications in space science. The National Aeronautics and Space Act was passed on July 29, 1958, disestablishing NASA’s predecessor, the National Advisory Committee for Aeronautics (NACA). The new agency became operational on October 1, 1958.

    Since that time, most U.S. space exploration efforts have been led by NASA, including the Apollo moon-landing missions, the Skylab space station, and later the Space Shuttle. Currently, NASA is supporting the International Space Station and is overseeing the development of the Orion Multi-Purpose Crew Vehicle and Commercial Crew vehicles. The agency is also responsible for the Launch Services Program (LSP) which provides oversight of launch operations and countdown management for unmanned NASA launches. Most recently, NASA announced a new Space Launch System that it said would take the agency’s astronauts farther into space than ever before and lay the cornerstone for future human space exploration efforts by the U.S.

    NASA science is focused on better understanding Earth through the Earth Observing System, advancing heliophysics through the efforts of the Science Mission Directorate’s Heliophysics Research Program, exploring bodies throughout the Solar System with advanced robotic missions such as New Horizons, and researching astrophysics topics, such as the Big Bang, through the Great Observatories [Hubble, Chandra, Spitzer, and associated programs. NASA shares data with various national and international organizations such as from the [JAXA]Greenhouse Gases Observing Satellite.

     
  • richardmitnick 12:44 pm on June 24, 2019 Permalink | Reply
    Tags: "NASA’s Fermi mission reveals record-setting gamma-ray bursts", , , , Basic Research, , ,   

    From Stanford University: “NASA’s Fermi mission reveals record-setting gamma-ray bursts” 

    Stanford University Name
    From Stanford University

    June 13, 2019

    1
    NASA/DOE/FermiLAT Collaboration

    NASA/Fermi Gamma Ray Space Telescope

    NASA/Fermi LAT

    Stanford has played a leading role in compiling Fermi’s gamma-ray bursts catalogs ever since the space observatory launched nearly 11 years ago.

    For 10 years, NASA’s Fermi Gamma-ray Space Telescope has scanned the sky for gamma-ray bursts (GRBs), the universe’s most luminous explosions. A new catalog of the highest-energy blasts provides scientists with fresh insights into how they work.

    “Fermi is an ongoing experiment that keeps producing good science,” said Nicola Omodei, an astrophysicist at Stanford University’s School of Humanities and Sciences. “GRBs are really one of the most spectacular astronomical events that we witness.”

    The catalog was published in the June 13 edition of The Astrophysical Journal. More than 120 authors contributed to the paper, which was led by Omodei and Giacomo Vianello at Stanford, Magnus Axelsson at Stockholm University in Sweden, and Elisabetta Bissaldi at the National Institute of Nuclear Physics and Polytechnic University in Bari, Italy.

    Stanford has played a leading role in compiling Fermi’s GRB catalogs ever since the space observatory launched nearly 11 years ago. “All of the analysis tools and methods that led to the preperation of the catalogs were developed at Stanford and SLAC,” Omodei said. “We’ve continued to refine the analysis techniques and increase the sensitivity of the Fermi Large Area Telescope (LAT) to GRBs. For every GRB, we can characterize its duration, its temporal behavior, and its spectral properties.”

    GRBs emit gamma rays, the highest-energy form of light. Most GRBs occurs when some types of massive stars run out of fuel and collapse to create new black holes. Others happen when two neutron stars, superdense remnants of stellar explosions, merge. Both kinds of cataclysmic events create jetfers of particles that move near the speed of light. The gamma rays are produced in collisions of fast-moving material inside the jets and when the jets interact with the environment around the star.

    Astronomers can distinguish the two GRB classes by the duration of their lower-energy gamma rays. Short bursts from neutron star mergers last less than 2 seconds, while long bursts typically continue for a minute or more. The new catalog, which includes 17 short and 169 long bursts, describes 186 events seen by Fermi’s Large Area Telescope (LAT) LAT over the last 10 years.

    Fermi observes these powerful bursts using two instruments. The LAT sees about one-fifth of the sky at any time and records gamma rays with energies above 30 million electron volts (MeV) — millions of times the energy of visible light. The Gamma-ray Burst Monitor (GBM) sees the entire sky that isn’t blocked by Earth and detects lower-energy emission. All told, the GBM has detected more than 2,300 GRBs so far.

    Included in Fermi’s latest observation set are a number of record-setting and intriguing events, including the shortest burst ever recorded (GRB 081102B, which lasted just one-tenth of a second), the longest burst in the catalog (GRB 160623A, which remained illuminated for 10 hours), and the farthest known burst (GRB 080916C, located 12.2 billion light-years away in the constellation Carina).

    Also included in the new catalog is GRB 170817A, the first burst to have both its light and gravitational waves captured simultaneously. Light from the event — a product of two neutron stars crashing together — was recorded by Fermi’s GBM instrument, while the spacetime ripples it generated were detected by the Laser Interferometer Gravitational Wave Observatory (LIGO), the Virgo interferometer.


    VIRGO Gravitational Wave interferometer, near Pisa, Italy


    Caltech/MIT Advanced aLigo Hanford, WA, USA installation


    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA

    Cornell SXS, the Simulating eXtreme Spacetimes (SXS) project

    Gravitational waves. Credit: MPI for Gravitational Physics/W.Benger

    Gravity is talking. Lisa will listen. Dialogos of Eide

    ESA/eLISA the future of gravitational wave research

    Localizations of gravitational-wave signals detected by LIGO in 2015 (GW150914, LVT151012, GW151226, GW170104), more recently, by the LIGO-Virgo network (GW170814, GW170817). After Virgo came online in August 2018


    Skymap showing how adding Virgo to LIGO helps in reducing the size of the source-likely region in the sky. (Credit: Giuseppe Greco (Virgo Urbino group)

    “Now that LIGO and VIRGO have begun another observation period, the astrophysics community will be on the lookout for more joint GRB and gravitational wave events” said Judy Racusin, a co-author and Fermi deputy project scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “This catalog was a monumental team effort, and the result helps us learn about the population of these events and prepares us for delving into future groundbreaking finds.”

    The Fermi Gamma-ray Space Telescope is an astrophysics and particle physics partnership managed by NASA’s Goddard Space Flight Center in Greenbelt, Maryland. Fermi was developed in collaboration with the U.S. Department of Energy, with important contributions from academic institutions and partners in France, Germany, Italy, Japan, Sweden and the United States.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Stanford University campus. No image credit

    Stanford University

    Leland and Jane Stanford founded the University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members

    Stanford University Seal

     
  • richardmitnick 12:37 pm on June 23, 2019 Permalink | Reply
    Tags: 5 currently useful and operating telescopes to be removed from Mauna Kea, , , Basic Research, , , TMT work begins on Mauna Kea   

    From The New York Times: “In Hawaii, Construction to Begin on Disputed Telescope Project” 

    New York Times

    From The New York Times

    June 20, 2019
    Dennis Overbye

    TMT-Thirty Meter Telescope, proposed and now approved for Mauna Kea, Hawaii, USA4,207 m (13,802 ft) above sea level

    Gov. David Ige of Hawaii announced on Thursday that a “notice to proceed” had been issued for construction of a giant, long-contested telescope on Mauna Kea, the volcano on the Big Island that 13 major telescopes already call home. Construction could start as soon as July.

    Such an announcement has been anxiously awaited both by astronomers and by Hawaiian cultural activists since last year, when Hawaii’s Supreme Court restored the telescope’s building permit. As part of the deal, five telescopes currently operating on Mauna Kea will be shut down and their sites restored to their original condition.

    “We are all stewards of Mauna Kea,” Governor Ige said. He pledged to respect the rights and cultural traditions of the Hawaiian people, including the freedom to speak out against the telescope.

    He asked that further debate happen away from the mountain, where steep roads and limited water, oxygen and medical services pose a safety risk. As he spoke, arguments were already breaking out on Twitter and Facebook.

    “This decision of the Hawaiian Supreme Court is the law of the land, and it should be respected,” he said.

    The announcement was another skirmish, surely not the last, for control of the volcano’s petrified lava slopes and the sky overhead. The Thirty Meter Telescope would be the largest in the Northern Hemisphere. Hawaiian activists have long opposed it, contending that decades of telescope-building on Mauna Kea have polluted the mountain. In 2014, protesters disrupted a groundbreaking ceremony and blocked work vehicles from accessing the mountain.

    Mauna Kea is considered “ceded land” held in trust for the Hawaiian people, and some Hawaiians have argued that the spate of telescope construction atop the mountain has interfered with cultural and religious practices.

    The Thirty Meter Telescope would be built by an international collaboration called the TMT International Observatory. The project, which involves the University of California and the California Institute of Technology as well as Japan, China, India and Canada, is expected to cost $2 billion.

    In December 2015, the state’s Supreme Court invalidated a previous construction permit, on the grounds that the opponents had been deprived of due process because a state board had granted the permit before the opponents could be heard in a contested case hearing. The court awarded a new permit last year.

    At the time, astronomers with the project said they would build the telescope in the Canary Islands if denied in Hawaii.

    On Wednesday night, in a precursor to Thursday’s announcement, state authorities dismantled an assortment of structures that had been constructed on Mauna Kea by protesters.

    The structures included a pair of shacks called “hales,” one located across from a visitor center halfway up the mountain, where protests had been staged, and another at the base of the mountain that activists were using as a checkpoint.

    Also dismantled were two small stone monuments, or “ahus” — one on the road leading to the telescope site, the other in the middle of the site, according to a spokesman for the TMT project. They were built only recently, without a permit, and so were deemed by the court to have no historical value.

    But Kealoha Pisciotta, a leader of the opposition, called the dismantling a “desecration” and “a hostile and racist act,” in an email. “They call these Religious structures illegal structures but our rights are constitutionally protected and the right specifically protected is our right to ‘continue’ our practice,” she wrote.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 10:23 am on June 23, 2019 Permalink | Reply
    Tags: , Basic Research, , Mellanox HDR 200G InfiniBand is powering next-gen supercomputers,   

    From insideHPC: “Mellanox HDR 200G InfiniBand is powering next-gen supercomputers” 

    From insideHPC

    June 23, 2019

    Today Mellanox announced that HDR 200G InfiniBand is powering the next generation of supercomputers world-wide, enabling higher levels of research and scientific discovery. HDR 200G InfiniBand solutions include the ConnectX-6 adapters, Mellanox Quantum switches, LinkX cables and transceivers and software packages. With its highest data throughput, extremely low latency, and smart In-Network Computing acceleration engines, HDR InfiniBand provides world leading performance and scalability for the most demanding compute and data applications.

    HDR 200G InfiniBand introduces new offload and acceleration engines, for delivering leading performance and scalability for high-performance computing, artificial intelligence, cloud, storage, and other applications. InfiniBand, a standards-based interconnect technology, enjoys the continuous development of new capabilities, while maintaining backward and forward software compatibility. InfiniBand is the preferred choice for world leading supercomputers, replacing lower performance or proprietary interconnect options.

    “We are proud to have our HDR InfiniBand solutions accelerate supercomputers around the world, enhance research and discoveries, and advancing Exascale programs,” said Gilad Shainer, senior vice president of marketing at Mellanox Technologies. “InfiniBand continues to gain market share, and be selected by many research, educational and government institutes, weather and climate facilities, and commercial organizations. The technology advantages of InfiniBand make it the interconnect of choice for compute and storage infrastructures.”

    The Texas Advanced Computing Center’s (TACC) Frontera supercomputer, funded by the National Science Foundation, is the fastest supercomputer at any U.S. university and one of the most powerful systems in the world.

    TACC Frontera Dell EMC supercomputer fastest at any university

    Ranked #5 on the June 2019 TOP500 Supercomputers list, Frontera utilizes HDR InfiniBand, and in particular multiple 800-port HDR InfiniBand switches, to deliver unprecedented computing power for science and engineering.

    “HDR InfiniBand enabled us to build a world-leading, 8,000+ node, top 5 supercomputer that will serve our users’ needs for the next several years,” said Dan Stanzione, TACC Executive Director. “We appreciate the deep collaboration with Mellanox and are proud to host one of the fastest supercomputers in the world. We look forward to utilizing the advanced routing capabilities and the In-Network Computing acceleration engines to enhance our users’ research activities and scientific discoveries.”

    Located at the Mississippi State University High Performance Computing Collaboratory, the new HDR InfiniBand-based Orion supercomputer will accelerate the university research, educational and service activities.

    Dell EMC Orion supercomputer at Mississippi State University

    Ranked #62 on the June 2019 TOP500 list, the 1800-node supercomputer leverages the performance advantages of HDR InfiniBand and its application acceleration engines to provide new levels of application performance and scalability.

    “HDR InfiniBand brings us leading performance and the ability to build very scalable and cost efficient supercomputers utilizing its high switch port density and configurable network topology,” said Trey Breckenridge, Director for High Performance Computing at Mississippi State University. “Over 16 years ago MSU became one of the first adopters of the InfiniBand technology in HPC. We are excited to continue that legacy by leveraging the latest InfiniBand technology to enhance the capabilities of our newest HPC system.”

    CSC, the Finnish IT Center for Science, and the Finnish Meteorological Institute Selected HDR 200G InfiniBand to accelerate a multi-phase supercomputer program. The program will serve researchers in Finnish universities and research institutes, enhancing their research into climate science, renewable energy, astrophysics, nanomaterials, and bioscience, among a wide range of exploration activities. The first supercomputer is ranked #166 on the TOP500 list.

    “The new supercomputer will enable our researchers and scientists to leverage the most efficient HPC and AI platform to enhance their competitiveness for years to come,” said Pekka Lehtovuori, Director of services for research at CSC. “The HDR InfiniBand technology, and the Dragonfly+ network topology will provide our users with leading performance and scalability while optimizing our total cost of ownership.”

    Cygnus is the first HDR InfiniBand supercomputer in Japan, located in the Center for Computational Sciences at the University of Tsukuba.

    Cygnus FPGA GPU supercomputer at University of Tsukuba Japan

    Ranked #264 on the TOP500 list, Cygnus leverages HDR InfiniBand to connect CPUs, GPUs and FPGAs together, enabling accelerated research in the areas of astrophysics, particle physics, material science, life, meteorology and artificial intelligence.

    The Center for Development of Advanced Computing (C-DAC) has selected HDR InfiniBand for India’s national supercomputing mission. The C-DAC HDR InfiniBand supercomputer advances India’s research, technology, and product development capabilities.

    “The Center for Development of Advanced Computing (C-DAC), an autonomous R&D institution under the Ministry of Electronics and IT, Government of India with its focus in Advanced Computing is uniquely positioned to establish dependable and secure Exascale Ecosystem offering services in various domains. As our nation embarks upon its most revolutionary phase of Digital Transformation, C-DAC has committed itself to explore and engage in the avant-garde visionary areas excelling beyond in the present areas of research transforming human lives through technological advancement,” said Dr Hemant Darbari, Director General, C-DAC.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Founded on December 28, 2006, insideHPC is a blog that distills news and events in the world of HPC and presents them in bite-sized nuggets of helpfulness as a resource for supercomputing professionals. As one reader said, we’re sifting through all the news so you don’t have to!

    If you would like to contact me with suggestions, comments, corrections, errors or new company announcements, please send me an email at rich@insidehpc.com. Or you can send me mail at:

    insideHPC
    2825 NW Upshur
    Suite G
    Portland, OR 97239

    Phone: (503) 877-5048

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: