Tagged: Basic Research Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:55 pm on June 25, 2017 Permalink | Reply
    Tags: , , Basic Research, , Interstellar medium, NASA CHESS - Colorado High-resolution Echelle Stellar Spectrograph, , University of Colorado   

    From Goddard: “NASA-Funded CHESS Mission Will Check Out the Space Between Stars” 

    NASA Goddard Banner
    NASA Goddard Space Flight Center

    June 23, 2017
    Lina Tran
    kathalina.k.tran@nasa.gov
    NASA’s Goddard Space Flight Center, Greenbelt, Md.

    Deep in space between distant stars, space is not empty. Instead, there drifts vast clouds of neutral atoms and molecules, as well as charged plasma particles called the interstellar medium — that may, over millions of years, evolve into new stars and even planets. These floating interstellar reservoirs are the focus of the NASA-funded CHESS sounding rocket mission, which will check out the earliest stages of star formation.

    CHESS — short for the Colorado High-resolution Echelle Stellar Spectrograph — is a sounding rocket payload that will fly on a Black Brant IX suborbital sounding rocket early in the morning of June 27, 2017. CHESS measures light filtering through the interstellar medium to study the atoms and molecules within, which provides crucial information for understanding the lifecycle of stars.

    1
    Floating clouds of the interstellar medium are the focus of the NASA-funded CHESS sounding rocket mission, which will check out the earliest stages of star formation. Here, the CHESS payload is integrated with the sounding rocket before launch.
    Credits: photo courtesy of Kevin France

    [NO IMAGES OF CHESS SATELLITE AVAILABLE]

    “The interstellar medium pervades the galaxy,” said Kevin France, the CHESS principle investigator at the University of Colorado, Boulder.

    “When massive stars explode as supernovae, they expel this raw material. It’s the insides of dead stars, turning into the next generation of stars and planets.”

    CHESS is a spectrograph, which provides information on how much of any given wavelength of light is present. It will train its eye at Beta Scorpii — a hot, brightly shining star in the Scorpius constellation well-positioned for the instrument to probe the material between the star and our own solar system. As light from Beta Scorpii streams toward Earth, atoms and molecules — including carbon, oxygen and hydrogen — block the light to varying degrees along the way.

    Scientists know which wavelengths are blocked by what, so by looking at how much light reaches the space around Earth, they can assess all sorts of details about the space it travelled through to get there. CHESS data provides observations such as which atoms and molecules are present in space, their temperatures and how fast they’re moving.

    The scientists also use CHESS data to evaluate how the interstellar cloud is structured, which can help them pinpoint where it stands in the process of star formation. It’s still not known exactly how long it takes for this material to be incorporated into new stars. But scientists know dense clouds can pave the way for the collapse at the very beginning of star formation.

    The flight of a sounding rocket is a short one; CHESS will fly for about 16 minutes total. Just six-and-a-half of those minutes are spent making observations between 90 and 200 miles above the surface — observations that can only be made in space, above the atmosphere, which the far-ultraviolet light that CHESS observes can’t penetrate. After the flight, the payload parachutes to the ground, where it can be recovered for future flights.

    This is the third flight for the CHESS payload in the past three years, and the mission’s most detailed survey yet. The scientists have used each to trial and improve the technology; the upcoming flight sports an upgraded diffraction grating, which reflects light and separates it into its different wavelengths.

    “A more efficient grating means the instrument is that many times more sensitive,” France said. “Compared to the first flight of CHESS, this third incarnation is about eight times more sensitive.”

    By flying rapidly developing instruments on relatively inexpensive sounding rockets, scientists are not only able to acquire high-quality science data, but also test and mature their instruments toward possible spaceflight. According to France, the CHESS instrument serves as a spectrograph prototype for NASA’s LUVOIR concept.

    “Supporting technology and suborbital flight projects today directly translates into lower risk and shorter development time for NASA’s large missions in the next two decades,” France said.

    The launch window for CHESS opens at 1:10 a.m. EDT at the White Sands Missile Range near Las Cruces, New Mexico. Precise timing of the launch will depend on weather conditions.

    CHESS is supported through NASA’s Sounding Rocket Program conducted at the agency’s Wallops Flight Facility, which is managed by NASA’s Goddard Space Flight Center in Greenbelt, Maryland. Orbital ATK provides mission planning, engineering services and field operations for the NASA Sounding Rocket Operations Contract. NASA’s Heliophysics Division manages the sounding rocket program for the agency.

    Related:

    More about NASA’s sounding rocket program
    Voyager 1 Helps Solver Interstellar Medium Mystery
    NASA’s IBEX Provides First View of the Solar System’s Tail

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA’s Goddard Space Flight Center is home to the nation’s largest organization of combined scientists, engineers and technologists that build spacecraft, instruments and new technology to study the Earth, the sun, our solar system, and the universe.

    Named for American rocketry pioneer Dr. Robert H. Goddard, the center was established in 1959 as NASA’s first space flight complex. Goddard and its several facilities are critical in carrying out NASA’s missions of space exploration and scientific discovery.


    NASA/Goddard Campus

     
  • richardmitnick 11:27 am on June 25, 2017 Permalink | Reply
    Tags: , , Basic Research, , , D.O.E. Office of Science, , , , Lambda-Cold Dark Matter Accelerated Expansion of the Universe, LBNL/DESI Dark Energy Spectroscopic Instrument,   

    From US D.O.E. Office of Science: “Our Expanding Universe: Delving into Dark Energy” 

    DOE Main

    Department of Energy Office of Science

    06.21.17
    Shannon Brescher Shea
    shannon.shea@science.doe.gov

    Space is expanding ever more rapidly and scientists are researching dark energy to understand why.

    1
    This diagram shows the timeline of the universe, from its beginnings in the Big Bang to today. Image courtesy of NASA/WMAP Science Team.

    The universe is growing a little bigger, a little faster, every day.

    And scientists don’t know why.

    If this continues, almost all other galaxies will be so far away from us that one day, we won’t be able to spot them with even the most sophisticated equipment. In fact, we’ll only be able to spot a few cosmic objects outside of the Milky Way. Fortunately, this won’t happen for billions of years.

    But it’s not supposed to be this way – at least according to theory. Based on the fact that gravity pulls galaxies together, Albert Einstein’s theory predicted that the universe should be expanding more slowly over time. But in 1998, astrophysicists were quite surprised when their observations showed that the universe was expanding ever faster. Astrophysicists call this phenomenon “cosmic acceleration.”

    “Whatever is driving cosmic acceleration is likely to dominate the future evolution of the universe,” said Josh Frieman, a researcher at the Department of Energy’s (DOE) Fermilab [FNAL] and director of the Dark Energy Survey.


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam

    While astrophysicists know little about it, they often use “dark energy” as shorthand for the cause of this expansion. Based on its effects, they estimate dark energy could make up 70 percent of the combined mass and energy of the universe. Something unknown that both lies outside our current understanding of the laws of physics and is the major influence on the growth of the universe adds up to one of the biggest mysteries in physics. DOE’s Office of Science is supporting a number of projects to investigate dark energy to better understand this phenomenon.

    The Start of the Universe

    Before scientists can understand what is causing the universe to expand now, they need to know what happened in the past. The energy from the Big Bang drove the universe’s early expansion. Since then, gravity and dark energy have engaged in a cosmic tug of war. Gravity pulls galaxies closer together; dark energy pushes them apart. Whether the universe is expanding or contracting depends on which force dominates, gravity or dark energy.

    Just after the Big Bang, the universe was much smaller and composed of an extremely high-energy plasma. This plasma was vastly different from anything today. It was so dense that it trapped all energy, including light. Unlike the current universe, which has expanses of “empty” space dotted by dense galaxies of stars, this plasma was nearly evenly distributed across that ancient universe.

    As the universe expanded and became less dense, it cooled. In a blip in cosmic time, protons and electrons combined to form neutral hydrogen atoms. When that happened, light was able to stream out into the universe to form what is now known as the “cosmic microwave background [CMB].”

    CMB per ESA/Planck


    ESA/Planck

    Today’s instruments that detect the cosmic microwave background provide scientists with a view of that early universe.

    Back then, gravity was the major force that influenced the structure of the universe. It slowed the rate of expansion and made it possible for matter to coalesce. Eventually, the first stars appeared about 400 million years after the Big Bang. Over the next several billion years, larger and larger structures formed: galaxies and galaxy clusters, containing billions to quadrillions (a million billion) of stars. While these cosmic objects formed, the space between galaxies continued to expand, but at an ever slower rate thanks to gravitational attraction.

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex MittelmannColdcreation

    But somewhere between 3 and 7 billion years after the Big Bang, something happened: instead of the expansion slowing down, it sped up. Dark energy started to have a bigger influence than gravity. The expansion has been accelerating ever since.

    Scientists used three different types of evidence to work out this history of the universe. The original evidence in 1998 came from observations of a specific type of supernova [Type 1a]. Two other types of evidence in the early 2000s provided further support.

    “It was this sudden avalanche of results through cosmology,” said Eric Linder, a Berkeley Lab researcher and Office of Science Cosmic Frontier program manager.

    Now, scientists estimate that galaxies are getting 0.007 percent further away from each other every million years. But they still don’t know why.

    What is Dark Energy?

    “Cosmic acceleration really points to something fundamentally different about how the forces of the universe work,” said Daniel Eisenstein, a Harvard University researcher and former director of the Sloan Digital Sky Survey. “We know of four major forces: gravity, electromagnetism, and the weak and strong forces. And none of those forces can explain cosmic acceleration.”

    So far, the evidence has spurred two competing theories.

    The leading theory is that dark energy is the “cosmological constant,” a concept Albert Einstein created in 1917 to balance his equations to describe a universe in equilibrium. Without this cosmological constant to offset gravity, a finite universe would collapse into itself.

    Today, scientists think the constant may represent the energy of the vacuum of space. Instead of being “empty,” this would mean space is actually exerting pressure on cosmic objects. If this idea is correct, the distribution of dark energy should be the same everywhere.

    All of the observations fit this idea – so far. But there’s a major issue. The theoretical equations and the physical measurements don’t match. When researchers calculate the cosmological constant using standard physics, they end up with a number that is off by a huge amount: 1 X 10^120 (1 with 120 zeroes following it).

    “It’s hard to make a math error that big,” joked Frieman.

    That major difference between observation and theory suggests that astrophysicists do not yet fully understand the origin of the cosmological constant, even if it is the cause of cosmic acceleration.

    The other possibility is that “dark energy” is the wrong label altogether. A competing theory posits that the universe is expanding ever more rapidly because gravity acts differently at very large scales from what Einstein’s theory predicts. While there’s less evidence for this theory than that for the cosmological constant, it’s still a possibility.

    The Biggest Maps of the Universe

    To collect evidence that can prove or disprove these theories, scientists are creating a visual history of the universe’s expansion. These maps will allow astrophysicists to see dark energy’s effects over time. Finding that the structure of the universe changed in a way that’s consistent with the cosmological constant’s influence would provide strong evidence for that theory.

    There are two types of surveys: imaging and spectroscopic. The Dark Energy Survey and Large Synoptic Survey Telescope (LSST) are imaging surveys, while the Baryon Oscillation Spectroscopic Survey (part of the Sloan Digital Sky Survey), eBOSS, and the Dark Energy Spectroscopic Instrument are spectroscopic.


    LSST Camera, built at SLAC



    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    BOSS Supercluster Baryon Oscillation Spectroscopic Survey (BOSS)

    LBNL/DESI Dark Energy Spectroscopic Instrument for the Nicholas U. Mayall 4-meter telescope at Kitt Peak National Observatory near Tucson, Ariz, USA

    Imaging surveys use giant cameras – some the size of cars – to take photos of the night sky. The farther away the object, the longer the light has taken to reach us. Taking pictures of galaxies, galaxy clusters, and supernovae at various distances shows how the distribution of matter has changed over time. The Dark Energy Survey, which started collecting data in 2013, has already photographed more than 300 million galaxies. By the time it finishes in 2018, it will have taken pictures of about one-eighth of the entire night sky. The LSST will further expand what we know. When it starts in 2022, the LSST will use the world’s largest digital camera to take pictures of 20 billion galaxies.

    “That is an amazing number. It could be 10% of all of the galaxies in the observable universe,” said Steve Kahn, a professor of physics at Stanford and LSST project director.

    However, these imaging surveys miss a key data point – how fast the Milky Way and other galaxies are moving away from each other. But spectroscopic surveys that capture light outside the visual spectrum can provide that information. They can also more accurately estimate how far away galaxies are. Put together, this information allows astrophysicists to look back in time.

    The Baryon Oscillation Spectroscopic Survey (BOSS), part of the larger Sloan Digital Sky Survey, was one of the biggest projects to take, as the name implies, a spectroscopic approach. It mapped more than 1.2 million galaxies and quasars.

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    However, there’s a major gap in BOSS’s data. It could measure what was going on 5 billion years ago using bright galaxies and 10 billion years ago using bright quasars. But it had nothing about what was going on in-between. Unfortunately, this time period is most likely when dark energy started dominating.

    “Seven billion years ago, dark energy starts to really dominate and push the universe apart more rapidly. So we’re making these maps now that span that whole distance. We start in the backyard of the Milky Way, our own galaxy, and we go out to 7 billion light years,” said David Schlegel, a Berkeley Lab researcher who is the BOSS principal investigator. That 7 billion light years spans the time from when the light was originally emitted to it reaching our telescopes today.

    Two new projects are filling that gap: the eBOSS survey and the Dark Energy Spectroscopic Instrument (DESI). eBOSS will target the missing time span from 5 to 7 billion years ago.

    4
    SDSS eBOSS.

    DESI will go back even further – 11 billion light years. Even though the dark energy was weaker then relative to gravity, surveying a larger volume of space will allow scientists to make even more precise measurements. DESI will also collect 10 times more data than BOSS. When it starts taking observations in 2019, it will measure light from 35 million galaxies and quasars.

    “We now realize that the majority of … the universe is stuff that we’ll never be able to directly measure using experiments here on Earth. We have to infer their properties by looking to the cosmos,” said Rachel Bean, a researcher at Cornell University who is the spokesperson for the LSST Dark Energy Science Collaboration. Solving the mystery of the galaxies rushing away from each other, “really does present a formidable challenge in physics. We have a lot of work to do.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of the Energy Department is to ensure America’s security and prosperity by addressing its energy, environmental and nuclear challenges through transformative science and technology solutions.

    Science Programs Organization

    The Office of Science manages its research portfolio through six program offices:

    Advanced Scientific Computing Research
    Basic Energy Sciences
    Biological and Environmental Research
    Fusion Energy Sciences
    High Energy Physics
    Nuclear Physics

    The Science Programs organization also includes the following offices:

    The Department of Energy’s Small Business Innovation Research and Small Business Technology Transfer Programs, which the Office of Science manages for the Department;
    The Workforce Development for Teachers and Students program sponsors programs helping develop the next generation of scientists and engineers to support the DOE mission, administer programs, and conduct research; and
    The Office of Project Assessment provides independent advice to the SC leadership regarding those activities essential to constructing and operating major research facilities.

     
  • richardmitnick 10:11 am on June 25, 2017 Permalink | Reply
    Tags: , , Basic Research, , , N44 in the LMC   

    From Manu: “the superbubble n44” 


    Manu Garcia, a friend from IAC.

    The universe around us.
    Astronomy, everything you wanted to know about our local universe and never dared to ask.

    1
    N44 Copyright and copyright credit: Gemini Obs., Aura, NSF

    The immense emission nebula N44 in our neighboring galaxy, the great Large Magellanic Cloud, has a great 250-Light-year hole and astronomers are trying to figure out why. One possibility is that the particle winds driven out by mass stars inside the bubble that push out the gas that shines brightly.

    Large Magellanic Cloud. Adrian Pingstone December 2003

    However, this response was found to be inconsistent with wind speeds. Another possibility is that the expanding shells of old supernovas have sculpted the unusual space cavern. Recently detected an unexpected runway of x-Ray Hot gas that escaped from the superbubble N44. The featured image was taken in three very specific colors by the GIANT GEMINI 8-metre telescope at cerro pachón in Chile.

    NOAO Gemini Planet Imager on Gemini South


    Gemini South telescope, Cerro Tololo Inter-American Observatory (CTIO) campus near La Serena, Chile

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 9:50 am on June 25, 2017 Permalink | Reply
    Tags: Asteroid research, , , Basic Research, , GeekWire   

    From GeekWire: “Are asteroids leaving the spotlight? No way, say Asteroid Day activists” 

    1

    GeekWire

    June 24, 2017

    Chelsey Ballarte
    Alan Boyle

    1
    An artist’s concept shows asteroids zooming past Earth (NASA / Asteroid Day Illustration).

    NASA may be closing down its grand plan to study a piece of an asteroid up close, but the researchers who focus on near-Earth objects aren’t turning their backs on massive space boulders.

    They say it’s just a matter of time before we’ll be forced to head off a threatening asteroid. On Friday, they’ll be calling attention to the challenge — and what scientists and activists are doing to address it.

    For the past two years, the organizers of Asteroid Day have focused on June 30 as a time to turn an international spotlight on planetary defense. The date marks the anniversary of the Tunguska explosion, a presumed asteroid strike that destroyed half a million acres of forest in Siberia in 1908.

    This year, with the United Nations’ encouragement, 190 countries around the world are planning a total of more than 700 Asteroid Day events, ranging from planetarium shows and virtual reality tours to a 24-hour streaming video marathon.

    Seattle’s Museum of Flight is hosting Asteroid Awareness Day, which will feature fun activities for the kids and live-stream lectures for the grownups.

    2
    Kids in Chile hold up drawings in celebration of Asteroid Day. No image credit.

    One of the organizers of the worldwide Asteroid Day event is former NASA astronaut Ed Lu, executive director of the nonprofit B612 Foundation’s Asteroid Institute.

    More than a decade ago, Lu proposed a method for diverting an asteroid [Nature] that was on a long-term collision course for Earth, by stationing a spacecraft near the asteroid and letting its gravitational attraction pull the asteroid ever so gently into a non-threatening trajectory.

    The “gravity tractor” concept would have been tested during NASA’s Asteroid Redirect Mission, or ARM, which had been planned for the mid-2020s. The Obama administration championed ARM, but now that President Donald Trump is in the White House, it’s being defunded.

    Despite ARM’s cutoff, Lu says other studies focusing on how to deflect potentially threatening asteroids will move forward.

    “We still think a test of a gravity tractor would be very useful,” he told GeekWire.

    Lu points to future space shots such as the European Space Agency’s Asteroid Impact Mission and NASA’s Double Asteroid Redirection Test, or DART, which is designed to smash a probe into an asteroid and see how much its trajectory changes.

    ESA AIM Asteroid Impact Mission

    NASA DART Double Imact Redirection Test vehicle

    NASA’s OSIRIS-REx mission, launched last September, aims to bring bits of an asteroid back to Earth for study in 2023.

    NASA OSIRIS-REX OVIRS


    NASA OSIRIS-REx Spacecraft

    NASA says the data produced during ARM’s planning stages will be used to guide preparations for future missions. “Asteroid encounter mission concepts remain of interest due to the broad array of benefits for the human and robotic exploration, science, planetary defense and asteroidal resources communities,” the space agency said in this month’s ARM update.

    Concerns about asteroid threats have risen to fossil evidence suggesting that cosmic impacts were behind ancient mass extinctions — including the demise of the dinosaurs 65 million years ago — as well as more recent events such as 2013’s Chelyabinsk meteor explosion and 1908’s Tunguska blast.

    Scientists estimate that a Tunguska-scale impact might happen every few centuries or so on average. NASA is already tracking more than 15,000 near-Earth asteroids, and Czech astronomers recently reported a new source of potentially hazardous space debris.

    The asteroid-tracking game is likely to become more complex in the 2020s, when the Large Synoptic Survey Telescope, or LSST, goes into full swing in Chile.


    LSST Camera, built at SLAC



    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    The University of Washington’s DIRAC Institute and the B612 Foundation’s Asteroid Institute are already bracing for what’s expected to be a flood of LSST asteroid discoveries.

    Alan Fitzsimmons, an astrophysicist from Queen’s University Belfast who’s part of Europe’s NEOshield-2 project, says scientists and engineers are making great strides in detecting asteroids – and adds a caveat that’s fitting for Asteroid Day.

    “Astronomers find Near-Earth asteroids every day, and most are harmless,” he said in a news release. “But it is still possible the next Tunguska would take us by surprise, and although we are much better at finding larger asteroids, that does us no good if we are not prepared to do something about them.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 8:37 pm on June 24, 2017 Permalink | Reply
    Tags: Ask Ethan: Can Failed Stars Eventually Succeed?, , , Basic Research, ,   

    From Ethan Siegel: “Ask Ethan: Can Failed Stars Eventually Succeed?” 

    Ethan Siegel
    Jun 24, 2017


    The closest brown dwarf system to Earth, Luhman 16, contains enough total mass to form a red dwarf star if everything within it were combined. The question of whether this will ever happen in our Universe is an interesting one. Janella Williams, Penn State University.

    In the night sky, the most apparent thing of all are the stars, found in every direction we dare to look. But for every star that gathers enough mass to ignite nuclear fusion in its core, burning hydrogen into helium and turning matter into energy via E = mc2, there are many other objects that didn’t make it that far. Most collections of mass that start to form in a nebula never get big enough to become a star, and instead become fragmented gas clouds, asteroids, rocky worlds, gas giants, or brown dwarfs. The brown dwarfs are the “failed stars” of the Universe, having gathered enough mass to ignite some rare-isotope fusion reactions, but not enough to become true stars. But many brown dwarfs come in binary pairs, leading Ibnul Hussaini to wonder if they might, someday, merge:

    “Will the orbit of these [brown dwarfs] over a long period of time, eventually become smaller and smaller from the loss of energy through gravitational waves? Will they then eventually end up merging? If so, what happens in a [brown dwarf] merger? Will they merge to become an actual star that goes through fusion? Or is it something else entirely?”

    In astronomy, as in life, just because you didn’t make it on the first try doesn’t mean you’ll never get there. Let’s start by looking at the ones that make it.

    2
    An illustration of a giant planet around a red dwarf star. The difference between a planet, a failed star, and a true star comes down to one thing only: mass. ESO.

    In order to ignite nuclear fusion in the core of a star — to get hydrogen nuclei to fuse — you need to reach a temperature of around 4,000,000 K. The gas that stars form from in interstellar space begins at relatively cold temperatures: just a few tens of degrees above absolute zero. But once gravitation kicks in, it causes this cloud of gas to collapse. When collapse occurs, the atoms inside gain speed, collide with each other, and heat up. If there were only a small number of atoms present, they’d emit that heat out into the interstellar medium, sending light streaming throughout the galaxy. But when you get large numbers of atoms together, they trap that heat, causing the interior of the gas cloud to heat up.

    3
    The constellation of Orion, along with the great molecular cloud complex and including its brightest stars. Many new stars are presently forming here due to the collapse of gas, which traps the heat from stellar formation. Rogelio Bernal Andreo.

    If you form something very small, like of the mass of an asteroid, Earth, or even Jupiter, you might heat up to thousands or even tens of thousands of degrees in your core, but you’ll still be very far away from that fusion temperature. But if you hit a certain critical mass — about thirteen times the mass of Jupiter — you’ll achieve a temperature of about 1,000,000 K. That’s not enough to begin fusing hydrogen into helium, but is a critical temperature for a very specific reaction: deuterium fusion. About 0.002% of the hydrogen in the Universe doesn’t just have a single proton as its nucleus, but rather a proton and a neutron bound together, known as a deuteron. At temperatures of a million degrees, a deuteron and a proton can fuse together into helium-3 (an uncommon isotope of helium), a reaction which releases energy.

    4
    The proton-proton chain responsible for producing the vast majority of the Sun’s power is an example of nuclear fusion. In deuterium fusion, only the deuterium (H-2) + proton (H-1) going to helium-3 (He-3) reaction can occur. Borb / Wikimedia Commons.

    This is important! This release of energy, particularly during the protostar (i.e., star-formation) phase, generates high-energy radiation that pushes back against internal gravitational collapse, preventing the very center from getting too hot and hitting that 4,000,000 K threshold. This buys you extra time — tens of thousands of years or more — allowing you to gather more and more mass. Once you start fusing pure hydrogen (i.e., protons) in your core, the energy release is so intense that stars don’t grow any larger, so those early, first stages are critical. If it weren’t for deuterium fusion, the most massive stars would cap out at only about three times the mass of our Sun, instead of the hundreds of solar masses they reach in our backyard.

    5
    A composite image of the first exoplanet ever directly imaged (red) and its brown dwarf parent star, as seen in the infrared. A true star would be much physically larger and higher in mass than the brown dwarf shown here. European Southern Observatory (ESO).

    In order to ever reach that 4,000,000 K temperature in your core, and thereby become a true star, you need a minimum of about 7.5% the mass of our Sun: around 1.5 × 1029 kg of mass. To become a deuterium-fusing brown dwarf, also known as a failed star, you need somewhere between 2.5 × 1028 kg and 1.5 × 1029 kg of mass. And just as there are binary stars out there in great numbers, so, too, are there binary brown dwarfs.

    6
    These are the two brown dwarfs that make up Luhman 16, and they may eventually merge together to create a star. NASA/JPL/Gemini Observatory/AURA/NSF.

    Gemini/North telescope at Mauna Kea, Hawaii, USA

    Gemini South telescope, Cerro Tololo Inter-American Observatory (CTIO) campus near La Serena, Chile

    NRAO/Karl V Jansky VLA, on the Plains of San Agustin fifty miles west of Socorro, NM, USA

    In fact, the closest brown dwarf to us, the system Luhman 16, is a binary system, while other brown dwarfs have been known to have giant planets orbiting them. In the specific case of Luhman 16, the masses of the two brown dwarfs are determined to be:

    Between 8.0 × 1028 kg and 1.0 × 1029 kg, for the primary, and
    between 6.0 × 1028 kg and 1.0 × 1029 kg, for the secondary.

    In other words, there’s an excellent chance that if these two failed stars, orbiting at about three times the Earth-Sun distance from one another, were to merge, they would form an actual star. In fact, any addition of mass that takes a failed star over that mass threshold to begin burning hydrogen in its core ought to do it.

    7
    The two brown dwarfs that make up Luhman 16 have been imaged twelve separate times by the Hubble Space Telescope, indicating their motion and relative orbits over a multi-year time period. Image credit: Hubble / ESA, L. Bedin / INAF.

    Ibnul’s hunch is on the right track: yes, it’s true that orbiting masses do emit gravitational waves, and that the emission of these waves will cause orbits to decay. But for these masses and distances, we’re talking about decay times of somewhere in the neighborhood of 10200 years, which is much, much longer than the lifetime of the Universe. In fact, it’s much longer than the lifetime of any star at all, of the galaxy, or even of the galaxy’s central black hole. If you wait around for gravitational waves to turn this binary pair of brown dwarfs into a star, you’re going to be waiting a disappointingly long time.

    8
    The inspiral and merger scenario for brown dwarfs as well-separated as these two are would take a very long time due to gravitational waves. But collisions are quite likely. Just as red stars colliding produce blue straggler stars, brown dwarf collisions can make red dwarf stars. Melvyn B. Davies, Nature 462, 991-992 (2009).

    Every once in a while, you get random collisions between objects in space. Just the fact that stars, failed stars, rogue planets and more move through the galaxy, primarily influenced by gravitation, means that there’s a finite chance that you’ll just randomly get a collision between two objects. This is a much better strategy than waiting for gravitational waves to take your orbits down, except in the most extreme cases. On timescales of about 1018 years, “only” about 100 million times older than the Universe presently is, brown dwarfs will randomly collide with either other brown dwarfs or stellar corpses, giving new life to a failed star. About 1% of brown dwarfs, according to current estimates, will meet that fate.

    9
    The Sun’s atmosphere is not confined to the photosphere or even the corona, but rather extends out for millions of miles in space, even under non-flare or ejection conditions. NASA’s Solar Terrestrial Relations Observatory.

    NASA/STEREO spacecraft

    But even if you can’t wait for gravitational radiation, and even if you don’t get lucky enough to collide with another brown dwarf in interstellar space, you still have a chance to merge. We normally think of stars as having a certain extent in space: that they take up a certain volume. For that matter, that’s how we think of Earth’s atmosphere, too: as a hard edge, with a boundary between what we consider the atmosphere and outer space. How foolish is that! In reality, atoms and particles extend outward for millions of miles (or kilometers), with flares from stars reaching well beyond the orbit of Earth. It was recently discovered that brown dwarfs emit flares, too, so just as a satellite in low-Earth orbit will fall back down to our planet, the friction from a brown dwarf in orbit around another will eventually draw them in. It won’t quite work for Luhman 16, but if the distance between the two failed stars were more like the Sun-Mercury distance, rather than the Sun-Ceres distance, this effect would have a shot.

    10
    Luigi Bedin’s multi-year study observing the motions of the failed stars in Luhman 16 has shown us how their positions and motions have changed over time, with the cycloid nature resulting from Earth’s motion during the year. Hubble / ESA, L. Bedin / INAF.

    NASA/ESA Hubble Telescope

    So what happens if you do get a merger or a collision? These events are rare and will, for the most part, take much longer than the present age of the Universe to occur. By that point, even a brown dwarf will have burned up all of its deuterium, while the corpse will have cooled off to just a few degrees above absolute zero at the surface. But the energy of a collision or merger ought to create enough heat and pressure in the core that we should — so long as we cross that critical mass threshold — still ignite nuclear fusion in the core. The star will be low-mass, red in color, and extremely long-lived, burning for more than 10 trillion years. When a failed star at last ignites, it will most likely be the only star shining in the galaxy for its entire life; these events will be that rare and spaced out in time. Yet the type of star you become is interesting in its own right.

    11
    When two brown dwarfs, far into the future, finally do merge together, they will likely be the only light shining in the night sky, as all other stars have gone out. The red dwarf that results will be the only primary light source left in the Universe at that time. user Toma/Space Engine; E. Siegel.

    It will burn its fuel so slowly that the helium-4 which gets made — the product of the core’s hydrogen fusion — will eventually convect out of the core, enabling more hydrogen to fuse in the core. The convection is efficient enough that 100% of the star’s hydrogen should burn to completion, leaving a solid mass of helium atoms. There won’t be enough mass to burn that helium any further, so the stellar remnant will contract down to a type of star that doesn’t yet exist in the Universe today: a helium white dwarf. It will take roughly a quadrillion years for this white dwarf to cool down and stop emitting light, during which time other brown dwarfs in the galaxy will collide and ignite. By time a failed star finally succeeds and goes through its entire life cycle, becoming a black dwarf, another failed star will gets its opportunity.

    12
    An accurate size/color comparison of a white dwarf (L), Earth reflecting our Sun’s light (middle), and a black dwarf (R). When white dwarfs finally radiate the last of their energy away, they will all eventually become black dwarfs. BBC / GCSE (L) / SunflowerCosmos (R).

    If you managed to achieve some type of immortality, you could, in theory, travel from failed star to failed star, continuing on by drawing your energy from the Universe’s final, rare successes. Most failed stars will remain failures forever, but the few that succeed will be burning long after all other lights have gone out. As Winston Churchill famous said, “Success is not final, failure is not fatal: it is the courage to continue that counts.” Perhaps that applies to even the stars, even moreso than to ourselves.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 4:21 pm on June 24, 2017 Permalink | Reply
    Tags: , , Basic Research, , , Extremeophiles, , MatISSE - Maturation of Instruments for Solar System Exploration, Oxford Nanopore, , SETG - The Search for Extraterrestrial Genomes   

    From Many Worlds: “In Search of Panspermia (and Life on Icy Moons)” 

    NASA NExSS bloc

    NASA NExSS

    Many Words icon

    Many Worlds

    2017-06-23
    Marc Kaufman

    1
    Early Earth, like early Mars and no doubt many other planets, was bombarded by meteorites and comets. Could they have arrived “living” microbes inside them?

    When scientists approach the question of how life began on Earth, or elsewhere, their efforts generally involve attempts to understand how non-biological molecules bonded, became increasingly complex, and eventually reached the point where they could replicate or could use sources of energy to make things happen. Ultimately, of course, life needed both.

    Researchers have been working for some time to understand this very long and winding process, and some have sought to make synthetic life out of selected components and energy. Some startling progress has been made in both of these endeavors, but many unexplained mysteries remain at the heart of the processes. And nobody is expecting the origin of life on Earth (or elsewhere) to be fully understood anytime soon.

    To further complicate the picture, the history of early Earth is one of extreme heat caused by meteorite bombardment and, most important, the enormous impact some 4.5 billion years of the Mars-sized planet that became our moon. As a result, many early Earth researchers think the planet was uninhabitable until about 4 billion years ago.

    Yet some argue that signs of Earth life 3.8 billion years ago have been detected in the rock record, and lifeforms were certainly present 3.5 billion years ago. Considering the painfully slow pace of early evolution — the planet, after all, supported only single-cell life for several billion years before multicellular life emerged — some researchers are skeptical about the likelihood of DNA-based life evolving in the relatively short window between when Earth became cool enough to support life and the earliest evidence of actual life.

    2
    A DNA helix animation. Life on Earth is based on DNA, and some researchers have been working on ways to determine whether DNA life also exists on Mars or elsewhere in the solar system.

    So what else, from a scientific as opposed to a religious perspective, might have set into motion the process that made life out of non-life?

    One long considered yet generally quickly dismissed answer is getting new attention and a little more respect. It invokes panspermia, the sharing of life via meteorites from one planet to another, or delivery by comet.

    In this context, the question generally raised is whether Earth might have been seeded by early Martian life (if it existed). Mars, it is becoming increasingly accepted, was probably more habitable in its early period than Earth. But panspermia inherently could go the other way as well, or possibly even between solar systems.

    A team of prominent scientists at MIT and Harvard are sufficiently convinced in the plausibility of panspermia that they have spent a decade, and a fair amount of NASA and other funding, to design and produce an instrument that can be sent to Mars and potentially detect DNA or more primitive RNA.

    In other words, life not only similar to that on Earth, but actually delivered long ago from Earth. It’s called the The Search for Extraterrestrial Genomes, or SETG.

    Gary Ruvkun is one of those researchers, a pioneering molecular biologist at Massachusetts General Hospital and professor of genetics at Harvard Medical School.

    I heard him speaking recently at a Space Sciences Board workshop on biosignatures, where he described the real (if slim) possibility that DNA or RNA-based life exists now on Mars, and the instrument that the SETG group is developing to detect it should it be there.

    3
    Did meteorites spread life between planets, and maybe even solar systems? Some pretty distinguished people think that it may well have happened. This illustration is an artist’s rendering of the comet Siding Spring approaching Mars in 2015. (NASA)

    The logic of panspermia — or perhaps “dispermia” if between but two planets — is pretty straight-forward, though with some significant question marks. Both Earth and Mars, it is well known, were pummeled by incoming meteorites in their earlier epochs, and those impacts are known to have sufficient force to send rock from the crash site into orbit.

    Mars meteorites have been found on Earth, and Earth meteorites no doubt have landed on Mars. Ruvkun said that recent work on the capacity of dormant microbes to survive the long, frigid and irradiated trip from planet to planet has been increasingly supportive.

    “Earth is filled with life in every nook and cranny, and that life is wildly diverse,” he told the workshop. “So if you’re looking for life on Mars, surely the first thing to look for is life like we find on Earth. Frankly, it would be kind of stupid not to.”

    The instrument being developed by the group, which is led by Ruvkun and Maria Zuber, MIT vice president for research and head of the Department of Earth, Atmospheric and Planetary Sciences. It would potentially be part of a lander or rover science package and would search DNA or RNA, using techniques based on the exploding knowledge of earthly genomics.

    The job is made easier, Ruvkun said, by the fact that the basic structure of DNA is the same throughout biology. What’s more, he said, there about 400 specific genes sequences “that make up the core of biology — they’re found in everything from extremeophiles and bacteria to worms and humans.”

    Those ubiquitous gene sequences, he said, were present more than 3 billion years ago in seemingly primitive lifeforms that were, in fact, not primitive at all. Rather, they had perfected some genetic pathways that were so good that they still used by most everything alive today.

    And how was it that these sophisticated life processes emerged not all that long (in astronomical or geological terms) after Earth cooled enough to be habitable? “Either life developed here super-fast or it came full-on as DNA life from afar,” Ruvkun said. It’s pretty clear which option he supports.

    Ruvkun said that the rest of the SETG team sees that kind of inter-planetary transfer — to Mars and from Mars — as entirely plausible, and that he takes panspermia a step forward. He thinks it’s possible, though certainly not likely nor remotely provable today, that life has been around in the cosmos for as long as 10 billion years, jumping from one solar system and planet to another. Not likely, but at idea worth entertaining.

    4
    A state-of-the-art instrument for reading DNA sequences in the field. The MIT/Harvard team is working with the company that makes it, and several others, on refining how it would do that kind of sequencing of live DNA on Mars. The extremely high-tech thumb drive weighs about 3 ounces. (Oxford Nanopore)

    Maria Zuber of MIT, who was the PI for the recent NASA GRAIL mission to the moon, has been part of the SETG team since near its inception, and MIT research scientist Christopher Carr is the project manager. Zuber said it was a rather low-profile effort at the start, but over the years has attracted many students and has won NASA funding three times including the currently running Maturation of Instruments for Solar System Exploration (MatISSE) grant.

    “I have made my career out of doing simple experiments. if want to look for life beyond earth helps to know what you’re looking for.

    “We happen to know what life on Earth is like– DNA based or possibly RNA-based as Gary is looking for as well. The point is that we know what to look for. There are so many possibilities of what life beyond Earth could be like that we might as well test the hypothesis that it, also, is DNA based. It’s a low probability result, but potentially very high value.”

    DNA sequencing instruments like the one her team is developing are taken to the field regularly by thousands of researchers, including some working with with SETG. The technology has advanced so quickly that they can pick up a sample in a marsh or desert or any extreme locale and on the spot determine what DNA is present. That’s quite a change from the pain-staking sequencing done painstakingly by graduate students not that long ago.

    Panspermia, Zuber acknowledged, is a rather improbable idea. But when nature is concerned, she said “I’m reticent to say anything is impossible. After all, the universe is made up of the same elements as those on Earth, and so there’s a basic commonality.”

    Zuber said the instrument was not ready to compete for a spot on the 2020 mission to Mars, but she expects to have a sufficiently developed one ready to compete for a spot on the next Mars mission. Or perhaps on missions to Europa or the plumes of Enceladus.

    The possibility of life skipping from planet to planet clearly fascinates both scientists and the public. You may recall the excitement in the mid 1990s over the Martian meteorite ALH84001, which NASA researchers concluded contained remnants of Martian life. (That claim has since been largely refuted.)

    Of the roughly 61,000 meteorites found on Earth, only 134 were deemed to be Martian as of two years ago. But how many have sunk into oceans or lakes, or been lost in the omnipresence of life on Earth? Not surprisingly, the two spots that have yielded the most meteorites from Mars are Antarctica and the deserts of north Africa.

    And when thinking of panspermia, it’s worthwhile to consider the enormous amount of money and time put into keeping Earthly microbes from inadvertently hitching a ride to Mars or other planets and moons as part of a NASA mission.

    The NASA office of planetary protection has the goal of ensuring, as much as possible, that other celestial bodies don’t get contaminated with our biology. Inherent in that concern is the conclusion that our microbes could survive in deep space, could survive the scalding entry to another planet, and could possibly survive on the planet’s surface today. In other words, that panspermia (or dispermia) is in some circumstances possible.

    Testing whether a spacecraft has brought Earth life to Mars is actually another role that the SETG instrument could play. If a sample tested on Mars comes back with a DNA signature result exactly like one on Earth–rather one that might have come initially from Earth and then evolved over billions of years– then scientists will know that particular bit of biology was indeed a stowaway from Earth.

    Rather like how a very hardy microbe inside a meteorite might have possibly traveled long ago.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Many Worlds

    There are many worlds out there waiting to fire your imagination.

    Marc Kaufman is an experienced journalist, having spent three decades at The Washington Post and The Philadelphia Inquirer, and is the author of two books on searching for life and planetary habitability. While the “Many Worlds” column is supported by the Lunar Planetary Institute/USRA and informed by NASA’s NExSS initiative, any opinions expressed are the author’s alone.

    This site is for everyone interested in the burgeoning field of exoplanet detection and research, from the general public to scientists in the field. It will present columns, news stories and in-depth features, as well as the work of guest writers.

    About NExSS

    The Nexus for Exoplanet System Science (NExSS) is a NASA research coordination network dedicated to the study of planetary habitability. The goals of NExSS are to investigate the diversity of exoplanets and to learn how their history, geology, and climate interact to create the conditions for life. NExSS investigators also strive to put planets into an architectural context — as solar systems built over the eons through dynamical processes and sculpted by stars. Based on our understanding of our own solar system and habitable planet Earth, researchers in the network aim to identify where habitable niches are most likely to occur, which planets are most likely to be habitable. Leveraging current NASA investments in research and missions, NExSS will accelerate the discovery and characterization of other potentially life-bearing worlds in the galaxy, using a systems science approach.
    The National Aeronautics and Space Administration (NASA) is the agency of the United States government that is responsible for the nation’s civilian space program and for aeronautics and aerospace research.

    President Dwight D. Eisenhower established the National Aeronautics and Space Administration (NASA) in 1958 with a distinctly civilian (rather than military) orientation encouraging peaceful applications in space science. The National Aeronautics and Space Act was passed on July 29, 1958, disestablishing NASA’s predecessor, the National Advisory Committee for Aeronautics (NACA). The new agency became operational on October 1, 1958.

    Since that time, most U.S. space exploration efforts have been led by NASA, including the Apollo moon-landing missions, the Skylab space station, and later the Space Shuttle. Currently, NASA is supporting the International Space Station and is overseeing the development of the Orion Multi-Purpose Crew Vehicle and Commercial Crew vehicles. The agency is also responsible for the Launch Services Program (LSP) which provides oversight of launch operations and countdown management for unmanned NASA launches. Most recently, NASA announced a new Space Launch System that it said would take the agency’s astronauts farther into space than ever before and lay the cornerstone for future human space exploration efforts by the U.S.

    NASA science is focused on better understanding Earth through the Earth Observing System, advancing heliophysics through the efforts of the Science Mission Directorate’s Heliophysics Research Program, exploring bodies throughout the Solar System with advanced robotic missions such as New Horizons, and researching astrophysics topics, such as the Big Bang, through the Great Observatories [Hubble, Chandra, Spitzer, and associated programs. NASA shares data with various national and international organizations such as from the [JAXA]Greenhouse Gases Observing Satellite.

     
  • richardmitnick 3:09 pm on June 24, 2017 Permalink | Reply
    Tags: Basic Research, CERN ProtoDUNE, , , , ,   

    From Symmetry: “World’s biggest neutrino experiment moves one step closer” 

    Symmetry Mag

    Symmetry

    06/23/17
    Lauren Biron

    1
    Photo by Maximilien Brice, CERN

    The startup of a 25-ton test detector at CERN advances technology for the Deep Underground Neutrino Experiment.

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA


    FNAL DUNE Argon tank at SURF


    Surf-Dune/LBNF Caverns at Sanford



    SURF building in Lead SD USA

    In a lab at CERN sits a very important box. It covers about three parking spaces and is more than a story tall. Sitting inside is a metal device that tracks energetic cosmic particles.

    CERN Proto DUNE Maximillian Brice

    This is a prototype detector, a stepping-stone on the way to the future Deep Underground Neutrino Experiment (DUNE). On June 21, it recorded its first particle tracks.

    So begins the largest ever test of an extremely precise method for measuring elusive particles called neutrinos, which may hold the key to why our universe looks the way it does and how it came into being.

    A two-phase detector

    The prototype detector is named WA105 3x1x1 (its dimensions in meters) and holds five active tons—3000 liters—of liquid argon. Argon is well suited to interacting with neutrinos then transmitting the subsequent light and electrons for collection. Previous liquid argon neutrino detectors, such as ICARUS and MicroBooNE, detected signals from neutrinos using wires in the liquid argon. But crucially, this new test detector also holds a small amount of gaseous argon, earning it the special status of a two-phase detector.

    INFN Gran Sasso ICARUS, since moved to FNAL

    FNAL/ICARUS

    FNAL/MicrobooNE

    As particles pass through the detector, they interact with the argon atoms inside. Electrons are stripped off of atoms and drift through the liquid toward an “extraction grid,” which kicks them into the gas. There, large electron multipliers create a cascade of electrons, leading to a stronger signal that scientists can use to reconstruct the particle track in 3D. Previous tests of this method were conducted in small detectors using about 250 active liters of liquid argon.

    “This is the first time anyone will demonstrate this technology at this scale,” says Sebastien Murphy, who led the construction of the detector at CERN.

    The 3x1x1 test detector represents a big jump in size compared to previous experiments, but it’s small compared to the end goal of DUNE, which will hold 40,000 active tons of liquid argon. Scientists say they will take what they learn and apply it (and some of the actual electronic components) to next-generation single- and dual-phase prototypes, called ProtoDUNE.

    The technology used for both types of detectors is a time projection chamber, or TPC. DUNE will stack many large modules snugly together like LEGO blocks to create enormous DUNE detectors, which will catch neutrinos a mile underground at Sanford Underground Research Facility in South Dakota. Overall development for liquid argon TPCs has been going on for close to 40 years, and research and development for the dual-phase for more than a decade. The idea for this particular dual-phase test detector came in 2013.

    “The main goal [with WA105 3x1x1] is to demonstrate that we can amplify charges in liquid argon detectors on the same large scale as we do in standard gaseous TPCs,” Murphy says.

    By studying neutrinos and antineutrinos that travel 800 miles through the Earth from the US Department of Energy’s Fermi National Accelerator Laboratory [FNAL] to the DUNE detectors, scientists aim to discover differences in the behavior of matter and antimatter. This could point the way toward explaining the abundance of matter over antimatter in the universe. The supersensitive detectors will also be able to capture neutrinos from exploding stars (supernovae), unveiling the formation of neutron stars and black holes. In addition, they allow scientists to hunt for a rare phenomenon called proton decay.

    “All the R&D we did for so many years and now want to do with ProtoDUNE is the homework we have to do,” says André Rubbia, the spokesperson for the WA105 3x1x1 experiment and former co-spokesperson for DUNE. “Ultimately, we are all extremely excited by the discovery potential of DUNE itself.”

    2
    One of the first tracks in the prototype detector, caused by a cosmic ray. André Rubbia

    Testing, testing, 3-1-1, check, check

    Making sure a dual-phase detector and its electronics work at cryogenic temperatures of minus 184 degrees Celsius (minus 300 degrees Fahrenheit) on a large scale is the primary duty of the prototype detector—but certainly not its only one. The membrane that surrounds the liquid argon and keeps it from spilling out will also undergo a rigorous test. Special cryogenic cameras look for any hot spots where the liquid argon is predisposed to boiling away and might cause voltage breakdowns near electronics.

    After many months of hard work, the cryogenic team and those working on the CERN neutrino platform have already successfully corrected issues with the cryostat, resulting in a stable level of incredibly pure liquid argon. The liquid argon has to be pristine and its level just below the large electron multipliers so that the electrons from the liquid will make it into the gaseous argon.

    “Adding components to a detector is never trivial, because you’re adding impurities such as water molecules and even dust,” says Laura Manenti, a research associate at the University College London in the UK. “That is why the liquid argon in the 311—and soon to come ProtoDUNEs—has to be recirculated and purified constantly.”

    While ultimately the full-scale DUNE detectors will sit in the most intense neutrino beam in the world, scientists are testing the WA105 3x1x1 components using muons from cosmic rays, high-energy particles arriving from space. These efforts are supported by many groups, including the Department of Energy’s Office of Science.

    The plan is now to run the experiment, gather as much data as possible, and then move on to even bigger territory.

    “The prospect of starting DUNE is very exciting, and we have to deliver the best possible detector,” Rubbia says. “One step at a time, we’re climbing a large mountain. We’re not at the top of Everest yet, but we’re reaching the first chalet.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 2:29 pm on June 24, 2017 Permalink | Reply
    Tags: , Basic Research, , SRM's -Standard Reference Materials   

    From NIST: ” Measurements Matter – How NIST Reference Materials Affect You” 

    NIST

    June 13, 2017
    Fran Webber

    In 2012, Consumer Reports announced startling findings—with potentially serious public health ramifications.

    The publication investigated arsenic levels in apple juice and rice and found levels of the toxin above those allowed in water by the Environmental Protection Agency. The articles pointed out that there were no rules about allowable levels for arsenic in food.

    The Food and Drug Administration responded by issuing a limit for arsenic levels in apple juice and, in 2016, for infant rice cereal . But the damage was already done.

    It’s a funny quirk of human psychology: we take the most important things for granted—until it all goes wrong.

    You probably don’t often question whether the food you buy in the grocery store is safe. Or if the lab where your doctor sends your samples accurately calculated your vitamin D levels.

    But imagine, for a moment, how much more difficult it would be to go about your daily life if you didn’t have the information those measurements provide.

    How would you decide what is safe and healthy to eat? How would you know if you were getting enough vitamin D or if your cholesterol levels were too high?

    That’s one of the big reasons NIST exists—to reduce uncertainty in our measurements and increase your confidence in the information you use to make important decisions in your daily life.

    And part of the way NIST does that is through Standard Reference Materials (SRMs).

    Standard Reference … what?

    The government has acronyms for seemingly everything. At NIST, one even has a registered trademark: SRM® is the “brand name” of our certified reference materials, the generic term for these vital tools. Many other organizations measure and distribute certified reference materials, but only NIST has SRMs.

    So what exactly is an SRM or certified reference material?


    NIST chemist Bob Watters provides an overview of how NIST’s standard reference materials, ranging from metal alloys to cholesterol samples, have helped industry make reliable measurements since the earliest days of the agency.

    It can be difficult to explain, because SRMs are actually a lot of different things. In fact, NIST sells more than 1,000 different types of SRMs, from gold nanoparticles to peanut butter .

    NIST has very carefully studied each of its SRMs, and it’s these characterizations, rather than the materials themselves, that customers pay for. SRMs serve a variety of purposes but are mostly used by other labs and members of industry to check their analytical measurements and to perform other kinds of quality-control tests.

    Steve Choquette, director of NIST’s Office of Reference Materials, says SRMs are like widgets, tools that provide a service or help you complete a task. In this case, SRMs give manufacturers access to a level of measurement accuracy they wouldn’t otherwise be able to obtain.

    “What an SRM really does is give our customers the highest quality measurements in a form they can easily use,” Choquette says.

    Peanut butter—SRM 2387—is an excellent example. NIST scientists know exactly how much fat, salt, sugar and other nutrients are in the peanut butter, and they’ve recorded those amounts on a certificate that’s sold with the SRM. When an SRM user measures the NIST peanut butter with his or her own instrument, he or she should get amounts that match the certificate. If not, the manufacturer knows the machine must be adjusted.

    NIST is a nonregulatory agency, which means it doesn’t set the rules for things like food and water safety. However, manufacturers frequently use NIST standards such as SRMs because they are a reliable, science-based means to demonstrating compliance with the rules set by regulatory agencies.

    Does your food measure up?

    Like the peanut butter SRM, many NIST SRMs are food products. These SRMs help the food industry comply with various U.S. food regulations such as those requiring nutrition facts labels. Regulators can be sure those labels are accurate when producers use SRMs to ensure their measurement instruments are properly calibrated.

    In the lab, Joe Katzenmeyer, senior scientist and strategic analytical manager at Land O’Lakes, uses the SRMs for nonfat milk powder, infant formula and meat homogenate (a canned pork and chicken mix).

    “We most often use NIST SRMs when developing a new testing procedure, and we need to know that a result is the ‘correct’ result,” Katzenmeyer said. “NIST values are established through a very thorough process and by labs across the country. This gives a high credibility to their established values.”

    And that’s how you can be confident in the nutrition facts labels, too, so you can make healthy decisions about what to eat.

    2
    NIST SRM 2385, spinach. Credit: K. Irvine/NIST

    But NIST food SRMs don’t just help you accurately count your carbs.

    Remember the concern about arsenic in apple juice and rice? NIST already had a rice flour SRM, but NIST researchers recently added measurements for different types of arsenic. And, NIST is in the process of making an SRM for apple juice that will include levels for various forms of arsenic as well. Government agencies, like the Food and Drug Administration, can use these SRMs to ensure that arsenic levels in the foods we eat are safe.

    And both health and safety are driving forces behind another type of NIST SRMs—those for dietary supplements.

    Marketers can make some pretty strong claims about their products. But do so-called “superfoods” like green tea or blueberries live up to the hype? The first step in finding out is to carefully measure the properties of these foods.

    That’s why NIST makes SRMs for green tea and blueberries, as well as multivitamins, St. John’s Wort and Ginkgo biloba, among others.

    A medical measurement marvel

    Nearly 74 million Americans have high levels of LDL cholesterol —that’s the bad kind. Those with high cholesterol have twice the risk of heart disease as those with normal levels.

    Keeping tabs on your cholesterol can be a matter of life and death. So, when you or your loved one goes to the doctor’s office to give a blood sample, how do you know the result you get is right?

    If you’re thinking it’s because of NIST SRMs, you’d be right! NIST sells a number of SRMs that lab techs use to calibrate clinical laboratory equipment.

    But SRMs don’t just help maintain the status quo. They also help drive innovation.

    A new SRM for monoclonal antibodies—a large class of drugs for treating cancer and autoimmune diseases, among other things—could make these life-saving treatments more widely available.

    Monoclonal antibodies are large protein molecules designed to bind to disease-causing cells or proteins, triggering a patient’s immune system to attack and clear them from the body. Sales of these drugs in the U.S. reached $50 billion in 2015.

    3

    NIST’s monoclonal antibody reference material, NIST RM 8671, is shipped in cryovials packaged in dry ice. It should be stored in a frozen state at -80 °C (-112 °F). Shown is a sample that underwent extensive round-robin testing by more than 100 collaborators before the biological material, donated by MedImmune, was certified as a NIST RM. Credit: NIST

    Manufacturing a monoclonal antibody drug on a large scale is complex and involves the use of genetically engineered cells that churn out large quantities of the molecule. Testing to make sure that the molecules are being made correctly happens at many points in the manufacturing process. The NIST SRM is an important tool for assuring the quality of these test methods and of the final product.

    And, since patents on many monoclonal antibodies are set to expire in the next several years, many anticipate a growing market for biosimilar—or generic—versions of the drugs. Generics could save patients billions of dollars by 2020 .

    But, this will mean a lot of testing and measurements to determine whether these generic versions are nearly identical to the branded versions. The NIST monoclonal antibody SRM could help with measurement challenges faced by researchers tasked with testing these drugs.

    Taking measurements to court

    In 1978, Michael Hanline was found guilty of murder in California. But Hanline always said he was innocent. Eventually, the California Innocence Project at California Western School of Law took up his case, and through DNA analysis, showed that Hanline was not the source of DNA found on key evidence.

    Hanline spent 36 years in prison. He is the longest-serving wrongfully convicted person in California history.

    When Hanline was convicted, the ability to evaluate DNA evidence didn’t yet exist. But today, it’s not uncommon to hear of cases where DNA evidence makes or breaks the case. And not just to exonerate the innocent. Far more often, DNA evidence helps law enforcement put away the right people the first time.

    NIST forensic DNA SRMs are crucial to this process. They help make sure that labs conducting forensic DNA analysis obtain accurate results. The Federal Bureau of Investigation requires that forensic DNA testing laboratories meet certain quality assurance standards. Labs must check their processes with a NIST SRM (or a reference material that traces back to NIST) every year or anytime they make substantial changes to their protocol.

    “The NIST DNA SRM we use in our lab is essential to ensure our analyses are reliable,” said Todd Bille, DNA technical leader at the Bureau of Alcohol, Tobacco, Firearms and Explosives. “With all the advances in the forensic community, NIST SRM 2391c is the only set of DNA samples that has what we need to make sure the analyses function properly in our hands. Our lab is also constantly evaluating new methods to handle DNA. Having this set of standard DNA samples allows us to be sure new methods don’t adversely affect the results.”

    Cementing quality control

    First of all, John Sieber wants you to know: There’s a difference between cement and concrete.

    “People get the two mixed up,” says Sieber, a NIST research chemist. “Cement is what you have before, and then you mix it with water and sand and gravel—aggregate, they call it—and you pour it into your sidewalk and it hardens through a chemical reaction and becomes concrete.”

    4
    NIST researcher John Sieber, concrete SRM development. Credit: copyright Earl Zubkoff

    Though you may have never given it a second thought, you no doubt interact with concrete on a daily basis as you drive to work, park your car in a garage, walk across the sidewalk to your office and sit at your desk in a high-rise building.

    “The human race is trying to cover the planet in concrete,” Sieber jokes.

    To make sure their product can withstand the tests of time, wear and weather, cement makers conform to certain quality standards. During the manufacturing process, cement makers test their products hourly. NIST SRMs are crucial to letting manufacturers know the results of their tests are accurate—and that they’re creating a high-quality product.

    NIST sells 15 cement—not concrete—SRMs that help manufacturers ensure their products meet certain quality standards and help buyers know they’re getting what they paid for.

    5
    NIST researchers in CAVE 3D Visualization lab exploring the movement of concrete particles. Credit: copyright Earl Zubkoff

    Standards of excellence

    To tell the story of SRMs is to tell the story of industry in America—its breakthroughs and its setbacks. From the turn of the 20th century onward, NIST stood with American makers as they erected skyscrapers, laid railways and took to the skies in airplanes. NIST helped manufacturers overcome technical challenges they faced in bringing innovative technology to the American people.

    In 1905, NIST—then known as the National Bureau of Standards—began preparing and distributing the first SRMs, standardized samples of iron, which manufacturers used as a check on their lab analyses. From those early standard samples, the program grew.

    Today, NIST still sells versions of these original SRMs, but it has come a long way. The diverse array of SRMs currently available reflect the complexity and technological advancement of a 21st-century society—and the new challenges it faces.

    NIST constantly works to improve its existing SRMs to adapt to changing needs—such as the arsenic levels added to the rice flour SRM, or the blueberry SRM, to which NIST is in the process of adding measurements for anthocyanins, a type of flavonoid, or pigment, in the blueberries that contributes to its antioxidant properties. And, NIST is always looking for opportunities to create new SRMs to drive innovation in emerging markets, like the monoclonal antibody SRM for biopharmaceutical manufacturers.

    “Good science is our carrot,” Choquette says.

    Speaking of carrots, we’ve got an SRM for that.

    To learn more about NIST’s Standard Reference Materials, visit http://www.nist.gov.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD, USA

    NIST Mission, Vision, Core Competencies, and Core Values

    NIST’s mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.
    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.
    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

     
  • richardmitnick 1:23 pm on June 24, 2017 Permalink | Reply
    Tags: , , Basic Research, , From the chain to messier 64,   

    From Manu: “From the chain to messier 64” 


    Manu Garcia, a friend from IAC.

    The universe around us.
    Astronomy, everything you wanted to know about our local universe and never dared to ask.

    1

    This colorful and broad telescopic mosaic links the chain of galaxies galaxies through the nucleus of the virgo cluster to the dusty spiral Galaxy Messier 64.

    Galaxies are scattered across the field of vision that extends around 20 full moons Through a beautiful night sky.

    The Cosmic Frame is also full of foreground stars from the Virgo constellations and the well prepared Coma, and weak, Dusty Nebulae that move above the plane of the milky way. Look carefully at it’s eyes. The famous pair of interacting galaxies are close to the top, not far from Messier 87, the giant elliptical galaxy of the Virgo cluster. At the bottom, you can look down Messier 64, also known as the black eye galaxy. The Virgo cluster is the cluster of galaxies closest to our own local group of galaxies.

    Local Group. Andrew Z. Colvin 3 March 2011

    The Virgo cluster galaxies are about 50 million light-years away, but Messier 64 is only 17 million light-years away.

    Image Credit and rights reserved: Rogelio Bernal Andreo (deep colors of heaven)

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 1:09 pm on June 24, 2017 Permalink | Reply
    Tags: , , Basic Research, , , It all started from humble beginnings, , Scientists make waves with black hole research, , What is superradiance?   

    From U Nottingham : “Scientists make waves with black hole research” 

    1

    University of Nottingham

    14 Jun 2017
    Jane Icke
    Media Relations Manager (Faculty of Science)
    jane.icke@nottingham.ac.uk
    +44 (0)115 951 5751
    University Park

    Dr Silke Weinfurtner, in the School of Mathematicson
    +44 (0) 115 9513865,
    silke.weinfurtner@nottingham.ac.uk

    Lindsay Brooke
    Media Relations Managers for the Faculty of Science
    +44 (0)115 951 5751
    lindsay.brooke@nottingham.ac.uk

    1
    A groundbreaking experiment has allowed researchers to simulate the dissipation of energy around a black hole using waves generated in the lab. University of Nottingham

    Scientists at the University of Nottingham have made a significant leap forward in understanding the workings of one of the mysteries of the universe. They have successfully simulated the conditions around black holes using a specially designed water bath.

    Their findings shed new light on the physics of black holes with the first laboratory evidence of the phenomenon known as the superradiance, achieved using water and a generator to create waves.

    The research – Rotational superradiant scattering in a vortex flow – has been published in Nature Physics. It was undertaken by a team in the Quantum Gravity Laboratory in the School of Physics and Astronomy.

    The work was led by Silke Weinfurtner from the School of Mathematical Sciences. In collaboration with an interdisciplinary team she designed and built the black hole ‘bath’ and measurement system to simulate black hole conditions.

    Dr Weinfurtner said: “This research has been particularly exciting to work on as it has bought together the expertise of physicists, engineers and technicians to achieve our common aim of simulating the conditions of a black hole and proving that superadiance exists. We believe our results will motivate further research on the observation of superradiance in astrophysics.”

    What is superradiance?

    The Nottingham experiment was based on the theory that an area immediately outside the event horizon of a rotating black hole – a black hole’s gravitational point of no return – will be dragged round by the rotation and any wave that enters this region, but does not stray past the event horizon, should be deflected and come out with more energy than it carried on the way in – an effect known as superradiance.

    Superadiance – the extraction of energy from a rotating black hole – is also known as the Penrose Mechanism and is a precursor of Hawking Radiation – a quantum version of black-hole superradiance.

    What’s in the Black Hole Lab?

    Dr Weinfurtner said: “Some of the bizzare black hole phenomena are hard, if not, impossible to study directly. This means there are very limited experimental possibilities. So this research is quite an achievement.”

    The ‘flume’, is specially designed 3m long, 1.5m wide and 50cm deep bath with a hole in the centre. Water is pumped in a closed circuit to establish a rotating draining flow. Once at the desired depth waves were generated at varied frequenices until the supperadiant scattering effect is created and recorded using a specially designed 3D air fluid interface sensor.

    Tiny dots of white paper punched out by a specially adapted sewing machine were used to measure the flow field – the speed of the fluid flow around the analogue black hole.

    It all started from humble beginnings

    This research has been many years in the making. The initial idea for creating a supperradiant effect with water started with a bucket and bidet. Dr Weinfurtner said: “This research has grown from humble beginnings. I had the initial idea for a water based experiment when I was at the International School for Advanced Studies (SISSA) in Italy and I set up an experiment with a bucket and a bidet. However, when it caused a flood I was quickly found a lab to work in!

    After her postdoc, Dr Weinfurtner went on to work with Bill Unruh, the Canadian born physicist who also has a made seminal contributions to our understanding of gravity, black holes, cosmology, quantum fields in curved spaces, and the foundations of quantum mechanics, including the discovery of the Unruh effect.

    Her move to the University of Nottingham accelerated her research as she was able to set up her own research group with support from the machine shop in the School of Physics and Astronomy.

    This research is funded by the Engineering and Physical Sciences Research Council, the Royal Society and the University of Nottingham.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    2

    “The University of Nottingham shares many of the characteristics of the world’s great universities. However, we are distinct not only in our key strengths but in how our many strengths combine: we are financially secure, campus based and comprehensive; we are research-led and recruit top students and staff from around the world; we are committed to internationalising all our core activities so our students can have a valuable and enjoyable experience that prepares them well for the rest of their intellectual, professional and personal lives.”

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: