Updates from richardmitnick Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 7:59 am on August 4, 2015 Permalink | Reply
    Tags: , , , Flash Project, ,   

    From CAASTRO: “Pilot study data prepare astronomers for future blind HI surveys” 

    CAASTRO bloc

    CAASTRO ARC Centre of Excellence for All Sky Astrophysics

    1

    4 Aug 2015

    Next-generation radio telescopes will make it possible to conduct the first large-scale HI absorption-line surveys, which will enable us to study the evolution of neutral gas in galaxies over a large range of cosmic time. However, we don’t currently have the understanding to derive physical galaxy properties from absorption-line data alone.

    To gain this understanding, we need to start by knowing the expected detection rate of intervening HI absorption. Previous studies have suggested that the detection rate is around 50% for sightlines bypassing the galaxy at distances of 20 kpc or less. However, these studies have typically targeted sightlines to quasars which provide very bright, compact radio sources ideal for detecting HI absorption against. Since only around 10% of all radio sources are quasars, it is therefore possible that such studies will have over-estimated the detection rate, compared to what future blind surveys might expect to find.

    In a new study, CAASTRO researcher Sarah Reeves (University of Sydney) and colleagues have investigated the detection rate of intervening absorption in an unbiased sample of radio sources. Importantly, they also obtained HI emission-line data, allowing them to map the distribution of HI gas in the target galaxies. This means that where they did not detect an absorption-line, they were able to pin-point the reason for the non-detection, i.e. whether the lack of absorption was due to the sightline not intersecting the HI disk of the galaxy or due to the properties of the background radio source (e.g. too dim) – or some other reason.

    This publication presents observations and results from the pilot sample (six of an eventual 16 sources). In this pilot sample, no intervening absorption-lines were detected. While observations for the full sample are required to better establish the detection rate, this preliminary result suggests that the detection rate is considerably lower than estimated by previous studies – perhaps around 5-10%. The team found that most of their sightlines did intersect the HI disk of the target galaxies, meaning that the low detection rate must be due to properties of the background sources. They found that many of the background sources resolved into multiple components at higher resolution, lowering the flux and reducing the absorption-line sensitivity. These results show that source type and structure can significantly affect the detection rate of absorption-line surveys, and help astronomers to better prepare for future large surveys, such as FLASH (‘The First Large Absorption Survey in HI’).

    Publication details:
    S. N. Reeves, E. M. Sadler, J. R. Allison, B. S. Koribalski, S. J. Curran, and M. B. Pracy in MNRAS (2015) HI emission and absorption in nearby, gas-rich galaxies

    FLASH

    The First Large Absorption Survey in HI (FLASH) is a wide-field ASKAP survey that will provide world-class science through the provision of new measurements of the amount and distribution of HI in distant galaxies, allowing us for the first time to investigate the relationship between HI gas supply and star formation rate in individual galaxies at z>0.5.

    SKA Pathfinder Radio Telescope
    SKA ASKAP telescopes

    ASKAP will be an array of 36 antennas each 12m in diameter, capable of high dynamic range imaging and using wide-field-of-view phased array feeds. ASKAP is intended to be a world-class telescope in its own right as well as a pathfinder instrument for the Square Kilometre Array.

    ASKAP’s large spectral bandwidth (300 MHz bandwidth over the frequency range 700-1800 MHz) and wide field of view (30 square degrees) will open up a completely new parameter space for large, blind HI absorption-line surveys using background radio continuum sources. Since the detection limit for such surveys is independent of redshift, ASKAP-FLASH will allow us to learn about the neutral gas content of galaxies in the poorly-explored redshift range 0.5 < z < 1.0, where the HI emission line is too weak to be detectable in even the deepest ASKAP surveys. The FLASH survey aims to detect and measure several hundred HI absorption lines (from both intervening and associated absorbers). This will provide a unique dataset for studies of galaxy evolution as well as a new estimate of the HI mass density at intermediate redshifts. The FLASH data will also be used for HI emission-line stacking experiments in combination with large-area optical redshift surveys like WiggleZ and GAMA.

    2
    Professor Elaine Sadler (Project Leader)

    CAASTRO Member Node
    Professor Lister Staveley-Smith University of Western Australia
    Associate Professor Martin Meyer University of Western Australia
    Dr. James Allison University of Sydney
    Dr. Stephen Curran University of Sydney
    Ms. Sarah Reeves University of Sydney
    Mr. Marcin Glowacki University of Sydney
    Associate Professor Chris Blake Swinburne University
    Professor Matthew Colless Australian National University

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Astronomy is entering a golden age, in which we seek to understand the complete evolution of the Universe and its constituents. But the key unsolved questions in astronomy demand entirely new approaches that require enormous data sets covering the entire sky.

    In the last few years, Australia has invested more than $400 million both in innovative wide-field telescopes and in the powerful computers needed to process the resulting torrents of data. Using these new tools, Australia now has the chance to establish itself at the vanguard of the upcoming information revolution centred on all-sky astrophysics.

    CAASTRO has assembled the world-class team who will now lead the flagship scientific experiments on these new wide-field facilities. We will deliver transformational new science by bringing together unique expertise in radio astronomy, optical astronomy, theoretical astrophysics and computation and by coupling all these capabilities to the powerful technology in which Australia has recently invested.

    PARTNER LINKS

    The University of Sydney
    The University of Western Australia
    The University of Melbourne
    Swinburne University of Technology
    The Australian National University
    Curtin University
    University of Queensland

     
  • richardmitnick 7:23 am on August 4, 2015 Permalink | Reply
    Tags: , , , ,   

    From phys.org: “End-of-century Manhattan climate index to resemble Oklahoma City today” 

    physdotorg
    phys.org

    August 4, 2015
    Carnegie Institution for Science

    1
    View from Midtown Manhattan, facing south toward Lower Manhattan

    Climate change caused by greenhouse gas emissions will alter the way that Americans heat and cool their homes. By the end of this century, the number of days each year that heating and air conditioning are used will decrease in the Northern states, as winters get warmer, and increase in Southern states, as summers get hotter, according to a new study from a high school student, Yana Petri, working with Carnegie’s Ken Caldeira. It is published by Scientific Reports.

    “Changes in outdoor temperatures have a substantial impact on energy use inside,” Caldeira explained. “So as the climate changes due to greenhouse gases in the atmosphere, the amount of energy we use to keep our homes comfortable will also change.”

    Using results from established climate models, Petri, under Caldeira’s supervision, calculated the changes in the number of days over the last 30 years when U.S. temperatures were low enough to require heating or high enough to require air conditioning in order to achieve a comfort level of 65 degrees Fahrenheit. She also calculated projections for future days when heating or air conditioning would be required to maintain the same comfort level if current trends in greenhouse gas emissions continue unchecked.

    Looking forward toward the end of this century, her calculations found that Washington state will have the smallest increase in air conditioning-required days and southern Texas will have the largest increase. Likewise, upper North Dakota, Minnesota, and Maine would have the largest decrease in heating-required days and southern Florida would have the smallest decrease.

    Petri then took this inquiry one step further and looked at a sum of heating-required days and cooling-required days in different regions both in the past and in future projection, to get a sense of changes in the overall thermal comfort of different areas.

    “No previous study has looked at climate model projections and tried to develop an index of overall thermal comfort, which is quite an achievement,” Caldeira said.

    Today, the city with the minimum combined number of heating- and cooling-required days, in other words the place with the most-optimal outdoor comfort level, is San Diego. But the model projected that in the same future time frame, 2080-2099, the climate would shift so that San Francisco would take its place as the city with the most-comfortable temperatures.

    Other changes predicted by the model are that the amount of heating and cooling required in New York City in the future will be similar to that used in Oklahoma City today. By this same measure, Seattle is projected to resemble present day San Jose, and Denver to become more like Raleigh, NC, is today.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Phys.org in 100 Words

    Phys.org™ (formerly Physorg.com) is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004, Phys.org’s readership has grown steadily to include 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

     
  • richardmitnick 7:12 am on August 4, 2015 Permalink | Reply
    Tags: , ,   

    From Discovery: “Hopes Dim for Reversing Ocean Warming: Study” 

    Discovery News
    Discovery News

    Aug 3, 2015
    AFP

    1
    2

    Technology to drain heat-trapping CO2 from the atmosphere may slow global warming, but will not reverse climate damage to the ocean on any meaningful timescale, according to research published Monday.

    A new NASA study has revealed that the ocean abyss has not warmed in the past few years. What does this mean for global warming?

    At the same time, a second study reported, even the most aggressive timetable for reducing greenhouse-gas emissions will need a big boost from largely untested carbon removal schemes to cap warming to two degrees Celsius (3.6 degrees Fahrenheit) above pre-industrial levels.

    Above that threshold, say scientists, the risk of climate calamity rises sharply. Earth is currently on a 4 Celsius (7.2 Fahrenheit) trajectory.

    Both studies, coming months before 195 nations meet in Paris in a bid to forge a climate pact, conclude that deep, swift cuts in carbon dioxide (CO2) emissions are crucial.

    Planetary-scale technical fixes — sometimes called geo-engineering — have often been invoked as a fallback solution in the fight against climate change.

    But with CO2 emissions still rising, along with the global thermostat, many scientists are starting to take a hard look at which ones might be feasible.

    Research has shown that extracting massive quantities of CO2 from the atmosphere, through intensive reforestation programs or carbon-scrubbing technology, would in theory help cool the planet.

    But up to now, little was known about the long-term potential for these measures for restoring oceans, rendered overly acidic after two centuries of absorbing CO2.

    Increased acidification has already ravaged coral, and several kinds of micro-organisms essential to the ocean food chain, with impacts going all the way up to humans.

    Scientists led by Sabine Mathesius of the GEOMAR Helmholtz Center for Ocean Research in Kiel, Germany, used computer models to test different carbon-reduction scenarios, looking in each case at the impact on acidity, water temperatures and oxygen levels.

    If humanity waited a century before sucking massive amounts of CO2 out of the atmosphere, they concluded, it would still take centuries, maybe even a thousand years, before the ocean would catch up.

    In the meantime, they researchers say, corals will have disappeared, many marine species will have gone extinct and the ocean would be rife with dead spots.

    “We show that in a business-as-usual scenario, even massive deployment of CO2 removal schemes cannot reverse the substantial impacts on the marine environment — at least not within many centuries,” Mathesius said.

    Even in a scenario in which large-scale carbon removal begins in 2050 — assuming such technology is available — the ocean does not fare well.

    “Immediate and ambitious action to reduce CO2 emissions is the most reliable strategy for avoiding dangerous climate change, ocean acidification, and large-scale threats to marine ecosystems,” the researchers concluded.

    Scientists commenting on the study said it should sound an alarm.

    “The threat of ocean acidification alone justifies dramatic and rapid reduction of CO2 emissions,” said Nick Riley, a research associate at the British Geological Survey (BGS).

    The second study, led by Thomas Gasser of the Institut Pierre-Simon Laplace, near Paris, uses state-of-the-art models to measure the trade-off between reducing emissions and carbon-removing technologies.

    They show that even if nations strike a deal in Paris adhering to the most aggressive CO2-slashing pathway outlined by UN scientists, it may not be enough to keep Earth on a 2 C trajectory.

    “Our results suggest that negative emissions” — the use of carbon removing technology — “are needed even in the case of very high mitigation rates.”

    To have a chance of meeting the 2 C target, 0.5 to 3.0 gigatonnes of carbon — up to a third of total annual CO2 emissions today from industry — would need to be extracted every year starting more or less immediately, they calculate.

    The study exposes “an elephant in the room,” Riley said. ”The target to keep warming within the 2 C rise is looking increasingly unattainable.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 6:59 am on August 4, 2015 Permalink | Reply
    Tags: , , , , ,   

    From NRAO: “Neutron Stars Strike Back at Black Holes in Jet Contest” 

    NRAO Icon
    National Radio Astronomy Observatory

    NRAO Banner

    4 August 2015
    Dave Finley, Public Information Officer
    (575) 835-7302
    dfinley@nrao.edu

    1
    Artist’s impression of material flowing from a companion star onto a neutron star. The material forms an accretion disk around the neutron star and produces a superfast jet of ejected material. The material closest to the neutron star is so hot that it glows in X-rays, while the jet is most prominent at radio wavelengths. A similar mechanism is at work with black holes. CREDIT: Bill Saxton, NRAO/AUI/NSF.

    Some neutron stars may rival black holes in their ability to accelerate powerful jets of material to nearly the speed of light, astronomers using the Karl G. Jansky Very Large Array (VLA) have discovered.

    “It’s surprising, and it tells us that something we hadn’t previously suspected must be going on in some systems that include a neutron star and a more-normal companion star,” said Adam Deller, of ASTRON, the Netherlands Institute for Radio Astronomy.

    Black holes and neutron stars are respectively the densest and second most dense forms of matter known in the Universe. In binary systems where these extreme objects orbit with a more normal companion star, gas can flow from the companion to the compact object, producing spectacular displays when some of the material is blasted out in powerful jets at close to the speed of light

    Previously, black holes were the undisputed kings of forming powerful jets. Even when only nibbling on a small amount of material, the radio emission that traces the jet outflow from the black hole was relatively bright. In comparison, neutron stars seemed to make relatively puny jets — the radio emission from their jets was only bright enough to see when they were gobbling material from their companion at a very high rate. A neutron star sedately consuming material was therefore predicted to form only very weak jets, which would be too faint to observe.

    Recently, however, combined radio and X-ray observations of the neutron star PSR J1023+0038 completely contradicted this picture. PSR J1023+0038, which was discovered by ASTRON astronomer Anne Archibald in 2009, is the prototypical “transitional millisecond pulsar”– a neutron star which spends years at a time in a non-accreting state, only to “transition” occasionally into active accretion. When observed in 2013 and 2014, it was accreting only a trickle of material, and should have been producing only a feeble jet.

    “Unexpectedly, our radio observations with the Very Large Array showed relatively strong emission, indicating a jet that is nearly as strong as we would expect from a black hole system,” Deller said.

    NRAO VLA
    VLA

    Two other such “transitional” systems are now known, and both of these now have been shown to exhibit powerful jets that rival those of their black-hole counterparts. What makes these transitional systems special compared to their other neutron star brethren? For that, Deller and colleagues are planning additional observations of known and suspected transitional systems to refine theoretical models of the accretion process.

    Deller led a team of astronomers who reported their findings in the Astrophysical Journal.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The NRAO operates a complementary, state-of-the-art suite of radio telescope facilities for use by the scientific community, regardless of institutional or national affiliation: the Very Large Array (VLA), the Robert C. Byrd Green Bank Telescope (GBT), and the Very Long Baseline Array (VLBA)*.

    ALMA Array

    NRAO ALMA

    NRAO GBT
    NRAO GBT

    NRAO VLA
    NRAO VLA

    The NRAO is building two new major research facilities in partnership with the international community that will soon open new scientific frontiers: the Atacama Large Millimeter/submillimeter Array (ALMA), and the Expanded Very Large Array (EVLA). Access to ALMA observing time by the North American astronomical community will be through the North American ALMA Science Center (NAASC).
    *The Very Long Baseline Array (VLBA) comprises ten radio telescopes spanning 5,351 miles. It’s the world’s largest, sharpest, dedicated telescope array. With an eye this sharp, you could be in Los Angeles and clearly read a street sign in New York City!

    Astronomers use the continent-sized VLBA to zoom in on objects that shine brightly in radio waves, long-wavelength light that’s well below infrared on the spectrum. They observe blazars, quasars, black holes, and stars in every stage of the stellar life cycle. They plot pulsars, exoplanets, and masers, and track asteroids and planets.

     
  • richardmitnick 6:01 am on August 4, 2015 Permalink | Reply
    Tags: , , , ,   

    From JPL: “Tracking A Mysterious Group of Asteroid Outcasts” 

    JPL

    August 3, 2015
    DC Agle
    Jet Propulsion Laboratory, Pasadena, Calif.
    818-393-9011
    agle@jpl.nasa.gov

    1
    The asteroid Euphrosyne glides across a field of background stars in this time-lapse view from NASA’s WISE spacecraft. WISE obtained the images used to create this view over a period of about a day around May 17, 2010, during which it observed the asteroid four times.

    Because WISE (renamed NEOWISE in 2013) is an infrared telescope, it senses heat from asteroids. Euphrosyne is quite dark in visible light, but glows brightly at infrared wavelengths.

    This view is a composite of images taken at four different infrared wavelengths: 3.4 microns (color-coded blue), 4.6 microns (cyan), 12 microns (green) and 22 microns (red).

    The moving asteroid appears as a string of red dots because it is much cooler than the distant background stars. Stars have temperatures in the thousands of degrees, but the asteroid is cooler than room temperature. Thus the stars are represented by shorter wavelength (hotter) blue colors in this view, while the asteroid is shown in longer wavelength (cooler) reddish colors.

    The WISE spacecraft was put into hibernation in 2011 upon completing its goal of surveying the entire sky in infrared light. WISE cataloged three quarters of a billion objects, including asteroids, stars and galaxies. In August 2013, NASA decided to reinstate the spacecraft on a mission to find and characterize more asteroids.

    Fast Facts:

    › A new NASA study has traced some members of the near-Earth asteroid population back to their likely source.

    › The source may be the Euphrosyne family of dark, asteroids on highly inclined (or tilted) orbits in the outer asteroid belt.

    › The study used data from NASA’s NEOWISE space telescope, which has a second life following its reactivation in 2013.

    NASA Wise Telescope
    WISE

    High above the plane of our solar system, near the asteroid-rich abyss between Mars and Jupiter, scientists have found a unique family of space rocks. These interplanetary oddballs are the Euphrosyne(pronounced you-FROH-seh-nee) asteroids, and by any measure they have been distant, dark and mysterious — until now.

    Distributed at the outer edge of the asteroid belt, the Euphrosynes have an unusual orbital path that juts well above the ecliptic, the equator of the solar system. The asteroid after which they are named, Euphrosyne — for an ancient Greek goddess of mirth — is about 156 miles (260 kilometers) across and is one of the 10 largest asteroids in the main belt. Current-day Euphrosyne is thought to be a remnant of a massive collision about 700 million years ago that formed the family of smaller asteroids bearing its name. Scientists think this event was one of the last great collisions in the solar system.

    A new study conducted by scientists at NASA’s Jet Propulsion Laboratory in Pasadena, California, used the agency’s orbiting Near-Earth Object Wide-field Infrared Survey Explorer (NEOWISE) telescope to look at these unusual asteroids to learn more about Near Earth Objects, or NEOs, and their potential threat to Earth.

    NEOs are bodies whose orbits around the sun approach the orbit of Earth; this population is short-lived on astronomical timescales and is fed by other reservoirs of bodies in our solar system. As they orbit the sun, NEOs can occasionally have close approaches to Earth. For this reason alone — the safety of our home planet — the study of such objects is important.

    As a result of their study, the JPL researchers believe the Euphrosynes may be the source of some of the dark NEOs found to be on long, highly inclined orbits. They found that, through gravitational interactions with Saturn, Euphrosyne asteroids can evolve into NEOs over timescales of millions of years.

    NEOs can originate in either the asteroid belt or the more distant outer reaches of the solar system. Those from the asteroid belt are thought to evolve toward Earth’s orbit through collisions and the gravitational influence of the planets. Originating well above the ecliptic and near the far edge of the asteroid belt, the forces that shape their trajectories toward Earth are far more moderate.

    “The Euphrosynes have a gentle resonance with the orbit of Saturn that slowly moves these objects, eventually turning some of them into NEOs,” said Joseph Masiero, JPL’s lead scientist on the Euphrosynes study. “This particular gravitational resonance tends to push some of the larger fragments of the Euphrosyne family into near-Earth space.”

    By studying the Euphrosyne family asteroids with NEOWISE, JPL scientists have been able to measure their sizes and the amount of solar energy they reflect. Since NEOWISE operates in the infrared portion of the spectrum, it detects heat. Therefore, it can see dark objects far better than telescopes operating at visible wavelengths, which sense reflected sunlight. Its heat-sensing capability also allows it to measure sizes more accurately.

    The 1,400 Euphrosyne asteroids studied by Masiero and his colleagues turned out to be large and dark, with highly inclined and elliptical orbits. These traits make them good candidates for the source of some of the dark NEOs the NEOWISE telescope detects and discovers, particularly those that also have highly inclined orbits.

    NEOWISE was originally launched as an astrophysics mission in 2009 as the Wide-field Infrared Survey Explorer, or WISE. It operated until 2011 and was then shut down. But the spacecraft, now dubbed NEOWISE, would get a second life. “NEOWISE is a great tool for searching for near-Earth asteroids, particularly high-inclination, dark objects,” Masiero said.

    There are over 700,000 asteroidal bodies currently known in the main belt that range in size from large boulders to about 60 percent of the diameter of Earth’s moon, with many yet to be discovered. This makes finding the specific point of origin of most NEOs extremely difficult.

    With the Euphrosynes it’s different. “Most near-Earth objects come from a number of sources in the inner region of the main belt, and they are quickly mixed around,” Masiero said. “But with objects coming from this family, in such a unique region, we are able to draw a likely path for some of the unusual, dark NEOs we find back to the collision in which they were born.”

    A better understanding of the origins and behaviors of these mysterious objects will give researchers a clearer picture of asteroids in general, and in particular the NEOs that skirt our home planet’s neighborhood. Such studies are important, and potentially critical, to the future of humanity, which is a primary reason JPL and its partners continue to relentlessly track these wanderers within our solar system. To date, U.S. assets have discovered more than 98 percent of the known NEOs.

    NASA’s Jet Propulsion Laboratory in Pasadena, California, manages the NEOWISE mission for NASA’s Science Mission Directorate in Washington. The Space Dynamics Laboratory in Logan, Utah, built the science instrument. Ball Aerospace & Technologies Corp. of Boulder, Colorado, built the spacecraft. Science operations and data processing take place at the Infrared Processing and Analysis Center at the California Institute of Technology in Pasadena. Caltech manages JPL for NASA.

    NASA’s Near-Earth Object Program at NASA Headquarters, Washington, manages and funds the search, study and monitoring of asteroids and comets whose orbits periodically bring them close to Earth. JPL manages the Near-Earth Object Office for NASA’s Science Mission Directorate in Washington.

    For more information about NEOWISE, visit:

    http://www.nasa.gov/neowise

    More information about asteroids and near-Earth objects is available at:

    http://neo.jpl.nasa.gov

    http://www.jpl.nasa.gov/asteroidwatch

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA JPL Campus

    Jet Propulsion Laboratory (JPL) is a federally funded research and development center and NASA field center located in the San Gabriel Valley area of Los Angeles County, California, United States. Although the facility has a Pasadena postal address, it is actually headquartered in the city of La Cañada Flintridge, on the northwest border of Pasadena. JPL is managed by the nearby California Institute of Technology (Caltech) for the National Aeronautics and Space Administration. The Laboratory’s primary function is the construction and operation of robotic planetary spacecraft, though it also conducts Earth-orbit and astronomy missions. It is also responsible for operating NASA’s Deep Space Network.

    Caltech Logo
    jpl

     
  • richardmitnick 5:48 pm on August 3, 2015 Permalink | Reply
    Tags: , , ,   

    From NOVA: “New Theory Could Tell Us If Life Came From an Alien Planet” 

    PBS NOVA

    NOVA

    03 Aug 2015
    Abbey Interrante

    1
    An artist’s rendition of exoplanets

    Life is thought to have originated spontaneously on Earth about 3.5 billion years ago, but some scientists are think life may have come to Earth from elsewhere in the universe. But if finding the origins of life on Earth has been difficult, searching for them in the sky seems nearly impossible.

    For supporters of the panspermia hypothesis, which says that life could have started on one planet and jumped to another, a new model proposed by Henry Lin and Abraham Loeb, both of Harvard University, is an exciting prospect not because it proves the theory, but because it makes it testable.

    Scientists look for evidence of panspermia by looking for biosignatures, or evidence of past or present life, on space objects, however space objects are so numerous in the universe that checking all of them is absurd. So Lin and Loeb suggest that if panspermia were to occur, it would appear in clusters of solar systems. For example, if Earth sat at the edge of one of these clusters, half of what’s viewed in the sky from the planet could be inhabited and the other half would be uninhabited.

    According to Lin and Loeb, if 25 exoplanets on one side of the sky showed signs of biological activity, and 25 on the other side showed no biological activity, this would be a smoking gun for panspermia. However, if the Earth is in the center of a panspermiac cluster, then it would be surrounded by biosignatures. If that was the case, panspermia would be harder to confirm.

    Joshua Sokol, reporting for New Scientist, explains further:

    Future probes like NASA’S James Webb Space Telescope will scrutinise the atmospheres of planets in other solar systems for possible signs of biological activity.

    NASA Webb Telescope
    Webb

    If life spreads between planets, inhabited worlds should clump in space like colonies of bacteria on a Petri dish. Otherwise, Lin says, its signature would be seen on just a few, randomly scattered planets.

    Studies show that regions of large stellar density are be more likely to have higher transfer rates of rocky material, and therefore a higher chance for spreading life. In regions of small stellar density, panspermia would be less likely to occur and therefore would have tiny amount of or even zero biosignatures. Nevertheless, with an even higher stellar density, the chances an area is inhospitable for life also rise because of an increased number of stellar encounters.

    Some question if panspermia has occurred already, resulting in life on Earth, or if humans will be the first to generate it through colonization of other planets. As our technological prowess increases, spacecraft could eventually transport us humans successfully through space. But it’s also possible that primitive life could evolve to survive the harsh environment of space, piggyback on debris from, say, a meteor collision with Earth, and colonize a new world. The question is: who will get there first?

    [Someone tell me where this article describes any test methods.]

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

     
  • richardmitnick 5:30 pm on August 3, 2015 Permalink | Reply
    Tags: , , ,   

    From LBL: “Notes from the Particle Physics Underground” 

    Berkeley Logo

    Berkeley Lab

    August 3, 2015
    Kate Greene 913-634-1611

    1
    Temp 0

    2
    Temp 0

    3
    Temp 0

    The Black Hills region in western South Dakota is known for its rich stores of gold and silver. In fact, 41 million ounces of gold and 9 million ounces of silver were pulled from Homestake Mine in Lead, SD between the 1870s and early 2000s. During that time, 370 miles of mine tunnels were created, reaching depths of 8,000 feet. But in 2006 science took over: Sanford Underground Research Facility (Sanford Lab) is an underground particle physics research complex housed in the former mine, using the earth and rock to shield experiments from cosmic rays. The better the shielding, the more likely the scientists will detect neutrinos and suspected dark matter particles called WIMPs. Earlier this summer, Lead celebrated the ribbon-cutting of a new visitor center that highlights the history of the old mine and the current and future science at Sanford Lab.

    The U.S. Department of Energy’s Lawrence Berkeley National Lab (Berkeley Lab) is a key player in the creation of Sanford Lab and in the operation of some of its current and future experiments, including the dark matter experiment called LUX and a neutrino experiment called the MAJORANA DEMONSTRATOR. Berkeley Lab is also managing the Berkeley Low Background Facility and the forthcoming LUX-ZEPLIN (LZ) dark matter project, which builds on the accomplishments of LUX.

    LUX Dark matter
    LUX

    Majorano Demonstrator Experiment
    MAJORANA DEMONSTRATOR

    Lux Zeplin project
    LUX-ZEPLIN (LZ)

    As a science writer for Berkeley Lab, I was able to catch a ride on one of the mine’s elevators, called a cage, and descend 4,850 feet down to learn more about the science and the scientists who work on these projects.

    The above slideshow illustrates what it’s like to go underground. The short video below shows the last few seconds of the cage ride and our exit into the space called the Davis Campus, completed in 2012 and home to the MAJORANA DEMONSTRATOR, the LUX experiment, and other facilities.

    The cage operator communicates with an operator on the surface at the start and end of the ride. There are no lights in the cage or shaft other than headlamps.

    The Black Hills region in western South Dakota is known for its rich stores of gold and silver. In fact, 41 million ounces of gold and 9 million ounces of silver were pulled from Homestake Mine in Lead, SD between the 1870s and early 2000s. During that time, 370 miles of mine tunnels were created, reaching depths of 8,000 feet. But in 2006 science took over: Sanford Underground Research Facility (Sanford Lab) is an underground particle physics research complex housed in the former mine, using the earth and rock to shield experiments from cosmic rays. The better the shielding, the more likely the scientists will detect neutrinos and suspected dark matter particles called WIMPs. Earlier this summer, Lead celebrated the ribbon-cutting of a new visitor center that highlights the history of the old mine and the current and future science at Sanford Lab.

    The U.S. Department of Energy’s Lawrence Berkeley National Lab (Berkeley Lab) is a key player in the creation of Sanford Lab and in the operation of some of its current and future experiments, including the dark matter experiment called LUX and a neutrino experiment called the MAJORANA DEMONSTRATOR. Berkeley Lab is also managing the Berkeley Low Background Facility and the forthcoming LUX-ZEPLIN (LZ) dark matter project, which builds on the accomplishments of LUX.

    As a science writer for Berkeley Lab, I was able to catch a ride on one of the mine’s elevators, called a cage, and descend 4,850 feet down to learn more about the science and the scientists who work on these projects.

    The above slideshow illustrates what it’s like to go underground. The short video below shows the last few seconds of the cage ride and our exit into the space called the Davis Campus, completed in 2012 and home to the MAJORANA DEMONSTRATOR, the LUX experiment, and other facilities.

    It takes about ten minutes to ride the cage down to the 4,850 level where LUX and the MAJORANA DEMONSTRATOR are located. This video captures the last few seconds of the cage ride and the entry into the Davis Campus.

    In addition to checking out the MAJORANA DEMONSTRATOR and LUX projects, I joined a tour given to a group of esteemed scientists (including Berkeley Lab’s Eric Linder) who were in the nearby town of Deadwood, SD for a conference on particle physics and cosmology. As part of the tour, we traveled through unlit tunnels, visited construction sites of a future experiment, and walked through the refuge chamber, a shelter equipped with water, meal bars, and canisters of breathable air in case a fire or other disaster strikes.

    I went underground at 7:30 a.m. and came back up at noon. My four and a half hours of being shielded from daylight and cosmic rays was pleasant enough, but when I stepped outside, above ground, I was glad to see a bright sun and feel the breeze on my skin.

    All photo and video credits: Kate Greene.

    On the full article is a slideshow that details the underground experiments. The short video that follows gives a sense of what it’s like to travel through the tunnels.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 5:03 pm on August 3, 2015 Permalink | Reply
    Tags: , , , SDSSIII   

    From Astronomy: “Stars in our galaxy move far from home” 

    Astronomy magazine

    Astronomy Magazine

    August 03, 2015
    Sloan Digital Sky Survey Press

    1
    A single frame from an animation shows how stellar orbits in the Milky Way can change. The image shows two pairs of stars (marked as red and blue) in which each pair started in the same orbit, and then one star in the pair changed orbits. The star marked as red has completed its move into a new orbit, while the star marked in blue is still moving. Dana Berry/SkyWorks Digital, Inc.; SDSS collaboration

    When it comes to our galaxy, home is where the star is.

    Scientists with the Sloan Digital Sky Survey (SDSS) have created a new map of the Milky Way and determined that 30 percent of stars have dramatically changed their orbits.

    SDSS Telescope
    SDSS telescope at Apach Point, NM, USA

    This discovery brings a new understanding of how stars form and how they travel throughout our galaxy.

    “In our modern world, many people move far away from their birthplaces, sometimes halfway around the world,” said Michael Hayden of New Mexico State University (NMSU). “Now we’re finding the same is true of stars in our galaxy — about 30 percent of the stars in our galaxy have traveled a long way from the orbits in which they were born.”

    To build a map of the Milky Way, the scientists used the SDSS Apache Point Observatory Galactic Evolution Explorer (APOGEE) spectrograph to observe 100,000 stars during a four-year campaign.

    The key to creating and interpreting this map is measuring the elements in the atmosphere of each star. “From the chemical composition of a star, we can learn its ancestry and life history,” Hayden said.

    The chemical information comes from spectra, which are detailed measurements of how much light the star gives off at different wavelengths. Spectra show prominent lines that correspond to elements and molecules present. Reading the spectral lines of a star can tell astronomers what the star is made of.

    “Stellar spectra show us that the chemical makeup of our galaxy is constantly changing,” said Jon Holtzman from NMSU. “Stars create heavier elements in their cores, and when the stars die, those heavier elements go back into the gas from which the next stars form.”

    As a result of this process of “chemical enrichment,” each generation of stars has a higher percentage of heavier elements than the previous generation did. In some regions of the galaxy, star formation has proceeded more vigorously than in other regions, and in these more vigorous regions, more generations of stars have formed. Thus, the average amount of heavier elements in stars varies across different parts of the galaxy. Astronomers can use the amount of heavy elements in a star to determine what part of the galaxy the star was born in.

    Hayden and colleagues used APOGEE data to map the relative amounts of 15 separate elements, including carbon, silicon, and iron, for stars all over the galaxy. What they found surprised them — up to 30 percent of stars had compositions indicating that they were formed in parts of the galaxy far from their current positions.

    “While on average the stars in the outer disk of the Milky Way have less heavy element enrichment, there is a fraction of stars in the outer disk that have heavier element abundances that are more typical of stars in the inner disk,” said Jo Bovy of the Institute for Advanced Study and the University of Toronto.

    When the team looked at the pattern of element abundances in detail, they found that much of the data could be explained by a model in which stars migrate into new orbits around the galactic center, moving nearer or farther with time. These random in-and-out motions are referred to as “migration” and are likely caused by irregularities in the galactic disk, such as the Milky Way’s famous spiral arms. Evidence of stellar migration had previously been seen in stars near the Sun, but the new study is the first clear evidence that migration occurs for stars throughout the galaxy.

    Future studies by astronomers using data from SDSS promise even more new discoveries. “These latest results take advantage of only a small fraction of the available APOGEE data,” said Steven Majewski, the principal investigator of APOGEE. “Once we unlock the full information content of APOGEE, we will understand the chemistry and shape of our galaxy much more clearly.”

    See the full article here

    .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 4:47 pm on August 3, 2015 Permalink | Reply
    Tags: Apple Computers, , Computer worms,   

    From WIRED: “Researchers Create First Firmware Worm That Attacks Macs” 

    Wired logo

    Wired

    08.03.15
    Kim Zetter

    1
    Josh Valcarcel/WIRED

    The common wisdom when it comes to PCs and Apple computers is that the latter are much more secure. Particularly when it comes to firmware, people have assumed that Apple systems are locked down in ways that PCs aren’t.

    It turns out this isn’t true. Two researchers have found that several known vulnerabilities affecting the firmware of all the top PC makers can also hit the firmware of MACs. What’s more, the researchers have designed a proof-of-concept worm for the first time that would allow a firmware attack to spread automatically from MacBook to MacBook, without the need for them to be networked.

    The attack raises the stakes considerably for system defenders since it would allow someone to remotely target machines—including air-gapped ones—in a way that wouldn’t be detected by security scanners and would give an attacker a persistent foothold on a system even through firmware and operating system updates. Firmware updates require the assistance of a machine’s existing firmware to install, so any malware in the firmware could block new updates from being installed or simply write itself to a new update as it’s installed.

    The only way to eliminate malware embedded in a computer’s main firmware would be to re-flash the chip that contains the firmware.

    “[The attack is] really hard to detect, it’s really hard to get rid of, and it’s really hard to protect against something that’s running inside the firmware,” says Xeno Kovah, one of the researchers who designed the worm. “For most users that’s really a throw-your-machine-away kind of situation. Most people and organizations don’t have the wherewithal to physically open up their machine and electrically reprogram the chip.”

    It’s the kind of attack intelligence agencies like the NSA covet. In fact, documents released by Edward Snowden, and research conducted by Kaspersky Lab, have shown that the NSA has already developed sophisticated techniques for hacking firmware.

    The Mac firmware research was conducted by Kovah, owner of LegbaCore, a firmware security consultancy, and Trammell Hudson, a security engineer with Two Sigma Investments. They’ll be discussing their findings on August 6 at the Black Hat security conference in Las Vegas.

    A computer’s core firmware—also referred to at times as the BIOS, UEFI or EFI—is the software that boots a computer and launches its operating system. It can be infected with malware because most hardware makers don’t cryptographically sign the firmware embedded in their systems, or their firmware updates, and don’t include any authentication functions that would prevent any but legitimate signed firmware from being installed.

    Firmware is a particularly valuable place to hide malware on a machine because it operates at a level below the level where antivirus and other security products operate and therefore does not generally get scanned by these products, leaving malware that infects the firmware unmolested. There’s also no easy way for users to manually examine the firmware themselves to determine if it’s been altered. And because firmware remains untouched if the operating system is wiped and re-installed, malware infecting the firmware can maintain a persistent hold on a system throughout attempts to disinfect the computer. If a victim, thinking his or her computer is infected, wipes the computer’s operating system and reinstalls it to eliminate malicious code, the malicious firmware code will remain intact.

    5 Firmware Vulnerabilities in Macs

    Last year, Kovah and his partner at Legbacore, Corey Kallenberg, uncovered a series of firmware vulnerabilities that affected 80 percent of PCs they examined, including ones from Dell, Lenovo, Samsung and HP. Although hardware makers implement some protections to make it difficult for someone to modify their firmware, the vulnerabilities the researchers found allowed them to bypass these and reflash the BIOS to plant malicious code in it.

    Kovah, along with Hudson, then decided to see if the same vulnerabilities applied to Apple firmware and found that untrusted code could indeed be written to the MacBook boot flash firmware. “It turns out almost all of the attacks we found on PCs are also applicable to Macs,” says Kovah.

    They looked at six vulnerabilities and found that five of them affected Mac firmware. The vulnerabilities are applicable to so many PCs and Macs because hardware makers tend to all use some of the same firmware code.

    “Most of these firmwares are built from the same reference implementations, so when someone finds a bug in one that affects Lenovo laptops, there’s a really good chance it’s going to affect the Dells and HPs,” says Kovah. “What we also found is that there is really a high likelihood that the vulnerability will also affect Macbooks. Because Apple is using a similar EFI firmware.”

    In the case of at least one vulnerability, there were specific protections that Apple could have implemented to prevent someone from updating the Mac code but didn’t.

    “People hear about attacks on PCs and they assume that Apple firmware is better,” Kovah says. “So we’re trying to make it clear that any time you hear about EFI firmware attacks, it’s pretty much all x86 [computers].”

    They notified Apple of the vulnerabilities, and the company has already fully patched one and partially patched another. But three of the vulnerabilities remain unpatched.
    Thunderstrike 2: Stealth Firmware Worm for Macs

    Using these vulnerabilities, the researchers then designed a worm they dubbed Thunderstrike 2 that can spread between MacBooks undetected. It can remain hidden because it never touches the computer’s operating system or file system. “It only ever lives in firmware, and consequently no [scanners] are actually looking at that level,” says Kovah.

    The attack infects the firmware in just seconds and can also be done remotely.

    There have been examples of firmware worms in the past—but they spread between things like home office routers and also involved infecting the Linux operating system on the routers. Thunderstrike 2, however, is designed to spread by infecting what’s known as the option ROM on peripheral devices.

    An attacker could first remotely compromise the boot flash firmware on a MacBook by delivering the attack code via a phishing email and malicious web site. That malware would then be on the lookout for any peripherals connected to the computer that contain option ROM, such as an Apple Thunderbolt Ethernet adapter, and infect the firmware on those. The worm would then spread to any other computer to which the adapter gets connected.

    When another machine is booted with this worm-infected device inserted, the machine firmware loads the option ROM from the infected device, triggering the worm to initiate a process that writes its malicious code to the boot flash firmware on the machine. If a new device is subsequently plugged into the computer and contains option ROM, the worm will write itself to that device as well and use it to spread.

    One way to randomly infect machines would be to sell infected Ethernet adapters on eBay or infect them in a factory.

    “People are unaware that these small cheap devices can actually infect their firmware,” says Kovah. “You could get a worm started all around the world that’s spreading very low and slow. If people don’t have awareness that attacks can be happening at this level then they’re going to have their guard down and an attack will be able to completely subvert their system.”

    In a demo video Kovah and Hudson showed WIRED, they used an Apple Thunderbolt to Gigabit Ethernet adapter, but an attacker could also infect the option ROM on an external SSD or on a RAID controller.

    No security products currently check the option ROM on Ethernet adapters and other devices, so attackers could move their worm between machines without fear of being caught. They plan to release some tools at their talk that will allow users to check the option ROM on their devices, but the tools aren’t able to check the boot flash firmware on machines.

    The attack scenario they demonstrated is ideal for targeting air-gapped systems that can’t be infected through network connections.

    “Let’s say you’re running a uranium refining centrifuge plant and you don’t have it connected to any networks, but people bring laptops into it and perhaps they share Ethernet adapters or external SSDs to bring data in and out,” Kovah notes. “Those SSDs have option ROMs that could potentially carry this sort of infection. Perhaps because it’s a secure environment they don’t use WiFi, so they have Ethernet adapters. Those adapters also have option ROMs that can carry this malicious firmware.”

    He likens it to how Stuxnet spread to Iran’s uranium enrichment plant at Natanz via infected USB sticks. But in that case, the attack relied on zero-day attacks against the Windows operating system to spread. As a result, it left traces in the OS where defenders might be able to find them.

    “Stuxnet sat around as a kernel driver on Windows file systems most of the time, so basically it existed in very readily available, forensically-inspectable places that everybody knows how to check. And that was its Achille’s heel,” Kovah says. But malware embedded in firmware would be a different story since firmware inspection is a vicious circle: the firmware itself controls the ability of the OS to see what’s in the firmware, thus a firmware-level worm or malware could hide by intercepting the operating system’s attempts to look for it. Kovah and colleagues showed how firmware malware could lie like this at a talk they gave in 2012. “[The malware] could trap those requests and just serve up clean copies [of code]… or hide in system management mode where the OS isn’t even allowed to look,” he says.

    Hardware makers could guard against firmware attacks if they cryptographically signed their firmware and firmware updates and added authentication capabilities to hardware devices to verify these signatures. They could also add a write-protect switch to prevent unauthorized parties from flashing the firmware.

    Although these measures would guard against low-level hackers subverting the firmware, well-resourced nation-state attackers could still steal a hardware maker’s master key to sign their malicious code and bypass these protections.

    Therefore, an additional countermeasure would involve hardware vendors giving users the ability to easily read their machine’s firmware to determine if it has changed since installation. If vendors provided a checksum of the firmware and firmware updates they distribute, users could periodically check to see if what’s installed on their machine differs from the checksums. A checksum is a cryptographic representation of data that is created by running the data through an algorithm to produce a unique identifier composed of letters and numbers. Each checksum is supposed to be unique so that if anything changes in the dataset, it will produce a different checksum.

    But hardware makers aren’t implementing these changes because it would require re-architecting systems, and in the absence of users demanding more security for their firmware, hardware makers aren’t likely to make the changes on their own.

    “Some vendors like Dell and Lenovo have been very active in trying to rapidly remove vulnerabilities from their firmware,” Kovah notes. “Most other vendors, including Apple as we are showing here, have not. We use our research to help raise awareness of firmware attacks, and show customers that they need to hold their vendors accountable for better firmware security.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 2:21 pm on August 3, 2015 Permalink | Reply
    Tags: , ,   

    From NYU: NYU Scientists bring order, and color, to microparticles” 

    NYU BLOC

    New York University

    August 3, 2015
    No Writer Credit

    1
    A team of New York University scientists has developed a technique that prompts microparticles to form ordered structures in a variety of materials. The advance offers a method to potentially improve the makeup and color of optical materials used in computer screens along with other consumer products. (c) iStock/dolphfyn

    A team of New York University scientists has developed a technique that prompts microparticles to form ordered structures in a variety of materials. The advance, which appears in the Journal of the American Chemical Society (JACS) as an “Editors’ Choice” article, offers a method to potentially improve the makeup and color of optical materials used in computer screens along with other consumer products.

    The work is centered on enhancing the arrangement of colloids—small particles suspended within a fluid medium. Colloidal dispersions are composed of such everyday items such as paint, milk, gelatin, glass, and porcelain, but their potential to create new materials remains largely untapped.

    Notably, DNA-coated colloids offer particular promise because they can be linked together, with DNA serving as the glue to form a range of new colloidal structures. However, previous attempts have produced uneven results, with these particles attaching to each other in ways that produce chaotic or inflexible configurations.

    The NYU team developed a new method to apply DNA coating to colloids so that they crystallize—or form new compounds—in an orderly manner. Specifically, it employed a synthetic strategy—click chemistry—introduced more than a decade ago that is a highly efficient way of attaching DNA. Here, scientists initiated a chemical reaction that allows molecular components to stick together in a particular fashion—a process some have compared to connecting Legos.

    In a previous paper, published earlier this year in the journal Nature Communications, the research team outlined the successful execution of this technique. However, the method, at that point, could manipulate only one type of particle. In the JACS study, the research team shows the procedure can handle five additional types of materials—and in different combinations.

    The advance, the scientists say, is akin to a builder having the capacity to construct a house using glass, metal, brick, and concrete—rather than only wood.

    “If you want to program and create structures at microscopic levels, you need to have the ability for a particle to move around and find its optimal position,” explains David Pine, a professor of physics at NYU and chair of the Chemical and Bioengineering Department at NYU Polytechnic School of Engineering. “Our research shows that this be done and be achieved with multiple materials, all resulting in several different types of compounds.”

    The work was conducted by researchers at NYU’s Molecular Design Institute and Center for Soft Matter Research and at South Korea’s Sungkyunkwan University. The paper’s other authors were: Yufeng Wang of the Center for Soft Matter Research and Molecular Design Institute; Yu Wang and Xiaolong Zheng of the Molecular Design Institute; Etienne Ducrot of the Center for Soft Matter Research; Myung-Goo Lee and Gi-Ra Yi of Sungkyunkwan University’s School of Chemical Engineering; and Marcus Weck of the Molecular Design Institute.

    The research was supported, in part, by grants from the U.S. Army Research Office (W911NF- 510 10-1-0518), the National Research Foundation of Korea (NRF-2014S1A2A2028608), and by the National Science Foundation’s Materials Research Science and Engineering Center (MRSEC) Program (DMR-0820341).

    NYU’s center is one of 24 MRSECs in the country. These NSF-backed centers support interdisciplinary and multidisciplinary materials research to address fundamental problems in science and engineering.

    For more on the NYU MRSEC, click here.

    See the full article here..

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NYU Campus

    More than 175 years ago, Albert Gallatin, the distinguished statesman who served as secretary of the treasury under Presidents Thomas Jefferson and James Madison, declared his intention to establish “in this immense and fast-growing city … a system of rational and practical education fitting for all and graciously opened to all.” Founded in 1831, New York University is now one of the largest private universities in the United States. Of the more than 3,000 colleges and universities in America, New York University is one of only 60 member institutions of the distinguished Association of American Universities.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 454 other followers

%d bloggers like this: