Tagged: Basic Research Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:29 pm on October 20, 2014 Permalink | Reply
    Tags: , , , , Basic Research, ,   

    From astrobio.net: ” Exomoons Could Be Abundant Sources Of Habitability” 

    Astrobiology Magazine

    Astrobiology Magazine

    Oct 20, 2014
    Elizabeth Howell

    With about 4,000 planet candidates from the Kepler Space Telescope data to analyze so far, astronomers are busy trying to figure out questions about habitability. What size planet could host life? How far from its star does it need to be? What would its atmosphere need to be made of?

    NASA Kepler Telescope
    NASA/Kepler

    Look at our own solar system, however, and there’s a big gap in the information we need. Most of the planets have moons, so surely at least some of the Kepler finds would have them as well. Tracking down these tiny worlds, however, is a challenge.

    europa
    Europa is one of the moons in our solar system that could host life. What about beyond the solar system? Credit: NASA/JPL/Ted Stryk

    A new paper in the journal Astrobiology, called Formation, Habitability, and Detection of Extrasolar Moons, goes over this mostly unexplored field of extrasolar research. The scientists do an extensive literature review of what is supposed about moons beyond the Solar System, and they add intriguing new results.

    A wealth of moons exist in our own solar system that could host life. Icy Europa, which is circling Jupiter, was recently discovered to have plumes of water erupting from its surface. Titan, in orbit around Saturn, is the only known moon with an atmosphere, and could have the precursor elements to life in its hydrocarbon seas that are warmed by Saturn’s heat. Other candidates for extraterrestrial hosts include Jupiter’s moons Callisto and Ganymede, as well as Saturn’s satellite Enceladus.

    Lead author René Heller, an astrophysicist at the Origins Institute at McMaster University, in Ontario, Canada, said some exomoons could be even better candidates for life than many exoplanets.

    “Moons have separate energy sources,” he said. “While the habitability of terrestrial planets is mostly determined by stellar illumination, moons also receive reflected stellar light from the planet as well as thermal emission from the planet itself.”

    Moreover, a planet like Jupiter — which hosts most of the moons in the Solar System that could support life — provides even more potential energy sources, he added. The planet is still shrinking and thereby converts gravitational energy into heat, so that it actually emits more light than it receives from the Sun, providing yet more illumination. Besides that, moons orbiting close to a gas giant are flexed by the planet’s gravity, providing potential tidal heating as an internal, geological heat source.

    tri
    Triton’s odd, melted appearance hint that the moon was captured and altered by Neptune. Credit: NASA

    Finding the first exomoon

    The first challenge in studying exomoons outside our Solar System is to actually find one. Earlier this year, NASA-funded researchers reported the possible discovery of such a moon, but this claim was ambiguous and can never be confirmed. That’s because it appeared as a one-time event, when one star passed in front of another, acting as a sort of gravitational lens that amplified the background star. Two objects popped out in the gravitational lens in the foreground — either a planet and a star, or a planet and an extremely heavy exomoon.

    For his part, Heller is convinced that exomoons are lurking in the Kepler data, but they have not been discovered yet. Only one project right now is dedicated to searching for exomoons, and is led by David Kipping at the Canadian Space Agency. His group has published several papers investigating 20 Kepler planets and candidates in total. The big restriction to their efforts is computational power, as their simulations require supercomputers.

    Another limiting factor is the number of observatories that can search for exomoons. To detect them, at least a handful of transits of the planet-moon system across their common host star would be required to absolutely make sure that the companion is a moon, Heller said. Also, the planet with the moon would have to be fairly far from its star, and decidedly not those close-in hot Jupiters that take only a few days to make an orbit. In that zone, the gravitational drag of the star would fatally perturb any moon’s orbit.

    Heller estimates that a telescope would need to stare constantly at the same patch of sky for several hundred days, minimum, to pick up an exomoon. Kepler fulfilled that obligation in spades with its four years of data gazing at the same spot in the sky, but astronomers will have to wait again for that opportunity.

    Because two of Kepler’s gyroscopes (pointing devices) have failed, Kepler’s new mission will use the pressure of the Sun to keep it steady. But it can only now point to the same region of the sky for about 80 days at at time because the telescope will periodically need to be moved so as not to risk placing its optics too close to the Sun.

    NASA’s forthcoming Transiting Exoplanet Survey Satellite [TESS} is only expected to look at a given field for 70 days. Further into the future, the European Space Agency’s PLAnetary Transits and Oscillations of stars (PLATO) will launch in 2024 for what is a planned six-year mission looking at several spots in the sky.

    NASA TESS
    NASA/TESS

    ESA PLATO
    ESA PLATO

    “PLATO is the next step, with a comparable accuracy to Kepler but a much larger field of view and hopefully a longer field of view coverage,” Heller said.

    Clues in our solar system

    pla
    Thousands of exoplanets and exoplanet candidates have been discovered, but astronomers are still searching for exomoons. Credit: ESA – C. Carreau

    Heller characterizes moons as an under-appreciated feature of extrasolar planetary systems. Just by looking around us in the Solar System, he says, astronomers have been able to make crucial explanations about how the moons must have formed and evolved together with their planets. Moons thus carry information about the substructure of planet evolution, which is not accessible by planet observations alone.

    The Earth’s moon, for example, was likely formed when a Mars-sized object collided with the proto-Earth and produced a debris disk. Over time, that debris coalesced into our moon.

    While Heller says the literature mostly focuses on collision scenarios between an Earth-sized object and a Mars-sized object, he doesn’t see any reason why crashes on a bigger scale might not happen. Perhaps an Earth-sized object crashed into an object that was five times the mass of Earth, producing an extrasolar Earth-Earth binary planet system, he suggests.

    Another collision scenario likely took place at Uranus. The gas giant’s rotation is tilted about 90 degrees in its orbit around the Sun. In other words, it is rolling on its side. More intriguing, its two dozen moons follow Uranus’ rotational equator, and they do not orbit in the same plane as Uranus’ track around the Sun. This scenario suggests that Uranus was hit multiple times by huge objects instead of just once, Heller said.

    Examining mighty Jupiter’s moons gives astronomers a sense of how high temperatures were in the disk that formed the gas giant and its satellites, Heller added. Ganymede, for example, is an icy moon. Models indicate that beyond Ganymede’s orbit (at about 15 Jupiter radii) it is sufficiently cold for water to pass from the gas to the solid (ice) stage, so the regular moons in these regions are very water-rich compared to the inner, mostly rocky moons Io and Europa.

    “It sounds a bit technical, but we couldn’t have this information about planetary accretion if we did not have the moons today to observe,” Heller said.

    Some moons could also have been captured, such as Neptune’s large moon, Triton. The moon orbits in a direction opposite to other moons in Neptune‘s system (and in fact, opposite to the direction of other large moons in the Solar System.) Plus, its odd terrain suggests that it used to be a free-floating object that was captured by Neptune’s gravity. Neptune is so huge that it raised tides within the moon, reforming its surface.

    Even comparing the different types of moons around planets in the Solar System reveals different timescales of formation. Jupiter includes four moons similar in size to Earth’s moon (Europa, Callisto, Ganymede and Io), while the next largest planet in our solar system, Saturn, only has one large moon called Titan. Astronomers believe Saturn has only one large moon because the gas that formed objects in our solar system was more plentiful in Jupiter’s system to provide material for the moons to form.

    The gas abundance happened as a consequence of the huge gas giant creating a void in the material surrounding our young Sun, pulling the material in for its moons. Saturn was not quite large enough to do this, resulting in fewer large moons.

    More strange situations could exist beyond our solar system’s boundaries, but it will take a dedicated search to find exomoons. Once they are discovered, however, they will allow planet formation and evolution studies on a completely new level.

    This research was supported in part by the Natural Sciences and Engineering Research Council of Canada (NSERC), the Center for Exoplanets and Habitable Worlds, which is supported by the Pennsylvania State University, the Pennsylvania Space Grant Consortium, the National Science Foundation (NSF) the NASA Astrobiology Institute.

    See the full article here.

    NASA

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 2:42 pm on October 20, 2014 Permalink | Reply
    Tags: , Basic Research, , , , ,   

    From FNAL: “New high-speed transatlantic network to benefit science collaborations across the U.S.” 


    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Monday, Oct. 20, 2014

    Karen McNulty-Walsh, Brookhaven Media and Communications Office, kmcnulty@bnl.gov, 631-344-8350
    Kurt Riesselmann, Fermilab Office of Communication, media@fnal.gov, 630-840-3351
    Jon Bashor, Computing Sciences Communications Manager, Lawrence Berkeley National Laboratory, jbashor@lbnl.gov, 510-486-5849

    Scientists across the United States will soon have access to new, ultra-high-speed network links spanning the Atlantic Ocean thanks to a project currently under way to extend ESnet (the U.S. Department of Energy’s Energy Sciences Network) to Amsterdam, Geneva and London. Although the project is designed to benefit data-intensive science throughout the U.S. national laboratory complex, heaviest users of the new links will be particle physicists conducting research at the Large Hadron Collider (LHC), the world’s largest and most powerful particle collider. The high capacity of this new connection will provide U.S. scientists with enhanced access to data at the LHC and other European-based experiments by accelerating the exchange of data sets between institutions in the United States and computing facilities in Europe.

    esnet

    DOE’s Brookhaven National Laboratory and Fermi National Accelerator Laboratory—the primary computing centers for U.S. collaborators on the LHC’s ATLAS and CMS experiments, respectively—will make immediate use of the new network infrastructure once it is rigorously tested and commissioned. Because ESnet, based at DOE’s Lawrence Berkeley National Laboratory, interconnects all national laboratories and a number of university-based projects in the United States, tens of thousands of researchers from all disciplines will benefit as well.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    CERN ATLAS New
    ATLAS at the LHC

    CERN CMS New
    CMS at CERN

    BNL Campus
    Brookhaven Lab

    The ESnet extension will be in place before the LHC at CERN in Switzerland—currently shut down for maintenance and upgrades—is up and running again in the spring of 2015. Because the accelerator will be colliding protons at much higher energy, the data output from the detectors will expand considerably—to approximately 40 petabytes of raw data per year compared with 20 petabytes for all of the previous lower-energy collisions produced over the three years of the LHC first run between 2010 and 2012.

    The cross-Atlantic connectivity during the first successful run for the LHC experiments, which culminated in the discovery of the Higgs boson, was provided by the US LHCNet network, managed by the California Institute of Technology. In recent years, major research and education networks around the world—including ESnet, Internet2, California’s CENIC, and European networks such as DANTE, SURFnet and NORDUnet—have increased their backbone capacity by a factor of 10, using sophisticated new optical networking and digital signal processing technologies. Until recently, however, higher-speed links were not deployed for production purposes across the Atlantic Ocean—creating a network “impedance mismatch” that can harm large, intercontinental data flows.

    An evolving data model

    This upgrade coincides with a shift in the data model for LHC science. Previously, data moved in a more predictable and hierarchical pattern strongly influenced by geographical proximity, but network upgrades around the world have now made it possible for data to be fetched and exchanged more flexibly and dynamically. This change enables faster science outcomes and more efficient use of storage and computational power, but it requires networks around the world to perform flawlessly together.

    “Having the new infrastructure in place will meet the increased need for dealing with LHC data and provide more agile access to that data in a much more dynamic fashion than LHC collaborators have had in the past,” said physicist Michael Ernst of DOE’s Brookhaven National Laboratory, a key member of the team laying out the new and more flexible framework for exchanging data between the Worldwide LHC Computing Grid centers.

    Ernst directs a computing facility at Brookhaven Lab that was originally set up as a central hub for U.S. collaborators on the LHC’s ATLAS experiment. A similar facility at Fermi National Accelerator Laboratory has played this role for the LHC’s U.S. collaborators on the CMS experiment. These computing resources, dubbed Tier 1 centers, have direct links to the LHC at the European laboratory CERN (Tier 0). The experts who run them will continue to serve scientists under the new structure. But instead of serving as hubs for data storage and distribution only among U.S.-based collaborators at Tier 2 and 3 research centers, the dedicated facilities at Brookhaven and Fermilab will be able to serve data needs of the entire ATLAS and CMS collaborations throughout the world. And likewise, U.S. Tier 2 and Tier 3 research centers will have higher-speed access to Tier 1 and Tier 2 centers in Europe.

    “This new infrastructure will offer LHC researchers at laboratories and universities around the world faster access to important data,” said Fermilab’s Lothar Bauerdick, head of software and computing for the U.S. CMS group. “As the LHC experiments continue to produce exciting results, this important upgrade will let collaborators see and analyze those results better than ever before.”

    Ernst added, “As centralized hubs for handling LHC data, our reliability, performance and expertise have been in demand by the whole collaboration, and now we will be better able to serve the scientists’ needs.”

    An investment in science

    ESnet is funded by DOE’s Office of Science to meet networking needs of DOE labs and science projects. The transatlantic extension represents a financial collaboration, with partial support coming from DOE’s Office of High Energy Physics (HEP) for the next three years. Although LHC scientists will get a dedicated portion of the new network once it is in place, all science programs that make use of ESnet will now have access to faster network links for their data transfers.

    “We are eagerly awaiting the start of commissioning for the new infrastructure,” said Oliver Gutsche, Fermilab scientist and member of the CMS Offline and Computing Management Board. “After the Higgs discovery, the next big LHC milestones will come in 2015, and this network will be indispensable for the success of the LHC Run 2 physics program.”

    This work was supported by the DOE Office of Science.

    The DOE Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

    See the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 2:20 pm on October 19, 2014 Permalink | Reply
    Tags: , , , Basic Research, , ,   

    From astrobio.net: “Rediscovering Venus to Find Faraway Earths “ 

    Astrobiology Magazine

    Astrobiology Magazine

    Oct 19, 2014
    fa
    Contact:
    Lyndsay Meyer
    The Optical Society
    +1.202.416.1435
    lmeyer@osa.org

    New optical device designed to measure gravitational pull of a planet should speed the search for Earth-like exoplanets.

    Astronomers Chih-Hao Li and David Phillips of the Harvard-Smithsonian Center for Astrophysics want to rediscover Venus—that familiar, nearby planet stargazers can see with the naked eye much of the year.

    Granted, humans first discovered Venus in ancient times. But Li and Phillips have something distinctly modern in mind. They plan to find the second planet again using a powerful new optical device installed on the Italian National Telescope that will measure Venus’ precise gravitational pull on the sun. If they succeed, their first-of-its-kind demonstration of this new technology will be used for finding Earth-like exoplanets orbiting distant stars.

    Italian National Telescope Galileo
    Italian National Telescope Galileo Internal
    Galileo Italian National Telescope

    “We are building a telescope that will let us see the sun the way we would see other stars,” said Phillips, who is a staff scientist at the Harvard-Smithsonian Center for Astrophysics. He and Li, a research associate at the Center for Astrophysics, will describe the device in a paper to be presented at The Optical Society’s (OSA) 98th Annual Meeting, Frontiers in Optics, being held Oct. 19-23 in Tucson, Arizona, USA. Li is the lead author of the paper, which has 12 collaborators.

    Astronomers have identified more than 1,700 exoplanets, some as far as hundreds of light years away. Most were discovered by the traditional transit method, which measures the decrease in brightness when a planet orbiting a distant star transits that luminous body, moving directly between the Earth and the star. This provides information about the planet’s size, but not its mass.

    Li and Phillips are developing a new laser-based technology known as the green astro-comb for use with the “radial velocity method,” which offers complementary information about the mass of the distant planet.

    From this information, astronomers will be able to determine whether distant exoplanets they discover are rocky worlds like Earth or less dense gas giants like Jupiter. The method is precise enough to help astronomers identify Earth-like planets in the “habitable zone,” the orbital distance “sweet-spot” where water exists as a liquid.

    Better Precision with a Laser

    The radial velocity method works by measuring how exoplanet gravity changes the light emitted from its star. As exoplanets circle a star, their gravitation tugs at the star changing the speed with which it moves toward or away from Earth by a small amount. The star speeds up slightly as it approaches Earth, with each light wave taking a fraction of a second less time to arrive than the wave before it.

    To an observer on Earth, the crests of these waves look closer together than they should, so they appear to have a higher frequency and look bluer. As the star recedes, the crests move further apart and the frequencies seem lower and redder.

    astro
    The astro-comb calibrates the Italian National Telescope’s HARPS-Nspectrograph using an observation of the asteroid Vesta. The top figure is a colorizedversion of the raw HARPS-N spectrum, showing the astro-comb calibration dottedlines and the sun’s spectrum reflected off Vesta as mostly solid vertical lines.The middle figure shows the raw data converted to a very precise standard one-dimensionalplot of spectral intensity vs. wavelength. The very regular astro-comb calibrationspectrum is below below. Credit: David Phillips

    This motion-based frequency change is known as the Doppler shift. Astronomers measure it by capturing the spectrum of a star on the pixels of a digital camera and watching how it changes over time.

    Today’s best spectrographs are only capable of measuring Doppler shifts caused by velocity changes of 1 meter per second or more. Only large gas giants or “super-earths” close to their host stars have enough gravity to cause those changes.

    The new astro-comb Li, Phillips and their colleagues are developing, however, will be able to detect Doppler shifts as small as 10 centimeters per second—small enough to find habitable zone Earth-like planets, even from hundreds of light years away.

    “The astro-comb works by injecting 8,000 lines of laser light into the spectrograph. They hit the same pixels as starlight of the same wavelength. This creates a comb-like set of lines that lets us map the spectrograph down to 1/10,000 of a pixel. So if I have light on this section of the pixel, I can tell you the precise wavelength,” Phillips explained.

    “By calibrating the spectrograph this way, we can take into account very small changes in temperature or humidity that affect the performance of the spectrograph. This way, we can compare data we take tonight with data from the same star five years from now and find those very small Doppler shifts,” he said.

    Seeing Green

    Li and his co-researchers pioneered the astro-comb several years ago, but it only worked with infrared and blue light. Their new version of the astro-comb lets astronomers measure green light—which is better for finding exoplanets.

    “The stars we look at are brightest in the green visible range, and this is the range spectrographs are built to handle,” Phillips said.

    Building the green astro-comb was a challenge, since the researchers needed to convert red laser light to green frequencies. They did it by making small fibers that convert one color of light to another.

    pla
    A slowly rotating planet is not guaranteed to be habitable, as is evident when looking at the inhospitable Venus. Credit: NASA/JPL/Caltech

    “Red light goes in and green light comes out,” Phillips said. “Even though I see it every day and understand the physics, it looks like magic.”

    The researchers plan to test the green astro-comb by pointing it at our sun, analyzing its spectrum to see if they can find Venus and rediscover its characteristic period of revolution, its size, its mass and its composition.

    “We know a lot about Venus, and we can compare our answers to what we already know, so we are more confident about our answers when we point our spectrographs at distant stars,” Li said.

    The Harvard-Smithsonian team is installing this device on the High-Accuracy Radial Velocity Planet Searcher-North (HARPS-N), a new spectrograph designed to search for exoplanets using the Italian National Telescope.

    “We will look at the thousands of potential exoplanets identified by the Kepler satellite telescope by the transit method. Together, our two methods can tell us a lot about those worlds,” Li said.

    And, because he will have already discovered Venus, he will be more certain of the answers.

    See the full article here.

    NASA

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 10:02 pm on October 18, 2014 Permalink | Reply
    Tags: , , Basic Research, , , Methane Studies   

    From astrobio.net: “Scientists discover carbonate rocks are unrecognized methane sink” 

    Astrobiology Magazine

    Astrobiology Magazine

    Oct 18, 2014
    osu
    Andrew Thurber, 541-737-4500, athurber@coas.oregonstate.edu

    Since the first undersea methane seep was discovered 30 years ago, scientists have meticulously analyzed and measured how microbes in the seafloor sediments consume the greenhouse gas methane as part of understanding how the Earth works.

    The sediment-based microbes form an important methane “sink,” preventing much of the chemical from reaching the atmosphere and contributing to greenhouse gas accumulation. As a byproduct of this process, the microbes create a type of rock known as authigenic carbonate, which while interesting to scientists was not thought to be involved in the processing of methane.

    seep
    Methane bubbles pour out between rocks at the seep site. The white material at lower right is a type of bacterial colony commonly observed at methane seeps. Image courtesy of Deepwater Canyons 2013 Expedition, NOAA-OER/BOEM/USGS –

    That is no longer the case. A team of scientists has discovered that these authigenic carbonate rocks also contain vast amounts of active microbes that take up methane. The results of their study, which was funded by the National Science Foundation, were reported today in the journal Nature Communications.

    “No one had really examined these rocks as living habitats before,” noted Andrew Thurber, an Oregon State University marine ecologist and co-author on the paper. “It was just assumed that they were inactive. In previous studies, we had seen remnants of microbes in the rocks – DNA and lipids – but we thought they were relics of past activity. We didn’t know they were active.

    “This goes to show how the global methane process is still rather poorly understood,” Thurber added.

    mus
    A vast mussel community found on flat bottom as well as on rocks rising a meter or more off the seafloor. Image courtesy of Deepwater Canyons 2013 Expedition, NOAA-OER/BOEM/USGS

    Lead author Jeffrey Marlow of the California Institute of Technology and his colleagues studied samples from authigenic compounds off the coasts of the Pacific Northwest (Hydrate Ridge), northern California (Eel River Basin) and central America (the Costa Rica margin). The rocks range in size and distribution from small pebbles to carbonate “pavement” stretching dozens of square miles.

    “Methane-derived carbonates represent a large volume within many seep systems and finding active methane-consuming archaea and bacteria in the interior of these carbonate rocks extends the known habitat for methane-consuming microorganisms beyond the relatively thin layer of sediment that may overlay a carbonate mound,” said Marlow, a geobiology graduate student in the lab of Victoria Orphan of Caltech.

    These assemblages are also found in the Gulf of Mexico as well as off Chile, New Zealand, Africa, Europe – “and pretty much every ocean basin in the world,” noted Thurber, an assistant professor (senior research) in Oregon State’s College of Earth, Ocean, and Atmospheric Sciences.

    The study is important, scientists say, because the rock-based microbes potentially may consume a huge amount of methane. The microbes were less active than those found in the sediment, but were more abundant – and the areas they inhabit are extensive, making their importance potential enormous. Studies have found that approximately 3-6 percent of the methane in the atmosphere is from marine sources – and this number is so low due to microbes in the ocean sediments consuming some 60-90 percent of the methane that would otherwise escape.

    bub
    Methane gas bubbles rise from the seafloor – this type of activity, originally noticed by the Okeanos Explorer in 2012 on a multibeam sonar survey, is what led scientists to the area. Image courtesy of Deepwater Canyons 2013 Expedition, NOAA-OER/BOEM/USGS

    Now those ratios will have to be re-examined to determine how much of the methane sink can be attributed to microbes in rocks versus those in sediments. The distinction is important, the researchers say, because it is an unrecognized sink for a potentially very important greenhouse gas.

    “We found that these carbonate rocks located in areas of active methane seeps are themselves more active,” Thurber said. “Rocks located in comparatively inactive regions had little microbial activity. However, they can quickly activate when methane becomes available.

    “In some ways, these rocks are like armies waiting in the wings to be called upon when needed to absorb methane.”

    The ocean contains vast amounts of methane, which has long been a concern to scientists. Marine reservoirs of methane are estimated to total more than 455 gigatons and may be as much as 10,000 gigatons carbon in methane. A gigaton is approximate 1.1 billion tons.

    By contrast, all of the planet’s gas and oil deposits are thought to total about 200-300 gigatons of carbon.

    See the full article here.

    NASA

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 9:43 pm on October 18, 2014 Permalink | Reply
    Tags: , , Basic Research, ,   

    From SPACE.com: “Comet Siding Spring at Mars: How a Rare Celestial Event Was Discovered” 

    space-dot-com logo

    SPACE.com

    October 18, 2014
    Elizabeth Howell

    A comet that was born before the Earth formed is flying in from the edge of the solar system, bound for a dramatic date with Mars on Sunday (Oct. 19).

    Comet Siding Spring — unknown and undiscovered until 2013 — will zoom past the Red Planet Sunday afternoon in an encounter that could help scientists better understand how the solar system came to be.

    Siding Spring will fly 87,000 miles (139,500 kilometers) from Mars at 2:27 p.m. EDT (1827 GMT) Sunday, about one-third of the distance from the Earth to the moon. Researchers will observe the close encounter with the fleet of orbiters and rovers at the Red Planet.

    Siding Spring is the first comet from the Oort Cloud, a collection of icy bodies at the edge of the solar system that will be observed up close by spacecraft. All comets examined in the past came from closer in, around Jupiter’s orbit or the edge of the Kuiper Belt, a huge set of icy objects beyond Neptune.

    oort
    Artist’s conception of the Oort Cloud

    kb
    Known objects in the Kuiper belt, derived from data from the Minor Planet Center. Objects in the main belt are colored green, whereas scattered objects are colored orange. The four outer planets are blue. Neptune’s few known trojans are yellow, whereas Jupiter’s are pink. The scattered objects between Jupiter’s orbit and the Kuiper belt are known as centaurs. The scale is in astronomical units. The pronounced gap at the bottom is due to difficulties in detection against the background of the plane of the Milky Way.

    “We can’t get to an Oort Cloud comet with our current rockets,” Carey Lisse, a senior astrophysicist at the Johns Hopkins University Applied Physics Laboratory, said during a NASA news conference last week. “These orbits are very long and extended — and at very great velocities … It’s a free flyby, if you will, and that’s a very fantastic event for us to study.”

    A failed planet

    image
    In a rare celestial event, a comet will pass closer to Mars than the moon is from Earth. See how the Comet Siding Spring flyby of Mars works in this Space.com infographic.
    Credit: by Karl Tate, Infographics Artist

    Siding Spring was created in the first few million years of Earth’s solar system, Lisse said. It likely formed somewhere between the orbits of Jupiter and Neptune, where many similar objects coalesced into the giant planets. But a gravitational push kicked Siding Spring out into the Oort Cloud; it took another jolt from a passing star a million years ago or so to send it toward the inner solar system.

    Half of the comet is rocky, and the other half is made up of volatile ices, such as water and carbon dioxide. Its flight past Mars is the first time it will make it into the solar system, past Jupiter’s orbit. The comet just recently crossed the “water-ice line,” the point where water can exist as a liquid in the solar system.

    Siding Spring, which Lisse said is about the size of an Appalachian mountain, will swing by Mars in a retrograde direction, the opposite way in which the planets orbit around the sun. This means any dust that comes off the comet will be moving at about 119,000 mph (190,000 km/h) relative to Mars.

    “Anything that comes off the comet that hits either Mars or the spacecraft is going to pack a real large amount of kinetic energy — a real wallop — so that’s one of the things that we’ve been worried about,” Lisse said.

    As a result, NASA has maneuvered its three operational Mars orbiters to be on the “safe” side of the Red Planet when dust exposure is highest.

    NASA investigations

    The comet was first discovered in January 2013 by Robert McNaught at the Siding Spring Observatory in Australia. Ever since then, scientists have been studying the celestial visitor with a variety of space- and ground-based assets, in an attempt to learn more about its history.

    Siding Spring Observatory
    Siding Spring Observatory Interior
    Siding Springs Observatory

    To learn about the comet, scientists will have to get an up-close look at its nucleus, to see its shape, size and composition. If all goes according to plan, NASA’s Mars Reconnaissance Orbiter will take high-resolution pictures of the comet’s heart, making it the first time an Oort Cloud comet’s nucleus will be seen up close.

    NASA Mars Reconnaisence Orbiter
    NASA’s Mars Reconnaissance Orbiter

    NASA’s Hubble, Swift and Spitzer space telescopes have mapped out the comet’s dust, water molecules and carbon dioxide. For now, it looks like dust is coming off more slowly than researchers had expected. Little activity was seen with the water ice until June, when the comet got close enough to the sun for ice to sublimate.

    NASA Hubble Telescope
    NASA/ESA Hubble

    NASA SWIFT Telescope
    NASA/Swift

    NASA Spitzer Telescope
    NASA/Spitzer

    Some other planned observations will come from NASA’s Chandra X-Ray telescope (which will look for any material thrown in Mars’ atmosphere), and the newly arrived Mars Atmosphere and Volatile EvolutioN (MAVEN) mission, which will see how the Red Planet’s atmosphere reacts as the comet passes by.

    NASA Chandra Telescope
    NASA/Chandra

    NASA Mars MAVEN
    NASA MAVEN

    NASA’s Curiosity and Opportunity rovers will also participate in the campaign, attempting to take the first images of a comet from the surface of another planet.

    NASA Mars Curiosity Rover
    NASA Curiosity

    NASA Mars Opportunity Rover
    NASA Opportunity

    See the full article, with video, here.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 4:24 pm on October 17, 2014 Permalink | Reply
    Tags: , , Basic Research, , ,   

    From Daily Galaxy: “Long-Sought Source of Massive Supernovas Detected” 

    Daily Galaxy
    The Daily Galaxy

    October 17, 2014
    No Writer Credit

    For years astronomers have searched for the elusive progenitors of hydrogen-deficient stellar explosions without success. However, this changed in June 2013 with the appearance of supernova iPTF13bvn and the subsequent detection of an object at the same location in archival images obtained before the explosion using the HST. The interpretation of the observed object is controversial. The team led by [Melina] Bersten presented a self-consistent picture using models of supernova brightness and progenitor evolution. In their picture, the more massive star in a binary system explodes after transferring mass to its companion.

    A group of researchers recently presented a model that provides the first characterization of the progenitor for a hydrogen-deficient supernova. Their model predicts that a bright hot star, which is the binary companion to an exploding object, remains after the explosion. To verify their theory, the group secured observation time with the Hubble Space Telescope (HST) to search for such a remaining star. Their findings, which are reported in the October 2014 issue of The Astronomical Journal, have important implications for the evolution of massive stars.

    NASA Hubble Telescope
    NASA Hubble schematic
    NASA/ESA Hubble

    One of the challenges in astrophysics is identifying which star produces which supernova. This is particularly problematic for supernovae without hydrogen, which are called Types Ib or Ic, because the progenitors have yet to be detected directly.

    The ultimate question is: “How do progenitor stars remove their hydrogen-rich envelopes during their evolution?” Two competing mechanisms have been proposed. One hypothesizes that a strong wind produced by a very massive star blows the outer hydrogen layers, while the other suggests that a gravitationally bound binary companion star removes the outer layers. The latter case does not require a very massive star. Because these two scenarios predict vastly different progenitor stars, direct detection of the progenitor for this type of supernova can provide definitive clues about the preferred evolutionary path.

    When young Type Ib supernova iPTF13bvn was discovered in nearby Spiral_galaxy NGC 5806, astronomers hoped to find its progenitor. Inspecting the available HST images did indeed reveal an object, providing optimism that the first hydrogen-free supernova progenitor would at last be identified. Due to the object’s blue hue, it was initially suggested that the object was a very hot, very massive, evolved star with a compact structure, called a “Wolf-Rayet” star. (Using models of such stars, a group based in Geneva was able to reproduce the brightness and color of the pre-explosion object with a Wolf-Rayet star that was born with over 30 times the mass of the Sun and died with 11 times the solar mass.)

    ngc5806
    NGC 5806

    burt
    Image shows spiral galaxy NGC5806 Left top: Zoomed image of supernova iPTF13bvn just after the explosion. Left bottom: HST image taken before the explosion. Progenitor of iPTF13bvn was identified. Right: Spiral galaxy NGC5806 Left top: Zoomed image of supernova iPTF13bvn just after the explosion. Left bottom: HST image taken before the explosion. Progenitor of iPTF13bvn was identified. (Image Credit: Iair Arcavi, Weizmann Institute of Science, PTF)

    “Based on such suggestions, we decided to check if such a massive star is consistent with the supernova brightness evolution,” says Melina Bersten of Kavli IPMU who led the research. However, the results are inconsistent with a Wolf-Rayet star; the exploding star must have been merely four times the mass of the Sun, which is much smaller than a Wolf-Rayet star. “If the mass was this low and the supernova lacked hydrogen, our immediate conclusion is that the progenitor was part of a binary system,” adds Bersten.

    Because the problem requires a more elaborate solution, the team set out to simulate the evolution of a binary system with mass transfer in order to determine a configuration that can explain all the observational evidence (a blue pre-explosion object with a relatively low mass devoid of hydrogen). “We tested several configurations and came up with a family of possible solutions,” explains Omar Benvenuto of IALP, Argentina. “Interestingly, the mass transfer process dictates the observational properties of the exploding star, so it allows suitable solutions to be derived even if the mass of the stars is varied,” adds Benvenuto. The team chose the case where two stars are born with 20 and 19 times the mass of the Sun. The mass transfer process causes the larger star to retain only four times the solar mass before exploding. Most importantly, the smaller star may trap part of the transferred mass, becoming a very bright and hot star.

    The existence of a hot star would provide strong evidence for the binary model presented by Bersten and collaborators. Fortunately, such a prediction can be directly tested once the supernova fades because the hot companion should become evident. “We have requested and obtained observation time with the HST to search for the companion star in 2015,” comments Gaston Folatelli of Kavli IPMU. “Until then, we must wait patiently to see if we can identify the progenitor of a hydrogen-free supernova for the first time,” Bersten adds.

    See the full article here.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:49 pm on October 17, 2014 Permalink | Reply
    Tags: , Basic Research, ,   

    From ANL: “Protons hog the momentum in neutron-rich nuclei” 

    News from Argonne National Laboratory

    October 17, 2014
    Kandice Carter, Jefferson Lab Public Affairs, 757-269-7263, kcarter@jlab.org
    or Jared Sagoff, Argonne National Laboratory communications office, 630-252-5549, media@anl.gov.

    Like dancers swirling on the dance floor with bystanders looking on, protons and neutrons that have briefly paired up in the nucleus have higher-average momentum, leaving less for non-paired nucleons. Using data from nuclear physics experiments carried out at the Department of Energy’s Thomas Jefferson National Accelerator Facility, researchers have now shown for the first time that this phenomenon exists in nuclei heavier than carbon, including aluminum, iron and lead.

    nucleons
    Research has shown that protons and neutrons that have briefly paired up in the nucleus have higher-average momentum, which allows a greater fraction of the protons than neutrons to have high momentum in relatively neutron-rich nuclei, such as carbon, aluminum, iron and lead. This result is contrary to long-accepted theories large nuclei and has implications for ultra-cold atomic gas systems and neutron stars.

    The phenomenon also surprisingly allows a greater fraction of the protons than neutrons to have high momentum in these relatively neutron-rich nuclei, which is contrary to long-accepted theories of the nucleus and has implications for ultra-cold atomic gas systems and neutron stars. The results were published online by the journal Science, on the Science Express website.

    The research builds on earlier work featured in Science that found that protons and neutrons in light nuclei pair up briefly in the nucleus, a phenomenon called a short-range correlation. Nucleons prefer pairing up with nucleons of a different type (proton preferred neutrons to other protons) by 20 to 1, and nucleons involved in a short-range correlation carry higher momentum than unpaired ones.

    Using data from an experiment conducted in 2004, the researchers were able to identify high-momentum nucleons involved in short-range correlations in heavier nuclei. In that experiment, led by Argonne physicist Kawtar Hafidi, the Jefferson Lab Continuous Electron Beam Accelerator Facility produced a 5.01 GeV beam of electrons to probe the nuclei of carbon, aluminum, iron and lead. The outgoing electrons and high-momentum protons were measured.

    “We found this dominance of proton-neutron pairs in the nuclei we studied. What’s striking is this pair-dominance all the way to lead,” says Doug Higinbotham, a staff scientist at Jefferson Lab and a lead coauthor on the paper.

    Then the researchers compared the momenta of protons versus neutrons in these nuclei. According to the Pauli exclusion principle, certain like particles can’t have the same momentum state. So, if you have a bunch of neutrons together, some will have low momentum, and others will have high momentum; the more neutrons you have, the more high-momentum neutrons you would see, as they fill up higher and higher momentum states.

    But according to Higinbotham, that expected picture is not what the researchers found when they measured high-momentum protons in neutron-rich nuclei.

    “What this paper is saying is the reverse, that the protons actually have the higher-average momentum. And it’s because they’ve all paired up with neutrons,” Higinbotham says. “It’s like a dance with too many girls (neutrons) and only a few boys (protons). Those boys are dancing their little hearts out, because there aren’t very many of them. So the average proton momentum is going to be higher than the average neutron momentum, because it’s mostly the neutrons that are sitting there, doing nothing, with nothing to pair up with, except themselves.”

    Higinbotham notes that the neutrons may also pair up briefly with other neutrons in short-range correlations and protons with other protons. However, these like-particle brief pairings occur once for roughly every 20 unlike-particle brief pairings.

    Now, the researchers hope to extend these new findings to other, similar systems, such as the quarks in nucleons and atoms in cold gases. According to Or Hen, a graduate student at Tel Aviv University in Israel and the paper’s lead author, he and his colleagues are already reaching out to other researchers.

    “We expect that this will also happen in ultra-cold atomic gas systems. And we’re having meetings with those researchers. If they find the same phenomenon, then we can use the flexibility of their experimental systems to go to extreme cases of very hard-to-study nuclear systems, such as the large imbalances of protons and neutrons that you can find in neutron stars,” Or said.

    To further that goal, Misak Sargsian, a lead coauthor and professor at Florida International University, said he’s extending this work into his own theoretical calculations of neutron stars.

    “Think of a neutron star like it’s a huge nucleus, where you have ten times more neutrons than protons. The effect should be very, very profound for neutron stars. So this opens up a new direction for research,” Sargsian said.

    According to Lawrence Weinstein, a lead coauthor and eminent scholar and professor at Old Dominion University in Norfolk, Va., the scientists would also like to continue their studies of the pairs.

    “We’d like to measure a lot more aspects of how protons and neutrons pair up in nuclei. So we know not just protons prefer neutrons, but how are the pairs behaving, in detail,” he said.

    This new result was made possible by an initiative funded by a grant from the U.S. Department of Energy and led by Weinstein and Sargsian, as well as Mark Strikman, a distinguished professor at Penn State, and Sebastian Kuhn, a professor and eminent scholar at Old Dominion University. The data-mining initiative consisted of re-analyzing experimental data from completed experiments in an attempt to glean new information that previously had not been considered or was missed. A collaboration of more than 140 researchers from more than 40 institutions and nine countries contributed to the result. Researchers at two U.S. Department of Energy national labs, Jefferson Lab and Argonne National Lab, participated in the research.

    Argonne physicist Kawtar Hafidi led the experiment that first collected the data back in 2003. “That data was so unique that we’ve been able to extract all kinds of information on several different areas of nuclear physics since then,” she said. She chairs the group, the CEBAF Large Acceptance Spectrometer collaboration nuclear physics working group, that oversees the review and release of scientific results from the data taken by that experiment.

    “This is excellent work that helps validate our theoretical picture of nuclear structure,” said Robert Wiringa, an Argonne physicist whose theoretical work is cited in the paper.

    The paper was published online by the journal Science, at the Science Express web site, on Thursday, 16 October, 2014. See http://www.sciencexpress.org, and also http://www.aaas.org. Science and Science Express are published by the AAAS, the science society, the world’s largest general scientific organization.

    This work was supported by the U.S. Department of Energy’s Office of Science (Office of Nuclear Physics), the U.S. National Science Foundation, Israel Science Foundation, Chilean Comisión Nacional de Investigación Científica y Technológica, French Centre National de la Recherche Scientifique and Commissariat a l’Energie Atomique, French-American Cultural Exchange, Italian Istituto Nazionale di Fisica Nucleare, National Research Foundation of Korea and the U.K.’s Science and Technology Facilities Council. CEBAF is a DOE Office of Science User Facility.

    See the full article here.

    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    The Advanced Photon Source at Argonne National Laboratory is one of five national synchrotron radiation light sources supported by the U.S. Department of Energy’s Office of Science to carry out applied and basic research to understand, predict, and ultimately control matter and energy at the electronic, atomic, and molecular levels, provide the foundations for new energy technologies, and support DOE missions in energy, environment, and national security. To learn more about the Office of Science X-ray user facilities, visit http://science.energy.gov/user-facilities/basic-energy-sciences/.

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:12 pm on October 17, 2014 Permalink | Reply
    Tags: Basic Research, , , ,   

    From Perimeter: “The Last Gasp of a Black Hole” 

    Perimeter Institute
    Perimeter Institute

    October 17, 2014
    No Writer Credit

    New research from Perimeter shows that two of the strangest features of quantum mechanicsentanglement and negative energy – might be two faces of one coin.

    Quantum mechanics is, notoriously, weird. Take entanglement: when two or more particles are entangled, their states are linked together, no matter how far apart they go.

    If the idea makes your classical mind twitch, you’re in good company. At the heart of everything, according to quantum mechanics, nature has a certain amount of irreducible jitter. Even nothing – the vacuum of space – can jitter, or as physicists say, fluctuate. When it does, a particle and its anti-particle can pop into existence.

    For example, an electron and an anti-electron (these are called positrons) might pop into existence out of the vacuum. We know that they each have a spin of one half, which might be either up or down. We also know that these particles were created from nothing and so, to balance the books, the total spin must add up to zero. Finally, we know that the spin of either particle is not determined until it is measured.

    So suppose the electron and the positron fly apart a few metres or a few light years, and then a physicist comes by to measure the spin of, say, the electron. She discovers that the electron is spin up, and in that moment, the electron becomes spin up. Meanwhile, a few metres or a few light years away, the positron becomes spin down. Instantly. That is the strangeness of quantum entanglement.

    Negative energy is less well known than entanglement, but no less weird. It begins with the idea – perhaps already implied by the positron and electron popping out of nowhere – that empty space is not empty. It is filled with quantum fields, and the energy of those fields can fluctuate a little bit.

    In fact, the energy of these fields can dip under the zero mark, albeit briefly. When that happens, a small region of space can, for a short span of time, weigh less than nothing – or at least less than the vacuum. It’s a little bit like finding dry land below sea level.

    Despite their air of strangeness, entanglement and negative energy are both well-explored topics. But now, new research, published as a Rapid Communication in Physical Review D, is hinting that these two strange phenomena may be linked in a surprising way.

    The work was done by Perimeter postdoctoral fellow Matteo Smerlak and former postdoc Eugenio Bianchi (now on the faculty at Penn State and a Visiting Fellow at Perimeter). “Negative energy and entanglement are two of the most striking features of quantum mechanics,” says Smerlak. “Now, we think they might be two sides of the same coin.”

    ms
    Perimeter Postdoctoral Researcher Matteo Smerlak

    man
    Perimeter Visiting Fellow Eugenio Bianchi

    Specifically, the researchers proved mathematically that any external influence that changes the entanglement of a system in its vacuum state must also produce some amount of negative energy. The reverse, they say, is also true: negative energy densities can never be produced without entanglement being directly affected.

    At the moment, the result only applies to certain quantum fields in two dimensions – to light pulses travelling up and down a thin cable, for instance. And it is with light that the Perimeter researchers hope that their new idea can be directly tested.

    “Some quantum states which have negative energy are known, and one of them is called a ‘squeezed state,’ and they can be produced in the lab, by optical devices called squeezers,” says Smerlak. The squeezers manipulate light to produce an observable pattern of negative energy.

    Remember that Smerlak and Bianchi’s basic argument is that if an external influence affects vacuum entanglement, it will also release some negative energy. In a quantum optics setup, the squeezers are the external influence.

    Experimentalists should be able to look for the correlation between the entanglement patterns and the negative energy densities which this new research predicts. If these results hold up – always a big if in brand new work – and if they can make the difficult leap from two dimensions to the real world, then there will be startling implications for black holes.

    Like optical squeezers, black holes also produce changes in entanglement and energy density. They do this by separating entangled pairs of particles and preferentially selecting the ones with negative energy.

    Remember that the vacuum is full of pairs of particles and antiparticles blinking into existence. Under normal circumstances, they blink out again just as quickly, as the particle and the antiparticle annihilate each other. But just at a black hole’s event horizon, it sometimes happens that one of the particles is sucked in, while the other escapes. The small stream of escaping particles is known as Hawking radiation.

    By emitting such radiation, black holes slowly give up their energy and mass, and eventually disappear. Black hole evaporation, as the process is known, is a hot topic in physics. This new research has the potential to change the way we think about it.

    “In the late stages of the evaporation of a black hole, the energy released from the black hole will turn negative,” says Smerlak. And if a black hole releases negative energy, then its total energy goes up, not down. “It means that the black hole will shrink and shrink and shrink – for zillions of years – but in the end, it will release its negative energy in a gasp before dying. Its mass will briefly go up.”

    Call it the last gasp of a black hole.

    See the full article here.

    About Perimeter

    Perimeter Institute is a leading centre for scientific research, training and educational outreach in foundational theoretical physics. Founded in 1999 in Waterloo, Ontario, Canada, its mission is to advance our understanding of the universe at the most fundamental level, stimulating the breakthroughs that could transform our future. Perimeter also trains the next generation of physicists through innovative programs, and shares the excitement and wonder of science with students, teachers and the general public.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 2:26 pm on October 17, 2014 Permalink | Reply
    Tags: , , Basic Research, ,   

    From ESA: “Herschel’s view of Comet Siding Spring” 

    ESASpaceForEuropeBanner
    European Space Agency

    17 October 2014

    These three images show emission from the dust in the coma surrounding the nucleus of Comet C/2013 A1 – also known as Comet Siding Spring – as observed at three different far-infrared wavelengths with ESA’s Herschel space observatory.

    three

    ESA Herschel
    ESA Herschel schematic
    ESA/Herschel

    Discovered on 3 January 2013, Comet Siding Spring is an Oort Cloud comet on its first journey into the inner Solar System. It will reach perihelion – its closest approach to the Sun – on 25 October 2014 at 1.4 AU (about 210,000,000 km). Having spent most of its life far from the Sun, this comet is much more pristine than periodic comets – those that orbit the Sun every two hundred years or less – and for that reason is particularly interesting to study.

    oort
    Artists rendering of the Kuiper Belt and Oort Cloud.

    On 31 March 2013, not long after it was discovered, astronomers observed Comet Siding Spring with Herschel. This was just one month before the observatory exhausted its supply of liquid helium coolant and ceased to collect data. When Herschel observed it, the comet was about 6.5 AU from the Sun. The observations were performed following a proposal for Director’s Discretionary Time from Peter Mattisson from the Stockholm Amateur Astronomers (STAR) in Sweden.

    The three panels show the comet at wavelengths of 70 microns (shown in blue), 100 microns (shown in green) and 160 microns (shown in red). Telescopes observing at these long wavelengths see the direct thermal emission from dust in the comet’s coma.

    The coma is resolved at the two shorter wavelengths (in the left and central panels). Close inspection of these two images reveals that the coma’s shape is slightly elongated towards the left – in the direction opposite the Sun. From these images, astronomers estimated that the coma extends some 50,000 km from the comet’s nucleus. The structure of the coma can hardly be resolved at the longest wavelength probed by Herschel (in the right panel).

    These observations were also used to calculate the total mass of dust in the coma, which amounts to about 300,000,000 kg. At the time of the Herschel observations, the comet appeared to be quite active – astronomers estimated that the activity had begun even prior to the comet’s discovery, when it was about 8 AU from the Sun. Observations performed at a later stage with space and ground-based telescopes showed that the comet’s activity has increased quite slowly over the past months, which is quite unusual for an Oort Cloud comet. There are even some hints that the comet’s activity has declined recently.

    Astronomers have been closely monitoring the activity of Comet Siding Spring because, a few days before perihelion, the comet will have an historic close approach to Mars, passing some 140,000 km from the Red Planet on 19 October 2014. The comet’s current moderate activity is good news for the fleet of spacecraft that are operated at Mars by various space agencies (including ESA’s Mars Express) because it means a low risk of dust particles hitting the instruments on board.

    Since Oort Cloud comets are discovered with an extremely short notice before perihelion – a few years at most – it is virtually impossible to plan a space mission to fly by such a comet. This is what makes Comet Siding Spring and its closest approach to Mars truly unique, as the spacecraft at Mars will have the chance to observe an Oort Cloud comet from a distance that could not possibly be achieved otherwise.

    The analysis of the Herschel images was performed by Cs. Kiss (Konkoly Observatory, Budapest, Hungary), T.G. Müller (Max-Planck-Institut für extraterrestrische Physik, Garching, Germany), M. Kidger (ESAC, European Space Agency, Madrid, Spain), P. Mattisson (STAR, Stockholm Amateur Astronomers, Sweden), and G. Marton (Konkoly Observatory, Budapest, Hungary).

    See the full article here.

    The European Space Agency (ESA), established in 1975, is an intergovernmental organization dedicated to the exploration of space, currently with 19 member states. Headquartered in Paris, ESA has a staff of more than 2,000. ESA’s space flight program includes human spaceflight, mainly through the participation in the International Space Station program, the launch and operations of unmanned exploration missions to other planets and the Moon, Earth observation, science, telecommunication as well as maintaining a major spaceport, the Guiana Space Centre at Kourou, French Guiana, and designing launch vehicles. ESA science missions are based at ESTEC in Noordwijk, Netherlands, Earth Observation missions at ESRIN in Frascati, Italy, ESA Mission Control (ESOC) is in Darmstadt, Germany, the European Astronaut Centre (EAC) that trains astronauts for future missions is situated in Cologne, Germany, and the European Space Astronomy Centre is located in Villanueva de la Cañada, Spain.

    ESA50 Logo large

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 2:05 pm on October 17, 2014 Permalink | Reply
    Tags: , Basic Research, , , , ,   

    From FNAL- “Frontier Science Result: CMS Off the beaten path” 


    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Friday, Oct. 17, 2014
    Jim Pivarski

    The main concern for most searches for rare phenomena is to control the backgrounds. Backgrounds are observations that resemble the one of interest, yet aren’t. For instance, fool’s gold is a background for gold prospectors. The main reason that the Higgs boson was hard to find is that most Higgs decays resemble b quark pair production, which is a million times more common. You not only have to find the one-in-a-million event picture, you have to identify some feature of it to prove that it is not an ordinary event.

    This is particularly hard to do in proton collisions because protons break apart in messy ways — the quarks from the proton that missed each other generate a spray of particles that fly off just about everywhere. Look through a billion or a trillion of these splatter events and you can find one that resembles the pattern of new physics that you’re looking for. Physicists have many techniques for filtering out these backgrounds — requiring missing momentum from an invisible particle, high energy perpendicular to the beam, a resonance at a single energy, and the presence of electrons and muons are just a few.

    nu
    Most particles produced by proton collisions originate in the point where the beams cross. Those that do not are due to intermediate particles that travel some distance before they decay

    A less common yet powerful technique for eliminating backgrounds is to look for displaced particle trajectories, meaning trajectories that don’t intersect the collision point. Particles that are directly created by the proton collision or are created by short-lived intermediates always emerge from this point. Those that emerge from some other point in space must be due to a long-lived intermediate.

    A common example of this is the b quark, which can live as long as a trillionth of a second before decaying into visible particles. That might not sound like very long, but the quark is traveling so quickly that it covers several millimeters in that trillionth of a second, which is a measurable difference.

    In a recent analysis, CMS scientists searched for displaced electrons and muons. Displaced tracks are rare, and electrons and muons are also rare, so displaced electrons and muons should be extremely rare. The only problem with this logic is that b quarks sometimes produce electrons and muons, so one other feature is needed to disambiguate. A b quark almost always produces a jet of particles, so this search for new physics also required that the electrons and muons were not close to jets.

    CERN CMS New
    CERN CMS

    With these simple selection criteria, the experimenters found only as many events as would be expected from standard physics. Therefore, it constrains any theory that predicts displaced electrons and muons. One of these is “displaced supersymmetry,” which generalizes the usual supersymmetry scenario by allowing the longest-lived supersymmetric particle to decay on the millimeter scale that this analysis tests. Displaced supersymmetry was introduced as a way that supersymmetry might exist yet be missed by most other analyses. Experiments like this one illuminate the dark corners in which supersymmetry might be hiding.

    See the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 342 other followers

%d bloggers like this: