Tagged: Basic Research Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:48 pm on December 19, 2014 Permalink | Reply
    Tags: , Basic Research, , Much More,   

    From JPL: “Horsehead of a Different Color” 


    December 19, 2014
    No Writer Credit

    Sometimes a horse of a different color hardly seems to be a horse at all, as, for example, in this newly released image from NASA’s Spitzer Space Telescope. The famous Horsehead nebula makes a ghostly appearance on the far right side of the image, but is almost unrecognizable in this infrared view. In visible-light images, the nebula has a distinctively dark and dusty horse-shaped silhouette, but when viewed in infrared light, dust becomes transparent and the nebula appears as a wispy arc.


    NASA Spitzer Telescope
    NASA Spitzer schematic

    NASA’s Jet Propulsion Laboratory, Pasadena, California, manages the Spitzer Space Telescope mission for NASA’s Science Mission Directorate, Washington. Science operations are conducted at the Spitzer Science Center at the California Institute of Technology in Pasadena. Spacecraft operations are based at Lockheed Martin Space Systems Company, Littleton, Colorado. Data are archived at the Infrared Science Archive housed at the Infrared Processing and Analysis Center at Caltech. Caltech manages JPL for NASA.

    For more information about Spitzer, visit http://spitzer.caltech.edu and http://www.nasa.gov/spitzer.

    See the full article here.

    Further material

    The Horsehead is only one small feature in the Orion Molecular Cloud Complex, dominated in the center of this view by the brilliant Flame nebula (NGC 2024). The smaller, glowing cavity falling between the Flame nebula and the Horsehead is called NGC 2023. These regions are about 1,200 light-years away.

    Photo taken by Rogelio Bernal Andreo in October 2010 of the Orion constellation showing the surrounding nebulas of the Orion Molecular Cloud complex. Also captured is the red supergiant Betelgeuse (top left) and the famous belt of Orion composed of the OB stars Altitak, Alnilam and Mintaka. To the bottom right can be found the star Rigel. The red crescent shape is Barnard’s Loop. The photograph appeared as the Astronomy Picture of the Day on October 23, 2010.

    Flame Nebula
    Stars are often born in clusters, in giant clouds of gas and dust. Astronomers have studied two star clusters using NASA’s Chandra X-ray Observatory and infrared telescopes and the results show that the simplest ideas for the birth of these clusters cannot work, as described in our latest press release. This composite image shows one of the clusters, NGC 2024, which is found in the center of the so-called Flame Nebula about 1,400 light years from Earth. In this image, X-rays from Chandra are seen as purple, while infrared data from NASA’s Spitzer Space Telescope are colored red, green, and blue. A study of NGC 2024 and the Orion Nebula Cluster, another region where many stars are forming, suggest that the stars on the outskirts of these clusters are older than those in the central regions. This is different from what the simplest idea of star formation predicts, where stars are born first in the center of a collapsing cloud of gas and dust when the density is large enough. The research team developed a two-step process to make this discovery. First, they used Chandra data on the brightness of the stars in X-rays to determine their masses. Next, they found out how bright these stars were in infrared light using data from Spitzer, the 2MASS telescope, and the United Kingdom Infrared Telescope.

    2MASS Telescope
    2MASS telescope interior

    UKIRT interior

    By combining this information with theoretical models, the ages of the stars throughout the two clusters could be estimated. According to the new results, the stars at the center of NGC 2024 were about 200,000 years old while those on the outskirts were about 1.5 million years in age. In Orion, the age spread went from 1.2 million years in the middle of the cluster to nearly 2 million years for the stars toward the edges.
    Explanations for the new findings can be grouped into three broad categories. The first is that star formation is continuing to occur in the inner regions. This could have happened because the gas in the outer regions of a star-forming cloud is thinner and more diffuse than in the inner regions. Over time, if the density falls below a threshold value where it can no longer collapse to form stars, star formation will cease in the outer regions, whereas stars will continue to form in the inner regions, leading to a concentration of younger stars there. Another suggestion is that old stars have had more time to drift away from the center of the cluster, or be kicked outward by interactions with other stars. Finally, the observations could be explained if young stars are formed in massive filaments of gas that fall toward the center of the cluster. The combination of X-rays from Chandra and infrared data is very powerful for studying populations of young stars in this way. With telescopes that detect visible light, many stars are obscured by dust and gas in these star-forming regions, as shown in this optical image of the region.
    NASA’s Marshall Space Flight Center in Huntsville, Ala., manages the Chandra program for NASA’s Science Mission Directorate in Washington. The Smithsonian Astrophysical Observatory in Cambridge, Mass., controls Chandra’s science and flight operations.
    Date 8 May 2014

    NASA Chandra Telescope
    NASA Chandra schematic

    The two carved-out cavities of the Flame nebula and NGC 2023 were created by the destructive glare of recently formed massive stars within their confines. They can be seen tracing a spine of glowing dust that runs through the image.

    The Flame nebula sits adjacent to the star Alnitak, the westernmost star in Orion’s belt, seen here as the bright blue dot near the top of the nebula.

    In this infrared image from Spitzer, blue represents light emitted at a wavelength of 3.6-microns, and cyan (blue-green) represents 4.5-microns, both of which come mainly from hot stars. Green represents 8-micron light and red represents 24-micron light. Relatively cooler objects, such as the dust of the nebulae, appear green and red. Some regions along the top and bottom of the image extending beyond Spitzer’s observations were filled in using data from NASA’s Wide-field Infrared Survey Explorer, or WISE, which covered similar wavelengths across the whole sky.

    NASA Wise Telescope

    The visible-light image (see inset), from the European Southern Observatory’s Very Large Telescope facility, can be found online at http://www.eso.org/public/images/eso0202a/.

    ESO VLT Interferometer
    ESO VLT Interior

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA JPL Campus

    Jet Propulsion Laboratory (JPL) is a federally funded research and development center and NASA field center located in the San Gabriel Valley area of Los Angeles County, California, United States. Although the facility has a Pasadena postal address, it is actually headquartered in the city of La Cañada Flintridge [1], on the northwest border of Pasadena. JPL is managed by the nearby California Institute of Technology (Caltech) for the National Aeronautics and Space Administration. The Laboratory’s primary function is the construction and operation of robotic planetary spacecraft, though it also conducts Earth-orbit and astronomy missions. It is also responsible for operating NASA’s Deep Space Network.

    Caltech Logo

  • richardmitnick 9:07 pm on December 18, 2014 Permalink | Reply
    Tags: AAO, , , Basic Research, ,   

    From NOAO: “NOAO: Compact Galaxy Groups Reveal Details of Their Close Encounters” 

    NOAO Banner

    December 18, 2014
    Dr. David James
    Cerro Tololo Inter-American Observatory
    Casilla 603
    La Serena, CHILE
    E-mail: djj@ctio.noao.edu

    Galaxies – spirals laced with nests of recent star formation, quiescent ellipticals composed mainly of old red stars, and numerous faint dwarfs – are the basic visible building blocks of the Universe. Galaxies are rarely found in isolation, but rather in sparse groups – sort of galactic urban sprawl. But there are occasional dense concentrations, often found in the center of giant clusters, but also, intriguingly, as more isolated compact groups (and yes, called Compact Galaxy Groups or CGs). The galaxies in these Compact Groups show dramatic differences in the way they evolve and change with time compared with galaxies in more isolated surroundings. Why is this? Collisions between galaxies in these dense groups are common, leading to rapid star formation, but there seems to be more to the puzzle.

    A team led by Dr Iraklis Konstantopoulos of the Australian Astronomical Observatory (AAO) has now obtained spectacular images of some CGs with the Dark Energy camera attached to the Blanco 4-meter telescope at the Cerro Tololo Inter-American Observatory (CTIO). This camera, constructed at the U.S. Department of Energy’s Fermi National Accelerator Laboratory, is able to image large areas of the sky to unprecedented faint limits. The team aims to combine these images with spectroscopic data from the AAO that will reveal the velocities of the galaxies, leading to a much better understanding of their gravitational interactions.

    Dark Energy Camera

    CTIO Victor M Blanco 4m Telescope
    CTIO Victor M Blanco 4m Telescope interior
    Blanco 4 meter telescope

    As Dr. David James (CTIO), who planned and obtained the images said, “The new images are absolutely brilliant, and reveal faint streams of gas and stars called tidal tails, created in the mutual gravitational interaction when two galaxies suffer a close encounter.” The tails, one preceding and one trailing the galaxy, persist long after the encounter, and allow the astronomers to calculate how long ago the event took place. The Dark Energy Camera, which can image a field four times the size of the full moon, is able to record these faint tidal tails, and the camera’s wide field will uncover unexpected surprises.

    HCG 07: Galaxies in this cluster are undergoing a burst of star formation, but no tidal tails. How many dwarf galaxies are hidden here? (This image covers an area about a third the size of the full moon.)

    HCG 31: The tidal tails are clues to recent interactions, but no evidence of heated gas between the galaxies, as would be expected.

    HCG 48: This group is dominated by a massive elliptical galaxy that has presumably formed by ingesting (astronomers refer to this as accreting) all of its neighbors.

    HCG 59: Two interacting giants have released a giant stellar stream in this Compact Group, which also hosts a bursting irregular galaxy.

    HCG 62: The brightest Compact Group in the X-ray spectrum, astronomers seek to understand how the galaxies which share a common halo will evolve.

    HCG 79: Known as Seyfert’s Sextet, four of these galaxies are involved in an ongoing interaction. The fifth galaxy is in the background and the sixth is actually material released in the interaction, the best candidate for a tidal dwarf galaxy in the local Universe.

    “The imagery reveals the assembly history of these galaxies living so close to each other via their previous interactions,” Dr Konstantopoulos said. “We look for stretched out tidal debris tails and roughly determine their ages. The time when interactions created the tidal debris and the arrangement of those ‘fossils’ tell us which galaxies interacted, and when.”

    Not all CGs are alike: in some, the gas is contained within the individual galaxies, while in other groups the gas spreads out among the galaxies. These new data will allow astronomers to untangle the physical mechanism that leads to such differences.

    Another new exploration is the census of faint dwarf galaxies. As their name implies, these are minor galaxies in comparison with giant ellipticals and spirals, but they are especially numerous, and the new data will reveal how many are lurking in these Compact Groups.

    The international team consists of astronomers at CTIO (a division of the National Optical Astronomy Observatory), the Australian Astronomical Observatory (the counterpart to the NOAO in Australia), and Monash University in Melbourne.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOAO News

    NOAO is the US national research & development center for ground-based night time astronomy. In particular, NOAO is enabling the development of the US optical-infrared (O/IR) System, an alliance of public and private observatories allied for excellence in scientific research, education and public outreach.

    Our core mission is to provide public access to qualified professional researchers via peer-review to forefront scientific capabilities on telescopes operated by NOAO as well as other telescopes throughout the O/IR System. Today, these telescopes range in aperture size from 2-m to 10-m. NOAO is participating in the development of telescopes with aperture sizes of 20-m and larger as well as a unique 8-m telescope that will make a 10-year movie of the Southern sky.

    In support of this mission, NOAO is engaged in programs to develop the next generation of telescopes, instruments, and software tools necessary to enable exploration and investigation through the observable Universe, from planets orbiting other stars to the most distant galaxies in the Universe.

    To communicate the excitement of such world-class scientific research and technology development, NOAO has developed a nationally recognized Education and Public Outreach program. The main goals of the NOAO EPO program are to inspire young people to become explorers in science and research-based technology, and to reach out to groups and individuals who have been historically under-represented in the physics and astronomy science enterprise.

    The National Optical Astronomy Observatory is proud to be a US National Node in the International Year of Astronomy, 2009.

    About Our Observatories:
    Kitt Peak National Observatory (KPNO)

    Kitt Peak

    Kitt Peak National Observatory (KPNO) has its headquarters in Tucson and operates the Mayall 4-meter, the 3.5-meter WIYN , the 2.1-meter and Coudé Feed, and the 0.9-meter telescopes on Kitt Peak Mountain, about 55 miles southwest of the city.

    Cerro Tololo Inter-American Observatory (CTIO)

    NOAO Cerro Tolo

    The Cerro Tololo Inter-American Observatory (CTIO) is located in northern Chile. CTIO operates the 4-meter, 1.5-meter, 0.9-meter, and Curtis Schmidt telescopes at this site.

    The NOAO System Science Center (NSSC)

    Gemini North
    Gemini North

    Gemini South telescope
    Gemini South

    The NOAO System Science Center (NSSC) at NOAO is the gateway for the U.S. astronomical community to the International Gemini Project: twin 8.1 meter telescopes in Hawaii and Chile that provide unprecendented coverage (northern and southern skies) and details of our universe.

    NOAO is managed by the Association of Universities for Research in Astronomy under a Cooperative Agreement with the National Science Foundation.

  • richardmitnick 4:45 pm on December 18, 2014 Permalink | Reply
    Tags: , , Basic Research, ,   

    From Chandra: “Chandra Weighs Most Massive Galaxy Cluster in Distant Universe” 

    NASA Chandra

    The most distant massive galaxy cluster, located about 9.6 billion light years from Earth, has been found and studied. Astronomers nicknamed this object the “Gioello” (Italian for “Jewel”) Cluster.
    Using Chandra data, researchers were able to accurately determine the mass and other properties of this cluster. Results like this help astronomers understand how galaxy clusters have evolved over time.




    Credit X-ray: NASA/CXC/INAF/P.Tozzi, et al; Optical: NAOJ/Subaru and ESO/VLT; Infrared: ESA/Herschel
    Release Date December 18, 2014

    A newly discovered galaxy cluster is the most massive one ever detected with an age of 800 million years or younger. Using data from NASA’s Chandra X-ray Observatory, astronomers have accurately determined the mass and other properties of this cluster, as described in our latest press release. This is an important step in understanding how galaxy clusters, the largest structures in the Universe held together by gravity, have evolved over time.

    A composite image shows the distant and massive galaxy cluster that is officially known as XDCP J0044.0-2033. Researchers, however, have nicknamed it “Gioiello”, which is Italian for “jewel”. They chose this name because an image of the cluster contains many sparkling colors from the hot, X-ray emitting gas and various star-forming galaxies within the cluster. Also, the research team met to discuss the Chandra data for the first time at Villa il Gioiello, a 15th century villa near the Observatory of Arcetri, which was the last residence of prominent Italian astronomer Galileo Galilei. In this new image of the Gioiello Cluster, X-rays from Chandra are purple, infrared data from ESA’s Hershel Space Telescope appear as large red halos around some galaxies, and optical data from the Subaru telescope on Mauna Kea in Hawaii are red, green, and blue.

    ESA Herschel
    ESA Herschel schematic

    NAOJ Subaru Telescope
    NAOJ Subaru Telescope interior

    Astronomers first detected the Gioiello Cluster, located about 9.6 billion light years away, using ESA’s XMM-Newton observatory. They were then approved to study the cluster with Chandra in observations that were equivalent to over four days of time. This is the deepest X-ray observation yet made on a cluster beyond a distance of about 8 billion light years.

    ESA XMM Newton
    ESA XMM-Newton schematc

    The long observing time allowed the researchers to gather enough X-ray data from Chandra that, when combined with scientific models, provides an accurate weight of the cluster. They determined that the Gioiello Cluster contains a whopping 400 trillion times the mass of the Sun.

    Previously, astronomers had found an enormous galaxy cluster, known as “El Gordo,” at a distance of 7 billion light years away and a few other large, distant clusters. According to the best current model for how the Universe evolved, there is a low chance of finding clusters as massive as the Gioiello Cluster and El Gordo. The new findings suggest that there might be problems with the theory, and are enticing astronomers to look for other distant and massive clusters.

    El Gordo consists of two separate galaxy subclusters colliding at several million
    kilometres per hour.

    These results are being published in The Astrophysical Journal available online. The first author is Paolo Tozzi, from the National Institute for Astrophysics (INAF) in Florence, Italy. The co-authors are Johana Santos, also from INAF in Florence, Italy; James Jee from the University of California in Davis; Rene Fassbender from INAD in Rome, Italy; Piero Rosati from the University of Ferrara in Ferrara, Italy; Alessandro Nastasi from the University of Paris-Sud, in Orsay, France; William Forman from Harvard-Smithsonian Center for Astrophysics (CfA) in Cambridge, MA; Barbara Sartoris and Stefano Borgani from the University of Trieste in Trieste, Italy; Hans Boehringer from the Max Planck Institute for Astrophysics in Garching, Germany; Bruno Altieri from the European Space Agency in Madrid, Spain; Gabriel Pratt from CEA Saclay in Cedex, France; Mario Nonino from the University of Trieste in Trieste, Italy and Christine Jones from CfA.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA’s Marshall Space Flight Center in Huntsville, Ala., manages the Chandra program for NASA’s Science Mission Directorate in Washington. The Smithsonian Astrophysical Observatory controls Chandra’s science and flight operations from Cambridge, Mass.

  • richardmitnick 3:43 pm on December 18, 2014 Permalink | Reply
    Tags: , , Basic Research, ,   

    From IFA at U Hawaii: “UH Astronomer, Keck Observatory Confirm First Kepler K2 Exoplanet Discovery” 

    U Hawaii

    University of Hawaii

    Institute for Astronomy

    U Hawaii Institute for Astonomy Mauna Kea
    IFA at Manua Kea

    Despite a malfunction that ended its primary mission in May 2013, NASA’s Kepler spacecraft has discovered a new super-Earth using data collected during its “second life,” known as the K2 mission.

    This artist’s conception portrays the first planet discovered by the Kepler spacecraft during its K2 mission. A transit of the planet was teased out of K2’s noisier data using ingenious computer algorithms developed by a researcher at the Harvard-Smithsonian Center for Astrophysics (CfA). The newfound planet, HIP 116454b, has a diameter of 20,000 miles (two and a half times the size of Earth) and weighs 12 times as much. It orbits its star once every 9.1 days. Artwork courtesy CfA.

    University of Hawaii astronomer Christoph Baranec supplied confirming data with his Robo-AO instrument mounted on the Palomar 1.5-meter telescope, and former UH graduate student Brendan Bowler, now a Joint Center for Planetary Astronomy postdoctoral fellow at Caltech, provided additional confirming observations using the Keck II adaptive optics system on Maunakea.

    Caltech Palomar 1.5m 60in telescope
    1.5 meter telescope at Palomar

    Keck Observatory

    The Kepler spacecraft detects planets by looking for planets that transit, or cross in front of, their star as seen from the vantage of Earth. During the transit, the star’s light dims slightly. The smaller the planet, the weaker the dimming, so brightness measurements must be exquisitely precise. To enable that precision, the spacecraft must maintain a steady pointing.

    Kepler’s primary mission came to an end when the second of four reaction wheels used to stabilize the spacecraft failed. Without at least three functioning reaction wheels, Kepler couldn’t be pointed accurately.

    Rather than giving up on the plucky spacecraft, a team of scientists and engineers developed an ingenious strategy to use pressure from sunlight as a virtual reaction wheel to help control the spacecraft. The resulting second mission promises to not only continue Kepler’s search for other worlds, but also introduce new opportunities to observe star clusters, active galaxies, and supernovae.

    “Like a phoenix rising from the ashes, Kepler has been reborn and is continuing to make discoveries. Even better, the planet it found is ripe for follow-up studies,” says lead author Andrew Vanderburg of the Harvard-Smithsonian Center for Astrophysics (CfA).

    Due to Kepler’s reduced pointing capabilities, extracting useful data requires sophisticated computer analysis. Vanderburg and his colleagues developed specialized software to correct for spacecraft movements, achieving about half the photometric precision of the original Kepler mission.

    Kepler’s new life began with a nine-day test in February 2014. When Vanderburg and his colleagues analyzed that data, they found that Kepler had detected a single planetary transit.

    The new found planet, HIP 116454b, has a diameter of 20,000 miles, two and a half times the size of Earth, and weighs almost 12 times as much as Earth. This makes HIP 116454b a super-Earth, a class of planets that doesn’t exist in our solar system. The average density suggests that this planet is either a water world (composed of about three-fourths water and one-fourth rock) or a mini-Neptune with an extended, gaseous atmosphere.

    This close-in planet circles its star once every 9.1 days at a distance of 8.4 million miles. Its host star is a type K orange dwarf slightly smaller and cooler than our sun. The system is 180 light-years from Earth in the constellation Pisces.

    During the process of verifying the discovery, Harvard astronomer and co-author John Johnson, a former postdoctoral fellow at the UH Institute for Astronomy, contacted Baranec and the Robo-AO team to obtain high-resolution imaging of HIP 116454 to determine whether it has very nearby stellar companions that could be contaminating the Kepler data, causing a misestimation of the planet’s size and other characteristics.

    “Because of the flexible nature of the Robo-AO system, it was possible to add the target to the Robo-AO intelligent queue, and several observations were carried out within days of the request,” says Baranec.

    While Robo-AO didn’t find any stellar companions, some additional follow-up measurements hinted that there might be a companion that is too close for Robo-AO to see. To be absolutely sure there were no contaminating companions, Bowler was asked to observe HIP 116454 with the Keck II adaptive optics system. He confirmed that HIP 116454 has no close-in stellar companions.

    Since the host star is relatively bright and nearby, follow-up studies will be easier to conduct than for many Kepler planets orbiting fainter, more distant stars. “HIP 116454b will be a top target for telescopes on the ground and in space,” says Johnson.

    The research paper reporting this discovery has been accepted for publication in The Astrophysical Journal.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    System Overview

    The University of Hawai‘i System includes 10 campuses and dozens of educational, training and research centers across the Hawaiian Islands. As the public system of higher education in Hawai‘i, UH offers opportunities as unique and diverse as our Island home.

    The 10 UH campuses and educational centers on six Hawaiian Islands provide unique opportunities for both learning and recreation.

    UH is the State’s leading engine for economic growth and diversification, stimulating the local economy with jobs, research and skilled workers.

  • richardmitnick 1:27 pm on December 18, 2014 Permalink | Reply
    Tags: , Basic Research, , , ,   

    From FNAL:- “Frontier Science Result: DZero Measuring the strange sea with silicon” 

    FNAL Home

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Thursday, Dec. 18, 2014
    Leo Bellantoni

    Our last DZero result began like so:

    FNAL DZero

    “The parts inside of a proton are called, in a not terribly imaginative terminology, partons. The partons that we tend to think of first and foremost are quarks — two up quarks and a down quark in each proton — but there are other kinds of partons as well.”

    This time, we start in the same place — with those unimaginatively named partons. There are three types.

    The first type comprises those alluded to above: quarks. The two up quarks and one down quark that make up protons are called valence quarks. They determine the electrical charge of the proton. There are six flavors of quark, and all the different combinations of three out of the six correspond to a particle of a specific type, called a baryon. (Well, almost. Top flavored quarks decay so quickly they never form a particle.)

    The second type of parton is the gluon. Gluons hold the quarks inside the proton together and are the mediators of the strong nuclear force. Just as electromagnetic energy comes in point-like units called photons, so energy of the strong nuclear force comes in units of the gluon.

    The third type of parton is the sea quark. A gluon can split into a quark-antiquark pair that exists for a fleetingly short time (10-24 seconds or less) before reforming back into a gluon.

    Sea quarks can be of any flavor. They very often are up or down quarks, just like the valence quarks. But they can also be strange quarks, and strange quarks do not exist as valence quarks in protons. A reaction with a strange quark in the initial state lets you measure these strange sea quarks in proton collisions.

    The reaction involves the collision of a strange sea quark from one proton (or antiproton) with a gluon from an antiproton (or proton) to produce a W boson and a charm quark. The charm quark, when produced with a large momentum transverse to the direction of the initial collision, will produce a narrow spray of particles all moving in roughly the same direction. Such a particle spray is called a jet. Because the charm quark will travel a few millimeters before decaying, the fact that there was a charm quark producing the jet can be inferred using the silicon based microstrip tracking detector at the very center of the DZero detector.

    Top quark and anti top quark pair decaying into jets, visible as collimated collections of particle tracks, and other fermions in the CDF detector at Tevatron.


    Tevatron map

    Silicon technology also helps identify jets produced from bottom flavored quarks. In fact, bottom quark jets are easier to find than charm quark jets. Measuring the production of bottom quark jets in events with a W boson provides important information about the nonvalence partons — specifically, gluons — of the proton.

    DZero has recently measured the production of both charm and bottom jets when a W boson is also produced. The new measurement uses more data than earlier analyses, and for the first time, we obtain information about the production (with a W) of charm and bottom jets that are produced with different momenta transverse to the collision axis. How the production varies with the transverse momentum is a valuable measurement tool to understand the various subprocesses at work. This is also the first measurement of charm-W production that relies upon the silicon microstrip tracking technology; previous measurements were based on less effective techniques.

    —Leo Bellantoni

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.

  • richardmitnick 10:28 am on December 18, 2014 Permalink | Reply
    Tags: Basic Research, Scientific Method,   

    From Ethan Siegel: “Does the Scientific Method need Revision?” 

    Starts with a bang
    Starts with a Bang

    Dec 18, 2014
    Ethan Siegel

    Does the prevalence of untestable theories in cosmology and quantum gravity require us to change what we mean by a scientific theory?

    Theoretical physics has problems. That’s nothing new — if it wasn’t so, then we’d have nothing left to do. But especially in high energy physics and quantum gravity, progress has basically stalled since the development of the standard model in the mid 70s. Yes, we’ve discovered a new particle every now and then. Yes, we’ve collected loads of data. But the fundamental constituents of our theories, quantum field theory and Riemannian geometry, haven’t changed since that time.

    The Standard Model of elementary particles, with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Everybody has their own favorite explanation for why this is so and what can be done about it. One major factor is certainly that the low hanging fruits have been picked, and progress slows as we have to climb farther up the tree. Today, we have to invest billions of dollars into experiments that are testing new ranges of parameter space, build colliders, shoot telescopes into orbit, have superclusters flip their flops. The days in which history was made by watching your bathtub spill over are gone.

    Image credit: © NEWSru.com, via http://www.newsru.com/world/07mar2006/otkrr.html.

    Another factor is arguably that the questions are getting technically harder while our brains haven’t changed all that much. Yes, now we have computers to help us, but these are, at least for now, chewing and digesting the food we feed them, not cooking their own.

    Taken together, this means that return on investment must slow down as we learn more about nature. Not so surprising.

    Still, it is a frustrating situation and this makes you wonder if not there are other reasons for lack of progress, reasons that we can do something about. Especially in a time when we really need a game changer, some breakthrough technology, clean energy, that warp drive, a transporter! Anything to get us off the road to Facebook, sorry, I meant self-destruction.

    Images credit: Pawel Kuczynski, via http://www.pawelkuczynski.com/Strona-g-owna/Home/index.php.

    It is our lacking understanding of space, time, matter, and their quantum behavior that prevents us from better using what nature has given us. And it is this frustration that lead people inside and outside the community to argue we’re doing something wrong, that the social dynamics in the field is troubled, that we’ve lost our path, that we are not making progress because we keep working on unscientific theories.

    Is that so?

    It’s not like we haven’t tried to make headway on finding the quantum nature of space and time. The arxiv categories hep-th and gr-qc are full every day with supposedly new ideas. But so far, not a single one of the existing approaches towards quantum gravity has any evidence speaking for it.

    Image credit: Brianna T. Wedge of deviantART, via http://briannatwedge.deviantart.com/.

    To me the reason this has happened is obvious: We haven’t paid enough attention to experimentally testing quantum gravity. One cannot develop a scientific theory without experimental input. It’s never happened before and it will never happen. Without data, a theory isn’t science. Without experimental test, quantum gravity isn’t physics.

    Image credit: CERN / IOP publishing, via http://cerncourier.com/cws/article/cern/28263/1/cernphysw1_7-00.

    If you think that more attention is now being paid to quantum gravity phenomenology, you are mistaken. Yes, I’ve heard them too, the lip confessions by people who want to keep on dwelling on their fantasies. But the reality is there is no funding for quantum gravity phenomenology and there are no jobs either. On the rare occasions that I have seen quantum gravity phenomenology mentioned on a job posting, the position was filled with somebody working on the theory, I am tempted to say, working on mathematics rather than physics.

    It is beyond me that funding agencies invest money into developing a theory of quantum gravity, but not into its experimental test. Yes, experimental tests of quantum gravity are farfetched. But if you think that you can’t test it, you shouldn’t put money into the theory either. And yes, that’s a community problem because funding agencies rely on experts’ opinion. And so the circle closes.

    A theory is only scientific if it useful to describe nature. Image source: http://abstrusegoose.com/275.

    To make matters worse, philosopher Richard Dawid has recently argued that it is possible to assess the promise of a theory without experimental test whatsover, and that physicists should thus revise the scientific method by taking into account what he calls “non-empirical facts”. By this he seems to mean what we often loosely refer to as internal consistency: theoretical physics is math heavy and thus has a very stringent logic. This allows one to deduce a lot of, often surprising, consequences from very few assumptions. Clearly, these must be taken into account when assessing the usefulness or range-of-validity of a theory, and they are being taken into account. But the consequences are irrelevant to the use of the theory unless some aspects of them are observable, because what makes up the use of a scientific theory is its power to describe nature.

    Dawid may be confused on this matter because physicists do, in practice, use empirical facts that we do not explicitly collect data on. For example, we discard theories that have an unstable vacuum, singularities, or complex-valued observables. Not because this is an internal inconsistency — it is not. You can deal with this mathematically just fine. We discard these because we have never observed any of that. We discard them because we don’t think they’ll describe what we see. This is not a non-empirical assessment.

    A huge problem with the lack of empirical fact is that theories remain axiomatically underconstrained. In practice, physicists don’t always start with a set of axioms, but in principle this could be done. If you do not have any axioms you have no theory, so you need to select some. The whole point of physics is to select axioms to construct a theory that describes observation. This already tells you that the idea of a theory for everything will inevitably lead to what has now been called the “multiverse”. It is just a consequence of stripping away axioms until the theory becomes ambiguous.

    Image credit: Moonrunner Design, via http://news.nationalgeographic.com/news/2014/03/140318-multiverse-inflation-big-bang-science-space/.

    Somewhere along the line many physicists have come to believe that it must be possible to formulate a theory without observational input, based on pure logic and some sense of aesthetics. They must believe their brains have a mystical connection to the universe and pure power of thought will tell them the laws of nature. But the only logical requirement to choose axioms for a theory is that the axioms not be in conflict with each other. You can thus never arrive at a theory that describes our universe without taking into account observations, period. The attempt to reduce axioms too much just leads to a whole “multiverse” of predictions, most of which don’t describe anything we will ever see.

    (The only other option is to just use all of mathematics, as [Max] Tegmark argues. You might like or not like that; at least it’s logically coherent. But that’s a different story and shall be told another time.)

    Now if you have a theory that contains more than one universe, you can still try to find out how likely it is that we find ourselves in a universe just like ours. The multiverse-defenders therefore also argue for a modification of the scientific method, one that takes into account probabilistic predictions. But we have nothing to gain from that. Calculating a probability in the multiverse is just another way of adding an axiom, in this case for the probability distribution. Nothing wrong with this, but you don’t have to change the scientific method to accommodate it.

    Image credit: screenshot from Nature, via http://www.nature.com/news/scientific-method-defend-the-integrity-of-physics-1.16535.

    In a Nature comment out today, George Ellis and Joe Silk argue that the trend of physicists to pursue untestable theories is worrisome. I agree with this, though I would have said the worrisome part is that physicists do not care enough about the testability — and apparently don’t need to care because they are getting published and paid regardless.

    See, in practice the origin of the problem is senior researchers not teaching their students that physics is all about describing nature. Instead, the students are taught by example that you can publish and live from outright bizarre speculations as long as you wrap them into enough math. I cringe every time a string theorist starts talking about beauty and elegance. Whatever made them think that the human sense for beauty has any relevance for the fundamental laws of nature?

    Schematic illustration for the circle of continually testing and improving scientific hypotheses. Source: Backreaction.

    The scientific method is often quoted as a circle of formulating and testing of hypotheses, but I find this misleading. There isn’t any one scientific method. The only thing that matters is that you honestly assess the use of a theory to describe nature. If it’s useful, keep it. If not, try something else. This method doesn’t have to be changed, it has to be more consistently applied. You can’t assess the use of a scientific theory without comparing it to observation.

    A theory might have other uses than describing nature. It might be pretty, artistic even. It might be thought-provoking. Yes, it might be beautiful and elegant. It might be too good to be true, it might be forever promising. If that’s what you are looking for that’s all fine by me. I am not arguing that these theories should not be pursued. Call them mathematics, art, or philosophy, but if they don’t describe nature don’t call them science.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible.

  • richardmitnick 9:54 am on December 18, 2014 Permalink | Reply
    Tags: Basic Research, , ,   

    From Ethan Siegel: “Quantum Immortality” 

    Starts with a bang
    Starts with a Bang

    This article was written by Paul Halpern, the author of Einstein’s Dice and Schrödinger’s Cat: How Two Great Minds Battled Quantum Randomness to Create a Unified Theory of Physics.

    Observers are the necessary, but unliked, bouncers in the elegant nightclub of quantum physics. While, no one is entirely comfortable with having doormen checking IDs, they persist; otherwise everyone and everything gets in, contrary to ordinary experience.

    Image credit: AIP Emilio Segre Visual Archives, Physics Today Collection of [Paul]Dirac and [Werner] Heisenberg;

    © Los Alamos National Laboratory of [John] von Neumann.

    In the late 1920s and early 1930s, Heisenberg, Dirac, and John von Neumann, codified the formalism of quantum mechanics as a two-step process. One part involves the continous evolution of states via the deterministic

    Schrödinger equation.
    Image credit: Wikimedia Commons user YassineMrabet.

    Map out a system’s potential energy distribution — in the form of a well, for example — and the spectrum of possible quantum states is set. If the states are time-dependent, then they predictably transform. That could set out, for instance, a superposition of states that spreads out in position space over time, like an expanding puddle of water.

    Yet experiments show that if an apparatus is designed to measure a particular quantity, such as the position, momentum or spin-state of a particle, quantum measurements yield specific values of that respective physical parameter. Such specificity requires a second type of quantum operation that is instantaneous and discrete, rather than gradual and continuous: the process of collapse.

    Image credit: A Friedman, via http://blogs.scientificamerican.com/the-curious-wavefunction/2014/01/15/what-scientific-idea-is-ready-for-retirement/.

    Collapse occurs when a measurement of a certain physical parameter — position, let’s say — precipitates a sudden transformation into one of the “eigenstates” (solution states) of the operator (mathematical function) corresponding to that parameter — the position operator, in that case.

    Image credit: Nick Trefethen, via http://www.chebfun.org/examples/ode-eig/Eigenstates.html.

    Then the measured value of that quantity is the “eigenvalue” associated with that eigenstate — the specific position of the particle, for instance. Eigenstates represent the spectrum of possible states and eigenvalues the measurements associated with those states.

    We can imagine the situation of quantum collapse as being something like a slot machine with a mixture of dollar coins and quarters; some old enough to be valuable, others shining new.

    Image credit: © 2014 Marco Jewelers, via http://marcojewelers.net/sell-buy-silver-gold-coins.

    Its front panel has two buttons: one red and the other blue. Press the red button and the coins instantly become sorted according to denomination. A number of dollar coins drop out (a mixture of old and new). Press the blue button and the sorting is instantly done by date. A bunch of old coins (of both denominations) are released. While someone seeking quick bucks might press red, a coin collector might push blue. The machine is set that you are not permitted to press both buttons. Similarly, in quantum physics, according to Heisenberg’s famous uncertainty principle certain quantities such as position and momentum are not measurable at once with any degree of precision.

    Over the years, a number of critics have attacked this interpretation.

    Albert Einstein
    Image credit: Oren Jack Turner, Princeton, N.J., via Wikimedia Commons user Jaakobou.

    Suggesting that quantum physics, though experimentally correct, must be incomplete, Einstein argued that random, instantaneous transitions had no place in a fundamental description of nature. Schrödinger cleverly developed his well-known feline thought experiment to demonstrate the absurdity of the observer’s role in quantum collapse. In his hypothetical scheme, he imagined a set-up in which a cat in a closed box, whose survival (or not) was tied to the random decay of a radioactive material, was in a mixed state of life and death until the box was opened and the system observed.

    Image credit: retrieved from Øystein Elgarøy at http://fritanke.no/index.php?page=vis_nyhet&NyhetID=8513.

    More recently, physicist Bryce DeWitt, who theorized how quantum mechanics might apply to gravity and the dynamics of the universe itself, argued that because there are presumably no observers outside the cosmos to view it (and trigger collapse into quantum gravity eigenstates), a complete accounting of quantum physics could not include observers.

    Instead, DeWitt, until his death in 2004, was an ardent advocate of an alternative to the Copenhagen (standard) interpretation of quantum mechanics that he dubbed the Many Worlds Interpretation (MWI).

    Image credit: University of Texas of Bryce DeWitt;

    Professor Jeffrey A. Barrett and UC Irvine, of Hugh Everett III.

    He based his views on the seminal work of Hugh Everett, who as a graduate student at Princeton, developed a way of avoiding the need in quantum mechanics for an observer. Instead, each time a quantum measurement is taken, the universe, including any observers, seamlessly and simultaneously splits into the spectrum of possible values for that measurement. For example, in the case of the measurement of the spin of an electron, in one branch it has spin up, and all observers see it that way; in the other it has spin down. Schrödinger’s cat would be happily alive in one reality, to the joy of its owner, while cruelly deceased in the other, much to the horror of the same owner (but in a different branch). Each observer in each branch would have no conscious awareness of his near-doppelgangers.

    As Everett wrote to DeWitt in explaining his theory:

    “The theory is in full accord with our experience (at least insofar as ordinary quantum mechanics is)… because it is possible to show that no observer would ever be aware of any ‘branching.’”

    If Schrödinger’s thought experiment were repeated each day, there would always be one branch of the universe in which the cat survives. Hypothetically, rather than the proverbial “nine lives,” the cat could have an indefinite number of “lives” or at least chances at life. There would always be one copy of the experimenter who is gratified, but perplexed, that his cat has beaten the odds and lived to see another day. The other copy, in mourning, would lament that the cat’s luck had finally run out.

    Image credit: Ethan Zuckerman, from Garrett Lisi’s talk (2008), via http://www.ethanzuckerman.com/blog/2008/02/28/ted2008-garrett-lisi-looks-for-balance/.

    What about human survival? We are each a collection of particles, governed on the deepest level by quantum rules. If each time a quantum transition took place, our bodies and consciousness split, there would be copies that experienced each possible result, including those that might determine our life or death. Suppose in one case a particular set of quantum transitions resulted in faulty cell division and ultimately a fatal form of cancer. For each of the transitions, there would always be an alternative that did not lead to cancer. Therefore, there would always be branches with survivors. Add in the assumption that our conscious awareness would flow only to the living copies, and we could survive any number of potentially hazardous events related to quantum transitions.

    Everett reportedly believed in this kind of “quantum immortality.” Fourteen years after his death in 1982, his daughter Liz took her own life, explaining in her suicide note that in some branch of the universe, she hoped to reunite with her father.

    There are major issues with the prospects for quantum immortality however. For one thing the MWI is still a minority hypothesis. Even if it is true, how do we know that our stream of conscious thought would flow only to branches in which we survive? Are all possible modes of death escapable by an alternative array of quantum transitions? Remember that quantum events must obey conservation laws, so there could be situations in which there was no way out that follows natural rules. For example, if you fall out of a spaceship hatch into frigid space, there might be no permissible quantum events (according to energy conservation) that could lead you to stay warm enough to survive.

    Finally, suppose you do somehow manage to achieve quantum immortality — with your conscious existence following each auspicious branch. You would eventually outlive all your friends and family members — because in your web of branches you would eventually encounter copies of them that didn’t survive. Quantum immortality would be lonely indeed!

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible.

  • richardmitnick 8:51 pm on December 17, 2014 Permalink | Reply
    Tags: Basic Research, ,   

    From IceCube: “Designing the future of the IceCube Neutrino Observatory” 

    IceCube South Pole Neutrino Observatory

    17 Dec 2014
    Sílvia Bravo

    The IceCube Neutrino Observatory is a successful and large scientific facility located near the Amundsen-Scott South Pole station in Antarctica. This observatory hosts IceCube, a cubic-kilometer deep-ice particle detector that is, so far, the largest ever built – and on the surface, IceTop, an extended air shower array.

    Completed in 2010, IceCube has recently discovered astrophysical neutrinos, revealing their potential to explore our universe at energies at the PeV scale and above, where most of the universe is opaque to high-energy photons. But the big questions remain unsolved: where do these neutrinos come from? How does nature accelerate particles to such extreme energies?

    Prof. Olga Botner, IceCube spokesperson and a physics professor at the University of Uppsala, and Prof. Francis Halzen, IceCube principal investigator and a professor at the University of Wisconsin–Madison, tell us about the plans for an upgrade to the IceCube Neutrino Observatory. As an extension of the current detector, it can be built in a few years and within an affordable budget, thanks to expertise acquired with IceCube.

    Artistic view of the Antarctic surface around the South Pole station, showing the position of the 86 strings of sensors in IceCube and the possible grid of the next-generation detector. Image: J.Yang/IceCube Collaboration

    Q: What has IceCube accomplished so far?

    Olga Botner (O): IceCube is the world’s foremost neutrino observatory, which, after just two years of running in its final configuration, discovered neutrinos from outer space that have energies a billion times larger than those of neutrinos produced by our Sun and a thousand times larger than any produced on Earth with man-made accelerators. The discovery of this high-energy neutrino flux is a turning point for neutrino astronomy: a dream of 50 years ago on the verge of becoming reality.

    Francis Halzen (F): The high level of the observed neutrino flux implies that a significant fraction of the energy in the non-thermal universe, powered by the gravitational energy of compact objects from neutron stars to supermassive black holes, is generated in hadronic accelerators. This tells us that we are approaching exciting times when high-energy neutrinos will reveal new sources or provide new insight on the energy generation in known sources.

    But IceCube has also been a successful detector with respect to its technical development. We developed highly successful designs for transforming natural ice into a particle detector. The optimized methods for deploying and commissioning large volume detectors in ice can be used for a next-generation detector; minimal modifications will target improvements focused on modernization, efficiency, and cost savings.

    O: This is a very important point. The detector was built within the expected time frame, within budget, and with a performance at least a factor of two better than anticipated.

    Going back to physics, I should also add that IceCube has yielded many interesting results beyond neutrino astronomy. We are studying cosmic rays, looking for signatures of the annihilations of dark matter particles into neutrinos, and investigating the properties of the neutrinos themselves. We have published competitive results in all these areas.

    Q: Why do we need a next-generation IceCube detector?

    F: We all agree on the observed spectrum of neutrinos, there’s no doubt about the discovery, but independent analyses of IceCube data have produced only on the order of 100 astrophysical neutrino events in several years. These modest numbers of cosmic neutrinos limit the ability of IceCube to be an efficient tool for neutrino astronomy over the next decade. A next-generation detector will provide an unprecedented view of the high-energy universe, taking neutrino astronomy to new levels of discovery. It is likely to resolve the question of the origin of the cosmic neutrinos recently discovered.

    O: That’s right! IceCube’s discovery of extraterrestrial neutrinos has shown us that even a cubic-kilometer detector is not enough. To fully exploit the potential for neutrino astronomy, a much larger observatory is needed. We are already working on its design. The new detector has been named IceCube-Gen2.

    Q: Is it feasible and cost-effective to build an even bigger detector at the Pole?

    O: It sure is. The good news is that the successful deployment and running of IceCube demonstrates that we have mastered the technologies to construct and operate a detector in the deep ice. The drilling systems and the optical modules for the next-generation detector will closely follow the designs that have been proven to work well—with certain modifications to improve the overall performance. This makes us confident that a next-generation detector is not only feasible but can be built in a cost-effective manner, just like IceCube.

    F: We didn’t know this before IceCube, but now we have measured the extremely long photon absorption lengths in ice. This will allow the spacing between strings of light sensors to exceed 250 m in a future IceCube extension; i.e., the instrumented volume can rapidly grow without increasing the costs much. In fact, we can build a ten-cubic-kilometer IceCube-Gen2 telescope by roughly doubling the instrumentation already deployed. Thus, a tenfold increase in astrophysical neutrino detection rates could be achieved with a cost comparable to the current IceCube detector.

    Q: And what about the time scale of this project? Will we need to wait a long time to see new results?

    O: We are aiming at an expanded array instrumenting a volume of 10 km3 for the detection of high-energy neutrinos—but also at improving the low-energy performance through deployment of a densely instrumented infill detector, PINGU, targeting neutrino mass hierarchy as its prime goal. We believe that this new IceCube-Gen2 observatory can be built within seven years of obtaining funding.

    Q: Sounds like a plan. Who is leading this next-generation IceCube?

    F: The present plan is to build IceCube following a management strategy that was successful in delivering IceCube on time and on budget. The collaboration is rapidly expanding, both in the US and in Europe and Canada. We expect that a larger fraction of the cost will be carried by significant contributions from our foreign collaborators.

    O: Exactly. The high-energy array and PINGU are both envisioned as parts of an IceCube-Gen2 observatory. A new collaboration, including IceCube members and additional institutions, is now being formed. This IceCube-Gen2 collaboration will work to develop proposals in the US and abroad to secure funding. We hope that IceCube-Gen2 will become a flagship scientific project for NSF as well as for funding agencies abroad.

    This image shows a simulated high-energy event of about 60 PeV in the proposed IceCube Gen2 detector. Image: IceCube Collaboration

    Q: Can other current or in-design experiments do better than IceCube-Gen2?

    F: Well, we have strong competitors. Early efforts for cubic-kilometer neutrino detectors focused on deep-water-based detectors, including DUMAND, Lake Baikal, and ANTARES. So far, there is no cubic-kilometer neutrino detector in deep water, but these experiments have paved the way toward the proposed construction of KM3NeT in the Mediterranean Sea and GVD in Lake Baikal.

    O: These new projects, GVD in Lake Baikal and KM3NeT in the Mediterranean, are presently in the prototyping or early construction phase. They will eventually provide a complementary view of the sky to that of an Antarctic observatory.

    Q: Should we expect IceCube-Gen2 to be as successful as IceCube? That may be the desire, but are there objective reasons to think so?

    O: The main one is that we already have established the existence of a flux of high-energy neutrinos. What we now need are substantial number of events to further characterize this flux in terms of energy spectrum, a possible energy cut-off, flavor composition, and provenance. We just need a larger detector to do this in a reasonable time. The higher event rates in a larger array will also improve the chances of correlating our neutrino events with observations by the new generation of high-energy gamma-ray telescopes and gravitational wave detectors, together charting the non-thermal universe.

    F: The larger samples of high-energy neutrinos with improved angular resolution and energy measurement will give us a detailed understanding of the source distribution. This sample will reveal an unobstructed view of the universe at energies at PeV and above. Those are unexplored wavelengths where most of the universe is opaque to high-energy photons. As Olga was mentioning, the operation of IceCube-Gen2 in coincidence with other telescopes and detectors will present totally novel opportunities for multimessenger astronomy and multiwavelength follow-up campaigns to obtain a truly complete picture of astrophysical sources.

    + Info IceCube-Gen2: A Vision for the Future of Neutrino Astronomy in Antarctica, IceCube Collaboration: M.G. Aartsen et al. arxiv.org/abs/1412.5106

    This white paper presents early studies toward a next-generation IceCube detector with the aim of instrumenting a 10 km3 volume of clear glacial ice at the South Pole and delivering an order of magnitude increase in astrophysical neutrino samples of all flavors.

    Read also a short description of IceCube-Gen2 on the IceCube w

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ICECUBE neutrino detector
    IceCube is a particle detector at the South Pole that records the interactions of a nearly massless sub-atomic particle called the neutrino. IceCube searches for neutrinos from the most violent astrophysical sources: events like exploding stars, gamma ray bursts, and cataclysmic phenomena involving black holes and neutron stars. The IceCube telescope is a powerful tool to search for dark matter, and could reveal the new physical processes associated with the enigmatic origin of the highest energy particles in nature. In addition, exploring the background of neutrinos produced in the atmosphere, IceCube studies the neutrinos themselves; their energies far exceed those produced by accelerator beams. IceCube is the world’s largest neutrino detector, encompassing a cubic kilometer of ice.

  • richardmitnick 6:28 pm on December 17, 2014 Permalink | Reply
    Tags: , , Basic Research, ,   

    From NASA Goddard: “MESSENGER Data Suggest Recurring Meteor Shower on Mercury “ 

    NASA Goddard Banner

    December 12, 2014
    Nancy Neal-Jones
    NASA’s Goddard Space Flight Center, Greenbelt, Maryland

    Elizabeth Zubritsky
    NASA’s Goddard Space Flight Center, Greenbelt, Maryland

    The closest planet to the sun appears to get hit by a periodic meteor shower, possibly associated with a comet that produces multiple events annually on Earth.

    The clues pointing to Mercury’s shower were discovered in the very thin halo of gases that make up the planet’s exosphere, which is under study by NASA’s MESSENGER (MErcury Surface, Space ENvironment, GEochemistry, and Ranging) spacecraft.

    NASA Messenger satellite

    “The possible discovery of a meteor shower at Mercury is really exciting and especially important because the plasma and dust environment around Mercury is relatively unexplored,” said Rosemary Killen, a planetary scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, and lead author of the study, available online in Icarus.

    Mercury appears to undergo a recurring meteor shower, perhaps when its orbit crosses the debris trail left by comet Encke. (Artist’s concept.)
    Image Credit: NASA’s Goddard Space Flight Center

    A meteor shower occurs when a planet passes through a swath of debris shed by a comet, or sometimes an asteroid. The smallest bits of dust, rock and ice feel the force of solar radiation, which pushes them away from the sun, creating the comet’s sometimes-dazzling tail. The larger chunks get deposited like a trail of breadcrumbs along the comet’s orbit – a field of tiny meteoroids in the making.

    Earth experiences multiple meteor showers each year, including northern summer’s Perseids, the calling card of comet Swift–Tuttle, and December’s reliable Geminids, one of the few events associated with an asteroid. Comet Encke has left several debris fields in the inner solar system, giving rise to the Southern and Northern Taurids, meteor showers that peak in October and November, and the Beta Taurids in June and July.

    The suggested hallmark of a meteor shower on Mercury is a regular surge of calcium in the exosphere. Measurements taken by MESSENGER’s Mercury Atmospheric and Surface Composition Spectrometer have revealed seasonal surges of calcium that occurred regularly over the first nine Mercury years since MESSENGER began orbiting the planet in March 2011.

    The suspected cause of these spiking calcium levels is a shower of small dust particles hitting the planet and knocking calcium-bearing molecules free from the surface. This process, called impact vaporization, continually renews the gases in Mercury’s exosphere as interplanetary dust and meteoroids rain down on the planet. However, the general background of interplanetary dust in the inner solar system cannot, by itself, account for the periodic spikes in calcium. This suggests a periodic source of additional dust, for example, a cometary debris field. Examination of the handful of comets in orbits that would permit their debris to cross Mercury’s orbit indicated that the likely source of the planet’s event is Encke.


    “If our scenario is correct, Mercury is a giant dust collector,” said Joseph Hahn, a planetary dynamist in the Austin, Texas, office of the Space Science Institute and coauthor of the study. “The planet is under steady siege from interplanetary dust and then regularly passes through this other dust storm, which we think is from comet Encke.”

    The researchers created detailed computer simulations to test the comet Encke hypothesis. However, the calcium spikes found in the MESSENGER data were offset a bit from the expected results. This shift is probably due to changes in the comet’s orbit over time, due to the gravitational pull of Jupiter and other planets.

    “The variation of Mercury’s calcium exosphere with the planet’s position in its orbit has been known for several years from MESSENGER observations, but the proposal that the source of this variation is a meteor shower associated with a specific comet is novel,” added MESSENGER Principal Investigator Sean Solomon, of the Lamont-Doherty Earth Observatory at Columbia University in New York. “This study should provide a basis for searches for further evidence of the influence of meteor showers on the interaction of Mercury with its solar-system environment.”

    The Johns Hopkins University Applied Physics Laboratory built and operates the MESSENGER spacecraft and manages this Discovery-class mission for NASA.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA’s Goddard Space Flight Center is home to the nation’s largest organization of combined scientists, engineers and technologists that build spacecraft, instruments and new technology to study the Earth, the sun, our solar system, and the universe.

    Named for American rocketry pioneer Dr. Robert H. Goddard, the center was established in 1959 as NASA’s first space flight complex. Goddard and its several facilities are critical in carrying out NASA’s missions of space exploration and scientific discovery.


  • richardmitnick 5:35 pm on December 17, 2014 Permalink | Reply
    Tags: , , , Basic Research, ,   

    From ALMA: “‘Perfect Storm’ Suffocating Star Formation around a Supermassive Black Hole” 

    ESO ALMA Array

    Wednesday, 17 December 2014
    Valeria Foncea
    Education and Public Outreach Officer
    Joint ALMA Observatory
    Santiago, Chile
    Tel: +56 2 467 6258
    Cell: +56 9 75871963
    Email: vfoncea@alma.cl

    Charles E. Blue
    Public Information Officer
    National Radio Astronomy Observatory
    Charlottesville, Virginia, USA
    Tel: +1 434 296 0314
    Cell: +1 434.242.9559
    E-mail: cblue@nrao.edu

    Masaaki Hiramatsu
    Education and Public Outreach Officer, NAOJ Chile
    Observatory Tokyo, Japan
    Tel: +81 422 34 3630
    E-mail: hiramatsu.masaaki@nao.ac.jp

    Richard Hook
    Public Information Officer, ESO
    Garching bei München, Germany
    Tel: +49 89 3200 6655
    Cell: +49 151 1537 3591
    Email: rhook@eso.org

    High-energy jets powered by supermassive black holes can blast away a galaxy’s star-forming fuel — resulting in so-called “red and dead” galaxies: those brimming with ancient red stars yet little or no hydrogen gas available to create new ones.

    Now astronomers using the Atacama Large Millimeter/submillimeter Array (ALMA) have discovered that black holes don’t have to be nearly so powerful to shut down star formation. By observing the dust and gas at the center NGC 1266, a nearby lenticular galaxy with a relatively modest central black hole, the astronomers have detected a “perfect storm” of turbulence that is squelching star formation in a region that would otherwise be an ideal star factory.

    NGC 1266

    This turbulence is stirred up by jets from the galaxy’s central black hole slamming into an incredibly dense envelope of gas. This dense region, which may be the result of a recent merger with another smaller galaxy, blocks nearly 98 percent of material propelled by the jets from escaping the galactic center.

    “Like an unstoppable force meeting an immovable object, the molecules in these jets meet so much resistance when they hit the surrounding dense gas that they are almost completely stopped in their tracks,” said Katherine Alatalo, an astronomer with the California Institute of Technology in Pasadena and lead author on a paper published in the Astrophysical Journal. This energetic collision produces powerful turbulence in the surrounding gas, disrupting the first critical stage of star formation. “So what we see is the most intense suppression of star formation ever observed,” noted Alatalo.

    Previous observations of NGC 1266 revealed a broad outflow of gas from the galactic center traveling up to 400 kilometers per second. Alatalo and her colleagues estimate that this outflow is as forceful as the simultaneous supernova explosion of 10,000 stars. The jets, though powerful enough to stir the gas, are not powerful enough to give it the velocity it needs to escape from the system.

    “Another way of looking at it is that the jets are injecting turbulence into the gas, preventing it from settling down, collapsing, and forming stars,” said National Radio Astronomy Observatory astronomer and co-author Mark Lacy.

    The region observed by ALMA contains about 400 million times the mass of our Sun in star-forming gas, which is 100 times more than is found in giant star-forming molecular clouds in our own Milky Way. Normally, gas this concentrated should be producing stars at a rate at least 50 times faster than the astronomers observed in this galaxy.

    Previously, astronomers believed that only extremely powerful quasars and radio galaxies contained black holes that were powerful enough to serve as a star-forming “on/off” switch.

    A combined Hubble Space Telescope / ALMA image of NGC 1266. The ALMA data (orange) are shown in the central region. Credit: NASA/ESA Hubble; ALMA (NRAO/ESO/NAOJ)

    NASA Hubble Telescope
    NASA Hubble schematic
    NASA/ESA Hubble

    “The usual assumption in the past has been that the jets needed to be powerful enough to eject the gas from the galaxy completely in order to be effective at stopping start formation,” said Lacy.

    To make this discovery, the astronomers first pinpointed the location of the far-infrared light being emitted by the galaxy. Normally, this light is associated with star formation and enables astronomers to detect regions where new stars are forming. In the case of NGC 1266, however, this light was coming from an extremely confined region of the galaxy. “This very small area was almost too small for the infrared light to be coming from star formation,” noted Alatalo.

    With ALMA’s exquisite sensitivity and resolution, and along with observations from CARMA (the Combined Array for Research in Millimeter-wave Astronomy), the astronomers were then able to trace the location of the very dense molecular gas at the galactic center. They found that the gas is surrounding this compact source of the far-infrared light.

    CARMA Array

    Under normal conditions, gas this dense would be forming stars at a very high rate. The dust embedded within this gas would then be heated by young stars and seen as a bright and extended source of infrared light. The small size and faintness of the infrared source in this galaxy suggests that NGC 1266 is instead choking on its own fuel, seemingly in defiance of the rules of star formation.

    The astronomers also speculate that there is a feedback mechanism at work in this region. Eventually, the black hole will calm down and the turbulence will subside so star-formation can begin anew. With this renewed star formation, however, comes greater motion in the dense gas, which then falls in on the black hole and reestablishes the jets, shutting down star formation once again.

    NGC 1266 is located approximately 100 million light-years away in the constellation Eridanus. Leticular galaxies are spiral galaxies, like our own Milky Way, but they have little interstellar gas available to form new stars.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Atacama Large Millimeter/submillimeter Array (ALMA), an international astronomy facility, is a partnership of Europe, North America and East Asia in cooperation with the Republic of Chile. ALMA is funded in Europe by the European Organization for Astronomical Research in the Southern Hemisphere (ESO), in North America by the U.S. National Science Foundation (NSF) in cooperation with the National Research Council of Canada (NRC) and the National Science Council of Taiwan (NSC) and in East Asia by the National Institutes of Natural Sciences (NINS) of Japan in cooperation with the Academia Sinica (AS) in Taiwan.

    ALMA construction and operations are led on behalf of Europe by ESO, on behalf of North America by the National Radio Astronomy Observatory (NRAO), which is managed by Associated Universities, Inc. (AUI) and on behalf of East Asia by the National Astronomical Observatory of Japan (NAOJ). The Joint ALMA Observatory (JAO) provides the unified leadership and management of the construction, commissioning and operation of ALMA.

    NRAO Small

    ESO 50


Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc

Get every new post delivered to your Inbox.

Join 378 other followers

%d bloggers like this: