Tagged: Cosmology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 11:22 am on July 26, 2015 Permalink | Reply
    Tags: Cosmology, , , , Time Travel   

    From RT: “Time-traveling photons connect general relativity to quantum mechanics” 

    RT Logo

    RT

    23 Jun, 2014
    No Writer Credit

    1
    Space-time structure exhibiting closed paths in space (horizontal) and time (vertical). A quantum particle travels through a wormhole back in time and returns to the same location in space and time. (Photo credit: Martin Ringbauer)

    Scientists have simulated time travel by using particles of light acting as quantum particles sent away and then brought back to their original space-time location. This is a huge step toward marrying two of the most irreconcilable theories in physics.

    Since traveling all the way to a black hole to see if an object you’re holding would bend, break or put itself back together in inexplicable ways is a bit of a trek, scientists have decided to find a point of convergence between general relativity and quantum mechanics in lab conditions, and they achieved success.

    Australian researchers from the UQ’s School of Mathematics and Physics wanted to plug the holes in the discrepancies that exist between two of our most commonly accepted physics theories, which is no easy task: on the one hand, you have Einstein’s theory of general relativity, which predicts the behavior of massive objects like planets and galaxies; but on the other, you have something whose laws completely clash with Einstein’s – and that is the theory of quantum mechanics, which describes our world at the molecular level. And this is where things get interesting: we still have no concrete idea of all the principles of movement and interaction that underpin this theory.

    Natural laws of space and time simply break down there.

    The light particles used in the study are known as photons, and in this University of Queensland study, they stood in for actual quantum particles for the purpose of finding out how they behaved while moving through space and time.

    The team simulated the behavior of a single photon that travels back in time through a wormhole and meets its older self – an identical photon. “We used single photons to do this but the time-travel was simulated by using a second photon to play the part of the past incarnation of the time traveling photon,” said UQ Physics Professor Tim Ralph asquotedby The Speaker.

    The findings were published in the journal Nature Communications and gained support from the country’s key institutions on quantum physics.

    Some of the biggest examples of why the two approaches can’t be reconciled concern the so-called space-time loop. Einstein suggested that you can travel back in time and return to the starting point in space and time. This presented a problem, known commonly as the ‘grandparents paradox,’ theorized by Kurt Godel in 1949: if you were to travel back in time and prevent your grandparents from meeting, and in so doing prevent your own birth, the classical laws of physics would prevent you from being born.

    But Tim Ralph has reminded that in 1991, such situations could be avoided by harnessing quantum mechanics’ flexible laws: “The properties of quantum particles are ‘fuzzy’ or uncertain to start with, so this gives them enough wiggle room to avoid inconsistent time travel situations,” he said.

    There are still ways in which science hasn’t tested the meeting points between general relativity and quantum mechanics – such as when relativity is tested under extreme conditions, where its laws visibly seem to bend, just like near the event horizon of a black hole.

    But since it’s not really easy to approach one, the UQ scientists were content with testing out these points of convergence on photons.

    “Our study provides insights into where and how nature might behave differently from what our theories predict,” Professor Ralph said.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 1:19 pm on July 20, 2015 Permalink | Reply
    Tags: , , Cosmology,   

    From NOVA: “Black Holes Could Turn You Into a Hologram, and You Wouldn’t Even Notice” 

    PBS NOVA

    NOVA

    01 Jul 2015
    Tim De Chant

    1
    Black holes may not have event horizons, but fuzzy surfaces.

    Few things are as mysterious as black holes. Except, of course, what would happen to you if you fell into one.

    Physicists have been debating what might happen to anyone unfortunate enough to slip toward the singularity, and so far, they’ve come up with approximately 2.5 ways you might die, from being stretched like spaghetti to burnt to a crisp.

    The fiery hypothesis is a product of Stephen Hawking’s firewall theory, which also says that black holes eventually evaporate, destroying everything inside. But this violates a fundamental principle of physics—that information cannot be destroyed—so other physicists, including Samir Mathur, have been searching for ways to address that error.

    Here’s Marika Taylor, writing for The Conversation:

    The general relativity description of black holes suggests that once you go past the event horizon, the surface of a black hole, you can go deeper and deeper. As you do, space and time become warped until they reach a point called the “singularity” at which point the laws of physics cease to exist. (Although in reality, you would die pretty early on on this journey as you are pulled apart by intense tidal forces).

    In Mathur’s universe, however, there is nothing beyond the fuzzy event horizon.

    Mathur’s take on black holes suggests that they aren’t surrounded by a point-of-no-return event horizon or a firewall that would incinerate you, but a fuzzball with small variations that maintain a record of the information that fell into it. What does touch the fuzzball is converted into a hologram. It’s not a perfect copy, but a doppelgänger of sorts.

    Perhaps more bizarrely, you even wouldn’t be aware that of the transformation. Say you were to be sucked toward a black hole. At the point where you’d normally hit the event horizon, Mathur says, you’d instead touch the fuzzy surface. But instead of noticing anything, the fuzzy surface would appear like any other part of space immediately around you. Everything would seem the same as it was, except that you’d be a hologram.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

     
  • richardmitnick 4:24 pm on July 19, 2015 Permalink | Reply
    Tags: , Cosmology, , ,   

    From WIRED: “Chemists Invent New Letters for Nature’s Genetic Alphabet” 

    Wired logo

    Wired

    07.19.15
    Emily Singer

    1
    Olena Shmahalo/Quanta Magazine

    DNA stores our genetic code in an elegant double helix.

    1
    The structure of the DNA double helix. The atoms in the structure are colour-coded by element and the detailed structure of two base pairs are shown in the bottom right.

    But some argue that this elegance is overrated. “DNA as a molecule has many things wrong with it,” said Steven Benner, an organic chemist at the Foundation for Applied Molecular Evolution in Florida.

    Nearly 30 years ago, Benner sketched out better versions of both DNA and its chemical cousin RNA, adding new letters and other additions that would expand their repertoire of chemical feats.

    2
    A hairpin loop from a pre-mRNA. Highlighted are the nucleobases (green) and the ribose-phosphate backbone (blue). Note that this is a single strand of RNA that folds back upon itself.

    He wondered why these improvements haven’t occurred in living creatures. Nature has written the entire language of life using just four chemical letters: G, C, A and T. Did our genetic code settle on these four nucleotides for a reason? Or was this system one of many possibilities, selected by simple chance? Perhaps expanding the code could make it better.

    Benner’s early attempts at synthesizing new chemical letters failed. But with each false start, his team learned more about what makes a good nucleotide and gained a better understanding of the precise molecular details that make DNA and RNA work. The researchers’ efforts progressed slowly, as they had to design new tools to manipulate the extended alphabet they were building. “We have had to re-create, for our artificially designed DNA, all of the molecular biology that evolution took 4 billion years to create for natural DNA,” Benner said.

    Now, after decades of work, Benner’s team has synthesized artificially enhanced DNA that functions much like ordinary DNA, if not better. In two papers published in the Journal of the American Chemical Society last month, the researchers have shown that two synthetic nucleotides called P and Z fit seamlessly into DNA’s helical structure, maintaining the natural shape of DNA. Moreover, DNA sequences incorporating these letters can evolve just like traditional DNA, a first for an expanded genetic alphabet.

    The new nucleotides even outperform their natural counterparts. When challenged to evolve a segment that selectively binds to cancer cells, DNA sequences using P and Z did better than those without.

    “When you compare the four-nucleotide and six-nucleotide alphabet, the six-nucleotide version seems to have won out,” said Andrew Ellington, a biochemist at the University of Texas, Austin, who was not involved in the study.

    Benner has lofty goals for his synthetic molecules. He wants to create an alternative genetic system in which proteins—intricately folded molecules that perform essential biological functions—are unnecessary. Perhaps, Benner proposes, instead of our standard three-component system of DNA, RNA and proteins, life on other planets evolved with just two.

    Better Blueprints for Life

    The primary job of DNA is to store information. Its sequence of letters contains the blueprints for building proteins. Our current four-letter alphabet encodes 20 amino acids, which are strung together to create millions of different proteins. But a six-letter alphabet could encode as many as 216 possible amino acids and many, many more possible proteins.

    3
    Expanding the genetic alphabet dramatically expands the number of possible amino acids and proteins that cells can build, at least in theory. The existing four-letter alphabet produces 20 amino acids (small circle) while a six-letter alphabet could produce 216 possible amino acids. Olena Shmahalo/Quanta Magazine

    Why nature stuck with four letters is one of biology’s fundamental questions. Computers, after all, use a binary system with just two “letters”—0s and 1s. Yet two letters probably aren’t enough to create the array of biological molecules that make up life. “If you have a two-letter code, you limit the number of combinations you get,” said Ramanarayanan Krishnamurthy, a chemist at the Scripps Research Institute in La Jolla, Calif.

    On the other hand, additional letters could make the system more error prone. DNA bases come in pairs—G pairs with C and A pairs with T. It’s this pairing that endows DNA with the ability to pass along genetic information. With a larger alphabet, each letter has a greater chance of pairing with the wrong partner, and new copies of DNA might harbor more mistakes. “If you go past four, it becomes too unwieldy,” Krishnamurthy said.

    But perhaps the advantages of a larger alphabet can outweigh the potential drawbacks. Six-letter DNA could densely pack in genetic information. And perhaps six-letter RNA could take over some of the jobs now handled by proteins, which perform most of the work in the cell.

    Proteins have a much more flexible structure than DNA and RNA and are capable of folding into an array of complex shapes. A properly folded protein can act as a molecular lock, opening a chamber only for the right key. Or it can act as a catalyst, capturing and bringing together different molecules for chemical reactions.

    Adding new letters to RNA could give it some of these abilities. “Six letters can potentially fold into more, different structures than four letters,” Ellington said.

    Back when Benner was sketching out ideas for alternative DNA and RNA, it was this potential that he had in mind. According to the most widely held theory of life’s origins, RNA once performed both the information-storage job of DNA and the catalytic job of proteins. Benner realized that there are many ways to make RNA a better catalyst.

    “With just these little insights, I was able to write down the structures that are in my notebook as alternatives that would make DNA and RNA better,” Benner said. “So the question is: Why did life not make these alternatives? One way to find out was to make them ourselves, in the laboratory, and see how they work.”

    3
    Steven Benner’s lab notebook from 1985 outlining plans to synthesize “better” DNA and RNA by adding new chemical letters. Courtesy of Steven Benner

    It’s one thing to design new codes on paper, and quite another to make them work in real biological systems. Other researchers have created their own additions to the genetic code, in one case even incorporating new letters into living bacteria. But these other bases fit together a bit differently from natural ones, stacking on top of each other rather than linking side by side. This can distort the shape of DNA, particularly when a number of these bases cluster together. Benner’s P-Z pair, however, is designed to mimic natural bases.

    One of the new papers by Benner’s team shows that Z and P are yoked together by the same chemical bond that ties A to T and C to G. (This bond is known as Watson-Crick pairing, after the scientists who discovered DNA’s structure.) Millie Georgiadis, a chemist at Indiana University-Purdue University Indianapolis, along with Benner and other collaborators, showed that DNA strands that incorporate Z and P retain their proper helical shape if the new letters are strung together or interspersed with natural letters.

    “This is very impressive work,” said Jack Szostak, a chemist at Harvard University who studies the origin of life, and who was not involved in the study. “Finding a novel base pair that does not grossly disrupt the double-helical structure of DNA has been quite difficult.”

    The team’s second paper demonstrates how well the expanded alphabet works. Researchers started with a random library of DNA strands constructed from the expanded alphabet and then selected the strands that were able to bind to liver cancer cells but not to other cells. Of the 12 successful binders, the best had Zs and Ps in their sequences, while the weakest did not.

    “More functionality in the nucleobases has led to greater functionality in nucleic acids themselves,” Ellington said. In other words, the new additions appear to improve the alphabet, at least under these conditions.

    But additional experiments are needed to determine how broadly that’s true. “I think it will take more work, and more direct comparisons, to be sure that a six-letter version generally results in ‘better’ aptamers [short DNA strands] than four-letter DNA,” Szostak said. For example, it’s unclear whether the six-letter alphabet triumphed because it provided more sequence options or because one of the new letters is simply better at binding, Szostak said.

    Benner wants to expand his genetic alphabet even further, which could enhance its functional repertoire. He’s working on creating a 10- or 12-letter system and plans to move the new alphabet into living cells. Benner’s and others’ synthetic molecules have already proved useful in medical and biotech applications, such as diagnostic tests for HIV and other diseases. Indeed, Benner’s work helped to found the burgeoning field of synthetic biology, which seeks to build new life, in addition to forming useful tools from molecular parts.

    Why Life’s Code Is Limited

    Benner’s work and that of other researchers suggests that a larger alphabet has the capacity to enhance DNA’s function. So why didn’t nature expand its alphabet in the 4 billion years it has had to work on it? It could be because a larger repertoire has potential disadvantages. Some of the structures made possible by a larger alphabet might be of poor quality, with a greater risk of misfolding, Ellington said.

    Nature was also effectively locked into the system at hand when life began. “Once [nature] has made a decision about which molecular structures to place at the core of its molecular biology, it has relatively little opportunity to change those decisions,” Benner said. “By constructing unnatural systems, we are learning not only about the constraints at the time that life first emerged, but also about constraints that prevent life from searching broadly within the imagination of chemistry.”

    5
    The genetic code—made up of the four letters, A, T, G and C—stores the blueprint for proteins. DNA is first transcribed into RNA and then translated into proteins, which fold into specific shapes. Olena Shmahalo/Quanta Magazine

    Benner aims to make a thorough search of that chemical space, using his discoveries to make new and improved versions of both DNA and RNA. He wants to make DNA better at storing information and RNA better at catalyzing reactions. He hasn’t shown directly that the P-Z base pairs do that. But both bases have the potential to help RNA fold into more complex structures, which in turn could make proteins better catalysts. P has a place to add a “functional group,” a molecular structure that helps folding and is typically found in proteins. And Z has a nitro group, which could aid in molecular binding.

    In modern cells, RNA acts as an intermediary between DNA and proteins. But Benner ultimately hopes to show that the three-biopolymer system—DNA, RNA and proteins—that exists throughout life on Earth isn’t essential. With better-engineered DNA and RNA, he says, perhaps proteins are unnecessary.

    Indeed, the three-biopolymer system may have drawbacks, since information flows only one way, from DNA to RNA to proteins. If a DNA mutation produces a more efficient protein, that mutation will spread slowly, as organisms without it eventually die off.

    What if the more efficient protein could spread some other way, by directly creating new DNA? DNA and RNA can transmit information in both directions. So a helpful RNA mutation could theoretically be transformed into beneficial DNA. Adaptations could thus lead directly to changes in the genetic code.

    Benner predicts that a two-biopolymer system would evolve faster than our own three-biopolymer system. If so, this could have implications for life on distant planets. “If we find life elsewhere,” he said, “it would likely have the two-biopolymer system.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 10:31 am on July 18, 2015 Permalink | Reply
    Tags: , , Cosmology, , ,   

    From NOVA: “How Time Got Its Arrow” 

    PBS NOVA

    NOVA

    15 Jul 2015

    1
    Lee Smolin, Perimeter Institute for Theoretical Physics

    I believe in time.

    I haven’t always believed in it. Like many physicists and philosophers, I had once concluded from general relativity and quantum gravity that time is not a fundamental aspect of nature, but instead emerges from another, deeper description. Then, starting in the 1990s and accelerated by an eight year collaboration with the Brazilian philosopher Roberto Mangabeira Unger, I came to believe instead that time is fundamental. (How I came to this is another story.) Now, I believe that by taking time to be fundamental, we might be able to understand how general relativity and the standard model emerge from a deeper theory, why time only goes one way, and how the universe was born.

    2
    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    1
    Flickr user Robert Couse-Baker, adapted under a Creative Commons license.

    The story starts with change. Science, most broadly defined, is the systematic study of change. The world we observe and experience is constantly changing. And most of the changes we observe are irreversible. We are born, we grow, we age, we die, as do all living things. We remember the past and our actions influence the future. Spilled milk is hard to clean up; a cool drink or a hot bath tend towards room temperature. The whole world, living and non-living, is dominated by irreversible processes, as captured mathematically by the second law of thermodynamics, which holds that the entropy of a closed system usually increases and seldom decreases.

    It may come as a surprise, then, that physics regards this irreversibility as a cosmic accident. The laws of nature as we know them are all reversible when you change the direction of time. Film a process described by those laws, and then run the movie backwards: the rewound version is also allowed by the laws of physics. To be more precise, you may have to change left for right and particles for antiparticles, along with reversing the direction of time, but the standard model of particle physics predicts that the original process and its reverse are equally likely.

    The same is true of Einstein’s theory of general relativity, which describes gravity and cosmology. If the whole universe were observed to run backwards in time, so that it heated up while it collapsed, rather than cooled as it expanded, that would be equally consistent with these fundamental laws, as we currently understand them.

    This leads to a fundamental question: Why, if the laws are reversible, is the universe so dominated by irreversible processes? Why does the second law of thermodynamics hold so universally?

    Gravity is one part of the answer. The second law tells us that the entropy of a closed system, which is a measure of disorder or randomness in the motions of the atoms making up that system, will most likely increase until a state of maximum disorder is reached. This state is called equilibrium. Once it is reached, the system is as mixed as possible, so all parts have the same temperature and all the elements are equally distributed.

    But on large scales, the universe is far from equilibrium. Galaxies like ours are continually forming stars, turning nuclear potential energy into heat and light, as they drive the irreversible flows of energy and materials that characterize the galactic disks. On these large scales, gravity fights the decay to equilibrium by causing matter to clump,,creating subsystems like stars and planets. This is beautifully illustrated in some recent papers by Barbour, Koslowski and Mercati.

    But this is only part of the answer to why the universe is out of equilibrium. There remains the mystery of why the universe at the big bang was not created in equilibrium to start with, for the picture of the universe given us by observations requires that the universe be created in an extremely improbable state—very far from equilibrium. Why?

    So when we say that our universe started off in a state far from equilibrium, we are saying that it started off in a state that would be very improbable, were the initial state chosen randomly from the set of all possible states. Yet we must accept this vast improbability to explain the ubiquity of irreversible processes in our world in terms of the reversible laws we know.

    In particular, the conditions present in the early universe, being far from equilibrium, are highly irreversible. Run the early universe backwards to a big crunch and they look nothing like the late universe that might be in our future.

    In 1979 Roger Penrose proposed a radical answer to the mystery of irreversibility. His proposal concerned quantum gravity, the long-searched-for unification of all the known laws, which is believed to govern the processes that created the universe in the big bang—or transformed it from whatever state it was in before the big bang.

    Penrose hypothesized that quantum gravity, as the most fundamental law, will be unlike the laws we know in that it will be irreversible. The known laws, along with their time-reversibility, emerge as approximations to quantum gravity when the universe grows large and cool and dilute, Penrose argued. But those approximate laws will act within a universe whose early conditions were set up by the more fundamental, irreversible laws. In this way the improbability of the early conditions can be explained.

    In the intervening years our knowledge of the early universe has been dramatically improved by a host of cosmological observations, but these have only deepened the mysteries we have been discussing. So a few years ago, Marina Cortes, a cosmologist from the Institute for Astronomy in Edinburgh, and I decided to revive Penrose’s suggestion in the light of all the knowledge gained since, both observationally and theoretically.

    Dr. Cortes argued that time is not only fundamental but fundamentally irreversible. She proposed that the universe is made of processes that continuously generate new events from present events. Events happen, but cannot unhappen. The reversal of an event does not erase that event, Cortes says: It is a new event, which happens after it.

    In December of 2011, Dr. Cortes began a three-month visit to Perimeter Institute, where I work, and challenged me to collaborate with her on realizing these ideas. The first result was a model we developed of a universe created by events, which we called an energetic causal set model.

    This is a version of a kind of model called a causal set model, in which the history of the universe is considered to be a discrete set of events related only by cause-and-effect. Our model was different from earlier models, though. In it, events are created by a process which maximizes their uniqueness. More precisely, the process produces a universe created by events, each of which is different from all the others. Space is not fundamental, only the events and the causal process that creates them are fundamental. But if space is not fundamental, energy is. The events each have a quantity of energy, which they gain from their predecessors and pass on to their successors. Everything else in the world emerges from these events and the energy they convey.

    We studied the model universes created by these processes and found that they generally pass through two stages of evolution. In the first stage, they are dominated by the irreversible processes that create the events, each unique. The direction of time is clear. But this gives rise to a second stage in which trails of events appear to propagate, creating emergent notions of particles. Particles emerge only when the second, approximately reversible stage is reached. These emergent particles propagate and appear to interact through emergent laws which seem reversible. In fact, we found, there are many possible models in which particles and approximately reversible laws emerge after a time from a more fundamental irreversible, particle-free system.

    This might explain how general relativity and the standard model emerged from a more fundamental theory, as Penrose hypothesized. Could we, we wondered, start with general relativity and, staying within the language of that theory, modify it to describe an irreversible theory? This would give us a framework to bridge the transition between the early, irreversible stage and the later, reversible stage.

    In a recent paper, Marina Cortes, PI postdoc Henrique Gomes and I showed one way to modify general relativity in a way that introduces a preferred direction of time, and we explored the possible consequences for the cosmology of the early universe. In particular, we showed that there were analogies of dark matter and dark energy, but which introduce a preferred direction of time, so a contracting universe is no longer the time-reverse of an expanding universe.

    To do this we had to first modify general relativity to include a physically preferred notion of time. Without that there is no notion of reversing time. Fortunately, such a modification already existed. Called shape dynamics, it had been proposed in 2011 by three young people, including Gomes. Their work was inspired by Julian Barbour, who had proposed that general relativity could be reformulated so that a relativity of size substituted for a relativity of time.

    Using the language of shape dynamics, Cortes, Gomes and I found a way to gently modify general relativity so that little is changed on the scale of stars, galaxies and planets. Nor are the predictions of general relativity regarding gravitational waves affected. But on the scale of the whole universe, and for the early universe, there are deviations where one cannot escape the consequences of a fundamental direction of time.

    Very recently I found still another way to modify the laws of general relativity to make them irreversible. General relativity incorporates effects of two fixed constants of nature, Newton’s constant, which measures the strength of the gravitational force, and the cosmological constant [usually denoted by the Greek capital letter lambda: Λ], which measures the density of energy in empty space. Usually these both are fixed constants, but I found a way they could evolve in time without destroying the beautiful harmony and consistency of the Einstein equations of general relativity.

    These developments are very recent and are far from demonstrating that the irreversibility we see around us is a reflection of a fundamental arrow of time. But they open a way to an understanding of how time got its direction that does not rely on our universe being a consequence of a cosmic accident.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

     
  • richardmitnick 2:23 pm on March 1, 2015 Permalink | Reply
    Tags: , Cosmology,   

    From Daily Galaxy: “Our Observed Universe is a Tiny Corner of an Enormous Cosmos –‘Ruled by Dark Energy'” 

    Daily Galaxy
    The Daily Galaxy

    March 01, 2015
    No Writer Credit

    1

    “This new concept is, potentially, as drastic an enlargement of our cosmic perspective as the shift from pre-Copernican ideas to the realization that the Earth is orbiting a typical star on the edge of the Milky Way.” Sir Martin Rees, physicist, Cambridge University, Astronomer Royal of Great Britain.

    Is our universe merely a part of an enormous universe containing diverse regions each with the right amount of the dark energy and each larger than the observed universe, according to Raphael Bousso, Professor of Theoretical Physics, U of California/Berkeley and Leonard Susskind, Felix Bloch Professor of Physics, Stanford University. The two theorize that information can leak from our causal patch into others, allowing our part of the universe to “decohere” into one state or another, resulting in the universe that we observe.

    The many worlds interpretation of quantum mechanics is the idea that all possible alternate histories of the universe actually exist. At every point in time, the universe splits into a multitude of existences in which every possible outcome of each quantum process actually happens.The reason many physicists love the many worlds idea is that it explains away all the strange paradoxes of quantum mechanics.

    Putting the many world interpretation aside for a moment, another strange idea in modern physics is the idea that our universe was born along with a large, possibly infinite, number of other universes. So our cosmos is just one tiny corner of a much larger multiverse.

    Susskind and Bousso have put forward the idea that the multiverse and the many worlds interpretation of quantum mechanics are formally equivalent, but if both quantum mechanics and the multiverse take special forms.

    Let’s take quantum mechanics first. Susskind and Bousso propose that it is possible to verify the predictions of quantum mechanics. In theory, it could be done if an observer could perform an infinite number of experiments and observe the outcome of them all, which is known as the supersymmetric multiverse with vanishing cosmological constant.

    If the universe takes this form, then it is possible to carry out an infinite number of experiments within the causal horizon of each other. At each instant in time, an infinite (or very large) number of experiments take place within the causal horizon of each other. As observers, we are capable of seeing the outcome of any of these experiments but we actually follow only one.

    Bousso and Susskind argue that since the many worlds interpretation is possible only in their supersymmetric multiverse, they must be equivalent. “We argue that the global multiverse is a representation of the many-worlds in a single geometry,” they say, calling this new idea the multiverse interpretation of quantum mechanics.

    But we have now entered the realm of what mathematical physicist Peter Woit of Columbia calls “Not Even Wrong, because the theory lacks is a testable prediction that would help physicists distinguish it experimentally from other theories of the universe. And without this crucial element, the multiverse interpretation of quantum mechanics is little more than philosophy, according to Woit.

    What this new supersymmetric multiverse interpretation does have is a simplicity– it’s neat and elegant that the many worlds and the multiverse are equivalent. Ockham’s Razor is fulfilled and no doubt, many quantum physicists delight in what appears to be an exciting. plausible interpretation of ultimate if currently untestable, reality.

    Ref: arxiv.org/abs/1105.3796: The Multiverse Interpretation of Quantum Mechanics

    The Daily Galaxy via technologyreview.com

    Image credit: hellstormde.deviantart.com

    See the full article here.

    Please help promote STEM in your local schools

    stem

    STEM Education Coalition

     
  • richardmitnick 5:19 am on February 25, 2015 Permalink | Reply
    Tags: , , Cosmology, Stephen Hawking   

    From NOVA: “Stephen Hawking Serves Up Scrambled Black Holes” 

    PBS NOVA

    NOVA

    04 Feb 2014
    Greg Kestin

    1
    Out of the firewall and into the frying pan? Credit: Flickr/Pheexies, under a Creative Commons license.

    Toast or spaghetti?

    That’s the question that physicists have been trying to answer for the last year and a half. After agreeing for decades that anything—or anyone—unlucky enough to fall into a black hole would be ripped and stretched into spaghetti-like strands by the overwhelming gravity, theorists are now contending with the possibility that infalling matter is instead incinerated by a “toasty” wall of fire at the black hole’s horizon. Now, Stephen Hawking has proposed a radical solution: nixing one of the most infamous characteristics of a black hole, its event horizon, or point of no return.

    2
    Stephen Hawking

    The original “spaghetti” scenario follows directly from [Albert] Einstein’s theory of general relativity, which describes how gravity stretches the fabric of space and time. A black hole warps that fabric into a bottomless pit; if you get too close, you reach a point of no return called the horizon, where the slope becomes so steep that you can never climb back out. Inside, the gravity gets stronger and stronger until it tears you limb from limb.

    The first hint that there was a flaw in this picture of a black hole came in 1975, when Stephen Hawking came upon a paradox. He realized that, over a very long time, a black hole will “evaporate”—that is, its mass and energy will gradually leak out as radiation, revealing nothing of what the black hole once contained. This was a shocking conclusion because it suggested that black holes destroy information, a fundamental violation of quantum mechanics, which insists that information be conserved.

    How exactly does black hole evaporation imply that information is destroyed? Let’s say you are reading the last copy of “Romeo and Juliet,” and when you get to the end, grief overcomes you (sorry for the spoiler) and you throw the book into a black hole. After the book falls past the horizon, gravity shreds its pages, and finally it is violently compressed into the central point of the black hole. Then you wait as the black hole slowly evaporates by randomly shooting off particles from its glowing edges without any concern for Romeo or Juliet. As the black hole winks out of existence, only these random subatomic particles remain, floating in space. Where did the Montagues and Capulets go? They are lost forever. You could have thrown in “The Cat in The Hat” and the particles left after evaporation would be indistinguishable from the Shakespearian remnants.

    Hawking realized that something had to give. Either quantum mechanics had to change to accommodate information loss, or Einstein’s theory of gravity was flawed.

    Over the past 40 years theorists have battled in the “black hole wars,” trying to resolve this paradox. Two decades ago, most physicists declared a truce, agreeing to consider the inside and the outside of the black hole as separate spaces. If something falls into the black hole, it has gone to another realm, so just stop thinking about it and its fate, they counseled. This argument was largely accepted until July 2012, when UC Santa Barbara physicist Joseph Polchinski and his colleagues realized the paradox was even more puzzling.

    Polchinski began with a similar thought experiment, but instead of Shakespeare, he imagined tossing entangled particles (particles that are quantum mechanically linked) toward a black hole. What happens, he asked, if one particle falls in the black hole and the other flies out into space? This creates a big problem: We can’t think of the two realms (inside and outside of the black hole) separately because they are tied together by the entangled particles.

    Polchinski proposed a new solution that ripped apart Einstein’s idea of a black hole—literally. If there were something to prevent entanglement across the horizon, he thought, then there would be no problem. So he came up with something called a firewall: a wall of radiation at the black hole’s horizon that burns up anything that hits it. This wall is a tear in space-time that nothing can go through.

    Is incineration finally the solution to the black hole information paradox? The father of the paradox, Stephen Hawking, recently put in his two cents (two pages, actually) in a very brief paper in which he argues against not just firewalls, but also event horizons as an ultimate point-of-no-return. This argument relies on quantum fluctuations in space-time that prevent a horizon from existing at a sharp boundary. He instead proposes a temporary “apparent horizon” that stores matter/energy (and information), chaotically scrambles it, and radiates it back out. This means that, as far as quantum mechanics is concerned, information is not lost; it is just extremely garbled. As Polchinski describes it, “It almost sounds like he is replacing the firewall with a chaos-wall!”

    Are you skeptical? If so, you are in good company. Polchinski, for one, is hesitant, saying “It is not clear what [Hawking’s] picture is. There are no calculations.”

    Steve Giddings, a theoretical physicist at the University of California, Santa Barbara, shares in this reluctance:

    “The big question has been how information escapes a black hole, and what that tells us about faster-than-light signaling or a more serious breakdown of spacetime; the effects Hawking describes don’t appear sufficient to address this.”

    Hawking’s new idea will need some flesh on its bones before we can truly embrace it, but if you don’t like spaghetti or toast, at least you have a third option now: scrambled black holes.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

     
  • richardmitnick 4:32 am on February 25, 2015 Permalink | Reply
    Tags: , Cosmology, ,   

    From phys.org: “How can space travel faster than the speed of light?” 

    physdotorg
    phys.org

    Feb 23, 2015
    Vanessa Janek

    1
    Light speed is often spoken of as a cosmic speed limit… but not everything plays by these rules. In fact, space itself can expand faster than a photon could ever hope to travel.

    Cosmologists are intellectual time travelers. Looking back over billions of years, these scientists are able to trace the evolution of our Universe in astonishing detail. 13.8 billion years ago, the Big Bang occurred. Fractions of a second later, the fledgling Universe expanded exponentially during an incredibly brief period of time called inflation. Over the ensuing eons, our cosmos has grown to such an enormous size that we can no longer see the other side of it.

    But how can this be? If light’s velocity marks a cosmic speed limit, how can there possibly be regions of spacetime whose photons are forever out of our reach? And even if there are, how do we know that they exist at all?

    The Expanding Universe

    Like everything else in physics, our Universe strives to exist in the lowest possible energy state possible. But around 10-36 seconds after the Big Bang, inflationary cosmologists believe that the cosmos found itself resting instead at a “false vacuum energy” – a low-point that wasn’t really a low-point. Seeking the true nadir of vacuum energy, over a minute fraction of a moment, the Universe is thought to have ballooned by a factor of 1050.

    Since that time, our Universe has continued to expand, but at a much slower pace. We see evidence of this expansion in the light from distant objects. As photons emitted by a star or galaxy propagate across the Universe, the stretching of space causes them to lose energy. Once the photons reach us, their wavelengths have been redshifted in accordance with the distance they have traveled.

    This is why cosmologists speak of redshift as a function of distance in both space and time. The light from these distant objects has been traveling for so long that, when we finally see it, we are seeing the objects as they were billions of years ago.

    The Hubble Volume

    Redshifted light allows us to see objects like galaxies as they existed in the distant past; but we cannot see all events that occurred in our Universe during its history. Because our cosmos is expanding, the light from some objects is simply too far away for us ever to see.

    The physics of that boundary rely, in part, on a chunk of surrounding spacetime called the Hubble volume. Here on Earth, we define the Hubble volume by measuring something called the Hubble parameter (H0), a value that relates the apparent recession speed of distant objects to their redshift. It was first calculated in 1929, when Edwin Hubble discovered that faraway galaxies appeared to be moving away from us at a rate that was proportional to the redshift of their light.

    Dividing the speed of light by H0, we get the Hubble volume. This spherical bubble encloses a region where all objects move away from a central observer at speeds less than the speed of light. Correspondingly, all objects outside of the Hubble volume move away from the center faster than the speed of light.

    Yes, “faster than the speed of light.” How is this possible?

    2
    Two sources of redshift: Doppler and cosmological expansion; modeled after Koupelis & Kuhn. Bottom: Detectors catch the light that is emitted by a central star. This light is stretched, or redshifted, as space expands in between. Credit: Brews Ohare

    The answer has to do with the difference between special relativity and general relativity. Special relativity requires what is called an “inertial reference frame” – more simply, a backdrop. According to this theory, the speed of light is the same when compared in all inertial reference frames. Whether an observer is sitting still on a park bench on planet Earth or zooming past Neptune in a futuristic high-velocity rocketship, the speed of light is always the same. A photon always travels away from the observer at 300,000,000 meters per second, and he or she will never catch up.

    General relativity, however, describes the fabric of spacetime itself. In this theory, there is no inertial reference frame. Spacetime is not expanding with respect to anything outside of itself, so the the speed of light as a limit on its velocity doesn’t apply. Yes, galaxies outside of our Hubble sphere are receding from us faster than the speed of light. But the galaxies themselves aren’t breaking any cosmic speed limits. To an observer within one of those galaxies, nothing violates special relativity at all. It is the space in between us and those galaxies that is rapidly proliferating and stretching exponentially.

    The Observable Universe

    Now for the next bombshell: The Hubble volume is not the same thing as the observable Universe.

    To understand this, consider that as the Universe gets older, distant light has more time to reach our detectors here on Earth. We can see objects that have accelerated beyond our current Hubble volume because the light we see today was emitted when they were within it.

    Strictly speaking, our observable Universe coincides with something called the particle horizon. The particle horizon marks the distance to the farthest light that we can possibly see at this moment in time – photons that have had enough time to either remain within, or catch up to, our gently expanding Hubble sphere.

    And just what is this distance? A little more than 46 billion light years in every direction – giving our observable Universe a diameter of approximately 93 billion light years, or more than 500 billion trillion miles.

    (A quick note: the particle horizon is not the same thing as the cosmological event horizon. The particle horizon encompasses all the events in the past that we can currently see. The cosmological event horizon, on the other hand, defines a distance within which a future observer will be able to see the then-ancient light our little corner of spacetime is emitting today.

    In other words, the particle horizon deals with the distance to past objects whose ancient light that we can see today; the cosmological event horizon deals with the distance that our present-day light that will be able to travel as faraway regions of the Universe accelerate away from us.)

    3
    Fit of redshift velocities to Hubble’s law. Credit: Brews Ohare

    Dark Energy

    Thanks to the expansion of the Universe, there are regions of the cosmos that we will never see, even if we could wait an infinite amount of time for their light to reach us. But what about those areas just beyond the reaches of our present-day Hubble volume? If that sphere is also expanding, will we ever be able to see those boundary objects?

    This depends on which region is expanding faster – the Hubble volume or the parts of the Universe just outside of it. And the answer to that question depends on two things: 1) whether H0 is increasing or decreasing, and 2) whether the Universe is accelerating or decelerating. These two rates are intimately related, but they are not the same.

    In fact, cosmologists believe that we are actually living at a time when H0 is decreasing; but because of dark energy, the velocity of the Universe’s expansion is increasing.

    That may sound counterintuitive, but as long as H0 decreases at a slower rate than that at which the Universe’s expansion velocity is increasing, the overall movement of galaxies away from us still occurs at an accelerated pace. And at this moment in time, cosmologists believe that the Universe’s expansion will outpace the more modest growth of the Hubble volume.

    4
    The observable universe, more technically known as the particle horizon.

    So even though our Hubble volume is expanding, the influence of dark energy appears to provide a hard limit to the ever-increasing observable Universe.

    Our Earthly Limitations

    Cosmologists seem to have a good handle on deep questions like what our observable Universe will someday look like and how the expansion of the cosmos will change. But ultimately, scientists can only theorize the answers to questions about the future based on their present-day understanding of the Universe. Cosmological timescales are so unimaginably long that it is impossible to say much of anything concrete about how the Universe will behave in the future. Today’s models fit the current data remarkably well, but the truth is that none of us will live long enough to see whether the predictions truly match all of the outcomes.

    Disappointing? Sure. But totally worth the effort to help our puny brains consider such mind-bloggling science – a reality that, as usual, is just plain stranger than fiction.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Phys.org in 100 Words

    Phys.org™ (formerly Physorg.com) is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004, Phys.org’s readership has grown steadily to include 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

     
  • richardmitnick 6:47 pm on January 8, 2015 Permalink | Reply
    Tags: , , , , , Cosmology   

    From Caltech: “Unusual Light Signal Yields Clues About Elusive Black Hole Merger” 

    Caltech Logo
    Caltech

    01/07/2015
    Ker Than

    The central regions of many glittering galaxies, our own Milky Way included, harbor cores of impenetrable darkness—black holes with masses equivalent to millions, or even billions, of suns. What is more, these supermassive black holes and their host galaxies appear to develop together, or “co-evolve.” Theory predicts that as galaxies collide and merge, growing ever more massive, so too do their dark hearts.

    bh
    Simulation of gravitational lensing by a black hole, which distorts the image of a galaxy in the background


    ESOCast
    Astronomers using ESO’s Very Large Telescope have discovered a gas cloud with several times the mass of the Earth accelerating towards the black hole at the centre of the Milky Way. This is the first time ever that the approach of such a doomed cloud to a supermassive black hole has been observed. This ESOcast explains the new results and includes spectacular simulations of how the cloud will break up over the next few years.
    Credit: ESO.

    ESO VLT Interferometer
    ESO VLT Interior
    ESO/VLT

    Black holes by themselves are impossible to see, but their gravity can pull in surrounding gas to form a swirling band of material called an accretion disk. The spinning particles are accelerated to tremendous speeds and release vast amounts of energy in the form of heat and powerful X-rays and gamma rays. When this process happens to a supermassive black hole, the result is a quasar—an extremely luminous object that outshines all of the stars in its host galaxy and that is visible from across the universe. “Quasars are valuable probes of the evolution of galaxies and their central black holes,” says George Djorgovski, professor of astronomy and director of the Center for Data-Driven Discovery at Caltech.

    In the January 7 issue of the journal Nature, Djorgovski and his collaborators report on an unusual repeating light signal from a distant quasar that they say is most likely the result of two supermassive black holes in the final phases of a merger—something that is predicted from theory but which has never been observed before. The discovery could help shed light on a long-standing conundrum in astrophysics called the “final parsec problem,” which refers to the failure of theoretical models to predict what the final stages of a black hole merger look like or even how long the process might take. “The end stages of the merger of these supermassive black hole systems are very poorly understood,” says the study’s first author, Matthew Graham, a senior computational scientist at Caltech. “The discovery of a system that seems to be at this late stage of its evolution means we now have an observational handle on what is going on.”

    Djorgovski and his team discovered the unusual light signal emanating from quasar PG 1302-102 after analyzing results from the Catalina Real-Time Transient Survey (CRTS), which uses three ground telescopes in the United States and Australia to continuously monitor some 500 million celestial light sources strewn across about 80 percent of the night sky. “There has never been a data set on quasar variability that approaches this scope before,” says Djorgovski, who directs the CRTS. “In the past, scientists who study the variability of quasars might only be able to follow some tens, or at most hundreds, of objects with a limited number of measurements. In this case, we looked at a quarter million quasars and were able to gather a few hundred data points for each one.”

    “Until now, the only known examples of supermassive black holes on their way to a merger have been separated by tens or hundreds of thousands of light years,” says study coauthor Daniel Stern, a scientist at NASA’s Jet Propulsion Laboratory. “At such vast distances, it would take many millions, or even billions, of years for a collision and merger to occur. In contrast, the black holes in PG 1302-102 are, at most, a few hundredths of a light year apart and could merge in about a million years or less.”

    Djorgovski and his team did not set out to find a black hole merger. Rather, they initially embarked on a systematic study of quasar brightness variability in the hopes of finding new clues about their physics. But after screening the data using a pattern-seeking algorithm that Graham developed, the team found 20 quasars that seemed to be emitting periodic optical signals. This was surprising, because the light curves of most quasars are chaotic—a reflection of the random nature by which material from the accretion disk spirals into a black hole. “You just don’t expect to see a periodic signal from a quasar,” Graham says. “When you do, it stands out.”

    Of the 20 periodic quasars that CRTS identified, PG 1302-102 was the best example. It had a strong, clean signal that appeared to repeat every five years or so. “It has a really nice smooth up-and-down signal, similar to a sine wave, and that just hasn’t been seen before in a quasar,” Graham says.

    The team was cautious about jumping to conclusions. “We approached it with skepticism but excitement as well,” says study coauthor Eilat Glikman, an assistant professor of physics at Middlebury College in Vermont. After all, it was possible that the periodicity the scientists were seeing was just a temporary ordered blip in an otherwise chaotic signal. To help rule out this possibility, the scientists pulled in data about the quasar from previous surveys to include in their analysis. After factoring in the historical observations (the scientists had nearly 20 years’ worth of data about quasar PG 1302-102), the repeating signal was, encouragingly, still there.

    The team’s confidence increased further after Glikman analyzed the quasar’s light spectrum. The black holes that scientists believe are powering quasars do not emit light, but the gases swirling around them in the accretion disks are traveling so quickly that they become heated into glowing plasma. “When you look at the emission lines in a spectrum from an object, what you’re really seeing is information about speed—whether something is moving toward you or away from you and how fast. It’s the Doppler effect,” Glikman says. “With quasars, you typically have one emission line, and that line is a symmetric curve. But with this quasar, it was necessary to add a second emission line with a slightly different speed than the first one in order to fit the data. That suggests something else, such as a second black hole, is perturbing this system.”

    Avi Loeb, who chairs the astronomy department at Harvard University, agreed with the team’s assessment that a “tight” supermassive black hole binary is the most likely explanation for the periodic signal they are seeing. “The evidence suggests that the emission originates from a very compact region around the black hole and that the speed of the emitting material in that region is at least a tenth of the speed of light,” says Loeb, who did not participate in the research. “A secondary black hole would be the simplest way to induce a periodic variation in the emission from that region, because a less dense object, such as a star cluster, would be disrupted by the strong gravity of the primary black hole.”

    In addition to providing an unprecedented glimpse into the final stages of a black hole merger, the discovery is also a testament to the power of “big data” science, where the challenge lies not only in collecting high-quality information but also devising ways to mine it for useful information. “We’re basically moving from having a few pictures of the whole sky or repeated observations of tiny patches of the sky to having a movie of the entire sky all the time,” says Sterl Phinney, a professor of theoretical physics at Caltech, who was also not involved in the study. “Many of the objects in the movie will not be doing anything very exciting, but there will also be a lot of interesting ones that we missed before.”

    It is still unclear what physical mechanism is responsible for the quasar’s repeating light signal. One possibility, Graham says, is that the quasar is funneling material from its accretion disk into luminous twin plasma jets that are rotating like beams from a lighthouse. “If the glowing jets are sweeping around in a regular fashion, then we would only see them when they’re pointed directly at us. The end result is a regularly repeating signal,” Graham says.

    Another possibility is that the accretion disk that encircles both black holes is distorted. “If one region is thicker than the rest, then as the warped section travels around the accretion disk, it could be blocking light from the quasar at regular intervals. This would explain the periodicity of the signal that we’re seeing,” Graham says. Yet another possibility is that something is happening to the accretion disk that is causing it to dump material onto the black holes in a regular fashion, resulting in periodic bursts of energy.

    “Even though there are a number of viable physical mechanisms behind the periodicity we’re seeing—either the precessing jet, warped accretion disk or periodic dumping—these are all still fundamentally caused by a close binary system,” Graham says.

    Along with Djorgovski, Graham, Stern, and Glikman, additional authors on the paper, A possible close supermassive black hole binary in a quasar with optical periodicity, include Andrew Drake, a computational scientist and co-principal investigator of the CRTS sky survey at Caltech; Ashish Mahabal, a staff scientist in computational astronomy at Caltech; Ciro Donalek, a computational staff scientist at Caltech; Steve Larson, a senior staff scientist at the University of Arizona; and Eric Christensen, an associate staff scientist at the University of Arizona. Funding for the study was provided by the National Science Foundation.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”
    Caltech buildings

     
  • richardmitnick 4:52 pm on January 8, 2015 Permalink | Reply
    Tags: , , , Cosmology, ,   

    From Gemini Observatory: “THE GEMINI PLANET IMAGER PRODUCES STUNNING OBSERVATIONS IN ITS FIRST YEAR” 

    NOAO

    Gemini Observatory
    Gemini Observatory

    January 6, 2015
    Media Contacts:

    Peter Michaud
    Public Information and Outreach Manager
    Gemini Observatory, Hilo, HI
    Email: pmichaud”at”gemini.edu
    Cell: (808) 936-6643
    Desk: (808) 974-2510

    Science Contacts:

    Marshall Perrin
    STScI
    Email: mperrin”at”stsci.edu
    Phone: (410) 507-5483

    James R. Graham
    University of California Berkeley
    Email: jrg”at”berkeley.edu
    Cell: (510) 926-9820

    Stunning exoplanet images and spectra from the first year of science operations with the Gemini Planet Imager (GPI) were featured today in a press conference at the 225th meeting of the American Astronomical Society (AAS) in Seattle, Washington. The Gemini Planet Imager GPI is an advanced instrument designed to observe the environments close to bright stars to detect and study Jupiter-like exoplanets (planets around other stars) and see protostellar material (disk, rings) that might be lurking next to the star.

    1
    Figure 1. GPI imaging of the planetary system HR 8799 in K band, showing 3 of the 4 planets. (Planet b is outside the field of view shown here, off to the left.) These data were obtained on November 17, 2013 during the first week of operation of GPI and in relatively challenging weather conditions, but with GPI’s advanced adaptive optics system and coronagraph the planets can still be clearly seen and their spectra measured (see Figure 2). Image credit: Christian Marois (NRC Canada), Patrick Ingraham (Stanford University) and the GPI Team.

    2
    Figure 2. GPI spectroscopy of planets c and d in the HR 8799 system. While earlier work showed that the planets have similar overall brightness and colors, these newly-measured spectra show surprisingly large differences. The spectrum of planet d increases smoothly from 1.9-2.2 microns while planet c’s spectrum shows a sharper kink upwards just beyond 2 microns. These new GPI results indicate that these similar-mass and equal-age planets nonetheless have significant differences in atmospheric properties, for in-stance more open spaces between patchy cloud cover on planet c versus uniform cloud cover on planet d, or perhaps differences in atmospheric chemistry. These data are helping refine and improve a new generation of atmospheric models to explain these effects. Image credit: Patrick Ingraham (Stanford University), Mark Marley (NASA Ames), Didier Saumon (Los Alamos National Laboratory) and the GPI Team.

    Marshall Perrin (Space Telescope Science Institute), one of the instrument’s team leaders, presented a pair of recent and promising results at the press conference. He revealed some of the most detailed images and spectra ever of the multiple planet system HR 8799. His presentation also included never-seen details in the dusty ring of the young star HR 4796A. “GPI’s advanced imaging capabilities have delivered exquisite images and data,” said Perrin. “These improved views are helping us piece together what’s going on around these stars, yet also posing many new questions.”

    The GPI spectra obtained for two of the planetary members of the HR 8799 system presents a challenge for astronomers. GPI team member Patrick Ingraham (Stanford University), lead the paper on HR 8799. Ingraham reports that the shape of the spectra for the two planets differ more profoundly than expected based on their similar colors, indicating significant differences between the companions. “Current atmospheric models of exoplanets cannot fully explain the subtle differences in color that GPI has revealed. We infer that it may be differences in the coverage of the clouds or their composition.” Ingraham adds, “The fact that GPI was able to extract new knowledge from these planets on the first commissioning run in such a short amount of time, and in conditions that it was not even designed to work, is a real testament to how revolutionary GPI will be to the field of exoplanets.”

    Perrin, who is working to understand the dusty ring around the young star HR 4796A, said that the new GPI data present an unprecedented level of detail in studies of the ring’s polarized light. “GPI not only sees the disk more clearly than previous instruments, it can also measure how polarized its light appears, which has proven crucial in under-standing its physical properties.” Specifically, the GPI measurements of the ring show it must be partially opaque, implying it is far denser and more tightly compressed than similar dust found in the outskirts of our own Solar System, which is more diffuse. The ring circling HR 4796A is about twice the diameter of the planetary orbits in our Solar System and its star about twice our Sun’s mass. “These data taken during GPI commissioning show how exquisitely well its polarization mode works for studying disks. Such observations are critical in advancing our understanding of all types and sizes of planetary systems – and ultimately how unique our own solar system might be,” said Perrin.

    3
    Figure 3. GPI imaging polarimetry of the circumstellar disk around HR 4796A, a ring of dust and planetesimals similar in some ways to a scaled up version of the solar system’s Kuiper Belt.

    Kuiper Belt
    Kuiper Belt, for illustration of the discussion

    These GPI observations reveal a complex pattern of variations in brightness and polarization around the HR 4796A disk. The western side (tilted closer to the Earth) appears brighter in polarized light, while in total intensity the eastern side appears slightly brighter, particularly just to the east of the widest apparent separation points of the disk. Reconciling this complex and apparently-contradictory pattern of brighter and darker regions required a major overhaul of our understanding of this circumstellar disk. Image credit: Marshall Perrin (Space Telescope Science Institute), Gaspard Duchene (UC Berkeley), Max Millar-Blanchaer (University of Toronto), and the GPI Team.

    4
    Figure 4. Diagram depicting the GPI team’s revised model for the orientation and composition of the HR 4796A ring. To explain the observed polarization levels, the disk must consist of relatively large (> 5 µm) silicate dust particles, which scatter light most strongly and polarize it more for forward scattering. To explain the relative faintness of the east side in total intensity, the disk must be dense enough to be slightly opaque, comparable to Saturn’s optically thick rings, such that on the near side of the disk our view of its brightly illuminated inner portion is partially obscured. This revised model requires the disk to be much narrower and flatter than expected, and poses a new challenge for theories of disk dynamics to explain. GPI’s high contrast imaging and polarimetry capabilities together were essential for this new synthesis. Image credit: Marshall Perrin (Space Telescope Science Institute).

    During the commissioning phase, the GPI team observed a variety of targets, ranging from asteroids in our solar system, to an old star near its death. Other teams of scientists have been using GPI as well and already astronomers around the world have published eight papers in peer-reviewed journals using GPI data. “This might be the most productive new instrument Gemini has ever had,” said Professor James Graham of the University of California, who leads the GPI science team and who will describe the GPI exoplanet survey in a talk scheduled at the AAS meeting on Thursday, January 8th.

    The Gemini Observatory staff integrated the complex instrument into the telescope’s software and helped to characterize GPI’s performance. “Even though it’s so complicated, GPI now operates almost automatically,” said Gemini’s instrument scientist for GPI Fredrik Rantakyro. “This allows us to start routine science operations.” The instrument is now available to astronomers and their proposals are scheduled to start ob-serving in early 2015. In addition, “shared risk” observations are already underway, starting in November 2014.

    The one thing GPI hasn’t done yet is discovered a new planet. “For the early tests, we concentrated on known planets or disks” said GPI PI Bruce Macintosh. Now that GPI is fully operational, the search for new planets has begun. In addition to observations by astronomers world-wide, the Gemini Planet Imager Exoplanet Survey (GPIES) will look at 600 carefully selected stars over the next few years. GPI ‘sees’ planets through the infrared light they emit when they’re young, so the GPIES team has assembled a list of the youngest and closest stars. So far the team has observed 50 stars, and analysis of the data is ongoing. Discovering a planet requires confirmation observations to distinguish a true planet orbiting the target star from a distant star that happens to sneak into GPI’s field of view – a process that could take years with previous instruments. The GPIES team found one such object in their first survey run, but GPI observations were sensitive enough to almost immediately rule it out. Macintosh said, “With GPI, we can tell almost instantly that something isn’t a planet – rather than months of uncertainty, we can get over our disappointment almost immediately. Now it’s time to find some real planets!”

    About GPI/GPIES

    The Gemini Planet Imager (GPI) instrument was constructed by an international collaboration led by Lawrence Livermore National Laboratory under Gemini’s supervision. The GPI Exoplanet Survey (GPIES) is the core science program to be carried out with it. GPIES is led by Bruce Macintosh, now a professor at Stanford University and James Graham, professor at the University of California at Berkeley and is designed to find young, Jupiter-like exoplanets. They survey will observe 600 young nearby stars in 890 hours over three years. Targets have been carefully selected by team members at Arizona State University, the University of Georgia, and UCLA. The core of the data processing architecture is led by Marshall Perrin of the Space Telescope Science Institute, with the core software originally written by University of Montreal, data management infrastructure from UC Berkeley and Cornell University, and contributions from all the other team institutions. The SETI institute located in California manages GPIES’s communications and public out-reach. Several teams located at the Dunlap Institute, the University of Western Ontario, the University of Chicago, the Lowell Observatory, NASA Ames, the American Museum of Natural History, University of Arizona and the University of California at San Diego and at Santa Cruz also contribute to the survey. The GPI Exoplanet Survey is supported by the NASA Origins Program NNX14AG80, the NSF AAG pro-gram, and grants from other institutions including the University of California Office of the President. Dropbox Inc. has generously provided storage space for the entire survey’s archive.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Gemini North
    Gemini North, Hawai’i

    Gemini South
    Gemini South, Chile
    AURA Icon

    The Gemini Observatory consists of twin 8.1-meter diameter optical/infrared telescopes located on two of the best observing sites on the planet. From their locations on mountains in Hawai‘i and Chile, Gemini Observatory’s telescopes can collectively access the entire sky.
    Gemini was built and is operated by a partnership of six countries including the United States, Canada, Chile, Australia, Brazil and Argentina. Any astronomer in these countries can apply for time on Gemini, which is allocated in proportion to each partner’s financial stake.

     
  • richardmitnick 4:31 am on January 8, 2015 Permalink | Reply
    Tags: , , , Cosmology, ,   

    From NASA Science: “Hubble: Pillars of Creation are also Pillars of Destruction” 

    NASA Science Science News

    Jan. 7, 2015
    Dr. Tony Phillips

    Although NASA’s Hubble Space Telescope has taken many breathtaking images of the universe, one snapshot stands out from the rest: the iconic view of the so-called “Pillars of Creation.” The jaw-dropping photo, taken in 1995, revealed never-before-seen details of three giant columns of cold gas bathed in the scorching ultraviolet light from a cluster of young, massive stars in a small region of the Eagle Nebula, or M16.

    e
    Overview of some famous sights in the Eagle Nebula

    NASA Hubble Telescope
    Hubble

    In celebration of its upcoming 25th anniversary in April, Hubble has revisited the famous pillars, providing astronomers with a sharper and wider view. Although the original image was dubbed the Pillars of Creation, the new image hints that they are also “pillars of destruction.”

    p
    Astronomers using NASA’s Hubble Space Telescope have assembled a bigger and sharper photograph of the iconic Eagle Nebula’s “Pillars of Creation”. Credit: NASA/ESA/Hubble Heritage Team (STScI/AURA)/J. Hester, P. Scowen (Arizona State U.)

    “I’m impressed by how transitory these structures are,” explains Paul Scowen of Arizona State University in Tempe. “They are actively being ablated away before our very eyes. The ghostly bluish haze around the dense edges of the pillars is material getting heated up and evaporating away into space. We have caught these pillars at a very unique and short-lived moment in their evolution.” Scowen and astronomer Jeff Hester, formerly of Arizona State University, led the original Hubble observations of the Eagle Nebula.


    HUBBLECast 82

    The original 1995 images were taken in visible light. The new image includes near-infrared light as well. The infrared view transforms the pillars into eerie, wispy silhouettes seen against a background of myriad stars. That’s because the infrared light penetrates much of the gas and dust, except for the densest regions of the pillars. Newborn stars can be seen hidden away inside the pillars.

    The infrared image shows that the very ends of the pillars are dense knots of dust and gas. They shadow the gas below them, keeping the gas cool and creating the long, column-like structures. The material in between the pillars has long since been evaporated away by the ionizing radiation from the central star cluster located above the pillars.

    At the top edge of the left-hand pillar, a gaseous fragment has been heated up and is flying away from the structure, underscoring the violent nature of star-forming regions. “These pillars represent a very dynamic, active process,” Scowen said. “The gas is not being passively heated up and gently wafting away into space. The gaseous pillars are actually getting ionized, a process by which electrons are stripped off of atoms, and heated up by radiation from the massive stars. And then they are being eroded by the stars’ strong winds and barrage of charged particles, which are literally sandblasting away the tops of these pillars.”

    When Scowen and Hester used Hubble to make the initial observations of the Eagle Nebula in 1995, astronomers had seen the pillar-like structures in ground-based images, but not in detail. They knew that the physical processes are not unique to the Eagle Nebula because star birth takes place across the universe. But at a distance of just 6,500 light-years, M16 is the most dramatic nearby example – as the team soon realized.

    As Scowen was piecing together the Hubble exposures of the Eagle, he was amazed at what he saw. “I called Jeff Hester on his phone and said, ‘You need to get here now,’” Scowen recalled. “We laid the pictures out on the table, and we were just gushing because of all the incredible detail that we were seeing for the very first time.”

    The first features that jumped out at the team in 1995 were the streamers of gas seemingly floating away from the columns. Astronomers had previously debated what effect nearby massive stars would have on the surrounding gas in stellar nurseries. “There is the only one thing that can light up a neighborhood like this: massive stars kicking out enough horsepower in ultraviolet light to ionize the gas clouds and make them glow,” Scowen said. “Nebulous star-forming regions like M16 are the interstellar neon signs that say, ‘We just made a bunch of massive stars here.’ This was the first time we had directly seen observational evidence that the erosionary process, not only the radiation but the mechanical stripping away of the gas from the columns, was actually being seen.”

    o
    The original 1995 image was beautiful.

    By comparing the 1995 and 2014 pictures, astronomers also noticed a lengthening of a narrow jet-like feature that may have been ejected from a newly forming star. The jet looks like a stream of water from a garden hose. Over the intervening 19 years, this jet has stretched farther into space, across an additional 60 billion miles, at an estimated speed of about 450,000 miles per hour.

    Our sun probably formed in a similar turbulent star-forming region. There is evidence that the forming solar system was seasoned with radioactive shrapnel from a nearby supernova. That means that our sun was formed as part of a cluster that included stars massive enough to produce powerful ionizing radiation, such as is seen in the Eagle Nebula. “That’s the only way the nebula from which the sun was born could have been exposed to a supernova that quickly, in the short period of time that represents, because supernovae only come from massive stars, and those stars only live a few tens of millions of years,” Scowen explained. “What that means is when you look at the environment of the Eagle Nebula or other star-forming regions, you’re looking at exactly the kind of nascent environment that our sun formed in.”

    See the full article here.

    Pillars in Near Infrared from ESO’s VLT
    3

    The 8.2-meter VLT’s ANTU telescope imaged the famous “Pillars of Creation” region and its surroundings in near-infrared using the ISAAC instrument. This enabled astronomers to penetrate the obscuring dust in their search to detect newly formed stars. The near-infrared results showed that 11 of the Pillars’ 73 evaporating gaseous globules (or EGGs) possibly contained stars, and that the tips of the pillars contain stars and nebulosity not seen in the Hubble image.

    ESA Video showing the Pillars in a variety of wavelengths.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA’s Goddard Space Flight Center manages the telescope. The Space Telescope Science Institute (STScI), is a free-standing science center, located on the campus of The Johns Hopkins University and operated by the Association of Universities for Research in Astronomy (AURA) for NASA, conducts Hubble science operations.

    NASA leads the nation on a great journey of discovery, seeking new knowledge and understanding of our planet Earth, our Sun and solar system, and the universe out to its farthest reaches and back to its earliest moments of existence. NASA’s Science Mission Directorate (SMD) and the nation’s science community use space observatories to conduct scientific studies of the Earth from space to visit and return samples from other bodies in the solar system, and to peer out into our Galaxy and beyond. NASA’s science program seeks answers to profound questions that touch us all:

    This is NASA’s science vision: using the vantage point of space to achieve with the science community and our partners a deep scientific understanding of our planet, other planets and solar system bodies, the interplanetary environment, the Sun and its effects on the solar system, and the universe beyond. In so doing, we lay the intellectual foundation for the robotic and human expeditions of the future while meeting today’s needs for scientific information to address national concerns, such as climate change and space weather. At every step we share the journey of scientific exploration with the public and partner with others to substantially improve science, technology, engineering and mathematics (STEM) education nationwide.

    NASA

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 452 other followers

%d bloggers like this: