Recent Updates Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:37 am on September 3, 2015 Permalink | Reply
    Tags: , , ,   

    From NOVA: “How big a deal was Stephen Hawking’s big black hole announcement?” 

    PBS NOVA

    NOVA

    02 Sep 2015
    Sarah Scoles

    “Can you hear me?” Stephen Hawking asked as he was about to begin his August 25 talk at the Royal Institute of Technology’s “Hawking Radiation” conference in Stockholm.

    3
    Stephen Hawking

    The 29-person audience, all VIP physicists, was eager to hear his big announcement and could hear him just fine. They knew, from a pre-announcement announcement the previous night, that Hawking was about to explain his solution to a 40-year-old mystery in physics: how information escapes from black holes.

    But while his idea made big headlines, the mere nine minutes of explanation felt vague and confusing to other physicists.

    1`
    The iconic image of a black hole. But, of course, we have never really seen a black hole, not even a supermassive black hole. Artist’s concept illustrates a supermassive black hole with millions to billions times the mass of our sun. Credit: NASA/JPL-Caltech

    What is Hawking’s problem?

    Hawking set out to resolve a problem called the “information paradox.” Understanding this snag requires a brush-up on black holes. If light or matter venture past a boundary around the black hole called the “event horizon,” they are done for: The speed required to overcome the black hole’s gravity and escape is greater than the speed of light [c]. So anything that crosses the event horizon—and any information about what it was in its previous life—stays inside the black hole. Whether you, a Snickers bar, or a whole planet fall in, they all end up the same: as anonymous extra mass piled onto the black hole itself.

    Or at least that used to be the idea. Then, in 1974, Stephen Hawking showed that black holes slowly evaporate. They continuously leak radiation (later named “Hawking radiation”), dwindling away until there’s nothing left, on timescales ranging form a few billion years to much longer than the current age of the universe. But, as theoretical physicist Carlo Rovelli of Aix-Marseille University, who attended the talk, explains, “This creates a problem: Where has all the stuff gone that fell inside? Where is the information about what fell in? It cannot be anymore ‘just inside,’ because the black hole has disappeared. So, where is it? Is it really lost?”

    Quantum mechanics says that information about the stuff can’t be lost. Information can neither be created nor destroyed. But black holes seem to destroy it. But they can’t. But they seem to. That’s the “paradox” part of the information paradox.

    Some scientists think the escaping Hawking radiation carries the information out with it, like a set-free hostage who can tell police about the room he just spent five days in.

    Ideas abound about how that radiation might spill the beans, and Hawking’s new revelation is just one contender. “The situation is not that there is a big problem, and here is the solution,” says Rovelli. “The situation is that there is a big problem, and there are a dozen solutions … none totally convincing, and now we have a new one.”

    Hawking’s big idea

    On August 25, as Hawking sat before the esteemed physicists, his voice played through the room’s speakers. “I propose that the information is stored not in the interior of the black hole, as one might expect, but on its boundary, the event horizon,” he said, “in the form of supertranslations of the horizon.”

    Translation: If you passed over the event horizon, you would leave an imprint on it. That imprint takes the form of, essentially, a hologram called a supertranslation—a two-dimensional representation of your three-dimensional parts—etched into the black hole’s exterior geometry. When Hawking radiation bubbles up, the event horizon leaves a similar imprint on that radiation. It’s like cosmic re-gifting. The Hawking radiation then streams back out into the universe, carrying the imprint, and the encoded information, with it. That code, though, is scrambled: If you fell into a black hole, we could not create your clone from it (sorry). But, cold comfort, your informational essence wouldn’t be eternally lost.

    The idea that information could be stamped onto a black hole’s event horizon was first proposed by Nobel Laureate Gerard ‘t Hooft, and supertranslations—as mathematical ideas—come from the 1960s. We don’t yet know enough about Hawking’s idea to detail how new and different it is.

    The problem with Hawking’s solution

    And that’s part of the problem: According to colleagues, the details of his “solution” feel fuzzy. “Two big questions are where the information from infalling stuff gets deposited, and how that information later gets transferred to stuff leaving the black hole,” says Steven Giddings, a physicist at the University of California, Santa Barbara.

    Those are two big questions—the biggest, most fundamental questions. It’s great that Hawking described the what of his idea, but, in science, the how is much more important. “What we need for a more detailed understanding is a more complete description of the mathematics … to see if they’ve really nailed the answer,” says Giddings. Rovelli agrees, stating, “The picture is very preliminary for the moment.”

    They weren’t the only two left scratching their heads. “In the conference, there were many world-class physicists, including Nobel Prize winners,” says Joe Polchinski, Giddings’ colleague at UC Santa Barbara. “I didn’t perceive much enthusiasm about the new idea. Everybody was interested, of course, but I couldn’t detect anybody that appeared convinced.”

    Polchinski, who has previously science-battled Hawking about black hole paradoxes, also pointed out a problem beyond the idea’s fuzziness: In Hawking’s scenario, information stays on the event horizon. But the information (using the previous example, you yourself) also falls into the black hole, meaning two copies of it would exist. “In quantum mechanics, information can’t be in two places,” Polchinski says, although he points out that Hawking may have found a way to evade this problem.

    Because Hawking’s black-hole revelations (as well as his proclamations about aliens and religion) receive public buzz and papers called “AdS/CFT without holography: A hidden dimension on the CFT side and implications for black-hole entropy” don’t, it may seem that his idea is totally novel. But it’s not. “From what we here understand, his suggestion builds on ideas that people have been tossing around recently,” says Giddings. Rovelli and Polchinski also point out its similarity to ’t Hooft’s 1990s ideas, although Hawking has added “some technical steps.”

    Hawking claims he, and co-conspirators Andrew Strominger of Harvard University and Malcolm Perry of Cambridge University, will leak more information in a paper in late September. If that paper throws around some convincing equations—what black-hole theorists require as evidence—the result could be a big deal. Until then, scientists are waiting to reserve judgment. “For the moment the theory is far too sketchy, in the manner it has been presented,” Rovelli says. “Let me put it this way: The big news is Hawking himself: his persona, his popular fame, the wonderful manner in which he communicate to the public and transmits enthusiasm to the public. This is fantastic and is his mastership. His physics is interesting, as many others’ are.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

     
  • richardmitnick 9:07 am on September 3, 2015 Permalink | Reply
    Tags: , , ,   

    From ASU: “ASU instruments on Mars orbiters help scientists probe ancient Mars atmosphere” 

    ASU Bloc

    ASU

    September 2nd, 2015

    Robert Burnham, robert.burnham@asu.edu
    (480) 458-8207
    Mars Space Flight Facility

    1
    Researchers estimating the amount of carbon held in the ground at the largest known carbonate-containing deposit on Mars utilized data from three different NASA Mars orbiters. Each image in this pair covers the same area about 36 miles (58 kilometers) wide in the Nili Fossae plains region of Mars’ northern hemisphere. The tally of carbon content in the rocks of this region is a key piece in solving a puzzle of how the Martian atmosphere has changed over time. Carbon dioxide from the atmosphere on early Mars reacted with surface rocks to form carbonate, thinning the atmosphere.
    The image on the left presents data from the Thermal Emission Imaging System (THEMIS) instrument on NASA’s Mars Odyssey orbiter. The color coding indicates thermal inertia — the property of how quickly a surface material heats up or cools off. Sand, for example (blue hues), cools off quicker after sundown than bedrock (red hues) does. The color coding in the image on the right presents data from the Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) instrument on NASA’s Mars Reconnaissance Orbiter. From the brightness at many different wavelengths, CRISM data can indicate what minerals are present on the surface. In the color coding used here, green hues are consistent with carbonate-bearing materials, while brown or yellow hues are olivine-bearing sands and locations with purple hues are basaltic in composition. The gray scale base map is a mosaic of daytime THEMIS infrared images. Annotations point to areas with different surface compositions. The scale bar indicates 20 kilometers (12.4 miles).
    In addition to data from THEMIS and CRISM, researchers estimating the amount of carbon in rocks of the Nili Fossae plains used data from the Thermal Emission Spectrometer instrument on NASA’s Mars Global Surveyor orbiter, which operated from 1997 to 2006, and from two telescopic cameras on Mars Reconnaissance Orbiter: the Context Camera and the High Resolution Imaging Science Experiment.
    Arizona State University, Tempe, provided and operates THEMIS. The Johns Hopkins University Applied Physics Laboratory, Laurel, Maryland, provided and operates CRISM. NASA’s Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Mars Reconnaissance Orbiter and Mars Odyssey projects for NASA’s Science Mission Directorate, Washington. Lockheed Martin Space Systems, Denver, built the orbiters and collaborates with JPL to operate them. Date 2 September 2015

    2
    This view combines information from two instruments on NASA’s Mars Reconnaissance Orbiter to map color-coded composition over the shape of the ground in a small portion of the Nili Fossae plains region of Mars’ northern hemisphere. This site is part of the largest known carbonate-rich deposit on Mars. In the color coding used for this map, green indicates a carbonate-rich composition, brown indicates olivine-rich sands, and purple indicates basaltic composition. Carbon dioxide from the atmosphere on early Mars reacted with surface rocks to form carbonate, thinning the atmosphere by sequestering the carbon in the rocks. An analysis of the amount of carbon contained in Nili Fossae plains estimated the total at no more than twice the amount of carbon in the modern atmosphere of Mars, which is mostly carbon dioxide. That is much more than in all other known carbonate on Mars, but far short of enough to explain how Mars could have had a thick enough atmosphere to keep surface water from freezing during a period when rivers were cutting extensive valley networks on the Red Planet. Other possible explanations for the change from an era with rivers to dry modern Mars are being investigated. This image covers an area approximately 1.4 miles (2.3 kilometers) wide. A scale bar indicates 500 meters (1,640 feet). The full extent of the carbonate-containing deposit in the region is at least as large as Delaware and perhaps as large as Arizona. The color coding is from data acquired by the Compact Reconnaissance Imaging Spectrometer for Mars (CRISM), in observation FRT0000C968 made on Sept. 19, 2008. The base map showing land shapes is from the High Resolution Imaging Science Experiment (HiRISE) camera. It is one product from HiRISE observation ESP_010351_2020, made July 20, 2013. Other products from that observation are online at http://www.uahirise.org/ESP_032728_2020.The Mars Reconnaissance Orbiter has been using CRISM, HiRISE and four other instruments to investigate Mars since 2006. The Johns Hopkins University Applied Physics Laboratory, Laurel, Maryland, led the work to build the CRISM instrument and operates CRISM in coordination with an international team of researchers from universities, government and the private sector. HiRISE is operated by the University of Arizona, Tucson, and was built by Ball Aerospace & Technologies Corp., Boulder, Colorado. NASA’s Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Mars Reconnaissance Orbiter Project for NASA’s Science Mission Directorate, Washington. Lockheed Martin Space Systems, Denver, built the orbiter and collaborates with JPL to operate it.Date 2 September 2015

    Mars was not always the arid Red Planet that we know today. Billions of years ago it was a world with watery environments — but how and why did it change?

    A new analysis of the largest known deposit of carbonate minerals on Mars helps limit the range of possible answers to that question.

    The Martian atmosphere currently is cold and thin — about 1 percent of Earth’s — and almost entirely carbon dioxide. Yet abundant evidence in the form of meandering valley networks suggests that long ago it had flowing rivers that would require both a warmer and denser atmosphere than today. Where did that atmosphere go?

    Carbon dioxide gas can be pulled out of the Martian air and buried in the ground by chemical reactions that form carbonate minerals. Once, many scientists expected to find large deposits of carbonates holding much of Mars’ original atmosphere. Instead, instruments on space missions over the past 20 years have detected only small amounts of carbonates spread widely plus a few localized deposits.

    The instruments searching for Martian carbonate minerals include the mineral-detecting Thermal Emission Spectrometer (TES) on NASA’s Mars Global Surveyor orbiter and the Thermal Emission Imaging System (THEMIS) on NASA’s Mars Odyssey orbiter. THEMIS’ strength lies in measuring and mapping the physical properties of the Martian surface.

    NASA Mars Global Surveyor
    NASA’s Mars Global Surveyor orbiter

    ASU TES on NASA Mars Global Surveyor
    TES

    NASA Mars Odessy Orbiter
    NASA’s Mars Odyssey orbiter

    ASU THEMIS on NASA's Mars Odyssey orbiter
    Thermal Emission Imaging System (THEMIS)

    Both instruments were designed by Philip Christensen, Regents’ Professor of geological sciences in ASU’s School of Earth and Space Exploration. TES fell silent when NASA lost contact with Mars Global Surveyor in 2006, but THEMIS remains in operation today.

    “We designed these instruments to investigate Martian geologic history, including its atmosphere,” Christensen said. “It’s rewarding to see data from all these instruments on many spacecraft coming together to produce these results.”

    Other instruments involved in the search include the mineral-mapping Compact Reconnaissance Imaging Spectrometer for Mars and two telescopic cameras on NASA’s Mars Reconnaissance Orbiter.

    NASA Mars Reconnaisence Orbiter
    NASA’s Mars Reconnaissance Orbiter

    ASU Compact Reconnaissance Imaging Spectrometer
    Compact Reconnaissance Imaging Spectrometer for Mars

    Big, but not big enough

    By far the largest known carbonate-rich deposit on Mars covers an area at least the size of Delaware, and maybe as large as Arizona, in a location called Nili Fossae. But its quantity of carbonate minerals comes up short for what’s needed to produce a thick atmosphere, according to a new paper just published online in the journal Geology.

    The paper’s lead author is Christopher Edwards, a former graduate student of Christensen’s. He is now with the U.S. Geological Survey in Flagstaff, Arizona. Both TES and THEMIS contributed to the work, he said.

    “The Thermal Emission Spectrometer told us how much Nili has of several kinds of minerals, especially carbonates,” Edwards noted.

    And, he added, “THEMIS played an essential complementary role by showing the physical nature of the rock units at Nili. Were they impact-shattered small rocks and soil? Were they fractured and cemented rocks? Or dunes? THEMIS data let us differentiate these units by composition.”

    Bethany Ehlmann of the California Institute of Technology and NASA’s Jet Propulsion Laboratory is Edwards’ co-author. She said Nili doesn’t measure up to what’s needed. “The biggest carbonate deposit on Mars has, at most, twice as much carbon within it as the current Mars atmosphere.

    “Even if you combined all known carbon reservoirs together,” she explained, “it is still nowhere near enough to sequester the thick atmosphere that has been proposed for the time when there were rivers flowing on the Martian surface.”

    Edwards and Ehlmann estimate that Nili’s carbonate inventory, in fact, falls too short by at least a factor of 35 times. Given the level of detail in orbital surveys, the team thinks it highly unlikely that other large deposits have been overlooked.

    Atmosphere going, going, gone

    So where did the thick ancient atmosphere go?

    Scientists are looking at two possible explanations. One is that Mars had a much denser atmosphere during its flowing-rivers period, and then lost most of it to outer space from the top of the atmosphere, rather than into minerals and rocks. NASA’s Curiosity Mars rover mission has found evidence for ancient top-of-atmosphere loss, but uncertainty remains just how long ago this happened. NASA’s MAVEN orbiter, examining rates of change in the outer atmosphere of Mars since late 2014, may help reduce the uncertainty.

    NASA Mars Curiosity Rover
    NASA’s Mars Curiosity Rover

    NASA Mars MAVEN
    NASA’s Mars MAVEN orbiter

    An alternative explanation, favored by Edwards and Ehlmann, is that the original Martian atmosphere had already lost most of its carbon dioxide by the era of rivers and valleys.

    “Maybe the atmosphere wasn’t so thick by the time the valley networks formed,” Edwards suggested. “Instead of Mars that was wet and warm, maybe it was cold and wet with an atmosphere that had already thinned.”

    How warm would it need to have been for the valleys to form? It wouldn’t take much, Edwards said.

    “In most locations, you could have had snow and ice instead of rain. You just have to nudge above the freezing point to get water to thaw and flow occasionally, and that doesn’t require very much atmosphere.”

    The School of Earth and Space Exploration is a unit of ASU’s College of Liberal Arts and Sciences.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ASU is the largest public university by enrollment in the United States.[11] Founded in 1885 as the Territorial Normal School at Tempe, the school underwent a series of changes in name and curriculum. In 1945 it was placed under control of the Arizona Board of Regents and was renamed Arizona State College.[12][13][14] A 1958 statewide ballot measure gave the university its present name.
    ASU is classified as a research university with very high research activity (RU/VH) by the Carnegie Classification of Institutions of Higher Education, one of 78 U.S. public universities with that designation. Since 2005 ASU has been ranked among the Top 50 research universities, public and private, in the U.S. based on research output, innovation, development, research expenditures, number of awarded patents and awarded research grant proposals. The Center for Measuring University Performance currently ranks ASU 31st among top U.S. public research universities.[15]

    ASU awards bachelor’s, master’s and doctoral degrees in 16 colleges and schools on five locations: the original Tempe campus, the West campus in northwest Phoenix, the Polytechnic campus in eastern Mesa, the Downtown Phoenix campus and the Colleges at Lake Havasu City. ASU’s “Online campus” offers 41 undergraduate degrees, 37 graduate degrees and 14 graduate or undergraduate certificates, earning ASU a Top 10 rating for Best Online Programs.[16] ASU also offers international academic program partnerships in Mexico, Europe and China. ASU is accredited as a single institution by The Higher Learning Commission.

    ASU Tempe Campus
    ASU Tempe Campus

     
  • richardmitnick 8:07 am on September 3, 2015 Permalink | Reply
    Tags: , ,   

    From Nautilus: “Don’t Worry, Smart Machines Will Take Us With Them” 

    Nautilus

    Nautilus

    September 3, 2015
    Stephen Hsu

    1

    When it comes to artificial intelligence [AI], we may all be suffering from the fallacy of availability: thinking that creating intelligence is much easier than it is, because we see examples all around us. In a recent poll, machine intelligence experts predicted that computers would gain human-level ability around the year 2050, and superhuman ability less than 30 years after.1 But, like a tribe on a tropical island littered with World War II debris imagining that the manufacture of aluminum propellers or steel casings would be within their power, our confidence is probably inflated.

    AI can be thought of as a search problem over an effectively infinite, high-dimensional landscape of possible programs. Nature solved this search problem by brute force, effectively performing a huge computation involving trillions of evolving agents of varying information processing capability in a complex environment (the Earth). It took billions of years to go from the first tiny DNA replicators to Homo Sapiens. What evolution accomplished required tremendous resources. While silicon-based technologies are increasingly capable of simulating a mammalian or even human brain, we have little idea of how to find the tiny subset of all possible programs running on this hardware that would exhibit intelligent behavior.

    But there is hope. By 2050, there will be another rapidly evolving and advancing intelligence besides that of machines: our own. The cost to sequence a human genome has fallen below $1,000, and powerful methods have been developed to unravel the genetic architecture of complex traits such as human cognitive ability. Technologies already exist which allow genomic selection of embryos during in vitro fertilization—an embryo’s DNA can be sequenced from a single extracted cell. Recent advances such as CRISPR allow highly targeted editing of genomes, and will eventually find their uses in human reproduction.

    It is easy to forget that the computer revolution was led by a handful of geniuses: individuals with truly unusual cognitive ability.

    The potential for improved human intelligence is enormous. Cognitive ability is influenced by thousands of genetic loci, each of small effect. If all were simultaneously improved, it would be possible to achieve, very roughly, about 100 standard deviations [Σ] of improvement, corresponding to an IQ of over 1,000. We can’t imagine what capabilities this level of intelligence represents, but we can be sure it is far beyond our own. Cognitive engineering, via direct edits to embryonic human DNA, will eventually produce individuals who are well beyond all historical figures in cognitive ability. By 2050, this process will likely have begun.

    These two threads—smarter people and smarter machines—will inevitably intersect. Just as machines will be much smarter in 2050, we can expect that the humans who design, build, and program them will also be smarter. Naively, one would expect the rate of advance of machine intelligence to outstrip that of biological intelligence. Tinkering with a machine seems easier than modifying a living species, one generation at a time. But advances in genomics—both in our ability to relate complex traits to the underlying genetic codes, and the ability to make direct edits to genomes—will allow rapid advances in biologically-based cognition. Also, once machines reach human levels of intelligence, our ability to tinker starts to be limited by ethical considerations. Rebooting an operating system is one thing, but what about a sentient being with memories and a sense of free will?

    Therefore, the answer to the question “Will AI or genetic modification have the greater impact in the year 2050?” is yes. Considering one without the other neglects an important interaction.

    2
    A titan at teatime: John Von Neumann talking to graduate students during afternoon tea. Alfred Eisenstaedt/The LIFE Picture Collection/Getty Images

    It has happened before. It is easy to forget that the computer revolution was led by a handful of geniuses: individuals with truly unusual cognitive ability. Alan Turing and John von Neumann both contributed to the realization of computers whose program is stored in memory and can be modified during execution. This idea appeared originally in the form of the Turing Machine, and was given practical realization in the so-called von Neumann architecture of the first electronic computers, such as the EDVAC.

    1
    The Bombe code-breaking machine Turing devised at Bletchley Park during the Second World War PHOTO: Getty Image

    3
    EDVAC

    While this computing design seems natural, even obvious, to us now, it was at the time a significant conceptual leap.

    Turing and von Neumann were special, and far beyond peers of their era. Both played an essential role in the Allied victory in WWII. Turing famously broke the German Enigma codes, but not before conceptualizing the notion of “mechanized thought” in his Turing Machine, which was to become the main theoretical construct in modern computer science. Before the war, von Neumann placed the new quantum theory on a rigorous mathematical foundation. As a frequent visitor to Los Alamos he made contributions to hydrodynamics and computation that were essential to the United States’ nuclear weapons program. His close colleague, the Nobel Laureate Hans A. Bethe, established the singular nature of his abilities, and the range of possibilities for human cognition, when he said “I always thought von Neumann’s brain indicated that he was from another species, an evolution beyond man.”

    Today, we need geniuses like von Neumann and Turing more than ever before. That’s because we may already be running into the genetic limits of intelligence. In a 1983 interview, Noam Chomsky was asked whether genetic barriers to further progress have become obvious in some areas of art and science.2 He answered:

    You could give an argument that something like this has happened in quite a few fields … I think it has happened in physics and mathematics, for example … In talking to students at MIT, I notice that many of the very brightest ones, who would have gone into physics twenty years ago, are now going into biology. I think part of the reason for this shift is that there are discoveries to be made in biology that are within the range of an intelligent human being. This may not be true in other areas.

    AI research also pushes even very bright humans to their limits. The frontier machine intelligence architecture of the moment uses deep neural nets: multilayered networks of simulated neurons inspired by their biological counterparts. Silicon brains of this kind, running on huge clusters of GPUs (graphical processor units made cheap by research and development and economies of scale in the video game industry), have recently surpassed human performance on a number of narrowly defined tasks, such as image or character recognition. We are learning how to tune deep neural nets using large samples of training data, but the resulting structures are mysterious to us. The theoretical basis for this work is still primitive, and it remains largely an empirical black art. The neural networks researcher and physicist Michael Nielsen puts it this way:3

    … in neural networks there are large numbers of parameters and hyper-parameters, and extremely complex interactions between them. In such extraordinarily complex systems it’s exceedingly difficult to establish reliable general statements. Understanding neural networks in their full generality is a problem that, like quantum foundations, tests the limits of the human mind.

    The detailed inner workings of a complex machine intelligence (or of a biological brain) may turn out to be incomprehensible to our human minds—or at least the human minds of today. While one can imagine a researcher “getting lucky” by stumbling on an architecture or design whose performance surpasses her own capability to understand it, it is hard to imagine systematic improvements without deeper comprehension.

    3
    Minds building minds: Alan Turing (right) at work on an early computer c. 1951.SSPL/Getty Images

    But perhaps we will experience a positive feedback loop: Better human minds invent better machine learning methods, which in turn accelerate our ability to improve human DNA and create even better minds. In my own work, I use methods from machine learning (so-called compressed sensing, or convex optimization in high dimensional geometry) to extract predictive models from genomic data. Thanks to recent advances, we can predict a phase transition in the behavior of these learning algorithms, representing a sudden increase in their effectiveness. We expect this transition to happen within about a decade, when we reach a critical threshold of about 1 million human genomes worth of data. Several entities, including the U.S. government’s Precision Medicine Initiative and the private company Human Longevity Inc. (founded by Craig Venter), are pursuing plans to genotype 1 million individuals or more.

    The feedback loop between algorithms and genomes will result in a rich and complex world, with myriad types of intelligences at play: the ordinary human (rapidly losing the ability to comprehend what is going on around them); the enhanced human (the driver of change over the next 100 years, but perhaps eventually surpassed); and all around them vast machine intellects, some alien (evolved completely in silico) and some strangely familiar (hybrids). Rather than the standard science-fiction scenario of relatively unchanged, familiar humans interacting with ever-improving computer minds, we will experience a future with a diversity of both human and machine intelligences. For the first time, sentient beings of many different types will interact collaboratively to create ever greater advances, both through standard forms of communication and through new technologies allowing brain interfaces. We may even see human minds uploaded into cyberspace, with further hybridization to follow in the purely virtual realm. These uploaded minds could combine with artificial algorithms and structures to produce an unknowable but humanlike consciousness. Researchers have recently linked mouse and monkey brains together, allowing the animals to collaborate—via an electronic connection—to solve problems. This is just the beginning of “shared thought.”

    It may seem incredible, or even disturbing, to predict that ordinary humans will lose touch with the most consequential developments on planet Earth, developments that determine the ultimate fate of our civilization and species. Yet consider the early 20th-century development of quantum mechanics. The first physicists studying quantum mechanics in Berlin—men like Albert Einstein and Max Planck—worried that human minds might not be capable of understanding the physics of the atomic realm. Today, no more than a fraction of a percent of the population has a good understanding of quantum physics, although it underlies many of our most important technologies: Some have estimated that 10-30 percent of modern gross domestic product is based on quantum mechanics. In the same way, ordinary humans of the future will come to accept machine intelligence as everyday technological magic, like the flat screen TV or smartphone, but with no deeper understanding of how it is possible.

    New gods will arise, as mysterious and familiar as the old.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
  • richardmitnick 5:17 pm on September 2, 2015 Permalink | Reply
    Tags: , , ,   

    From AAS: “Witnessing Solar Rejuvenation” 

    AASNOVA

    Amercan Astronomical Society

    2 September 2015
    Susanna Kohler

    1
    The Sun as observed in a blend of three different extreme UV wavelengths by the Solar Dynamics Observatory’s AIA (Atmospheric Imaging Assembly) instrument on October 24, 2014. The sun experienced a sudden rejuvenation in its magnetic field during the second half of 2014. [NASA/SDO/AIA]

    At the end of last year, the Sun’s large-scale magnetic field suddenly strengthened, reaching its highest value in over two decades. Here, Neil Sheeley and Yi-Ming Wang (both of the Naval Research Laboratory) propose an explanation for why this happened and what it predicts for the next solar cycle.

    Magnetic Strengthening

    Until midway through 2014, solar cycle 24 — the current solar cycle — was remarkably quiet. Even at its peak, it averaged only 79 sunspots per year, compared to maximums of up to 190 in recent cycles. Thus it was rather surprising when, toward the end of 2014, the Sun’s large-scale magnetic field underwent a sudden rejuvenation, with its mean field leaping up to its highest values since 1991 and causing unprecedentedly large numbers of coronal loops to collapse inward.

    Yet in spite of the increase we observed in the Sun’s open flux (the magnetic flux leaving the Sun’s atmosphere, measured from Earth), there was not a significant increase in solar activity, as indicated by sunspot number and the rate of coronal mass ejections. This means that the number of sources of magnetic flux didn’t increase — so Sheeley and Wang conclude that flux must instead have been emerging from those sources in a more efficient way! But how?

    2
    WSO open flux and the radial component of the interplanetary magnetic field (measures of the magnetic flux leaving the Sun’s photosphere and heliosphere, respectively), compared to sunspot number (in units of 100 sunspots). A sudden increase in flux is visible after the peak of each of the last four sunspot cycles. Click for a larger view! [Sheeley & Wang 2015]

    Aligned Activity

    The authors show that the active regions on the solar surface in late 2014 lined up in such a way that the emerging flux was enhanced, forming a strong equatorial dipole field that accounts for the sudden rejuvenation observed.

    Interestingly, this rejuvenation of the Sun’s open flux wasn’t just a one-time thing; similar bursts have occurred shortly after the peak of every sunspot cycle that we have flux measurements for. The authors find that three factors (how the active regions are distributed longitudinally, their sizes, and the contribution of the axisymmetric component of the magnetic field) determine the strength of this rejuvenation. All three of these factors happened to contribute optimally in 2014.

    As a final note, Sheeley and Wang suggest that the current strength of the axisymmetric component of the magnetic field can be used to provide an early indication of how active the next solar cycle might be. Using this method, they predict that solar cycle 25 will be similar to the current cycle in amplitude.

    Citation

    N. R. Sheeley Jr. and Y.-M. Wang 2015 ApJ 809 113. doi:10.1088/0004-637X/809/2/113

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 4:06 pm on September 2, 2015 Permalink | Reply
    Tags: , ,   

    From AAS: “Which Galaxies Are the Most Habitable?” 

    AASNOVA

    Amercan Astronomical Society

    31 August 2015
    Susanna Kohler

    1
    The galaxies Maffei 1 (top right) and 2 (bottom left). Maffei 1 is the closest giant elliptical galaxy to the Milky Way. A recent study suggests giant ellipticals may be the most likely galaxies to harbor life. [NASA/JPL-Caltech/UCLA]

    Habitable zones are a hot topic in exoplanet studies: where, around a given star, could a planet exist that supports life? But if you scale this up, you get a much less common question: which type of galaxy is most likely to host complex life in the universe? A team of researchers from the UK believes it has the answer.
    Criteria for Habitability

    Led by Pratika Dayal of the University of Durham, the authors of this study set out to estimate the habitability of a large population of galaxies. The first step in this process is to determine what elements contribute to a galaxy’s habitability. The authors note three primary factors:

    1.Total number of stars
    More stars means more planets!
    2.Metallicity of the stars
    Planets are more likely to form in stellar vicinities with higher metallicities, since planet formation requires elements heavier than iron.
    3.Likelihood of Type II supernovae nearby

    Planets that are located out of range of supernovae have a higher probability of being habitable, since a major dose of cosmic radiation is likely to cause mass extinctions or delay evolution of complex life. Galaxies’ supernova rates can be estimated from their star formation rates (the two are connected via the initial mass function).

    2
    Lower panel: the number of Earth-like habitable planets (given by the color bar, which shows the log ratio relative to the Milky Way) increases in galaxies with larger stellar mass and lower star formation rates. Upper panel: the larger stellar-mass galaxies tend to be elliptical (blue line) rather than spiral (red line).[Dayal et al. 2015]

    Hospitable Cosmic Giants

    Interestingly, these three conditions have previously been shown to be linked via something termed the “fundamental metallicity relation,” which relates the total stellar masses, metallicities, and star formation rates of galaxies. By using this relation, the authors were able to create predictions for the number of habitable planets in more than 100,000 galaxies in the local universe (cataloged by the Sloan Digital Sky Survey).

    Based on these predictions, the authors find that the galaxies likely to host the largest number of habitable planets are those that have a mass greater than twice that of the Milky Way and star formation rates less than a tenth of that of the Milky Way.

    These galaxies tend to be giant elliptical galaxies, rather than compact spirals like our own galaxy. The authors calculate that the most hospitable galaxies can host up to 10,000 times as many Earth-like planets and 1,000,000 times as many gas-giants (which might have habitable moons) as the Milky Way!

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 3:35 pm on September 2, 2015 Permalink | Reply
    Tags: , ,   

    From NOVA: “Venom of Aggressive Brazilian Wasp Rips Holes in Cancer Cells” 

    PBS NOVA

    NOVA

    1
    Polybia paulista

    Until a decade ago, Polybia paulista wasn’t well known to anyone other than entomologists and the hapless people it stung in its native Brazil. But then, a number of research groups discovered a series of remarkable qualities all concentrated in the aggressive wasp’s venom.

    One compound in particular has stood out for its antimicrobial and anti-cancer properties. Polybia-MP1, a peptide, or a string of amino acids, is different from most antibacterial peptides in that it’s only toxic to bacteria and not red blood cells. MP1 punches through bacteria’s cell membranes, causing them to die a leaky death. Scientists had also discovered that MP1 was also good at inhibiting spreading bladder and prostate cancer cells and could kill leukemia cells, but they didn’t know why it was so toxic only to tumor cells.

    Well, now they think they have an idea. How MP1 kills cancer cells turns out to be very similar to how it kills bacteria cells—by causing them to leak to death. MP1 targets two lipids— phosphatidylserine, or PS, and phosphatidylethanolamine, or PE—that cancer cells have adorned on the outside of their membranes. Here’s Kiona Smith-Strickland, writing for Discover:

    MP1’s destruction of a cancer cell, researchers say, has two stages. First, MP1 bonds to the outer surface of the cell, and then it opens holes or pores in the membrane big enough to let the cell’s contents leak out. PS is crucial for the first part: seven times more MP1 molecules bound to membranes with PS in their outer layer. And PE is crucial for the second: Once the MP1 molecules worked their way into the membrane, they opened pores twenty to thirty times larger than in membranes without PE.

    Even better, healthy cells have neither PS nor PE on the outside of their membranes. Rather, they keep them on the inside, a key difference from cancer cells that would shield them from the damaging effects of MP1. In other words, MP1 could make an ideal chemotherapy.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

     
  • richardmitnick 3:24 pm on September 2, 2015 Permalink | Reply
    Tags: , ,   

    From Berkeley: “CT scan of Earth links deep mantle plumes with volcanic hotspots” 

    UC Berkeley

    UC Berkeley

    September 2, 2015
    Robert Sanders


    Supercomputer simulation of plumes of hot rock rising through the mantle to the surface, where they generate volcanic eruptions that form island chains. Animation by Scott French, NERSC & Berkeley Lab; video by Roxanne Makasdjian and Stephen McNally, UC Berkeley.

    University of California, Berkeley, seismologists have produced for the first time a sharp, three-dimensional scan of Earth’s interior that conclusively connects plumes of hot rock rising through the mantle with surface hotspots that generate volcanic island chains like Hawaii, Samoa and Iceland.

    Essentially a computed tomography, or CT scan, of Earth’s interior, the picture emerged from a supercomputer simulation at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) at the Lawrence Berkeley National Laboratory.

    While medical CTs employ X-rays to probe the body, the scientists mapped mantle plumes by analyzing the paths of seismic waves bouncing around Earth’s interior after 273 strong earthquakes that shook the globe over the past 20 years.

    Previous attempts to image mantle plumes have detected pockets of hot rock rising in areas where plumes have been proposed, but it was unclear whether they were connected to volcanic hotspots at the surface or the roots of the plumes at the core mantle boundary 2,900 kilometers (1,800 miles) below the surface.

    The new, high-resolution map of the mantle — the hot rock below Earth’s crust but above the planet’s iron core — not only shows these connections for many hotspots on the planet, but reveals that below about 1,000 kilometers the plumes are between 600 and 1,000 kilometers across, up to five times wider than geophysicists thought. The plumes are likely at least 400 degrees Celsius hotter than surrounding rock.

    “No one has seen before these stark columnar objects that are contiguous all the way from the bottom of the mantle to the upper part of the mantle,” said first author Scott French, a computational scientist at NERSC who recently received his Ph.D. from UC Berkeley.

    Senior author Barbara Romanowicz, a UC Berkeley professor of earth and planetary science, noted that the connections between the lower-mantle plumes and the volcanic hotspots are not direct because the tops of the plumes spread out like the delta of a river as they merge with the less viscous upper mantle rock.

    “These columns are clearly separated in the lower mantle and they go all the way up to about 1,000 kilometers below the surface, but then they start to thin out in the upper part of the mantle, and they meander and deflect,” she said. “So while the tops of the plumes are associated with hotspot volcanoes, they are not always vertically under them.”

    Ancient anchors

    The new picture also shows that the bases of these plumes are anchored at the core-mantle boundary in two huge blobs of hot rock, each about 5,000 kilometers in diameter, that are likely denser than surrounding rock. Romanowicz estimates that those two anchors — directly opposite one another under Africa and the Pacific Ocean — have been in the same spots for 250 million years.

    1
    The 1,800-mile thick mantle under the Pacific Ocean contains rising plumes of hot rock that fan out at the surface to stationary hotspots, where they generate island chains as Earth’s crust moves due to plate tectonics. Scott French image.

    French and Romanowicz, who also is affiliated with the Institut de Physique du Globe and the Collège de France in Paris, will publish their findings in the Sept. 3 issue of the British journal Nature.

    The Earth is layered like an onion. An exterior crust contains the oceans and continents, while under the crust lies a thick mantle of hot but solid rock 2,900 kilometers thick. Below the mantle is the outer core, composed of liquid, molten iron and nickel, which envelopes an inner core of solid iron at the center of the planet.

    Heated by the hot core, the rock in the mantle rises and falls like water gently simmering in a pan, though this convection occurs much more slowly. Seismologists proposed some 30 years ago that stationary plumes of hot rock in the mantle occasionally punched through the crust to produce volcanoes, which, as the crust moved, generated island chains such as the Galapagos, Cape Verde and Canary islands.

    The Hawaiian Islands, for example, consist of 5 million-year-old Kauai to the west but increasingly younger islands to the east, because the Pacific Plate is moving westward. The newest eruption, Loihi, is still growing underwater east of the youngest island in the chain, Hawaii.

    Until now, evidence for the plume and hotspot theory had been circumstantial, and some seismologists argued instead that hotspots are very shallow pools of hot rock feeding magma chambers under volcanoes.

    Romanowicz, who uses seismic waves to study Earth’s interior, had previously worked with French, then a graduate student, on a tomographic model of the upper 800 kilometers of the mantle, which showed periodic hot and cold regions of rock underlying hotspot volcanoes. The new study completes that picture down to the core-mantle boundary.

    3
    Most of the known volcanic hotspots are linked to plumes of hot rock (red) rising from two spots on the boundary between the metal core and rocky mantle 1,800 miles below Earth’s surface.
    No image credit.

    She noted that if higher temperature alone were responsible for the rising plumes, they would be only 100-200 kilometers wide, ballooning out only when they approach the surface. The fact that they appear to be five times wider in the lower mantle suggests that they also differ chemically from the surrounding cooler rock.

    This supports models where the material in the plume is a mixture of normal mantle rock and primordial rock from the dense rock anchoring the plume at the core-mantle boundary. In fact, lava emerging from hotspot volcanoes is known to differ chemically and isotopically from lava from other volcanoes, such as those erupting at subduction zones where Earth’s crust dives into the upper mantle.

    The supercomputer analysis did not detect plumes under all hotspot volcanoes, such as those in Yellowstone National Park. The plumes that feed them may be too thin to be detected given the computational limits of the global modeling technique, French said.

    Millions of hours of computer time

    To create a high-resolution CT of Earth, French used very accurate numerical simulations of how seismic waves travel through the mantle, and compared their predictions to the ground motion actually measured by detectors around the globe. Earlier attempts by other researchers often approximated the physics of wave propagation and focused mainly on the arrival times of only certain types of seismic waves, such as the P (pressure) and S (shear) waves, which travel at different speeds. French used numerical simulations to compute all components of the seismic waves, such as their scattering and diffraction, and tweaked the model repeatedly to fit recorded data using a method similar to statistical regression. The final computation required 3 million CPU hours on NERSC’s supercomputers, though parallel computing shrank this to a couple of weeks.

    Romanowicz hopes eventually to obtain higher resolution supercomputer images of Earth’s interior, perhaps by zooming in on specific areas, such as that under the Pacific Ocean, or by using new data.

    “Tomography is the most powerful method to get this information, but in the future it will be combined with very sensitive gravity measurements from satellites and maybe electromagnetic sounding, where people do conductivity measurements of the interior,” she said.

    This study was supported by the National Science Foundation (EAR-1417229) and the European Research Council. NERSC is supported by the U.S. Department of Energy Office of Science (DE-AC02-05CH11231).

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Founded in the wake of the gold rush by leaders of the newly established 31st state, the University of California’s flagship campus at Berkeley has become one of the preeminent universities in the world. Its early guiding lights, charged with providing education (both “practical” and “classical”) for the state’s people, gradually established a distinguished faculty (with 22 Nobel laureates to date), a stellar research library, and more than 350 academic programs.

    UC Berkeley Seal

     
  • richardmitnick 3:02 pm on September 2, 2015 Permalink | Reply
    Tags: , , NASA Landsat   

    From Goddard: “Avoiding Rock Bottom: How Landsat Aids Nautical Charting” 

    NASA Goddard Banner
    Goddard Space Flight Center

    Sep. 1, 2015
    Laura Rocchio, NASA’S Goddard Space Flight Center

    1
    Landsat 8 image of In Bechevin Bay, the easternmost passageway between the Gulf of Alaska and the Bering Sea. This natural color, pan-sharpened image was acquired on May 14, 2014. Credits: Image processing by Jesse Allen, NASA Earth Observatory

    On the most recent nautical chart of the Beaufort Sea where the long narrow Tapkaluk Islands of Alaska’s North Slope separate the sea from the shallow Elson Lagoon (Nautical Chart 16081), a massive shoal is immediately noticeable just west of the entrance to the lagoon. On the chart, it looks like a massive blue thumb jutting out into the sea. The National Oceanographic and Atmospheric Administration (NOAA) identified this prodigious, 6-nautical mile-long, 2-nm-wide shoal using Landsat satellite data.

    NASA Landsat 8
    Landsat 8

    NASA’s and the United States Geological Survey’s Landsat Program is the longest space-based continuous global record of Earth’s surface. The satellite imagery provides valuable information for agriculture, forestry, regional planning, mapping, and global change research.

    In NOAA’s Office of Coast Survey, the Marine Chart Division is responsible for updating the suite of over 1000 nautical charts that keep mariners in U.S. waters safe. Their mandate covers all U.S. territorial waters in the U.S. Exclusive Economic Zone (EEZ), a combined area of 3.4 million square nautical miles that extends 200 nautical miles offshore from the nation’s coastline—the largest EEZ of all nations.

    The field of Satellite Derived Bathymetry (SDB), has been around for nearly a half-century now, but it took the advent of free Landsat data in 2008, the 2013 launch of the more-advanced Landsat 8 satellite, and a shift in thinking about SDB products, to reinvigorate the use of satellite data in NOAA’s Marine Chart Division.

    Keeping waterways safe is a massive undertaking

    “There’s been a shift in the way we think,” Lieutenant Anthony Klemm, a NOAA Corps Officer in the Office of Coast Survey’s Marine Chart Division, explains, “In the past, if a measurement wasn’t made by the Army Corps or a NOAA survey ship, we didn’t want to use it, but now we are opening up to other technologies to evaluate the health of our current chart suite.”

    Because of this sea change in thinking and faced with the daunting job of deciding which charts were most in need of updating, NOAA hydrographers revisited the use of SDB using freely available Landsat data as a viable tool to help them do their jobs.

    2
    Anthony Klemm, a NOAA Corps Officer, in New York Harbor, aboard the NOAA Ship Thomas Jefferson, a hydrographic survey ship based out of Norfolk, VA.
    Credits: NOAA

    “NOAA has now been using Landsat imagery for chart adequacy assessment and mission planning,” Shachak Pe’eri, a Research Professor at the Joint Hydrographic Center at the University of New Hampshire, says.

    The Joint Hydrographic Center, a think-tank of researchers investigating technology and mapping challenges in NOAA’s Office of Coast Survey, realized that Landsat SDB could be an important reconnaissance tool. A single Landsat image is about 100 nautical miles across and affords a wide overview of a coastal area. Maps of SDB can be compared with existing nautical charts. In places where depth patterns do not match the seafloor may have changed, so those areas are more closely examined. If an area looks shallower than what is presented in the chart and if there is a reasonable amount of vessel traffic or corroborating mariners’ reports in the area, the chart location is tagged as a higher-priority candidate for hydrographic mapping—i.e. sending out a hydrographic ship to make depth measurements using sonar (multi-beam or single-beam).

    NOAA: thinking big about SDB

    Water clarity has been a limiting factor when it comes to SDB. If waters are too turbid (full of sediments that obscure light reflectance from the seafloor), then bathymetric measurements cannot be made.

    3
    The NOAA Ship Fairweather in the Gulf of Alaska with its namesake Mt. Fairweather. Credits: NOAA

    Pe’eri, in a collaborative study with NOAA and the U.S. Coast Guard, has pioneered turbidity mapping as a proxy for bathymetric measurements. In enclosed waterbodies with strong currents, such as bays and sounds, turbid channels show up on Landsat imagery—and these turbid channels illuminate where currents are carving deeper channels that are safe for boat passage.

    In the arctic, where near-shore changes occur rapidly because of seasonal sedimentation and erosion, new SDB techniques like turbidity mapping are preventing maritime mishaps. Bechevin Bay, the easternmost passageway between the Gulf of Alaska and the Bering Sea, provides fisherman with a shortcut for three ice-free months a year, but the location of sand bars can shift significantly here because of melting ice and shifting sediments. With the help of Landsat SDB turbidity maps, the new locations of these sandbars can be estimated. Recently this has led to the discovery of a new, straighter, and more geologically stable channel.

    Pe’eri’s team has also developed a multi-image method to help separate clear and turbid waters using Landsat data. Techniques such as turbidity mapping will grow increasingly important for navigation planning as warming waters enable more industrial development of the Arctic and set the stage for international shipping routes.

    NOAA’s Marine Chart Division has made Landsat a prominent tool in their charting toolbox—especially Landsat 8.

    4
    Satellite Derived Bathymetry measurements overlaid on a chart of Plymouth Bay in Massachusetts. The red indicates shallow waters. Here, the SDB indicates that the the shoaling of Brown’s Bank has shifted since the chart’s creation. Credits: NOAA

    “Landsat 8 is overwhelmingly better,” Pe’eri says citing the new satellite’s additional cirrus band which helps him better account for atmospheric noise that can counter accurate SDB and Landsat 8’s better radiometric resolution (which means more signal, less noise, and more measurement fidelity). But it’s not just SDB that this innovative office is utilizing. They are also watching traffic patterns using the Automatic Identification System (AIS) and even light communication from recreational boaters, fishermen, tugboats, and larger vessels, and together with bathymetry measurements are prioritizing which charts are in perilous need of revision.

    “We’re making charts safer up there,” Klemm says talking about the recent Beaufort Sea chart revisions, “and that’s so exciting.”

    For more information on the NASA/USGS Landsat program, click here.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA’s Goddard Space Flight Center is home to the nation’s largest organization of combined scientists, engineers and technologists that build spacecraft, instruments and new technology to study the Earth, the sun, our solar system, and the universe.

    Named for American rocketry pioneer Dr. Robert H. Goddard, the center was established in 1959 as NASA’s first space flight complex. Goddard and its several facilities are critical in carrying out NASA’s missions of space exploration and scientific discovery.

    NASA Goddard Campus
    NASA/Goddard Campus
    NASA

     
  • richardmitnick 2:42 pm on September 2, 2015 Permalink | Reply
    Tags: , , , ,   

    From Clean Energy at WCG: “Summer is a great time to focus on solar energy” 

    WCG
    World Community Grid

    Clean Energy

    Clean Energy Project

    2 Sep 2015

    By: The Clean Energy Project team
    Harvard University

    Summary
    A busy summer has led to several advances in the Clean Energy Project: new team members, new database search functionality, new publications and (hopefully) new funding!

    1
    Front: Wendy Woodin, Dr. Ed Pyzer-Knapp, Dipti Jasrasaria Back: Dr. Steven Lopez

    The Clean Energy Project (CEP) team has been working very hard this summer, and have had a number of successes to show for it.

    We are very happy to introduce the latest addition to our team – Dr. Steven Lopez has joined us from UCLA, where he worked for Ken Houk on computational organic chemistry. Steven’s knowledge of chemical reactivity, and reaction mechanisms will be invaluable to the team as we strive to deliver libraries of molecules which are synthetically accessible.

    We have been lucky enough to get funding for two undergraduates – Wendy and Dipti – to study with the team over the summer. Dipti has continued her work on machine learning on crystals, and Wendy has worked on hashing functions. These hashing functions will be deployed in our new database to enable users to perform searches for molecules similar to their search term; we believe this search option will further enhance the utility of the database for the discovery of new organic photovoltaic materials.

    We are also very happy to say that Ed and Kewei have had a manuscript accepted into the journal Advanced Functional Materials. Advanced Functional Materials is one of the most prestigious journals for this area of study so we are very excited to have been accepted! We will share the details of the manuscript once it gets published.

    Finally, we have just submitted a couple of grant proposals for continuing to fund the CEP in the years to come. Grant proposals are incredibly important for keeping our project running, and so we will keep our fingers crossed for a successful response!

    As ever, we are very appreciative for the computing time you donate since without it, we would be unable to perform the research which goes on in the CEP. So, thank you again…and keep crunching!

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Harvard Clean Energy Project Database contains data and analyses on 2.3 million candidate compounds for organic photovoltaics. It is an open resource designed to give researchers in the field of organic electronics access to promising leads for new material developments.

    Would you like to help find new compounds for organic solar cells? By participating in the Harvard Clean Energy Project you can donate idle computer time on your PC for the discovery and design of new materials. Visit WorldCommunityGrid to get the BOINC software on which the project runs.

    CleanEnergyProjectPartners

    CEP runs on software from BOINC, Berkeley Open Infrastructure for Network computing.

    BOINCLarge

     
  • richardmitnick 2:18 pm on September 2, 2015 Permalink | Reply
    Tags: , , ,   

    From JPL: “At Saturn, One of These Rings is not like the Others” 

    JPL

    September 2, 2015
    Preston Dyches
    Jet Propulsion Laboratory, Pasadena, Calif.
    818-354-7013
    preston.dyches@jpl.nasa.gov

    1

    Of the countless equinoxes Saturn has seen since the birth of the solar system, this one, captured here in a mosaic of light and dark, is the first witnessed up close by an emissary from Earth … none other than our faithful robotic explorer, Cassini.

    NASA Cassini Spacecraft
    Cassini

    Seen from our planet, the view of Saturn’s rings during equinox is extremely foreshortened and limited. But in orbit around Saturn, Cassini had no such problems. From 20 degrees above the ring plane, Cassini’s wide angle camera shot 75 exposures in succession for this mosaic showing Saturn, its rings, and a few of its moons a day and a half after exact Saturn equinox, when the sun’s disk was exactly overhead at the planet’s equator.

    The novel illumination geometry that accompanies equinox lowers the sun’s angle to the ring plane, significantly darkens the rings, and causes out-of-plane structures to look anomalously bright and to cast shadows across the rings. These scenes are possible only during the few months before and after Saturn’s equinox which occurs only once in about 15 Earth years. Before and after equinox, Cassini’s cameras have spotted not only the predictable shadows of some of Saturn’s moons (see PIA11657), but also the shadows of newly revealed vertical structures in the rings themselves (see PIA11665).

    Also at equinox, the shadows of the planet’s expansive rings are compressed into a single, narrow band cast onto the planet as seen in this mosaic. (For an earlier view of the rings’ wide shadows draped high on the northern hemisphere, see PIA09793.)

    The images comprising the mosaic, taken over about eight hours, were extensively processed before being joined together. First, each was re-projected into the same viewing geometry and then digitally processed to make the image “joints” seamless and to remove lens flares, radially extended bright artifacts resulting from light being scattered within the camera optics.

    At this time so close to equinox, illumination of the rings by sunlight reflected off the planet vastly dominates any meager sunlight falling on the rings. Hence, the half of the rings on the left illuminated by planetshine is, before processing, much brighter than the half of the rings on the right. On the right, it is only the vertically extended parts of the rings that catch any substantial sunlight.

    With no enhancement, the rings would be essentially invisible in this mosaic. To improve their visibility, the dark (right) half of the rings has been brightened relative to the brighter (left) half by a factor of three, and then the whole ring system has been brightened by a factor of 20 relative to the planet. So the dark half of the rings is 60 times brighter, and the bright half 20 times brighter, than they would have appeared if the entire system, planet included, could have been captured in a single image.

    The moon Janus (179 kilometers, 111 miles across) is on the lower left of this image. Epimetheus (113 kilometers, 70 miles across) appears near the middle bottom. Pandora (81 kilometers, 50 miles across) orbits outside the rings on the right of the image. The small moon Atlas (30 kilometers, 19 miles across) orbits inside the thin F ring on the right of the image. The brightnesses of all the moons, relative to the planet, have been enhanced between 30 and 60 times to make them more easily visible. Other bright specks are background stars. Spokes — ghostly radial markings on the B ring — are visible on the right of the image.

    This view looks toward the northern side of the rings from about 20 degrees above the ring plane.

    The images were taken on Aug. 12, 2009, beginning about 1.25 days after exact equinox, using the red, green and blue spectral filters of the wide angle camera and were combined to create this natural color view. The images were obtained at a distance of approximately 847,000 kilometers (526,000 miles) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 74 degrees. Image scale is 50 kilometers (31 miles) per pixel.

    The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. The Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the mission for NASA’s Science Mission Directorate, Washington, D.C. The Cassini orbiter and its two onboard cameras were designed, developed and assembled at JPL. The imaging operations center is based at the Space Science Institute in Boulder, Colo.

    Fast Facts:

    › A study suggests the particles in one section of Saturn’s rings are denser than elsewhere, possibly due to solid, icy cores.

    › The findings could mean that particular ring is much younger than the rest.

    When the sun set on Saturn’s rings in August 2009, scientists on NASA’s Cassini mission were watching closely. It was the equinox — one of two times in the Saturnian year when the sun illuminates the planet’s enormous ring system edge-on. The event provided an extraordinary opportunity for the orbiting Cassini spacecraft to observe short-lived changes in the rings that reveal details about their nature.

    Like Earth, Saturn is tilted on its axis. Over the course of its 29-year-long orbit, the sun’s rays move from north to south over the planet and its rings, and back again. The changing sunlight causes the temperature of the rings — which are made of trillions of icy particles — to vary from season to season. During equinox, which lasted only a few days, unusual shadows and wavy structures appeared and, as they sat in twilight for this brief period, the rings began to cool.

    In a recent study published in the journal Icarus, a team of Cassini scientists reported that one section of the rings appears to have been running a slight fever during equinox. The higher-than-expected temperature provided a unique window into the interior structure of ring particles not usually available to scientists.

    “For the most part, we can’t learn much about what Saturn’s ring particles are like deeper than 1 millimeter below the surface. But the fact that one part of the rings didn’t cool as expected allowed us to model what they might be like on the inside,” said Ryuji Morishima of NASA’s Jet Propulsion Laboratory, Pasadena, California, who led the study.

    The researchers examined data collected by Cassini’s Composite Infrared Spectrometer during the year around equinox. The instrument essentially took the rings’ temperature as they cooled. The scientists then compared the temperature data with computer models that attempt to describe the properties of ring particles on an individual scale.

    What they found was puzzling. For most of the giant expanse of Saturn’s rings, the models correctly predicted how the rings cooled as they fell into darkness. But one large section — the outermost of the large, main rings, called the A ring — was much warmer than the models predicted. The temperature spike was especially prominent in the middle of the A ring.

    To address this curiosity, Morishima and colleagues performed a detailed investigation of how ring particles with different structures would warm up and cool down during Saturn’s seasons. Previous studies based on Cassini data have shown Saturn’s icy ring particles are fluffy on the outside, like fresh snow. This outer material, called regolith, is created over time, as tiny impacts pulverize the surface of each particle. The team’s analysis suggested the best explanation for the A ring’s equinox temperatures was for the ring to be composed largely of particles roughly 3 feet (1 meter) wide made of mostly solid ice, with only a thin coating of regolith.

    “A high concentration of dense, solid ice chunks in this one region of Saturn’s rings is unexpected,” said Morishima. “Ring particles usually spread out and become evenly distributed on a timescale of about 100 million years.”

    The accumulation of dense ring particles in one place suggests that some process either placed the particles there in the recent geologic past or the particles are somehow being confined there. The researchers suggest a couple of possibilities to explain how this aggregation came to be. A moon may have existed at that location within the past hundred million years or so and was destroyed, perhaps by a giant impact. If so, debris from the breakup might not have had time to diffuse evenly throughout the ring. Alternatively, they posit that small, rubble-pile moonlets could be transporting the dense, icy particles as they migrate within the ring. The moonlets could disperse the icy chunks in the middle A ring as they break up there under the gravitational influence of Saturn and its larger moons.

    “This particular result is fascinating because it suggests that the middle of Saturn’s A ring may be much younger than the rest of the rings,” said Linda Spilker, Cassini project scientist at JPL and a co-author of the study. “Other parts of the rings may be as old as Saturn itself.”

    During its final series of close orbits to Saturn, Cassini will directly measure the mass of the planet’s main rings for the first time, using gravity science. Scientists will use the mass of the rings to place constraints on their age.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA JPL Campus

    Jet Propulsion Laboratory (JPL) is a federally funded research and development center and NASA field center located in the San Gabriel Valley area of Los Angeles County, California, United States. Although the facility has a Pasadena postal address, it is actually headquartered in the city of La Cañada Flintridge [1], on the northwest border of Pasadena. JPL is managed by the nearby California Institute of Technology (Caltech) for the National Aeronautics and Space Administration. The Laboratory’s primary function is the construction and operation of robotic planetary spacecraft, though it also conducts Earth-orbit and astronomy missions. It is also responsible for operating NASA’s Deep Space Network.

    Caltech Logo
    jpl

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 460 other followers

%d bloggers like this: