Tagged: Kavli Institute Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:56 pm on April 28, 2017 Permalink | Reply
    Tags: , , , , Kavli Institute,   

    From Kavli: “Delving Into the ‘Dark Universe’ with the Large Synoptic Survey Telescope” 


    The Kavli Foundation

    Two astrophysicists and a theoretical physicist discuss how the Large Synoptic Survey Telescope will probe the nature of dark matter and dark energy by taking an unprecedentedly enormous scan of the sky.

    LSST Camera, built at SLAC

    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    AT A MOUNTAINTOP CEREMONY IN CHILE, on April 14th, scientists and diplomats laid the first stone for the Large Synoptic Survey Telescope (LSST). This ambitious international astrophysics project is slated to start scanning the heavens in 2022. When it does, LSST should open up the “dark universe” of dark matter and dark energy—the unseen substance and force, respectively, composing 95 percent of the universe’s mass and energy—as never before.

    The “large” in LSST’s name is a bit of an understatement. The telescope will feature an 8.4-meter diameter mirror and a 3.2 gigapixel camera, the biggest digital camera ever built. The telescope will survey the entire Southern Hemisphere’s sky every few days, hauling in 30 terabytes of data nightly. After just its first month of operations, LSST’s camera will have observed more of the universe than all previous astronomical surveys combined.

    On April 2, 2015, two astrophysicists and a theoretical physicist spoke with The Kavli Foundation about how LSST’s sweeping search for dark matter and dark energy will answer fundamental questions about our universe’s make-up.

    Steven Kahn – is the Director of LSST and the Cassius Lamb Kirk Professor in the Natural Sciences in the Kavli Institute for Particle Astrophysics and Cosmology (KIPAC) at Stanford University.

    Sarah Bridle – is a professor of astrophysics in the Extragalactic Astronomy and Cosmology research group of the Jodrell Bank Center for Astrophysics in the School of Physics and Astronomy at The University of Manchester.

    Hitoshi Murayama – is the Director of the Kavli Institute for the Physics and Mathematics of the Universe (Kavli IPMU) at the University of Tokyo and a professor at the Berkeley Center for Theoretical Physics at the University of California, Berkeley.

    The following is an edited transcript of their roundtable discussion. The participants have been provided the opportunity to amend or edit their remarks.

    THE KAVLI FOUNDATION (TKF): Steven, when the LSST takes its first look at the universe seven years from now, why will this be so exciting to you?

    STEVEN KAHN: In terms of how much light it will collect and its field of view, LSST is about ten times bigger than any other survey telescope either planned or existing. This is important because it will allow us to survey a very large part of the sky relatively quickly and to do many repeated observations of every part of the Southern Hemisphere over ten years. By doing this, the LSST will gather information on an enormous number of galaxies. We’ll detect something like 20 billion galaxies.

    SARAH BRIDLE: That’s a hundred times as many as we’re going to get with the current generation of telescopes, so it’s a huge increase. With the data, we’re going to be able to make a three-dimensional map of the dark matter in the universe using gravitational lensing.

    Gravitational Lensing NASA/ESA

    Gravitational microlensing, S. Liebes, Physical Review B, 133 (1964): 835

    Then we’re going to use that to tell us about how the “clumpiness” of the universe is changing with time, which is going to tell us about dark energy.

    TKF: How does gathering information on billions of galaxies help us learn more about dark energy?

    HITOSHI MURAYAMA: Dark energy is accelerating the expansion of the universe and ripping it apart. The questions we are asking are: Where is the universe going? What is its fate? Is it getting completely ripped apart at some point? Does the universe end? Or does it go forever? Does the universe slow down at some point? To understand these questions, it’s like trying to understand how quickly the population of a given country is aging. You can’t understand the trend of where the country is going just by looking at a small number of people. You have to do a census of the entire population. In a similar way, you need to really look at a vast amount of galaxies so you can understand the trend of where the universe is going. We are taking a cosmic census with LSST.

    A diagram explaining the phenomenon of gravitational lensing. Foreground clumps of dark matter in galaxy clusters gravitationally bend the light from background galaxies on its way to Earth. Note that the image is not to scale. Credit: NASA, ESA, L. Calcada)

    This phenomenon occurs when foreground matter and dark matter contained in galaxy clusters bend the light from background galaxies—sort of like looking through the bottom of a wine glass. Measuring the amount of the distortion of the background galaxies indirectly reveals the amount of dark matter that has clumped together in the foreground object. Measuring the rate of this dark matter clumping across different eras in the universe’s history speaks to how much dark energy is stretching the universe at given times, thus revealing the mysterious, pervasive force’s strength and properties.

    TKF: The main technique the LSST will use to learn more about dark energy will be gravitational lensing. Dark energy is the mysterious, invisible force that is pushing open and shaping the universe. Can you elaborate on why this is important and how will LSST help realize its full potential?

    BRIDLE: It’s extremely difficult to detect the dark energy that seems to be causing our universe to accelerate.

    Dark Energy Camera [DECam], built at FNAL

    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam

    Through gravitational lenses, however, it’s possible by observing how much dark matter is being pulled together by gravity.

    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al

    And by looking at how much this dark matter clumps up early and later on in the universe, we can see how much the universe is being stretched apart at different times. With LSST, there will be a huge increase in the number of galaxies that we can detect and observe. LSST will also let us identify how far away the galaxies are. This is important. If we want to see how fast the universe is clumping together at different times, we need to know at what time and how far away we’re looking.

    KAHN: With LSST, we’re trying to measure the subtle distortion of the appearance of galaxies caused by clumps of dark matter. We do this by looking for correlations in galaxies’ shapes depending on their position with respect to one another. Of course, there’s uncertainty associated with that kind of measurement on the relatively small scales of individual galaxies, and the dominant source of that uncertainty is that galaxies have intrinsic shapes—some are spiral-shaped, some are round, and so on, and we are seeing them at different viewing angles, too. Increasing the number of galaxies with LSST makes doing this a far more statistically powerful and thus precise measurement of the effect of gravitational lensing caused by dark matter and how the clumping of dark matter has changed over the universe’s history.

    LSST will also help address something called cosmic variance. This happens when we’re making comparisons of what we see against a statistical prediction of what an ensemble of possible universes might look like. We only live in one universe, so there’s an inherent error associated with how good those statistical predictions are of what our universe should look like when applied to the largest scales of great fields of galaxies. The only way to try and statistically beat that cosmic variance down is to survey as much of the sky as possible, and that’s the other area where LSST is breaking new ground.

    TKF: Will the gravitational lensing observations by LSST be more accurate than anything before?

    KAHN: One of the reasons I personally got motivated to work on LSST was because of the difficulty in making the sort of weak lensing measurements that Sarah described.

    BRIDLE: Typically, telescopes distort the images of galaxies by more than the gravitational lensing effect we are trying to measure. And in order to learn about dark matter and dark energy from gravitational lensing, we’ve got to not just detect the gravitational lensing signal but measure it to about a one-percent accuracy. So we’ve got to rid of these effects from the optics in the telescope before we can do anything to learn about cosmology.

    KAHN: A lot of the initial work in this field has been plagued by issues associated with the basic telescopes and cameras used. It was hard to separate out the cosmic signals that people were looking for from spurious effects that were introduced by the instrumentation. LSST is actually the first telescope that will have ever been built with the notion of doing weak lensing in mind. We have taken great care to model in detail the whole system, from the telescope to the camera to the atmosphere that we are looking through, to understand the particular issues in the system that could compromise weak lensing measurements. That approach has been a clear driver in how we design the facility and how we calibrate it. It’s been a big motivation for me personally and for the entire LSST team.

    TKF: As LSST reveals the universe’s past, will it also help us predict the future of the universe?

    MURAYAMA: Yes, it will. Because LSST will survey the sky so quickly and repeatedly, it will show how the universe is changing over time. For example, we will be able to see how a supernova changes from one time period to another. This kind of information should prove extremely useful in deciphering the nature of dark energy, for instance.

    KAHN: This is one way LSST will observe changes in the universe and gather information on dark energy beyond gravitational lensing. In fact, the way the acceleration of the universe by dark energy was first discovered in 1998 was through the measurement of what are called Type Ia supernovae.

    Sag A* NASA Chandra X-Ray Observatory 23 July 2014, the supermassive black hole at the center of the Milky Way

    These are exploding stars where we believe we understand the typical intrinsic brightness of the explosion. Therefore, the apparent brightness of a supernova—how faint the supernova appears when we see it—is a clear measure of how far away the object is. That is because objects that are farther away are dimmer than closer objects. By measuring a population of Type Ia supernovae, we can figure out their true distances from us and how those distances have increased over time. Put those two pieces of information together, and that’s a way of determining the expansion rate of the universe.

    This analysis was done for the initial discovery of the accelerating cosmic expansion with a relatively small number of supernovae—just tens. LSST will measure an enormous number of supernovae, something like 250,000 per year. Only a smaller fraction of those will be very well characterized, but that number is still in the tens of thousands per year. That will be very useful for understanding how our universe has evolved.

    TKF: LSST will gather a prodigious amount of data. How will this information be made available to scientists and the public alike for parsing?

    KAHN: Dealing with the enormous size of the data base LSST will produce is a challenge. Over its ten-year run, LSST will generate something like a couple hundred petabytes of data, where a petabyte is 10 to the 15th bytes. That’s more data, by a lot, than everything that’s ever been written in any language in human history.

    The data will be made public to the scientific community largely in the form of catalogs of objects and their properties. But those catalogs can be trillions of lines long. So one of the challenges is not so much how you acquire and store the data, but how do you actually find anything in something that big? It’s the needle in the haystack problem. That’s where there need to be advances because the current techniques that we use to query catalogs, or to say “find me such and such,” they don’t scale very well to this size of data. So a lot of new computer science ideas have to be invoked to make that work.


    “With the data, we’re going to be able to make a three-dimensional map of the dark matter in the universe using gravitational lensing. Then we’re going to use that to tell us about how the “clumpiness” of the universe is changing with time, which is going to tell us about dark energy.” –Sarah Bridle

    MURAYAMA: One thing that we at Kavli IPMU are pursuing right now is a sort of precursor project to LSST called Hyper Suprime-Cam, using the Subaru Telescope.

    NAOJ Subaru Hyper Suprime Camera

    NAOJ/Subaru Telescope at Mauna Kea Hawaii, USA

    It’s smaller than LSST, but it’s trying to do many of the things that LSST is after, like looking for weak gravitational lensing and trying to understand dark energy. We already are facing the challenge of dealing with a large data set. One aspect we would like to pursue at Kavli IPMU, and of course LSST is already doing it, is to get a lot of people in computer science and statistics involved into this. I believe a new area of statistics will be created by the needs of handling these large data sets. It’s a sort of fusion, the interdisciplinary aspects of this project. It’s a large astronomy survey that will influence other areas of science.

    TKF: Are any “citizen science” projects envisioned for LSST, like Galaxy Zoo, a website where astronomy buffs classify the shapes of millions of galaxies imaged by the Sloan Digital Sky Survey?

    KAHN: Data will be made available right away. So LSST will in some sense bring the universe home to anybody with a personal computer, who can log on and look at any part of the southern hemisphere’s sky at any given time. So there’s a tremendous potential there to engage the public not only in learning about science, but actually in doing science and interacting directly with the universe.

    We have people involved in LSST that are intimately tied into Galaxy Zoo. We’re looking into how to incorporate citizens and crowdsource the science investigations of LSST. One of these investigations is strong gravitational lensing. Sarah has talked about weak gravitational lensing, which is a very subtle distortion to the appearance of the background galaxies. But it turns out if you put a galaxy right behind a concentration of dark matter found in a massive foreground galaxy cluster, then the distortions can get very significant. You can actually see multiple images of the background galaxy in a single image, bent all the way around the foreground galaxy cluster. The detection of those strong gravitational lenses and the analysis of the light patterns you see within them also yields complementary scientific information about cosmological fundamental parameters. But it requires sort of recognizing what is in fact a strong gravitational lensing event, as well as modeling the distribution of dark matter that gives rise to the strength of that particular lensing. Colleagues of Hitoshi and myself have already created a tool to help with this effort, called SpaceWarps (www.spacewarps.org). The tool lets the public look for strong gravitational lenses using data from the Sloan Digital Sky Survey and to play around with dark matter modeling to see if they can get something that looks like the real data.


    “Over its ten-year run, LSST will generate something like a couple hundred petabytes of data, where a petabyte is 10 to the 15th bytes. That’s more data, by a lot, than everything that’s ever been written in any language in human history.” –Steven Kahn

    MURAYAMA: This has been incredibly successful. Scientists have developed computer programs to automatically look for these strongly lensed galaxies, but even an algorithm written by the best scientists can still miss some of these strong gravitationally lensed objects. Regular citizens, however, often manage to find some candidates for the strongly lensed galaxies that the computer algorithm has missed. Not only will this be great fun for people to get involved, it can even help the science as well, especially with a project as large as LSST.

    TKF: In the hunt for dark energy’s signature on the cosmos, LSST is just one of many current and planned efforts. Sarah, how will LSST observations tie in with the Dark Energy Survey you’re working on, and Hitoshi, with will LSST complement the Hyper Suprime-Cam?

    BRIDLE: So the Dark Energy Survey is going to image one-eighth of the whole sky and have 300 million galaxy images. About two years of data have been taken so far, with about three more years to go. We’ll be doing maps of dark matter and measurements of dark energy. The preparation for LSST that we are doing via DES will be essential.

    MURAYAMA: Hyper Suprime-Cam is similar to the Dark Energy Survey. It’s a nearly billion pixel camera looking for nearly 10 million galaxies. Following up on the Hyper Suprime-Cam imaging surveys, we would like to measure what we call spectra from a couple million galaxies.

    KAHN: The measurement of spectra as an addition to imaging tells us not only about the structure of matter in the universe but also how much the matter is moving with respect to the overall, accelerating cosmic expansion due to dark energy. Spectra are an additional, very important piece of information in constraining cosmological models.

    MURAYAMA: We will identify spectra with an instrument called the Prime Focus Spectrograph, which is scheduled to start operations in 2017 also on the Subaru telescope.

    NAOJ Subaru Prime Focus Spectrograph

    We will do very deep exposures to get the spectra on some of these interesting objects, such as galaxies where lensing is taking place and supernovae, which will also allow us to do much more precise measurements on dark energy.

    This image from a pilot project, the Deep Lens Survey (DLS), offers up an example of what the sky will look like when observed by LSST. The images from LSST will have twice DLS’ depth and resolution, while also covering 50,000 times the area of this particular image, and in six different optical colors. Credit: Deep Lens Survey / UC Davis / NOAO)

    Like the Hyper Suprime-Cam, LSST can only do imaging. So I’m hoping when LSST comes online in the 2020s, we will already have the Prime Focus Spectrograph operational, and we will be able to help each other. LSST’s huge amount of data will contain many interesting objects we would like to study with this Prime Focus Spectrograph.

    KAHN: All these dark matter and dark energy telescope projects are very complementary to each other. It’s because of the scientific importance of these really fundamental pressing questions—what is the nature of dark matter and dark energy?—that the various different funding institutions around the world have been eager to invest in such an array of different complementary projects. I think that’s great, and it just shows how important this general problem is.

    TKF: Hitoshi, you mentioned earlier the interdisciplinary approach fostered by LSST and projects like it, and you’ve spoken before about how having different scientific disciplines and perspectives together leads to breakthrough thinking—a major goal of Kavli IPMU. Your primary expertise is in particle physics, but you work on many other areas of physics. Could you describe how observations of the very biggest scales of the dark universe with LSST will inform work on the very smallest, subatomic scales, and vice versa?

    MURAYAMA: It’s really incredible to think about this point. The biggest thing we can observe in the universe has to have something to do with the smallest things we can think of and all the matter we see around us.

    BRIDLE: It is amazing that you can look at the largest scales and find out about the smallest things.

    MURAYAMA: For more than a hundred years, particle physicists have been trying to understand what everything around us is made of. We made huge progress by building a theory called the standard model of particle physics in the 20th century, which is really a milestone of science. Discovering the Higgs boson at the Large Hadron Collider at CERN in 2012 really nailed that the standard model is the right theory about the origin of everything around us. But it turns out that what we see around us is actually making up only five percent of the universe.

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    So there is this feeling among particle physicists of “what have we been doing for a hundred years?” We only have five percent of the universe! We still need to understand the remaining 95 percent of the universe, which is dark matter and dark energy. It’s a huge problem and we have no idea what they are really.


    “The biggest thing we can observe in the universe has to have something to do with the smallest things we can think of.” –Hitoshi Murayama

    A way I explain what dark matter is: It’s the mother from whom we got separated at birth. What I mean by this is without dark matter, there’s no structure to the universe—no galaxies, no stars—and we wouldn’t be here. Dark matter, like a mother, is the reason we exist, but we haven’t met her and have never managed to thank her. So that’s the reason why we would like to know who she is, how she came to exist and how she shaped us. That’s the connection between the science of looking for the fundamental constituents of the universe, which is namely what particle physicists are after, and this largest scale of observation done with LSST.

    TKF: Given LSST’s vast vista on the Universe, it is frankly expected that the project will turn up the unexpected. Any ideas or speculations on what tracking such a huge portion of the universe might newly reveal?

    KAHN: That’s sort of like asking, “what are the unknown unknowns?” [laughter]

    TKF: Yes—good luck figuring those out!

    KAHN: Let me just say, one of the great things about astrophysics is that we have explicit theoretical predictions we’re trying to test out by taking measurements of the universe. That approach is more akin to many other areas of experimental physics, like searching for the Higgs boson with the Large Hadron Collider, as Hitoshi mentioned earlier.

    CERN/LHC Map

    CERN LHC Tunnel

    LHC at CERN

    But there’s also this wonderful history in astronomy that every time we build a bigger and better facility, we always find all kinds of new things we never envisioned.

    If you go back—unfortunately I’m old enough to remember these days—to the period before the launch of the Hubble Space Telescope, it’s interesting to see what people had thought were going to be the most exciting things to do with Hubble. Many of those things were done and they were definitely exciting. But I think what many people felt was the most exciting was the stuff we didn’t even think to ask about, like the discovery of dark energy Hubble helped make. So I think a lot of us have expectations of similar kinds of discoveries for facilities like LSST. We will make the measurement we’re intending to make, but there will be a whole bunch of other exciting stuff that we never even dreamed of that’ll come for free on top.

    BRIDLE: I’m a cosmologist and I’m very excited for what LSST is going to do for cosmology, but I’m even more excited that it’s going to be taking very, very short 15-second exposures of the sky. LSST is going to be able to discover all these changing, fleeting objects like supernovae that Hitoshi talked about, but it’s a whole new phase of discovery. It’s inevitable we’re going to discover a whole load of new stuff that we’ve never even thought of.

    MURAYAMA: I’m sure there will be surprises!

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

  • richardmitnick 3:15 pm on November 25, 2016 Permalink | Reply
    Tags: INCITE program, Kavli Institute, , Stellar mass loss,   

    From UC Santa Barbara’s Kavli Institute for Theoretical Physics (KITP): “Stellar Simulators” 

    UC Santa Barbara Name bloc

    UC Santa Barbara


    The Kavli Foundation

    November 22, 2016
    Julie Cohen

    It’s an intricate process through which massive stars lose their gas as they evolve. And a more complete understanding could be just calculations away, if only those calculations didn’t take several millennia to run on normal computers.

    But astrophysicists Matteo Cantiello and Yan-Fei Jiang of UC Santa Barbara’s Kavli Institute for Theoretical Physics (KITP) may find a way around that problem.

    The pair have been awarded 120 million CPU hours over two years on the supercomputer Mira — the sixth-fastest computer in the world — through the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, an initiative of the U.S. Department of Energy Office of Science.

    MIRA IBM Blue Gene Q supercomputer at the Argonne Leadership Computing Facility
    MIRA IBM Blue Gene Q supercomputer at the Argonne Leadership Computing Facility

    INCITE aims to accelerate scientific discoveries and technological innovations by awarding, on a competitive basis, time on supercomputers to researchers with large-scale, computationally intensive projects that address “grand challenges” in science and engineering.

    “Access to Mira means that we will be able to run calculations that otherwise would take about 150,000 years to run on our laptops,” said Cantiello, an associate specialist at KITP.

    Cantiello and Jiang will use their supercomputer time to run 3-D simulations of stellar interiors, in particular the outer envelopes of massive stars. Such calculations are an important tool to inform and improve the one-dimensional approximations used in stellar evolution modeling. The researchers aim to unravel the complex physics involved in the interplay among gas, radiation and magnetic fields in such stars — stellar bodies that later in life can explode to form black holes and neutron stars.

    The physicists use grid-based Athena++ code — which has been carefully extended and tested by Jiang — to solve equations for the gas flow in the presence of magnetic fields (magnetohydrodynamics) and for how photons move in such environments and interact with the gas flow (radiative transfer). The code divides the huge calculations into small pieces that are sent to many different CPUs and are solved in parallel. With a staggering number of CPUs — 786,432 to be precise — Mira speeds up the process tremendously.

    This research addresses an increasingly important problem: understanding the structure of massive stars and the nature of the process that makes them lose mass as they evolve. This includes both relatively steady winds and dramatic episodic mass loss eruptions.

    Called stellar mass loss, this process has a decisive effect on the final fate of these objects. The type of supernova explosion that these stars undergo, as well as the type of remnants they leave behind (neutron stars, black holes or even no remnant at all), are intimately tied to their mass loss.

    The study is particularly relevant in light of the recent detection of gravitational waves from LIGO (Laser Interferometer Gravitational-Wave Observatory). The discovery demonstrated the existence of stellar mass black holes orbiting so close to each other that eventually they can merge and produce the observed gravitational waves.

    “Understanding how these black hole binary systems formed in the first place requires a better understanding of the structure and mass loss of their stellar progenitors,” explained Jiang, a postdoctoral fellow at KITP.

    The implications of the work Cantiello and Jiang will perform on Mira also extend to broader fields of stellar evolution and galaxy formation, among others.

    See the full UCSB article here .
    See the full Kavli article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

    UC Santa Barbara Seal

    The University of California, Santa Barbara (commonly referred to as UC Santa Barbara or UCSB) is a public research university and one of the 10 general campuses of the University of California system. Founded in 1891 as an independent teachers’ college, UCSB joined the University of California system in 1944 and is the third-oldest general-education campus in the system. The university is a comprehensive doctoral university and is organized into five colleges offering 87 undergraduate degrees and 55 graduate degrees. In 2012, UCSB was ranked 41st among “National Universities” and 10th among public universities by U.S. News & World Report. UCSB houses twelve national research centers, including the renowned Kavli Institute for Theoretical Physics.

  • richardmitnick 6:18 am on July 19, 2016 Permalink | Reply
    Tags: Delft, Kavli Institute, Smallest Hard Disk To Date Writes Information Atom By Atom   

    From Kavli: “Smallest Hard Disk To Date Writes Information Atom By Atom” 


    The Kavli Foundation

    July 18, 2016

    Credit: Ottelab/TUDelft

    Every day, modern society creates more than a billion gigabytes of new data. To store all this data, it is increasingly important that each single bit occupies as little space as possible. A team of scientists at the Kavli Institute of Nanoscience at Delft University managed to bring this reduction to the ultimate limit: they built a memory of 1 kilobyte (8,000 bits), where each bit is represented by the position of one single chlorine atom. “In theory, this storage density would allow all books ever created by humans to be written on a single post stamp”, says lead-scientist Sander Otte. They reached a storage density of 500 Terabits per square inch (Tbpsi), 500 times better than the best commercial hard disk currently available. His team reports on this memory in Nature Nanotechnology on Monday July 18.

    STM scan (96 nm wide, 126 nm tall) of the 1 kB memory, written to a section of Feynman’s lecture There’s Plenty of Room at the Bottom (with text markup). (Credit: Ottelab/TUDelft)


    In 1959, physicist Richard Feynman challenged his colleagues to engineer the world at the smallest possible scale. In his famous lecture There’s Plenty of Room at the Bottom, he speculated that if we had a platform allowing us to arrange individual atoms in an exact orderly pattern, it would be possible to store one piece of information per atom. To honor the visionary Feynman, Otte and his team now coded a section of Feynman’s lecture on an area 100 nanometers wide.

    Sliding puzzle

    The team used a scanning tunneling microscope (STM), in which a sharp needle probes the atoms of a surface, one by one. With these probes scientists cannot only see the atoms but they can also use them to push the atoms around. “You could compare it to a sliding puzzle”, Otte explains. “Every bit consists of two positions on a surface of copper atoms, and one chlorine atom that we can slide back and forth between these two positions. If the chlorine atom is in the top position, there is a hole beneath it — we call this a 1. If the hole is in the top position and the chlorine atom is therefore on the bottom, then the bit is a 0.” Because the chlorine atoms are surrounded by other chlorine atoms, except near the holes, they keep each other in place. That is why this method with holes is much more stable than methods with loose atoms and more suitable for data storage.

    Explanation of the bit logic and the atomic markers. Credit: Ottelab/TUDelft


    The researchers from Delft organized their memory in blocks of 8 bytes (64 bits). Each block has a marker, made of the same type of ‘holes’ as the raster of chlorine atoms. Inspired by the pixelated square barcodes (QR codes) often used to scan tickets for airplanes and concerts, these markers work like miniature QR codes that carry information about the precise location of the block on the copper layer. The code will also indicate if a block is damaged, for instance due to some local contaminant or an error in the surface. This allows the memory to be scaled up easily to very big sizes, even if the copper surface is not entirely perfect.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

  • richardmitnick 11:50 am on June 9, 2016 Permalink | Reply
    Tags: , Kavli Institute, , Tarantula Toxins Offer Key Insights Into Neuroscience of Pain   

    From Kavli: “Tarantula Toxins Offer Key Insights Into Neuroscience of Pain” 


    The Kavli Foundation

    Nicholas Weiler, UCSF June 06, 2016

    Toxins Extracted from Ornamental Baboon Tarantula May Be Used as Tools to Study Disorders Ranging from Irritable Bowel Syndrome to Epilepsy.

    A Heteroscodra maculata, a West African tarantula.

    When your dentist injects lidocaine into your gums, the drug blocks the pain of the oncoming drill, but it also blocks all other sensation – leaving your mouth feeling numb and swollen. What if there were a drug that could specifically block pain, but leave the rest of your sensations alone? In order to do this, you would need to find a way to control the cohort of nerve fibers that transmit the specific type of pain you would like to block.

    A research team led by UC San Francisco scientists has discovered molecules that may help researchers do just that: two toxins isolated from the venom of Heteroscodra maculata, a West African tarantula the size of your hand (commonly referred to as the “ornamental baboon” or “Togo starburst” tarantula). This spider’s massive fangs deliver a poison that causes excruciating pain in part by triggering a specific kind of sodium channel within A-delta nerve fibers, according to the new research.

    The study was led by researchers in the lab of David Julius, PhD, Chair of the Department of Physiology at UCSF, and was published June 6, 2016 in the journal Nature.

    The researchers are excited about this finding for two equally important reasons: for opening a new chapter in our understanding of pain, and because the new toxins can now be used as a highly selective tool for manipulating this type of sodium channel, which also has been implicated in neurological disorders unrelated to pain, from epilepsy to autism to Alzheimer’s disease.

    “It’s a good problem to have,” said Jeremiah Osteen, PhD, the postdoctoral fellow in Julius’s group who led the research team. “We didn’t know which of the two findings we should be more excited about.”

    Poisonous Creatures Reveal Tools for Pain, Neurological Research

    Julius’s lab – which is renowned for the discovery and characterization of the so-called “wasabi receptor” – has recently been working to identify new pain pathways by screening more than a hundred different venoms from poisonous spiders, scorpions, and centipedes — sourced from the collection of co-author Glenn F. King, PhD, of the University of Queensland in Australia — all of which have evolved chemical defenses that target the biology of animals’ pain nerves.

    “There are dozens to hundreds of different active peptides in each animal’s venom,” said Julius, who is a member of the Kavli Institute for Fundamental Neuroscience at UCSF. “The deeper you look the more toxins there seem to be.”

    The Togo starburst tarantula’s venom struck them as being particularly interesting because it appeared to activate a particular type of sodium channel within sensory nerves that was not a part of known pain pathways.

    To identify which of the dozens of chemicals that made up the tarantula’s venom were specifically targeting these channels, the researchers separated the venom and applied the components one-by-one to rodent sensory neurons in a lab dish. They found two peptide molecules that specifically and powerfully activated these sensory nerves, and experiments with lab-synthesized versions of the same molecules confirmed that these chemicals could activate pain-sensing neurons on their own.

    Experiments with an array of different drugs that block candidate receptor molecules demonstrated that the two toxins specifically bind to and demonstrated that this particular receptor is indeed found on A-delta nerves in mice.

    The accepted notion is that A-delta fibers may convey the sharp, immediate shock of a burn or a cut, ahead of the burning throb conveyed by slower C fibers. The newly discovered tarantula peptides allowed the researchers to isolate A-delta fibers in mice, and show that they also appear to play a role in touch hypersensitivity – when normally innocuous light touch causes discomfort – a type of pain common to diseases like shingles and many chronic pain syndromes.

    Nine Subtly Different Voltage-Sensitive Sodium Channels

    Additional experiments also implicated heightened touch sensitivity of Nav1.1-expressing nerves in a mouse model of irritable bowel syndrome, suggesting these nerves, and this channel, may play a role in the chronic discomfort such patients experience.

    The pharmacological aspect of the research is also exciting for researchers because the nine subtly different voltage-sensitive sodium channels that are critical for nervous system function are extremely hard to manipulate individually. Researchers have been on a decades-long quest to find selective drugs for each subtype, so identifying two in one spider is a valuable find.

    “These channels are incredibly hard to identify drugs for because the different subtypes are closely related, making it difficult to identify drugs or other agents that act on one subtype and not another,” Julius said. “These toxins provide unique tools to start understanding exactly what this particular subtype, Nav1.1, does in terms of pain sensation.”

    The Nav1.1 subtype in particular has been implicated in the development of diseases including epilepsy, autism, and Alzheimer’s disease, and the researchers hope that in addition to helping scientists understand the biology of pain, the new discovery will lead to the development of new drugs to target these diseases.

    “These spiders had millions of years of evolution to come up with these potent and specific toxins,” Osteen said. “They’re tools one might be hard pressed to design as well in the lab.”

    Additional UCSF authors on the paper were Joshua J. Emrick, Chuchu Zhang, Xidao Wang, PhD, and Allan I. Basbaum, PhD. Please see the paper online for a full list of authors and their contributions.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

  • richardmitnick 7:38 am on May 21, 2016 Permalink | Reply
    Tags: , , Gold production, Kavli Institute, ,   

    From Kavli: “Galactic ‘Gold Mine’ Explains the Origin of Nature’s Heaviest Elements” 


    The Kavli Foundation

    Adam Hadhazy, Spring 2016

    Neutron star merger depicted Goddard
    Neutron star merger depicted. NASA/Goddard

    A unique galaxy loaded with hard-to-produce, heavy elements sheds light on stellar histories and galactic evolution.

    RESEARCHERS HAVE SOLVED a 60-year-old mystery regarding the origin of the heaviest elements in nature, conveyed in the faint starlight from a distant dwarf galaxy.

    Most of the chemical elements, composing everything from planets to paramecia, are forged by the nuclear furnaces in stars like the Sun. But the cosmic wellspring for a certain set of heavy, often valuable elements like gold, silver, lead and uranium, has long evaded scientists.

    Astronomers studying a galaxy called Reticulum II have just discovered that its stars contain whopping amounts of these metals—collectively known as “r-process” elements (See “What is the R-Process?”).

    Reticulum II galaxy. Dark Energy Survey, DECam, CTIO/Blanco Telescope, Cerro Tololo, Chile
    Reticulum II galaxy. Dark Energy Survey, DECam, CTIO/Blanco Telescope, Cerro Tololo, Chile

    Of the 10 dwarf galaxies that have been similarly studied so far, only Reticulum II bears such strong chemical signatures. The finding suggests some unusual event took place billions of years ago that created ample amounts of heavy elements and then strew them throughout the galaxy’s reservoir of gas and dust. This r-process-enriched material then went on to form Reticulum II’s standout stars.

    Based on the new study*, from a team of researchers at the Kavli Institute at the Massachusetts Institute of Technology, the unusual event in Reticulum II was likely the collision of two, ultra-dense objects called neutron stars. Scientists have hypothesized for decades that these collisions could serve as a primary source for r-process elements, yet the idea had lacked solid observational evidence. Now armed with this information, scientists can further hope to retrace the histories of galaxies based on the contents of their stars, in effect conducting “stellar archeology.”

    The Kavli Foundation recently spoke with three astrophysicists about how this discovery can unlock clues about galactic evolution as well as the abundances of certain elements on Earth we use for everything from jewelry-making to nuclear power generation. The participants were:

    Alexander Ji – is a graduate student in physics at the Massachusetts Institute of Technology (MIT) and a member of the MIT Kavli Institute for Astrophysics and Space Research (MKI). He is lead author of a paper in Nature describing this discovery.

    Anna Frebel – is the Silverman Family Career Development Assistant Professor in the Department of Physics at MIT and also a member of MKI. Frebel is Ji’s advisor and coauthored the Nature paper. Her work delves into the chemical and physical conditions of the early universe as conveyed by the oldest stars.

    Enrico Ramirez-Ruiz – is a Professor of Astronomy and Astrophysics at the University of California, Santa Cruz. His research explores violent events in the universe, including the mergers of neutron stars and their role in generating r-process elements.

    The following is an edited transcript of their roundtable discussion. The participants have been provided the opportunity to amend or edit their remarks.

    THE KAVLI FOUNDATION: What was your reaction to discovering an abundance of heavy elements in the stars in the galaxy called Reticulum II?

    ALEX JI: I had spent some time looking at stars in other galaxies like this, and in every one of those, the content of this type of element – which we call r-process elements – was very low. So we went into this whole project thinking we would get very low detections as well with this galaxy. When we read off the r-process content of that first star in our telescope, it just looked wrong, like it could not have come out of this galaxy! I spent a long time making sure the telescope was pointed at the right star. Then I called Anna—actually, I had to wake her up, it was 3 A.M.—and we started doing instrument checks to make sure we were looking at the right thing. It turns out we were.

    ANNA FREBEL: It was quite funny, because usually when I get a call in the middle of the night from someone at the telescope, it means something really bad has happened! [Laughter] In this case, we were all super-excited because Alex had found something in the data that was really unexpected and also was a smoking gun. We pretty quickly confirmed that at least that first star he was looking at really had all these heavy elements in rather large quantities.

    Then another star showed the same kind of signature. I was like, “Oh my god—we’ve hit the lottery . . . twice!” We would have been happy walking away with just one awesome star, and then it turned into two, then into three, and four, five and so forth. The universe had thrown us a really big bone!

    ENRICO RAMIREZ-RUIZ: I’ve been working on neutron star mergers for a while, so I was extremely excited to see Alex and Anna’s results. Their study is indeed a smoking gun that exotic neutron star mergers were occurring very early in the history of this particular dwarf galaxy, and for that matter likely in many other small galaxies. Neutron star mergers are therefore probably responsible for the bulk of the precious substances we call r-process elements throughout the universe.


    An artist’s conception of a supernova forging heavy elements. Credit: Supernova illustration: Akihiro Ikeshita/Particle CG: Naotsugu Mikami (NAOJ)

    What Is the R-Process?

    The r-process stands for “rapid neutron-capture process.” This phenomenon, first theoretically described by nuclear physicists in 1957, creates elements in nature that are heavier than iron. In the supernova explosions of massive stars and in neutron star collisions, tremendous numbers of freely moving neutrons bind with iron atoms. As more and more neutrons pile up in the atom’s nucleus, the neutrons undergo a radioactive decay, turning into protons. Accordingly, new, heavier elements are formed, because elements are differentiated by the number of protons in their nucleus. As its name implies, this process must occur rapidly in order to build up to very heavy, neutron-rich nuclei that then decay into heavy elements, such as uranium, which has 92 protons compared to iron’s 26. While a theoretical understanding of the r-process is sound, scientists have debated over the astrophysical conditions and sites where the process can actually occur.


    TKF: Why has the provenance of these elements been such a tough nut to crack?

    FREBEL: The question of the cosmic origin of all of the elements has been a longstanding problem. The precursor question was, “Why do stars shine?” Scientists tackled that in the early part of the last century and solved the mystery only around 1950. We found out that stars do nuclear fusion in their cores, generating heat and light, and as part of that process, heavier elements are created. That led to a phase where a lot of people worked on figuring out how all the elements are made.

    Understanding how heavy, r-process elements, are formed is one of hardest problems in nuclear physics. The production of these really heavy elements takes so much energy that it’s nearly impossible to make them experimentally, even with current particle accelerators and apparatuses. The process for making them just doesn’t work on Earth. So we have had to use the stars and the objects in the cosmos as our lab.

    JI: As Anna just mentioned, we have been mostly stuck with astronomy, trying to measure what could have made all of these elements out in the stars. But it’s also very difficult to find stars that give you any information about the r-process.

    RAMIREZ-RUIZ: Right, it is very difficult to see these elements shine when they’re created in the universe because they are very rare. For example, gold is only one part in a billion in the Sun. So even though the necessary physical conditions needed to make these elements were clear to physicists more than 50 years ago, it was a mystery as to what sort of objects and astrophysics would provide these conditions, because we couldn’t see r-process elements being produced in explosion remnants in our own galaxy.

    Two competing theories did emerge, which are that these elements are produced by supernovae and neutron star mergers. These phenomena are very different in terms of how often they should happen and in the amount of these elements they should theoretically produce. Just to give you an example, the explosion of a star with more than eight times the mass of the Sun is thought to produce about a Moon’s mass-worth of gold. A neutron star merger, however, is thought to produce a Jupiter’s mass-worth of gold. That’s over 25,000 times more! So just one neutron star merger can provide the gold we would expect to find in about six million to 10 million stars.

    Alex and Anna’s observations are so unbelievably useful because they really show that the phenomenon which created these elements is something rare, but that produces a lot of these elements, as a neutron star merger should.

    FREBEL: It took 60-something years of work to figure this out, and a variety of astronomers—observers as well as theorists—have all put in their share. That’s exactly what we and Enrico are continuing to do.

    TKF: Enrico, you study the ionized gas called plasma that composes stars. How is the material in neutron stars different than the plasma in run-of-the-mill stars like the Sun, and how does this provide the raw ingredients for making r-process elements?

    RAMIREZ-RUIZ: Neutron stars are only about the size of San Francisco Bay, which I live close by, yet they pack in as much mass as the Sun—about 330,000 times the mass of the Earth. Neutron stars are the densest objects in the universe. A neutron star the size of a Starbucks cup would weigh as much as Mount Everest! We call them neutron stars because they are neutron-rich, and that’s a key aspect for making r-process elements, as I’ll let Alex and Anna explain.

    JI: So the nuclear fusion in stars can only make the elements up to iron. That’s because iron is the most stable nucleus. If you try to fuse two things to make elements heavier than iron, it actually takes more energy than the fusion reaction itself releases. A neutron that gets close enough to this dense iron nucleus can join it thanks to one of the fundamental forces of nature, the strong force, which binds protons and neutrons together.

    You can keep increasing the size of this nucleus by adding more neutrons, but there’s a trade-off. That nucleus will undergo a radioactive decay called a beta decay. Specifically, one of those added neutrons will spontaneously release some energy and turn into a proton. The r-process is what happens when you capture neutrons faster than the beta decays happen, and in that way you can build up to heavier nuclei.

    FREBEL: This process can only happen when you have lots and lots of free neutrons outside of an atomic nucleus, and that’s actually a difficult thing to do, because neutrons only survive for about 15 minutes before they decay into a proton. In other words, almost as soon as you have free neutrons, they just disappear. So it’s really hard to find places where there are even free neutrons to undergo this neutron capture. As far back as the 1930s, neutron stars had been postulated as something that could exist, and it wasn’t until the late 1960s that we knew they were real.

    RAMIREZ-RUIZ: As we learned more about neutron stars, we found out that about two percent of them have companion stars, and a very small fraction have another neutron star orbiting around them. If the neutron stars are close enough, they will merge within several billion years or less because they produce gravitational waves as they spin around each other. These waves simply carry off energy and angular momentum, so the stars get closer and closer, and eventually they touch each other.

    TKF: What happens to these heavy elements once two neutron stars collide?

    RAMIREZ-RUIZ: As these neutron stars come together, the stars eject some material in their tidal tails into space at very close to the speed of light. So the atoms of these elements are moving very fast when they are first formed. By the time the ambient gas and dust in the galaxy is able to slow these elements down, they have probably mixed with about a thousand Sun-masses worth of material, enriching it atom-by-atom.

    FREBEL: Everything gets nicely mixed, like dough. And from that mixed material, the next generation of stars then forms. This stellar generation contains many, little, low-mass stars that have very long life times. It’s these low-mass, long-lived stars that we observed today in Reticulum II for this study.

    TKF: Anna, you published a book last year called “Searching for the Oldest Stars: Ancient Relics from the Early Universe.” How do these results demonstrate what you call “stellar archeology?”


    FREBEL: Finding these elements at Reticulum II thoroughly illustrates the concept of stellar archaeology. The idea is that we can use the composition of individual stars to trace the processes that created the elements in the early universe. Because the elements that we observe in our stars today were made prior to the stars’ birth—the stars inherited these heavy elements like “cosmic genes”—we have this incredible opportunity to look back in time to study the early chemical and physical processes that ushered in stars and galaxy formation soon after the Big Bang.

    Reticulum II is actually a perfect example of what we now call dwarf galaxy archaeology. It’s pretty much the same thing I just described, but now we are able to add other dimensions by not just using individual stars, but the entire dwarf galaxy and all the information that comes with it. We can use galactic environmental conditions and the star formation history to trace what happened very early on in that galaxy that provided the various elements we see today.

    It’s very nice that despite all the progress we have made in this field, there is more to come. I really think these findings have opened a new door for studying galaxy formation with individual stars and to some extent individual elements. We are seriously connecting the really small scales of stars with the really big scales of galaxies. I’m very excited to see what else we find. I don’t think we’ll find another galaxy like Reticulum II anytime soon, but hey, we’re going to keep looking!

    JI: The way I like to think about this is, imagine if you were an actual archaeologist and you wandered around on the surface of the Earth picking up artifacts whenever you found them. You’d find a collection of random artifacts from different periods and places, and you wouldn’t be sure how to associate them. In contrast, looking at galaxies like Reticulum II is like digging into a coherent, subterranean layer and finding a collection of artifacts that are all telling the same story . . .

    FREBEL: Like Pompeii!

    JI: Yeah, like Pompeii!

    TKF: Ah yes, the Roman town, and its residents, who were completely buried under volcanic ash. That was not a very nice outcome . . .

    FREBEL: Not for the people, no.

    RAMIREZ-RUIZ: But the archeological evidence did remain pristine .

    TKF: Bad for Pompeians, but good for archeologists. Shifting gears here, what tools do you need to dig even deeper, if you will, into how elements like gold and silver originate, and otherwise find more cosmic archeological clues?

    JI: There are two types of things that we need. First, we have to find dwarf galaxies and that requires very large sky surveys like the Dark Energy Survey—which discovered Reticulum II—as well as surveys conducted by the Large Synoptic Survey Telescope, which will start operations in the 2020s.

    LSST/Camera, built at SLAC
    LSST Interior
    LSST telescope, currently under construction at Cerro Pachón Chile
    LSST/Camera, built at SLAC; LSST telescope, currently under construction at Cerro Pachón Chile

    The second thing is we have to look at the stars in those galaxies. The problem with galaxies is that they are far away, so we need pretty large telescopes to do that.

    FREBEL: The stars that Alex has been observing are actually really, really faint. We had to work very hard to squeeze out whatever information we could about them. It was only because these stars had such a strong signal of r-process elements that we could see those signals in their light, very little of which we’re actually able to capture with current telescopes.

    So that really shows why we need larger telescopes. Multiple telescope projects are underway and are scheduled to open in the 2020s. They will have mirrors more than twice as big as today’s best ground-based telescopes. These include the Giant Magellan Telescope, the Thirty Meter Telescope and the European Extremely Large Telescope.

    Giant Magellan Telescope,  Las Campanas Observatory, some 115 km (71 mi) north-northeast of La Serena, Chile
    Giant Magellan Telescope, Las Campanas Observatory, some 115 km (71 mi) north-northeast of La Serena, Chile

    TMT-Thirty Meter Telescope, proposed for Mauna Kea, Hawaii, USA
    TMT-Thirty Meter Telescope, proposed for Mauna Kea, Hawaii, USA

    ESO/E-ELT,to be on top of Cerro Armazones in the Atacama Desert of northern Chile
    ESO/E-ELT,to be on top of Cerro Armazones in the Atacama Desert of northern Chile

    They promise more light per unit of time hour, which means we can observe fainter stars, but we can also go back to brighter stars and get insanely high quality data. That is what we need for these r-process stars because there is so much information in their light. I think the next five to 10 years will be very exciting in this regard.

    RAMIREZ-RUIZ: I want to make a plug for the Laser Interferometer Gravitational-Wave Observatory, or LIGO.

    Caltech/MIT Advanced aLigo detector in Livingston, Louisiana
    Caltech/MIT Advanced aLigo detector in Livingston, Louisiana

    The ultimate dream of mine would be to detect the gravitational wave signal of a neutron-neutron star merger.

    Gravitational waves. Credit: MPI for Gravitational Physics/W.Benger-Zib
    Gravitational waves. Credit: MPI for Gravitational Physics/W.Benger-Zib

    When we have multiple gravitational wave observatories in operation, such as when LIGO India is built next decade, we will be able to pinpoint the location of these rare events. We can then use our conventional, light-based telescopes to look at the transient light signals from the merged neutron star, which we actually think will be powered by the decay of these precious elements. That would be the ultimate direct evidence that these mergers are indeed producing all of these elements.

    FREBEL: Pinpointing the location of neutron star mergers might become possible for events in the nearby universe. But I don’t think we’ll go back far enough in space, and therefore time, to see a merger like in Reticulum II that went off billions of years ago. I agree with Enrico, though, it would be really great to have a nearby example that shows us, right in front of our eyes, how this really all works.

    RAMIREZ-RUIZ: Anna’s absolutely right. We won’t see the r-process enrichment events that took place at the time when a galaxy like Reticulum II was being formed, but hopefully we’ll see the newly synthesized gold closer to home! [Laughter]

    TKF: Let’s take a moment to consider that most of the gold, silver and platinum in our valuable jewelry, as well as the uranium in our nuclear reactors, was created when mind- bogglingly dense neutron stars crushed into each other at incredible speeds. As you’re doing your research, does this sort of notion ever stop you in your tracks?

    JI: It does stop you in your tracks, right? Definitely one of the things that I think attracts people to astronomy is understanding the origin of everything around us. The other part of it for me is these neutron stars mergers are happening on really small scales, but these events are explosive enough to affect a whole galaxy! Imagining that event, then zooming out to the whole galactic scale, then zooming back down to us on Earth—I think it’s pretty cool to be able to follow the consequences of the production of these elements from beginning to end.

    RAMIREZ-RUIZ: Something to think about is that all the gold originally here on Earth sank into the planet’s center because the early Earth was molten. So all the gold we have today on or near the surface is from asteroid impacts!

    FREBEL: As we’ve been saying, the gold wasn’t made in the asteroids, it was probably made in a neutron star merger. It then mixed into the cloud of gas and dust in which all the asteroids and planets formed. That gold was then transported to us on Earth as a special delivery. [Laughter]

    RAMIREZ-RUIZ: We have some gold atoms in our bodies, too. If we were to “talk” to one of these atoms, it would tell us a story how it was formed in billions of degrees, how it flew through space. Because just one of these neutron star mergers produced so much gold, probably all of the gold atoms that are in the four of us in this roundtable discussion came from the same event. So we’re not only linked by genetics, but by these exotic phenomena that happen in the universe.

    *Science paper:
    R-process enrichment from a single event in an ancient dwarf galaxy

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

  • richardmitnick 2:57 pm on May 16, 2016 Permalink | Reply
    Tags: , , Kavli Institute, The Kavli Prize(s)   



    The Kavli Foundation

    $1 Million Prizes in Astrophysics, Nanoscience, and Neuroscience Recognize Pioneering Advances in Our Understanding of Existence
    France Córdova, Director of the National Science Foundation, Will Deliver Keynote Address at Event June 2 in NYC, Hosted by ABC News’ Chief Health and Medical Editor, Dr. Richard Besser

    On June 2nd, the 2016 Kavli Prizes in astrophysics, nanoscience, and neuroscience will be announced from the Norwegian Academy of Science and Letters in Oslo. The international biennial Prizes, $1 million (U.S.) cash awards in each field, recognize pioneering advances in our understanding of existence at its biggest, smallest, and most complex scales.

    In New York City, the World Science Festival will hold an invitation-only breakfast program, live-streamed for free at http://www.worldsciencefestival.com/programs/kavli-prize-announcement/. France Córdova, Director of the National Science Foundation, will deliver the keynote address at the event, which will be hosted by Dr. Richard Besser, ABC News’ Chief Health and Medical Editor, and will include a live satellite transmission of the announcement. The breakfast will conclude with a panel discussion of the Laureates’ Prize-winning work by three preeminent scientists: astrophysicist Nergis Mavalvala, nanoscientist Michal Lipson, and neuroscientist Cori Bargmann.

    The World Science Festival’s 2016 Kavli Prizes breakfast will take place Thursday, June 2,8-10am EST, at New York University’s Grand Hall (238 Thompson Street). The live-stream will begin at 8:15am. Media interested in attending the breakfast should reach out to Blake Zidell at blake@blakezidell.com.

    Kavli Laureates are chosen by committees whose members are recommended by six of the world’s most renowned science societies and academies. Winners, who are not notified in advance of the announcement, go on to receive gold medals, presented this year by H.R.H. Crown Prince Haakon, during a ceremony in Oslo. The ceremony is followed by a banquet at Oslo’s famed City Hall, the venue of such historic events as the Nobel Peace Prize ceremony. Since its inaugural year, all U.S. laureates have also visited the Oval Office of the White House in recognition of the honor and the laureates’ scientific contributions.

    More about the Kavli Prizes

    The Kavli Prizes recognize seminal scientific achievements in Astrophysics, Nanoscience and Neuroscience.

    The Kavli Prize in Astrophysics recognizes outstanding achievement in advancing our knowledge and understanding of the origin, evolution, and properties of the universe, including the fields of cosmology, astrophysics, astronomy, planetary science, solar physics, space science, astrobiology, astronomical and astrophysical instrumentation, and particle astrophysics.

    The Kavli Prize in Nanoscience recognizes outstanding achievement in the science and application of the unique physical, chemical, and biological properties of atomic, molecular, macromolecular, and cellular structures and systems that are manifest in the nanometer scale, including molecular self-assembly, nanomaterials, nanoscale instrumentation, nanobiotechnology, macromolecular synthesis, molecular mechanics, and related topics.

    The Kavli Prize in Neuroscience recognizes outstanding achievement in advancing our knowledge and understanding of the brain and nervous system, including molecular neuroscience, cellular neuroscience, systems neuroscience, neurogenetics, developmental neuroscience, cognitive neuroscience, computational neuroscience, and related facets of the brain and nervous system.

    The Norwegian Academy of Science and Letters appoints the three prize committees after receiving recommendations from the following international academies and scientific organizations:

    The Chinese Academy of Science
    The French Academy of Sciences
    The Max Planck Society (Germany)
    The National Academy of Sciences (U.S.)
    The Norwegian Academy of Science and Letters
    The Royal Society (U.K.)

    The prize committees review the nominated candidates and submit their recommendations to the board of The Norwegian Academy of Science and Letters. The President of the Academy announces the prize winners.

    First awarded in 2008, the Kavli Prizes have honored 31 scientists from seven countries, including the United States, United Kingdom, Germany, Japan, Norway, Russia, and Sweden. In 2014, Alan H. Guth (U.S.), Andrei D. Linde (U.S.), and Alexei A. Starobinsky (Russia) shared the Kavli Prize in Astrophysics for pioneering the theory of cosmic inflation; Thomas W. Ebbesen (Norway), Stefan W. Hell (Germany), and Sir John B. Pendry (U.K.) won the Kavli Prize in Nanoscience for their transformative contributions to nano-optics; and Brenda Milner (U.K.), John O’Keefe (U.S.), and Marcus E. Raichle (U.S.) received the Kavli Prize in Neuroscience for their discovery of specialized brain networks for memory and cognition. Past awards have honored scientists for research ranging from the discovery of the Kuiper Belt to creating unprecedented methods for controlling matter on the nanoscale, to deepening our understanding of the basic neuronal mechanisms underlying perception and decision.

    The Kavli Prize is a partnership between The Norwegian Academy of Science and Letters, The Kavli Foundation (U.S.), and The Norwegian Ministry of Education and Research. It is named after Fred Kavli, a Norwegian-born U.S. philanthropist and founder of The Kavli Foundation.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

  • richardmitnick 7:09 pm on February 25, 2016 Permalink | Reply
    Tags: , Astronomy's future, Astrophysicists Zero in on How Planets Form, , Kavli Institute   

    From KAVLI: “The Origin of Worlds: Astrophysicists Zero in on How Planets Form” 


    The Kavli Foundation

    Adam Hadhazy

    Powerful new telescopes and new techniques are letting scientists probe planets in the earliest stages of development

    THE SECRETS OF PLANET FORMATION are becoming harder to keep. In November, using a new observing method, scientists snapped the very first pictures of an extrasolar planet still gathering up mass from its dusty, planetary nursery. Called LkCa 15 b, this immature gas giant has opened a window into the poorly understood process of how planets form.

    LkCa 15 b came on the heels of the discovery of 51 Eri b, a relatively adolescent world of 20 million years that, at this point in its development, is like a young Jupiter. Though fully formed, 51 Eri b is still radiating heat from its tumultuous birth. This leftover heat provides tantalizing clues about the conditions in which 51 Eri b has grown up.

    The discovery of both these exoplanets was made possible by a technique called direct imaging, wherein the faint light from a planet is separated from the overwhelming glare of its host star. Compared to other methods, direct imaging lets astronomers more readily study the atmospheric composition of a newly formed or even forming world, providing key insights into these objects’ ultimate origins.

    In January, The Kavli Foundation spoke with three planetary formation experts. The discussion covered promising new ways of studying how giant planets form and whether they can explain the rise of our entire Solar System.

    The participants were:

    Bruce Macintosh – is a Professor of Physics at Stanford University and a member of the Kavli Institute for Particle Astrophysics and Cosmology (KIPAC). Macintosh is the Principal Investigator of the Gemini Planet Imager project that discovered 51 Eri b, announced in the journal Science.
    Kate Follette – is a postdoctoral researcher also at the Kavli Institute for Particle Astrophysics and Cosmology (KIPAC) at Stanford University. She is the co-lead author on a Nature paper on LkCa 15 b and co-author of the 51 Eri b study.
    Ruth Murray-Clay – is an Assistant Professor of Physics at the University of California, Santa Barbara and also a co-author of the 51 Eri b study.

    The following is an edited transcript of their roundtable discussion. The participants have been provided the opportunity to amend or edit their remarks.

    The Kavli Foundation: How are the discoveries of 51 Eri b and LkCa 15 b helping to complete a “family album” of how giant planets form and develop?

    Bruce Macintosh: In the family album, so to speak, 51 Eri b is as big as it’s going to get, like an adolescent, but still kind of angry, warm and turbulent, again like an adolescent. One could set up the analogy that objects in protoplanetary disks, like LkCa 15 b, are still in utero. But LkCa 15 b already has a solid core. It’s formed, which is sort of birth-like, and now it’s in the process of growing. It’s accreting mass and growing like a child.

    Kate Follette: The planets, such as 51 Eri b, that we image with the Gemini Planet Imager are indeed like turbulent adolescents.

    Gemini Planet Imager
    Gemini Planet Imager
    Gemini Planet Imager on Gemini South in Chile

    Gemini South telescope
    Gemini South

    They are fully formed, but in terms of their age as a proportion of the total lifetime of their system, which is billions of years, they’re just barely out of the womb.

    Macintosh: We could say that planets age like dogs. [Laughter] They don’t age on a linear scale with people.

    Ruth Murray-Clay: We’ve seen a lot of gas giant planets in observational surveys, like from the Kepler space telescope.

    NASA Kepler Telescope

    But these giant planets tend to be very close to their stars. We have a standard picture that gas giants form at distances from their stars where Jupiter and Saturn are found in our Solar System. 51 Eri b doesn’t fit that picture. It’s a relatively cool world at a more Jupiter-like distance from its star, and we had never seen that before. It suggests that some of these giants must dynamically move closer to their stars as their solar systems evolve. We’d really like to know if that is the right way of thinking about the overall architecture of planetary systems, and 51 Eri b is the start of that.

    With regard to LkCa 15 b, from a planet formation perspective it’s incredibly exciting to be seeing a planet possibly still in the process of accumulating mass, or accretion. It’s very difficult to model that process theoretically, partly because we just don’t know how accretion happens in young disks and onto planets. So having any sort of observational constraints on those theories like we’re getting by studying LkCa 15 b will take us a long way.

    The Basics of Making a Planet

    Planets develop out of protoplanetary disks surrounding young stars. Gas and dust particles within these disks glom together into larger bodies, building up over millions of years into full-size planets. The biggest objects capture huge, gassy atmospheres, becoming gas giants. However, this “core accretion” model does not work well at creating gas giants in tight or distant orbits from stars. Nor is the accretion process itself well understood. Newer theories propose that gas giants can migrate toward or away from their stars as a solar system dynamically evolves. In the process, numerous fledging worlds are gravitationally displaced into unusual orbits or expelled into interstellar space. Overall, many questions remain about how, where and when planets arise.

    TKF: To see a baby planet still accreting material, Kate and her colleagues turned to a new method to study LkCa 15 b. Using the Magellan telescope in Chile, they detected the signature of heated, ionized hydrogen gas—called “H-alpha”—as the gas fell from a protoplanetary disk onto a world just taking shape. Kate, will you be able to use this method to study other planets still taking form?

    Follette: We certainly hope so. We have a sample of about 20 more objects that we think are good candidates for protoplanets similar to LkCa 15 b because they are in so-called transition disks. These are protoplanetary disks that have big, central cavities that are cleared of all of the gas and dust that usually surrounds a young star. The youngest protoplanetary disks are shaped like pancakes, with material reaching all the way in to the star in the middle. Transition disks, on the other hand, are like donuts, and researchers have long postulated that their central holes are cleared by the gravitational influence of planets. The LkCa 15 b image is significant because it is the first time we’ve been able to show this to be true by directly imaging a planet inside a disk gap.

    We know that the central stars in these disks are often still actively accreting material from the disk. Therefore, gas and dust must somehow be making it from the outer disk onto the star, and so you can logically conclude that this material is also falling onto intervening planets. At the same time, because the cavities are cleared of material, we can see all the way down to the mid-plane of the disk—sort of like the “midline” on a donut, if you will—where we think the planets are forming.

    Given this setup, we really think transition disks and accreting planets are a sweet spot for this particular type of imaging. The planet is glowing so brightly in H-alpha, it means that we don’t have to work as hard to isolate its light emission as we would otherwise when trying to study a dim planet near a bright star. As a result, we can work even closer to the star, all the way to the inside of these disk gaps. Just as a comparison, when Bruce, Ruth and I worked on the 51 Eri b discovery, its star was about 500,000 times brighter than 51 Eri b. The light that my colleagues and I isolated from LkCa 15 b is only a few hundred times fainter than its star.

    Macintosh: That’s like cheating. That’s too easy! [Laughter]

    Follette: Yeah, exactly!
    “The next telescope wave will really let us see the equivalent of the inner Solar System caught in the process of forming.”—Bruce Macintosh

    TKF: Could cracking the case of how giant planets form potentially trickle down to understanding how Neptune-size and even rocky, Earth-like worlds form?

    Murray-Clay: The current dominant model of how gas giants form is that a solid, rock and ice core forms first. The core becomes massive enough to accrete gas from the surrounding protoplanetary disk. With that model in mind, one way of thinking about Neptunes or even Earths is as planets that failed to grow massive enough to become gas giants. For a Neptune, you can ask if it were massive enough to accrete a Saturn-sized envelope, why didn’t it? Why did it fail? And for an Earth, you can ask why wasn’t there enough material around for it as a rocky core to grow big enough to become a gas giant instead?

    The theoretical answers to those questions typically depend on how much available mass there is in a protoplanetary disk. Other considerations are how disturbed the disk becomes over time, which affects how long it takes rocky cores to grow, and whether that process takes too long to form gas giants compared to the lifetime of the dissipating gas disk.

    If the basic ideas that many of us astrophysicists share about planet formation are correct, then those ideas should tell us something about the kinds of architectures that we’d expect to find in planetary systems. If you want to understand why you have an Earth in a solar system, you also need to understand why you don’t have a Jupiter.

    Macintosh: To understand planetary formation processes, a lot of people have tried “archaeological” approaches. This is where you look at the statistics of the planets discovered in mature solar systems and compare these statistics to the predictions from various planetary formation paths. The history of success of that approach is relatively weak, though. It’s been very hard to develop a model that successfully predicts the new solar systems we keep discovering.

    TKF: Do we have a limited or skewed view of planetary formation because most theories are based on the single example of our own Solar System? After all, we did not even know about another multi-planet system around a Sun-like star until 1999.

    Follette: We are discovering planets that are very unlike what we see in our Solar System, but then again, the field of exoplanetology is so young. The main techniques we have been using to discover exoplanets favor high-mass planets very close to their stars, like for instance the so-called “hot Jupiter” class of worlds. Maybe they are the outliers. They might just be the easy-to-detect, oddball planets of the universe. So the fact that we found a lot of hot Jupiters or closely packed planetary systems may not be indicative of what the norm is in the universe.

    Murray-Clay: I agree with that. It may be that we’re actually being sent down the wrong path by the exoplanet discoveries we’re making, not the other way around by starting with our Solar System. Now that can’t entirely be true, because from the number of solar systems that Kepler has seen, we can infer that something like 30 percent of stars host “super-Earths.” So super-Earths, which have anywhere from two to 10 times the mass of Earth and which we don’t have in our Solar System, are clearly a common outcome of planet formation. However, we also don’t know what the solar systems seen by Kepler look like in their entirety, because Kepler’s data cannot really tell us about any planets in more distant orbits. We really don’t know that Solar System-like architectures are not common.

    TKF: What are some promising planetary formation models or explanations, then, that would account not only for our Solar System, but those strikingly different solar systems we have glimpsed in recent years?

    Murray-Clay: It may be that there’s a range of solar system masses. Very high-mass systems may be more likely to produce gas giants. Those gas giants are more likely to be gravitationally kicked into their inner solar system and therefore be easily observable by our conventional exoplanet detection techniques, like the “radial velocity method” as well as the “transit method” seen by Kepler.

    Then you get to a little bit lower mass solar systems and you get something like our Solar System. You have some gas giants in this scenario, but they didn’t get kicked around very much to inner or outer orbits by the gravity of other planets in the system. Then, you go to even lower mass systems, and there you might end up with systems that have no gas giants at all and just a number of Earths and mini-Neptunes. Within a framework like that, the Solar System is just part of a spectrum of possible solar systems.

    Macintosh: To some extent, though, it seems like it’s getting harder to have a continuum for solar system architectures that includes our Solar System and the Kepler-discovered solar systems. There are not a lot of intermediate solar systems. You kind of end up wanting a more bimodal or trimodal process, where the possible solar system outcomes branch instead of there being a range of possibilities.

    Murray-Clay: But how would we know that yet? We’re only really seeing the very inner parts of some solar systems with Kepler.

    Macintosh: That’s certainly true, but a system like ours with orbits scaled down by 30 percent, or planet sizes scaled up by 30 percent, would be fairly detectable by Kepler.

    Murray-Clay: There’s a lot we don’t know yet, though we can say that a large fraction of stars have solar systems that are not like ours. I just think it’s still not clear that the Solar System is uncommon.

    TKF: Complicating matters further, researchers at Caltech, including Kavli Prize Laureate Mike Brown, recently presented theoretical evidence for a new, super-Earth-size world lurking way out in the Kuiper Belt, far beyond Pluto.

    Kuiper Belt
    Known objects in the Kuiper belt beyond the orbit of Neptune. (Scale in AU; epoch as of January 2015.)

    Could this “Planet Nine,” if it’s real, scramble our prevailing Solar System planet formation models?

    Macintosh: Positing that Planet Nine does exist for the sake of discussion, it’s not necessarily crazy compared to what we see in other solar systems. We were just discussing that the lack of super-Earths in our Solar System is one way we differ from the majority of the solar systems discovered by the Kepler mission. Now we might have this far-out super-Earth called Planet Nine, but not correspondingly where Kepler sees these sorts of planets in other solar systems. So you need a mechanism to explain how Planet Nine got way out there.

    We know from observations of giant planets that they can get scattered from the inner parts of solar systems to the outer parts. One example is the HR 8799 system, which has four Jupiter-size worlds. Two of them are orbiting farther out than Neptune. Putting those ideas together that, A, our Solar System could have formed a super-Earth-slash-mini-Neptune; and B, that you could move that super-Earth-slash-mini-Neptune very far out is surprisingly not insane by astronomer’s standards.

    Murray-Clay: I completely agree with Bruce—not insane at all. Kicking things out to large orbits is a reasonably common outcome. I’ll be very excited if they find Planet Nine, but I don’t think it would necessarily change our view of the formation of the Solar System that much.

    TKF: What upcoming projects and instruments are you all looking forward to that could help further unravel the mysteries of planet formation? Bruce, you were just selected to lead an exoplanet team for NASA’s next major astrophysics observatory, the Wide Field Infrared Survey Telescope (WFIRST).


    The direct imaging of exoplanets could be a prime mission for this future telescope, slated for launch in the mid-2020s. Why should we be excited?

    Macintosh: The most interesting thing with WFIRST will be the ability to characterize the make-up of planets. Analyzing the light that comes from planets can tell you about their composition, which helps you understand how much mass they accreted in their youth versus how much mass they got through some other mechanism. For the first time with WFIRST, we’ll be able to do the imaging tricks we do nowadays with the Gemini Planet Imager and Magellan, but do it on mature planets—no longer the adolescents. Middle-aged planets are much, much, much, much, much more common than adolescents, so we’ll have a bigger sample to look at. We will actually see something that looks like a Jupiter and see a solar system that is analogous to our own. We’ll see things about these planets’ formation histories through their compositions that will serve a nice compliment to studies of younger planets.

    Follette: As Bruce said, we’re moving in a direction that I think is very promising. Looking even further ahead, NASA is studying four large, space-telescope mission concepts right now that might fly in the 2030s. Three of them would be very good candidates for doing direct imaging of exoplanets and/or exoplanet-forming disks. I think there’s a real push in the field for one of the next big space missions to include a direct imaging component in its design, which will help us push toward imaging ever lower mass planets and better understand how planets evolve.

    Macintosh: In the meantime, a lot of very powerful observations are going to be made by facilities like ALMA, the Atacama Large Millimeter/submillimeter Array, which is now up and running, and the James Webb Space Telescope [JWST] when it flies later this decade.

    ALMA Array

    NASA Webb telescope annotated

    And when we get the next generation of extremely large, ground-based telescopes with 25- to 40-meter diameter primary mirrors in the 2020s, then, observationally, we really will get right into where planets form at Jupiter-like distances.

    39 meter ESO E-ELT

    30 meterTMT

    Giant Magellan Telescope
    21 meter Giant Magellan Telescope

    We’ll still have trouble pushing out to where Saturn and Uranus are located. But the next telescope wave will really let us see the equivalent of the inner Solar System caught in the process of forming.

    Murray-Clay: This will be very important. The Kepler space telescope brought us thousands of planets, but in many cases we’re only seeing one, easily detectable planet out of an entire solar system. By understanding the overall architecture of planetary systems in the future, we’ll have a much better sense of the overall formation scenario for different kinds of systems.

    Macintosh: A lot of us are driven by questions about Earth-like planets, of course. How common they are? How detectable might they be? Can we see them? Planet formation theorists like Ruth are trying to understand whether planets like Earth ultimately are a common thing or a rare thing, and whether our Solar System is unique or whether there’s a whole bunch of others out there like it.

    Follette: Never in the history of astronomy have we—meaning our star, our galaxy, and so on—proven to be particularly special, so it’s not likely that our Solar System and its planets are one-of-a-kind. I find this typicality encouraging, because it means there’s a chance that we’ll image another Earth-like planet during my lifetime. To me, it ultimately means that we’re not likely to be alone in the universe either.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

  • richardmitnick 12:42 pm on January 18, 2016 Permalink | Reply
    Tags: , , , Kavli Institute   

    From Kavli: “Crowdsourcing the Universe: How Citizen Scientists are Driving Discovery” 


    The Kavli Foundation

    Winter 2016
    Adam Hadhazy

    Legions of volunteer, amateur astronomers are turning their eyes to the sky thanks to online image portals and doing extraordinary science.

    ASTRONOMERS ARE INCREASINGLY enlisting volunteer “citizen scientists” to help them examine a seemingly endless stream of images and measurements of the universe. These volunteers’ combined efforts are having a powerful impact on the study of the cosmos.

    A collage of the 29 new gravitational lensing candidates discovered by citizen scientists using Space Warps. (Credit: Space Warps, Canada-France-Hawaii Telescope Legacy Survey)

    Just last November, a citizen science project called Space Warps announced the discovery of 29 new gravitational lenses, regions in the universe where massive objects bend the paths of photons (from galaxies and other light sources) as they travel toward Earth. As cosmic phenomena go, the lenses are highly prized by scientists because they offer tantalizing glimpses of objects too distant, and dim, to be seen through existing telescopes, as well as key information on the lensing objects themselves.

    The Space Warps’ haul of lenses is all the more impressive because of how it was obtained. During an eight-month period, about 37,000 volunteers combed through more than 430,000 digital images in a huge, online photo library of deep space. Automated computer programs have identified most of the 500 gravitational lenses on astronomer’s books. However, computers failed to flag the 29 lenses the Space Warps volunteers spotted, speaking to unique skills we humans possess.

    The Kavli Foundation spoke with three researchers, all co-authors of two papers published in the Monthly Notices of the Royal Astronomical Society describing the Space Warps findings. In our roundtable, the researchers discussed the findings and the critical role citizen science is playing in furthering astronomical discovery.

    The participants were:

    Anupreeta More – is a project researcher at the Kavli Institute for the Physics and Mathematics of the Universe (Kavli IPMU) at the University of Tokyo. More is a co-principal investigator for Space Warps, a citizen project dedicated to identifying gravitational lenses.
    Aprajita Verma – is a senior researcher in the department of physics at the University of Oxford. Verma is also a co-principal investigator for Space Warps.
    Chris Lintott – is a professor of astrophysics and the citizen science lead at the University of Oxford. Lintott is a co-founder of Galaxy Zoo, a citizen science project in which volunteers classify types of galaxies, and the principal investigator for the Zooniverse citizen science web portal.

    The following is an edited transcript of the roundtable discussion. The participants have been provided the opportunity to amend or edit their remarks.

    The Kavli Foundation: Anupreeta and Aprajita, where did you get the idea — along with your co-principal investigator Phil Marshall of the Kavli Institute for Particle Astrophysics and Cosmology (KIPAC) at Stanford University — to put volunteers to work on identifying gravitational lenses starting back in 2013?

    ANUPREETA MORE: A few years ago, Chris Lintott gave a talk on citizen science at the Kavli Institute for Cosmological Physics in Chicago, where I was working at the time. It got me thinking about a lens search by citizen scientists.

    APRAJITA VERMA: For Phil Marshall and I, Space Warps grew out of Galaxy Zoo. Soon after Galaxy Zoo launched, I started to look at some of the galaxies that were being posted on the Galaxy Zoo user forum that had potential lensed features surrounding them. This was a great by product of the core Galaxy Zoo project. However, we realized that to find these incredibly rare sources, which are often confused with other objects, we really needed a tailored interface to efficiently find lenses. This grew into Space Warps.

    TKF: Chris, Galaxy Zoo itself was inspired by Stardust@home [a project running on BOINC software from UC Berkeley], the first astronomy-based citizen science project in which people played an active role. Until then, citizen scientists were often computer owners who offered up free processing power on their devices to aid in machine-driven data analysis. Were you concerned when you started Galaxy Zoo in 2007 that it would be hard to attract volunteers?

    CHRIS LINTOTT: Since Stardust@home involved people looking at images of a comet’s dust grains brought back by NASA’s Stardust space probe, we thought “Well, if people are willing to look at dust grains, then surely they’d be happy to look at our galaxies!”

    NASA Stardust spacecraft

    But that turned out to be almost beside the point. As we’ve done many of these citizen science projects over the years, we’ve discovered it’s not the quality of the images that matter. After all, our galaxies aren’t typically beautiful. They are not the Hubble Space Telescope shots that you’d expect to find on the front page of the New York Times.

    NASA Hubble Telescope
    NASA/ESA Hubble

    Our galaxies are often fuzzy, little, enigmatic blobs. The Space Warps images are pretty, but again they’re not the kind of thing you would sell as a poster in the gift shop at the Kennedy Space Center.

    It’s actually the ideas that get people excited. I think Space Warps and Galaxy Zoo have been successful because they have done a great job of explaining to people why we need their help. We’re saying to them: “Look, if you do this simple task, it allows us to do science.” This idea is best shown by Planet Hunters, a citizen science project that searches for exoplanets in data from NASA’s Kepler spacecraft.

    NASA Kepler Telescope

    Users are looking at graphs for fun. But because the idea is the discovery of exoplanets, people will put up with looking at data.

    TKF: What sort of unique science is made possible because of Space Warps?

    VERMA: Gravitational lenses allow us to look at objects, such as very distant galaxies, that are fainter and in much more detail than with the telescopes we have now. It’s enabling the kind of science we’ll be routinely doing with extremely large telescopes in the future.

    MORE: That’s right. Something unique about gravitational lensing is that it acts like a natural telescope and allows us to study some really faint, distant galaxies which we wouldn’t get to study otherwise. We’re seeing these distant galaxies in the early stages of their life cycle, which helps us understand how galaxies evolve over time.

    Also, in a gravitational lens system, it’s possible for us to study the properties of the foreground galaxies or galaxy groups that are gravitationally lensing the background sources. For example, we can measure the mass of these foreground galaxies and also study how mass is distributed in them.

    Anupreeta More’s research specialty is gravitational lensing and its applications in measuring the mass distributions of matter and dark matter in galaxies, galaxy clusters and the universe as a whole. (Credit: Anupreeta More)

    TKF: Space Warps and other citizen science projects flourish because computer programs sometimes struggle at identifying features in data. Why do computers have trouble spotting the characteristic arc or blobby shapes of gravitational lenses that humans can?

    MORE: The problem is that these arc-like images of distant galaxies can have very different shapes and profiles. The process of lensing magnifies these galaxies’ images and can distort them. Also, these distant galaxies emit light at different wavelengths and can appear to have different colors. Furthermore, there are structures in these galaxies that can change the shape of the arcs.

    VERMA: Also, lots of spiral galaxies have bluish spiral arms that can look like lenses. We call these objects “lens impostors” and we find many more of these false positives compared to rare, true gravitational lenses.

    MORE: All these differences make it difficult to automate the process for finding lenses. But human beings are very good at pattern recognition. The dynamic range that our eyes and our brains offer is much greater than a computer algorithm.

    LINTOTT: Another thing to bear in mind in astronomy, particularly in Space Warps, is that we’re often looking for rare objects. A computer’s performance depends very strongly on how many examples you have to “train” it with. When you’re dealing with rare things, that’s often very difficult to do. We can’t assemble large collections of hundreds of thousands of examples of gravitational lenses because we don’t have them yet.

    Also, people — unlike computers — check beyond what we are telling them to look for when they review images. One of the great Space Warps examples is the discovery of a “red ring” gravitational lens. All the example lenses on the Space Warps site are blue in color. But because we have human classifiers, they had no trouble noticing this red thing that looks a little like these blue things they’ve been taught to keep an eye out for. Humans have an ability to make intuitive leaps like that, and that’s very important.

    VERMA: I echo the point that it’s very difficult to program diversity and adaptability into any computer algorithm, whereas we kind of get it for free from the citizen scientists! [Laughter]

    Aprajita Verma researches galaxy formation and evolution, and is particularly interested in understanding the nature of galaxies at high redshift. She is also involved with two major next generation astronomy telescopes, the European Extremely Large Telescope (E-ELT) and the Large Synoptic Survey Telescope (LSST). (Credit: Aprajita Verma)

    ESO E-ELT Interior

    LSST Exterior
    LSST Interior
    LSST Camera
    LSST, building which will house it in Chile, and the camera, being built at SLAC

    KF: Aprajita and Anupreeta, what’s the importance of the red ring object Chris just mentioned that the Space Warps community discovered in 2014 and has nicknamed 9io9?

    VERMA: This object was a really exciting find, and it’s a classic example of something we hadn’t seen before that citizen scientists quickly found. We think that inside the background galaxy there’s both an active black hole, which is producing radio wave emissions, as well as regions of star-formation. They’re both stretched by the lensing into these spectacular arcs. It’s just a really nice example of what lensing can do. We’re still putting in further observations to try and really understand what this object is like.

    MORE: In this particular case with 9io9, there is the usual, main lensing galaxy, but then there is also another, small, satellite galaxy, whose mass and gravity are also contributing to the lensing. The satellite galaxy produces visible effects on the lensed images and we can use this to study its mass distribution. There are no other methods besides gravitational lensing which can provide as accurate a mass estimate for galaxies at such great distances.

    TKF: Besides 9io9, citizen astrophysicists have turned up other bizarre, previously unknown phenomena. One example is Hanny’s Voorwerp, a galaxy-size gas cloud discovered in 2007 in Galaxy Zoo. More recently, in 2015, Planet Hunters spotted huge decreases in the starlight coming from a star called KIC 8462. The cause could be an eclipsing swarm of comets; another, albeit unlikely, possibility that has set off rampant speculation on the Internet is that an alien megastructure is blocking light from the star. Why does citizen science seemingly work so well at making completely unexpected discoveries?

    LINTOTT: I often talk about the human ability to be distracted as a good thing. If we’re doing a routine task and something unusual comes along, we stop to pay attention to it. That’s rather hard to develop with automated computer systems. They can look for anomalies, but in astronomy, most anomalies are boring, such as satellites crossing in front of the telescope, or the telescope’s camera malfunctions.

    However, humans are really good at spotting interesting anomalies like Hanny’s Voorwerp, which looks like either an amorphous green blob or an evil Kermit the Frog, depending on how you squint at it. [Laughter] The point is, it’s something you want to pay attention to.

    The other great thing about citizen science is that the volunteers who find these unusual things start to investigate and become advocates for them. Citizen scientists will jump up and down and tell us professional scientists we should pay attention to something. The great Zooniverse discoveries have always been from that combination of somebody who’s distracted and then asks questions about what he or she has found.

    TKF: Aprajita and Chris, you are both working on the Large Synoptic Survey Telescope (LSST). It will conduct the largest-ever scan of the sky starting in 2022 and should turn up tons of new gravitational lenses. Do you envision a Space Warps-style citizen science project for LSST?

    VERMA: Citizens will play a huge role in the LSST, which is a game-changer for lensing. We know of about 500 lenses currently. LSST will find on the order of tens to hundreds of thousands of lenses. We will potentially require the skill that citizen scientists have in looking for exotic and challenging objects.

    Also, LSST’s dataset will have a time dimension. We’re really going to make a movie of the universe, and this will turn up a number of surprises. I can see citizen scientists being instrumental in a lot of the discoveries LSST will make.

    LINTOTT: One thing that’s challenging about LSST is the sheer size of the dataset. If you were a citizen scientist, say, who had subscribed to receive text message alerts for when objects change in the sky as LSST makes its movie of the universe, then you would end up with a couple of billion text messages a night. Obviously that would not work. So that means we need to filter the data. We’ll dynamically decide whether to assign a task to a machine or to a citizen scientist, or indeed to a professional scientist.

    Chris Lintott develops a range of citizen science projects, with a particular focus on galaxy formation. (Credit: Chris Lintott)

    TKF: Chris, that comment reminds me of something you said to TIME magazine in 2008: “In many parts of science, we’re not constrained by what data we can get, we’re constrained by what we can do with the data we have. Citizen science is a very powerful way of solving that problem.” In this era of big data, how important do you all see citizen science being moving forward, given that computers will surely get better at visual recognition tasks?

    LINTOTT: In astronomy, if you’re looking at things that are routine, like a spiral galaxy or a common type of supernova, I think the machines will take over. They will do so having been trained on the large datasets that citizen scientists will provide. But I think there will be citizen involvement for a long while and it will become more interesting as we use machines to do more of the routine work and filter the data. The tasks for citizen scientists will involve more varied things — more of the unusual, Hanny’s Voorwerp-type of discoveries. Plus, a lot of unusual discoveries will need to be followed up, and I’d like to see citizen scientists get further into the process of analysis. Without them, I think we’re going to end up with a pile of interesting objects which professional scientists just don’t have time to deal with.

    VERMA: We have already seen a huge commitment from citizen scientists, particularly those who’ve spent a long time on Galaxy Zoo and Space Warps. For example, on Space Warps, we have a group of people who are interested in doing gravitational lens modeling, which has long been the domain of the professional astronomer. So we know that there’s an appetite there to do further analysis with the objects they’ve found. I think in the future, the citizen science community will work hand-in-hand with professional astronomers.

    TKF: Are there new citizen astrophysicist opportunities on the horizon related to your projects?

    LINTOTT: Galaxy Zoo has a new lease on life, actually. We just added in new galaxies from a telescope in Chile. These galaxies are relatively close and their images are beautiful. It’s our first proper look at the southern sky, so we have an all-new part of the universe to explore. It gives users a chance to be the first to see galaxies — if they get over to Galaxy Zoo quickly!

    VERMA: For Space Warps, we are expecting new data and new projects to be online next year.

    MORE: Here in Japan, we are leading an imaging survey called the Hyper Suprime-Cam (HSC) survey and it’s going to be much larger and deeper than what we have been looking at so far. We expect to find more than an order of magnitude increase in the number of lenses. Currently, we are preparing images of the candidates from the HSC survey and hope to start a new lens search with Space Warps soon.

    Arguably the most famous citizen astrophysicist discovery, Hanny’s Voorwerp—Dutch for Hanny’s Object—is seen here by the Hubble Space Telescope in 2011. The Voorwerp is a gas cloud the size of galaxy and appears green due to glowing oxygen. A Dutch schoolteacher, Hanny van Arkel, spotted the object while volunteering for Galaxy Zoo. Credit: NASA, ESA, W. Keel (University of Alabama), and the Galaxy Zoo Team)

    TKF: Is it the thrill of discovery that entices most citizen scientist volunteers? Some of the images in Galaxy Zoo have never been seen before because they were taken by a robotic telescope and stored away. Volunteers therefore have the chance to see something no one else ever has.

    MORE: That discovery aspect is personal. I think it’s always exciting for anyone.

    LINTOTT: When we set up Galaxy Zoo, we thought it would be a huge motivation to see something that’s yours and be the first human to lay eyes on a galaxy. Exploring space in that way is something that until Galaxy Zoo only happened on “Star Trek.” [Laughter]

    In the years since, we’ve also come to realize that citizen science is a collective endeavor. The people who’ve been through 10,000 images without finding anything have contributed to the discovery of something like the red ring galaxy just as much as the person who happens to stumble across it. You need to get rid of the empty data as well. I’ve been surprised by how much our volunteers believe that. It’s a far cry from the traditional, public view of scientific discovery in which the lone genius makes the discovery and gets all the credit.

    VERMA: We set out with Space Warps for citizen scientists to be part of our collaboration and they’ve really enabled us to produce important findings. They’ve inspired us with their dedication and productivity. We’ve learned from our analysis that basically anyone who joins Space Warps has an impact on the results. We are also especially grateful for a very dedicated, diligent group that has made most of the lens classifications. We look forward to welcoming everyone back in our future projects!

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

  • richardmitnick 3:27 pm on January 6, 2016 Permalink | Reply
    Tags: , , Detecting magnetic fields inside stars, Kavli Institute   

    From Kavli: “Stellar Revelations” 


    The Kavli Foundation

    Temp 1
    Internal magnetic fields of red giants are up to 10 million times stronger than the Earth’s. No image credit foound

    Using a recently developed technique to detect magnetic fields inside stars, a group of astronomers — including Matteo Cantiello and Lars Bildsten from UC Santa Barbara’s Kavli Institute for Theoretical Physics (KITP) — has discovered that strong magnetic fields are very common in stars. The group’s findings appear in the journal Nature.

    “We have applied a novel theoretical idea that we developed just a few months ago to thousands of stars and the results are just extraordinary,” said Cantiello, a specialist in stellar astrophysics at KITP.

    Previously, only a very small percentage of stars were known to have strong magnetic fields. Therefore, current scientific models of how stars evolve do not include magnetic fields as a fundamental component.

    “Such fields have simply been regarded as insignificant for our general understanding of stellar evolution,” said lead author Dennis Stello, an astrophysicist at the University of Sydney in Australia. “Our result clearly shows this assumption needs to be revisited because we found that up to 60 percent of stars host strong fields.”

    Jim Fuller, Matteo Cantiello and Lars Bildsten (Credit: Bill Wolf)

    Until now, astronomers have been unable to detect these magnetic fields because such fields hide deep in the stellar interior, out of sight from conventional observation methods that measure only the surface properties of stars. The research team turned to asteroseismology, a technique that probes beyond the stellar surface, to determine the presence of very strong magnetic fields near the stellar core.

    “The stellar core is the region where the star produces most of its energy through thermonuclear reactions,” Cantiello explained. “So the field is likely to have important effects on how stars evolve since it can alter the physical processes that take place in the core.”

    Most stars — like the sun — are subject to continuous oscillations. “Their interior is essentially ringing like a bell,” noted co-author Jim Fuller, a postdoctoral scholar from the California Institute of Technology in Pasadena. “And like a bell or a musical instrument, the sound produced reveals physical properties, such as size, temperature and what they are made of.”

    The researchers used very precise data from NASA’s Kepler space telescope to measure tiny brightness variations caused by the ringing sound inside thousands of stars.

    NASA Kepler Telescope

    They found that certain oscillation frequencies were missing in 60 percent of the stars due to suppression by strong magnetic fields in the stellar cores.

    “It’s like having a trumpet that doesn’t sound normal because something is hiding inside it, altering the sound it produces,” Stello said.

    This magnetic suppression effect had previously been seen in only a few dozen stars. However, the new analysis of the full data set from Kepler revealed that this effect is prevalent in stars that are only slightly more massive than the sun.

    According to Cantiello, such intermediate mass stars are hotter and more luminous, and their cores are stirred by convection. “We believe that the magnetic field is created by this ‘boiling’ sequence and stored inside the star for the remaining evolutionary phase. Astrophysicists previously have suggested this but it was very speculative; now it seems clear that this is the case,” he said.

    “This is a very important result that will enable scientists to test more directly current theories for how magnetic fields form and evolve in stellar interiors,” said co-author Bildsten, the director of KITP. “When a star dies, the presence of strong magnetic fields can have a profound impact, possibly resulting in some of the brightest explosions in the universe.”

    This research could potentially lead to a better general understanding of stellar magnetic dynamos, including the one controlling the sun’s 11-year sunspot cycle, which is known to affect communication systems and cloud cover on Earth.

    “So far, the study of stellar magnetic dynamos principally relied on computer simulations, which now can be tested using these new exciting observations,” said Fuller.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

  • richardmitnick 2:43 pm on October 15, 2015 Permalink | Reply
    Tags: , , Kavli Institute   

    From Kavli: “US Neuroscientists Call for Creation of ‘Brain Observatories'” 


    The Kavli Foundation

    Originally published by Cell Press


    What is the future of the BRAIN Initiative? This national White House Grand Challenge involving more than 100 laboratories in the United States has already made progress in establishing large-scale neuroscience goals and developing shared tools. And now in an Opinion paper publishing October 15 in Neuron, leading American neuroscientists call for the next step: a coordinated national network of neurotechnology centers or “brain observatories.”

    “It is our view that the technological challenges that must be surmounted are sufficiently complex that they are beyond the reach of single-investigator efforts; we believe they can only be surmounted through highly coordinated, multi-investigator, cross-disciplinary efforts,” the authors write. “These centers could be similar to existing astronomical observatories, where large-scale technology development and deployment is carried out in a centralized fashion, and where facilities are then shared by the entire community.”

    The six authors include Rafael Yuste, co-director of the Kavli Institute for Brain Science at Columbia University, Michael Roukes of Caltech, Ralph Greenspan of the Kavli Institute for Brain and Mind, George Church of the Wyss Institute and Harvard Medical School, Miyoung Chun of The Kavli Foundation, and A. Paul Alivisatos of the University of California, Berkeley and the Kavli Energy NanoSciences Institute.

    They outline four primary areas of the BRAIN initiative that are critically dependent on new technology that would be unlikely to be quickly realized outside of a center-based framework. These include connectomics – the systematic reconstruction of neural circuits – neural nanoprobe systems, new resonance imaging technologies, and computational data mining. Each of these requires platforms that are expensive to acquire, implement, and maintain, and, if only hosted in one lab, would affect scientific reproducibility and robustness.

    “We believe that the early stage of the BRAIN initiative has laid the groundwork for the next critical stages: enabling the development of integrated neurotechnology systems and, subsequently, the broad dissemination of newly created tools,” the authors write. “There is tremendous opportunity for rapid progress in the four areas mentioned above if the BRAIN Initiative expands beyond its current portfolio of single- and few-investigator projects. These centers would unite and synergize the hundreds of individual laboratories now funded by the BRAIN Initiative.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: