Tagged: Kavli Institute Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 6:21 am on February 10, 2015 Permalink | Reply
    Tags: , Kavli Institute, Optical antennae   

    From Kavli: “Rediscovering Spontaneous Light Emission” 

    KavliFoundation

    The Kavli Foundation

    02/05/2015

    Media Contact

    James Cohen
    Director of Communications
    The Kavli Foundation
    (805) 278-7495
    cohen@kavlifoundation.org

    1
    Spontaneous light emissions from LEDs can be substantially enhanced when coupled to the right optical antenna, making them comparable to the stimulated emissions from lasers. (Image from Wikipedia)

    Berkeley Lab researchers have developed a nano-sized optical antenna that can greatly enhance the spontaneous emission of light from atoms, molecules and semiconductor quantum dots. This advance opens the door to light-emitting diodes (LEDs) that can replace lasers for short-range optical communications, including optical interconnects for microchips, plus a host of other potential applications.

    “Since the invention of the laser, spontaneous light emission has been looked down upon in favor of stimulated light emission,” says Eli Yablonovitch, an electrical engineer with Berkeley Lab’s Materials Sciences Division. “However, with the proper optical antenna, spontaneous emission can actually be faster than stimulated emission.”

    Yablonovitch, who also holds a faculty appointment with the University of California (UC) Berkeley where he directs the NSF Center for Energy Efficient Electronics Science (E3S), and is a member of the Kavli Energy NanoSciences Institute at Berkeley (Kavli ENSI), led a team that used an external antenna made from gold to effectively boost the spontaneous light emission of a nanorod made from Indium Gallium Arsenide Phosphide (InGaAsP) by 115 times. This is approaching the 200-fold increase that is considered the landmark in speed difference between stimulated and spontaneous emissions. When a 200-fold increase is reached, spontaneous emission rates will exceed those of stimulated emissions.

    2
    Eli YablonovitchEli Yablonovitch is an award-winning electrical engineer with Berkeley Lab, UC Berkeley and Kavli ENSI (photo by Roy Kaltschmidt)

    “With optical antennas, we believe that spontaneous emission rate enhancements of better than 2,500 times are possible while still maintaining light emission efficiency greater than 50-percent,” Yablonovitch says. “Replacing wires on microchips with antenna -enhanced LEDs would allow for faster interconnectivity and greater computational power.”

    The results of this study are reported in the Proceedings of the National Academy of Sciences (PNAS) in a paper titled Optical antenna enhanced spontaneous emission. Yablonovitch and UC Berkeley’s Ming Wu are the corresponding authors. Other authors are Michael Eggleston, Kevin Messer and Liming Zhang.

    In the world of high technology lasers are ubiquitous, the reigning workhorse for high-speed optical communications. Lasers, however, have downsides for communications over short distances, i.e., one meter or less – they consume too much power and typically take up too much space. LEDs would be a much more efficient alternative but have been limited by their spontaneous emission rates.

    “Spontaneous emission from molecular-sized radiators is slowed by many orders of magnitude because molecules are too small to act as their own antennas,” Yablonovitch says. “The key to speeding up these spontaneous emissions is to couple the radiating molecule to a half-wavelength antenna. Even though we’ve had antennas in radio for 120 years, somehow we’ve overlooked antennas in optics. Sometimes the great discoveries are looking right at us and waiting.”

    4
    Optical AntannaeCoupling a gold antenna to a InGaAsP nanorod, isolated by TiO2 and embedded in epoxy, greatly enhanced the spontaneous light emission of the InGaAsP

    For their optical antenna, Yablonovitch and his colleagues used an arch antenna configuration. The surface of a square-shaped InGaAsP nanorod was coated with a layer of titanium dioxide to provide isolation between the nanorod and a gold wire that was deposited perpendicularly over the nanorod to create the antenna. The InGaAsP semiconductor that served as the spontaneous light-emitting material is a material already in wide use for infrared laser communication and photo-detectors.

    In addition to short distance communication applications, LEDs equipped with optical antennas could also find important use in photodetectors. Optical antennas could also be applied to imaging, bio-sensing and data storage applications.

    This research was supported by E3S, the U.S. Air Force Office of Scientific Research, and the U.S. Department of Energy’s Office of Science.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

     
  • richardmitnick 9:25 am on January 29, 2015 Permalink | Reply
    Tags: , , Kavli Institute   

    From Kavli Foundation: “Bubbles From the Center of Our Galaxy: A Key to Understanding Dark Matter and the Milky Way’s Past?” 

    KavliFoundation

    The Kavli Foundation

    Winter 2014
    Kelen Tuttle

    Three astrophysicists who discovered two enormous and unexpected structures radiating from the center of our galaxy discuss what these mysterious bubbles can tell us about the history of the Milky Way and how they could help in the search for dark matter.

    1
    From end to end, the newly discovered gamma-ray bubbles (magenta) extend 50,000 light-years, or roughly half of the Milky Way’s diameter. (Credit: NASA’s Goddard Space Flight Center)

    COMPARED TO OTHER GALAXIES, the Milky Way is a peaceful place. But it hasn’t always been so sleepy. In 2010, a team of scientists working at the Harvard–Smithsonian Center for Astrophysics discovered a pair of “Fermi bubbles” extending tens of thousands of light-years above and below the Milky Way’s disk.

    NASA Fermi Telescope
    NASA/Fermi

    These structures are enormous balloons of energetic gamma rays emanating from the center of our galaxy. They hint at a powerful event that took place millions of years ago, likely when the black hole at the center of our galaxy feasted on an enormous amount of gas and dust – perhaps several hundreds or even thousands of times the mass of the sun. But exactly how the bubbles formed, and the exact story they can tell us about the history of our galaxy, remains a mystery.

    Fresh from giving the January 6 Rossi Prize lecture at the Winter American Astronomical Society conference, three astrophysicists who discovered the Fermi bubbles spoke with The Kavli Foundation about ongoing attempts to understand the cause and implications of these unexpected and strange structures, as well as ways in which they may help in the hunt for dark matter.

    7
    Douglas Finkbeiner(Credit: Erin Cram)
    DOUGLAS FINKBEINER is a professor of astronomy and of physics at Harvard University and a member of the Institute for Theory and Computation at the Harvard–Smithsonian Center for Astrophysics. He was part of a collaboration that first discovered a gamma ray “haze” near the center of the Milky Way.

    9
    Tracy SlatyerTracy Slatyer(Credit: Heather Williams/MIT School of Science)
    TRACY SLATYER is an assistant professor of physics at the Massachusetts Institute of Technology and an Affiliated Faculty member at the MIT Kavli Institute for Astrophysics and Space Research. Working with Finkbeiner and Su, she showed that the gamma ray haze is in fact emission from two hot bubbles of plasma emanating from the galactic center.

    0
    Meng SuMeng Su (Credit: Yuqi Qin)
    MENG SU is a Pappalardo Fellow and an Einstein Fellow at the Massachusetts Institute of Technology and the MIT Kavli Institute for Astrophysics and Space Research. He developed the first maps that showed the exact shape of the Fermi bubbles.

    The following is an edited transcript of their roundtable discussion. The participants have been provided the opportunity to amend or edit their remarks.

    THE KAVLI FOUNDATION: When the three of you discovered Fermi bubbles in 2010, they were a complete surprise. No one anticipated the existence of such structures. What were your first thoughts when you saw these huge bubbles – which span more than half of the visible sky – emerge from the data?

    DOUGLAS FINKBEINER: How about crushing disappointment? There seems to be a popular misconception that scientists know what they’re looking for and when they find it, they know it. In reality, that’s often not how it works. In this case, we were on a quest to find dark matter, and we found something completely different. So at first I was puzzled, baffled, disappointed and confused.

    We had been looking for evidence of dark matter in the inner galaxy, which would have shown up as gamma rays. And we did find an excess of gamma rays, so for a little while we thought this might be a dark matter signal. But as we did a better analysis and added more data, we started to see the edges of this structure. It looked like a big figure 8 with a balloon above and below the plane of the galaxy. Dark matter probably wouldn’t do that. At the time, I made the tongue-in-cheek comment that we had double bubble trouble. Instead of a nice spherical halo like we would see with dark matter, we were finding these two bubbles.

    TRACY SLATYER: I called a talk on the Fermi bubbles “Double Bubble Trouble” – it has such a nice ring to it.

    FINKBEINER: It does. After my first thought – “Oh darn, it’s not dark matter” – my second thought was, “Oh, it’s still something very interesting, so now let’s go find out what it is.”

    SLATYER: At the time, Doug, you told me something along the lines of “Scientific discoveries are more often heralded by ‘Huh, that looks funny’ than by ‘Eureka!’” When we first started seeing the edge of these bubbles emerge, I remember looking at the maps with Doug, who was pointing out where he thought there were edges, and not seeing them at all myself. And then more data started coming in and they became clearer and clearer – though it may have been Isaac Asimov who said it first.

    So my first reaction was more like “Huh, that looks really strange.” But I wouldn’t call myself disappointed. It was a puzzle that we needed to figure out.

    FINKBEINER: Maybe befuddled is a better descriptor than disappointed.

    MENG SU: I agree. We already knew of other bubble-like structures in the universe, but this was still quite a big shock. Finding these bubbles in the Milky Way wasn’t anticipated by any theories. When Doug first showed us the picture where you could start to see the bubbles, I immediately started to think about what could possibly produce this type of structure besides dark matter. I personally was less puzzled by the structure itself and more puzzled by how the Milky Way could have produced it.

    SLATYER: But of course it’s also true that the structures we see in other galaxies have never been seen in gamma rays. As far as I know, beyond the question of whether the Milky Way could make a structure like this, there had never been an expectation that we would see a bright signal in gamma rays.

    SU: That’s right. This discovery is still unique and, to me, punishing.

    TKF: Why were such bubbles not expected in the Milky Way, if they are seen in other galaxies?

    FINKBEINER: It’s a good question. On the one hand we’re saying that these aren’t uncommon in other galaxies, while on the other hand we’re saying they were totally unexpected in the Milky Way. One of the reasons it was unexpected is that while every galaxy has a supermassive black hole at the center, in the Milky Way that black hole is about 4 million times the mass of the sun while in the galaxies in which we had previously observed bubbles, the black holes tend to be 100 or 1,000 times more massive than our black hole. And because we think it’s the black hole sucking in nearby matter that’s making most of these bubbles, you wouldn’t have expected a small black hole like the one we have in the Milky Way to be capable of this.

    SU: For that reason, no one expected to see bubbles in our galaxy. We thought the black hole at the center of the Milky Way was a boring one that just sat there quietly. But more and more evidence is suggesting that it was very active a long time ago. It now seems that, in the past, our black hole could have been tens of millions of times more active than it is currently. Before the discovery of Fermi bubbles, people were discussing that possibility, but there was no single piece of evidence showing that our black hole could be that active. The Fermi bubble discovery changed the picture.

    SLATYER: Exactly. Other galaxies that have similar looking structures are in fact quite different galactic environments. It’s not clear that bubbles we see in other galaxies with fairly similar shapes to the ones we see in the Milky Way are necessarily coming from the same physical processes. Due to the sensitivity of the instruments, we have no way to look at the gamma rays associated with these bubbles in other Milky Way-like galaxies – if they release gamma rays at all. The Fermi bubbles are really our first chance to look at anything like this close up and in gamma rays, and we just don’t know if many of the very puzzling features of the Fermi bubbles are present in other galaxies. It’s quite unclear at the moment the degree to which the Fermi bubbles are the same phenomenon as what we see in similarly shaped structures at other wavelengths in other galaxies.

    SU: I think it’s actually very lucky that our galaxy has these structures. We get to look at them very clearly and with great sensitivity, allowing us to study them in detail.

    SLATYER: Something like this could be present in other galaxies, and we would never know.

    SU: Yes – and the opposite is true, too. It’s completely possible that the Fermi bubbles are from something we’ve never seen before.

    FINKBEINER: Exactly. And, for example, the X-rays we do see coming from bubbles in other galaxies, those photons have a factor of a million times less energy than the gamma rays we see streaming from the Fermi bubbles. So we should not jump to conclusions that they come from the same physical processes.

    SU: And, here in our own galaxy, I think more people are asking questions about the implications of the Milky Way’s black hole being so active. I think the picture and the questions are different now. Discovering this structure has very important implications to many key questions about the Milky Way, galaxy formation and black hole growth.
    “More and more evidence points to the story that the supermassive black hole in the center of our Milky Way was very active a long time ago. Before the discovery of Fermi bubbles, people were discussing the possibility, but there was no single piece of evidence showing that our black hole could be that active. The Fermi bubble discovery changed the picture.” —Meng Su

    TKF: Doug and Meng, in a Scientific American article you coauthored with Dmitry Malyshev, you said that Fermi bubbles “promise to reveal deep secrets about the structure and history of our galaxy.” Will you tell us more about what type of secrets these might be?

    SU: There are at least two key questions we’re trying to answer about the supermassive black holes in the center of each galaxy: How does the black hole itself form and grow? And, as the black hole grows, what’s the interaction between the black hole and the host galaxy?

    I think that how the Milky Way fits into this big picture is still a mystery. We don’t know why the mass of the black hole in the center of the Milky Way is so small relative to other supermassive black holes, or how the interaction between this relatively small black hole and the Milky Way galaxy works. The bubbles provide a unique link for both how the black hole grew and how the energy injection from the black hole accretion process impacted the Milky Way as a whole.

    FINKBEINER: Some of our colleagues at the Harvard–Smithsonian Center for Astrophysics conduct simulations where they can see how supernova explosions and black hole accretion events heat gas and drive it out of a galaxy. You can see in some of these simulations that things are going along just fine and stars are forming and the galaxy is rotating and everything is progressing, and then the black hole reaches some critical size. Suddenly, when more matter falls into the black hole, it makes such a big flash that it basically pushes most of the gas right out of the galaxy. After that, there’s no more star formation – you’re kind of done. That feedback process is key to galaxy formation.

    SU: If the bubbles – like the ones we found – form episodically, that could help us understand how the energy outflow from the black hole changes the halo of the gas in the Milky Way dark matter halo. When this gas cools, the Milky Way forms stars. So the whole system will be changed because of the bubble story; the bubbles are closely linked to the history of our galaxy.

    TKF: What additional experimental data or simulations are needed to really understand what’s going on with these bubbles?

    SU: Right now, we’re focused on two things. First, from multi-wavelength observations, we’re looking to understand the current status of the bubbles – how fast they expand, how much energy is released through them, and how high-energy particles within the bubbles are accelerated either close to the black hole or inside the bubbles themselves. Those details we want to understand as much as possible through observations. Second, we want to understand the physics. For example, we want to understand just how the bubbles formed in the first place. Could a burst of star formation very close to the black hole help form the outflow that powers the bubbles? This can help us understand what kind of process forms these types of bubbles.

    FINKBEINER: Any type of work that can give you the amount of energy released over specific timescales is really important to figuring out what’s going on.

    SU: Truthfully, I think it’s amazing how many of the conclusions we drew from the very first observations of the bubbles still hold true today. The energy, the velocity, the age of the bubbles – all of these are consistent with today’s observations. All of the observations point to the same story, which allows us to ask more detailed questions.

    TKF: That doesn’t often happen in astrophysics, where your initial observations are so spot-on.

    FINKBEINER: This doesn’t always happen, it’s true. But we also weren’t very precise. Our paper says that the bubbles are somewhere between 1 and 10 million years old, and now we think they’re about 3 million years old, which is logarithmically right between 1 and 10 million. So, we’re pretty happy. But it’s not like we said it would be 3.76 million and were right.

    TKF: What are the other remaining mysteries about these bubbles? What more do you hope to learn that we haven’t discussed already?

    FINKBEINER: We have an age. I’m done. [laughter]

    TKF: Ha! Now that does not sound like astrophysics.

    SU: No, actually, we expect to learn many new things from future observations. We’ll have additional satellites launching in the coming years that will offer better measurements of the bubbles. One surprising thing we’ve found is that the bubbles have a high-energy cut off. Basically, the bubbles stop shining in high-energy gamma rays at a certain energy. Above that, we don’t see any gamma rays and we don’t know why. So we hope to take better measurements that can tell us why this cutoff is happening. This can be done with future gamma-ray energy satellites, including one called Dark Matter Particle Explorer that will launch later this year. Although the satellite is focused on looking for signatures of dark matter, it will also be able to detect these high-energy gamma rays, even higher than the Fermi Gamma-ray Space Telescope, the telescope we used to discover the Fermi bubbles. That’s where the name of the structure came from.

    4
    Hints of the Fermi bubbles’ edges were first observed in X-rays (blue) by ROSAT, which operated in the 1990s. The gamma rays mapped by the Fermi Gamma-ray Space Telescope (magenta) extend much farther from the galaxy’s plane. (Credit: NASA’s Goddard Space Flight Center)

    NASA ROSAT satellite
    NASA/ROSAT

    Likewise, we’re also interested in the lower energy gamma rays. There are some limitations with the Fermi satellite we’re currently using – the spatial resolution is not nearly as good for low-energy gamma rays. So we hope to launch another satellite in the future that can view the bubbles in low-energy gamma rays. I’m actually part of a team proposing to build this satellite, and I’m glad to find a good name for it: PANGU. It’s still in the early stages, but hopefully we can get the data within 10 years. From this, we hope to learn more about the processes within the bubbles that lead to the emission of gamma rays. We need more data to understand this.

    We’d also like to learn more about the bubbles in X-rays, which also hold key information. For example, X-rays could tell us how the bubbles affect the gas in the Milky Way’s halo. The bubbles presumably heat up the gas as they expand into the halo. We’d like to measure how much the energy from the bubbles is dumped into the gas halo. That’s key to understanding the black hole’s impact on star formation. A new German-Russian satellite called eRosita, planned to launch in 2016, could help with this. We hope its data will help us learn details about all the pieces of the bubble and how they interact with the gas around them.

    FINKBEINER: I completely agree with what Meng just said. That’s going to be a very important data set.

    SLATYER: Figuring out the exact origin of the bubbles is something I’m looking forward to. For example, if you make some basic assumptions, it looks like the gamma-ray signal has some very strange features. Particularly, the fact that the bubbles look so uniform all the way across is surprising. You wouldn’t expect the physics processes we think are taking place inside the bubbles to produce this uniformity. Are there multiple processes at work here? Does the radiation field within the bubbles look very different than what we expect? Is there an odd cancellation going on between the electron density and radiation field? These are just some of the questions we still have, questions that more observations – like the ones Meng was talking about – should shed light on.

    FINKBEINER: In other words, we’re still looking in detail and saying, “That looks funny.”
    “Other galaxies that have similar looking structures are in fact quite different galactic environments. It’s not clear that bubbles that we see in other galaxies that have fairly similar shapes to the ones we see in the Milky Way are necessarily coming from the same physical processes.” —Tracy Slatyer

    TKF: It sounds like there are still many more observations that need to be made before we can fully understand the Fermi bubbles. But from what we do know already, is there anything that could fire up the galactic core again, causing it to create more such bubbles?

    FINKBEINER: Well, if we’re right that the bubbles come from the black hole sucking up a lot of matter, just drop a bunch of gas on the black hole and you’ll see fireworks.

    TKF: Is there a lot of matter near our black hole that could naturally set off these fireworks?

    FINKBEINER: Oh sure! I don’t think it’ll happen in our lifetimes, but if you wait maybe 10 million years, I wouldn’t be surprised at all.

    SU: There are smaller bits of matter, like a cloud of gas called G2 that people estimate has as much mass as perhaps three Earths, that will likely be pulled into the black hole in just a few years. That will probably not produce something like the Fermi bubbles, but it will tell us something about the environment around the black hole and the physics of this process. Those observations might help us learn how much mass it would have taken to create the Fermi bubbles and what types of physics played out in that process.

    FINKBEINER: It’s true, we might learn something interesting from this G2 cloud. But this might be a bit of a red herring, since no reasonable model indicates it will produce gamma rays. It would take a gas cloud something like 100,000,000 times larger to produce a Fermi bubble.

    SU: There’s a lot of evidence that the galactic center was a very different environment several million years ago. But it’s hard to deduce the overall story of exactly how things were in the past and what’s happened in the intervening time. I think the Fermi bubbles might provide a unique, direct piece of evidence that there was once much richer surrounding gas and dust that fed the central black hole than there is today.

    TKF: The Fermi bubbles certainly remain an exciting area of research. So does dark matter, which is what you were originally looking for when you discovered the Fermi bubbles. How is that original dark matter hunt going?

    6
    Data from the Fermi Telescope shows the bubbles (in red and yellow) against other sources of gamma rays. The plane of the galaxy (mostly black and white) stretches horizontally across the middle of the image, and the bubbles extend up and down from the center. (Credit: NASA’s Goddard Space Flight Center)

    FINKBEINER: We’ve really come full circle. If one of the most talked about types of theoretical dark matter particles, the Weakly Interacting Dark Matter Particle, or WIMP, exists, it should give off some sort of gamma-ray signal. It’s just a question of whether that signal is at a level that we can detect. So if you ever want to see this signal in the inner galaxy, you have to understand all the other things that make gamma rays. We thought we understood them all, and then along came the Fermi bubbles. Now we really need to thoroughly understand these bubbles before we can go back to looking for WIMPs in the center of the galaxy. Once we understand them well, we can confidently subtract the Fermi bubble gamma rays from the overall gamma-ray signal and look for any excess of gamma rays remaining that might come from dark matter.

    Putting together quotations from Richard Feynman and Valentine Telegdi, “Yesterday’s sensation is today’s calibration is tomorrow’s background.” The Fermi bubbles are certainly very interesting in their own right, and they’ll keep people busy for many years trying to figure out what they are. But they’re also a background or a foreground for any dark matter searches, and need to be understood for that reason too.
    “It would be a supreme irony if we found the Fermi bubbles while looking for dark matter and then while studying the Fermi bubbles we discovered dark matter.”
    —Douglas Finkbeiner

    SLATYER: This is what I’m working on in my research these days. And the first question to what Doug just said is often, “Well, why don’t you just look for evidence of dark matter somewhere other than the inner galaxy?” But in WIMP models of dark matter, we expect the signals from the galactic center to be significantly brighter than anywhere else in the sky. So just giving up on the galactic center is not generally a good option.

    Looking at the Fermi bubbles near the galactic center, we have found a promising signal that could potentially be associated with dark matter. It extends a significant distance from the galactic center, and has a lot of the properties that you would expect from a dark matter signal – including appearing outside the bubbles as well.

    This is a very concrete case where studies of the Fermi bubbles uncovered something that may be related to dark matter – which is what we were looking for in the first place. It also emphasizes the importance of understanding what exactly is going on in the bubbles, so that we can get a better understanding of this very interesting region of the sky.

    FINKBEINER: It would be a supreme irony if we found the Fermi bubbles while looking for dark matter and then while studying the Fermi bubbles we discovered dark matter.

    See the full article here..

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

     
  • richardmitnick 3:02 pm on December 12, 2014 Permalink | Reply
    Tags: , , Kavli Institute   

    From Kavli: “Is an Understanding of Dark Matter around the Corner? Experimentalists Unsure” 

    KavliFoundation

    The Kavli Foundation

    December 12, 2014

    Media Contact

    James Cohen
    Director of Communications
    The Kavli Foundation
    (805) 278-7495
    cohen@kavlifoundation.org

    Scientists have long known that dark matter is out there, silently orchestrating the universe’s movement and structure. But what exactly is dark matter made of? And what does a dark matter particle look like? That remains a mystery, with experiment after experiment coming up empty handed in the quest to detect these elusive particles.

    With some luck, that may be about to change. With ten times the sensitivity of previous detectors, three recently funded dark matter experiments have scientists crossing their fingers that they may finally glimpse these long-sought particles. In recent conversations with The Kavli Foundation, scientists working on these new experiments expressed hope that they would catch dark matter, but also agreed that, in the end, their success or failure is up to nature to decide.

    “Nature is being coy,” said Enectali Figueroa-Feliciano, an associate professor of physics at the MIT Kavli Institute for Astrophysics and Space Research who works on one of the three new experiments. “There’s something we just don’t understand about the internal structure of how the universe works. When theorists write down all the ways dark matter might interact with our particles, they find, for the simplest models, that we should have seen it already. So even though we haven’t found it yet, there’s a message there, one that we’re trying to decode now.”

    The first of the new experiments, called the Axion Dark Matter eXperiment, searches for a theoretical type of dark matter particle called the axion. ADMX seeks evidence of this extremely lightweight particle converting into a photon in the experiment’s high magnetic field. By slowly varying the magnetic field, the detector hunts for one axion mass at a time.

    ADMX Axion Dark Matter Experiment
    ADMX at U Washington

    “We’ve demonstrated that we have the tools necessary to see axions,” said Gray Rybka, research assistant professor of physics at the University of Washington who co-leads the ADMX Gen 2 experiment. “With Gen2, we’re buying a very, very powerful refrigerator that will arrive very shortly. Once it arrives, we’ll be able to scan very, very quickly and we feel we’ll have a much better chance of finding axions – if they’re out there.”

    The two other new experiments look for a different type of theoretical dark matter called the WIMP. Short for Weakly Interacting Massive Particle, the WIMP interacts with our world very weakly and very rarely. The Large Underground Xenon, or LUX, experiment, which began in 2009, is now getting an upgrade to increase its sensitivity to heavier WIMPs. Meanwhile, the Super Cryogenic Dark Matter Search collaboration, which has looked for the signal of a lightweight WIMP barreling through its detector since 2013, is in the process of finalizing the design for a new experiment to be located in Canada.

    LUX Dark matter
    LUX

    LBL SuperCDMS
    Super Cryogenic Dark Matter Search

    “In a way it’s like looking for gold,” said Figueroa-Feliciano, a member of the SuperCDMS experiment. “Harry has his pan and he’s looking for gold in a deep pond, and we’re looking in a slightly shallower pond, and Gray’s a little upstream, looking in his own spot. We don’t know who’s going to find gold because we don’t know where it is.”

    Rybka agreed, but added the more optimistic perspective that it’s also possible that all three experiments will find dark matter. “There’s nothing that would require dark matter to be made of just one type of particle except us hoping that it’s that simple,” he said. “Dark matter could be one-third axions, one-third heavy WIMPs and one-third light WIMPs. That would be perfectly allowable from everything we’ve seen.”

    Yet the nugget of gold for which all three experiments search is a very valuable one. And even though the search is difficult, all three scientists agreed that it’s worthwhile because glimpsing dark matter would reveal insight into a large portion of the universe.

    “We’re all looking and somewhere, maybe even now, there’s a little bit of data that will cause someone to have an ‘Ah ha!’ moment,” said Harry Nelson, professor of physics at the University of California, Santa Barbara and science lead for the LUX upgrade, called LUX-ZEPLIN. “This idea that there’s something out there that we can’t sense yet is one of those things that sends chills down my spine.”

    More about the hunt for dark matter is available at:

    New Dark Matter Experiments Prepare to Hunt the Unknown: A Conversation with Enectali Figueroa-Feliciano, Harry Nelson and Gray Rybka
    Spotlight Live: Dark Matter at Long Last? Three New Experiments Ramp Up (Transcript)

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

     
  • richardmitnick 12:00 pm on October 24, 2014 Permalink | Reply
    Tags: , Charles Munger, Kavli Institute, ,   

    From NYT: “Charles Munger, Warren Buffett’s Longtime Business Partner, Makes $65 Million Gift” 

    New York Times

    The New York Times

    October 24, 2014
    Michael J. de la Merced

    Charles T. Munger has been known for many things over his decades-long career, including longtime business partner of Warren E. Buffett; successful investor and lawyer; and plain-spoken commentator with a wide following.

    cm

    Now Mr. Munger, 90, can add another title to that list: deep-pocketed benefactor to the field of theoretical physics.

    He was expected to announce on Friday that he has donated $65 million to the Kavli Institute for Theoretical Physics at the University of California, Santa Barbara. The gift — the largest in the school’s history — will go toward building a 61-bed residence for visitors to the institute, which brings together physicists for weeks at a time to exchange ideas.

    “U.C.S.B. has by far the most important program for visiting physicists in the world,” Mr. Munger said in a telephone interview. “Leading physicists routinely are coming to the school to talk to one another, create new stuff, cross-fertilize ideas.”

    ucsb
    UC Santa Barbara Campus

    The donation is the latest gift by Mr. Munger, a billionaire who has not been shy in giving away the wealth he has accumulated as vice chairman of Mr. Buffett’s Berkshire Hathaway to charitable causes.

    Though perhaps not as prominent a donor as his business partner, who cocreated the Giving Pledge campaign for the world’s richest people to commit their wealth to philanthropy, Mr. Munger has frequently donated big sums to schools like Stanford and the Harvard-Westlake School. (He has not signed on to the Giving Pledge campaign.)

    The biggest beneficiary of his largess thus far has been the University of Michigan, his alma mater. Last year alone, he gave $110 million worth of Berkshire shares — one of the biggest gifts in the university’s history — to create a new residence intended to help graduate students from different areas of study mingle and share ideas.

    That same idea of intellectual cross-pollination underpins the Kavli Institute, which over 35 years has established itself as a haven for theoretical physicists from around the world to meet and discuss potential new developments in their field.

    Funded primarily by the National Science Foundation, the institute has produced advances in the understanding of white dwarf stars, string theory and quantum computing.

    A former director of the institute, David J. Gross, shared in the 2004 Nobel Prize in Physics for work that shed new light on the fundamental force that binds together the atomic nucleus.

    “Away from day-to-day responsibilities, they are in a different mental state,” Lars Bildsten, the institute’s current director, said of the center’s visitors. “They’re more willing to wander intellectually.”

    To Mr. Munger, such interactions are crucial for the advancement of physics. He cited international conferences attended by the likes of [Albert]Einstein and Marie Curie.

    Mr. Munger himself did not study physics for very long, having taken a class at the California Institute of Technology while in the Army during World War II. But as an avid reader of scientific biography, he came to appreciate the importance of the field.

    And he praised the rise of the University of California, Santa Barbara, as a leading haven for physics, particularly given its status as a relatively young research institution.

    But while the Kavli Institute conducts various programs throughout the year for visiting scientists, it has long lacked a way for physicists to spend time outside of work hours during their stays. A permanent residence hall would allow them to mingle even more, in the hope of fostering additional eureka moments.

    “We want to make their hardest choice, ‘Which barbecue to go to?’ ” Mr. Bildsten joked.

    Though Mr. Munger has some ties to the University of California, Santa Barbara — a grandson is an alumnus — he was first introduced to the Kavli Institute through a friend who lives in Santa Barbara.

    During one of the pair’s numerous fishing trips, that friend, Glen Mitchel, asked the Berkshire vice chairman to help finance construction of a new residence. The university had already reserved a plot of land for the dormitory in case the institute raised the requisite funds.

    “It wasn’t a hard sell,” Mr. Munger said.

    “Physics is vitally important,” he added. “Everyone knows that.”

    See the full article here.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 7:28 pm on October 6, 2014 Permalink | Reply
    Tags: , , Kavli Institute   

    From Kavli: ” A Warm Dark Matter Search Using XMASS “ 

    KavliFoundation

    The Kavli Foundation

    10/06/2014
    Yoichiro Suzuki
    Kavli Institute for the Physics and Mathematics of the Universe, The University of Tokyo
    E-mail: yoichiro.suzuki_at_ipmu.jp 

    The XMASS collaboration, led by Yoichiro Suzuki at the Kavli IPMU, has reported its latest results on the search for warm dark matter. Their results rule out the possibility that super-weakly interacting massive bosonic particles (bosonic super-WIMPs) constitute all dark matter in the universe. This result was published in the September 19th issue of the Physical Review Letters as an Editors’ Suggestion.

    xmass
    XMASS DetectorConstruction of XMASS-Ⅰ detector (2010/Feb./25) (C) Kamioka Observatory, ICRR(Institute for Cosmic Ray Research), The University of Tokyo

    The universe is considered to be filled with dark matter, which cannot be observed by ordinary light. Although much evidence supports the existence of dark matter, it has yet to be directly detected and its nature is not understood.

    Various theoretical models have been proposed to explain the nature of dark matter. Some models extend the standard model of particle physics, such as super-symmetry, and suggest that weakly interacting massive particles (WIMPs) are dark matter candidates. These models have motivated most experimental research on dark matter. In discussions on the large-scale structure formation of the universe, these WIMPs fit the cold dark matter (CDM) paradigm.

    Supersymmetry standard model
    Standard Model of Supersymmetry

    On the other hand, some simulations based on the CDM scenario predict a much richer structure of the universe on galactic scales than those observed. Furthermore, high-energy collider experiments have yet to provide evidence of super-symmetric particles. These facts have increased the interest in lighter and further weakly interacting particles such as bosonic super-WIMPs as dark matter. Super-WIMPs with masses greater than a twentieth of an electron (more than 3 keV) do not conflict with the structure formation of the universe.

    “Bosonic super-WIMPs are experimentally attractive since if they are absorbed in ordinary material, they would deposit energy essentially equivalent to the super-WIMP’s rest mass,” Suzuki says. “And only ultra-low background detectors like XMASS can detect the signal.”

    The XMASS experiment was conducted to directly search for such bosonic super-WIMPS, especially in the mass range between a tenth and a third that of an electron (between 40 and 120 keV). XMASS is a cryogenic detector using about 1 ton of liquid xenon as the target material. Using 165.9 days of data, a significant excess above the background is not observed in the fiducial mass of 41 kg. The absence of such a signal excludes the possibility that bosonic super-WIMPs constitute all dark matter in the universe.

    “Light super-WIMPs are a good candidate of dark matter on galactic scales,” Professor Naoki Yoshida, a cosmologist at the School of Science, the University of Tokyo and a Project Professor at the Kavli IPMU says. “The XMASS team derived an important constraint on the possibility of such light dark models for a broad range of particle masses.”

    See the full article here.

    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 8:30 pm on September 9, 2014 Permalink | Reply
    Tags: , , Kavli Institute,   

    From Kavli: “Tiny Graphene Drum Could Form Future Quantum Memory” 

    KavliFoundation

    The Kavli Foundation

    09/09/2014
    No Writer Credit

    Scientists from TU Delft’s Kavli Institute of Nanoscience have demonstrated that they can detect extremely small changes in position and forces on very small drums of graphene. Graphene drums have great potential to be used as sensors in devices such as mobile phones. Using their unique mechanical properties, these drums could also act as memory chips in a quantum computer. The researchers present their findings in an article in the August 24th edition of Nature Nanotechnology. The research was funded by the FOM Foundation, the EU Marie-Curie program, and NWO.

    Graphene drums

    drum
    Graphene Drum

    Graphene is famous for its special electrical properties, but research on the one-layer thin graphite was recently expanded to explore graphene as a mechanical object. Thanks to their extreme low mass, tiny sheets of graphene can be used the same was as the drumhead of a musician. In the experiment, scientists use microwave-frequency light to ‘play’ the graphene drums, to listen to its ‘nano sound’, and to explore the way graphene in these drums moves.

    Optomechanics

    Dr. Vibhor Singh and his colleagues did this by using a 2D crystal membrane as a mirror in an ‘optomechanical cavity’. “In optomechanics you use the interference pattern of light to detect tiny changes in the position of an object. In this experiment, we shot microwave photons at a tiny graphene drum. The drum acts as a mirror: by looking at the interference of the microwave photons bouncing off of the drum, we are able to sense minute changes in the position of the graphene sheet of only 17 femtometers, nearly 1/10000th of the diameter of an atom.”, Singh explains.

    Amplifier

    The microwave ‘light’ in the experiment is not only good for detecting the position of the drum, but can also push on the drum with a force. This force from light is extremely small, but the small mass of the graphene sheet and the tiny displacements they can detect mean that the scientist can use these forces to ‘beat the drum’: the scientists can shake the graphene drum with the momentum of light. Using this radiation pressure, they made an amplifier in which microwave signals, such as those in your mobile phone, are amplified by the mechanical motion of the drum.

    Memory

    The scientists also show you can use these drums as ‘memory chips’ for microwave photons, converting photons into mechanical vibrations and storing them for up to 10 milliseconds. Although that is not long by human standards, it is a long time for a computer chip. “One of the long-term goals of the project is explore 2D crystal drums to study quantum motion. If you hit a classical drum with a stick, the drumhead will start oscillating, shaking up and down. With a quantum drum, however, you can not only make the drumhead move up and then down, but also make it into a ‘quantum superposition’, in which the drum head is both moving up and moving down at the same time ”, says research group leader Dr. Gary Steele. “This ‘strange’ quantum motion is not only of scientific relevance, but also could have very practical applications in a quantum computer as a quantum ‘memory chip’”.

    In a quantum computer, the fact that quantum ‘bits’ that can be both in the state 0 and 1 at the same time allow it to potentially perform computations much faster than a classical computer like those used today. Quantum graphene drums that are ‘shaking up and down at the same time’ could be used to store quantum information in the same way as RAM chips in your computer, allowing you to store your quantum computation result and retrieve it at a later time by listening to its quantum sound.

    See the full article, with video, here.

    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 10:35 pm on August 19, 2014 Permalink | Reply
    Tags: , , , , Kavli Institute,   

    From Kavli: “New Survey Begins Mapping Nearby Galaxies “ 

    KavliFoundation

    The Kavli Foundation

    August 18, 2014
    (Originally published by Kavli IPMU)

    A new survey called MaNGA (Mapping Nearby Galaxies at Apache Point Observatory) has been launched that will greatly expand our understanding of galaxies, including the Milky Way, by charting the internal structure and composition of an unprecedented sample of 10,000 galaxies.

    Apache Point Observatory
    Apache Point Observatory

    MaNGA is a part of the fourth generation Sloan Digital Sky Survey (SDSS-IV) and will make maps of stars and gas in galaxies to determine how they have grown and changed over billions of years, using a novel optical fiber bundle technology that can take spectra of all parts of a galaxy at the same time.

    Sloan Digital Sky Survey Telescope
    Sloan Digital Sky Survey Telescope

    The new survey represents a collaboration of more than 200 astronomers at more than 40 institutions on four continents. With the new technology, astronomers will gain a perspective on the building blocks of the universe with a statistical precision that has never been achieved before.

    “Because the life story of a galaxy is encoded in its internal structure—a bit like the way the life story of a tree is encoded in its rings—MaNGA would, for the first time, enable us to map the evolutionary histories of galaxies of all types and sizes, living in all kinds of environments,” said Kevin Bundy, MaNGA’s Principal Investigator from the Kavli Institute for the Physics and Mathematics of the Universe, the University of Tokyo.

    image
    Previously, SDSS has mapped the universe across billions of light-years, focusing on the time from 7 billion years after the Big Bang to the present and the time from 2 billion years to 3 billion years after the Big Bang. SDSS-IV will focus on mapping the distribution of galaxies and quasars 3 billion years to 7 billion years after the Big Bang, a critical time when dark energy is thought to have started to affect the expansion of the Universe. Image credit: SDSS collaboration and Dana Berry / SkyWorks Digital, Inc. WMAP cosmic microwave background (Credit: NASA/WMAP Science Team)

    This new survey will provide a vast public database of observations that will significantly expand astronomer’s understanding of how tiny differences in the density of the early universe evolved over billions of years into the rich structure of galaxies today. This cosmic story includes the journey of our own Milky Way galaxy from its origins to the birth of our sun and solar system, and eventually the necessary conditions that gave rise to life on Earth.

    “MaNGA will not only teach us about what shapes the appearance of normal galaxies,” said SDSS Project Scientist, Matthew Bershady from the University of Wisconsin, Madison. “It will also almost surely surprise us with new discoveries about the origin of dark matter, super-massive black holes, and perhaps even the nature of gravity itself.” This potential comes from MaNGA’s ability to paint a complete picture of each galaxy using an unprecedented amount of spectral information on the chemical composition and motions of stars and gas.

    To realize this potential, the MaNGA team has developed new technologies for bundling sets of fiber-optic cables into tightly-packed arrays that dramatically enhance the capabilities of existing instrumentation on the 2.5-meter Sloan Foundation Telescope in New Mexico. Unlike nearly all previous surveys, which combine all portions of a galaxy into a single spectrum, MaNGA will obtain as many as 127 different measurements across the full extent of every galaxy. Its new instrumentation enables a survey of more than 10,000 nearby galaxies at twenty times the rate of previous efforts, which did one galaxy at a time.

    But local galaxy studies are far from the only astronomical topic the new SDSS will explore. Another core program called APOGEE-2 will chart the compositions and motions of stars across the entire Milky Way in unprecedented detail, using a telescope in Chile along with the existing Sloan Foundation Telescope.

    image2
    The new SDSS will measure spectra at multiple points in the same galaxy, using a newly created fiber bundle technology. The left-hand side shows the Sloan Foundation Telescope and a close-up of the tip of the fiber bundle. The bottom right illustrates how each fiber will observe a different section of each galaxy. The image (from the Hubble Space Telescope) shows one of the first galaxies that the new SDSS has measured. The top right shows data gathered by two fibers observing two different part of the galaxy, showing how the spectrum of the central regions differs dramatically from outer regions. Image Credit: David Law, SDSS collaboration, and Dana Berry / SkyWorks Digital, Inc. Hubble Space Telescope (Credit:(http://hubblesite.org/newscenter/archive/releases/2008/16/image/cg/): NASA, ESA, the Hubble Heritage (STScI/AURA)-ESA/Hubble Collaboration, and A. Evans (University of Virginia, Charlottesville/NRAO/Stony Brook University))

    And the new SDSS will continue to improve our understanding of the Universe as a whole. The third core program, eBOSS, will precisely measure the expansion history of the Universe through 80% of cosmic history, back to when the Universe was less than three billion years old. These new detailed measurements will help to improve constraints on the nature of dark energy, the most mysterious experimental result in modern physics.

    “SDSS has a proud history of fostering a breadth of cosmic discoveries that connect a deep understanding of the origins of the universe with key insights on the nature of galaxies and the makeup of our own Milky Way,” said Hitoshi Murayama, Director of the Kavli IPMU. “We are delighted to be a part of this endeavor to understand the Universe in the broadest sense, and particularly happy to see our Kevin Bundy playing such a crucial role to make it all happen.”

    With new technology and surveys like MaNGA and the continuing generous support of the Alfred P. Sloan Foundation and participating institutions, the SDSS will remain one of the world’s most productive astronomical facilities. Science results from the SDSS will continue to reshape our view of the fundamental constituents of the cosmos, the universe of galaxies, and our home in the Milky Way.

    ABOUT THE SLOAN DIGITAL SKY SURVEY

    Funding for the Sloan Digital Sky Survey IV has been provided by the Alfred P. Sloan Foundation and the Participating Institutions. SDSS-IV acknowledges support and resources from the Center for High-Performance Computing at the University of Utah.

    SDSS-IV is managed by the Astrophysical Research Consortium for the Participating Institutions of the SDSS Collaboration including the Carnegie Institution for Science, Carnegie Mellon University, the Chilean Participation Group, Harvard-Smithsonian Center for Astrophysics, Instituto de Astrofisica de Canarias, The Johns Hopkins University, Kavli Institute for the Physics and Mathematics of the Universe (Kavli IPMU) / University of Tokyo, Lawrence Berkeley National Laboratory, Leibniz Institut fur Astrophysik Potsdam (AIP),Max-Planck-Institut fur Astrophysik (MPA Garching), Max-Planck-Institut fur Extraterrestrische Physik (MPE), Max-Planck-Institut fur Astronomie (MPIA Heidelberg), National Astronomical Observatory of China, New Mexico State University, New York University, The Ohio State University, Pennsylvania State University, Shanghai Astronomical Observatory, United Kingdom Participation Group, Universidad Nacional Autonoma de Mexico, University of Arizona, University of Colorado Boulder, University of Portsmouth, University of Utah, University of Washington, University of Wisconsin, Vanderbilt University, and Yale University.

    SDSS Website – http://www.sdss.org/

    See the full article, with video and additional material here.

    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 5:49 am on June 3, 2014 Permalink | Reply
    Tags: , Kavli Institute, ,   

    From The Kavli Institute at Stanford: “Solving big questions requires big computation” 

    KavliFoundation

    The Kavli Foundation

    Understanding the origins of our solar system, the future of our planet or humanity requires complex calculations run on high-power computers.

    A common thread among research efforts across Stanford’s many disciplines is the growing use of sophisticated algorithms, run by brute computing power, to solve big questions.

    In Earth sciences, computer models of climate change or carbon sequestration help drive policy decisions, and in medicine computation is helping unravel the complex relationship between our DNA and disease risk. Even in the social sciences, computation is being used to identify relationships between social networks and behaviors, work that could influence educational programs.

    dell sc

    “There’s really very little research that isn’t dependent on computing,” says Ann Arvin, vice provost and dean of research. Arvin helped support the recently opened Stanford Research Computing Center (SRCC) located at SLAC National Accelerator Laboratory, which expands the available research computing space at Stanford. The building’s green technology also reduces the energy used to cool the servers, lowering the environmental costs of carrying out research.

    “Everyone we’re hiring is computational, and not at a trivial level,” says Stanford Provost John Etchemendy, who provided an initial set of servers at the facility. “It is time that we have this facility to support those faculty.”

    Here are just a few examples of how Stanford faculty are putting computers to work to crack the mysteries of our origins, our planet and ourselves.

    Myths once explained our origins. Now we have algorithms.

    Our Origins

    Q: How did the universe form?

    For thousands of years, humans have looked to the night sky and created myths to explain the origins of the planets and stars. The real answer could soon come from the elegant computer simulations conducted by Tom Abel, an associate professor of physics at Stanford.

    Cosmologists face an ironic conundrum. By studying the current universe, we have gained a tremendous understanding of what occurred in the fractions of a second after the Big Bang, and how the first 400,000 years created the ingredients – gases, energy, etc. – that would eventually become the stars, planets and everything else. But we still don’t know what happened after those early years to create what we see in the night sky.

    “It’s the perfect problem for a physicist, because we know the initial conditions very well,” says Abel, who is also director of the Kavli Institute for Particle Astrophysics and Cosmology at SLAC. “If you know the laws of physics correctly, you should be able to exactly calculate what will happen next.”

    Easier said than done. Abel’s calculations must incorporate the laws of chemistry, atomic physics, gravity, how atoms and molecules radiate, gas and fluid dynamics and interactions, the forces associated with dark matter and so on. Those processes must then be simulated out over the course of hundreds of millions, and eventually billions, of years. Further complicating matters, a single galaxy holds one billion moving stars, and the simulation needs to consider their interactions in order to create an accurate prediction of how the universe came to be.

    “Any of the advances we make will come from writing smarter algorithms,” Abel says. “The key point of the new facility is it will allow for rapid turnaround, which will allow us to constantly develop and refine and validate new algorithms. And this will help us understand how the very first things were formed in the universe.” —Bjorn Carey //

    Q: How did we evolve?

    The human genome is essentially a gigantic data set. Deep within each person’s six billion data points are minute variations that tell the story of human evolution, and provide clues to how scientists can combat modern-day diseases.

    To better understand the causes and consequences of these genetic variations, Jonathan Pritchard, a professor of genetics and of biology, writes computer programs that can investigate those links. “Genetic variation affects how cells work, both in healthy variation and in response to disease,” Pritchard says. How that variation displays itself – in appearance or how cells work – and whether natural selection favors those changes within a population drives evolution.

    Consider, for example, variation in the gene that codes for lactase, an enzyme that allows mammals to digest milk. Most mammals turn off the lactase gene after they’ve been weaned from their mother’s milk. In populations that have historically revolved around dairy farming, however, Pritchard’s algorithms have helped to elucidate signals of strong selection since the advent of agriculture to enable people to process milk active throughout life. There has been similarly strong selection on skin pigmentation in non-Africans that allow better synthesis of vitamin D in regions where people are exposed to less sunlight.

    The algorithms and machine learning methods Pritchard used have the potential to yield powerful medical insights. Studying variations in how genes are regulated within a population could reveal how and where particular proteins bind to DNA, or which genes are turned on in different cell types­ – information that could help design novel therapies. These inquiries can generate hundreds of thousands of data sets and can only be parsed with up to tens of thousands of hours of computer work.

    Pritchard is bracing for an even bigger explosion of data; as genome sequencing technologies become less expensive, he expects the number of individually sequenced genomes to jump by as much as a hundredfold in the next few years. “Storing and analyzing vast amounts of data is a fundamental challenge that all genomics groups are dealing with,” says Pritchard, who is a member of Stanford Bio-X.

    “Having access to SRCC will make our inquiries go easier and more quickly, and we can move on faster to making the next discovery.” —Bjorn Carey //
    7 billion people live on Earth. Computers might help us survive ourselves.

    Our Planet
    Q: How can we predict future climates?

    There is no lab large enough to conduct experiments on the global-scale interactions between air, water and land that control Earth’s climate, so Stanford’s Noah Diffenbaugh and his students use supercomputers.

    Computer simulations reveal that if human emissions of greenhouse gases continue at their current pace, global warming over the next century is likely to occur faster than any global-scale shift recorded in the past 65 million years. This will increase the likelihood and severity of droughts, heat waves, heavy downpours and other extreme weather events.

    Climate scientists must incorporate into their predictions a growing number of data streams – including direct measurements as well as remote-sensing observations from satellites, aircraft-based sensors, and ground-based arrays.

    “That takes a lot of computing power, especially as we try to figure out how to use newer unstructured forms of data, such as from mobile sensors,” says Diffenbaugh, an associate professor of environmental Earth system science and a senior fellow at the Stanford Woods Institute for the Environment.

    Diffenbaugh’s team plans to use the increased computing resources available at SRCC to simulate air circulation patterns at the kilometer-scale over multiple decades. This has rarely been attempted before, and could help scientists answer questions such as how the recurring El Niño ocean circulation pattern interacts with elevated atmospheric carbon dioxide levels to affect the occurrence of tornadoes in the United States.

    “We plan to use the new computing cluster to run very large high-resolution simulations of climate over regions like the U.S. and India,” Diffenbaugh says. One of the most important benefits of SRCC, however, is not one that can be measured in computing power or cycles.

    “Perhaps most importantly, the new center is bringing together scholars from across campus who are using similar methodologies to figure out new solutions to existing problems, and hopefully to tackle new problems that we haven’t imagined yet.” —Ker Than //

    Q: How can we predict if climate solutions work?

    The capture and trapping of carbon dioxide gas deep underground is one of the most viable options for mitigating the effects of global warming, but only if we can understand how that stored gas interacts with the surrounding structures.

    Hamdi Tchelepi, a professor of energy resources engineering, uses supercomputers to study interactions between injected CO2 gas and the complex rock-fluid system in the subsurface.

    “Carbon sequestration is not a simple reversal of the technology that allows us to extract oil and gas. The physics involved is more complicated, ranging from the micro-scale of sand grains to extremely large geological formations that may extend hundreds of kilometers, and the timescales are on the order of centuries, not decades,” says Tchelepi, who is also the co-director of the Stanford Center for Computational Earth and Environmental Sciences (CEES).

    For example, modeling how a large plume of CO2 injected into the ground migrates and settles within the subsurface, and whether it might escape from the injection site to affect the air quality of a faraway city, can require the solving of tens of millions of equations simultaneously. SRCC will help augment the high computing power already available to Stanford Earth scientists and students through CEES, and will serve as a testing ground for custom algorithms developed by CEES researchers to simulate complex physical processes.

    Tchelepi, who is also affiliated with the Precourt Institute for Energy, says people are often surprised to learn the role that supercomputing plays in modern Earth sciences, but Earth scientists use more computer resources than almost anybody except the defense industry, and their computing needs can influence the designs of next-generation hardware.

    “Earth science is about understanding the complex and ever-changing dynamics of flowing air, water, oil, gas, CO2 and heat. That’s a lot of physics, requiring extensive computing resources to model.” —Ker Than //
    Q: How can we build more efficent energy networks?

    When folks crank their air conditioners during a heat wave, you can almost hear the electric grid moan. The sudden, larger-than-average demand for electricity can stress electric plants, and energy providers scramble to redistribute the load, or ask industrial users to temporarily shut down. To handle those sudden spikes in use more efficiently, Ram Rajagopal, an assistant professor of civil and environmental engineering, used supercomputers to analyze the energy usage patterns of 200,000 anonymous households and businesses in Northern California and from that develop a model that could tune consumer demand and lead to a more flexible “smart grid.”

    Today, utility companies base forecasts on a 24-hour cycle that aggregates millions of households. Not surprisingly, power use peaks in the morning and evening, when people are at home. But when Rajagopal looked at 1.6 billion hourly data points he plotted dramatic variations.

    Some households conformed to the norm and others didn’t. This forms the statistical underpinning for a new way to price and purchase power – by aggregating as few as a thousand customers into a unit with a predictable usage pattern. “If we want to thwart global warming we need to give this technology to communities,” says Rajagopal. Some consumers might want to pay whatever it costs to stay cool on hot days, others might conserve or defer demand to get price breaks. “I’m talking about neighborhood power that could be aligned to your beliefs,” says Rajagopal.

    Establishing a responsive smart grid and creative energy economies will become even more important as solar and wind energy – which face hourly supply limitations due to Mother Nature – become a larger slice of the energy pie. —Tom Abate //

    Know thyself. Let computation help.

    Ourselves

    Q: How does our DNA make us who we are?

    Our DNA is sometimes referred to as our body’s blueprint, but it’s really more of a sketch. Sure, it determines a lot of things, but so do the viruses and bacteria swarming our bodies, our encounters with environmental chemicals that lodge in our tissues and the chemical stew that ensues when our immune system responds to disease states.

    All of this taken together – our DNA, the chemicals, the antibodies coursing through our veins and so much more – determines our physical state at any point in time. And all that information makes for a lot of data if, like genetics professor Michael Snyder, you collected it 75 times over the course of four years.

    Snyder is a proponent of what he calls “personal omics profiling,” or the study of all that makes up our person, and he’s starting with himself. “What we’re collecting is a detailed molecular portrait of a person throughout time,” he says.

    So far, he’s turning out to be a pretty interesting test case. In one round of assessment he learned that he was becoming diabetic and was able to control the condition long before it would have been detected through a periodic medical exam.

    If personal omics profiling is going to go mainstream, serious computing will be required to tease out which of the myriad tests Snyder’s team currently runs give meaningful information and should be part of routine screening. Snyder’s sampling alone has already generated a half of a petabyte of data – roughly enough raw information to fill about a dishwasher-size rack of servers.

    Right now, that data and the computer power required to understand it reside on campus, but new servers will be located at SRCC. “I think you are going to see a lot more projects like this,” says Snyder, who is also a Stanford Bio-X affiliate and a member of the Stanford Cancer Center.

    “Computing is becoming increasingly important in medicine.” —Amy Adams //

    Q: How do we learn to read?

    A love letter, with all of its associated emotions, conveys its message with the same set of squiggly letters as a newspaper, novel or an instruction manual. How our brains learn to interpret a series of lines and curves into language that carries meaning or imparts knowledge is something psychology Professor Brian Wandell has been trying to understand.

    Wandell hopes to tease out differences between the brain scans of kids learning to read normally and those who are struggling, and use that information to find the right support for kids who need help. “As we acquire information about the outcome of different reading interventions we can go back to our database to understand whether there is some particular profile in the child that works better with intervention 1, and a second profile that works better with intervention 2,” says Wandell, a Stanford Bio-X member who is also the Isaac and Madeline Stein Family Professor and professor, by courtesy, of electrical engineering.

    His team developed a way of scanning kids’ brains with magnetic resonance imaging, then knitting the million collected samples together with complex algorithms that reveal how the nerve fibers connect different parts of the brain. “If you try to do this on your laptop, it will take half a day or more for each child,” he says. Instead, he uses powerful computers to reveal specific brain changes as kids learn to read.

    Wandell is associate director of the Stanford Neurosciences Institute, where he is leading the effort to develop a computing strategy – one that involves making use of SRCC rather than including computing space in their planned new building. He says one advantage of having faculty share computing space and systems is to speed scientific progress.

    “Our hope for the new facility is that it gives us the chance to set the standards for a better environment for sharing computations and data, spreading knowledge rapidly through the community,”

    Q: How do we work effectively together?

    There comes a time in every person’s life when it becomes easy to settle for the known relationship, for better or for worse, rather than seek out new ties with those who better inspire creativity and ensure success.

    Or so finds Daniel McFarland, professor of education and, by courtesy, of organizational behavior, who has studied how academic collaborations form and persist. McFarland and his own collaborators tracked signs of academic ties such as when Stanford faculty co-authored a paper, cited the same publications or got a grant together. Armed with 15 years of collaboration output on 3,000 faculty members, they developed a computer model of how networks form and strengthen over time.

    “Social networks are large, interdependent forms of data that quickly confront limits of computing power, and especially so when we study network evolution,” says McFarland.

    Their work has shown that once academic relationships have established, they tend to continue out of habit, regardless of whether they are the most productive fit. He argues that successful academic programs or businesses should work to bring new members into collaborations and also spark new ties to prevent more senior people from falling back on known but less effective relationships. At the same time, he comes down in favor of retreats and team building exercises to strengthen existing good collaborations.

    McFarland’s work has implications for Stanford’s many interdisciplinary programs. He has found that collaborations across disciplines often fall apart due in part to the distant ties between researchers. “To form and sustain these ties, pairs of colleagues must interact frequently to share knowledge,” he writes. “This is perhaps why interdisciplinary centers may be useful organizational means of corralling faculty and promoting continued distant collaborations.” —Amy Adams //

    Q: What can computers tell us about how our body works?

    As you sip your morning cup of coffee, the caffeine makes its way to your cells, slots into a receptor site on the cells’ surface and triggers a series of reactions that jolt you awake. A similar process takes place when Zantac provides relief for stomach ulcers, or when chemical signals produced in the brain travel cell-to-cell through your nervous system to your heart, telling it to beat.

    In each of these instances, a drug or natural chemical is activating a cell’s G-protein coupled receptor (GPCR), the cellular target of roughly half of all known drugs, says Vijay Pande, a professor of chemistry and, by courtesy, of structural biology and of computer science at Stanford. This exchange is a complex one, though. In order for caffeine or any other molecule to influence a cell, it must fit snugly into the receptor site, which consists of 4,000 atoms and transforms between an active and inactive configuration. Current imaging technologies are unable to view that transformation, so Pande has been simulating it using his Folding@Home distributed computer network.

    So far, Pande’s group has demonstrated a few hundred microseconds of the receptor’s transformation. Although that’s an extraordinarily long chunk of time compared to similar techniques, Pande is looking forward to accessing the SRCC to investigate the basic biophysics of GPCR and other proteins. Greater computing power, he says, will allow his team to simulate larger molecules in greater detail, simulate folding sequences for longer periods of time and visualize multiple molecules as they interact. It might even lead to atom-level simulations of processes at the scale of an entire cell. All of this knowledge could be applied to computationally design novel drugs and therapies.

    “Having more computer power can dramatically change every aspect of what we can do in my lab,” says Pande, who is also a Stanford Bio-X affiliate. “Much like having more powerful rockets could radically change NASA, access to greater computing power will let us go way beyond where we can go routinely today. —Bjorn Carey //

    See the full article here.

    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 5:06 pm on December 14, 2013 Permalink | Reply
    Tags: , , , , Kavli Institute,   

    From Kavli: “Swirls in Remnants of Big Bang May Hold Clues to Universe’s Infancy” 

    KavliFoundation

    The Kavli Foundation

    December 13, 2013
    No Writer Credit
    (Originally published by University of Chicago)

    South Pole Telescope scientists have detected for the first time a subtle distortion in the oldest light in the universe, which may help reveal secrets about the earliest moments in the universe’s formation.

    The scientists observed twisting patterns in the polarization of the cosmic microwave background—light that last interacted with matter very early in the history of the universe, less than 400,000 years after the Big Bang. These patterns, known as “B modes,” are caused by gravitational lensing, a phenomenon that occurs when the trajectory of light is bent by massive objects, much like a lens focuses light.

    10m
    The 10 metre South Pole Telescope
    Physics Review magazine has named research results published earlier this year by the South Pole Telescope collaboration as one of the top 10 physics breakthroughs of 2013. (Photo by Daniel Luong-Van)

    A multi-institutional collaboration of researchers led by John Carlstrom, the S. Chandrasekhar Distinguished Service Professor in Astronomy & Astrophysics at the University of Chicago, made the discovery. They announced their findings in a paper published in the journal Physical Review Letters—using the first data from SPTpol, a polarization-sensitive camera installed on the telescope in January 2012.

    sptol
    SPTpol: an instrument for CMB polarization measurements with the South Pole Telescope

    “The detection of B-mode polarization by South Pole Telescope is a major milestone, a technical achievement that indicates exciting physics to come,” said Carlstrom, who also is deputy director of the Kavli Institute for Cosmological Physics.

    The cosmic microwave background is a sea of Photons (light particles) left over from the Big Bang that pervades all of space, at a temperature of minus 270 degrees Celsius—a mere 3 degrees above absolute zero. Measurements of this ancient light have already given physicists a wealth of knowledge about the properties of the universe. Tiny variations in temperature of the light have been painstakingly mapped across the sky by multiple experiments, and scientists are gleaning even more information from polarized light.

    Light is polarized when its electromagnetic waves are preferentially oriented in a particular direction. Light from the cosmic microwave background is polarized mainly due to the scattering of photons off of electrons in the early universe, through the same process by which light is polarized as it reflects off the surface of a lake or the hood of a car. The polarization patterns that result are of a swirl-free type, known as “E modes,” which have proven easier to detect than the fainter B modes, and were first measured a decade ago by a collaboration of researchers using the Degree Angular Scale Interferometer, another UChicago-led experiment.

    Simple scattering can’t generate B modes, which instead emerge through a more complex process—hence scientists’ interest in measuring them. Gravitational lensing, it has long been predicted, can twist E modes into B modes as photons pass by galaxies and other massive objects on their way toward earth. This expectation has now been confirmed.

    To tease out the B modes in their data, the scientists used a previously measured map of the distribution of mass in the universe to determine where the gravitational lensing should occur. They combined their measurement of E modes with the mass distribution to provide a template of the expected twisting into B modes. The scientists are currently working with another year of data to further refine their measurement of B modes.

    The careful study of such B modes will help physicists better understand the universe. The patterns can be used to map out the distribution of mass, thereby more accurately defining cosmologically important properties like the masses of neutrinos, tiny elementary particles prevalent throughout the cosmos.

    Similar, more elusive B modes would provide dramatic evidence of inflation, the theorized turbulent period in the moments after the Big Bang when the universe expanded extremely rapidly. Inflation is a well-regarded theory among cosmologists because its predictions agree with observations, but thus far there is not a definitive confirmation of the theory. Measuring B modes generated by inflation is a possible way to alleviate lingering doubt.

    “The detection of a primordial B-mode polarization signal in the microwave background would amount to finding the first tremors of the Big Bang,” said the study’s lead author, Duncan Hanson, a postdoctoral scientist at McGill University in Canada.

    cmb
    Cosmic microwave background

    B modes from inflation are caused by gravitational waves. These ripples in space-time are generated by intense gravitational turmoil, conditions that would have existed during inflation. These waves, stretching and squeezing the fabric of the universe, would give rise to the telltale twisted polarization patterns of B modes. Measuring the resulting polarization would not only confirm the theory of inflation—a huge scientific achievement in itself—but would also give scientists information about physics at very high energies—much higher than can be achieved with particle accelerators.

    The measurement of B modes from gravitational lensing is an important first step in the quest to measure inflationary B modes. In inflationary B mode searches, lensing B modes show up as noise. “The new result shows that this noise can be accounted for and subtracted off so that scientists can search for and hopefully measure the inflationary B modes underneath,” Hanson said. “The lensing signal itself can also be used by itself to learn about the distribution of mass in the universe.”

    See the full article here.

    The Kavli Institute for Cosmological Physics, based at the University of Chicago seeks answers to some of the most profound questions about matter, energy and the universe.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 8:33 am on October 31, 2013 Permalink | Reply
    Tags: , , , , Kavli Institute,   

    From SLAC: “Cosmos Seeded with Heavy Elements During Violent Youth” 

    October 30, 2013
    Lori Ann White

    New evidence of heavy elements spread evenly between the galaxies of the giant Perseus cluster supports the theory that the universe underwent a turbulent and violent youth more than 10 billion years ago. That explosive period was responsible for seeding the cosmos with the heavy elements central to life itself.

    pc

    Researchers from the Kavli Institute for Particle Astrophysics and Cosmology (KIPAC), jointly run by Stanford University and the Department of Energy’s SLAC National Accelerator Laboratory, shed light on this important era by analyzing 84 separate sets of X-ray telescope observations from the Japanese-US Suzaku satellite. Their results appear today in the journal Nature.

    “We saw that iron is spread out between the galaxies remarkably smoothly,” said Norbert Werner, lead author of the paper. “That means it had to be present in the intergalactic gas before the Perseus cluster formed.”

    The even distribution of these elements supports the idea that they were created at least 10 to 12 billion years ago. According to the paper, during this time of intense star formation, billions of exploding stars created vast quantities of heavy elements in the alchemical furnaces of their own destruction. This was also the epoch when black holes in the hearts of galaxies were at their most energetic. Young stars, exploding supernovae, and voraciously feeding black holes produced powerful winds 10-12 billion years ago. These winds were the spoon that lifted the iron from the galaxies and mixed it with the intergalactic gas. (Akihiro Ikeshita)

    “The combined energy of these cosmic phenomena must have been strong enough to expel most of the metals from the galaxies at early times, and to enrich and mix the intergalactic gas,” said co-author and KIPAC graduate student Ondrej Urban.

    To settle the question of whether the heavy elements created by supernovae remain mostly in their home galaxies or are spread out through intergalactic space, the researchers looked through the Perseus cluster in eight different directions. They focused on the hot, 10-million-degree gas that fills the spaces between galaxies and found the spectroscopic signature of iron reaching all the way to the cluster’s edges.

    “We estimate there’s about 50 billion solar masses of iron in the cluster,” said former KIPAC member and co-author Aurora Simionescu, who is currently with the Japanese Aerospace Exploration Agency as an International Top Young Fellow. “We think most of the iron came from a single type of supernova, called a Type Ia supernova.”

    In Type Ia supernovae the stars are destroyed and release all their material into the surrounding space. The researchers believe that at least 40 billion Type Ia supernovae must have exploded within a relatively short period on cosmological time scales in order to release that much iron and have the force to drive it out of the galaxies.

    The results suggest that the Perseus cluster is probably not unique, and that iron – along with other heavy elements – is evenly spread throughout all massive galaxy clusters, said Steven Allen, a KIPAC professor and head of the research team.

    “You are older than you think – or at least, some of the iron in your blood is older, formed in galaxies millions of light years away and billions of years ago,” Simionescu said.

    See the full article here.

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.
    i1


    ScienceSprings is powered by MAINGEAR computers

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 443 other followers

%d bloggers like this: