Tagged: Kavli Institute Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:02 pm on December 12, 2014 Permalink | Reply
    Tags: , , Kavli Institute   

    From Kavli: “Is an Understanding of Dark Matter around the Corner? Experimentalists Unsure” 

    KavliFoundation

    The Kavli Foundation

    December 12, 2014

    Media Contact

    James Cohen
    Director of Communications
    The Kavli Foundation
    (805) 278-7495
    cohen@kavlifoundation.org

    Scientists have long known that dark matter is out there, silently orchestrating the universe’s movement and structure. But what exactly is dark matter made of? And what does a dark matter particle look like? That remains a mystery, with experiment after experiment coming up empty handed in the quest to detect these elusive particles.

    With some luck, that may be about to change. With ten times the sensitivity of previous detectors, three recently funded dark matter experiments have scientists crossing their fingers that they may finally glimpse these long-sought particles. In recent conversations with The Kavli Foundation, scientists working on these new experiments expressed hope that they would catch dark matter, but also agreed that, in the end, their success or failure is up to nature to decide.

    “Nature is being coy,” said Enectali Figueroa-Feliciano, an associate professor of physics at the MIT Kavli Institute for Astrophysics and Space Research who works on one of the three new experiments. “There’s something we just don’t understand about the internal structure of how the universe works. When theorists write down all the ways dark matter might interact with our particles, they find, for the simplest models, that we should have seen it already. So even though we haven’t found it yet, there’s a message there, one that we’re trying to decode now.”

    The first of the new experiments, called the Axion Dark Matter eXperiment, searches for a theoretical type of dark matter particle called the axion. ADMX seeks evidence of this extremely lightweight particle converting into a photon in the experiment’s high magnetic field. By slowly varying the magnetic field, the detector hunts for one axion mass at a time.

    ADMX Axion Dark Matter Experiment
    ADMX at U Washington

    “We’ve demonstrated that we have the tools necessary to see axions,” said Gray Rybka, research assistant professor of physics at the University of Washington who co-leads the ADMX Gen 2 experiment. “With Gen2, we’re buying a very, very powerful refrigerator that will arrive very shortly. Once it arrives, we’ll be able to scan very, very quickly and we feel we’ll have a much better chance of finding axions – if they’re out there.”

    The two other new experiments look for a different type of theoretical dark matter called the WIMP. Short for Weakly Interacting Massive Particle, the WIMP interacts with our world very weakly and very rarely. The Large Underground Xenon, or LUX, experiment, which began in 2009, is now getting an upgrade to increase its sensitivity to heavier WIMPs. Meanwhile, the Super Cryogenic Dark Matter Search collaboration, which has looked for the signal of a lightweight WIMP barreling through its detector since 2013, is in the process of finalizing the design for a new experiment to be located in Canada.

    LUX Dark matter
    LUX

    LBL SuperCDMS
    Super Cryogenic Dark Matter Search

    “In a way it’s like looking for gold,” said Figueroa-Feliciano, a member of the SuperCDMS experiment. “Harry has his pan and he’s looking for gold in a deep pond, and we’re looking in a slightly shallower pond, and Gray’s a little upstream, looking in his own spot. We don’t know who’s going to find gold because we don’t know where it is.”

    Rybka agreed, but added the more optimistic perspective that it’s also possible that all three experiments will find dark matter. “There’s nothing that would require dark matter to be made of just one type of particle except us hoping that it’s that simple,” he said. “Dark matter could be one-third axions, one-third heavy WIMPs and one-third light WIMPs. That would be perfectly allowable from everything we’ve seen.”

    Yet the nugget of gold for which all three experiments search is a very valuable one. And even though the search is difficult, all three scientists agreed that it’s worthwhile because glimpsing dark matter would reveal insight into a large portion of the universe.

    “We’re all looking and somewhere, maybe even now, there’s a little bit of data that will cause someone to have an ‘Ah ha!’ moment,” said Harry Nelson, professor of physics at the University of California, Santa Barbara and science lead for the LUX upgrade, called LUX-ZEPLIN. “This idea that there’s something out there that we can’t sense yet is one of those things that sends chills down my spine.”

    More about the hunt for dark matter is available at:

    New Dark Matter Experiments Prepare to Hunt the Unknown: A Conversation with Enectali Figueroa-Feliciano, Harry Nelson and Gray Rybka
    Spotlight Live: Dark Matter at Long Last? Three New Experiments Ramp Up (Transcript)

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

     
  • richardmitnick 12:00 pm on October 24, 2014 Permalink | Reply
    Tags: , Charles Munger, Kavli Institute, ,   

    From NYT: “Charles Munger, Warren Buffett’s Longtime Business Partner, Makes $65 Million Gift” 

    New York Times

    The New York Times

    October 24, 2014
    Michael J. de la Merced

    Charles T. Munger has been known for many things over his decades-long career, including longtime business partner of Warren E. Buffett; successful investor and lawyer; and plain-spoken commentator with a wide following.

    cm

    Now Mr. Munger, 90, can add another title to that list: deep-pocketed benefactor to the field of theoretical physics.

    He was expected to announce on Friday that he has donated $65 million to the Kavli Institute for Theoretical Physics at the University of California, Santa Barbara. The gift — the largest in the school’s history — will go toward building a 61-bed residence for visitors to the institute, which brings together physicists for weeks at a time to exchange ideas.

    “U.C.S.B. has by far the most important program for visiting physicists in the world,” Mr. Munger said in a telephone interview. “Leading physicists routinely are coming to the school to talk to one another, create new stuff, cross-fertilize ideas.”

    ucsb
    UC Santa Barbara Campus

    The donation is the latest gift by Mr. Munger, a billionaire who has not been shy in giving away the wealth he has accumulated as vice chairman of Mr. Buffett’s Berkshire Hathaway to charitable causes.

    Though perhaps not as prominent a donor as his business partner, who cocreated the Giving Pledge campaign for the world’s richest people to commit their wealth to philanthropy, Mr. Munger has frequently donated big sums to schools like Stanford and the Harvard-Westlake School. (He has not signed on to the Giving Pledge campaign.)

    The biggest beneficiary of his largess thus far has been the University of Michigan, his alma mater. Last year alone, he gave $110 million worth of Berkshire shares — one of the biggest gifts in the university’s history — to create a new residence intended to help graduate students from different areas of study mingle and share ideas.

    That same idea of intellectual cross-pollination underpins the Kavli Institute, which over 35 years has established itself as a haven for theoretical physicists from around the world to meet and discuss potential new developments in their field.

    Funded primarily by the National Science Foundation, the institute has produced advances in the understanding of white dwarf stars, string theory and quantum computing.

    A former director of the institute, David J. Gross, shared in the 2004 Nobel Prize in Physics for work that shed new light on the fundamental force that binds together the atomic nucleus.

    “Away from day-to-day responsibilities, they are in a different mental state,” Lars Bildsten, the institute’s current director, said of the center’s visitors. “They’re more willing to wander intellectually.”

    To Mr. Munger, such interactions are crucial for the advancement of physics. He cited international conferences attended by the likes of [Albert]Einstein and Marie Curie.

    Mr. Munger himself did not study physics for very long, having taken a class at the California Institute of Technology while in the Army during World War II. But as an avid reader of scientific biography, he came to appreciate the importance of the field.

    And he praised the rise of the University of California, Santa Barbara, as a leading haven for physics, particularly given its status as a relatively young research institution.

    But while the Kavli Institute conducts various programs throughout the year for visiting scientists, it has long lacked a way for physicists to spend time outside of work hours during their stays. A permanent residence hall would allow them to mingle even more, in the hope of fostering additional eureka moments.

    “We want to make their hardest choice, ‘Which barbecue to go to?’ ” Mr. Bildsten joked.

    Though Mr. Munger has some ties to the University of California, Santa Barbara — a grandson is an alumnus — he was first introduced to the Kavli Institute through a friend who lives in Santa Barbara.

    During one of the pair’s numerous fishing trips, that friend, Glen Mitchel, asked the Berkshire vice chairman to help finance construction of a new residence. The university had already reserved a plot of land for the dormitory in case the institute raised the requisite funds.

    “It wasn’t a hard sell,” Mr. Munger said.

    “Physics is vitally important,” he added. “Everyone knows that.”

    See the full article here.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 7:28 pm on October 6, 2014 Permalink | Reply
    Tags: , , Kavli Institute   

    From Kavli: ” A Warm Dark Matter Search Using XMASS “ 

    KavliFoundation

    The Kavli Foundation

    10/06/2014
    Yoichiro Suzuki
    Kavli Institute for the Physics and Mathematics of the Universe, The University of Tokyo
    E-mail: yoichiro.suzuki_at_ipmu.jp 

    The XMASS collaboration, led by Yoichiro Suzuki at the Kavli IPMU, has reported its latest results on the search for warm dark matter. Their results rule out the possibility that super-weakly interacting massive bosonic particles (bosonic super-WIMPs) constitute all dark matter in the universe. This result was published in the September 19th issue of the Physical Review Letters as an Editors’ Suggestion.

    xmass
    XMASS DetectorConstruction of XMASS-Ⅰ detector (2010/Feb./25) (C) Kamioka Observatory, ICRR(Institute for Cosmic Ray Research), The University of Tokyo

    The universe is considered to be filled with dark matter, which cannot be observed by ordinary light. Although much evidence supports the existence of dark matter, it has yet to be directly detected and its nature is not understood.

    Various theoretical models have been proposed to explain the nature of dark matter. Some models extend the standard model of particle physics, such as super-symmetry, and suggest that weakly interacting massive particles (WIMPs) are dark matter candidates. These models have motivated most experimental research on dark matter. In discussions on the large-scale structure formation of the universe, these WIMPs fit the cold dark matter (CDM) paradigm.

    Supersymmetry standard model
    Standard Model of Supersymmetry

    On the other hand, some simulations based on the CDM scenario predict a much richer structure of the universe on galactic scales than those observed. Furthermore, high-energy collider experiments have yet to provide evidence of super-symmetric particles. These facts have increased the interest in lighter and further weakly interacting particles such as bosonic super-WIMPs as dark matter. Super-WIMPs with masses greater than a twentieth of an electron (more than 3 keV) do not conflict with the structure formation of the universe.

    “Bosonic super-WIMPs are experimentally attractive since if they are absorbed in ordinary material, they would deposit energy essentially equivalent to the super-WIMP’s rest mass,” Suzuki says. “And only ultra-low background detectors like XMASS can detect the signal.”

    The XMASS experiment was conducted to directly search for such bosonic super-WIMPS, especially in the mass range between a tenth and a third that of an electron (between 40 and 120 keV). XMASS is a cryogenic detector using about 1 ton of liquid xenon as the target material. Using 165.9 days of data, a significant excess above the background is not observed in the fiducial mass of 41 kg. The absence of such a signal excludes the possibility that bosonic super-WIMPs constitute all dark matter in the universe.

    “Light super-WIMPs are a good candidate of dark matter on galactic scales,” Professor Naoki Yoshida, a cosmologist at the School of Science, the University of Tokyo and a Project Professor at the Kavli IPMU says. “The XMASS team derived an important constraint on the possibility of such light dark models for a broad range of particle masses.”

    See the full article here.

    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 8:30 pm on September 9, 2014 Permalink | Reply
    Tags: , , Kavli Institute,   

    From Kavli: “Tiny Graphene Drum Could Form Future Quantum Memory” 

    KavliFoundation

    The Kavli Foundation

    09/09/2014
    No Writer Credit

    Scientists from TU Delft’s Kavli Institute of Nanoscience have demonstrated that they can detect extremely small changes in position and forces on very small drums of graphene. Graphene drums have great potential to be used as sensors in devices such as mobile phones. Using their unique mechanical properties, these drums could also act as memory chips in a quantum computer. The researchers present their findings in an article in the August 24th edition of Nature Nanotechnology. The research was funded by the FOM Foundation, the EU Marie-Curie program, and NWO.

    Graphene drums

    drum
    Graphene Drum

    Graphene is famous for its special electrical properties, but research on the one-layer thin graphite was recently expanded to explore graphene as a mechanical object. Thanks to their extreme low mass, tiny sheets of graphene can be used the same was as the drumhead of a musician. In the experiment, scientists use microwave-frequency light to ‘play’ the graphene drums, to listen to its ‘nano sound’, and to explore the way graphene in these drums moves.

    Optomechanics

    Dr. Vibhor Singh and his colleagues did this by using a 2D crystal membrane as a mirror in an ‘optomechanical cavity’. “In optomechanics you use the interference pattern of light to detect tiny changes in the position of an object. In this experiment, we shot microwave photons at a tiny graphene drum. The drum acts as a mirror: by looking at the interference of the microwave photons bouncing off of the drum, we are able to sense minute changes in the position of the graphene sheet of only 17 femtometers, nearly 1/10000th of the diameter of an atom.”, Singh explains.

    Amplifier

    The microwave ‘light’ in the experiment is not only good for detecting the position of the drum, but can also push on the drum with a force. This force from light is extremely small, but the small mass of the graphene sheet and the tiny displacements they can detect mean that the scientist can use these forces to ‘beat the drum’: the scientists can shake the graphene drum with the momentum of light. Using this radiation pressure, they made an amplifier in which microwave signals, such as those in your mobile phone, are amplified by the mechanical motion of the drum.

    Memory

    The scientists also show you can use these drums as ‘memory chips’ for microwave photons, converting photons into mechanical vibrations and storing them for up to 10 milliseconds. Although that is not long by human standards, it is a long time for a computer chip. “One of the long-term goals of the project is explore 2D crystal drums to study quantum motion. If you hit a classical drum with a stick, the drumhead will start oscillating, shaking up and down. With a quantum drum, however, you can not only make the drumhead move up and then down, but also make it into a ‘quantum superposition’, in which the drum head is both moving up and moving down at the same time ”, says research group leader Dr. Gary Steele. “This ‘strange’ quantum motion is not only of scientific relevance, but also could have very practical applications in a quantum computer as a quantum ‘memory chip’”.

    In a quantum computer, the fact that quantum ‘bits’ that can be both in the state 0 and 1 at the same time allow it to potentially perform computations much faster than a classical computer like those used today. Quantum graphene drums that are ‘shaking up and down at the same time’ could be used to store quantum information in the same way as RAM chips in your computer, allowing you to store your quantum computation result and retrieve it at a later time by listening to its quantum sound.

    See the full article, with video, here.

    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 10:35 pm on August 19, 2014 Permalink | Reply
    Tags: , , , , Kavli Institute,   

    From Kavli: “New Survey Begins Mapping Nearby Galaxies “ 

    KavliFoundation

    The Kavli Foundation

    August 18, 2014
    (Originally published by Kavli IPMU)

    A new survey called MaNGA (Mapping Nearby Galaxies at Apache Point Observatory) has been launched that will greatly expand our understanding of galaxies, including the Milky Way, by charting the internal structure and composition of an unprecedented sample of 10,000 galaxies.

    Apache Point Observatory
    Apache Point Observatory

    MaNGA is a part of the fourth generation Sloan Digital Sky Survey (SDSS-IV) and will make maps of stars and gas in galaxies to determine how they have grown and changed over billions of years, using a novel optical fiber bundle technology that can take spectra of all parts of a galaxy at the same time.

    Sloan Digital Sky Survey Telescope
    Sloan Digital Sky Survey Telescope

    The new survey represents a collaboration of more than 200 astronomers at more than 40 institutions on four continents. With the new technology, astronomers will gain a perspective on the building blocks of the universe with a statistical precision that has never been achieved before.

    “Because the life story of a galaxy is encoded in its internal structure—a bit like the way the life story of a tree is encoded in its rings—MaNGA would, for the first time, enable us to map the evolutionary histories of galaxies of all types and sizes, living in all kinds of environments,” said Kevin Bundy, MaNGA’s Principal Investigator from the Kavli Institute for the Physics and Mathematics of the Universe, the University of Tokyo.

    image
    Previously, SDSS has mapped the universe across billions of light-years, focusing on the time from 7 billion years after the Big Bang to the present and the time from 2 billion years to 3 billion years after the Big Bang. SDSS-IV will focus on mapping the distribution of galaxies and quasars 3 billion years to 7 billion years after the Big Bang, a critical time when dark energy is thought to have started to affect the expansion of the Universe. Image credit: SDSS collaboration and Dana Berry / SkyWorks Digital, Inc. WMAP cosmic microwave background (Credit: NASA/WMAP Science Team)

    This new survey will provide a vast public database of observations that will significantly expand astronomer’s understanding of how tiny differences in the density of the early universe evolved over billions of years into the rich structure of galaxies today. This cosmic story includes the journey of our own Milky Way galaxy from its origins to the birth of our sun and solar system, and eventually the necessary conditions that gave rise to life on Earth.

    “MaNGA will not only teach us about what shapes the appearance of normal galaxies,” said SDSS Project Scientist, Matthew Bershady from the University of Wisconsin, Madison. “It will also almost surely surprise us with new discoveries about the origin of dark matter, super-massive black holes, and perhaps even the nature of gravity itself.” This potential comes from MaNGA’s ability to paint a complete picture of each galaxy using an unprecedented amount of spectral information on the chemical composition and motions of stars and gas.

    To realize this potential, the MaNGA team has developed new technologies for bundling sets of fiber-optic cables into tightly-packed arrays that dramatically enhance the capabilities of existing instrumentation on the 2.5-meter Sloan Foundation Telescope in New Mexico. Unlike nearly all previous surveys, which combine all portions of a galaxy into a single spectrum, MaNGA will obtain as many as 127 different measurements across the full extent of every galaxy. Its new instrumentation enables a survey of more than 10,000 nearby galaxies at twenty times the rate of previous efforts, which did one galaxy at a time.

    But local galaxy studies are far from the only astronomical topic the new SDSS will explore. Another core program called APOGEE-2 will chart the compositions and motions of stars across the entire Milky Way in unprecedented detail, using a telescope in Chile along with the existing Sloan Foundation Telescope.

    image2
    The new SDSS will measure spectra at multiple points in the same galaxy, using a newly created fiber bundle technology. The left-hand side shows the Sloan Foundation Telescope and a close-up of the tip of the fiber bundle. The bottom right illustrates how each fiber will observe a different section of each galaxy. The image (from the Hubble Space Telescope) shows one of the first galaxies that the new SDSS has measured. The top right shows data gathered by two fibers observing two different part of the galaxy, showing how the spectrum of the central regions differs dramatically from outer regions. Image Credit: David Law, SDSS collaboration, and Dana Berry / SkyWorks Digital, Inc. Hubble Space Telescope (Credit:(http://hubblesite.org/newscenter/archive/releases/2008/16/image/cg/): NASA, ESA, the Hubble Heritage (STScI/AURA)-ESA/Hubble Collaboration, and A. Evans (University of Virginia, Charlottesville/NRAO/Stony Brook University))

    And the new SDSS will continue to improve our understanding of the Universe as a whole. The third core program, eBOSS, will precisely measure the expansion history of the Universe through 80% of cosmic history, back to when the Universe was less than three billion years old. These new detailed measurements will help to improve constraints on the nature of dark energy, the most mysterious experimental result in modern physics.

    “SDSS has a proud history of fostering a breadth of cosmic discoveries that connect a deep understanding of the origins of the universe with key insights on the nature of galaxies and the makeup of our own Milky Way,” said Hitoshi Murayama, Director of the Kavli IPMU. “We are delighted to be a part of this endeavor to understand the Universe in the broadest sense, and particularly happy to see our Kevin Bundy playing such a crucial role to make it all happen.”

    With new technology and surveys like MaNGA and the continuing generous support of the Alfred P. Sloan Foundation and participating institutions, the SDSS will remain one of the world’s most productive astronomical facilities. Science results from the SDSS will continue to reshape our view of the fundamental constituents of the cosmos, the universe of galaxies, and our home in the Milky Way.

    ABOUT THE SLOAN DIGITAL SKY SURVEY

    Funding for the Sloan Digital Sky Survey IV has been provided by the Alfred P. Sloan Foundation and the Participating Institutions. SDSS-IV acknowledges support and resources from the Center for High-Performance Computing at the University of Utah.

    SDSS-IV is managed by the Astrophysical Research Consortium for the Participating Institutions of the SDSS Collaboration including the Carnegie Institution for Science, Carnegie Mellon University, the Chilean Participation Group, Harvard-Smithsonian Center for Astrophysics, Instituto de Astrofisica de Canarias, The Johns Hopkins University, Kavli Institute for the Physics and Mathematics of the Universe (Kavli IPMU) / University of Tokyo, Lawrence Berkeley National Laboratory, Leibniz Institut fur Astrophysik Potsdam (AIP),Max-Planck-Institut fur Astrophysik (MPA Garching), Max-Planck-Institut fur Extraterrestrische Physik (MPE), Max-Planck-Institut fur Astronomie (MPIA Heidelberg), National Astronomical Observatory of China, New Mexico State University, New York University, The Ohio State University, Pennsylvania State University, Shanghai Astronomical Observatory, United Kingdom Participation Group, Universidad Nacional Autonoma de Mexico, University of Arizona, University of Colorado Boulder, University of Portsmouth, University of Utah, University of Washington, University of Wisconsin, Vanderbilt University, and Yale University.

    SDSS Website – http://www.sdss.org/

    See the full article, with video and additional material here.

    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 5:49 am on June 3, 2014 Permalink | Reply
    Tags: , Kavli Institute, ,   

    From The Kavli Institute at Stanford: “Solving big questions requires big computation” 

    KavliFoundation

    The Kavli Foundation

    Understanding the origins of our solar system, the future of our planet or humanity requires complex calculations run on high-power computers.

    A common thread among research efforts across Stanford’s many disciplines is the growing use of sophisticated algorithms, run by brute computing power, to solve big questions.

    In Earth sciences, computer models of climate change or carbon sequestration help drive policy decisions, and in medicine computation is helping unravel the complex relationship between our DNA and disease risk. Even in the social sciences, computation is being used to identify relationships between social networks and behaviors, work that could influence educational programs.

    dell sc

    “There’s really very little research that isn’t dependent on computing,” says Ann Arvin, vice provost and dean of research. Arvin helped support the recently opened Stanford Research Computing Center (SRCC) located at SLAC National Accelerator Laboratory, which expands the available research computing space at Stanford. The building’s green technology also reduces the energy used to cool the servers, lowering the environmental costs of carrying out research.

    “Everyone we’re hiring is computational, and not at a trivial level,” says Stanford Provost John Etchemendy, who provided an initial set of servers at the facility. “It is time that we have this facility to support those faculty.”

    Here are just a few examples of how Stanford faculty are putting computers to work to crack the mysteries of our origins, our planet and ourselves.

    Myths once explained our origins. Now we have algorithms.

    Our Origins

    Q: How did the universe form?

    For thousands of years, humans have looked to the night sky and created myths to explain the origins of the planets and stars. The real answer could soon come from the elegant computer simulations conducted by Tom Abel, an associate professor of physics at Stanford.

    Cosmologists face an ironic conundrum. By studying the current universe, we have gained a tremendous understanding of what occurred in the fractions of a second after the Big Bang, and how the first 400,000 years created the ingredients – gases, energy, etc. – that would eventually become the stars, planets and everything else. But we still don’t know what happened after those early years to create what we see in the night sky.

    “It’s the perfect problem for a physicist, because we know the initial conditions very well,” says Abel, who is also director of the Kavli Institute for Particle Astrophysics and Cosmology at SLAC. “If you know the laws of physics correctly, you should be able to exactly calculate what will happen next.”

    Easier said than done. Abel’s calculations must incorporate the laws of chemistry, atomic physics, gravity, how atoms and molecules radiate, gas and fluid dynamics and interactions, the forces associated with dark matter and so on. Those processes must then be simulated out over the course of hundreds of millions, and eventually billions, of years. Further complicating matters, a single galaxy holds one billion moving stars, and the simulation needs to consider their interactions in order to create an accurate prediction of how the universe came to be.

    “Any of the advances we make will come from writing smarter algorithms,” Abel says. “The key point of the new facility is it will allow for rapid turnaround, which will allow us to constantly develop and refine and validate new algorithms. And this will help us understand how the very first things were formed in the universe.” —Bjorn Carey //

    Q: How did we evolve?

    The human genome is essentially a gigantic data set. Deep within each person’s six billion data points are minute variations that tell the story of human evolution, and provide clues to how scientists can combat modern-day diseases.

    To better understand the causes and consequences of these genetic variations, Jonathan Pritchard, a professor of genetics and of biology, writes computer programs that can investigate those links. “Genetic variation affects how cells work, both in healthy variation and in response to disease,” Pritchard says. How that variation displays itself – in appearance or how cells work – and whether natural selection favors those changes within a population drives evolution.

    Consider, for example, variation in the gene that codes for lactase, an enzyme that allows mammals to digest milk. Most mammals turn off the lactase gene after they’ve been weaned from their mother’s milk. In populations that have historically revolved around dairy farming, however, Pritchard’s algorithms have helped to elucidate signals of strong selection since the advent of agriculture to enable people to process milk active throughout life. There has been similarly strong selection on skin pigmentation in non-Africans that allow better synthesis of vitamin D in regions where people are exposed to less sunlight.

    The algorithms and machine learning methods Pritchard used have the potential to yield powerful medical insights. Studying variations in how genes are regulated within a population could reveal how and where particular proteins bind to DNA, or which genes are turned on in different cell types­ – information that could help design novel therapies. These inquiries can generate hundreds of thousands of data sets and can only be parsed with up to tens of thousands of hours of computer work.

    Pritchard is bracing for an even bigger explosion of data; as genome sequencing technologies become less expensive, he expects the number of individually sequenced genomes to jump by as much as a hundredfold in the next few years. “Storing and analyzing vast amounts of data is a fundamental challenge that all genomics groups are dealing with,” says Pritchard, who is a member of Stanford Bio-X.

    “Having access to SRCC will make our inquiries go easier and more quickly, and we can move on faster to making the next discovery.” —Bjorn Carey //
    7 billion people live on Earth. Computers might help us survive ourselves.

    Our Planet
    Q: How can we predict future climates?

    There is no lab large enough to conduct experiments on the global-scale interactions between air, water and land that control Earth’s climate, so Stanford’s Noah Diffenbaugh and his students use supercomputers.

    Computer simulations reveal that if human emissions of greenhouse gases continue at their current pace, global warming over the next century is likely to occur faster than any global-scale shift recorded in the past 65 million years. This will increase the likelihood and severity of droughts, heat waves, heavy downpours and other extreme weather events.

    Climate scientists must incorporate into their predictions a growing number of data streams – including direct measurements as well as remote-sensing observations from satellites, aircraft-based sensors, and ground-based arrays.

    “That takes a lot of computing power, especially as we try to figure out how to use newer unstructured forms of data, such as from mobile sensors,” says Diffenbaugh, an associate professor of environmental Earth system science and a senior fellow at the Stanford Woods Institute for the Environment.

    Diffenbaugh’s team plans to use the increased computing resources available at SRCC to simulate air circulation patterns at the kilometer-scale over multiple decades. This has rarely been attempted before, and could help scientists answer questions such as how the recurring El Niño ocean circulation pattern interacts with elevated atmospheric carbon dioxide levels to affect the occurrence of tornadoes in the United States.

    “We plan to use the new computing cluster to run very large high-resolution simulations of climate over regions like the U.S. and India,” Diffenbaugh says. One of the most important benefits of SRCC, however, is not one that can be measured in computing power or cycles.

    “Perhaps most importantly, the new center is bringing together scholars from across campus who are using similar methodologies to figure out new solutions to existing problems, and hopefully to tackle new problems that we haven’t imagined yet.” —Ker Than //

    Q: How can we predict if climate solutions work?

    The capture and trapping of carbon dioxide gas deep underground is one of the most viable options for mitigating the effects of global warming, but only if we can understand how that stored gas interacts with the surrounding structures.

    Hamdi Tchelepi, a professor of energy resources engineering, uses supercomputers to study interactions between injected CO2 gas and the complex rock-fluid system in the subsurface.

    “Carbon sequestration is not a simple reversal of the technology that allows us to extract oil and gas. The physics involved is more complicated, ranging from the micro-scale of sand grains to extremely large geological formations that may extend hundreds of kilometers, and the timescales are on the order of centuries, not decades,” says Tchelepi, who is also the co-director of the Stanford Center for Computational Earth and Environmental Sciences (CEES).

    For example, modeling how a large plume of CO2 injected into the ground migrates and settles within the subsurface, and whether it might escape from the injection site to affect the air quality of a faraway city, can require the solving of tens of millions of equations simultaneously. SRCC will help augment the high computing power already available to Stanford Earth scientists and students through CEES, and will serve as a testing ground for custom algorithms developed by CEES researchers to simulate complex physical processes.

    Tchelepi, who is also affiliated with the Precourt Institute for Energy, says people are often surprised to learn the role that supercomputing plays in modern Earth sciences, but Earth scientists use more computer resources than almost anybody except the defense industry, and their computing needs can influence the designs of next-generation hardware.

    “Earth science is about understanding the complex and ever-changing dynamics of flowing air, water, oil, gas, CO2 and heat. That’s a lot of physics, requiring extensive computing resources to model.” —Ker Than //
    Q: How can we build more efficent energy networks?

    When folks crank their air conditioners during a heat wave, you can almost hear the electric grid moan. The sudden, larger-than-average demand for electricity can stress electric plants, and energy providers scramble to redistribute the load, or ask industrial users to temporarily shut down. To handle those sudden spikes in use more efficiently, Ram Rajagopal, an assistant professor of civil and environmental engineering, used supercomputers to analyze the energy usage patterns of 200,000 anonymous households and businesses in Northern California and from that develop a model that could tune consumer demand and lead to a more flexible “smart grid.”

    Today, utility companies base forecasts on a 24-hour cycle that aggregates millions of households. Not surprisingly, power use peaks in the morning and evening, when people are at home. But when Rajagopal looked at 1.6 billion hourly data points he plotted dramatic variations.

    Some households conformed to the norm and others didn’t. This forms the statistical underpinning for a new way to price and purchase power – by aggregating as few as a thousand customers into a unit with a predictable usage pattern. “If we want to thwart global warming we need to give this technology to communities,” says Rajagopal. Some consumers might want to pay whatever it costs to stay cool on hot days, others might conserve or defer demand to get price breaks. “I’m talking about neighborhood power that could be aligned to your beliefs,” says Rajagopal.

    Establishing a responsive smart grid and creative energy economies will become even more important as solar and wind energy – which face hourly supply limitations due to Mother Nature – become a larger slice of the energy pie. —Tom Abate //

    Know thyself. Let computation help.

    Ourselves

    Q: How does our DNA make us who we are?

    Our DNA is sometimes referred to as our body’s blueprint, but it’s really more of a sketch. Sure, it determines a lot of things, but so do the viruses and bacteria swarming our bodies, our encounters with environmental chemicals that lodge in our tissues and the chemical stew that ensues when our immune system responds to disease states.

    All of this taken together – our DNA, the chemicals, the antibodies coursing through our veins and so much more – determines our physical state at any point in time. And all that information makes for a lot of data if, like genetics professor Michael Snyder, you collected it 75 times over the course of four years.

    Snyder is a proponent of what he calls “personal omics profiling,” or the study of all that makes up our person, and he’s starting with himself. “What we’re collecting is a detailed molecular portrait of a person throughout time,” he says.

    So far, he’s turning out to be a pretty interesting test case. In one round of assessment he learned that he was becoming diabetic and was able to control the condition long before it would have been detected through a periodic medical exam.

    If personal omics profiling is going to go mainstream, serious computing will be required to tease out which of the myriad tests Snyder’s team currently runs give meaningful information and should be part of routine screening. Snyder’s sampling alone has already generated a half of a petabyte of data – roughly enough raw information to fill about a dishwasher-size rack of servers.

    Right now, that data and the computer power required to understand it reside on campus, but new servers will be located at SRCC. “I think you are going to see a lot more projects like this,” says Snyder, who is also a Stanford Bio-X affiliate and a member of the Stanford Cancer Center.

    “Computing is becoming increasingly important in medicine.” —Amy Adams //

    Q: How do we learn to read?

    A love letter, with all of its associated emotions, conveys its message with the same set of squiggly letters as a newspaper, novel or an instruction manual. How our brains learn to interpret a series of lines and curves into language that carries meaning or imparts knowledge is something psychology Professor Brian Wandell has been trying to understand.

    Wandell hopes to tease out differences between the brain scans of kids learning to read normally and those who are struggling, and use that information to find the right support for kids who need help. “As we acquire information about the outcome of different reading interventions we can go back to our database to understand whether there is some particular profile in the child that works better with intervention 1, and a second profile that works better with intervention 2,” says Wandell, a Stanford Bio-X member who is also the Isaac and Madeline Stein Family Professor and professor, by courtesy, of electrical engineering.

    His team developed a way of scanning kids’ brains with magnetic resonance imaging, then knitting the million collected samples together with complex algorithms that reveal how the nerve fibers connect different parts of the brain. “If you try to do this on your laptop, it will take half a day or more for each child,” he says. Instead, he uses powerful computers to reveal specific brain changes as kids learn to read.

    Wandell is associate director of the Stanford Neurosciences Institute, where he is leading the effort to develop a computing strategy – one that involves making use of SRCC rather than including computing space in their planned new building. He says one advantage of having faculty share computing space and systems is to speed scientific progress.

    “Our hope for the new facility is that it gives us the chance to set the standards for a better environment for sharing computations and data, spreading knowledge rapidly through the community,”

    Q: How do we work effectively together?

    There comes a time in every person’s life when it becomes easy to settle for the known relationship, for better or for worse, rather than seek out new ties with those who better inspire creativity and ensure success.

    Or so finds Daniel McFarland, professor of education and, by courtesy, of organizational behavior, who has studied how academic collaborations form and persist. McFarland and his own collaborators tracked signs of academic ties such as when Stanford faculty co-authored a paper, cited the same publications or got a grant together. Armed with 15 years of collaboration output on 3,000 faculty members, they developed a computer model of how networks form and strengthen over time.

    “Social networks are large, interdependent forms of data that quickly confront limits of computing power, and especially so when we study network evolution,” says McFarland.

    Their work has shown that once academic relationships have established, they tend to continue out of habit, regardless of whether they are the most productive fit. He argues that successful academic programs or businesses should work to bring new members into collaborations and also spark new ties to prevent more senior people from falling back on known but less effective relationships. At the same time, he comes down in favor of retreats and team building exercises to strengthen existing good collaborations.

    McFarland’s work has implications for Stanford’s many interdisciplinary programs. He has found that collaborations across disciplines often fall apart due in part to the distant ties between researchers. “To form and sustain these ties, pairs of colleagues must interact frequently to share knowledge,” he writes. “This is perhaps why interdisciplinary centers may be useful organizational means of corralling faculty and promoting continued distant collaborations.” —Amy Adams //

    Q: What can computers tell us about how our body works?

    As you sip your morning cup of coffee, the caffeine makes its way to your cells, slots into a receptor site on the cells’ surface and triggers a series of reactions that jolt you awake. A similar process takes place when Zantac provides relief for stomach ulcers, or when chemical signals produced in the brain travel cell-to-cell through your nervous system to your heart, telling it to beat.

    In each of these instances, a drug or natural chemical is activating a cell’s G-protein coupled receptor (GPCR), the cellular target of roughly half of all known drugs, says Vijay Pande, a professor of chemistry and, by courtesy, of structural biology and of computer science at Stanford. This exchange is a complex one, though. In order for caffeine or any other molecule to influence a cell, it must fit snugly into the receptor site, which consists of 4,000 atoms and transforms between an active and inactive configuration. Current imaging technologies are unable to view that transformation, so Pande has been simulating it using his Folding@Home distributed computer network.

    So far, Pande’s group has demonstrated a few hundred microseconds of the receptor’s transformation. Although that’s an extraordinarily long chunk of time compared to similar techniques, Pande is looking forward to accessing the SRCC to investigate the basic biophysics of GPCR and other proteins. Greater computing power, he says, will allow his team to simulate larger molecules in greater detail, simulate folding sequences for longer periods of time and visualize multiple molecules as they interact. It might even lead to atom-level simulations of processes at the scale of an entire cell. All of this knowledge could be applied to computationally design novel drugs and therapies.

    “Having more computer power can dramatically change every aspect of what we can do in my lab,” says Pande, who is also a Stanford Bio-X affiliate. “Much like having more powerful rockets could radically change NASA, access to greater computing power will let us go way beyond where we can go routinely today. —Bjorn Carey //

    See the full article here.

    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 5:06 pm on December 14, 2013 Permalink | Reply
    Tags: , , , , Kavli Institute,   

    From Kavli: “Swirls in Remnants of Big Bang May Hold Clues to Universe’s Infancy” 

    KavliFoundation

    The Kavli Foundation

    December 13, 2013
    No Writer Credit
    (Originally published by University of Chicago)

    South Pole Telescope scientists have detected for the first time a subtle distortion in the oldest light in the universe, which may help reveal secrets about the earliest moments in the universe’s formation.

    The scientists observed twisting patterns in the polarization of the cosmic microwave background—light that last interacted with matter very early in the history of the universe, less than 400,000 years after the Big Bang. These patterns, known as “B modes,” are caused by gravitational lensing, a phenomenon that occurs when the trajectory of light is bent by massive objects, much like a lens focuses light.

    10m
    The 10 metre South Pole Telescope
    Physics Review magazine has named research results published earlier this year by the South Pole Telescope collaboration as one of the top 10 physics breakthroughs of 2013. (Photo by Daniel Luong-Van)

    A multi-institutional collaboration of researchers led by John Carlstrom, the S. Chandrasekhar Distinguished Service Professor in Astronomy & Astrophysics at the University of Chicago, made the discovery. They announced their findings in a paper published in the journal Physical Review Letters—using the first data from SPTpol, a polarization-sensitive camera installed on the telescope in January 2012.

    sptol
    SPTpol: an instrument for CMB polarization measurements with the South Pole Telescope

    “The detection of B-mode polarization by South Pole Telescope is a major milestone, a technical achievement that indicates exciting physics to come,” said Carlstrom, who also is deputy director of the Kavli Institute for Cosmological Physics.

    The cosmic microwave background is a sea of Photons (light particles) left over from the Big Bang that pervades all of space, at a temperature of minus 270 degrees Celsius—a mere 3 degrees above absolute zero. Measurements of this ancient light have already given physicists a wealth of knowledge about the properties of the universe. Tiny variations in temperature of the light have been painstakingly mapped across the sky by multiple experiments, and scientists are gleaning even more information from polarized light.

    Light is polarized when its electromagnetic waves are preferentially oriented in a particular direction. Light from the cosmic microwave background is polarized mainly due to the scattering of photons off of electrons in the early universe, through the same process by which light is polarized as it reflects off the surface of a lake or the hood of a car. The polarization patterns that result are of a swirl-free type, known as “E modes,” which have proven easier to detect than the fainter B modes, and were first measured a decade ago by a collaboration of researchers using the Degree Angular Scale Interferometer, another UChicago-led experiment.

    Simple scattering can’t generate B modes, which instead emerge through a more complex process—hence scientists’ interest in measuring them. Gravitational lensing, it has long been predicted, can twist E modes into B modes as photons pass by galaxies and other massive objects on their way toward earth. This expectation has now been confirmed.

    To tease out the B modes in their data, the scientists used a previously measured map of the distribution of mass in the universe to determine where the gravitational lensing should occur. They combined their measurement of E modes with the mass distribution to provide a template of the expected twisting into B modes. The scientists are currently working with another year of data to further refine their measurement of B modes.

    The careful study of such B modes will help physicists better understand the universe. The patterns can be used to map out the distribution of mass, thereby more accurately defining cosmologically important properties like the masses of neutrinos, tiny elementary particles prevalent throughout the cosmos.

    Similar, more elusive B modes would provide dramatic evidence of inflation, the theorized turbulent period in the moments after the Big Bang when the universe expanded extremely rapidly. Inflation is a well-regarded theory among cosmologists because its predictions agree with observations, but thus far there is not a definitive confirmation of the theory. Measuring B modes generated by inflation is a possible way to alleviate lingering doubt.

    “The detection of a primordial B-mode polarization signal in the microwave background would amount to finding the first tremors of the Big Bang,” said the study’s lead author, Duncan Hanson, a postdoctoral scientist at McGill University in Canada.

    cmb
    Cosmic microwave background

    B modes from inflation are caused by gravitational waves. These ripples in space-time are generated by intense gravitational turmoil, conditions that would have existed during inflation. These waves, stretching and squeezing the fabric of the universe, would give rise to the telltale twisted polarization patterns of B modes. Measuring the resulting polarization would not only confirm the theory of inflation—a huge scientific achievement in itself—but would also give scientists information about physics at very high energies—much higher than can be achieved with particle accelerators.

    The measurement of B modes from gravitational lensing is an important first step in the quest to measure inflationary B modes. In inflationary B mode searches, lensing B modes show up as noise. “The new result shows that this noise can be accounted for and subtracted off so that scientists can search for and hopefully measure the inflationary B modes underneath,” Hanson said. “The lensing signal itself can also be used by itself to learn about the distribution of mass in the universe.”

    See the full article here.

    The Kavli Institute for Cosmological Physics, based at the University of Chicago seeks answers to some of the most profound questions about matter, energy and the universe.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 8:33 am on October 31, 2013 Permalink | Reply
    Tags: , , , , Kavli Institute,   

    From SLAC: “Cosmos Seeded with Heavy Elements During Violent Youth” 

    October 30, 2013
    Lori Ann White

    New evidence of heavy elements spread evenly between the galaxies of the giant Perseus cluster supports the theory that the universe underwent a turbulent and violent youth more than 10 billion years ago. That explosive period was responsible for seeding the cosmos with the heavy elements central to life itself.

    pc

    Researchers from the Kavli Institute for Particle Astrophysics and Cosmology (KIPAC), jointly run by Stanford University and the Department of Energy’s SLAC National Accelerator Laboratory, shed light on this important era by analyzing 84 separate sets of X-ray telescope observations from the Japanese-US Suzaku satellite. Their results appear today in the journal Nature.

    “We saw that iron is spread out between the galaxies remarkably smoothly,” said Norbert Werner, lead author of the paper. “That means it had to be present in the intergalactic gas before the Perseus cluster formed.”

    The even distribution of these elements supports the idea that they were created at least 10 to 12 billion years ago. According to the paper, during this time of intense star formation, billions of exploding stars created vast quantities of heavy elements in the alchemical furnaces of their own destruction. This was also the epoch when black holes in the hearts of galaxies were at their most energetic. Young stars, exploding supernovae, and voraciously feeding black holes produced powerful winds 10-12 billion years ago. These winds were the spoon that lifted the iron from the galaxies and mixed it with the intergalactic gas. (Akihiro Ikeshita)

    “The combined energy of these cosmic phenomena must have been strong enough to expel most of the metals from the galaxies at early times, and to enrich and mix the intergalactic gas,” said co-author and KIPAC graduate student Ondrej Urban.

    To settle the question of whether the heavy elements created by supernovae remain mostly in their home galaxies or are spread out through intergalactic space, the researchers looked through the Perseus cluster in eight different directions. They focused on the hot, 10-million-degree gas that fills the spaces between galaxies and found the spectroscopic signature of iron reaching all the way to the cluster’s edges.

    “We estimate there’s about 50 billion solar masses of iron in the cluster,” said former KIPAC member and co-author Aurora Simionescu, who is currently with the Japanese Aerospace Exploration Agency as an International Top Young Fellow. “We think most of the iron came from a single type of supernova, called a Type Ia supernova.”

    In Type Ia supernovae the stars are destroyed and release all their material into the surrounding space. The researchers believe that at least 40 billion Type Ia supernovae must have exploded within a relatively short period on cosmological time scales in order to release that much iron and have the force to drive it out of the galaxies.

    The results suggest that the Perseus cluster is probably not unique, and that iron – along with other heavy elements – is evenly spread throughout all massive galaxy clusters, said Steven Allen, a KIPAC professor and head of the research team.

    “You are older than you think – or at least, some of the iron in your blood is older, formed in galaxies millions of light years away and billions of years ago,” Simionescu said.

    See the full article here.

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.
    i1


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 11:52 am on July 17, 2013 Permalink | Reply
    Tags: , , Kavli Institute   

    From KAVLI: ” Imperfect graphene renders ‘electrical highways'” 

    KavliFoundation

    The Kavli Foundation

    07/16/2013
    (Originally published by Cornell University)

    Media Contact
    James Cohen
    Director of Communications
    The Kavli Foundation
    (805) 278-7495
    cohen@kavlifoundation.org

    “Just an atom thick, 200 times stronger than steel and a near-perfect conductor, graphene’s future in electronics is all but certain. But to make this carbon supermaterial useful, it needs to be a semiconductor – a material that can switch between insulating and conducting states, which forms the basis for all electronics today.

    graphene
    Graphene is an atomic-scale honeycomb lattice made of carbon atoms.

    Three dark field-transmission electron microscopy images of bilayer graphene are overlaid with colors to show diffraction angles. The lines are soliton boundaries. (Muller Lab)

    Combining experiment and theory, Cornell researchers have moved a step closer to making graphene a useful, controllable material. They showed that when grown in stacked layers, graphene produces some specific defects that influence its conductivity.

    graph
    Three dark field-transmission electron microscopy images of bilayer graphene are overlaid with colors to show diffraction angles. The lines are soliton boundaries. (Muller Lab)

    On the experiment side, a research group has imaged and analyzed the structure and behavior of graphene sheets stacked one on top of the other, called bilayer graphene. The group, publishing online June 24 in Proceedings of the National Academy of Sciences, includes Paul McEuen, the Goldwin Smith Professor of Physics and director of the Kavli Institute at Cornell for Nanoscale Science; David Muller, professor of applied and engineering physics and Kavli co-director; and Jiwoong Park, associate professor of chemistry and chemical biology and Kavli member.

    See the full article here.

    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 11:51 am on July 9, 2013 Permalink | Reply
    Tags: , , , , , Kavli Institute   

    From The Kavli Foundation: “Capturing the Dark Expansion of the Universe” 

    KavliFoundation

    The Kavli Foundation

    Astronomers first exposed dark energy 15 years ago. Now with the help of an enormous camera, they hope to really begin to understand what it is and why the universe is flying apart at increasing speed.

    Summer, 2013
    No Writer Credit

    “PERCHED ATOP A RUGGED PEAK in the north Chilean Andes, one of the world’s largest cameras is taking portraits of deep space. Light traveling for billions of years tickles the camera’s gigantic eye every night, yielding crisp images of ancient clusters of galaxies. But the explorers who built the camera seek imprints of something dark and rather disturbing within the pretty pictures: a pervasive, invisible force pushing the universe apart.

    Decam
    DECam, built at Fermilab, resides at the Cerro Tololo Inter-American Observatory in Chile

    Dark energy — the name given to the obscure driver of cosmic acceleration — is just as enigmatic today. ‘When you say dark energy, what you really mean is something you don’t know about,’ says cosmologist David Burke of the Kavli Institute for Particle Astrophysics and Cosmology (KIPAC) at the SLAC National Accelerator Laboratory and Stanford University.

    Is dark energy an enduring force that has always been with us, and will be forevermore? Does it fit neatly into Albert Einstein’s theories of relativity, or will its presence force us to rethink all we know about gravity? What is it?

    A global team of astronomers, physicists, engineers and dreamers intends to find out. Through the wide lens of the Dark Energy Camera (DECam), the group will search for subtle changes in the body language of the universe. Within the colors of distant supernovae, the clumpiness of galaxy clusters, and the bending of primordial light lie clues about the origin of our universe and, perhaps, its future.”

    See the full article here.

    The Kavli Foundation, based in Oxnard, California, is dedicated to the goals of advancing science for the benefit of humanity and promoting increased public understanding and support for scientists and their work.

    The Foundation’s mission is implemented through an international program of research institutes, professorships, and symposia in the fields of astrophysics, nanoscience, neuroscience, and theoretical physics as well as prizes in the fields of astrophysics, nanoscience, and neuroscience.


    ScienceSprings is powered by MAINGEAR computers

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 378 other followers

%d bloggers like this: