Tagged: UC Berkeley Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 7:32 am on September 23, 2016 Permalink | Reply
    Tags: , Juan de Fuca plate, , , Seismic 'CT scans' reveal deep earth dynamics, Seismic tomography, UC Berkeley   

    From Berkeley via phys.org: “Seismic ‘CT scans’ reveal deep earth dynamics” 

    UC Berkeley

    UC Berkeley


    September 23, 2016
    Wallace Ravven

    A new look 100 miles beneath a massive tectonic plate as it dives under North America has helped clarify the subduction process that generates earthquakes, volcanoes and the rise of the Cascade Range in the Pacific Northwest.

    The largest array of seismometers ever deployed on the seafloor, coupled with hundreds of others operating in the continental U.S., has enabled UC Berkeley researchers to essentially create CT scans of the Juan de Fuca plate and part of the earth’s mantle directly below it.

    The plate, about the size of the state of Michigan, is grinding under the continent along an 800-mile swath that runs from Northern California to Vancouver Island, known as the Cascadia subduction zone.

    The 3-D imaging process, known as seismic tomography, has revealed with unprecedented clarity a huge, buoyant, sausage-shaped region of the upper mantle, or asthenosphere, pressing up on the oceanic plate.

    The imaging casts new light on the competing hypotheses about the drivers of plate tectonics, a dynamic earth process that has been studied for more than 50 years but is still poorly understood.

    Different evidence has led to three different plate movement scenarios: either the plates are pushed from mid-ocean ridges; or they are pulled from their subducting slabs; or their movement is driven by the drag of the viscous mantle material that lies directly below.

    The new research suggests that the third scenario does not apply to the Cascadia subduction zone. Rather, it reveals that a distinct, thin—and difficult to observe—layer separates the plate from the mantle beneath, at least in the Cascadia subduction zone. The layer acts as a kind of berm that the plate rolls over before descending beneath the continent, says UC Berkeley seismologist Richard Allen, leader of the research and co-author of a paper appearing in the Sept. 23 edition of the journal Science.

    “What we observe is an accumulation of low-viscosity material between the plate and the mantle. Its composition acts as a lubricant, and decouples the plate’s movement from the mantle below it,” explains Allen, who is director of the Berkeley Seismological Laboratory and professor and chair of Earth and Planetary Science at Berkeley. The plates may move independently of the mantle below, he adds.

    The finding, he says, will help refine models of plate tectonic dynamics, aiding the long-range effort to understand the connection between tectonics and earthquakes.

    “It is the motion of the plates that causes earthquakes,” Allen says. “Models like this help us understand that linkage so we can be better informed of the coastal hazards.

    “First though, we need to learn if what we find here is typical of subduction zones across the planet, or if it is unique for some reason.”

    Japan has recently deployed a massive seafloor seismic network to study subduction and earthquakes. Allen hopes to next apply the tomography strategy there. Alaska also beckons.

    Lead author on the Science paper is William Hawley, a graduate student in Allen’s lab.

    “Plate tectonics is the most fundamental concept explaining the formation of features we see on the earth’s surface,” Hawley says, “but despite the fact that the concept is simple, we still do not know exactly why or how it operates.

    “If the asthenosphere acts as a lubricant for tectonic plate movement throughout the planet, it will really change our long-term models of the process”—dynamic changes that occur over a 100 million years.

    “Modelers will have to take this lubricating layer into account because it changes the way the mantle and the plates talk to each other.”

    Seismic tomography generates 3-D images of the earth’s interior by measuring how differences in shape, density, rock type and temperature affect the path, speed and amplitude of seismic waves traveling through the planet from an earthquake.

    Much as in CT scans, computers process differences in energy measured at the receiving end to infer interior 3-D detail. CT scans use X-rays as the energy source, while seismic tomography measures energy from seismic waves.

    A dense array of seismometers directly over the region of interest yields the best images and provides the highest resolution of the structures, which can then inform models of the process.

    This study used the data from the largest scale ocean-floor deployment to complement the onshore data already available. Together, they generated the best images of the region to date.

    The four-year seafloor research effort was made possible by the National Science Foundation’s ambitious $20 million Cascadia Initiative. The NSF aimed to spur greater understanding of plate structure, subduction processes, earthquakes and volcanism by deploying seismometers at 120 sites on the ocean floor, arrayed throughout the 95,000-square-mile Juan de Fuca plate.

    Over the four years, the offshore and onshore seismometer array measured thousands of earthquakes throughout the planet, ranging from magnitudes of 5 to about 9 on the Richter scale. The study examined a subset of 321 quakes with magnitudes between about 6 and 7.5.

    Grad students and faculty scientists participated in 24 research cruises to deploy the instruments and move them between two swaths of the Juan de Fuca plate. Several of the seismic tomography cruises invited undergraduate students on the two-week trips. On one Berkeley-led cruise aboard the R/V Thomas Thompson, the undergrads dubbed the trip the “Tom Cruise,” and sent daily video blogs.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Phys.org in 100 Words

    Phys.org™ (formerly Physorg.com) is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004, Phys.org’s readership has grown steadily to include 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

    Founded in the wake of the gold rush by leaders of the newly established 31st state, the University of California’s flagship campus at Berkeley has become one of the preeminent universities in the world. Its early guiding lights, charged with providing education (both “practical” and “classical”) for the state’s people, gradually established a distinguished faculty (with 22 Nobel laureates to date), a stellar research library, and more than 350 academic programs.

    UC Berkeley Seal

  • richardmitnick 1:53 pm on September 22, 2016 Permalink | Reply
    Tags: Chan Zuckerberg Biohub, UC Berkeley   

    From Berkeley: “UC Berkeley to partner in $600M Chan Zuckerberg science ‘Biohub’” 

    UC Berkeley

    UC Berkeley

    September 21, 2016
    Yasmin Anwar

    UC Berkeley, UC San Francisco and Stanford University will join forces in a new medical science research center funded by a $600 million commitment from Facebook CEO and founder Mark Zuckerberg and pediatrician Priscilla Chan.

    Announced today, the San Francisco-based Chan Zuckerberg Biohub, an independent collaboration between the Bay Area’s three premier research universities, is the first philanthropic science investment made by the Chan Zuckerberg Initiative, which is dedicated to “advancing human potential and promoting equality.”

    “We are excited to see such a generous and timely investment in fundamental scientific work across the Bay Area,” said Jennifer Doudna, UC Berkeley professor of molecular and cell biology and chemistry, Li Ka Shing Chancellor’s Chair in Biomedical and Health Sciences, Howard Hughes Medical Institute investigator and a member of the Biohub’s Science Advisory Group.

    “The Biohub will allow researchers at leading institutions to collaborate and accelerate the development of breakthrough scientific and medical advancements, applications and therapeutics,” added Doudna, who is best known for her pioneering work on CRISPR-Cas9, a gene-editing technology that has the potential to revolutionize genetics, molecular biology and medicine.

    Headquartered next to UCSF’s Mission Bay campus, with a satellite site at Stanford, the Biohub will provide basic researchers and clinical scientists with flexible laboratory space, the latest technological tools and funding for ambitious research projects.

    “In bringing together three world-class research universities in UC Berkeley, UCSF, and Stanford, the Biohub represents the type of cross-institutional collaborative environment that will be critical for addressing the most pressing life sciences challenges of our time,” said UC Berkeley Chancellor Nicholas Dirks.

    “Each of these institutions brings its own set of perspectives, questions, ideas, and expertise to this venture, and, given this unprecedented kind of exchange, I am confident that the Biohub will be the catalyst for major and even transformational research breakthroughs,” he added.

    ‘New avenues to treat and cure disease’

    Paul Alivisatos, vice chancellor for research at UC Berkeley and a pioneer of nanoscience, said “this forward-looking gift will empower scientists at the leading edges of their fields to work across disciplines in new ways and to be nimbler and to pursue new ideas.”

    “The research community at UC Berkeley is thrilled to have this new opportunity to collaborate with researchers at UCSF and Stanford to expand our knowledge of human health and lay the groundwork to create new avenues to treat and cure disease,” he said.

    In 2015, Chan and Zuckerberg pledged in an open letter to their newborn daughter to donate 99 percent of their Facebook shares during their lives for charitable purposes. “Partnering with experts,” they wrote, “is more effective for the mission than trying to lead efforts ourselves.”

    They are now making good on their promise, said Robert Tjian, a UC Berkeley professor of biochemistry, biophysics and structural biology who as president of the Howard Hughes Medical Institute — a post he left earlier this month — served as an adviser in the creation of the Chan Zuckerberg Biohub. He will be a member of the President’s Advisory Board at the Biohub and will serve on the Scientific Advisory Board at the larger Chan Zuckerberg Science Initiative.

    “It’s a game changer, not only for the Bay Area and the three respective campuses, but for the life sciences in general,” Tjian said.

    The Biohub will immediately initiate two potentially transformative research projects to be conducted over the next five years: the Cell Atlas and the Infectious Disease Initiative.

    The Cell Atlas will be a map made available to researchers around the world that reveals the many different types of cells that control the body’s major organs, such as the brain, heart, breast and lungs. The Cell Atlas will also depict the internal machinery of cells in unprecedented detail, allowing scientists to search for the basic breakdowns that occur within cells when disease strikes.

    The Infectious Disease Initiative will explore new ways to create drugs, diagnostic tests and vaccines against the many infectious diseases that still threaten much of the world, like HIV, Ebola and Zika. The initiative will include a Rapid Response Team that can immediately devote world-class scientists and advanced research technology to develop new ways to fight a sudden outbreak.

    The Biohub’s open-access model will allow researchers at its three member universities and elsewhere to use its technology and collaborate with scientists at the Biohub, which will provide support for both established and early-career scientists.

    Nurturing young scientists

    Moreover, the Biohub will fund Chan Zuckerberg Investigators to support high-impact projects that may be too exploratory to receive government support. The competition for these slots will open to faculty at the three universities in October. Investigators are expected to be selected by an independent panel of scientists by the end of the year.

    “We have three great research powerhouses in the San Francisco Bay Area, and the Biohub will serve as a completely new nexus of collaboration by providing exceptional resources and opportunities for UCSF, Stanford and Berkeley scientists to create highly productive partnerships,” said Biohub co-director Joseph DeRisi, professor and chair of biochemistry and biophysics at UCSF.

    “The Biohub will be the sinew that ties together these three institutions in the Bay Area like never before,” said Stephen Quake, Stanford professor of bioengineering and of applied physics, who will co-lead the center with DeRisi.

    DeRisi is renowned for his use of genomic technologies for the study of malaria and viruses, and the diagnosis of unknown infections; Quake developed a platform called microfluidics, which can sequence miniscule amounts of DNA or analyze molecules within drops of liquid, technology that has accelerated research into the genetic basis of disease.

    “This exciting new venture by the Chan Zuckerberg Initiative brings together private philanthropy with some of the best minds in the world,” said UC President Janet Napolitano. “Collaboration in the name of science and the public good among the Bay Area’s three leading research universities will surely speed the development of new treatments and cures for diseases once deemed intractable.”

    Advancing the Biohub’s overarching projects will require technologies such as the CRISPR genome-editing technology, advanced cryo-EM, single-cell sequencing platforms as well as single-molecule imaging technologies and the computational infrastructure needed to analyze giant datasets, Tjian said.

    A unique collaboration

    “The three research universities bring great scientific strengths, cross-cutting expertise and a spirit of collaboration,” Tjian said. “For example, Berkeley has both depth and breadth in computational biology and is spearheading the application of next-generation super- resolution live-cell imaging.”

    “There are also great people working on infectious diseases complemented by experts in cell biology to help develop the cell atlas,” he added. “These foundational discovery efforts will be critical to uncover the underlying basis of disease that will inform the development of novel diagnosis platforms and therapies.”

    In addition to the Biohub, the Chan Zuckerberg Initiative has also announced plans for a broader focus on science, its second major initiative, alongside work to improve education for all students.

    The Chan Zuckerberg Initiative’s goal is to cure, prevent or manage all diseases by the end of the century by accelerating basic science research. The initiative seeks to support new ways of enabling scientists and engineers to work together to build new tools that will empower the whole scientific community and advance progress.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Founded in the wake of the gold rush by leaders of the newly established 31st state, the University of California’s flagship campus at Berkeley has become one of the preeminent universities in the world. Its early guiding lights, charged with providing education (both “practical” and “classical”) for the state’s people, gradually established a distinguished faculty (with 22 Nobel laureates to date), a stellar research library, and more than 350 academic programs.

    UC Berkeley Seal

  • richardmitnick 1:32 pm on September 16, 2016 Permalink | Reply
    Tags: , , , HERA collaboration, , , UC Berkeley   

    From UC Berkeley and SKA: “Funding boost for SKA Precursor HERA telescope – What happened after the lights came on in the universe?” 

    UC Berkeley

    UC Berkeley

    SKA Square Kilometer Array


    From SKA:
    Friday 21 September 2016, SKA Global Headquarters, UK – The Hydrogen Epoch of Reionisation Array (HERA) has been awarded international funding with a $9.5 million investment to expand its capabilities, as announced on Wednesday 14th September by the US National Science Foundation.

    Image of the [beginnings of] HERA telescope at the Losberg Site in the Karoo desert. Credit: Danny Jacobs

    HERA, which was recently granted the status of SKA precursor telescope by SKA Organisation, currently has 19, 14-metre radio dishes at the SKA South Africa Losberg site near Carnarvon. With this fresh injection of $9.5 million, this will allow the array to expand to 220 radio dishes by 2018.

    HERA is an experiment focused on one science goal – detecting the Epoch of Reionization signal – and is not a general facility. As part of this effort, HERA is developing techniques, algorithms, calibration and processing pipelines and hardware optimised towards the detection of the power spectrum of the EOR, all of which will benefit SKA in designing and eventually operating the SKA-low telescope to be based in Australia.

    From UC Berkeley:

    September 14, 2016
    Robert Sanders

    An experiment to explore the aftermath of cosmic dawn, when stars and galaxies first lit up the universe, has received nearly $10 million in funding from the National Science Foundation to expand its detector array in South Africa.

    The HERA collaboration expects eventually to expand to 330 radio dishes in the core array, each pointed straight up to measure radiation originally emitted some 13 billion years ago. Twenty outrigger dishes (not shown) are also planned, bringing the array up to 350 dishes total.

    The experiment, an international collaboration called the Hydrogen Epoch of Reionization Array, or HERA, currently has 19 14-meter (42-foot) radio dishes aimed at the southern sky near Carnarvon, South Africa, and will soon up that to 37. The $9.5 million in new funding will allow the array to expand to 240 radio dishes by 2018.

    Led by UC Berkeley, HERA will explore the billion-year period after hydrogen gas collapsed into the first stars, perhaps 100 million years after the Big Bang, through the ignition of stars and galaxies throughout the universe. These first brilliant objects flooded the universe with ultraviolet light that split or ionized all the hydrogen atoms between galaxies into protons and electrons to create the universe we see today.

    “The first galaxies lit up and started ionizing bubbles of gas around them, and soon these bubbles started percolating and intersecting and making bigger and bigger bubbles,“ said Aaron Parsons, a UC Berkeley associate professor of astronomy and principal investigator for HERA. “Eventually, they all intersected and you got this über bubble, leaving the universe as we observe it today: Between galaxies the gas is essentially all ionized.“

    That’s the theory, anyway. HERA hopes for the first time to observe this key cosmic milestone and then map the evolution of reionization to about 1 billion years after the Big Bang.

    “We have leaned a ton about the cosmology of our universe from studies of the cosmic microwave background, but those experiments are observing just the thin shell of light that was emitted from a bunch of protons and electrons that finally combined into neutral hydrogen 380,000 years after the Big Bang,“ he said. “We know from these experiments that the universe started out neutral, and we know that it ended ionized, and we are trying to map out how it transitioned between those two.“

    “Before the cosmic dawn, the universe glowed from the cosmic microwave background radiation, but there weren’t stars lighting up the universe,“ said David DeBoer, a research astronomer in UC Berkeley’s Radio Astronomy Laboratory. “At some point the neutral hydrogen seeded the stars and black holes and galaxies that relit the universe and led to the epoch of reionization.“

    A 13.8-billion-year cosmic timeline indicates the era shortly after the Big Bang observed by the Planck satellite, the era of the first stars and galaxies observed by HERA and the era of galaxy evolution to be observed by NASA’s future James Webb Space Telescope. (HERA image)

    The HERA array, which could eventually expand to 350 telescopes, consists of radio dishes staring fixedly upwards, measuring radiation originally emitted at a wavelength of 21 centimeters – the hyperfine transition in the hydrogen atom – that has been red-shifted by a factor of 10 or more since it was emitted some 13 billion years ago. The researchers hope to detect the boundaries between bubbles of ionized hydrogen – invisible to HERA – and the surrounding neutral or atomic hydrogen.

    By tuning the receiver to different wavelengths, they can map the bubble boundaries at different distances or redshifts to visualize the evolution of the bubbles over time.

    “HERA can also tell us a lot about how galaxies form,“ Parsons said. “Galaxies are very complex organisms that feed back on themselves, regulating their own star formation and the gas that falls into them, and we don’t really understand how they live, especially at this early time when flowing hydrogen gas ends up as complex structures with spiral arms and black holes in the middle. The epoch of reionization is a bridge between the cosmology that we can theoretically calculate from first principles and the astrophysics we observe today and try to understand.“

    UC Berkeley’s partners in HERA are the University of Washington, UCLA, Arizona State University, the National Radio Astronomical Observatory, the University of Pennsylvania, the Massachusetts Institute of Technology, Brown University, the University of Cambridge in the UK, the Square Kilometer Array in South Africa and the Scuola Normale Superiore in Pisa, Italy.

    Other collaborators are the Harvard-Smithsonian Center for Astrophysics in Cambridge, Massachusetts, the University of KwaZulu Natal, the University of Western Cape and Rhodes University, all in South Africa, and California State Polytechnic University in Pomona.

    “Astronomers want to know what happened to the universe after it emerged from its so-called ‘dark ages’,” said Rich Barvainis, director of the National Science Foundation program that funds HERA. “HERA will help us answer that question, not by studying the primordial stars and galaxies themselves, but rather by studying how these objects changed the nature of intergalactic space.”

    Searching for a firefly on a searchlight

    The key to detecting these percolating bubbles of ionized gas from the epoch of reionization is a receiver that can detect radio signals from neutral hydrogen a million times fainter than nearby radio noise.

    “The foreground noise, mostly synchrotron emission from electrons spiraling in magnetic fields in our own galaxy, is about a million times stronger than the signal,“ DeBoer said. “This is a real problem, because it’s like looking for a firefly in front of an incredibly powerful searchlight. We are trying to see the firefly and filter out the searchlight.“

    Previous experiments, such as the UC Berkeley-led Precision Array Probing the Epoch of Reionization (PAPER) in South Africa and the Murchison Widefield Array (MWA) in Australia, have not been sensitive enough to detect this signal, but with larger dishes and better signal processing, HERA should do the trick.

    “HERA is a unique, next-generation instrument building on the heritage of PAPER,“ said Parsons, who helped build PAPER a decade ago when he was a graduate student working with the late UC Berkeley astronomer Donald Backer. “It is on the same site as PAPER, we are using a lot of the same equipment, but importantly we have brought together a lot more collaborators, including a lot of the U.S. team that has been working with MWA.“

    The strategy is to build a hexagonal array of radio dishes that minimizes the noise, such as radio reflections in the dishes and wires, that would obscure the signal. A supercomputer’s worth of field programmable gate arrays will cross-correlate the signals from the antennas to finely map a 10-degree swath of southern sky centered at minus-30 degrees latitude. Using a technique adopted from PAPER, they will employ this computer processing power to eliminate the slowly varying noise across the wavelength spectrum – 150-350 centimeters – to reveal the rapidly varying signal from neutral hydrogen as they tune across the radio spectrum.

    Astronomers have already discovered hints of reionization, Parsons said. Measurements of the polarization of the cosmic microwave background radiation show that some of the photons emitted at that early time in the universe have been scattered by intervening electrons possibly created by the first stars and galaxies. And galaxy surveys have turned up some very distant galaxies that show attenuation by intervening intergalactic neutral hydrogen, perhaps the last bit remaining before reionization was complete.

    “We have an indication that reionization should have happened, and we are getting hints of when it might have ended, but we don’t have anything telling us what is going on during it.,“ Parsons added. “That is what we hope to learn with HERA, the actual step-by-step process of how reionization happened.“

    Once astronomers know the reionization process, they can calculate the scattering of radiation from the era of recombination – the cosmic background radiation, or CMB – and remove some of the error that makes it hard to detect the gravitational waves produced by inflation shortly after the Big Bang.

    “There is a lot of cosmology you can do with HERA,“ Parsons said. “We have learned so much from the thin shell of the CMB, but here we will be looking at a full three-dimensional space. Something like 80 percent of the observable universe can be mapped using the 21-centimeter line, so this opens up the next generation of cosmology.“

    Parsons and DeBoer compare HERA to the first experiment to detect the cosmic microwave background radiation, the Cosmic Background Explorer, which achieved its goal in 1992 and won for its leaders – George Smoot of UC Berkeley and Lawrence Berkeley National Laboratory, and John Mather of NASA – the 2006 Nobel Prize in Physics.

    “Ultimately, the goal is to get to the point were we are actually making images, just like the CMB images we have seen,“ DeBoer said. “But that is really, really hard, and we need to learn a fair bit about what we are looking for and the instruments we need to get there. We hope that what we develop will allow the Square Kilometre Array or another big project to actually make these images and get much more science from this pivotal epoch in our cosmic history.“

    See the full SKA article here .
    See the UC Berkeley press release here .
    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Founded in the wake of the gold rush by leaders of the newly established 31st state, the University of California’s flagship campus at Berkeley has become one of the preeminent universities in the world. Its early guiding lights, charged with providing education (both “practical” and “classical”) for the state’s people, gradually established a distinguished faculty (with 22 Nobel laureates to date), a stellar research library, and more than 350 academic programs.

    UC Berkeley Seal

  • richardmitnick 3:44 pm on June 17, 2016 Permalink | Reply
    Tags: , , , UC Berkeley   

    From UC Berkeley: “Breakout: How black hole jets punch out of their galaxies” 

    UC Berkeley

    UC Berkeley

    June 16, 2016
    Robert Sanders

    A simulation of the powerful jets generated by supermassive black holes at the centers of the largest galaxies explains why some burst forth as bright beacons visible across the universe, while others fall apart and never pierce the halo of the galaxy.

    Access mp4 video here .
    New simulations of the jets produced by rotating supermassive black holes in the cores of galaxies show how, with enough power, the corkscrewing fields (white squiggles) can force their way through surrounding gas and drill out of the galaxy, channeling hot gas into the interstellar medium (top). Less powerful jets get stalled inside the galaxy, however, their magnetic fields breaking and dumping hot gas inside and heating up the galaxy. (Simulations by Alexander Tchekhovskoy, UC Berkeley, and Omer Bromberg, Hebrew University)

    This computer-simulated image shows a supermassive black hole at the core of a galaxy. The black region in the center represents the black hole’s event horizon, where no light can escape the massive object’s gravitational grip. The black hole’s powerful gravity distorts space around it like a funhouse mirror. Light from background stars is stretched and smeared as the stars skim by the black hole. Credit: NASA, ESA, and D. Coe, J. Anderson, and R. van der Marel (STScI)

    About 10 percent of all galaxies with active nuclei — all presumed to have supermassive black holes within the central bulge — are observed to have jets of gas spurting in opposite directions from the core. The hot ionized gas is propelled by the twisting magnetic fields of the rotating black hole, which can be as large as several billion suns.

    A 40-year-old puzzle was why some jets are hefty and punch out of the galaxy into intergalactic space, while others are narrow and often fizzle out before reaching the edge of the galaxy. The answer could shed light on how galaxies and their central black holes evolve, since aborted jets are thought to roil the galaxy and slow star formation, while also slowing the infall of gas that has been feeding the voracious black hole. The model could also help astronomers understand other types of jets, such as those produced by individual stars, which we see as gamma-ray bursts or pulsars.

    “Whereas it was rather easy to reproduce the stable jets in simulations, it turned out to be an extreme challenge to explain what causes the jets to fall apart,” said University of California, Berkeley theoretical astrophysicist Alexander Tchekhovskoy, a NASA Einstein postdoctoral fellow, who led the project. “To explain why some jets are unstable, researchers had to resort to explanations such as red giant stars in the jets’ path loading the jets with too much gas and making them heavy and unstable so that the jets fall apart.”

    This false-color image of the radio jet and lobes in the very bright radio galaxy Cygnus A is an example of the powerful jets that can be produced by supermassive black holes at the cores of large galaxies. (Image by R. Perley, C. Carilli & J. Dreher)

    By taking into account the magnetic fields that generate these jets, Tchekhovskoy and colleague Omer Bromberg, a former Lyman Spitzer Jr. postdoctoral fellow at Princeton University, discovered that magnetic instabilities in the jet determine their fate. If the jet is not powerful enough to penetrate the surrounding gas, the jet becomes narrow or collimated, a shape prone to kinking and breaking. When this happens, the hot ionized gas funneled through the magnetic field spews into the galaxy, inflating a hot bubble of gas that generally heats up the galaxy.

    Powerful jets, however, are broader and able to punch through the surrounding gas into the intergalactic medium. The determining factors are the power of the jet and how quickly the gas density drops off with distance, typically dependent on the mass and radius of the galaxy core.

    The simulation, which agrees well with observations, explains what has become known as the Fanaroff-Riley morphological dichotomy of jets, first pointed out by Bernie Fanaroff of South Africa and Julia Riley of the U.K. in 1974.

    “We have shown that a jet can fall apart without any external perturbation, just because of the physics of the jet,“ Tchekhovskoy said. He and Bromberg, who is currently at the Hebrew University of Jerusalem in Israel, will publish their simulations on June 17 in the journal Monthly Notices of the Royal Astronomical Society, a publication of Oxford University Press.

    Bendable drills

    The supermassive black hole in the bulging center of these massive galaxies is like a pitted olive spinning around an axle through the hole, Tchekhovskoy said. If you thread a strand of spaghetti through the hole, representing a magnetic field, then the spinning olive will coil the spaghetti like a spring. The spinning, coiled magnetic fields act like a flexible drill trying to penetrate the surrounding gas.

    The black hole at the center of the galaxy M87 produced a weak jet that could not break out of the galaxy, as seen in this radio image from 1989. As in the new computer simulation, stalled jets dump hot gas into giant bubble-like structures that heat up the galaxy. These stalled jets may be part of the black hole feedback mechanism that periodically halts the inflow of gas that feeds the black hole. (VLA/NRAO/NSF image)

    The simulation, based solely on magnetic field interactions with ionized gas particles, shows that if the jet is not powerful enough to punch a hole through the surrounding gas, the magnetic drill bends and, due to the magnetic kink instability, breaks. An example of this type of jet can be seen in the galaxy M87, one of the closest such jets to Earth at a distance of about 50 million light-years, and has a central black hole equal to about 6 billion suns.

    “If I were to jump on top of a jet and fly with it, I would see the jet start to wiggle around because of a kink instability in the magnetic field,“ Tchekhovskoy said.“If this wiggling grows faster than it takes the gas to reach the tip, then the jet will fall apart. If the instability grows slower than it takes for gas to go from the base to the tip of the jet, then the jet will stay stable.“

    The jet in the galaxy Cygnus A, located about 600 million light-years from Earth, is an example of powerful jets punching through into intergalactic space.

    Tchekhovskoy argues that the unstable jets contribute to what is called black hole feedback, that is, a reaction from the material around the black hole that tends to slow its intake of gas and thus its growth. Unstable jets deposit a lot of energy within the galaxy that heats up the gas and prevents it from falling into the black hole. Jets and other processes effectively keep the sizes of supermassive black holes below about 10 billion solar masses, though UC Berkeley astronomers recently found black holes with masses near 21 billion solar masses.

    Presumably these jets start and stop, lasting perhaps 10-100 million years, as suggested by images of some galaxies showing more than one jet, one of them old and tattered. Evidently, black holes go through binging cycles, interrupted in part by the occasional unstable jet that essentialy takes away their food.

    The simulations were run on the Savio computer at UC Berkeley, Darter at the National Institute for Computational Sciences at the University of Tennesee, Knoxville, and Stampede, Maverick and Ranch computers at the Texas Advanced Computing Center at the University of Texas at Austin. The entire simulation took about 500 hours on 2,000 computer cores, the equivalent of 1 million hours on a standard laptop.

    The researchers are improving their simulation to incorporate the smaller effects of gravity, buoyancy and the thermal pressure of the interstellar and intergalactic media.

    The work was supported by NASA through Einstein Postdoctoral Fellowship grant number PF3-140115 awarded by the Chandra X-ray Center, operated by the Smithsonian Astrophysical Observatory for NASA under contract NAS8-03060, and the National Science Foundation through an XSEDE computational time allocation TG-AST100040.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Founded in the wake of the gold rush by leaders of the newly established 31st state, the University of California’s flagship campus at Berkeley has become one of the preeminent universities in the world. Its early guiding lights, charged with providing education (both “practical” and “classical”) for the state’s people, gradually established a distinguished faculty (with 22 Nobel laureates to date), a stellar research library, and more than 350 academic programs.

    UC Berkeley Seal

  • richardmitnick 8:57 am on February 13, 2016 Permalink | Reply
    Tags: , , , MyShake, UC Berkeley   

    From livescience: “‘MyShake’ App Turns Your Smartphone into Earthquake Detector” 


    February 12, 2016
    Mindy Weisberger


    Seismologists and app developers are shaking things up with a new app that transforms smartphones into personal earthquake detectors.

    By tapping into a smartphone’s accelerometer — the motion-detection instrument — the free Android app, called MyShake, can pick up and interpret nearby quake activity, estimating the earthquake’s location and magnitude in real-time, and then relaying the information to a central database for seismologists to analyze.

    In time, an established network of users could enable MyShake to be used as an early- warning system, the researchers said.

    UC Berkeley MyShake
    MyShake network

    Crowdsourcing quakes

    Seismic networks worldwide detect earthquakes and convey quake data to scientists around the clock, providing a global picture of the tremors that are part of Earth’s ongoing dynamic processes. But there are areas where the network is thin, which means researchers are missing pieces in the seismic puzzle. However, “citizen- scientists” with smartphones could fill those gaps, according to Richard Allen, leader of the MyShake project and director of the Berkeley Seismological Laboratory in California.

    “As smartphones became more popular and it became easier to write software that would run on smartphones, we realized that we had the potential to use the accelerometer that runs in every smartphone to record earthquakes,” Allen told Live Science.

    How it works

    Accelerometers measure forces related to acceleration: vibration, tilt and movement, and also the static force of gravity’s pull. In smartphones, accelerometers detect changes in the device’s orientation, allowing the phone to know exactly which end is up and to adjust visual displays to correspond to the direction it’s facing.

    Fitness apps for smartphones use accelerometers to pinpoint specific changes in motion in order to calculate the number of steps you take, for example. And the MyShake app is designed to recognize when a smartphone’s accelerometer picks up the signature shaking of an earthquake, Allen said, which is different from other types of vibrating motion, or “everyday shaking.”

    In fact, the earthquake-detection engine in MyShake is designed to recognize an earthquake’s vibration profile much like a fitness app recognizes steps, according to Allen.

    “It’s about looking at the amplitude and the frequency content of the earthquake,” Allen said, “and it’s quite different from the amplitude and frequency content of most everyday shakes. It’s very low-frequency energy and the amplitude is not as big as the amplitude for most everyday activities.”

    In other words, the difference between the highs and lows of the motion generated by an earthquake are smaller than the range you’d find in other types of daily movement, he said.

    Quake, rattle and roll

    When a smartphone’s MyShake app detects an earthquake, it instantly sends an alert to a central processing site. A network detection algorithm is activated by incoming data from multiple phones in the same area, to “declare” an earthquake, identify its location and estimate its magnitude, Allen said.

    For now, the app will only collect and transmit data to the central processor. But the end goal, Allen said, is for future versions of the app to send warnings back to individual users.

    An iPhone version of the app will also be included in future plans for MyShake, according to Allen.For seismologists, the more data they can gather about earthquakes, the better, Allen said. A bigger data pool means an improved understanding of quake behavior, which could help experts design better early warning systems and safety protocols, things that are especially critical in urban areas prone to frequent quake activity. With 2.6 billion smartphones currently in circulation worldwide and an anticipated 6 billion by 2020, according to an Ericsson Mobility Report released in 2015, a global network of handheld seismic detectors could go a long way toward keeping people safe by improving quake preparation and response.

    The findings were published online today (Feb. 12) in the journal Science Advances, and the MyShake app is available for download at myshake.berkeley.edu.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 5:32 pm on February 4, 2016 Permalink | Reply
    Tags: , Exoskeletal help for permanently injured people, UC Berkeley   

    From Berkeley: “A new-generation exoskeleton helps the paralyzed to walk” 

    UC Berkeley

    UC Berkeley

    February 3, 2016
    No writer credit found

    Until recently, being paralyzed from the waist down meant using a wheelchair to get around. And although daily life is more accessible to wheelchair users, they still face physical and social limitations. But UC Berkeley’s Robotics and Human Engineering Laboratory has been working to change that.

    The robotics lab, a team of graduate students led by mechanical engineering professor Homayoon Kazerooni, has been working for more than a decade to create robotic exoskeletons that allow those with limited mobility to walk again.

    New exoskeleton
    Steven Sanchez, who was paralyzed from the waist down after a BMX accident, wears SuitX’s Phoenix. “If I had this it would change a lot of things,” he says. (Photo courtesy of SuitX)

    This week, a new, lighter and more agile exoskeleton, for which the Kaz lab developed the original technology, was unveiled earlier this week: The Phoenix, by SuitX, a company that has spun off the robotics lab. Kazerooni is its founder and CEO.

    The Phoenix is lightweight, has two motors at the hips and electrically controlled tension settings that tighten when the wearer is standing and swing freely when they’re walking. Users can control the movement of each leg and walk up to 1.1 miles per hour by pushing buttons integrated into a pair of crutches. It’s powered for up to eight hours by a battery pack worn in a backpack.

    “We can’t really fix their disease,” says Kazerooni. “We can’t fix their injury. But what it would do is postpone the secondary injuries due to sitting. It gives a better quality of life.”

    Kazarooni and his team have developed a series of exoskeletons over the years. Their work in the field began in 2000 with a project funded by the Defense Advanced Research Projects Agency to create a device, now called the Berkeley Lower Extremity Exoskeleton (BLEEX), that could help people carry heavier loads for longer. At that time, Kazerooni also realized the potential use for exoskeletons in the medical field, particularly as an alternative to wheelchairs.

    The team began developing new devices to restore mobility for people who had become paraplegic.

    In 2011, they made the exoskeleton that helped Berkeley senior Austin Whitney, paralyzed from the waist down in a 2007 car accident, make an epic walk across the graduation stage to receive his diploma. Soon after, the Austin Project was created in honor of Whitney, with a goal of finding new technologies to create reliable, inexpensive exoskeleton systems for everyday personal use.

    Today, the Phoenix is one of the lightest and most accessible exoskeletons to hit the market. It can be adjusted to fit varied weights, heights and leg sizes and can be used for a range of mobility hindrances. And, although far from inexpensive at $40,000, it’s about the half the cost of other exoskeletons that help restore mobility.

    Read more about SuitX’s Phoenix suit in the MIT Technology Review.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Founded in the wake of the gold rush by leaders of the newly established 31st state, the University of California’s flagship campus at Berkeley has become one of the preeminent universities in the world. Its early guiding lights, charged with providing education (both “practical” and “classical”) for the state’s people, gradually established a distinguished faculty (with 22 Nobel laureates to date), a stellar research library, and more than 350 academic programs.

    UC Berkeley Seal

  • richardmitnick 3:16 pm on January 20, 2016 Permalink | Reply
    Tags: , , , UC Berkeley   

    From Berkeley: “Advance improves cutting and pasting with CRISPR-Cas9 gene editing” 

    UC Berkeley

    UC Berkeley

    January 20, 2016
    Robert Sanders

    Temp 1
    A view of the Cas9 protein (red and blue) bound to a double strand of DNA (purple and grey). After both strands are cut, one DNA strand (purple dots) is free and able to bind with a piece of DNA to be inserted at the break.This behavior can be utilized to significantly boost the efficiency of gene editing. Image by Christopher Richardson, UC Berkeley, based on structure solved by Martin Jinek’s lab.

    UC Berkeley researchers have made a major improvement in CRISPR-Cas9 technology that achieves an unprecedented success rate of 60 percent when replacing a short stretch of DNA with another.

    The improved technique is especially useful when trying to repair genetic mutations that cause hereditary diseases, such as sickle cell disease or severe combined immune deficiency. The technique allows researchers to patch an abnormal section of DNA with the normal sequence and potentially correct the defect and is already working in cell culture to improve ongoing efforts to repair defective genes.

    “The exciting thing about CRISPR-Cas9 is the promise of fixing genes in place in our genome, but the efficiency for that can be very low,” said Jacob Corn, scientific director of the Innovative Genomics Initiative at UC Berkeley, a group that focuses on next-generation genome editing and gene regulation for lab and clinical application. “If you think of gene editing as a word processor, we know how to cut, but we need a more efficient way to paste and glue a new piece of DNA where we make the cut.”

    “In cases where you want to change very small regions of DNA, up to 30 base pairs, this technique would be extremely effective,” said first author Christopher Richardson, an IGI postdoc.

    Problems in short sections of DNA, including single base-pair mutations, are typical of many genetic diseases. Base pairs are the individual building blocks of DNA, strung end-to-end in a strand that coils around a complementary strand to make the well-known helical, double-stranded DNA molecule.

    Richardson, Corn and their IGI colleagues describe the new technique in the Jan. 21 issue of the journal Nature Biotechnology.

    Grabbing onto a loose strand

    Richardson invented the new approach after finding that the Cas9 protein, which does the actual DNA cutting, remains attached to the chromosome for up to six hours, long after it has sliced through the double-stranded DNA. Richardson looked closely at the Cas9 protein bound to the two strands of DNA and discovered that while the protein hangs onto three of the cut ends, one of the ends remains free.

    Jennifer Doudna explains how CRISPR-Cas9 edits genes. Video by Roxanne Makasdjian and Stephen McNally, UC Berkeley.
    Watch/download mp4 video here .

    When Cas9 cuts DNA, repair systems in the cell can grab a piece of complementary DNA, called a template, to repair the cut. Researchers can add templates containing changes that alter existing sequences in the genome — for example, correcting a disease-causing mutation.

    Richardson reasoned that bringing the substitute template directly to the site of the cut would improve the patching efficiency, and constructed a piece of DNA that matches the free DNA end and carries the genetic sequence to be inserted at the other end. The technique worked extremely well, allowing successful repair of a mutation with up to 60 percent efficiency.

    “Our data indicate that Cas9 breaks could be different at a molecular level from breaks generated by other targeted nucleases, such as TALENS and zinc-finger nucleases, which suggests that strategies like the ones we are using can give you more efficient repair of Cas9 breaks,” Richardson said.

    The researchers also showed that variants of the Cas9 protein that bind DNA but do not cut also can successfully paste a new DNA sequence at the binding site, possibly by forming a “bubble” structure on the target DNA that also acts to attract the repair template. Gene editing using Cas9 without genome cutting could be safer than typical gene editing by removing the danger of off-target cutting in the genome, Corn said.

    Co-authors with Richardson and Corn are IGI researchers Jordan Ray, Mark DeWitt and Gemma Curie. The work was funded by the Li Ka Shing Foundation.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Founded in the wake of the gold rush by leaders of the newly established 31st state, the University of California’s flagship campus at Berkeley has become one of the preeminent universities in the world. Its early guiding lights, charged with providing education (both “practical” and “classical”) for the state’s people, gradually established a distinguished faculty (with 22 Nobel laureates to date), a stellar research library, and more than 350 academic programs.

    UC Berkeley Seal

  • richardmitnick 4:40 pm on December 23, 2015 Permalink | Reply
    Tags: , , UC Berkeley   

    From Berkeley: “Engineers demo first processor that uses light for ultrafast communications” 

    UC Berkeley

    UC Berkeley

    December 23, 2015
    Sarah Yang

    The electronic-photonic processor chip communicates to the outside world directly using light, illustrated here. The photo shows the packaged microchip under illumination, revealing the chip’s primary features. (Image by Glenn J. Asakawa, University of Colorado, Glenn.Asakawa@colorado.edu)

    Engineers have successfully married electrons and photons within a single-chip microprocessor, a landmark development that opens the door to ultrafast, low-power data crunching.

    The researchers packed two processor cores with more than 70 million transistors and 850 photonic components onto a 3-by-6-millimeter chip. They fabricated the microprocessor in a foundry that mass-produces high-performance computer chips, proving that their design can be easily and quickly scaled up for commercial production.

    The new chip, described in a paper to be published Dec. 24 in the print issue of the journal Nature, marks the next step in the evolution of fiber optic communication technology by integrating into a microprocessor the photonic interconnects, or inputs and outputs (I/O), needed to talk to other chips.

    “This is a milestone. It’s the first processor that can use light to communicate with the external world,” said Vladimir Stojanović, an associate professor of electrical engineering and computer sciences at the University of California, Berkeley, who led the development of the chip. “No other processor has the photonic I/O in the chip.”

    Stojanović and fellow UC Berkeley professor Krste Asanović teamed up with Rajeev Ram at the Massachusetts Institute of Technology
    and Miloš Popović at the University of Colorado Boulder to develop the new microprocessor.

    “This is the first time we’ve put a system together at such scale, and have it actually do something useful, like run a program,” said Asanović, who helped develop the free and open architecture called RISC-V (reduced instruction set computer), used by the processor.

    Greater bandwidth with less power

    Compared with electrical wires, fiber optics support greater bandwidth, carrying more data at higher speeds over greater distances with less energy. While advances in optical communication technology have dramatically improved data transfers between computers, bringing photonics into the computer chips themselves had been difficult.

    The electronic-photonic processor chip naturally illuminated by red and green bands of light. (Image by Glenn J. Asakawa, University of Colorado, Glenn.Asakawa@colorado.edu)

    That’s because no one until now had figured out how to integrate photonic devices into the same complex and expensive fabrication processes used to produce computer chips without changing the process itself. Doing so is key since it does not further increase the cost of the manufacturing or risk failure of the fabricated transistors.

    The researchers verified the functionality of the chip with the photonic interconnects by using it to run various computer programs, requiring it to send and receive instructions and data to and from memory. They showed that the chip had a bandwidth density of 300 gigabits per second per square millimeter, about 10 to 50 times greater than packaged electrical-only microprocessors currently on the market.

    The photonic I/O on the chip is also energy-efficient, using only 1.3 picojoules per bit, equivalent to consuming 1.3 watts of power to transmit a terabit of data per second. In the experiments, the data was sent to a receiver 10 meters away and back.

    “The advantage with optical is that with the same amount of power, you can go a few centimeters, a few meters or a few kilometers,” said study co-lead author Chen Sun, a recent UC Berkeley Ph.D. graduate from Stojanović’s lab at the Berkeley Wireless Research Center. “For high-speed electrical links, 1 meter is about the limit before you need repeaters to regenerate the electrical signal, and that quickly increases the amount of power needed. For an electrical signal to travel 1 kilometer, you’d need thousands of picojoules for each bit.”

    The achievement opens the door to a new era of bandwidth-hungry applications. One near-term application for this technology is to make data centers more green. According to the Natural Resources Defense Council, data centers consumed about 91 billion kilowatt-hours of electricity in 2013, about 2 percent of the total electricity consumed in the United States, and the appetite for power is growing exponentially.

    This research has already spun off two startups this year with applications in data centers in mind. SiFive is commercializing the RISC-V processors, while Ayar Labs is focusing on photonic interconnects. Earlier this year, Ayar Labs – under its previous company name of OptiBit – was awarded the MIT Clean Energy Prize. Ayar Labs is getting further traction through the CITRIS Foundry startup incubator at UC Berkeley.

    The advance is timely, coming as world leaders emerge from the COP21 United Nations climate talks with new pledges to limit global warming.

    Further down the road, this research could be used in applications such as LIDAR, the light radar technology used to guide self-driving vehicles and the eyes of a robot; brain ultrasound imaging; and new environmental biosensors.

    ‘Fiat lux’ on a chip

    The researchers came up with a number of key innovations to harness the power of light within the chip.

    The illumination and camera create a rainbow-colored pattern across the electronic-photonic processor chip. (Image by Milos Popović, University of Colorado, milos.popovic@colorado.edu)

    Each of the key photonic I/O components – such as a ring modulator, photodetector and a vertical grating coupler – serves to control and guide the light waves on the chip, but the design had to conform to the constraints of a process originally thought to be hostile to photonic components. To enable light to move through the chip with minimal loss, for instance, the researchers used the silicon body of the transistor as a waveguide for the light. They did this by using available masks in the fabrication process to manipulate doping, the process used to form different parts of transistors.

    After getting the light onto the chip, the researchers needed to find a way to control it so that it can carry bits of data. They designed a silicon ring with p-n doped junction spokes next to the silicon waveguide to enable fast and low-energy modulation of light.

    Using the silicon-germanium parts of a modern transistor – an existing part of the semiconductor manufacturing process – to build a photodetector took advantage of germanium’s ability to absorb light and convert it into electricity.

    A vertical grating coupler that leverages existing poly-silicon and silicon layers in innovative ways was used to connect the chip to the external world, directing the light in the waveguide up and off the chip. The researchers integrated electronic components tightly with these photonic devices to enable stable operation in a hostile chip environment.

    The authors emphasized that these adaptations all worked within the parameters of existing microprocessor manufacturing systems, and that it will not be difficult to optimize the components to further improve their chip’s performance.

    Other co-lead authors on this paper are Mark Wade, Ph.D. student at the University of Colorado, Boulder; Yunsup Lee, a Ph.D. candidate at UC Berkeley; and Jason Orcutt, an MIT graduate who now works at the IBM Research Center in New York.

    The Defense Advanced Research Projects Agency (DARPA) helped support this work.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Founded in the wake of the gold rush by leaders of the newly established 31st state, the University of California’s flagship campus at Berkeley has become one of the preeminent universities in the world. Its early guiding lights, charged with providing education (both “practical” and “classical”) for the state’s people, gradually established a distinguished faculty (with 22 Nobel laureates to date), a stellar research library, and more than 350 academic programs.

    UC Berkeley Seal

  • richardmitnick 12:07 pm on December 20, 2015 Permalink | Reply
    Tags: , , UC Berkeley   

    From UC Berkeley: “Earth’s magnetic field could flip within a human lifetime” 2014 but very informative 

    UC Berkeley

    UC Berkeley

    October 14, 2014
    Robert Sanders

    Imagine the world waking up one morning to discover that all compasses pointed south instead of north.

    It’s not as bizarre as it sounds. Earth’s magnetic field has flipped – though not overnight – many times throughout the planet’s history. Its dipole magnetic field, like that of a bar magnet, remains about the same intensity for thousands to millions of years, but for incompletely known reasons it occasionally weakens and, presumably over a few thousand years, reverses direction.

    Left to right, Biaggio Giaccio, Gianluca Sotilli, Courtney Sprain and Sebastien Nomade sitting next to an outcrop in the Sulmona basin of the Apennine Mountains that contains the Matuyama-Brunhes magnetic reversal. A layer of volcanic ash interbedded with the lake sediments can be seen above their heads. Sotilli and Sprain are pointing to the sediment layer in which the magnetic reversal occurred. (Photo by Paul Renne)

    Now, a new study by a team of scientists from Italy, France, Columbia University and the University of California, Berkeley, demonstrates that the last magnetic reversal 786,000 years ago actually happened very quickly, in less than 100 years – roughly a human lifetime.

    “It’s amazing how rapidly we see that reversal,” said UC Berkeley graduate student Courtney Sprain. “The paleomagnetic data are very well done. This is one of the best records we have so far of what happens during a reversal and how quickly these reversals can happen.”

    Sprain and Paul Renne, director of the Berkeley Geochronology Center and a UC Berkeley professor-in- residence of earth and planetary science, are coauthors of the study, which will be published in the November issue of Geophysical Journal International and is now available online.

    Flip could affect electrical grid, cancer rates

    The discovery comes as new evidence indicates that the intensity of Earth’s magnetic field is decreasing 10 times faster than normal, leading some geophysicists to predict a reversal within a few thousand years.

    Though a magnetic reversal is a major planet-wide event driven by convection in Earth’s iron core, there are no documented catastrophes associated with past reversals, despite much searching in the geologic and biologic record. Today, however, such a reversal could potentially wreak havoc with our electrical grid, generating currents that might take it down.

    And since Earth’s magnetic field protects life from energetic particles from the sun and cosmic rays, both of which can cause genetic mutations, a weakening or temporary loss of the field before a permanent reversal could increase cancer rates. The danger to life would be even greater if flips were preceded by long periods of unstable magnetic behavior.

    “We should be thinking more about what the biologic effects would be,” Renne said.

    Dating ash deposits from windward volcanoes

    The new finding is based on measurements of the magnetic field alignment in layers of ancient lake sediments now exposed in the Sulmona basin of the Apennine Mountains east of Rome, Italy. The lake sediments are interbedded with ash layers erupted from the Roman volcanic province, a large area of volcanoes upwind of the former lake that includes periodically erupting volcanoes near Sabatini, Vesuvius and the Alban Hills.

    Leonardo Sagnotti, standing, and coauthor Giancarlo Scardia collecting a sample for paleomagnetic analysis.

    Italian researchers led by Leonardo Sagnotti of Rome’s National Institute of Geophysics and Volcanology measured the magnetic field directions frozen into the sediments as they accumulated at the bottom of the ancient lake.

    Sprain and Renne used argon-argon dating, a method widely used to determine the ages of rocks, whether they’re thousands or billions of years old, to determine the age of ash layers above and below the sediment layer recording the last reversal. These dates were confirmed by their colleague and former UC Berkeley postdoctoral fellow Sebastien Nomade of the Laboratory of Environmental and Climate Sciences in Gif-Sur-Yvette, France.

    Because the lake sediments were deposited at a high and steady rate over a 10,000-year period, the team was able to interpolate the date of the layer showing the magnetic reversal, called the Matuyama-Brunhes transition, at approximately 786,000 years ago. This date is far more precise than that from previous studies, which placed the reversal between 770,000 and 795,000 years ago.

    “What’s incredible is that you go from reverse polarity to a field that is normal with essentially nothing in between, which means it had to have happened very quickly, probably in less than 100 years,” said Renne. “We don’t know whether the next reversal will occur as suddenly as this one did, but we also don’t know that it won’t.”

    Unstable magnetic field preceded 180-degree flip

    Whether or not the new finding spells trouble for modern civilization, it likely will help researchers understand how and why Earth’s magnetic field episodically reverses polarity, Renne said.

    The ‘north pole’ — that is, the direction of magnetic north — was reversed a million years ago. This map shows how, starting about 789,000 years ago, the north pole wandered around Antarctica for several thousand years before flipping 786,000 years ago to the orientation we know today, with the pole somewhere in the Arctic. No image credit.

    The magnetic record the Italian-led team obtained shows that the sudden 180-degree flip of the field was preceded by a period of instability that spanned more than 6,000 years. The instability included two intervals of low magnetic field strength that lasted about 2,000 years each. Rapid changes in field orientations may have occurred within the first interval of low strength. The full magnetic polarity reversal – that is, the final and very rapid flip to what the field is today – happened toward the end of the most recent interval of low field strength.

    Renne is continuing his collaboration with the Italian-French team to correlate the lake record with past climate change.

    Renne and Sprain’s work at the Berkeley Geochronology Center was supported by the Ann and Gordon Getty Foundation.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Founded in the wake of the gold rush by leaders of the newly established 31st state, the University of California’s flagship campus at Berkeley has become one of the preeminent universities in the world. Its early guiding lights, charged with providing education (both “practical” and “classical”) for the state’s people, gradually established a distinguished faculty (with 22 Nobel laureates to date), a stellar research library, and more than 350 academic programs.

    UC Berkeley Seal

  • richardmitnick 6:55 pm on December 17, 2015 Permalink | Reply
    Tags: , , UC Berkeley   

    From UC Berkeley: “Seeing Through the Big Data Fog” 

    UC Berkeley

    UC Berkeley

    December 14, 2015
    Wallace Ravven

    A neuroscientist studies how stress affects the brain’s ability to form new memories. Across the campus, another researcher looks for telltale signs of distant planets in a sliver of sky. What each of them seeks may lie hidden in an avalanche of data.

    Joe Hellerstein and his students developed a new programming model for distributed computing which MIT Technology Review named one of the 10 technologies “most likely to change our world”.

    The same is true in industry, where data must be diced, sliced and analyzed to identify changes in customer behavior or the promise of new fabrication techniques.

    Working the data so that it can yield to analysis regularly runs into a bottleneck — a human bottleneck, says Berkeley computer science professor Joe Hellerstein.

    In 2011, Sean Kandel, a grad student working with Hellerstein and Stanford computer scientist Jeffrey Heer, interviewed three dozen analysts at 25 companies in different industries to ask them how they spend their time, what their “pain points” were, as Hellerstein says.

    “It became very clear that the task of wrangling data takes up the lion’s share of their time,” Hellerstein says. “People come at data differently. They name data differently, or it may be incomplete. You have to sort this out. You find oddball data, and you don’t know if it was input incorrectly or if it’s a meaningful outlier. All this precedes analysis. It’s very tedious.”

    Hellerstein, Heer and Kandel devised a software program to refine and speed the process. They called it, reasonably enough, Data Wrangler, and made it freely available online. Data Wrangler became the core of Trifacta, a startup they founded in 2012.

    Trifacta provides a platform to efficiently convert raw data into more structured formats for analysis. Its flagship product for data wrangling enables data analysts to easily transform data from messy traces of the real world into structured tables and charts that can reveal unsuspected patterns, or suggest new directions for analysis.

    Trifacta was quickly adopted by dozens of companies, from Linkedin to Lockheed Martin, and typically provides a major productivity gain.

    “What used to take weeks suddenly takes minutes”, Hellerstein says. “So you can experiment a great deal more with the data. This was far and away the most useful piece of research that I have been involved in.”

    In 2014, CRN, a high-profile communications technology magazine, placed Trifacta on its short list of The 10 Coolest Big Data Products.

    Joe Hellerstein and his postdoc Eugene Wu worked on designing a high-level language for crafting interactive visualizations of data. Wu is now a professor at Columbia University. Photo: Peg Skorpinski

    GoPro, the company that makes wearable video recorders, was an early Trifacta client. On YouTube, GoPro videos run the gamut from a sky diver’s death-defying leap to Kama, the surfing pig. (He prefers three-to four-foot waves.)

    After sales of its recorders took off, GoPro moved into developing media software and other online services for customers. The company was soon inundated with coveted consumer data from devices, retail sales, social media and other sources.

    GoPro built a data science team, which brought in Trifacta to clean up the data and present it in an intuitive and accessible format, so the less techy business people could use it to tailor services to customers and offer new products.

    Hellerstein’s research also targets software developers who build Big Data systems — systems that may harness hundreds or thousands of computers to do their work. These “distributed computing” platforms, which also form the foundation of Cloud Computing, create major new hurdles for software engineering.

    Code for a single computer is an ordered list of instructions, and most programming languages were designed for simple, orderly computing on a single machine.

    With a distributed system, Hellerstein says, “If you force order, the machines spend all their time coordinating, and progress is limited by the slowest machine. Working around this with a traditional programming language is incredibly hard, and typically leads to all kinds of tricky bugs and design flaws.”

    With his students, he launched the BOOM (Berkeley Orders of Magnitude) project to develop a new programming model for distributed computers that helps programmers avoid specifying the steps of a computation in a particular order. Instead, it focuses on the information that the program must manage, and the way that information flows through machines and tasks.

    “The main result of the BOOM project is a ‘disorderly’ programming language called Bloom, which has enabled us to write complex distributed programs in simple, intuitive ways — with tens or hundreds of times less code than traditional languages,” Hellerstein says.

    In 2010, Bloom was recognized by MIT Technology Review as one of the 10 technologies “most likely to change our world.”

    Hellerstein has since used it in his courses on “Programming the Cloud” at Berkeley. It has been adopted by a number of research groups and forms the basis of a startup company in the Bay Area called Eve that Hellerstein advises.

    As he describes his work to ease data wrangling and speed cloud programming, Hellerstein turns a small metal hammer in his hands. “It’s true,” he says. “I do like to tinker.” Of course, he does way more than tinker. He’s developing better tools for the trade.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Founded in the wake of the gold rush by leaders of the newly established 31st state, the University of California’s flagship campus at Berkeley has become one of the preeminent universities in the world. Its early guiding lights, charged with providing education (both “practical” and “classical”) for the state’s people, gradually established a distinguished faculty (with 22 Nobel laureates to date), a stellar research library, and more than 350 academic programs.

    UC Berkeley Seal

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: