Tagged: LBL Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:38 am on August 24, 2016 Permalink | Reply
    Tags: LBL, Oceanus   

    From LBL- “Aboard the Oceanus: Motoring back home” 

    Berkeley Logo

    Berkeley Lab

    August 22, 2016
    No writer credit


    A group of us took turns hand-spooling 700 meters of cable back onto this reel as part of the packing up process. (Photo: Sarah Yang)

    We are nearing the end of this 10-day research trip, and as I’m writing, people are bustling around me in the process of deconstructing the labs they set up and packing up their gear. Researchers have made backup copies of the data collected from the various tests conducted, the Carbon Flux Explorers are tucked away in their crates, 700 meters of cable were unspooled from the ship’s winch back onto a reel by hand, and filtering and processing stations – including the “Bubble” – have been taken down.

    We’re moving along at about 10.5 to 11 knots, which should get us into San Francisco by 6:30 a.m. tomorrow (Tuesday, Aug. 23). We’ll have to offload quickly since the next research team is set to arrive the following day.

    By the end of this trip, the researchers have profiled more than 150 kilometers of the ocean’s water column through 26 CTD deployments. The Carbon Flux Explorers collectively gathered more than eight days of data.

    Of this expedition’s 10-day allotment, there were 6.5 days when experiments were run, equipment and robots were deployed and recovered, and samples were processed. The rest of the time was spent in transit time to and from stations. Altogether we will have traveled about 750 nautical miles by the time we dock in San Francisco.

    Ship time on the Oceanus runs $25,000 per day. With all of the work that went into collecting the particles from the ocean, including the around-the-clock hours from numerous students and engineers before departure, I am confident that the particles and plankton collected are worth more than their weight in gold.

    “We’ve accomplished what we set out to do,” said Jim Bishop. “We had an equal number of deployments and recoveries, and I’m thankful for that. We also wanted to torture test the Carbon Flux Explorers and the various sensors used to profile the water. We collected samples from near-shore and oligotrophic waters at various depths. And we gave students invaluable field experience that they will be able to carry with them to wherever their careers take them.”

    As is tradition, the cook on the Oceanus treated the crew and research team to a festive meal to celebrate the last full day of the voyage. For lunch today, he served us steak and lobster, and he threw in some crab legs for good measure.

    Now that we’re heading back, we should soon reappear on the vessel tracking map, which temporarily lost our signal when we ventured far offshore.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

  • richardmitnick 4:09 pm on August 18, 2016 Permalink | Reply
    Tags: A New Way to Display the 3-D Structure of Molecules, , LBL   

    From LBNL: “A New Way to Display the 3-D Structure of Molecules” 

    Berkeley Logo

    Berkeley Lab

    August 18, 2016
    Glenn Roberts Jr.

    Mirror, mirror: This rendering shows opposite configurations in the molecular structure of a plant hormone called jasmonic acid (gray and red) that are bound to nanostructures (gold and blue) called MOFs, or metal-organic frameworks. (Credit: S. Lee, E. Kapustin, O. Yaghi/Berkeley Lab and UC Berkeley)

    Researchers at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and UC Berkeley have created a sort of nanoscale display case that enables new atomic-scale views of hard-to-study chemical and biological samples.

    Their work, published online Aug. 18 in the journal Science, could help to reveal new structural details for a range of challenging molecules—including complex chemical compounds and potentially new drugs—by stabilizing them inside sturdy structures known as metal-organic frameworks (MOFs).

    The researchers introduced a series of different molecules that were chemically bound inside these porous MOFs, each measuring about 100 millionths of a meter across, and then used X-ray techniques to determine the precise molecular structure of the samples inside the MOFs.

    The samples ranged from a simple alcohol to a complex plant hormone, and the new method, dubbed “CAL” for covalent alignment (the molecules form a type of chemical bond known as a covalent bond in the MOFs), enables researchers to determine the complete structure of a molecule from a single MOF crystal that contains the sample molecules in its pores.

    The MOFs in the study, which are identical and are easy to manufacture in large numbers, provided a sort of backbone for the sample molecules that held them still for the X-ray studies—the molecules otherwise can be wobbly and difficult to stabilize. The researchers prepared the samples by dipping the MOFs into solutions containing different molecular mixes and then heating them until they crystallized.

    “We wanted to demonstrate that any of these molecules, no matter how complex, can be incorporated and their structure determined inside the MOFs,” said Omar Yaghi, a materials scientist at Berkeley Lab and chemistry professor at UC Berkeley who led the research.

    This illustration shows the structure of a nanostructure known as a metal-organic framework or MOF. The structure possesses a handedness (like a right-handed vs. left-handed person), known as “chirality,” that enables researchers to identify the same kind of handedness in molecules that bind within it. (Credit: S. Lee, E. Kapustin, O. Yaghi/Berkeley Lab and UC Berkeley)

    The MOFs also possess a particular handedness known as “chirality”—like a left-handed person vs. a right-handed person—that selectively binds with molecular samples that also possess this handedness. The difference in a molecule’s handedness is particularly important for pharmaceuticals, as it can mean the difference between a medicine and a poison.

    “This is one of the holy grails: how to crystallize complex molecules, and to determine their chirality,” Yaghi said.

    Seungkyu Lee and Eugene A. Kapustin, Berkeley Lab researchers and UC Berkeley graduate students who participated in the latest work, said hard-to-study proteins, such as those important for drug development, are high-priority targets for the new technique.

    “We are aiming for those molecules that have never been crystallized before,” Kapustin said. “That’s our next step. So we cannot only show the arrangement of atoms, but also the handedness of molecules, in which pharmaceutical companies are interested.”

    One of the best methods for studying any molecule’s 3-D structure in atomic detail is to form it into a crystal. Then, researchers point intense X-ray light at the crystal, which produces a pattern of spots—like light off of a disco ball. Such patterns serve as a fingerprint for fully mapping the molecule’s 3-D structure.

    Some molecules are difficult to form into crystals, though, and the process of crystallizing a single molecule can in some cases involve years of effort and expense.

    “To crystallize a molecule typically involves a trial-and-error method,” Yaghi said. “Every chemist and biologist has to submit to this process. But in this MOF material you don’t need all that—it traps the molecule and orders it. It’s a way to bypass that trial-and-error approach to crystallography.”

    Different types of MOFs, with different pore sizes, could be tested to find out which ones work best with different types of samples, Lee said.

    Importantly, the MOFs in the latest study did not appear to distort the natural, intact structure of the molecules. Researchers say it’s possible to determine the complete 3-D structure of a molecule even if the samples only fill about 30 percent of a MOF’s pores.

    Researchers determined the atomic structure of the MOFs and the bound molecules with X-rays at Berkeley Lab’s Advanced Light Source (ALS), and they also studied the MOFs using a technique called nuclear magnetic resonance (NMR) at Berkeley Lab’s Molecular Foundry.

    LBL ALS interior

    In all, the researchers studied 16 different molecules bound inside the MOF pores, including a plant hormone called jasmonic acid whose chiral structure had never been directly determined before, other plant hormones known as gibberellins, methanol, and other acids and alcohols.

    This illustration shows the structure of 16 molecules that were studied while bound to metal-organic frameworks (MOFs) that exhibit handedness. The frameworks stabilized the molecules for study with X-rays. (Credit: S. Lee, E. Kapustin, O. Yaghi/Berkeley Lab and UC Berkeley)

    The metals in the MOF framework itself can actually serve to enhance the quality of the X-ray images, Kapustin said, adding that in one case the technique allowed researchers to distinguish between two nearly identical plant hormones based on the difference in a single atomic bond.

    Researchers could see structural details down to hundredths of a nanometer—less than the diameter of some atoms. “You can see with such precision whether it is a double bond or a single bond, or if this is a carbon atom or some other atom,” Lee said. “Once you bind a molecule in the MOF, you can learn the absolute structure very precisely since the chirality of the MOF serves as a reference during the structure refinement.”

    This work was supported by BASF SE in Germany and the King Abdulaziz City for Science and Technology Center of Excellence for Nanomaterials and Clean Energy Applications.

    The Advanced Light Source and Molecular Foundry are both DOE Office of Science User Facilities.

    For more information about Omar Yaghi’s research, visit http://yaghi.berkeley.edu/.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

  • richardmitnick 4:32 pm on July 29, 2016 Permalink | Reply
    Tags: , Bacterial Pathogenicity, , LBL   

    From LBL: “Study Finds Molecular Switch That Triggers Bacterial Pathogenicity” 

    Berkeley Logo

    Berkeley Lab

    July 29, 2016
    Sarah Yang
    (510) 486-4575

    The top two rows show illustrations of crystals and solution structures of bacterial HU proteins with DNA represented by X-ray crystallography and small angle X-ray scattering, respectively. DNA strands are yellow and HU proteins are shades of blue. Soft X-ray tomography was used to visualize bacterial chromatin (in yellow) in wild type and invasive E. coli cells, shown in the bottom row. (Credit: Michal Hammel/Berkeley Lab)

    Scientists have revealed for the first time the molecular steps that turn on bacteria’s pathogenic genes. Using an array of high-powered X-ray imaging techniques, the researchers at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) showed that histone-like proteins that bind to DNA are related to the physical twisting of the genetic strand, and that the supercoiling of the chromosome can trigger the expression of genes that make a microbe invasive.

    The study, published Friday, July 29, in the journal Science Advances, could open up new avenues in the development of drugs to prevent or treat bacterial infection, the study authors said.

    The researchers looked at how the long strands of DNA are wound tight, a necessity if they are to fit into compact spaces. For eukaryotes, the strands wrap around histone proteins to fit inside a nucleus. For single-celled prokaryotes, which include bacteria, HU proteins serve as the histones, and the chromosomes bunch up in the nucleoid, which lacks a membrane.

    When the normal twists and turns of DNA compaction turn into supercoiling, trouble can begin.

    “It has been known that DNA supercoiling leads to pathogenicity in bacteria, but exactly how the bacterial chromosome is condensed, organized, and ultimately segregated has been a puzzle for over half a century,” said study lead author Michal Hammel, research scientist in Berkeley Lab’s Molecular Biophysics and Integrated Bioimaging Division. “What we did for the first time was to visualize in E. coli how this packing is done, and we also discovered that the way HU proteins pack the chromosomes can trigger gene expression. That is new.”

    Capturing HU in action

    Elucidating these molecular mechanisms entailed imaging HU proteins at different resolutions and stages using two beamlines at Berkeley Lab’s Advanced Light Source [ALS], a DOE Office of Science User Facility.

    LBL ALS interior

    The Structurally Integrated Biology for Life Sciences (SIBYLS) beamline, directed by senior scientist John Tainer, combines X-ray crystallography and small angle X-ray scattering (SAXS) capabilities. The crystallography provided atomic-level details of how the HU proteins interacted with the bacterial DNA, while SAXS was able to show how the HU proteins assembled and affected the longer strands of DNA in a solution.

    Berkeley Lab scientists Michal Hammel and Carolyn Larabell in front of the SIBYLS Beamline at the Advanced Light Source. (Credit: Paul Mueller/Berkeley Lab)

    To get a clear sense of how that twisting and packing manifests at the cellular level, Hammel teamed up with Berkeley Lab faculty scientist Carolyn Larabell, director of the National Center for X-ray Tomography (NCXT), which is also based at the Advanced Light Source.

    “We needed the interaction of these different techniques to get the overall picture of how the HU interactions with DNA were affecting the bacteria,” added Larabell, who is also a professor of anatomy at UC San Francisco. “With X-ray tomography, we’re able to see the natural contrast in organic material in as close to a living state as possible, and we can provide quantitative comparisons of how compacted the chromosomes were in pathogenic and normal strains of E. coli.”

    Larabell calculated that the genetic material in the pathogenicE. coli is so tightly packed that it consumes less than one-half the volume compared with its non-mutant counterpart.

    A target to control pathogenesis

    Before this paper, it had been believed that the enzyme topoisomerase was the primary driver of DNA coiling in bacteria. This new study shows that, independent of topoisomerase, changing the assembly of HU proteins was enough to induce changes in DNA coiling at different stages of bacterial growth.

    “What is notable about HU proteins as a trigger for gene expression is that it’s quick,” said Hammel. “This makes sense as a survival mechanism for bacteria, which need to adapt quickly to different environments.”

    The study results also beg the question: If pathogenicity can be switched on, could it also be switched off?

    “We certainly expect to answer that question in future studies,” said Hammel. “These HU interactions could be an attractive target for drugs that control pathogenesis, not only of bacteria, but of other microbes with comparable genetic structures.”

    Other study co-authors include researchers from the National Cancer Institute’s Center for Cancer Research and the University of Texas.

    The National Institutes of Health and the DOE Office of Science supported this research.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

  • richardmitnick 1:38 pm on July 14, 2016 Permalink | Reply
    Tags: Dark Energy Measured with Record-Breaking Map of 1.2 Million Galaxies, LBL,   

    From LBL: “Dark Energy Measured with Record-Breaking Map of 1.2 Million Galaxies” 

    Berkeley Logo

    Berkeley Lab

    July 14, 2016
    Jon Weiner

    David Schlegel, Lawrence Berkeley National Laboratory

    Shirley Ho, Lawrence Berkeley National Laboratory and Carnegie Mellon University

    A team of hundreds of physicists and astronomers have announced results from the largest-ever, three-dimensional map of distant galaxies. The team constructed this map to make one of the most precise measurements yet of the dark energy currently driving the accelerated expansion of the Universe.

    “We have spent five years collecting measurements of 1.2 million galaxies over one quarter of the sky to map out the structure of the Universe over a volume of 650 cubic billion light years,” says Jeremy Tinker of New York University, a co-leader of the scientific team carrying out this effort. “This map has allowed us to make the best measurements yet of the effects of dark energy in the expansion of the Universe. We are making our results and map available to the world.”

    This is one slice through the map of the large-scale structure of the Universe from the Sloan Digital Sky Survey and its Baryon Oscillation Spectroscopic Survey. Each dot in this picture indi-cates the position of a galaxy 6 billion years into the past. The image covers about 1/20th of the sky, a slice of the Universe 6 billion light-years wide, 4.5 billion light-years high, and 500 million light-years thick. Color indicates distance from Earth, ranging from yellow on the near side of the slice to purple on the far side. Galaxies are highly clustered, revealing superclusters and voids whose presence is seeded in the first fraction of a second after the Big Bang. This image contains 48,741 galaxies, about 3% of the full survey dataset. Grey patches are small regions without survey data. Credit: Daniel Eisenstein and SDSS-III

    These new measurements were carried out by the Baryon Oscillation Spectroscopic Survey (BOSS) program of the Sloan Digital Sky Survey-III. Shaped by a continuous tug-of-war between dark matter and dark energy, the map revealed by BOSS allows scientists to measure the expansion rate of the Universe and thus determine the amount of matter and dark energy that make up the present-day Universe. A collection of papers describing these results was submitted this week to the Monthly Notices of the Royal Astronomical Society.

    BOSS measures the expansion rate of the Universe by determining the size of the baryonic acoustic oscillations (BAO) in the three-dimensional distribution of galaxies. The original BAO size is determined by pressure waves that travelled through the young Universe up to when it was only 400,000 years old (the Universe is presently 13.8 billion years old), at which point they became frozen in the matter distribution of the Universe. The end result is that galaxies have a slight preference to be separated by a characteristic distance that astronomers call the acoustic scale. The size of the acoustic scale at 13.4 billion years ago has been exquisitely determined from observations of the cosmic microwave background from the light emitted when the pressure waves became frozen. Measuring the distribution of galaxies since that time allows astronomers to measure how dark matter and dark energy have competed to govern the rate of expansion of the Universe.

    “We’ve made the largest map for studying the 95% of the universe that is dark,” noted David Schlegel, an astrophysicist at Lawrence Berkeley National Laboratory (Berkeley Lab) and principal investigator for BOSS. “In this map, we can see galaxies being gravitationally pulled towards other galaxies by dark matter. And on much larger scales, we see the effect of dark energy ripping the universe apart.”

    The Sloan Digital Sky Survey and its Baryon Oscillation Spectroscopic Survey has transformed a two-dimensional image of the sky (left panel) into a three-dimensional map spanning distances of billions of light years, shown here from two perspectives (middle and right panels). This map includes 120,000 galaxies over 10% of the survey area. The brighter regions correspond to the regions of the Universe with more galaxies and therefore more dark matter. Image credit: Jeremy Tinker and SDSS-III.

    Shirley Ho, an astrophysicist at Berkeley Lab and Carnegie Mellon University (CMU), co-led two of the companion papers and adds, “We can now measure how much the galaxies and stars cluster together as a function of time to such an accuracy we can test General Relativity at cosmological scales.”

    Ariel Sanchez of the Max-Planck Institute of Extraterrestrial Physics led the effort to estimate the exact amount of dark matter and dark energy based on the BOSS data and explains: “Measuring the acoustic scale across cosmic history gives a direct ruler with which to measure the Universe’s expansion rate. With BOSS, we have traced the BAO’s subtle imprint on the distribution of galaxies spanning a range of time from 2 to 7 billion years ago.”

    To measure the size of these ancient giant waves to such sharp precision, BOSS had to make an unprecedented and ambitious galaxy map, many times larger than previous surveys. At the time the BOSS program was planned, dark energy had been previously determined to significantly influence the expansion of the Universe starting about 5 billion years ago. BOSS was thus designed to measure the BAO feature from before this point (7 billion years ago) out to near the present day (2 billion years ago).

    Jose Vazquez of Brookhaven National Laboratory combined the BOSS results with other surveys and searched for any evidence of unexplained physical phenomena in the results. “Our latest results tie into a clean cosmological picture, giving strength to the standard cosmological model that has emerged over the last eighteen years.”

    Rita Tojeiro of the University of St. Andrews is the other co-leader of the BOSS galaxy clustering working group along with Tinker. “We see a dramatic connection between the sound wave imprints seen in the cosmic microwave background 400,000 years after the Big Bang to the clustering of galaxies 7-12 billion years later. The ability to observe a single well-modeled physical effect from recombination until today is a great boon for cosmology.”

    The map also reveals the distinctive signature of the coherent movement of galaxies toward regions of the Universe with more matter, due to the attractive force of gravity. Crucially, the observed amount of infall is explained well by the predictions of general relativity.

    “The results from BOSS provide a solid foundation for even more precise future BAO measurements, such as those we expect from the Dark Energy Spectroscopic Instrument (DESI),” says Natalie Roe, Physics Division director at Berkeley Lab. “DESI will construct a more detailed 3-dimensional map in a volume of space ten times larger to precisely characterize dark energy — and ultimately the future of our universe.”
    Funding for SDSS-III has been provided by the Alfred P. Sloan Foundation, the Participating Institutions, the National Science Foundation, and the U.S. Department of Energy Office of Science. For more on SDSS-III, visit http://www.sdss3.org/.

    To read the SDSS news release, go here.

    SDSS Telescope at Apache Point, NM, USA
    SDSS Telescope at Apache Point, NM, USA

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

  • richardmitnick 3:10 pm on June 6, 2016 Permalink | Reply
    Tags: , Copper is Key in Burning Fat, LBL   

    From LBL: “Copper is Key in Burning Fat” 

    Berkeley Logo

    Berkeley Lab

    Chris Chang and UC Berkeley graduate student Sumin Lee carry out experiments to find proteins that bind to copper and potentially influence the storage and burning of fat. (Credit: Peg Skorpinski/UC Berkeley)

    Long prized as a malleable, conductive metal used in cookware, electronics, jewelry and plumbing, copper has been gaining increasing attention over the past decade for its role in certain biological functions. It has been known that copper is needed to form red blood cells, absorb iron, develop connective tissue and support the immune system.

    The new findings, to appear in the July print issue of Nature Chemical Biology but published online today, establishes for the first time copper’s role in fat metabolism.

    The team of researchers was led by Chris Chang, a faculty scientist at Berkeley Lab’s Chemical Sciences Division, a UC Berkeley professor of chemistry and a Howard Hughes Medical Institute investigator. Co-lead authors of the study are Lakshmi Krishnamoorthy and Joseph Cotruvo Jr, both UC Berkeley postdoctoral researchers in chemistry with affiliations at Berkeley Lab.

    “We find that copper is essential for breaking down fat cells so that they can be used for energy,” said Chang. “It acts as a regulator. The more copper there is, the more the fat is broken down. We think it would be worthwhile to study whether a deficiency in this nutrient could be linked to obesity and obesity-related diseases.”

    Dietary copper

    Chang said that copper could potentially play a role in restoring a natural way to burn fat. The nutrient is plentiful in foods such as oysters and other shellfish, leafy greens, mushrooms, seeds, nuts and beans.

    According to the Food and Nutrition Board of the Institute of Medicine, an adult’s estimated average dietary requirement for copper is about 700 micrograms per day. The Food and Nutrition Board also found that only 25 percent of the U.S. population gets enough copper daily.

    “Copper is not something the body can make, so we need to get it through our diet,” said Chang. “The typical American diet, however, doesn’t include many green leafy vegetables. Asian diets, for example, have more foods rich in copper.”

    But Chang cautions against ingesting copper supplements as a result of these study results. Too much copper can lead to imbalances with other essential minerals, including zinc.

    Copper as a ‘brake on a brake’

    The researchers made the copper-fat link using mice with a genetic mutation that causes the accumulation of copper in the liver. Notably, these mice have larger than average deposits of fat compared with normal mice.

    A fluorescent probe creates a heat map of copper in white fat cells. Higher levels of copper are shown in yellow and red. The left panel shows normal levels of copper from fat cells of control mice, and the right panel shows cells deficient in copper. (Credit: Lakshmi Krishnamoorthy and Joseph Cotruvo Jr./UC Berkeley)

    The inherited condition, known as Wilson’s disease, also occurs in humans and is potentially fatal if left untreated.

    Analysis of the mice with Wilson’s disease revealed that the abnormal buildup of copper was accompanied by lower than normal lipid levels in the liver compared with control groups of mice. The researchers also found that the white adipose tissue, or white fat, of the mice with Wilson’s disease had lower levels of copper compared with the control mice and correspondingly higher levels of fat deposits.

    They then treated the Wilson’s disease mice with isoproterenol, a beta agonist known to induce lipolysis, the breakdown of fat into fatty acids, through the cyclic adenosine monophosphate (cAMP) signaling pathway. They noted that the mice with Wilson’s disease exhibited less fat-breakdown activity compared with control mice.

    The results prompted the researchers to conduct cell culture analyses to clarify the mechanism by which copper influences lipolysis. The researchers used inductively coupled plasma mass spectroscopy (ICP-MS) equipment at Berkeley Lab to measure levels of copper in fat tissue.

    They found that copper binds to phosphodiesterase 3, or PDE3, an enzyme that binds to cAMP, halting cAMP’s ability to facilitate the breakdown of fat.

    “When copper binds phosphodiesterase, it’s like a brake on a brake,” said Chang. “That’s why copper has a positive correlation with lipolysis.”

    Hints from cows and copper

    The connection between copper and fat metabolism is not altogether surprising. The researchers actually found hints of the link in the field of animal husbandry.

    “It had been noted in cattle that levels of copper in the feed would affect how fatty the meat was,” said Chang. “This effect on fat deposits in animals was in the agricultural literature, but it hadn’t been clear what the biochemical mechanisms were linking copper and fat.”

    The new work builds upon prior research from Chang’s lab on the roles of copper and other metals in neuroscience. In support of President Barack Obama’s BRAIN Initiative, Berkeley Lab provided Chang seed funding in 2013 through the Laboratory Directed Research and Development program. Chang’s work continued through the BRAIN Tri-Institutional Partnership, an alliance with Berkeley Lab, UC Berkeley and UC San Francisco.

    Of the copper in human bodies, there are particularly high concentrations found in the brain. Recent studies, including those led by Chang, have found that copper helps brain cells communicate with each other by acting as a brake when it is time for neural signals to stop.

    While Chang’s initial focus was on the role of copper in neural communications, he branched out to investigations of metals in fat metabolism and other biological pathways. This latest work was primarily funded by the National Institutes of Health.

    Previous stories on Chris Chang’s research on copper include:

    Copper on the Brain at Rest
    Of Metal Heads and Imaging

    A profile of Chris Chang is online through the UC Berkeley Office of the Vice Chancellor for Research.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

  • richardmitnick 12:25 pm on May 27, 2016 Permalink | Reply
    Tags: , Berkeley built new RFQ successfully takes first beam at FNAL, Fermilab Accelerator Division, LBL   

    From FNAL: “Upgraded PIP-II RFQ successfully takes first beam” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    May 25, 2016
    Rashmi Shivni

    This photo of the RFQ for the Fermilab PIP-II accelerator was taken during the assembly phase at Lawrence Berkeley National Laboratory. Photo courtesy of Andrew Lambert, Berkeley Lab

    In March, the Fermilab Accelerator Division successfully sent beam through a newly commissioned linear accelerator. The brand new radio-frequency quadrupole (RFQ) linac, designed and built by a team of engineers and physicists at Lawrence Berkeley National Laboratory, will be the start for a proposed upgrade to Fermilab’s 800-MeV superconducting linear accelerator.

    “The RFQ is one of the biggest challenges faced by our group,” said Derun Li, lead scientist on the RFQ development team and deputy head of the Center for Beam Physics at Berkeley Lab. “And seeing it take nearly 100 percent of the source beam on its first try is great!”

    The new, front-end accelerator is one of several upgrade projects conducted under PIP-II, a plan to overhaul the Fermilab accelerator complex to produce high-intensity proton beams for the lab’s multiple experiments. PIP-II is supported by the DOE Office of Science.

    Currently located at the Cryomodule Test Facility, approximately 1.5 miles northeast of Wilson Hall, the RFQ took first beam – 100-microsecond pulses at 10 hertz – during its first testing phase. Since its first run in March, the team has been working on various commissioning activities, including running the pulsed beam through the RFQ and its transport lines. These activities are expected to continue until June.

    The goal of these tests is to provide intense, focused beams to the entire accelerator complex. The lab’s current RFQ, which sits at the beginning of the laboratory’s accelerator chain, accelerates a negative hydrogen ion beam to 0.75 million electronvolts, or MeV. The new RFQ, which is longer, accelerates a beam to 2.1 MeV, nearly three times the energy. Transported beam current, and therefore power, is the key improvement with the new RFQ. The current RFQ delivers 54-watt beam power; the new RFQ delivers beam at 21 kilowatts – an increase by a factor of nearly 400.

    RFQs are widely used for accelerating low-energy ion beams, and the energy of the beams they produce typically caps off at about 5 MeV, said Paul Derwent, PIP-II Department head. These low energy protons will then undergo further acceleration by other components of Fermi’s accelerator complex, some to 8 GeV and others to 120 GeV.

    The new RFQ is 4.5 meters long and made of four parallel copper vanes, as opposed to four rods used on the current RFQ. As viewed from one end, the vanes form a symmetrical cross. At the center of the cross is a tiny aperture, or tunnel through which the beam travels.

    The RFQ is undergoing tests at the Cryomodule Test Facility at Fermilab. Photo: Reidar Hahn

    If you were to remove one vane and peer inside the RFQ from the side, you would see an intricate pattern of peaks and valleys, similar to a waveform, along the inner edge of each vane. Like puzzle pieces, the vanes fit together to form the small tunnel with the rippling walls of that inner waveform shape. The farther down the tunnel you go, and therefore the higher the beam energy, the longer the spacing of the peaks and valleys. This means that the time the beam needs to go from peak to valley and back remains constant, necessary for proper acceleration.

    Jim Steimel, the electrical engineering coordinator for PIP-II and a Fermilab liaison for Berkeley’s RFQ development team, said this shape is a special trait in RFQs; one that creates an electromagnetic quadrupole field that focuses low-velocity particles.

    “As the beam travels through the RFQ tunnel, longitudinal electrical fields generated by the vane peaks and valleys accelerate particle energy,” Steimel said. “This helps focus the beam and keeps the particles accelerating.”

    The Berkeley team successfully designed the accelerator to bring beams to a higher intensity than Fermilab’s previous RFQ technology could achieve – energy that matches PIP-II’s front-end requirements.

    “Our challenge was to come up with a design that uses minimum radio-frequency power and delivers the required beam quality and intensity, and to engineer a mechanical design that can withstand continuous operation at high average power,” Li said.

    Li’s team took into account potential problems that may occur at a power of 100 kilowatts or more, which was needed to maintain the electromagnetic quadrupole field inside the RFQ.

    For example, at higher powers temperatures can rapidly increase, causing thermal stress on the RFQ components. Large water flow rates and durable materials are needed to withstand heat and prevent deformations, which is a significant mechanical engineering feat.

    “The Berkeley team is proud to have been a key contributor to the first phase of the PIP-II upgrade,” said Wim Leemans, director of Berkeley Lab’s Accelerator Technology and Applied Physics Division. “Berkeley physicists and engineers have been building RFQs for a number of users and purposes for 30 years, and this is a great example of getting the most leverage out of the agency investment.”

    Now that the Berkeley and Fermilab teams demonstrated that the RFQ can generate intense beams in pulses, the next step will be to create a continuous high-intensity beam for PIP-II. The team expects to achieve a continuous beam in the summer.

    “Fermilab and Berkeley have a long history of collaboration,” Derwent said. “This was just another one where it has worked very well, and their expertise helped us achieve one of our goals.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

  • richardmitnick 2:35 pm on May 24, 2016 Permalink | Reply
    Tags: , , LBL,   

    From LBL: “Hunting for Dark Matter’s ‘Hidden Valley’ ” Women in Science 

    Berkeley Logo

    Berkeley Lab

    May 24, 2016
    Glenn Roberts Jr.

    Kathryn Zurek (Credit: Roy Kaltschmidt/Berkeley Lab)

    Kathryn Zurek realized a decade ago that we may be searching in the wrong places for clues to one of the universe’s greatest unsolved mysteries: dark matter. Despite making up an estimated 85 percent of the total mass of the universe, we haven’t yet figured out what it’s made of.

    Now, Zurek, a theoretical physicist at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), says thanks to extraordinary improvements in experimental sensitivity, “We increasingly know where not to look.” In 2006, during grad school, Zurek began to explore the concept of a new “Hidden Valley” model for physics that could hold all of the answers to dark matter.

    “I noticed that from a model-builder’s point of view that dark matter was extraordinarily undeveloped,” she said. It seemed as though scientists were figuratively hunting in the dark for answers. “People were focused on models of just two classes of dark matter candidates, rather than a much broader array of possibilities.”

    Physicist and author Alan Lightman has described dark matter as an “invisible elephant in the room”—you know it’s there because of the dent it’s making in the floorboards but you can’t see or touch it. Likewise, physicists can infer that dark matter exists in huge supply compared to normal matter because of its gravitational effects, which far exceed those expected from the matter we can see in space.

    Since physicist Fritz Zwicky in 1933 measured this major discrepancy in the gravitational mass of a galaxy cluster, that he concluded was due to dark matter, the search for what dark matter is really made of has taken many forms: from deep-underground detectors to space- and ground-based observatories, balloon-borne missions and powerful particle accelerator experiments.

    While there have been some candidate signals and hints, and numerous experiments have narrowed the range of energies and masses at which we are now looking for dark matter particles, the scientific community hasn’t yet embraced a dark matter discovery.

    3 Knowns and 3 Unknowns about Dark Matter

    What’s known
    1. We can observe its effects.

    2. It is abundant.
    It makes up about 85 percent of the total mass of the universe, and about 27 percent of the universe’s total mass and energy.

    3. We know more about what dark matter is not.

    Increasingly sensitive detectors are lowering the possible rate at which dark mark matter particles can interact with normal matter.
    This chart shows the sensitivity limits (solid-line curves) of various experiments searching for signs of theoretical dark matter particles known as WIMPs (weakly interacting massive particles). The shaded closed contours show hints of WIMP signals. The thin dashed and dotted curves show projections for future U.S.-led dark matter direct-detection experiments expected in the next decade, and the thick dashed curve (orange) shows a so-called “neutrino floor” where neutrino-related signals can obscure the direct detection of dark matter particles. (Credit: Snowmass report, 2013.)

    What’s unknown

    1. Is it made up of one particle or many particles?
    (Credit: Pixabay/CreativeMagic)

    Could dark matter be composed of an entire family of particles, such as a theorized “hidden valley” or “dark sector?”
    2. Are there “dark forces” acting on dark matter?

    Are there forces beyond gravity and other known forces that act on dark matter but not on ordinary matter, and can dark matter interact with itself?
    This image from the NASA/ESA Hubble Space Telescope shows the galaxy cluster Abell 3827. The blue structures surrounding the central galaxies are views of a more distant galaxy behind the cluster that has been distorted by an effect known as gravitational lensing. Observations of the central four merging galaxies in this image have provided hints that the dark matter around one of the galaxies is not moving with the galaxy itself, possibly indicating the occurrence of an unknown type of dark matter interaction. (Credit: ESO)

    3. Is there dark antimatter?
    A computerized visualization showing the possible large-scale structure of dark matter in the universe. (Credit: Amit Chourasia and Steve Cutchin/NPACI Visualization Services; Enzo)

    In 2006, as a graduate student at the University of Washington, Zurek and collaborator Matthew J. Strassler, a faculty member, published a paper*, “Echoes of a Hidden Valley at Hadron Colliders,” that considered the possibility of new physics such as the existence of a new group of light (low-mass), long-lived particles that could possibly be revealed at CERN’s Large Hadron Collider, the machine that would later enable the Nobel Prize-winning discovery of the Higgs boson in 2012.

    CERN/LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN

    Some of the scientifically popular hypothetical particle candidates for dark matter are WIMPs (weakly interacting massive particles) and axions (very-low-mass particles). But the possibility of a rich and overlooked mix of light particles was compelling for Zurek, who began to construct models to test out the theory.

    “If you had a low-mass hidden sector, you could ‘stuff’ all kinds of things inside of it,” she said. “That really set me up to start thinking about complex dark sectors, which I did as a postdoc.”

    Looking back to 2008, Zurek said she felt like someone carrying around a sandwich board proclaiming that dark matter could be a stranger, manifold thing than most had imagined. “I was like that little guy with the sign.”

    By coincidence, the so-called “PAMELA anomaly” was revealed that same year; data from the PAMELA space mission in 2008 had found an unexpected excess of positrons, the antimatter counterpart to electrons, at certain energies. This measurement excited physicists as a possible particle signature from the decay of dark matter, and the excess defied standard dark matter theories and opened the door to new ones.

    Now that the concept of “hidden valleys” or “dark sectors” with myriad particles making up dark matter is gaining steam among scientists—Zurek spoke in late April at a three-day “Workshop on Dark Sectors”—she said she feels gratified to have worked on some of the early theoretical models.

    “It’s great in one sense because these ideas really got traction,” Zurek said. “The fact that there were these experimental anomalies, that was sort of a coincidence. As a second- or third-year postdoc, this was like ‘my program’—this was the thing I was pushing. It suddenly got very popular.”

    On an afternoon in late April, Zurek and her student Katelin Schutz sat together waiting to press the button to submit a new paper on a proposal to tease out a signal for light dark matter particles using an exotic, supercooled liquid known as superfluid helium. In the paper, they explain how this form of helium can probe for signals of “super light dark matter,” with an energy signature well below the reach of today’s experiments.

    They are also working with Dan McKinsey, a Berkeley Lab scientist and UC Berkeley physics professor who is a superfluid helium expert, on possible designs for an experiment.

    Most popular theories of WIMPs suggest a mass around 100 times the mass of a proton, a particle found at an atom’s core, for example, but a superfluid helium detector could be sensitive to masses many orders of magnitude smaller, she said.

    Are we any closer to finding dark matter?

    Zurek said she is surprised we haven’t yet made a discovery, but she is encouraged by the increasing sensitivity of experiments, and she said Berkeley Lab has particular expertise in high-precision detectors that will hopefully ensure its role in future experiments.

    “There is a cross-fertilization from different fields of physics that has really blossomed in the last several years,” Zurek also said. She joined Berkeley Lab in 2014 after serving as an associate professor at University of Michigan, and has also spent time at the Institute for Advanced Study in Princeton, N.J.; and at Fermi National Accelerator Laboratory’s Particle Astrophysics Center.

    Besides dark matter research, Zurek works on problems related to possible new physics at play in the infant universe and in the evolution of the universe’s structure, for example Her work often is at the intersection of particle physics experiments and astrophysics observations.

    Hard problems like the dark matter mystery are what drew her to physics at an early age, when she enrolled in college at the age of 15.

    “I wanted to understand how the universe worked. Plus, physics was hard and I liked that. I thought it was the hardest thing you could do, which I found very appealing. I decided at 15 that I wanted to make it a career, and I just never looked back,” she said.

    She knew, too, that she didn’t want to work directly on big science experiments. “I had always been fascinated about ideas: Ideas in philosophy, and the interplay between music and philosophy and physics.”

    She is a classical pianist with the ability to improvise melodies—she refers to this as a “tremendous intuition in how to make sounds”—and she still turns to music when confronting a physics problem. “When you’re really stuck on a problem you never stop thinking about it. Sometimes playing the piano helps.”

    When outdoors, Zurek enjoys sailing, hiking and alpine-style climbing—complete with ice axe and crampons—atop peaks such as Mount Rainer and Mount Shasta.

    As for the trail ahead in the dark matter hunt, Zurek said it’s important to be nimble and to expect the unexpected.

    “You don’t want to put yourself at a dead-end where you’re not exploring other possibilities,” she said.

    “The thing we don’t want to forget is: We don’t know what dark matter is. You have to have room for exploratory experiments, and you probably need a lot of them.”

    Learn more about Kathryn Zurek’s research: https://www.kzurek.theory.lbl.gov/.

    *Science paper:
    Echoes of a Hidden Valley at Hadron Colliders

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

  • richardmitnick 3:55 pm on May 23, 2016 Permalink | Reply
    Tags: , LBL, , Water-Energy Nexus New Focus of Berkeley Lab Research   

    From LBL: “Water-Energy Nexus New Focus of Berkeley Lab Research” 

    Berkeley Logo

    Berkeley Lab

    May 23, 2016
    Julie Chao
    (510) 486-6491

    Water banking, desalination, and high-resolution climate models are all part of the new Berkeley Lab Water Resilience Initiative. (California snowpack photo credit: Dan Pisut/ NASA)

    Billions of gallons of water are used each day in the United States for energy production—for hydroelectric power generation, thermoelectric plant cooling, and countless other industrial processes, including oil and gas mining. And huge amounts of energy are required to pump, treat, heat, and deliver water.

    This interdependence of water and energy is the focus of a major new research effort at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab). With the twin challenges of population growth and climate change adding to the resource pressures, Berkeley Lab’s Water Resilience Initiative aims to use science and technology to optimize coupled water-energy systems and guide investments in such systems.

    “Considering water and energy as separate issues is passé,” said Berkeley Lab scientist Robert Kostecki, one of the three leads of the initiative. “Now the two are becoming critically interdependent. And both the energy and water sectors are expected to experience serious stresses from extreme climate events. However the problem on each side is dealt with, there needs to be an understanding of possible implications on the other side.”

    The Initiative has three main goals: hydroclimate and ecosystem predictions, revolutionary concepts for efficient and sustainable groundwater systems, and science and technology breakthroughs in desalination. The goals can be viewed as analogous to energy distribution, storage, and generation, says Susan Hubbard, Berkeley Lab’s Associate Lab Director for Earth and Environmental Sciences.

    “We consider improved hydroclimate predictions as necessary for understanding future water distribution,” Hubbard said. “We are exploring water banking as a subsurface strategy to store water that is delivered by snowmelt or extreme precipitation. To remain water resilient in other locations and to take advantage of seawater through brackish inland produced waters, Berkeley Lab is performing fundamental investigations to explore new approaches to desalinate water, ideally leading to cost and energy efficient approaches to generate water.”

    Climate: the Source of All Water

    The climate, ultimately, is the source of all water, and in places like California, where the snow pack plays an important role, climate change will have a big impact on how much water there will be and when it will come. The goal of the climate focus of the Initiative, led by Berkeley Lab climate scientist Andrew Jones, is to develop approaches to predict hydroclimate at scales that can be used to guide water-energy strategies.

    “Historically we’ve developed climate models that are global models, developed to answer global science questions,” Jones said. “But there’s an increasing demand for information at much finer spatial scales to support climate adaptation planning.”

    Ten years ago, Berkeley Lab scientists helped develop global climate models with a resolution of 200 kilometers. By 2012, the most advanced models had 25 km resolution. Now a project is underway to develop a regional climate model of the San Francisco Bay Area with resolution of 1 km, or the neighborhood level.

    “We’ll be looking at the risk of extreme heat events and how that interacts with the microclimates of the Bay Area, and additionally, how change in the urban built environment can exacerbate or ameliorate those heat events,” Jones said. “Then we want to understand the implications of those heat events for water and energy demands.”

    The eventual goal is to transfer this model for use in other urban areas to be able to predict extreme precipitation events as well as drought and flood risk.

    Subsurface: Storage, Quality, and Movement of Water Underground

    Another Initiative focus, led by Peter Nico, head of Berkeley Lab’s Geochemistry Department, studies what’s happening underground. “We have a lot of expertise in understanding the subsurface—using various geophysical imaging techniques, measuring chemical changes, using different types of hydrologic and reactive transport models to simulate what’s happening,” he said. “So our expertise matches up very well with groundwater movement and management and groundwater quality.”

    Groundwater issues have become more important with the drought of the last four years. “California has been ‘overdrafting’ water for a long time, especially in the San Joaquin Valley, where we’re pulling more water out than is naturally infiltrating back in,” Nico said. “With the drought the use of groundwater has gone up even more. That’s causing a lot of problems, like land subsidence.”

    While there is already a lot of activity associated with groundwater management in California, Nico added, “we still can’t confidently store and retrieve clean water in the subsurface when and where we need it. We think there’s a place to contribute a more scientific chemistry- and physics-based understanding to efficient groundwater storage in California.”

    For example, Berkeley Lab scientists have expertise in using geophysical imaging, which allows them to “see” underground without drilling a well. “We have very sophisticated hydrologic and geochemical computer codes we think we can couple with imaging to predict where water will go and how its chemistry may change through storage or retrieval,” he said.

    Berkeley Lab researchers are helping test “water banking” on almond orchards. (Courtesy of Almond Board of California)

    They have a new project with the Almond Board of California to determine the ability to recharge over-drafted groundwater aquifers in the San Joaquin Valley by applying peak flood flows to active orchards, known as “water banking.” The project is part of the Almond Board’s larger Accelerated Innovation Management (AIM) program, which includes an emphasis on creating sustainable water resources. Berkeley Lab scientists will work with existing partners, Sustainable Conservation and UC Davis, who are currently conducting field trials and experiments, and contribute their expertise on the deeper subsurface, below the root zone of the almond trees, to determine what happens to banked water as it moves through the subsurface.

    Another project, led by Berkeley Lab researcher Larry Dale, is developing a model of the energy use and cost of groundwater pumping statewide in order to improve the reliability of California’s electric and water systems, especially in cases of drought and increase in electricity demand. The project has been awarded a $625,000 grant by the California Energy Commission.

    Desalination: Aiming for Pipe Parity

    Reverse osmosis is the state-of-the-art desalination technology and has been around since the 1950s. Unfortunately, there have been few breakthroughs in the field of desalination since then, and it remains prohibitively expensive. “The challenge is to lower the cost of desalination of sea water by a factor of five to achieve ‘pipe parity,’ or cost parity with water from natural sources,” said Kostecki, who is leading the project. “This is a formidable endeavor and it cannot be done with incremental improvements of existing technologies.”

    To reach this goal, Kostecki and other Berkeley Lab researchers are working on several different approaches for more efficient desalination. Some are new twists on existing technologies—such as forward osmosis using heat from geothermal sources, graphene-based membranes, and capacitive deionization—while others are forging entirely new paradigms, such as using the quantum effects in nanoconfined spaces and new nano-engineered materials architectures.

    “The reality is that if one is shooting for a 5X reduction in the cost of desalination of water, then this requires a completely new way of thinking, new science, new technology—this is what we are shooting for,” said Ramamoorthy Ramesh, Associate Lab Director for Energy Technologies.

    Some of these projects are part of the U.S./China Clean Energy Research Center for Water Energy Technologies (CERC-WET), a new $64-million collaboration between China and the United States to tackle water conservation in energy production and use. It is a cross-California collaboration led on the U.S. side by Berkeley Lab researcher Ashok Gadgil and funded primarily by the Department of Energy.

    “Berkeley Lab is ideally suited to take on the water-energy challenge,” said Ramesh. “As a multidisciplinary lab with deep expertise in energy technologies, computational sciences, energy sciences as well as earth and climate sciences, we have the opportunity to develop and integrate fundamental insights through systems-level approaches. Relevant to California, we are focusing on developing scalable water systems that are resilient in an energy-constrained and uncertain water future.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

  • richardmitnick 3:40 pm on May 13, 2016 Permalink | Reply
    Tags: , LBL, , New National Microbiome Initiative   

    From LBL: “Berkeley Lab Participates in New National Microbiome Initiative” 

    Berkeley Logo

    Berkeley Lab

    May 13, 2016
    Dan Krotz

    The Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) will participate in a new National Microbiome Initiative launched today by the White House Office of Science and Technology Policy.

    The initiative, announced at an event in Washington, D.C., will advance the understanding of microbiome behavior and enable the protection of healthy microbiomes, which are communities of microorganisms that live on and in people, plants, soil, oceans, and the atmosphere. Microbiomes maintain the healthy function of diverse ecosystems, and they influence human health, climate change, and food security.

    The National Microbiome Initiative brings together scientists from more than 100 universities, companies, research institutions, and federal agencies. The goal is to investigate fundamental principles that govern microbiomes across ecosystems, and develop new tools to study microbiomes.

    Berkeley Lab is well positioned to contribute to the national effort thanks to Microbes to Biomes, a Lab-wide initiative designed to understand, predict, and harness critical microbiomes for energy, food, environment, and health. The initiative involves scientists across Berkeley Lab in biology, environmental sciences, genomics, systems biology, computation, advanced imaging, material sciences, and engineering.

    “It’s exciting to see this coordinated National Microbiome Initiative launched. It is very much in line with our interdisciplinary vision for Microbes-to-Biomes and our goals of building a functional understanding of Earth’s microbiomes,” says Eoin Brodie, deputy director of Berkeley Lab’s Climate and Ecosystem Sciences Division.

    In addition, Brodie is the corresponding author of an editorial published* today in the journal mBio that calls for a predictive understanding of Earth’s microbiomes to address some the most significant challenges of the 21st century. These challenges include maintaining our food, energy, and water supplies while improving the health of our population and Earth’s ecosystems. Trent Northen, director of Berkeley Lab’s Environmental Genomics and Systems Biology Division, and Mary Maxon, Biosciences Area principal deputy, are co-authors of the editorial.

    More about Berkeley Lab’s Microbes to Biomes Initiative

    Access mp4 video here .
    Berkeley Lab’s Microbes to Biomes initiative is designed to reveal, decode and harness microbes.

    Microbes to Biomes brings together teams of Berkeley Lab scientists to discover causal mechanisms governing microbiomes and accurately predict responses. The goal is to harness beneficial microbiomes in natural and managed environments for a range of applications, including terrestrial carbon sequestration, sustainable growth of bioenergy and food crops, and environmental remediation.

    The initiative, which aims to bridge the gap from microbe-scale to biome-scale science, takes advantage of Berkeley Lab’s capabilities, ranging from biology, environmental sciences, genomics, systems biology, computation, advanced imaging, materials sciences, and engineering.

    Berkeley Lab scientists are developing new approaches to monitor, simulate, and manipulate microbe-through-biome interactions and feedbacks. They’re also creating controlled laboratory “ecosystems,” which will ultimately be virtually linked to ecosystem field observatories. The initial goal is to build a mechanistic and predictive understanding of the soil-microbe-plant biome.

    More about the mBio editorial

    The potential impact of a unified Microbiome initiative to understand and responsibly harness the activities of microbial communities. (Credit: Diana Swantek, Berkeley Lab)

    The mBio paper makes the case that given the extensive influence of microorganisms across our biosphere—they’ve shaped our planet and its inhabitants for over 3.5 billion years—and new scientific capabilities, the time is ripe for a cross-disciplinary effort to understand, predict, and harness microbiome function to help address the big challenges of today.

    This effort could draw on rapidly improving advances in gene function testing as well as precision manipulation of genes, communities, and model ecosystems. Recently developed analytical and simulation approaches could also be utilized.

    The goal is to improve prediction of ecosystem response and enable the development of new, responsible, microbiome-based solutions to significant energy, health, and environmental problems.

    The mBio editorial was authored by eleven scientists from several institutions. The Berkeley Lab co-authors were supported by the Department of Energy’s Office of Science.

    Science editorial:
    Toward a Predictive Understanding of Earth’s Microbiomes to Address 21st Century Challenges

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

  • richardmitnick 2:30 pm on May 12, 2016 Permalink | Reply
    Tags: , , , LBL,   

    From LBL and Princeton: “$40M to Establish New Observatory Probing Early Universe” 

    Berkeley Logo

    Berkeley Lab

    May 12, 2016
    News Release

    LBL The Simons Array in the Atacama in Chile, with the  Atacama Cosmology Telescope
    The Simons Array will be located in Chile’s High Atacama Desert, at an elevation of about 17,000 feet. The site currently hosts the Atacama Cosmology Telescope (bowl-shaped structure at upper right) and the Simons Array (the three telescopes at bottom left, center and right). The Simons Observatory will merge these two experiments, add several new telescopes and set the stage for a next-generation experiment. (Credit: University of Pennsylvania)

    The Simons Foundation has given $38.4 million to establish a new astronomy facility in Chile’s Atacama Desert, adding new telescopes and detectors alongside existing instruments in order to boost ongoing studies of the evolution of the universe, from its earliest moments to today. The Heising-Simons Foundation is providing an additional $1.7 million for the project.

    The Simons Observatory is a collaboration among the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab); UC Berkeley; Princeton University; the University of California at San Diego; and University of Pennsylvania, all of which are also providing financial support.

    The project manager for the Simons Observatory will be located at Princeton, and Princeton faculty also will oversee the development, design, testing and manufacture of many of the observatory’s camera components.

    The observatory will probe the subtle properties of the universe’s first light, known as cosmic microwave background (CMB) radiation.

    A critical element in wringing new cosmological information from the CMB — which is the glow of heat left over from the Big Bang — is the use of densely packed, very sensitive cryogenic detectors. Princeton’s expertise with the detector development for the Atacama Cosmology Telescope in Chile and other observatories will complement the collaborative effort of the Simons Observatory, said Suzanne Staggs, Princeton’s project lead for the observatory and the Henry DeWolf Smyth Professor of Physics.

    Of particular importance is the University’s large dilution refrigerator-based camera testing facility located in the Department of Physics. The CMB has a temperature of 3 degrees Kelvin (-454.27 degrees Fahrenheit), and CMB detectors are more sensitive the colder they are. The Princeton facility will test the Simons Observatory equipment at a frosty 80 millikelvin, or eighty one-thousandths of a degree above absolute zero.

    Cosmic Microwave Background per ESA/Planck
    Cosmic Microwave Background per ESA/Planck


    The observatory will pay particular attention to the polarization, or directional information, in the CMB light to better understand what took place a fraction of a second after the Big Bang. While these events are hidden from view behind the glare of the microwave radiation, the disturbances they caused in the fabric of space-time affected the microwave’s polarization, and scientists hope to work backwards from these measurements to test theories about how the universe came into existence.

    “The Simons Observatory will allow us to peer behind the dust in our galaxy and search for a true signal from the Big Bang,” said Adrian Lee, a physicist at Berkeley Lab, a UC Berkeley physics professor and one of the lead investigators at the observatory.

    A key goal of the project is to detect gravitational waves generated by cosmic inflation, an extraordinarily rapid expansion of space that, according to the most popular cosmological theory, took place in an instant after the Big Bang. These primordial gravitational waves induced a very small but characteristic polarization pattern, called B-mode polarization, in the microwave background radiation that can be detected by telescopes and cameras like those planned for the Simons Observatory.

    “While patterns that we see in the microwave sky are a picture of the structure of the universe 380,000 years after the Big Bang, we believe that some of these patterns were generated much earlier, by gravitational waves produced in the first moments of the universe’s expansion,” said project spokesperson Mark Devlin, a cosmologist at the University of Pennsylvania who leads the university’s team in the collaboration. “By measuring how the gravitational waves affect electrons and matter 380,000 years after the Big Bang we are observing fossils from the very, very early universe.”

    Lee added, “Once we see the signal of inflation, it will be the beginning of a whole new era of cosmology.” We will then be looking at a time when the energy scale in the universe was a trillion times higher than the energy accessible in any particle accelerator on Earth.

    By measuring how radiation from the early universe changed as it traveled through space to Earth, the observatory also will teach us about the nature of dark energy and dark matter, the properties of neutrinos and how large-scale structure formed as the universe expanded and evolved.

    Two existing instruments at the site—the Atacama Cosmology Telescope and the Simons Array—are currently measuring this polarization. The foundation funds will merge these two experiments, expand the search and develop new technology for a fourth-stage, next-generation project—dubbed CMB-Stage 4 or CMB-S4—that could conceivably mine all the cosmological information in the cosmic microwave background fluctuations possible from a ground-based observatory.

    “We are still in the planning stage for CMB-S4, and this is a wonderful opportunity for the foundations to create a seed for the ultimate experiment,” said Akito Kusaka, a Berkeley Lab physicist and one of the lead investigators. “This gets us off to a quick start.”

    The Simons Observatory is designed to be a first step toward CMB-S4. This next-generation experiment builds on years of support from the National Science Foundation (NSF), and the Department of Energy (DOE) Office of Science has announced its intent to participate in CMB-S4, following the recommendation by its particle physics project prioritization panel [FNAL P5]. Such a project is envisioned to have telescopes at multiple sites and draw together a broad community of experts from the U.S. and abroad. The Atacama site in Chile has already been identified as an excellent location for CMB-S4, and the Simons Foundation funding will help develop it for that role.

    “We are hopeful that CMB-S4 would shed light not only on inflation, but also on the dark elements of the universe: neutrinos and so-called dark energy and dark matter,” Kusaka said. “The nature of these invisible elements is among the biggest questions in particle physics as well.”

    Beyond POLARBEAR

    Experiments at the Chilean site have already paved the way for CMB-S4. A 2012 UC Berkeley-led experiment with participation by Berkeley Lab researchers, called POLARBEAR, used a 3.5-meter telescope at the Chilean site to measure the gravitational-lensing-generated B-mode polarization of the cosmic microwave background radiation.

    POLARBEAR McGill Telescope located in the Atacama Desert of northern Chile in the Antofagasta Region. The POLARBEAR experiment is mounted on the Huan Tran Telescope (HTT) at the James Ax Observatory in the Chajnantor Science Reserve.
    POLARBEAR McGill Telescope located in the Atacama Desert of northern Chile in the Antofagasta Region. The POLARBEAR experiment is mounted on the Huan Tran Telescope (HTT) at the James Ax Observatory in the Chajnantor Science Reserve.

    Team scientists confirmed in 2014 that the signal was strong enough to allow them eventually to measure the neutrino mass and the evolution of dark energy.

    The recent addition of two more telescopes upgrades POLARBEAR to the Simons Array, which will speed up the mapping of the CMB and improve sky and frequency coverage. The $40 million in new funding will make possible the successor to the Simons Array and the nearby Atacama Cosmology Telescope.

    Current stage-3 experiments for these short-wavelength microwaves, which must be chilled to three-tenths of a degree Kelvin above absolute zero, have about 10,000 pixels, Lee said.

    “We need to make a leap in our technology to pave the way for the 500,000 detectors required for the ultimate experiment,” he said. “We’ll be generating the blueprint for a much more capable telescope.”

    “The generosity of this award is unprecedented in our field, and will enable a major leap in scientific capability,” said Brian Keating, leader of the UC San Diego contingent and current project director. “People are used to thinking about mega- or gigapixel detectors in optical telescopes, but for signals in the microwave range 10,000 pixels is a lot. What we’re trying to do—the real revolution here—is to pave the way to increase our pixels number by more than an order of magnitude.”

    Berkeley Lab and UC Berkeley will contribute $1.25 million in matching funds to the project over the next five years. The $1.7 million contributed by the Heising-Simons Foundation will be devoted to supporting research at Berkeley to improve the microwave detectors and to develop fabrication methods that are more efficient and cheaper, with the goal of boosting the number of detectors in CMB experiments by more than a factor of a 10.

    The site in Chile is located in the Parque Astronómico, which is administered by the Comisión Nacional de Investigación Científica y Tecnológica (CONICYT). Since 1998, U.S. investigators and the NSF have worked with Chilean scientists, the University of Chile, and CONICYT to locate multiple projects at this high, dry site to study the CMB.

    See the full LBL article here .

    See the full Princeton article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: