Tagged: Lawrence Berkeley National laboratory Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:54 pm on December 8, 2014 Permalink | Reply
    Tags: , , , Lawrence Berkeley National laboratory   

    From LBL: “World Record for Compact Particle Accelerator” 

    Berkeley Logo

    Berkeley Lab

    December 8, 2014
    Kate Greene 510-486-4404

    Using one of the most powerful lasers in the world, researchers have accelerated subatomic particles to the highest energies ever recorded from a compact accelerator.

    l
    A 9 cm-long capillary discharge waveguide used in BELLA experiments to generate multi-GeV electron beams. The plasma plume has been made more prominent with the use of HDR photography. Credit: Roy Kaltschmidt

    The team, from the U.S. Department of Energy’s Lawrence Berkeley National Lab (Berkeley Lab), used a specialized petawatt laser and a charged-particle gas called plasma to get the particles up to speed. The setup is known as a laser-plasma accelerator, an emerging class of particle accelerators that physicists believe can shrink traditional, miles-long accelerators to machines that can fit on a table.

    The researchers sped up the particles—electrons in this case—inside a nine-centimeter long tube of plasma. The speed corresponded to an energy of 4.25 giga-electron volts. The acceleration over such a short distance corresponds to an energy gradient 1000 times greater than traditional particle accelerators and marks a world record energy for laser-plasma accelerators.

    “This result requires exquisite control over the laser and the plasma,” says Dr. Wim Leemans, director of the Accelerator Technology and Applied Physics Division at Berkeley Lab and lead author on the paper. The results appear in the most recent issue of Physical Review Letters.

    Traditional particle accelerators, like the Large Hadron Collider at CERN, which is 17 miles in circumference, speed up particles by modulating electric fields inside a metal cavity. It’s a technique that has a limit of about 100 mega-electron volts per meter before the metal breaks down.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC atCERN

    Laser-plasma accelerators take a completely different approach. In the case of this experiment, a pulse of laser light is injected into a short and thin straw-like tube that contains plasma. The laser creates a channel through the plasma as well as waves that trap free electrons and accelerate them to high energies. It’s similar to the way that a surfer gains speed when skimming down the face of a wave.

    The record-breaking energies were achieved with the help of BELLA (Berkeley Lab Laser Accelerator), one of the most powerful lasers in the world. BELLA, which produces a quadrillion watts of power (a petawatt), began operation just last year.

    LBL BellaBELLA at LBL

    “It is an extraordinary achievement for Dr. Leemans and his team to produce this record-breaking result in their first operational campaign with BELLA,” says Dr. James Symons, associate laboratory director for Physical Sciences at Berkeley Lab.

    In addition to packing a high-powered punch, BELLA is renowned for its precision and control. “We’re forcing this laser beam into a 500 micron hole about 14 meters away, “ Leemans says. “The BELLA laser beam has sufficiently high pointing stability to allow us to use it.” Moreover, Leemans says, the laser pulse, which fires once a second, is stable to within a fraction of a percent. “With a lot of lasers, this never could have happened,” he adds.

    cs
    Computer simulation of the plasma wakefield as it evolves over the length of the 9-cm long channel. Credit: Berkeley Lab

    At such high energies, the researchers needed to see how various parameters would affect the outcome. So they used computer simulations at the National Energy Research Scientific Computing Center (NERSC) to test the setup before ever turning on a laser. “Small changes in the setup give you big perturbations,” says Eric Esarey, senior science advisor for the Accelerator Technology and Applied Physics Division at Berkeley Lab, who leads the theory effort. “We’re homing in on the regions of operation and the best ways to control the accelerator.”

    In order to accelerate electrons to even higher energies—Leemans’ near-term goal is 10 giga-electron volts—the researchers will need to more precisely control the density of the plasma channel through which the laser light flows. In essence, the researchers need to create a tunnel for the light pulse that’s just the right shape to handle more-energetic electrons. Leemans says future work will demonstrate a new technique for plasma-channel shaping.

    Multi-GeV electron beams from capillary-discharge-guided subpetawatt laser pulses in the self-trapping regime by W. P. Leemans, A. J. Gonsalves, H.-S. Mao, et al. was published in Physical Review Letters on December 8, 2014.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 1:48 pm on December 3, 2014 Permalink | Reply
    Tags: , , Lawrence Berkeley National laboratory,   

    From LBL: “High-Intensity Proton Accelerator Successfully Demonstrated” 

    Berkeley Logo

    Berkeley Lab

    December 2, 2014
    No Writer Credit

    Last month, a team of engineers and physicists from Berkeley Lab and collaborators from the Institute of Modern Physics (IMP), Chinese Academy of Sciences in Lanzhou, China, demonstrated the successful operation of a high intensity continuous-wave linear particle accelerator. This linac, called a Radio Frequency Quadrupole (RFQ), reached 10 mA beam current at 2.1 MeV with an operating frequency of 162.5 MHz. This is currently the highest-intensity proton RFQ in operation. IMP plans to use the RFQ as an injector for the Chinese Accelerator Driven System project, a research-and-development effort to shorten the half-life of nuclear waste by bombarding it with high-energy protons.

    rfq
    The RFQ accelerator achieved 10 mA at 162.5 MHz CW operation last month in Lanzhou, China. Photo credit: Courtesy of IMP

    “We are very proud of this collaboration,” says Division Director of the Accelerator Technology and Applied Physics Division Wim Leemans.“Judging by the feedback we received from the Chinese Academy of Sciences at Lanzhou, it is clear that the team at Berkeley lab successfully delivered a key part of the new IMP facility, exceeding performance expectations in the process.”

    team
    (l-r) Steve Virostek, Allan DeMello, Derun Li, Tianhuan Luo, John Staples, Matt Hoff, Andrew Lambert

    The RFQ accelerator was designed and engineered at Berkeley Lab and was fabricated and assembled at IMP in China. The LBNL team is composed of physicists Derun Li (PI), John Staples and Tianhuan Luo of Accelerator Technology and Applied Physics Division (ATAP), and engineers Steve Virostek (lead), Matt Hoff, Andrew Lambert and Allan DeMello from Engineering Division. Berkeley Lab has made numerous RFQ accelerators over the past 35 years, more than any other institution in the US. Current efforts include collaboration with Fermilab to build and deliver a high intensity RFQ by May 2015 for the US Long Baseline Neutrino Facility.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 7:02 pm on November 24, 2014 Permalink | Reply
    Tags: , , , Lawrence Berkeley National laboratory   

    From LBL: “For Important Tumor-Suppressing Protein, Context is Key” 

    Berkeley Logo

    Berkeley Lab

    November 21, 2014
    Dan Krotz 510-486-4019

    Scientists from the US Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have learned new details about how an important tumor-suppressing protein, called p53, binds to the human genome. As with many things in life, they found that context makes a big difference.

    t.
    PDB rendering based on 1TUP: P53 complexed with DNA[1]

    The researchers mapped the places where p53 binds to the genome in a human cancer cell line. They compared this map to a previously obtained map of p53 binding sites in a normal human cell line. These binding patterns indicate how the protein mobilizes a network of genes that quell tumor growth.

    They found that p53 occupies various types of DNA sequences, among them are sequences that occur in many copies and at multiple places in the genome. These sequences, called repeats, make up about half of our genome, but their function is much less understood than the non-repeated parts of the genome that code for genes.

    It’s been known for some time that p53 binds to repeats, but the Berkeley Lab scientists discovered something new: The protein is much more enriched at repeats in cancer cells than in normal cells. The binding patterns in these cell lines are very different, despite the same experimental conditions. This is evidence, they conclude, that in response to the same stress signal, p53 binds to the human genome in a way that is selective and dependent on cell context—an idea that has been an open question for years.

    l
    Illustration of p53 binding to major categories of repeats in the human genome, such as LTR, SINE and LINE.

    The research is published online Nov. 21 in the journal PLOS ONE.

    “It is well established that p53 regulates specific sets of genes, depending on the cell type and the DNA damage type. But how that specificity is achieved, and whether p53 binds to the genome in a selective manner, has been a matter of debate. We show that p53 binding is indeed selective and dependent on cell context,” says Krassimira Botcheva of Berkeley Lab’s Life Sciences Division. She conducted the research with Sean McCorkle of Brookhaven National Laboratory.

    What exactly does cell context mean in this case? The DNA that makes up the genome is organized into chromatin, which is further packed into chromosomes. Different cell types differ by their chromatin state. Cancer can change chromatin in a way that doesn’t affect DNA sequences, a type of change that is called epigenetic. The new research indicates that epigenetic changes to chromatin may have a big impact on how p53 does its job.

    “To understand p53 tumor suppression functions that depend on DNA binding, we have to examine these functions in the context of the dynamic, cancer-associated epigenetic changes,” says Botcheva.

    Their finding is the latest insight into p53, one of the most studied human proteins. For the past 35 years, scientists have explored how the protein fights cancer. After DNA damage, p53 can initiate cell cycle arrest to allow time for DNA repair. The protein can promote senescence, which stops a cell from proliferating. It can also trigger cell death if the DNA damage is severe.

    Much of this research has focused on how p53 binds to the non-repeated part of the genome, where the genes are located. This latest research suggests that repeats deserve a lot of attention too.

    “Our research indicates that p53 binding at repeats could be essential for maintaining the genomic stability,” says Botcheva. “Repeats could have a significant impact on the way the entire p53 network is mobilized to ensure tumor suppression.”

    The research was supported by the U.S. Department of Energy’s Office of Science.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 4:27 pm on November 17, 2014 Permalink | Reply
    Tags: , , , , Lawrence Berkeley National laboratory   

    From LBL: “As Temperatures Rise, Soil Will Relinquish Less Carbon to the Atmosphere Than Currently Predicted” 

    Berkeley Logo

    Berkeley Lab

    November 17, 2014
    Dan Krotz 510-486-4019

    New Berkeley Lab model quantifies interactions between soil microbes and their surroundings

    Here’s another reason to pay close attention to microbes: Current climate models probably overestimate the amount of carbon that will be released from soil into the atmosphere as global temperatures rise, according to research from the US Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab).

    The findings are from a new computer model that explores the feedbacks between soil carbon and climate change. It’s the first such model to include several physiologically realistic representations of how soil microbes break down organic matter, a process that annually unleashes about ten times as much carbon into the atmosphere as fossil fuel emissions. In contrast, today’s models include a simplistic representation of microbial behavior.

    The research is published Nov. 17 on the website of the journal Nature Climate Change.

    Based on their results, the Berkeley Lab scientists recommend that future Earth system models include a more nuanced and dynamic depiction of how soil microbes go about the business of degrading organic matter and freeing up carbon.

    This approach could help scientists more accurately predict what will happen to soil carbon as Earth’s climate changes. These predictions are especially important in vulnerable regions like the Arctic, which is expected to warm considerably this century, and which holds a vast amount of carbon in the tundra.

    “We know that microbes are the agents of change when it comes to decomposing organic matter. But the question is: How important is it to explicitly quantify complex microbial interactions in climate models?” says Jinyun Tang, a scientist in Berkeley Lab’s Earth Sciences Division who conducted the research with fellow Berkeley Lab scientist William Riley.

    “We found that it makes a big difference,” Tang says. ”We showed that warming temperatures would return less soil carbon to the atmosphere than current models predict.”

    mod
    The complex and dynamic livelihood of soil microbes is captured in this schematic. For the first time, these processes are represented in a computer model that predicts the fate of soil carbon as temperatures rise. (Credit: Berkeley Lab)

    Terrestrial ecosystems, such as the Arctic tundra and Amazon rainforest, contain a huge amount of carbon in organic matter such as decaying plant material. Thanks to soil microbes that break down organic matter, these ecosystems also contribute a huge amount of carbon to the atmosphere.

    al
    The soil above the Arctic Circle near Barrow, Alaska contains a tremendous amount of carbon. New research may help scientists better predict how much of this carbon will be released as the climate warms.

    Because soil is such a major player in the carbon cycle, even a small change in the amount of carbon it releases can have a big affect on atmospheric carbon concentrations. This dynamic implies that climate models should represent soil-carbon processes as accurately as possible.

    But here’s the problem: Numerous empirical experiments have shown that the ways in which soil microbes decompose organic matter, and respond to changes in temperature, vary over time and from place to place. This variability is not captured in today’s ecosystem models, however. Microbes are depicted statically. They respond instantaneously when they’re perturbed, and then revert back as if nothing happened.

    To better portray the variability of the microbial world, Tang and Riley developed a numerical model that quantifies the costs incurred by microbes to respire, grow, and consume energy. Their model accounts for internal physiology, such as the production of enzymes that help microbes break down organic matter. It includes external processes, such as the competition for these enzymes once they’re outside the microbe. Some enzymes adsorb onto mineral surfaces, which means they are not available to chew through organic matter. The model also includes competition between different microbial populations.

    Together, these interactions—from enzymes to minerals to populations­—represent microbial networks as ever-changing systems, much like what’s observed in experiments.

    The result? When the model was subjected to a 4 degrees Celsius change, it predicted more variable but weaker soil-carbon and climate feedbacks than current approaches.

    “There’s less carbon flux to the atmosphere in response to warming,” says Riley. “Our representation is more complex, which has benefits in that it’s likely more accurate. But it also has costs, in that the parameters used in the model need to be further studied and quantified.”

    Tang and Riley recommend more research be conducted on these microbial and mineral interactions. They also recommend that these features ultimately be included in next-generation Earth system models, such as the Department of Energy’s Accelerated Climate Modeling for Energy, or ACME.

    The research was supported by the Department of Energy’s Office of Science.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 5:39 pm on November 12, 2014 Permalink | Reply
    Tags: , Lawrence Berkeley National laboratory, , ,   

    From LBL: “Latest Supercomputers Enable High-Resolution Climate Models, Truer Simulation of Extreme Weather” 

    Berkeley Logo

    Berkeley Lab

    November 12, 2014
    Julie Chao (510) 486-6491

    Not long ago, it would have taken several years to run a high-resolution simulation on a global climate model. But using some of the most powerful supercomputers now available, Lawrence Berkeley National Laboratory (Berkeley Lab) climate scientist Michael Wehner was able to complete a run in just three months.

    What he found was that not only were the simulations much closer to actual observations, but the high-resolution models were far better at reproducing intense storms, such as hurricanes and cyclones. The study, The effect of horizontal resolution on simulation quality in the Community Atmospheric Model, CAM5.1, has been published online in the Journal of Advances in Modeling Earth Systems.

    “I’ve been calling this a golden age for high-resolution climate modeling because these supercomputers are enabling us to do gee-whiz science in a way we haven’t been able to do before,” said Wehner, who was also a lead author for the recent Fifth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC). “These kinds of calculations have gone from basically intractable to heroic to now doable.”

    mw
    Michael Wehner, Berkeley Lab climate scientist

    Using version 5.1 of the Community Atmospheric Model, developed by the Department of Energy (DOE) and the National Science Foundation (NSF) for use by the scientific community, Wehner and his co-authors conducted an analysis for the period 1979 to 2005 at three spatial resolutions: 25 km, 100 km, and 200 km. They then compared those results to each other and to observations.

    One simulation generated 100 terabytes of data, or 100,000 gigabytes. The computing was performed at Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC), a DOE Office of Science User Facility. “I’ve literally waited my entire career to be able to do these simulations,” Wehner said.

    sc

    The higher resolution was particularly helpful in mountainous areas since the models take an average of the altitude in the grid (25 square km for high resolution, 200 square km for low resolution). With more accurate representation of mountainous terrain, the higher resolution model is better able to simulate snow and rain in those regions.

    “High resolution gives us the ability to look at intense weather, like hurricanes,” said Kevin Reed, a researcher at the National Center for Atmospheric Research (NCAR) and a co-author on the paper. “It also gives us the ability to look at things locally at a lot higher fidelity. Simulations are much more realistic at any given place, especially if that place has a lot of topography.”

    The high-resolution model produced stronger storms and more of them, which was closer to the actual observations for most seasons. “In the low-resolution models, hurricanes were far too infrequent,” Wehner said.

    The IPCC chapter on long-term climate change projections that Wehner was a lead author on concluded that a warming world will cause some areas to be drier and others to see more rainfall, snow, and storms. Extremely heavy precipitation was projected to become even more extreme in a warmer world. “I have no doubt that is true,” Wehner said. “However, knowing it will increase is one thing, but having a confident statement about how much and where as a function of location requires the models do a better job of replicating observations than they have.”

    Wehner says the high-resolution models will help scientists to better understand how climate change will affect extreme storms. His next project is to run the model for a future-case scenario. Further down the line, Wehner says scientists will be running climate models with 1 km resolution. To do that, they will have to have a better understanding of how clouds behave.

    “A cloud system-resolved model can reduce one of the greatest uncertainties in climate models, by improving the way we treat clouds,” Wehner said. “That will be a paradigm shift in climate modeling. We’re at a shift now, but that is the next one coming.”

    The paper’s other co-authors include Fuyu Li, Prabhat, and William Collins of Berkeley Lab; and Julio Bacmeister, Cheng-Ta Chen, Christopher Paciorek, Peter Gleckler, Kenneth Sperber, Andrew Gettelman, and Christiane Jablonowski from other institutions. The research was supported by the Biological and Environmental Division of the Department of Energy’s Office of Science.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:42 pm on November 8, 2014 Permalink | Reply
    Tags: , , Lawrence Berkeley National laboratory, Synthetic Biology   

    From LBL: “Synthetic Biology for Space Exploration” 

    Berkeley Logo

    Berkeley Lab

    November 5, 2014
    Lynn Yarris (510) 486-5375

    Does synthetic biology hold the key to manned space exploration of the Moon and Mars? Berkeley Lab researchers have used synthetic biology to produce an inexpensive and reliable microbial-based alternative to the world’s most effective anti-malaria drug, and to develop clean, green and sustainable alternatives to gasoline, diesel and jet fuels. In the future, synthetic biology could also be used to make manned space missions more practical.

    “Not only does synthetic biology promise to make the travel to extraterrestrial locations more practical and bearable, it could also be transformative once explorers arrive at their destination,” says Adam Arkin, director of Berkeley Lab’s Physical Biosciences Division (PBD) and a leading authority on synthetic and systems biology.

    “During flight, the ability to augment fuel and other energy needs, to provide small amounts of needed materials, plus renewable, nutritional and taste-engineered food, and drugs-on-demand can save costs and increase astronaut health and welfare,” Arkin says. “At an extraterrestrial base, synthetic biology could make even more effective use of the catalytic activities of diverse organisms.”

    aa
    Adam Arkin is a leading authority on synthetic and systems biology.

    Arkin is the senior author of a paper in the Journal of the Royal Society Interface that reports on a techno-economic analysis demonstrating “the significant utility of deploying non-traditional biological techniques to harness available volatiles and waste resources on manned long-duration space missions.” The paper is titled Towards Synthetic Biological Approaches to Resource Utilization on Space Missions. The lead and corresponding author is Amor Menezes, a postdoctoral scholar in Arkin’s research group at the University of California (UC) Berkeley. Other co-authors are John Cumbers and John Hogan with the NASA Ames Research Center.

    One of the biggest challenges to manned space missions is the expense. The NASA rule-of-thumb is that every unit mass of payload launched requires the support of an additional 99 units of mass, with “support” encompassing everything from fuel to oxygen to food and medicine for the astronauts, etc. Most of the current technologies now deployed or under development for providing this support are abiotic, meaning non-biological. Arkin, Menezes and their collaborators have shown that providing this support with technologies based on existing biological processes is a more than viable alternative.

    am
    Amor Menezes

    “Because synthetic biology allows us to engineer biological processes to our advantage, we found in our analysis that technologies, when using common space metrics such as mass, power and volume, have the potential to provide substantial cost savings, especially in mass,” Menezes says.

    In their study, the authors looked at four target areas: fuel generation, food production, biopolymer synthesis, and pharmaceutical manufacture. They showed that for a 916 day manned mission to Mars, the use of microbial biomanufacturing capabilities could reduce the mass of fuel manufacturing by 56-percent, the mass of food-shipments by 38-percent, and the shipped mass to 3D-print a habitat for six by a whopping 85-percent. In addition, microbes could also completely replenish expired or irradiated stocks of pharmaceuticals, which would provide independence from unmanned re-supply spacecraft that take up to 210 days to arrive.

    “Space has always provided a wonderful test of whether technology can meet strict engineering standards for both effect and safety,” Arkin says. “NASA has worked decades to ensure that the specifications that new technologies must meet are rigorous and realistic, which allowed us to perform up-front techno-economic analysis.”

    mb
    Microbial-based biomanufacturing could be transformative once explorers arrive at an extraterrestrial site. (Image courtesy of Royal Academy Interface)

    The big advantage biological manufacturing holds over abiotic manufacturing is the remarkable ability of natural and engineered microbes to transform very simple starting substrates, such as carbon dioxide, water biomass or minerals, into materials that astronauts on long-term missions will need. This capability should prove especially useful for future extraterrestrial settlements.

    “The mineral and carbon composition of other celestial bodies is different from the bulk of Earth, but the earth is diverse with many extreme environments that have some relationship to those that might be found at possible bases on the Moon or Mars,” Arkin says. “Microbes could be used to greatly augment the materials available at a landing site, enable the biomanufacturing of food and pharmaceuticals, and possibly even modify and enrich local soils for agriculture in controlled environments.”

    The authors acknowledge that much of their analysis is speculative and that their calculations show a number of significant challenges to making biomanufacturing a feasible augmentation and replacement for abiotic technologies. However, they argue that the investment to overcome these barriers offers dramatic potential payoff for future space programs.

    “We’ve got a long way to go since experimental proof-of-concept work in synthetic biology for space applications is just beginning, but long-duration manned missions are also a ways off,” says Menezes. “Abiotic technologies were developed for many, many decades before they were successfully utilized in space, so of course biological technologies have some catching-up to do. However, this catching-up may not be that much, and in some cases, the biological technologies may already be superior to their abiotic counterparts.”

    This research was supported by the National Aeronautics and Space Administration (NASA) and the University of California, Santa Cruz.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 2:58 pm on October 30, 2014 Permalink | Reply
    Tags: , , Lawrence Berkeley National laboratory, ,   

    From LBL: “Lord of the Microrings” 

    Berkeley Logo

    Berkeley Lab

    October 30, 2014
    Lynn Yarris (510) 486-5375

    A significant breakthrough in laser technology has been reported by the U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (Berkeley Lab) and the University of California (UC) Berkeley. Scientists led by Xiang Zhang, a physicist with joint appointments at Berkeley Lab and UC Berkeley, have developed a unique microring laser cavity that can produce single-mode lasing even from a conventional multi-mode laser cavity. This ability to provide single-mode lasing on demand holds ramifications for a wide range of applications including optical metrology and interferometry, optical data storage, high-resolution spectroscopy and optical communications.

    “Losses are typically undesirable in optics but, by deliberately exploiting the interplay between optical loss and gain based on the concept of parity-time symmetry, we have designed a microring laser cavity that exhibits intrinsic single-mode lasing regardless of the gain spectral bandwidth,” says Zhang, who directs Berkeley Lab’s Materials Sciences Division and is UC Berkeley’s Ernest S. Kuh Endowed Chair Professor. “This approach also provides an experimental platform to study parity-time symmetry and phase transition phenomena that originated from quantum field theory yet have been inaccessible so far in experiments. It can fundamentally broaden optical science at both semi-classical and quantum levels”

    zz
    Xiang Zhang, director of Berkeley Lab’s Materials Sciences Division. (Photo by Roy Kaltschmidt)

    Zhang, who also directs the National Science Foundation’s Nano-scale Science and Engineering Center, and is a member of the Kavli Energy NanoSciences Institute at Berkeley, is the corresponding author of a paper in Science that describes this work. The paper is titled Single-Mode Laser by Parity-time Symmetry Breaking. Co-authors are Liang Feng, Zi Jing Wong, Ren-Min Ma and Yuan Wang.

    A laser cavity or resonator is the mirrored component of a laser in which light reflected multiple times yields a standing wave at certain resonance frequencies called modes. Laser cavities typically support multiple modes because their dimensions are much larger than optical wavelengths. Competition between modes limits the optical gain in amplitude and results in random fluctuations and instabilities in the emitted laser beams.

    “For many applications, single-mode lasing is desirable for its stable operation, better beam quality, and easier manipulation,” Zhang says. “Light emission from a single-mode laser is monochromatic with low phase and intensity noises, but creating sufficiently modulated optical gain and loss to obtain single-mode lasing has been a challenge.”
    Scanning electron microscope image of the fabricated PT symmetry microring laser cavity.

    image
    Scanning electron microscope image of the fabricated PT symmetry microring laser cavity.

    While mode manipulation and selection strategies have been developed to achieve single-mode lasing, each of these strategies has only been applicable to specific configurations. The microring laser cavity developed by Zhang’s group is the first successful concept for a general design. The key to their success is using the concept of the breaking of parity-time (PT) symmetry. The law of parity-time symmetry dictates that the properties of a system, like a beam of light, remain the same even if the system’s spatial configuration is reversed, like a mirror image, or the direction of time runs backward. Zhang and his group discovered a phenomenon called “thresholdless parity-time symmetry breaking” that provides them with unprecedented control over the resonant modes of their microring laser cavity, a critical requirement for emission control in laser physics and applications.

    lf
    Liang Feng

    “Thresholdless PT symmetry breaking means that our light beam undergoes symmetry breaking once the gain/loss contrast is introduced no matter how large this contrast is,” says Liang Feng, lead author of the Science paper, a recent posdoc in Zhang’s group and now an assistant professor with the University at Buffalo. “In other words, the threshold for PT symmetry breaking is zero gain/loss contrast.”

    Zhang, Feng and the other members of the team were able to exploit the phenomenon of thresholdless PT symmetry breaking through the fabrication of a unique microring laser cavity. This cavity consists of bilayered structures of chromium/germanium arranged periodically in the azimuthal direction on top of a microring resonator made from an indium-gallium-arsenide-phosphide compound on a substrate of indium phosphide. The diameter of the microring is 9 micrometers.

    “The introduced rotational symmetry in our microring resonator is continuous, mimicking an infinite system,” says Feng. “The counterintuitive discovery we made is that PT symmetry does not hold even at an infinitesimal gain/loss modulation when a system is rotationally symmetric. This was not observed in previous one-dimensional PT modulation systems because those finite systems did not support any continuous symmetry operations.”

    Using the continuous rotational symmetry of their microring laser cavity to facilitate thresholdless PT symmetry breaking,

    Zhang, Feng and their collaborators are able to delicately manipulate optical gain and loss in such a manner as to ultimately yield single-mode lasing.

    “PT symmetry breaking means an optical mode can be gain-dominant for lasing, whereas PT symmetry means all the modes remain passive,” says Zi-Jing Wong, co-lead author and a graduate student in Zhang’s group. “With our microring laser cavity, we facilitate a desired mode in PT symmetry breaking, while keeping all other modes PT symmetric. Although PT symmetry breaking by itself cannot guarantee single-mode lasing, when acting together with PT symmetry for all other modes, it facilitates single-mode lasing.”

    In their Science paper, the researchers suggest that single-mode lasing through PT-symmetry breaking could pave the way to next generation optoelectronic devices for communications and computing as it enables the independent manipulation of multiple laser beams without the “crosstalk” problems that plague today’s systems. Their microring laser cavity concept might also be used to engineer optical modes in a typical multi-mode laser cavity to create a desired lasing mode and emission pattern.

    “Our microring laser cavities could also replace the large laser boxes that are routinely used in labs and industry today,” Feng says. “Moreover, the demonstrated single-mode operation regardless of gain spectral bandwidth may create a laser chip carrying trillions of informational signals at different frequencies. This would make it possible to shrink a huge datacenter onto a tiny photonic chip.”

    This research was supported by the Office of Naval Research MURI program.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:20 pm on October 29, 2014 Permalink | Reply
    Tags: , , , , , Lawrence Berkeley National laboratory   

    From LBL: “New Lab Startup Afingen Uses Precision Method to Enhance Plants” 

    Berkeley Logo

    Berkeley Lab

    October 29, 2014
    Julie Chao (510) 486-6491

    Imagine being able to precisely control specific tissues of a plant to enhance desired traits without affecting the plant’s overall function. Thus a rubber tree could be manipulated to produce more natural latex. Trees grown for wood could be made with higher lignin content, making for stronger yet lighter-weight lumber. Crops could be altered so that only the leaves and certain other tissues had more wax, thus enhancing the plant’s drought tolerance, while its roots and other functions were unaffected.

    By manipulating a plant’s metabolic pathways, two scientists at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), Henrik Scheller and Dominique Loqué, have figured out a way to genetically rewire plants to allow for an exceptionally high level of control over the spatial pattern of gene expression, while at the same time boosting expression to very high levels. Now they have launched a startup company called Afingen to apply this technology for developing low-cost biofuels that could be cost-competitive with gasoline and corn ethanol.

    two
    Henrik Scheller (left) and Dominique Loque hold a tray of Arabidopsis Thaliana plants, which they used in their research. (Berkeley Lab photo)

    “With this tool we seem to have found a way to control very specifically what tissue or cell type expresses whatever we want to express,” said Scheller. “It’s a new way that people haven’t thought about to increase metabolic pathways. It could be for making more cell wall, for increasing the stress tolerance response in a specific tissue. We think there are many different applications.”

    Cost-competitive biofuels

    Afingen was awarded a Small Business Innovation Research (SBIR) grant earlier this year for $1.72 million to engineer switchgrass plants that will contain 20 percent more fermentable sugar and 40 percent less lignin in selected structures. The grant was provided under a new SBIR program at DOE that combines an SBIR grant with an option to license a specific technology produced at a national laboratory or university through DOE-supported research.

    “Techno-economic modeling done at (the Joint BioEnergy Institute, or JBEI) has shown that you would get a 23 percent reduction in the price of the biofuel with just a 20 percent reduction in lignin,” said Loqué. “If we could also increase the sugar content and make it easier to extract, that would reduce the price even further. But of course it also depends on the downstream efficiency.”

    Scheller and Loqué are plant biologists with the Department of Energy’s Joint BioEnergy Institute (JBEI), a Berkeley Lab-led research center established in 2007 to pursue breakthroughs in the production of cellulosic biofuels. Scheller heads the Feedstocks Division and Loqué leads the cell wall engineering group.

    The problem with too much lignin in biofuel feedstocks is that it is difficult and expensive to break down; reducing lignin content would allow the carbohydrates to be released and converted into fuels much more cost-effectively. Although low-lignin plants have been engineered, they grow poorly because important tissues lack the strength and structural integrity provided by the lignin. With Afingen’s technique, the plant can be manipulated to retain high lignin levels only in its water-carrying vascular cells, where cell-wall strength is needed for survival, but low levels throughout the rest of the plant.

    The centerpiece of Afingen’s technology is an “artificial positive feedback loop,” or APFL. The concept targets master transcription factors, which are molecules that regulate the expression of genes involved in certain biosynthetic processes, that is, whether certain genes are turned “on” or “off.” The APFL technology is a breakthrough in plant biotechnology, and Loqué and Scheller recently received an R&D 100 Award for the invention.

    An APFL is a segment of artificially produced DNA coded with instructions to make additional copies of a master transcription factor; when it is inserted at the start of a chosen biosynthetic pathway—such as the pathway that produces cellulose in fiber tissues—the plant cell will synthesize the cellulose and also make a copy of the master transcription factor that launched the cycle in the first place. Thus the cycle starts all over again, boosting cellulose production.

    The process differs from classical genetic engineering. “Some people distinguish between ‘transgenic’ and ‘cisgenic.’ We’re using only pieces of DNA that are already in that plant and just rearranging them in a new way,” said Scheller. “We’re not bringing in foreign DNA.”

    Other licensees and applications

    This breakthrough technique can also be used in fungi and for a wide variety of uses in plants, for example, to increase food crop yields or to boost production of highly specialized molecules used by the pharmaceutical and chemical industries. “It could also increase the quality of forage crops, such as hay fed to cows, by increasing the sugar content or improving the digestibility,” Loqué said.

    Another intriguing application is for biomanufacturing. By engineering plants to grow entirely new pharmaceuticals, specialty chemicals, or polymer materials, the plant essentially becomes a “factory.” “We’re interested in using the plant itself as a host for production,” Scheller said. “Just like you can upregulate pathways in plants that make cell walls or oil, you can also upregulate pathways that make other compounds or properties of interest.”

    Separately, two other companies are using the APFL technology. Tire manufacturer Bridgestone has a cooperative research and development agreement (CRADA) with JBEI to develop more productive rubber-producing plants. FuturaGene, a Brazilian paper and biomass company, has licensed the technology for exclusive use with eucalyptus trees and several other crops; APFL can enhance or develop traits to optimize wood quality for pulping and bioenergy applications.

    “The inventors/founders of Afingen made the decision to not compete for a license in fields of use that were of interest to other companies that had approached JBEI. This allowed JBEI to move the technology forward more quickly on several fronts,” said Robin Johnston, Berkeley Lab’s Acting Deputy Chief Technology Transfer Officer. “APFL is a very insightful platform technology, and I think only a fraction of the applications have even been considered yet.”

    Afingen currently has one employee—Ai Oikawa, a former postdoctoral researcher and now the director of plant engineering—and will be hiring three more in November. It is the third startup company to spin out of JBEI. The first two were Lygos, which uses synthetic biology tools to produce chemical compounds, and TeselaGen, which makes tools for DNA synthesis and cloning.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 4:05 pm on October 28, 2014 Permalink | Reply
    Tags: , CUORE collaboration, Lawrence Berkeley National laboratory,   

    From LBL: “Creating the Coldest Cubic Meter in the Universe” 

    Berkeley Logo

    Berkeley Lab

    October 28, 2014
    Kate Greene 510-486-4404

    In an underground laboratory in Italy, an international team of scientists has created the coldest cubic meter in the universe. The cooled chamber—roughly the size of a vending machine—was chilled to 6 milliKelvin or -273.144 degrees Celsius in preparation for a forthcoming experiment that will study neutrinos, ghostlike particles that could hold the key to the existence of matter around us.

    cube
    Scientist inspect the cryostat of the of the Cryogenic Underground Observatory for Rare Events. Credit: CUORE collaboration

    The collaboration responsible for the record-setting refrigeration is called the Cryogenic Underground Observatory for Rare Events (CUORE), supported jointly by the Istituto Nazionale di Fisica Nucleare (INFN) in Italy, and the Department of Energy’s Office of Science and National Science Foundation in the US. Lawrence Berkeley National Lab (Berkeley Lab) manages the CUORE project in the US. The CUORE collaboration is made of 157 scientists from the U.S., Italy, China, Spain, and France, and is based in the underground Italian facility called Laboratori Nazionali del Gran Sasso (LNGS) of the INFN.

    “We’ve been building this experiment for almost ten years,” says Yury Kolomensky, senior faculty scientist in the Physics Division of Berkeley Lab, professor of physics at UC Berkeley, and U.S. spokesperson for the CUORE collaboration. “This is a tremendous feat of cryogenics. We’ve exceeded our goal of 10 milliKelvin. Nothing in the universe this large has ever been as cold.”

    The chamber, technically called a cryostat, was designed and built in Italy, and maintained the ultra-cold temperature for more than two weeks. An international team of physicists, including students and postdoctoral scholars from Italy and the US, worked for over two years to assemble the cryostat, iron out the kinks, and demonstrate its record-breaking performance. The claim that no other object of similar size and temperature – either natural or man-made – exists in the universe was detailed in a recent paper by Jonathan Ouellet, Berkeley Lab Nuclear Science staff and UC Berkeley graduate student.

    In order to achieve such a low-temperature cryostat, the team used a multi chamber design that looks something like Russian nesting dolls: six chambers in total, each becoming progressively smaller and colder.

    dolls
    An illustration of the cross-section of the cryostat with a human figure for scale. Credit: CUORE collaboration

    The chambers are evacuated, isolating the insides from the room temperature, like in a thermos. The outer chambers are cooled to the temperature of liquid helium with mechanical coolers called pulse tubes – which do not require expensive cryogenic liquids. The innermost chamber is cooled using a process similar to traditional refrigeration in which a fluid evaporates and takes heat along with it. The only fluid that operates at such cold temperatures, however, is liquid helium. The researchers use a mixture of Helium-3 and Helium-4 that continuously circulates in a specialized cryogenic unit called dilution refrigerator, removing any remnant heat energy from the smallest chamber. The CUORE dilution refrigerator, built by Leiden Cryogenics in Netherlands, is one of the most powerful in the world. “It’s a Mack truck of dilution refrigerators,” Kolomensky says.

    The ultimate purpose for the coldest cubic meter in the universe is to house a new ultra-sensitive detector. The goal of CUORE is to observe a hypothesized rare process called neutrinoless double-beta decay. Detection of this process would allow researchers to demonstrate, for the first time, that neutrinos are their own antiparticles, thereby offering a possible explanation for the abundance of matter over anti-matter in our universe —in other words, why the galaxies, stars, and ultimately people exist in the universe at all.

    To detect neutrinoless double-beta decay, the team is using a detector made of 19 independent towers of tellurium dioxide (TeO2) crystals. Fifty-two crystals, each a little smaller than a Rubik’s cube, make up each tower. The team expects that they would be able to see evidence of the rare radioactive process within these cube-shaped crystals because the phenomenon would produce a barely detectable temperature rise, picked up by highly sensitive temperature sensors.

    Berkeley Lab, with Lawrence Livermore National Lab, has supplied roughly half the crystals for the CUORE project. In addition, Berkeley Lab designed and fabricated the highly sensitive temperature sensors – Neutron Transmutation Doped thermistors invented by Eugene Haller, UC Berkeley faculty and senior faculty scientist in the Material Science Division.

    UC postdocs Tom Banks and Tommy O’Donnell, who also have joint appointments with the Nuclear Science Division at Berkeley Lab, led the international team of physicists, engineers, and technicians to assemble over ten thousand parts into towers in nitrogen-filled glove boxes, including and bonding almost 8000 25-micron gold wires to 100-micron sized pads on the temperature sensors and on copper pads connected to detector wiring.

    The last of the 19 towers has recently been completed; all towers are now safely stored underground at LNGS, waiting to occupy the record-breaking vessel. The coldest cubic meter in the known universe is not just the feat of engineering; it will become a premier science instrument next year.

    US-CUORE team was lead by late Prof. Stuart Freedman until his untimely passing in 2012. Other current and former Berkeley Lab members of the CUORE collaboration not previously mentioned include US Contractor Project Manager Sergio Zimmermann (Engineering Division), former US Contractor Project Manager Richard Kadel (Physics Division, retired), staff scientists Jeffrey Beeman (Materials Science Division), Brian Fujikawa (Nuclear Science Division), Sarah Morgan (Engineering), Alan Smith (EH&S), postdocs Raul Hennings-Yeomans (UCB and NSD), Ke Han (NSD, now Yale), and Yuan Mei (NSD), graduate students Alexey Drobizhev and Sachi Wagaarachchi (UCB and NSD), and engineers David Biare, Lucio di Paolo (NSD and LNGS), and Joseph Wallig (Engineering).

    For more information: CUORE collaboration news release here.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 8:43 pm on October 3, 2014 Permalink | Reply
    Tags: , , , Lawrence Berkeley National laboratory,   

    From LBL: “News Center RCas9: A Programmable RNA Editing Tool” 

    Berkeley Logo

    Berkeley Lab

    October 3, 2014
    Lynn Yarris (510) 486-5375

    A powerful scientific tool for editing the DNA instructions in a genome can now also be applied to RNA, the molecule that translates DNA’s genetic instructions into the production of proteins. A team of researchers with Berkeley Lab and the University of California (UC) Berkeley has demonstrated a means by which the CRISPR/Cas9 protein complex can be programmed to recognize and cleave RNA at sequence-specific target sites. This finding has the potential to transform the study of RNA function by paving the way for direct RNA transcript detection, analysis and manipulation.

    sch
    Schematic shows how RNA-guided Cas9 working with PAMmer can target ssRNA for programmable, sequence-specific cleavage.

    Led by Jennifer Doudna, biochemist and leading authority on the CRISPR/Cas9 complex, the Berkeley team showed how the Cas9 enzyme can work with short DNA sequences known as “PAM,” for protospacer adjacent motif, to identify and bind with specific site of single-stranded RNA (ssRNA). The team is designating this RNA-targeting CRISPR/Cas9 complex as RCas9.

    “Using specially designed PAM-presenting oligonucleotides, or PAMmers, RCas9 can be specifically directed to bind or cut RNA targets while avoiding corresponding DNA sequences, or it can be used to isolate specific endogenous messenger RNA from cells,” says Doudna, who holds joint appointments with Berkeley Lab’s Physical Biosciences Division and UC Berkeley’s Department of Molecular and Cell Biology and Department of Chemistry, and is also an investigator with the Howard Hughes Medical Institute (HHMI). “Our results reveal a fundamental connection between PAM binding and substrate selection by RCas9, and highlight the utility of RCas9 for programmable RNA transcript recognition without the need for genetically introduced tags.”

    jd
    Biochemist Jennifer Doudna is leading authority on the CRISPR/Cas9 complex (Photo by Roy Kaltschmidt)

    From safer, more effective medicines and clean, green, renewable fuels, to the clean-up and restoration of our air, water and land, the potential is there for genetically engineered bacteria and other microbes to produce valuable goods and perform critical services. To exploit the vast potential of microbes, scientists must be able to precisely edit their genetic information.

    In recent years, the CRISPR/Cas complex has emerged as one of the most effective tools for doing this. CRISPR, which stands for Clustered Regularly Interspaced Short Palindromic Repeats, is a central part of the bacterial immune system and handles sequence recognition. Cas9 – Cas stands for CRISPR-assisted – is an RNA-guided enzyme that handles the sniping of DNA strands at the specified sequence site.

    Together, CRISPR and Cas9 can be used to precisely edit the DNA instructions in a targeted genome for making desired types of proteins. The DNA is cut at a specific location so that old DNA instructions can be removed and/or new instructions inserted.

    Until now, it was thought that Cas9 could not be used on the RNA molecules that transcribe those DNA instructions into the desired proteins.

    “Just as Cas9 can be used to cut or bind DNA in a sequence-specific manner, RCas9 can cut or bind RNA in a sequence-specific manner,” says Mitchell O’Connell, a member of Doudna’s research group and the lead author of a paper in Nature that describes this research titled Programmable RNA recognition and cleavage by CRISPR/Cas9. Doudna is the corresponding author. Other co-authors are Benjamin Oakes, Samuel Sternberg, Alexandra East Seletsky and Matias Kaplan.

    two
    Benjamin Oakes and Mitch O’Connell are part of the collaboration led by Jennifer Doudna that showed how the CRISPR/Cas9 complex can serve as a programmable RNA editor. (Photo by Roy Kaltschmidt)

    In an earlier study, Doudna and her group showed that the genome editing ability of Cas9 is made possible by presence of PAM, which marks where cutting is to commence and activates the enzyme’s strand-cleaving activity. In this latest study, Doudna, Mitchell and their collaborators show that PAMmers, in a similar manner, can also stimulate site-specific endonucleolytic cleavage of ssRNA targets. They used Cas9 enzymes from the bacterium Streptococcus pyogenes to perform a variety of in vitro cleavage experiments using a panel of RNA and DNA targets.

    “While RNA interference has proven useful for manipulating gene regulation in certain organisms, there has been a strong motivation to develop an orthogonal nucleic-acid-based RNA-recognition system such as RCas9,” Doudna says. “The molecular basis for RNA recognition by RCas9 is now clear and requires only the design and synthesis of a matching guide RNA and complementary PAMmer.”

    The researchers envision a wide range of potential applications for RCas9. For example, an RCas9 tethered to a protein translation initiation factor and targeted to a specific mRNA could essentially act as a designer translation factor to “up-” or “down-” regulate protein synthesis from that mRNA.

    “Tethering RCas9 to beads could be used to isolate RNA or native RNA–protein complexes of interest from cells for downstream analysis or assays,” Mitchell says. “RCsa9 fused to select protein domains could promote or exclude specific introns or exons, and RCas9 tethered to a fluorescent proteins could be used to observe RNA localization and transport in living cells.”

    This research was primarily supported by the NIH-funded Center for RNA Systems Biology.

    See the full article here.

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 378 other followers

%d bloggers like this: