Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 5:11 pm on October 17, 2016 Permalink | Reply
    Tags: Applied Research & Technology, , , San Joaquin Expanding Your Horizons Conference, ,   

    From LLNL: “Girls explore STEM careers at conference” 

    Lawrence Livermore National Laboratory

    Oct. 17, 2016
    Carenda L Martin

    “She Believed She Could So She Did STEM,” was the theme for the recent San Joaquin Expanding Your Horizons Conference, held at the University of the Pacific

    “She Believed She Could So She Did STEM,” was the theme for the 24th annual San Joaquin Expanding Your Horizons (SJEYH) conference, where nearly 500 young women flocked to the University of the Pacific campus in Stockton, excited to learn more about science, technology, engineering and mathematics (STEM).

    The conference, which is co-sponsored by Lawrence Livermore National Laboratory (LLNL), Sandia National Laboratories, and the University of the Pacific School of Engineering and Computer Science, sparks girls’ interest in STEM careers in a fun environment. Participants, spanning grades 6-12, came from across San Joaquin and Stanislaus counties, including Stockton, Lodi, Manteca, Modesto and other rural communities, to attend the daylong event.

    Monique Warren, a Stockton native and environmental engineer at LLNL, served as the keynote speaker, kicking off the event with an enthusiastic and inspirational talk exploring the SJEYH theme.

    As a past attendee, Warren was delighted to come full circle as the keynote speaker and credited SJEYH and programs like it for helping her get to where she is today. “When I first heard the theme for this year’s conference, I thought to myself, ‘Wow, what a great idea and what a great thing to teach,'” said Warren. “But the more I thought about this theme, the more I realized that wasn’t how my story began.”

    Warren didn’t always have a clear picture of what she wanted to do in life. “There have been many people in my life who have influenced, taught and helped to shape who I am for the better,” said Warren. “However, there are four special people in particular, that without them, I may not have become an environmental engineer. These four people are a huge part of the reason that I believed ‘I could.'”

    Warren shared that her primary inspiration came from her parents, along with mentors Andrea Hodge, an LLNL scientist, and Darin Gray, her teacher when she attended the USC Discover Engineering program.

    “My dad, a Laboratory employee, opened my eyes to the possibility of science through his determination to connect me with a mentor,” said Warren. Through his network at LLNL, he introduced me to Andrea, who shared with me first-hand what her job entailed. Darin Gray showed me that engineers solve real world problems and by introducing fun hands-on projects, he gave me a feel for what engineering was like. Finally, it was my mom who encouraged me to the point where I believed I could do it.”

    “Our goal today is to provide you with the opportunity to see the endless possibilities in science, technology, engineering and mathematics and to remind you that there is so much you can be and do,” said Warren. “If you want to live life like you intend to win, you need to put in the ‘EFFORT’ (enthusiasm, faith, flexibility, originality, rise [to the challenge] and teachable).”

    Each participant attended three out of 24 hands-on workshops that were offered, including titles such as: Fun With Science, Bristle Bots, DNA Cheek Cell Extraction Experiment, Cyber Defense, Ubiquitous Electronics, Water Treatment in Action, Engineer a Microscope, Computer Repair and Networking, Chemistry Potions and many more.

    After lunch and the final workshop, event organizers showed a slideshow of photos from the day and distributed raffle prizes to participants, including a laptop (grand prize). Many of those present had attended SJEYH before. Sierra Carpenter (Millenium High School), Diana Aguilera (Stockton Early College Academy), Emma Navarra and Hanna Navarra (both from Connecting Waters Charter School) received recognition for having attended the conference for all seven years.

    Jeene Villanueva, a computer scientist at LLNL, served as SJEYH conference chair for the second year in a row. “It is exciting to see the impact this conference has on students,” she said. “Past attendees are now professional women scientists and come back as volunteers to run workshops and chaperone groups. We feel the excitement continue not only in new attendees, but in workshop presenters and volunteers as well.”

    The annual conference is coordinated by a core committee of volunteers with the help of 200 additional volunteers who work at LLNL, Sandia National Laboratory and the University of the Pacific, along with other members of the community. More than 40 LLNL employees were involved in making SJEYH a successful event.

    “This conference runs smoothly due to the hard work of my awesome team that includes Deb Burdick, Martha Campiotti, Marleen Emig, Cary Gellner, Carolyn Hall, Joan Houston, Sharon Langman, Carrie Martin, Kathleen Shoga, Lindsey Whitehurst, Pearline Williams and Teri York,” said Villanueva. “I am always impressed by their selfless dedication to ensuring a successful event each year.”

    Special guests in attendance included: Jenny Kenoyer, City of Modesto council member; Maria Mendez, Stockton Unified School District Board of Education; Chiakis Ornelas, representing Congressman Jerry McNerney, 9th District Office; and Steven Howell, dean of the School of Engineering and Computer Science at the University of the Pacific.

    Various sponsors that contributed giveaways, services and donations included the American Association of University Women (AAUW); Junior League of San Joaquin County; Lawrence Livermore National Laboratory Women’s Association; Matthew Simpson (LLNL); NASCO, Modesto; Sandia Women’s Connection; SaveMart S.H.A.R.E.S. Program; Sandia/Lockheed Martin Foundation Gifts and Grants; Simplot J R Company, Lathrop; Society of Women Engineers/UOP; Soroptimist International, Manteca, Tracy; Stockton AAUW and Watermark.

    For more information, see the SJEYH website.

    To view more photos of the event, see the San Joaquin EYH 2016 photo gallery.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    LLNL Campus

    Operated by Lawrence Livermore National Security, LLC, for the Department of Energy’s National Nuclear Security
    DOE Seal

  • richardmitnick 10:42 am on October 14, 2016 Permalink | Reply
    Tags: Applied Research & Technology, , Cuprates, , , X-ray photon correlation spectroscopy   

    From BNL: “Scientists Find Static “Stripes” of Electrical Charge in Copper-Oxide Superconductor” 

    Brookhaven Lab

    October 14, 2016
    Ariana Tantillo
    (631) 344-2347
    Peter Genzer
    (631) 344-3174

    Fixed arrangement of charges coexists with material’s ability to conduct electricity without resistance

    Members of the Brookhaven Lab research team—(clockwise from left) Stuart Wilkins, Xiaoqian Chen, Mark Dean, Vivek Thampy, and Andi Barbour—at the National Synchrotron Light Source II’s Coherent Soft X-ray Scattering beamline, where they studied the electronic order of “charge stripes” in a copper-oxide superconductor. No image credit.

    Cuprates, or compounds made of copper and oxygen, can conduct electricity without resistance by being “doped” with other chemical elements and cooled to temperatures below minus 210 degrees Fahrenheit. Despite extensive research on this phenomenon—called high-temperature superconductivity—scientists still aren’t sure how it works. Previous experiments have established that ordered arrangements of electrical charges known as “charge stripes” coexist with superconductivity in many forms of cuprates. However, the exact nature of these stripes—specifically, whether they fluctuate over time—and their relationship to superconductivity—whether they work together with or against the electrons that pair up and flow without energy loss—have remained a mystery.

    Now, scientists at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory have demonstrated that static, as opposed to fluctuating, charge stripes coexist with superconductivity in a cuprate when lanthanum and barium are added in certain amounts. Their research, described in a paper published on October 11 in Physical Review Letters, suggests that this static ordering of electrical charges may cooperate rather than compete with superconductivity. If this is the case, then the electrons that periodically bunch together to form the static charge stripes may be separated in space from the free-moving electron pairs required for superconductivity.

    “Understanding the detailed physics of how these compounds work helps us validate or rule out existing theories and should point the way toward a recipe for how to raise the superconducting temperature,” said paper co-author Mark Dean, a physicist in the X-Ray Scattering Group of the Condensed Matter Physics and Materials Science Department at Brookhaven Lab. “Raising this temperature is crucial for the application of superconductivity to lossless power transmission.”

    Charge stripes put to the test of time

    To see whether the charge stripes were static or fluctuating in their compound, the scientists used a technique called x-ray photon correlation spectroscopy. In this technique, a beam of coherent x-rays is fired at a sample, causing the x-ray photons, or light particles, to scatter off the sample’s electrons. These photons fall onto a specialized, high-speed x-ray camera, where they generate electrical signals that are converted to a digital image of the scattering pattern. Based on how the light interacts with the electrons in the sample, the pattern contains grainy dark and bright spots called speckles. By studying this “speckle pattern” over time, scientists can tell if and how the charge stripes change.

    In this study, the source of the x-rays was the Coherent Soft X-ray Scattering (CSX-1) beamline at the National Synchrotron Light Source II (NSLS-II), a DOE Office of Science User Facility at Brookhaven.

    BNL NSLS-II Interior

    “It would be very difficult to do this experiment anywhere else in the world,” said co-author Stuart Wilkins, manager of the soft x-ray scattering and spectroscopy program at NSLS-II and lead scientist for the CSX-1 beamline. “Only a small fraction of the total electrons in the cuprate participate in the charge stripe order, so the intensity of the scattered x-rays from this cuprate is extremely small. As a result, we need a very intense, highly coherent x-ray beam to see the speckles. NSLS-II’s unprecedented brightness and coherent photon flux allowed us to achieve this beam. Without it, we wouldn’t be able to discern the very subtle electronic order of the charge stripes.”

    The team’s speckle pattern was consistent throughout a nearly three-hour measurement period, suggesting that the compound has a highly static charge stripe order. Previous studies had only been able to confirm this static order up to a timescale of microseconds, so scientists were unsure if any fluctuations would emerge beyond that point.

    X-ray photon correlation spectroscopy is one of the few techniques that scientists can use to test for these fluctuations on very long timescales. The team of Brookhaven scientists—representing a close collaboration between one of Brookhaven’s core departments and one of its user facilities—is the first to apply the technique to study the charge ordering in this particular cuprate. “Combining our expertise in high-temperature superconductivity and x-ray scattering with the capabilities at NSLS-II is a great way to approach these kind of studies,” said Wilkins.

    To make accurate measurements over such a long time, the team had to ensure the experimental setup was incredibly stable. “Maintaining the same x-ray intensity and sample position with respect to the x-ray beam are crucial, but these parameters become more difficult to control as time goes on and eventually impossible,” said Dean. “When the temperature of the building changes or there are vibrations from cars or other experiments, things can move. NSLS-II has been carefully engineered to counteract these factors, but not indefinitely.”

    “The x-ray beam at CSX-1 is stable within a very small fraction of the 10-micron beam size over our almost three-hour practical limit,” added Xiaoqian Chen, co-first author and a postdoc in the X-Ray Scattering Group at Brookhaven. CSX-1’s performance exceeds that of any other soft x-ray beamline currently operational in the United States.

    In part of the experiment, the scientists heated up the compound to test whether thermal energy might cause the charge stripes to fluctuate. They observed no fluctuations, even up to the temperature at which the compound is known to stop behaving as a superconductor.

    “We were surprised that the charge stripes were so remarkably static over such long timescales and temperature ranges,” said co-first author and postdoc Vivek Thampy of the X-Ray Scattering Group. “We thought we may see some fluctuations near the transition temperature where the charge stripe order disappears, but we didn’t.”

    In a final check, the team theoretically calculated the speckle patterns, which were consistent with their experimental data.

    Going forward, the team plans to use this technique to probe the nature of charges in cuprates with different chemical compositions.

    X-ray scattering measurements were supported by the Center for Emergent Superconductivity, an Energy Frontier Research Center funded by DOE’s Office of Science.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

  • richardmitnick 7:37 pm on October 13, 2016 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From Smithsonian: “Predicting Chaos: New Sensors Sniff Out Volcanic Eruptions Before They Happen” 


    October 13, 2016
    Laura Poppick

    Mount Etna, Italy, erupts at night. (Alessandro Aiuppa, University of Palermo, Italy)

    Volcanoes have blindsided humans for millennia, leaving entire cities at the whim of their devastating eruptions. But compared to other forms of natural disaster, volcanoes actually offer a variety of quiet clues leading up to their destruction. Now, new developments in volcano monitoring systems allow scientists to sniff out, forecast and plan for eruptions with more precision than ever before.

    “We are now able to put really precise instruments on volcanoes to monitor the types of gases that are emitted, and that gives us a clue as to where magma is in the system,” says Marie Edmonds, a volcanologist at the University of Cambridge who has been working amongst fuming volcanoes for about 15 years. “We can see trends in the data relating to eruptions that are just about to happen.”

    Edmonds is part of an international group called the Deep Carbon Observatory that is working to place newly developed gas sensors on 15 of the 150 most active volcanoes on Earth by 2019, to improve their capacity to forecast different types of eruptions worldwide. Last week the Deep Carbon Observatory released an interactive visualization, supported in part by the Smithsonian Institution’s Global Volcanism Program, that allows the public to watch visualizations of historic volcanic data evolve through time.

    The visualization also lets viewers follow along as new sensors are deployed. These sensors continuously measure carbon dioxide, sulfur dioxide and water vapor fuming out of volcanoes, and are placed within large boxes and buried underground with antennae on the surface. In recent years, advancements in electronics have made them more precise and affordable, allowing scientists to use them more prevalently around he world.

    Yet placing these sensors on top of active volcanoes isn’t without risk. Researchers must wear reflective suits to protect their skin from excess heat, and gas masks to protect their lungs from getting singed by corrosive gases—sometimes after hiking long distances through remote regions to reach a site. But Edmond says the potential good such work can do for populations at risk makes the more dangerous parts of the job worthwhile.

    “It’s brilliant to know that you are doing something to actually help people,” says Edmonds. “You do think about what you’re doing because it is sometimes dangerous, but I really do enjoy it.”

    Volcanologist Tobias Fischer of the University of New Mexico hikes down the steep crater wall of the vigorously degassing Gareloi volcano in the Western Aleutian Islands to collect a volcanic gas sample. (Taryn Lopez, University of Alaska Fairbanks)

    In the past month, researchers from Edmonds’ team attached one of their sensors on a drone and measured emissions from a remote volcano in Papau New Guinea over a short period of time, demonstrating another recently-developed technique used to collect snapshots of volcanic activity. When collected over a range of different types of volcanoes, these snapshots help scientists better understand the complexities of the activities leading up to an eruption. (What drones can’t do, however, is take long-term measurements.)

    Gas sensors help forecast eruptions because, as magma rises up, the resulting release of pressure overhead uncorks gases dissolved within the magma. Carbon dioxide billows out relatively early on and, as magma slithers higher up, sulfur dioxide begins to fume out. Researchers use the ratio of these two gases to determine how close the magma is getting to the earth’s surface, and how imminent an eruption may be.

    As magma rises, it also pushes through rock in the crust and causes tiny earthquakes not usually felt by humans above, but detectable with sensitive seismic equipment. Edmonds’ team often pairs gas sensors with seismic stations and uses the data in tandem to study volcanoes

    Robin Matoza, a researcher at the University of California at Santa Barbara who is not involved in Edmond’s research, agrees that technological advancements in recent years have drastically improved researchers’ ability to understand the inner workings of volcanoes and the behaviors leading up to eruptions. In places where his team once had just a few seismic stations, they can have now installed 10 or more due to the smaller size and increasing affordability of the technology. The ability to compute the collected data has also improved in recent years, Matoza says.

    “Now we are easily able to store years worth of seismic data just on a small flash drive,” says Matoza, who studies seismic signals released by volcanoes prior to eruptions. “So we can easily query that large data and learn more about the processes contained in it.”

    To supplement gas and seismic information on a broader scale, researchers use satellites to study eruptions from above. Volcanologists at the Alaska Volcano Observatory in Anchorage and Fairbanks collect this suite of gas, seismic and satellite data to on a regular basis, monitoring roughly 25 volcanoes across the state and offer early warnings to residents.

    For example, they released a series of warnings in the months leading up to the 2009 eruption of Mount Redbout, about 110 miles (180 km) southwest of Anchorage. They also work closely with the Federal Aviation Administration to help detect aviation hazards during eruptions.

    Over time, the researchers agree that satellites will become increasingly useful in collecting data over large regions. But at the moment, satellites are less precise and not as reliable as the other tools, in part because they don’t collect data as rapidly and don’t function well during cloudy weather.

    “You can have a satellite pass over a volcano and it can be obscured by clouds,” says Matt Haney, a volcanologist at the Alaska Volcanic Observatory. “I imagine in the future there will be new satellites that are launched that will be even more powerful.”

    Despite the challenges of this work, Edmonds says it can be easier to forecast volcanic eruptions than some other hazards because of the array of warning signs preceding eruptions compared to certain earthquakes and other abrupt disasters. And while the researchers may not be able to forecast to the exact day or hour that an eruption will occur yet, rapidly advancing technology is moving them in that direction.

    “The more instruments and the more sensors just contribute to our toolbox,” says Edmonds. “We are one step closer.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Smithsonian magazine and Smithsonian.com place a Smithsonian lens on the world, looking at the topics and subject matters researched, studied and exhibited by the Smithsonian Institution — science, history, art, popular culture and innovation — and chronicling them every day for our diverse readership.

  • richardmitnick 10:26 am on October 13, 2016 Permalink | Reply
    Tags: Applied Research & Technology, Next Century Will Bring Deep Water to New York City,   

    From Rutgers: “Next Century Will Bring Deep Water to New York City” 

    Rutgers University
    Rutgers University

    October 10, 2016
    Ken Branson

    An increase in frequency of floods like those caused by Sandy expected by 2100

    The Brooklyn-Battery Tunnel (officially known as the Hugh L. Carey Tunnel) in Manhattan, as it appeared in the immediate aftermath of Hurricane Sandy in 2012. Photo: Metropolitan Transportation Authority of the State of New York

    New York City can expect 9-foot floods, as intense as that produced by 2012’s Superstorm Sandy, at least three times more frequently over the next century – and possibly as much as 17 times more frequently, according to a paper published today by scientists at Rutgers University, Princeton University and the Woods Hole Oceanographic Institution.

    The paper was published in the Proceedings of the National Academy of Sciences.

    The study is based on a combination of historical data and computer model projections performed by Ning Lin of Princeton University, Benjamin Horton and Robert Kopp of Rutgers University, and Jeff Donnelly of the Woods Hole Oceanographic Institution. The historical data consist of tidal gauge records taken from New York City, going back to 1856, and geological records from the same area going back two millennia. The model projections consist of Kopp’s work on future sea-level models; Lin’s work on future storm intensity; and the work of Horton, Kopp and Donnelly on historical sea levels and storm surges.

    The scientists ask the question: How frequent will floods like that produced by Sandy be in the future? Earlier research led by Andra Reed, now a postdoctoral scholar at Rutgers, had shown a 20-fold increase in the frequency of extreme floods, primarily as a result of sea-level rise, between the historic period from 850 to 1850 and the late 20th century.

    The historic sea-level rise was largely due to natural effects, like the slow sinking of the land in the mid-Atlantic region in response to the end of the last ice age; but over the late 20th century, human-caused climate change came to dominate sea-level rise.

    In the paper published today, the authors report that floods as intense as Sandy’s would have occurred about once every 400 years on average under the sea-level rise conditions of the year 2000, but that over the 21st century are expected to be about four times more probable due to an acceleration in the rate of sea-level rise.

    “The grand answer is that things are going to get worse by 2100,” says Horton, who is professor of Marine and Coastal Sciences in the School of Environmental and Biological Sciences. “If nothing changes with hurricanes, sea-level rise alone will increase the frequency of Sandy-like events by 2100.”

    But the size, intensity, and tracks of hurricanes may change. In the paper, Princeton’s Lin combined historical climate data and modeling of future climate conditions and storm surges. She found that these changes may lead to a more modest three-fold increase in flood probability, but may also break badly against New York City, making Sandy’s flood 17 times more probable.

    “As we refine climate and hurricane dynamic models, we will have more accurate predictions that will allow planners to better design flood mitigation strategies,” said Lin, the study’s lead author.

    The study built upon past work by Kopp, professor of earth and planetary sciences in the School of Arts and Sciences, estimating sea levels over the 21st century. “We ask, ‘What is likely?’ and ‘What are the extremes?’” Kopp said. “We take into account factors that cause local sea level to vary from global sea level. And we’ve shown, through geological investigations, that our projections are consistent with the assumption that temperature and sea level will be related in the future as they have been over the past two thousand years.”

    Projections are not predictions and, Horton says, the spread between what “likely” and “extreme” is an indication of the complexity of future projection. “Things are only going to get worse by 2100,” Horton says. “It’s just a question of how much worse it will get. There is no happy scenario.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Rutgers, The State University of New Jersey, is a leading national research university and the state’s preeminent, comprehensive public institution of higher education. Rutgers is dedicated to teaching that meets the highest standards of excellence; to conducting research that breaks new ground; and to providing services, solutions, and clinical care that help individuals and the local, national, and global communities where they live.

    Founded in 1766, Rutgers teaches across the full educational spectrum: preschool to precollege; undergraduate to graduate; postdoctoral fellowships to residencies; and continuing education for professional and personal advancement.

    Rutgers smaller

  • richardmitnick 10:32 am on October 12, 2016 Permalink | Reply
    Tags: Applied Research & Technology, , Low-bandgap polymers,   

    From DESY: “X-ray vision reveals how polymer solar cells wear out” 



    Scientists from the Technical University of Munich have used the acurate X-ray vision provided by DESY’s radiation source PETRA III to observe the degradation of plastic solar cells.

    DESY Petra III interior
    DESY Petra III

    Their study suggests an approach for improving the manufacturing process to increase the long-term stability of such organic solar cells. The team of Prof. Peter Müller-Buschbaum has presented its findings in the latest issue of the scientific journal Advanced Energy Materials (Vol. 6, No. 19, published online in advance).

    Unlike conventional solar cells, which are made of silicon, organic solar cells produce electricity in an active blended layer between two carbon-based materials. When one of these is a polymer, the resulting cell is often referred to as a plastic solar cell. These are particularly promising because they can be manufactured simply and cheaply. They can be used to make extremely lightweight, flexible and even semi-transparent solar cells using printing techniques on flexible polymer materials, opening up completely new fields of application. In general, however, organic solar cells are less efficient than silicon-based ones, and sometimes they have also a reduced lifetime.

    The inner structure of the active layer without solvent additive (left), with solvent additive (centre) and after loss of solvent additive (right). Losing the solvent leads to an inner structure comparable to production without solvent. Credit: Christoph Schaffer / TU München

    The internal structure of the active layer is crucial in organic solar cells. When manufacturing them, the two materials that form the active layer have to separate out of a common solution, much like droplets of oil forming in water. “It is important that the polymer domains formed in the process are a few tens of nanometres apart,” points out Christoph Schaffer, a PhD student in Müller-Buschbaum’s group, who is the paper’s first author. “Only then positive and negative charge carriers can be efficiently produced in the active layer and separated from each other. If the structure is too coarse or too fine, this process no longer happens, and the efficiency of the solar cell will decrease.” A nanometre is one millionth of a millimetre.

    Modern polymer solar cells often consist of so-called low-bandgap polymers, which absorb particularly large amounts of light. In many cases, these require the use of a solvent additive during the manufacturing process in order to achieve high efficiencies. However, this additive is controversial because it might further decrease the lifetime of the solar cells.

    The scientists used DESY’s X-ray source PETRA III to study the degradation of such low-bandgap polymer solar cells with solvent additives in more detail. To this end, a solar cell of this type was exposed to simulated sunlight in a chamber, while its key parameters were continuously monitored. At the same time, the scientists shone a narrowly collimated x-ray beam from PETRA III at the solar cell at different times, providing a picture of the internal structure of the active layer on a nanometre scale every few minutes. “These measurements can be used to relate the structure to the performance of the solar cell and track it over time,” explains co-author Prof. Stephan Roth, who is in charge of DESY’s P03 beamline, where the experiments were conducted.

    “The data reveals that domains that are on the scale of a few tens of nanometres shrink substantially during operation and that their geometric boundaries with other components disappear,” says Schaffer. At the same time, the measurements suggest that the amount of residual solvent additive decreases. The scientists attribute the measured drop in the efficiency of the solar cell to the observed decrease.

    “Since there is evidence to suggest that the residual amount of solvent additive decreases, we have to assume that this process can limit the lifetime of the solar cells,” explains Müller-Buschbaum. “Therefore it is essential to come up with strategies for stabilising the structure. This could be achieved through chemical bonding between the polymer chains, or using customised encapsulating substances.”

    In an earlier study, the Munich researchers observed the degradation of a different type of polymer solar cell. In that case, the efficiency was found to drop as a result of the active centres gradually growing in size during their operation. This suggested that it is in fact better to manufacture such solar cells with a suboptimal structure, i.e. one that is too fine, so that it can then grow to the optimum size during the first hours of operation.

    The current study picks up the story where the previous one left off. “Our first study showed us that the efficiency dropped when the structure became coarser,” says Schaffer. “Exactly the opposite happens in the present study. This behaviour is precisely what we expected, because the composition of the active layer is different. The materials in the first study tend to demix to a high degree. Here, the opposite is true, and we need the solvent additive in order to achieve the demixing of the materials that is needed to obtain high efficiencies. When the solvent additive disappears during operation, the structure becomes finer and therefore moves away from its optimum.”

    Both these studies offer important approaches to optimising the manufacture of organic solar cells, as co-author Roth points out: “The way these two studies fit together provides a wonderful example of how investigations with synchrotron radiation on the atomic scale yield crucial results, especially in applied research such as in the field of renewable energies.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition


    DESY is one of the world’s leading accelerator centres. Researchers use the large-scale facilities at DESY to explore the microcosm in all its variety – from the interactions of tiny elementary particles and the behaviour of new types of nanomaterials to biomolecular processes that are essential to life. The accelerators and detectors that DESY develops and builds are unique research tools. The facilities generate the world’s most intense X-ray light, accelerate particles to record energies and open completely new windows onto the universe. 
That makes DESY not only a magnet for more than 3000 guest researchers from over 40 countries every year, but also a coveted partner for national and international cooperations. Committed young researchers find an exciting interdisciplinary setting at DESY. The research centre offers specialized training for a large number of professions. DESY cooperates with industry and business to promote new technologies that will benefit society and encourage innovations. This also benefits the metropolitan regions of the two DESY locations, Hamburg and Zeuthen near Berlin.

  • richardmitnick 7:40 am on October 12, 2016 Permalink | Reply
    Tags: Applied Research & Technology, , , ,   

    From SLAC: “X-rays Reveal New Path In Battle Against Mosquito-borne Illness” 

    SLAC Lab

    The mosquito larvicide BinAB is composed of two proteins, BinA (yellow) and BinB (blue). Inside bacterial cells, BinAB naturally forms nanocrystals. Using these crystals and the intense X-ray pulses produced by SLAC’s Linac Coherent Light Source, scientists shed light on the three-dimensional structure of BinAB and its mode of action. (SLAC National Accelerator Laboratory)

    September 28, 2016

    SLAC’s X-ray Laser Provides Clues to Engineering a New Protein to Kill Mosquitos Carrying Dengue, Zika

    Structural biology research conducted at the U.S. Department of Energy’s SLAC National Accelerator Laboratory has uncovered how small insecticidal protein crystals that are naturally produced by bacteria might be tailored to combat dengue fever and the Zika virus.

    SLAC’s X-ray free-electron laser – the Linac Coherent Light Source (LCLS), a DOE Office of Science User Facility – offered unprecedented views of the toxin BinAB, used as a larvicide in public health efforts against mosquito-borne diseases such as malaria, West Nile virus and viral encephalitis.


    The larvicide is currently ineffective against the Aedes mosquitos that transmit Zika and dengue fever, and therefore not used to combat these species of mosquitos at this time. The new information provides clues to how scientists could design a composite toxin that would work against a broader range of mosquito species, including Aedes.

    Today, Nature published the study.

    “A more detailed look at the proteins’ structure provides information fundamental to understanding how the crystals kill mosquito larvae,” said Jacques-Philippe Colletier, a scientist at the Institut de Biologie Structurale in Grenoble, France and lead author on the paper. “This is a prerequisite for modifying the toxin to adapt it to our needs.”

    Selective Mosquito Control, Courtesy of Bacteria

    The BinAB crystals are produced by Lysinibacillus sphaericus bacteria, which release the crystals along with spores at the end of their life cycle. Mosquito larvae eat the crystals along with the spores, and then die.

    BinAB is inactive in the crystalline state and does not work on contact. For the crystals to dissolve, they must be exposed to alkaline conditions, such as those in a mosquito larva’s gut. The binary protein is then activated, recognized by a specific receptor at the surface of cells and internalized.

    Because Aedes larvae can evade one of these steps of intoxication, they are resistant to BinAB. These larvae do not express the correct receptors at the surface of their intestinal cells. Many other insect species, small crustaceans and humans also lack these receptors, as well as alkaline digestive systems.

    “Part of the appeal is that the larvicide’s safe because it’s so specific, but that’s also part of its limitation,” said Michael Sawaya, a scientist at the University of California, Los Angeles-DOE Molecular Biology Institute and co-author on the paper.

    For public health officials who want to prevent mosquito-borne disease, BinAB could also offer an alternative for controlling certain species of mosquitos that have begun to show resistance to other forms of chemical control.

    Creating a Tailored Insecticide

    The research team already knew the larvicide is composed of a pair of proteins, BinA and BinB, that pair together in crystals and are later activated by larval digestive enzymes.

    In the LCLS experiments, they learned the molecular basis for how the two proteins paired with each other – each performing an important, unique function. Previous research had determined that BinA is the toxic part of the complex, while BinB is responsible for binding the toxin to the mosquito’s intestine. BinB ushers BinA into the cells; once inside, BinA kills the cell.

    The scientists also identified four “hot spots” on the proteins that are activated by the alkaline conditions in the larval gut. All together, they trigger a change from a nontoxic form of the protein to a version that is lethal to mosquito larvae.

    Using the information gathered during the crystallography study, the research team has already begun to engineer a form of the BinAB proteins that will work against more species of mosquitos. This is ongoing work at Institut de Biologie Structurale, UCLA, University of California, Riverside and SLAC.

    Solving the Structure

    Only coarse details were known about the unique three-dimensional structure and biological behavior of BinAB prior to the experiment at LCLS.

    “We chose to look at the BinAB larvicide because it is so widely used, yet the structural details were a mystery,” said Brian Federici, professor of entomology at UC Riverside.

    The small size of the crystals made them difficult to study at conventional X-ray sources. So the research team used genetic engineering techniques to increase the size of the crystals, and the bright, fast pulses of light at LCLS allowed the scientists to collect detailed structural data from the tiny crystals before X-rays damaged their samples.

    The researchers used a crystallography technique called de novo phasing. This involves tagging the crystals with heavy metal markers, collecting tens of thousands of X-ray diffraction patterns, and combining the information collected to obtain a three-dimensional map of the electron density of the protein.

    “This is the first time we’ve used de novo phasing on a crystal of great interest at an X-ray free-electron laser,” said Sebastien Boutet, SLAC scientist.

    The technique had so far only been used on test samples where the structure was already known, in order to prove that it would work.

    “The most immediate need is to now expand the spectrum of action of the BinAB toxin to counter the progression of Zika, in particular,” said Colletier. “BinAB is already effective against Culex [carrier of West Nile encephalitis] and Anopheles [carrier of malaria] mosquitos. With the results of the study, we now feel more confident that we can design the protein to target Aedes mosquitos.”

    Additional contributors to the research include scientists from the Howard Hughes Medical Institutes at UCLA, Lawrence Berkeley National Laboratory, and Stanford University. The Institut de Biologie Structurale is a research center for integrated structural biology funded by the Commissariat à l’Énergie Atomique, the Centre National de la Recherche Scientifique and the Université Grenoble Alpes. The Collaborative Innovation Award program of Howard Hughes Medical Institute (HCIA-HHMI), W.M Keck Foundation, National Institutes of Health, National Science Foundation, France Alzheimer Foundation, Agence Nationale de la Recherche, and DOE Office of Science supported the research.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

  • richardmitnick 1:15 pm on October 10, 2016 Permalink | Reply
    Tags: Applied Research & Technology, , , Intertropical Convergence Zone, Paleoceanography, Paleography   

    From Eos: “Simulating the Climate 145 Million Years Ago” 

    Eos news bloc


    Shannon Hall

    A new model shows that the Intertropical Convergence Zone wasn’t always a single band around the equator, which had drastic effects on climate.

    Upper Jurassic (145- to 160-million-year-old) finely laminated organic carbon-rich shale interspersed with homogeneous, low-carbon mudrock of the Kimmeridge Clay Formation in Kimmeridge Bay, England. Variation in rock type reflects the ocean response to a monsoon-like climate 30°N during the Late Jurassic. Credit: Howard Armstrong

    The United Kingdom was once a lush oasis. That can be read from sediments within the Kimmeridge Clay Formation, which were deposited around 160 to 145 million years ago on Dorset’s “Jurassic Coast.” A favorite stomping ground for fossil hunters and the source rock for North Sea oil, the formation is rich in organic matter, which suggests that it likely formed when global greenhouse conditions were at least 4 times higher than present levels.

    Normally, organic matter disappears rapidly after an organism dies, as the nutrients are consumed by other life forms and the carbon decays. However, when the seas are starved of oxygen, which occurs when plankton numbers swell owing to increasing levels of carbon dioxide, then organic matter is preserved. An abundance of so-called black shales, or organic-rich muds, within the Kimmeridge Clay Formation points to this past.

    Here Armstrong et al. used those black shales to build new climate simulations that better approximate the climate toward the end of the Jurassic period. The model simulated 1422 years of time that suggested a radically different Intertropical Convergence Zone—the region where the Northern and Southern Hemisphere trade winds meet—than the one today. The convergence of these trade winds produces a global belt of clouds near the equator and is responsible for most of the precipitation on Earth.

    This figure shows the path (in red) of the Intertropical Convergence Zone as it forks, where the Pacific Ocean met the western coast of the American continents. Credit: Armstrong et al. [2016]

    Today the Intertropical Convergence Zone in the Atlantic strays, at most, 12° away from the equator. However, 145 million years ago, when the continents were still much closer together, the model showed that the zone split, like a fork in the road, where the Pacific Ocean met the western coast of the American continents. The zone was driven apart by the proto-Appalachian mountain range to the north and the North African mountains to the south. The northern fork, which was much stronger than the southern one, extended as far as about 30° north, passing over the United Kingdom and the location of the Kimmeridge Clay Formation.

    Not only were the researchers able to verify that the United Kingdom was once a tropical oasis, but they were also able to simulate and map the climate 145 million years ago—research that will help scientists better understand how Earth will react to anthropogenic warming today and in the future. (Paleoceanography, doi:10.1002/2015PA002911, 2016)

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 9:49 am on October 9, 2016 Permalink | Reply
    Tags: Applied Research & Technology, , , , Global Volcanism Program   

    From Smithsonian: “How Earthquakes and Volcanoes Reveal the Beating Heart of the Planet” 


    October 6, 2016
    Rachel E. Gross

    Your face looks fine. Trust me. But if you zoom in and take a time-lapse, you’ll see a landscape in motion: zits erupting, pore-craters forming, ridges of skin stretching apart and squashing together as you smile and frown. Similarly, the Earth outside your window might appear quiet. But that’s because you’re looking at a tiny slice in time and space. Expand your view and you’ll see plates shift, earthquakes ripple and volcanoes erupt along tectonic boundaries. The world snaps, crackles and tears asunder. Nothing stays the same.

    To illustrate these dynamic patterns, the Smithsonian Institution’s Global Volcanism Program, hosted within the National Museum of Natural History, has created a time-lapse animation of the world’s earthquakes, eruptions and emissions since 1960. Drawing from the first compiled database of sulfur emissions dating to 1978, the animations show how the seemingly random activity of volcanoes and earthquakes form consistent global patterns over time. Understanding those patterns gives researchers insight into how these dramatic events are entwined with the inner workings of our planet.

    Earthquakes and volcanoes can conjure up images of widespread destruction. But for those who study Earth’s deepest reaches, like Elizabeth Cottrell, a research geologist at the Smithsonian’s National Museum of Natural History and director of the Global Volcanism Program, volcanoes are also “windows to the interior.” Their activity and emissions provide a taste of what’s inside, helping researchers to untangle the composition and history of the planet’s core. That’s crucial, because we still don’t know exactly what the inside of our planet is made of. We need to understand the interior if we are to disentangle the global carbon cycle, the chemical flux that influences our planet’s past and future.

    We know a lot about carbon, the element that forms the chemical backbone of life, in our crust and oceans. We know far less about it in Earth’s core and mantle. It’s so far proved challenging to sample the Earth’s mantle, which extends up to 1,800 miles below the surface. This means that Earth’s interior plays a huge—and mysterious—role in the global carbon cycle. The interior contains perhaps 90 percent of our planet’s carbon, bound up in pure forms like graphite or diamonds. Gleaning the movements of this elusive deep-earth carbon has been called “one of the most vexing problems” in our quest to understand the global carbon cycle.

    Fortunately, we have volcanoes. As a planetary geologist, Cottrell thinks of these magma-makers as a “sample delivery system” that gives us a peek into the planet’s core. “Earthquakes and eruptions are the heartbeat of the planet,” she says. The emissions from these events, which have influenced global climate, are the planet’s respiration. (Worldwide, volcanoes release about 180 to 440 million tons of carbon dioxide.) By studying the chemistry of lava and the makeup of volcanic gases, Cottrell and others can get an idea of what lies within—like studying human burps to figure out what’s in your stomach.

    Volcanoes belch out about mostly water vapor in the form of steam, along with carbon dioxide and some sulfur (by contrast, humans breathe out about 16 percent oxygen, 4 percent CO2 and 79 percent nitrogen). Understanding the “normal” levels of these volcano emissions would help scientists determine what the baseline is—and thus, how drastically human activity is impacting it. Yet pinning down those emissions is a tricky business. Collecting volcanic gas is downright dangerous, requiring researchers to get up close and personal to hot, pressurized emissions. When it erupts from the mantle, molten lava is a searing 1000 to 1300 degrees Celsius.

    No wonder scientists would rather read gas signatures in the atmosphere using satellites from space. Unfortunately, that technique also has its problems. In the past three centuries, anthropogenic emissions from sources like factory farming and burning fossil fuels have drastically overtaken the emissions from volcanoes—meaning that volcanic CO2 gets lost in the background noise. As a workaround, scientists use sulfur, which is easier to measure from space, as a proxy for carbon. In the past decade, technological advancements have also made us possible to tease apart some of these emissions.

    “Global satellite monitoring of volcanoes will transform our understanding of gas fluxes from Earth’s interior to exterior in the coming decade,” says Cottrell, who has been working along with Michigan Tech researcher Simon Carn and data manager Ed Venzke to incorporate volcanic emissions into the Smithsonian database since 2012.

    In the visualization above, you can see earthquakes and volcanic eruptions not just as individual events, but as indicators of those regions of frenzied activity in Earth’s crust where plates push up against each other and are torn asunder. The key is timescale. By zooming out to the past 50 years, you can see that volcanoes aren’t merely catastrophic blips, but a steady pattern: the living heartbeat of a dynamic planet. “When we look on a long timescale, we see the constant pulse of the planet,” says Cottrell, who recommends watching the animation with the sound on to get the full effect. It is a “constant unrelenting beat punctuated by periods of high and low activity.”

    Zoom in again, and you can see how volcanoes link us all on a very personal level. Every time you breathe, you inhale volcanic gas, which rapidly mixes with the atmosphere and diffuses. By knowing when and where recent volcanic eruptions have occurred, you can even pinpoint the volcano that flavored your last inhalation. Now that’s intimate.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Smithsonian magazine and Smithsonian.com place a Smithsonian lens on the world, looking at the topics and subject matters researched, studied and exhibited by the Smithsonian Institution — science, history, art, popular culture and innovation — and chronicling them every day for our diverse readership.

  • richardmitnick 9:32 am on October 9, 2016 Permalink | Reply
    Tags: Alien life could feed on cosmic rays, Applied Research & Technology, ,   

    From Science: “Alien life could feed on cosmic rays” 



    Oct. 7, 2016
    Jessica Boddy

    High-energy particles called galactic cosmic rays could be an energy source for life on other planets. Tinieder/iStockphoto

    A bizarre microbe found deep in a gold mine in South Africa could provide a model for how life might survive in seemingly uninhabitable environments through the cosmos. Known as Desulforudis audaxviator, the rod-shaped bacterium thrives 2.8 kilometers underground in a habitat devoid of the things that power the vast majority of life on Earth—light, oxygen, and carbon. Instead, this “gold mine bug” gets energy from radioactive uranium in the depths of the mine. Now, scientists predict that life elsewhere in the universe might also feed off of radiation, especially radiation raining down from space.

    “It really grabbed my attention because it’s completely powered by radioactive substances,” says Dimitra Atri, an astrobiologist and computational physicist who works for the Blue Marble Space Institute of Science in Seattle, Washington. “Who’s to say life on other worlds doesn’t do the same thing?”

    Essentially all life on Earth’s surface takes in the energy it needs through one of two processes. Plants, some bacteria, and certain other organisms collect energy from sunlight through a process called photosynthesis. In it, they use the energy from light to convert water and carbon dioxide into more complex and energetic molecules called hydrocarbons, thus storing the energy so that it can be recovered later by breaking down the molecules through a process called oxidation. Alternatively, animals and other organisms simply feed off of plants, one another, etc., to steal the energy already stored in living things.

    D. audaxviator takes a third path: It draws its energy from the radioactivity of uranium in the rock in the mine. The radiation from decaying uranium nuclei breaks apart sulfur and water molecules in the stone, producing molecular fragments such as sulfate and hydrogen peroxide that are excited with internal energy. The microbe then takes in these molecules, siphons off their energy, and spits them back out. Most of the energy produced from this process powers the bacterium’s reproduction and internal processes, but a portion of it also goes to repairing damage from the radiation.

    Atri thinks an extraterrestrial life form could easily make use of a similar system. The radiation might not come from radioactive materials on the planet itself, but rather from galactic cosmic rays (GCRs)—high-energy particles that careen through the universe after being flung out of a supernova. They’re everywhere, even on Earth, but our planet’s magnetic field and atmosphere shields us from most GCRs.

    Desulforudis audaxviator thrive using radiation from uranium as an energy source deep in the gold mine they call home. NASA

    The surfaces of other planets like Mars are much more susceptible to cosmic rays because of their thin atmospheres and, in the case of Mars, its lack of a magnetic field. Atri argues GCRs could reach the Red Planet’s surface with enough energy left to power a tiny organism. This could also be the case on any world with a negligible atmosphere: Pluto, Earth’s moon, Jupiter’s moon Europa, Saturn’s moon Enceladus, and, theoretically, countless more outside our solar system. He does note, though, that because GCRs don’t deliver nearly as much energy as the sun, GCR-powered life would be very small, and simple, just like D. audaxviator.

    To figure out how this might work, Atri ran simulations using existing data about GCRs to see how much energy they’d provide on some of these other worlds. The numbers were clear: The small, steady shower of cosmic rays would supply enough energy to power a simple organism on all of the planets he simulated except Earth, Atri reports this week in the Journal of the Royal Society Interface. “It can’t be ruled out that life like this could exist,” he says.

    Atri thinks Mars is the best candidate to host GCR-powered life. The planet’s composition is rocky like Earth’s with plenty of minerals, and it might even have some water tucked away. Both would offer excellent mediums to be broken down by cosmic rays and gobbled up by a life form. The most essential part of the equation, though, is the thin atmosphere. “It’s funny,” Atri says, “because when we look for planets that contain life currently, we look for a very thick atmosphere. With these life forms, we’re looking for the opposite.”

    Duncan Forgan, an astrobiologist at the University of St Andrews in the United Kingdom who was not involved with the work, agrees that Mars might be harboring D. audaxviator-like life because its stable temperatures and physical makeup are similar to that of the South African gold mine. He does worry that on other planets that don’t receive light energy from a sun but still get bombarded with GCRs—such as free floating rogue planets not tied to any solar system—temperatures would dip too low and freeze life in its tracks. He also cautions that too many cosmic rays could wipe life out altogether: “Life forms like this want a steady flux of energy from cosmic rays, but not so much that it’s damaging,” he says. “They might not be able to cope with a huge bout of radiation that pops in.”

    In the future, Atri wants to bring the gold mine bug into the lab and see how it responds to cosmic radiation levels equivalent to those on Mars, Europa, and others. That data would give him more clues to whether this kind of organism could survive beyond Earth. “Desulforudis audaxviator is proof that life can thrive using almost any energy source available,” he says. “I always think of Jeff Goldblum in Jurassic Park—life finds a way.”

    See the full article here .

    The American Association for the Advancement of Science is an international non-profit organization dedicated to advancing science for the benefit of all people.

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition

  • richardmitnick 5:10 pm on October 7, 2016 Permalink | Reply
    Tags: Applied Research & Technology, , ,   

    From Caltech via phys.org: “California earthquakes discovered much deeper than originally believed” 

    Caltech Logo



    October 7, 2016
    Rong-Gong Lin Ii, Los Angeles Times

    Seismogram being recorded by a seismograph at the Weston Observatory in Massachusetts, USA. Credit: Wikipedia

    Scientists in California have found that earthquakes can occur much deeper below the Earth’s surface than originally believed, a discovery that alters their understanding of seismic behavior and potential risks.

    Seismologists have long believed that earthquakes occur less than 12 to 15 miles underground. But the new research found evidence of quakes deeper than 15 miles, below the Earth’s crust and in the mantle.

    Three scientists at the California Institute of Technology in Pasadena studied data from state-of-the-art sensors installed in Long Beach atop the Newport-Inglewood fault, one of the most dangerous in the Los Angeles Basin and which caused the magnitude 6.4 Long Beach earthquake of 1933.

    After analyzing the data collected over six months by 5,000 sensors, scientists found quakes were occurring deep into the upper mantle, an area where the rock is so hot that it is no longer brittle like it is at the surface, but creeps, moving around like an extremely hard honey.

    It appeared that the Newport-Inglewood fault extended even into the mantle – past the uppermost layer of the Earth, the crust, where earthquakes long have been observed. Until now, researchers didn’t think earthquakes were possible there, said Caltech seismology professor Jean Paul Ampuero, one of three authors of the study, published Thursday in the journal Science.

    Ampuero said the research raised the possibility that the Newport-Inglewood and others, like the San Andreas, could see even more powerful earthquakes than expected. The earthquakes he and his colleagues studied were so deep that they were not felt at the surface by conventional seismic sensors.

    The new study [Science] indicates that a quake much closer to the surface could travel much deeper into the Earth, producing a stronger, more damaging, rupture than previously believed was possible.

    “That got us thinking – that if earthquakes want to get big, one way of achieving that is by penetrating deep,” Ampuero said. “The big question is: If the next, larger earthquake happens, if it manages to penetrate deeper than we think, it may be bigger than we expect.”

    It’s an idea that was first raised in 2012, also by Ampuero and several colleagues in the journal Science, when a magnitude 8.6 earthquake struck the Indian Ocean.

    That was the largest quake of its kind “that has ever happened,” Ampuero said. It happened on a fault known as a “strike-slip,” the same kind of fault as Newport-Inglewood and California’s mighty San Andreas, the state’s longest fault.

    But that Indian Ocean earthquake was so large, it was impossible to explain how it happened with existing science.

    So answering the question of how an 8.6 earthquake occurred required a new explanation – that perhaps the quake centered on a fault that not only ruptured the crust, but went deeper into the mantle.

    If deep earthquakes can occur on the Newport-Inglewood fault, then it’s possible Southern Californians could see earthquakes along this fault at an even greater magnitude than what is projected. According to Caltech, the probable magnitude of a large quake on the Newport-Inglewood fault ranges from 6.0 to 7.4.

    But there’s a lot more study that needs to be done.

    The deep quakes Caltech scientists detected were only microquakes – topping out at about a magnitude 2.

    Therefore, one alternate – and more comforting – possibility is that these deep earthquakes remain small and don’t help a large earthquake become stronger. With this theory, earthquakes in this deep zone occur in small pockets far away from each other and don’t link in a way that forces a big earthquake to get stronger.

    “This could be good news, in a way, because if they never break together, that means they can break in tiny earthquakes, but they cannot break in large ones,” Ampuero said. “So several questions are still open. I wouldn’t say that this is cause for alarm at this point. These are very interesting questions that we need to pursue.”

    Another thing to consider: The deep earthquakes were found in a 9-square-mile area underneath Long Beach, recorded over six months. When researchers looked farther northwest – over a shorter time period, only four weeks – they did not find deep earthquakes there.

    So it’s possible that deep earthquakes don’t exist everywhere on the Newport-Inglewood fault. But it’s also possible that scientists didn’t record any, and could catch some if they continue monitoring the area for a longer period.

    There’s a possibility that Long Beach is simply peculiar, and what’s found there isn’t found elsewhere. In Long Beach, scientists found evidence that there are some liquids flowing from the mantle up to the surface – an observation that was not found in another location on the Newport-Inglewood fault.

    The scientists obtained the data from a group who installed sensors to better understand the oil fields of the area. Once they collected it, the scientists had to design a program to process the massive amounts of data collected to understand what was going on miles underground, and invisible to conventional seismic sensing equipment.

    In addition to Ampuero, the other authors of the study are Asaf Inbal and Robert Clayton.

    See the full article here .

    QCN bloc

    Quake-Catcher Network

    The Quake-Catcher Network is a collaborative initiative for developing the world’s largest, low-cost strong-motion seismic network by utilizing sensors in and attached to internet-connected computers. With your help, the Quake-Catcher Network can provide better understanding of earthquakes, give early warning to schools, emergency response systems, and others. The Quake-Catcher Network also provides educational software designed to help teach about earthquakes and earthquake hazards.

    After almost eight years at Stanford, and a year at CalTech, the QCN project is moving to the University of Southern California Dept. of Earth Sciences. QCN will be sponsored by the Incorporated Research Institutions for Seismology (IRIS) and the Southern California Earthquake Center (SCEC).

    The Quake-Catcher Network is a distributed computing network that links volunteer hosted computers into a real-time motion sensing network. QCN is one of many scientific computing projects that runs on the world-renowned distributed computing platform Berkeley Open Infrastructure for Network Computing (BOINC).


    BOINC WallPaper

    The volunteer computers monitor vibrational sensors called MEMS accelerometers, and digitally transmit “triggers” to QCN’s servers whenever strong new motions are observed. QCN’s servers sift through these signals, and determine which ones represent earthquakes, and which ones represent cultural noise (like doors slamming, or trucks driving by).

    There are two categories of sensors used by QCN: 1) internal mobile device sensors, and 2) external USB sensors.

    Mobile Devices: MEMS sensors are often included in laptops, games, cell phones, and other electronic devices for hardware protection, navigation, and game control. When these devices are still and connected to QCN, QCN software monitors the internal accelerometer for strong new shaking. Unfortunately, these devices are rarely secured to the floor, so they may bounce around when a large earthquake occurs. While this is less than ideal for characterizing the regional ground shaking, many such sensors can still provide useful information about earthquake locations and magnitudes.

    USB Sensors: MEMS sensors can be mounted to the floor and connected to a desktop computer via a USB cable. These sensors have several advantages over mobile device sensors. 1) By mounting them to the floor, they measure more reliable shaking than mobile devices. 2) These sensors typically have lower noise and better resolution of 3D motion. 3) Desktops are often left on and do not move. 4) The USB sensor is physically removed from the game, phone, or laptop, so human interaction with the device doesn’t reduce the sensors’ performance. 5) USB sensors can be aligned to North, so we know what direction the horizontal “X” and “Y” axes correspond to.

    If you are a science teacher at a K-12 school, please apply for a free USB sensor and accompanying QCN software. QCN has been able to purchase sensors to donate to schools in need. If you are interested in donating to the program or requesting a sensor, click here.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.

    Earthquake safety is a responsibility shared by billions worldwide. The Quake-Catcher Network (QCN) provides software so that individuals can join together to improve earthquake monitoring, earthquake awareness, and the science of earthquakes. The Quake-Catcher Network (QCN) links existing networked laptops and desktops in hopes to form the worlds largest strong-motion seismic network.

    Below, the QCN Quake Catcher Network map
    QCN Quake Catcher Network map

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Phys.org in 100 Words

    Phys.org™ (formerly Physorg.com) is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004, Phys.org’s readership has grown steadily to include 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page.

    Caltech campus

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: