Updates from richardmitnick Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 5:32 pm on May 23, 2016 Permalink | Reply
    Tags: , , ,   

    From Goddard: “NASA: Solar Storms May Have Been Key to Life on Earth” 

    NASA Goddard Banner

    NASA Goddard Space Flight Center

    May 23, 2016
    Karen C. Fox
    NASA’s Goddard Space Flight Center, Greenbelt, Md.
    karen.c.fox@nasa.gov

    Solar eruption 2012 by NASA's Solar Dynamic Observatory SDO
    Solar eruption 2012 by NASA’s Solar Dynamic Observatory SDO

    Our sun’s adolescence was stormy—and new evidence shows that these tempests may have been just the key to seeding life as we know it.

    Some 4 billion years ago, the sun shone with only about three-quarters the brightness we see today, but its surface roiled with giant eruptions spewing enormous amounts of solar material and radiation out into space. These powerful solar explosions may have provided the crucial energy needed to warm Earth, despite the sun’s faintness. The eruptions also may have furnished the energy needed to turn simple molecules into the complex molecules such as RNA and DNA that were necessary for life. The research was published* in Nature Geoscience on May 23, 2016, by a team of scientists from NASA.


    Access mp4 video here .

    Understanding what conditions were necessary for life on our planet helps us both trace the origins of life on Earth and guide the search for life on other planets. Until now, however, fully mapping Earth’s evolution has been hindered by the simple fact that the young sun wasn’t luminous enough to warm Earth.

    “Back then, Earth received only about 70 percent of the energy from the sun than it does today,” said Vladimir Airapetian, lead author of the paper and a solar scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “That means Earth should have been an icy ball. Instead, geological evidence says it was a warm globe with liquid water. We call this the Faint Young Sun Paradox. Our new research shows that solar storms could have been central to warming Earth.”

    Scientists are able to piece together the history of the sun by searching for similar stars in our galaxy. By placing these sun-like stars in order according to their age, the stars appear as a functional timeline of how our own sun evolved. It is from this kind of data that scientists know the sun was fainter 4 billion years ago. Such studies also show that young stars frequently produce powerful flares – giant bursts of light and radiation — similar to the flares we see on our own sun today. Such flares are often accompanied by huge clouds of solar material, called coronal mass ejections, or CMEs, which erupt out into space.

    NASA’s Kepler mission found stars that resemble our sun about a few million years after its birth.

    NASA/Kepler Telescope
    NASA/Kepler Telescope

    The Kepler data showed many examples of what are called “superflares” – enormous explosions so rare today that we only experience them once every 100 years or so. Yet the Kepler data also show these youngsters producing as many as ten superflares a day.

    While our sun still produces flares and CMEs, they are not so frequent or intense.

    What’s more, Earth today has a strong magnetic field that helps keep the bulk of the energy from such space weather from reaching Earth.

    Magnetosphere of Earth, original bitmap from NASA. SVG rendering by Aaron Kaase
    Magnetosphere of Earth, original bitmap from NASA

    Space weather can, however, significantly disturb a magnetic bubble around our planet, the magnetosphere, a phenomenon referred to as geomagnetic storms that can affect radio communications and our satellites in space. It also creates auroras – most often in a narrow region near the poles where Earth’s magnetic fields bow down to touch the planet.

    Our young Earth, however, had a weaker magnetic field, with a much wider footprint near the poles.

    “Our calculations show that you would have regularly seen auroras all the way down in South Carolina,” says Airapetian. “And as the particles from the space weather traveled down the magnetic field lines, they would have slammed into abundant nitrogen molecules in the atmosphere. Changing the atmosphere’s chemistry turns out to have made all the difference for life on Earth.”

    The atmosphere of early Earth was also different than it is now: Molecular nitrogen – that is, two nitrogen atoms bound together into a molecule – made up 90 percent of the atmosphere, compared to only 78 percent today. As energetic particles slammed into these nitrogen molecules, the impact broke them up into individual nitrogen atoms. They, in turn, collided with carbon dioxide, separating those molecules into carbon monoxide and oxygen.

    The free-floating nitrogen and oxygen combined into nitrous oxide, which is a powerful greenhouse gas. When it comes to warming the atmosphere, nitrous oxide is some 300 times more powerful than carbon dioxide. The teams’ calculations show that if the early atmosphere housed less than one percent as much nitrous oxide as it did carbon dioxide, it would warm the planet enough for liquid water to exist.

    This newly discovered constant influx of solar particles to early Earth may have done more than just warm the atmosphere, it may also have provided the energy needed to make complex chemicals. In a planet scattered evenly with simple molecules, it takes a huge amount of incoming energy to create the complex molecules such as RNA and DNA that eventually seeded life.

    While enough energy appears to be hugely important for a growing planet, too much would also be an issue — a constant chain of solar eruptions producing showers of particle radiation can be quite detrimental. Such an onslaught of magnetic clouds can rip off a planet’s atmosphere if the magnetosphere is too weak. Understanding these kinds of balances help scientists determine what kinds of stars and what kinds of planets could be hospitable for life.

    “We want to gather all this information together, how close a planet is to the star, how energetic the star is, how strong the planet’s magnetosphere is in order to help search for habitable planets around stars near our own and throughout the galaxy,” said William Danchi, principal investigator of the project at Goddard and a co-author on the paper. “This work includes scientists from many fields — those who study the sun, the stars, the planets, chemistry and biology. Working together we can create a robust description of what the early days of our home planet looked like – and where life might exist elsewhere.”

    For more information about the Kepler mission, visit:

    http://www.nasa.gov/kepler

    *Science paper:
    Prebiotic chemistry and atmospheric warming of early Earth by an active young Sun

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA’s Goddard Space Flight Center is home to the nation’s largest organization of combined scientists, engineers and technologists that build spacecraft, instruments and new technology to study the Earth, the sun, our solar system, and the universe.

    Named for American rocketry pioneer Dr. Robert H. Goddard, the center was established in 1959 as NASA’s first space flight complex. Goddard and its several facilities are critical in carrying out NASA’s missions of space exploration and scientific discovery.

    NASA Goddard Campus
    NASA/Goddard Campus
    NASA

     
  • richardmitnick 4:56 pm on May 23, 2016 Permalink | Reply
    Tags: , ,   

    From U Texas at Austin: “Making Virus Sensors Cheap and Simple: New Method Detects Single Viruses” 

    U Texas Austin bloc

    University of Texas at Austin

    23 May 2016
    Marc G Airhart

    Scientists at The University of Texas at Austin have developed a new method to rapidly detect a single virus in urine, as reported* this week in the journal Proceedings of the National Academy of Sciences.

    1
    Researchers at The University of Texas at Austin demonstrated the ability to detect single viruses in a solution containing murine cytomegalovirus (MCMV). The single virus in this image is a human cytomegalovirus, a cousin of MCMV. It was obtained by chilling a sample down with liquid nitrogen and exposing it to high-energy electrons. Image courtesy of Jean-Yves Sgro, U. of Wisconsin-Madison (EMD-5696 data Dai, XH et al., 2013)

    Although the technique presently works on just one virus, scientists say it could be adapted to detect a range of viruses that plague humans including Ebola, Zika and HIV.

    “The ultimate goal is to build a cheap, easy-to-use device to take into the field and measure the presence of a virus like Ebola in people on the spot,” says Jeffrey Dick, a chemistry graduate student and co-lead author of the study. “While we are still pretty far from this, this work is a leap in the right direction.”

    The other co-lead author is Adam Hilterbrand, a microbiology graduate student.

    The new method is highly selective, meaning it is only sensitive to one type of virus, filtering out possible false negatives caused by other viruses or contaminants.

    There are two other commonly used methods for detecting viruses in biological samples, but they have drawbacks. One requires a much higher concentration of viruses, and the other requires samples to be purified to remove contaminants. The new method, however, can be used with urine straight from a person or animal.

    The other co-authors are Lauren Strawsine, a postdoctoral fellow in chemistry; Jason Upton, an assistant professor of molecular biosciences; and Allen Bard, professor of chemistry and director of the Center for Electrochemistry.

    The researchers demonstrated their new technique on a virus that belongs to the same family as the herpes virus, called murine cytomegalovirus (MCMV). To detect individual viruses, the team places an electrode — a wire that conducts electricity, in this case, one that is thinner than a human cell — in a sample of mouse urine. They then add to the urine some special molecules made up of enzymes and antibodies that naturally stick to the virus of interest. When all three stick together and then bump into the electrode, there’s a spike in electric current that can be easily detected.

    The researchers say their new method still needs refinement. For example, the electrodes become less sensitive over time because a host of other naturally occurring compounds stick to them, leaving less surface area for viruses to interact with them. To be practical, the process will also need to be engineered into a compact and rugged device that can operate in a range of real-world environments.

    Support for this research was provided by the National Science Foundation, the Welch Foundation and the Cancer Prevention & Research Institute of Texas.

    *Science paper:
    Enzymatically enhanced collisions on ultramicroelectrodes for specific and rapid detection of individual viruses

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    U Texas Arlington Campus

    In 1839, the Congress of the Republic of Texas ordered that a site be set aside to meet the state’s higher education needs. After a series of delays over the next several decades, the state legislature reinvigorated the project in 1876, calling for the establishment of a “university of the first class.” Austin was selected as the site for the new university in 1881, and construction began on the original Main Building in November 1882. Less than one year later, on Sept. 15, 1883, The University of Texas at Austin opened with one building, eight professors, one proctor, and 221 students — and a mission to change the world. Today, UT Austin is a world-renowned higher education, research, and public service institution serving more than 51,000 students annually through 18 top-ranked colleges and schools.

     
  • richardmitnick 4:39 pm on May 23, 2016 Permalink | Reply
    Tags: , ,   

    From Rockefeller: “New method gives scientists a better look at how HIV infects and takes over its host cells” 

    Rockefeller U bloc

    Rockefeller University

    May 23, 2016
    No writer credit found

    1
    Cell to cell: When HIV infects a cell, it programs the cell to express the viral protein Env, which the virus uses to spread to neighboring cells. Above, Env (red) produced by one infected cell has recruited other, uninfected cells, causing them to fuse, and their nuclei (blue) to cluster.

    Viruses attack cells and commandeer their machinery in a complex and carefully orchestrated invasion. Scientists have longed probed this process for insights into biology and disease, but essential details still remain out of reach.

    A new approach, developed by a team of researchers led by the Rockefeller University and the Aaron Diamond AIDS Research Center (ADARC), offers an unprecedented view of how a virus infects and appropriates a host cell, step by step. In research published* May 23 in Nature Microbiology, they applied their method to HIV, a virus whose genome is less than 100,000 the size of ours.

    “HIV is truly an expert at living large on a small budget,” says first author Yang Luo, a postdoc at ADARC and a former graduate student at Rockefeller University. “We asked the question, how does such a compact virus manipulate the host cell to gain entry and replicate itself, all while escaping the immune system?”

    Mapping HIV’s ‘interactome’

    The study focused on two viral proteins known to bring about HIV’s infection of human white blood cells. The first one, called envelope or Env, sits on the surface of the virus and, by binding to receptors on the host cell, helps the membrane that encapsulates the virus fuse with the cell’s outer membrane. A second protein, Vif, destroys an enzyme that host cells produce to defend themselves against the virus.

    In an effort to better understand how these two proteins function, the team wanted to map their interactome—meaning all the proteins with which they associate within a host cell. To accomplish this, the researchers needed to devise a way to isolate clusters of interacting proteins from cells during different stages of infection. Such experiments can be done by introducing a genetic sequence into the viral genome—a “tag” that acts like a piece of molecular Velcro, allowing one viral protein to be yanked out along with all the other proteins associated with it.

    2

    It sounds simple, but making it work took a decade.

    “Inserting a tag sequence into small viruses is a challenge to begin with,” says corresponding author Mark Muesing, a principal investigator at ADARC. “If you disrupt their nucleic acid and protein sequences, you can easily compromise the virus’s ability to replicate. And HIV represents a particular challenge because it can quickly revert back to its original sequence.”

    “We developed a technique to find places in the HIV genome where we can insert stable tags without affecting the virus’s capacity to proliferate. In effect, this allowed us to expand cultures of the infected cells along with the tagged viral protein,” he added.

    The host’s contribution

    Next, the researchers infected human cells with viruses carrying the tagged protein sequences, and were able to pull out and identify a large number of host proteins directly during the infectious process. This provided the first evidence that many previously underappreciated host proteins interact with the viral machinery during replication.

    “Imagine you have a factory assembly line where only one component of, say, the stamping machine, actually touches the product,” says co-author Michael Rout, professor and head of Rockefeller’s Laboratory of Cellular and Structural Biology. “Other parts support and power the stamp. Likewise, within an infected cell, we can identify the components of a particular cellular machine, not just the piece that comes in contact with the viral protein.”

    “Every host protein we pull out generates new questions,” adds co-first author Erica Jacobs, a research associate in Brian Chait’s lab. “Does it help the virus invade and coopt the host to replicate itself? Or does it harm it? The answers will not only help us understand the virus, but also shed light on our cells’ ability to defend themselves.”

    One important discovery has already emerged from the lists of proteins. Viruses, including HIV, often attack as so-called virions, which are individual packets of protein and genetic code. But they can also pass directly from an infected to an uninfected cell, a more effective mode of transmission. To do so, the virus appears to use host proteins to construct a platform, a close junctional surface, between cells.

    From the list of proteins that interact with Env, the researchers have identified cellular proteins predicted to contribute to this platform between cells. Because this route of transmission protects the virus in a sequestered environment, away from host defenses, the new findings may aid in the development of future anti-HIV therapies.

    A live infection, step by step

    According to co-author Brian Chait, Camille and Henry Dreyfus Professor and head of the Laboratory of Mass Spectrometry and Gaseous Ion Chemistry, the new approach offers a rare glimpse into the process by which HIV invades and resurrects itself within a cell.

    “Often, studies of this sort are done with viral proteins in the absence of a true viral infection “However, because viral infections are exquisitely orchestrated events, you are likely to miss all kinds of important details if you study the action of these proteins out of their proper context.”

    “Deciphering the intricacies of virus-host protein interactions in space and time during the progression of an infection is remarkably powerful” says co-author Ileana Cristea, an associate professor of molecular biology at Princeton University. “The challenge is to discover which precise interactions are the critical ones.”

    Todd Greco, a co-first author, and an associate research scholar and lecturer in molecular biology in Cristea’s lab, says that “even for host proteins within the same family, their relative stability within HIV-1 protein complexes can be very different. More broadly, by understanding these mechanisms we will better understand the coordinated responses of cells.”

    *Science paper:
    HIV–host interactome revealed directly from infected cells

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Rockefeller U Campus

    The Rockefeller University is a world-renowned center for research and graduate education in the biomedical sciences, chemistry, bioinformatics and physics. The university’s 76 laboratories conduct both clinical and basic research and study a diverse range of biological and biomedical problems with the mission of improving the understanding of life for the benefit of humanity.

    Founded in 1901 by John D. Rockefeller, the Rockefeller Institute for Medical Research was the country’s first institution devoted exclusively to biomedical research. The Rockefeller University Hospital was founded in 1910 as the first hospital devoted exclusively to clinical research. In the 1950s, the institute expanded its mission to include graduate education and began training new generations of scientists to become research leaders around the world. In 1965, it was renamed The Rockefeller University.

     
  • richardmitnick 3:55 pm on May 23, 2016 Permalink | Reply
    Tags: , , , Water-Energy Nexus New Focus of Berkeley Lab Research   

    From LBL: “Water-Energy Nexus New Focus of Berkeley Lab Research” 

    Berkeley Logo

    Berkeley Lab

    May 23, 2016
    Julie Chao
    (510) 486-6491
    JHChao@lbl.gov

    1
    Water banking, desalination, and high-resolution climate models are all part of the new Berkeley Lab Water Resilience Initiative. (California snowpack photo credit: Dan Pisut/ NASA)

    Billions of gallons of water are used each day in the United States for energy production—for hydroelectric power generation, thermoelectric plant cooling, and countless other industrial processes, including oil and gas mining. And huge amounts of energy are required to pump, treat, heat, and deliver water.

    This interdependence of water and energy is the focus of a major new research effort at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab). With the twin challenges of population growth and climate change adding to the resource pressures, Berkeley Lab’s Water Resilience Initiative aims to use science and technology to optimize coupled water-energy systems and guide investments in such systems.

    “Considering water and energy as separate issues is passé,” said Berkeley Lab scientist Robert Kostecki, one of the three leads of the initiative. “Now the two are becoming critically interdependent. And both the energy and water sectors are expected to experience serious stresses from extreme climate events. However the problem on each side is dealt with, there needs to be an understanding of possible implications on the other side.”

    The Initiative has three main goals: hydroclimate and ecosystem predictions, revolutionary concepts for efficient and sustainable groundwater systems, and science and technology breakthroughs in desalination. The goals can be viewed as analogous to energy distribution, storage, and generation, says Susan Hubbard, Berkeley Lab’s Associate Lab Director for Earth and Environmental Sciences.

    “We consider improved hydroclimate predictions as necessary for understanding future water distribution,” Hubbard said. “We are exploring water banking as a subsurface strategy to store water that is delivered by snowmelt or extreme precipitation. To remain water resilient in other locations and to take advantage of seawater through brackish inland produced waters, Berkeley Lab is performing fundamental investigations to explore new approaches to desalinate water, ideally leading to cost and energy efficient approaches to generate water.”

    Climate: the Source of All Water

    The climate, ultimately, is the source of all water, and in places like California, where the snow pack plays an important role, climate change will have a big impact on how much water there will be and when it will come. The goal of the climate focus of the Initiative, led by Berkeley Lab climate scientist Andrew Jones, is to develop approaches to predict hydroclimate at scales that can be used to guide water-energy strategies.

    “Historically we’ve developed climate models that are global models, developed to answer global science questions,” Jones said. “But there’s an increasing demand for information at much finer spatial scales to support climate adaptation planning.”

    Ten years ago, Berkeley Lab scientists helped develop global climate models with a resolution of 200 kilometers. By 2012, the most advanced models had 25 km resolution. Now a project is underway to develop a regional climate model of the San Francisco Bay Area with resolution of 1 km, or the neighborhood level.

    “We’ll be looking at the risk of extreme heat events and how that interacts with the microclimates of the Bay Area, and additionally, how change in the urban built environment can exacerbate or ameliorate those heat events,” Jones said. “Then we want to understand the implications of those heat events for water and energy demands.”

    The eventual goal is to transfer this model for use in other urban areas to be able to predict extreme precipitation events as well as drought and flood risk.

    Subsurface: Storage, Quality, and Movement of Water Underground

    Another Initiative focus, led by Peter Nico, head of Berkeley Lab’s Geochemistry Department, studies what’s happening underground. “We have a lot of expertise in understanding the subsurface—using various geophysical imaging techniques, measuring chemical changes, using different types of hydrologic and reactive transport models to simulate what’s happening,” he said. “So our expertise matches up very well with groundwater movement and management and groundwater quality.”

    Groundwater issues have become more important with the drought of the last four years. “California has been ‘overdrafting’ water for a long time, especially in the San Joaquin Valley, where we’re pulling more water out than is naturally infiltrating back in,” Nico said. “With the drought the use of groundwater has gone up even more. That’s causing a lot of problems, like land subsidence.”

    While there is already a lot of activity associated with groundwater management in California, Nico added, “we still can’t confidently store and retrieve clean water in the subsurface when and where we need it. We think there’s a place to contribute a more scientific chemistry- and physics-based understanding to efficient groundwater storage in California.”

    For example, Berkeley Lab scientists have expertise in using geophysical imaging, which allows them to “see” underground without drilling a well. “We have very sophisticated hydrologic and geochemical computer codes we think we can couple with imaging to predict where water will go and how its chemistry may change through storage or retrieval,” he said.

    2
    Berkeley Lab researchers are helping test “water banking” on almond orchards. (Courtesy of Almond Board of California)

    They have a new project with the Almond Board of California to determine the ability to recharge over-drafted groundwater aquifers in the San Joaquin Valley by applying peak flood flows to active orchards, known as “water banking.” The project is part of the Almond Board’s larger Accelerated Innovation Management (AIM) program, which includes an emphasis on creating sustainable water resources. Berkeley Lab scientists will work with existing partners, Sustainable Conservation and UC Davis, who are currently conducting field trials and experiments, and contribute their expertise on the deeper subsurface, below the root zone of the almond trees, to determine what happens to banked water as it moves through the subsurface.

    Another project, led by Berkeley Lab researcher Larry Dale, is developing a model of the energy use and cost of groundwater pumping statewide in order to improve the reliability of California’s electric and water systems, especially in cases of drought and increase in electricity demand. The project has been awarded a $625,000 grant by the California Energy Commission.

    Desalination: Aiming for Pipe Parity

    Reverse osmosis is the state-of-the-art desalination technology and has been around since the 1950s. Unfortunately, there have been few breakthroughs in the field of desalination since then, and it remains prohibitively expensive. “The challenge is to lower the cost of desalination of sea water by a factor of five to achieve ‘pipe parity,’ or cost parity with water from natural sources,” said Kostecki, who is leading the project. “This is a formidable endeavor and it cannot be done with incremental improvements of existing technologies.”

    To reach this goal, Kostecki and other Berkeley Lab researchers are working on several different approaches for more efficient desalination. Some are new twists on existing technologies—such as forward osmosis using heat from geothermal sources, graphene-based membranes, and capacitive deionization—while others are forging entirely new paradigms, such as using the quantum effects in nanoconfined spaces and new nano-engineered materials architectures.

    “The reality is that if one is shooting for a 5X reduction in the cost of desalination of water, then this requires a completely new way of thinking, new science, new technology—this is what we are shooting for,” said Ramamoorthy Ramesh, Associate Lab Director for Energy Technologies.

    Some of these projects are part of the U.S./China Clean Energy Research Center for Water Energy Technologies (CERC-WET), a new $64-million collaboration between China and the United States to tackle water conservation in energy production and use. It is a cross-California collaboration led on the U.S. side by Berkeley Lab researcher Ashok Gadgil and funded primarily by the Department of Energy.

    “Berkeley Lab is ideally suited to take on the water-energy challenge,” said Ramesh. “As a multidisciplinary lab with deep expertise in energy technologies, computational sciences, energy sciences as well as earth and climate sciences, we have the opportunity to develop and integrate fundamental insights through systems-level approaches. Relevant to California, we are focusing on developing scalable water systems that are resilient in an energy-constrained and uncertain water future.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 12:36 pm on May 23, 2016 Permalink | Reply
    Tags: , , ,   

    From SLAC: “Caught on Camera: First Movies of Droplets Getting Blown Up by X-ray Laser” 


    SLAC Lab

    May 23, 2016

    Details Revealed in SLAC Footage Will Give Researchers More Control in X-ray Laser Experiments

    Researchers have made the first microscopic movies of liquids getting vaporized by the world’s brightest X-ray laser at the Department of Energy’s SLAC National Accelerator Laboratory. The new data could lead to better and novel experiments at X-ray lasers, whose extremely bright, fast flashes of light take atomic-level snapshots of some of nature’s speediest processes.

    “Understanding the dynamics of these explosions will allow us to avoid their unwanted effects on samples,” says Claudiu Stan of Stanford PULSE Institute, a joint institute of Stanford University and SLAC. “It could also help us find new ways of using explosions caused by X-rays to trigger changes in samples and study matter under extreme conditions. These studies could help us better understand a wide range of phenomena in X-ray science and other applications.”


    Researchers have recorded the first movies of liquids getting vaporized by SLAC’s Linac Coherent Light Source (LCLS), the world’s brightest X-ray laser. The movies reveal new details that could lead to better and novel experiments at X-ray lasers. (SLAC National Accelerator Laboratory)
    Access mp4 video here .

    Liquids are a common way of bringing samples into the path of the X-ray beam for analysis at SLAC’s Linac Coherent Light Source (LCLS), a DOE Office of Science User Facility, and other X-ray lasers. At full power, ultrabright X-rays can blow up samples within a tiny fraction of a second. Fortunately, in most cases researchers can take the data they need before the damage sets in.


    Access the mp4 video here .

    The new study, published* today in Nature Physics, shows in microscopic detail how the explosive interaction unfolds and provides clues as to how it could affect X-ray laser experiments.

    Stan and his team looked at two ways of injecting liquid into the path of the X-ray laser: as a series of individual drops or as a continuous jet. For each X-ray pulse hitting the liquid, the team took one image, timed from five billionths of a second to one ten-thousandth of a second after the pulse. They strung hundreds of these snapshots together into movies.

    “Thanks to a special imaging system developed for this purpose, we were able to record these movies for the first time,” says co-author Sébastien Boutet from LCLS. “We used an ultrafast optical laser like a strobe light to illuminate the explosion, and made images with a high-resolution microscope that is suitable for use in the vacuum chamber where the X-rays hit the samples.”

    The footage shows how an X-ray pulse rips a drop of liquid apart. This generates a cloud of smaller particles and vapor that expands toward neighboring drops and damages them. These damaged drops then start moving toward the next-nearest drops and merge with them.


    This movie shows how a drop of liquid explodes after being struck by a powerful X-ray pulse from LCLS. The vertical white line at the center shows the position of the X-ray beam. The movie captures the first 9 millionths of a second after the explosion. (SLAC National Accelerator Laboratory)
    Access mp4 video here .

    In the case of jets, the movies show how the X-ray pulse initially punches a hole into the stream of liquid. This gap continues to grow, with the ends of the jet on either side of the gap beginning to form a thin liquid film. The film develops an umbrella-like shape, which eventually folds back and merges with the jet.


    Researchers studied the explosive interaction of X-ray pulses from LCLS with liquid jets, as shown in this movie of the first 9 millionths of a second after the explosion. (SLAC National Accelerator Laboratory)
    Access mp4 video here .

    Based on their data, the researchers were able to develop mathematical models that accurately describe the explosive behavior for a number of factors that researchers vary from one LCLS experiment to another, including pulse energy, drop size and jet diameter.

    They were also able to predict how gap formation in jets could pose a challenge in experiments at the future light sources European XFEL in Germany and LCLS-II, under construction at SLAC. Both are next-generation X-ray lasers that will fire thousands of times faster than current facilities.

    European XFEL Test module
    European XFEL Test module

    SLAC LCLS-II line
    SLAC LCLS-II line

    “The jets in our study took up to several millionths of a second to recover from each explosion, so if X-ray pulses come in faster than that, we may not be able to make use of every single pulse for an experiment,” Stan says. “Fortunately, our data show that we can already tune the most commonly used jets in a way that they recover quickly, and there are ways to make them recover even faster. This will allow us to make use of LCLS-II’s full potential.”

    The movies also show for the first time how an X-ray blast creates shock waves that rapidly travel through the liquid jet. The team is hopeful that these data could benefit novel experiments, in which shock waves from one X-ray pulse trigger changes in a sample that are probed by a subsequent X-ray pulse. This would open up new avenues for studies of changes in matter that occur at time scales shorter than currently accessible.

    Other institutions involved in the study were Max Planck Institute for Medical Research, Germany; Princeton University; and Paul Scherrer Institute, Switzerland. Funding was received from the DOE Office of Science; Max Planck Society; Human Frontiers Science Project; and SLAC’s Laboratory Directed Research & Development program.

    *Science paper:
    Liquid explosions induced by X-ray laser pulses

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.
    i1

     
  • richardmitnick 12:14 pm on May 23, 2016 Permalink | Reply
    Tags: , , , UW Medical Center ready to deploy tiniest pacemaker ever   

    From U Washington: “UW Medical Center ready to deploy tiniest pacemaker ever” 

    U Washington

    University of Washington

    05.20.2016
    Brian Donohue

    1
    The old and the new: a conventional pacemaker, left, and the Medtronic Micra are displayed by UW Medicine electrophysiologists Jordan Prutkin and Kristen Patton.

    The world’s smallest pacemaker will debut soon at UW Medical Center – one of two Washington state hospitals that will offer the device for the next several months.

    Drs. Jordan Prutkin and Kristen Patton, cardiac electrophysiologists with the UW Medicine Regional Heart Center, received final training this week from representatives of Medtronic, the manufacturer of the device, named Micra.

    About as tall and wide as a AAA battery, the device is threaded up through the femoral vein to the heart, where it is attached to the right ventricle to deliver impuses when a patient’s heartbeat is too slow. The unit’s direct placement takes advantage of another advance: Its battery is inside, so there are no wires connected to a separate power source.

    For decades, pacemakers have comprised a generator, usually implanted under the skin in the patient’s left chest, and leads, which carry impulses from the generator into the heart. The wires are these devices’ main vulnerability, wearing out over time and heightening risk of infections. Removing broken leads years after implant can be problematic because they often have become enmeshed within the tissue of blood vessels.

    “That’s why this miniature technology is so important and transformative – because it really does reduce risks associated with these devices,” Patton said.

    On April 6, the U.S. Food and Drug Administration approved the Micra for patients with slow or irregular heart rhythms. The FDA based its decision on a clinical trial of 719 patients implanted with the device. In the study, 98 percent of patients experienced adequate heart pacing and fewer than 7 percent had complications such as blood clots, heart injury and device dislocation.

    The risk of dislodgement is low, Patton said. “Its tiny hooks deploy straight into the muscle and grab and it is very hard to detach.”

    The Micra will have limited applicability, at least initially, because it paces only one chamber. About 75 percent of conventional pacemakers pace at least two of the heart’s chambers.

    “This is good for people who only need pacing in the ventricle because they have atrial fibrillation in the top chamber, and for people who only need pacing a small percentage of the time,” Prutkin said.

    2
    Illustration of the Micra being deployed into a right ventricle. Medtronic

    Similar to other single-chamber devices on the market, the Micra’s battery is projected to last 10 to 14 years, depending on how much pacing a patient requires.

    At the Micra training, Patton said, she heard something that she hadn’t expected.

    “The two physicians leading the session have a lot of experience with this device, and they said it makes a difference psychologically to patients; it removes the visible bump under the skin of the generator and that persistent reminder that ‘something is wrong with my heart.’

    “We hear from patients all the time, wondering whether they should move less to protect against lead fracture. Patients ask, ‘What if I wear a backpack? Can I still do pushups or play golf?’ This device seems to be a positive step in that way,” Patton said.

    Prutkin sees this device as the beginning of the next generation of pacemakers.

    “Right now this can only go in the ventricle, but in time this will be available for both the atria and ventricles, and multiple devices in one person will be able to talk to one another to regulate a heartbeat. That’s down the road, but that’s where this technology is heading.”

    The device also will be available at Sacred Heart Medical Center in Spokane.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The University of Washington is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.

    So what defines us — the students, faculty and community members at the University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

     
  • richardmitnick 12:06 pm on May 23, 2016 Permalink | Reply
    Tags: , , , ESA Hi-Gal   

    From ESA: “The Little Fox and the Giant Stars” 

    ESA Space For Europe Banner

    European Space Agency

    23/05/2016
    No writer credit found

    1

    New stars are the lifeblood of our Galaxy, and there is enough material revealed by this Herschel infrared image to build stars for millions of years to come.

    ESA/Herschel
    ESA/Herschel

    Situated 8000 light-years away in the constellation Vulpecula – latin for little fox – the region in the image is known as Vulpecula OB1. It is a ‘stellar association’ in which a batch of truly giant ‘OB’ stars is being born.

    The vast quantities of ultraviolet and other radiation emitted by these stars is compressing the surrounding cloud, causing nearby regions of dust and gas to begin the collapse into more new stars. In time, this process will ‘eat’ its way through the cloud, transforming some of the raw material into shining new stars.

    The image was obtained as part of Herschel’s Hi-GAL key-project. This used the infrared space observatory’s instruments to image the entire galactic plane in five different infrared wavelengths.

    3
    This fantastic picture is a 70-170-350um composite image of the Galactic Plane at the longitude of 59° in the Vulpecula region. Most remarkable features that can be seen are shock fronts from HII regions, bubbles, Interstellar medium structured at all scales, and remarkable filamentary structures with on-going star formation.

    These wavelengths reveal cold material, most of it between -220ºC and -260ºC. None of it can be seen at ordinary optical wavelengths, but this infrared view shows astronomers a surprising amount of structure in the cloud’s interior.

    The surprise is that the Hi-GAL survey has revealed a spider’s web of filaments that stretches across the star-forming regions of our Galaxy. Part of this vast network can be seen in this image as a filigree of red and orange threads.

    At visual wavelengths, the OB association is linked to a star cluster catalogued as NGC 6823. It was discovered by William Herschel in 1785 and contains 50–100 stars. A nebula emitting visible light, catalogued as NGC 6820, is also part of this multi-faceted star-forming region.

    The giant stars at the heart of Vulpecula OB1 are some of the biggest in the Galaxy. Containing dozens of times the mass of the Sun, they have short lives, astronomically speaking, because they burn their fuel so quickly.

    At an estimated age of two million years, they are already well through their lifespans. When their fuel runs out, they will collapse and explode as supernovas. The shock this will send through the surrounding cloud will trigger the birth of even more stars, and the cycle will begin again.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The European Space Agency (ESA), established in 1975, is an intergovernmental organization dedicated to the exploration of space, currently with 19 member states. Headquartered in Paris, ESA has a staff of more than 2,000. ESA’s space flight program includes human spaceflight, mainly through the participation in the International Space Station program, the launch and operations of unmanned exploration missions to other planets and the Moon, Earth observation, science, telecommunication as well as maintaining a major spaceport, the Guiana Space Centre at Kourou, French Guiana, and designing launch vehicles. ESA science missions are based at ESTEC in Noordwijk, Netherlands, Earth Observation missions at ESRIN in Frascati, Italy, ESA Mission Control (ESOC) is in Darmstadt, Germany, the European Astronaut Centre (EAC) that trains astronauts for future missions is situated in Cologne, Germany, and the European Space Astronomy Centre is located in Villanueva de la Cañada, Spain.

    ESA50 Logo large

     
  • richardmitnick 10:11 am on May 23, 2016 Permalink | Reply
    Tags: , , , ,   

    From Daily Galaxy: “”Attempt No Journey There” –Swarm of 10,000 Black Holes and Neutron Stars Orbit Milky Way’s Supermassive Black Hole” 

    Daily Galaxy
    The Daily Galaxy

    1
    No image caption, no image credit

    May 22, 2016

    “The giant black holes in the cores of galaxies, a million to 20 billion times heavier than the Sun, therefore, cannot have been born in the death of a star. They must have formed in some other way, perhaps by the agglomeration of many smaller black holes; perhaps by the collapse of massive clouds of gas.” ― Kip S. Thorne, The Science of Interstellar.

    “The Center of our Milky Way Galaxy is a place of extremes,” says Mark Morris, an expert on The Galactic Center at UCLA. “For every star in our nighttime sky, for example, there would be a million for someone looking up from a planet near the Galactic center.”

    Thinking about a far-future visit to our galaxy’s central zone, brings to mind Arthur C. Clark’s admonition about a visit to Jupiter’s ocean moon, Europa –“All These Worlds are Yours –Except Europa Attempt No Landing There.” In addition to the extreme star density, a swarm of 10,000 or more black holes may be orbiting the Milky Way’s supermassive black hole, according to observations from NASA’s Chandra X-ray Observatory in 2015.

    Sag A*  NASA Chandra X-Ray Observatory 23 July 2014, the supermassive black hole at the center of the Milky Way
    Sag A* NASA Chandra X-Ray Observatory 23 July 2014, the supermassive black hole at the center of the Milky Way”

    This would represent the highest concentration of black holes anywhere in the Galaxy. These relatively small, stellar-mass black holes, along with neutron stars, appear to have migrated into the Galactic Center over the course of several billion years. Could this migration be the prelude to feeding our supermassive black hole suggested by Caltech’s Kip Thorne?

    The discovery was made as part of Chandra’s ongoing program of monitoring the region around Sagittarius A* (Sgr A*), the supermassive black hole at the center of the Milky Way, reported by by Michael Muno of the University of California, Los Angeles (UCLA) at a 2015 meeting of the American Astronomical Society.

    Among the thousands of X-ray sources detected within 70 light years of Sgr A*, Muno and his colleagues searched for those most likely to be active black holes and neutron stars by selecting only the brightest sources that also exhibited large variations in their X-ray output. These characteristics identify black holes and neutron stars that are in binary star systems and are pulling matter from nearby companion stars. Of the seven sources that met these criteria, four are within three light years of Sgr A*.

    “Although the region around Sgr A* is crowded with stars, we expected that there was only a 20 percent chance that we would find even one X-ray binary within a three-light-year radius,” said Muno. “The observed high concentration of these sources implies that a huge number of black holes and neutron stars have gathered in the center of the Galaxy.”

    Mark Morris, also of UCLA and a coauthor on the present work, had predicted a decade ago that a process called dynamical friction would cause stellar black holes to sink toward the center of the Galaxy. Black holes are formed as remnants of the explosions of massive stars and have masses of about 10 suns. As black holes orbit the center of the Galaxy at a distance of several light years, they pull on surrounding stars, which pull back on the black holes.

    2
    Unidentified. No image credit.

    3

    The images above are part of a Chandra program that monitors a region around the Milky Way’s supermassive black hole, Sagittarius A* (Sgr A*). Four bright, variable X-ray sources (circles) were discovered within 3 light years of Sgr A* (the bright source just above Source C). The lower panel illustrates the strong variability of one of these sources. This variability, which is present in all the sources, is indicative of an X-ray binary system where a black hole or neutron star is pulling matter from a nearby companion star.

    “Stars are packed quite close together in the center zone,” says Morris. “Then, there’s that supermassive black hole that is sitting in there, relatively quiet for now, but occasionally producing a dramatic outpouring of energy. The UCLA Galactic center group been use the Keck Telescopes in Hawaii to follow its activity for the last 17 years, watching not only the fluctuating emission from the black hole, but also watching the stars around it as they rapidly orbit the black hole.”

    Morris had predicted a decade ago that a process called dynamical friction would cause stellar black holes to sink toward the center of the Galaxy. Black holes are formed as remnants of the explosions of massive stars and have masses of about 10 suns. As black holes orbit the center of the Galaxy at a distance of several light years, they pull on surrounding stars, which pull back on the black holes. The net effect is that black holes spiral inward, and the low-mass stars move out. From the estimated number of stars and black holes in the Galactic Center region, dynamical friction is expected to produce a dense swarm of 20,000 black holes within three light years of Sgr A*. A similar effect is at work for neutron stars, but to a lesser extent because they have a lower mass.

    Once black holes are concentrated near Sgr A*, they will have numerous close encounters with normal stars there, some of which are in binary star systems. The intense gravity of a black hole can induce an ordinary star to “change partners” and pair up with the black hole while ejecting its companion. This process and a similar one for neutron stars are expected to produce several hundreds of black hole and neutron star binary systems.

    The black holes and neutron stars in the cluster are expected to gradually be swallowed by the supermassive black hole, Sgr A*, at a rate of about one every million years. At this rate, about 10,000 black holes and neutron stars would have been captured in a few billion years, adding about 3 percent to the mass of the central supermassive black hole, which is currently estimated to contain the mass of 3.7 million suns.

    In the meantime, the acceleration of low-mass stars by black holes will eject low-mass stars from the central region. This expulsion will reduce the likelihood that normal stars will be captured by the central supermassive black hole. This may explain why the central regions of some galaxies, including the Milky Way, are fairly quiet even though they contain a supermassive black hole.

    See the full article here .

    Please help promote STEM in your local schools

    stem

    STEM Education Coalition

     
  • richardmitnick 4:25 pm on May 22, 2016 Permalink | Reply
    Tags: , , U Penn   

    From Penn: “Breaking Down Cancer Cell Defenses” 

    U Penn bloc

    University of Pennsylvania

    May 20, 2016
    No writer credit found

    Inhibiting Membrane Enzyme May Make Some Cancer Cells More Vulnerable to Chemotherapy, Finds Penn Study

    The mistaken activation of certain cell-surface receptors contributes to a variety of human cancers. Knowing more about the activation process has led researchers to be able to induce greater vulnerability by cancer cells to an existing first-line treatment for cancers (mainly lung) driven by a receptor called EGFR. The team, led by Eric Witze, PhD, an assistant professor of Cancer Biology in the Perelman School of Medicine at the University of Pennsylvania, published* their findings this month in Molecular Cell.

    “We found that inhibiting an enzyme that adds the fatty acid palmitate onto proteins creates dependence by cancer cells on EGFR signaling for survival,” Witze said. By using a small molecule called 2-bromo-palmitate (2BP) that inhibits these palmitate-adding enzymes, the researchers surmise that cancer patients might be able to one day make their cells more sensitive to cancer-fighting EGFR inhibitors.

    Palmitate is the most common fatty acid found in animals, plants, and microbes, although is not well studied. Proteins that have palmitate bound to them are usually associated with the cell membrane. Palmitate allows these proteins to transfer chemical signals from outside the cell to inside via the cell membrane.

    EGFR itself is a transmembrane protein associated with palmitate, and by blocking palmitate, EGFR becomes hyperactivated. “We thought that this finding would be ‘good’ for the cancer, but ‘bad’ for a cancer patient,” Witze said. In cancers not related to EGFR signaling, this relationship is correct; however, in cancers related to EGFR, if the palmitate-adding enzyme is inhibited, EGFR is activated, but cancer cells grow more slowly.

    In addition, if genifitib, an inhibitor to EGFR itself on the market for lung cancer, is added to the cell, the cells die. This finding is somewhat counterintuitive with regard to cell growth since EGFR activation functions as a positive growth signal, the researchers note; however, that fact cells die when EGFR is inhibited is not counterintuitive, but shows the cells are now addicted to the EGFR signal.

    “It’s as if a switch is stuck on,” Witze said. “The cell loses control of the growth signal.” If no palmitate is associated with EGFR, then it the cell loses control of this signal, and if the EGFR inhibitor is added, cells die.”

    The research shows that the reversible modification of EGFR with palmitate “pins” the tail of EGFR to the cell, impeding EGFR activation. The researchers think when the tail is no longer able to be pinned to the membrane the switch is stuck in the “on” position.

    Currently, the experimental 2BP compound inhibits any enzyme that uses palmitate as a substrate, making it toxic to most cells. “We need to find a compound specific for the palmitate-adding enzyme and or modify 2BP to make it more specific to decrease unwanted side effects.” Witze said.

    Kristin B. Runkle, Akriti Kharbanda, Ewa Stypulkowski, Xing-Jun Cao, Wei Wang, and Benjamin A. Garcia, all from Penn are co-authors.

    This work was funded by the National Institute for Health (R01CA181633, T32-CA-557726-07), the American Cancer Society (RSG-15-027-01, IRG –78-002-34) and the Department of Defense (BC123187P1).

    *Science paper:
    Inhibition of DHHC20-Mediated EGFR Palmitoylation Creates a Dependence on EGFR Signaling

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    U Penn campus

    Academic life at Penn is unparalleled, with 100 countries and every U.S. state represented in one of the Ivy League’s most diverse student bodies. Consistently ranked among the top 10 universities in the country, Penn enrolls 10,000 undergraduate students and welcomes an additional 10,000 students to our world-renowned graduate and professional schools.

    Penn’s award-winning educators and scholars encourage students to pursue inquiry and discovery, follow their passions, and address the world’s most challenging problems through an interdisciplinary approach.

     
  • richardmitnick 4:01 pm on May 22, 2016 Permalink | Reply
    Tags: , ,   

    From SA: “Nanosized Materials Help Electronics Compute Like Real Brains” 

    Scientific American

    Scientific American

    May 20, 2016
    Michael Torrice, Chemical & Engineering News

    1
    Credit: Eyewire/Getty Images (MARS)

    Although processors have gotten smaller and faster over time, few computers can compete with the speed and computing power of the human brain. And none comes close to the organ’s energy efficiency. So some engineers want to develop electronics that mimic how the brain computes to build more powerful and efficient devices.

    A team at IBM Research, Zurich, now reports that nanosized devices made from phase-change materials can mimic how neurons fire to perform certain calculations (Nat. Nanotechnol. 2016, DOI:10.1038/nnano.2016.70).

    This report “shows quite concretely that we can make simple but effective hardware mimics of neurons, which could be made really small and therefore have low operating powers,” says C. David Wright, an electrical engineer at the University of Exeter who wrote a commentary accompanying the new article.

    The IBM team’s device imitates how an individual neuron integrates incoming signals from other neurons to determine when it should fire. These input signals change the electrical potential across the neuron’s membrane—some increase it, others decrease it. Once that potential passes a certain threshold, the neuron fires.

    Previously, engineers have mimicked this process using combinations of capacitors and silicon transistors, which can be complex and difficult to scale down, Wright explains in his commentary.

    In the new work, IBM’s Evangelos Eleftheriou and colleagues demonstrate a potentially simpler system that uses a phase-change material to play the part of a neuron’s membrane potential. The doped chalcogenide Ge2Sb2Te5, which has been tested in conventional memory devices, can exist in two phases: a glassy amorphous state and a crystalline one. Electrical pulses slowly convert the material from amorphous to crystalline, which, in turn, changes its conductance. At a certain level of phase change, the material’s conductance suddenly jumps, and the device fires like a neuron.

    The IBM team tested a mushroom-shaped device consisting of a 100-nm-thick layer of the chalcogenide sandwiched between two electrodes. In one demonstration, they used the neuronlike device to detect correlations in 1,000 streams of binary data. Such a calculation could spot trends in social media chatter or even in stock market transactions, Wright says.

    He also points out that the devices fire faster than actual neurons, on a nanosecond timescale compared with a millisecond one. The neuron mimics, Wright says, are another step toward hardware that can process information as the brain does but at speeds orders of magnitudes faster than the organ. “That could do some remarkable things.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 560 other followers

%d bloggers like this: