Updates from richardmitnick Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 7:50 pm on September 30, 2014 Permalink | Reply
    Tags: , , , , , ,   

    From Astronomy: “New molecule found in space connotes life origins” 

    Astronomy magazine

    Astronomy Magazine

    September 29, 2014
    No Writer Credit
    Cornell University, Ithaca, New York

    Like finding a molecular needle in a cosmic haystack, astronomers have detected radio waves emitted by isopropyl cyanide.

    Hunting from a distance of 27,000 light-years, astronomers have discovered an unusual carbon-based molecule — one with a branched structure — contained within a giant gas cloud in interstellar space. Like finding a molecular needle in a cosmic haystack, astronomers have detected radio waves emitted by isopropyl cyanide. The discovery suggests that the complex molecules needed for life may have their origins in interstellar space.

    cyan
    Dust and molecules in the central region of our galaxy: The background image shows the dust emission in a combination of data obtained with the APEX telescope and the Planck space observatory at a wavelength around 860 micrometers. The organic molecule iso-propyl cyanide with a branched carbon backbone (i-C3H7CN, left) as well as its straight-chain isomer normal-propyl cyanide (n-C3H7CN, right) were both detected with the Atacama Large Millimeter/submillimeter Array in the star-forming region Sgr B2, about 300 light years away from the galactic center Sgr A*.
    MPIfR/A. Weiß (background image); University of Cologne/M. Koerber (molecular models); MPIfR/A. Belloche (montage)

    Using the Atacama Large Millimeter/submillimeter Array (ALMA), researchers studied the gaseous star-forming region Sagittarius B2.

    ALMA Array
    ALMA

    Organic molecules usually found in these star-forming regions consist of a single “backbone” of carbon atoms arranged in a straight chain. But the carbon structure of isopropyl cyanide branches off, making it the first interstellar detection of such a molecule, said Rob Garrod from Cornell University in Ithaca, New York.

    This detection opens a new frontier in the complexity of molecules that can be formed in interstellar space and that might ultimately find their way to the surfaces of planets, said Garrod. The branched carbon structure of isopropyl cyanide is a common feature in molecules that are needed for life — such as amino acids, which are the building blocks of proteins. This new discovery lends weight to the idea that biologically crucial molecules, like amino acids that are commonly found in meteorites, are produced early in the process of star formation — even before planets such as Earth are formed.

    Garrod, along with Arnaud Belloche and Karl Menten, both of the Max Planck Institute for Radio Astronomy, and Holger Müller of the University of Cologne, sought to examine the chemical makeup of Sagittarius B2, a region close to the Milky Way’s galactic center and an area rich in complex interstellar organic molecules.

    With ALMA, the group conducted a full spectral survey looking for fingerprints of new interstellar molecules — with sensitivity and resolution 10 times greater than previous surveys.

    The purpose of the ALMA Observatory is to search for cosmic origins through an array of 66 sensitive radio antennas from the high elevation and dry air of northern Chile’s Atacama Desert. The array of radio telescopes works together to form a gigantic “eye” peering into the cosmos.

    “Understanding the production of organic material at the early stages of star formation is critical to piecing together the gradual progression from simple molecules to potentially life-bearing chemistry,” said Belloche.

    About 50 individual features for isopropyl cyanide and 120 for normal-propyl cyanide — its straight-chain sister molecule — were identified in the ALMA spectrum of the Sagittarius B2 region. The two molecules — isopropyl cyanide and normal-propyl cyanide — are also the largest molecules yet detected in any star-forming region.

    See the full article here..

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 5:41 pm on September 30, 2014 Permalink | Reply
    Tags: , , , , , ,   

    From SPACE.com: “Search for Alien Life Should Target Water, Oxygen and Chlorophyll” 

    space-dot-com logo

    SPACE.com

    September 30, 2014
    Mike Wall

    The next generation of space telescopes hunting for signs of extraterrestrial life should focus on water, then oxygen and then alien versions of the plant chemical chlorophyll, a new study suggests.

    In the past 20 years or so, astronomers have confirmed the existence of nearly 2,000 worlds outside Earth’s solar system. Many of these exoplanets lie in the habitable zones of stars, areas potentially warm enough for the worlds to harbor liquid water on their surfaces. Astrobiologists hope that life may someday be spotted on such alien planets, since there is life pretty much everywhere water exists on Earth.

    One strategy to discover signs of such alien life involves looking for ways that organisms might change a world’s appearance. For example, chemicals typically shape what are known as the spectra seen from planets by adding or removing wavelengths of light. Alien-hunting telescopes could look for spectra that reveal chemicals associated with life. In other words, these searches would focus on biosignatures — chemicals or combinations of chemicals that life could produce, but that processes other than life could not or would be unlikely to create.

    Astrophysicists Timothy Brandt and David Spiegel at the Institute for Advanced Study in Princeton, New Jersey, sought to see how challenging it might be to conclusively identify signatures of water, oxygen and chlorophyll — the green pigment that plants use to convert sunlight to energy — on a distant twin of Earth using a future off-Earth instrument such as NASA’s proposed Advanced Technology Large-Aperture Space Telescope (ATLAST).

    atlast
    8-meter monolithic mirror telescope (credit: MSFC Advanced Concepts Office)
    again
    16-meter segmented mirror telescope (credit: Northrop Grumman Aerospace Systems & NASA/STScI)
    two conceptual schemes for ATLAST

    The scientists found that water would be the easiest to detect.

    “Water is a very common molecule, and I think a mission to take spectra of exoplanets should certainly look for water,” said Brandt, the lead study author. “Indeed, we have found water in a few gas giants more massive than Jupiter orbiting other stars.”

    In comparison, oxygen is more difficult to detect than previously thought, requiring scientific instruments approximately twice as sensitive as those needed to detect water and significantly better at discriminating between similar colors of light.

    “Oxygen, however, has only been a large part of Earth’s atmosphere for a few hundred million years,” Brandt said. “If we see it in an exoplanet, it probably points to life, but not finding oxygen certainly does not mean that the planet is sterile.”

    Although a well-designed space telescope could detect water and oxygen on a nearby Earth twin, the astrophysicists found the instrument would need to be significantly more sensitive, or very lucky, to see chlorophyll. Identifying this chemical typically requires scientific instruments about six times more sensitive than those needed for oxygen. Chlorophyll becomes as detectable as oxygen only when an exoplanet has a lot of vegetation and/or little in the way of cloud cover, researchers said.

    Chlorophyll slightly reddens the light from Earth. If extraterrestrial life does convert sunlight to energy as plants do, scientists expect that the alien process might use a different pigment than chlorophyll. But alien photosynthesis could also slightly redden planets, just as chlorophyll does.

    “Light comes in packets called photons, and only photons with at least a certain amount of energy are useful for photosynthesis,” Brandt said. Chlorophyll reflects photons that are too red and low in energy to be used for photosynthesis, and it may be reasonable to assume that extraterrestrial pigments would do the same thing, Brandt noted.

    The researchers suggest a strategy for discovering Earthlike alien life that first looks for water, then oxygen on the more favorable planets and finally chlorophyll on only the most exceptionally promising worlds.

    “The goal of a future space telescope will be primarily to detect water and oxygen on a planet around a nearby star,” Brandt said. “The construction and launch of such a telescope will probably cost at least $10 billion and won’t happen for at least 20 years — a lot of technology development needs to happen first — but it could be the most exciting mission of my lifetime.”

    Brandt and Spiegel detailed their findings online Sept. 1 in the journal Proceedings of the National Academy of Sciences.

    See the full article here.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 5:17 pm on September 30, 2014 Permalink | Reply
    Tags: , , , STEM   

    From FNAL: “High school students advance particle physics and their own science education at Fermilab” 


    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Tuesday, Sept. 30, 2014
    Leah Hesla

    As an eighth grader, Paul Nebres took part in a 2012 field trip to Fermilab. He learned about the laboratory’s exciting scientific experiments, said hello to a few bison and went home inspired.

    kids
    Illinois Mathematics and Science Academy students Nerione Agrawal (left) and Paul Nebres (right) work on the Muon g-2 experiment through the Student Inquiry and Research program. Muon g-2 scientist Brendan Kiburg (center) co-mentors the students. Photo: Fermilab

    Now a junior at the Illinois Mathematics and Science Academy (IMSA) in Aurora, Nebres is back at Fermilab, this time actively contributing to its scientific program. He’s been working on the Muon g-2 project since the summer, writing software that will help shape the magnetic field that guides muons around a 150-foot-circumference muon storage ring.

    Nebres is one of 13 IMSA students at Fermilab. The high school students are part of the academy’s Student Inquiry and Research program, or SIR. Every Wednesday over the course of a school year, the students use these weekly Inquiry Days to work at the laboratory, putting their skills to work and learning new ones that advance their understanding in the STEM fields.

    The program is a win for both the laboratory and the students, who work on DZero, MicroBooNE, MINERvA and electrical engineering projects, in addition to Muon g-2.

    “You can throw challenging problems at these students, problems you really want solved, and then they contribute to an important part of the experiment,” said Muon g-2 scientist Brendan Kiburg, who co-mentors a group of four SIR students with scientists Brendan Casey and Tammy Walton. “Students can build on various aspects of the projects over time toward a science result and accumulate quite a nice portfolio.”

    This year roughly 250 IMSA students are in the broader SIR program, conducting independent research projects at Argonne National Laboratory, the University of Chicago and other Chicago-area institutions.

    IMSA junior Nerione Agrawal, who started in the SIR program this month, uses her background in computing and engineering to simulate the potential materials that will be used to build Muon g-2 detectors.

    “I’d been to Fermilab a couple of times before attending IMSA, and when I found out that you could do an SIR at Fermilab, I decided I wanted to do it,” she said. “I’ve really enjoyed it so far. I’ve learned so much in three weeks alone.”

    The opportunities for students at the laboratory extend beyond their particular projects.

    “We had the summer undergraduate lecture series, so apart from doing background for the experiment, I learned what else is going on around Fermilab, too,” Nebres said. “I didn’t expect the amount of collaboration that goes on around here to be at the level that it is.”

    In April, every SIR student will create a poster on his or her project and give a short talk at the annual IMSAloquium.

    Kiburg encourages other researchers at the lab to advance their projects while nurturing young talent through SIR.

    “This is an opportunity to let a creative person take the reins of a project, steward it to completion or to a point that you could pick up where they leave off and finish it,” he said. “There’s a real deliverable outcome. It’s inspiring.”

    See the full article here.

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 4:55 pm on September 30, 2014 Permalink | Reply
    Tags: , , , ,   

    From NASA/SWIFT: “NASA’s Swift Mission Observes Mega Flares from a Mini Star” 

    NASA Swift Banner

    NASA SWIFT Telescope

    NASA Swift

    September 30, 2014
    Francis Reddy
    NASA’s Goddard Space Flight Center, Greenbelt, Maryland

    On April 23, NASA’s Swift satellite detected the strongest, hottest, and longest-lasting sequence of stellar flares ever seen from a nearby red dwarf star. The initial blast from this record-setting series of explosions was as much as 10,000 times more powerful than the largest solar flare ever recorded.

    “We used to think major flaring episodes from red dwarfs lasted no more than a day, but Swift detected at least seven powerful eruptions over a period of about two weeks,” said Stephen Drake, an astrophysicist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, who gave a presentation on the “superflare” at the August meeting of the American Astronomical Society’s High Energy Astrophysics Division. “This was a very complex event.”

    At its peak, the flare reached temperatures of 360 million degrees Fahrenheit (200 million Celsius), more than 12 times hotter than the center of the sun.


    In April 2014, NASA’s Swift mission detected a massive superflare from a red dwarf star in the binary system DG CVn, located about 60 light-years away. Astronomers Rachel Osten of the Space Telescope Science Institute and Stephen Drake of NASA Goddard discuss this remarkable event.
    Image Credit: NASA’s Goddard Space Flight Center/S. Wiessinger

    The “superflare” came from one of the stars in a close binary system known as DG Canum Venaticorum, or DG CVn for short, located about 60 light-years away. Both stars are dim red dwarfs with masses and sizes about one-third of our sun’s. They orbit each other at about three times Earth’s average distance from the sun, which is too close for Swift to determine which star erupted.

    “This system is poorly studied because it wasn’t on our watch list of stars capable of producing large flares,” said Rachel Osten, an astronomer at the Space Telescope Science Institute in Baltimore and a deputy project scientist for NASA’s James Webb Space Telescope, now under construction. “We had no idea DG CVn had this in it.”

    Most of the stars lying within about 100 light-years of the solar system are, like the sun, middle-aged. But a thousand or so young red dwarfs born elsewhere drift through this region, and these stars give astronomers their best opportunity for detailed study of the high-energy activity that typically accompanies stellar youth. Astronomers estimate DG CVn was born about 30 million years ago, which makes it less than 0.7 percent the age of the solar system.

    Stars erupt with flares for the same reason the sun does. Around active regions of the star’s atmosphere, magnetic fields become twisted and distorted. Much like winding up a rubber band, these allow the fields to accumulate energy. Eventually a process called magnetic reconnection destabilizes the fields, resulting in the explosive release of the stored energy we see as a flare. The outburst emits radiation across the electromagnetic spectrum, from radio waves to visible, ultraviolet and X-ray light.

    At 5:07 p.m. EDT on April 23, the rising tide of X-rays from DG CVn’s superflare triggered Swift’s Burst Alert Telescope (BAT). Within several seconds of detecting a strong burst of radiation, the BAT calculates an initial position, decides whether the activity merits investigation by other instruments and, if so, sends the position to the spacecraft. In this case, Swift turned to observe the source in greater detail, and, at the same time, notified astronomers around the globe that a powerful outburst was in progress.

    “For about three minutes after the BAT trigger, the superflare’s X-ray brightness was greater than the combined luminosity of both stars at all wavelengths under normal conditions,” noted Goddard’s Adam Kowalski, who is leading a detailed study on the event. “Flares this large from red dwarfs are exceedingly rare.”

    The star’s brightness in visible and ultraviolet light, measured both by ground-based observatories and Swift’s Optical/Ultraviolet Telescope, rose by 10 and 100 times, respectively. The initial flare’s X-ray output, as measured by Swift’s X-Ray Telescope, puts even the most intense solar activity recorded to shame.

    The largest solar explosions are classified as extraordinary, or X class, solar flares based on their X-ray emission. “The biggest flare we’ve ever seen from the sun occurred in November 2003 and is rated as X 45,” explained Drake. “The flare on DG CVn, if viewed from a planet the same distance as Earth is from the sun, would have been roughly 10,000 times greater than this, with a rating of about X 100,000.”

    But it wasn’t over yet. Three hours after the initial outburst, with X-rays on the downswing, the system exploded with another flare nearly as intense as the first. These first two explosions may be an example of “sympathetic” flaring often seen on the sun, where an outburst in one active region triggers a blast in another.

    Over the next 11 days, Swift detected a series of successively weaker blasts. Osten compares the dwindling series of flares to the cascade of aftershocks following a major earthquake. All told, the star took a total of 20 days to settle back to its normal level of X-ray emission.

    How can a star just a third the size of the sun produce such a giant eruption? The key factor is its rapid spin, a crucial ingredient for amplifying magnetic fields. The flaring star in DG CVn rotates in under a day, about 30 or more times faster than our sun. The sun also rotated much faster in its youth and may well have produced superflares of its own, but, fortunately for us, it no longer appears capable of doing so.

    Astronomers are now analyzing data from the DG CVn flares to better understand the event in particular and young stars in general. They suspect the system likely unleashes numerous smaller but more frequent flares and plan to keep tabs on its future eruptions with the help of NASA’s Swift.

    See the full article, with video, here.

    The Swift Gamma-Ray Burst Mission consists of a robotic spacecraft called Swift, which was launched into orbit on November 20, 2004, at 17:16:00 UTC on a Delta II 7320-10C expendable launch vehicle. Swift is managed by the NASA Goddard Space Flight Center, and was developed by an international consortium from the United States, United Kingdom, and Italy. It is part of NASA’s Medium Explorer Program (MIDEX).


    ScienceSprings is powered by MAINGEAR computers

     
  • richardmitnick 4:34 pm on September 30, 2014 Permalink | Reply
    Tags: , , , , Triumf ARIEL LINAC   

    From Triumf: “ARIEL E-linac Meets Mega-Volt Milestones!” 

    30 September 2014
    Prepared by Shane Koscielniak, Lia Merminga, and Bob Laxdal.

    The campaign to demonstrate 10 MV/m accelerating gradient in the superconducting radio-frequency (SRF) cavities of the E-linac Injector (EINJ) and Accelerator (EACA) cryomodules achieved two critically important milestones last week: On Sept 23rd, the EINJ cavity reached 12 MV/m in continuous wave (c.w.) operation, exceeding design specification, and on September 24th, the EACA cavity reached 10 MV/m in c.w. operation, meeting design specification!

    areil
    ARIEL SUPERCONDUCTING ELECTRON LINAC

    To reach high gradient means the ability to accelerate a particle beam to a high voltage in a short distance; in this case 10 million electron volts over 1 metre of length! To achieve this, high power from the klystrons, on the order of 20 kW, is fed to the cavity through input couplers.

    team
    The “10MV/m SRF squad”: R. Nagimov, P. Kolb, R. Laxdal, Y.Y. Ma, W. Rawnsley, Z. Yao, V. Zvyagintsev.

    On Sept 23rd, the SRF squad produced initial results on the EINJ cavity. “We have reached 12 MV/m in both pulsed (2% duty factor) and c.w. mode. The cavity looks clean – the present limitation is only in the coupler conditioning,” reported Bob Laxdal, the SRF Group Leader.

    And on Sept 24th, the SRF squad pushed the EACA performance to 10 MV/m in c.w. operation. This builds on previous tests conducted Sept 10-13, where the EACA cavity attained 7 MV/m in c.w. and 10 MV/m in pulsed operation. “We had no doubt that the input coupler conditioning in the intervening period would make a difference, and it paid off,” says SRF Engineer Vladimir Zvyagintsev. “We will continue to condition in the coming days to further improve the performance.”

    The significance of the two results is that the cavities have reached or exceeded the specified performance for acceleration and rf power and the team believes they can go higher. “This paves the way to accelerate the electron beam up to 25 MeV later this week”, says E-linac Project Leader Shane Koscielniak. “The SRF results are pivotal to the project, and I am excited that we will be commissioning the linac with electron beam this week,” he added.

    The campaign leading to the SRF 10 Mega-Volt Milestones was five years in the making. The SRF team collaborated with the Canadian company PAVAC Industries to produce the first multi-cell elliptical cavities built in Canada, which were then processed and individually characterized in TRIUMF test cryostats, developed specifically for the ARIEL project. Several tests were required before the cavities were sufficiently optimized for assembly into the cryomodules. The cryomodules are the life support systems for the cavities – a super-thermos that allows control of the cavity while keeping it thermally isolated. Their complex design was done completely in house by the SRF team over a three year period. The two cryomodules (EINJ and EACA) have been assembled. Critical for the performance is assembly in a Class 10 clean room to keep the rf surfaces free from dust that would reduce performance.

    The cryomodules were installed in the Electron Hall during the spring and summer. A cryogenic distribution system connects the two cryomodules to the cryogenic service allowing the supply of 77 Kelvin liquid Nitrogen (LN2) for thermally shielding and 4 Kelvin liquid Helium (LHe) to cool the cavities. The final cooling of the RF cavity to 2 Kelvin is accomplished by sub-atmospheric pumping.

    Two new 300 kW klystrons were installed, commissioned and connected to the cryomodules with waveguides, while two sophisticated TRIUMF designed control systems were installed and commissioned to regulate the voltage produced by the cavity. The multitude of control and diagnostic cables were interfaced to the e-linac EPICS control system, and the thermal isolation volume and the cavity/beamline volume were evacuated and leak tested. The cooling of the cryomodules proceeded very smoothly and the cryogenic performance of the modules was measured and matched the design specification. Bob Laxdal confirmed, “The cryo-engineering of the modules is very solid – all aspects of the performance reflect the design requirements.”

    With heroic efforts on behalf of many TRIUMF groups, including cryogenics, controls, vacuum, high power and low level RF, preparations were complete for the SRF group to begin to characterize the cavity performance. The first step in applying radio-frequency power (to each cryomodule) was to condition the two input couplers; RF waves are applied to their interior metallic surfaces to eliminate electron emission. A key step was to tune the coupler and cavity so that RF power goes into the cavity, rather than being reflected away from it. “Achieving ‘lock’ of the RF components all to the same frequency is a big moment that unites hardware, electronics of the low level RF system, and controls,” Shane pointed out.

    The power from the klystron was increased slowly from a few 100 Watts to the 20 kW level, first in pulsed mode and then in continuous mode. Interlock checks were done to avoid the potential of a rapid heating of the helium due to an errant rf event. In the end all aspects of the power delivery, cryogenics, cryo-engineering, low level rf and cavity performance were successful. Vladimir Zvyagintsev, SRF Engineer, was part of the team to ‘energize’ the cavities. “It is very unusual to bring up a high power cold cavity system in such a short time, but to bring up two systems in two consecutive nights is remarkable and speaks to the quality of the installation.”

    “The rf milestones this week represent an enormous effort by a talented team. We are proud of what they have accomplished,” stated Bob Laxdal.

    team2
    “Cavity successfully energized”. The team: Chang Wei (visitor from IMP), P. Kolb, K. Fong, V. Zvyagintsev, Z. Yao, M. Laverty, Liu Yang.

    All the hard work has paid off. The ARIEL e-linac met its Mega-Volt Milestones. The ARIEL-1 project is on target!

    See the full article here.

    World Class Science at Triumf Lab, British Columbia, Canada
    Canada’s national laboratory for particle and nuclear physics
    Member Universities:
    University of Alberta, University of British Columbia, Carleton University, University of Guelph, University of Manitoba, Université de Montréal, Simon Fraser University,
    Queen’s University, University of Toronto, University of Victoria, York University. Not too shabby, eh?

    Associate Members:
    University of Calgary, McMaster University, University of Northern British Columbia, University of Regina, Saint Mary’s University, University of Winnipeg, How bad is that !!
    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 4:10 pm on September 30, 2014 Permalink | Reply
    Tags: , , , , NASA NISAR   

    From NASA: “U.S., India to Collaborate on Mars Exploration, Earth-Observing Mission” 

    NASA

    NASA

    September 30, 2014
    Steve Cole
    Headquarters, Washington
    202-358-0918
    stephen.e.cole@nasa.gov

    In a meeting Tuesday in Toronto, NASA Administrator Charles Bolden and K. Radhakrishnan, chairman of the Indian Space Research Organisation (ISRO), signed two documents to launch a NASA-ISRO satellite mission to observe Earth and establish a pathway for future joint missions to explore Mars.

    two
    NASA Administrator Charles Bolden (left) and Chairman K. Radhakrishnan of the Indian Space Research Organisation signing documents in Toronto on Sept. 30, 2014 to launch a joint Earth-observing satellite mission and establish a pathway for future joint missions to explore Mars. Image Credit: NASA

    While attending the International Astronautical Congress, the two space agency leaders met to discuss and sign a charter that establishes a NASA-ISRO Mars Working Group to investigate enhanced cooperation between the two countries in Mars exploration. They also signed an international agreement that defines how the two agencies will work together on the NASA-ISRO Synthetic Aperture Radar (NISAR) mission, targeted to launch in 2020.

    concept
    An artist’s concept of the planned NASA-ISRO Synthetic Aperture Radar, or NISAR, satellite in orbit, showing the large deployable mesh antenna, solar panels and radar electronics attached to the spacecraft. The mission is a partnership between NASA and the Indian Space Research Organization. Image credit: NASA/JPL-Caltech

    “The signing of these two documents reflects the strong commitment NASA and ISRO have to advancing science and improving life on Earth,” said NASA Administrator Charles Bolden. “This partnership will yield tangible benefits to both our countries and the world.”

    The joint Mars Working Group will seek to identify and implement scientific, programmatic and technological goals that NASA and ISRO have in common regarding Mars exploration. The group will meet once a year to plan cooperative activities, including potential NASA-ISRO cooperation on future missions to Mars.

    Both agencies have newly arrived spacecraft in Mars orbit. NASA’s Mars Atmosphere and Volatile EvolutioN (MAVEN) spacecraft arrived at Mars Sept. 21. MAVEN is the first spacecraft dedicated to exploring the tenuous upper atmosphere of Mars. ISRO’s Mars Orbiter Mission (MOM), India’s first spacecraft launched to Mars, arrived Sept. 23 to study the Martian surface and atmosphere and demonstrate technologies needed for interplanetary missions.

    NASA Mars MAVEN
    NASA/MAVEN

    India Mars Orbiter Mission
    ISRO MOM

    One of the working group’s objectives will be to explore potential coordinated observations and science analysis between MAVEN and MOM, as well as other current and future Mars missions.

    “NASA and Indian scientists have a long history of collaboration in space science,” said John Grunsfeld, NASA associate administrator for science. “These new agreements between NASA and ISRO in Earth science and Mars exploration will significantly strengthen our ties and the science that we will be able to produce as a result.”

    The joint NISAR Earth-observing mission will make global measurements of the causes and consequences of land surface changes. Potential areas of research include ecosystem disturbances, ice sheet collapse and natural hazards. The NISAR mission is optimized to measure subtle changes of the Earth’s surface associated with motions of the crust and ice surfaces. NISAR will improve our understanding of key impacts of climate change and advance our knowledge of natural hazards.

    NISAR will be the first satellite mission to use two different radar frequencies (L-band and S-band) to measure changes in our planet’s surface less than a centimeter across. This allows the mission to observe a wide range of changes, from the flow rates of glaciers and ice sheets to the dynamics of earthquakes and volcanoes.

    Under the terms of the new agreement, NASA will provide the mission’s L-band synthetic aperture radar (SAR), a high-rate communication subsystem for science data, GPS receivers, a solid state recorder, and a payload data subsystem. ISRO will provide the spacecraft bus, an S-band SAR, and the launch vehicle and associated launch services.

    NASA had been studying concepts for a SAR mission in response to the National Academy of Science’s decadal survey of the agency’s Earth science program in 2007. The agency developed a partnership with ISRO that led to this joint mission. The partnership with India has been key to enabling many of the mission’s science objectives.

    NASA’s contribution to NISAR is being managed and implemented by the agency’s Jet Propulsion Laboratory (JPL) in Pasadena, California.

    NASA and ISRO have been cooperating under the terms of a framework agreement signed in 2008. This cooperation includes a variety of activities in space sciences such as two NASA payloads — the Mini-Synthetic Aperture Radar (Mini-SAR) and the Moon Mineralogy Mapper — on ISRO’s Chandrayaan-1 mission to the moon in 2008. During the operational phase of this mission, the Mini-SAR instrument detected ice deposits near the moon’s northern pole.

    For more information on NASA’s Mars exploration program, visit:

    http://www.nasa.gov/mars

    For more information on the NISAR mission, visit:

    http://nisar.jpl.nasa.gov

    See the full article here.

    The National Aeronautics and Space Administration (NASA) is the agency of the United States government that is responsible for the nation’s civilian space program and for aeronautics and aerospace research.

    President Dwight D. Eisenhower established the National Aeronautics and Space Administration (NASA) in 1958 with a distinctly civilian (rather than military) orientation encouraging peaceful applications in space science. The National Aeronautics and Space Act was passed on July 29, 1958, disestablishing NASA’s predecessor, the National Advisory Committee for Aeronautics (NACA). The new agency became operational on October 1, 1958.

    Since that time, most U.S. space exploration efforts have been led by NASA, including the Apollo moon-landing missions, the Skylab space station, and later the Space Shuttle. Currently, NASA is supporting the International Space Station and is overseeing the development of the Orion Multi-Purpose Crew Vehicle and Commercial Crew vehicles. The agency is also responsible for the Launch Services Program (LSP) which provides oversight of launch operations and countdown management for unmanned NASA launches. Most recently, NASA announced a new Space Launch System that it said would take the agency’s astronauts farther into space than ever before and lay the cornerstone for future human space exploration efforts by the U.S.

    NASA science is focused on better understanding Earth through the Earth Observing System, advancing heliophysics through the efforts of the Science Mission Directorate’s Heliophysics Research Program, exploring bodies throughout the Solar System with advanced robotic missions such as New Horizons, and researching astrophysics topics, such as the Big Bang, through the Great Observatories [Hubble,
    Chandra, Spitzer ]and associated programs. NASA shares data with various national and international organizations such as from the Greenhouse Gases Observing Satellite.
    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:30 pm on September 30, 2014 Permalink | Reply
    Tags: , , ,   

    From Symmetry: “Accelerating the fight against cancer” 

    Symmetry

    September 30, 2014
    Glennda Chui

    As charged-particle therapies grow in popularity, physicists are working with other experts to make them smaller, cheaper and more effective—and more available to cancer patients in the United States.

    Once physicists started accelerating particles to high energies in the 1930s, it didn’t take them long to think of a killer app for this new technology: zapping tumors.

    Standard radiation treatments, which had already been around for decades, send X-rays straight through the tumor and out the other side of the body, damaging healthy tissue both coming and going. But protons and ions—atoms stripped of electrons—slow when they hit the body and come to a stop, depositing most of their destructive energy at their stopping point. If you tune a beam of protons or ions so they stop inside a tumor, you can deliver the maximum dose of radiation while sparing healthy tissue and minimizing side effects. This makes it ideal for treating children, whose developing bodies are particularly sensitive to radiation damage, and for cancers very close to vital tissues such as the optic nerves or spinal cord.

    proton
    Protons and electrons in an atom

    Today, nearly 70 years after American particle physicist Robert Wilson came up with the idea, proton therapy has been gaining traction worldwide and in the United States, where 14 centers are treating patients and nine more are under construction. Ions such as carbon, helium and oxygen are being used to treat patients in Germany, Italy, China and Japan. More than 120,000 patients had been treated with various forms of charged-particle therapy by the end of 2013, according to the Particle Therapy Co-Operative Group.

    New initiatives from CERN research center in Europe and the Department of Energy and National Cancer Institute in the United States are aimed at moving the technology along, assessing its strengths and limitations and making it more affordable.

    And physicists are still deeply involved. No one knows more about building and operating particle accelerators and detectors. But there’s a lot more to know. So they’ve been joining forces with physicians, engineers, biologists, computer scientists and other experts to make the equipment smaller, lighter, cheaper and more efficient and to improve the way treatments are done.

    pa
    particle Accelerator

    “As you get closer to the patient, you leave the world accelerator physicists live in and get closer to the land of people who have PhDs in medical physics,” says Stephen Peggs, an accelerator physicist at Brookhaven National Laboratory.

    “It’s alignment, robots and patient ergonomics, which require just the right skill sets, which is why it’s fun, of course, and one reason why it’s interesting—designing with patients in mind.”

    Knowing where to stop

    The collaborations that make charged-particle therapy work go back a long way. The first experimental treatments took place in 1954 at what is now Lawrence Berkeley National Laboratory. Later scientists at Fermi National Accelerator Laboratory designed and built the circular accelerator at the heart of the first hospital-based proton therapy center in the United States, opened in 1990 at California’s Loma Linda University Medical Center.

    A number of private companies have jumped into the field, opening treatment centers, selling equipment and developing more compact and efficient treatment systems that are designed to cut costs. ProTom International, for instance, recently received US Food and Drug Administration approval for a system that’s small enough and light enough to ship on a plane and move in through a door, so it will no longer be necessary to build the treatment center around it. Other players include ProCure, Mevion, IBA, Varian Medical Systems, ProNova, Hitachi, Sumitomo and Mitsubishi.

    The goal of any treatment scheme is to get the beam to stop in exactly the right spot; the most advanced systems scan a beam back and forth to “paint” the 3-D volume of the tumor with great precision. Aiming it is not easy, though. Not only is every patient’s body different—a unique conglomeration of organs and tissues of varying densities—but every patient breathes, so the target is in constant motion.

    Doctors use X-ray CT scans—the CT stands for “computed tomography”—to make a 3-D image of the tumor and its surroundings so they can calculate the ideal stopping point for the proton beam. But since protons don’t travel through the body exactly the same way X-rays do—their paths are shifted by tiny, rapid changes in the tissues they encounter along the way—their end points can differ slightly from the predicted ones.

    Physicists are trying to reduce that margin of error with a technology called proton CT.

    centers
    There are 49 charged-particle treatment centers operating worldwide, including 14 in the United States, and 27 more under construction. This map shows the number of patients treated through the end of 2013 in centers that are now in operation. Source: Particle Therapy Co-Operative Group.
    Artwork by: Sandbox Studio, Chicago with Shawna X.

    Reconnoitering with proton CT

    The idea is simple: Use protons rather than X-rays to make the images. The protons are tuned to high enough energies that they go through the body without stopping, depositing about one-tenth as much radiation along their path as X-rays do.

    Detectors in front of and behind the body pinpoint where each proton beam enters and leaves, and a separate detector measures how much energy the protons lose as they pass through tissues. By directing proton beams through the patient from different angles, doctors can create a 3-D image that tells them, much more accurately than X-rays, how to tune the proton beam so it stops inside the tumor.

    Two teams are now in friendly competition, testing rival ways to perform proton CT on “phantom” human heads made of plastic. Both approaches are based on detectors that are staples in particle physics.

    One team is made up of researchers from Northern Illinois University, Fermilab, Argonne National Laboratory and the University of Delhi in India and funded by the US Army Medical Research Acquisition Center in Maryland. They use a pair of fiber trackers on each side of the phantom head to pinpoint where the proton beams enter and exit. Each tracker contains thousands of thin plastic fibers. When a proton hits a fiber, it gives off a flash of light that is picked up by another physics standby—a silicon photomultiplier—and conveyed to a detector.

    The team is testing this system, which includes computers and software for turning the data into images, at the CDH Proton Center in Warrenville, Illinois.

    “The point is to demonstrate you can get the image quality you need to target the treatment more accurately with a lower radiation dose level than with X-ray CT,” says Peter Wilson, principal investigator for the Fermilab part of the project.

    The second project, a collaboration between researchers at Loma Linda, University of California, Santa Cruz, and Baylor University, is fi-nanced by a $2 million grant from the National Institutes of Health. Their proton CT system is based on silicon strip detectors the Santa Cruz group developed for the Fermi Gamma-ray Space Telescope and the ATLAS experiment at CERN, among others. It’s being tested at Loma Linda.

    NASA Fermi Telescope
    NASA/Fermi

    CERN ATLAS New
    CERN/ATLAS

    “We know how to detect charged particles with silicon detectors. Charged particles for us are duck soup,” says UCSC particle physicist Hartmut Sadrozinski, who has been working with these detectors for more than 30 years. Since a single scan requires tracking about a billion protons, the researchers also introduced software packages developed for high-energy physics to analyze the high volume of data coming into the detector.

    Proton CT will have to get a lot faster before it’s ready for the treatment room. In experiments with the phantom head, the system can detect a million protons per second, completing a scan in about 10 minutes, Sadrozinski says; the goal is to bring that down to 2 to 3 minutes, reducing the time the patient has to hold still and ensuring accurate images and dose delivery.
    Trimming the size and cost of ion therapy

    The first ion therapy center opened in Japan in 1994; by the end of 2013 centers in Japan, China, Germany and Italy had treated nearly 13,000 patients.

    There’s reason to think ions could be more effective than protons or X-rays for treating certain types of cancer, according to a recent review of the field published in Radiation Oncology by researchers from the National Cancer Institute and Walter Reed National Military Medical Center. Ions deliver a more powerful punch than protons, causing more damage to a tumor’s DNA, and patient treatments have shown promise.

    But the high cost of building and operating treatment centers has held the technology back, the researchers wrote; and long-term research on possible side effects, including the possibility of triggering secondary cancers, is lacking.

    The cost of building ion treatment centers is higher in part because the ions are so much heavier than protons. You need bigger magnets to steer them around an accelerator, and heavier equipment to deliver them to the patient.

    Two projects at Brookhaven National Laboratory aim to bring the size and cost of the equipment down.

    One team, led by accelerator physicist Dejan Trbojevic, has developed and patented a simpler, less expensive gantry that rotates around a stationary patient to aim an ion beam at a tumor from various angles. Gantries for ion therapy can be huge—the one in use at the Heidelberg Ion-Beam Therapy Center in Germany weighs 670 tons and is tall as a jetliner. The new design shrinks the size of the gantry by making a single set of simpler, smaller magnets do double duty, both bending and focusing the particle beam.

    In the second project, Brookhaven scientists are working with a Virginia company, Best Medical International, to design a system for treating patients with protons, carbon ions and other ion beams. Called the ion Rapidly Cycling Medical Synchrotron (iRCMS), it is designed to deliver ions to patients in smaller, more rapid pulses. With smaller pulses, the diameter of the beam also shrinks, along with the size of the magnets used to steer it. Brookhaven is building one of the system’s three magnet girders, radio-frequency acceleration cavities and a power supply for a prototype system. The end product must be simple and reliable enough for trained hospital technicians to operate for years.

    “A particle accelerator for cancer treatment has to be industrial, robust—not the high-tech, high-performance, typical machine we’re used to,” says Brookhaven’s Peggs, one of the lead scientists on the project. “It’s more like a Nissan than a Ferrari.”

    image
    Artwork by: Sandbox Studio, Chicago with Shawna X.

    Launching a CERN initiative for cancer treatment

    CERN, the international particle physics center in Geneva, is best known to many as the place where the Higgs boson was discovered in 2012. In 1996 it began collaborating on a study called PIMMS that designed a system for delivering both proton and ion treatments. That system evolved into the equipment at the heart of two ion therapy centers: CNAO, the National Center for Oncological Treatment in Pavia, Italy, which treated its first patient in 2011, and MedAustron, scheduled to open in Austria in 2015.

    Now scientists at CERN want to spearhead an international collaboration to design a new, more compact treatment system that will incorporate the latest particle physics technologies. It’s part of a larger CERN initiative launched late last year with a goal of contributing to a global system for treating cancer with charged-particle beams.

    Part of an existing CERN accelerator, the Low Energy Ion Ring, will be converted into a facility to provide various types of charged-particle beams for research into how they affect healthy and cancerous tissue. The lab will also consider developing detectors for making medical images and controlling the treatment beam, investigating ways to control the dose the patient receives and adapting large-scale computing for medical applications.

    CERN Low Energy Ion Ring
    CERN Low Energy Ion Ring

    CERN will provide seed funding and seek out other funding from foundations, philanthropists and other sources, such as the European Union.

    “Part of CERN’s mission is knowledge transfer,” says Steve Myers, director of the medical initiative, who spent the past five years running the Large Hadron Collider as director of accelerators and technology for CERN.

    CERN LHC Map
    CERN LHC Grand Tunnel
    CERN LHC particles
    CERN/LHC

    “We would like to make the technologies we have developed for particle physics available to other fields of research simply because we think it’s a nice thing to do,” he says. “All the things we do are related to the same goal, which is treating cancer tumors in the most effective and efficient way possible.”

    Expanding the options in the US

    In the US, the biggest barrier to setting up ion treatment centers is financial: Treatment centers cost hundreds of millions of dollars. Unlike in Europe and Asia, no government funding is available, so these projects have to attract private investors. But without rigorous studies showing that ion therapy is worth the added cost in terms of eradicating cancer, slowing its spread or improving patients’ lives, investors are reluctant to pony up money and insurance companies are reluctant to pay for treatments.

    Studies that rigorously compare the results of proton or ion treatment with standard radiation therapy are just starting, says James Deye, program director for medical physics at the National Cancer Institute’s radiation research program.

    The need for more research on ion therapy has caught the attention of the Department of Energy, whose Office of High Energy Physics oversees fundamental, long-term accelerator research in the US. A 2010 report, “Accelerators for America’s Future,” identified ion therapy as one of a number of areas where accelerator research and development could make important contributions to society.

    In January 2013, more than 60 experts from the US, Japan and Europe met at a workshop sponsored by the DOE and NCI to identify areas where more research is needed on both the hardware and medical sides to develop the ion therapy systems of the future. Ideally, the participants concluded, future facilities should offer treatment with multiple types of charged particles—from protons to lithium, helium, boron and carbon ions—to allow researchers to compare their effectiveness and individual patients to get more than one type of treatment.

    In June, the DOE’s Accelerator Stewardship program asked researchers to submit proposals for seed funding to improve accelerator and beam delivery systems for ion therapy.

    “If there are accelerator technologies that can better enable this type of treatment, our job is to apply our R&D and technical skills to try to improve their ability to do so,” says Michael Zisman, an accelerator physicist from Lawrence Berkeley National Laboratory who is temporarily detailed to the DOE Office of High Energy Physics.

    “Ideally we hope there will be partnerships between labs, industry, universities and medical facilities,” he says. “We don’t want good technology ideas in search of a problem. We rather want to make sure our customers are identifying real problems that we believe the application of improved accelerator technology can actually solve.”

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.


    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 2:35 pm on September 30, 2014 Permalink | Reply
    Tags: , , ,   

    From physicsworld: “Japan seeks to splurge on big-science facilities” 

    physicsworld
    physicsworld.com

    Sep 26, 2014
    Dennis Normile

    Physics in Japan is set for a major boost after the education ministry asked for a massive 18% increase for its 2015 science and technology budget to take it to $11.1bn. Support for major facilities – including the SPring-8 synchrotron and the SACLA X-ray free-electron laser, both in Hyōgo Prefecture, and the Japan Proton Accelerator Research Complex (J-PARC) in Tokaimura – would rise 15.6% to $960m. The finance ministry, however, is likely to squeeze the requested amounts before the budget, which takes effect from next April, goes before the legislature in December.

    spring8
    SPring-8

    jparc
    J-PARC

    The money for SACLA and SPring-8 would mean the facilities could run for an additional 750 and 1000 hours, respectively, and also fund an upgrade at SACLA. At J-PARC, the cash would go on overall operations plus maintenance and safety upgrades. The ministry’s request also includes $11m to finish the Large-Scale Cryogenic Gravitational Wave Telescope (also known as KAGRA).

    lcgt
    LCGT

    Built in the Ikenoyama Mountain in Kamioka, KAGRA features two 3 km-long arms forming an “L” for the detector plus two access tunnels. Some 7.7 km of tunnels were completed earlier this year that will be used for the experiment. “[The budget allocation] would allow us to complete equipment development and installation,” says KAGRA project director Takaaki Kajita, who is based at the University of Tokyo’s Institute for Cosmic Ray Research in Kashiwa. The facility is expected to be complete by the end of next year and start operations in 2017.

    For ongoing international projects, the ministry is seeking $54m for Japan’s contribution to the Thirty Meter Telescope being built on Mauna Kea in Hawaii, as well as $260m for ITER, the experimental fusion reactor currently under construction in Cadarache, France.

    TMT
    TMT Schematic
    TMT

    ITER Tokamak
    ITER Tokamak

    The ministry also aims to spend $1m to continue studies for the proposed International Linear Collider (ILC), which Japan has expressed an interest in hosting. This year the government set up a committee to investigate the scientific case for the facility, with sub-committees looking at technical issues and cost. Satoru Yamashita, a physicist at the University of Tokyo who chairs Japan’s ILC Strategy Council, says the country took a step towards international support for the $10bn project with initial political-level discussions with the US in July. “There is still a lot to do,” adds Yamashita.

    LC Linear Collider
    LC

    See the full article here.

    PhysicsWorld is a publication of the Institute of Physics. The Institute of Physics is a leading scientific society. We are a charitable organisation with a worldwide membership of more than 50,000, working together to advance physics education, research and application.

    We engage with policymakers and the general public to develop awareness and understanding of the value of physics and, through IOP Publishing, we are world leaders in professional scientific communications.
    IOP Institute of Physics

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 7:27 am on September 30, 2014 Permalink | Reply
    Tags: , ,   

    From physicsworld: “Quantum data are compressed for the first time” 

    physicsworld
    physicsworld.com

    Sep 29, 2014
    Jon Cartwright

    A quantum analogue of data compression has been demonstrated for the first time in the lab. Physicists working in Canada and Japan have squeezed quantum information contained in three quantum bits (qubits) into two qubits. The technique could pave the way for a more effective use of quantum memories and offers a new method of testing quantum logic devices.

    image
    Three for two: physicists have compressed quantum data

    Compression of classical data is a simple procedure that allows a string of information to take up less space in a computer’s memory. Given an unadulterated string of, for example, 1000 binary values, a computer could simply record the frequency of the 1s and 0s, which might require just a dozen or so binary values. Recording the information about the order of those 1s and 0s would require a slightly longer string, but it would probably still be shorter than the original sequence.

    Quantum data are rather different, and it is not possible to simply determine the frequencies of 1s and 0s in a string of quantum information. The problem comes down to the peculiar nature of qubits, which, unlike classical bits, can be a 1, a 0 or some “superposition” of both values. A user can indeed perform a measurement to record the “one-ness” of a qubit, but such a measurement would destroy any information about that qubit’s “zero-ness”. What is more, if a user then measures a second qubit prepared in an identical way, he or she might find a different value for its “one-ness” – because qubits do not specify unique values but only the probability of measurement outcomes. This latter trait would seem to preclude the possibility of compressing even identical qubits, because there is no way of predicting what classical values they will ultimately manifest as.

    A way forward

    In 2010 physicists Martin Plesch and Vladimír Bužek of the Slovak Academy of Sciences in Bratislava realized that, while it is not possible to compress quantum data to the same extent as classical data, some compression can be achieved. As long as the quantum nature of a string of identically prepared qubits is preserved, they said, it should be possible to feed them through a circuit that records only their probabilistic natures. Such a recording would require exponentially fewer qubits, and would allow a user to easily store the quantum information in a quantum memory, which is currently a limited resource. Then at some later time, the user could decide what type of measurement to perform on the data.

    “This way you can store the qubits until you know what question you’re interested in,” says Aephraim Steinberg of the University of Toronto. “Then you can measure x if you want to know x; and if you want to know z, you can measure z – whereas if you don’t store the qubits, you have to choose which measurements you want to do right now.”

    Now, Steinberg and his colleagues have demonstrated working quantum compression for the first time with photon qubits. Because photon qubits are currently very difficult to process in quantum logic gates, Steinberg’s group resorted to a technique known as measurement-based quantum computing, in which the outcomes of a logic gate are “built in” to qubits that are prepared and entangled at the same source. The details are complex, but the researchers managed to transfer the probabilistic nature of three qubits into two qubits.

    A nice trick

    Plesch says that this is the first time that compression of quantum data has been realized, and believes Steinberg and colleagues have come up with a “nice trick” to make it work. “This approach is, however, hard to scale to a larger number of qubits,” Plesch adds. “Having said that, I consider the presented work as a very nice proof-of-concept for the future.”

    Steinberg thinks that larger-scale quantum compression might be possible with different types of qubits, such as trapped ions, which have so far proved easier to manage in large ensembles. A practical use for the process would be in testing quantum devices using a process known as quantum tomography, in which many identically prepared qubits are sent through a quantum device to check that it is functioning properly. With quantum compression, says Steinberg, one could perform the tomography experiment and then decide later what aspect of the device you wanted to test.

    But in the meantime, says Steinberg, the demonstration provides another perspective on the strangeness of the quantum world. “If you had a book filled just with ones, you could simply tell your friend that it’s a book filled with ones,” he says. “But quantum mechanically, that’s already not true. Even if I gave you a billion identically prepared photons, you could get different information from each one. To describe their states completely would require infinite classical information.”

    The research will be described in Physical Review Letters.

    See the full article here.

    PhysicsWorld is a publication of the Institute of Physics. The Institute of Physics is a leading scientific society. We are a charitable organisation with a worldwide membership of more than 50,000, working together to advance physics education, research and application.

    We engage with policymakers and the general public to develop awareness and understanding of the value of physics and, through IOP Publishing, we are world leaders in professional scientific communications.
    IOP Institute of Physics

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 7:04 pm on September 29, 2014 Permalink | Reply
    Tags: , ,   

    From MIT: “Modeling shockwaves through the brain” 


    MIT News

    September 29, 2014
    Jennifer Chu | MIT News Office

    New scaling law helps estimate humans’ risk of blast-induced traumatic brain injury.

    Since the start of the military conflicts in Iraq and Afghanistan, more than 300,000 soldiers have returned to the United States with traumatic brain injury (TBI) caused by exposure to bomb blasts — and in particular, exposure to improvised explosive devices, or IEDs. Symptoms of traumatic brain injury can range from the mild, such as lingering headaches and nausea, to more severe impairments in memory and cognition.

    brain
    Jose-Luis Olivares/MIT

    Since 2007, the U.S. Department of Defense has recognized the critical importance and complexity of this problem, and has made significant investments in traumatic brain injury research. Nevertheless, there remain many gaps in scientists’ understanding of the effects of blasts on the human brain; most new knowledge has come from experiments with animals.

    br
    MIT researchers have developed a model of the human head for use in simulations to predict the risk for blast-induced traumatic brain injury. Relevant tissue structures include the skull (green), brain (red), and flesh (blue). Courtesy of the researchers

    Now MIT researchers have developed a scaling law that predicts a human’s risk of brain injury, based on previous studies of blasts’ effects on animal brains. The method may help the military develop more protective helmets, as well as aid clinicians in diagnosing traumatic brain injury — often referred to as the “invisible wounds” of battle.

    “We’re really focusing on mild traumatic brain injury, where we know the least, but the problem is the largest,” says Raul Radovitzky, a professor of aeronautics and astronautics and associate director of the MIT Institute for Soldier Nanotechnologies (ISN). “It often remains undetected. And there’s wide consensus that this is clearly a big issue.”

    While previous scaling laws predicted that humans’ brains would be more resilient to blasts than animals’, Radovitzky’s team found the opposite: that in fact, humans are much more vulnerable, as they have thinner skulls to protect much larger brains.

    A group of ISN researchers led by Aurélie Jean, a postdoc in Radovitzky’s group, developed simulations of human, pig, and rat heads, and exposed each to blasts of different intensities. Their simulations predicted the effects of the blasts’ shockwaves as they propagated through the skulls and brains of each species. Based on the resulting differences in intracranial pressure, the team developed an equation, or scaling law, to estimate the risk of brain injury for each species.

    “The great thing about doing this on the computer is that it allows you to reduce and possibly eventually eliminate animal experiments,” Radovitzky says.

    The MIT team and co-author James Q. Zheng, chief scientist at the U.S. Army’s soldier protection and individual equipment program, detail their results this week in the Proceedings of the National Academy of Sciences.

    Air (through the) head

    A blast wave is the shockwave, or wall of compressed air, that rushes outward from the epicenter of an explosion. Aside from the physical fallout of shrapnel and other chemical elements, the blast wave alone can cause severe injuries to the lungs and brain. In the brain, a shockwave can slam through soft tissue, with potentially devastating effects.

    In 2010, Radovitzky’s group, working in concert with the Defense and Veterans Brain Injury Center, a part of the U.S. military health system, developed a highly sophisticated, image-based computational model of the human head that illustrates the ways in which pressurized air moves through its soft tissues. With this model, the researchers showed how the energy from a blast wave can easily reach the brain through openings such as the eyes and sinuses — and also how covering the face with a mask can prevent such injuries. Since then, the team has developed similar models for pigs and rats, capturing the mechanical response of brain tissue to shockwaves.

    In their current work, the researchers calculated the vulnerability of each species to brain injury by establishing a mathematical relationship between properties of the skull, brain, and surrounding flesh, and the propagation of incoming shockwaves. The group considered each brain structure’s volume, density, and celerity — how fast stress waves propagate through a tissue. They then simulated the brain’s response to blasts of different intensities.

    “What the simulation allows you to do is take what happens outside, which is the same across species, and look at how strong was the effect of the blast inside the brain,” Jean says.

    In general, they found that an animal’s skull and other fleshy structures act as a shield, blunting the effects of a blast wave: The thicker these structures are, the less vulnerable an animal is to injury. Compared with the more prominent skulls of rats and pigs, a human’s thinner skull increases the risk for traumatic brain injury.

    Shifting the problem

    This finding runs counter to previous theories, which held that an animal’s vulnerability to blasts depends on its overall mass, but which ignored the role of protective physical structures. According to these theories, humans, being more massive than pigs or rats, would be better protected against blast waves.

    Radovitzky says this reasoning stems from studies of “blast lung” — blast-induced injuries such as tearing, hemorrhaging, and swelling of the lungs, where it was found that mass matters: The larger an animal is, the more resilient it may be to lung damage. Informed by such studies, the military has since developed bulletproof vests that have dramatically decreased the number of blast-induced lung injuries in recent years.

    “There have essentially been no reported cases of blast lung in the last 10 years in Iraq or Afghanistan,” Radovitzky notes. “Now we’ve shifted that problem to traumatic brain injury.”

    In collaboration with Army colleagues, Radovitzky and his group are performing basic research to help the Army develop helmets that better protect soldiers. To this end, the team is extending the simulation approach they used for blast to other types of threats.

    His group is also collaborating with audiologists at Massachusetts General Hospital, where victims of the Boston Marathon bombing are being treated for ruptured eardrums.

    “They have an exact map of where each victim was, relative to the blast,” Radovitzky says. “In principle, we could simulate the event, find out the level of exposure of each of those victims, put it in our scaling law, and we could estimate their risk of developing a traumatic brain injury that may not be detected in an MRI.”

    Joe Rosen, a professor of surgery at Dartmouth Medical School, sees the group’s scaling law as a promising window into identifying a long-sought mechanism for blast-induced traumatic brain injury.

    “Eighty percent of the injuries coming off the battlefield are blast-induced, and mild TBIs may not have any evidence of injury, but they end up the rest of their lives impaired,” says Rosen, who was not involved in the research. “Maybe we can realize they’re getting doses of these blasts, and that a cumulative dose is what causes [TBI], and before that point, we can pull them off the field. I think this work will be important, because it puts a stake in the ground so we can start making some progress.”

    This work was supported by the U.S. Army through ISN.

    See the full article here.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 330 other followers

%d bloggers like this: