Tagged: Science 2.0 Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:27 am on March 22, 2015 Permalink | Reply
    Tags: , , , Science 2.0   

    From Science 2.0: “Early Kidney Cancer Detection With Urine Test” 

    Science 2.0 bloc

    Science 2.0

    March 22nd 2015
    News Staff

    80 percent of patients survive when kidney cancer is detected early – but it is often not easy. However, finding it early has been among the disease’s greatest challenges.

    Kidney cancer is the seventh most common cancer in men and the 10th most common in women, affecting about 65,000 people each year in the United States. About 14,000 patients die of the disease annually. Like most cancers, kidney tumors are easier to treat when diagnosed early. But symptoms of the disease, such as blood in the urine and abdominal pain, often don’t develop until later, making early diagnosis difficult.

    Now, researchers have developed a noninvasive method to screen for kidney cancer that involves measuring the presence of proteins in the urine. They found that the protein biomarkers were more than 95 percent accurate in identifying early-stage kidney cancers. In addition, there were no false positives caused by non-cancerous kidney disease.

    1
    Evan D. Kharasch, M.D., Ph.D., (left) and Jeremiah J. Morrissey, Ph.D. Credit: Elizabethe Holland Durando

    “These biomarkers are very sensitive and specific to kidney cancer,” said senior author Evan D. Kharasch, MD, PhD, of Washington University School of Medicine in St. Louis. “The most common way that we find kidney cancer is as an incidental, fortuitous finding when someone has a CT or MRI scan. It’s not affordable to use such scans as a screening method, so our goal has been to develop a urine test to identify kidney cancer early.”

    When kidney cancer isn’t discovered until after it has spread, more than 80 percent of patients die within five years.

    With researchers from the Siteman Cancer Center, the Mallinckrodt Institute of Radiology and the Division of Urologic Surgery, Kharasch and principal investigator Jeremiah J. Morrissey, PhD, professor of anesthesiology, analyzed urine samples from 720 patients at Barnes-Jewish Hospital who were about to undergo abdominal CT scans for reasons unrelated to a suspicion of kidney cancer. Results of the scans let the investigators determine whether or not patients had kidney cancer. As a comparison, they also analyzed samples from 80 healthy people and 19 patients previously diagnosed with kidney cancer.

    The researchers measured levels of two proteins in the urine — aquaporin-1 (AQP1) and perlipin-2 (PLIN2). None of the healthy people had elevated levels of either protein, but patients with kidney cancer had elevated levels of both proteins.

    In addition, three of the 720 patients who had abdominal CT scans also had elevated levels of both proteins. Two of those patients were diagnosed subsequently with kidney cancer, and the third patient died from other causes before a diagnosis could be made.

    “Each protein, or biomarker, individually pointed to patients who were likely to have kidney cancer, but the two together were more sensitive and specific than either by itself,” said Morrissey. “When we put the two biomarkers together, we correctly identified the patients with kidney cancer and did not have any false positives.”

    Even when patients had other types of non-cancerous kidney disease, levels of the two proteins in the urine were not elevated and did not suggest the presence of cancer.

    “Patients with other kinds of cancer or other kidney diseases don’t have elevations in these biomarkers,” Kharasch said. “So in addition to being able to detect kidney cancer early, another advantage of using these biomarkers may be to show who doesn’t have the disease.”

    Not all kidney masses found by CT scans turn out to be cancerous, he said. In fact, about 15 percent are not malignant.

    “But a CT scan can only tell you whether there is a mass in the kidney, not whether it’s cancer,” Kharasch said. “Currently, the only way to know for sure is to have surgery, and unfortunately, 10 to 15 percent of kidneys removed surgically turn out to be normal.”

    Kharasch and Morrissey are working to develop an easy-to-use screening test for kidney cancer, much like mammograms, colonoscopies or other tests designed to identify cancer at early, more treatable stages before patients have symptoms.

    “By and large, patients don’t know they have kidney cancer until they get symptoms, such a blood in the urine, a lump or pain in the side or the abdomen, swelling in the ankles or extreme fatigue,” Morrissey said. “And by then, it’s often too late for a cure. Metastatic kidney cancer is extremely difficult to treat, and if the disease is discovered after patients have developed symptoms, they almost always have metastases. So we’re hoping to use the findings to quickly get a test developed that will identify patients at a time when their cancer can be more easily treated.”

    Funded by the Barnes-Jewish Hospital Frontier Fund and The Department of Anesthesiology at Washington University School of Medicine in St. Louis, with additional support from the Bear Cub Fund of Washington University, Barnes-Jewish Hospital Foundation and Washington University Institute of Clinical and Translational Science, with additional funding from the National Cancer Institute (NCI) of the National Institutes of Health (NIH). NIH grant numbers R01CA141521 and UL1 TR000448.

    Morrissey JJ, Mellnick VM, Luo J, Siegel MJ, Figenshau RS, Bhayani S, Kharasch ED. Evaluation of urine aquaporin 1 and perilipin 2 concentrations as biomarkers to screen for renal cell carcinoma. JAMA Oncology, published online March 19, 2015.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 6:13 am on March 21, 2015 Permalink | Reply
    Tags: , Neanderthals, Science 2.0,   

    From Science 2.0: “Not Cro-Magnon, Volcanoes May Have Doomed Neanderthals” 

    Science 2.0 bloc

    Science 2.0

    March 20th 2015
    News Staff

    1
    Annual average temperature anomalies in excess of 3°C for the first year after the Campanian Ignimbrite (CI) eruption. Credit: B.A. Black et al. and the journal Geology

    A new paper notes that the Campanian Ignimbrite eruption in Italy 40,000 years ago, one of the largest volcanic cataclysms in Europe and responsible for injecting a significant amount of sulfur-dioxide (SO2) into the stratosphere, coincided with the final decline of Neanderthals as well as with dramatic territorial and cultural advances among modern humans.

    Scientists have long debated if this eruption and the resulting volcanic sulfur cooling and acid deposition could have contributed to the final extinction of the Neanderthals more than climate change or hominin competition.

    A new paper tests this hypothesis using a climate model.

    However, the decline of Neanderthals in Europe began well before the Campanian Ignimbrite eruption: “Radiocarbon dating has shown that at the time of the CI eruption, anatomically modern humans had already arrived in Europe, and the range of Neanderthals had steadily diminished. Work at five sites in the Mediterranean indicates that anatomically modern humans were established in these locations by then as well.”

    “While the precise implications of the Campanian Ignimbrite eruption for cultures and livelihoods are best understood in the context of archaeological data sets,” write Black and colleagues, the results of their study quantitatively describe the magnitude and distribution of the volcanic cooling and acid deposition that ancient hominin communities experienced coincident with the final decline of the Neanderthals.

    In their climate simulations, Black and colleagues found that the largest temperature decreases after the eruption occurred in Eastern Europe and Asia and sidestepped the areas where the final Neanderthal populations were living (Western Europe). Therefore, the authors conclude that the eruption was probably insufficient to trigger Neanderthal extinction.

    However, the abrupt cold spell that followed the eruption would still have significantly impacted day-to-day life for Neanderthals and early humans in Europe. Black and colleagues point out that temperatures in Western Europe would have decreased by an average of 2 to 4 degrees Celsius during the year following the eruption. These unusual conditions, they write, may have directly influenced survival and day-to-day life for Neanderthals and anatomically modern humans alike, and emphasize the resilience of anatomically modern humans in the face of abrupt and adverse changes in the environment.

    Citation: Benjamin A. Black et al., ‘Campanian Ignimbrite volcanism, climate, and the final decline of the Neanderthals’, Geology 19 March 2015, DOI: 10.1130/G36514.1.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 11:19 am on March 15, 2015 Permalink | Reply
    Tags: , , , Science 2.0   

    From Science 2.0: “Fractal Patterns May Uncover New Line Of Attack On Cancer” 

    Science 2.0 bloc

    Science 2.0

    March 15th 2015
    News Staff

    1
    The appearance of fractal patterns on the surface of cancer cells. Credit: Credit: M. Dokukin and I. Sokolov

    Studying the intricate fractal patterns on the surface of cells could give researchers a new insight into the physical nature of cancer, and provide new ways of preventing the disease from developing.

    This is according to scientists in the US who have, for the first time, shown how physical fractal patterns emerge on the surface of human cancer cells at a specific point of progression towards cancer.

    Publishing their results today, 11 March, in the Institute of Physics and Germany Physical Society’s New Journal of Physics, they found that the distinctive repeating fractal patterns develop at the precise point in which precancerous cells transform into cancer cells, and that fractal patterns are not present either before or after this point.

    The researchers hope the new findings can inspire biologists to search for specific “weak” points in the pathways that lead to the alteration of precancerous cells at this specific moment. By targeting these weak points, the researchers believe they could influence the process and thus prevent cancer from developing.

    Lead author of the study Professor Igor Sokolov said: “Despite many decades of fighting cancer, the war is far from being victorious. A sharp increase in the complexity and variability of genetic signatures has slowed the advancement based on finding specific cancer genes in patients.

    “Thus, more than ever, there is a need for new conceptual paradigms about the nature of cancer, and what we have found adds towards the development of such paradigm.”

    The term “fractal” defines a pattern that, when you take a small part of it, looks similar, although perhaps not identical, to its full structure. For example, the leaf of a fern tree resembles the full plant and a river’s tributary resembles the shape of the river itself.

    Nature is full of fractal patterns; they can be seen in clouds, lightning bolts, crystals, snowflakes, mountains, and blood vessels. Fractal patterns develop in conditions that are far-from-equilibrium, or chaotic–systems that are not in a state of rest or balance.

    In their study, the researchers, consisting of Dr Dokukin and Prof Sokolov from Tufts University and Ms Guz and Prof Woodworth from Clarkson University, used atomic force microscopy to produce extremely high-resolution images of the surfaces of human cervical epithelial cells.

    The cells were studied in vitro as they progressed from normal cells to immortal (premalignant) cells to malignant cells.

    “Despite previous expectations that fractal patterns are associated with cancer cells, we found that fractal geometry only occurs at a limited period of development when immortal cells become cancerous,” Professor Sokolov continued.

    “We also found that cells deviate more from fractal when they further progress towards cancer, and confirmed that normal cells do not have fractal patterns.”

    Whilst it is still unclear if the presence of fractal patterns plays any part in the development or progression of cancer itself, the researchers state that it is definitely plausible to expect that they have some importance given the role that the cell surface plays in metastasis.

    Professor Sokolov said: “When cancer metastasizes and spreads through the body, cancer cells have to physically crawl through multiple tissues, overcoming friction and resistance through interactions on the cells surface.

    “Moving forward, we need to further our understanding as to how important the cell surface is in the development of cancer.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 12:33 pm on February 12, 2015 Permalink | Reply
    Tags: , , Science 2.0   

    From Science 2.0: “How The Brain’s Internal Maps Are Linked To The External World” 

    Science 2.0 bloc

    Science 2.0

    February 12th 2015
    News Staff

    The brain’s GPS wouldn’t be much value if its maps of our surroundings that were not calibrated to the real world – grounded in reality.

    But they are, and a new study shows how this is done.

    The way that the brain’s internal maps are linked and anchored to the external world has been a mystery for a decade, ever since 2014 Nobel Laureates May-Britt and Edvard Moser discovered grid cells, the key reference system of our brain’s spatial navigation system. Now, researchers at the Mosers’ Kavli Institute for Systems Neuroscience believe they have solved this mystery.

    Think of regular maps and how they relate to your surroundings. When we go hiking and orient ourselves with a map and compass, we align the map using the north arrow on the compass and match it to the longitude lines on the map, to align the map with the terrain and make sure we find our way (unless we have a GPS that does the work for us).

    1
    ‘Right frame of mind’. Conceptual interpretation of results. The internal grid map of space must be anchored to the external geometry of the world. Forces act on the pattern (orange lines) to produce a final grid geometry that is most asymmetric in relation to the environment. Husserl would be proud.

    We know our brains contain a number of internal maps, all mapped onto the surroundings, ready to be pulled up to guide us in the right direction. These grid maps come in different sizes and resolutions, but until now they have offered few clues as to how they are anchored to the surroundings.

    The findings published in Nature this week explain the surprising twist the brain uses to align its internal maps.

    Stitched to the wall?

    “We recorded the activity of hundreds of grid cells,” Tor Stensola, a researcher at the Kavli Institute for Systems Neuroscience says. “Looking at the information from more than 800 grid cells, we noticed that grid patterns typically were oriented in the same few directions. This was true for all the animals studied, which suggested the grid map aligns to its surroundings in a systematic way. Grid cells all seemed to be anchored to one of the local walls, but always with a specific offset of a few degrees. So we decided to investigate this.”

    The researchers recorded the activity of grid cells in rats as the animals foraged for cookie crumbs in a 1.5 square metre box. Now, if a map, represented by the activity of grid cells, were to represent the box it would need to look the same every time the rat ran in the box. And it would need to be aligned the same way too, independently of which way the rat ran, and where the rat was put into the box. The recorded maps were indeed consistent. Each grid pattern was anchored to one of the walls. But none of the grids’ axes were perfectly aligned with either wall; they were consistently askew.

    “The map was rotated,” Stensola says, where the angle was always approximately 7.5 degrees off of the anchored wall. “We did some calculations and realized that there might be a very good reason for that particular angle. At 0- and 15-degree angles, the map would be symmetric. In other words, if it were perfectly aligned with the wall, that would mean it would mirror either the cardinal axis or the diagonal of the square box respectively, making it likely that places would get mixed up. The 7.5 degrees angle of rotation is the one that minimizes symmetry with axes in the environment, thereby minimizing these kinds of potential errors.”

    Perfectly asymmetric

    The researchers found more asymmetry in the grids. Grid patterns were not perfectly hexagonal, but instead seemed distorted such that they became asymmetric. Although this was known from before, nobody had any idea why.

    “The inner six fields of the grid, the centre of the map, would lie on a perfect circle if the pattern was perfectly hexagonal,” Stensola says. “The asymmetry caused these points to form an ellipse instead.”

    The researchers looked into the connection between the rotation and anchoring of the grid, and the distortion of the pattern and found an almost perfect correspondence between the direction the ellipse was pointing and the direction the grid was anchored towards. By skewing the pattern along one of the walls – a process known as shearing– the researchers could replicate both the ellipse and the 7.5-degree rotation of the grid.

    Shearing forces displace parts of an object differently depending on the location of the part. To visualize how it works, think of when you push a deck of cards along the table, and the deck slides. The bottom card will stay put, the top card will be moved the most, and all the cards in between will move a distance that reflect its position in the deck. This kind of deformation has been seen in nature, where it is known to happen in trilobite fossils when external forces in the ground squeeze them into a new shape. (PICTURES)

    The researchers believe that once an animal experiences a novel environment it will form a map of the environment where one of the axes is perfectly aligned with a nearby wall. Gradually, shearing forces will distort and skew the map 7.5 degrees away from the anchoring wall, forming a stable and robust map with low chance of error.

    Cornfields and the brain

    But what happens to the internal maps when an environment is very large, like a cornfield? If local landmarks are very far apart, it may be advantageous for the brain to create several local and accurate maps rather than one inaccurate overall map.

    The researchers tested this by recording grids in a much bigger square box (2.2m squared). The map had the same asymmetries here as in the smaller box, but showed one crucial difference.

    “As the box got bigger, some maps looked different,” Stensola explains. “The maps broke in two, and became two separate local maps for the same box, anchored to opposite walls. The grids were just as elliptical as in the smaller box. When the maps broke apart, we had to reevaluate. We knew that maps could anchor locally in complex environments. We now believe that the shearing forces can operate on not only the entire environment, but may apply locally as well.”

    This distortion towards asymmetry is likely what our brains use to reduce the frequency of error in self-localization, the researchers believe.

    Physics enters the brain

    Professor Edvard Moser, director of the Kavli Institute for Systems Neuroscience, believes these findings are important in connecting different fields of science.

    “It is always a great feeling when we find the solution to a mystery that has puzzled us for a decade,” he says. “The findings give us more clues as to how our internal maps interact with the surroundings. Now we’ll have to figure out in detail how the information about the orientation of walls and boundaries in the surroundings reach the grid maps, and how this information is processed. Perhaps border cells will prove to hold the answer to this, we do not know this for sure yet.”

    Moser notes that they had to turn to physics to solve the puzzle of the sheared maps. “How the laws of physics apply in the nervous system is an area full of unsolved questions. We’ll just have to keep up the work to find out.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 2:35 pm on February 6, 2015 Permalink | Reply
    Tags: , , , Science 2.0   

    From Science 2.0: “Methane Seepage Has Been Occurring For Millions Of Years” 

    Science 2.0 bloc

    Science 2.0

    There’ve been some recent environmental claims about methane seepage, flaming tapwater, but what were not staged have been due to nature. It’s a tale almost as old as earth.

    But outside environmental circles, science was always thinking about methane as much as CO2, because it has a 23X greater warming impact than CO2. Fortunately it is short-lived so trace seepage of methane from natural gas is nowhere near as devastating as CO2 from other forms of energy creation. Natural gas is why emissions from energy in America are back at early 1990s levels and emissions from coal, the dirtiest polluter, are back at early 1980s levels. Even though it is modest from natural gas, 60 percent of methane in the atmosphere comes from human activities (such as cow burps and other things) but that is nothing compared to the giga-tons of it trapped under the ocean floor of the Arctic.

    And it’s leaking.

    But fear not, it always has.

    “Our planet is leaking methane gas all the time. If you go snorkeling in the Caribbean you can see bubbles raising from the ocean floor at 25 meters depth. We studied this type of release, only in a much deeper, colder and darker environment. And found out that it has been going on, periodically, for as far back as 2.7 million years,” says Andreia Plaza Faverola, researcher at Centre for Arctic Gas Hydrate, Environment and Climate, and the primary author behind a new paper in Geophysical Research Letters.

    1
    Andreia Plaza Faverola is researcher at Centre for Arctic Gas Hydrate, Environment and Climate at UiT The Arctic University of Norway. Credit: Maja Sojtaric/CAGE

    Faverola is talking about Vestnesa Ridge in Fram Strait, a thousand meters under the Arctic Ocean surface offshore West-Svalbard. Here, 800 meter high gas flares rise from the seabed today. That’s the size of the tallest manmade structure in the world – Burj Khalifa in Dubai.

    “Half of Vestnesa Ridge is showing very active seepage of methane. The other half is not. But there are obvious pockmarks on the inactive half, cavities and dents in the ocean floor, that we recognized as old seepage features. So we were wondering what activates, or deactivates, the seepage in this area,” says Faverola.

    Why 2.7 million years?

    The team of marine geophysicists from CAGE used the P-Cable technology to figure it out. It is a seismic instrument that is towed behind a research vessel. It recorded the sediments beneath these pockmarks. P-Cable renders images that look like layers of a cake. It also enables scientists to visualize deep sediments in 3D.

    2
    The Arctic Ocean floor offshore West-Svalbard. Credit: Andreia Plaza Faverola/CAGE

    “We know from other studies in the region that the sediments we are looking at in our seismic data are at least 2.7 million years old. This is the period of increase of glaciations in the Northern Hemisphere, which influences the sediment. The P-Cable enabled us to see features in this sediment, associated with gas release in the past.

    “These features can be buried pinnacles or cavities that form what we call gas chimneys in the seismic data. Gas chimneys appear like vertical disturbances in the layers of our sedimentary cake. This enables us to reconstruct the evolution of gas expulsion from this area for at least 2,7 million years.”

    The seismic signal penetrated into 400 to 500 meters of sediment to map this timescale.

    How is the methane released?

    By using this method, scientists were able to identify two major events of gas emission throughout this time period: One 1,8 million years ago, the other 200,000 years ago.

    This means that there is something that activated and deactivated the emissions several times. The authors have a plausible explanation: It is the movement of the tectonic plates that influences the gas release.

    Vestnesa is not like California though, riddled with earthquakes because of the moving plates. The ridge is on a so-called passive margin. But as it turns out, it doesn´t take a huge tectonic shift to release the methane stored under the ocean floor.

    “Even though Vestnesa Ridge is on a passive margin, it is between two oceanic ridges that are slowly spreading. These spreading ridges resulted in separation of Svalbard from Greenland and opening of the Fram Strait. The spreading influences the passive margin of West-Svalbard, and even small mechanical collapse in the sediment can trigger seepage,” says Faverola.

    Where does the methane come from?

    The methane is stored as gas hydrates, chunks of frozen gas and water, up to hundreds of meters under the ocean floor. Vestnesa hosts a large gas hydrate system. There is some concern that global warming of the oceans may melt this icy gas and release it into the atmosphere. That is not very likely in this area, according to Faverola.

    “This is a deep water gas hydrate system, which means that it is in permanently cold waters and under a lot of pressure. This pressure keeps the hydrates stable and the whole system is not vulnerable to global temperature changes. But under the stable hydrates there is gas that is not frozen. The amount of this gas may increase if hydrates melt at the base of this stability zone, or if gas from deeper in the sediments arrives into the system. This could increase the pressure in this part of the system, and the free gas may escape the seafloor through chimneys. Hydrates would still remain stable in this scenario.”

    Historical methane peaks coincide with increase in temperature

    Throughout Earth´s history there have been several short periods of significant increase in temperature. And these periods often coincide with peaks of methane in the atmosphere , as recorded by ice cores. Scientists such as Plaza Faverola are still debating about the cause of this methane release in the past.

    “One hypotheses is that massive gas release from geological sources, such as volcanos or ocean sediments may have influenced global climate.. What we know is that there is a lot of methane released at present time from the ocean floor. What we need to find out is if it reaches the atmosphere, or if it ever did.”

    Historical events of methane release, such as the ones in the Vestnesa Ridge, provide crucial information that can be used in future climate modeling. Knowing if these events repeat, and identifying what makes them happen, may help us to better predict the potential influence of methane from the oceans on future climate.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 8:20 am on January 30, 2015 Permalink | Reply
    Tags: , , , Science 2.0,   

    From Science 2.0: “Calculating The Future Of Solar-fuel Refineries” 

    Science 2.0 bloc

    Science 2.0

    January 30th 2015
    News Staff

    The process of converting the sun’s energy into liquid fuels requires a sophisticated, interrelated series of choices but a solar refinery is especially tricky to map out because the designs involve newly developed or experimental technologies. This makes it difficult to develop realistic plans that are economically viable and energy efficient.

    In a paper recently published in the journal Energy & Environmental Science, a team led by University of Wisconsin-Madison chemical and biological engineering Professors Christos Maravelias and George Huber outlined a tool to help engineers better gauge the overall yield, efficiency and costs associated with scaling solar-fuel production processes up into large-scale refineries.

    1

    That’s where the new UW-Madison tool comes in. It’s a framework that focuses on accounting for general variables and big-picture milestones associated with scaling up energy technologies to the refinery level. This means it’s specifically designed to remain relevant even as solar-fuel producers and researchers experiment with new technologies and ideas for technologies that don’t yet exist.

    Renewable-energy researchers at UW-Madison have long emphasized the importance of considering energy production as a holistic process, and Maravelias says the new framework could be used by a wide range of solar energy stakeholders, from basic science researchers to business decision-makers. The tool could also play a role in wider debates about which renewable-energy technologies are most appropriate for society to pursue on a large scale.

    “The nice thing about it being general is that if a researcher develops a different technology – and there are many different ways to generate solar fuels – our framework would still be applicable, and if someone wants a little more detail, our framework can be adjusted accordingly,” Maravelias says.

    In addition to bringing clarity to the solar refinery conversation, the framework could also be adapted to help analyze and plan any number of other energy-related processes, says Jeff Herron, a postdoc in Maravelias’ group and the paper’s lead author.

    “People tend to be narrowly focused on their particular role within a bigger picture,” Herron says. “I think bringing all that together is unique to our work, and I think that’s going to be one of the biggest impacts.”

    Ph.D. student Aniruddha Upadhye and postdoc Jiyong Kim also contributed to the project. The research was funded by the U.S. Department of Energy.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 7:49 pm on January 25, 2015 Permalink | Reply
    Tags: , Peat Fires, Science 2.0, Wild Fires   

    From Science 2.0: “Asian Peat Fire Impact On Climate Change” 

    Science 2.0 bloc

    Science 2.0

    January 25th 2015
    News Staff

    Wildfires send hot flames and smoke high into the air, including black carbon emissions associated with climate change and risk to human health. Unless the United States adapts sensible forest management policies, which means fewer instances of the Department of the Interior and environmental lobbyists conspiring to manipulate science reports, carbon emissions from wildfires in the contiguous U.S. could increase by 50 percent by 2050 and double by 2100.

    But high-profile wildfires should not be the only concern. Peatlands — an organic mixture of decayed and compacted leaves — are actually the largest fires on earth. These high-moisture-containing fuels make up nearly three-fourths of earth’s land mass and are believed to be the largest emitter of carbon from wildfires to the atmosphere. In 1997, a period of extreme haze caused by the spread of smoldering peat fires in Indonesia led to a rapid increase in respiratory illnesses and disrupted shipping and aviation routes, creating an estimated 0.81 to 2.57 billion tons of carbon gases in that region.

    Peat fires are difficult to extinguish, burn at a lower temperature, produce more smoke, generate particles that can be irritating to the eyes and respiratory system, and are among the least understood of fires, according to the U.S. Environmental Protection Agency. Unlike flaming fires, peatlands smolder underground for weeks and months and emit a whitish smoke.

    1
    This natural-color satellite image was collected by the Moderate Resolution Imaging Spectroradiometer (MODIS) aboard the Aqua satellite on Sept. 28, 2014. Actively burning areas, detected by MODIS’s thermal bands, are outlined in red and there is smoke significant rising from these areas. Credit: NASA image courtesy Jeff Schmaltz, MODIS Rapid Response Team. NASA/Goddard

    NASA AQUA MODIS
    MODIS

    NASA Aqua satellite
    Aqua

    To study the climatic effects of carbon-containing aerosols emitted from peat fires, Rajan Chakrabarty, PhD, assistant professor of environmental engineering at Washington University in St. Louis’ School of Engineering & Applied Science, has received a grant from the National Aeronautics and Space Administration (NASA). Brent Williams, PhD, assistant professor of environmental engineering, is a co-investigator on the project, as is Wei Min Hao, PhD, an atmospheric chemist at the USDA Forest Service Fire Sciences Laboratory in Montana.

    “This project is going to provide the much-needed information on peat smoke aerosol properties for integration in satellite retrieval algorithms and climate models,” Chakrabarty said. “Based on my initial findings, I hypothesize the peat smoke is made up of brown carbon and not black carbon. Brown carbon is a class of organic carbon aerosol which, unlike black carbon, strongly absorbs incoming solar radiation in the shorter wavelengths, or near ultraviolet.”

    With the new grant, Chakrabarty and his collaborators will study the origin of brown carbon particles, their chemical makeup, how many particulates are being emitted and determine their optical properties, or how they behave in light, using a photoacoustic technique, which uses non-invasive imaging based on sound and light. When scientists and engineers collect the smoke particles on a filter, the residue is yellow to brown in color, giving clues about its properties, Chakrabarty said.

    “When you see the yellowish or brown color, it implies that the brown carbon is reflecting all light but the blue violet wavelengths, and that’s what it’s absorbing,” Chakrabarty said. “That has significant implications because we know that black carbon is much less in terms of mass emissions compared to organic carbon. And if organic carbon has absorption right near visible or near ultraviolet light when the solar spectrum starts to peak, it will add further warming on top of black carbon, and we don’t know how much.”

    Chakrabarty and Williams will study laboratory-scale fires from peat fuel types found worldwide at the USDA Fire Science Laboratory combustion chamber using their real-time optical and chemical instruments, which they will transport to Montana in the summer of 2016. They will compile the results into a reference for other scientists, called a look-up table, which will provide researchers and climate modelers with properties for the different fuel types. Since NASA has the largest inventory of fire images from satellite, its scientists will integrate the look-up table into several of its satellite algorithms and climate models, Chakrabarty said.

    Chakrabarty first studied brown carbon particles as a graduate student at the University of Nevada, Reno, in a pilot study at the USDA Fire Science Laboratory. He observed large-scale brown-carbon aerosol being freshly emitted as primary particles from smoldering combustion of two kinds of duff, or the surface organic layer containing litter, lichen, live and dead moss and organic soil. The combustion of duff is the largest contributor to smoldering smoke production in forest regions in the Northern Hemisphere and contributes between 46 percent and 72 percent of all wildland fire carbon emissions annually.

    Source: Washington University in St. Louis

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 5:34 am on January 22, 2015 Permalink | Reply
    Tags: , , Science 2.0   

    From Science 2.0: “Hidden Magnetic Messages From Space Discovered” 

    Science 2.0 bloc

    Science 2.0

    January 21st 2015

    Geologists have discovered hidden magnetic messages from the early solar system in meteorites measured using the PEEM-Beamline at the BESSY II synchrotron located in The Helmholtz-Zentrum Berlin für Materialien und Energie (HZB).

    Bessy II Synchrotron
    BESSY II synchrotron

    The information captures the dying moments of the magnetic field during core solidification on a meteorite parent body, providing a sneak preview of the fate of Earth’s own magnetic field as its core continues to freeze.

    Meteorites were previously thought to have poor magnetic memories, with the magnetic signals they carry having been written and rewritten many times during their long journey to Earth. [Dr. Richard] Harrison, however, identified specific regions filled with nanoparticles that were magnetically extremely stable. These “tiny space magnets” retain a faithful record of the magnetic fields generated by the meteorite’s parent body. Harrison and his colleagues could map these tiny magnetic signals using circular polarized X-ray synchrotron radiation at BESSY II.

    2
    The Pallasite meteorite, studied by Harrison, contains information about the early solar system. Image copyright Natural History Museum, London. Sample used from the Natural History Museum Meteorite Collection.

    Meteorites have witnessed a long and violent history; they are fragments of asteroids which formed in the early solar system, 4.5 billion years ago. Shortly after their formation, some asteroids were heated up by radioactive decay, causing them to melt and segregate into a liquid metal core surrounded by a solid rocky mantle. Convection of the liquid metal created magnetic fields, just as the liquid outer core of the Earth generates a magnetic field today. From time to time asteroids crash together and tiny fragments fall to Earth as meteorites, giving scientists the opportunity to study the properties of the magnetic fields that were generated billions of years ago. “They are like natural hard discs”, Dr. Richard Harrison believes. The geologist from the Department of Earth Sciences, University of Cambridge, UK, is searching for methods to decipher the information stored deep inside the space rocks. Now his new approach has yielded its first results.

    Until now it was not clear whether ancient magnetic signals could be retained by stony-iron meteorites at all. Large and highly mobile magnetic domains are found within the iron metal: these domains create huge magnetic signals but are easily overwritten by new events. The probability that these regions might contain useful information about early magnetic fields in the solar system is extremely low.

    But Harrison took a much closer look. At the PEEM-Beamline of BESSY II, Harrison and PhD student James Bryson found dramatic variation in magnetic properties as they went through the meteorite. They saw not only regions containing large, mobile magnetic domains, but also identified an unusual region called the cloudy zone containing thousands of tiny particles of tetrataenite, a super hard magnetic material. “These tiny particles, just 50 to 100 nanometers in diameter, hold on to their magnetic signal and don’t change. So it is only these very small regions of chaotic looking magnetization that contain the information we want”, Bryson concludes.

    The PEEM-Beamline offers X-rays with the specific energy and polarization needed to make sense of these magnetic signals. Since the absorption of the X-rays depends on the magnetization, the scientists could map the magnetic signals on the sample surface in ultrahigh resolution without changing them by the procedure. “The new technique we have developed is a way of analyzing these images to extract real information. So we can do for the first time paleomagnetic measurements of very small regions of these rocks, regions which are less than one micrometer in size. These are the highest resolution paleomagnetic measurements ever made”, Harrison points out.

    By spatially resolving the variations in magnetic signal across the cloudy zone, the team were able to reconstruct the history of magnetic activity on the meteorite parent body, and were even able to capture the moment when the core finished solidifying and the magnetic field shut down. These new measurements answer many open questions regarding the longevity and stability of magnetic activity on small bodies. Their observations, supported by computer simulations, demonstrate that the magnetic field was created by compositional, rather than thermal, convection – a result that changes our perspective on the way magnetic fields were generated during the early solar system and even provides a sneak preview of the fate of Earth’s own magnetic field as its core continues to freeze.

    Article: Long-lived magnetism from solidification-driven convection on the pallasite parent body, Nature on 22 January 2015. Source: Helmholtz-Zentrum Berlin für Materialien und Energie

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: