From U Chicago: “New program teaches data science for energy and environment research”

U Chicago bloc

University of Chicago

August 11, 2017
Rob Mitchum

Innovative curriculum will arm graduate students with critical tools for modern science.

An interdisciplinary program will train graduate students in data science to tackle issues related to food, energy and water.

For the future of the planet, there are few research subjects more important than the global supplies of food, water and energy. To comprehensively study, understand and inform policy around these complex systems, the next generation of researchers in the physical, social and biological sciences will need fluency with data analysis methods that transverse traditional academic boundaries.

A new interdisciplinary curriculum will train graduate students from geosciences, economics, computer science, public policy and other programs in computational and data science techniques critical for modern science. With a $3 million award from the National Science Foundation, the new research traineeship grant will combine expertise from across UChicago and Argonne National Laboratory in computing, statistics, social science, climate and agriculture.

“This program will equip graduate students with the tools needed to advance the study of issues related to food, energy and water,” said Elisabeth Moyer, associate professor of atmospheric science in the Department of the Geophysical Sciences. “Our vision is to produce students who have the computational skills and breadth of knowledge, from social to physical sciences, needed to tackle these critical research subjects in all their complexity.”

As Earth’s population rises in the coming years, demand for food, energy and water is expected to soar. The global scale and interdependence of these sectors—where increased agriculture decreases freshwater supplies, while both are affected by the environmental impact of accelerating energy production—necessitates research collaboration across fields. Improved data collection and modeling on these topics creates promising opportunities for understanding their complexities, but only if analyzed with the right computational methods.

The program will produce students with a foundation in a discipline such as geosciences, economics or public policy, as well as the computational skills and breadth of multidisciplinary knowledge to tackle complex questions in food, energy and water. A three-year curriculum including boot camps, retreats, new courses and practicum projects will give each student experience working with data science methods and scientists from fields other than their own.

Additional training in scientific communication and professional development, as well as opportunities for international research experience with the Potsdam Institute for Climate Impact Research and the African Institute for Mathematical Sciences, will further prepare students for research careers in this area.

“We want to extend education outside the boundaries of traditional silo-ed disciplinary programs,” said Ryan Kellogg, professor at the Harris School of Public Policy. “This program will provide students with the computational skills needed to exploit the growing torrent of relevant data, and give them experience both interacting across disciplines and translating results to non-academic audiences.”

Campus as a laboratory

As part of the curriculum, students will receive introductions to computing in the social sciences and geosciences, spatial statistics and imagery analysis, geographic information systems (GIS), data science fundamentals, time series analysis and environmental economics. A general course on the food/energy/water system, drawn from existing courses for interdisciplinary audiences taught by Moyer and Cristina Negri, environmental engineer at Argonne and fellow of the Institute for Molecular Engineering, will be followed by a data analysis practicum where groups of students work with real data and organizations in government and industry.

The program will build upon successful UChicago training initiatives such as the Executive Program in Applied Data Analytics, the Computational Analysis and Public Policy curriculum at the Harris School of Public Policy and the Data Science for Social Good Summer Fellowship.

nstruction and mentorship will be provided by several UChicago research groups, including the Center for Robust Decision-Making on Climate and Energy Policy (climate and agricultural modeling), Knowledge Lab (text mining), the Energy Policy Institute at UChicago (environmental and energy economics), the Center for Data Science and Public Policy (data analytics and project management) and the Center for Spatial Data Science (spatial analysis). High-performance computing resources and tutorials will be provided by the Research Computing Center.

“All across the University of Chicago campus, we have researchers applying innovative data science techniques to important questions in energy and the environment,” said Michael Franklin, the Liew Family Chair of Computer Science. “With this new program, we can extend that strength to enrich our graduate education, making our campus a laboratory for training a new generation of interdisciplinary, computational-minded scientists.”

The award is one of 17 given out by the National Science Foundation this month as part of their NSF Research Traineeship program. The foundation awarded a total of $51 million to “develop and implement bold, new, potentially transformative models for graduate education in science, technology, engineering and mathematics fields.”

“Integration of research and education through interdisciplinary training will prepare a workforce that undertakes scientific challenges in innovative ways,” said Dean Evasius, director of the NSF Division of Graduate Education in a news release. “The NSF Research Traineeship awards will ensure that today’s graduate students are prepared to pursue cutting-edge research and solve the complex problems of tomorrow.”

In addition to Moyer, Kellogg and Franklin, additional investigators on the award include Joshua Elliott, Computation Institute fellow and research scientist, and Ian Foster, the Arthur Holly Compton Distinguished Service Professor of Computer Science and Argonne Distinguished Fellow.

The program will start in the 2018-19 academic year. Further information will be available in the upcoming academic year for those interested in applying.

See the full article here .

Please help promote STEM in your local schools.


Stem Education Coalition

U Chicago Campus

An intellectual destination

One of the world’s premier academic and research institutions, the University of Chicago has driven new ways of thinking since our 1890 founding. Today, UChicago is an intellectual destination that draws inspired scholars to our Hyde Park and international campuses, keeping UChicago at the nexus of ideas that challenge and change the world.

#applied-research-technology, #earth-observation, #environmental-studies, #new-program-teaches-data-science-for-energy-and-environment-research, #u-chicago

From CMU: “Researchers Show Forest Fragmentation from Shale Development Could Be Reduced by Placing Natural Gas Lines Along Roadways”

Carnegie Mellon University logo
Carnegie Mellon university

March 20, 2015
Tara Moore / 412-268-9673 /

Forest fragmentation occurs when the key infrastructure related to Marcellus shale natural gas extraction — specifically the well pads themselves, as well as gathering lines and access roads — cuts through the forest, dividing it into smaller sections.

A team of researchers in Carnegie Mellon University’s College of Engineering found that forest fragmentation from natural gas development in Pennsylvania is caused by gathering lines, the smaller pipelines that carry extracted natural gas to the main distribution pipes.

In a paper in the journal Ecological Indicators, the scientists report that redirecting the lines so they follow the routes of existing roadways would greatly reduce fragmentation.

The research group includes Leslie Abrahams, a doctoral student in engineering and public policy (EPP) and civil and environmental engineering (CEE), and co-authors W. Michael Griffin, an EPP associate research professor, and CEE Professor H. Scott Matthews.

While it may seem at first glance that the well pads, which can require anywhere from three to nine acres of cleared forest land, would be the biggest culprit of fragmentation, the team discovered the main cause to be the gathering lines. While gathering lines are buried underground, the surfaces above them, called right of ways, are cleared of all trees, causing almost 19 acres of loss per well pad.

“If something cuts a cleared path through the forest, it could be dividing a species’ habitat in half,” Abrahams explained. “Flying squirrels, for example. The natural gas infrastructure can create openings in the forest that are too wide for them to glide across and suddenly their habitat is greatly decreased.”

The gathering line right of ways also create pathways that allow invasive species to access inner parts of the forest. Suddenly, indigenous animals are introduced to predators they’ve never learned how to avoid before, making survival much more difficult.

The research team used computer-modeling software to develop strategies to greatly reduce forest fragmentation.

The team’s major suggestion for future infrastructure development is to build future gathering lines, the major culprits of fragmentation, so they follow the same routes as existing roads. This way, no additional corridors are built, keeping future fragmentation to a minimum.

Additionally, requiring natural gas companies to collaborate on infrastructure development would help eliminate unnecessary fragmentation by forcing multiple companies to use the same pipelines.

“Eliminating the need for multiple pipelines that go to the same place would save developers money, while helping to protect core forest ecosystems,” Griffin said.

Another strategy is to reduce the number of necessary well pads by simply drilling more wells at each pad.

“One of the benefits of unconventional natural gas development is that you can develop multiple wells per pad because instead of drilling straight down, you go down and then out horizontally so you can have six or 12 different wells per pad,” Abrahams said.

“By simply drilling more wells per pad, and drilling those wells farther, you can still drill the same number of wells without clearing as much forested land. However, while it does reduce the level of fragmentation over the business as usual case, this strategy does not stop additional future fragmentation from occurring altogether because it does not address the placement of the gathering lines,” Abrahams said.

This work was funded in part by the National Science Foundation Graduate Research Fellowship Program, the Center for Climate and Energy Decision Making, and by the Department of Engineering and Public Policy.

Read the full paper, titled “Assessment of policies to reduce core forest fragmentation from Marcellus shale development in Pennsylvania,” at

See the full article here.

Please help promote STEM in your local schools.


Stem Education Coalition

Carnegie Mellon Campus

Carnegie Mellon University (CMU) is a global research university with more than 12,000 students, 95,000 alumni, and 5,000 faculty and staff.
CMU has been a birthplace of innovation since its founding in 1900.
Today, we are a global leader bringing groundbreaking ideas to market and creating successful startup businesses.
Our award-winning faculty members are renowned for working closely with students to solve major scientific, technological and societal challenges. We put a strong emphasis on creating things—from art to robots. Our students are recruited by some of the world’s most innovative companies.
We have campuses in Pittsburgh, Qatar and Silicon Valley, and degree-granting programs around the world, including Africa, Asia, Australia, Europe and Latin America.

#applied-research-technology, #carnegie-mellon-university, #environmental-studies

From LBL: “First Direct Observation of Carbon Dioxide’s Increasing Greenhouse Effect at the Earth’s Surface”

Berkeley Logo

Berkeley Lab

February 25, 2015
Dan Krotz

Scientists have observed an increase in carbon dioxide’s greenhouse effect at the Earth’s surface for the first time. The researchers, led by scientists from the US Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), measured atmospheric carbon dioxide’s increasing capacity to absorb thermal radiation emitted from the Earth’s surface over an eleven-year period at two locations in North America. They attributed this upward trend to rising CO2 levels from fossil fuel emissions.

The influence of atmospheric CO2 on the balance between incoming energy from the Sun and outgoing heat from the Earth (also called the planet’s energy balance) is well established. But this effect has not been experimentally confirmed outside the laboratory until now. The research is reported Wednesday, Feb. 25, in the advance online publication of the journal Nature.

The results agree with theoretical predictions of the greenhouse effect due to human activity. The research also provides further confirmation that the calculations used in today’s climate models are on track when it comes to representing the impact of CO2.

These graphs show carbon dioxide’s increasing greenhouse effect at two locations on the Earth’s surface. The first graph shows CO2 radiative forcing measurements obtained at a research facility in Oklahoma. As the atmospheric concentration of CO2 (blue) increased from 2000 to the end of 2010, so did surface radiative forcing due to CO2 (orange), and both quantities have upward trends. This means the Earth absorbed more energy from solar radiation than it emitted as heat back to space. The seasonal fluctuations are caused by plant-based photosynthetic activity. The second graph shows similar upward trends at a research facility on the North Slope of Alaska. (Credit: Berkeley Lab)

The scientists measured atmospheric carbon dioxide’s contribution to radiative forcing at two sites, one in Oklahoma and one on the North Slope of Alaska, from 2000 to the end of 2010. Radiative forcing is a measure of how much the planet’s energy balance is perturbed by atmospheric changes. Positive radiative forcing occurs when the Earth absorbs more energy from solar radiation than it emits as thermal radiation back to space. It can be measured at the Earth’s surface or high in the atmosphere. In this research, the scientists focused on the surface.

They found that CO2 was responsible for a significant uptick in radiative forcing at both locations, about two-tenths of a Watt per square meter per decade. They linked this trend to the 22 parts-per-million increase in atmospheric CO2 between 2000 and 2010. Much of this CO2 is from the burning of fossil fuels, according to a modeling system that tracks CO2 sources around the world.

“We see, for the first time in the field, the amplification of the greenhouse effect because there’s more CO2 in the atmosphere to absorb what the Earth emits in response to incoming solar radiation,” says Daniel Feldman, a scientist in Berkeley Lab’s Earth Sciences Division and lead author of the Nature paper.

“Numerous studies show rising atmospheric CO2 concentrations, but our study provides the critical link between those concentrations and the addition of energy to the system, or the greenhouse effect,” Feldman adds.

The scientists used spectroscopic instruments operated by the Department of Energy’s Atmospheric Radiation Measurement (ARM) Climate Research Facility. This research site is on the North Slope of Alaska near the town of Barrow. They also collected data from a site in Oklahoma. (Credit: Jonathan Gero)

He conducted the research with fellow Berkeley Lab scientists Bill Collins and Margaret Torn, as well as Jonathan Gero of the University of Wisconsin-Madison, Timothy Shippert of Pacific Northwest National Laboratory, and Eli Mlawer of Atmospheric and Environmental Research.

The scientists used incredibly precise spectroscopic instruments operated by the Atmospheric Radiation Measurement (ARM) Climate Research Facility, a DOE Office of Science User Facility. These instruments, located at ARM research sites in Oklahoma and Alaska, measure thermal infrared energy that travels down through the atmosphere to the surface. They can detect the unique spectral signature of infrared energy from CO2.

Other instruments at the two locations detect the unique signatures of phenomena that can also emit infrared energy, such as clouds and water vapor. The combination of these measurements enabled the scientists to isolate the signals attributed solely to CO2.

“We measured radiation in the form of infrared energy. Then we controlled for other factors that would impact our measurements, such as a weather system moving through the area,” says Feldman.

The result is two time-series from two very different locations. Each series spans from 2000 to the end of 2010, and includes 3300 measurements from Alaska and 8300 measurements from Oklahoma obtained on a near-daily basis.

Both series showed the same trend: atmospheric CO2 emitted an increasing amount of infrared energy, to the tune of 0.2 Watts per square meter per decade. This increase is about ten percent of the trend from all sources of infrared energy such as clouds and water vapor.

Based on an analysis of data from the National Oceanic and Atmospheric Administration’s CarbonTracker system, the scientists linked this upswing in CO2-attributed radiative forcing to fossil fuel emissions and fires.

The measurements also enabled the scientists to detect, for the first time, the influence of photosynthesis on the balance of energy at the surface. They found that CO2-attributed radiative forcing dipped in the spring as flourishing photosynthetic activity pulled more of the greenhouse gas from the air.

The scientists used the National Energy Research Scientific Computing Center (NERSC), a DOE Office of Science User Facility located at Berkeley Lab, to conduct some of the research.

The research was supported by the Department of Energy’s Office of Science.

See the full article here.

Please help promote STEM in your local schools.


Stem Education Coalition

A U.S. Department of Energy National Laboratory Operated by the University of California

University of California Seal

DOE Seal

#applied-research-technology, #environmental-studies, #green-house-gasses, #lbnl

From NSF: “Good news and bad news for coral reefs”

National Science Foundation

February 9, 2015
Lily Whiteman, National Science Foundation (703) 292-8310

Researcher Paul Sikkel of Arkansas State University discovered a mechanism that may be damaging coral reef ecosystems in the Caribbean. Sikkel identified this potential mechanism through his National Science Foundation-funded research on the relationships between parasites and their host fishes in Caribbean reefs. Credit: David Burdick

Some good news for coral reefs: In 2014, President Obama expanded the Pacific Remote Islands Marine National Monument in the central Pacific from about 87,000 square miles to 308,000 square miles. The Monument “is the largest marine protected area in the world and an important part of the most widespread collection of marine life on the planet under a single country’s jurisdiction,” according to the National Oceanic and Atmospheric Administration (NOAA).

This area sustains a diversity of species, including some of the most pristine coral reefs in the world, as well as a diversity of fish species, shellfish, marine mammals, seabirds, land birds, insects and vegetation not found anywhere else.

Fishing, energy exploration and other activities are prohibited in the Monument. Among the Monument’s protected corals are expansive shallow coral reefs and deep coral forests, including some corals that are 5,000 years old.

The expansion of the monument is promising in light of benefits that may be provided by marine protected areas (MPAs). An MPA is a coastal or offshore marine area that is managed to protect natural and/or cultural resources.

In the accompanying video, Paul Sikkel, of Arkansas State University, discusses some of the possible successes of the MPA system in the Philippines.

This system was developed back in the early 1970s, when reef fisheries were left virtually unmanaged, and destructive fishing practices, often organized by large commercial fishing companies, ran rampant throughout the Philippines–a cluster of 7,107 islands that harbors more than 1,700 reef species and about 9 percent of global coral reef area.

To help protect its marine resources, the Philippines established at least 985 MPAs covering almost 5 percent of coastal municipal waters. To a large degree, the Philippine MPAs are now co-managed by local communities and local governments along with the national government. This partial de-centralization of authority helps give responsibility for MPA management to those who depend on their ecological health the most: coastal communities.

The Philippine MPAs still fall short of the national goal for coverage area, and conservation enforcement problems remain. Nevertheless, some evidence suggests that the Philippine community-based management system may have generated some conservation victories.

For example, a study published in 2010 showed that species richness of large predatory reef fish increased fourfold over a 14-year period in one Philippine MPA and 11-fold over a 15-year period in another Philippine MPA. The study also showed that as species richness increased in complexity within one of the MPAs, this type of complexity also increased within neighboring fished areas–evidently because of a spillover effect from the MPA.

But even while MPA status may provide protection from local threats, such as pollution or anchor damage, MPAs may remain vulnerable to global threats, such as climate change, which cannot be controlled at local levels.

In the accompanying video, Sikkel also discusses a new mechanism that may be damaging coral reef ecosystems in the Caribbean. Sikkel identified this potential mechanism through his National Science Foundation-funded research on the relationships between parasites and their host fishes in Caribbean reefs.

See the full article here.

Please help promote STEM in your local schools.

Stem Education Coalition

The National Science Foundation (NSF) is an independent federal agency created by Congress in 1950 “to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense…we are the funding source for approximately 24 percent of all federally supported basic research conducted by America’s colleges and universities. In many fields such as mathematics, computer science and the social sciences, NSF is the major source of federal backing.


#applied-research-technology, #environmental-studies, #nsf, #preserving-coral-reefs

From BBC: “Mammals on brink of ‘extinction calamity'” in Australia


10 February 2015
Helen Briggs

The endangered northern quoll, a mammal species native to Australia

Australia has lost one in ten of its native mammals over the last 200 years in what conservationists describe as an “extinction calamity”.

No other nation has had such a high rate of loss of land mammals over this time period, according to scientists at Charles Darwin University, Australia.

The decline is mainly due to predation by the feral cat and the red fox, which were introduced from Europe, they say.

Large scale fires to manage land are also having an impact.

As an affluent nation with a small population, Australia’s wildlife should be relatively secure from threats such as habitat loss.

But a new survey of Australia’s native mammals, published in the journal Proceedings of the National Academy of Sciences, suggests the scale of the problem is more serious than anticipated.

Since 1788, 11% of 273 native mammals living on land have died out, 21% are threatened and 15% are near threatened, the study found. Marine mammals are faring better.

Shy species

“No other country has had such a high rate and number of mammal extinctions over this period, and the number we report for Australia is substantially higher than previous estimates,” said conservation biologist John Woinarski, who led the research.

“A further 56 Australian land mammals are now threatened, indicating that this extremely high rate of biodiversity loss is likely to continue unless substantial changes are made.

“The extent of the problem has been largely unappreciated until recently because much of the loss involves small, nocturnal, shy species with [little] public profile – few Australians know of these species, let alone have seen them, so their loss has been largely unappreciated by the community.”

The brush-tailed rabbit-rat, a mammal species native to Australia that is listed as a near-threatened species by the International Union for Conservation of Nature The brush-tailed rabbit-rat, a mammal species native to Australia that is listed as a near-threatened species by the International Union for Conservation of Nature

In time, iconic species such as the koala will also decline, said the researchers, from Charles Darwin University, Southern Cross University and the Department of Parks and Wildlife in Wanneroo.

The prospects for Australia’s wildlife can be improved but is “a very formidable challenge”, they added.

It is estimated there are between 15 and 23 million wild cats living on the continent.

Practical measures to protect native species include boosting biosecurity on islands off the mainland, which have fewer feral cats and foxes.

The islands could also act as arks for endangered species, while more careful use of fire and control measures to wipe out foxes and feral cats are also being considered.

But the researchers warn that Australians may ultimately need to consider the way they live on the land to stem the loss of natural assets.

See the full article here.

Please help promote STEM in your local schools.


Stem Education Coalition

#applied-research-technology, #bbc, #environmental-studies, #extinctions

From Science 2.0: “Methane Seepage Has Been Occurring For Millions Of Years”

Science 2.0 bloc

Science 2.0

There’ve been some recent environmental claims about methane seepage, flaming tapwater, but what were not staged have been due to nature. It’s a tale almost as old as earth.

But outside environmental circles, science was always thinking about methane as much as CO2, because it has a 23X greater warming impact than CO2. Fortunately it is short-lived so trace seepage of methane from natural gas is nowhere near as devastating as CO2 from other forms of energy creation. Natural gas is why emissions from energy in America are back at early 1990s levels and emissions from coal, the dirtiest polluter, are back at early 1980s levels. Even though it is modest from natural gas, 60 percent of methane in the atmosphere comes from human activities (such as cow burps and other things) but that is nothing compared to the giga-tons of it trapped under the ocean floor of the Arctic.

And it’s leaking.

But fear not, it always has.

“Our planet is leaking methane gas all the time. If you go snorkeling in the Caribbean you can see bubbles raising from the ocean floor at 25 meters depth. We studied this type of release, only in a much deeper, colder and darker environment. And found out that it has been going on, periodically, for as far back as 2.7 million years,” says Andreia Plaza Faverola, researcher at Centre for Arctic Gas Hydrate, Environment and Climate, and the primary author behind a new paper in Geophysical Research Letters.

Andreia Plaza Faverola is researcher at Centre for Arctic Gas Hydrate, Environment and Climate at UiT The Arctic University of Norway. Credit: Maja Sojtaric/CAGE

Faverola is talking about Vestnesa Ridge in Fram Strait, a thousand meters under the Arctic Ocean surface offshore West-Svalbard. Here, 800 meter high gas flares rise from the seabed today. That’s the size of the tallest manmade structure in the world – Burj Khalifa in Dubai.

“Half of Vestnesa Ridge is showing very active seepage of methane. The other half is not. But there are obvious pockmarks on the inactive half, cavities and dents in the ocean floor, that we recognized as old seepage features. So we were wondering what activates, or deactivates, the seepage in this area,” says Faverola.

Why 2.7 million years?

The team of marine geophysicists from CAGE used the P-Cable technology to figure it out. It is a seismic instrument that is towed behind a research vessel. It recorded the sediments beneath these pockmarks. P-Cable renders images that look like layers of a cake. It also enables scientists to visualize deep sediments in 3D.

The Arctic Ocean floor offshore West-Svalbard. Credit: Andreia Plaza Faverola/CAGE

“We know from other studies in the region that the sediments we are looking at in our seismic data are at least 2.7 million years old. This is the period of increase of glaciations in the Northern Hemisphere, which influences the sediment. The P-Cable enabled us to see features in this sediment, associated with gas release in the past.

“These features can be buried pinnacles or cavities that form what we call gas chimneys in the seismic data. Gas chimneys appear like vertical disturbances in the layers of our sedimentary cake. This enables us to reconstruct the evolution of gas expulsion from this area for at least 2,7 million years.”

The seismic signal penetrated into 400 to 500 meters of sediment to map this timescale.

How is the methane released?

By using this method, scientists were able to identify two major events of gas emission throughout this time period: One 1,8 million years ago, the other 200,000 years ago.

This means that there is something that activated and deactivated the emissions several times. The authors have a plausible explanation: It is the movement of the tectonic plates that influences the gas release.

Vestnesa is not like California though, riddled with earthquakes because of the moving plates. The ridge is on a so-called passive margin. But as it turns out, it doesn´t take a huge tectonic shift to release the methane stored under the ocean floor.

“Even though Vestnesa Ridge is on a passive margin, it is between two oceanic ridges that are slowly spreading. These spreading ridges resulted in separation of Svalbard from Greenland and opening of the Fram Strait. The spreading influences the passive margin of West-Svalbard, and even small mechanical collapse in the sediment can trigger seepage,” says Faverola.

Where does the methane come from?

The methane is stored as gas hydrates, chunks of frozen gas and water, up to hundreds of meters under the ocean floor. Vestnesa hosts a large gas hydrate system. There is some concern that global warming of the oceans may melt this icy gas and release it into the atmosphere. That is not very likely in this area, according to Faverola.

“This is a deep water gas hydrate system, which means that it is in permanently cold waters and under a lot of pressure. This pressure keeps the hydrates stable and the whole system is not vulnerable to global temperature changes. But under the stable hydrates there is gas that is not frozen. The amount of this gas may increase if hydrates melt at the base of this stability zone, or if gas from deeper in the sediments arrives into the system. This could increase the pressure in this part of the system, and the free gas may escape the seafloor through chimneys. Hydrates would still remain stable in this scenario.”

Historical methane peaks coincide with increase in temperature

Throughout Earth´s history there have been several short periods of significant increase in temperature. And these periods often coincide with peaks of methane in the atmosphere , as recorded by ice cores. Scientists such as Plaza Faverola are still debating about the cause of this methane release in the past.

“One hypotheses is that massive gas release from geological sources, such as volcanos or ocean sediments may have influenced global climate.. What we know is that there is a lot of methane released at present time from the ocean floor. What we need to find out is if it reaches the atmosphere, or if it ever did.”

Historical events of methane release, such as the ones in the Vestnesa Ridge, provide crucial information that can be used in future climate modeling. Knowing if these events repeat, and identifying what makes them happen, may help us to better predict the potential influence of methane from the oceans on future climate.

See the full article here.

Please help promote STEM in your local schools.


Stem Education Coalition

#applied-research-technology, #environmental-studies, #methane-studies, #science-2-0

From Scientific American: “Humans Cross Another Danger Line for the Planet”

Scientific American

Scientific American

January 15, 2015
Mark Fischetti

Five years go an impressive, international group of scientists unveiled nine biological and environmental “boundaries” that humankind should not cross in order to keep the earth a livable place. To its peril, the world had already crossed three of those safe limits: too much carbon dioxide in the atmosphere, too rapid a rate of species loss and too much pouring of nitrogen into rivers and oceans—primarily in the form of fertilizer runoff.

Now we have succeeded in transgressing a fourth limit: the amount of forestland being bulldozed or burned out of existence (see map below). Less and less forest reduces the planet’s ability to absorb some of that carbon dioxide and to produce water vapor, crucial to plant life. And the ongoing loss alters how much of the sun’s energy is absorbed or reflected across wide regions, which itself can modify climate.

Details about the fourth transgression, and updates on how well the planet is faring on all nine boundaries, are being published today online in Science. Another international team, with some of the same members from the original group, decided to reassess the boundaries given five more years of data, and they plan to keep doing so into the future. “Science moves on,” says the paper’s lead author, Will Steffen, a professor at the Australian National University and at the Stockholm Resilience Center at Stockholm University. “And so do the best ways to formulate the boundaries and apply them to policy.”

Disappearing Forests: Green are sustainable for now, yellow and red are past the safe limit.

Indeed, the group chose two “core boundaries,” each of which could “drive the earth system into a new state” if they are persistently surpassed. Of course the insinuation is that a “new” state is not a state we want to be in. The two core boundaries are “climate change”—primarily the amount of CO2 in the atmosphere—and “biosphere integrity,” which is how much we are killing off species and tearing up natural habitat. “Looking back into geologic history, we see that climate change and mass extinctions have occurred when major changes took place on earth,” Steffen says. Too much CO2 will simply cook the planet, and too much disruption of ecosystems will destroy natural resources (think food) on which people depend.

The designation of two core boundaries may in part be a response to frustration from policymakers when the nine boundaries were first revealed. They were simply too much for legislators and political leaders to take on. Steffen hopes that a focus on the core boundaries will guide the finalization of U.N. Sustainable Development Goals, happening this year, which are meant to guide nations in crafting policies that help sustain the planet into the future.

The researchers have put a surprisingly visual point on which nations bear the greatest responsibility for change by including detailed maps, like the forest map above, that show which regions are exceeding the boundaries the most. One set of maps, for example, shows which agricultural regions are sending the most nitrogen and phosphorous into waterways and oceans. When asked if the group might be criticized for pointing fingers, Steffen says, “That’s the reality of what’s happening. We’ve come to a point where we can’t avoid the equity issue any more,” meaning it is time to step up and indicate which countries are hurting the planet the most.

The realization that it is time to talk tough comes from a second new paper appearing today online in Anthropocene Review, also led by Steffen. It updates a striking set of 24 graphs (below) that show that almost all the damage to earth by humans has occurred since 1950, in lock step with rapid economic growth worldwide. This “great acceleration” of social, economic and environmental drivers basically says that although growing population adds stress to the earth’s systems, greater consumption through rising living standards is responsible for even more of the burden.

The “Great Acceleration” of social, economic and environmental drivers.

The knee-jerk reaction—that developing and poor countries cannot attain the income and quality of life levels enjoyed by fully developed nations without harming the earth—is wrong, however, Steffen says. For example, he notes, “We are not denying people food by saying the world must live within certain limits for nitrogen and phosphorous. Agronomists say new practices such as precision farming can allow us to grow enough food to feed 9 billion people and stay within the safe boundaries. We can be clever. We have tools. We just need to agree as a global society that we have to do things smarter.”

Part of scientists’ role in a smarter future will be to determine how the nine boundaries work together. Until now the limits have been assessed in isolation. Steffen says the next step, perhaps for a new report in another five years, is to unravel “the interactions between the boundaries. They are not independent of one another.” In the meantime, he adds, “we have to work with the policy community starting right now.”

See the full article here.

Please help promote STEM in your local schools.


Stem Education Coalition

Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

#applied-research-technology, #environmental-studies, #scientific-american