Tagged: NOVA Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:35 pm on November 11, 2015 Permalink | Reply
    Tags: , NOVA, Water management   

    From NOVA: “How Water Is Reshaping the West” 



    11 Nov 2015
    Hillary Rosner

    On an unseasonably warm September day with a cloudless sky, I stand on a Colorado hillside, in a meadow that’s painted the ochre of early fall. Ponderosa pines stand sentry above the grass, which sways in a barely perceptible breeze. Down below, cottonwoods tower above an old streambed. All around me, birds chirp and flit among the shrubs; recent visitors spotted fresh mountain lion tracks and a black bear. It’s an almost perfect vista. Except I’m looking at a landscape that will soon disappear.

    A grassy meadow sways in the breeze in Chimney Hollow.

    “Be careful of rattlesnakes,” Brian Werner says as we walk near what will, a few years out, become the south end of Chimney Hollow Reservoir. I try to imagine what will happen to the snakes—and the bears and birds and burrowing animals—when these 1,600 acres become a lakebed. I’d been conducting an animated interview with Werner for more than an hour as we toured the region’s waterworks–reservoirs, pipelines, diversion ditches, pumps—but now, standing here, I’m speechless. Perhaps sensing my mood, Werner tries to be upbeat. He gestures to the west, where, as part of the reservoir land-acquisition deal, another 1,800 acres will be permanently protected. But it’s hard to stand beneath those ponderosas and not feel a kind of heartbreak.

    Werner works for Northern Water, a public utility that delivers water to parts of eight northeastern Colorado counties and about 880,000 people. In conjunction with the U.S. Bureau of Reclamation, Northern Water administers the Colorado-Big Thompson Project, a sprawling collection of reservoirs and pipes built to send Colorado River water from the western part of the state across the Rockies (through a tunnel beneath Rocky Mountain National Park) the more populous—and growing—northeastern towns. Werner’s job title is public information officer, but after 34 years with the utility, he’s also its de facto historian, with an insider’s deep knowledge of the entire state’s water past and present, including the intricacies of water rights. (Western water law is an unfathomably complex beast predicated on a first-come-first-served system, which is why newer cities, late to the game, are struggling for rights to water that often flows right past them.)

    Up and down Colorado’s Front Range—the string of cities perched along the Rocky Mountains’ eastern flanks—it’s a boom time. Fort Collins, the northernmost city, has doubled its population since the 1980s, with no sign of stopping. Farther to the east, in former rural communities like Frederick, Dacono, and Evans, pavement is spreading like weeds, subdivisions are sprouting in place of corn. The reservoir soon to drown the spectacular landscape under my feet that afternoon would deliver water to these bustling communities.

    Nearby, another proposed reservoir would submerge a highway to store water from the Poudre River, which flows through downtown Fort Collins; this project will serve those same growing towns. “Some people think if we don’t build those projects, people just won’t come,” Werner says. “I wish that were the case. But it’s not gonna happen. People are going to keep moving here, because it’s a great place to live.”

    Across much of the West, the story is similar. As cities and states grapple with urban growth alongside the impacts of global warming—crippling drought, a shifted timeline of snowmelt and stream flows, uncertainty about future water supplies—nothing is off the table when it comes to securing access to water. These days, the stories that make national news are more likely to be about old dams coming down than about new ones rising. That’s partly because dams coming down are still a rarity. But across the West, the local news is far more likely to be about smaller dams going up. The era of water mega-projects may be behind us, but engineers are still transforming landscapes to deliver water—an increasingly elusive and valuable commodity.

    In Colorado, planning for the next phase in the perpetual quest for water is nearing completion. The statewide water plan, mandated by Governor John Hickenlooper, is a massive document, two and a half years in the making, that details how the state will provide water to its expected population of 10 million people by midcentury and make up an anticipated 163-billion-gallon-a-year water deficit. A final version of the plan is due to the governor by December 10; officials are scrambling to respond to new demands by some Front Range cities, including Denver, Colorado Springs, and the towns that Northern Water supplies, for new reservoir capacity. The Denver Post reported last month that “46 staffers are scrambling to fix the plan and include a massive new commitment for new reservoir storage of 130 billion gallons.”

    Smaller Projects

    “The Reclamation era”—roughly the 1930s to the 1970s—“was big monster projects, massive dams that totally reshaped the watershed, rivers, and ecology,” says Reagan Waskom, director of the Colorado Water Institute at Colorado State University in Fort Collins. Today’s projects, Waskom says, are a series of “expansions and enlargements,” smaller-scale efforts meant to complement or shore up existing systems. About an hour south of Chimney Hollow, in the mountains west of Boulder, Denver’s water utility is planning to enlarge a reservoir built in the 1940s, raising the dam by 100 feet, doubling the lake’s surface area and tripling its storage capacity.

    The dams being built today, Werner agrees, are “not the Hoovers and the Meads. We’re not doing that anymore.” Still, he adds, “I don’t want to rule them out totally.” As recently as the 1990s, he notes, California built Diamond Valley, the biggest reservoir constructed in the western U.S. in 50 years. But in his three-plus decades with Northern Water, he’s worked on the construction of just one reservoir.

    Certainly, Chimney Hollow is no Diamond Valley or Lake Mead. The country’s largest reservoir, Mead can store roughly 26 million acre-feet, or 8.47 trillion gallons, of water. (Water storage is often measured in acre-feet, or the amount of water needed to flood one acre of land one foot deep.) The Hoover Dam, which stops the Colorado River in its tracks to create the lake, is 726 feet high. Chimney Hollow will hold 90,000 acre-feet, or about 29.3 billion gallons, with a 300-foot-high dam.

    Five earthen impoundments hold back 156,735 acre-feet, or 51 billion gallons, of water stored in Horsetooth Reservoir, which is just west of Fort Collins, Colorado.

    A subsidiary of Northern Water, called the Municipal Subdistrict, runs the Windy Gap project, which was built in the early 1980s to provide water for Boulder, Fort Collins, and four other Front Range cities. The system pulls water from the Colorado River and stores it in the Windy Gap reservoir on the west side of the Rockies then delivers it to Lake Granby, where it is pumped through the Big Thompson system to the eastern side. But in wet years, Lake Granby, the main reservoir for that Big Thompson system, is already full—leaving no room to store the Windy Gap water. That means in dry years, when the customers really need it, the water isn’t there.

    Chimney Hollow is the solution, a way to stabilize the Windy Gap water supply. Water managers call it “firming.” Imagine that you are technically entitled to ten units of water out of a reservoir that stores 100 units. But in a dry year, the reservoir might only contain 30 units, and there are other customers besides you. In such a system, you couldn’t really depend on the reservoir for your water. That worst-case scenario is what water people call “firm yield.”

    On the Windy Gap system, the firm yield is currently zero. “In the dry years, there’s no water available,” explains Werner, “and in the wet years, there’s nowhere to put it. You can’t rely on a project with zero firm yield.” Chimney Hollow, the utility contends, will give customers—the city of Erie, say— guaranteed annual delivery of their legally allotted water.

    “Even with climate change, we know that there will be high flow years,” Waskom says. “When those come along, you’ve either got a place to store that water or you don’t.”

    Storage offers a safety measure, a way to even out the inconsistencies and help ensure a more stable water supply. That’s no small thing when you’re coping with climate change. Floods that scientists used to consider 100-year events—with a 1% chance of occurring in any given year—are now being called 20-year events—with a 5% chance of occurring in any given year. Droughts are different than they used to be, too: today’s droughts, which coincide with warmer temperatures, bring more wildfires, earlier snowmelt, and greater rates of evaporation—transferring water from soil, plants, lakes, and reservoirs to the air.
    Zombie Projects

    But not everyone thinks that means we should keep building reservoirs.

    “One of the things you see all over the West is that people think they need more water to serve more people,” says John Fleck, an adjunct professor and writer-in-residence at the University of New Mexico’s Water Resources Program. “But when push comes to shove, communities are really successful at using a lot less water. There’s this notion that if our population goes up, our water use has to go up. That’s really not the case.” Albuquerque, notes Fleck, who is writing a book about the future of the Colorado River, uses less water today than it did in the late 1980s even as it grew by 45%.

    Fleck believes some new reservoir and pipeline construction falls into a category he calls “zombie projects”: yet-to-be-built portions of long-ago-conceived storage systems. Decades ago, he says, “we got these projects in our minds, the idea that they were going to be built. People continue to think they want to build them, without recognizing the changing water-use realities.”

    Fleck points to a federal project called the Ute Lake Pipeline in eastern New Mexico as the “classic zombie”: A dam and reservoir were built back in the early 1960s, but a pipeline to deliver the water to the Clovis area—150 miles away—remains, well, a pipe dream. “The backers want to build it, homeowners around the lake and environmentalists want to stop it. The water just sits in the lake.” Fleck doubts it will ever be built. “But it limps along, because everybody’s incentives are well fed.”

    Flatiron Reservoir stores 760 acre-feet, or 247 million gallons, of water southwest of Loveland, Colorado.

    There’s also the issue of whether there will continue to be enough water in the rivers to make these efforts worthwhile. “Whether you have a big reservoir or just a straw where you’re sucking water out of the river and sending it somewhere else, the question is, will the water be there?” says Jeff Lukas, a researcher with the Western Water Assessment, a think tank based at the University of Colorado. “Just because you’ve done the modeling and your scheme would’ve worked under the hydrology of last 50 years doesn’t mean it’ll work in the next 50 years.”

    A new reservoir for the swelling Denver suburb of Parker, for instance, completed in 2011, has managed to fill to only a third of capacity, thanks to dry spells, evaporation, and junior water rights that allow Parker to draw water only after more senior rights holders are done. The project took 30 years to build, and currently only provides a year’s worth of water for 75 families.

    New water storage projects may also amount to a zero-sum game. “If you take more water out of the Colorado River basin,” Fleck says, “you’re taking water out of an overstressed system that can’t afford to lose water without someone else having less.”

    Or, as Douglas Kenney puts it, “It’s not like there’s some new water we can capture and the region is better off.” Kenney directs the Western Water Policy Program at the University of Colorado School of Law. When I visited him in his office, he picked up a manila envelope from a pile of mail on his desk. Sent by someone he’s never met, it contained a proposal for a “veterans memorial pipeline” that would carry water from the Mississippi River to the West. Kenney says these things land in his mailbox on a regular basis. Nutty as it sounds, it’s an idea that repeatedly surfaces, most notably pushed by former Nevada water czar Patricia Mulroy. Before she retired last year, Mulroy was the head of the Southern Nevada Water Authority, which supplies water to Las Vegas; during her 25-year tenure, the city slashed per capita water use by a third. Yet Mulroy frequently said that no option was too outrageous to consider. In addition to suggesting the West siphon water from the Mississippi, she applied for federal permits to tap vast quantities of groundwater from beneath 20,000 square miles of eastern Nevada sagebrush country. (The project is bogged down in lawsuits.)

    Some critics believe that new water projects are proceeding from a faulty premise: the idea that you develop as much water as you possibly can while you can, and then only conserve water when you absolutely have to. But it’s hard to tell that to cities that may simply not have enough water for their booming populations. “You look at the big cities of the West, and they’re pretty much all using the same or less water than 30 years ago,” Kenney says. “But they can do that because they had a big enough base to start from. It’s pretty easy to go to a city of one million and say you can conserve your way to the next 50 years of growth. It’s hard to go to a city of 500 people that’s growing rapidly, and say that conservation is the way you’re going to grow for the next couple of decades.”


    At some level, decisions about how to plan for the future of Western water supplies come down to both values and inertia. As Werner says, it’s not feasible to stop people from moving to Colorado’s Front Range and other booming parts of the Western U.S. While environmental conditions—unbearably hot summers or persistent extreme drought—might ultimately make both the Front Range and the entire West far less attractive, for the moment, they’re still desirable places to live. It’s hard to stop progress.

    “I see all of these Front Range projects as tradeoffs for society, just like building another freeway or power plant,” Waskom tells me when I call him on his cell phone one morning. “They are the costs of growth to live the way we live. I think there will be an effort to do the best we can for the environment, but everything has an impact, and I think we all have to take some personal responsibility for that.”

    A view of Chimney Hollow with Carter Lake Reservoir in the background

    At Northern Water’s headquarters in Berthoud—another rapidly growing Front Range town—a series of experimental garden plots occupy several acres behind the building. There, the utility is growing scores of different kinds of grasses and perennials using varying amounts of water and types of irrigation systems and soil amendments. New signs detail the watering needs of various landscapes and link, via QR codes, to web pages filled with water-conserving information for homeowners and professional landscapers. “I’m a huge believer that you better be able to convince people you’re doing everything you can do to save every last drop of water before you go out and build that next water project,” Werner says.

    Still, this is the same utility that, back in the mid 1980s, immediately after the Cache La Poudre became the state’s first river to receive an official Wild and Scenic designation, proposed building a 300-foot-high dam in the Poudre Canyon, west of Fort Collins. (The designation tied up 90 percent of the river above the canyon mouth from future development but left a seven-mile stretch through the canyon open.) “From a strict engineering sense, it made the most sense,” Werner says. “Run-of-the-river reservoirs are the best way to do it, in canyons. Part of what I’m saying about the new world is, you don’t do things from strictly engineering standpoint.”

    The new world is nothing if not complex. It’s a world of tradeoffs, a world without easy answers. Still, standing on the hillside at Chimney Hollow, I’m sure of one thing: I wish there was some way to spare this spectacular place.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

  • richardmitnick 12:13 pm on November 9, 2015 Permalink | Reply
    Tags: , , NOVA   

    From NOVA: “Pushing the Limits of Life” 



    04 Nov 2015
    Carrie Arnold

    Everywhere scientists have looked on Earth, they have found signs of life. They’ve looked in the deepest oceans and the driest deserts, and in every case, life—in some form or another—was flourishing. But Kelly Wrighton and Mike Wilkins aren’t satisfied that the search is over, so they’re looking for life in a place more extreme than ever before.

    Which is why the married couple, both assistant professors of microbiology at Ohio State University, are at a new fracking well being drilled just outside Morgantown, West Virginia. Before Northeast Natural Energy can send down fluid to fracture the Marcellus Shales, buried more than 1.5 miles below the surface for 400 million years, Wrighton, Wilkins, and a team of scientists will be collecting rock samples hauled up from the deep.

    A closeup of the drill at the Morgantown site.

    Unlike previous samples, which were collected after the well had been fracked first and thus contaminated, these samples will be pristine. It will give the microbiologists their best shot to find signs of microbial life.

    Wrighton and Wilkins have spent their burgeoning careers studying the microbes dozens, even thousands of feet beneath the surface of the Earth. Such deep subsurface microbes have to contend with high temperatures, in some areas well above the boiling point of water. They also have to manage extremely high pressures and high concentrations of salt. Perhaps the most difficult task is finding energy. Cut off from solar energy, subsurface bacteria had to rely on chemical reactions or sinks of oil and natural gas to make their living.

    Signing on to the Morgantown project almost two years ago was a huge gamble, since no one knows whether life can survive in such an extreme environment. Wrighton and Wilkins used the expertise they had gathered in studying subsurface microbes as grad students and postdocs, and then spent more than a year working on the project full-time before the first samples could even be collected. Whatever they find, they hope to shed light on one of science’s big questions: Just how extreme can life get? Their answers could reveal the limits of life, the conditions beyond which living things just couldn’t hack it. It could also tell us more about how life might have first evolved and where else it could be found in the universe.

    “There’s an enormous reservoir of undiscovered life that’s really hard to get to,” Wilkins says.

    Earliest Extremes

    The study of microbes living in extreme environments—so-called extremophiles—is relatively new. In 1969, Indiana University bacteriologist Thomas Brock and his student Hudson Freeze traveled to Yellowstone National Park to search for bacteria living in the park’s hot springs. To many, the expedition seemed like little more than tilting at windmills. Any bacteria living in the hot springs would have to survive at temperatures greater than 158° Fahrenheit, a point at which most living things would be cooked. But when they sampled some pink muck from Mushroom Spring, just a few miles north of Old Faithful, Brock and Freeze found it teeming with life. Among scientists, their discovery of Thermus aquaticus is now more famous for its facilitation of the polymerase chain reaction, used in labs around the world for amplifying DNA. But in the 1960s and 1970s, Brock’s discovery showed the scientific community that bacteria could survive in environments far more extreme than anyone thought.

    Extremophiles give Grand Prismatic Spring in Yellowstone National Park its vivid colors.

    “Bacteria are able to grow…at any temperature at which there is liquid water, even in pools which are above the boiling point,” Brock wrote in a 1967 Science paper.

    The discovery that microbes could live in environments far more extreme than anyone suspected opened a wide range of habitats to microbial exploration. While some scientists explored the frigid, windswept deserts of Antarctica, others, like Bo Barker Jørgensen and Karsten Pedersen, geomicrobiologists at Aarhus University in Denmark and Chalmers University of Technology in Sweden, respectively, began taking advantage of burgeoning surveys of marine life. Part of these surveys included sampling sediments at the bottom of the ocean or deep underground, which Jørgensen, Pedersen, and others found teeming with life. “It took a decade to accept that life was actually that deep,” Pedersen says.

    These first studies, in the mid-1980s, showed that deep subsurface life can exist. Still, despite decades of work on the subject, there’s no formal definition of what “deep” really means, says Tori Hoehler, an astrobiologist at NASA’s Ames Science Center.

    “From the NASA perspective, the deep subsurface means that you’ve gone deep enough to escape the influence of the surface biosphere,” Hoehler says. “I’m not sure there’s a strict dividing line, but once you’re a few meters deep or so, it’s certainly a different world than on the surface.”

    Without the large-scale drilling projects used to study deep subsurface marine life, microbiologists like Tullis Onstott of Princeton University had to access the deep via existing digs. In 1996, Onstott began focusing on gold mines in South Africa, then as now some of the deepest mines in the world, with some nearly 2.5 miles below the surface. They are hot, dark, filthy places, and the working conditions are often deadly for miners. But for some microbes, this miners’ hell is pretty close to heaven.

    The first microbe Onstott found from his gold mine expeditions was related to Firmicutes, a type of microbe typically found in boiling hot springs, like those found in Yellowstone. This initial success enabled more trips back to the hellish environment found in the South African mines. In a 2006 Science paper, Onstott and colleagues published the discovery of a bacterial community living in a gold mind water reservoir 1.74 miles below the ground. The basalt surrounding the reservoir also contained large amounts of uranium, which made the rocks highly radioactive. The radioactivity split water molecules into oxygen and, more importantly, the hydrogen gas that the microbes used for food. “They gobble hydrogen up like potato chips,” Onstott says.

    Onstott has no idea how long the microbes have been down there, though he doubts they have been present since the rock was buried several billion years ago. The geological processes that molded and shaped the rock since their formation would have created temperatures and pressures high enough to kill any microbes that are native to those depths. Even so, Onstott believes that the microbes were able to survive on radioactivity for potentially millions of years.

    “It’s incredible how far down you can go and still detect life,” Wilkins says.

    Surviving for any length of time in such a stressful environment isn’t easy. The extreme heat of many of these locales can cause proteins to misfold, turning what are similar to beautiful pieces of origami into useless crumpled heaps of scrap paper. DNA also requires more maintenance down deep, as does the cellular membrane. “A bacterium has to be very metabolically active to keep its sh– together, or rather, keep its sh– inside the cell,” Onstott says. Keeping the cell in proper working order when under stress requires lots of energy, which can be hard to find even in the best of circumstances—and life in an underground vat of sizzling radioactive water most definitely is not.

    Mike Wilkins uses pressure chambers to encourage bacteria from deep underground to grow.

    In 2003, scientists discovered what remains the known upper temperature limit of life. A team from the University of Massachusetts, Amherst found strain 121, a microbe living in a deep sea vent off Puget Sound dividing, albeit slowly, at 121° Celsius, or 250˚ F.

    Lab experiments to define these limits have been difficult, as Thermus aquaticus routinely grows at temperatures above 90° Celsius in Yellowstone, but hasn’t been coaxed to grow in the lab above a temperature of around 80° Celsius. Still, to get life to grow at its hottest extreme, all other conditions have to be optimal. In life living deep below ground, those conditions are typically far from ideal, which means the upper temperature limit is likely much lower, although no one currently knows what that might be.

    One result of this high-stress, low-energy environment is that many of the microbes found far below ground divide much less frequently. The famous bacterium Escherichia coli can divide in less than 20 minutes in nutrient-rich growth media. Onstott believes that some of the microbes he has found in the South African gold mines might have doubling times in the decades, centuries, or even millennia. As a result, their numbers are likely going to be much lower than microbes found on the surface, simply because they can’t reproduce as quickly.

    Give them the right food, however, and all of that might change. Wrighton began her study of subsurface life by sampling the microbes found in a well that had already been fracked. In the process of fracking, energy companies typically flush the wells with hydrocarbon-rich liquids, both to get the natural gas out and to prevent microbes from corroding the pipes. But no fracking well can be sterilized completely. Microbes from the surface often make their way below ground, and frequently in very large numbers. For some bacteria, fracking fluids are an all-you-can-eat buffet.

    Wrighton wanted to know how these different communities of microbes lived together and how any deep subsurface life might affect microbial contaminants and vice versa. She was also curious about how the microbes made a living and what enzymes and genes were necessary to carry out basic functions. This, in turn, could provide a lot of information about how microbes interacted with each other to create rich, diverse communities.

    Microbiologists pulverized the shale samples to extract any chemicals that might suggest there’s life trapped inside.

    Her first glimpse at the microbial life in fracked wells led her to an even more fundamental question: Was anything down there in the first place? Wrighton’s instincts as a microbiologist told her yes, especially since scientists had found signs of life essentially everywhere else on Earth that they had looked. She began focusing her attention on the Marcellus Shale in Appalachia where an NSF-funded study would drill deep into the rock to obtain samples that Wrighton, Wilkins, and other researchers would have the opportunity to study. Since the shale was rich with seams of natural gas and other hydrocarbons, the microbes should have plenty to eat, she suspected. And the shale’s depth, at more than a mile, meant that it was deep enough to be completely isolated from the surface world but still accessible by drilling.

    Even more importantly, the site was pristine. It had never been previously drilled or fracked. This meant that if Wrighton could somehow account for any introduced contaminants, whatever other microbes she found were almost certainly native to the deep shales. To look for them, she teamed up with Wilkins and geologist Shikha Sharma from West Virginia University. The team would take a three-pronged approach: They would look for the chemical signs of life, seek out any microbial genetic material, and try to directly culture any microbes found.

    “We want to try and identify the chemical, physical, and biological factors that constrain life and try to formulate a recipe for what makes life possible,” Wrighton says.

    Together, they knew that if life could be found in the Marcellus shales, they would find it. The trio spent more than a year running mock experiments and honing their techniques to prepare for the arrival of their samples. After a series of delays, they finally got word that their samples would be drilled in early September.

    Drilling Deep

    You could hear the drill site long before you saw it. Nestled into a hillside outside of Morgantown and overlooking an old World War II munitions factory, a heavily rutted dirt road led up to a drill rig, a large blue metallic cylinder that rose for more than three stories from the rock below. The drill bit was more than a mile and a half below the surface, a distance long enough to hold nearly six Empire State Buildings placed end to end. Rebecca Daly, Wrighton’s lab manager, had gotten used to the din after spending several days at the site in preparation for the drilling that would yield her samples. Daly might have gotten used to the racket, but the oppressive late summer heat was something else entirely. Sweat streamed down her neck, soaking her long, blond ponytail.

    It didn’t dampen her enthusiasm, though. “This is such an incredible opportunity,” she says. “We’ve never been able to get pristine samples this deep before.”

    The drilling rig bores deep into the Earth to retrieve shale samples for microbiologists to study.

    Daly and Sharma had been up most of the night before, poring over geological data to identify the sites that would be most likely to hold signs of life. Formed more than 400 million years ago in the Devonian era, before dinosaurs roamed the Earth, the Marcellus shales are a tough place to survive. The shale is hot and salty, and it has relatively small amounts of two of the things microbes need the most to eke out a living: water and living space.

    But work by other geologists had shown that other shales had microfractures just big enough to provide a nice home to some bacteria. The team was also interested in the interface between the Marcellus shales and the rocks immediately above and below. “We think these will be the hot spots for life. There’s enough space for the microbes to grow, and they have the hydrocarbons that have diffused out of the shale that they can use for energy,” Wilkins says.

    Sharma’s analysis of the geology beneath Morgantown identified more than 50 different areas likely to yield good results, including places that contained water and organic carbon that the microbes could use for food. Before Northeast Natural Energy started drilling, the researchers dropped small fluorescent beads down the drill shaft that would allow them to identify areas on the samples potentially contaminated by surface material. Since these bacteria were apt to be more numerous and swamp any signals from native deep subsurface life, Wrighton and Wilkins needed to remove as much contamination as humanly possible.

    Drilling for their samples began on Friday morning and lasted for nearly 36 hours. As the sun climbed in the sky on Saturday morning, Daly’s heart began hammering. The last time these rocks had felt the warmth of the sun or the stirring air of a cool breeze, the first trees had just emerged, as had the first four-legged animals.

    After seemingly endless hours of waiting, Daly’s samples were ready. Handling the dark gray tiles that were roughly the size of a large palm and using a pair of latex gloves, Daly gently cleaned the tiles in a bath of salt water before placing them in a special storage container that removed all oxygen (although oxygen is necessary for us, it’s toxic to many subsurface microbes). Stacked in her car, it looked like Daly was transporting tiles to use in a kitchen or bathroom remodel, their slightly rough surface giving them a rustic feel. Then, she placed the boxes in the trunk of her car before racing back to Columbus along I-70.

    Wrighton, Wilkins, and Sharma had their samples, but it was anyone’s guess as to what they might find.

    Microbial Detritus

    When European explorers first set foot on new ground, they planted a flag to show that they had been there. Microbes do something similar, if you know what to look for, Sharma says. Microbes signal not with flags but with subtle chemical changes.

    Sharma began her career as a geologist while a university student in her native India. She was initially drawn not to rocks, but to the fact that the geologists “had the best field trips,” she says. Over the years, her scientific interests transitioned to include the chemical signatures of life that can be found in rocks. Just as humans leave signs of our presence by the strands of hair on the bathroom floor and the coat draped over the handrail, microbes, too, have their own way of saying “Kilroy was here.” These types of microbial graffiti are found in the chemical signatures on the rocks.

    “The kind of signatures that we see can tell us if the microbes have been doing their thing for a very, very long time,” Sharma says. “These things can’t happen in a day.”

    Shale samples in Shikha Sharma’s lab at West Virginia University

    Key to those signatures is carbon, the building block of life on this planet. All carbon has six protons in its nucleus, and most carbon is carbon-12, with six protons and six neutrons, but tiny amounts of carbon-13 and -14, with seven and eight neutrons, respectively, also exist. For reasons that aren’t entirely clear, living organisms prefer carbon-12 to carbon-13 and carbon-14. This preference means that rocks that ever contained life have relatively more carbon-13 and carbon-14 than rocks that didn’t, since microbes would have consumed more of the carbon-12.

    Although Sharma hasn’t ever looked for signatures of life in as extreme an environment as the current project, she has found signs in deep underground reservoirs, around 1.4 miles beneath the surface. DNA fingerprinting revealed a type of microbe known as a methanogen, which gives off methane as a by-product of metabolism much as humans exhale carbon dioxide. The reservoirs weren’t pristine, and there was no way to know whether these microbes were surface contaminants, but her results nonetheless revealed something important. “Even if they are contaminants, they are still active more than 7,800 feet below,” Sharma says.

    Sharma’s isotope fingerprinting can also determine whether these microbes are currently active or whether they’re merely a signature of life that was once there. Dead microbes, for example, leave traces of diglyceride fatty acids (DGFAs) as the fats in their cell wall decompose.

    These signatures can provide clues about which microbes are present, but not with the kind of detail that scientists need. This is where Wrighton comes in. Her expertise is in sequencing microbes that are present at very low levels. DNA degrades rapidly, which can make it harder to find than Sharma’s chemical signatures, but it also provides much more information about the microbial communities within. “We can create a metabolic blueprint about what these microbes are doing and how they’re living without having to touch a single Petri dish,” Wrighton says.

    To get at the DNA, Wrighton and her team of graduate students are grinding up the rock samples by hand, using a large, strong mortar and pestle. “It sounds like a giant construction zone. I’m pretty sure we’re the least popular lab in the building right now,” Wrighton says, laughing. They will then soak the rock in chemicals to extract any DNA. Her lab’s expertise combined with improved genetic sequencing technology should tease out the sequence of even a single bacterium.

    Wilkins, for his part, will be trying to coax these hard-to-grow microbes in the lab. Any microbes he finds will be fed a diet of ground up rock and be grown in the equivalent of a pressure cooker, to make the microbes feel right at home. Raising them in large numbers can tell Wilkins more about how they live and what they are like.

    Although they’ve had their samples for more than a month, even the most preliminary data answering the biggest question of all, is there any life down there, is still several months out. Wrighton remains optimistic that the shales or the layers immediately above and below will yield signs of life, “but we’re preparing for not finding anything,” she says.

    Whatever they find, the answers still matter. Knowing life’s potential limits also provides valuable information that could guide researchers looking for life not just here on Earth but also elsewhere in the universe. “So far, we’ve focused almost all our attention on the tiny skin of Earth that we live on,” Daly says. “But there’s just so much more out there.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

  • richardmitnick 12:23 pm on October 22, 2015 Permalink | Reply
    Tags: , NOVA,   

    From NOVA: “Are the Laws of Physics Really Universal?” 



    21 Oct 2015
    Kate Becker

    Can the laws of physics change over time and space?

    As far as physicists can tell, the cosmos has been playing by the same rulebook since the time of the Big Bang. But could the laws have been different in the past, and could they change in the future? Might different laws prevail in some distant corner of the cosmos?

    “It’s not a completely crazy possibility,” says Sean Carroll, a theoretical physicist at Caltech, who points out that, when we ask if the laws of physics are mutable, we’re actually asking two separate questions: First, do the equations of quantum mechanics and gravity change over time and space? And second, do the numerical constants that populate those equations vary?

    An artist’s impression of the quasar 3C 279. Astrophysicists use light from quasars to look for variations in the fundamental constants. Credit: ESO/M. Kornmesser, adapted under a Creative Commons license.

    To see the distinction, imagine the whole universe as one big game of basketball. You can tweak certain parameters without changing the game: Raise the hoop a little higher, make the court a little bigger, change the way you score, and it’s still basketball. But if you tell the players to start running bases or kicking field goals, then you’re playing a different game.

    Most of the current research into the changeability of physical laws has focused on the numerical constants. Why? It’s the easier question to answer. Physicists can make solid, testable predictions about how variations in numerical constants should affect the results of their experiments. Plus, says Carroll, it wouldn’t necessarily blow physics wide open if it turns out that constants do change over time. In fact, some constants have changed: The mass of an electron, for instance, was zero until the Higgs field turned on a tiny sliver of a second after the Big Bang. “We have lots of theories that can accommodate changing constants,” says Carroll. “All you have to do to account for time-dependent constants is to add some scalar field to the theory that moves very slowly.”

    A scalar field, Carroll explains, is any quantity that has a unique value at every point in space-time. The celebrity-du-jour scalar field is the Higgs, but you can also think of less exotic quantities, like temperature, as scalar fields, too. A yet-undiscovered scalar field that changes very slowly could continue to evolve even billions of years after the Big Bang—and with it, the so-called constants of nature could evolve, too.

    Luckily, the cosmos has gifted us with some handy windows through which we can peer at the constants as they were in the deep past. One such window is located in the rich uranium deposits of the Oklo region of Gabon, in Central Africa, where, in 1972, workers serendipitously discovered a group of natural nuclear reactors—rocks that spontaneously ignited and managed to sustain nuclear reactions for hundreds of thousands of years. The result: “A radioactive fossil of what the rules of nature looked like” two billion years ago, says Carroll. (For perspective, the Earth is about 4 billion years old, and the universe is edging toward 14 billion.)

    The characteristics of that fossil depend on the value of a special number called the fine structure constant, which bundles up a handful of other constants—the speed of light [in a vacuum], the charge on an electron, the electric constant, and Planck’s constant—into a single number, about 1/137. It’s what physicists call a “dimensionless” constant, meaning that it’s really just a number: it’s not 1/137 inches, seconds, or coulombs, it’s just plain 1/137. That makes it an ideal place to look for changes in the constants embedded within it, says Steve Lamoreaux, a physicist at Yale University. “If the constants changed in such a way that the electron mass and the electrostatic interaction energies changed in different way, it would show up in the 1/137 unambiguously, independent of measurement system.”

    But interpreting that fossil isn’t easy, and over the years researchers studying Oklo have come to apparently conflicting conclusions. For decades, studies of Oklo seemed to show that the fine structure constant was absolutely steady. Then came a study suggesting that it had gotten bigger, and another that it had gotten smaller. In 2006, Lamoreaux (then at Los Alamos National Laboratory) and his colleagues published a fresh analysis that was, they wrote, “consistent with no shift.” But, they pointed out, it was still “model dependent”—that is, they had to make certain assumptions about how the fine structure constant could change.

    Using atomic clocks, physicists can search for even tinier changes in the fine structure constant, but they’re limited to looking at present-day variations that happen over just a year or so. Researchers at the National Institute of Standards and Technology in Boulder, Colorado, compared time kept by atomic clocks running on aluminum and mercury to put extremely tight limits on the present-day change in the fine structure constant. Though they can’t say for certain that the fine structure constant isn’t changing, if it is, the variation is tiny: just quadrillionths of a single percent each year.

    Today, the best limits on how the constants could be varying over the life of the universe come from observations of distant objects on the sky. That’s because, the farther into space you look, the farther back in time you can see. The Oklo “time machine” stops two billion years ago, but, using light from distant quasars, astronomers have dialed the cosmic time machine 11 billion years back.

    Quasars are extremely bright, ancient objects that astronomers believe are probably glowing supermassive black holes. As light from these quasars travels to us, some of it gets absorbed by the gas it travels through along the way. But it doesn’t get absorbed evenly: only very particular wavelengths, or colors, get plucked out. The specific colors that are “deleted” from the spectrum depend on how photons from the quasar light interact with atoms in the gas, and those interactions depend on the fine structure constant. So, by looking at the spectrum of light from distant quasars, astrophysicists can search for changes to the fine structure constant over many billions of years.

    “By the time that light has reached us here on Earth, it has collected information regarding several galaxies going back billions of years,” says Tyler Evans, who led some of the most rigorous quasar measurements to date while he was a PhD student at Swinburne University of Technology in Australia. “It is analogous to taking a core sample of ice or the Earth in order to tell how climate was behaving in previous epochs.”

    Despite some tantalizing hints, the latest studies all show that changes to the fine structure constant are “consistent with zero.” That doesn’t mean that the fine structure constant absolutely isn’t changing. But if it is, it’s doing so more subtly than these experiments can detect, and that seems unlikely, says Carroll. “It’s hard to squeeze a theory into the little daylight between not changing at all, and not changing enough that we can see it.”

    Astrophysicists are also looking for changes to G, the gravitational constant, which dials in the strength of gravity. In 1937, Paul Dirac, one of the pioneers of quantum mechanics, offered up the hypothesis that gravity gets weaker as the universe ages. Though the idea didn’t stick, physicists kept looking for changes in G, and today some exotic alternative theories of gravity embrace a shifting gravitational constant. While lab experiments here on Earth have returned confusing results, studies off Earth suggest that G isn’t changing much, if it all. Most recently, radio astronomers scoured 21 years of precise timing data from an unusually bright, stable pulsar to see if they could trace any changes in its regular “heartbeat” of radio emission to changes in the gravitational constant. The result—nothing.

    But back to the second, tougher half of our original question: Could the laws of physics themselves, and not just the constants sewn into them, be changing? “That’s much harder to say,” says Carroll, who points out that there are different degrees of disruption to consider. If the rules of some “sub-theory” of quantum mechanics, like quantum electrodynamics, turned out to be fluid, maybe existing theory could accommodate that. But if the laws of quantum mechanics itself are in flux, says Carroll, “That would be very bizarre.” No theory predicts how or why such a change might happen; there is simply no framework from which to investigate the question.

    As far as we can tell, the universe seems to be playing fair. But physicists will keep scouring the rulebook, looking for clues that the rules of the game could be changing at a level we haven’t yet perceived.

    Go Deeper
    Picks for further reading

    Discover: Is the Search for Immutable Laws of Nature a Wild-Goose Chase?
    Astrophysicist Adam Frank profiles four theorists who challenge the notion that there is one set of unchanging laws that perfectly describes the universe.

    Michael Murphy: Are Nature’s Laws Really Universal?
    Murphy, an astrophysicist at Swinburne University of Technology, provides a general-audience overview of the search for changes in the fundamental constants, with links to related articles and video.
    Natural History Magazine: On Earth As In The Heavens
    In this essay, astrophysicist Neil deGrasse Tyson explains why physicists think that the same laws that apply on Earth apply throughout the cosmos, and how we may one day use this knowledge to communicate with alien civilizations.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

  • richardmitnick 7:33 am on October 16, 2015 Permalink | Reply
    Tags: , , NOVA,   

    From NOVA: “Particle Physics – Bowling for Dark Matter” 



    15 Oct 2015
    Maggie McKee

    Flickr user Kuba Bożanowski, adapted under a Creative Commons license.

    Could 1,000-ton “bowling balls” of dark matter be rolling around the cosmos? That’s the idea behind macro dark matter, a proposal that dark matter might actually be a new, outsize species of ordinary matter rather than a completely alien kingdom of particles. That, some argue, could explain why the mysterious stuff has so far eluded capture in exotic dark-matter traps around the world and failed to materialize in the Large Hadron Collider in Switzerland.

    But others say the proposal is no less speculative than those that conjure up novel hypothetical particles and argue that if it were correct, the new species’ giant footprints would already have shown up on Earth.

    Since the 1930s, researchers have suggested that an unseen form of matter, dubbed dark matter, is the gravitational glue that keeps big cosmic structures like galaxies and galaxy clusters from flying apart. This cosmic adhesive seems to outweigh the familiar stuff we see around us by a factor of 5.5, but its identity remains a stubborn mystery.

    The leading suspect is a class of particles called WIMPs, or weakly interacting massive particles. Lying beyond physicists’ best theory of particles, the standard model, WIMPs are beloved because they may solve other problems besides dark matter.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Most notably, some of them arise in versions of a popular but unproven theory known as supersymmetry, which can explain the mass of the Higgs boson, which many physicists consider to be puzzlingly low.

    Supersymmetry standard model
    Supersymmetry standard model

    Unfortunately, attempts to detect WIMPs have so far come up empty-handed. Unequivocal signs of the particles have not turned up in underground detectors designed to catch their occasional collisions with ordinary matter. And no supersymmetric particles have yet been produced at the world’s most powerful particle smasher, the Large Hadron Collider, even though some versions of the theory predicted they should have been. “The LHC is making us realize that maybe supersymmetry may not be correct,” says Peter Graham, a dark matter theorist at Stanford University in California. “That’s led to a lot of interesting work thinking about new, different kinds of dark matter candidates.”

    One of these posits that dark matter may be made of hordes of subatomic particles called quarks, the lightest types of which form the garden-variety building blocks of protons and neutrons. The idea, which was first put forward by physicist Edward Witten in 1984, is “appealing because it could mean that dark matter is a standard model phenomenon,” says David Jacobs of the University of Cape Town in South Africa. Perhaps, he and his collaborators argue, quarks glommed together in huge numbers in the early universe, rather than only in the cliques of three most commonly seen in matter today. Or perhaps quarks still formed triplets, known as baryons, but those baryons then stuck together in enormous numbers—swarms of 10 trillion trillion or more. (The largest grouping of baryons known today hosts just 294.) Either way, this type of matter may have included so many quarks that it could be seen with the naked eye, forming chunks weighing at least as much as half a stick of butter. Witten called these lumps “quark nuggets.”

    In light of the so-far fruitless searches for WIMPs, Jacobs and his collaborators have revisited Witten’s notion that these nuggets could form dark matter. The nuggets are one possibility for dark matter made of hefty lumps that the team calls macros.

    “I like that they’re [thinking] out of the box,” says dark matter theorist Jonathan Feng of the University of California, Irvine. That said, he believes it’s too soon to count WIMPs out just because neither they nor their supersymmetric brethren have been detected in the first three years of operations at the LHC. “Rumors of the death of WIMPs are greatly exaggerated,” Feng quips, adding that the particles’ health would be of greater concern if they were still no-shows after another decade of observations at the LHC.

    “It’s long past time we consider things that are a little bit more complicated than just the simple WIMP,” agrees Graham. But he disagrees with the team’s notion that dark matter made of known quarks would be more theoretically attractive than that made of as-yet-undetected WIMPs, since it is far from clear whether quarks could form huge agglomerations.

    Macro researchers acknowledge the point. Calculating the forces between quarks is notoriously difficult, and it is currently impossible to work out whether quarks could join up in such large throngs. And even if they did, it’s not clear whether they would have been able to live through the searing heat of the early universe to tell their tale today, or whether they may have come together only briefly and then dispersed, like a flash mob. “We don’t know whether this stuff … will be stable,” says macro proponent Glenn Starkman of Case Western Reserve University in Cleveland, Ohio.

    Still, the possibility can’t be ruled out, and Starkman and others say researchers should go look for signs of macros. The team calculates that those weighing between about 50 and 100 million billion (10^17) grams could account for dark matter. If its constituent particles were packed as tightly as those inside an atomic nucleus, a macro weighing a billion grams, or 1,000 tonnes, would measure about 30 centimeters across. “It would look like a rock or a bowling ball—until you tried to pick it up,” says Starkman.

    Celestial surveys would have missed macros because they are relatively few and far between and, compared with other astronomical bodies, puny. But they should occasionally cross paths with Earth and could leave telltale signatures when they hit the atmosphere or surface, Starkman says. Ground-based detectors that search the atmosphere for signs of impacting space particles called cosmic rays, for example, might be able to catch macros hurtling along too, he suggested earlier this month. But after a subsequent meeting with members of the largest cosmic-ray observatory in the world, an array of detectors in Argentina called the Pierre Auger Observatory, that possibility looks less promising.

    Pierre Augur Observatory
    Part of Pierre Auger Observatory

    That’s because Auger plucks out only the fastest-moving signals from the stream of incoming data. “[Macros] move very much slower (about 1,000 times) than a typical cosmic ray, and at present Auger appears not able to see such things,” says Jacobs.

    Traveling at that speed— estimated at 300 kilometers/second—macros should have stuck out like sore thumbs in meteor observations if they were actually pummeling the Earth, says Peter Jenniskens, a meteor researcher at the SETI Institute in Mountain View, California. The fastest meteors, the Leonids, slice through the atmosphere at about 70 kilometers per second, so “anything streaking into the atmosphere at 300 kilometers per second would certainly have been noticed” by naked-eye observers and in photographs, agrees Jay Melosh, a planetary scientist at Purdue University in West Lafayette, Indiana.

    Starkman estimates that a bowling-ball-sized macro should hit the Earth once a year. But such an object would deposit about half the energy of the atomic bomb that leveled Hiroshima into the atmosphere, says Melosh. “Such a large energy release would certainly have been noted any time after 1960 when [a worldwide network] was put in place to detect just such events,” he says. Though the network has registered several events above that energy, Melosh does not think any were unusual enough to represent an “exotic bowling ball.” Moreover, he says, such an object “would have had the momentum to punch entirely through the Earth and emerge at nearly the same velocity on the other side,” releasing as much energy as a magnitude 10 earthquake. “Again, hardly something to be missed,” he says.

    Whether or not the macro dark matter hypothesis, in some form, can survive such criticisms is unclear, but Feng says it’s worth studying. “It does have the advantage of being a testable idea with real-world consequences,” he says. “It’s going to spur a lot of interesting discussions.”

    Go Deeper

    Editor’s picks for further reading

    Backreaction: Macro Dark Matter
    Theoretical physicist and blogger Sabine Hossenfelder explains why she “hates” the idea of macro dark matter—but thinks the idea is still important to pursue.

    The Nature of Reality: Journey Into the Dark Realm
    What if dark matter isn’t just one particle, but a diverse realm of dark matter particles that experience forces that don’t affect ordinary matter? Don Lincoln explores the “dark realm.”

    NOVA scienceNOW: Dark Matter (Video)
    Host Neil deGrasse Tyson reports from a half mile underground in an abandoned mine, where scientists are using special detectors to look for evidence of a ghostly substance that they believe makes up most of the matter of the universe—a hypothetical entity called dark matter.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

  • richardmitnick 4:18 pm on October 7, 2015 Permalink | Reply
    Tags: , , , , NOVA   

    From DON Lincoln (FNAL) for NOVA: “Neutrino Physicists win Nobel, but Neutrino Mysteries Remain” 



    07 Oct 2015
    FNAL Don Lincoln
    Don Lincoln

    Neutrinos are the most enigmatic of the subatomic fundamental particles. Ghosts of the quantum world, neutrinos interact so weakly with ordinary matter that it would take a wall of solid lead five light-years deep to stop the neutrinos generated by the sun. In awarding this year’s Nobel Prize in physics to Takaaki Kajita (Super-Kamiokande Collaboration/University of Tokyo) and Arthur McDonald (Sudbury Neutrino Observatory Collaboration/Queen’s University, Canada) for their neutrino research, the Nobel committee affirmed just how much these “ghost particles” can teach us about fundamental physics. And we still have much more to learn about neutrinos.

    Super-Kamiokande experiment Japan
    Super-Kamiokande experiment Japan

    Sudbury Neutrino Observatory
    Sudbury Neutrino Observatory

    View from the bottom of the SNO acrylic vessel and photomultiplier tube array with a fish-eye lens. This photo was taken immediately before the final, bottom-most panel of photomultiplier tubes was installed. Photo courtesy of Ernest Orlando, Lawrence Berkeley National Laboratory.

    Neutrinos are quantum chameleons, able to change their identity between the three known species (called electron-, muon– and tau-neutrinos). It’s as if a duck could change itself into a goose and then a swan and back into a duck again. Takaaki Kajita and Arthur B. McDonald received the Nobel for finding the first conclusive proof of this identity-bending behavior.

    In 1970, chemist Ray Davis built a large experiment designed to detect neutrinos from the sun. This detector was made up of a 100,000-gallon tank filled with a chlorine-containing compound. When a neutrino hit a chlorine nucleus, it would convert it into argon. In spite of a flux of about 100,000 trillion solar neutrinos per second, neutrinos interact so rarely that he expected to see only about a couple dozen argon atoms after a week’s running.

    But the experiment found even fewer argon atoms than predicted, and Davis concluded that the flux of electron-type neutrinos hitting his detector was only about a third of that emitted by the sun. This was an incredible scientific achievement and, for it, Davis was awarded a part of the 2002 Nobel Prize in physics.

    Explaining how these neutrinos got “lost” in their journey to Earth would take nearly three decades. The correct answer was put forth by the Italian-born physicist Bruno Pontecorvo, who hypothesized that the electron-type neutrinos emitted by the sun were morphing, or “oscillating,” into muon-type neutrinos. (Note that the tau-type neutrino was postulated in 1975 and observed in 2000; Pontecorvo was unaware of its existence.) This also meant that neutrinos must have mass—a surprise, since even in the Standard Model of particle physics, our most modern theory of the behavior of subatomic particles, neutrinos are treated as massless.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    So, if neutrinos could really oscillate, we would know that our current theory is wrong, at least in part.

    In 1998, a team of physicists led by Takaaki Kajita was using the Super Kamiokande (SuperK) experiment in Japan to study neutrinos created when cosmic rays from space hit the Earth’s atmosphere. SuperK was an enormous cavern, filled with 50,000 tons of water and surrounded by 11,000 light-detecting devices called phototubes. When a neutrino collided with a water molecule, the resulting debris from the interaction would fly off in the direction that the incident neutrino was traveling. This debris would emit a form of light called Cerenkov radiation and scientists could therefore determine the direction the neutrino was traveling.

    Cherenkov radiation glowing in the core of the Advanced Test Reactor [Idaho National Laboratory].

    By comparing the neutrinos created overhead, about 12 miles from the detector, to those created on the other side of the Earth, about 8,000 miles away, the researchers were able to demonstrate that muon-type neutrinos created in the atmosphere were disappearing, and that the rate of disappearance was related to the distance that the neutrinos traveled before being detected. This was clear evidence for neutrino oscillations.

    Just a few years later, in 2001, the Sudbury Neutrino Observatory (SNO) experiment, led by Arthur B. McDonald, was looking at neutrinos originating in the sun. Unlike previous experiments, the SNO could identify all three neutrino species, thanks to its giant tank of heavy water (i.e. D2O, two deuterium atoms combined with oxygen). SNO first used ordinary water to measure the flux of electron-type neutrinos and then heavy water to observe all three types. The SNO team was able to demonstrate that the neutrino flux of all three types of neutrinos agreed exactly with those emitted by the sun, but that the flux of electron-type was lower than would be expected in a no-oscillation scenario. This experiment was a definitive demonstration of the oscillation of solar neutrinos.

    With the achievements of both the SuperK and SNO experiments, it is entirely fitting that Kajita and McDonald share the 2015 Nobel Prize in physics. They demonstrated that neutrinos oscillate and, therefore, that neutrinos have mass. This is a clear crack in the impressive façade of the Standard Model of particle physics and may well lead to a better and more complete theory.

    The neutrino story didn’t end there, though. To understand the phenomenon in greater detail, physicists are now generating beams of neutrinos at many sites over the world, including Fermilab, Brookhaven, CERN and the KEK laboratory in Japan. Combined with studies of neutrinos emitted by nuclear reactors, significant progress has been made in understanding the nature of neutrino oscillation.

    Real mysteries remain. Our measurements have shown that the mass of each neutrino species is different. That’s why we know that some must have mass: if they are different, they can’t all be zero. However, we don’t know the absolute mass of the neutrino species—just the mass differences. We don’t even know which species is the heaviest and which is the lightest.

    The biggest question in neutrino oscillation physics, though, is whether neutrinos and antimatter neutrinos oscillate the same way. If they don’t, this could explain why our universe is composed solely of matter even while we believe that matter and antimatter existed in equal quantities right after the Big Bang.

    Accordingly, Fermilab, America’s premier particle physics laboratory, has launched a multi-decade effort to build the world’s most intense beam of neutrinos, aimed at a distant detector located 800 miles away in South Dakota.

    Sanford Underground Research Facility Interior
    Sanford Underground Research Facility

    Named the Deep Underground Neutrino Experiment (DUNE), it will dominate the neutrino frontier for the foreseeable future.
    FNAL Dune & LBNF

    This year’s Nobel Prize acknowledged a great step forward in our understanding of these ghostly, subatomic chameleons, but their entire story hasn’t been told. The next few decades will be a very interesting time.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

  • richardmitnick 2:02 pm on October 6, 2015 Permalink | Reply
    Tags: , , , , NOVA, Sterile neutrinos   

    From NOVA: “Sterile Neutrinos: The Ghost Particle’s Ghost” July 2014. Old, but Worth It for the Details 



    11 Jul 2014

    FNAL Don Lincoln
    Don Lincoln, FNAL

    What do you call the ghost of a ghost?

    If you’re a particle physicist, you might call it a sterile neutrino. Neutrinos, known more colorfully as “ghost particles,” can pass through (almost) anything. If you surrounded the Sun with five light years’ worth of solid lead, a full half of the Sun’s neutrinos would slip right on through. Neutrinos have this amazing penetrating capability because they do not interact by the electromagnetic force, nor do they feel the strong nuclear force. The only forces they feel are the weak nuclear force and the even feebler tug of gravity.

    The Perseus galaxy cluster, one of 73 clusters from which mysterious x-rays, possible produced by sterile neutrinos, were observed. Credit: Chandra: NASA/CXC/SAO/E.Bulbul, et al.; XMM-Newton: ESA

    NASA Chandra Telescope

    ESA XMM Newton

    When Wolfgang Pauli first postulated neutrinos in 1930, he thought that his proposed particles could never be detected. In fact, it took more than 25 years for physicists to confirm that neutrinos—Italian for “little neutral ones”—were real. Now, physicists are hunting for something even harder to spot: a hypothetical ghostlier breed of neutrinos called sterile neutrinos.

    Today, we know of three different “flavors” of neutrinos: electron neutrinos, muon neutrinos and tau neutrinos (and their antimatter equivalents). In the the late 1960s, studies of the electron-type neutrinos emitted by the Sun led scientists to suspect that they were somehow disappearing or morphing into other forms. Measurements made in 1998 by the Super Kamiokande experiment strongly supported this hypothesis, and in 2001, the Sudbury Neutrino Observatory clinched it.

    Super-Kamiokande Detector
    Super-Kamiokande Detector

    Sudbury Neutrino Observatory
    Sudbury Neutrino Observatory

    One of the limitations of studying neutrinos from the Sun and other cosmic sources is that experimenters don’t have control over them. However, scientists can make beams of neutrinos in particle accelerators and also study neutrinos emitted by man-made nuclear reactors. When physicists studied neutrinos from these sources, a mystery presented itself. It looked like there weren’t three kinds of neutrinos, but rather four or perhaps more.

    Ordinarily, this wouldn’t be cause for alarm, as the history of particle physics is full of the discovery of new particles. However, in 1990, researchers using the LEP accelerator demonstrated convincingly that there were exactly three kinds of ordinary neutrinos. Physicists were faced with a serious puzzle.

    LEP at CERN

    There were some caveats to the LEP measurement. It was only capable of finding neutrinos if they were low mass and interacted via the weak nuclear force. This led scientists to hypothesize that perhaps the fourth (and fifth and…) forms of neutrinos were sterile, a word coined by Russian physicist Bruno Pontecorvo to describe a form of neutrino that didn’t feel the weak nuclear force.

    Searching for sterile neutrinos is a vibrant experimental program and a confusing one. Researchers pursuing some experiments, such as the LSND and MiniBoone, have published measurements consistent with the existence of these hypothetical particles, while others, like the Fermilab MINOS team, have ruled out sterile neutrinos with the same properties. Inconsistencies abound in the experimental world, leading to great consternation among scientists.

    LSND Experiment
    LANL/LSND Experiment

    FNAL MiniBoone

    FNAL Minos Far Detector

    In addition, theoretical physicists have been busy. There are many different ways to imagine a particle that doesn’t experience the strong, weak, or electromagnetic forces (and is therefore very difficult to make and detect); proposals for a variety of different kinds of sterile neutrinos have proliferated wildly, and sterile neutrinos are even a credible candidate for dark matter.

    Perhaps the only general statement we can make about sterile neutrinos is that they are spin ½ fermions, just like neutrinos, but unlike “regular” neutrinos, they don’t experience the weak nuclear force. Beyond that, the various theoretical ideas diverge. Some predict that sterile neutrinos have right-handed spin, in contrast to ordinary neutrinos, which have only left-handed spin. Some theories predict that sterile neutrinos will be very light, while others have them quite massive. If they are massive, that could explain why ordinary neutrinos have such a small mass: perhaps the mathematical product of the masses of these two species of neutrinos equals a constant, say proponents of what scientists call the “see-saw mechanism”; as one mass goes up, the other must go down, resulting in low-mass ordinary neutrinos and high-mass sterile ones.

    Now, some astronomers have proposed sterile neutrinos could be the source of a mysterious excess of x-rays coming from certain clusters of galaxies. Both NASA’s Chandra satellite and the European Space Agency’s XMM-Newton have spotted an excess of x-ray emission at 3.5 keV. It is brighter than could immediately be accounted for by known x-ray sources, but it could be explained by sterile neutrinos decaying into photons and regular neutrinos. However, one should be cautious. There are tons of atomic emission lines in this part of the x-ray spectrum. One such line, an argon emission line, happens to be at 3.62 keV. In fact, if the authors allow a little more of this line than predicted, the possible sterile neutrino becomes far less convincing.

    Thus the signal is a bit sketchy and could easily disappear with a better understanding of more prosaic sources of x-ray emission. This is not a criticism of the teams who have made the announcement, but an acknowledgement of the difficulty of the measurement. Many familiar elements emit x-rays in the 3.5 keV energy range, and though the researchers attempted to remove those expected signals, they may find that a fuller accounting negates the “neutrino” signal. Still, the excess was seen by more than one facility and in more than one cluster of galaxies, and the people involved are smart and competent, so it must be regarded as a possible discovery.

    It is an incredible long shot that the excess of 3.5 keV x-ray from galaxy clusters is a sterile neutrino but, if it is, it will be a really big deal. The first order of business is a more detailed understanding of more ordinary emission lines. Unfortunately, only time will tell if we’ve truly seen a ghost.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

  • richardmitnick 7:41 am on October 6, 2015 Permalink | Reply
    Tags: , , NOVA,   

    From NOVA: “Are Space and Time Discrete or Continuous?” 



    01 Oct 2015
    Sabine Hossenfelder

    Split a mile in half, you get half a mile. Split the half mile, you get a quarter, and on and on, until you’ve carved out a length far smaller than the diameter of an atom. Can this slicing continue indefinitely, or will you eventually reach a limit: a smallest hatch mark on the universal ruler?

    The success of some contemporary theories of quantum gravity may hinge on the answer to this question. But the puzzle goes back at least 2500 years, to the paradoxes thought up by the Greek philosopher Zeno of Elea, which remained mysterious from the 5th century BC until the early 1800s. Though the paradoxes have now been solved, the question they posed—is there a smallest unit of length, beyond which you can’t divide any further?—persists.

    Credit: Flickr user Ian Muttoo, adapted under a Creative Commons license.

    The most famous of Zeno’s paradoxes is that of Achilles and the Tortoise in a race. The tortoise gets a head start on the faster-running Achilles. Achilles should quickly catch up—at least that’s what would happen in a real-world footrace. But Zeno argued that Achilles will never pass over the tortoise, because in the time it takes for Achilles to reach the tortoise’s starting point, the tortoise too will have moved forward. While Achilles pursues the tortoise to cover this additional distance, the tortoise moves yet another bit. Try as he might, Achilles only ever reaches the tortoise’s position after the animal has already left it, and he never catches up.

    Obviously, in real life, Achilles wins the race. So, Zeno argued, the assumptions underlying the scenario must be wrong. Specifically, Zeno believed that space is not indefinitely divisible but has a smallest possible unit of length. This allows Achilles to make a final step surpassing the distance to the tortoise, thereby resolving the paradox.

    It took more than two thousand years to develop the necessary mathematics, but today we know that Zeno’s argument was plainly wrong. After mathematicians understood how to sum an infinite number of progressively smaller steps, they calculated the exact moment Achilles surpasses the tortoise, proving that it does not take forever, even if space is indefinitely divisible.

    Zeno’s paradox is solved, but the question of whether there is a smallest unit of length hasn’t gone away. Today, some physicists think that the existence of an absolute minimum length could help avoid another kind of logical nonsense; the infinities that arise when physicists make attempts at a quantum version of [Albert]Einstein’s General Relativity, that is, a theory of “quantum gravity.” When physicists attempted to calculate probabilities in the new theory, the integrals just returned infinity, a result that couldn’t be more useless. In this case, the infinities were not mistakes but demonstrably a consequence of applying the rules of quantum theory to gravity. But by positing a smallest unit of length, just like Zeno did, theorists can reduce the infinities to manageable finite numbers. And one way to get a finite length is to chop up space and time into chunks, thereby making it discrete: Zeno would be pleased.

    He would also be confused. While almost all approaches to quantum gravity bring in a minimal length one way or the other, not all approaches do so by means of “discretization”—that is, by “chunking” space and time. In some theories of quantum gravity, the minimal length emerges from a “resolution limit,” without the need of discreteness. Think of studying samples with a microscope, for example. Magnify too much, and you encounter a resolution-limit beyond which images remain blurry. And if you zoom into a digital photo, you eventually see single pixels: further zooming will not reveal any more detail. In both cases there is a limit to resolution, but only in the latter case is it due to discretization.

    In these examples the limits could be overcome with better imaging technology; they are not fundamental. But a resolution-limit due to quantum behavior of space-time would be fundamental. It could not be overcome with better technology.

    So, a resolution-limit seems necessary to avoid the problem with infinities in the development of quantum gravity. But does space-time remain smooth and continuous even on the shortest distance scales, or does it become coarse and grainy? Researchers cannot agree.

    Artist concept of Gravity Probe B orbiting the Earth to measure space-time, a four-dimensional description of the universe including height, width, length, and time.
    Date 18 May 2008
    Source http://www.nasa.gov/mission_pages/gpb/gpb_012.html
    Author NASA

    In string theory, for example, resolution is limited by the extension of the strings (roughly speaking, the size of the ball that you could fit the string inside), not because there is anything discrete. In a competing theory called loop quantum gravity, on the other hand, space and time are broken into discrete blocks, which gives rise to a smallest possible length (expressed in units of the Planck length, about 10-35 meters), area and volume of space-time—the fundamental building blocks of our universe. Another approach to quantum gravity, “asymptotically safe gravity,” has a resolution-limit but no discretization. Yet another approach, “causal sets,” explicitly relies on discretization.

    And that’s not all. Einstein taught us that space and time are joined in one entity: space-time. Most physicists honor Einstein’s insight, and so most approaches to quantum gravity take space and time to either both be continuous or both be discrete. But some dissidents argue that only space or only time should be discrete.

    So how can physicists find out whether space-time is discrete or continuous? Directly measuring the discrete structure is impossible because it is too tiny. But according to some models, the discreteness should affect how particles move through space. It is a miniscule effect, but it adds up for particles that travel over very long distances. If true, this would distort images from far-away stellar objects, either by smearing out the image or by tearing apart the arrival times of particles that were emitted simultaneously and would otherwise arrive on Earth simultaneously. Astrophysicists have looked for both of these signals, but they haven’t found the slightest evidence for graininess.

    Even if the direct effects on particle motion are unmeasurable, defects in the discrete structure could still be observable. Think of space-time like a diamond. Even rare imperfections in atomic lattices spoil a crystal’s ability to transport light in an orderly way, which will ruin a diamond’s clarity. And if the price tags at your jewelry store tell you one thing, it’s that perfection is exceedingly rare. It’s the same with space-time. If space-time is discrete, there should be imperfections. And even if rare, these imperfections will affect the passage of light through space. No one has looked for this yet, and I’m planning to start such a search in the coming months.

    Next to guiding the development of a theory of quantum gravity, finding evidence for space-time discreteness—or ruling it out!—would also be a big step towards solving a modern-day paradox: the black hole information loss problem, posed by Stephen Hawking in 1974. We know that black holes can only store so much information, which is another indication for a resolution-limit. But we do not know exactly how black holes encode the information of what fell inside. A discrete structure would provide us with elementary storage units.

    Black hole information loss is a vexing paradox that Zeno would have appreciated. Let us hope we will not have to wait 2000 years for a solution.

    Editor and author’s picks for further reading

    arXiv: Minimal Length Scale Scenarios for Quantum Gravity

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

  • richardmitnick 10:56 am on September 29, 2015 Permalink | Reply
    Tags: , , , NOVA   

    From NOVA: “$20 Million Xprize Wants to Eliminate Waste Carbon Dioxide” 



    29 Sep 2015
    Tim De Chant

    Five out of five climatologists agree—we’re probably going to emit more CO2 than we should if we want to prevent the worst effects of climate change.

    Fortunately, there’s a solution—capturing that CO2 and doing something with it. Unfortunately, the “somethings” that we know of with are both costly and not that profitable. A new Xprize announced this morning aims to change that. Funded by energy company NRG and COSIA, an industry group representing Canadian oil sands companies, the prize will fund the teams that develop the most valuable ways to turn the most CO2 into something useful.

    A smokestack vents emissions to the atmosphere.

    “It’s the second largest prize we’ve ever launched,” Paul Bunje, senior scientist of energy and environment at Xprize, told NOVA Next. “It’s a recognition of a couple of things: One is the scale of the challenge at hand—dealing with carbon dioxide emissions is obviously an epic challenge for the entire plant. Secondly, it also recognizes just how difficult, technologically, this challenge is.”

    Starting today, teams have nine months to register, and by late 2016, they’ll need to submit technical documentation in support of their plans. A panel of judges will then pick the best 15 in each “track”—one which captures emissions from a coal-fired power plant, the other from a natural gas-fired plant.

    The 30 semifinalists will then have to develop laboratory-scale versions of their plan. The best five from each track will receive a $500,000 grant to help fund the next stage, where teams will have to build demonstration-scale facilities that will be attached to working power plants. Four and a half years from now, a winner from each track will be chosen and be awarded $7.5 million.

    Bunje, who is leading this Xprize, hopes the prize will show that “CO2 doesn’t just have to be a waste project that drives climate change—rather, that you can make money off of the products from converted CO2,” he said. “That kind of a perception shift will be pretty remarkable.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

  • richardmitnick 8:18 pm on September 28, 2015 Permalink | Reply
    Tags: , , , NOVA   

    From NOVA: “Could the Universe Be Lopsided?” 



    28 Sep 2015
    Paul Halpern

    One hundred years ago, [Albert] Einstein re-envisioned space and time as a rippling, twisting, flexible fabric called spacetime. His theory of general relativity showed how matter and energy change the shape of this fabric. One might expect, therefore, that the fabric of the universe, strewn with stars, galaxies, and clouds of particles, would be like a college student’s dorm room: a mess of rumpled, crumpled garments.

    Indeed, if you look at the universe on the scale of stars, galaxies, and even galaxy clusters, you’ll find it puckered and furrowed by the gravity of massive objects. But take the wider view—the cosmologists’ view, which encompasses the entire visible universe—and the fabric of the universe is remarkably smooth and even, no matter which direction you turn. Look up, down, left, or right and count up the galaxies you see: you’ll find it’s roughly the same from every angle. The cosmic microwave background [CMB], the cooled-down relic of radiation from the early universe, demonstrates the same remarkable evenness on the very largest scale.

    Cosmic Background Radiation Planck
    CMB per ESA/Planck

    ESA Planck
    ESA/Planck satellite

    A computer simulation of the ‘cosmic web’ reveals the great filaments, made largely of dark matter, located in the space between galaxies. By NASA, ESA, and E. Hallman (University of Colorado, Boulder), via Wikimedia Commons

    Physicists call a universe that appears roughly similar in all directions isotropic. Because the geometry of spacetime is shaped by the distribution of matter and energy, an isotropic universe must posses a geometric structure that looks the same in all directions as well. The only three such possibilities for three-dimensional spaces are positively curved (the surface of a hypersphere, like a beach ball but in a higher dimension), negatively curved (the surface of a hyperboloid, shaped like a saddle or potato chip), or flat. Russian physicist [Alexei] Fridmann, Belgian cleric and mathematician Georges Lemaître and others incorporated these three geometries into some of the first cosmological solutions of Einstein’s equations. (By solutions, we mean mathematical descriptions of how the three spatial dimensions of the universe behave over time, given the type of geometry and the distribution of matter and energy.) Supplemented by the work of American physicist Howard Robertson and British mathematician Arthur Walker, this class of isotropic solutions has become the standard for descriptions of the universe in the Big Bang theory.

    However, in 1921 Edward Kasner—best known for his coining of the term “Googol” for the number 1 followed by 100 zeroes—demonstrated that there was another class of solutions to Einstein’s equations: anisotropic, or “lopsided,” solutions.

    Known as the Kasner solutions, these cosmic models describe a universe that expands in two directions while contracting in the third. That is clearly not the case with the actual universe, which has grown over time in all three directions. But the Kasner solutions become more intriguing when you apply them to a kind of theory called a Kaluza-Klein model, in which there are unseen extra dimensions beyond space and time. Thus space could theoretically have three expanding dimensions and a fourth, hidden, contracting dimension. Physicists Alan Chodos and Steven Detweiler explored this concept in their paper Where has the fifth dimension gone?

    Kasner’s is far from the only anisotropic model of the universe. In 1951, physicist Abraham Taub applied the shape-shifting mathematics of Italian mathematician Luigi Bianchi to general relativity and revealed even more baroque classes of anisotropic solutions that expand, contract or pulsate differently in various directions. The most complex of these, categorized as Bianchi type-IX, turned out to have chaotic properties and was dubbed by physicist Charles Misner the “Mixmaster Universe” for its resemblance to the whirling, twirling kitchen appliance.

    Like a cake rising in a tray, while bubbling and quivering on the sides, the Mixmaster Universe expands and contracts, first in one dimension and then in another, while a third dimension just keeps expanding. Each oscillation is called a Kasner epoch. But then, after a certain number of pulses, the direction of pure expansion abruptly switches. The formerly uniformly expanding dimension starts pulsating, and one of those formerly pulsating starts uniformly expanding. It is as if the rising cake were suddenly turned on its side and another direction started rising instead, while the other directions, including the one that was previously rising, just bubbled.

    One of the weird things about the Mixmaster Universe is that if you tabulate the number of Kasner epochs in each era, before the behavior switches, it appears as random as a dice roll. For example, the universe might oscillate in two directions five times, switch, oscillate in two other directions 17 times, switch again, pulsate another way twice, and so forth—without a clear pattern. While the solution stems from deterministic general relativity, it seems unpredictable. This is called deterministic chaos.

    Could the early moments of the universe have been chaotic, and then somehow regularized over time, like a smoothed-out pudding? Misner initially thought so, until he realized that the Mixmaster Universe couldn’t smooth out on its own. However, it could have started out “lopsided,” then been stretched out during an era of ultra-rapid expansion called inflation until its irregularities were lost from sight.

    As cosmologists have collected data from instruments such as the Hubble Space Telescope, Planck Satellite, and WMAP satellite (now retired), the bulk of the evidence supports the idea that our universe is indeed isotropic.

    NASA Hubble Telescope
    NASA/ESA Hubble


    But a minority of researchers have used measurements of the velocities of galaxies and other observations, such as an odd line up of temperature fluctuations in the cosmic microwave background dubbed the “Axil of Evil” to assert that the universe could be slightly irregular after all.

    For example, starting in 2008, Alexander Kashlinsky, a researcher at NASA’s Goddard Space Flight Center, and his colleagues have statistically analyzed cosmic microwave background data gathered by first the WMAP satellite and the Planck satellite to show that, in addition to their motion due to cosmic expansion, many galaxy clusters seem to be heading toward a particular direction on the sky. He dubbed this phenomenon “dark flow,” and suggested that it is evidence of a previously-unseen cosmic anisotropy known as a “tilt.” Although the mainstream astronomical community has disputed Kashlinsky’s conclusion, he has continued to gather statistical evidence for dark flow and the idea of tilted universes.

    Whether or not the universe really is “lopsided,” it is intriguing to study the rich range of solutions of Einstein’s general theory of relativity. Even if the preponderance of evidence today points to cosmic regularity, who knows when a new discovery might call that into question, and compel cosmologists to dust off alternative ideas. Such is the extraordinary flexibility of Einstein’s masterful theory: a century after its publication, physicists are still exploring its possibilities.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

  • richardmitnick 9:33 am on September 23, 2015 Permalink | Reply
    Tags: , , , NOVA   

    From NOVA: “Why Doesn’t Everyone Believe Humans Are Causing Climate Change?” 2014 But Important 



    19 Nov 2014
    Brad Balukjian

    Last week during his tour of Asia, President Barack Obama struck a new global warming deal with China. It was a landmark agreement that many expect could break the logjam that has kept the world’s two largest emitters largely on the sidelines of talks to curb greenhouse gas emissions. Both countries agreed to reduce carbon dioxide emissions, with the U.S. ramping up reductions starting in 2020 and China beginning cuts in 2030.

    Yet back home, President Obama still faces an electorate that doesn’t believe climate change is caused by humans. Only 40% of Americans attribute global warming to human activity, according to a recent Pew Research Center poll. This, despite decades of scientific evidence and the fact that Americans generally trust climate scientists.

    Despite decades of evidence, most Americans don’t believe that humans are causing climate change.

    That apparent cognitive dissonance has vexed two scientists in particular: Michael Ranney, a professor of education at the University of California, Berkeley, and Dan Kahan, a professor of law at Yale University. According to both, we haven’t been asking the right questions. But they disagree on what, exactly, those questions should be. If one or both of them are right, the shift in tone could transform our society’s debate over climate change.

    The Wisdom Deficit

    In the 1990s, Michael Ranney started informally asking people what they perceived to be the world’s biggest problem. He hadn’t set out to tackle environmental issues—he was first trained in applied physics and materials science before turning to cognitive psychology. But time and again, he heard “climate change” as an answer.

    Ranney had also noticed that while the scientific community had converged on a consensus, the general public had not, at least not in the U.S. The Climategate controversy in late 2009 over leaked e-mails between climate scientists and Oklahoma Senator James Inhofe’s insistence that anthropogenic global warming is a hoax are just two examples of the widespread conflict among the American public over what is causing the planet to warm.

    Ranney and his team say that a “wisdom deficit” is driving the wedge. Specifically, it’s a lack of understanding of the mechanism of global warming that’s been retarding progress on the issue. “For many Americans, they’re caught between a radio talk show host—of the sort that Rush Limbaugh is—and maybe a professor who just gave them a lecture on global warming. And if you don’t understand the mechanism, then you just have competing authorities, kind of like the Pope and Galileo,” he says. “Mechanism turns out to be a tie-breaker when there’s a contentious issue.”

    Despite the fact that the general public has been inundated with scientific facts related to global warming, Ranney says that our climate literacy is still not very high. In other words, though we may hear a lot about climate change, we don’t really understand it. It’s similar to how lots of people follow the ups and downs of the Dow Jones Industrial Average but don’t understand how those fluctuations relate to macroeconomic trends.

    Climate illiteracy isn’t just limited to the general public, either. Ranney recalls a scientist’s presentation at a recent conference which said that many university professors teaching global warming barely had a better understanding of its mechanism than the undergraduates they were teaching. “Even one of the most highly-cited climate change communicators in the world didn’t know the mechanism over dinner,” he says.

    One of the most common misconceptions, according to Ranney, is that light energy “bounces” off the surface of the Earth and then is trapped or “bounced back” by greenhouse gases. The correct mechanism is subtly different. Ranney’s research group has boiled it down to 35 words: “Earth transforms sunlight’s visible light energy into infrared light energy, which leaves Earth slowly because it is absorbed by greenhouse gases. When people produce greenhouse gases, energy leaves Earth even more slowly—raising Earth’s temperature.”

    When Ranney surveyed 270 visitors to a San Diego park on how global warming works, he found that exactly zero could provide the proper mechanism. In a second experiment, 79 psychology undergraduates at UC Berkeley scored an average of 3.8 out of 9 possible points when tested on mechanistic knowledge of climate change. In a third study, 41 people recruited through Amazon’s Mechanical Turk, an online marketplace for freelance labor, scored an average of 1.9 out of 9. (Study participants in Japan and Germany had a similarly poor showing, meaning it’s not just an American problem.) With every new experiment, Ranney found consistently low levels of knowledge.

    At least, he did at first. In his experiments, after the first round of questions, Ranney included a brief lecture or a written explanation on the correct mechanism behind global warming. He then polled the same people to see whether they understood it better and whether they accepted that humans are causing climate change. In the UC Berkeley study, acceptance rose by 5.4%; in the Mechanical Turk study, it increased by 4.7%. Perhaps most notably, acceptance increased among both conservatives and liberals. There was no evidence for political polarization.

    That doesn’t mean polarization doesn’t exist. It’s certainly true that liberals are more likely to accept anthropogenic global warming than conservatives. Myriad studies and surveys have found that. But political affiliation doesn’t always overwhelm knowledge when it becomes available—Ranney found no evidence for a difference between conservatives’ and liberals’ change in willingness to accept climate change after his “knowledge intervention.”

    Convinced that the key to acceptance is understanding the mechanism, Ranney created a series of no-frills videos of varying lengths in multiple languages explaining just that. More than 130,000 page views later, Ranney is not shy about his aims: “Our goal is to garner 7 billion visitors,” he says.

    Depolarizing Language

    Meanwhile, Dan Kahan says that it’s not a wisdom gap that’s preventing acceptance of human’s role in climate change, but the cultural politicization of the topic. People don’t need a sophisticated understanding of climate change, he says. “They only need to be able to recognize what the best available scientific evidence signifies as a practical matter: that human-caused global warming is initiating a series of very significant dynamics—melting ice, rising sea levels, flooding, heightened risk of serious diseases, more intense hurricanes and other extreme weather events—that put us in danger.”

    According to Kahan, the problem lies in the discourse around the issue. When people are asked about their acceptance of anthropogenic global warming, he says the questions tend to confound what people know with who they are and the cultural groups they identify with. In those circumstances, declaring a position on the issue becomes more a statement of cultural identity than one of scientific understanding.

    Kahan’s ideas are based on his own surveys of the American public. In one recent study of 1,769 participants recruited through the public opinion firm YouGov, he assessed people’s “ordinary climate science intelligence” with a series of climate change knowledge questions. He also collected demographic data, including political orientation. Kahan found no correlation between one’s understanding of climate science and his or her acceptance of human-caused climate change. Some people who knew quite a bit on the topic still didn’t accept the premise of anthropogenic climate change, and vice versa. He also found that, as expected, conservatives are less likely to accept that humans are changing the climate.

    Unlike Ranney, Kahan did find strong evidence for polarization. The more knowledgeable a conservative, for example, the more likely they are to not accept human-caused global warming. Kahan suggests that these people use their significant analytical skills to seek evidence that aligns with their political orientation.

    Still, despite many people’s strong reluctance to accept anthropogenic global warming, cities and counties in places like southeast Florida have gone ahead and supported practices to deal with global warming anyway. Kahan relates one anecdote in which state and local officials in Florida have argued for building a nuclear power generator higher than planned because of sea-level rise and storm surge projections. But if you ask these same people if they believe in climate change, they’ll say, “no, that’s something entirely different!” Kahan says.

    Kahan’s not exactly sure why some people act in ways that directly contradict their own beliefs—he laughs and verbally shrugs when asked—but he has some ideas. The leading one is the notion of dualism, when someone mentally separates two apparently conflicting ideas and yet feels no need to reconcile them. This happens on occasion with religious medical doctors, he says, who reject evolution but openly admit to using the principles of evolution in their work life.

    Whatever the cause, Kahan thinks the case of southeast Florida is worth studying. There, the community has been able to examine the scientific evidence for climate change and take action despite widespread disagreement on whether humans are actually driving climate change. The key, Kahan says, is that they have kept politics out of the room.

    Two Sides of the Same Coin

    Ranney and Kahan, much like the skeptics and supporters of human-caused climate change, question the other’s conclusions. Kahan is skeptical that Ranney’s approach can be very effective on a large scale. “I don’t think it makes sense to believe that if you tell people in five-minute lectures about climate science, that it’s going to solve the problem,” he says. He also questions the applicability of Ranney’s experiments, which have mostly included students and Mechanical Turk respondents. “The people who are disagreeing in the world are not college students,” he says. “You’re also not in a position to give every single person a lecture. But if you did, do you think you’d be giving that lecture to them with Rush Limbaugh standing right next to them pointing out they they’re full of shit? Because in the world, that’s what happens.”

    Hundreds of millions of in-person lectures would certainly be impossible, but Ranney has high hopes for his online videos. Plus, Ranney points out that Kahan’s studies are correlative, while his are controlled experiments where causation can be more strongly inferred. In addition, most of the measures of climate science knowledge that Kahan uses in his research focus on factual knowledge rather than mechanism. (For example, the multiple choice question, “What gas do most scientists believe causes temperatures in the atmosphere to rise?”). Ranney’s work, on the other hand, is all about mechanism.

    Despite their apparent disagreement, Ranney thinks the debate is a bit of a false dichotomy. “It’s certainly the case that one’s culture has a significant relationship to whether or not you accept [anthropogenic global warming], but that doesn’t mean your global warming knowledge isn’t also related to it. And it doesn’t mean you can’t overcome a cultural predilection with more information,” Ranney says. “There were a lot of things that were culturally predicted, like thinking we were in a geocentric universe or that smoking was fine for you or that the Earth was flat—all manner of things that eventually science overcame.”

    Perhaps Ranney and Kahan are on the same team after all—they would probably agree that, at the end of the day, both knowledge and culture matter, and that we’d be well-served to focus our energy on how to operationally increase acceptance of anthropogenic global warming. “Whatever we can do now will be heroic for our great-grandchildren, and whatever we do not do will be infamous,” Ranney says.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc

Get every new post delivered to your Inbox.

Join 498 other followers

%d bloggers like this: