Tagged: Energy Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:39 am on November 5, 2015 Permalink | Reply
    Tags: , , Energy   

    From Duke: “Low-Energy, High-Impact Physics” 

    Duke Bloc
    Duke Crest

    Duke University

    Nov 2, 2015
    Mary-Russell Roberson

    Triangle Universities Nuclear Lab Celebrates 50 Years

    In this undated photo, (L-R) Russell Roberson, Ed Bilpuch, (both of whom directed TUNL at one time) and Al Lovette worked in the underground room where the massive Van De Graaff accelerator is operated.

    The identity of this region of North Carolina as a “Research Triangle” was still more of a concept than a reality in 1965 when the U.S. Atomic Energy Commission gave the three universities $2.5 million to build a cutting-edge laboratory to explore the Nuclear Age.

    Borrowing some of its identity from the newly minted Research Triangle Park just a few miles away on Highway 54, the launch of the Triangle Universities Nuclear Laboratory was front page news throughout the region.

    Duke professor Henry Newson had succeeded — on his third try — in securing funding for a 15-MeV tandem Van de Graaff accelerator and a 15-MeV cyclotron. His creative twist was the idea of using the cyclotron to inject a particle beam into the Van de Graaff to cost-effectively double the beam energy. Scientists at TUNL called the combination the cyclo-graaff.

    The other magic of the third funding attempt was the idea to include UNC and NC State in the proposal, said Eugen Merzbacher, a professor emeritus at UNC who died in 2013. “Henry had this brilliant idea to combine the three universities.”

    Merzbacher helped write the successful proposal, as did Worth Seagondollar, who was chair of the physics department at N.C. State. Each university would supply faculty members and graduate students to conduct research using the equipment.

    Fifty years later, the agreement stands.

    The cyclotron is gone and the lab has had to change its goals with the times, but it still brings more than $7 million of research funding into the Triangle each year. It outlived the AEC, which became the Department of Energy. DOE’s Office of Nuclear Physics is still the major funder, but there is support from other agencies as well, including the National Nuclear Security Administration, the National Science Foundation, and the Domestic Nuclear Detection Office of the Department of Homeland Security.

    The TUNL lab has produced 286 Ph.D.s from all three schools, some of whom are returning the weekend of Nov. 6-8 to celebrate and get caught up.

    download the mp4 video here.

    Construction of the cyclo-graaff lab, located behind the Duke physics building on West Campus, was partly supported by a grant from the North Carolina Board of Science and Technology.

    Russell Roberson, professor emeritus and a former TUNL director, arrived at Duke in 1963. At the time, the Duke Physics department had two small Van de Graaff accelerators — one rated at an energy of 4 MeV and another rated at 3 MeV — but Newson wanted a bigger accelerator for bigger experiments.

    “Because of his work on the Manhattan Project, Newson understood how many people could effectively use a big facility like the tandem Van de Graaff,” Roberson said. “He knew Duke couldn’t provide that many people. But by dividing it up among the three universities, we were able to establish a very significant faculty presence with a large number of graduate students and make it one of the top accelerator and nuclear facilities in the country.”

    Jim Koster (NCSU), Scott Wilburn and Paul Huffman in the Tandem Van De Graaf control room, circa early 1990s.

    Originally, the focus of TUNL was nuclear structure. Newson, who directed the lab from 1968 until his death in 1978, used a high-resolution neutron beam to study the atomic nucleus. Later, Duke professor Edward Bilpuch modified the equipment to produce a proton beam, which he and colleagues used in a series of well-known experiments to study isobaric analogue states of the nucleus with ultrafine energy resolution.

    Parts of the massive Van de Graaff that arrived in 1966 are still being used by TUNL physicists, but over the years, the lab has broadened its focus, said current director, Duke professor Calvin Howell, who did his graduate work as a Duke student at TUNL in the 1980s.

    The lab’s evolution often followed the interests and technical innovations of faculty members. For example, when UNC professor Tom Clegg built a polarized ion beam at TUNL in 1986, other faculty members and students caught his enthusiasm and used it for their own experiments.

    Driven by a free electron laser housed in a separate building behind the original TUNL, that polarized beam is now known as the HIGS (high intensity gamma-ray source), and it’s the world’s most intense polarized gamma-ray beam.

    download the mp4 video here.

    “That’s been the history of TUNL—new people come in with new ideas and new technology and techniques, and they don’t just hoard those things for themselves,” Howell said. “The collaboration and the synergy between faculty members works beautifully. We don’t have institutional boundaries.”

    Today, TUNL physicists are pushing scientific frontiers in several areas, including studying strong interaction physics to better understand the structure of nuclei and nucleons (protons and neutrons); modeling nuclear reactions in stars; and delving into the fundamental nature of neutrinosto discover whether these chargeless particles serve as their own anti-particles and how they may have played a role in the processes that generated the visible matter in the universe.

    But TUNL leaders all agree that one of the lab’s most important contributions has been educating the next generation of scientists. (Learn more about Duke’s TUNL alumni.)

    “We’ve continued to be one of the more significant laboratories in the country in terms of producing students,” Roberson said. “Many of our graduate students go in industry and the national labs and universities. At one time, there were 35 graduates from TUNL working at Los Alamos National Lab.”

    “The record speaks for itself in the outstanding scientists we have produced at the Ph.D. level,” Howell said. “In the last 15 years, we’ve also put considerable effort into creating opportunities for undergraduates.”

    The NSF-funded Research Experience for Undergraduates (REU) program supports 10-12 undergraduates from around the country each summer to work and learn at TUNL. The lab also collaborates now with Duke’s high-energy program to allow REU students to spend the summer at the Large Hadron Collider at CERN in Switzerland.

    There are other examples of universities that tried to create shared physics laboratories but were not able to work together as a team to make it happen, according to Steve Shafroth, who came to UNC and TUNL in 1967.

    “TUNL is such a unique thing, with the three universities collaborating like that and staying friends,” Shafroth added with a laugh. “You know, with the basketball rivalry and all so strong.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    Duke Campus

    Younger than most other prestigious U.S. research universities, Duke University consistently ranks among the very best. Duke’s graduate and professional schools — in business, divinity, engineering, the environment, law, medicine, nursing and public policy — are among the leaders in their fields. Duke’s home campus is situated on nearly 9,000 acres in Durham, N.C, a city of more than 200,000 people. Duke also is active internationally through the Duke-NUS Graduate Medical School in Singapore, Duke Kunshan University in China and numerous research and education programs across the globe. More than 75 percent of Duke students pursue service-learning opportunities in Durham and around the world through DukeEngage and other programs that advance the university’s mission of “knowledge in service to society.”

  • richardmitnick 7:36 am on October 26, 2015 Permalink | Reply
    Tags: , , Energy,   

    From EPFL: “An innovative response to the challenge of storing renewable energy” 

    EPFL bloc

    Ecole Polytechnique Federale Lausanne

    Emmanuel Barraud

    Inside the Container. Alain Herzog/EPFL

    A system for managing and storing energy, developed by EPFL’s Distributed Electrical Systems Laboratory, has been inaugurated on the school’s campus. The system, which received extensive co-financing from the Canton of Vaud, is built around an industrial-capacity battery developed by Vaud-based company Leclanché. It is now connected to the Romande Energie-EPFL solar park and will be used to conduct real-world tests on the behavior of a power grid that is fed electricity from solar panels.

    The experimental storage system was inaugurated today and is now connected to the Romande Energie-EPFL solar park, one of the largest in French-speaking Switzerland. Researchers will use it to study new, industrial-scale solutions for using renewable energies (especially solar energy) and feeding them into the power distribution grid, as part of the ‘EPFL Smart Grid’ project.

    Useful life far above average

    The system, which is the size of a shipping container, is unique for its underlying technology: it is based on high-performance lithium-ion titanate cells manufactured by Vaud-based company Leclanché. The life of these cells is around 15,000 charge-discharge cycles, while 3,000 is more common. In addition, the cells come with ceramic separators, patented by Leclanché, which are meant to maximize safety. It is a fully integrated solution comprising storage and energy-conversion modules as well as software for the battery to communicate with the EPFL engineers.

    Real-world testing

    The system will be used to test the research being carried out by Professor Mario Paolone, the Head of EPFL’s Distributed Electrical Systems Laboratory. It will be able to hold up to 500 kWh, which is the equivalent of the average energy consumed by fifty Swiss households over the course of one day, while managing variations in power as a function of the sunshine. “The ability to connect reliable energy storage solutions to the grid is key for incorporating renewable energy sources in our energy mix,” said Dr. Paolone. “Because of the system’s high capacity, we will be able for the first time to carry out real-world tests on the new control methods offered by the smart grids developed at EPFL.”

    A project co-financed by Canton of Vaud

    This project received extensive financial support from the Canton of Vaud. As part of its “100 million for renewable energies and energy efficiency” program, the Canton allocated some two million francs to Dr. Paolone’s team. These funds are from the R&D component of that program, which, in addition to EPFL, is providing support to the School of Business and Engineering in Yverdon-les-Bains and the University of Lausanne. “This project represents an important milestone in the implementation of our energy policy, one of the objectives of which is to develop renewable energy resources at the local level,” said Jacqueline de Quattro, State Councilor and Head of the Department of Territorial Planning and the Environment.

    The research involving the new system is set to last 23 months and will optimize the functioning of the various components of the new system, its management and its interoperability with an integrated electricity production and distribution network (i.e., a smart grid).

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    EPFL is Europe’s most cosmopolitan technical university. It receives students, professors and staff from over 120 nationalities. With both a Swiss and international calling, it is therefore guided by a constant wish to open up; its missions of teaching, research and partnership impact various circles: universities and engineering schools, developing and emerging countries, secondary schools and gymnasiums, industry and economy, political circles and the general public.

  • richardmitnick 4:33 pm on October 13, 2015 Permalink | Reply
    Tags: , , Energy, ,   

    From Rutgers: “Rutgers, Brookhaven National Laboratory Get $12M for Advanced Materials Effort” 

    Rutgers University
    Rutgers University

    October 12, 2015
    Carl Blesch

    Advanced materials research could lead to more efficient batteries and other energy saving technologies.
    Photo: Shutterstock

    The U.S. Department of Energy has awarded a four-year, $12 million grant to establish a new research center led by a Rutgers professor to accelerate the development of materials that improve energy efficiency and boost energy productio

    The center will be hosted by the U.S. Department of Energy’s Brookhaven National Laboratory in Upton, N.Y., and led by Gabriel Kotliar, Board of Governors Professor in the Department of Physics and Astronomy, School of Arts and Sciences, at Rutgers University. Kotliar also holds a part-time position at Brookhaven Lab.

    Gabriel Kotliar

    “This is a huge new initiative by the Department of Energy,” said Robert Bartynski, chair of Rutgers’ Department of Physics and Astronomy. “Rutgers is among an elite group of universities and labs to contribute to this effort, and the department’s award formalizes a strong and growing collaboration between Rutgers and Brookhaven National Laboratory.”

    The team’s research will focus on developing advanced materials for high-temperature superconductors and other energy initiatives, including technologies that convert heat to electricity to increase energy resources and reduce reliance on fossil fuels.

    “Developing tools to increase our understanding of these most interesting substances could result in the development of important new technologies, such as better thermoelectric materials for conversion of heat to electricity and more efficient batteries for cars and electronic devices,” said Kotliar.

    The new endeavor, called the Center for Computational Design of Functional Strongly Correlated Materials and Theoretical Spectroscopy, will develop software and databases that catalog the essential physics and chemistry of these materials to help other researchers and industrial scientists develop useful new materials more quickly. Brookhaven Lab will also use its experimental facilities to validate the researchers’ theoretical predictions.

    Another Rutgers physics professor, Kristjan Haule, will lead a Rutgers-based lab that supports the center’s research, including development of simulation tools to predict properties of materials, and a database of such simulations for useful materials such as thermoelectrics.

    Kristjan Haule

    The center is one of three funded by the Department of Energy at several national laboratories and universities nationwide in support of the U. S. Government’s Materials Genome Initiative (MGI). MGI is a multi-agency effort to reduce the time from discovery to deployment of new advanced materials with the goal to revitalize American manufacturing. The department’s total funding for all three centers will be $32 million over four years.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Rutgers, The State University of New Jersey, is a leading national research university and the state’s preeminent, comprehensive public institution of higher education. Rutgers is dedicated to teaching that meets the highest standards of excellence; to conducting research that breaks new ground; and to providing services, solutions, and clinical care that help individuals and the local, national, and global communities where they live.

    Founded in 1766, Rutgers teaches across the full educational spectrum: preschool to precollege; undergraduate to graduate; postdoctoral fellowships to residencies; and continuing education for professional and personal advancement.

    Rutgers Seal

  • richardmitnick 10:23 am on August 15, 2015 Permalink | Reply
    Tags: , Energy,   

    From U Chicago: “Copper clusters capture and convert carbon dioxide to make fuel” 

    U Chicago bloc

    University of Chicago

    August 14, 2015
    Payal Marathe

    A copper tetramer catalyst created by researchers at Argonne National Laboratory may help capture and convert carbon dioxide in a way that ultimately saves energy. It consists of small clusters of four copper atoms each, supported on a thin film of aluminum oxide. These catalysts work by binding to carbon dioxide molecules, orienting them in a way that is ideal for chemical reactions.
    Courtesy of Larry Curtiss

    Capture and convert: This is the motto of carbon dioxide reduction—a process that stops the greenhouse gas before it escapes from chimneys and power plants into the atmosphere and instead turns it into a useful product.

    One possible end product is methanol, a liquid fuel and the focus of a recent study conducted at Argonne National Laboratory. The chemical reactions that make methanol from carbon dioxide rely on a catalyst to speed up the conversion, and Argonne scientists identified a new material that could fill this role. With its unique structure, this catalyst can capture and convert carbon dioxide in a way that ultimately saves energy.

    They call it a copper tetramer.

    It consists of small clusters of four copper atoms each, supported on a thin film of aluminum oxide. These catalysts work by binding to carbon dioxide molecules, orienting them in a way that is ideal for chemical reactions. The structure of the copper tetramer is such that most of its binding sites are open, which means it can attach more strongly to carbon dioxide and can better accelerate the conversion.

    The current industrial process to reduce carbon dioxide to methanol uses a catalyst of copper, zinc oxide and aluminum oxide. A number of its binding sites are occupied merely in holding the compound together, which limits how many atoms can catch and hold carbon dioxide.

    “With our catalyst, there is no inside,” said paper co-author Stefan Vajda, a senior chemist at Argonne and a fellow of the University of Chicago’s Institute for Molecular Engineering. “All four copper atoms are participating because with only a few of them in the cluster, they are all exposed and able to bind.”

    To compensate for a catalyst with fewer binding sites, the current method of reduction creates high-pressure conditions to facilitate stronger bonds with carbon dioxide molecules. But compressing gas into a high-pressure mixture takes a lot of energy.

    The benefit of enhanced binding is that the new catalyst requires lower pressure and less energy to produce the same amount of methanol.

    Carbon dioxide emissions are an ongoing environmental problem, and according to the authors, it’s important that research identifies optimal ways to deal with the waste.

    “We’re interested in finding new catalytic reactions that will be more efficient than the current catalysts, especially in terms of saving energy,” said Larry Curtiss, an Argonne distinguished fellow who co-authored this paper.

    Copper tetramers could allow for the capture and conversion of carbon dioxide on a larger scale—reducing an environmental threat and creating a useful product like methanol, which can be transported and burned for fuel.

    The catalyst still has a long journey ahead from the lab to industry. Potential obstacles include instability and figuring out how to manufacture mass quantities. There’s a chance that copper tetramers may decompose when put to use in an industrial setting, so ensuring long-term durability is a critical step for future research, Curtiss said. And while the scientists needed only nanograms of the material for this study, that number would have to be multiplied dramatically for industrial purposes.

    Meanwhile, researchers are interested in searching for other catalysts that might even outperform their copper tetramer. These catalysts can be varied in size, composition and support material, resulting in a list of more than 2,000 potential combinations, Vajda said.

    But the scientists don’t have to run thousands of different experiments, said Peter Zapol, an Argonne physicist and co-author of this paper. Instead, they will use advanced calculations to make predictions [simulations], and then test the catalysts that seem most promising.

    “We haven’t yet found a catalyst better than the copper tetramer, but we hope to,” Vajda said. “With global warming becoming a bigger burden, it’s pressing that we keep trying to turn carbon dioxide emissions back into something useful.”

    For this research, the team used the Center for Nanoscale Materials as well as beamline 12-ID-C of the Advanced Photon Source [APS].

    ANL APS interior
    APS at ANL

    Curtiss said the Advanced Photon Source allowed the scientists to observe ultra-low loadings of their small clusters, down to a few nanograms, which was a critical piece of this investigation.

    The study, Carbon dioxide conversion to methanol over size-selected Cu4 clusters at low pressures, was published in the Journal of the American Chemical Society and was funded by the DOE’s Office of Basic Energy Sciences. Co-authors also included researchers from the University of Freiburg and Yale University.

    This article first appeared on the Argonne National Laboratory website.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    U Chicago Campus

    An intellectual destination

    One of the world’s premier academic and research institutions, the University of Chicago has driven new ways of thinking since our 1890 founding. Today, UChicago is an intellectual destination that draws inspired scholars to our Hyde Park and international campuses, keeping UChicago at the nexus of ideas that challenge and change the world.

  • richardmitnick 7:34 am on April 23, 2015 Permalink | Reply
    Tags: , , , Energy   

    From ANU: “Australia can cut emissions and grow its economy” 

    ANU Australian National University Bloc

    Australian National University

    22 April 2015
    No Writer Credit


    Australia can make deep cuts to its carbon emissions and move to full renewable energy for its electricity supply at a relatively low cost, an ANU report has found.

    The report, written by Associate Professor Frank Jotzo and PhD scholar Luke Kemp, reviews the evidence from major studies over the past eight years.

    It finds that the cost estimates for Australia reaching ambitious emissions reduction goals came down in every successive major report.

    “Deep cuts to Australia’s emissions can be achieved, at a low cost,” said Associate Professor Jotzo, director of the ANU Centre for Climate Economics and Policy at the Crawford School of Public Policy.

    Australia has committed to cut greenhouse gas emissions by five per cent of year 2000 levels by 2020, and is due in coming months to decide on emissions reduction targets for after 2020.

    Australia is among the world’s highest producers of per-capita carbon emissions, due to a heavy reliance on coal for electricity generation.

    Associate Professor Jotzo’s report, commissioned by WWF Australia (World Wildlife Fund), found the cost of moving to renewable energy was becoming cheaper, and strong climate action could be achieved while maintaining economic growth.

    “At the heart of a low-carbon strategy for Australia is a carbon-free power system,” he said.

    “Australia has among the best prerequisites in the world for moving to a fully renewable energy electricity supply.”

    He said the costs of carbon-free technology, such as wind and solar power, have fallen faster than expected.

    “For example, large-scale solar panel power stations are already only half the cost that the Treasury’s 2008 and 2011 modelling studies estimated they would be in the year 2030,” he said.

    The report is available at the WWF Australia website.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ANU Campus

    ANU is a world-leading university in Australia’s capital city, Canberra. Our location points to our unique history, ties to the Australian Government and special standing as a resource for the Australian people.

    Our focus on research as an asset, and an approach to education, ensures our graduates are in demand the world-over for their abilities to understand, and apply vision and creativity to addressing complex contemporary challenges.

  • richardmitnick 9:06 am on April 22, 2015 Permalink | Reply
    Tags: , Energy, ,   

    From NOVA: “The EPA’s Natural Gas Problem” 



    11 Feb 2015
    Phil McKenna

    When U.S. President Barack Obama recently announced plans to reign in greenhouse gas emissions from the oil and gas production, the opposing drum beats from industry and environmental groups were as fast as they were relentless. The industry group America’s Natural Gas Alliance bombarded Twitter with paid advertisements stating how little their industry actually emits. Press releases from leading environmental organizations deploring the plan’s reliance on largely voluntary actions flooded email inboxes.

    Opposition to any new regulation by industry, however, isn’t as lockstep as its lobbying groups would have us believe. At the same time, environmentalists’ focus on voluntary versus mandatory measures misses a much graver concern.

    The White House and EPA are seeking to regulate methane emissions from the oil and gas industry.

    The joint White House and U.S. Environmental Protection Agency proposal would reduce emissions of methane, the primary component of natural gas, by 40–45% from 2012 levels in the coming decade. It’s a laudable goal. While natural gas is relatively clean burning—emitting roughly half the amount of carbon dioxide per unit of energy as coal—it is an incredibly potent greenhouse gas if it escapes into the atmosphere unburned.

    Methane emissions from the oil and gas sector are estimated to be equivalent to the pollution from 180 coal-fired power plants, according to studies done by the Environmental Defense Fund (EDF), an environmental organization. Yet there is a problem: despite that estimate, no one, including EDF, knows for certain how much methane the oil and gas industry actually emits.

    The EPA publishes an annual inventory of U.S. Greenhouse Gas emissions, which it describes as “the most comprehensive accounting of total greenhouse gas emissions for all man-made sources in the United States.” But their estimates for the natural gas industry are, by their own admission, outdated, based on limited data, and likely significantly lower than actual emissions.

    The Baseline

    Getting the number right is extremely important as it will serve as the baseline for any future reductions. “The smaller the number they start with, the smaller the amount they have to reduce in coming years by regulation,” says Anthony Ingraffea, a professor of engineering at Cornell University in Ithaca, New York. “A 45% reduction on a rate that is too low will be a very small reduction. From a scientific perspective, this doesn’t amount to a hill of beans.”

    Ingraffea says methane emissions are likely several times higher than what the EPA estimates. (Currently, the EPA says that up to 1.8% of the natural gas distributed and produced in the U.S. escapes to the atmosphere.) Even if Ingraffea is right, its still a small percentage, but methane’s potency as a greenhouse gas makes even a small release incredibly significant. Over 100 years, methane traps 34 times more heat in the atmosphere than carbon dioxide. If you are only looking 20 years into the future, a time frame given equal weight by the United Nation’s Intergovernmental Panel on Climate Change, methane is 86 times more potent than carbon dioxide.

    After being damaged during Hurricane Ike in September 2008, a natural gas tank spews methane near Sabine Pass, Texas.

    If Ingraffea is right, the amount of methane released into the atmosphere from oil and gas wells, pipelines, processing and storage facilities has a warming affect approaching that of the country’s 557 coal fired power plants. Reducing such a high rate of emissions by 40–45% would certainly help stall climate change. It would also likely be much more difficult to achieve than the cuts industry and environmental groups are currently debating.

    Ingraffea first called attention to what he and others believe are EPA underestimates in 2011 when he published a highly controversial paper along with fellow Cornell professor Robert Howarth. Their research suggested the amount of methane emitted by the natural gas industry was so great that relying on natural gas was actually worse for the climate than burning coal.

    Following the recent White House and EPA announcement, industry group America’s Natural Gas Alliance (ANGA) stated that they have reduced emissions by 17% since 1990 while increasing production by 37%. “We question why the administration would single out our sector for regulation, given our demonstrated reductions,” the organization wrote in a press release following the White House’s proposed policies. ANGA bases its emissions reduction on the EPA’s own figures and stands by the data. “We like to have independent third party verification, and we use the EPA’s figures for that,” says ANGA spokesman Daniel Whitten.

    Shifting Estimates

    But are the EPA estimates correct, and are they sufficiently independent? To come up with its annual estimate, the EPA doesn’t make direct measurements of methane emissions each year. Rather they multiply emission factors, the volume of a gas thought to be emitted by a particular source—like a mile of pipeline or a belching cow—by the number of such sources in a given area. For the natural gas sector, emission factors are based on a limited number of measurements conducted in the early 1990s in industry-funded studies.

    In 2010 the EPA increased its emissions factors for methane from the oil and natural gas sector, citing “outdated and potentially understated” emissions. The end result was a more than doubling of its annual emissions estimate from the prior year. In 2013, however, the EPA reversed course, lowering estimates for key emissions factors for methane at wells and processing facilities by 25–30%. When reached for comment, the EPA pointed me to their existing reports.

    The change was not driven by better scientific understanding but by political pressure, Howarth says. “The EPA got huge pushback from industry and decreased their emissions again, and not by collecting new data.” The EPA states that the reduction in emissions factors was based on “a significant amount of new information” that the agency received about the natural gas industry.

    However, a 2013 study published in the journal Geophysical Research Letters concludes that “the main driver for the 2013 reduction in production emissions was a report prepared by the oil and gas industry.” The report was a non-peer reviewed survey of oil and gas companies conducted by ANGA and the American Petroleum Institute.

    The EPA’s own inspector general released a report that same year that was highly critical of the agency’s estimates of methane and other harmful gasses. “Many of EPA’s existing oil and gas production emission factors are of questionable quality because they are based on limited and/or low quality data.” The report concluded that the agency likely underestimates emissions, which “hampers [the] EPA’s ability to accurately assess risks and air quality impacts from oil and gas production activities.”


    Soon after the EPA lowered its emissions estimates, a number of independent studies based on direct measurements found higher methane emissions. In November 2013, a study based on direct measurements of atmospheric methane concentrations across the United States concluded actual emissions from the oil and gas sector were 1.5 times higher than EPA estimates. The study authors noted, “the US EPA recently decreased its methane emission factors for fossil fuel extraction and processing by 25–30% but we find that [methane] data from across North America instead indicate the need for a larger adjustment of the opposite sign.”

    In February 2014, a study published in the journal Science reviewed 20 years of technical literature on natural gas emissions in the U.S. and Canada and concluded that “official inventories consistently underestimate actual CH4 emissions.”

    “When you actually go out and measure methane emissions directly, you tend to come back with measurements that are higher than the official inventory,” says Adam Brandt, lead author of the study and an assistant professor of energy resources engineering at Stanford University. Brandt and his colleagues did not attempt to make an estimate of their own, but stated that in a worst-case scenario total methane emissions from the oil and gas sector could be three times higher than the EPA’s estimate.

    On January 22, eight days after the White House’s announcement, another study found similarly high emissions from a sector of the natural gas industry that is often overlooked. The study made direction measurements of methane emissions from natural gas pipelines and storage facilities in and around Boston, Massachusetts, and found that they were 3.9 times higher than the EPA’s estimate for the “downstream” sector, or the parts of the system which transmit, distribute, and store natural gas.

    Most natural gas leaks are small, but large ones can have catastrophic consequences. The wreckage above was caused by a leak in San Bruno, California, in 2010.

    Boston’s aging, leak-prone, cast-iron pipelines likely make the city more leaky than most, but the high volume of emissions—losses around the city total roughly $1 billion worth of natural gas per decade—are nonetheless surprising. The majority of methane emissions were previously believed to occur “upstream” at wells and processing facilities. Efforts to curb emissions including the recent goals set by the White House have overlooked the smaller pipelines that deliver gas to end users.

    “Emissions from end users have been only a very small part of conversation on emissions from natural gas,” says lead author Kathryn McKain, an atmospheric scientist at Harvard University. “Our findings suggest that we don’t understand the underlying emission processes which is essential for creating effective policy for reducing emissions.”

    The Boston study was one of 16 recent or ongoing studies coordinated by EDF to try to determine just how much methane is actually being emitted from the industry as a whole. Seven studies, focusing on different aspects of oil and gas industry infrastructure, have been published thus far. Two of the studies, including the recent Boston study, have found significantly higher emission rates. One study, conducted in close collaboration with industry, found lower emissions. EDF says it hopes to have all studies completed by the end of 2015. The EPA told me it will take the studies into account for possible changes in its current methane emission factors.

    Fraction of a Percent

    EDF is simultaneously working with industry to try to reduce methane emissions. A recent study commissioned by the environmental organization concluded the US oil and gas industry could cut methane emissions by 40% from projected 2018 levels at a cost of less than one cent per thousand cubic feet of natural gas, which today sells for about $5. The reductions could be achieved with existing emissions-control technologies and policies.

    “We are talking about one third or one fourth of a percent of the price of gas to meet these goals,” says Steven Hamburg chief scientist for EDF. The 40–45% reduction goal recently announced by the White House is nearly identical to the level of cuts analyzed by EDF. To achieve the reduction the White House proposes mandatory changes in new oil and gas infrastructure as well as voluntary measures for existing infrastructure.

    Thomas Pyle, president of the Institute for Energy Research, an industry organization, says industry is already reducing its methane emissions and doesn’t need additional rules. “It’s like regulating ice cream producers not to spill any ice cream during the ice cream making process,” he says. “It is self-evident for producers to want to capture this product with little or no emissions and make money from it.”

    Unlike making ice cream, however, natural gas producers often vent their product intentionally as part of the production process. One of the biggest sources of methane emissions in natural gas production is gas that is purposely vented from pneumatic devices which use pressurized methane to open and close valves and operate pumps. They typically release or “bleed” small amounts of gas during their operation.

    Such equipment is widely used throughout natural gas extraction, processing, and transmission process. A recent study by Natural Resources Defense Council (NRDC) estimates natural gas driven pneumatic equipment vents 1.6–1.9 million metric tons of methane each year. The figure accounts for nearly one-third of all methane lost by the natural gas industry, as estimated by the EPA.

    A natural gas distribution facility

    “Low-bleed” or “zero-bleed” controllers are available, though they are more expensive. The latter use compressed air or electricity to operate instead of pressurized natural gas, or they capture methane that would otherwise be vented and reuse it. “Time and time again we see that we can operate this equipment without emissions or with very low emissions,” Hamburg says. Increased monitoring and repair of unintended leaks at natural gas facilities could reduce an additional third of the industry’s methane emissions according to the NRDC study.

    Environmentalist organizations have come out in strong opposition to the lack of mandatory regulations for existing infrastructure, which will account for nearly 90% of methane emissions in 2018 according to a recent EDF report.

    While industry groups oppose mandatory regulations on new infrastructure, at least one industry leader isn’t concerned. “I don’t believe the new regulations will hurt us at all,” says Mark Boling an executive vice president at Houston-based Southwestern Energy Company, the nation’s fourth largest producer of natural gas.

    Boling says leak monitoring and repair programs his company initiated starting in late 2013 will pay for themselves in 12 to 18 months through reduced methane emissions. Additionally, he says the company has also replaced a number of pneumatic devices with zero-bleed solar powered electric pumps. Southwestern Energy is now testing air compressors powered by fuel cells to replace additional methane-bleeding equipment Boling says. In November, Southwestern Energy launched ONE Future, a coalition of companies from across the natural gas industry. Their goal is to lower the industry’s methane emissions below one percent.

    Based on the EPA emissions rate of 1.8% and fixes identified by EDF and NRDC, their goal seems attainable. But what if the actual emissions rate is significantly higher, as Howarth and Ingraffea have long argued and recent studies seem to suggest? “We can sit here and debate whose numbers are right, ‘Is it 4%? 8%? Whatever,’ ” Boling says. “But there are cost effective opportunities out there to reduce emissions, and we need to step up and do it.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

  • richardmitnick 3:52 pm on March 9, 2015 Permalink | Reply
    Tags: , , , , Energy   

    From Caltech: “One Step Closer to Artificial Photosynthesis and “Solar Fuels” 

    Caltech Logo

    Ker Than

    Ke Sun, a Caltech postdoc in the lab of George L. Argyros Professor and Professor of Chemistry Nate Lewis, peers into a sample of a new, protective film that he has helped develop to aid in the process of harnessing sunlight to generate fuels.
    Credit: Lance Hayashida/Caltech Marcomm

    Caltech scientists, inspired by a chemical process found in leaves, have developed an electrically conductive film that could help pave the way for devices capable of harnessing sunlight to split water into hydrogen fuel.

    When applied to semiconducting materials such as silicon, the nickel oxide film prevents rust buildup and facilitates an important chemical process in the solar-driven production of fuels such as methane or hydrogen.

    “We have developed a new type of protective coating that enables a key process in the solar-driven production of fuels to be performed with record efficiency, stability, and effectiveness, and in a system that is intrinsically safe and does not produce explosive mixtures of hydrogen and oxygen,” says Nate Lewis, the George L. Argyros Professor and professor of chemistry at Caltech and a coauthor of a new study, published the week of March 9 in the online issue of the journal the Proceedings of the National Academy of Sciences, that describes the film.

    The development could help lead to safe, efficient artificial photosynthetic systems—also called solar-fuel generators or “artificial leaves”—that replicate the natural process of photosynthesis that plants use to convert sunlight, water, and carbon dioxide into oxygen and fuel in the form of carbohydrates, or sugars.

    The artificial leaf that Lewis’ team is developing in part at Caltech’s Joint Center for Artificial Photosynthesis (JCAP) consists of three main components: two electrodes—a photoanode and a photocathode—and a membrane. The photoanode uses sunlight to oxidize water molecules to generate oxygen gas, protons, and electrons, while the photocathode recombines the protons and electrons to form hydrogen gas. The membrane, which is typically made of plastic, keeps the two gases separate in order to eliminate any possibility of an explosion, and lets the gas be collected under pressure to safely push it into a pipeline.

    Scientists have tried building the electrodes out of common semiconductors such as silicon or gallium arsenide—which absorb light and are also used in solar panels—but a major problem is that these materials develop an oxide layer (that is, rust) when exposed to water.

    Lewis and other scientists have experimented with creating protective coatings for the electrodes, but all previous attempts have failed for various reasons. “You want the coating to be many things: chemically compatible with the semiconductor it’s trying to protect, impermeable to water, electrically conductive, highly transparent to incoming light, and highly catalytic for the reaction to make oxygen and fuels,” says Lewis, who is also JCAP’s scientific director. “Creating a protective layer that displayed any one of these attributes would be a significant leap forward, but what we’ve now discovered is a material that can do all of these things at once.”

    The team has shown that its nickel oxide film is compatible with many different kinds of semiconductor materials, including silicon, indium phosphide, and cadmium telluride. When applied to photoanodes, the nickel oxide film far exceeded the performance of other similar films—including one that Lewis’s group created just last year. That film was more complicated—it consisted of two layers versus one and used as its main ingredient titanium dioxide (TiO2, also known as titania), a naturally occurring compound that is also used to make sunscreens, toothpastes, and white paint.

    “After watching the photoanodes run at record performance without any noticeable degradation for 24 hours, and then 100 hours, and then 500 hours, I knew we had done what scientists had failed to do before,” says Ke Sun, a postdoc in Lewis’s lab and the first author of the new study.

    Lewis’s team developed a technique for creating the nickel oxide film that involves smashing atoms of argon into a pellet of nickel atoms at high speeds, in an oxygen-rich environment. “The nickel fragments that sputter off of the pellet react with the oxygen atoms to produce an oxidized form of nickel that gets deposited onto the semiconductor,” Lewis says.

    Crucially, the team’s nickel oxide film works well in conjunction with the membrane that separates the photoanode from the photocathode and staggers the production of hydrogen and oxygen gases.

    “Without a membrane, the photoanode and photocathode are close enough to each other to conduct electricity, and if you also have bubbles of highly reactive hydrogen and oxygen gases being produced in the same place at the same time, that is a recipe for disaster,” Lewis says. “With our film, you can build a safe device that will not explode, and that lasts and is efficient, all at once.”

    Lewis cautions that scientists are still a long way off from developing a commercial product that can convert sunlight into fuel. Other components of the system, such as the photocathode, will also need to be perfected.

    “Our team is also working on a photocathode,” Lewis says. “What we have to do is combine both of these elements together and show that the entire system works. That will not be easy, but we now have one of the missing key pieces that has eluded the field for the past half-century.”

    Along with Lewis and Sun, additional authors on the paper, “Stable solar-driven oxidation of water by semiconducting photoanodes protected by transparent catalytic nickel oxide films,” include Caltech graduate students Fadl Saadi, Michael Lichterman, Xinghao Zhou, Noah Plymale, and Stefan Omelchenko; William Hale, from the University of Southampton; Hsin-Ping Wang and Jr-Hau He, from King Abdullah University in Saudi Arabia; Kimberly Papadantonakis, a scientific research manager at Caltech; and Bruce Brunschwig, the director of the Molecular Materials Research Center at Caltech. Funding was provided by the Office of Science at the U.S. Department of Energy, the National Science Foundation, the Beckman Institute, and the Gordon and Betty Moore Foundation.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”
    Caltech buildings

  • richardmitnick 10:00 am on February 26, 2015 Permalink | Reply
    Tags: , Energy,   

    From NYT: “Bill Gates and Other Business Leaders Urge U.S. to Increase Energy Research” 

    New York Times

    The New York Times

    FEB. 23, 2015

    This Duke Energy battery project in Texas, supported by federal research dollars, stores power from wind turbines for later use. A new report calls on the government to increase its spending on energy research.

    The government is spending far too little money on energy research, putting at risk the long-term goals of reducing carbon emissions and alleviating energy poverty, some of the country’s top business leaders found in a new report.

    The American Energy Innovation Council, a group of six executives that includes the Microsoft co-founder Bill Gates and the General Electric chief Jeffrey R. Immelt, urged Congress and the White House to make expanded energy research a strategic national priority.

    The leaders pointed out that the United States had fallen behind a slew of other countries in the percentage of economic output being spent on energy research, among them China, Japan, France and South Korea. Their report urged leaders of both political parties to start increasing funds to ultimately triple today’s level of research spending, about $5 billion a year.

    “Growing and consistent appropriations for energy innovation should be a top U.S. priority over the next decade,” the business leaders recommended in their report. “The budget numbers over the last five years are a major failure in U.S. energy policy.”

    At stake, Mr. Gates said in an interview, are not just long-term goals like reducing emissions of greenhouse gases, but also American leadership in industries of the future, including advanced nuclear reactors and coal-burning power plants that could capture and bury their emissions.

    “Our universities, our national labs are the best in the world,” Mr. Gates said, but he added that a chronic funding shortfall was holding back the pace of their work.

    The report did credit the Obama administration and Congress with some gains, including a one-time injection of funds in the economic stimulus bill of 2009. But subsequent budgets have essentially dropped back to prior levels, and spending on American energy research remains far below the high point it reached just after the energy crises of the 1970s.

    In the past, the report found, investments in energy innovation have paid major dividends. Mr. Gates cited the example of hydraulic fracturing to unlock gas and oil in shale deposits, a technique developed in part with federal research money that has led to a newfound abundance of oil and gas, lowering prices for consumers.

    Similar innovation is needed in low-emission sources of energy, the report found, if the goal of limiting global warming is to be met while making energy more available to poor people around the world. Experts involved in writing the report said the needed breakthroughs included safer types of nuclear reactors, cheaper methods of capturing carbon dioxide emissions at power plants and improved batteries that can store large amounts of energy.

    The new report is an update on similar recommendations the same business leaders made five years ago. While the report found that the picture remained generally bleak, it did cite some progress.

    For instance, Congress established the Advanced Research Projects Agency-Energy, or ARPA-E, modeled on the Pentagon research agency that helped create the Internet. And the Energy Department has funded a string of energy innovation hubs across the country.

    “There’s some very promising things that are in these centers, but the pace is absolutely limited by the modest funding level,” Mr. Gates said. “Those should be funded at a much higher level.”

    The report pointed out that funding for ARPA-E was less than $300 million per year, and urged that it be raised closer to $1 billion. The entire federal appropriation for energy research is less than Americans spend every year buying potato and tortilla chips, the report noted.

    The recommendations in the report are similar to those made by other groups in recent years. But with the federal budget under pressure, the idea of a major push on energy research has gained little traction in Washington.

    The business leaders hope to change that as the 2016 presidential race gets under way, urging both parties to embrace ambitious research plans.

    Aside from Mr. Gates and Mr. Immelt, the American Energy Innovation Council comprises Norman R. Augustine, a former chairman and chief executive of Lockheed Martin; John Doerr, the Silicon Valley venture capitalist; Chad Holliday, a former chairman and chief executive of DuPont who soon will become chairman of Shell; and Tom Linebarger, chairman and chief executive of Cummins.

    In pushing their case in Washington, the leaders are likely to encounter reluctance on the right to increase government spending, as well as some philosophical objections to expanding the government’s role in the energy market. On the left, they may encounter wariness from environmentalists who, while not opposing new research, do not want that push to detract from rapid deployment of current clean-energy technologies, like wind and solar power.

    “I am 100 percent for more research, since who could possibly oppose that?” said Joseph J. Romm, who helped manage federal energy research in the Bill Clinton administration and later founded a widely read blog on climate change. “But it is only a small part of the answer, and certainly not the most important.”

    He added that aggressive deployment of existing technologies and a price on emissions of carbon dioxide would go a long way to reduce emissions, and that the latter would help unlock more private innovation.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 7:24 am on February 17, 2015 Permalink | Reply
    Tags: , , Energy, , ,   

    From physicsworld: “Smaller fusion reactors could deliver big gains” 


    Feb 16, 2015
    Michael Banks

    Hot topic: size may not be everything in tokamak design

    Researchers from the UK firm Tokamak Energy say that future fusion reactors could be made much smaller than previously envisaged – yet still deliver the same energy output. That claim is based on calculations showing that the fusion power gain – a measure of the ratio of the power from a fusion reactor to the power required to maintain the plasma in steady state – does not depend strongly on the size of the reactor. The company’s finding goes against conventional thinking, which says that a large power output is only possible by building bigger fusion reactors.

    The largest fusion reactor currently under construction is the €16bn ITER facility in Cadarache, France.

    ITER Tokamak

    This will weigh about 23,000 tonnes when completed in the coming decade and consist of a deuterium–tritium plasma held in a 60 m-tall, doughnut-shaped “tokamak”. ITER aims to produce a fusion power gain (Q) of 10, meaning that, in theory, the reactor will emit 10 times the power it expends by producing 500 MW from 50 MW of input power. While ITER has a “major” plasma radius of 6.21 m, it is thought that an actual future fusion power plant delivering power to the grid would need a 9 m radius to generate 1 GW.

    Low power brings high performance

    The new study, led by Alan Costley from Tokamak Energy, which builds compact tokamaks, shows that smaller, lower-power, and therefore lower-cost reactors could still deliver a value of Q similar to ITER. The work focused on a key parameter in determining plasma performance called the plasma “beta”, which is the ratio of the plasma pressure to the magnetic pressure. By using scaling expressions consistent with existing experiments, the researchers show that the power needed for high fusion performance can be three or four times lower than previously thought.

    Combined with the finding on the size-dependence of Q, these results imply the possibility of building lower-power, smaller and cheaper pilot plants and reactors. “The consequence of beta-independent scaling is that tokamaks could be much smaller, but still have a high power gain,” David Kingham, Tokamak Energy chief executive, told Physics World.

    The researchers propose that a reactor with a radius of just 1.35 m would be able to generate 180 MW, with a Q of 5. This would result in a reactor just 1/20th of the size of ITER. “Although there are still engineering challenges to overcome, this result is underpinned by good science,” says Kingham. “We hope that this work will attract further investment in fusion energy.”

    Many challenges remain

    Howard Wilson, director of the York Plasma Institute at the University of York in the UK, points out, however, that the result relies on being able to achieve a very high magnetic field. “We have long been aware that a high magnetic field enables compact fusion devices – the breakthrough would be in discovering how to create such high magnetic fields in the tokamak,” he says. “A compact fusion device may indeed be possible, provided one can achieve high confinement of the fuel, demonstrate efficient current drive in the plasma, exhaust the heat and particles effectively without damaging material surfaces, and create the necessary high magnetic fields.”

    The work by Tokamak Energy follows an announcement late last year that the US firm Lockheed Martin plans to build a “truck-sized” compact fusion reactor by 2019 that would be capable of delivering 100 MW. However, the latest results from Tokamak Energy might not be such bad news for ITER. Kingham adds that his firm’s work means that, in principle, ITER is actually being built much larger than necessary – and so should outperform its Q target of 10.

    The research is published in Nuclear Fusion.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    PhysicsWorld is a publication of the Institute of Physics. The Institute of Physics is a leading scientific society. We are a charitable organisation with a worldwide membership of more than 50,000, working together to advance physics education, research and application.

    We engage with policymakers and the general public to develop awareness and understanding of the value of physics and, through IOP Publishing, we are world leaders in professional scientific communications.
    IOP Institute of Physics

  • richardmitnick 5:28 pm on February 9, 2015 Permalink | Reply
    Tags: , Energy, , ,   

    From LBL: “New Design Tool for Metamaterials” 

    Berkeley Logo

    Berkeley Lab

    February 9, 2015
    Lynn Yarris (510) 486-5375

    Confocal microscopy confirmed that the nonlinear optical properties of metamaterials can be predicted using a
    theory about light passing through nanostructures.

    Metamaterials – artificial nanostructures engineered with electromagnetic properties not found in nature – offer tantalizing future prospects such as high resolution optical microscopes and superfast optical computers. To realize the vast potential of metamaterials, however, scientists will need to hone their understanding of the fundamental physics behind them. This will require accurately predicting nonlinear optical properties – meaning that interaction with light changes a material’s properties, for example, light emerges from the material with a different frequency than when it entered. Help has arrived.

    Scientists with the U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (Berkeley Lab) and the University of California (UC) Berkeley have shown, using a recent theory for nonlinear light scattering when light passes through nanostructures, that it is possible to predict the nonlinear optical properties of metamaterials.

    “The key question has been whether one can determine the nonlinear behavior of metamaterials from their exotic linear behavior,” says Xiang Zhang, director of Berkeley Lab’s Materials Sciences Division and an international authority on metamaterial engineering who led this study. “We’ve shown that the relative nonlinear susceptibility of large classes of metamaterials can be predicted using a comprehensive nonlinear scattering theory. This will allow us to efficiently design metamaterials with strong nonlinearity for important applications such as coherent Raman sensing, entangled photon generation and frequency conversion.”

    Xiang Zhang, Haim Suchowski and Kevin O’Brien were part of the team that discovered a way to predict thenon-linear optical properties of metamaterials. (Photo by Roy Kaltschmidt)

    Zhang, who holds the Ernest S. Kuh Endowed Chair at UC Berkeley and is a member of the Kavli Energy NanoSciences Institute at Berkeley (Kavli ENSI), is the corresponding author of a paper describing this research in the journal Nature Materials. The paper is titled Predicting nonlinear properties of metamaterials from the linear response. The other authors are Kevin O’Brien, Haim Suchowski, Junsuk Rho, Alessandro Salandrino, Boubacar Kante and Xiaobo Yin.

    The unique electromagnetic properties of metamaterials stem from their physical structure rather than their chemical composition. This structure, for example, provides certain metamaterials with a negative refractive index, an optical property in which the phase front of light moving through a material propagates backward towards the source. The phase front light moving through natural materials always propagates forward, away from its source.

    Zhang and his group have already exploited the linear optical properties of metamaterials to create the world’s first optical invisibility cloak and mimic black holes. Most recently they used a nonlinear metamaterial with a refractive index of zero to generate “phase mismatch–free nonlinear light,” meaning light waves moved through the material gaining strength in all directions. However, engineering nonlinear metamaterials remains in its infancy, with no general conclusion on the relationship between linear and nonlinear properties.

    Metamaterial arrays whose geometry varied gradually from a symmetric bar to an asymmetric U-shape were used to compare the predictive abilities of Miller’s rule and a non-linear light scattering theory.

    For the past several decades, scientists have estimated the nonlinear optical properties in natural crystals using a formulation known as “Miller’s rule,” for the physicist Robert Miller who authored it. In this new study, Zhang and his group found that Miller’s rule doesn’t work for a number of metamaterials. That’s the bad news. The good news is that a nonlinear light scattering theory, developed for nanostructures by Dutch scientist Sylvie Roke, does.

    “From the linear properties, one calculates the nonlinear polarization and the mode of the nanostructure at the second harmonic,” says Kevin O’Brien, co-lead author of the Nature Materials paper and a member of Zhang’s research group. “We found the nonlinear emission is proportional to the overlap integral between these, not simply determined by their linear response.”

    Zhang, O’Brien, Suchowski, and the other contributors to this study evaluated Miller’s rule and the nonlinear light scattering theory by comparing their predictions to experimental results obtained using a nonlinear stage-scanning confocal microscope.

    “Nonlinear stage-scanning confocal microscopy is critical because it allows us to rapidly measure the nonlinear emission from thousands of different nanostructures while minimizing the potential systematic errors, such as intensity or beam pointing variations, often associated with tuning the wavelength of an ultrafast laser,” O’Brien says.

    The researchers used confocal microscopy to observe the second harmonic generation from metamaterial arrays whose geometry was gradually shifted from a symmetric bar-shape to an asymmetric U-shape. Second harmonic light is a nonlinear optical property in which photons with the same frequency interact with a nonlinear material to produce new photons at twice the energy and half the wavelength of the originals. It was the discovery of optical second harmonic generation in 1961 that started modern nonlinear optics.

    “Our results show that nonlinear scattering theory can be a valuable tool in the design of nonlinear metamaterials not only for second-order but also higher order nonlinear optical responses over a broad range of wavelengths,” O’Brien says. “We’re now using these experimental and theoretical techniques to explore other nonlinear processes in metamaterials, such as parametric amplification and entangled photon generation.”

    This research was supported by the DOE Office of Science.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc

Get every new post delivered to your Inbox.

Join 496 other followers

%d bloggers like this: