Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:56 am on April 21, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , HOUR - Hopkins Office for Undergraduate Research,   

    From Hopkins: “A new home for undergraduate research at Johns Hopkins” 

    Johns Hopkins
    Johns Hopkins University

    4.19.17
    Katie Pearce

    1
    Excerpt from time-lapse photography project by PURA awardee Ambrose Tang. Image credit: Ambrose Tang

    There is no tidy definition for the word “research” when it comes to the varied pursuits taking place across Johns Hopkins University.

    “What it calls to mind for many people is bench science, and obviously Hopkins has a lot of that, but we take a broader view,” says Feilim Mac Gabhann, director of the new Hopkins Office for Undergraduate Research [HOUR]. He says research can take the form of “any creative endeavor that brings new knowledge or new insights that weren’t there before.”

    That expansiveness will be on full display at Friday’s DREAMS event, the new office’s inaugural event, which will showcase undergraduate research at the Homewood campus.

    The 250-plus presentations will include, for example, a photography project from Peabody Institute junior Ambrose Tang, who won a grant to explore time-lapse techniques in capturing scenes of his home city of Hong Kong. The end result will present like a film, though he recorded no video—it’s a textured quilt of still photos, with 24 images making up every one second of footage.

    Also presenting at the showcase is Danait Yemane, a public health studies senior whose research took the form of cultural investigation. She traveled to Eritrea, where her parents were born, to learn firsthand about the impacts of the country’s compulsory military service. While the conscriptions are meant to be finite, Yemane found they’d become a long-time occupation for many Eritreans who shared their stories with her.

    The DREAMS event—which takes place April 21 from 1 to 4 p.m. in Goldfarb Gym—covers disciplines across Johns Hopkins, including engineering, science, medicine, and the arts and humanities. Presentation topics include cancer research, medieval artwork, urban gardening, robotics, and immigration, and presenters include recent awardees of the PURA program, which provides $2,500 research stipends. Both Tang and Yemane were recipients last year.

    The Hopkins Office for Undergraduate Research—HOUR, for short—launched in March under Vice Provost for Research Denis Wirtz with the mission to create a broad, centralized support structure for undergraduates to pursue their research.

    “I believe in the university being a real living thing that is larger than one lab, or one department, or one school,” says Mac Gabhann, who is an associate professor of biomedical engineering and part of JHU’s Institute for Computational Medicine.

    To help connect students to research opportunities both inside and outside of Hopkins, HOUR is building out a new database listing available grants and summer research programs.

    One summer program they’re highlighting is CIRCUIT, which allows undergraduates of any background to take part in brain-mapping research at the Johns Hopkins Applied Physics Laboratory. The program provides a $5,000 stipend for living expenses.

    Such stipends, HOUR recognizes, are critical for reducing the barriers for fledgling young researchers. That’s the principle behind another new offering, the STAR program, which provides $4,000 stipends for summer research of the student’s choice within any Hopkins partnership, in Baltimore or beyond.

    HOUR also intends to serve an educational role of its own. A range of training materials, such as videos and tutorials, will help students learn about every step of the research process, from applying for grants and preparing budgets to making public presentations.

    To stay in the loop with new opportunities from HOUR, follow them on Twitter and Facebook. Registration for the DREAMS event is open through April 19, and though deadlines have passed for the CIRCUIT and STAR programs, new application cycles will open next winter.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Johns Hopkins Campus

    The Johns Hopkins University opened in 1876, with the inauguration of its first president, Daniel Coit Gilman. “What are we aiming at?” Gilman asked in his installation address. “The encouragement of research … and the advancement of individual scholars, who by their excellence will advance the sciences they pursue, and the society where they dwell.”

    The mission laid out by Gilman remains the university’s mission today, summed up in a simple but powerful restatement of Gilman’s own words: “Knowledge for the world.”

    What Gilman created was a research university, dedicated to advancing both students’ knowledge and the state of human knowledge through research and scholarship. Gilman believed that teaching and research are interdependent, that success in one depends on success in the other. A modern university, he believed, must do both well. The realization of Gilman’s philosophy at Johns Hopkins, and at other institutions that later attracted Johns Hopkins-trained scholars, revolutionized higher education in America, leading to the research university system as it exists today.

     
  • richardmitnick 9:23 am on April 21, 2017 Permalink | Reply
    Tags: Applied Research & Technology, Autism studies, ,   

    From HMS: “A strengths-based approach to autism” 

    Harvard University

    Harvard University

    Harvard Medical School

    Harvard Medical School

    April 20, 2017
    Monique Tello, MD, MPH

    1
    No image caption. No image credit.

    At our son’s 18-month checkup five years ago, our pediatrician expressed concern. Gio wasn’t using any words, and would become so frustrated he would bang his head on the ground. Still, my husband and I were in denial. We dragged our feet. Meanwhile, our son grunted and screamed; people said things. Finally we started therapy with early intervention services.

    A few months later, after hundreds of pages of behavior questionnaires for us and hours of testing for Gio, we heard the words: “Your son meets criteria for a diagnosis of autism spectrum disorder…”

    Our journey has taken us through several behavioral approaches with many different providers. Today, Gio is doing very well, in an integrated first grade in public school. He can speak, read, write, and play. His speech and syntax can be hard to understand, but we are thrilled that we can communicate with him.

    The difference between typical and functional

    Longtime autism researcher Laurent Mottron wrote a recent scientific editorial in which he points out that the current approach to treating a child with autism is based on changing them, making them conform, suppressing repetitive behaviors, intervening with any “obsessive” interests. Our family experienced this firsthand. Some of our early behavioral therapists would see Gio lie on the ground to play, his face level with the cars and trucks he was rolling into long rows, and they would tell us, “Make him sit up. No lying down. Let’s rearrange the cars. Tell him, they don’t always have to be in a straight line, Gio!”

    To me, this approach seemed rigid. We don’t all have to act in the exact same way. These kids need to function, not robotically imitate “normal.”

    Why not leverage difference rather than extinguish it?

    We naturally gravitated towards Stanley Greenspan’s “DIR/Floortime” approach, in which therapists and parents follow the child’s lead, using the child’s interests to engage them, and then helping the child to progress and develop.

    Mottron’s research supports Greenspan’s approach: study the child to identify his or her areas of interest. The more intense the interest the better, because that’s what the child will find stimulating. Let them fully explore that object or theme (shiny things? purple things? wheels?) because these interests help the developing brain to figure out the world.

    Then, use that interest as a means to engage with the child, and help them make more connections. Mottron suggests that parents and teachers get on the same level with the child and engage in a similar activity — be it rolling cars and trucks, or lining them up. When the child is comfortable, add in something more. Maybe, make the cars and trucks talk to each other.

    But, don’t pressure the child to join the conversation. Let them be exposed to words, conversations, and songs, without forced social interaction. This is how early language skills can be taught in a non-stressful way, acknowledging and aligning with the autistic brain. The ongoing relationship and engagement will foster communication.

    Basically, what both Greenspan and Mottron are advocating are methods of teaching autistic children to relate, adapt, and function in the world, without “forcing the autism out of them.”

    The concept of accepting autistic kids as they are, and incorporating the natural ways they think into educational and therapeutic techniques, feels right to me. Gio is different from most kids, and really, he’s not interested in most kids. Our attempts to push him to participate in “fun” group activities like soccer, Easter egg hunts, and birthday parties have all been spectacular failures. Maybe the real failure was ours: by pushing him to “fit in,” we deny his true nature. Yes, the way he thinks is sometimes mysterious to us, but he clearly has great strengths: a remarkable ability to focus and persevere, to experiment with his ideas, and to follow his vision.

    World-renowned autism expert and animal rights activist Temple Grandin (who is herself autistic, and very open about her preference for animal rather than human companionship!) sums up Mottron’s approach perfectly: “The focus should be on teaching people with autism to adapt to the social world around them while still retaining the essence of who they are, including their autism.”

    Sources

    generallymedicine blog post: Screaming Frustration: Our Two-Year-Old Won’t Talk

    generallymedicine blog post: They Dropped the A-Bomb On Us

    generallymedicine blog post: So Our Son Is Autistic… and It’s Going to Be Okay

    generallymedicine blog post: Should we let our kid hang out by himself most of the time?

    generallymedicine blog post: Autism Awareness… and Coolness

    Greenspan, Stanley (2006). Engaging Autism. Da Capo Press. http://www.stanleygreenspan.com

    Mottron, L. Should we change targets and methods of early intervention in autism, in favor of a strengths-based education? European Child & Adolescent Psychiatry, February 2017, e-pub ahead of print.

    Related Information: The Autism Revolution

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    HMS campus

    Established in 1782, Harvard Medical School began with a handful of students and a faculty of three. The first classes were held in Harvard Hall in Cambridge, long before the school’s iconic quadrangle was built in Boston. With each passing decade, the school’s faculty and trainees amassed knowledge and influence, shaping medicine in the United States and beyond. Some community members—and their accomplishments—have assumed the status of legend. We invite you to access the following resources to explore Harvard Medical School’s rich history.

    Harvard University campus

    Harvard is the oldest institution of higher education in the United States, established in 1636 by vote of the Great and General Court of the Massachusetts Bay Colony. It was named after the College’s first benefactor, the young minister John Harvard of Charlestown, who upon his death in 1638 left his library and half his estate to the institution. A statue of John Harvard stands today in front of University Hall in Harvard Yard, and is perhaps the University’s best known landmark.

    Harvard University has 12 degree-granting Schools in addition to the Radcliffe Institute for Advanced Study. The University has grown from nine students with a single master to an enrollment of more than 20,000 degree candidates including undergraduate, graduate, and professional students. There are more than 360,000 living alumni in the U.S. and over 190 other countries.

     
  • richardmitnick 8:59 am on April 21, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , , OpenMP (for Multi-Processing) Architecture Review Board (ARB)   

    From BNL: “Brookhaven Lab Joins the OpenMP Architecture Review Board’ 

    Brookhaven Lab

    April 20, 2017
    Ariana Tantillo
    atantillo@bnl.gov

    Lab to help evolve the standard for OpenMP, the most popularly used shared-memory parallel programming model.

    1
    (Left to right) Lingda Li, Abid Malik, and Verinder Rana of Brookhaven Lab’s Computational Science Initiative (CSI) will collaborate with members of the OpenMP Architecture Review Board to help shape the OpenMP programming standard for high-performance computing. Not pictured: Kerstin Kleese van Dam, CSI director, and Barbara Chapman, director of CSI’s Computer Science and Mathematics research team who led the Brookhaven initiative to join the OpenMP ARB.

    The U.S. Department of Energy’s (DOE) Brookhaven National Laboratory has joined the OpenMP (for Multi-Processing) Architecture Review Board (ARB). This nonprofit technology consortium manages the OpenMP application programming interface specification for parallel programming on shared-memory machines, in which any processor can access data stored in any part of the memory.

    As part of this consortium of leading hardware and software vendors and research organizations, Brookhaven Lab will help shape one of the most widely used programming standards for high-performance computing—the combination of computing power from multiple processors working simultaneously to solve large and complex problems. Brookhaven’s participation in the OpenMP ARB is critical to ensuring the OpenMP standard supports scientific requirements for data analysis, modeling and simulation, and visualization.

    “Advancing the frontiers of high-performance and data-intensive computing is central to Brookhaven’s mission in scientific discovery. Our membership in the OpenMP ARB recognizes the importance we place upon OpenMP for our science portfolio, both now and in the future,” said Robert Harrison, chief scientist of the Computational Science Initiative (CSI) at Brookhaven Lab and director of the Institute for Advanced Computational Science at Stony Brook University, which joined OpenMP ARB at the end of 2016.

    Barbara Chapman, director of CSI’s Computer Science and Mathematics research team at Brookhaven and a professor of applied mathematics and statistics and of computer science at Stony Brook, led the initiative to join the OpenMP ARB. Chapman, whose research focuses on programming models for large-scale computing, has been involved with the evolution of OpenMP since 2001.

    Abid Malik, a senior technology engineer on Chapman’s team, and research assistant Verinder Rana will represent Brookhaven during monthly meetings with the ARB. They plan to join several of the subgroups that focus on evolving specific aspects of the OpenMP programming model, including those for computational accelerators (such as graphics processing units, or GPUs), the C++ programming language, and memory management.

    Each ARB member organization makes suggestions on how OpenMP should be evolved to meet their specific requirements. In turn, the vendors decide which suggestions to implement, depending on how relevant they are to a wide range of applications.

    According to Malik, OpenMP will benefit from Brookhaven’s expertise in tackling big data challenges, especially those posed by its DOE Office of Science User Facilities—the Center for Functional Nanomaterials, National Synchrotron Light Source II, and Relativistic Heavy Ion Collider.

    BNL Center for Functional Nanomaterials

    BNL NSLS-II

    BNL NSLS-II

    2
    BNL RHIC Star detector

    BNL RHIC Campus

    Using this expertise, Brookhaven will help advance the OpenMP standard for next-generation supercomputers, which will help scientists tackle increasingly complex problems by performing calculations at unprecedented speed and accuracy.

    “The OpenMP language subgroup is actively working with the scientific community to prepare OpenMP for exascale computing,” said Malik. “Brookhaven’s big data experience will help expand OpenMP to include features useful for porting big data programs on multicore CPUs [central processing units] and GPUs.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 8:40 am on April 21, 2017 Permalink | Reply
    Tags: Applied Research & Technology, OEWaves, Optical micro-oscillator, Optical pendulum, stable and super-accurate clock component, , UCLA-led team develops tiny   

    From UCLA: “UCLA-led team develops tiny, stable and super-accurate clock component” 

    UCLA bloc

    UCLA

    April 20, 2017
    Matthew Chin

    1
    The micro-oscillator functions analogously to the gears of a clock pendulum. Nicoletta Barolini

    A team of engineering researchers from UCLA and OEWaves has developed an optical micro-oscillator, a key time-keeping component of clocks that could vastly improve the accuracy of time-keeping, which is essential for use in spacecraft, automobile sensing or satellite communications.

    An optical oscillator is similar to a pendulum in a grandfather clock, only instead of a swinging motion to keep time, its “tick” is the laser’s very high frequency, or cycles per second. This “optical pendulum” is a laser light confined in a very quiet resonator that allows for the light to bounce back and forth without losing its energy. This class of optical oscillators is extremely accurate. However, they are large stand-alone devices, about the size of a home kitchen oven, and must be kept in completely stable laboratory conditions.

    The new oscillator has laboratory-like stability, and is small and lightweight enough to be potentially incorporated into satellites, in cars for super-accurate navigation, for ultra-high precision measurement, or even an everyday device like a smartphone. The improvement is orders of magnitude better compared to the best currently available outside a lab, which are quartz crystal oscillators in luxury wrist watches, computers and smartphones. The new device also takes advantage of a phenomenon discovered in St. Paul’s Cathedral in London.

    The researchers suggest this could be used in miniaturized atomic clocks for spacecraft and satellites, for which precise timing is important to navigation. It could be used for precision distance and rotation sensing for cars and other vehicles and in high-resolution optical spectroscopy, which is used to image molecular and atomic structures.

    “Any fluctuations in temperature or pressure can change the size of the oscillators, and therefore changes how far the laser light travels, and thus, the accuracy of the oscillation,” said Chee Wei Wong, professor of electrical engineering at the UCLA Henry Samueli School of Engineering and Applied Science and the principal investigator on the research.

    Think of when a doorframe expands or contracts because of changes in temperature. At the tiny scales of optical oscillators, even the smallest change in size can affect its accuracy.

    The research team’s new oscillator is accurate and stable. The light oscillation frequency doesn’t change more than 0.1 parts per billion. At the same time, they shrank the oscillator’s size down to only 1 cubic centimeter in volume.

    “The miniature stabilized laser demonstrated in this work is a key step in reducing the size, weight and power of optical clocks, and to make possible their availability outside the laboratory and for field applications,” said Lute Maleki, CEO of OEwaves.

    The research team’s optical oscillator is three to five times more stable than existing devices in not being affected during extreme changes in temperature and pressure. Based on experimental results, the researchers also suggest its stability could be as much as 60 times better.

    “Usually, even tiny variations of the atmospheric temperature or pressure introduce measurement uncertainty by an order of magnitude larger than the observed effects,” said Jinkang Lim, a UCLA postdoctoral researcher in the Mesoscopic Optics and Quantum Electronics Laboratory and the lead author on the study. “We carefully designed our resonator and isolated it from the ambient fluctuations. Then we observed the minute changes and saw it remained stable, even with environmental changes.

    “This tiny oscillator could lead to measurement and navigation devices in the field, where temperature and pressure are not controlled and change dramatically,” Lim added. “This new micro-oscillator could retain its accuracy, even with unfriendly environmental conditions.”

    The optical micro-oscillator, works at this level of accuracy because it confines the laser light inside itself by using what’s known as “whispering gallery-mode” resonance, so named because of similarities to how a someone can whisper something against the walls in the dome of London’s St. Paul’s Cathedral, where this phenomenon was first reported, that will be completely audible on the opposite side. The phenomenon is also in New York City’s Grand Central Station. In this case, the laser light wave propagates along the specially-designed interior of the micro-resonator. Additionally, the frequency remains stable as the micro-resonator resists changes from temperature and pressure. Finally, the light oscillations themselves are very distinct, rather than “fuzzy.”

    The research, which was published in Nature Communications, was supported by the Air Force Research Laboratory.

    Other authors on the paper include Anatoily Savchenko, Elijah Dale, Wei Liang, Danny Eliyahu, Vladmir Ilchenko, and Andrey Matsko, all from OEwaves.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    UC LA Campus

    For nearly 100 years, UCLA has been a pioneer, persevering through impossibility, turning the futile into the attainable.

    We doubt the critics, reject the status quo and see opportunity in dissatisfaction. Our campus, faculty and students are driven by optimism. It is not naïve; it is essential. And it has fueled every accomplishment, allowing us to redefine what’s possible, time after time.

    This can-do perspective has brought us 12 Nobel Prizes, 12 Rhodes Scholarships, more NCAA titles than any university and more Olympic medals than most nations. Our faculty and alumni helped create the Internet and pioneered reverse osmosis. And more than 100 companies have been created based on technology developed at UCLA.

     
  • richardmitnick 8:27 am on April 21, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , Polarization states,   

    From UCLA: “UCLA-led team develops technique to control laser polarization” 

    UCLA bloc

    UCLA

    April 19, 2017
    Matthew Chin

    1
    Artist’s depiction of the laser polarization metasurface that can tune the laser’s polarization state purely electronically, without any moving parts. Nicoletta Barolini/UCLA

    A research team led by UCLA electrical engineers has developed a new technique to control the polarization state of a laser that could lead to a new class of powerful, high-quality lasers for use in medical imaging, chemical sensing and detection, or fundamental science research.

    Think of polarized sunglasses, which help people see more clearly in intense light. Polarizing works by filtering visible light waves to allow only waves that have their electric field pointing in one specific direction to pass through, which reduces brightness and glare.

    Like brightness and color, polarization is a fundamental property of light that emerges from a laser. The traditional way to control the polarization of a laser was to use a separate component like a polarizer or a waveplate. To change its polarization, the polarizer or waveplate must be physically rotated, a slow process that results in a physically larger laser system.

    The team from the UCLA Henry Samueli School of Engineering and Applied Science developed a specialized artificial material, a type of “metasurface,” that can tune the laser’s polarization state purely electronically, without any moving parts. The research was published in Optica. The breakthrough advance was applied to a class of lasers in the terahertz range of frequencies on the electromagnetic spectrum, which lies between microwaves and infrared waves.

    “While there are a few ways to quickly switch polarization in the visible spectrum, in the terahertz range there is currently a lack of good options,” said Benjamin Williams, associate professor of electrical engineering and the principal investigator of the research. “In our approach, the polarization control is built right into the laser itself. This allows a more compact and integrated setup, as well as the possibility for very fast electronic switching of the polarization. Also, our laser efficiently generates the light into the desired polarization state — no laser power is wasted generating light in the wrong polarization.”

    Terahertz radiation penetrates many materials, such as dielectric coatings, paints, foams, plastics, packaging materials, and more without damaging them, Williams said.

    “So some applications include non-destructive evaluation in industrial settings, or revealing hidden features in the study of art and antiquities,” said Williams, who directs the Terahertz Devices and Intersubband Nanostructures Laboratory. “For example, our laser could be used for terahertz imaging, where the addition of polarization contrast may help to uncover additional information in artwork, such as improved edge detection for hidden defects or structures.”

    The work is based on the group’s recent development of the world’s first vertical-external-cavity surface-emitting laser, or VECSEL, that operates in the terahertz range.

    Their new metasurface covers an area of 2 square millimeters and has a distinct zigzag pattern of wire antennas running across its surface. An electric current runs through the wires, selectively energizing particular segments of the laser material, which allows a user to change and customize the polarization state as needed.

    The lead authors of the research are electrical engineering graduate student Luyao Xu and electrical engineering undergraduate student Daguan Chen. Other authors include electrical engineering graduate student Christopher Curwen; Mohammad Memarian, a postdoctoral scholar in UCLA’s microwave electronics lab; John Reno of Sandia National Laboratories; and UCLA electrical engineering professor Tatsuo Itoh, who holds the Northrop Grumman Chair in Engineering.

    The research was supported by the National Science Foundation and NASA.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    UC LA Campus

    For nearly 100 years, UCLA has been a pioneer, persevering through impossibility, turning the futile into the attainable.

    We doubt the critics, reject the status quo and see opportunity in dissatisfaction. Our campus, faculty and students are driven by optimism. It is not naïve; it is essential. And it has fueled every accomplishment, allowing us to redefine what’s possible, time after time.

    This can-do perspective has brought us 12 Nobel Prizes, 12 Rhodes Scholarships, more NCAA titles than any university and more Olympic medals than most nations. Our faculty and alumni helped create the Internet and pioneered reverse osmosis. And more than 100 companies have been created based on technology developed at UCLA.

     
  • richardmitnick 8:09 am on April 21, 2017 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From Rutgers: “In Celebration of Science at Rutgers” 

    Rutgers University
    Rutgers University

    From President Barchi-

    Members of the Rutgers Community:

    With the upcoming March for Science focusing national attention on science and the role it plays in American life, we at Rutgers have much to celebrate about the contributions we have made, and continue to make, in scientific research.

    I’m proud to say that Rutgers is pursuing life-changing scientific discoveries in so many areas, including precision medicine, transportation infrastructure, energy research, drug development, wireless technology, disease diagnostics, nutrition and food science, proteomics, genetics, sustainable materials, computational biology, brain health, autism research, and much more.

    Rutgers scientists gave the world the antibiotic streptomycin, proved the case against smoking, and identified the first AIDS cases. Today our scientists are changing our understanding of oceans, developing world-renowned turfgrass, and revolutionizing the way we test for tuberculosis. Our faculty conducted more than $650 million in research last year alone, more than half of that funded by federal grants.

    In our classrooms and labs, our professors are training students who will become the scientists, physicians, engineers, and inventors who will add to our stores of knowledge and further improve human health, explore and protect the natural environment, and advance economic development.

    True to our service mission, we are also applying our scientific expertise to help our state’s communities through the programs of the New Jersey Agricultural Experiment Station (NJAES). For instance, the Water Resources Program has helped towns better manage their storm-water runoff through green infrastructure practices such as porous pavement and rain gardens, and our Haskins Shellfish Research Laboratory has helped revive the oyster industry in southern New Jersey. We also support New Jersey with science in other ways, too; for example, we bring our medical knowledge to New Jersey residents through our clinical practices, clinical trials, and community health programs.

    Science matters to everyone. It affects all of us, and for a research university like Rutgers, it is at the heart of what we do. It is important to remind ourselves, and all those we serve as a public research university, of our ongoing commitment to excellence in science and scientific research. In that pursuit, our Office of Research and Development provides essential guidance while our Federal Relations team advocates for research funding on Capitol Hill.

    I share the sentiments of those from Rutgers who intend to participate in demonstrations in support of science this weekend, including the March for Science in Washington. I also respect the rights of others who have dissenting viewpoints and wish to express them.

    Please take a moment to view a new video with a number of Rutgers voices expressing why science matters and see a Q&A with the Office of Federal Relations. I also invite you to read my op-ed on science, published this morning by USA Today.

    Sincerely,

    Robert Barchi

    Received via email .

    Follow Rutgers Research here .

    Please help promote STEM in your local schools.

    rutgers-campus

    STEM Icon

    Stem Education Coalition

    Rutgers, The State University of New Jersey, is a leading national research university and the state’s preeminent, comprehensive public institution of higher education. Rutgers is dedicated to teaching that meets the highest standards of excellence; to conducting research that breaks new ground; and to providing services, solutions, and clinical care that help individuals and the local, national, and global communities where they live.

    Founded in 1766, Rutgers teaches across the full educational spectrum: preschool to precollege; undergraduate to graduate; postdoctoral fellowships to residencies; and continuing education for professional and personal advancement.

    Rutgers smaller
    Please give us back our original beautiful seal which the University stole away from us.

     
  • richardmitnick 4:26 pm on April 20, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , ,   

    From BNL: “Q&A with CFN User Davood Shahrjerdi” 

    Brookhaven Lab

    April 18, 2017
    Ariana Tantillo
    atantillo@bnl.gov

    Combining the unique properties of emerging nanomaterials with advanced silicon-based electronics, NYU’s Shahrjerdi engineers nano-bioelectronics

    1
    Davood Shahrjerdi in the scanning electron microscope facility at Brookhaven Lab’s Center for Functional Nanomaterials (CFN). The image on the screen is a Hall bar structure for measuring carrier transport in a semiconductor wire.

    Davood Shahrjerdi is an assistant professor of electrical and computer engineering at New York University (NYU) and a principal investigator at the NYU Laboratory for Nano-Engineered Hybrid Integrated Systems. Shahrjerdi, who holds a doctorate in solid-state electronics from The University of Texas at Austin, engineers nanodevices for sensing and life science applications through integrating the unique properties of emerging nanomaterials with advanced silicon-based electronics. For the past two years, he has been using facilities at the Center for Functional Nanomaterials (CFN)—a U.S. Department of Energy (DOE) Office of Science User Facility at Brookhaven National Laboratory—to fabricate and characterize these nanodevices.

    What is the mission of the NYU Laboratory for Nano-Engineered Hybrid Integrated Systems?

    My lab’s mission is to create new electronic devices for sensing and life science applications. To achieve this goal, we combine the benefits of emerging nanomaterials—such as two-dimensional (2D) materials like graphene—and advanced silicon integrated circuits. These nano-engineered bioelectronic systems offer new functionalities that exist in neither nanomaterials nor silicon electronics alone. At the moment, we are leveraging our expertise to engineer new tools for neuroscience applications.

    We are also doing research for realizing high-performance flexible electronics for bioelectronics applications. Our approach is two pronged: (1) flexible electronics using technologically mature materials, such as silicon, that are conventionally mechanically rigid, and (2) flexible electronics using atomically thin 2D nanomaterials that are inherently flexible.

    Given the resources of NYU and the plethora of nanotechnology research centers in the surrounding New York City area, why bring your research to CFN?

    Before I joined academia, I was a research staff member at the IBM Thomas J. Watson Research Center, where I had easy access to advanced fabrication and characterization facilities. When I joined NYU in September 2014, I began to look for research facilities to pursue my research projects. In my search, I discovered CFN and reached out to its scientists, who were very helpful in explaining the research proposal process and the available facilities for my research. In the past two years, my research projects have evolved tremendously, and access to CFN laboratories has been instrumental to this evolution. Because research-active scientists maintain CFN labs, I can conduct my research without major hiccups—a rare occurrence in academia, where equipment downtime and process changes could set back experiments.

    It is not only the state-of-the-art facilities but also the interactions with scientists that have made CFN invaluable to my research. I could use other fabrication facilities in Manhattan, but I prefer to come to CFN. At IBM, I could walk out of my office and knock on any door, gaining access to the expertise of chemists, physicists, and device engineers. This multidisciplinary environment similarly exists at CFN, and it is conducive to driving science forward. Bringing my research to the CFN also means that my doctoral students and postdocs have the opportunity to use state-of-the-art facilities and interact with world-class scientists.

    What tools do you use at CFN to conduct your research, and what are some of the projects you are currently working on?

    We synthesize the 2D nanomaterials at my NYU lab, with subsequent device fabrication and some advanced material characterization at CFN. After device fabrication, we perform electrical characterization at my NYU lab.

    In addition to using the materials processing capabilities in CFN’s clean room, we use advanced material characterization capabilities to glean information about the properties of our materials and devices at the nanoscale. These capabilities include transmission electron microscopy (TEM) to study the structure of the materials, X-ray photoelectron spectroscopy to examine their chemical state, and nano-Auger electron spectroscopy to probe their elemental composition.

    3
    The 5,000-square-foot clean room at CFN is dedicated to state-of-the-art processing of thin-film materials and devices. Capabilities include high-resolution patterning by electron-beam and nanoimprint lithography methods, plasma-based dry etch processes, and material deposition.

    One of our projects is the large-area synthesis of 2D transition metal dichalcogenide semiconductors, which are materials that have a transition metal atom (such as molybdenum or tungsten) sandwiched between two chalcogen atoms (sulfur, selenium, or tellurium). Using a modified version of chemical vapor deposition (referring to the deposition of gaseous reactants onto a substrate to form a solid), my team synthesized a monolayer of tungsten disulfide that has the highest carrier mobility reported for this material. I am now working with CFN scientists to understand the origin of this high electrical performance through low-energy electron microscopy (LEEM). Our understanding could lead to the development of next-generation flexible biomedical devices.

    3
    The single-atom-thick tungsten disulfide (illustration, left) can absorb and emit light, making it attractive for applications in optoelectronics, sensing, and flexible electronics. The photoemission image of the NYU logo (right) shows the monolayer material emitting light.

    Recently, our team together with CFN scientists published a paper on studying the defects in another 2D transition metal dichalcogenide, monolayer molybdenum disulfide. We treated the material with a superacid and used the nano-Auger technique to determine which structural defects were “healed” by the superacid. Our electrical measurements revealed the superacid treatment improves the material’s performance.

    4
    Shahrjerdi and his team fabricated top-gated field-effect transistors (FETs)—devices that utilize a small voltage to control current—on as-grown and superacid-treated molybdenum disulfide films. A schematic of the device is shown in (a). As seen in the graph (b), the chemical treatment (TFSI, red line) improves the electronic properties of the device. From Applied Physics Letters 110, 033503 (2017).

    Another ongoing project in my NYU lab involves a collaborative effort with the NYU Center for Neural Science to develop next-generation neuroprobes for understanding not only the electrical signaling in the brain but also the chemical signaling. This problem is challenging to solve, and we are excited about the prospects of nanotechnology for realizing an innovative solution to it.

    In fabricating nanoelectronic materials and components, what are some of the challenges you face?

    Nanomaterials are usually difficult to handle—they are often very thin and are highly sensitive to defects or misprocessing. As a result, reproducibility could be a challenge. To understand what is causing a particular observed behavior, we have to fabricate many samples and try to reproduce the same result to understand the physical origin of an observed behavior.

    Also, it often happens that you expect to observe a certain behavior but you might end up observing an anomalous behavior that could lead to new discoveries. For example, I accidentally stumbled on the epitaxial growth of silicon on silicon at 120-degrees Celsius while playing around with hydrogen dilution during the deposition of amorphous silicon. This temperature is much lower than the usual temperature required by the traditional approach. My IBM collaborators and I published the work, and it actually led to a best paper award from the Journal of Electronic Materials!

    What is the most exciting thing on the horizon for nanoelectronics? What do you personally hope to achieve?

    Over the next 5 to 10 years, the field of nanoelectronics has great potential to transform our lives—especially in the areas of bioelectronics and bio-inspired electronics, with the marriage between nanomaterials and conventional electronics leading to new discoveries in the life sciences.

    Biosensing is the area that I am most passionate about. The research community still has a limited understanding of how the brain functions, hindering the progress for developing treatments and drugs for neurological disorders such as Parkinson’s. Developing next-generation sensors that advance our understanding of the brain will have tremendous economic and societal impact. I am very excited about our neuroprobe project.

    Also, better understanding of the brain could lead to new discoveries for realizing next-generation computing systems that are inspired by the brain. For example, nanoscale memory devices that could mimic the synapses of the brain would open new horizons for brain-inspired computing. I am engaged in a collaborative effort with The University of Texas at Austin to explore the prospects of nanoscale memristors (short for memory resistor, a new class of electrical circuits with memories that retain information even after the power is shut off) for such an application.

    NYU is home to the second-highest number of international students in the United States, representing more than 130 different countries, and CFN employs staff and hosts users from around the world. How has being in these multicultural environments impacted your research?

    I believe science has no boundaries because it is shared by people who are driven by their curiosity to discover unknowns and have the desire to better humanity. These sentiments are at the core of scientific communities. Though we may have different backgrounds, our common ground is working on problems that have not yet been solved or discovering the undiscovered.

    How did you become interested in science in general and specifically neuroscience?

    As a kid, I was fascinated with science, particularly physics, and building things. By high school, I had also developed an interest in biology and particularly the brain. When I completed high school in Iran, I had to make the decision of whether I wanted to pursue an undergraduate degree or attend medical school. In Iran, there are no pre-med programs—you start medical school directly after high school, and you cannot enroll in medical school after you have taken the undergraduate route.

    My passion at the time was electrical engineering, so I went for the undergraduate degree. This passion evolved into device physics, my PhD field. After a few years at IBM as a device physicist, my love of bioelectronics was rekindled. I started studying neuroscience and even contemplated attending medical school in the United States. Finally, I decided to join academia and apply my knowledge of physics and electronics to the area of bioelectronics. I feel fortunate to have found a career in which I can combine my expertise and interests.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 4:05 pm on April 20, 2017 Permalink | Reply
    Tags: Applied Research & Technology, Focused Ion Beam Milling, ,   

    From Oxford: “Widely used engineering technique has unintended consequences new research reveals” 

    U Oxford bloc

    Oxford University

    1

    20 Apr 2017

    A technique that revolutionised scientists’ ability to manipulate and study materials at the nano-scale may have dramatic unintended consequences, new Oxford University research reveals.

    Focused Ion Beam Milling (FIB) uses a tiny beam of highly energetic particles to cut and analyse materials smaller than one thousandth of a stand of human hair.

    This remarkable capability transformed scientific fields ranging from materials science and engineering to biology and earth sciences. FIB is now an essential tool for a number of applications including; researching high performance alloys for aerospace engineering, nuclear and automotive applications and for prototyping in micro-electronics and micro-fluidics.

    FIB was previously understood to cause structural damage within a thin surface layer (tens of atoms thick) of the material being cut. Until now it was assumed that the effects of FIB would not extend beyond this thin damaged layer. Ground-breaking new results from the University of Oxford demonstrate that this is not the case, and that FIB can in fact dramatically alter the material’s structural identity. This work was carried out in collaboration with colleagues from Argonne National Laboratory, USA, LaTrobe University, Australia, and the Culham Centre for Fusion Energy, UK.

    In research newly published in the journal Scientific Reports, the team studied the damage caused by FIB using a technique called coherent synchrotron X-ray diffraction. This relies on ultra-bright high energy X-rays, available only at central facilities such as the Advanced Photon Source at Argonne National Lab, USA. These X-rays can probe the 3D structure of materials at the nano-scale. The results show that even very low FIB doses, previously thought negligible, have a dramatic effect.

    Felix Hofmann, Associate Professor in Oxford’s Department of Engineering Science and lead author on the study, said, ‘Our research shows that FIB beams have much further-reaching consequences than first thought, and that the structural damage caused is considerable. It affects the entire sample, fundamentally changing the material. Given the role FIB has come to play in science and technology, there is an urgent need to develop new strategies to properly understand the effects of FIB damage and how it might be controlled.’

    Prior to the development of FIB, sample preparation techniques were limited, only allowing sections to be prepared from the material bulk, but not from specific features. FIB transformed this field by making it possible to cut out tiny coupons from specific sites in a material. This progression enabled scientists to examine specific material features using high-resolution electron microscopes. Furthermore it has made mechanical testing of tiny material specimens possible, a necessity for the study of dangerous or extremely precious materials.

    Although keen for his peers to heed the serious consequence of FIB, Professor Hofmann said, ‘The scientific community has been aware of this issue for a while now, but no one (myself included) realised the scale of the problem. There is no way we could have known that FIB had such invasive side effects. The technique is integral to our work and has transformed our approach to prototyping and microscopy, completely changing the way we do science. It has become a central part of modern life.’

    Moving forward, the team is keen to develop awareness of FIB damage. Furthermore, they will build on their current work to gain a better understanding of the damage formed and how it might be removed. Professor Hofmann said, ‘We’re learning how to get better. We have gone from using the technique blindly, to working out how we can actually see the distortions caused by FIB. Next we can consider approaches to mitigate FIB damage. Importantly the new X-ray techniques that we have developed will allow us to assess how effective these approaches are. From this information we can then start to formulate strategies for actively managing FIB damage.’

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    U Oxford campus

    Oxford is a collegiate university, consisting of the central University and colleges. The central University is composed of academic departments and research centres, administrative departments, libraries and museums. The 38 colleges are self-governing and financially independent institutions, which are related to the central University in a federal system. There are also six permanent private halls, which were founded by different Christian denominations and which still retain their Christian character.

    The different roles of the colleges and the University have evolved over time.

     
  • richardmitnick 3:45 pm on April 20, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , How Graphene Could Cool Smartphone Computer and Other Electronics Chips,   

    From Rutgers: “How Graphene Could Cool Smartphone, Computer and Other Electronics Chips” 

    Rutgers University
    Rutgers University

    March 27, 2017 [Nothing like beimg timely.]
    Todd B. Bates

    Rutgers scientists lead research that discovers potential advance for the electronics industry.

    1
    Graphene, a one-atom-thick layer of graphite, consists of carbon atoms arranged in a honeycomb lattice. Photo: OliveTree/Shutterstock

    With graphene, Rutgers researchers have discovered a powerful way to cool tiny chips – key components of electronic devices with billions of transistors apiece.

    “You can fit graphene, a very thin, two-dimensional material that can be miniaturized, to cool a hot spot that creates heating problems in your chip, said Eva Y. Andrei, Board of Governors professor of physics in the Department of Physics and Astronomy. “This solution doesn’t have moving parts and it’s quite efficient for cooling.”

    The shrinking of electronic components and the excessive heat generated by their increasing power has heightened the need for chip-cooling solutions, according to a Rutgers-led study published recently in Proceedings of the National Academy of Sciences. Using graphene combined with a boron nitride crystal substrate, the researchers demonstrated a more powerful and efficient cooling mechanism.

    “We’ve achieved a power factor that is about two times higher than in previous thermoelectric coolers,” said Andrei, who works in the School of Arts and Sciences.

    The power factor refers to the effectiveness of active cooling. That’s when an electrical current carries heat away, as shown in this study, while passive cooling is when heat diffuses naturally.

    Graphene has major upsides. It’s a one-atom-thick layer of graphite, which is the flaky stuff inside a pencil. The thinnest flakes, graphene, consist of carbon atoms arranged in a honeycomb lattice that looks like chicken wire, Andrei said. It conducts electricity better than copper, is 100 times stronger than steel and quickly diffuses heat.

    The graphene is placed on devices made of boron nitride, which is extremely flat and smooth as a skating rink, she said. Silicon dioxide – the traditional base for chips – hinders performance because it scatters electrons that can carry heat away.

    See the full article here .

    Follow Rutgers Research here .

    Please help promote STEM in your local schools.

    rutgers-campus

    STEM Icon

    Stem Education Coalition

    Rutgers, The State University of New Jersey, is a leading national research university and the state’s preeminent, comprehensive public institution of higher education. Rutgers is dedicated to teaching that meets the highest standards of excellence; to conducting research that breaks new ground; and to providing services, solutions, and clinical care that help individuals and the local, national, and global communities where they live.

    Founded in 1766, Rutgers teaches across the full educational spectrum: preschool to precollege; undergraduate to graduate; postdoctoral fellowships to residencies; and continuing education for professional and personal advancement.

    Rutgers smaller
    Please give us back our original beautiful seal which the University stole away from us.

     
  • richardmitnick 10:45 am on April 20, 2017 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From NYT: “Is It O.K. to Tinker With the Environment to Fight Climate Change?” 

    New York Times

    The New York Times

    APRIL 18, 2017
    JON GERTNER

    Scientists are investigating whether releasing tons of particulates into the atmosphere might be good for the planet. Not everyone thinks this is a good idea.

    1

    For the past few years, the Harvard professor David Keith has been sketching this vision: Ten Gulfstream jets, outfitted with special engines that allow them to fly safely around the stratosphere at an altitude of 70,000 feet, take off from a runway near the Equator. Their cargo includes thousands of pounds of a chemical compound — liquid sulfur, let’s suppose — that can be sprayed as a gas from the aircraft. It is not a one-time event; the flights take place throughout the year, dispersing a load that amounts to 25,000 tons. If things go right, the gas converts to an aerosol of particles that remain aloft and scatter sunlight for two years. The payoff? A slowing of the earth’s warming — for as long as the Gulfstream flights continue.

    Keith argues that such a project, usually known as solar geoengineering, is technologically feasible and — with a back-of-the-envelope cost of under $1 billion annually — ought to be fairly cheap from a cost-benefit perspective, considering the economic damages potentially forestalled: It might do good for a world unable to cut carbon-dioxide emissions enough to prevent further temperature increases later this century.

    What surprised me, then, as Keith paced around his Harvard office one morning in early March, was his listing all the reasons humans might not want to hack the environment. “Actually, I’m writing a paper on this right now,” he said. Most of his thoughts were related to the possible dangers of trying to engineer our way out of a climate problem of nearly unimaginable scientific, political and moral complexity. Solar geoengineering might lead to what some economists call “lock-in,” referring to the momentum that a new technology, even one with serious flaws, can assume after it gains a foothold in the market. The qwerty keyboard is one commonly cited example; the internal combustion engine is another. Once we start putting sulfate particles in the atmosphere, he mused, would we really be able to stop?

    Another concern, he said, is “just the ethics about messing with nature.” Tall, wiry and kinetic, with thinning hair and a thick beard that gives him the look of the backcountry skier he is, Keith proudly showed me the framed badge that his father, a biologist, wore when he attended the landmark United Nations Conference on the Human Environment in Stockholm in 1972. Now 53, Keith has taken more wilderness trips — hiking, rock climbing, canoeing — than he can properly recall, and for their recent honeymoon, he and his wife were dropped off by helicopter 60 miles from the nearest road in northern British Columbia. “It was quite rainy,” he told me, “and that ended up making it even better.” So the prospect of intentionally changing the climate, he confessed, is not just unpleasant — “it initially struck me as nuts.”

    It still strikes him as a moral hazard, to use a term he borrows from economics. A planet cooled by an umbrella of aerosol particles — an umbrella that works by reflecting back into space, say, 1 percent of the sun’s incoming energy — might give societies less incentive to adopt greener technologies and radically cut carbon emissions. That would be disastrous, Keith said. The whole point of geoengineering is not to give us license to forget about the buildup of CO₂. It’s to lessen the ill effects of the buildup and give us time to transition to cleaner energy.

    Beyond these conceivable dangers, though, a more fundamental problem lurks: Solar geoengineering simply might not work. It has been a subject of intense debate among climate scientists for roughly a decade. But most of what we know about its potential effects derives from either computer simulations or studies on volcanic eruptions like that of Mount Pinatubo in 1991, which generated millions of tons of sunlight-scattering particulates and might have cooled the planet by as much as 0.5 degrees Celsius, or nearly 1 degree Fahrenheit. The lack of support for solar geoengineering’s efficacy informs Keith’s thinking about what we should do next. Actively tinkering with our environment — fueling up the Gulfstream jets and trying to cool things down — is not something he intends to try anytime soon, if ever. But conducting research is another matter.

    A decade ago, when Keith was among the few American scientists to advocate starting a geoengineering research program, he was often treated at science conferences as an outlier. “People would sort of inch away or, really, tell me I shouldn’t be doing this,” he said. Geoengineering was seen as a scientific taboo and Keith its dark visionary. “The preconception was that I was some kind of Dr. Strangelove figure,” he told me — “which I didn’t like.”

    Attitudes appear to have changed over the past few years, at least in part because of the continuing academic debates and computer-modeling studies. The National Academy of Sciences endorsed the pursuit of solar geoengineering research in 2015, a stance also taken in a later report by the Obama administration. A few influential environmental groups, like the Natural Resources Defense Council and the Environmental Defense Fund, now favor research.

    In the meantime, Keith’s own work at Harvard has progressed. This month, he is helping to start Harvard’s Solar Geoengineering Research Program, a broad endeavor that begins with $7 million in funding and intends to reach $20 million over seven years. One backer is the Hewlett Foundation; another is Bill Gates, whom Keith regularly advises on climate change. Keith is planning to conduct a field experiment early next year by putting particles into the stratosphere over Tucson.

    The new Harvard program is not merely intent on getting its concepts out of the lab and into the field, though; a large share of its money will also be directed to physical and social scientists at the university, who will evaluate solar geoengineering’s environmental dangers — and be willing to challenge its ethics and practicality. Keith told me, “It’s really important that we have a big chunk of the research go to groups whose job will be to find all the ways that it won’t work.” In other words, the technology that Keith has long believed could help us ease our predicament — “the nuclear option” for climate, as one opponent described it to me, to be considered only when all else has failed — will finally be investigated to see whether it is a reasonable idea. At the same time, it will be examined under the premise that it may in fact be a very, very bad one.

    Climate change already presents a demoralizing array of challenges — melting ice sheets and species extinctions — but the ultimate severity of its impacts depends greatly on how drastically technology and societies can change over the next few decades. The growth of solar and wind power in recent years, along with an apparent decrease in coal use, suggest that the global community will succeed in curtailing CO₂ emissions. Still, that may not happen nearly fast enough to avert some dangerous consequences. As Keith likes to point out, simply reducing emissions doesn’t reverse global warming. In fact, even if annual global CO₂ emissions decrease somewhat, the total atmospheric CO₂ may continue to increase, because the gas is so slow to dissipate. We may still be living with damaging amounts of atmospheric carbon dioxide a half-century from now, with calamitous repercussions. The last time atmospheric CO₂ levels were as elevated as they are today, three million years ago, sea levels were most likely 45 feet higher, and giant camels roamed above the Arctic Circle.

    Recently, I met with Daniel Schrag, who is the head of the Harvard University Center for the Environment, an interdisciplinary teaching and research department. Schrag, who helped recruit Keith to Harvard, painted a bleak picture of our odds of keeping global temperatures from rising beyond levels considered safe by many climate scientists. When you evaluate the time scales involved in actually switching our energy systems to cleaner fuels, Schrag told me, “the really depressing thing is you start to understand why any of these kinds of projections — for 2030 or 2050 — are absurd.” He went on: “Are they impossible? No. I want to give people hope, too. I’d love to make this happen. And we have made a lot of progress on some things, on solar, on wind. But the reality is we haven’t even started doing the hard stuff.”

    Schrag described any kind of geoengineering as “at best an imperfect solution that is operationally extremely challenging.” Yet to Schrag and Keith, the political and technical difficulties associated with a rapid transition to a zero-carbon-emissions world make it sensible to look into geoengineering research. There happens to be a number of different plans for how to actually do it, however — including the fantastical (pumping seawater onto Antarctica to combat sea-level rise) and the impractical (fertilizing oceans with iron to foster the growth of algae, which would absorb more CO₂). Some proposals involve taking carbon out of the air, using either immense plant farms or absorption machines. (Keith is involved with such sequestration technology, which faces significant hurdles in terms of cost and feasibility.) Another possible approach would inject salt crystals into clouds over the ocean to brighten them and cool targeted areas, like the dying Great Barrier Reef. Still, the feeling among Keith and his colleagues is that aerosols sprayed into the atmosphere might be the most economically and technologically viable approach of all — and might yield the most powerful global effect.

    It is not a new idea. In 2000, Keith published a long academic paper on the history of weather and climate modification, noting that an Institute of Rainmaking was established in Leningrad in 1932 and that American engineers began a cloud-seeding campaign in Vietnam a few decades later. A report issued in 1965 by President Lyndon B. Johnson’s administration called attention to the dangers of increasing concentrations of CO₂ and, anticipating Keith’s research, speculated that a logical response might be to change the albedo, or reflectivity, of the earth. To Keith’s knowledge, though, there have been only two actual field experiments so far. One, by a Russian scientist in 2009, released aerosols into the lower atmosphere via helicopter and appears to have generated no useful data. “It was a stunt,” Keith says. Another was a modest attempt at cloud brightening a few years ago by a team at the Scripps Institution of Oceanography at the University of California, San Diego.

    Downstairs from Keith’s Harvard office, there is a lab cluttered with students fiddling with pipettes and arcane scientific instruments. When I visited in early March, Zhen Dai, a graduate student who works with Keith, was engaged with a tabletop apparatus, a maze of tubes and pumps and sensors, meant to study how chemical compounds interact with the stratosphere. For the moment, Keith’s group is leaning toward beginning its field experiments with ice crystals and calcium carbonate — limestone — that has been milled to particles a half-micron in diameter, or less than 1/100th the width of a human hair. They may eventually try a sulfur compound too. The experiment is called Scopex, which stands for Stratospheric Controlled Perturbation Experiment. An instrument that can disperse an aerosol of particles — say, several ounces of limestone dust — will be housed in a gondola that hangs beneath a balloon that ascends to 70,000 feet. The whole custom-built contraption, whose two small propellers will be steered from the ground, will also include a variety of sensors to collect data on any aerosol plume. Keith’s group will measure the sunlight-scattering properties of the plume and evaluate how its particles interact with atmospheric gases, especially ozone. The resulting data will be used by computer models to try to predict larger-scale effects.

    But whether a scientist should be deliberately putting foreign substances into the atmosphere, even for a small experiment like this, is a delicate question. There is also the difficulty of deciding on how big the atmospheric plumes should get. When does an experiment become an actual trial run? Ultimately, how will the scientists know if geoengineering really works without scaling it up all the way?

    Keith cites precedents for his thinking: a company that scatters cremation ashes from a high-altitude balloon, and jet engines, whose exhaust contains sulfates. But the crux of the problem that Harvard’s Solar Geoengineering Research Program wrestles with is intentionality. Frank Keutsch, a professor of atmospheric sciences at Harvard who is designing and running the Scopex experiments with Keith, told me: “This effort with David is very different from all my other work, because for those other field experiments, we’ve tried to measure the atmosphere and look at processes that are already there. You’re not actually changing nature.” But in this case, Keutsch agrees, they will be.

    During one of our conversations, Keith suggested that I try to flip my thinking for a moment. “What if humanity had never gotten into fossil fuels,” he posed, “and the world had gone directly to generating energy from solar or wind power?” But then, he added, what if in this imaginary cleaner world there was a big natural seep of a heat-trapping gas from within the earth? Such events have happened before. “It would have all the same consequences that we’re worried about now, except that it’s not us doing the CO₂ emissions,” Keith said. In that case, the reaction to using geoengineering to cool the planet might be one of relief and enthusiasm.

    In other words, decoupling mankind’s actions — the “sin,” as Keith put it, of burning fossil fuels — from our present dilemma can demonstrate the value of climate intervention. “No matter what, if we emit CO₂, we are hurting future generations,” Keith said. “And it may or may not be true that doing some solar geo would over all be a wise thing to do, but we don’t know yet. That’s the reason to do research.”

    There are risks, undeniably — some small, others potentially large and terrifying. David Santillo, a senior scientist at Greenpeace, told me that some modeling studies suggest that putting aerosols in the atmosphere, which might alter local climates and rain patterns and would certainly affect the amount of sunlight hitting the earth, could have a significant impact on biodiversity. “There’s a lot more we can do in theoretical terms and in modeling terms,” Santillo said of the Harvard experiments, “before anyone should go out and do this kind of proof-of-concept work.” Alan Robock, a professor of atmospheric sciences at Rutgers, has compiled an exhaustive list of possible dangers. He thinks that small-scale projects like the Scopex experiment could be useful, but that we don’t know the impacts of large-scale geoengineering on agriculture or whether it might deplete the ozone layer (as volcanic eruptions do). Robock’s list goes on from there: Solar geoengineering would probably reduce solar-electricity generation. It would do nothing to reduce the increasing acidification of the oceans, caused by seawater absorbing carbon dioxide. A real prospect exists, too, that if solar geoengineering efforts were to stop abruptly for any reason, the world could face a rapid warming even more dangerous than what’s happening now — perhaps too fast for any ecological adaptation.

    Keith is well aware of Robock’s concerns. He also makes the distinction that advocating research is not the same as advocating geoengineering. But the line can blur. Keith struck me as having a fair measure of optimism that his research can yield insights into materials and processes that can reduce the impacts of global warming while averting huge risks. For instance, he is already encouraged by computer models that suggest the Arctic ice cap, which has shrunk this year to the smallest size observed during the satellite era, could regrow under cooler conditions brought on by light-scattering aerosols. He also believes that the most common accusation directed against geoengineering — that it might disrupt precipitation patterns and lead to widespread droughts — will prove largely unfounded.

    But Keith is not trained as an atmospheric scientist; he’s a hands-on physicist-engineer who likes to take machinery apart. There are deep unknowns here. Keutsch, for one, seems uncertain about what he will discover when the group actually tries spraying particulates high above the earth. The reduction of sunlight could adversely affect the earth’s water cycle, for example. “It really is unclear to me if this approach is feasible,” he says, “and at this point we know far too little about the risks. But if we want to know whether it works, we have to find out.”

    Finally, what if something goes wrong either in research or in deployment? David Battisti, an atmospheric scientist at the University of Washington, told me, “It’s not obvious to me that we can reduce the uncertainty to anywhere near a tolerable level — that is, to the level that there won’t be unintended consequences that are really serious.” While Battisti thought Keith’s small Scopex experiment posed little danger — “The atmosphere will restore itself,” he said — he noted that the whole point of the Harvard researchers’ work is to determine whether solar geoengineering could be done “forever,” on a large-scale, round-the-clock basis. When I asked Battisti if he had issues with going deeper into geoengineering research, as opposed to geoengineering itself, he said: “Name a technology humans have developed that they haven’t used. I can’t think of any. So we can work on this for sure. But we are in this dilemma: Once we do develop this technology, it will be tempting to use it.”

    Suppose Keith’s research shows that solar geoengineering works. What then? The world would need to agree where to set the global thermostat. If there is no consensus, could developed nations impose a geoengineering regimen on poorer nations? On the second point, if this technology works, it would arguably be unethical not to use it, because the world’s poorest populations, facing drought and rising seas, may suffer the worst effects of a changing climate.

    In recent months, a group under the auspices of the Carnegie Council in New York, led by Janos Pasztor, a former United Nations climate official, has begun to work through the thorny international issues of governance and ethics. Pasztor told me that this effort will most likely take four years. And it is not lost on him — or anyone I spoke with in Keith’s Harvard group — that the idea of engineering our environment is taking hold as we are contemplating the engineering of ourselves through novel gene-editing technologies. “They both have an effect on shaping the pathway where human beings are now and where will they be,” says Sheila Jasanoff, a professor of science and technology studies at Harvard who sometimes collaborates with Keith. Jasanoff also points out that each technology potentially enables rogue agents to act without societal consent.

    This is a widespread concern. We might reach a point at which some countries pursue geoengineering, and nothing — neither costs nor treaties nor current technologies — can stop them. Pasztor sketched out another possibility to me: “You could even have a nightmare scenario, where a country decides to do geoengineering and another country decides to do counter-geoengineering.” Such a countermeasure could take the form of an intentional release of a heat-trapping gas far more potent than CO₂, like a hydrochlorofluorocarbon. One of Schrag’s main concerns, in fact, is that geoengineering a lower global temperature might preserve ecosystems and limit sea-level rise while producing irreconcilable geopolitical frictions. “One thing I can’t figure out,” he told me, “is how do you protect the Greenland ice sheet and still have Russia have access to its northern ports, which they really like?” Either Greenland and Siberia will melt, or perhaps both can stay frozen. You probably can’t split the difference.

    For the moment, and perhaps for 10 or 20 years more, these are mere hypotheticals. But the impacts of climate change were once hypotheticals, too. Now they’ve become possibilities and probabilities. And yet, as Tom Ackerman, an atmospheric scientist at the University of Washington, said at a recent discussion among policy makers that I attended in Washington: “We are doing an experiment now that we don’t understand.” He was not talking about geoengineering; he was observing that the uncertainty about the potential risks of geoengineering can obscure the fact that there is uncertainty, too, about the escalating disasters that may soon result from climate change.

    His comment reminded me of a claim made more than a half-century ago, long before the buildup of CO₂ in the atmosphere had become the central environmental and economic problem of our time. Two scientists, Roger Revelle and Hans Suess, wrote in a scientific paper, “Human beings are now carrying out a large-scale geophysical experiment of a kind that could not have happened in the past nor be reproduced in the future.”

    If anything could sway a fence-sitter to consider whether geoengineering research makes sense, perhaps it is this. The fact is, we are living through a test already.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: