Tagged: Women in STEM Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:53 am on July 21, 2019 Permalink | Reply
    Tags: , , , Cecilia Payne, , , , Payne discovered that hydrogen and helium are the dominant elements of the stars -1925 Ph.D. thesis., , Women in STEM   

    From COSMOS Magazine: Women in STEM- “This week in science history: The woman who found hydrogen in the stars is born” Cecilia Payne 

    Cosmos Magazine bloc

    From COSMOS Magazine


    Meet the Woman Who Discovered the Composition of the Stars, Cecilia Payne. Mental Floss, Caitlin Schneider August 26, 2015

    Cecilia Payne is today recognised as an equal to Newton and Einstein, but it wasn’t always so.

    10 May 2018
    Jeff Glorfeld

    Cecilia Payne, photographed in 1951. Bettmann / Contributor / Getty Images

    Cecilia Payne, born on May 10, 1900, in Wendover, England, began her scientific career in 1919 with a scholarship to Cambridge University, where she studied physics. But in 1923 she received a fellowship to move to the United States and study astronomy at Harvard. Her 1925 thesis, Stellar Atmospheres, was described at the time by renowned Russian-American astronomer Otto Struve as “the most brilliant PhD thesis ever written in astronomy”.

    In the January, 2015, Richard Williams of the American Physical Society, wrote: “By calculating the abundance of chemical elements from stellar spectra, her work began a revolution in astrophysics.”

    In 1925 Payne received the first PhD in astronomy from Radcliffe, Harvard’s college for women, – because Harvard itself did not grant doctoral degrees to women.

    In the early 1930s she met Sergey Gaposchkin, a Russian astronomer who could not return to the Soviet Union because of his politics. Payne was able to find a position at Harvard for him. They married in 1934.

    Finally, in 1956, she achieved two Harvard firsts: she became its first female professor, and the first woman to become department chair.

    In a 2016 article about Payne for New York magazine, writer Dava Sobel reports that when she arrived at Harvard, Payne found the school had a collection of several hundred thousand glass photographs of the night sky, taken over a period of 40 years. Many of these images stretched starlight into strips, or spectra, marked by naturally occurring lines that revealed the constituent elements.

    As she painstakingly examined these plates, Payne reached her controversial – and groundbreaking – conclusion: that unlike on Earth, hydrogen and helium are the dominant elements of the stars.

    At the time, most scientists believed that because stars contained familiar elements such as silicon, aluminium and iron, similar to Earth’s make-up, they would be present in the same proportions, with only small amounts of hydrogen.

    Although the presence of hydrogen in stars had been known since the 1860s, when chemical analysis at a distance first became possible, no one expected the great abundance claimed by Payne.

    Richard Williams, writing for the American Physical Society in 2015, said: “The giants – Copernicus, Newton, and Einstein – each in his turn, brought a new view of the universe. Payne’s discovery of the cosmic abundance of the elements did no less.”

    However, at the time of her thesis publication the foremost authority on stellar composition, Henry Norris Russell, of Princeton University, convinced Payne that her conclusions had to be wrong, encouraging her write that her percentages of hydrogen and helium were “improbably high” and therefore “almost certainly not real”.

    But in brilliant vindication, Russell devoted the next four years to studying Payne’s findings, and in the issue of the Astrophysical Journal, he agreed with her and cited her 1925 study, concluding for the record that the great abundance of hydrogen “can hardly be doubted”.

    Cecilia Payne-Gaposchkin died on December 7, 1979.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 11:33 am on July 19, 2019 Permalink | Reply
    Tags: , Dr. Jennifer Andrews, , , , , The Seismo Lab at Caltech, Women in STEM   

    From Caltech: Women in STEM “What is it Like to be a Caltech Seismologist During a Big Quake?” Dr. Jennifer Andrews 

    Caltech Logo

    From Caltech

    July 18, 2019
    Robert Perkins
    (626) 395‑1862

    When an earthquake strikes, seismologists at Caltech’s Seismological Laboratory spring into action.


    Dr. Jennifer Andrews

    An arm of Caltech’s Division of Geological and Planetary Sciences (GPS), the Seismo Lab is home to dozens of seismologists who collaborate with the United States Geological Survey (USGS) to operate one of the largest seismic networks in the nation.Together, they analyze data to provide the public with information about where the quake occurred and how big it was. That information not only helps first responders, but feeds into the scientific understanding on earthquakes and when and where the next big quicks are likely to strike.

    After the two largest Ridgecrest earthquakes on July 4 and 5 (Magnitude 6.4 and 7.1, respectively), Caltech staff seismologist Jen Andrews was part of the Seismo Lab team that rushed to respond. Recently, she described that experience.

    Where were you when the earthquakes hit?

    For Thursday’s quake, I was at home in my shower. I didn’t even realize at the time that it was a quake. But when I got out and looked at my computer, I saw the report. Then the phone rang, and it was Egill [Hauksson, research professor of geophysics at Caltech], saying it was time to go to work. It was all hands on deck.

    For Friday’s quake, I was at the ballet at the Dorothy Chandler Pavilion in Downtown Los Angeles. They’d just finished act 1 and were in intermission, so fortunately no dancers were on stage to be knocked off their feet. I was in the balcony, so the movement I felt was probably amplified by the height (and also the soft sediment beneath Downtown). The chandeliers were swaying, but no one panicked. As soon as I felt it shake, I started counting. We felt it as a roll, so I knew the epicenter wasn’t right beneath us. Once I reached 20 seconds, I knew this was a big earthquake, even bigger than the first one. I immediately got in a taxi and headed straight to campus.

    What did you do next?

    Here at the Seismo Lab, it’s our responsibility to verify that all of the info we’re putting out about earthquakes—the locations and magnitudes, for example—are correct. We’re responsible for getting info about the origin out within two minutes of the shaking, so we have fully automated systems that send updates to the National Earthquake Information Center right away. All of that happens without anyone touching anything, before we can even get to our desks. But once we get there, we look at the waveforms and make sure that we’re correctly identifying the P and S waves. [During an earthquake, several types of seismic waves radiate out from the quake’s epicenter, including compressional waves (or P-waves), transverse waves (or S-waves), and surface waves.] We also know the speed at which seismic waves should travel, so we can use that to make sure that we’re correctly identifying where the quake originated. It turns out that the automatic systems did a brilliant job of getting most of the information correct.

    What is it like to be in the Seismo Lab after a big earthquake?

    It’s very busy. There’s a lot of people: seismologists, news reporters, even curious students and people who are on campus who just want to know what’s going on. Meanwhile, we have a lot of issues to deal with: we have seismologists on the phone with state representatives and others speaking to members of the press, while still others are trying to process data coming in from seismometers. Within a few hours of a quake, the USGS tries to figure out who’s going out to the location of the earthquake, and what equipment they’ll be taking. For the Ridgecrest quakes, they did flyovers in a helicopter looking for ruptures, and then sent people on the ground to measure the rupture. They then deployed additional seismometers so that we could get an even clearer picture of any aftershocks.

    How long after the earthquake will things stay busy for you?

    The media attention relaxes after a few hours or days, but I’m going to be looking at the data we gathered from these quakes for a long time. I was here every day over the holiday weekend and the following week working on it. It could take months or even years for our group to process all the data.

    Do you learn more from big earthquakes like these than you do from little ones?

    You learn different things. The data will be incorporated into earthquake hazard models, though likely will not make big changes. But these quakes in particular were interesting, as two perpendicular faults were involved. We can study the rupture dynamics, which you can’t resolve in smaller quakes. Also, having two strong quakes caused variations in fault slip and ground motion that will be important to study and understand.

    See the full article here .

    Earthquake Alert


    Earthquake Alert

    Earthquake Network projectEarthquake Network is a research project which aims at developing and maintaining a crowdsourced smartphone-based earthquake warning system at a global level. Smartphones made available by the population are used to detect the earthquake waves using the on-board accelerometers. When an earthquake is detected, an earthquake warning is issued in order to alert the population not yet reached by the damaging waves of the earthquake.

    The project started on January 1, 2013 with the release of the homonymous Android application Earthquake Network. The author of the research project and developer of the smartphone application is Francesco Finazzi of the University of Bergamo, Italy.

    Get the app in the Google Play store.

    Smartphone network spatial distribution (green and red dots) on December 4, 2015

    Meet The Quake-Catcher Network

    QCN bloc

    Quake-Catcher Network

    The Quake-Catcher Network is a collaborative initiative for developing the world’s largest, low-cost strong-motion seismic network by utilizing sensors in and attached to internet-connected computers. With your help, the Quake-Catcher Network can provide better understanding of earthquakes, give early warning to schools, emergency response systems, and others. The Quake-Catcher Network also provides educational software designed to help teach about earthquakes and earthquake hazards.

    After almost eight years at Stanford, and a year at CalTech, the QCN project is moving to the University of Southern California Dept. of Earth Sciences. QCN will be sponsored by the Incorporated Research Institutions for Seismology (IRIS) and the Southern California Earthquake Center (SCEC).

    The Quake-Catcher Network is a distributed computing network that links volunteer hosted computers into a real-time motion sensing network. QCN is one of many scientific computing projects that runs on the world-renowned distributed computing platform Berkeley Open Infrastructure for Network Computing (BOINC).

    The volunteer computers monitor vibrational sensors called MEMS accelerometers, and digitally transmit “triggers” to QCN’s servers whenever strong new motions are observed. QCN’s servers sift through these signals, and determine which ones represent earthquakes, and which ones represent cultural noise (like doors slamming, or trucks driving by).

    There are two categories of sensors used by QCN: 1) internal mobile device sensors, and 2) external USB sensors.

    Mobile Devices: MEMS sensors are often included in laptops, games, cell phones, and other electronic devices for hardware protection, navigation, and game control. When these devices are still and connected to QCN, QCN software monitors the internal accelerometer for strong new shaking. Unfortunately, these devices are rarely secured to the floor, so they may bounce around when a large earthquake occurs. While this is less than ideal for characterizing the regional ground shaking, many such sensors can still provide useful information about earthquake locations and magnitudes.

    USB Sensors: MEMS sensors can be mounted to the floor and connected to a desktop computer via a USB cable. These sensors have several advantages over mobile device sensors. 1) By mounting them to the floor, they measure more reliable shaking than mobile devices. 2) These sensors typically have lower noise and better resolution of 3D motion. 3) Desktops are often left on and do not move. 4) The USB sensor is physically removed from the game, phone, or laptop, so human interaction with the device doesn’t reduce the sensors’ performance. 5) USB sensors can be aligned to North, so we know what direction the horizontal “X” and “Y” axes correspond to.

    If you are a science teacher at a K-12 school, please apply for a free USB sensor and accompanying QCN software. QCN has been able to purchase sensors to donate to schools in need. If you are interested in donating to the program or requesting a sensor, click here.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.

    Earthquake safety is a responsibility shared by billions worldwide. The Quake-Catcher Network (QCN) provides software so that individuals can join together to improve earthquake monitoring, earthquake awareness, and the science of earthquakes. The Quake-Catcher Network (QCN) links existing networked laptops and desktops in hopes to form the worlds largest strong-motion seismic network.

    Below, the QCN Quake Catcher Network map
    QCN Quake Catcher Network map

    ShakeAlert: An Earthquake Early Warning System for the West Coast of the United States

    The U. S. Geological Survey (USGS) along with a coalition of State and university partners is developing and testing an earthquake early warning (EEW) system called ShakeAlert for the west coast of the United States. Long term funding must be secured before the system can begin sending general public notifications, however, some limited pilot projects are active and more are being developed. The USGS has set the goal of beginning limited public notifications in 2018.

    Watch a video describing how ShakeAlert works in English or Spanish.

    The primary project partners include:

    United States Geological Survey
    California Governor’s Office of Emergency Services (CalOES)
    California Geological Survey
    California Institute of Technology
    University of California Berkeley
    University of Washington
    University of Oregon
    Gordon and Betty Moore Foundation

    The Earthquake Threat

    Earthquakes pose a national challenge because more than 143 million Americans live in areas of significant seismic risk across 39 states. Most of our Nation’s earthquake risk is concentrated on the West Coast of the United States. The Federal Emergency Management Agency (FEMA) has estimated the average annualized loss from earthquakes, nationwide, to be $5.3 billion, with 77 percent of that figure ($4.1 billion) coming from California, Washington, and Oregon, and 66 percent ($3.5 billion) from California alone. In the next 30 years, California has a 99.7 percent chance of a magnitude 6.7 or larger earthquake and the Pacific Northwest has a 10 percent chance of a magnitude 8 to 9 megathrust earthquake on the Cascadia subduction zone.

    Part of the Solution

    Today, the technology exists to detect earthquakes, so quickly, that an alert can reach some areas before strong shaking arrives. The purpose of the ShakeAlert system is to identify and characterize an earthquake a few seconds after it begins, calculate the likely intensity of ground shaking that will result, and deliver warnings to people and infrastructure in harm’s way. This can be done by detecting the first energy to radiate from an earthquake, the P-wave energy, which rarely causes damage. Using P-wave information, we first estimate the location and the magnitude of the earthquake. Then, the anticipated ground shaking across the region to be affected is estimated and a warning is provided to local populations. The method can provide warning before the S-wave arrives, bringing the strong shaking that usually causes most of the damage.

    Studies of earthquake early warning methods in California have shown that the warning time would range from a few seconds to a few tens of seconds. ShakeAlert can give enough time to slow trains and taxiing planes, to prevent cars from entering bridges and tunnels, to move away from dangerous machines or chemicals in work environments and to take cover under a desk, or to automatically shut down and isolate industrial systems. Taking such actions before shaking starts can reduce damage and casualties during an earthquake. It can also prevent cascading failures in the aftermath of an event. For example, isolating utilities before shaking starts can reduce the number of fire initiations.

    System Goal

    The USGS will issue public warnings of potentially damaging earthquakes and provide warning parameter data to government agencies and private users on a region-by-region basis, as soon as the ShakeAlert system, its products, and its parametric data meet minimum quality and reliability standards in those geographic regions. The USGS has set the goal of beginning limited public notifications in 2018. Product availability will expand geographically via ANSS regional seismic networks, such that ShakeAlert products and warnings become available for all regions with dense seismic instrumentation.

    Current Status

    The West Coast ShakeAlert system is being developed by expanding and upgrading the infrastructure of regional seismic networks that are part of the Advanced National Seismic System (ANSS); the California Integrated Seismic Network (CISN) is made up of the Southern California Seismic Network, SCSN) and the Northern California Seismic System, NCSS and the Pacific Northwest Seismic Network (PNSN). This enables the USGS and ANSS to leverage their substantial investment in sensor networks, data telemetry systems, data processing centers, and software for earthquake monitoring activities residing in these network centers. The ShakeAlert system has been sending live alerts to “beta” users in California since January of 2012 and in the Pacific Northwest since February of 2015.

    In February of 2016 the USGS, along with its partners, rolled-out the next-generation ShakeAlert early warning test system in California joined by Oregon and Washington in April 2017. This West Coast-wide “production prototype” has been designed for redundant, reliable operations. The system includes geographically distributed servers, and allows for automatic fail-over if connection is lost.

    This next-generation system will not yet support public warnings but does allow selected early adopters to develop and deploy pilot implementations that take protective actions triggered by the ShakeAlert notifications in areas with sufficient sensor coverage.


    The USGS will develop and operate the ShakeAlert system, and issue public notifications under collaborative authorities with FEMA, as part of the National Earthquake Hazard Reduction Program, as enacted by the Earthquake Hazards Reduction Act of 1977, 42 U.S.C. §§ 7704 SEC. 2.

    For More Information

    Robert de Groot, ShakeAlert National Coordinator for Communication, Education, and Outreach

    Learn more about EEW Research

    ShakeAlert Fact Sheet

    ShakeAlert Implementation Plan

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”

    Caltech campus

  • richardmitnick 11:07 am on July 18, 2019 Permalink | Reply
    Tags: , , Birth of the Moon, , , , Replicating the forces that generate new planets, Sarah T. Stewart, , Women in STEM   

    From Nautilus: Women in STEM-“She Rewrote the Moon’s Origin Story” Sarah T. Stewart 


    From Nautilus

    July 18, 2019
    Brian Gallagher


    Fire When Ready: In her lab, Sarah T. Stewart (above) tries to replicate the forces that generate new planets. She employs “light gas guns, essentially cannons,” she says, to fire disks—at eight kilometers per second—toward minerals, vaporizing them, to generate the pressures and temperatures needed for planet formation. John D. & Catherine T. MacArthur Foundation.

    Fifty years ago, in the Oval Office, Richard Nixon made what he called the “most historic phone call ever.” Houston had put him through to the men on the moon. “It’s a great honor and privilege for us to be here,” Neil Armstrong said, “representing not only the United States but men of peace of all nations, and with interest and a curiosity and a vision for the future.” The Apollo missions—a daring feat of passion and reason—weren’t just for show. In reaching the moon in 1969, fulfilling John F. Kennedy’s promise seven years earlier to go there not because it would be easy, but hard, humanity tested its limits—as well as the lunar soil.

    The samples the astronauts brought back to Earth have revolutionized our understanding of the moon’s origins, leading scientists to imagine new models of how our planet, and its companion, emerged. One of those scientists is Sarah T. Stewart, a planetary physicist at the University of California, Davis. Last year she won a MacArthur Foundation Fellowship, unofficially known as the “genius grant,” for her work on the origin of Earth’s moon. Her theory upends one held for decades.

    Stewart’s bold vision grows out a love for science planted in high school in O’Fallon, Illinois. “I had phenomenal math and physics teachers,” she said. “So when I went to college, I wanted to be a physics major.” At Harvard, where she studied astronomy and physics, “I met amazing scientists, and that sparked a whole career.” She earned her Ph.D. at Caltech.

    Nautilus spoke to Stewart last year about the scientific significance of the Apollo lunar landings, as well as how her laboratory experiments, which replicate the pressures and temperatures of planetary collisions, informed her model of the moon’s birth.

    How significant were the Apollo moon landings to science?

    This July marks the 50th anniversary of the Apollo moon landing. The rock samples that the Apollo missions brought back basically threw out every previous idea for the origin of the moon. Before the Apollo results were in, a Russian astronomer named Viktor Safronov had been developing models of how planets grow. He found that they grow into these sub- or proto-planet-size bodies that would then collide. A couple of different groups then independently proposed that a giant impact made a disc around the Earth that the moon accreted from. Over the past 50 years, that model became quantitative, predictive. Simulations showed that the moon should be made primarily out of the object that struck the proto-Earth. But the Apollo mission found that the moon is practically a twin of the Earth, particularly its mantle, in major elements and in isotopic ratios: The different weight elements are like fingerprints, present in the same abundances. Every single small asteroid and planet in the solar system has a different fingerprint, except the Earth and the moon. So the giant impact hypothesis was wrong. It’s a lesson in how science works—the giant impact hypothesis hung on for so long because there was no alternative model that hadn’t already been disproven.

    How is your proposal for the moon’s birth different?

    We changed the giant impact. And by changing it we specifically removed one of the original constraints. The original giant impact was proposed to set the length of day of the Earth, because angular momentum—the rotational equivalent of linear momentum—is a physical quantity that is conserved: If we go backwards in time, the moon comes closer to the Earth. At the time the moon grew, the Earth would have been spinning with a five-hour day. So all of the giant impact models were tuned to give us a five-hour day for the Earth right after the giant impact. What we did was say, “Well, what if there were a way to change the angular momentum after the moon formed?” That would have to be through a dynamical interaction with the sun. What that means is that we could start the Earth spinning much faster—we were exploring models where the Earth had a two- to three-hour day after the giant impact.

    What did a faster-spinning Earth do to your models?

    The surprising new thing is that when the Earth is hot, vaporized, and spinning quickly, it isn’t a planet anymore. There’s a boundary beyond which all of the Earth material cannot physically stay in an object that rotates altogether—we call that the co-rotation limit. A body that exceeds the co-rotation limit forms a new object that we named a synestia, a Greek-derived word that is meant to represent a connected structure. A synestia is a different object than a planet plus a disc. It has different internal dynamics. In this hot vaporized state, the hot gas in the disc can’t fall onto the planet, because the planet has an atmosphere that’s pushing that gas out. What ends up happening is that the rock vapor that forms a synestia cools by radiating to space, forms magma rain in the outer parts of the synestia, and that magma rain accretes to form the moon within the rock vapor that later cools to become the Earth.

    How did you the idea of a synestia come about?

    In 2012, Matija Ćuk and I published a paper that was a high-spin model for the origin of the moon. We changed the impact event, but we didn’t realize that after the impact, things were completely different. It just wasn’t anything we ever extracted from the simulations. It wasn’t until two years later when my student Simon Lock and I were looking at different plots, plots we had never made before out of the same simulations, that we realized that we had been interpreting what happened next incorrectly. There was a bonafide eureka moment where we’re sitting together talking about how the disc would evolve around the Earth after the impact, and realizing that it wasn’t a standard disc. These synestias have probably been sitting in people’s computer systems for quite some time without anyone ever actually identifying them as something different.

    Was the size of the synestia beyond the moon’s current orbit?

    It could have been bigger. Exactly how big it was depends on the energy of the event and how fast it was spinning. We don’t have precise constraints on that to make the moon because a range of synestias could make the moon.

    How long was the Earth in a synestia state?

    The synestia was very large, but it didn’t last very long. Because rock vapor is very hot, and where we are in the solar system is far enough away from the sun that our mean temperature is cooler than rock vapor, the synestia cooled very quickly. So it could last a thousand years or so before looking like a normal planet again. Exactly how long it lasts depends on what else is happening in the solar system around the Earth. In order to be a long lived object it would need to be very close to the star.

    What was the size of the object that struck proto-Earth?

    We can’t tell, because a variety of mass ratios, impact angles, impact velocities can make a synestia that has enough mass and angular momentum in it to make our moon. I don’t know that we will ever know for sure exactly what hit us. There may be ways for us to constrain the possibilities. One way to do that is to look deep in the Earth for clues about how large the event could have been. There are chemical tracers from the deep mantle that indicate that the Earth wasn’t completely melted and mixed, even by the moon-forming event. Those reach the surface through what are called ocean island basalts, sometimes called mantle plumes, from near the core-mantle boundary, up through the whole mantle to the surface. It could be that that could be used as a constraint on being too big. Because the Earth and the moon are very similar in the mantles of the two bodies, that can be used to determine what is too small of an event. That would give us a range that can probably be satisfied by a number of different impact configurations.

    How much energy does it take to form a synestia?

    Giant impacts are tremendously energetic events. The energy of the event, in terms of the kinetic energy of the impact, is released over hours. The power involved is similar to the power, or luminosity, of the sun. We really cannot think of the Earth as looking anything like the Earth when you’ve just dumped the energy of the sun into this planet.

    How common are synestias?

    We actually think that synestias should happen quite frequently during rocky planet formation. We haven’t looked at the gas giant planets. There are some different physics that happen with those. But for growing rocky bodies like the Earth, we attempted to estimate the statistics of how often there should be synestias. And for Earth-mass bodies anywhere in the universe probably, the body is a synestia at least once while it’s growing. The likelihood of making a synestia goes up as the bodies become larger. Super-Earths also should have been a synestia at some point.

    You say that all of the pressures and temperatures reached during planet formation are now accessible in the laboratory. First, give us a sense of the magnitude of those pressures and temperatures, and then tell us how accessing them in labs is possible.

    The center of the Earth is at several thousand degrees, and has hundreds of gigapascals of pressure—about three million times more pressure than the surface. Jupiter’s center is even hotter. The center-of-Jupiter pressures can be reached temporarily during a giant impact, as the bodies are colliding together. A giant impact and the center of Jupiter are about the limits of the pressures and temperatures reached during planet formation: so tens of thousands of degrees, and a million times the pressure of the Earth. To replicate that, we need to dump energy into our rock or mineral very quickly in order to generate a shockwave that reaches these amplitudes in pressure and temperature. We use major minerals in the Earth, or rocky planets—so we’ve studied iron, quartz, forsterite, enstatite, and different alloy compositions of those. Other people have studied the hydrogen helium mixture for Jupiter, and ices for Uranus and Neptune. In my lab we have light gas guns, essentially cannons. And, using compressed hydrogen, we can launch a metal flyer plate—literally a thin disk—to almost eight kilometers per second. We can reach the core pressures in the Earth, but I can’t reach the range of giant impacts or the center of Jupiter in my lab. But the Sandia Z machine, which is a big capacitor that launches metal plates using a magnetic force, can reach 40 kilometers per second. , which is a big capacitor that launches metal plates using a magnetic force, can reach 40 kilometers per second.

    Sandia Z machine

    And with the National Ignition Facility laser at Lawrence Livermore National Lab, we can reach the pressures at the center of Jupiter.

    National Ignition Facility at LLNL

    What happens to the flyer plates when they’re shot?

    The target simply gets turned to dust after being vaporized and then cooling again. They’re very destructive experiments. You have to make real time measurements—of the wave itself and how fast it’s traveling—within tens of nanoseconds. That we can translate to pressure. My group has spent a lot of time developing ways to measure temperature, and to find phase boundaries. The work that led to the origin of the moon was specifically studying what it takes to vaporize Earth materials, and to determine the boiling points of rocks. We needed to know when it would be vaporized in order to calculate when something would become a synestia.

    How do you use your experimental results?

    What runs in our code is a simplified version of a planet. With our experiments we can simulate a simplified planet to infer the more complicated chemical system. Once we’ve determined the pressure-temperature of the average system, you can ask more detailed questions about the multi-component chemistry of a real planet. In the moon paper that was published last year, there’s two big sections. One that does the simplified modeling of the giant impact—it gives us the pressure-temperature range in the synestia. Then another that looks at the chemistry of the system that starts at these high pressures and temperatures and cools, but now using a more realistic model for the Earth.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

  • richardmitnick 12:24 pm on July 12, 2019 Permalink | Reply
    Tags: EMIC-electromagnetic ion cyclotron waves, Eun-Hwa Kim, , Plasma particles, , Women in STEM   

    From PPPL: Women in STEM-“Scientists deepen understanding of the magnetic fields that surround the Earth and other planets” Eun-Hwa Kim 

    From PPPL

    July 12, 2019
    Raphael Rosen

    PPPL physicist Eun-Hwa Kim (Photo by Elle Starkman)

    Vast rings of electrically charged particles encircle the Earth and other planets. Now, a team of scientists has completed research into waves that travel through this magnetic, electrically charged environment, known as the magnetosphere, deepening understanding of the region and its interaction with our own planet, and opening up new ways to study other planets across the galaxy.

    Magnetosphere of Earth, original bitmap from NASA. SVG rendering by Aaron Kaase

    The scientists, led by Eun-Hwa Kim, physicist at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL), examined a type of wave that travels through the magnetosphere. These waves, called electromagnetic ion cyclotron (EMIC) waves, reveal the temperature and the density of the plasma particles within the magnetosphere, among other qualities.

    “Waves are a kind of signal from the plasma,” said Kim, lead author of a paper that reported the findings in JGR Space Physics. “Therefore, the EMIC waves can be used as diagnostic tools to reveal some of the plasma’s characteristics.”

    Kim and researchers from Andrews University in Michigan and Kyung Hee University in South Korea focused their research on mode conversion, the way in which some EMIC waves form. During this process, other waves that compress along the direction they travel from outer space collide with Earth’s magnetosphere and trigger the formation of EMIC waves, which then zoom off at a particular angle and polarization — the direction in which all of the light waves are vibrating.

    Using PPPL computers, the scientists performed simulations showing that these mode-converted EMIC waves can propagate through the magnetosphere along magnetic field lines at a normal angle that is less than 90 degrees, in relation to the border of the region with space. Knowing such characteristics enables physicists to identify EMIC waves and gather information about the magnetosphere with limited initial information.

    A better understanding of the magnetosphere could provide detailed information about how Earth and other planets interact with their space environment. For instance, the waves could allow scientists to determine the density of elements like helium and oxygen in the magnetosphere, as well as learn more about the flow of charged particles from the sun that produces the aurora borealis.

    Moreover, engineers employ waves similar to EMIC waves to aid the heating of plasma in doughnut-shaped magnetic fusion devices known as tokamaks. So, studying the behavior of the waves in the magnetosphere could deepen insight into the creation of fusion energy, which takes place when plasma particles collide to form heavier particles. Scientists around the world seek to replicate fusion on Earth for a virtually inexhaustible supply of power to generate electricity.

    Knowledge of EMIC waves could thus provide wide-ranging benefits. “We are really eager to understand the magnetosphere and how it mediates the effect that space weather has on our planet,” said Kim. “Being able to use EMIC waves as diagnostics would be very helpful.”

    This research was supported by the DOE’s Office of Science (Fusion Energy Sciences), the National Science Foundation, and the National Aeronautics and Space Administration.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    PPPL campus

    Princeton Plasma Physics Laboratory is a U.S. Department of Energy national laboratory managed by Princeton University. PPPL, on Princeton University’s Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. Results of PPPL research have ranged from a portable nuclear materials detector for anti-terrorist use to universally employed computer codes for analyzing and predicting the outcome of fusion experiments. The Laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the largest single supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

  • richardmitnick 11:56 am on July 12, 2019 Permalink | Reply
    Tags: "Enriching solid-state batteries", , , Jennifer Rupp, , , , , , Women in STEM   

    From MIT News: Women in STEM-“Enriching solid-state batteries” Jennifer Rupp 

    MIT News

    From MIT News

    July 11, 2019
    Denis Paiste | Materials Research Laboratory

    MIT Associate Professor Jennifer Rupp stands in front of a pulsed laser deposition chamber, in which her team developed a new lithium garnet electrolyte material with the fastest reported ionic conductivity of its type. The technique produces a thin film about 330 nanometers thick. “Having the lithium electrolyte as a solid-state very fast conductor allows you to dream out loud of anything else you can do with fast lithium motion,” Rupp says. Photo: Denis Paiste/Materials Research Laboratory

    Researchers at MIT have come up with a new pulsed laser deposition technique to make thinner lithium electrolytes using less heat, promising faster charging and potentially higher-voltage solid-state lithium ion batteries.

    Key to the new technique for processing the solid-state battery electrolyte is alternating layers of the active electrolyte lithium garnet component (chemical formula, Li6.25Al0.25La3Zr2O12, or LLZO) with layers of lithium nitride (chemical formula Li3N). First, these layers are built up like a wafer cookie using a pulsed laser deposition process at about 300 degrees Celsius (572 degrees Fahrenheit). Then they are heated to 660 C and slowly cooled, a process known as annealing.

    During the annealing process, nearly all of the nitrogen atoms burn off into the atmosphere and the lithium atoms from the original nitride layers fuse into the lithium garnet, forming a single lithium-rich, ceramic thin film. The extra lithium content in the garnet film allows the material to retain the cubic structure needed for positively charged lithium ions (cations) to move quickly through the electrolyte. The findings were reported in a Nature Energy paper published online recently by MIT Associate Professor Jennifer L. M. Rupp and her students Reto Pfenninger, Michal M. Struzik, Inigo Garbayo, and collaborator Evelyn Stilp.

    “The really cool new thing is that we found a way to bring the lithium into the film at deposition by using lithium nitride as an internal lithiation source,” Rupp, the work’s senior author, says. Rupp holds joint MIT appointments in the departments of Materials Science and Engineering and Electrical Engineering and Computer Science.

    “The second trick to the story is that we use lithium nitride, which is close in bandgap to the laser that we use in the deposition, whereby we have a very fast transfer of the material, which is another key factor to not lose lithium to evaporation during a pulsed laser deposition,” Rupp explains.

    Safer technology

    Lithium batteries with commonly used electrolytes made by combining a liquid and a polymer can pose a fire risk when the liquid is exposed to air. Solid-state batteries are desirable because they replace the commonly used liquid polymer electrolytes in consumer lithium batteries with a solid material that is safer. “So we can kick that out, bring something safer in the battery, and decrease the electrolyte component in size by a factor of 100 by going from the polymer to the ceramic system,” Rupp explains.

    Although other methods to produce lithium-rich ceramic materials on larger pellets or tapes, heated using a process called sintering, can yield a dense microstructure that retains a high lithium concentration, they require higher heat and result in bulkier material. The new technique pioneered by Rupp and her students produces a thin film that is about 330 nanometers thick (less than 1.5 hundred-thousandths of an inch). “Having a thin film structure instead of a thick ceramic is attractive for battery electrolyte in general because it allows you to have more volume in the electrodes, where you want to have the active storage capacity. So the holy grail is be thin and be fast,” she says.

    Compared to the classic ceramic coffee mug, which under high magnification shows metal oxide particles with a grain size of tens to hundreds of microns, the lithium (garnet) oxide thin films processed using Rupp’s methods show nanometer scale grain structures that are one-thousandth to one-ten-thousandth the size. That means Rupp can engineer thinner electrolytes for batteries. “There is no need in a solid-state battery to have a large electrolyte,” she says.

    Faster ionic conduction

    Instead, what is needed is an electrolyte with faster conductivity. The unit of measurement for lithium ion conductivity is expressed in Siemens. The new multilayer deposition technique produces a lithium garnet (LLZO) material that shows the fastest ionic conductivity yet for a lithium-based electrolyte compound, about 2.9 x 10-5 Siemens (0.000029 Siemens) per centimeter. This ionic conductivity is competitive with solid-state lithium battery thin film electrolytes based on LIPON (lithium phosphorus oxynitride electrolytes) and adds a new film electrolyte material to the landscape.

    “Having the lithium electrolyte as a solid-state very fast conductor allows you to dream out loud of anything else you can do with fast lithium motion,” Rupp says.

    A battery’s negatively charged electrode stores power. The work points the way toward higher-voltage batteries based on lithium garnet electrolytes, both because its lower processing temperature opens the door to using materials for higher voltage cathodes that would be unstable at higher processing temperatures, and its smaller electrolyte size allows physically larger cathode volume in the same battery size.

    Co-authors Michal Struzik and Reto Pfenninger carried out processing and Raman spectroscopy measurements on the lithium garnet material. These measurements were key to showing the material’s fast conduction at room temperature, as well as understanding the evolution of its different structural phases.

    “One of the main challenges in understanding the development of the crystal structure in LLZO was to develop appropriate methodology. We have proposed a series of experiments to observe development of the crystal structure in the [LLZO] thin film from disordered or ‘amorphous’ phase to fully crystalline, highly conductive phase utilizing Raman spectroscopy upon thermal annealing under controlled atmospheric conditions,” says co-author Struzik, who was a postdoc working at ETH Zurich and MIT with Rupp’s group, and is now a professor at Warsaw University of Technology in Poland. “That allowed us to observe and understand how the crystal phases are developed and, as a consequence, the ionic conductivity improved,” he explains.

    Their work shows that during the annealing process, lithium garnet evolves from the amorphous phase in the initial multilayer processed at 300 C through progressively higher temperatures to a low conducting tetragonal phase in a temperature range from about 585 C to 630 C, and to the desired highly conducting cubic phase after annealing at 660 C. Notably, this temperature of 660 C to achieve the highly conducting phase in the multilayer approach is nearly 400 C lower than the 1,050 C needed to achieve it with prior sintering methods using pellets or tapes.

    “One of the greatest challenges facing the realization of solid-state batteries lies in the ability to fabricate such devices. It is tough to bring the manufacturing costs down to meet commercial targets that are competitive with today’s liquid-electrolyte-based lithium-ion batteries, and one of the main reasons is the need to use high temperatures to process the ceramic solid electrolytes,” says Professor Peter Bruce, the Wolfson Chair of the Department of Materials at Oxford University, who was not involved in this research.

    “This important paper reports a novel and imaginative approach to addressing this problem by reducing the processing temperature of garnet-based solid-state batteries by more than half — that is, by hundreds of degrees,” Bruce adds. “Normally, high temperatures are required to achieve sufficient solid-state diffusion to intermix the constituent atoms of ceramic electrolyte. By interleaving lithium layers in an elegant nanostructure the authors have overcome this barrier.”

    After demonstrating the novel processing and high conductivity of the lithium garnet electrode, the next step will be to test the material in an actual battery to explore how the material reacts with a battery cathode and how stable it is. “There is still a lot to come,” Rupp predicts.

    Understanding aluminum dopant sites

    A small fraction of aluminum is added to the lithium garnet formulation because aluminum is known to stabilize the highly conductive cubic phase in this high-temperature ceramic. The researchers complemented their Raman spectroscopy analysis with another technique, known as negative-ion time-of-flight secondary ion mass spectrometry (TOF-SIMS), which shows that the aluminum retains its position at what were originally the interfaces between the lithium nitride and lithium garnet layers before the heating step expelled the nitrogen and fused the material.

    “When you look at large-scale processing of pellets by sintering, then everywhere where you have a grain boundary, you will find close to it a higher concentration of aluminum. So we see a replica of that in our new processing, but on a smaller scale at the original interfaces,” Rupp says. “These little things are what adds up, also, not only to my excitement in engineering but my excitement as a scientist to understand phase formations, where that goes and what that does,” Rupp says.

    “Negative TOF-SIMS was indeed challenging to measure since it is more common in the field to perform this experiment with focus on positively charged ions,” explains Pfenninger, who worked at ETH Zurich and MIT with Rupp’s group. “However, for the case of the negatively charged nitrogen atoms we could only track it in this peculiar setup. The phase transformations in thin films of LLZO have so far not been investigated in temperature-dependent Raman spectroscopy — another insight towards the understanding thereof.”

    The paper’s other authors are Inigo Garbayo, who is now at CIC EnergiGUNE in Minano, Spain, and Evelyn Stilp, who was then with Empa, Swiss Federal Laboratories for Materials Science and Technology, in Dubendorf, Switzerland.

    Rupp began this research while serving as a professor of electrochemical materials at ETH Zurich (the Swiss Federal Institute of Technology) before she joined the MIT faculty in February 2017. MIT and ETH have jointly filed for two patents on the multi-layer lithium garnet/lithium nitride processing. This new processing method, which allows precise control of lithium concentration in the material, can also be applied to other lithium oxide films such as lithium titanate and lithium cobaltate that are used in battery electrodes. “That is something we invented. That’s new in ceramic processing,” Rupp says.

    “It is a smart idea to use Li3N as a lithium source during preparation of the garnet layers, as lithium loss is a critical issue during thin film preparation otherwise,” comments University Professor Jürgen Janek at Justus Liebig University Giessen in Germany. Janek, who was not involved in this research, adds that “the quality of the data and the analysis is convincing.”

    “This work is an exciting first step in preparing one of the best oxide-based solid electrolytes in an intermediate temperature range,” Janek says. “It will be interesting to see whether the intermediate temperature of about 600 degrees C is sufficient to avoid side reactions with the electrode materials.”

    Oxford Professor Bruce notes the novelty of the approach, adding “I’m not aware of similar nanostructured approaches to reduce diffusion lengths in solid-state synthesis.”

    “Although the paper describes specific application of the approach to the formation of lithium-rich and therefore highly conducting garnet solid electrolytes, the methodology has more general applicability, and therefore significant potential beyond the specific examples provided in the paper,” Bruce says. Commercialization may be needed to be demonstrate this approach at larger scale, he suggests.

    While the immediate impact of this work is likely to be on batteries, Rupp predicts another decade of exciting advances based on applications of her processing techniques to devices for neuromorphic computing, artificial intelligence, and fast gas sensors. “The moment the lithium is in a small solid-state film, you can use the fast motion to trigger other electrochemistry,” she says.

    Several companies have already expressed interest in using the new electrolyte approach. “It’s good for me to work with strong players in the field so they can push out the technology faster than anything I can do,” Rupp says.

    This work was funded by the MIT Lincoln Laboratory, the Thomas Lord Foundation, Competence Center Energy and Mobility, and Swiss Electrics.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

  • richardmitnick 8:49 am on July 11, 2019 Permalink | Reply
    Tags: , Ruby Mendenhall, , Women in STEM   

    From Science Node: Women in STEM-“The citizen scientists of hidden America” Ruby Mendenhall 

    Science Node bloc
    From Science Node

    03 July, 2019
    Alisa Alering

    This health study in Chicago recruits subjects to also be the scientists.

    When you read the words ‘citizen scientist’, what do you picture? Maybe backyard astronomers helping to classify distant galaxies, or fifth graders recording soil temperatures to track climate change.

    But Ruby Mendenhall, assistant dean for diversity and democratization of health innovation at the Carle Illinois College of Medicine, has a different idea of what citizen science can do—and who can participate.

    Mendenhall used a 2017-2018 NCSA Faculty Fellowship to examine how exposure to nearby gun crimes impacts African-American mothers living in Englewood, Chicago. Home to about 30,000 people, Englewood has a reputation as one of the most violent neighborhoods in the city.

    Beyond the physical effects of stress, Mendenhall wanted to investigate the long-term consequences experienced by women living in communities like Englewood. For example, what happens to a parent when the sound of gunshots is common during the day—and especially at night?

    Here’s where the citizen science comes in. The women of Englewood aren’t just subjects in this research, they’re active participants.

    “We wanted to put more agency in their hands,” says Mendenhall. “We asked them, ‘What would you like to see solved? What’s an issue that you have? How can we study this?’”

    From subjects to scientists

    Mendenhall sees citizen science as a way to address health disparities and social inequality. Though many citizen science projects focus on topics like backyard biology, it’s an existing framework that can be applied to community-based participatory research in health and medicine.

    “These are citizen scientists who can take knowledge of their own lived experience and create new knowledge about Black women and families,” says Mendenhall. “We hope they can help us make medical advances around depression, PTSD, and how the body responds to stress.”

    Mendenhall wanted to put more agency in the hands of the women, transforming them from study subjects into participating scientists. The researchers asked what the women wanted to see solved, what issues they were concerned about, and how it might be studied.

    Mendenhall then teamed up with computer scientist Kiel Gilleade to design a mobile health study that documented the women’s experience via wearable biosensors, phone GPS, and diary-keeping.

    Given historical problems with mistrust of the medical community—and with good reason—Mendenhall was concerned that the participants wouldn’t agree to let researchers take samples of their blood (for a separate study) to see how stress affected the genes that regulate the immune system.

    But, somewhat to her surprise, the women agreed. One of the reasons the women gave for their willingness to participate was that they recognized the impact stress was having on their bodies.

    “They talked about having headaches, backaches, stomachaches, many things,” says Mendenhall. “They were interested in what was going on with their bodies, what was the connection.”

    Asking the right questions

    Whose voice is not represented? Mendenhall presented her keynote address, Using Advanced Computing to Recover Black Women’s Lost History, at the PEARC18 conference in Pittsburgh, in July 2018.

    Mendenhall hasn’t always engaged with computation to further her research. She started her academic career in African-American studies and sociology. But when faculty from NCSA visited her department, Mendenhall became intrigued by the possibilities of big data.

    “I didn’t change the research I was interested in, I didn’t change my focus on Black women and their agency and their lived experiences on the margins of society,” says Mendenhall. “What I did was expand my toolkit and my ability to answer questions—and even to ask different questions.”

    Some of the questions she’s asking are: Whose voice may not be represented? Whose lived experience isn’t represented? If they were, how would what we see be different? Mendenhall believes that scholars of all types can benefit from putting more time and energy into asking questions like these.

    “I think it’s important to understand that big data is not neutral, it is not objective,” says Mendenhall. “Data is situated within a historical and political context.”

    Despite biases in existing collections of data, Mendenhall believes data can also be applied to help equalize the historical record.

    “I think big data has great potential if more voices are brought in,” says Mendenhall. “If everyone’s voice can be heard and seen and studied and digitized. And if Black women can also study it themselves and develop ideas about what that data is representing.”

    The study about Black women in Englewood followed only twelve women but the next step will be to expand the pool of citizen scientists to 600 or more.

    “Ideally, I’m thinking about 100,000 citizen scientists or all the women in Chicago. If they could all be citizen scientists—then what would we see?”

    Mendenhall is currently at work on a funding proposal to create a Communiversity Think-and-Do Tank where researchers and citizen scientists will work together to address grand challenges (e.g., gun violence, Black infant and maternal mortality, mental health, diverse histories in the digital archives, etc.) She hopes this will be one avenue to get her closer to her goal of 100,000 citizen scientists.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

  • richardmitnick 12:14 pm on July 3, 2019 Permalink | Reply
    Tags: "Seeing Farther and Deeper: An Interview with Katie Bouman", , , , Computer vision and imaging, Women in STEM   

    From Caltech: Women in STEM- “Seeing Farther and Deeper: An Interview with Katie Bouman” 

    Caltech Logo

    From Caltech

    July 02, 2019

    Robert Perkins
    (626) 395‑1862


    New Caltech faculty member Katie Bouman creates images from nonideal sensor data and mines for information from images using techniques that can be applied to everything from medical imaging to studying the universe.

    An assistant professor of computing and mathematical sciences in the Division of Engineering and Applied Science, Bouman joined Caltech’s faculty at the beginning of June. She earned her bachelor’s degree at the University of Michigan in Ann Arbor, followed by a master’s and PhD from MIT. After completing her graduate studies, she worked as a postdoctoral researcher at the Harvard-Smithsonian Center for Astrophysics. Bouman was one of about 200 scientists and engineers from across the globe who worked on the Event Horizon Telescope project, which made headlines in April for capturing the first-ever image of a black hole.

    Recently, Bouman answered a few questions about her life and work.

    How would you describe your research?

    I like thinking about how we can use imaging to help push forward the boundaries of other fields. I did my PhD in a computer vision group—a group that tries to analyze images and understand images. A lot of people in the computer vision field work on object detection and action recognition. Those are really interesting problems, and researchers have been working for decades to build machines that mimic human intelligence in order to solve them. But there is another world of interesting problems that cameras and images can help us solve that humans are not even capable of doing on their own.

    I like to search for information hidden in images, imperceptible to humans, that we can use to learn about the environment around us. This requires an understanding of the complete sensing system: how light interacts with the world and is then captured by our camera sensor into individual pixels. This line of research, where we work on merging sensors and algorithms to achieve something not possible with just one or the other, is often described as computational imaging or computational photography.

    What kind of applications do you see for this work?

    There are multiple sides to the research I enjoy: one side is coming up with new ways to reconstruct images invisible to traditional sensors, and another side is using images or videos to extract hidden information from a scene. For instance, I’ve used each pixel in a video like a very noisy sensor to recover the location of people moving behind a wall from imperceptible changes in shadows that appear on the ground. I’ve also used data to create an image. For example, in the black hole imaging work, we had really noisy, sparse data. We had to figure out how to create an image to learn something from what we were seeing.

    Here at Caltech, I’m excited to start connecting with people across campus and help them use imaging to push the boundaries of their disciplines. I’ve already had the opportunity to speak with Zach Ross [assistant professor of geophysics] about how new techniques could help in more precisely localizing the origin of collections of earthquakes. This work, perhaps surprisingly, contains many similarities to the work I’ve done in black hole imaging.

    I also will be having Aviad Levis join me as postdoc next year. Aviad has been working with JPL on studying cloud tomography: reconstructing the 3D structure and the particle distribution of clouds from 2D images taken by planes or satellites. Similar to imaging black holes, these clouds evolve as the measurements are being taken, so every measurement captures a different sample of the cloud structure. We are excited about exploring some ideas for solving both of these messy, time-evolving problems. By intelligently connecting the information from time-variable measurements, I’m confident we can design algorithms to solve for a more accurate cloud structure or a video of a black hole evolving over time.

    Each problem, each application, has its own intricacies; understanding the structure of a problem is exciting, and by encoding that structure into our algorithms, we can learn more.

    I understand that you’re not the only Bouman on campus right now.

    That’s right. My sister, Amanda, is a graduate student in mechanical engineering, and my brother, Alexander, is an undergraduate also in mechanical engineering. It’s definitely been nice having them around to show me the ropes and help me get settled on campus.

    After living on the East Coast, how do you like being in California?

    My husband and I are really enjoying it, but we’re still getting used to it. It seems like every day I tell him that we need to eat outside because it’s so beautiful, so we end up grilling outside every day. In Boston, we had to take advantage of every nice day. Eventually, we’re going to have to stop eating just hamburgers and hot dogs.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”

    Caltech campus

  • richardmitnick 12:36 pm on June 26, 2019 Permalink | Reply
    Tags: , , Catherine Drennan, , , Drennan seized on X-ray crystallography as a way to visualize molecular structures., , Women in STEM   

    From MIT News: Women in STEM- “For Catherine Drennan, teaching and research are complementary passions” 

    MIT News

    From MIT News

    June 26, 2019
    Leda Zimmerman

    “Really the most exciting thing for me is watching my students ask good questions, problem-solve, and then do something spectacular with what they’ve learned,” says Professor Catherine Drennan. Photo: James Kegley

    Professor of biology and chemistry is catalyzing new approaches in research and education to meet the climate challenge.

    Catherine Drennan says nothing in her job thrills her more than the process of discovery. But Drennan, a professor of biology and chemistry, is not referring to her landmark research on protein structures that could play a major role in reducing the world’s waste carbons.

    “Really the most exciting thing for me is watching my students ask good questions, problem-solve, and then do something spectacular with what they’ve learned,” she says.

    For Drennan, research and teaching are complementary passions, both flowing from a deep sense of “moral responsibility.” Everyone, she says, “should do something, based on their skill set, to make some kind of contribution.”

    Drennan’s own research portfolio attests to this sense of mission. Since her arrival at MIT 20 years ago, she has focused on characterizing and harnessing metal-containing enzymes that catalyze complex chemical reactions, including those that break down carbon compounds.

    She got her start in the field as a graduate student at the University of Michigan, where she became captivated by vitamin B12. This very large vitamin contains cobalt and is vital for amino acid metabolism, the proper formation of the spinal cord, and prevention of certain kinds of anemia. Bound to proteins in food, B12 is released during digestion.

    “Back then, people were suggesting how B12-dependent enzymatic reactions worked, and I wondered how they could be right if they didn’t know what B12-dependent enzymes looked like,” she recalls. “I realized I needed to figure out how B12 is bound to protein to really understand what was going on.”

    Drennan seized on X-ray crystallography as a way to visualize molecular structures. Using this technique, which involves bouncing X-ray beams off a crystallized sample of a protein of interest, she figured out how vitamin B12 is bound to a protein molecule.

    “No one had previously been successful using this method to obtain a B12-bound protein structure, which turned out to be gorgeous, with a protein fold surrounding a novel configuration of the cofactor,” says Drennan.

    Carbon-loving microbes show the way

    These studies of B12 led directly to Drennan’s one-carbon work. “Metallocofactors such as B12 are important not just medically, but in environmental processes,” she says. “Many microbes that live on carbon monoxide, carbon dioxide, or methane — eating carbon waste or transforming carbon — use metal-containing enzymes in their metabolic pathways, and it seemed like a natural extension to investigate them.”

    Some of Drennan’s earliest work in this area, dating from the early 2000s, revealed a cluster of iron, nickel, and sulfur atoms at the center of the enzyme carbon monoxide dehydrogenase (CODH). This so-called C-cluster serves hungry microbes, allowing them to “eat” carbon monoxide and carbon dioxide.

    Recent experiments by Drennan analyzing the structure of the C-cluster-containing enzyme CODH showed that in response to oxygen, it can change configurations, with sulfur, iron, and nickel atoms cartwheeling into different positions. Scientists looking for new avenues to reduce greenhouse gases took note of this discovery. CODH, suggested Drennan, might prove an effective tool for converting waste carbon dioxide into a less environmentally destructive compound, such as acetate, which might also be used for industrial purposes.

    Drennan has also been investigating the biochemical pathways by which microbes break down hydrocarbon byproducts of crude oil production, such as toluene, an environmental pollutant.

    “It’s really hard chemistry, but we’d like to put together a family of enzymes to work on all kinds of hydrocarbons, which would give us a lot of potential for cleaning up a range of oil spills,” she says.

    The threat of climate change has increasingly galvanized Drennan’s research, propelling her toward new targets. A 2017 study she co-authored in Science detailed a previously unknown enzyme pathway in ocean microbes that leads to the production of methane, a formidable greenhouse gas: “I’m worried the ocean will make a lot more methane as the world warms,” she says.

    Drennan hopes her work may soon help to reduce the planet’s greenhouse gas burden. Commercial firms have begun using the enzyme pathways that she studies, in one instance employing a proprietary microbe to capture carbon dioxide produced during steel production — before it is released into the atmosphere — and convert it into ethanol.

    “Reengineering microbes so that enzymes take not just a little, but a lot of carbon dioxide out of the environment — this is an area I’m very excited about,” says Drennan.

    Creating a meaningful life in the sciences

    At MIT, she has found an increasingly warm welcome for her efforts to address the climate challenge.

    “There’s been a shift in the past decade or so, with more students focused on research that allows us to fuel the planet without destroying it,” she says.

    In Drennan’s lab, a postdoc, Mary Andorfer, and a rising junior, Phoebe Li, are currently working to inhibit an enzyme present in an oil-consuming microbe whose unfortunate residence in refinery pipes leads to erosion and spills. “They are really excited about this research from the environmental perspective and even made a video about their microorganism,” says Drennan.

    Drennan delights in this kind of enthusiasm for science. In high school, she thought chemistry was dry and dull, with no relevance to real-world problems. It wasn’t until college that she “saw chemistry as cool.”

    The deeper she delved into the properties and processes of biological organisms, the more possibilities she found. X-ray crystallography offered a perfect platform for exploration. “Oh, what fun to tell the story about a three-dimensional structure — why it is interesting, what it does based on its form,” says Drennan.

    The elements that excite Drennan about research in structural biology — capturing stunning images, discerning connections among biological systems, and telling stories — come into play in her teaching. In 2006, she received a $1 million grant from the Howard Hughes Medical Institute (HHMI) for her educational initiatives that use inventive visual tools to engage undergraduates in chemistry and biology. She is both an HHMI investigator and an HHMI professor, recognition of her parallel accomplishments in research and teaching, as well as a 2015 MacVicar Faculty Fellow for her sustained contribution to the education of undergraduates at MIT.

    Drennan attempts to reach MIT students early. She taught introductory chemistry classes from 1999 to 2014, and in fall 2018 taught her first introductory biology class.

    “I see a lot of undergraduates majoring in computer science, and I want to convince them of the value of these disciplines,” she says. “I tell them they will need chemistry and biology fundamentals to solve important problems someday.”

    Drennan happily migrates among many disciplines, learning as she goes. It’s a lesson she hopes her students will absorb. “I want them to visualize the world of science and show what they can do,” she says. “Research takes you in different directions, and we need to bring the way we teach more in line with our research.”

    She has high expectations for her students. “They’ll go out in the world as great teachers and researchers,” Drennan says. “But it’s most important that they be good human beings, taking care of other people, asking what they can do to make the world a better place.”

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

  • richardmitnick 8:32 am on June 17, 2019 Permalink | Reply
    Tags: "This Incredible Orbit Map of Our Solar System Makes Our Brains Ache", Eleanor Lutz, , , , Women in STEM   

    From University of Washington via Science Alert: Women in STEM- “This Incredible Orbit Map of Our Solar System Makes Our Brains Ache” Eleanor Lutz 

    U Washington

    From University of Washington



    Science Alert

    (Eleanor Lutz)

    17 JUN 2019

    If you want to know what a talent for scientific visualizations looks like, check out Eleanor Lutz. She’s a PhD student in biology at the University of Washington, and at her website Tabletop Whale, you can see her amazing work on full display.

    Her latest piece is a map showing all the orbits of over 18,000 asteroids in the Solar System. It includes 10,000 asteroids that are over 10 km in diameter, and about 8,000 objects of unknown size.

    As the tagline at her website says, she produces “Charts, infographics, and animations about any and all things science.”

    This includes things like a “Visual Compendium of Glowing Creatures,” “All the Stars You Can See From Earth,” and a beautiful topographic map of Mercury.


    But it’s her newest project that is garnering her a lot of attention in the space community. Lutz is working on an Atlas of Space, and has been for the last year and a half. It’s a collection of ten visualizations including planets, moons, and outer space.

    As she says on her website, “I’ve made an animated map of the seasons on Earth, a map of Mars geology, and a map of everything in the solar system bigger than 10 km.”

    It’s that map of objects larger than 10 km that is generating buzz.

    (Eleanor Lutz)

    All of the data for Lutz’s Atlas of Space is public data, freely available. She gets if from sources like NASA and the US Geological Survey.

    Part of what drives her is that even though the data is public and freely available, it’s raw. And taking that raw data and turning it into a helpful, and even beautiful, visualization, takes a lot of work.

    In an interview with Wired, Lutz said, “I really like that all this data is accessible, but it’s very difficult to visualize. It’s really awesome science, and I wanted everyone to be able to see it in a way that makes sense.”

    ( Eleanor Lutz)

    (Eleanor Lutz)

    Lutz’s work is really more than data visualizations. She has a designer’s eye, and some of her work is very artful.

    But being a scientist, she’s inspired to share the data and the methods she used to create her work. She plans to publish the open source code for each of her pieces, and also tutorials for how to create them yourself.

    It’s difficult to understand our world, or anything in nature really, without engaging with science. Without science, all we have is anecdote and opinion.

    But science is all about data, and dense data is not everyone’s cup of tea. It’s taxing and time-consuming to understand.

    Lutz’s work is making it easier. In an interview with Wired, she said, “There’s a knowledge barrier to accessing some of the interesting, awesome things about science. There are so many facts and equations, and I want those cool ideas to be accessible.”

    To access some of those cool ideas she’s talking about, visit her website, tabletopwhale.com, where you can explore her work and her methods. You can also purchase prints there.

    This article was originally published by Universe Today. Read the original article.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Washington is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.
    So what defines us —the students, faculty and community members at the University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

  • richardmitnick 11:39 am on May 31, 2019 Permalink | Reply
    Tags: "Scientists discover ancient seawater preserved from the last Ice Age", Asst. Prof. Clara Blättler- U Chicago, , Geophysical sciences, , Women in STEM   

    From University of Chicago: Women in STEM- “Scientists discover ancient seawater preserved from the last Ice Age” Asst. Prof. Clara Blättler, U Chicago 

    U Chicago bloc

    From University of Chicago

    May 23, 2019
    Louise Lerner

    Asst. Prof. Clara Blättler with a vial of seawater dating to the last Ice Age—about 20,000 years ago. Photo by Jean Lachat.

    Drops locked inside rock offer clues to modeling Earth’s climate and ocean circulation.

    Twenty thousand years ago, in the thick of an Ice Age, Earth looked very different. Because water was locked up in glaciers hundreds of feet thick, which stretched down over Chicago and New York City, the ocean was smaller—shorelines extended hundreds of miles farther out, and the remaining water was saltier and colder.

    A University of Chicago scientist led a study [Geochimica et Cosmochimica Acta] that recently announced the discovery of the first-ever direct remnants of that ocean: pockets of seawater dating to the Ice Age, tucked inside rock formations in the middle of the Indian Ocean.

    “Previously, all we had to go on to reconstruct seawater from the last Ice Age were indirect clues, like fossil corals and chemical signatures from sediments on the seafloor,” said Clara Blättler, an assistant professor of geophysical sciences at the University of Chicago, who studies Earth history using isotope geochemistry. “But from all indications, it looks pretty clear we now have an actual piece of this 20,000-year-old ocean.”

    Blättler and the team made the discovery on a months-long scientific mission exploring the limestone deposits that form the Maldives, a set of tiny islands in the middle of the Indian Ocean. The ship, the JOIDES Resolution, is specifically built for ocean science and is equipped with a drill that can extract cores of rock over a mile long from up to three miles beneath the seafloor. Then scientists either vacuum out the water or use a hydraulic press to squeeze the water out of the sediments.

    Scientists carry a core of rock extracted by drill. Photo by Carlos Alvarez-Zarikian

    The scientists were actually studying those rocks to determine how sediments are formed in the area, which is influenced by the yearly Asian monsoon cycle. But when they extracted the water, they noticed their preliminary tests were coming back salty—much saltier than normal seawater. “That was the first indication we had something unusual on our hands,” Blättler said.

    The scientists took the vials of water back to their labs and ran a rigorous battery of tests on the chemical elements and isotopes that made up the seawater. All of their data pointed to the same thing: The water was not from today’s ocean, but the last remnants of a previous era that had migrated slowly through the rock.

    Scientists are interested in reconstructing the last Ice Age because the patterns that drove its circulation, climate and weather were very different from today’s—and understanding these patterns could shed light on how the planet’s climate will react in the future. “Any model you build of the climate has to be able to accurately predict the past,” Blättler said.

    For example, she said, ocean circulation is a primary player in climate, and scientists have a lot of questions about how that looked during an Ice Age. “Since so much fresh water was pulled into glaciers, the oceans would have been significantly saltier—which is what we saw,” Blättler said. “The properties of the seawater we found in the Maldives suggests that salinity in the Southern Ocean may have been more important in driving circulation than it is today.

    On Asst. Prof. Clara Blättler’s desk is a pencil holder made from a drill bit used to extract cores of rock from the seafloor, as well as vials of the 20,000-year-old ocean. Photo by Jean Lachat.

    “It’s kind of a nice connection,” she said, “since Cesare Emiliani, who is widely regarded as the father of paleoceanography—reconstructing the ancient ocean—actually wrote his seminal paper on the subject here at the University of Chicago in 1955.”

    Their readings from the water align with predictions based on other evidence—a nice confirmation, Blättler said. The findings may also suggest places to search for other such pockets of ancient water.

    Other co-authors on the paper were from Princeton University and the University of Miami.

    Funding: International Ocean Drilling Program (National Science Foundation, Japan Ministry of Education, Culture, Sports, Science and Technology, European Consortium for Ocean Research Drilling), Simons Foundation.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Chicago Campus

    An intellectual destination

    One of the world’s premier academic and research institutions, the University of Chicago has driven new ways of thinking since our 1890 founding. Today, UChicago is an intellectual destination that draws inspired scholars to our Hyde Park and international campuses, keeping UChicago at the nexus of ideas that challenge and change the world.

    The University of Chicago is an urban research university that has driven new ways of thinking since 1890. Our commitment to free and open inquiry draws inspired scholars to our global campuses, where ideas are born that challenge and change the world.

    We empower individuals to challenge conventional thinking in pursuit of original ideas. Students in the College develop critical, analytic, and writing skills in our rigorous, interdisciplinary core curriculum. Through graduate programs, students test their ideas with UChicago scholars, and become the next generation of leaders in academia, industry, nonprofits, and government.

    UChicago research has led to such breakthroughs as discovering the link between cancer and genetics, establishing revolutionary theories of economics, and developing tools to produce reliably excellent urban schooling. We generate new insights for the benefit of present and future generations with our national and affiliated laboratories: Argonne National Laboratory, Fermi National Accelerator Laboratory, and the Marine Biological Laboratory in Woods Hole, Massachusetts.

    The University of Chicago is enriched by the city we call home. In partnership with our neighbors, we invest in Chicago’s mid-South Side across such areas as health, education, economic growth, and the arts. Together with our medical center, we are the largest private employer on the South Side.

    In all we do, we are driven to dig deeper, push further, and ask bigger questions—and to leverage our knowledge to enrich all human life. Our diverse and creative students and alumni drive innovation, lead international conversations, and make masterpieces. Alumni and faculty, lecturers and postdocs go on to become Nobel laureates, CEOs, university presidents, attorneys general, literary giants, and astronauts.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: