Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:51 am on August 22, 2019 Permalink | Reply
    Tags: Applied Research & Technology, , , ,   

    Woods Hole Oceanographic Institute via COSMOS: ” Geology creates chemical energy” 

    From Woods Hole Oceanographic Institute

    22 August 2019

    Origin of a massive methane reservoir discovered.

    The manipulator arm of the remotely operated vehicle Jason samples a stream of fluid from a hydrothermal vent.
    Chris German/WHOI/NSF, NASA/ROV Jason 2012 / Woods Hole Oceanographic Institution

    Scientists know methane is released from deep-sea vents, but its source has long been a mystery.

    Now a team from Woods Hole Oceanographic Institution, US, may have the answer. Analysis of 160 rock samples from across the world’s oceans provides evidence, they say, of the formation and abundance of abiotic methane – methane formed by chemical reactions that don’t involve organic matter.

    Nearly every sample contained an assemblage of minerals and gases that form when seawater, moving through the deep oceanic crust, is trapped in magma-hot olivine, a rock-forming mineral, the researchers write in a paper published in Proceedings of the National Academy of Science.

    As the mineral cools, the water trapped inside undergoes a chemical reaction, a process called serpentinisation, which forms hydrogen and methane.

    “Here’s a source of chemical energy that’s being created by geology,” says co-author Jeffrey Seewald.

    On Earth, deep-sea methane might have played a critical role for the evolution of primitive organisms living at hydrothermal vents on the seafloor, Seewald adds. And elsewhere in the solar system, methane produced through the same process could provide an energy source for basic life forms.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Woods Hole Oceanographic Institute

    Vision & Mission

    The ocean is a defining feature of our planet and crucial to life on Earth, yet it remains one of the planet’s last unexplored frontiers. For this reason, WHOI scientists and engineers are committed to understanding all facets of the ocean as well as its complex connections with Earth’s atmosphere, land, ice, seafloor, and life—including humanity. This is essential not only to advance knowledge about our planet, but also to ensure society’s long-term welfare and to help guide human stewardship of the environment. WHOI researchers are also dedicated to training future generations of ocean science leaders, to providing unbiased information that informs public policy and decision-making, and to expanding public awareness about the importance of the global ocean and its resources.
    Mission Statement

    The Woods Hole Oceanographic Institution is dedicated to advancing knowledge of the ocean and its connection with the Earth system through a sustained commitment to excellence in science, engineering, and education, and to the application of this knowledge to problems facing society.

  • richardmitnick 9:42 am on August 21, 2019 Permalink | Reply
    Tags: "Technique could make better membranes for next-generation filtration", Applied Research & Technology, “We have demonstrated a platform that we believe will enable researchers to use their new materials in a large thin asymmetric membrane configuration testable in real-world applications.”, In the T-FLO technique the active layer is cast as a liquid on a sheet of glass or metal and cured to make the active layer solid., , More advanced materials to be used for desalination and other processes., T-FLO, The new membrane was also able to remove organic materials from solvent waste and to separate greenhouse gases.,   

    From UCLA Newsroom: “Technique could make better membranes for next-generation filtration” 

    From UCLA Newsroom

    August 20, 2019
    Wayne Lewis

    Media Contact
    Nikki Lin

    UCLA scientists’ method will allow more advanced materials to be used for desalination and other processes.

    UCLA postdoctoral scholar Brian McVerry and doctoral student Mackenzie Anderson examine an ultra-thin membrane film on a glass plate used in the T-FLO process. Marc Roseboro/UCLA

    Deriving drinkable water from seawater, treating wastewater and conducting kidney dialysis are just a few important processes that use a technology called membrane filtration.

    The key to the process is the membrane filter — a thin, semi-porous film that allows certain substances such as water to pass through while separating out other, unwanted substances. But in the past 30 years, there have been no significant improvements in the materials that make up the key layers of commercially produced membrane filters.

    Now, UCLA researchers have developed a new technique called thin-film liftoff, or T-FLO, for creating membrane filters. The approach could offer a way for manufacturers to produce more effective and energy-efficient membranes using high-performance plastics, metal-organic frameworks and carbon materials. To date, limitations in how filters are fabricated have prevented those materials from being viable in industrial production.

    A study describing the work is published in the journal Nano Letters.

    “There are a lot of materials out there that in the lab can do nice separations, but they’re not scalable,” said Richard Kaner, UCLA’s Dr. Myung Ki Hong Professor of Materials Innovation and the study’s senior author. “With this technique, we can take these materials, make thin films that are scalable, and make them useful.”

    In addition to their potential for improving types of filtration that are performed using current technology, membranes produced using T-FLO could make possible an array of new forms of filtration, said Kaner, who also is a distinguished professor of chemistry and biochemistry, and of materials science and engineering, and a member of the California NanoSystems Institute at UCLA. For example, the technique might one day make it feasible to pull carbon dioxide out of industrial emissions — which would enable the carbon to be converted to fuel or other applications while also reducing pollution.

    Filters like the ones used for desalination are called asymmetric membranes because of their two layers: a thin but dense “active” layer that rejects particles larger than a specific size, and a porous “support” layer that gives the membrane structure and allows it to resist the high pressures used in reverse osmosis and other filtering processes. The first asymmetric membrane for desalination was devised by UCLA engineers in the 1960s.

    Today’s asymmetric membranes are made by casting the active layer onto the support layer, or casting both concurrently. But to manufacture an active layer using more advanced materials, engineers have to use solvents or high heat — both of which damage the support layer or prevent the active layer from adhering.

    In the T-FLO technique, the active layer is cast as a liquid on a sheet of glass or metal and cured to make the active layer solid. Next, a support layer made of epoxy reinforced with fabric is added and the membrane is heated to solidify the epoxy.

    The use of epoxy in the support layer is the innovation that distinguishes the T-FLO technique — it enables the active layer to be created first so that it can be treated with chemicals or high heat without damaging the support layer.

    The membrane then is submerged in water to wash out the chemicals that induce pores in the epoxy and to loosen the membrane from the glass or metal sheet.

    Finally, the membrane is peeled off of the plate with a blade — the “liftoff” that gives the method its name.

    “Researchers around the world have demonstrated many new exciting materials that can separate salts, gases and organic materials more effectively than is done industrially,” said Brian McVerry, a UCLA postdoctoral scholar who invented the T-FLO process and is the study’s co-first author. “However, these materials are often made in relatively thick films that perform the separations too slowly or in small samples that are difficult to scale industrially.

    “We have demonstrated a platform that we believe will enable researchers to use their new materials in a large, thin, asymmetric membrane configuration, testable in real-world applications.”

    The researchers tested a membrane produced using T-FLO for removing salt from water, and it showed promise for solving one of the common problems in desalination, which is that microbes and other organic material can clog the membranes. Although adding chlorine to the water can kill the microbes, the chemical also causes most membranes to break down. In the study, the T-FLO membrane both rejected the salt and resisted the chlorine.

    In other experiments, the new membrane was also able to remove organic materials from solvent waste and to separate greenhouse gases.

    Mackenzie Anderson, a UCLA doctoral student, is co-first author of the study.

    The research was supported by the U.S./China Clean Energy Research Center for Water-Energy Technologies and the National Science Foundation. The project is aligned with UCLA’s Sustainable LA Grand Challenge.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    UC LA Campus

    For nearly 100 years, UCLA has been a pioneer, persevering through impossibility, turning the futile into the attainable.

    We doubt the critics, reject the status quo and see opportunity in dissatisfaction. Our campus, faculty and students are driven by optimism. It is not naïve; it is essential. And it has fueled every accomplishment, allowing us to redefine what’s possible, time after time.

    This can-do perspective has brought us 12 Nobel Prizes, 12 Rhodes Scholarships, more NCAA titles than any university and more Olympic medals than most nations. Our faculty and alumni helped create the Internet and pioneered reverse osmosis. And more than 100 companies have been created based on technology developed at UCLA.

  • richardmitnick 11:31 am on August 20, 2019 Permalink | Reply
    Tags: "With open data scientists share their work", Applied Research & Technology, , , Gran Sasso, ,   

    From Symmetry: “With open data, scientists share their work” 

    Symmetry Mag
    From Symmetry

    Meredith Fore

    Illustration by Sandbox Studio, Chicago

    There are barriers to making scientific data open, but doing so has already contributed to scientific progress.

    It could be said that astronomy, one of the oldest sciences, was one of the first fields to have open data. The open records of Chinese astronomers from 1054 A.D. allowed astronomer Carlo Otto Lampland to identify the Crab Nebula as the remnant of a supernova in 1921.

    Supernova remnant Crab nebula. NASA/ESA Hubble

    In 1705 Edward Halley used the previous observations of Johannes Kepler and Petrus Apianus—who did their work before Halley was old enough to use a telescope—to deduce the orbit of his eponymous comet.

    Comet 1P/Halley as taken March 8, 1986 by W. Liller, Easter Island, part of the International Halley Watch (IHW) Large Scale Phenomena Network.
    NASA/W. Liller

    In science, making data open means making available, free of charge, the observations or other information collected in a scientific study for the purpose of allowing other researchers to examine it for themselves, either to verify it or to conduct new analyses.

    Scientists continue to use open data to make new discoveries today. In 2010, a team of scientists led by Professor Doug Finkbeiner at Harvard University found vast gamma-ray bubbles above and below the Milky Way. The accomplishment was compared to the discovery of a new continent on Earth. The scientists didn’t find the bubbles by making their own observations; they did it by analyzing publicly available data from the Fermi Gamma Ray Telescope.

    NASA/Fermi LAT

    NASA/Fermi Gamma Ray Space Telescope

    “Open data often can be used to answer other kinds of questions that the people who collected the data either weren’t interested in asking, or they just never thought to ask,” says Kyle Cranmer, a professor at New York University. By making scientific data available, “you’re enabling a lot of new science by the community to go forward in a more efficient and powerful way.”

    Cranmer is a member of ATLAS, one of the two general-purpose experiments that, among other things, co-discovered the Higgs boson at the Large Hadron Collider at CERN.

    CERN ATLAS Image Claudia Marcelloni

    CERN ATLAS Higgs Event

    He and other CERN researchers recently published a letter in Nature Physics titled “Open is not enough,” which shares lessons learned about providing open data in high-energy physics. The CERN Open Data Portal, which facilitates public access of datasets from CERN experiments, now contains more than two petabytes of information.

    Computing at CERN

    The fields of both particle physics and astrophysics have seen rapid developments in the use and spread of open data, says Ulisses Barres, an astrophysicist at the Brazilian Center for Research in Physics. “Astronomy is going to, in the next decade, increase the amount of data that it produces by a factor of hundreds,” he says. “As the amount of data grows, there is more pressure for increasing our capacity to convert information into knowledge.”

    The Square Kilometer Array Telescope—built in Australia and South Africa and set to turn on in the 2020s—is expected to produce about 600 terabytes of data per year.

    SKA Square Kilometer Array

    SKA South Africa

    Raw data from studies conducted during the site selection process are already available on the SKA website, with a warning that “these files are very large indeed, and before you download them you should check whether your local file system will be able to handle them.”

    Barres sees the growth in open data as an opportunity for developing nations to participate in the global science community in new ways. He and a group of fellow astrophysicists helped develop something called the Open Universe Initiative “with the objective of stimulating a dramatic increase in the availability and usability of space science data, extending the potential of scientific discovery to new participants in all parts of the world and empowering global educational services.”

    The initiative, proposed by the government of Italy, is currently in the “implementation” phase within the United Nations Office for Outer Space Affairs.

    “I think that data is this proper entry point for science development in places that don’t have much science developed yet,” Barres says. “Because it’s there, it’s available, there is much more data than we can properly analyze.”

    There are barriers to implementing open data. One is the concept of ownership—a lab might not want to release data that they could use for another project or might worry about proper credit and attribution. Another is the natural human fear of being accused of being wrong or having your data used irresponsibly.

    But one of the biggest barriers, according to physics professor Jesse Thaler of MIT, is making the data understandable. “From the user perspective, every single aspect of using public data is challenging,” Thaler says.

    Think of a high school student’s chemistry lab notebook. A student might mark certain measurements in her data table with a star, to remind herself that she used a different instrument to take those measurements. Or she may use acronyms to name different samples. Unless she writes these schemes down, another student wouldn’t know the star’s significance and wouldn’t be able to know what the samples were.

    This has been a challenge for the CERN Open Data Portal, Cranmer says. “It’s very well curated, but it’s hard to use, because the data has got a lot of structure to it. It’s very complicated. You have to put additional effort to make it more usable.”

    And for a lot of scientists already working to manage gigantic projects, doing extra work to make their data useable to outside groups—well, “that’s just not mission critical,” he says. But Thaler adds that the CMS experiment has been very responsive to the needs of outside users.

    CERN CMS Higgs Event

    “Figuring out how to release data is challenging because you want to provide as much relevant information to outside users as possible,” Thaler says. “But it’s often not obvious, until outside users actually get their hands on the data, what information is relevant.”

    Still, there are many examples of open data benefiting astrophysics and particle physics. Members of the wider scientific community have discovered exoplanets through public data from the Kepler Space Telescope. When the Gaia spacecraft mapped the positions of 1.7 billion stars and released them as open data, scientists flocked to hackathons hosted by the Flatiron Institute to interpret it and produced about 20 papers’ worth of research.

    Open data policies have allowed for more accountability. The physics community was able to thoroughly check data from the first black hole collisions detected by LIGO and question a proposed dark-matter signal from the DAMA/LIBRA experiment.

    DAMA-LIBRA at Gran Sasso

    Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy

    Open data has also allowed for new collaborations and has nourished existing ones. Thaler, who is a theorist, says the dialogue between experimentalists and theorists has always been strong, but “open data is an opportunity to accelerate that conversation,” he says.

    For Cari Cesarotti, a graduate student who uses CMS Open Data for research in particle physics theory at Harvard, one of the most important benefits of open data is how it maximizes the scientific value of data experimentalists have to work very hard to obtain.

    “Colliders are really expensive and quite laborious to build and test,” she says. “So the more that we can squeeze out utility using the tools that we already have—to me, that’s the right thing to do, to try to get as much mileage as we possibly can out of the data set.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 8:36 am on August 20, 2019 Permalink | Reply
    Tags: A heat shield just 10 atoms thick, Applied Research & Technology, ,   

    From Stanford University: “Stanford researchers build a heat shield just 10 atoms thick to protect electronic devices” 

    Stanford University Name
    From Stanford University

    August 16, 2019
    Tom Abate

    This greatly magnified image shows four layers of atomically thin materials that form a heat-shield just two to three nanometers thick, or roughly 50,000 times thinner than a sheet of paper. (Image credit: National Institute of Standards and Technology)

    Excess heat given off by smartphones, laptops and other electronic devices can be annoying, but beyond that it contributes to malfunctions and, in extreme cases, can even cause lithium batteries to explode.

    To guard against such ills, engineers often insert glass, plastic or even layers of air as insulation to prevent heat-generating components like microprocessors from causing damage or discomforting users.

    Now, Stanford researchers have shown that a few layers of atomically thin materials, stacked like sheets of paper atop hot spots, can provide the same insulation as a sheet of glass 100 times thicker. In the near term, thinner heat shields will enable engineers to make electronic devices even more compact than those we have today, said Eric Pop, professor of electrical engineering and senior author of a paper published Aug. 16 in Science Advances.

    “We’re looking at the heat in electronic devices in an entirely new way,” Pop said.

    Detecting sound as heat

    The heat we feel from smartphones or laptops is actually an inaudible form of high-frequency sound. If that seems crazy, consider the underlying physics. Electricity flows through wires as a stream of electrons. As these electrons move, they collide with the atoms of the materials through which they pass. With each such collision an electron causes an atom to vibrate, and the more current flows, the more collisions occur, until electrons are beating on atoms like so many hammers on so many bells – except that this cacophony of vibrations moves through the solid material at frequencies far above the threshold of hearing, generating energy that we feel as heat.

    Thinking about heat as a form of sound inspired the Stanford researchers to borrow some principles from the physical world. From his days as a radio DJ at Stanford’s KZSU 90.1 FM, Pop knew that music recording studios are quiet thanks to thick glass windows that block the exterior sound. A similar principle applies to the heat shields in today’s electronics. If better insulation were their only concern, the researchers could simply borrow the music studio principle and thicken their heat barriers. But that would frustrate efforts to make electronics thinner. Their solution was to borrow a trick from homeowners, who install multi-paned windows – usually, layers of air between sheets of glass with varying thickness – to make interiors warmer and quieter.

    “We adapted that idea by creating an insulator that used several layers of atomically thin materials instead of a thick mass of glass,” said postdoctoral scholar Sam Vaziri, the lead author on the paper.

    Atomically thin materials are a relatively recent discovery. It was only 15 years ago that scientists were able to isolate some materials into such thin layers. The first example discovered was graphene, which is a single layer of carbon atoms and, ever since it was found, scientists have been looking for, and experimenting with, other sheet-like materials. The Stanford team used a layer of graphene and three other sheet-like materials – each three atoms thick – to create a four-layered insulator just 10 atoms deep. Despite its thinness, the insulator is effective because the atomic heat vibrations are dampened and lose much of their energy as they pass through each layer.

    To make nanoscale heat shields practical, the researchers will have to find some mass production technique to spray or otherwise deposit atom-thin layers of materials onto electronic components during manufacturing. But behind the immediate goal of developing thinner insulators looms a larger ambition: Scientists hope to one day control the vibrational energy inside materials the way they now control electricity and light. As they come to understand the heat in solid objects as a form of sound, a new field of phononics is emerging, a name taken from the Greek root word behind telephone, phonograph and phonetics.

    “As engineers, we know quite a lot about how to control electricity, and we’re getting better with light, but we’re just starting to understand how to manipulate the high-frequency sound that manifests itself as heat at the atomic scale,” Pop said.

    Eric Pop is an affiliate of the Precourt Institute for Energy. Stanford authors include former postdoctoral scholars Eilam Yalon and Miguel Muñoz Rojo, and graduate students Connor McClellan, Connor Bailey, Kirby Smithe, Alexander Gabourie, Victoria Chen, Sanchit Deshmukh and Saurabh Suryavanshi. Other authors are from Theiss Research and the National Institute of Standards and Technology.

    This research was supported by the Stanford Nanofabrication Facility, the Stanford Nano Shared Facilities, the National Science Foundation, the Semiconductor Research Corporation, the Defense Advanced Research Projects Agency, the Air Force Office of Scientific Research, the Stanford SystemX Alliance, the Knut and Alice Wallenberg Foundation, the Stanford Graduate Fellowship program and the National Institute of Standards and Technology.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Stanford University campus. No image credit

    Stanford University

    Leland and Jane Stanford founded the University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members

    Stanford University Seal

  • richardmitnick 12:22 pm on August 19, 2019 Permalink | Reply
    Tags: Applied Research & Technology, , , , ,   

    From Imperial College London: “Lab-based dark energy experiment narrows search options for elusive force” 

    Imperial College London
    From Imperial College London

    19 August 2019
    Hayley Dunning

    No image caption or credit.

    An experiment to test a popular theory of dark energy has found no evidence of new forces, placing strong constraints on related theories.

    Dark energy is the name given to an unknown force that is causing the universe to expand at an accelerating rate.

    Some physicists propose dark energy is a ‘fifth’ force that acts on matter, beyond the four already known – gravitational, electromagnetic, and the strong and weak nuclear interactions.

    However, researchers think this fifth force may be ‘screened’ or ‘hidden’ for large objects like planets or weights on Earth, making it difficult to detect.

    Now, researchers at Imperial College London and the University of Nottingham have tested the possibility that this fifth force is acting on single atoms, and found no evidence for it in their most recent experiment.

    This could rule out popular theories of dark energy that modify the theory of gravity, and leaves fewer places to search for the elusive fifth force.

    Finding the fifth force

    The experiment, performed at Imperial College London and analysed by theorists at the University of Nottingham, is reported today in Physical Review Letters.

    Professor Ed Copeland, from the Centre for Astronomy & Particle Physics at the University of Nottingham, said: “This experiment, connecting atomic physics and cosmology, has allowed us to rule out a wide class of models that have been proposed to explain the nature of dark energy, and will enable us to constrain many more dark energy models.”

    The experiment tested theories of dark energy that propose the fifth force is comparatively weaker when there is more matter around – the opposite of how gravity behaves.

    This would mean it is strong in a vacuum like space, but is weak when there is lots of matter around. Therefore, experiments using two large weights would mean the force becomes too weak to measure.

    Experiment with a single atom

    The researchers instead tested a larger weight with an incredibly small weight – a single atom – where the force should have been observed if it exists.

    The team used an atom interferometer to test whether there were any extra forces that could be the fifth force acting on an atom. A marble-sized sphere of metal was placed in a vacuum chamber and atoms were allowed to free-fall inside the chamber.

    The theory is, if there is a fifth force acting between the sphere and atom, the atom’s path will deviate slightly as it passes by the sphere, causing a change in the path of the falling atom. However, no such force was found.

    Professor Ed Hinds, from the Department of Physics at Imperial, said: “It is very exciting to be able to discover something about the evolution of the universe using a table-top experiment in a London basement.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Imperial College London

    Imperial College London is a science-based university with an international reputation for excellence in teaching and research. Consistently rated amongst the world’s best universities, Imperial is committed to developing the next generation of researchers, scientists and academics through collaboration across disciplines. Located in the heart of London, Imperial is a multidisciplinary space for education, research, translation and commercialisation, harnessing science and innovation to tackle global challenges.

  • richardmitnick 10:09 am on August 19, 2019 Permalink | Reply
    Tags: "Ocean warming has fisheries on the move helping some but hurting more", Applied Research & Technology, , , , ,   

    From The Conversation: “Ocean warming has fisheries on the move, helping some but hurting more” 

    From The Conversation

    August 19, 2019
    Chris Free, UCSB

    Atlantic Cod on Ice. Alamy. Cod fisheries in the North Sea and Irish Sea are declining due to overfishing and climate change.

    Climate change has been steadily warming the ocean, which absorbs most of the heat trapped by greenhouse gases in the atmosphere, for 100 years. This warming is altering marine ecosystems and having a direct impact on fish populations. About half of the world’s population relies on fish as a vital source of protein, and the fishing industry employs more the 56 million people worldwide.

    My recent study [Science] with colleagues from Rutgers University and the U.S. National Oceanic and Atmospheric Administration found that ocean warming has already impacted global fish populations. We found that some populations benefited from warming, but more of them suffered.


    Overall, ocean warming reduced catch potential – the greatest amount of fish that can be caught year after year – by a net 4% over the past 80 years. In some regions, the effects of warming have been much larger. The North Sea, which has large commercial fisheries, and the seas of East Asia, which support some of the fastest-growing human populations, experienced losses of 15% to 35%.

    The reddish and brown circles represent fish populations whose maximum sustainable yields have dropped as the ocean has warmed. The darkest tones represent extremes of 35 percent. Blueish colors represent fish yields that increased in warmer waters. Chris Free, CC BY-ND

    Although ocean warming has already challenged the ability of ocean fisheries to provide food and income, swift reductions in greenhouse gas emissions and reforms to fisheries management could lessen many of the negative impacts of continued warming.

    How and why does ocean warming affect fish?

    My collaborators and I like to say that fish are like Goldilocks: They don’t want their water too hot or too cold, but just right.

    Put another way, most fish species have evolved narrow temperature tolerances. Supporting the cellular machinery necessary to tolerate wider temperatures demands a lot of energy. This evolutionary strategy saves energy when temperatures are “just right,” but it becomes a problem when fish find themselves in warming water. As their bodies begin to fail, they must divert energy from searching for food or avoiding predators to maintaining basic bodily functions and searching for cooler waters.

    Thus, as the oceans warm, fish move to track their preferred temperatures. Most fish are moving poleward or into deeper waters. For some species, warming expands their ranges. In other cases it contracts their ranges by reducing the amount of ocean they can thermally tolerate. These shifts change where fish go, their abundance and their catch potential.

    Warming can also modify the availability of key prey species. For example, if warming causes zooplankton – small invertebrates at the bottom of the ocean food web – to bloom early, they may not be available when juvenile fish need them most. Alternatively, warming can sometimes enhance the strength of zooplankton blooms, thereby increasing the productivity of juvenile fish.

    Understanding how the complex impacts of warming on fish populations balance out is crucial for projecting how climate change could affect the ocean’s potential to provide food and income for people.


    Impacts of historical warming on marine fisheries

    Sustainable fisheries are like healthy bank accounts. If people live off the interest and don’t overly deplete the principal, both people and the bank thrive. If a fish population is overfished, the population’s “principal” shrinks too much to generate high long-term yields.

    Similarly, stresses on fish populations from environmental change can reduce population growth rates, much as an interest rate reduction reduces the growth rate of savings in a bank account.

    In our study we combined maps of historical ocean temperatures with estimates of historical fish abundance and exploitation. This allowed us to assess how warming has affected those interest rates and returns from the global fisheries bank account.

    Losers outweigh winners

    We found that warming has damaged some fisheries and benefited others. The losers outweighed the winners, resulting in a net 4% decline in sustainable catch potential over the last 80 years. This represents a cumulative loss of 1.4 million metric tons previously available for food and income.

    Some regions have been hit especially hard. The North Sea, with large commercial fisheries for species like Atlantic cod, haddock and herring, has experienced a 35% loss in sustainable catch potential since 1930. The waters of East Asia, neighbored by some of the fastest-growing human populations in the world, saw losses of 8% to 35% across three seas.

    Other species and regions benefited from warming. Black sea bass, a popular species among recreational anglers on the U.S. East Coast, expanded its range and catch potential as waters previously too cool for it warmed. In the Baltic Sea, juvenile herring and sprat – another small herring-like fish – have more food available to them in warm years than in cool years, and have also benefited from warming. However, these climate winners can tolerate only so much warming, and may see declines as temperatures continue to rise.

    Shucking scallops in Maine, where fishery management has kept scallop numbers sustainable. Robert F. Bukaty/AP

    Management boosts fishes’ resilience

    Our work suggests three encouraging pieces of news for fish populations.

    First, well-managed fisheries, such as Atlantic scallops on the U.S. East Coast, were among the most resilient to warming. Others with a history of overfishing, such as Atlantic cod in the Irish and North seas, were among the most vulnerable. These findings suggest that preventing overfishing and rebuilding overfished populations will enhance resilience and maximize long-term food and income potential.

    Second, new research suggests that swift climate-adaptive management reforms can make it possible for fish to feed humans and generate income into the future. This will require scientific agencies to work with the fishing industry on new methods for assessing fish populations’ health, set catch limits that account for the effects of climate change and establish new international institutions to ensure that management remains strong as fish migrate poleward from one nation’s waters into another’s. These agencies would be similar to multinational organizations that manage tuna, swordfish and marlin today.

    Finally, nations will have to aggressively curb greenhouse gas emissions. Even the best fishery management reforms will be unable to compensate for the 4 degree Celsius ocean temperature increase that scientists project will occur by the end of this century if greenhouse gas emissions are not reduced.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Conversation launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

  • richardmitnick 9:15 am on August 19, 2019 Permalink | Reply
    Tags: "Brookhaven Completes LSST's Digital Sensor Array", Applied Research & Technology, , , , , , ,   

    From Brookhaven National Lab: “Brookhaven Completes LSST’s Digital Sensor Array” 

    From Brookhaven National Lab

    August 19, 2019

    Stephanie Kossman
    (631) 344-8671

    Peter Genzer,
    (631) 344-3174

    Brookhaven National Lab has finished constructing the 3.2 gigapixel “digital film” for the world’s largest camera for cosmology, physics, and astronomy.

    SLAC National Accelerator Laboratory installs the first of Brookhaven’s 21 rafts that make up LSST’s digital sensor array. Photo courtesy SLAC National Accelerator Laboratory.

    After 16 years of dedicated planning and engineering, scientists at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory have completed a 3.2 gigapixel sensor array for the camera that will be used in the Large Synoptic Survey Telescope (LSST), a massive telescope that will observe the universe like never before.


    LSST Camera, built at SLAC

    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    LSST Data Journey, Illustration by Sandbox Studio, Chicago with Ana Kova

    “This is the biggest charge-coupled device (CCD) array that has ever been built,” said Paul O’Connor, senior scientist at Brookhaven Lab’s instrumentation division. “It’s three billion pixels. No telescope has ever put this many sensors into one camera.”

    The digital sensor array is composed of about 200 16-megapixel sensors, divided into 21 modules called “rafts.” Each raft can function on its own, but when combined, they will view an area of sky that can fit more than 40 full moons in a single image. Researchers will stitch these images together to create a time-lapse movie of the complete visible universe accessible from Chile.

    Currently under construction on a mountaintop in Chile, LSST is designed to capture the most complete images of our universe that have ever been achieved. The project to build the telescope facility and camera is a collaborative effort among more than 30 institutions from around the world, and it is primarily funded by DOE’s Office of Science and the National Science Foundation. DOE’s SLAC National Accelerator Laboratory is leading the overall effort to construct the camera—the world’s largest camera for astronomy—while Brookhaven led the design, construction, and qualification of the digital sensor array—the “digital film” for the camera.

    “It’s the heart of the camera,” said Bill Wahl, science raft subsystem manager of the LSST project at Brookhaven Lab. “What we’ve done here at Brookhaven represents years of great work by many talented scientists, engineers, and technicians. Their work will lead to a collection of images that has never been seen before by anyone. It’s an exciting time for the project and for the Lab.”

    Members of the LSST project team at Brookhaven Lab are shown with a prototype raft cryostat. In addition to the rafts, Brookhaven scientists designed and built the cryostats that hold and cool the rafts to -100° Celsius.

    Brookhaven began its LSST research and development program in 2003, with construction of the digital sensor array starting in 2014. In the time leading up to construction, Brookhaven designed and fabricated the assembly and test equipment for the science rafts used both at Brookhaven and SLAC. The Laboratory also created an entire automated production facility and cleanroom, along with production and tracking software.

    “We made sure to automate as much of the production facility as possible,” O’Connor said. “Testing a single raft could take up to three days. We were working on a tight schedule, so we had our automated facility running 24/7. Of course, out of a concern for safety, we always had someone monitoring the facility throughout the day and night.”

    Constructing the complex sensor array, which operates in a vacuum and must be cooled to -100° Celsius, is a challenge on its own. But the Brookhaven team was also tasked with testing each fully assembled raft, as well as individual sensors and electronics. Once each raft was complete, it needed to be carefully packaged in a protective environment to be safely shipped across the country to SLAC.

    The LSST team at Brookhaven completed the first raft in 2017. But soon after, they were presented with a new challenge.

    “We later discovered that design features inadvertently led to the possibility that electrical wires in the rafts could get shorted out,” O’Connor said. “The rate at which this effect was impacting the rafts was only on the order of 0.2%, but to avoid any possibility of degradation, we went through the trouble of refitting almost every raft.”

    Now, just two years after the start of raft production, the team has successfully built and shipped the final raft to SLAC for integration into the camera. This marks the end of a 16-year project at Brookhaven, which will be followed by many years of astronomical observation.

    Many of the talented team members recruited to Brookhaven for the LSST project were young engineers and technicians hired right out of graduate school. Now, they’ve all been assigned to ongoing physics projects at the Lab, such as upgrading the PHENIX detector at the Relativistic Heavy Ion Collider—a DOE Office of Science User Facility for nuclear physics research—to sPHENIX [see RHIC components below], as well as ongoing work with the ATLAS detector at CERN’s Large Hadron Collider. Brookhaven is the U.S. host laboratory for the ATLAS collaboration.

    CERN ATLAS Image Claudia Marcelloni

    “Brookhaven’s role in the LSST camera project afforded new and exciting opportunities for engineers, technicians, and scientists in electro-optics, where very demanding specifications must be met,” Wahl said. “The multi-disciplined team we assembled did an excellent job achieving design objectives and I am proud of our time together. Watching junior engineers and scientists grow into very capable team members was extremely rewarding.”

    Brookhaven Lab will continue to play a strong role in LSST going forward. As the telescope undergoes its commissioning phase, Brookhaven scientists will serve as experts on the digital sensor array in the camera. They will also provide support during LSST’s operations, which are projected to begin in 2022.

    SLAC National Accelerator Laboratory installs the first of Brookhaven’s 21 rafts that make up LSST’s digital sensor array. Photo courtesy SLAC National Accelerator Laboratory.

    “The commissioning of such a complex camera will be an exciting and challenging endeavor,” said Brookhaven physicist Andrei Nomerotski, who is leading Brookhaven’s contributions to the commissioning and operation phases of the LSST project. “After years of using artificial signal sources for the sensor characterization, we are looking forward to seeing real stars and galaxies in the LSST CCDs.”

    Once operational in the Andes Mountains, LSST will serve nearly every subset of the astrophysics community. Perhaps most importantly, LSST will enable scientists to investigate dark energy and dark matter—two puzzles that have baffled physicists for decades. It is also estimated that LSST will find millions of asteroids in our solar system, in addition to offering new information about the creation of our galaxy. The images captured by LSST will be made available to physicists and astronomers in the U.S. and Chile immediately, making LSST one of the most advanced and accessible cosmology experiments ever created. Over time, the data will be made available to the public worldwide.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL Center for Functional Nanomaterials



    BNL RHIC Campus

    BNL/RHIC Star Detector


    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

  • richardmitnick 8:56 am on August 19, 2019 Permalink | Reply
    Tags: Applied Research & Technology, , Geomorphology, Maya Stokes, , Stokes focuses on specific pathways — freshwater environments — and the interplay of biology and streams has some dynamic features., Stokes investigates how related fish are to one another in the United States., The intersection of geology and evolutionary biology., Ultimate frisbee coach,   

    From MIT News: Women in STEM- “From streams to teams” Maya Stokes 

    MIT News

    From MIT News

    August 18, 2019
    Laura Carter

    Geomorphology graduate student Maya Stokes performed fieldwork in the Chilean Altiplano in 2016. She assisted fellow PhD student Christine Y. Chen with her thesis work studying the history of lakes and the paleoclimate of South America. Photo courtesy of Christine Y. Chen.

    Sampling fish to learn about their response to riverine evolution in the Pigeon River, graduate student Maya Stokes (left) and her research advisor, Taylor Perron (right), are accompanied by biologists from the Tennessee Valley Authority. Photo courtesy of Sean Gallen/Colorado State University.

    As coach of the MIT women’s ultimate frisbee team, Maya Stokes (right) instructs player Saroja Erabelli at a tournament in Texas. Photo courtesy of Yang Zhong.

    “I love the intellectual freedom that’s been awarded to me [at MIT],” says Maya Stokes. “I think that the culture of intellectual independence is strong at MIT, and it’s very motivating to be around.” Courtesy of the MIT Martin Fellows.

    Graduate student Maya Stokes, a geomorphology expert and ultimate frisbee coach, shows her passion for teaching in the field and on the field.

    If you’ve ever looked out the window of an airplane, you might have seen beautiful meandering and braided river systems cutting their way through the Earth. Fly over that same area again a few years later, and you’ll witness a different landscape. On geologic timescales, geomorphology, the study of how the Earth’s surface is shaped and evolves, involves the most rapid processes.

    “You can observe changes in the paths that rivers take or landslides that dramatically alter hillslopes in a human lifetime. Many geologic processes don’t allow you that opportunity,” says Maya Stokes, a fourth-year graduate student in the Department of Earth, Atmospheric and Planetary Sciences (EAPS) who researches rivers.

    Stokes wasn’t always interested in geomorphology, although her love for the outdoors stems from a childhood in Colorado. She entered Rice University in Houston with an interest in science and spent some time as an undergraduate trying out different fields. Fascinated by the history of the Earth and life on it, she narrowed her search down to Earth science and ecology and evolutionary biology. A class on geomorphology won her over. Being able to pursue a career that allowed her to work outside was also an enticing perk.

    At MIT, Stokes now conducts research with Taylor Perron, associate department head of EAPS and associate professor of geology at MIT, who is an expert in riverine erosion in mountains. She also collaborates with Tom Near, an evolutionary biologist at Yale University, enabling her to combine her two areas of interest. Her research focus lies at the intersection of geology and evolutionary biology. While exploring how rivers evolve over time, she simultaneously investigates how the ecosystems within those systems evolve in response.

    You can think of it like two carloads of people on a road trip. One car crosses a bridge toward a major metropolis, but shortly after, construction closes the bridge and forms a detour sending the second car traveling through a rural farmland. Those two carloads of people will have different experiences, different meals and lodging, that are unique to their car’s particular pathway.

    Stokes focuses on specific pathways — freshwater environments — and the interplay of biology and streams has some dynamic features. “As shown by the recent UN report, understanding and maintaining biodiversity is a high priority goal for building a sustainable future on Earth,” she says in reference to the 2019 global assessment report conducted by the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services.

    To get more hands on, Stokes investigates how related fish are to one another in the United States. She collects both genetic and geologic datasets, processed with the help of a University of Massachusetts at Amherst geochemistry lab run by Isaac Larsen. She has been on three trips to collect data, mostly in the Appalachians, a location of which she’s grown fond, because, she explains, “The topography is rugged, the streams are clear and beautiful, and the landscape is saturated with life.”

    Specifically narrowing to the Tennessee River, Stokes and her collaborators are observing how several populations of the Greenfin darter fish (Nothonotus chlorobranchius) have been separated, possibly as a result of knickpoints, or sharp changes in the slope. Last year, she published a paper in Geophysical Research Letters that predicts a rerouting of the upper Rio Orinoco into the Rio Negro in the Amazon River basin, which is summarized in a blog post on the website of the American Geophysical Union.

    “Stokes’ ambitious project requires a blend of versatility, creativity, determination and intellectual fearlessness. I think she has that rare combination of talents,” says Perron. In order to explore the scope of her research fully, Stokes expanded her resources beyond MIT, successfully applying for funding to take short courses and field courses to achieve her research goals.

    “I love the intellectual freedom that’s been awarded to me [at MIT]. It’s made my PhD feel authentic, exciting, and very much mine. I think that the culture of intellectual independence is strong at MIT, and it’s very motivating to be around,” says Stokes. She’s grateful to have received research support from MIT’s Office of Graduate Education as a Hugh Hampton Young Fellow and through a fellowship from the MIT Martin Family Society of Fellows for Sustainability.

    Hoping to continue to investigate these questions long after her PhD, Stokes plans to become a professor of the history of the Earth and how it influences the evolution of life. MIT has provided Stokes the opportunity to build her teaching skills as a teaching assistant for incoming undergraduates at Yellowstone National Park on four occasions. Explaining the volcanic and natural history of the area, she reveled in the chance to entice new students to delve into the study of the wonderful and constantly evolving Earth. Stokes was recognized with an Award for Excellence in Teaching in EAPS earlier this year.

    Stokes’s leadership skills also led her to serve as president for the EAPS Student Advisory Council (ESAC), and to help start an initiative for a universal first-year course for all EAPS graduate students. She also worked on an initiative started by her fellow EAPS graduate student Eva Golos to allow students to provide input on faculty searches. Recently, she was honored at the MIT Office of Graduate Education’s 2019 celebration of Graduate Women of Excellence, nominated by her peers and one of three in EAPS selected based on “their exemplary leadership through example and action, service to the Institute, their dedication to mentoring and their drive to make changes to improve the student experience.” When not on trips to muddy waters, Stokes regularly joins EAPS post-work gatherings with trips to the Muddy Charles, MIT’s on-campus bar, forging deep friendships.

    Stokes still manages to spend most of her time outdoors, teaching, outside the realm of Earth science. She coaches the women’s ultimate frisbee team at MIT and plays on regionally competitive teams in the Boston area. “It’s also allowed me to interact with undergraduate students at MIT through coaching which helps me feel more tapped into the MIT community at large. I’ve learned a lot about teamwork, leadership, and teaching from the sport,” she says.

    Stokes’ advisor speculates that she will continue to stand out after she graduates with her doctorate from MIT. “She has demonstrated strong commitments to teaching undergraduates and communicating science to the public,” says Perron. “I expect that she will be a leading researcher in science working at the intersection of the physical environment and biological diversity.”

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

  • richardmitnick 8:32 am on August 19, 2019 Permalink | Reply
    Tags: 2019 RoboCup Millennium Challenge, According to IDC the global robotics market was worth $151 billion in 2018 and that’s expected to double to $315.5 billion by 2021., Applied Research & Technology, ,   

    From CSIROscope- “Cashing in: Australia’s role in $1trn robotic revolution” 

    CSIRO bloc

    From CSIROscope

    19 August 2019
    Adrian Turner

    Fifteen international teams from Australia, Brazil, China, Germany, Iran, Japan and Portugal recently descended on Sydney for the 2019 RoboCup Millennium Challenge. Eleven fully autonomous virtual robots known as “agents” played as part of each team without the assistance of a remote control and complying with FIFA rules. The nail-biting final came down to the wire, with an Australian team emerging victorious over the 2018 world champions with seconds to spare.

    But this was more than a game, it highlighted Australia’s strengths in robotics and the speed with which the field is evolving.

    Robots of the team NomadZ (ETH Zurich) of Switzerland, 1st and 2nd of left,and the Australian Runswift team (University of New South Wales), right, challenge for the ball during a soccer match.

    According to IDC the global robotics market was worth $151 billion in 2018, and that’s expected to double to $315.5 billion by 2021. Robots are used today in wide-ranging fields such as precision agriculture, mining, medical procedures, construction, biosecurity, transportation and even for companionship.

    Advancements in robotics have been accompanied by a fear that robots and automation will take our jobs along the way. While there are short-term risks with forecasts of 40 per cent of jobs potentially being displaced, it’s not clear that there will be an overall reduction in the number of jobs over time. The World Economic Forum suggests that the opposite will occur. In their Future of Jobs 2018 report, the authors concluded that while automation technologies including artificial intelligence could see 75 million jobs displaced globally, 133 million new roles may emerge as companies shake up their division of labour between humans and machines, translating to an additional 58 million new jobs created by 2022.

    A recent report by AlphaBeta estimates that automation can boost Australia’s productivity and national income by (up to) $2.2 trillion by 2030 and result in improved health and safety, the development of new products and services, new types of jobs and new business models. In that same report AlphaBeta concluded that by 2025 automation in manufacturing could increase by 6 per cent along with an 11 per cent reduction in injuries while wages for non-automatable tasks will rise 20 per cent.

    The key to unlocking economic and societal benefit from robotics will be to have them do things not possible or economic before. Take caring of an ageing population that is forecast to live longer but with a smaller workforce to support them. The math doesn’t add up without new methods for care to keep people out of hospitals and in their homes longer. Or supporting children with autism to develop social interaction and communication skills with Kaspar, a social robot being trialled by researchers at the University of New South and CSIRO. Robots can help with dangerous jobs too. CSIRO’s Data61 spinout Emesent develops drones capable of travelling in GPS-denied environments utilising 3D LiDAR technology. They travel down mineshafts to safely inspect hard to access areas of underground mines, so people don’t have to.

    On the other side of the world, a Harvard University group has spent the last 12 years creating a robotic bee capable of partially untethered flight powered by artificial muscles beating the wings 120 times a second. The ultimate objective of the program is to create a robobee swarm for use in natural disasters and artificial pollination given the devastating effectives of colony collapse disorder on bee populations and consequently food pollination. The US Department of Agriculture estimates that of the 1400 crops grown for food, 80 per cent depend on pollination and globally pollination services are likely worth more than $3 trillion.

    Robotic advancements

    Advancement in robotics is accelerating. They will increasingly evolve from isolated machines to be seamlessly integrated with our environments and each other. When one robot encounters an obstacle or new context and learns, the entire network of robots can instantaneously learn.

    Other advancements include the use of more tactile skins with embedded pressure sensors, and more flexible sensors. A team of engineers from the university of Delaware have created flexible carbon nanotube coatings on fibres that include cotton and wool, resulting in shape forming, flexible and pressure sensitive skins. Just as with the robobee there are also advancements in collaborative robots, or cobots, that can be used for resilient search and rescue operations among other things.

    We are also witnessing improvements in dexterity. The California-based Intuitive Surgical has developed a robot allowing a surgeon to control three fully articulated instruments to treat deep-seated damaged or diseased tissues or organs. Robots are also being developed that can unfold and soft robotics that will be important for applications that involve people contact. The challenge until recently has been a lack of actuators or artificial muscles that can replicate the versatility of real muscles. Advancements are being made with one design made from inexpensive materials reportedly able to lift 200 times its weight. Another compelling advancement is in augmenting our own muscles via wearable robots or exoskeletons. Applications today range from helping prevent workplace injury to helping people function more fully after spinal cord damage or strokes.

    Australia can benefit substantially from robotics in areas like managing environmental threats, maintaining vital urban infrastructure, maximise crop yields in drought-affected regions, transportation or supporting law enforcement. Australia was the first country to automate its ports and mine sites and we have strong university capabilities at QUT and Sydney University among others. Today there are about 1100 robotics companies in the country and CSIRO’s Data61 recently opened the largest robotic motion-capture facility in the southern hemisphere.

    The question of how Australia can capitalise on the trillion-dollar artificial intelligence and robotics revolution will be the focal point of the upcoming D61+LIVE conference in Sydney this October. Like all other industry creation opportunities in front of us right now, the opportunity is perishable and the way to maximise the benefit as a country is to be a global leader in parts. The Australian Robocup team has shown us how it’s done. Game on.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    SKA/ASKAP radio telescope at the Murchison Radio-astronomy Observatory (MRO) in Mid West region of Western Australia

    So what can we expect these new radio projects to discover? We have no idea, but history tells us that they are almost certain to deliver some major surprises.

    Making these new discoveries may not be so simple. Gone are the days when astronomers could just notice something odd as they browse their tables and graphs.

    Nowadays, astronomers are more likely to be distilling their answers from carefully-posed queries to databases containing petabytes of data. Human brains are just not up to the job of making unexpected discoveries in these circumstances, and instead we will need to develop “learning machines” to help us discover the unexpected.

    With the right tools and careful insight, who knows what we might find.

    CSIRO campus

    CSIRO, the Commonwealth Scientific and Industrial Research Organisation, is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

  • richardmitnick 3:24 pm on August 18, 2019 Permalink | Reply
    Tags: "Supercomputing Galactic Winds with Cholla", Applied Research & Technology, Cholla, ,   

    From insideHPC: “Supercomputing Galactic Winds with Cholla” 

    From insideHPC

    August 18, 2019
    Elizabeth Rosenthal at ORNL

    In this video, a galactic wind simulation depicts interstellar gas and stars (red) and the outflows (blue) captured using the Cholla astrophysics code.


    “Using the Titan supercomputer at Oak Ridge National Laboratory, a team of astrophysicists created a set of galactic wind simulations of the highest resolution ever performed. The simulations will allow researchers to gather and interpret more accurate, detailed data that elucidates how galactic winds affect the formation and evolution of galaxies.”

    ORNL Cray XK7 Titan Supercomputer, once the fastest in the world, to be decommissioned

    Brant Robertson of the University of California, Santa Cruz, and Evan Schneider of Princeton University developed the simulation suite to better understand galactic winds—outflows of gas released by supernova explosions—which could help explain variations in their density and temperature distributions.

    The improved set of galactic wind simulations will be incorporated into larger cosmological simulations.

    “We now have a much clearer idea of how the high speed, high temperature gas produced by clusters of supernovae is ejected after mixing with the cooler, denser gas in the disk of the galaxy,” Schneider said.”

    Cholla is a GPU-based hydrodynamics code I developed as part of my thesis work at the University of Arizona. It was designed to be massively-parallel and extremely efficient, and has been run on some of the largest supercomputers in the world. I am committed to keeping Cholla free and open-source. The most recent public release of the code can be found on GitHub.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Founded on December 28, 2006, insideHPC is a blog that distills news and events in the world of HPC and presents them in bite-sized nuggets of helpfulness as a resource for supercomputing professionals. As one reader said, we’re sifting through all the news so you don’t have to!

    If you would like to contact me with suggestions, comments, corrections, errors or new company announcements, please send me an email at rich@insidehpc.com. Or you can send me mail at:

    2825 NW Upshur
    Suite G
    Portland, OR 97239

    Phone: (503) 877-5048

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: