Updates from richardmitnick Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:15 pm on September 16, 2019 Permalink | Reply
    Tags: 21st century alchemy, , , , , Plasmons   

    From Niels Bohr Institute: “Quantum Alchemy: Researchers use laser light to transform metal into magnet” 

    University of Copenhagen

    Niels Bohr Institute bloc

    From Niels Bohr Institute

    16 September 2019

    Mark Spencer Rudner
    Associate Professor
    Condensed Matter Physics
    Niels Bohr Institutet
    rudner@nbi.ku.dk

    Maria Hornbek
    Journalist
    The Faculty of Science
    maho@science.ku.dk
    +45 22 95 42 83

    CONDENSED MATTER PHYSICS: Pioneering physicists from the University of Copenhagen and Nanyang Technological University in Singapore have discovered a way to get non-magnetic materials to make themselves magnetic by way of laser light. The phenomenon may also be used to endow many other materials with new properties.

    1
    Mark Rudner, Niels Bohr Institute, University of Copenhagen

    2
    Asst Prof Justin Song Chien Wen

    The intrinsic properties of materials arise from their chemistry — from the types of atoms that are present and the way that they are arranged. These factors determine, for example, how well a material may conduct electricity or whether or not it is magnetic. Therefore, the traditional route for changing or achieving new material properties has been through chemistry.

    Now, a pair of researchers from the University of Copenhagen and Nanyang Technological University in Singapore have discovered a new physical route to the transformation of material properties: when stimulated by laser light, a metal can transform itself from within and suddenly acquire new properties.

    1

    “For several years, we have been looking into how to transform the properties of a matter by irradiating it with certain types of light. What’s new is that not only can we change the properties using light, we can trigger the material to change itself, from the inside out, and emerge into a new phase with completely new properties. For instance, a non-magnetic metal can suddenly transform into a magnet,” explains Associate Professor Mark Rudner, a researcher at the University of Copenhagen’s Niels Bohr Institute.

    He and colleague Justin Song of Nanyang Technological University in Singapore made the discovery that is now published in Nature Physics. The idea of using light to transform the properties of a material is not novel in itself. But up to now, researchers have only been capable of manipulating the properties already found in a material. Giving a metal its own ‘separate life’, allowing it to generate its own new properties, has never been seen before.

    By way of theoretical analysis, the researchers have succeeded in proving that when a non-magnetic metallic disk is irradiated with linearly polarized light, circulating electric currents and hence magnetism can spontaneously emerge in the disk.

    Researchers use so-called plasmons (a type of electron wave) found in the material to change its intrinsic properties. When the material is irradiated with laser light, plasmons in the metal disk begin to rotate in either a clockwise or counterclockwise direction. However, these plasmons change the quantum electronic structure of a material, which simultaneously alters their own behavior, catalyzing a feedback loop. Feedback from the plasmons’ internal electric fields eventually causes the plasmons to break the intrinsic symmetry of the material and trigger an instability toward self-rotation that causes the metal to become magnetic.

    Technique can produce properties ‘on demand’

    According to Mark Rudner, the new theory pries open an entire new mindset and most likely, a wide range of applications:

    “It is an example of how the interaction between light and material can be used to produce certain properties in a material ‘on demand’. It also paves the way for a multitude of uses, because the principle is quite general and can work on many types of materials. We have demonstrated that we can transform a material into a magnet. We might also be able to change it into a superconductor or something entirely different,” says Rudner. He adds:

    “You could call it 21st century alchemy. In the Middle Ages, people were fascinated by the prospect of transforming lead into gold. Today, we aim to get one material to behave like another by stimulating it with a laser.”

    Among the possibilities, Rudner suggests that the principle could be useful in situations where one needs a material to alternate between behaving magnetically and not. It could also prove useful in opto-electronics – where, for example, light and electronics are combined for fiber-internet and sensor development.

    The researchers’ next steps are to expand the catalog of properties that can be altered in analogous ways, and to help stimulate their experimental investigation and utilization.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings


    Stem Education Coalition

    Niels Bohr Institute Campus

    Niels Bohr Institute (Danish: Niels Bohr Institutet) is a research institute of the University of Copenhagen. The research of the institute spans astronomy, geophysics, nanotechnology, particle physics, quantum mechanics and biophysics.

    The Institute was founded in 1921, as the Institute for Theoretical Physics of the University of Copenhagen, by the Danish theoretical physicist Niels Bohr, who had been on the staff of the University of Copenhagen since 1914, and who had been lobbying for its creation since his appointment as professor in 1916. On the 80th anniversary of Niels Bohr’s birth – October 7, 1965 – the Institute officially became The Niels Bohr Institute.[1] Much of its original funding came from the charitable foundation of the Carlsberg brewery, and later from the Rockefeller Foundation.[2]

    During the 1920s, and 1930s, the Institute was the center of the developing disciplines of atomic physics and quantum physics. Physicists from across Europe (and sometimes further abroad) often visited the Institute to confer with Bohr on new theories and discoveries. The Copenhagen interpretation of quantum mechanics is named after work done at the Institute during this time.

    On January 1, 1993 the institute was fused with the Astronomic Observatory, the Ørsted Laboratory and the Geophysical Institute. The new resulting institute retained the name Niels Bohr Institute.

    The University of Copenhagen (UCPH) (Danish: Københavns Universitet) is the oldest university and research institution in Denmark. Founded in 1479 as a studium generale, it is the second oldest institution for higher education in Scandinavia after Uppsala University (1477). The university has 23,473 undergraduate students, 17,398 postgraduate students, 2,968 doctoral students and over 9,000 employees. The university has four campuses located in and around Copenhagen, with the headquarters located in central Copenhagen. Most courses are taught in Danish; however, many courses are also offered in English and a few in German. The university has several thousands of foreign students, about half of whom come from Nordic countries.

    The university is a member of the International Alliance of Research Universities (IARU), along with University of Cambridge, Yale University, The Australian National University, and UC Berkeley, amongst others. The 2016 Academic Ranking of World Universities ranks the University of Copenhagen as the best university in Scandinavia and 30th in the world, the 2016-2017 Times Higher Education World University Rankings as 120th in the world, and the 2016-2017 QS World University Rankings as 68th in the world. The university has had 9 alumni become Nobel laureates and has produced one Turing Award recipient

     
  • richardmitnick 8:53 pm on September 16, 2019 Permalink | Reply
    Tags: , , , , , the most massive neutron star yet J0740+6620   

    From PBS NOVA: “Astronomers may have just detected the most massive neutron star yet” 

    From PBS NOVA

    September 16, 2019
    Katherine J. Wu

    1
    An artist’s impression of the pulse from a neutron star being delayed by a white dwarf passing between the neutron star and Earth. Image Credit: BSaxton, NRAO/AUI/NSF

    1

    The sun at the center of our solar system is a big-bodied behemoth, clocking in at more than 4 nonillion pounds (in the U.S., that’s 4 followed by 30 zeros).

    Now, multiply that mass by 2.14, and cram it down into a ball just 15 miles across. That’s an absurdly dense object, one almost too dense to exist. But the key word here is “almost”—because a team of astronomers has just found one such star.

    The newly discovered cosmic improbability, reported today in the journal Nature Astronomy, is a neutron star called J0740+6620 that lurks 4,600 light-years from Earth. It’s the most massive neutron star ever detected, and is likely to remain a top contender for that title for some time: Much denser, researchers theorize, and it would collapse into a black hole.

    Both neutron stars and black holes are stellar corpses—the leftover cores of stars that die in cataclysmic explosions called supernovae. The density of these remnants dictates their fate: The more mass that’s stuffed into a small space, the more likely a black hole will form.

    Neutron stars are still ultra-dense, though, and astronomers don’t have a clear-cut understanding of how matter behaves within them. Extremely massive neutron stars like this one, which exist tantalizingly close to the black hole tipping point, could yield some answers, study author Thankful Cromartie, an astronomer at the University of Virginia, told Ryan F. Mandelbaum at Gizmodo.

    Cromartie and her colleagues first detected J0740+6620, which is a type of rapidly rotating neutron star called a millisecond pulsar, with the Green Bank telescope in West Virginia. The name arises from the way the spinning star’s poles emit radio waves, generating a pulsing pattern that mimics the sweeping motion of a lighthouse beam.

    During their observations, the researchers noted that J0740+6620 is locked into a tight dance with a white dwarf—another kind of dense stellar remnant. The two bodies orbit each other, forming what’s called a binary. When the white dwarf passes in front of the pulsar from our point of view, it forces light from J0740+6620 to take a slightly longer path to Earth, because the white dwarf’s gravity slightly warps the space around it. The team used the delay in J0740+6620’s pulses to calculate the mass of both objects.

    Previous measurements from the Laser Interferometer Gravitational-Wave Observatory (LIGO) suggest that the upper limit for a neutron star’s mass is about 2.17 times that of the sun—a figure that’s just a smidge above J0740+6620’s estimated heft. But with future observations, that number could still change.

    MIT /Caltech Advanced aLigo

    Harshal Gupta, NSF program director for the Green Bank Observatory, called the new paper “a very solid effort in terms of astronomy and the physics of compact objects,” Mandelbaum reports.

    “Each ‘most massive’ neutron star we find brings us closer to identifying that tipping point [when they must collapse],” study author Scott Ransom, an astronomer at the National Radio Astronomy Observatory, said in a statement. “The orientation of this binary star system created a fantastic cosmic laboratory.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

     
  • richardmitnick 8:25 pm on September 16, 2019 Permalink | Reply
    Tags: , , , , ,   

    From UC Santa Barbara: “A Quantum Leap” 

    UC Santa Barbara Name bloc
    From UC Santa Barbara

    September 16, 2019
    James Badham

    $25M grant makes UC Santa Barbara home to the nation’s first NSF-funded Quantum Foundry, a center for development of materials and devices for quantum information-based technologies.

    1
    Professors Stephen Wilson and Ania Bleszynski Jayich will co-direct the campus’s new Quantum Foundry

    We hear a lot these days about the coming quantum revolution. Efforts to understand, develop, and characterize quantum materials — defined broadly as those displaying characteristics that can be explained only by quantum mechanics and not by classical physics — are intensifying.

    Researchers around the world are racing to understand these materials and harness their unique qualities to develop revolutionary quantum technologies for quantum computing, communications, sensing, simulation and other quantum technologies not yet imaginable.

    This week, UC Santa Barbara stepped to the front of that worldwide research race by being named the site of the nation’s first Quantum Foundry.

    Funded by an initial six-year, $25-million grant from the National Science Foundation (NSF), the project, known officially as the UC Santa Barbara NSF Quantum Foundry, will involve 20 faculty members from the campus’s materials, physics, chemistry, mechanical engineering and computer science departments, plus myriad collaborating partners. The new center will be anchored within the California Nanosystems Institute (CNSI) in Elings Hall.

    3
    California Nanosystems Institute

    The grant provides substantial funding to build equipment and develop tools necessary to the effort. It also supports a multi-front research mission comprising collaborative interdisciplinary projects within a network of university, industry, and national-laboratory partners to create, process, and characterize materials for quantum information science. The Foundry will also develop outreach and educational programs aimed at familiarizing students at all levels with quantum science, creating a new paradigm for training students in the rapidly evolving field of quantum information science and engaging with industrial partners to accelerate development of the coming quantum workforce.

    “We are extremely proud that the National Science Foundation has chosen UC Santa Barbara as home to the nation’s first NSF-funded Quantum Foundry,” said Chancellor Henry T. Yang. “The award is a testament to the strength of our University’s interdisciplinary science, particularly in materials, physics and chemistry, which lie at the core of quantum endeavors. It also recognizes our proven track record of working closely with industry to bring technologies to practical application, our state-of-the-art facilities and our educational and outreach programs that are mutually complementary with our research.

    “Under the direction of physics professor Ania Bleszynski Jayich and materials professor Stephen Wilson the foundry will provide a collaborative environment for researchers to continue exploring quantum phenomena, designing quantum materials and building instruments and computers based on the basic principles of quantum mechanics,” Yang added.

    Said Joseph Incandela, the campus’s vice chancellor for research, “UC Santa Barbara is a natural choice for the NSF quantum materials Foundry. We have outstanding faculty, researchers, and facilities, and a great tradition of multidisciplinary collaboration. Together with our excellent students and close industry partnerships, they have created a dynamic environment where research gets translated into important technologies.”

    “Being selected to build and host the nation’s first Quantum Foundry is tremendously exciting and extremely important,” said Rod Alferness, dean of the College of Engineering. “It recognizes the vision and the decades of work that have made UC Santa Barbara a truly world-leading institution worthy of assuming a leadership role in a mission as important as advancing quantum science and the transformative technologies it promises to enable.”

    “Advances in quantum science require a highly integrated interdisciplinary approach, because there are many hard challenges that need to be solved on many fronts,” said Bleszynski Jayich. “One of the big ideas behind the Foundry is to take these early theoretical ideas that are just beginning to be experimentally viable and use quantum mechanics to produce technologies that can outperform classical technologies.”

    Doing so, however, will require new materials.

    “Quantum technologies are fundamentally materials-limited, and there needs to be some sort of leap or evolution of the types of materials we can harness,” noted Wilson. “The Foundry is where we will try to identify and create those materials.”

    Research Areas and Infrastructure

    Quantum Foundry research will be pursued in three main areas, or “thrusts”:

    • Natively Entangled Materials, which relates to identifying and characterizing materials that intrinsically host anyon excitations and long-range entangled states with topological, or structural, protection against decoherence. These include new intrinsic topological superconductors and quantum spin liquids, as well as materials that enable topological quantum computing.

    • Interfaced Topological States, in which researchers will seek to create and control protected quantum states in hybrid materials.

    • Coherent Quantum Interfaces, where the focus will be on engineering materials having localized quantum states that can be interfaced with various other quantum degrees of freedom (e.g. photons or phonons) for distributing quantum information while retaining robust coherence.

    Developing these new materials and assessing their potential for hosting the needed coherent quantum state requires specialized equipment, much of which does not exist yet. A significant portion of the NSF grant is designated to develop such infrastructure, both to purchase required tools and equipment and to fabricate new tools necessary both to grow and characterize the quantum states in the new materials, Wilson said.

    UC Santa Barbara’s deep well of shared materials growth and characterization infrastructure was also a factor in securing the grant. The Foundry will leverage existing facilities, such as the large suite of instrumentation shared via the Materials Research Lab and the California Nanosystems Institute, multiple molecular beam epitaxy (MBE) growth chambers (the university has the largest number of MBE apparatuses in academia), unique optical facilities such as the Terahertz Facility, state-of-the-art clean rooms, and others among the more than 300 shared instruments on campus.

    Data Science

    NSF is keenly interested in both generating and sharing data from materials experiments. “We are going to capture Foundry data and harness it to facilitate discovery,” said Wilson. “The idea is to curate and share data to accelerate discovery at this new frontier of quantum information science.”

    Industrial Partners

    Industry collaborations are an important part of the Foundry project. UC Santa Barbara’s well-established history of industrial collaboration — it leads all universities in the U.S. in terms of industrial research dollars per capita — and the application focus that allows it to to transition ideas into materials and materials into technologies, was important in receiving the Foundry grant.

    Another value of industrial collaboration, Wilson explained, is that often, faculty might be looking at something interesting without being able to visualize how it might be useful in a scaled-up commercial application. “If you have an array of directions you could go, it is essential to have partners to help you visualize those having near-term potential,” he said.

    “This is a unique case where industry is highly interested while we are still at the basic-science level,” said Bleszynski Jayich. “There’s a huge industry partnership component to this.”

    Among the 10 inaugural industrial partners are Microsoft, Google, IBM, Hewlett Packard Enterprises, HRL, Northrop Grumman, Bruker, SomaLogic, NVision, and Anstrom Science. Microsoft and Google have substantial campus presences already; Microsoft’s Quantum Station Q lab is here, and UC Santa Barbara professor and Google chief scientist John Martinis and a team of his Ph.D. student researchers are working with Google at its Santa Barbara office, adjacent to campus, to develop Google’s quantum computer.

    Undergraduate Education

    In addition, with approximately 700 students, UC Santa Barbara’s undergraduate physics program is the largest in the U.S. “Many of these students, as well as many undergraduate engineering and chemistry students, are hungry for an education in quantum science, because it’s a fascinating subject that defies our classical intuition, and on top of that, it offers career opportunities. It can’t get much better than that,” Bleszynski Jayich said.

    Graduate Education Program

    Another major goal of the Foundry project is to integrate quantum science into education and to develop the quantum workforce. The traditional approach to quantum education at the university level has been for students to take physics classes, which are focused on the foundational theory of quantum mechanics.

    “But there is an emerging interdisciplinary component of quantum information that people are not being exposed to in that approach,” Wilson explained. “Having input from many overlapping disciplines in both hard science and engineering is required, as are experimental touchstones for trying to understand these phenomena. Student involvement in industry internships and collaborative research with partner companies is important in addressing that.”

    “We want to introduce a more practical quantum education,” Bleszynski Jayich added. “Normally you learn quantum mechanics by learning about hydrogen atoms and harmonic oscillators, and it’s all theoretical. That training is still absolutely critical, but now we want to supplement it, leveraging our abilities gained in the past 20 to 30 years to control a quantum system on the single-atom, single-quantum-system level. Students will take lab classes where they can manipulate quantum systems and observe the highly counterintuitive phenomena that don’t make sense in our classical world. And, importantly, they will learn various cutting-edge techniques for maintaining quantum coherence.

    “That’s particularly important,” she continued, “because quantum technologies rely on the success of the beautiful, elegant theory of quantum mechanics, but in practice we need unprecedented control over our experimental systems in order to observe and utilize their delicate quantum behavior.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition


    UC Santa Barbara Seal
    The University of California, Santa Barbara (commonly referred to as UC Santa Barbara or UCSB) is a public research university and one of the 10 general campuses of the University of California system. Founded in 1891 as an independent teachers’ college, UCSB joined the University of California system in 1944 and is the third-oldest general-education campus in the system. The university is a comprehensive doctoral university and is organized into five colleges offering 87 undergraduate degrees and 55 graduate degrees. In 2012, UCSB was ranked 41st among “National Universities” and 10th among public universities by U.S. News & World Report. UCSB houses twelve national research centers, including the renowned Kavli Institute for Theoretical Physics.

     
  • richardmitnick 7:55 pm on September 16, 2019 Permalink | Reply
    Tags: "Where the Rivers Meet the Sea", Estuaries are the borderlands between salt- and freshwater environments and they are incredibly diverse both biologically and physically., Oceanus Magazine,   

    From Oceanus Magazine via Woods Hole Oceanographic Institution: “Where the Rivers Meet the Sea” 

    From Woods Hole Oceanographic Institution

    1

    September 16, 2019
    W. Rockwell Geyer

    The transition from salt to fresh water is turbulent, vulnerable, and incredibly bountiful.

    2

    The sea lions stop bellowing and slip, one by one, off the jetty into the mocha-brown water of the Fraser River, near Vancouver, British Columbia. The surface of the water is smooth, except for a line of ripples moving slowly upriver. The sea lions seem to know that the calm surface belies turmoil beneath.

    The tide has just turned, and a tongue of salt water is first creeping, then galloping, back into the Fraser just a few hours after being expelled by a strong outflow during the previous ebb. Although the surface appears calm, the underwater intersection of fresh and salt water roils with turbulent eddies as strong as any in the ocean. The confusion of swirling water and suspended sediments disorients homeward-bound salmon, providing an easy feast for the sea lions.

    Not all rivers end as dramatically as the Fraser. But the mixing of freshwater streams and rivers with salty ocean tides in a partly enclosed body of water—natural scientists call it an estuary—fuels some of the most productive ecosystems on Earth, and also some of the most vulnerable.

    Long before the advent of civilization, early humans recognized the bounty of the estuary and made these regions a focal point for human habitation. Unfortunately, overdevelopment, poor land use, and centuries of industrial contamination have taken a toll on most estuaries. Boston Harbor, San Francisco Bay, and the Hudson River are poster children for environmental degradation.

    Yet there is hope. Estuaries are the borderlands between salt- and freshwater environments, and they are incredibly diverse both biologically and physically. The diversity and the high energy of the ecosystem make estuaries remarkably resilient. With a better understanding of these systems, we can reverse their decline and restore the ecological richness of these valuable, albeit muddy, environments.
    How does an estuary work?

    From a physicist’s point of view, the density difference between fresh and salt water makes estuaries interesting. When river water meets sea water, the lighter fresh water rises up and over the denser salt water. Sea water noses into the estuary beneath the outflowing river water, pushing its way upstream along the bottom.

    Often, as in the Fraser River, this occurs at an abrupt salt front. Across such a front, the salt content (salinity) and density may change from oceanic to fresh in just a few tens of meters horizontally and as little as a meter vertically.

    Accompanying these strong salinity and density gradients are large vertical changes in current direction and strength. You can’t see these swirling waters from the surface, but a fisherman may find that his net takes on a life of its own when he lowers it into seemingly placid water.

    Pliny the Elder, the noted Roman naturalist, senator, and commander of the Imperial Fleet in the 1st century A.D., observed this peculiar behavior of fishermens’ nets in the Strait of Bosphorus, near Istanbul. Pliny deduced that surface and bottom currents were flowing in opposite directions, and he provided the first written documentation of what we now call the “estuarine circulation.”

    Saltwater intrusion

    The opposing fresh and saltwater streams sometimes flow smoothly, one above the other. But when the velocity difference reaches a certain threshold, vigorous turbulence results, and the salt and fresh water are mixed. Tidal currents, which act independently of estuarine circulation, also add to the turbulence, mixing the salt and fresh waters to produce brackish water in the estuary.

    In the Fraser River, this circulation is confined to a very short and energetic frontal zone near the mouth, sometimes only several hundred meters long. In other estuaries, such as San Francisco Bay, the Chesapeake Bay, or the Hudson River, the salt front and accompanying estuarine circulation extend inland for many miles.

    The landward intrusion of salt is carefully monitored by engineers because of the potential consequences to water supplies if the salt intrusion extends too far. For instance, the city of Poughkeepsie, N.Y., 60 miles north of the mouth of the Hudson River, depends on the river for its drinking water. Roughly once per decade, drought conditions cause the salt intrusion to approach the Poughkeepsie freshwater intake. The last time this happened, in 1995, extra water had to be spilled from dams upstream to keep the salt front from becoming a public health hazard.

    The lifeblood of estuaries

    Estuarine circulation serves a valuable, ecological function. The continual bottom flow provides an effective ventilation system, drawing in new oceanic water and expelling brackish water. If it weren’t for this natural “flushing” process, the waters of the estuary would become stagnant, pollution would accumulate, and oxygen would be depleted.

    This circulation system leads to incredible ecological productivity. Nutrients and dissolved oxygen are continually resupplied from the ocean, and wastes are expelled in the surface waters. This pumping action leads to some of the highest growth rates of microscopic plants (researchers call it “primary production”) in any marine environment. This teeming population of plankton provides a base for diverse and valuable food webs, fueling the growth of some of our most prized fish, birds, and mammals—salmon, striped bass, great blue heron, bald eagles, seals, and otters, to name a few.

    The vigor of the circulation depends in part on the supply of river water to push the salt water back. The San Francisco Bay area has become a center of controversy in recent years because there are many interests competing for the fresh water flowing into the Bay—principally agriculture and urban water supplies extending to Southern California. Environmentalists are determined that San Francisco Bay should get “its share” of the fresh water coming from the Sacramento-San Joachim delta because the vast freshwater habitats in the region are particularly vulnerable to salt intrusion.

    Estuarine circulation is also affected by the tides; stronger tides generally enhance the exchange and improve the ecological function of the system. The Hudson estuary, for example, is tidal for 153 miles inland to Troy, N.Y. The Algonquin Indians called the river Mohicanituk, “the river that flows both ways.”

    Mucking up the system

    Estuaries have their problems. Some are self-inflicted; some are caused by the abuses of human habitation.

    An estuary, with all of its dynamic stirrings, has one attribute that promotes its own destruction: It traps sediment. When suspended mud and solids from a river enter the estuary, they encounter the salt front. Unlike fresh water, which rides up and over the saline layer, the sediment falls out of the surface layer into the denser, saltier layer of water moving into the estuary. As it drops, it gets trapped and accumulates on the bottom. Slowly, the estuary grows muddier and muddier, shallower and shallower.

    Occasionally a major flood will push the salt right out of the estuary, carrying the muddy sediment along with it. Sediment cores in the Hudson River indicate that sediment may accumulate for 10, 20, or even 50 years, laying down layers every year like tree rings. But then a hurricane or big snowmelt floods the river, wipes out the layers of sediment, and sends the mud out to sea.

    The “episodic” behavior of sediment deposition is good news and bad news. It is good because a big storm can keep an estuary from getting too shallow too fast. In fact, it appears that over the last 6,000 years, the natural dredging by large storms has maintained nearly constant water depth in the Hudson estuary.

    The bad news is that the sediment retains a “memory” of all of the contaminants that have passed through it over the years. Environmental regulations are far stricter now than they were 50 years ago, and we have stopped using many chemicals that play havoc with the environment. For instance, polychlorinated biphenyls (PCBs) were banned in the 1970s because they were shown to be toxic to fish and wildlife, and to the humans who consume them. Yet we still have a contamination problem in the Hudson and other rivers because PCBs are slow to decay and each new flood remobilizes these “legacy” contaminants and prolongs our exposure.

    Trickle-down effects

    Billions of dollars are now being spent to clean up American estuaries contaminated by industrial pollution. In Boston, for instance, the new sewage system created to save Boston Harbor cost taxpayers about $5 billion. The Superfund program of the U.S. Environmental Protection Agency collects and spends billions of dollars more to remediate estuaries.

    Often the remediation strategies are complex and controversial. In the case of Hudson River, there is a heated debate about whether PCB-contaminated sediments should be removed—dredged with high-tech methods that theoretically minimize environmental harm—or left undisturbed. That debate pivots on the episodic storm phenomenon: Are the contaminated sediments there to stay, or could they get stirred up when the next hurricane washes through the Hudson Valley?

    Aside from cleanup initiatives, parts of the Hudson need to be dredged for navigational purposes. Dredging is not that costly or difficult, but finding a place to put contaminated sediments is a problem. The Port of New York has been filling up abandoned Pennsylvania coal mines with its contaminated mud, but that is not a long-term solution.

    While the problems of American estuaries are complicated and expensive, they pale in comparison to Asian estuaries. The entire nation of Bangladesh lies within the estuary and lower floodplain of the Ganges-Brahmaputra River. Other Asian rivers such as the Mekong, Chiang Jiang (or Yangtze), and Huang Ho (or Yellow River) are crowded and strained by concentrated human settlements. Global sea-level rise is causing a loss of land, increased flooding, and increased salt intrusion in these estuaries.

    The demand for water upstream for irrigation and domestic use significantly reduces freshwater flow through these systems. The Indus River and Huang Ho estuaries have suffered from drastic reductions of freshwater flow over the past several decades, and the impact of these human alterations is just now being recognized. New policies about land use, water diversion, and even global carbon dioxide production (which affects global warming and sea level rise) will be needed to protect these vulnerable estuarine environments and their human inhabitants.

    Stirring up new ideas

    One of the challenges of estuarine research is that most of the significant problems are interdisciplinary, involving physics, biology, chemistry, geology, and often public policy and economics. Estuaries are also incredibly diverse, coming in all shapes and sizes. Yet scientists are continually challenged by public policymakers to generalize our results from studies of one estuary and apply them to the rest of the world’s estuaries.

    As scientists, one of our roles is to predict changes in the environment, given different natural and human-induced influences. To foresee the health of estuaries in the future, we have some fundamental questions to answer about the present and the past. How far will salt intrude if river flow is cut in half? Do changes in river flow increase or decrease the rate at which sediments shoal the estuary? What effect do such changes have on the fish that spawn in fresh water?

    What we learn will be critical for a human population that increasingly values coastal waters. We need sound public policy to reduce vulnerability to coastal flooding and to protect drinking water, food supplies, and some of the world’s most important habitats. We will develop better policies only if we can ground them in better science.

    Oceanus, the oceanography magazine produced by WHOI,now has an online version at http://oceanusmag.whoi.edu.
    Initial articles feature deep ocean exploration, such as the
    evolutionary puzzle of seafloor life, life beneath the sea floor,
    and undersea earthquakes. Articles on current research in the
    coastal ocean, including the debate over wind farms, are being
    added regularly, and future articles will focus on ocean life,
    from marine mammals to genetics. The online version includes
    an email update function, which emails links to new articles
    when they are posted, printer-friendly versions of each article,
    and an “e-mail this to a friend” function. For visually impaired
    viewers, there is a button to enlarge the screen display. Oceanus
    will cover the work of the Institution’s Ocean Institutes this
    year, then branch out to cover WHOI science more broadly. Print
    issues of the magazine will also be available later this year.

    See the full article here. .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Woods Hole Oceanographic Institute

    Vision & Mission

    The ocean is a defining feature of our planet and crucial to life on Earth, yet it remains one of the planet’s last unexplored frontiers. For this reason, WHOI scientists and engineers are committed to understanding all facets of the ocean as well as its complex connections with Earth’s atmosphere, land, ice, seafloor, and life—including humanity. This is essential not only to advance knowledge about our planet, but also to ensure society’s long-term welfare and to help guide human stewardship of the environment. WHOI researchers are also dedicated to training future generations of ocean science leaders, to providing unbiased information that informs public policy and decision-making, and to expanding public awareness about the importance of the global ocean and its resources.
    Mission Statement

    The Woods Hole Oceanographic Institution is dedicated to advancing knowledge of the ocean and its connection with the Earth system through a sustained commitment to excellence in science, engineering, and education, and to the application of this knowledge to problems facing society.

     
  • richardmitnick 5:08 pm on September 16, 2019 Permalink | Reply
    Tags: , , , New Results for the Mass of Neutrinos,   

    From Karlsruhe Institute of Technology: “New Results for the Mass of Neutrinos” 

    1

    From Karlsruhe Institute of Technology

    16.09.2019

    Dr. Joachim Hoffmann
    Redakteur/Pressereferent
    Tel.: +49 721 608-21151
    joachim hoffmann∂kit edu

    1
    The layout and major features of the KATRIN experimental facility at the Karlsruhe Institute of Technology.Karlsruhe Institute of Technology.
    Overview of the 70 m long KATRIN setup with its major components a) windowless gaseous tritium source, b) pumping section, and c) electrostatic spectrometers and focal plane detector. (Fig.: Michaela Meloni, KIT)

    Karlsruhe Tritium Neutrino Experiment KATRIN limits Neutrino Masses to less than 1 eV.

    Neutrinos and their small non-zero masses play a key role in cosmology and particle physics. The allowed range of the mass scale has now been narrowed down by the initial results of the international Karlsruhe Tritium Neutrino Experiment (KATRIN). The analysis of a first four-week measurement run in spring 2019 limits neutrino masses to less than approximately 1 eV, which is smaller by a factor of 2 compared to previous laboratory results based on multi-year campaigns. This demonstrates the huge potential of KATRIN in elucidating novel properties of neutrinos over the coming years.

    Apart from photons, the fundamental quanta of light, neutrinos are the most abundant elementary particles in the universe. The observation of neutrino oscillations two decades ago proved that they possess a small non-zero mass, contrary to earlier expectations. Accordingly, the “light-weights of the universe” play a prominent role in the evolution of large-scale structures in the cosmos as well as in the world of elementary particles, where their small mass scale points to new physics beyond known theories. Over the coming years, the most precise scale of the world, the international KATRIN experiment located at the Karlsruhe Institute of Technology (KIT), is set to measure the mass of the fascinating neutrinos with unprecedented precision.

    In the past years, the KATRIN collaboration, formed by 20 institutions from 7 countries, successfully mastered many technological challenges in the commissioning of the 70 m long experimental setup (see Fig. 1). In mid-2018, KATRIN reached an important milestone with the
    official inauguration of the beamline. In spring this year, the big moment finally arrived: the 150-strong team (see Fig. 2) was able to “put neutrinos on the ultra-precise scale of KATRIN” for the first time. To that end, high-purity tritium gas was circulated over weeks through the source cryostat, and high statistics energy spectra of electrons were collected. Following this, the international analysis team went to work on extracting the first neutrino mass result from the spring 2019 measurement campaign.

    KATRIN’s current result builds upon years of effort, which established a data-processing framework, identified and constrained key backgrounds and sources of uncertainty, and constructed a comprehensive model of the instrumental response. Through simulations and test measurements, an international team of analysts gained a deep understanding of the experiment and its detailed behavior. In spring 2019, both hardware and analysis groups were ready for taking neutrino mass data. Thierry Lasserre (CEA, Frankreich, Max Planck Institute for Physics, Munich), analysis coordinator for this first measurement campaign, described what happened as the data came in: “Our three international analysis teams deliberately worked separately from each other to guarantee truly independent results. In doing so, special emphasis was put on securing that no team member was able to prematurely deduce the neutrino mass result before completion of the final analysis step.”

    As is customary in today’s precision experiments, vital additional information required to complete the analysis was veiled, a process known to specialists as “blinding.” To coordinate their final steps, the analysts met for a one-week workshop at KIT in mid-July. By late evening on July 18, the uncertainties were finalized and the spectral models were unblinded. As a result, the analysis programs simultaneously performed overnight fits to search for the tell-tale signature of a massive neutrino. The following morning, all three groups announced identical results, which limit the absolute mass of neutrinos to a value of less than 1 electron-volt (eV) at 90% confidence. Thus, half a million of the neutrinos weigh less than one electron, the second lightest elementary particle.

    The two long-term co-spokespersons of the experiment, Guido Drexlin from KIT and Christian Weinheimer from Münster University, comment on this very first result with great joy: “The fact that it took KATRIN only a few weeks to provide a world-leading sensitivity and to improve on the multi-year campaigns of the predecessor experiments by a factor of 2 demonstrates the extraordinary high potential of our project”. The KIT Vice-President for Research, Oliver Kraft, congratulates the collaboration “on this fantastic achievement which builds on the many technological breakthroughs reached over the past years. These world-leading benchmarks would not have been possible without the close cooperation of all partners bundling their unique expertise.”

    Kathrin Valerius, leader of a Helmholtz Young Investigators Group, is coordinating KATRIN analysis activities at KIT. During the commissioning phase, her team worked in particular on precision modeling of the tritium source as well as on dedicated calibration and test measurements leading up to neutrino mass data taking: “We are delighted that the intense preparations are now bearing fruit, and proud to be able to analyze the first neutrino mass data with this highly motivated team.”

    3
    Electron energy spectrum of tritium scanning together with fitted model, from which neutrino mass is derived. (Graphik: Lisa Schlüter, TU München)

    The analyses, which were presented at a recent scientific symposium in Toyama, Japan, and simultaneously have been submitted to a renowned science journal for publication, make use of a fundamental principle known for a long time in direct kinematic studies of neutrino mass: in the beta decay process of tritium, the electron and its neutral, undetected partner, the (electron) neutrino, statistically share the available decay energy of 18.6 keV. In extremely rare cases, the electron effectively obtains the entire decay energy, while the neutrino is left with almost no energy, the minimum amount being – following Einstein – its rest mass E = mc². It is this tiny spectral distortion due to the non-zero neutrino mass that the KATRIN team was looking for in an ensemble of more than 2 million electrons collected over a few tens of eV narrow energy interval close to the kinematic endpoint (see Fig. 3).

    This is only a tiny fraction of the total number of 25 billion electrons generated per second in the gaseous molecular tritium source of KATRIN. To maintain this huge number of decays, a closed tritium cycle at high throughput is mandatory. Operation of this unprecedented high-luminosity source requires the entire infrastructure of the Karlsruhe Tritium Laboratory, where the source cryostats are located. The adjacent huge electrostatic main spectrometer of 24 m length and 10 m diameter then acts as precision filter to transmit only the extremely tiny fraction of highest-energy electrons carrying information about the neutrino mass. Variation of the ultra-precise (on the ppm scale) retarding potential over tens of volts then gives unprecedented precision in the spectroscopy of electrons from tritium decay.

    With the now established world-leading upper limit of the neutrino mass, KATRIN has taken its first successful step in elucidating unknown properties of neutrinos, many more steps will follow in the co
    ming years. The two co-spokespersons look forward to further significant improvements of the neutrino mass sensitivity and in the search for novel effects beyond the Standard Model of Particle Physics. In the name of the entire collaboration, they would also like to thank the awarding authorities for their long-term support in the realization and operation of the experiment: “KATRIN is not only a shining beacon of fundamental research and an outstandingly reliable high-tech instrument, but also a motor of international cooperation which provides first-class training of young researchers.”

    Being „The Research University in the Helmholtz Association“, KIT creates and imparts knowledge for the society and the environment. It is the objective to make significant contributions to the global challenges in the fields of energy, mobility and information. For this, about 9,300 employees cooperate in a broad range of disciplines in natural sciences, engineering sciences, economics, and the humanities and social sciences. KIT prepares its 25,100 students for responsible tasks in society, industry, and science by offering research-based study programs. Innovation efforts at KIT build a bridge between important scientific findings and their application for the benefit of society, economic prosperity, and the preservation of our natural basis of life.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    2

    Mission Statement of KIT

    Preamble

    The Karlsruhe Institute of Technology, briefly referred to as KIT, was established by the merger of the Forschungszentrum Karlsruhe GmbH and the Universität Karlsruhe (TH) on October 01, 2009. KIT combines the tasks of a university of the state of Baden-Württemberg with those of a research center of the Helmholtz Association in the areas of research, teaching, and innovation.

    The KIT merger represents the consistent continuation of a long-standing close cooperation of two research and education institutions rich in tradition. The University of Karlsruhe was founded in 1825 as a Polytechnical School and has developed to a modern location of research and education in natural sciences, engineering, economics, social sciences, and the humanities, which is organized in eleven departments. The Karlsruhe Research Center was founded in 1956 as the Nuclear Reactor Construction and Operation Company and has turned into a multidisciplinary large-scale research center of the Helmholtz Association, which conducts research under eleven scientific and engineering programs.

    In 2014/15, the KIT concentrated on an overarching strategy process to further develop its corporate strategy. This mission statement as the result of a participative process was the first element to be incorporated in the strategy process.

    Mission Statement of KIT

    KIT combines the traditions of a renowned technical university and a major large-scale research institution in a very unique way. In research and education, KIT assumes responsibility for contributing to the sustainable solution of the grand challenges that face the society, industry, and the environment. For this purpose, KIT uses its financial and human resources with maximum efficiency. The scientists of KIT communicate the contents and results of their work to society.

    Engineering sciences, natural sciences, the humanities, and social sciences make up the scope of subjects covered by KIT. In high interdisciplinary interaction, scientists of these disciplines study topics extending from the fundamentals to application and from the development of new technologies to the reflection of the relationship between man and technology. For this to be accomplished in the best possible way, KIT’s research covers the complete range from fundamental research to close-to-industry, applied research and from small research partnerships to long-term large-scale research projects. Scientific sincerity and the striving for excellence are the basic principles of our activities.

    Worldwide exchange of knowledge, large-scale international research projects, numerous global cooperative ventures, and cultural diversity characterize and enrich the life and work at KIT. Academic education at KIT is guided by the principle of research-oriented teaching. Early integration into interdisciplinary research projects and international teams and the possibility of using unique research facilities open up exceptional development perspectives for our students.

    The development of viable technologies and their use in industry and the society are the cornerstones of KIT’s activities. KIT supports innovativeness and entrepreneurial culture in various ways. Moreover, KIT supports a culture of creativity, in which employees and students have time and space to develop new ideas.

    Cooperation of KIT employees, students, and members is characterized by mutual respect and trust. Achievements of every individual are highly appreciated. Employees and students of KIT are offered equal opportunities irrespective of the person. Family-friendliness is a major objective of KIT as an employer. KIT supports the compatibility of job and family. As a consequence, the leadership culture of KIT is also characterized by respect and cooperation. Personal responsibility and self-motivation of KIT employees and members are fostered by transparent and participative decisions, open communication, and various options for life-long learning.

    The structure of KIT is tailored to its objectives in research, education, and innovation. It supports flexible, synergy-based cooperation beyond disciplines, organizations, and hierarchies. Efficient services are rendered to support KIT employees and members in their work.

    Young people are our future. Reliable offers and career options excellently support KIT’s young scientists and professionals in their professional and personal development.

     
  • richardmitnick 12:40 pm on September 16, 2019 Permalink | Reply
    Tags: "Saving baby turtles one nest at a time", , , , Predation on turtle nests   

    From CSIROscope: “Saving baby turtles one nest at a time” 

    CSIRO bloc

    From CSIROscope

    16 September 2019
    Louise Jeckells

    1
    Every year thousands of sea turtles come to the beaches of the western Cape to nest. Photo by Gina Zimny

    When baby sea turtles, or hatchlings, break free from their eggs they have to make a long and difficult journey to the ocean. These tiny newborns face a number of threats just trying to make it to the water’s edge. This running of the gauntlet is critical for their survival and the continuation of the species.

    But before they even leave the nest they’re already under threat. Predators taking eggs from the nest is one of the most significant threats to marine turtles. Feral pigs, goannas and dingoes are disturbing turtle nests in parts of Queensland’s western Cape York Peninsula. Before our research scientist Dr Justin Perry and Indigenous rangers from Aak Puul Ngangtam (APN Cape York) started working in the area, there was 100 per cent predation on turtle nests. No baby turtles were reaching the ocean.

    We’ve been working with the local community since 2008 to understand the impacts of feral animals on the ecological and economic values of northern Australia. Justin and his team focussed on a biocultural assessment to understand the impacts on turtles and to collaboratively design a solution with local people.

    “This was a big problem and the management actions being applied weren’t working,” Justin said.

    “The science and monitoring was separated from the management, and management was separated from the community.”

    2
    Turtles survey team on Cape York beach. Photo by Gina Zimny

    Forging a pig plan together

    “We brought together the regional bodies that were responsible for managing pigs and turtles to create the Northern Nest Project. This was when we started to look at the turtle problem in a holistic way,” Justin said.

    The working group decided to tackle the problem using a targeted control method. They trialed a baiting system to target the specific pigs that were coming onto the beach and eating the eggs.

    This control method was not popular with Traditional Owners as the effects on other species such as dingoes and birds were unknown. The scientists ran a very small-scale project, monitoring every bait station with cameras and providing regular reports. The efforts were rewarded. The following year there was a 100 per cent success rate for baby turtles hatching and reaching the ocean.

    2
    Turtle measurements are essential to improving our understanding. The data is collected at night. Photo by Gina Zimny

    Specific predation plans

    After this success, we (through the Northern Australia Environmental Resources Hub project) and APN funded a full-time researcher to patrol the beach during the turtle nesting season. This provided a complete overview of the predation events and turtle nesting habits across the year. Once the team started measuring the pig impacts, they could see the impact of other predators in the area.

    Another Indigenous group had been using cages to stop feral pigs. But the aluminum cages were heavy and hard to manage. So Justin and his team decided to test the effectiveness of inexpensive and easy to carry garden mesh for protecting nests from predators. APN’s resident scientist, Gina Zimny, meshed hundreds of nests across the season. When the team tallied up the impact, it was clear that this method was only stopping a handful of predators. The mesh protected nests from hungry dingoes but only stopped around 10 per cent of goannas. And they were helpless against the destructive power of feral pigs.

    4
    Mesh protects turtle hatchlings from predation but still enables them to head for the ocean. Photo by Gina Zimny

    Protecting the nests of marine turtles from raids by pigs, dingoes and goannas requires species-specific management strategies. To tackle this challenge, we designed an interactive dashboard that combined all the data that had been collected for the past four years. This way rangers could see how effective their management efforts had been.

    On target

    Justin said the most efficient way of controlling predation was linking the monitoring data with management.

    “Everyone agreed that the value was getting more baby turtles into the ocean. And turtle experts had set a target of 70 per cent of nests hatching to maintain a healthy population.

    “To hit this metric of success, we knew we had to apply an adaptive management process to the problem,” Justin said.

    5
    Cape York. Photo by Gina Zimny

    “We’re working towards providing an immediate feedback loop on predation. The idea is to link an iPad application with a cloud server. Then when the rangers return every night, a summary dashboard updates and provides all the data required to react to the situation.

    “It’s such a vast landscape so the only way to win is to tackle one small task at a time. Trying to control the entire pig population is not feasible. But targeting the egg-eating individuals can be done and we have shown that it works,” Justin said.

    This year we are working on automating the data analysis and summaries so that rangers get to see what’s happening on their beaches in real-time. Having real-time data linked with planned management responses will close the adaptive management loop. And it will give the baby turtles the best chance of making it from nest to ocean.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    SKA/ASKAP radio telescope at the Murchison Radio-astronomy Observatory (MRO) in Mid West region of Western Australia

    So what can we expect these new radio projects to discover? We have no idea, but history tells us that they are almost certain to deliver some major surprises.

    Making these new discoveries may not be so simple. Gone are the days when astronomers could just notice something odd as they browse their tables and graphs.

    Nowadays, astronomers are more likely to be distilling their answers from carefully-posed queries to databases containing petabytes of data. Human brains are just not up to the job of making unexpected discoveries in these circumstances, and instead we will need to develop “learning machines” to help us discover the unexpected.

    With the right tools and careful insight, who knows what we might find.

    CSIRO campus

    CSIRO, the Commonwealth Scientific and Industrial Research Organisation, is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

     
  • richardmitnick 12:11 pm on September 16, 2019 Permalink | Reply
    Tags: , , First-year Research Immersion, SUNY Binghamton, , Undergraduate research, University of Maryland,   

    From The Conversation: “At these colleges, students begin serious research their first year” 

    Conversation
    From The Conversation

    September 16, 2019
    Nancy Stamp

    Rat brains to understand Parkinson’s disease. Drones to detect plastic landmines. Social media to predict acts of terrorism.

    These are just a few potentially lifesaving research projects that students have undertaken in recent years at universities in New York and Maryland. While each project is interesting by itself, there’s something different about these particular research projects – all three were carried out by undergraduates during their earliest years of college.

    That’s noteworthy because students usually have to wait until later in their college experience – even graduate school – to start doing serious research. While about one out of every five undergraduates get some kind of research experience, the rest tend to get just “cookbook” labs that typically do not challenge students to think but merely require them to follow directions to achieve the “correct” answer.

    That’s beginning to change through First-year Research Immersion, an academic model that is part of an emergent trend meant to provide undergraduates with meaningful research experience.

    The First-year Research Immersion is a sequence of three course-based research experiences at three universities: University of Texas at Austin, University of Maryland and Binghamton University, where I teach science.

    As a scientist, researcher and professor, I see undergraduate research experience as a crucial part of college. And as the former director of my university’s First-year Research Immersion program for aspiring science or engineering majors, I also believe these research experiences better equip students to apply what they learn in different situations.

    There is evidence to support this view. For instance, a 2018 study found that undergraduate exposure to a rigorous research program “leads to success in a research STEM career.” The same study found that undergraduates who get a research experience are “more likely to pursue a Ph.D. program and generate significantly more valued products” compared to other students.

    A closer look

    Just what do these undergraduate research experiences look like?

    At the University of Texas, it involved having students identify a new way to manage and repair DNA, the stuff that makes up our genes. This in turn provides insights into preventing genetic disorders.

    At the University of Maryland, student teams investigated how social media promotes terrorism and found that it is possible to identify when conflicts on social media can escalate into physical violence.

    4
    Binghamton student William Frazer test a drone with a sense to detect plastic landmines. Jonathn Cohen/Binghamton University

    Essential elements

    The First-year Research Immersion began as an experiment at the University of Texas at Austin in 2005. The University of Maryland at College Park and Binghamton University – SUNY adapted the model to their institutions in 2014.

    The program makes research experiences an essential part of a college course. These course-based research experiences have five elements. Specifically, they:

    Engage students in scientific practices, such as how and why to take accurate measurements.
    Emphasize teamwork.
    Examine broadly relevant topics, such as the spread of Lyme disease.
    Explore questions with unknown answers to expose students to the process of scientific discovery.
    Repeat measurements or experiments to verify results.

    The model consists of three courses. In the first course, students identify an interesting problem, determine what’s known and unknown and collaborate to draft a preliminary project proposal.

    In the second course, students develop laboratory research skills, begin their team project and use the results to write a full research proposal.

    In the third course, during sophomore year, students execute their proposed research, produce a report and a research poster.

    This sequence of courses is meant to give all students – regardless of their prior academic experience – the time and support they need to be successful.

    Does it work?

    The First-year Research Immersion program is showing promising results. For instance, at Binghamton, where 300 students who plan to major in engineering and science participate in the program, a survey indicated that participants got 14% more research experience than students in traditional laboratory courses.

    At the University of Maryland, where 600 freshmen participate in the program, students reported that they made substantial gains in communication, time management, collaboration and problem-solving.

    At the University of Texas at Austin, where about 900 freshman participate in the First-Year Research Immersion program in the natural sciences, educators found that program participants showed a 23% higher graduation rate than similar students who were not in the program. And this outcome took place irrespective of students’ gender, race or ethnicity, or whether they were the first in their family to attend college or not.

    All three programs have significantly higher numbers of students from minority groups than the campuses overall. For instance, at Binghamton University, there are 22% more students from underrepresented minority groups than the campus overall, university officials reported. This has significant implications for diversity because research shows that longer, more in-depth research experiences – ones that involve faculty – help students from minority groups and low-income students stick with college.

    5
    Akibo Watson, a neuroscience major at Binghamton University, conducts an analysis of brain tissue. Jonathan Cohen/Binghamton University

    Undergraduates who get research experience also enjoy professional and personal growth. Via online surveys and written comments, students routinely say that they improved dramatically in their self-confidence and career-building skills, such as communication, project management skills and teamwork.

    Students also report that their undergraduate research experience has helped them obtain internships or get into graduate school.

    Making research experiences available more broadly

    The challenge remains in making the opportunity of more undergraduate research experiences available to more students.

    The First-year Research Immersion program is not the only course-based research program that is connected to faculty research.

    However, to the best of my knowledge, the First-year Research Immersion programs at my university, and in Texas and Maryland, are the only such programs for first-year students that are overseen by a research scientist and involve taking three courses in a row. This three-course sequence allows student teams to delve deeply into real problems.

    More colleges could easily follow suit. For instance, traditional introductory lab courses could be transformed into research-based courses at no additional cost. And advanced lab courses could be converted to research experiences that build on those research-based courses. In that way, students could take the research projects they started during their first and second years of college even further.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Conversation launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

     
  • richardmitnick 11:38 am on September 16, 2019 Permalink | Reply
    Tags: , Gaussian noise, , non-Gaussian noise, ,   

    From MIT News and Dartmouth College: “Uncovering the hidden “noise” that can kill qubits” 

    MIT News

    From MIT News

    September 16, 2019
    Rob Matheson

    1
    MIT and Dartmouth College researchers developed a tool that detects new characteristics of non-Gaussian “noise” that can destroy the fragile quantum superposition state of qubits, the fundamental components of quantum computers. Image courtesy of the researchers.

    New detection tool could be used to make quantum computers robust against unwanted environmental disturbances.

    MIT and Dartmouth College researchers have demonstrated, for the first time, a tool that detects new characteristics of environmental “noise” that can destroy the fragile quantum state of qubits, the fundamental components of quantum computers.

    The advance may provide insights into microscopic noise mechanisms to help engineer new ways of protecting qubits.

    Qubits can represent the two states corresponding to the classic binary bits, a 0 or 1. But, they can also maintain a “quantum superposition” of both states simultaneously, enabling quantum computers to solve complex problems that are practically impossible for classical computers.

    But a qubit’s quantum “coherence” — meaning its ability to maintain the superposition state — can fall apart due to noise coming from environment around the qubit. Noise can arise from control electronics, heat, or impurities in the qubit material itself, and can also cause serious computing errors that may be difficult to correct.

    Researchers have developed statistics-based models to estimate the impact of unwanted noise sources surrounding qubits to create new ways to protect them, and to gain insights into the noise mechanisms themselves. But, those tools generally capture simplistic “Gaussian noise,” essentially the collection of random disruptions from a large number of sources. In short, it’s like white noise coming from the murmuring of a large crowd, where there’s no specific disruptive pattern that stands out, so the qubit isn’t particularly affected by any one particular source. In this type of model, the probability distribution of the noise would form a standard symmetrical bell curve, regardless of the statistical significance of individual contributors.

    In a paper published today in the journal Nature Communications, the researchers describe a new tool that, for the first time, measures “non-Gaussian noise” affecting a qubit. This noise features distinctive patterns that generally stem from a few particularly strong noise sources.

    The researchers designed techniques to separate that noise from the background Gaussian noise, and then used signal-processing techniques to reconstruct highly detailed information about those noise signals. Those reconstructions can help researchers build more realistic noise models, which may enable more robust methods to protect qubits from specific noise types. There is now a need for such tools, the researchers say: Qubits are being fabricated with fewer and fewer defects, which could increase the presence of non-Gaussian noise.

    “It’s like being in a crowded room. If everyone speaks with the same volume, there is a lot of background noise, but I can still maintain my own conversation. However, if a few people are talking particularly loudly, I can’t help but lock on to their conversation. It can be very distracting,” says William Oliver, an associate professor of electrical engineering and computer science, professor of the practice of physics, MIT Lincoln Laboratory Fellow, and associate director of the Research Laboratory for Electronics (RLE). “For qubits with many defects, there is noise that decoheres, but we generally know how to handle that type of aggregate, usually Gaussian noise. However, as qubits improve and there are fewer defects, the individuals start to stand out, and the noise may no longer be simply of a Gaussian nature. We can find ways to handle that, too, but we first need to know the specific type of non-Gaussian noise and its statistics.”

    “It is not common for theoretical physicists to be able to conceive of an idea and also find an experimental platform and experimental colleagues willing to invest in seeing it through,” says co-author Lorenza Viola, a professor of physics at Dartmouth. “It was great to be able to come to such an important result with the MIT team.”

    Joining Oliver and Viola on the paper are: first author Youngkyu Sung, Fei Yan, Jack Y. Qiu, Uwe von Lüpke, Terry P. Orlando, and Simon Gustavsson, all of RLE; David K. Kim and Jonilyn L. Yoder of the Lincoln Laboratory; and Félix Beaudoin and Leigh M. Norris of Dartmouth.

    Pulse filters

    For their work, the researchers leveraged the fact that superconducting qubits are good sensors for detecting their own noise. Specifically, they use a “flux” qubit, which consists of a superconducting loop that is capable of detecting a particular type of disruptive noise, called magnetic flux, from its surrounding environment.

    In the experiments, they induced non-Gaussian “dephasing” noise by injecting engineered flux noise that disturbs the qubit and makes it lose coherence, which in turn is then used as a measuring tool. “Usually, we want to avoid decoherence, but in this case, how the qubit decoheres tells us something about the noise in its environment,” Oliver says.

    Specifically, they shot 110 “pi-pulses” — which are used to flip the states of qubits — in specific sequences over tens of microseconds. Each pulse sequence effectively created a narrow frequency “filter” which masks out much of the noise, except in a particular band of frequency. By measuring the response of a qubit sensor to the bandpass-filtered noise, they extracted the noise power in that frequency band.

    By modifying the pulse sequences, they could move filters up and down to sample the noise at different frequencies. Notably, in doing so, they tracked how the non-Gaussian noise distinctly causes the qubit to decohere, which provided a high-dimensional spectrum of the non-Gaussian noise.

    Error suppression and correction

    The key innovation behind the work is carefully engineering the pulses to act as specific filters that extract properties of the “bispectrum,” a two-dimension representation that gives information about distinctive time correlations of non-Gaussian noise.

    Essentially, by reconstructing the bispectrum, they could find properties of non-Gaussian noise signals impinging on the qubit over time — ones that don’t exist in Gaussian noise signals. The general idea is that, for Gaussian noise, there will be only correlation between two points in time, which is referred to as a “second-order time correlation.” But, for non-Gaussian noise, the properties at one point in time will directly correlate to properties at multiple future points. Such “higher-order” correlations are the hallmark of non-Gaussian noise. In this work, the authors were able to extract noise with correlations between three points in time.

    This information can help programmers validate and tailor dynamical error suppression and error-correcting codes for qubits, which fixes noise-induced errors and ensures accurate computation.

    Such protocols use information from the noise model to make implementations that are more efficient for practical quantum computers. But, because the details of noise aren’t yet well-understood, today’s error-correcting codes are designed with that standard bell curve in mind. With the researchers’ tool, programmers can either gauge how their code will work effectively in realistic scenarios or start to zero in on non-Gaussian noise.

    Keeping with the crowded-room analogy, Oliver says: “If you know there’s only one loud person in the room, then you’ll design a code that effectively muffles that one person, rather than trying to address every possible scenario.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 11:01 am on September 16, 2019 Permalink | Reply
    Tags: , , , Metal oxide sensors, Methanol is sometimes referred to as ethanol’s deadly twin   

    From ETH Zürich: “Measuring ethanol’s deadly twin” 

    ETH Zurich bloc

    From ETH Zürich

    16.09.2019
    Fabio Bergamin

    ETH researchers have developed an inexpensive, handheld measuring device that can distinguish between methanol and potable alcohol. It offers a simple, quick method of detecting adulterated or contaminated alcoholic beverages and is able to diagnose methanol poisoning in exhaled breath.

    1
    Women in India sell home-brewed alcohol, which may contain toxic amounts of methanol. (Photograph: Shutterstock / Steve Estvanik)

    Methanol is sometimes referred to as ethanol’s deadly twin. While the latter is the intoxicating ingredient in wine, beer and schnapps, the former is a chemical that becomes highly toxic when metabolised by the human body. Even a relatively small amount of methanol can cause blindness or even prove fatal if left untreated.

    Cases of poisoning from the consumption of alcoholic beverages tainted with methanol occur time and again, particularly in developing and emerging countries, because alcoholic fermentation also produces small quantities of methanol. Whenever alcohol is unprofessionally distilled in backyard operations, relevant amounts of methanol may end up in the liquor. Beverages that have been adulterated with windscreen washer fluid or other liquids containing methanol are another potential cause of poisoning.

    Beverage analyses and the breath test

    Until now, methanol could be distinguished from ethanol only in a chemical analysis laboratory. Even hospitals require relatively large, expensive equipment in order to diagnose methanol poisoning. “These appliances are rarely available in emerging and developing countries, where outbreaks of methanol poisoning are most prevalent,” says Andreas Güntner, a research group leader at the Particle Technology Laboratory of ETH Professor Sotiris Pratsinis and a researcher at the University Hospital Zürich.

    He and his colleagues have now developed an affordable handheld device based on a small metal oxide sensor. It is able to detect adulterated alcohol within two minutes by “sniffing out” methanol and ethanol vapours from a beverage. Moreover, the tool can also be used to diagnose methanol poisoning by analysing a patient’s exhaled breath. In an emergency, this helps ensure the appropriate measures are taken without delay.

    Separating methanol from ethanol

    There’s nothing new about using metal oxide sensors to measure alcoholic vapours. However, this method was unable to distinguish between different alcohols, such as ethanol and methanol. “Even the breathalyser tests used by the police measure only ethanol, although some devices also erroneously identify methanol as ethanol,” explains Jan van den Broek, a doctoral student at ETH and the lead author of the study.

    First, the ETH scientists developed a highly sensitive alcohol sensor using nanoparticles of tin oxide doped with palladium. Next, they used a trick to differentiate between methanol and ethanol. Instead of analysing the sample directly with the sensor, the two types of alcohol are first separated in an attached tube filled with a porous polymer, through which the sample air is sucked by a small pump. As its molecules are smaller, methanol passes through the polymer tube more quickly than ethanol.

    2
    The millimetre-sized black dot in the centre of the gold section is the alcohol sensor.

    3
    In this image, the sensor is inside the white casing. To its right is the polymer tube in which methanol is separated from ethanol. (Photographs: Van den Broek J et al. Nature Communications 2019)

    The measuring device proved to be exceptionally sensitive. In laboratory tests, it detected even trace amounts of methanol contamination selectively in alcoholic beverages, down to the low legal limits. Furthermore, the scientists analysed breath samples from a person who had previously drunk rum. For test purposes, the researchers subsequently added a small quantity of methanol to the breath sample.

    Patent pending

    The researchers have filed a patent application for the measuring method. They are now working to integrate the technology into a device that can be put to practical use. “This technology is low cost, making it suitable for use in developing countries as well. Moreover, it’s simple to use and can be operated even without laboratory training, for example by authorities or tourists,” Güntner says. It is also ideal for quality control in distilleries.

    Methanol is more than just a nuisance in conjunction with alcoholic beverages, it is also an important industrial chemical – and one that might come to play an even more important role: methanol is being considered as a potential future fuel, since vehicles can be powered with methanol fuel cells. So a further application for the new technology could be as an alarm sensor to detect leaks in tanks.

    The study was part of the University Medicine Zürich – Zürich Exhalomics flagship project.

    Science paper:
    Highly selective detection of methanol over ethanol by a handheld gas sensor
    Nature Communications

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ETH Zurich campus
    ETH Zürich is one of the leading international universities for technology and the natural sciences. It is well known for its excellent education, ground-breaking fundamental research and for implementing its results directly into practice.

    Founded in 1855, ETH Zürich today has more than 18,500 students from over 110 countries, including 4,000 doctoral students. To researchers, it offers an inspiring working environment, to students, a comprehensive education.

    Twenty-one Nobel Laureates have studied, taught or conducted research at ETH Zürich, underlining the excellent reputation of the university.

     
  • richardmitnick 10:26 am on September 16, 2019 Permalink | Reply
    Tags: "Why carbon dioxide has such outsized influence on Earth’s climate", , , ,   

    From University of North Carolina via EarthSky: “Why carbon dioxide has such outsized influence on Earth’s climate” 

    From University of North Carolina

    via

    1

    EarthSky

    September 16, 2019
    Jason West

    Carbon dioxide, CO2, makes up less than one-twentieth of 1% of Earth’s atmosphere. How does this relatively scarce gas control Earth’s thermostat?

    I am often asked how carbon dioxide can have an important effect on global climate when its concentration is so small – just 0.041% of Earth’s atmosphere. And human activities are responsible for just 32% of that amount.

    1
    The ‘Keeling Curve,’ named for scientist Charles David Keeling, tracks the accumulation of carbon dioxide in Earth’s atmosphere, measured in parts per million. Image via Scripps Institution of Oceanography.

    NASA Orbiting Carbon Observatory 2, NASA JPL-Caltech

    Early greenhouse science

    The scientists who first identified carbon dioxide’s importance for climate in the 1850s were also surprised by its influence. Working separately, John Tyndall in England and Eunice Foote in the United States found that carbon dioxide, water vapor and methane all absorbed heat, while more abundant gases did not.

    Scientists had already calculated that the Earth was about 59 degrees Fahrenheit (33 degrees Celsius) warmer than it should be, given the amount of sunlight reaching its surface. The best explanation for that discrepancy was that the atmosphere retained heat to warm the planet.

    Tyndall and Foote showed that nitrogen and oxygen, which together account for 99% of the atmosphere, had essentially no influence on Earth’s temperature because they did not absorb heat. Rather, they found that gases present in much smaller concentrations were entirely responsible for maintaining temperatures that made the Earth habitable, by trapping heat to create a natural greenhouse effect.

    A blanket in the atmosphere

    Earth constantly receives energy from the sun and radiates it back into space. For the planet’s temperature to remain constant, the net heat it receives from the sun must be balanced by outgoing heat that it gives off.

    Since the sun is hot, it gives off energy in the form of shortwave radiation at mainly ultraviolet and visible wavelengths. Earth is much cooler, so it emits heat as infrared radiation, which has longer wavelengths.

    3
    The electromagnetic spectrum is the range of all types of EM radiation – energy that travels and spreads out as it goes. The sun is much hotter than the Earth, so it emits radiation at a higher energy level, which has a shorter wavelength. Image via NASA.

    Carbon dioxide and other heat-trapping gases have molecular structures that enable them to absorb infrared radiation. The bonds between atoms in a molecule can vibrate in particular ways, like the pitch of a piano string. When the energy of a photon corresponds to the frequency of the molecule, it is absorbed and its energy transfers to the molecule.

    Carbon dioxide and other heat-trapping gases have three or more atoms and frequencies that correspond to infrared radiation emitted by Earth. Oxygen and nitrogen, with just two atoms in their molecules, do not absorb infrared radiation.

    Most incoming shortwave radiation from the sun passes through the atmosphere without being absorbed. But most outgoing infrared radiation is absorbed by heat-trapping gases in the atmosphere. Then they can release, or re-radiate, that heat. Some returns to Earth’s surface, keeping it warmer than it would be otherwise.

    4
    Earth receives solar energy from the sun (yellow), and returns energy back to space by reflecting some incoming light and radiating heat (red). Greenhouse gases trap some of that heat and return it to the planet’s surface. Image via NASA/Wikimedia.

    Research on heat transmission

    During the Cold War, the absorption of infrared radiation by many different gases was studied extensively. The work was led by the U.S. Air Force, which was developing heat-seeking missiles and needed to understand how to detect heat passing through air.

    This research enabled scientists to understand the climate and atmospheric composition of all planets in the solar system by observing their infrared signatures. For example, Venus is about 870 F (470 C) because its thick atmosphere is 96.5% carbon dioxide.

    It also informed weather forecast and climate models, allowing them to quantify how much infrared radiation is retained in the atmosphere and returned to Earth’s surface.

    People sometimes ask me why carbon dioxide is important for climate, given that water vapor absorbs more infrared radiation and the two gases absorb at several of the same wavelengths. The reason is that Earth’s upper atmosphere controls the radiation that escapes to space. The upper atmosphere is much less dense and contains much less water vapor than near the ground, which means that adding more carbon dioxide significantly influences how much infrared radiation escapes to space.

    Observing the greenhouse effect

    Have you ever noticed that deserts are often colder at night than forests, even if their average temperatures are the same? Without much water vapor in the atmosphere over deserts, the radiation they give off escapes readily to space. In more humid regions radiation from the surface is trapped by water vapor in the air. Similarly, cloudy nights tend to be warmer than clear nights because more water vapor is present.

    The influence of carbon dioxide can be seen in past changes in climate. Ice cores from over the past million years have shown that carbon dioxide concentrations were high during warm periods – about 0.028%. During ice ages, when the Earth was roughly 7 to 13 F (4-7 C) cooler than in the 20th century, carbon dioxide made up only about 0.018% of the atmosphere.

    Even though water vapor is more important for the natural greenhouse effect, changes in carbon dioxide have driven past temperature changes. In contrast, water vapor levels in the atmosphere respond to temperature. As Earth becomes warmer, its atmosphere can hold more water vapor, which amplifies the initial warming in a process called the “water vapor feedback.” Variations in carbon dioxide have therefore been the controlling influence on past climate changes.

    Small change, big effects

    It shouldn’t be surprising that a small amount of carbon dioxide in the atmosphere can have a big effect. We take pills that are a tiny fraction of our body mass and expect them to affect us.

    Today the level of carbon dioxide is higher than at any time in human history. Scientists widely agree that Earth’s average surface temperature has already increased by about 2 F (1 C) since the 1880s, and that human-caused increases in carbon dioxide and other heat-trapping gases are extremely likely to be responsible.

    Without action to control emissions, carbon dioxide might reach 0.1% of the atmosphere by 2100, more than triple the level before the Industrial Revolution. This would be a faster change than transitions in Earth’s past that had huge consequences. Without action, this little sliver of the atmosphere will cause big problems.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    UNC-University of North Carolina
    Carolina’s vibrant people and programs attest to the University’s long-standing place among leaders in higher education since it was chartered in 1789 and opened its doors for students in 1795 as the nation’s first public university. Situated in the beautiful college town of Chapel Hill, N.C., UNC has earned a reputation as one of the best universities in the world. Carolina prides itself on a strong, diverse student body, academic opportunities not found anywhere else, and a value unmatched by any public university in the nation.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: