Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:19 am on February 19, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , C1 complex, CryoEM-cryo electron microscopy, CryoET-Cryo electron tomography, U Utrecht   

    From U Utrecht: “Unexpected immune activation illustrated in the cold” 

    Utrecht University

    15 February 2018

    Monica van der Garde
    Public Information Officer
    m.vandergarde@uu.nl
    +31 (0)6 13 66 14 38

    Press Office Leiden University Medical Center
    pers@lumc.nl
    +31 6 11 37 11 46
    +31 71 526 8005

    q
    Combining CryoEM and CryoET lets researchers see the C1 complex in 3D (coloured model) bound to antibodies in a native state (background).

    Researchers at Utrecht University and Leiden University Medical Center, the Netherlands, have for the first time made a picture of an important on-switch of our immune system. Their novel technical approach already led to the discovery of not one, but two ways in which the immune system can be activated. This kind of new insights are important for designing better therapies against infections or cancer, according to team leaders Piet Gros and Thom Sharp. Their findings are published on February 16, 2018 in the journal Science.

    When invading microbes, viruses and tumours are detected in our bodies, our antibodies engage in an immediate defence strategy. They quickly raise warning signs on these aberrant surfaces that alert our body’s immune system of a security breach. This is the entry cue of several molecules, together called the C1 complex, that stick to the surface of the rogue cell and eliminate it from our body. Until recently, it was unknown how exactly invaders were recognized, and how this C1 complex was activated.

    Challenging

    Studying the C1 complex has been challenging since its components often clump together when taken out of their natural environment into a lab setting. Together with the international biotech company Genmab A/S, researchers from Utrecht University and Leiden University Medical Center have now developed a unique technical approach to studying it in a more natural environment – and discovered more than expected.

    Life-like detailed picture

    In order to capture the binding and interaction of the complex, Piet Gros, Utrecht University and Thom Sharp, Leiden University Medical Center, combined two imaging techniques, cryo electron microscopy (CryoEM) and cryo electron tomography (CryoET). “These technologies are exploding in the field,” describes Thom Sharp, “and each method gives us different but complementary information on the same complex.” When combined, these methods provide a more life-like detailed picture of the system.

    Reconstruction into a 3D representation

    For CryoEM, think of taking thousands of copies of the same convoluted complex and scattering them onto the sticky side of a piece of tape. The camera is in a fixed position and takes pictures of these particles, which may have landed right-side-up, on its side, on a point. CryoET, on the other hand, can image the complex in a more natural environment, as it is bound to the cell surface. It takes images from different angles of the complex, similar to a CT scan, where the particle rotates within the instrument. For both techniques, images are then reconstructed into a 3D representation of the complex.

    Very different mechanisms identified

    The researchers were surprised to find not one, but two ways in which the immune system can be activated: by physical distortion and by cross-activation. In some cases, the configuration of danger signals on a cell’s surface is sparse, and when antibodies bind, the entire complex must physically adjust or distort itself to properly fit. This adjustment of a single complex can set off an immune response. In other situations, where the danger signals are dense, multiple C1 complexes can help activate each other, like a neighbourhood watch system.

    First report

    This is the first report of two independent ways by which our immune system can be activated. In addition, the combination of CryoEM and CryoET enabled the visualization of details of these interactions that may enable researchers to create more specific therapeutics that can activate, slow down or stop the cascade of signals within our immune system.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Utrecht University (UU; Dutch: Universiteit Utrecht, formerly Rijksuniversiteit Utrecht) is a university in Utrecht, the Netherlands. It is one of the oldest universities in the Netherlands. Established March 26, 1636, it had an enrollment of 29,425 students in 2016, and employed 5,568 faculty and staff.[4] In 2011, 485 PhD degrees were awarded and 7,773 scientific articles were published. The 2013 budget of the university was €765 million.[5]

    The university is rated as the best university in the Netherlands by the Shanghai Ranking of World Universities 2013, and ranked as the 13th best university in Europe and the 52nd best university of the world.

    The university’s motto is “Sol Iustitiae Illustra Nos,” which means “Sun of Justice, shine upon us.” This motto was gleaned from a literal Latin Bible translation of Malachi 4:2. (Rutgers University, having a historical connection with Utrecht University, uses a modified version of this motto.) Utrecht University is led by the University Board, consisting of prof. dr. Bert van der Zwaan (Rector Magnificus) and Hans Amman.

    Advertisements
     
  • richardmitnick 9:11 am on February 19, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , Electric Eels, Electrocytes, ,   

    From The Atlantic: “A New Kind of Soft Battery, Inspired by the Electric Eel” 

    Atlantic Magazine

    The Atlantic Magazine

    Dec 13, 2017
    Ed Yong

    1
    Thomas Schroeder / Anirvan Guha

    In 1799, the Italian scientist Alessandro Volta fashioned an arm-long stack of zinc and copper discs, separated by salt-soaked cardboard. This “voltaic pile” was the world’s first synthetic battery, but Volta based its design on something far older—the body of the electric eel.

    This infamous fish makes its own electricity using an electric organ that makes up 80 percent of its two-meter length. The organ contains thousands of specialized muscle cells called Electric Eel. Each only produces a small voltage, but together, they can generate up to 600 volts—enough to stun a human, or even a horse. They also provided Volta with ideas for his battery, turning him into a 19th-century celebrity.

    Two centuries on, and batteries are everyday objects. But even now, the electric eel isn’t done inspiring scientists. A team of researchers led by Michael Mayer at the University of Fribourg have now created a new kind of power source [Nature] that ingeniously mimics the eel’s electric organ. It consists of blobs of multicolored gels, arranged in long rows much like the eel’s electrocytes. To turn this battery on, all you need to do is to press the gels together.

    Unlike conventional batteries, the team’s design is soft and flexible, and might be useful for powering the next generation of soft-bodied robots. And since it can be made from materials that are compatible with our bodies, it could potentially drive the next generation of pacemakers, prosthetics, and medical implants. Imagine contact lenses that generate electric power, or pacemakers that run on the fluids and salts within our own bodies—all inspired by a shocking fish.

    To create their unorthodox battery, the team members Tom Schroeder and Anirvan Guha began by reading up on how the eel’s electrocytes work. These cells are stacked in long rows with fluid-filled spaces between them. Picture a very tall tower of syrup-smothered pancakes, turned on its side, and you’ll get the idea.

    When the eel’s at rest, each electrocyte pumps positively charged ions out of both its front-facing and back-facing sides. This creates two opposing voltages that cancel each other out. But at the eel’s command, the back side of each electrocyte flips, and starts pumping positive ions in the opposite direction, creating a small voltage across the entire cell. And crucially, every electrocyte performs this flip at the same time, so their tiny voltages add up to something far more powerful. It’s as if the eel has thousands of small batteries in its tail; half are pointing in the wrong direction but it can flip them at a whim, so that all of them align. “It’s insanely specialized,” says Schroeder.

    2
    How an electric eel’s electrocytes work (Schroeder et al. / Nature).

    He and his colleagues first thought about re-creating the entire electric organ in a lab, but soon realized that it’s far too complicated. Next, they considered setting up a massive series of membranes to mimic the stacks of electrocytes—but these are delicate materials that are hard to engineer in the thousands. If one broke, the whole series would shut down. “You’d run into the string-of-Christmas-lights problem,” says Schroeder.

    In the end, he and Guha opted for a much simpler setup, involving lumps of gel that are arranged on two separate sheets. Look at the image below, and focus on the bottom sheet. The red gels contain saltwater, while blue ones contain freshwater. Ions would flow from the former to the latter, but they can’t because the gels are separated. That changes when the green and yellow gels on the other sheet bridge the gaps between the blue and red ones, providing channels through which ions can travel.

    Here’s the clever bit: The green gel lumps only allow positive ions to flow through them, while the yellow ones only let negative ions pass. This means (as the inset in the image shows) that positive ions flow into the blue gels from only one side, while negative ions flow in from the other. This creates a voltage across the blue gel, exactly as if it was an electrocyte. And just as in the electrocytes, each gel only produces a tiny voltage, but thousands of them, arranged in a row, can produce up to 110 volts

    3
    Schroeder et al. / Nature.

    The eel’s electrocytes fire when they receive a signal from the animal’s neurons. But in Schroeder’s gels, the trigger is far simpler—all he needs to do is to press the gels together.

    It would be cumbersome to have incredibly large sheets of these gels. But Max Shtein, an engineer at the University of Michigan, suggested a clever solution—origami. Using a special folding pattern that’s also used to pack solar panels into satellites, he devised a way of folding a flat sheet of gels so the right colors come into contact in the right order. That allowed the team to generate the same amount of power in a much smaller space—in something like a contact lens, which might one day be realistically worn.

    For now, such batteries would have to be actively recharged. Once activated, they produce power for up to a few hours, until the levels of ions equalize across the various gels, and the battery goes flat. You then need to apply a current to reset the gels back to alternating rows of high-salt and low-salt. But Schroeder notes that our bodies constantly replenish reservoirs of fluid with varying levels of ions. He imagines that it might one day be possible to harness these reservoirs to create batteries.

    Essentially, that would turn humans into something closer to an electric eel. It’s unlikely that we’d ever be able to stun people, but we could conceivably use the ion gradients in our own bodies to power small implants. Of course, Schroeder says, that’s still more a flight of fancy than a goal he has an actual road map for. “Plenty of things don’t work for all sorts of reasons, so I don’t want to get too far ahead of myself,” he says.

    It’s not unreasonable to speculate, though, says Ken Catania from Vanderbilt University, who has spent years studying the biology of the eels. “Volta’s battery was not exactly something you could fit in a cellphone, but over time we have all come to depend on it,” he says. “Maybe history will repeat itself.”

    “I’m amazed at how much electric eels have contributed to science,” he adds. “It’s a good lesson in the value of basic science.” Schroeder, meanwhile, has only ever seen electric eels in zoos, and he’d like to encounter one in person. “I’ve never been shocked by one, but I feel like I should at some point,” he says.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 8:51 am on February 19, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , Blood and urine tests developed to indicate autism in children, ,   

    From U Warwick: “Blood and urine tests developed to indicate autism in children” 

    U Warwick bloc

    University of Warwick

    19 February 2018

    1
    Credit: CC0 Public Domain

    New tests which can indicate autism in children have been developed by researchers at the University of Warwick.

    The academic team who conducted the international research believe that their new blood and urine tests which search for damage to proteins are the first of their kind.

    The tests could lead to earlier detection of autism spectrum disorders (ASD) and consequently children with autism could be given appropriate treatment much earlier in their lives. ASDs are defined as developmental disorders mainly affecting social interaction and they can include a wide spectrum of behavioural problems. These include speech disturbances, repetitive and/or compulsive behaviour, hyperactivity, anxiety, and difficulty to adapt to new environments, some with or without cognitive impairment. Since there is a wide range of ASD symptoms diagnosis can be difficult and uncertain, particularly at the early stages of development.

    The paper Advanced glycation endproducts, dityrosine, and arginine transporter dysfunction in autism—a source of biomarkers for clinical diagnosis has been published in Molecular Autism. The team was led by Dr Naila Rabbani, Reader of Experimental Systems Biology at the University of Warwick who said: “Our discovery could lead to earlier diagnosis and intervention.”

    “We hope the tests will also reveal new causative factors. With further testing we may reveal specific plasma and urinary profiles or “fingerprints” of compounds with damaging modifications. This may help us improve the diagnosis of ASD and point the way to new causes of ASD.”

    The team which is based at the University’s Warwick Medical School involves academics at the University of Warwick’s Warwick Systems Biology group, the University of Birmingham, the University of Bologna, the Institute of Neurological Sciences, Bologna, and the Don Carlo Gnocchi Foundation ONLUS. They found a link between ASD and damage to proteins in blood plasma by oxidation and glycation – processes where reactive oxygen species (ROS) and sugar molecules spontaneously modify proteins. They found the most reliable of the tests they developed was examining protein in blood plasma where, when tested, children with ASD were found to have higher levels of the oxidation marker dityrosine (DT) and certain sugar-modified compounds called “advanced glycation endproducts” (AGEs). Genetic causes have been found in 30–35% of cases of ASD and the remaining 65–70% of cases are thought to be caused by a combination of environmental factors, multiple mutations, and rare genetic variants. However the research team also believe that the new tests could reveal yet to be identified causes of ASD.

    The team’s research also confirmed the previously held belief that mutations of amino acid transporters are a genetic variant associated with ASD. The Warwick team worked with collaborators at the University of Bologna, Italy, who recruited locally 38 children who were diagnosed as having with ASD (29 boys and nine girls) and a control group of 31 children (23 boys and eight girls) between the ages of five and 12. Blood and urine samples were taken from the children for analysis.

    The University of Warwick team discovered that there were chemical differences between the two groups. Working with a further collaborator at the University of Birmingham, the changes in multiple compounds were combined together using artificial intelligence algorithms techniques to develop a mathematical equation or “algorithm” to distinguish between ASD and controls. The outcome was a diagnostic test better than any method currently available.

    The next steps are to repeat the study with further groups of children to confirm the good diagnostic performance and to assess if the test can identify ASD at very early stages assess if treatments are working.

    Authors:

    Attia Anwar, Warwick Medical School, University of Warwick Provvidenza Maria Abruzzo, Department of Experimental, Diagnostic and Specialty Medicine, School of Medicine, University of Bologna; Don Carlo Gnocchi Foundation ONLUS, IRCCS “S. Maria Nascente”, Milan Sabah Pasha, Warwick Medical School, University of Warwick Kashif Rajpoot, Department of Computer Science, University of Birmingham, Alessandra Bolotta, Department of Experimental, Diagnostic and Specialty Medicine, School of Medicine, University of Bologna; Don Carlo Gnocchi Foundation ONLUS, IRCCS “S. Maria Nascente”, Milan; Warwick Systems Biology, University of Warwick, Clinical Sciences Research Laboratories, University Hospital, Coventry Alessandro Ghezzo, Department of Experimental, Diagnostic and Specialty Medicine, School of Medicine, University of Bologna, Marina Marini, Department of Experimental, Diagnostic and Specialty Medicine, School of Medicine, University of Bologna; Don Carlo Gnocchi Foundation ONLUS, IRCCS “S. Maria Nascente”, Milan, Annio Posar, Child Neurology and Psychiatry Unit, IRCCS Institute of Neurological Sciences, Bologna; Department of Biomedical and Neuromotor Sciences, University of Bologna Paola Visconti Child Neurology and Psychiatry Unit, IRCCS Institute of Neurological Sciences, Bologna Paul J. Thornalley, Warwick Medical School, University of Warwick; Warwick Systems Biology, University of Warwick, Clinical Sciences Research Laboratories, University Hospital, Coventry Naila Rabbani, Warwick Medical School, University of Warwick; Warwick Systems Biology, University of Warwick, Clinical Sciences Research Laboratories, University Hospital, Coventry; Research Technology Platform–Proteomics, University of Warwick; AGEomics and Systems Biology Research Group, Warwick Systems Biology, University of Warwick, University Hospital, Coventry

    This work received the following funding: Naila Rabbani – Warwick Impact Fund; Marina Marini – Fondazione del Monte di Bologna e Ravenna, Italy; Fondazione Nando Peretti, Rome, Italy

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    U Warwick Campus

    We’re a world-leading university with the highest academic and research standards. But we’re not letting the story end there.

    That’s because we’re a place of possibility. We’re always looking for new ways to make things happen. Whether you’re a dedicated student, an innovative lecturer or an ambitious company, Warwick provides a tireless yet supportive environment in which you can make an impact.

    And our students, alumni and staff are consistently making an impact – the kind that changes lives, whether close to home or on a global scale.

    It’s the achievements of our people that help explain why our levels of research excellence and scholarship are recognised internationally.

    It’s a prime attraction for some of the biggest names in worldwide business and industry.

    It’s why we’re ranked highly in the lists of great UK and world universities.

    All of this contributes to a compelling story, one that’s little more than 50 years old. But who said youth should hold you back from changing the world?

     
  • richardmitnick 8:27 am on February 19, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , Meteotsunami,   

    From COSMOS Magazine: “Prevalence and danger of little known tsunami type revealed” 

    Cosmos Magazine bloc

    COSMOS Magazine

    19 February 2018
    Richard A Lovett

    1
    huffpost.

    On 4 July 2003, beachgoers at Warren Dunes State Park, in the US state of Michigan, were enjoying America’s Independence Day holiday when a fast-moving line of thunderstorms blew in from Lake Michigan. They scurried for shelter, but the event passed so quickly it didn’t appear that their holiday was ruined.

    “In 15 minutes it was gone,” says civil engineer Alvaro Linares of the University of Wisconsin, Madison.

    But when swimmers re-entered the water, rip currents appeared seemingly from nowhere, pulling eight people out into the lake, where seven drowned.

    What these people had encountered, Linares says, was a meteotsunami — an aquatic hazard of which few people, including scientists, were aware of until recently.

    Few scientists have researched the phenomenon. May of those who have gathered recently at the annual American Geophysical Union Ocean Sciences meeting, held in Portland, Oregon, US, to compare notes.

    Conventional tsunamis are caused by underwater processes such as earthquakes and submarine landslides. Meteotsunamis, as the name indicates, are caused by weather. But while the catalysts are different, the effects are not.

    “The wave characteristics are very similar,” says Eric Anderson of the Great Lakes Environmental Research Laboratory of the National Oceanic and Atmospheric Administration (NOAA) in Ann Arbor, Michigan.

    To create a meteotsunami, what’s required is a combination of a strong, fast-moving storm and relatively shallow water. The sudden increase in winds along the storm front, possibly combined with changes in air pressure, starts the process by kicking up a tsunami-style wave that runs ahead of it. But the process would quickly fizzle out if the water was too deep, because in deep water, such waves propagate very quickly and would soon outrun the storm.

    What’s needed to produce a meteotsunami is a water depth at which the storm’s speed and the wave’s speed match, allowing the wave to build as it and the storm move in tandem. “The storm puts all its energy into that wave,” Anderson says.

    Furthermore, the wave can magnify even more when it hits shallower water or shoals. “That is when these become destructive,” Anderson says.

    In 2004, for example, a storm front 300 kilometres wide sped across the East China Sea at a speed of 31 metres per second, 112 kilometres per hour, says Katsutoshi Fukuzawa of the University of Tokyo.

    Water there is shallow, he adds, with depths mostly under 100 metres. This limits wave speed to about 30 metres per second — a near-perfect match to the storm’s. As a result, parts of the island of Kyushu were hit with a tsunami as big as 1.6-metres.

    Not that meteotsunamis have to be that big to be dangerous. The one at Warren Dunes was probably no more than 30 centimeters, says Linares — small enough not even to be visible in the lake’s normal chop.

    But unlike normal surf, meteotsunamis produce a sustained slosh that lasts several minutes between run-up and retreat. That means that even low-height waves carry a lot of water, creating the potential for strong rip currents when they withdraw. According to Linares’ models [Journal of Geophysical Research], these currents would have persisted for about an hour — plenty long enough to drag unwary swimmers far out into the lake, long after the storm had passed.

    It’s also possible for meteotsunamis to become “detached” from the storm front that created them, striking shores far away. Researchers reviewing records in the Great Lakes have concluded that that is what happened when such a wave hit Chicago in 1954, killing 10 people.

    “The wave came out of nowhere,” Anderson says. “It was a calm, sunny day.”

    It’s not just Japan and America’s Great Lakes that have seen such events. In May 2017, a storm raced up the English Channel, kicking up a metre-high wave that swept beaches in The Netherlands as bystanders looked on with awe, says Ap van Dongeren of the Deltares research institute in Delft, The Netherlands.

    Quirks of topography can magnify the effects of such tsunamis. On 13 June 2013, a group of spearfishermen in New Jersey were stunned when a surge of water threw them across a breakwater into the open ocean [nj.com]. A few minutes later, another surge threw them back where they’d come from. And that came from a meteotsunami that measured at well less than a metre on local tide gauges, says Gregory Dusek, a NOAA oceanographer at Camp Springs, Maryland.

    Meteotsunamis have occurred on all inhabited continents, including one that hit the port of Fremantle, near the Australian city of Perth, in 2014, causing a ship to break free from its moorings and crash into a railroad bridge in 2014, Sarath Wijeratne of the University of Western Australia reported in a conference abstract. In fact, Wijeratne concluded, a look back at historical water level records indicates that Western Australia may have seen more than 15 such events each year between 2008 and 2016.

    Other researchers are also finding these events to be surprisingly frequent. By studying tide gauge records back to 1996, Dusek has concluded that they occur on America’s eastern seaboard at a rate of 23 per year — though most are small enough nobody would ever notice. In Holland, Van Dongeren says that a quick check of historical tide gauge records revealed at least three such events in the past decade that had gone unnoticed because they happened at low tide. “They’re not that rare,” he says.

    Fukuzawa says that Japan saw 37 meteotsunamis exceeding one metre from 1961 to 2005.

    Furthermore, bigger ones are possible. In June 2014, Croatia was hit by a two-to-three metre tsunami sweeping in from the Adriatic Sea, says Clea Denamiel, of the Croatian Institute of Oceanography and Fisheries.

    But the mother of all meteotsunamis came in 1978, when Vela Luka, at the southern end of Croatia’s scenic Dalmatian coast, was smashed by a meteotsunami measuring a full six metres, with giant waves surging and retreating about every 17 minutes, just as might have occurred in the aftermath of a large offshore earthquake.

    As of now, scientists don’t know enough about meteotsunamis to be able to predict them, though efforts are under way to create models that can do just that. But as they dig back through old records, they are increasingly realising that meteotsunamis might have been with us for a long time.

    Or as Linares puts it with typical scientific understatement, “meteotsunamis are a beach hazard that has been overlooked”.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 3:02 pm on February 16, 2018 Permalink | Reply
    Tags: Applied Research & Technology, Improving quantum information processing,   

    From ORNL: “Researchers demonstrate promising method for improving quantum information processing” 

    i1

    Oak Ridge National Laboratory

    February 16, 2018
    Scott Jones, Communications
    jonesg@ornl.gov
    865.241.6491

    1
    Joseph Lukens, Pavel Lougovski and Nicholas Peters (from left), researchers with ORNL’s Quantum Information Science Group, are examining methods for encoding photons with quantum information that are compatible with the existing telecommunications infrastructure and that incorporate off-the-shelf components. Credit: Genevieve Martin/Oak Ridge National Laboratory, U.S. Department of Energy.

    A team of researchers led by the Department of Energy’s Oak Ridge National Laboratory has demonstrated a new method for splitting light beams into their frequency modes. The scientists can then choose the frequencies they want to work with and encode photons with quantum information. Their work could spur advancements in quantum information processing and distributed quantum computing.

    The team’s findings were published in Physical Review Letters.

    The frequency of light determines its color. When the frequencies are separated, as in a rainbow, each color photon can be encoded with quantum information, delivered in units known as qubits. Qubits are analogous to but different from classical bits, which have a value of either 0 or 1, because qubits are encoded with values of both 0 and 1 at the same time.

    The researchers liken quantum information processing to stepping into a hallway and being able to go both ways, whereas in classical computing just one path is possible.

    The team’s novel approach—featuring the first demonstration of a frequency tritter, an instrument that splits light into three frequencies—returned experimental results that matched their predictions and showed that many quantum information processing operations can be run simultaneously without increasing error. The quantum system performed as expected under increasingly complex conditions without degrading the encoded information.

    “Under our experimental conditions, we got a factor 10 better than typical error rates,” said Nicholas Peters, Quantum Communications team lead for ORNL’s Quantum Information Science Group. “This establishes our method as a frontrunner for high-dimensional frequency-based quantum information processing.”

    Photons can carry quantum information in superpositions—where photons simultaneously have multiple bit values—and the presence of two quantum systems in superposition can lead to entanglement, a key resource in quantum computing.

    Entanglement boosts the number of calculations a quantum computer could run, and the team’s focus on creating more complex frequency states aims to make quantum simulations more powerful and efficient. The researchers’ method is also notable because it demonstrates the Hadamard gate, one of the elemental circuits required for universal quantum computing.

    “We were able to demonstrate extremely high-fidelity results right off the bat, which is very impressive for the optics approach,” said Pavel Lougovski, the project’s principal investigator. “We are carving out a subfield here at ORNL with our frequency-based encoding work.”

    The method leverages widely available telecommunications technology with off-the-shelf components while yielding high-fidelity results. Efforts to develop quantum repeaters, which extend the distance quantum information can be transmitted between physically separated computers, will benefit from this work.

    “The fact that our method is telecom network-compatible is a big advantage,” Lougovski said. “We could perform quantum operations on telecom networks if needed.”

    Peters added that their project demonstrates that unused fiber-optic bandwidth could be harnessed to reduce computational time by running operations in parallel.

    “Our work uses frequency’s main advantage—stability—to get very high fidelity and then do controlled frequency jumping when we want it,” said Wigner Fellow Joseph Lukens, who led the ORNL experiment. The researchers have experimentally shown that quantum systems can be transformed to yield desired outputs.

    The researchers suggest their method could be paired with existing beam-splitting technology, taking advantage of the strengths of both and bringing the scientific community closer to full use of frequency-based photonic quantum information processing.

    Peters, Lougovski and Lukens, all physicists with ORNL’s Quantum Information Science Group, collaborated with graduate student Hsuan-Hao Lu, professor Andrew Weiner, and colleagues at Purdue University. The team published the theory for their experiments in Optica in January 2017.

    This research is supported by ORNL’s Laboratory Directed Research and Development program and the National Science Foundation.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    i2

     
  • richardmitnick 2:17 pm on February 16, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , , , PRIMA   

    From MIT: “Integrated simulations answer 20-year-old question in fusion research” 

    MIT News

    MIT Widget

    MIT News

    February 16, 2018
    Leda Zimmerman

    To make fusion energy a reality, scientists must harness fusion plasma, a fiery gaseous maelstrom in which radioactive particles react to generate heat for electricity. But the turbulence of fusion plasma can confront researchers with unruly behaviors that confound attempts to make predictions and develop models. In experiments over the past two decades, an especially vexing problem has emerged: In response to deliberate cooling at its edges, fusion plasma inexplicably undergoes abrupt increases in central temperature.

    These counterintuitive temperature spikes, which fly against the physics of heat transport models, have not found an explanation — until now.

    A team led by Anne White, the Cecil and Ida Green Associate Professor in the Department of Nuclear Science and Engineering, and Pablo Rodriguez Fernandez, a graduate student in the department, has conducted studies that offer a new take on the complex physics of plasma heat transport and point toward more robust models of fusion plasma behavior. The results of their work appear this week in the journal Physical Review Letters. Rodriguez Fernandez is first author on the paper.

    In experiments using MIT’s Alcator C-Mod tokamak (a toroidal-shaped device that deploys a magnetic field to contain the star-furnace heat of plasma), the White team focused on the problem of turbulence and its impact on heating and cooling.

    Alcator C-Mod tokamak at MIT, no longer in operation

    In tokamaks, heat transport is typically dominated by turbulent movement of plasma, driven by gradients in plasma pressure.

    Hot and cold

    Scientists have a good grasp of turbulent transport of heat when the plasma is held at steady-state conditions. But when the plasma is intentionally perturbed, standard models of heat transport simply cannot capture plasma’s dynamic response.

    In one such case, the cold-pulse experiment, researchers perturb the plasma near its edge by injecting an impurity, which results in a rapid cooling of the edge.

    “Now, if I told you we cooled the edge of hot plasma, and I asked you what will happen at the center of the plasma, you would probably say that the center should cool down too,” says White. “But when scientists first did this experiment 20 years ago, they saw that edge cooling led to core heating in low-density plasmas, with the temperature in the core rising, and much faster than any standard transport model would predict.” Further mystifying researchers was the fact that at higher densities, the plasma core would cool down.

    Replicated many times, these cold-pulse experiments with their unlikely results defy what is called the standard local model for the turbulent transport of heat and particles in fusion devices. They also represent a major barrier to predictive modeling in high-performance fusion experiments such as ITER, the international nuclear fusion project, and MIT’s own proposed smaller-scale fusion reactor, ARC.

    MIT ARC Fusion Reactor

    ITER Tokamak in Saint-Paul-lès-Durance, which is in southern France

    To achieve a new perspective on heat transport during cold-pulse experiments, White’s team developed a unique twist.

    “We knew that the plasma rotation, that is, how fast the plasma was spinning in the toroidal direction, would change during these cold-pulse experiments, which complicates the analysis quite a bit,” White notes. This is because the coupling between momentum transport and heat transport in fusion plasmas is still not fully understood,” she explains. “We needed to unambiguously isolate one effect from the other.”

    As a first step, the team developed a new experiment that conclusively demonstrated how the cold-pulse phenomena associated with heat transport would occur irrespective of the plasma rotation state. With Rodriguez Fernandez as first author, White’s group reported this key result in the journal Nuclear Fusion in 2017.

    A new integrated simulation

    From there, a tour de force of modeling was needed to recreate the cold-pulse dynamics seen in the experiments. To tackle the problem, Rodriguez Fernandez built a new framework, called PRIMA, which allowed him to introduce cold-pulses in time-dependent simulations. Using special software that factored in the turbulence, radiation and heat transport physics inside a tokamak, PRIMA could model cold-pulse phenomena consistent with experimental measurements.

    “I spent a long time simulating the propagation of cold pulses by only using an increase in radiated power, which is the most intuitive effect of a cold-pulse injection,” Rodriguez Fernandez says.

    Because experimental data showed that the electron density increased with every cold pulse injection, Rodriguez Fernandez implemented an analogous effect in his simulations. He observed a very good match in amplitude and time-scales of the core temperature behavior. “That was an ‘aha!’ moment,” he recalls.

    Using PRIMA, Rodriguez Fernandez discovered that a competition between types of turbulent modes in the plasma could explain the cold-pulse experiments. These different modes, explains White, compete to become the dominant cause of the heat transport. “Whichever one wins will determine the temperature profile response, and determine whether the center heats up or cools down after the edge cooling,” she says.

    By determining the factors behind the center-heating phenomenon (the so-called nonlocal response) in cold-pulse experiments, White’s team has removed a central concern about limitations in the standard, predictive (local) model of plasma behavior. This means, says White, that “we are more confident that the local model can be used to predict plasma behavior in future high performance fusion plasma experiments — and eventually, in reactors.”

    “This work is of great significance for validating fundamental assumptions underpinning the standard model of core tokamak turbulence,” says Jonathan Citrin, Integrated Modelling and Transport Group leader at the Dutch Institute for Fundamental Energy Research (DIFFER), who was not involved in the research. “The work also validated the use of reduced models, which can be run without the need for supercomputers, allowing to predict plasma evolution over longer timescales compared to full-physics simulations,” says Citrin. “This was key to deciphering the challenging experimental observations discussed in the paper.”

    The work isn’t over for the team. As part of a separate collaboration between MIT and General Atomics, Plasma Science and Fusion Center scientists are installing a new laser ablation system to facilitate cold-pulse experiments at the DIII-D tokamak in San Diego, California, with first data expected soon. Rodriguez Fernandez has used the integrated simulation tool PRIMA to predict the cold-pulse behavior at DIII-D, and he will perform an experimental test of the predictions later this year to complete his PhD research.

    The research team included Brian Grierson and Xingqiu Yuan, research scientists at Princeton Plasma Physics Laboratory; Gary Staebler, research scientist at General Atomics; Martin Greenwald, Nathan Howard, Amanda Hubbard, Jerry Hughes, Jim Irby and John Rice, research scientists from the MIT Plasma Science and Fusion Center; and MIT grad students Norman Cao, Alex Creely, and Francesco Sciortino. The work was supported by the US DOE Fusion Energy Sciences.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 11:19 am on February 16, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , , Sea water filtration   

    From CSIRO via Science Alert: “This New Graphene Invention Makes Filthy Seawater Drinkable in One Simple Step “ 

    CSIRO bloc

    Commonwealth Scientific and Industrial Research Organisation

    Science Alert

    16 FEB 2018
    MICHELLE STARR

    1
    (CSIRO)

    2.1 billion people still don’t have safe drinking water.

    Using a type of graphene called Graphair, scientists from Australia have created a water filter that can make highly polluted seawater drinkable after just one pass.

    The technology could be used to cheaply provide safe drinking water to regions of the world without access to it.

    “Almost a third of the world’s population, some 2.1 billion people, don’t have clean and safe drinking water,” said lead author Dong Han Seo.

    “As a result, millions – mostly children – die from diseases associated with inadequate water supply, sanitation and hygiene every year. In Graphair we’ve found a perfect filter for water purification.

    “It can replace the complex, time consuming and multi-stage processes currently needed with a single step.”

    Developed by researchers at the Commonwealth Scientific and Industrial Research Organisation (CSIRO), Graphair is a form of graphene made out of soybean oil.

    Graphene – a one-atom-thick, ultrastrong carbon material – might be touted as a supermaterial, but it’s been relatively expensive to produce, which has been limiting its use in broader applications.

    Graphair is cheaper and simpler to produce than more traditional graphene manufacturing methods, while retaining the properties of graphene.

    One of those properties is hydrophobia – graphene repels water.

    To turn it into a filter, the researchers developed a graphene film with microscopic nanochannels; these allow the water through, but stop larger pollutants with larger molecules.

    Then the team overlaid their new film on a typical, commercial-grade water filtration membrane to do some tests.

    When used by itself, a water filtration membrane becomes coated with contaminants, blocking the pores that allow the water through. The researchers found that during their tests using highly polluted Sydney Harbour water, a normal water filter’s filtration rate halved without the graphene film.

    Then the Graphair was added to the filter. The team found that the combination filter screened out more contaminants – 99 percent of them – faster than the conventional filter. And it continued to work even when coated with pollutants, the researchers said.

    This eliminates a step from other filtration methods – removing the contaminants from the water before passing it through the membrane to prevent them from coating it.

    This is a similar result to one found last year, where minuscule pores in a graphene filter were able to prevent salt from seawater from passing through – and allow water through faster.

    “This technology can create clean drinking water, regardless of how dirty it is, in a single step,” Seo said.

    “All that’s needed is heat, our graphene, a membrane filter, and a small water pump. We’re hoping to commence field trials in a developing world community next year.”

    Eventually, they believe that the technology could be used for household and even town-based water filtration, as well as seawater and industrial wastewater treatment.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    CSIRO campus

    CSIRO, the Commonwealth Scientific and Industrial Research Organisation, is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

     
  • richardmitnick 10:13 am on February 16, 2018 Permalink | Reply
    Tags: , Applied Research & Technology, Atmosphere science, , NASA PACE,   

    From Eos: “A Novel Approach to a Satellite Mission’s Science Team” 

    AGU bloc

    AGU
    Eos news bloc

    Eos

    12 February 2018
    Emmanuel Boss
    Lorraine A. Remer

    1
    NASA Plankton, Aerosol, Cloud, Ocean Ecosystem (PACE) mission satellite.

    The NASA Plankton, Aerosol, Cloud, Ocean Ecosystem (PACE) mission, with a target launch within the next 5 years, aims to make measurements that will advance ocean and atmospheric science and facilitate interdisciplinary studies involving the interaction of the atmosphere with ocean biological systems. Unique to this Earth science satellite project was the formation of a science team charged with a dual role: performing principal investigator (PI)-led peer-reviewed science relevant to specific aspects of PACE, as well as supporting the mission’s overall formulation as a unified team.

    This science team is serving a limited term of 3 years, and recompetition for membership is expected later this year. Overall, the cooperative, consensus-building approach of the first PACE Science Team has been a constructive and scientifically productive contribution for the new satellite mission. This approach can serve as a model for all future satellite missions.

    PACE

    The PACE satellite, as envisioned, would carry multiple sensors into space as early as 2022. These instruments include a radiometer that will span the ultraviolet to the near infrared (NIR) with high spectral resolution (<5 nanometers). This radiometer will also scan individual bands from the NIR to the shortwave infrared. In addition, the instrument suite would include two different CubeSat polarimeters. These devices are radiometers that separate different polarization states of light over several viewing angles and spectral bands.

    Measurements from these sensors would be used to derive properties of atmospheric aerosols, clouds, and oceanic constituents. Derived products could lead to better understanding of the processes involved in determining sources, distributions, sinks, and interactions of these variables with critical applications including Earth’s radiative balance, ocean carbon uptake, sustainable fisheries, and more.

    The PACE Science Team

    To help map out the scope of the PACE mission, NASA first established a science definition team that provided a report on the desired characteristics of PACE in 2012. Following that report and just before the decision to fund PACE was made, in 2014 NASA published a call for proposals for participants in the first PACE Science Team.

    The scientists funded under this call and selected for the science team were partitioned into two subject areas: One focused on atmospheric correction and atmospheric products, and the other addressed the retrieval of inherent optical properties of the ocean. The team was enhanced with NASA personnel with specific portfolios in two areas: data processing and applications for societal relevance.

    NASA’s solicitation specified “the ultimate goal for each of the two measurement suite teams is to achieve consensus and develop community-endorsed paths forward for the PACE sensor(s) for the full spectrum of components within the measurement suite. The goal is to replace individual ST [science team] member recommendations for measurement, algorithm, and retrieval approaches (historically based on the individual expertise and interests of ST members) with consensus recommendations toward common goals.”

    This new framework differed from past NASA science teams in that PIs not only proposed their own science objectives and coordinated their own research but were also expected to contribute to common goals as well.

    Science Team Activities

    Soon after forming, the science team identified several issues or subject areas of common concern and formed subgroups to address these individual concerns. These areas included construction of novel data sets for algorithm development (both in situ and synthetic data sets), cross comparison and benchmarking of coupled ocean-atmosphere radiative transfer codes, and cross comparison of instruments in the field to assess and constrain uncertainties in the measurements of oceanic particle absorption.

    The science team was also asked by NASA to assess the designs of the PACE radiometer and polarimeter and to determine the value of adding a high spatial resolution coastal camera. An ad hoc subgroup was formed to produce a stand-alone report on the advantages and requirements for polarimetry for atmospheric correction, aerosol characterization, and oceanic retrievals. The team contributed to both the design and content of the PACE website.

    The PACE science team also developed an alternative style for their last two annual meetings that emphasized discussion and interaction. To improve the efficiency of the PACE science team’s workshops, a “flipped meeting” format was adopted in which team members prerecorded their individual presentations in advance and posted these recordings to an internal site. Science team members were able to view and listen to the recordings at their leisure and arrived at the meeting itself readied with questions and discussion points for the presenters. This meeting strategy was successful and led to invigorating two-way discussions.

    Enhanced Collaborations

    The PACE science team is in the last phase of the 3-year term. Several consensus reports are being finalized to provide NASA with input and recommendations about the most likely paths forward for PACE atmospheric correction, atmospheric products, and oceanic optical properties [e.g., Werdell et al., 2018].

    PACE has set itself up to be a model for interdisciplinary collaboration. Early fruits of this can be seen in the multiple collaborations that have sprouted up between ocean and atmospheric scientists, whose vocabulary and culture were initially vastly different. Collaborative products range from published papers that build realistic radiative transfer models from within the ocean to the top of the atmosphere to the assembly of novel databases that contain ocean and atmospheric measurements useful to develop novel algorithms.

    We hope these collaborations will result in increased cooperation in PACE’s future and on future missions. In particular, we’re hopeful that collaborations will lead to enhanced study of processes at the air-sea interface, a complex domain that is relatively unknown, where a holistic and interdisciplinary approach will lead to better understanding of the functioning of our planet.

    PACE’s future is currently uncertain (it is in Congress’s continuing resolutions but was one of the missions the current administration did not support). Although we hope that the mission keeps its funding, we note that the cooperative, consensus-building approach of the first PACE science team was a constructive and scientifically productive contribution to the path forward for a new satellite mission. We expect that this framework to support mission activities will be adopted in future NASA missions to maximize their utility across disciplines.

    Science paper:
    An overview of approaches and challenges for retrieving marine inherent optical properties from ocean color remote sensing, Progress in Oceanography

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 9:39 am on February 16, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , , ,   

    From BNL: “Bringing a Hidden Superconducting State to Light” 

    Brookhaven Lab

    February 16, 2018
    Ariana Tantillo,
    atantillo@bnl.gov
    (631) 344-2347

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    High-power light reveals the existence of superconductivity associated with charge “stripes” in the copper-oxygen planes of a layered material above the temperature at which it begins to transmit electricity without resistance.

    1
    Physicist Genda Gu holds a single-crystal rod of LBCO—a compound made of lanthanum, barium, copper, and oxygen—in Brookhaven’s state-of-the-art crystal growth lab. The infrared image furnace he used to synthesize these high-quality crystals is pictured in the background. No image credit.

    A team of scientists has detected a hidden state of electronic order in a layered material containing lanthanum, barium, copper, and oxygen (LBCO). When cooled to a certain temperature and with certain concentrations of barium, LBCO is known to conduct electricity without resistance, but now there is evidence that a superconducting state actually occurs above this temperature too. It was just a matter of using the right tool—in this case, high-intensity pulses of infrared light—to be able to see it.

    Reported in a paper published in the Feb. 2 issue of Science, the team’s finding provides further insight into the decades-long mystery of superconductivity in LBCO and similar compounds containing copper and oxygen layers sandwiched between other elements. These “cuprates” become superconducting at relatively higher temperatures than traditional superconductors, which must be frozen to near absolute zero (minus 459 degrees Fahrenheit) before their electrons can flow through them at 100-percent efficiency. Understanding why cuprates behave the way they do could help scientists design better high-temperature superconductors, eliminating the cost of expensive cooling systems and improving the efficiency of power generation, transmission, and distribution. Imagine computers that never heat up and power grids that never lose energy.

    “The ultimate goal is to achieve superconductivity at room temperature,” said John Tranquada, a physicist and leader of the Neutron Scatter Group in the Condensed Matter Physics and Materials Science Department at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory, where he has been studying cuprates since the 1980s. “If we want to do that by design, we have to figure out which features are essential for superconductivity. Teasing out those features in such complicated materials as the cuprates is no easy task.”

    The copper-oxygen planes of LBCO contain “stripes” of electrical charge separated by a type of magnetism in which the electron spins alternate in opposite directions. In order for LBCO to become superconducting, the individual electrons in these stripes need to be able to pair up and move in unison throughout the material.

    Previous experiments showed that, above the temperature at which LBCO becomes superconducting, resistance occurs when the electrical transport is perpendicular to the planes but is zero when the transport is parallel. Theorists proposed that this phenomenon might be the consequence of an unusual spatial modulation of the superconductivity, with the amplitude of the superconducting state oscillating from positive to negative on moving from one charge stripe to the next. The stripe pattern rotates by 90 degrees from layer to layer, and they thought that this relative orientation was blocking the superconducting electron pairs from moving coherently between the layers.

    “This idea is similar to passing light through a pair of optical polarizers, such as the lenses of certain sunglasses,” said Tranquada. “When the polarizers have the same orientation, they pass light, but when their relative orientation is rotated to 90 degrees, they block all light.”

    However, a direct experimental test of this picture had been lacking—until now.

    One of the challenges is synthesizing the large, high-quality single crystals of LBCO needed to conduct experiments. “It takes two months to grow one crystal, and the process requires precise control over temperature, atmosphere, chemical composition, and other conditions,” said co-author Genda Gu, a physicist in Tranquada’s group. Gu used an infrared image furnace—a machine with two bright lamps that focus infrared light onto a cylindrical rod containing the starting material, heating it to nearly 2500 degrees Fahrenheit and causing it to melt—in his crystal growth lab to grow the LBCO crystals.

    Collaborators at the Max Planck Institute for the Structure and Dynamics of Matter and the University of Oxford then directed infrared light, generated from high-intensity laser pulses, at the crystals (with the light polarization in a direction perpendicular to the planes) and measured the intensity of light reflected back from the sample. Besides the usual response—the crystals reflected the same frequency of light that was sent in—the scientists detected a signal three times higher than the frequency of that incident light.

    “For samples with three-dimensional superconductivity, the superconducting signature can be seen at both the fundamental frequency and at the third harmonic,” said Tranquada. “For a sample in which charge stripes block the superconducting current between layers, there is no optical signature at the fundamental frequency. However, by driving the system out of equilibrium with the intense infrared light, the scientists induced a net coupling between the layers, and the superconducting signature shows up in the third harmonic. We had suspected that the electron pairing was present—it just required a stronger tool to bring this superconductivity to light.”

    University of Hamburg theorists supported this experimental observation with analysis and numerical simulations of the reflectivity.

    This research provides a new technique to probe different types of electronic orders in high-temperature superconductors, and the new understanding may be helpful in explaining other strange behaviors in the cuprates.

    The work performed at Brookhaven was supported by DOE’s Office of Science.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 9:17 am on February 16, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , , ,   

    From ESA: “Swarm details energetic coupling” 

    ESA Space For Europe Banner

    European Space Agency

    15 February 2018

    ESA/Swarm

    The Sun bathes our planet in the light and heat it needs to sustain life, but it also bombards us with dangerous charged particles in solar wind. Our magnetic field largely shields from this onslaught, but like many a relationship, it’s somewhat complicated. Thanks to ESA’s Swarm mission the nature of this Earth–Sun coupling has been revealed in more detail than ever before.

    Earth’s magnetic field is like a huge bubble, protecting us from cosmic radiation and charged particles carried by powerful winds that escape the Sun’s gravitational pull and sweep across the Solar System.

    Magnetosphere of Earth, original bitmap from NASA. SVG rendering by Aaron Kaase

    The trio of Swarm satellites were launched in 2013 to improve our understanding of how the field is generated and how it protects us from this barrage of charged particles.

    Since our magnetic field is generated mainly by an ocean of liquid iron that makes up the planet’s outer core, it resembles a bar magnet with field lines emerging from near the poles.

    The field is highly conductive and carries charged particles that flow along these field lines, giving rise to field-aligned currents.

    Carrying up to 1 TW of electrical power – about six times the amount of energy produced every year by wind turbines in Europe – these currents are the dominant form of energy transfer between the magnetosphere and ionosphere.

    The shimmering green and purple light displays of the auroras in the skies above the polar regions are a visible manifestation of energy and particles travelling along magnetic field lines.

    3
    Aurora borealis
    Released 21/04/2017
    Copyright Sherwin Calaluan
    The aurora borealis is a visible display of electrically charged atomic particles from the Sun interacting with Earth’s magnetic field.

    The theory about the exchange and momentum between solar wind and our magnetic field actually goes back more than 100 years, and more recently the Active Magnetosphere and Planetary Electrodynamics Response Experiment satellite network has allowed scientists to study large-scale field-aligned currents.

    However, the Swarm mission is leading to exciting new wave of discoveries. A new paper [Journal of Geophysical Research] explores the dynamics of this energetic coupling across different spatial scales – and finds that it’s all in the detail.

    Ryan McGranaghan from NASA’s Jet Propulsion Laboratory said, “We have a good understanding of how these currents exchange energy between the ionosphere and the magnetosphere at large scales so we assumed that smaller-scale currents behaved in the same way, but carried proportionally less energy.”

    “Swarm has allowed us to effectively zoom in on these smaller currents and we see that, under certain conditions, this is not the case.

    ______________________________________________________________________________________________
    4
    Solar corona viewed by Proba-2
    Released 16/03/2015
    Copyright ESA/ROB
    This snapshot of our constantly changing Sun catches looping filaments and energetic eruptions on their outward journey from our star’s turbulent surface.

    The disc of our star is a rippling mass of bright, hot active areas, interspersed with dark, cool snaking filaments that wrap around the star. Surrounding the tumultuous solar surface is the chaotic corona, a rarified atmosphere of super-heated plasma that blankets the Sun and extends out into space for millions of kilometres.

    This coronal plasma reaches temperatures of several million degrees in some regions – significantly hotter than the surface of the Sun, which reaches comparatively paltry temperatures of around 6000ºC – and glows in ultraviolet and extreme-ultraviolet light owing to its extremely high temperature. By picking one particular wavelength, ESA’s Proba-2 SWAP (Sun Watcher with APS detector and Image Processing) camera is able to single out structures with temperatures of around a million degrees.

    ESA Proba 2

    As seen in the above image, taken on 25 July 2014, the hot plasma forms large loops and fan-shaped structures, both of which are kept in check by the Sun’s intense magnetic field. While some of these loops stay close to the surface of the Sun, some can stretch far out into space, eventually being swept up into the solar wind – an outpouring of energetic particles that constantly streams out into the Solar System and flows past the planets, including Earth.

    Even loops that initially appear to be quite docile can become tightly wrapped and tangled over time, storing energy until they eventually snap and throw off intense flares and eruptions known as coronal mass ejections. These eruptions, made up of massive amounts of gas embedded in magnetic field lines, can be dangerous to satellites, interfere with communication equipment and damage vital infrastructure on Earth.

    Despite the Sun being the most important star in our sky, much is still unknown about its behaviour. Studying its corona in detail could help us to understand the internal workings of the Sun, the erratic motions of its outer layers, and the highly energetic bursts of material that it throws off into space.

    Two new ESA missions will soon contribute to this field of study: Solar Orbiter is designed to study the solar wind and region of space dominated by the Sun and also to closely observe the star’s polar regions, and the Proba-3 mission will study the Sun’s faint corona closer to the solar rim than has ever before been achieved.

    NASA/ESA Solar Orbiter

    ESA Proba 3

    ______________________________________________________________________________________________

    “Our findings show that these smaller currents carry significant energy and that their relationship with the larger currents is very complex. Moreover, large and small currents affect the magnetosphere–ionosphere differently.”

    Colin Forsyth from University College London noted, “Since electric currents around Earth can interfere with navigation and telecommunication systems, this is an important discovery.

    “It also gives us a greater understanding of how the Sun and Earth are linked and how this coupling can ultimately add energy to our atmosphere.

    “This new knowledge can be used to improve models so that we can better understand, and therefore, ultimately, prepare for the potential consequences of solar storms.”

    ESA’s Swarm mission manager, Rune Floberghagen, added, “Since the beginning of the mission we have carried out projects to address the energy exchange between the magnetosphere, ionosphere and the thermosphere.

    “But what we are witnessing now is nothing short of a complete overhaul of the understanding of how Earth responds to and interacts with output from the Sun.

    “In fact, this scientific investigation is becoming a fundamental pillar for the extended Swarm mission, precisely because it is breaking new ground and at the same time has strong societal relevance. We now wish to explore this potential of Swarm to the fullest.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The European Space Agency (ESA), established in 1975, is an intergovernmental organization dedicated to the exploration of space, currently with 19 member states. Headquartered in Paris, ESA has a staff of more than 2,000. ESA’s space flight program includes human spaceflight, mainly through the participation in the International Space Station program, the launch and operations of unmanned exploration missions to other planets and the Moon, Earth observation, science, telecommunication as well as maintaining a major spaceport, the Guiana Space Centre at Kourou, French Guiana, and designing launch vehicles. ESA science missions are based at ESTEC in Noordwijk, Netherlands, Earth Observation missions at ESRIN in Frascati, Italy, ESA Mission Control (ESOC) is in Darmstadt, Germany, the European Astronaut Centre (EAC) that trains astronauts for future missions is situated in Cologne, Germany, and the European Space Astronomy Centre is located in Villanueva de la Cañada, Spain.

    ESA50 Logo large

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: