Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:16 pm on November 17, 2017 Permalink | Reply
    Tags: A new window into electron behavior, Applied Research & Technology, , ARPES cannot be used to visualize electron behavior in insulators — materials within which electric current does not flow freely. ARPES also does not work in a magnetic field which can greatly alter, Band structure determines a material’s electrical and optical properties, MERTS-The team’s technique is called momentum and energy resolved tunneling spectroscopy and is based on quantum mechanical tunneling, , Scientists may be able to identify better faster semiconductor materials   

    From MIT: “A new window into electron behavior” 

    MIT News
    MIT Widget

    MIT News

    November 16, 2017
    Jennifer Chu

    1
    Scientists at MIT have found a way to visualize electron behavior beneath a material’s surface. The team’s technique is based on quantum mechanical tunneling, a process by which electrons can traverse energetic barriers by simply appearing on the other side. In this image, researchers show the measured tunneling spectra at various densities, with high measurements in red. Courtesy of the researchers.

    2
    The team set up a two-dimensional electron system known as a quantum well. The system consists of two layers of gallium arsenide, separated by a thin barrier made from another material, aluminum gallium arsenide. The researchers then applied electrical pulses to eject electrons from the first layer of gallium arsenide and into the second layer. They reasoned that those electrons that were able to tunnel through to the second layer of gallium arsenide did so because their momenta and energies coincided with those of electronic states in that layer. Courtesy of the researchers.

    3
    The researchers also found that, under certain magnetic field strengths, the ordinary parabola resembled two stacked donuts. They realized that the abnormal distribution was a result of electrons interacting with vibrating ions within the material. Courtesy of the researchers

    For the first time, physicists have developed a technique that can peer deep beneath the surface of a material to identify the energies and momenta of electrons there.

    The energy and momentum of these electrons, known as a material’s “band structure,” are key properties that describe how electrons move through a material. Ultimately, the band structure determines a material’s electrical and optical properties.

    The team, at MIT and Princeton University, has used the technique to probe a semiconducting sheet of gallium arsenide, and has mapped out the energy and momentum of electrons throughout the material. The results are published today in the journal Science.

    By visualizing the band structure, not just at the surface but throughout a material, scientists may be able to identify better, faster semiconductor materials. They may also be able to observe the strange electron interactions that can give rise to superconductivity within certain exotic materials.

    “Electrons are constantly zipping around in a material, and they have a certain momentum and energy,” says Raymond Ashoori, professor of physics at MIT and a co-author on the paper. “These are fundamental properties which can tell us what kind of electrical devices we can make. A lot of the important electronics in the world exist under the surface, in these systems that we haven’t been able to probe deeply until now. So we’re very excited — the possibilities here are pretty vast.”

    Ashoori’s co-authors are postdoc Joonho Jang and graduate student Heun Mo Yoo, along with Loren Pfeffer, Ken West, and Kirk Baldwin, of Princeton University.

    Pictures beneath the surface

    To date, scientists have only been able to measure the energy and momentum of electrons at a material’s surface. To do so, they have used angle-resolved photoemission spectroscopy, or ARPES, a standard technique that employs light to excite electrons and make them jump out from a material’s surface. The ejected electrons are captured, and their energy and momentum are measured in a detector. Scientists can then use these measurements to calculate the energy and momentum of electrons within the rest of the material.

    “[ARPES] is wonderful and has worked great for surfaces,” Ashoori says. “The problem is, there is no direct way of seeing these band structures within materials.”

    In addition, ARPES cannot be used to visualize electron behavior in insulators — materials within which electric current does not flow freely. ARPES also does not work in a magnetic field, which can greatly alter electronic properties inside a material.

    The technique developed by Ashoori’s team takes up where ARPES leaves off and enables scientists to observe electron energies and momenta beneath the surfaces of materials, including in insulators and under a magnetic field.

    “These electronic systems by their nature exist underneath the surface, and we really want to understand them,” Ashoori says. “Now we are able to get these pictures which have never been created before.”

    Tunneling through

    The team’s technique is called momentum and energy resolved tunneling spectroscopy, or MERTS, and is based on quantum mechanical tunneling, a process by which electrons can traverse energetic barriers by simply appearing on the other side — a phenomenon that never occurs in the macroscopic, classical world which we inhabit. However, at the quantum scale of individual atoms and electrons, bizarre effects such as tunneling can occasionally take place.

    “It would be like you’re on a bike in a valley, and if you can’t pedal, you’d just roll back and forth. You would never get over the hill to the next valley,” Ashoori says. “But with quantum mechanics, maybe once out of every few thousand or million times, you would just appear on the other side. That doesn’t happen classically.”

    Ashoori and his colleagues employed tunneling to probe a two-dimensional sheet of gallium arsenide. Instead of shining light to release electrons out of a material, as scientists do with ARPES, the team decided to use tunneling to send electrons in.

    The team set up a two-dimensional electron system known as a quantum well. The system consists of two layers of gallium arsenide, separated by a thin barrier made from another material, aluminum gallium arsenide. Ordinarily in such a system, electrons in gallium arsenide are repelled by aluminum gallium arsenide, and would not go through the barrier layer.

    “However, in quantum mechanics, every once in a while, an electron just pops through,” Jang says.

    The researchers applied electrical pulses to eject electrons from the first layer of gallium arsenide and into the second layer. Each time a packet of electrons tunneled through the barrier, the team was able to measure a current using remote electrodes. They also tuned the electrons’ momentum and energy by applying a magnetic field perpendicular to the tunneling direction. They reasoned that those electrons that were able to tunnel through to the second layer of gallium arsenide did so because their momenta and energies coincided with those of electronic states in that layer. In other words, the momentum and energy of the electrons tunneling into gallium arsenide were the same as those of the electrons residing within the material.

    By tuning electron pulses and recording those electrons that went through to the other side, the researchers were able to map the energy and momentum of electrons within the material. Despite existing in a solid and being surrounded by atoms, these electrons can sometimes behave just like free electrons, albeit with an “effective mass” that may be different than the free electron mass. This is the case for electrons in gallium arsenide, and the resulting distribution has the shape of a parabola. Measurement of this parabola gives a direct measure of the electron’s effective mass in the material.

    Exotic, unseen phenomena

    The researchers used their technique to visualize electron behavior in gallium arsenide under various conditions. In several experimental runs, they observed “kinks” in the resulting parabola, which they interpreted as vibrations within the material.

    “Gallium and arsenic atoms like to vibrate at certain frequencies or energies in this material,” Ashoori says. “When we have electrons at around those energies, they can excite those vibrations. And we could see that for the first time, in the little kinks that appeared in the spectrum.”

    They also ran the experiments under a second, perpendicular magnetic field and were able to observe changes in electron behavior at given field strengths.

    “In a perpendicular field, the parabolas or energies become discrete jumps, as a magnetic field makes electrons go around in circles inside this sheet,” Ashoori says.

    “This has never been seen before.”

    The researchers also found that, under certain magnetic field strengths, the ordinary parabola resembled two stacked donuts.

    “It was really a shock to us,” Ashoori says.

    They realized that the abnormal distribution was a result of electrons interacting with vibrating ions within the material.

    “In certain conditions, we found we can make electrons and ions interact so strongly, with the same energy, that they look like some sort of composite particles: a particle plus a vibration together,” Jang says.

    Further elaborating, Ashoori explains that “it’s like a plane, traveling along at a certain speed, then hitting the sonic barrier. Now there’s this composite thing of the plane and the sonic boom. And we can see this sort of sonic boom — we’re hitting this vibrational frequency, and there’s some jolt happening there.”

    The team hopes to use its technique to explore even more exotic, unseen phenomena below the material surface.

    “Electrons are predicted to do funny things like cluster into little bubbles or stripes,” Ashoori says. “These are things we hope to see with our tunneling technique. And I think we have the power to do that.”

    This research was supported, in part, by the Gordon and Betty Moore Foundation and the BES program of the Office of Science of the U.S. Department of Energy.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

    Advertisements
     
  • richardmitnick 7:56 am on November 17, 2017 Permalink | Reply
    Tags: Applied Research & Technology, food insecurity, Jordan joins Mediterranean research effort to tackle water, UNC Endeavors   

    From UNC Endeavors: “Jordan joins Mediterranean research effort to tackle water, food insecurity” 

    UNC Endeavors

    13 November 2017
    Kevin Casey

    1
    A 10-year research partnership between Mediterranean countries will tackle water scarcity and food insecurity in the region. Image credit – CC0

    Jordan has become the latest country to sign up to an international research effort to tackle water scarcity and food insecurity in the Mediterranean region.

    The agreement was signed in a ceremony at the World Science Forum in Jordan on 10 November. Once it is ratified by Jordan and the EU, the country will join a list of partners including Israel, Algeria, Tunisia and Turkey, who have agreed to work together to develop ways to meet the growing challenges of climate change, population growth and urbanisation in the region.

    Carlos Moedas, EU Commissioner for Research, Science and Innovation, who attended the EU-Jordan signing ceremony, called the initiative ‘the most ambitious joint research and innovation programme ever to be undertaken by countries across the Mediterranean.’

    The 10-year partnership for research and innovation in the Mediterranean area, known as PRIMA and due to start in 2018, will develop scientific research into water and sustainable food production, topics of pressing concern to the countries of the Mediterranean region.

    It will be financed with funding of EUR 274 million from the participant countries, backed by EUR 220 million from the EU’s Horizon 2020 research funding programme.

    Diplomacy

    In a keynote speech on the final day of the World Science Forum, the theme of which was science for peace, Commissioner Moedas said that initiatives like PRIMA show that sometimes science can be the best tool for diplomacy.

    While the broader Middle East region is rife with conflicting political viewpoints, he pointed out that there is common ground in scientific research, and nations that open up to science and innovation can progress their own wellbeing.

    ‘This message of international cooperation is powerful,’ he said.

    Commissioner Moedas said the extent to which scientific cooperation can overcome political tensions is illustrated by an iconic photograph from 1975, depicting Soviet and US astronauts greeting each other in space despite the severe Cold War tensions between the two countries.
    American and Soviet astronauts met in space in 1975, despite the tensions of the Cold War. Image credit – NASA
    2
    ‘The two men are floating in zero gravity, reaching across a hatch from an American spaceship to a Russian one, grasping their hands and turning their faces to smile at the camera,’ he said.

    Overcoming the technical challenges of forming a rendezvous in space between two incompatible spacecraft required a great deal of cooperation between the scientists and engineers of both countries.

    Even at the lowest point of their political relationships, Soviet and American scientists found grounds to work together, and this is because science is the universal language, said the Commissioner.

    ‘It does not care about capitalism, or communism. Or religious creed. Science does not take sides. But it can improve the lives of many, no matter what they believe,’ he said.

    Cooperation

    The Middle East already has an example of scientific cooperation helping open channels of communication between political rivals in SESAME, a high-energy physics research centre hosted in Jordan with partners including Israel, Pakistan, Iran and Turkey.

    SESAME Particle Accelerator Jordan interior


    SESAME Particle Accelerator, Jordan campus, an independent laboratory located in Allan in the Balqa governorate of Jordan

    SESAME, which is funded partly by the EU, creates benefits beyond scientific achievements, according to Commissioner Moedas, who said of the endeavour: ‘That generates mutual respect and admiration. That moves people’s hearts, as well as their minds.’

    ________________________________________________________________
    ‘Science does not take sides. But it can improve the lives of many, no matter what they believe.’
    Carlos Moedas, EU Commissioner for Research, Science and Innovation
    ________________________________________________________________

    However, he made the point that while scientific research keeps the door open to positive dialogue, developing an open research system also means the science gets better.

    Recently published research has concluded that scientists have most impact when they’re free to move. Another study found that international mobility can bolster innovation.

    ‘I would go further than this,’ said Commissioner Moedas. ‘International science is also the best thing for our world.’

    On 27 October, the European Commission announced that they would spend more than EUR 1 billion over the next three years on 30 flagship initiatives that promote international cooperation in areas of mutual benefit.

    These will include working with Canada on personalised medicine, Africa on sustainable agriculture, and Japan, Korea, China and Taiwan on 5G technology.

    There are also plans to cooperate with Russia on research infrastructures, a development that comes despite political tensions. ‘Russia is still a welcome partner in Horizon 2020,’ said Commissioner Moedas, as joint research on areas of mutual concern continues to enable a ‘precious link through the common language and ideals of science.’

    He said that the successor funding programme to Horizon 2020 should also support open science by enabling mobility for scientists, collaborating with non-EU partners and doing more to address global challenges.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition</a

    U NC bloc

    U NC campus

    Carolina’s vibrant people and programs attest to the University’s long-standing place among leaders in higher education since it was chartered in 1789 and opened its doors for students in 1795 as the nation’s first public university. Situated in the beautiful college town of Chapel Hill, N.C., UNC has earned a reputation as one of the best universities in the world. Carolina prides itself on a strong, diverse student body, academic opportunities not found anywhere else, and a value unmatched by any public university in the nation.

     
  • richardmitnick 8:33 pm on November 16, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , , ,   

    From phys.org: “Machine learning used to predict earthquakes in a lab setting” 

    physdotorg
    phys.org

    October 23, 2017

    1
    Aerial photo of the San Andreas Fault in the Carrizo Plain, northwest of Los Angeles. Credit: Wikipedia.

    A group of researchers from the UK and the US have used machine learning techniques to successfully predict earthquakes. Although their work was performed in a laboratory setting, the experiment closely mimics real-life conditions, and the results could be used to predict the timing of a real earthquake.

    The team, from the University of Cambridge, Los Alamos National Laboratory and Boston University, identified a hidden signal leading up to earthquakes, and used this ‘fingerprint’ to train a machine learning algorithm to predict future earthquakes. Their results, which could also be applied to avalanches, landslides and more, are reported in the journal Geophysical Review Letters.

    For geoscientists, predicting the timing and magnitude of an earthquake is a fundamental goal. Generally speaking, pinpointing where an earthquake will occur is fairly straightforward: if an earthquake has struck a particular place before, the chances are it will strike there again. The questions that have challenged scientists for decades are how to pinpoint when an earthquake will occur, and how severe it will be. Over the past 15 years, advances in instrument precision have been made, but a reliable earthquake prediction technique has not yet been developed.

    As part of a project searching for ways to use machine learning techniques to make gallium nitride (GaN) LEDs more efficient, the study’s first author, Bertrand Rouet-Leduc, who was then a PhD student at Cambridge, moved to Los Alamos National Laboratory in New Mexico to start a collaboration on machine learning in materials science between Cambridge University and Los Alamos. From there the team started helping the Los Alamos Geophysics group on machine learning questions.

    The team at Los Alamos, led by Paul Johnson, studies the interactions among earthquakes, precursor quakes (often very small earth movements) and faults, with the hope of developing a method to predict earthquakes. Using a lab-based system that mimics real earthquakes, the researchers used machine learning techniques to analyse the acoustic signals coming from the ‘fault’ as it moved and search for patterns.

    The laboratory apparatus uses steel blocks to closely mimic the physical forces at work in a real earthquake, and also records the seismic signals and sounds that are emitted. Machine learning is then used to find the relationship between the acoustic signal coming from the fault and how close it is to failing.

    The machine learning algorithm was able to identify a particular pattern in the sound, previously thought to be nothing more than noise, which occurs long before an earthquake. The characteristics of this sound pattern can be used to give a precise estimate (within a few percent) of the stress on the fault (that is, how much force is it under) and to estimate the time remaining before failure, which gets more and more precise as failure approaches. The team now thinks that this sound pattern is a direct measure of the elastic energy that is in the system at a given time.

    “This is the first time that machine learning has been used to analyse acoustic data to predict when an earthquake will occur, long before it does, so that plenty of warning time can be given – it’s incredible what machine learning can do,” said co-author Professor Sir Colin Humphreys of Cambridge’s Department of Materials Science & Metallurgy, whose main area of research is energy-efficient and cost-effective LEDs. Humphreys was Rouet-Leduc’s supervisor when he was a PhD student at Cambridge.

    “Machine learning enables the analysis of datasets too large to handle manually and looks at data in an unbiased way that enables discoveries to be made,” said Rouet-Leduc.

    Although the researchers caution that there are multiple differences between a lab-based experiment and a real earthquake, they hope to progressively scale up their approach by applying it to real systems which most resemble their lab system. One such site is in California along the San Andreas Fault, where characteristic small repeating earthquakes are similar to those in the lab-based earthquake simulator. Progress is also being made on the Cascadia fault in the Pacific Northwest of the United States and British Columbia, Canada, where repeating slow earthquakes that occur over weeks or months are also very similar to laboratory earthquakes.

    “We’re at a point where huge advances in instrumentation, machine learning, faster computers and our ability to handle massive data sets could bring about huge advances in earthquake science,” said Rouet-Leduc.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Phys.org in 100 Words

    Phys.org™ (formerly Physorg.com) is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004, Phys.org’s readership has grown steadily to include 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

     
  • richardmitnick 8:20 pm on November 16, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , , , The focus of the study were blood samples from Ebola patients that were obtained during the outbreak in Sierra Leone in 2014, The scientists found that levels of two biomarkers known as L-threonine (an amino acid) and vitamin-D-binding-protein may accurately predict which patients live and which die, The team found that survivors had higher levels of some immune-related molecules and lower levels of others compared to those who died, The team looked at activity levels of genes and proteins as well as the amounts of lipids and byproducts of metabolism   

    From PNNL: “Unlocking the secrets of Ebola” 

    PNNL BLOC
    PNNL Lab

    November 16, 2017
    Tom Rickey
    tom.rickey@pnnl.gov
    (509) 375-3732

    1
    PNNL scientists and their collaborators have identified molecules in the blood that indicate which patients with Ebola virus are most likely to have a poor outcome. (Credit: Photo courtesy of PNNL)

    Scientists have identified a set of biomarkers that indicate which patients infected with the Ebola virus are most at risk of dying from the disease.

    The results come from scientists at the Department of Energy’s Pacific Northwest National Laboratory and their colleagues at the University of Wisconsin-Madison, Icahn School of Medicine at Mount Sinai, the University of Tokyo and the University of Sierra Leone. The results were published online Nov. 16 in the journal Cell Host & Microbe.

    The findings could allow clinicians to prioritize the scarce treatment resources available and provide them to the sickest patients, said the senior author of the study, Yoshihiro Kawaoka, a virology professor at the UW-Madison School of Veterinary Medicine.

    The focus of the study were blood samples from Ebola patients that were obtained during the outbreak in Sierra Leone in 2014. The Wisconsin team obtained 29 blood samples from 11 patients who ultimately survived and nine blood samples from nine patients who died from the virus. The Wisconsin team inactivated the virus according to approved protocols, developed in part at PNNL, and then shipped the samples to PNNL and other institutions for analysis.

    The team looked at activity levels of genes and proteins as well as the amounts of lipids and byproducts of metabolism. The team found 11 biomarkers that distinguish fatal infections from non-fatal ones and two that, when screened for early upon symptom onset, accurately predict which patients are likely to die.

    “Our team studied thousands of molecular clues in each of these samples, sifting through extensive data on the activity of genes, proteins, and other molecules to identify those of most interest,” said Katrina Waters, the leader of the PNNL team and a corresponding author of the paper. “This may be the most thorough analysis yet of blood samples of patients infected with the Ebola virus.”

    The team found that survivors had higher levels of some immune-related molecules and lower levels of others compared to those who died. Plasma cytokines, which are involved in immunity and stress response, were higher in the blood of people who perished. Fatal cases had unique metabolic responses compared to survivors, higher levels of virus, changes to plasma lipids involved in processes like blood coagulation, and more pronounced activation of some types of immune cells.

    Pancreatic enzymes also leaked into the blood of patients who died, suggesting that these enzymes contribute to the tissue damage characteristic of fatal Ebola virus disease.

    The scientists found that levels of two biomarkers, known as L-threonine (an amino acid) and vitamin-D-binding-protein, may accurately predict which patients live and which die. Both were present at lower levels at the time of admission in the patients who ultimately perished.

    The team found that many of the molecular signals present in the blood of sick, infected patients overlap with sepsis, a condition in which the body – in response to infection by bacteria or other pathogens – mounts a damaging inflammatory reaction.

    Fifteen PNNL scientists contributed to the study. Among the corresponding authors of the study are three PNNL scientists: Waters, Thomas Metz and Richard D. Smith. Three additional PNNL scientists – Jason P. Wendler, Jennifer E. Kyle and Kristin E. Burnum-Johnson – are among six scientists who share “first author” honors.

    Other PNNL authors include Jon Jacobs, Young-Mo Kim, Cameron Casey, Kelly Stratton, Bobbie-Jo Webb-Robertson, Marina Gritsenko, Matthew Monroe, Karl Weitz, and Anil Shukla.

    Analyses of proteins, lipids and metabolites in the blood samples were performed at EMSL, the Environmental Molecular Sciences Laboratory, a DOE Office of Science User Facility at PNNL.

    The study was funded by a Japanese Health and Labor Sciences Research Grant; by grants for Scientific Research on Innovative Areas from the Ministry of Education, Cultures, Sports, Science and Technology of Japan; by Emerging/Re-emerging Infectious Diseases Project of Japan; and by the National Institute of Allergy and Infectious Diseases, part of the National Institutes of Health. Support was also provided by the Department of Scientific Computing at the Icahn School of Medicine at Mount Sinai and by a grant from the National Institute of General Medicine.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Pacific Northwest National Laboratory (PNNL) is one of the United States Department of Energy National Laboratories, managed by the Department of Energy’s Office of Science. The main campus of the laboratory is in Richland, Washington.

    PNNL scientists conduct basic and applied research and development to strengthen U.S. scientific foundations for fundamental research and innovation; prevent and counter acts of terrorism through applied research in information analysis, cyber security, and the nonproliferation of weapons of mass destruction; increase the U.S. energy capacity and reduce dependence on imported oil; and reduce the effects of human activity on the environment. PNNL has been operated by Battelle Memorial Institute since 1965.

    i1

     
  • richardmitnick 12:09 pm on November 15, 2017 Permalink | Reply
    Tags: A Speed Gun for Photosynthesis, A type of optical sensor that if the science bears out will be able to estimate the rate of photosynthesis, Applied Research & Technology, , , , SIF - Solar Induced Fluorescence, Specially designed sap flow sensors, Such aDevice would revolutionize agriculture forestry and the study of Earth’s climate and ecosystems   

    From NIST: “A Speed Gun for Photosynthesis” 


    NIST

    1
    The NIST forest in Gaithersburg, Maryland. Credit: R. Press/NIST

    November 03, 2017 [NIST is not always quick to social media]
    Rich Press

    On a recent sunny afternoon, David Allen was standing by a third-floor window in a research building at the National Institute of Standards and Technology (NIST), holding in his hands a device that looked like a cross between a video camera and a telescope. The NIST campus is in suburban Gaithersburg, Maryland, but looking out the window, Allen could see 24 hectares (60 acres) of tulip tree, oak, hickory and red maple—a remnant of the northeastern hardwood forest that once dominated this landscape.

    Allen mounted the device on a tripod and pointed it out the window at the patch of forest below. The device wasn’t a camera, but a type of optical sensor that, if the science bears out, will be able to estimate the rate of photosynthesis—the chemical reaction that enables plants to convert water, carbon dioxide (CO2) and sunlight into food and fiber—from a distance.

    That measurement is possible because when plants are photosynthesizing, their leaves emit a very faint glow of infrared light. That glow is called Solar Induced Fluorescence, or SIF, and in recent years, optical sensors for measuring it have advanced dramatically. The sensor that Allen had just mounted on a tripod was one of them.

    “If SIF sensors end up working well,” Allen said, “I can imagine an instrument that stares at crops or a forest and has a digital readout on it that says how fast the plant is growing in real time.”

    Such a device would revolutionize agriculture, forestry and the study of Earth’s climate and ecosystems.

    2
    NIST scientist David Allen and Boston University Ph.D. student Julia Marrs aim a SIF sensor at a specific tree in the NIST forest.
    Credit: R. Press/NIST

    Allen is a NIST chemist whose research involves remote sensing—the technology that’s used to observe Earth from outer space. Remote sensing allows scientists to track hurricanes, map terrain, monitor population growth and produce daily weather reports. The technology is so deeply embedded in our everyday lives that it’s easy to take for granted. But each type of remote sensing had to be developed from the ground up, and the SIF project at NIST shows how that’s done.

    Some satellites are already collecting SIF data, but standards are needed to ensure that those measurements can be properly interpreted. NIST has a long history of developing standards for satellite-based measurements, and Allen’s research is aimed at developing standards for measuring SIF. Doing that requires a better understanding of the biological processes that underlie SIF, and for that, Allen teamed up with outside scientists.

    At the same time that Allen was aiming a SIF sensor through that third-floor window, a team of biologists from Boston University and Bowdoin College was in the NIST forest measuring photosynthesis up close. A pair of them spent the day climbing into the canopy on an aluminum orchard ladder. Once there, they would use a portable gas exchange analyzer to measure photosynthesis directly based on how much CO2 the leaf pulled out of the air. They also measured SIF at close range.

    3
    Boston University ecologist Lucy Hutyra (left) works at the forest edge alongside plant physiological ecologist Barry Logan (center) and ecologist Jaret Reblin, both of Bowdoin College in Brunswick, Maine. They measured photosynthesis directly, as well as temperature, humidity, and other environmental variables. Credit: R. Press/NIST

    Other scientists checked on specially designed sap flow sensors they had installed on the trunks of trees to measure the movement of water toward the leaves for photosynthesis.

    “We’re measuring the vital signs of the trees,” said Lucy Hutyra, the Boston University ecologist who led the team of scientists on the ground. The idea was to use those ground measurements to make sense of the SIF data collected from a distance.

    “If we measure an increase in photosynthesis at the leaf, we should see a corresponding change in the optical signal,” Hutyra said.

    4
    After directly measuring photosynthesis in an individual leaf using a field portable gas exchange analyzer, scientists preserved a small sample of leaf tissue in liquid nitrogen. They would later analyze that tissue in the lab to measure levels of chlorophyll and other pigments. Credit: R. Press/NIST

    The research was also taking place at still a higher level. That afternoon, Bruce Cook and Larry Corp, scientists with NASA’s G-LiHT project, flew over the NIST forest in a twin-turboprop plane that carried multiple sensors, including a SIF sensor and Light Detection and Ranging (LiDAR) sensors that mapped the internal structure of the forest canopy. The aircraft made six parallel passes over the forest at about 340 meters (1,100 feet, slightly above the minimum safe altitude allowed by FAA regulations), the instruments peering out from a port cut into the belly of the aircraft.

    That gave the scientists three simultaneous measurements to work with: from the ground, from the window above the forest and from the air. They’ll spend months correlating the data.

    “It’s tricky, because when you go from the leaf level to the forest level, you often get different results,” Allen said. For instance, at the forest level, the SIF signal is affected by the variations in the canopy, including its contours and density. “We’re still studying those effects.”

    5
    At the airport in Gaithersburg, Maryland, NASA earth scientist Bruce Cook (left), leader of the Goddard LiDAR, Hyperspectral, and Thermal (G-LiHT) project, shows David Allen and Julia Marrs the sensor array in the bottom of the aircraft. Credit: R. Press/NIST

    Currently, there is no reliable way to measure photosynthesis in real time over a wide area. Instead, scientists measure how green an area is to gauge how much chlorophyll is present—that’s the molecule that supports photosynthesis and gives leaves their color. But if a plant lacks water or nutrients, it may be green even if the photosynthetic machinery is switched off.

    SIF may be a much better indicator of active photosynthesis. When plants are photosynthesizing, most of the light energy absorbed by the chlorophyll molecule goes into growing the plant, but about two to five percent of that energy leaks away as SIF. The amount of leakage is not always proportional to photosynthesis, however. Environmental variables also come into play.

    The NIST forest is a test bed for understanding how all those variables interrelate. In addition to SIF data and the vital signs of trees, the scientists are collecting environmental data such as temperature, relative humidity and solar irradiance. They’re also figuring out the best ways to configure and calibrate the SIF instruments.

    “We’d like to see robust, repeatable results that make sense,” Allen said. “That will allow us to scale up from the leaf level, to the forest level, to the ecosystem level, and to estimate photosynthesis from measurements made at any of those scales.”

    Making SIF scalable is a key part of the measurement standard that Allen is working to create, and it will go from the ground level to measurements made from outer space.

    6
    A corner of the NIST forest shot by NASA scientists, and the plane that carried them and their G-LiHT airborne imaging system.
    Credit: Bruce Cook, Larry Corp/NASA (left); David Allen/NIST

    Using SIF to measure photosynthesis in real time would allow farmers to use only as much irrigation and fertilizer as their crops need, and only when they need it. Forest managers would be able to know how fast their timber is growing without having to tromp through the woods with a tape measure. Environmental managers would be able to monitor the recovery of damaged or deforested habitats after a drought or forest fire.

    And scientists would have a powerful new tool for studying how plants help regulate the amount of CO2 in the atmosphere.

    Humans add CO2 to the atmosphere when they burn fossil fuels, and land-based plants remove roughly a quarter of that CO2 through photosynthesis. But the environmental factors that affect that process are not well understood, mainly because scientists haven’t had a good way to measure the uptake of CO2 at the ecosystem level. SIF measurements, and the standards for interpreting them accurately, might help solve that problem.

    “CO2 exchange by plants is one of the most important biological processes on the planet,” Allen said, “and SIF will give us a new way to see that process in action.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD, USA

    NIST Mission, Vision, Core Competencies, and Core Values

    NIST’s mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.
    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.
    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

     
  • richardmitnick 8:24 am on November 15, 2017 Permalink | Reply
    Tags: , AMAZE, Applied Research & Technology,   

    From ESA: “3D printed metal mutants arise from Europe’s AMAZE programme” 

    ESA Space For Europe Banner

    European Space Agency

    14 November 2017

    6

    1
    3D printed support
    Released 13/11/2017
    Copyright AMAZE – Thales Alenia Space/Renishaw
    Sun sensor and antenna support bracket produced through the AMAZE metal 3D printing programme, courtesy of Thales Alenia Space and Renishaw.

    Europe’s lead in metal 3D printing has been strengthened by the four-year AMAZE programme, producing lighter, cheaper, organically shaped parts. ESA was among 26 academic and industrial partners developing novel processes and products for high-performance sectors.

    Launched in 2013, AMAZE – short for Additive Manufacturing Aiming Towards Zero Waste and Efficient Production of High-Tech Metal Products – was the largest R&D programme for 3D printing ever run.

    ESA helped to initiate the programme, which was funded by the European Commission’s Seventh Framework Programme and coordinated by the UK’s Manufacturing Technology Centre (MTC).

    The Agency joined manufacturers Airbus and Thales Alenia Space in assessing prototype products for space use, while comparable end-users did the same for the automotive, aeronautics and nuclear fusion sectors.

    2
    Laser-based 3D printing
    Released 13/11/2017
    Copyright AMAZE – Fraunhofer ILT/Airbus
    High-rate laser-based laser melting of pylon bracket in Inconel 718 metal, courtesy of Fraunhofer ILT and Airbus

    “The work of AMAZE ranges right across the process chain,” explains David Wimpenny, Chief Technologist for the National Additive Manufacturing Centre, based at the MTC.

    “It includes new approaches to part design, along with the challenge of reliably finishing and inspecting the resulting parts, introducing novel materials, improving production throughput and developing common industrial standards.”

    Tommaso Ghidini, heading ESA’s Structures, Mechanisms and Materials Division, comments: “The Agency’s participation in AMAZE was an opportunity to create synergies and cross-fertilising benefits with our existing Advanced Manufacturing Cross-Cutting Initiative, harnessing game changing manufacturing technologies for space.”

    To draw maximum benefit from the process, parts need to be designed specially. With 3D printing it is only the volume of material being fused together that is paid for, with no waste to be cut away, so the lighter the weight of the part the cheaper it ends up.

    David Wimpenny adds: “It’s really opened the eyes of designers: through 3D printing, complex, performance-optimised, lightweight parts actually end up costing less than traditional alternatives.

    “During AMAZE we’ve been literally growing parts to bear the loads required; the result has been these organically shaped metal parts weighing less than half the of the original component, manufactured all in one – removing joints which represent potential points of weakness.

    3
    3-m-wide titanium cylinder
    Released 13/11/2017
    Copyright AMAZE – ESA/IREPA
    Among the largest items produced by AMAZE, this 3-m diameter structural cylinder was printed in titanium alloy Ti64 using ‘directed energy deposition’ melting powder with a laser, courtesy of ESA and IREPA.

    “This complexity means that file sizes can be huge – several orders of magnitude larger than a normal CAD file – and it can take a long while to process all that data. But another AMAZE development has been new software tools to radically reduce the time involved.”

    New materials were developed to meet specific industrial needs, including the first 3D printing of InVar, an alloy of nickel and iron that is highly prized by the space sector for its ability to withstand orbital temperature extremes without expansion or contraction.

    3D printing of vanadium and tungsten was also demonstrated. These high-melting point metals are suited for use within nuclear fusion reactors as well as rocket engines.

    4
    Temperature-resistant InVar-printed component

    Assessing a range of different 3D printing techniques, the variety of produced parts varied hugely, from millimetre-scale samples to metre-scale structural items.

    “Just as important was increasing the speed and productivity of the process, from a few hundred grams to kilograms per hour, without compromising quality,” explains David Wimpenny.

    “We achieved this in various ways, including increasing the number and power of the lasers used for material melting.”

    4
    Pylon brackets

    “We’ve also worked to ensure the powder feedstock is optimised for the process. The powder particles have to have the correct size and shape to provide the right flow properties to give consistently high-quality, defect-free layers.”

    Another challenge was the post processing, finish and inspection of the parts, including standardised non-destructive test procedures. Medical-style three-dimensional CT scanning is one solution that was explored, with AMAZE findings going towards an ongoing effort to develop a common ISO standard for the field.

    “The industrial partners in the project are now commercialising the results of the project and the AMAZE experience has helped to forge a research community which will continue to increase the knowledge and improve the capability of additive manufacturing processes going forwards”, says David Wimpenny.

    For instance, Norsk Titanium – supported by developments made during AMAZE – has become the first company to manufacture structural aircraft components using metal 3D printing.

    Tommaso explains that the ESA–RAL Advanced Manufacturing Lab at Harwell in the UK, has played an important role in assessing the performance of AMAZE’s aeronautical and space outputs: “It has helped to define verification strategies used for critical applications, putting them on a fast track for adoption by projects.”

    David Wimpenny concludes that AMAZE has helped maintain Europe’s pre-eminence in the field of metal 3D printing, “But we shouldn’t be complacent. Global competition is fierce and it’s critically important Europe maintains its lead.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The European Space Agency (ESA), established in 1975, is an intergovernmental organization dedicated to the exploration of space, currently with 19 member states. Headquartered in Paris, ESA has a staff of more than 2,000. ESA’s space flight program includes human spaceflight, mainly through the participation in the International Space Station program, the launch and operations of unmanned exploration missions to other planets and the Moon, Earth observation, science, telecommunication as well as maintaining a major spaceport, the Guiana Space Centre at Kourou, French Guiana, and designing launch vehicles. ESA science missions are based at ESTEC in Noordwijk, Netherlands, Earth Observation missions at ESRIN in Frascati, Italy, ESA Mission Control (ESOC) is in Darmstadt, Germany, the European Astronaut Centre (EAC) that trains astronauts for future missions is situated in Cologne, Germany, and the European Space Astronomy Centre is located in Villanueva de la Cañada, Spain.

    ESA50 Logo large

     
  • richardmitnick 5:10 am on November 15, 2017 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From COSMOS: “Need a better microscope? Add mirrors” 

    Cosmos Magazine bloc

    COSMOS Magazine

    15 November 2017
    Andrew Masterson

    1
    Anthony Van Leeuwenhoek’s first microscope, from the seventeenth century, looks nothing like a modern SPIM microscope, but both are products of a quest to improve optics. Stegerphoto.

    From pre-classical times onwards, it could be argued, lens-makers have been the unsung heroes of science.

    As early as 750 BCE the Assyrians were shaping lenses from quartz. From there, the history of optics both underpins and enables discovery in both the macro and micro worlds.

    Where would science be today had it not been for the patient work of myriad lens grinders and optics theorists, including Francis Bacon, Galileo, van Leeuwenhoek, right up to Roberts and Young – inventors in 1951 of photon scanning microscopy – and beyond?

    Even today, the quest for better, clearer, more detailed images from lenses continues apace, with the latest advance, declared in the journal Nature Communications, coming from the US National Institutes of Health and the University of Chicago.

    2
    The images obtained by the combination of the new coverslip and computer algorithms show clearer views of small structures. Credit: Yicong Wu, National Institute of Biomedical Imaging and Bioengineering

    3
    In this diagram, you can see how the mirrored coverslip allows for four simultaneous views. Credit: Yicong Wu, National Institute of Biomedical Imaging and Bioengineering

    A team of researchers, led by Hari Shroff, head of the National Institute of Biomedical Imaging and Bioengineering’s lab section on High Resolution Optical Imaging (HROI), report the solution to a mechanical problem in microscope optics that was, in a way, of their own making.

    Several years ago, Shroff and colleagues developed a new type of microscope that performed “selective plane illumination microscopy” or SPIM. These microscopes use light sheets to illuminate only sections of specimens being examined, thereby doing less damage and better preserving the sample.

    In 2013, Shroff’s team created a SPIM microscope that used two lenses instead of one, which improved image quality and depth perception, In 2016, a third lens was added, allowing improved resolution and 3D-imagery.

    A fourth lens would have boosted matters even more, but at this point van Leeuwenhoek’s twenty-first century heirs hit a snag.

    “Once we incorporated three lenses, we found it became increasingly difficult to add more,” says Shroff. “Not because we reached the limit of our computational abilities, but because we ran out of physical space.”

    Proximity was a real issue. Not only were the three lenses crowded together, but all had to be positioned extremely close to the sample being examined to allow the imaging goal – detailed views of structures within a single cell, say – to be achieved.

    In their new paper, Shroff and his colleagues reveal a solution to the problem that is nothing if not elegant. Rather than try to cram an extra lens in, they have put mirrors on the coverslip – the thin piece of glass that sits on top of the sample.

    The result – especially when coupled with new algorithms in the computerised back-end of a SPIM microscope – is better speed, efficiency and resolution.

    “It’s a lot like looking into a mirror,” Shroff explains. “If you look at a scene in a mirror, you can view perspectives that are otherwise hidden. We used this same principle with the microscope.

    “We can see the sample conventionally using the usual views enabled by the lenses themselves, while at the same time recording the reflected images of the sample provided by the mirror.”

    The addition of the tiny mirrors was not without its own problems. Every microscope raw image contains unwanted data from the source of illumination used to light up the sample. With three lenses, there are three sources of this interference; with mirrors added, these too are multiplied.

    Shroff, however, took this problem to computational imaging researcher Patrick La Riviere at the University of Chicago, who, with his team, was able to modify the processing software to eliminate the extra noise and further improve the signal.

    Francis Bacon, one thinks, would have approved.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 4:21 am on November 15, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , Hydro-next-gen: shifting hydrogen research to the fast lane   

    From CSIROscope: Hydro-next-gen: shifting hydrogen research to the fast lane 

    CSIRO bloc

    CSIROscope

    15 November 2017
    Claire Ginn

    Australia is known for its vast energy resources. We’ve got ample sun, wind, biomass, natural gas and coal, and while these resources can be used effectively on their own, we can also use them to produce other energy sources, like hydrogen.

    Why use one energy source to create another? Our future energy mix will consist of many different sources to ensure we have energy when and where it’s needed, in the cleanest form. Through hydrogen, we can effectively create low-emissions energy for our own use, store it for later, or transport it overseas where it’s in demand. It’s a way to use what we already have, but tailored for a particular purpose.

    Published on Nov 8, 2017
    Hydrogen is the most abundant element in the universe and a clean-cut burning fuel. We’re looking into how it can be integrated into power generation, transport, manufacturing and energy storage, both in Australia and overseas.

    Water, water, everywhere (let’s turn it into hydrogen)

    Hydrogen is all around us, just not quite in the form that we need. It’s the H2 in water (H20), the H4 in methane (CH4), the H22 in our table sugar (C12H22O11) … the list goes on. To isolate pure hydrogen, we need to force a reaction, and there are a few ways to do this:

    Reforming: currently the most common way of producing pure hydrogen whereby heat is applied to natural gas, causing hydrogen atoms to be separated from the carbon atoms. Technologies such as carbon capture and storage (CCS) can be applied to this process to limit emissions.
    Electrolysis: an electric current can be passed through water to separate hydrogen molecules from oxygen. Renewable sources could potentially be used to drive this reaction, making the whole process very clean.
    Biomass gasification: literally turning our trash into (gaseous) treasure, whereby a process involving heat, steam and oxygen can be performed to convert biomass (and waste products) into hydrogen.

    While the majority of hydrogen is currently produced from fossil fuel sources (around 96% at 2014), there is a real opportunity for Australia to produce low or zero emission hydrogen. We live on the continent with the most solar energy coverage in the world, so it makes sense to capitalise!

    A very useful fuel

    How can we use hydrogen? Let us count the ways. It can support power generation, transport, food production, agriculture, and more.

    One of the most common uses is in fuel cells, combining hydrogen and oxygen to produce electricity, heat, and water. A fuel cell will produce electricity as long as fuel (hydrogen) is supplied. That means no recharging, and no harmful emissions. They are an option for creating heat and electricity for buildings, and electrical power for vehicles. Right now, we’re looking into the feasibility of hydrogen and fuel cells in powering major facilities like the MCG.

    Carried away

    One of the bigger challenges with hydrogen is that its low density means that it’s difficult to transport. To get around this, ‘carrier fuels’, like ammonia, can be used to take the hydrogen to where it’s needed. Almost counter-intuitively, ammonia (NH3) has a higher density of hydrogen that pure hydrogen (H2), so it’s a very efficient way to transport the fuel. And we already have much of the infrastructure needed to achieve this, even as an export fuel.

    Hydrogen can also carry energy to be used ‘on demand’, perhaps using our existing gas grid as delivery method. The concept of ‘power-to-gas’ means that hydrogen can be injected into the gas grid where it is effectively ‘stored energy’. This could potentially stabilise the fluctuations we encounter when the sun isn’t shining and wind isn’t blowing.

    In the lab

    We’re working on a number of technologies to increase the efficiency of hydrogen production, storage, and conversion to a suitable form for export and domestic use. These include inexpensive electrolysers to catalyse renewable production, membranes to extract hydrogen from carrier fuel at point of use and ways to use hydrogen to create synthetic fuels that can replace diesel and gasoline. We also have a laboratory dedicated to testing ‘hybrid energy systems’, whereby two or more energy technologies are combined for overall benefits – hydrogen lends itself particularly well to these systems.

    1
    Dr Michael Dolan in our hydrogen laboratory. No image credit.

    The time is right

    Australia is known as a leading exporter of energy resources, but the time is right to take a leadership position in what could be the ‘next big thing’. We can use our rich energy resources to produce hydrogen, either as an export or to be used domestically in transport, power generation and to offset more carbon-intensive processes.

    We’re already seeing momentum. The South Australian Energy Plan released earlier in 2017 mentions hydrogen as an area for further investigation. Victoria announced funding for a commercial-scale hydrogen refuelling station for garbage trucks. And further afield, Japan has made clear its intentions for hydrogen to play a major role in powering its Tokyo Olympics in 2020.

    While technologies are emerging and converging, what’s needed is a central coordination point, and we’re seeking to do just that through a new Future Science Platform (FSP) focused on Hydrogen Energy Systems. This platform will help us development technologies that allow us to export solar energy, as well as providing low emissions energy solutions for Australians.

    From what we’re doing in the lab, to pilot and demonstration scale testing, there’s plenty of activity planned. We’ve also got a strong network of partners and collaborators to support current, practical research and technology initiatives across the hydrogen energy value chain.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    SKA/ASKAP radio telescope at the Murchison Radio-astronomy Observatory (MRO) in Mid West region of Western Australia

    So what can we expect these new radio projects to discover? We have no idea, but history tells us that they are almost certain to deliver some major surprises.

    Making these new discoveries may not be so simple. Gone are the days when astronomers could just notice something odd as they browse their tables and graphs.

    Nowadays, astronomers are more likely to be distilling their answers from carefully-posed queries to databases containing petabytes of data. Human brains are just not up to the job of making unexpected discoveries in these circumstances, and instead we will need to develop “learning machines” to help us discover the unexpected.

    With the right tools and careful insight, who knows what we might find.

    CSIRO campus

    CSIRO, the Commonwealth Scientific and Industrial Research Organisation, is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

     
  • richardmitnick 4:58 pm on November 14, 2017 Permalink | Reply
    Tags: Applied Research & Technology, , , , , Quantum Circuits Company, , , , Robert Schoelkopf is at the forefront of a worldwide effort to build the world’s first quantum computer,   

    From NYT: “Yale Professors Race Google and IBM to the First Quantum Computer” 

    New York Times

    The New York Times

    NOV. 13, 2017
    CADE METZ

    1
    Prof. Robert Schoelkopf inside a lab at Yale University. Quantum Circuits, the start-up he has created with two of his fellow professors, is located just down the road. Credit Roger Kisby for The New York Times

    Robert Schoelkopf is at the forefront of a worldwide effort to build the world’s first quantum computer. Such a machine, if it can be built, would use the seemingly magical principles of quantum mechanics to solve problems today’s computers never could.

    Three giants of the tech world — Google, IBM, and Intel — are using a method pioneered by Mr. Schoelkopf, a Yale University professor, and a handful of other physicists as they race to build a machine that could significantly accelerate everything from drug discovery to artificial intelligence. So does a Silicon Valley start-up called Rigetti Computing. And though it has remained under the radar until now, those four quantum projects have another notable competitor: Robert Schoelkopf.

    After their research helped fuel the work of so many others, Mr. Schoelkopf and two other Yale professors have started their own quantum computing company, Quantum Circuits.

    Based just down the road from Yale in New Haven, Conn., and backed by $18 million in funding from the venture capital firm Sequoia Capital and others, the start-up is another sign that quantum computing — for decades a distant dream of the world’s computer scientists — is edging closer to reality.

    “In the last few years, it has become apparent to us and others around the world that we know enough about this that we can build a working system,” Mr. Schoelkopf said. “This is a technology that we can begin to commercialize.”

    Quantum computing systems are difficult to understand because they do not behave like the everyday world we live in. But this counterintuitive behavior is what allows them to perform calculations at rate that would not be possible on a typical computer.

    Today’s computers store information as “bits,” with each transistor holding either a 1 or a 0. But thanks to something called the superposition principle — behavior exhibited by subatomic particles like electrons and photons, the fundamental particles of light — a quantum bit, or “qubit,” can store a 1 and a 0 at the same time. This means two qubits can hold four values at once. As you expand the number of qubits, the machine becomes exponentially more powerful.

    Todd Holmdahl, who oversees the quantum project at Microsoft, said he envisioned a quantum computer as something that could instantly find its way through a maze. “A typical computer will try one path and get blocked and then try another and another and another,” he said. “A quantum computer can try all paths at the same time.”

    The trouble is that storing information in a quantum system for more than a short amount of time is very difficult, and this short “coherence time” leads to errors in calculations. But over the past two decades, Mr. Schoelkopf and other physicists have worked to solve this problem using what are called superconducting circuits. They have built qubits from materials that exhibit quantum properties when cooled to extremely low temperatures.

    With this technique, they have shown that, every three years or so, they can improve coherence times by a factor of 10. This is known as Schoelkopf’s Law, a playful ode to Moore’s Law, the rule that says the number of transistors on computer chips will double every two years.

    2
    Professor Schoelkopf, left, and Prof. Michel Devoret working on a device that can reach extremely low temperatures to allow a quantum computing device to function. Credit Roger Kisby for The New York Times

    “Schoelkopf’s Law started as a joke, but now we use it in many of our research papers,” said Isaac Chuang, a professor at the Massachusetts Institute of Technology. “No one expected this would be possible, but the improvement has been exponential.”

    These superconducting circuits have become the primary area of quantum computing research across the industry. One of Mr. Schoelkopf’s former students now leads the quantum computing program at IBM. The founder of Rigetti Computing studied with Michel Devoret, one of the other Yale professors behind Quantum Circuits.

    In recent months, after grabbing a team of top researchers from the University of California, Santa Barbara, Google indicated it is on the verge of using this method to build a machine that can achieve “quantum supremacy” — when a quantum machine performs a task that would be impossible on your laptop or any other machine that obeys the laws of classical physics.

    There are other areas of research that show promise. Microsoft, for example, is betting on particles known as anyons. But superconducting circuits appear likely to be the first systems that will bear real fruit.

    The belief is that quantum machines will eventually analyze the interactions between physical molecules with a precision that is not possible today, something that could radically accelerate the development of new medications. Google and others also believe that these systems can significantly accelerate machine learning, the field of teaching computers to learn tasks on their own by analyzing data or experiments with certain behavior.

    A quantum computer could also be able to break the encryption algorithms that guard the world’s most sensitive corporate and government data. With so much at stake, it is no surprise that so many companies are betting on this technology, including start-ups like Quantum Circuits.

    The deck is stacked against the smaller players, because the big-name companies have so much more money to throw at the problem. But start-ups have their own advantages, even in such a complex and expensive area of research.

    “Small teams of exceptional people can do exceptional things,” said Bill Coughran, who helped oversee the creation of Google’s vast internet infrastructure and is now investing in Mr. Schoelkopf’s company as a partner at Sequoia. “I have yet to see large teams inside big companies doing anything tremendously innovative.”

    Though Quantum Circuits is using the same quantum method as its bigger competitors, Mr. Schoelkopf argued that his company has an edge because it is tackling the problem differently. Rather than building one large quantum machine, it is constructing a series of tiny machines that can be networked together. He said this will make it easier to correct errors in quantum calculations — one of the main difficulties in building one of these complex machines.

    But each of the big companies insist that they hold an advantage — and each is loudly trumpeting its progress, even if a working machine is still years away.

    Mr. Coughran said that he and Sequoia envision Quantum Circuits evolving into a company that can deliver quantum computing to any business or researcher that needs it. Another investor, Canaan’s Brendan Dickinson, said that if a company like this develops a viable quantum machine, it will become a prime acquisition target.

    “The promise of a large quantum computer is incredibly powerful,” Mr. Dickinson said. “It will solve problems we can’t even imagine right now.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 1:30 pm on November 14, 2017 Permalink | Reply
    Tags: Applied Research & Technology, “Lamp-plaque” method, , FELs, Hyperspectral cameras are used for a wide range of monitoring applications including biomedical defense and ground-based air-based and space-based environmental sensing, Lights Camera Calibrate! Improving Space Cameras with a Better Model for Ultra-Bright Lamps, , , There’s an emerging market for hyperspectral sensors in general   

    From NIST: “Lights, Camera, Calibrate! Improving Space Cameras with a Better Model for Ultra-Bright Lamps” 


    NIST

    November 14, 2017
    Jennifer Lauren Lee

    1
    A standard FEL lamp, such as the one pictured here, is about the size of a person’s thumb. Credit: David Allen/NIST

    Studio photographers may be familiar with the 1,000-watt quartz halogen lamps known as “FELs.” Scientists use them too—specially calibrated ones, at least—to test the performance of light sensors that monitor Earth’s weather, plant life and oceans, often from space.

    A researcher at the National Institute of Standards and Technology (NIST) has recently made an improved mathematical model of the light output of FEL lamps. The new model, developed by NIST theorist Eric Shirley, will make the lamps more useful research tools, the scientists say, particularly for calibrating a relatively new class of cameras called hyperspectral imagers.

    Rainbow Vision

    Hyperspectral cameras are used for a wide range of monitoring applications, including biomedical, defense, and ground-based, air-based and space-based environmental sensing. While ordinary cameras only capture light in three bands of wavelengths—red, green and blue—hyperspectral imagers can be designed to see all the colors of the rainbow and beyond, including ultraviolet and infrared. Their increased range allows these cameras to reveal the distinctive signatures of processes that are invisible to the naked eye.

    Some of these effects are subtle, however—such as when researchers are trying to tease out changes in ocean color, or to monitor plant growth, which helps them predict crop productivity.

    “These are both examples where you’re looking at an extremely small signal of just a couple percent total,” said David Allen of NIST’s Physical Measurement Laboratory (PML). In cases like this, achieving low uncertainties in the calibration of their detectors is essential.

    Of particular interest to Allen and his colleagues was a calibration technique called the “lamp-plaque” method, popular with scientists because it is relatively inexpensive and portable. For this calibration procedure, researchers use a standard FEL lamp. Incidentally, FEL is the name designated by the American National Standards Institute (ANSI) for these lamps. It is not an acronym.

    First, the lamp light shines onto a white, rectangular board called a reflectance plaque, made of a material that scatters more than 99 percent of the visible, ultraviolet and near-infrared light that hits it. Then, after bouncing off the plaque, the scattered light hits the camera being calibrated.

    The method has been used for decades to calibrate other kinds of sensors, which only need to see one point of light. Hyperspectral imagers, on the other hand, can distinguish shapes.

    “They have some field of view, like a camera,” Allen said. “That means that to calibrate them, you need something that illuminates a larger area.” And the trouble with the otherwise convenient lamp-plaque system is that the light bouncing off the plaque isn’t uniform: It’s brightest in the center and less intense toward the edges.

    The researchers could easily calculate the intensity of the light in the brightest spot, but they didn’t know exactly how that light falls off in brightness toward the plaque’s edges.

    To lower the calibration uncertainties, researchers needed a better theoretical model of the lamp-plaque system.

    Counting Coils

    Shirley, the NIST theorist who took on this task, had to consider several parameters. One major contributor to the variations in intensity is the orientation of the lamp with respect to the plaque. FEL lamps have a filament that consists of a coiled coil—the shape that an old-fashioned telephone cord would make if wrapped around a finger. All that coiling means that light produced by one part of the filament can be physically blocked by other parts of the filament. Setting the lamp at an angle with respect to the plaque exacerbates this effect.

    2
    Close-up of an FEL lamp revealing its “coiled coil” filament. Behind the lamp is a white reflectance plaque like the ones used in calibrations. Credit: Jennifer Lauren Lee/NIST

    To model the system, Shirley took into account the diameter of the wire and both coils, the amount of space between each curve of the coils and the distance between the lamp and the plaque.

    “These are all things that were obvious,” Shirley said, “but they were not as appreciated before.”

    NIST scientists tested the actual output of some FEL lamp-plaque systems against what the model predicted and found good agreement. They say the uncertainties on light intensity across the entire plaque could now be as low as a fraction of a percent, down from about 10 to 15 percent.

    Moving forward, NIST will incorporate the new knowledge into its calibration service for hyperspectral imagers. But researchers are preparing to publish their results and hope scientists will use the new model when doing their own calibrations. The work could also serve as a foundation for creating better detector specifications, potentially useful for U.S. manufacturers who build and sell the cameras.

    “There’s an emerging market for hyperspectral sensors in general,” Allen said. “They’re becoming more sophisticated, and this is a component to help them be a more robust product in an increasingly competitive market.”

    Sensors, Modeling & simulation research, Optical / photometry / laser metrology, Physics and Standards

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD, USA

    NIST Mission, Vision, Core Competencies, and Core Values

    NIST’s mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.
    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.
    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: