Tagged: NIST Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 11:59 am on August 14, 2019 Permalink | Reply
    Tags: , , , NIST, Particle number concentrations   

    From NIST: “Solving the Big Problem of Measuring Tiny Nanoparticles” 


    From NIST

    August 14, 2019

    Alison Gillespie
    alison.gillespie@nist.gov
    (301) 975-2316

    1
    Scientists have long debated the most effective way to measure nanoparticles so that results can be shared across labs. NIST researchers have found that one approach — particle number concentrations — may be better than others for most applications. Credit: N. Hanacek/NIST

    Tiny nanoparticles play a gargantuan role in modern life, even if most consumers are unaware of their presence. They provide essential ingredients in sunscreen lotions, prevent athlete’s foot fungus in socks, and fight microbes on bandages. They enhance the colors of popular candies and keep the powdered sugar on doughnuts powdery. They are even used in advanced drugs that target specific types of cells in cancer treatments.

    When chemists analyze a sample, however, it is challenging to measure the sizes and quantities of these particles — which are often 100,000 times smaller than the thickness of a piece of paper. Technology offers many options for assessing nanoparticles, but experts have not reached a consensus on which technique is best.

    In a new paper from the National Institute of Standards and Technology (NIST) and collaborating institutions, researchers have concluded that measuring the range of sizes in nanoparticles — instead of just the average particle size — is optimal for most applications.

    “It seems like a simple choice,” said NIST’s Elijah Petersen, the lead author of the paper, which was published today in Environmental Science: Nano. “But it can have a big impact on the outcome of your assessment.”

    As with many measurement questions, precision is key. Exposure to a certain amount of some nanoparticles could have adverse effects. Pharmaceutical researchers often need exactitude to maximize a drug’s efficacy. And environmental scientists need to know, for example, how many nanoparticles of gold, silver or titanium could potentially cause a risk to organisms in soil or water.

    Using more nanoparticles than needed in a product because of inconsistent measurements could also waste money for manufacturers.

    Although they might sound ultramodern, nanoparticles are neither new nor based solely on high-tech manufacturing processes. A nanoparticle is really just a submicroscopic particle that measures less than 100 nanometers on at least one of its dimensions. It would be possible to place hundreds of thousands of them onto the head of a pin. They are exciting to researchers because many materials act differently at the nanometer scale than they do at larger scales, and nanoparticles can be made to do lots of useful things.

    Nanoparticles have been in use since the days of ancient Mesopotamia, when ceramic artists used extremely small bits of metal to decorate vases and other vessels. In fourth-century Rome, glass artisans ground metal into tiny particles to change the color of their wares under different lighting. These techniques were forgotten for a while but rediscovered in the 1600s by resourceful manufacturers for glassmaking again. Then, in the 1850s, scientist Michael Faraday extensively researched ways to use various kinds of wash mixes to change the performance of gold particles.

    Modern nanoparticle research advanced quickly in the mid-20th century due to technological innovations in optics. Being able to see the individual particles and study their behavior expanded the possibilities for experimentation. The largest advances came, however, after experimental nanotechnology took off in the 1990s. Suddenly, the behavior of single particles of gold and many other substances could be closely examined and manipulated. Discoveries about the ways that small amounts of a substance would reflect light, absorb light, or change in behavior were numerous, leading to the incorporation of nanoparticles into many more products.

    Debates have since followed about their measurement. When assessing the response of cells or organisms to nanoparticles, some researchers prefer measuring particle number concentrations (sometimes called PNCs by scientists). Many find PNCs challenging since extra formulas must be employed when determining the final measurement. Others prefer measuring mass or surface area concentrations.

    PNCs are often used for characterizing metals in chemistry. The situation for nanoparticles is inherently more complex, however, than it is for dissolved organic or inorganic substances because unlike dissolved chemicals, nanoparticles can come in a wide variety of sizes and sometimes stick together when added to testing materials.

    “If you have a dissolved chemical, it’s always going to have the same molecular formula, by definition,” Petersen says. “Nanoparticles don’t just have a certain number of atoms, however. Some will be 9 nanometers, some will be 11, some might be 18, and some might be 3.”

    The problem is that each of those particles may be fulfilling an important role. While a simple estimate of particle number is perfectly fine for some industrial applications, therapeutic applications require much more robust measurement. In the case of cancer therapies, for example, each particle, no matter how big or small, may be delivering a needed antidote. And just as with any other kind of dosage, nanoparticle dosage must be exact in order to be safe and effective.

    Using the range of particle sizes to calculate the PNC will often be the most helpful in most cases, said Petersen. The size distribution doesn’t use a mean or an average but notes the complete distribution of sizes of particles so that formulas can be used to effectively discover how many particles are in a sample.

    But no matter which approach is used, researchers need to make note of it in their papers, for the sake of comparability with other studies. “Don’t assume that different approaches will give you the same result,” he said.

    Petersen adds that he and his colleagues were surprised by how much the coatings on nanoparticles could impact measurement. Some coatings, he noted, can have a positive electrical charge, causing clumping.

    Petersen worked in collaboration with researchers from federal laboratories in Switzerland, and with scientists from 3M who have previously made many nanoparticle measurements for use in industrial settings. Researchers from Switzerland, like those in much of the rest of Europe, are keen to learn more about measuring nanoparticles because PNCs are required in many regulatory situations. There hasn’t been much information on which techniques are best or more likely to yield the most precise results across many applications.

    “Until now we didn’t even know if we could find agreement among labs about particle number concentrations,” Petersen says. “They are complex. But now we are beginning to see it can be done.”

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD, USA

    NIST Mission, Vision, Core Competencies, and Core Values

    NIST’s mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.
    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.
    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

     
  • richardmitnick 3:32 pm on May 2, 2019 Permalink | Reply
    Tags: "Sculpting Super-Fast Light Pulses: NIST Nanopillars Shape Light Precisely for Practical Applications", , Essential for sending information through high-speed optical circuits and in probing atoms and molecules that vibrate thousands of trillions of times a second, NIST,   

    From NIST: “Sculpting Super-Fast Light Pulses: NIST Nanopillars Shape Light Precisely for Practical Applications” 


    From NIST

    May 02, 2019

    Jennifer Huergo
    jennifer.huergo@nist.gov
    (301) 975-6343

    Imagine being able to shape a pulse of light in any conceivable manner—compressing it, stretching it, splitting it in two, changing its intensity or altering the direction of its electric field.

    Controlling the properties of ultrafast light pulses is essential for sending information through high-speed optical circuits and in probing atoms and molecules that vibrate thousands of trillions of times a second. But the standard method of pulse shaping—using devices known as spatial light modulators—is costly, bulky and lacks the fine control scientists increasingly need. In addition, these devices are typically based on liquid crystals that can be damaged by the very same pulses of high intensity laser light they were designed to shape.

    1
    Schematic shows a novel technique to reshape the properties of an ultrafast light pulse. An incoming pulse of light (left) is dispersed into its various constituent frequencies, or colors, and directed into a metasurface composed of millions of tiny silicon pillars and an integrated polarizer. The nanopillars are specifically designed to simultaneously and independently shape such properties of each frequency component as its amplitude, phase or polarization. The transmitted beam is then recombined to achieve a new shape-modified pulse (right). Credit: S. Kelley/NIST

    Now researchers at the National Institute of Standards and Technology (NIST) and the University of Maryland’s NanoCenter in College Park have developed a novel and compact method of sculpting light. They first deposited a layer of ultrathin silicon on glass, just a few hundred nanometers (billionths of a meter) thick, and then covered an array of millions of tiny squares of the silicon with a protective material. By etching away the silicon surrounding each square, the team created millions of tiny pillars, which played a key role in the light sculpting technique.

    The flat, ultrathin device is an example of a metasurface, which is used to change the properties of a light wave traveling through it. By carefully designing the shape, size, density and distribution of the nanopillars, multiple properties of each light pulse can now be tailored simultaneously and independently with nanoscale precision. These properties include the amplitude, phase and polarization of the wave.

    A light wave, a set of oscillating electric and magnetic fields oriented at right angles to each other, has peaks and troughs similar to an ocean wave. If you’re standing in the ocean, the frequency of the wave is how often the peaks or troughs travel past you, the amplitude is the height of the waves (trough to peak), and the phase is where you are relative to the peaks and troughs.

    “We figured out how to independently and simultaneously manipulate the phase and amplitude of each frequency component of an ultrafast laser pulse,” said Amit Agrawal, of NIST and the NanoCenter. “To achieve this, we used carefully designed sets of silicon nanopillars, one for each constituent color in the pulse, and an integrated polarizer fabricated on the back of the device.”

    When a light wave travels through a set of the silicon nanopillars, the wave slows down compared with its speed in air and its phase is delayed—the moment when the wave reaches its next peak is slightly later than the time at which the wave would have reached its next peak in air. The size of the nanopillars determines the amount by which the phase changes, whereas the orientation of the nanopillars changes the light wave’s polarization. When a device known as a polarizer is attached to the back of the silicon, the change in polarization can be translated to a corresponding change in amplitude.

    2

    A more detailed schematic of the pulse shaping setup. An incoming pulse of light (left) diffracts off a grating, which disperses the pulse into its various frequencies, or colors. A parabolic mirror then redirects the dispersed light into a silicon surface etched with millions of tiny pillars. The nanopillars are specifically designed to simultaneously and independently shape such properties of each frequency component as its amplitude, phase or polarization. A second parabolic mirror and diffraction grating then recombines the separated components into a newly formed pulse (right). Credit: T. Xu/Nanjing University

    Altering the phase, amplitude or polarization of a light wave in a highly controlled manner can be used to encode information. The rapid, finely tuned changes can also be used to study and change the outcome of chemical or biological processes. For instance, alterations in an incoming light pulse could increase or decrease the product of a chemical reaction. In these ways, the nanopillar method promises to open new vistas in the study of ultrafast phenomenon and high-speed communication.

    Agrawal, along with Henri Lezec of NIST and their collaborators, describe the findings online today in the journal Science.

    “We wanted to extend the impact of metasurfaces beyond their typical application—changing the shape of an optical wavefront spatially—and use them instead to change how the light pulse varies in time,” said Lezec.

    A typical ultrafast laser light pulse lasts for only a few femtoseconds, or one thousandth of a trillionth of a second, too short for any device to shape the light at one particular instant. Instead, Agrawal, Lezec and their colleagues devised a strategy to shape the individual frequency components or colors that make up the pulse by first separating the light into those components with an optical device called a diffraction grating.

    Each color has a different intensity or amplitude—similar to the way a musical overtone is composed of many individual notes that have different volumes. When directed into the nanopillar-etched silicon surface, different frequency components struck different sets of nanopillars. Each set of nanopillars was tailored to alter the phase, intensity or electric field orientation (polarization) of components in a particular way. A second diffraction grating then recombined all the components to create the newly shaped pulse.

    The researchers designed their nanopillar system to work with ultrafast light pulses (10 femtoseconds or less, equivalent to one hundredth of a trillionth of a second) composed of a broad range of frequency components that span wavelengths from 700 nanometers (visible red light) to 900 nanometers (near-infrared). By simultaneously and independently altering the amplitude and phase of these frequency components, the scientists demonstrated that their method could compress, split and distort pulses in a controllable manner.

    Further refinements in the device will give scientists additional control over the time evolution of light pulses and may enable researchers to shape in exquisite detail individual lines in a frequency comb, a precise tool for measuring the frequencies of light used in such devices as atomic clocks and for identifying planets around distant stars.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD, USA

    NIST Mission, Vision, Core Competencies, and Core Values

    NIST’s mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.
    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.
    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

     
  • richardmitnick 11:42 am on March 22, 2019 Permalink | Reply
    Tags: , , NIST, Ultraviolet light-emitting diodes   

    From NIST: “NIST Researchers Boost Intensity of Nanowire LEDs” 


    From NIST

    March 21, 2019

    Laura Ost
    laura.ost@nist.gov
    (303) 497-4880

    1
    Model of nanowire-based light-emitting diode showing that adding a bit of aluminum to the shell layer (black) directs all recombination of electrons and holes (spaces for electrons) into the nanowire core (multicolored region), producing intense light. Credit: NIST

    Nanowire gurus at the National Institute of Standards and Technology (NIST) have made ultraviolet light-emitting diodes (LEDs) that, thanks to a special type of shell, produce five times higher light intensity than do comparable LEDs based on a simpler shell design.

    Ultraviolet LEDs are used in a growing number of applications such as polymer curing, water purification and medical disinfection. Micro-LEDs are also of interest for visual displays. NIST staff are experimenting with nanowire-based LEDs for scanning-probe tips intended for electronics and biology applications.

    The new, brighter LEDs are an outcome of NIST’s expertise in making high-quality gallium nitride (GaN) nanowires. Lately, researchers have been experimenting with nanowire cores made of silicon-doped GaN, which has extra electrons, surrounded by shells made of magnesium-doped GaN, which has a surplus of “holes” for missing electrons. When an electron and a hole combine, energy is released as light, a process known as electroluminescence.

    The NIST group previously demonstrated GaN LEDs that produced light attributed to electrons injected into the shell layer to recombine with holes. The new LEDs have a tiny bit of aluminum added to the shell layer, which reduces losses from electron overflow and light reabsorption.

    As described in the journal Nanotechnology, the brighter LEDs are fabricated from nanowires with a so-called “p-i-n” structure, a tri-layer design that injects electrons and holes into the nanowire. The addition of aluminum to the shell helps confine electrons to the nanowire core, boosting the electroluminescence fivefold.

    “The role of the aluminum is to introduce an asymmetry in the electrical current that prevents electrons from flowing into the shell layer, which would reduce efficiency, and instead confines electrons and holes to the nanowire core,” first author Matt Brubaker said.

    The nanowire test structures were about 440 nanometers (nm) long with a shell thickness of about 40 nm. The final LEDs, including the shells, were almost 10 times larger. Researchers found that the amount of aluminum incorporated into fabricated structures depends on nanowire diameter.

    Group leader Kris Bertness said at least two companies are developing micro-LEDs based on nanowires, and NIST has a Cooperative Research and Development Agreement with one of them to develop dopant and structural characterization methods. The researchers have had preliminary discussions with scanning-probe companies about using NIST LEDs in their probe tips, and NIST plans to demonstrate prototype LED tools soon.

    The NIST team holds U.S. Patent 8,484,756 on an instrument that combines microwave scanning probe microscopy with an LED for nondestructive, contactless testing of material quality for important semiconductor nanostructures such as transistor channels and individual grains in solar cells. The probe could also be used for biological research on protein unfolding and cell structure.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD, USA

    NIST Mission, Vision, Core Competencies, and Core Values

    NIST’s mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.
    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.
    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

     
  • richardmitnick 12:44 pm on November 29, 2018 Permalink | Reply
    Tags: , , NIST   

    From NIST: “NIST Atomic Clocks Now Keep Time Well Enough to Improve Models of Earth” 


    From NIST

    November 28, 2018

    Experimental atomic clocks at the National Institute of Standards and Technology (NIST) have achieved three new performance records, now ticking precisely enough to not only improve timekeeping and navigation, but also detect faint signals from gravity, the early universe and perhaps even dark matter.

    The clocks each trap a thousand ytterbium atoms in optical lattices, grids made of laser beams. The atoms tick by vibrating or switching between two energy levels. By comparing two independent clocks, NIST physicists achieved record performance in three important measures: systematic uncertainty, stability and reproducibility.

    2

    NIST physicist Andrew Ludlow and colleagues achieved new atomic clock performance records in a comparison of two ytterbium optical lattice clocks. Laser systems used in both clocks are visible in the foreground, and the main apparatus for one of the clocks is located behind Ludlow.
    Credit: Burrus/NIST

    Published online today in the journal Nature, the new NIST clock records are:

    Systematic uncertainty: How well the clock represents the natural vibrations, or frequency, of the atoms. NIST researchers found that each clock ticked at a rate matching the natural frequency to within a possible error of just 1.4 parts in 1018—about one billionth of a billionth.
    Stability: How much the clock’s frequency changes over a specified time interval, measured to a level of 3.2 parts in 1019 (or 0.00000000000000000032) over a day.
    Reproducibility: How closely the two clocks tick at the same frequency, shown by 10 comparisons of the clock pair, yielding a frequency difference below the 10-18 level (again, less than one billionth of a billionth).

    “Systematic uncertainty, stability, and reproducibility can be considered the ‘royal flush’ of performance for these clocks,” project leader Andrew Ludlow said. “The agreement of the two clocks at this unprecedented level, which we call reproducibility, is perhaps the single most important result, because it essentially requires and substantiates the other two results.”

    “This is especially true because the demonstrated reproducibility shows that the clocks’ total error drops below our general ability to account for gravity’s effect on time here on Earth. Hence, as we envision clocks like these being used around the country or world, their relative performance would be, for the first time, limited by Earth’s gravitational effects.”

    Einstein’s theory of relativity predicts that an atomic clock’s ticking, that is, the frequency of the atoms’ vibrations, is reduced—shifted toward the red end of the electromagnetic spectrum—when observed in stronger gravity. That is, time passes more slowly at lower elevations.

    While these so-called redshifts degrade a clock’s timekeeping, this same sensitivity can be turned on its head to exquisitely measure gravity. Super-sensitive clocks can map the gravitational distortion of space-time more precisely than ever. Applications include relativistic geodesy, which measures the Earth’s gravitational shape, and detecting signals from the early universe such as gravitational waves and perhaps even as-yet-unexplained dark matter.

    NIST’s ytterbium clocks now exceed the conventional capability to measure the geoid, or the shape of the Earth based on tidal gauge surveys of sea level. Comparisons of such clocks located far apart such as on different continents could resolve geodetic measurements to within 1 centimeter, better than the current state of the art of several centimeters.

    In the past decade of new clock performance records announced by NIST and other labs around the world, this latest paper showcases reproducibility at a high level, the researchers say. Furthermore, the comparison of two clocks is the traditional method of evaluating performance.

    Among the improvements in NIST’s latest ytterbium clocks was the inclusion of thermal and electric shielding, which surround the atoms to protect them from stray electric fields and enable researchers to better characterize and correct for frequency shifts caused by heat radiation.

    The ytterbium atom is among potential candidates for the future redefinition of the second—the international unit of time—in terms of optical frequencies. NIST’s new clock records meet one of the international redefinition roadmap’s requirements, a 100-fold improvement in validated accuracy over the best clocks based on the current standard, the cesium atom, which vibrates at lower microwave frequencies.

    NIST is building a portable ytterbium lattice clock with state-of-the-art performance that could be transported to other labs around the world for clock comparisons and to other locations to explore relativistic geodesy techniques.

    The work is supported by NIST, the National Aeronautics and Space Administration and the Defense Advanced Research Projects Agency.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD, USA

    NIST Mission, Vision, Core Competencies, and Core Values

    NIST’s mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.
    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.
    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

     
  • richardmitnick 3:55 pm on August 22, 2018 Permalink | Reply
    Tags: , , NIST   

    From NIST: “Connecting the (Nano) Dots: NIST Says Big-Picture Thinking Can Advance Nanoparticle Manufacturing” 


    From NIST

    August 22, 2018
    Michael E. Newman
    michael.newman@nist.gov
    (301) 975-3025

    1
    Electron micrograph showing gallium arsenide nanoparticles of varying shapes and sizes. Such heterogeneity can increase costs and limit profits when making nanoparticles into products. A new NIST study recommends that researchers, manufacturers and administrators work together to solve this, and other common problems, in nanoparticle manufacturing.
    Credit: A. Demotiere, E. Shevchenko/Argonne National Laboratory

    Nanoparticle manufacturing, the production of material units less than 100 nanometers in size (100,000 times smaller than a marble), is proving the adage that “good things come in small packages.” Today’s engineered nanoparticles are integral components of everything from the quantum dot nanocrystals coloring the brilliant displays of state-of-the-art televisions to the miniscule bits of silver helping bandages protect against infection. However, commercial ventures seeking to profit from these tiny building blocks face quality control issues that, if unaddressed, can reduce efficiency, increase production costs and limit commercial impact of the products that incorporate them.

    To help overcome these obstacles, the National Institute of Standards and Technology (NIST) and the nonprofit World Technology Evaluation Center (WTEC) advocate that nanoparticle researchers, manufacturers and administrators “connect the dots” by considering their shared challenges broadly and tackling them collectively rather than individually. This includes transferring knowledge across disciplines, coordinating actions between organizations and sharing resources to facilitate solutions.

    The recommendations are presented in a new paper in the journal ACS Applied Nano Materials.

    “We looked at the big picture of nanoparticle manufacturing to identify problems that are common for different materials, processes and applications,” said NIST physical scientist Samuel Stavis, lead author of the paper. “Solving these problems could advance the entire enterprise.”

    The new paper provides a framework to better understand these issues. It is the culmination of a study initiated by a workshop organized by NIST that focused on the fundamental challenge of reducing or mitigating heterogeneity, the inadvertent variations in nanoparticle size, shape and other characteristics that occur during their manufacture.

    “Heterogeneity can have significant consequences in nanoparticle manufacturing,” said NIST chemical engineer and co-author Jeffrey Fagan.

    In their paper, the authors noted that the most profitable innovations in nanoparticle manufacturing minimize heterogeneity during the early stages of the operation, reducing the need for subsequent processing. This decreases waste, simplifies characterization and improves the integration of nanoparticles into products, all of which save money.

    The authors illustrated the point by comparing the production of gold nanoparticles and carbon nanotubes. For gold, they stated, the initial synthesis costs can be high, but the similarity of the nanoparticles produced requires less purification and characterization. Therefore, they can be made into a variety of products, such as sensors, at relatively low costs.

    In contrast, the more heterogeneous carbon nanotubes are less expensive to synthesize but require more processing to yield those with desired properties. The added costs during manufacturing currently make nanotubes only practical for high-value applications such as digital logic devices.

    “Although these nanoparticles and their end products are very different, the stakeholders in their manufacture can learn much from each other’s best practices,” said NIST materials scientist and co-author J. Alexander Liddle. “By sharing knowledge, they might be able to improve both seemingly disparate operations.”

    Finding ways like this to connect the dots, the authors said, is critically important for new ventures seeking to transfer nanoparticle technologies from laboratory to market.

    “Nanoparticle manufacturing can become so costly that funding expires before the end product can be commercialized,” said WTEC nanotechnology consultant and co-author Michael Stopa. “In our paper, we outlined several opportunities for improving the odds that new ventures will survive their journeys through this technology transfer ‘valley of death.’”

    Finally, the authors considered how manufacturing challenges and innovations are affecting the ever-growing number of applications for nanoparticles, including those in the areas of electronics, energy, health care and materials.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD, USA

    NIST Mission, Vision, Core Competencies, and Core Values

    NIST’s mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.
    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.
    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

     
  • richardmitnick 10:17 am on January 10, 2018 Permalink | Reply
    Tags: , , EIT spectrometer, New NIST Spectrometer Measures Single Photons with Great Precision, NIST   

    From NIST: “New NIST Spectrometer Measures Single Photons with Great Precision” 


    NIST

    December 18, 2017 [Missed the first time around social media.]

    Chad Boutin
    boutin@nist.gov
    (301) 975-4261

    1
    A laser strikes a gas of atoms, which vibrate and block much of the light. A second laser is tuned to nearly the same wavelength as the first. The interference between the beams creates a narrow “hole” through which photons of a specific wavelength are transparent and can pass. Making fine adjustments to the second laser’s wavelength moves the hole, enabling highly precise measurements of the photon’s wavelength. Credit: Irvine/NIST

    2
    https://www.nextbigfuture.com/2017/12/nist-creates-spectrometer-with-10000-times-more-precision-than-standard-devices.html

    Future communications networks that are less vulnerable to hacking could be closer to reality with an invention that measures the properties of single-photon sources with high accuracy.

    Built by scientists at the National Institute of Standards and Technology (NIST), the device could help bring about “quantum communications” networks, which would use individual particles of light to send bits of information. Because each bit of information can be embedded in the quantum properties of a single photon, the laws of quantum mechanics make it difficult, if not impossible, for an enemy to intercept the message undetected.

    Both the telecommunications and computer industries would like such networks to keep information secure. The NIST method may help overcome one of the technical barriers standing in their way by measuring photons’ spectral properties—essentially their color—10,000 times better than conventional spectrometers.

    Individual photons have a limitation: They cannot travel through fiber-optic cables for more than about 100 kilometers (about 60 miles) without likely being absorbed. A quantum network able to handle worldwide communications would need periodic way stations that could catch photons and retransmit their information without loss. The NIST team’s invention could help such a “quantum repeater” interact effectively with photons.

    Key to the operation of the quantum repeater would be a memory component that uses an ensemble of atoms to store the photon’s information briefly and retransmit it at the right moment. Its operation would involve an atom’s energy structure: As an atom catches the photon, the atom’s energy level rises to a higher state. At the desired moment, the atom returns to its original state and emits the energy as another photon.

    Not just any photon can readily interact with this atom, though. It needs to be exactly the right color, or wavelength, to be absorbed by the atom. To make usable repeaters, engineers need to measure photons’ wavelengths far more precisely than conventional spectrometers can.

    The NIST team goes past convention with a technique called electromagnetically induced transparency (EIT), which starts out by using atoms’ ability to block light of a specific wavelength.

    Astronomers can tell what gases form the atmosphere of a far-off world because light passing through it makes the gas molecules vibrate at frequencies that block out light of particular colors, creating telltale dark lines in the light’s spectrum. EIT essentially creates a single dark line by beaming a laser at atoms whose vibrations block much of its light. A second laser, tuned to nearly the same wavelength as the first, is directed at the same atom and the interference between these two nearly identical beams alters the darkness. Instead of a simple dark line, it creates a line with a narrow transparent hole through which photons only of an extremely specific wavelength can pass.

    By making fine adjustments to the second laser’s wavelength, the team found it could move the hole back and forth across the dark line’s width, giving them a way to make highly precise measurements of a passing photon’s wavelength.

    To give a sense of how precise their spectrometer is, the team gave the example of a common laser pointer that shines in a single narrow color range, creating a pure-colored point on a screen. The typical spectrum width of a laser pointer is right around 1 terahertz (THz). The NIST invention can measure the color of a single-photon-level signal that has a spectrum 10 million times narrower than the laser pointer, resulting in a performance 10,000 times better than typical conventional spectrometers.

    “Additionally, we can extend our EIT spectrometer’s performance to any other wavelength range using other processes developed by our group without sacrificing its spectral resolution, high wavelength accuracy and high detection sensitivity,” said Lijun Ma, an optical engineer on the NIST team. “We think this will give the industry the tool it needs to build effective quantum repeaters.”

    Science paper:
    L. Ma, O. Slattery and X. Tang. Spectral characterization of single photon sources with ultra-high resolution, accuracy and sensitivity, Optics Express

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD, USA

    NIST Mission, Vision, Core Competencies, and Core Values

    NIST’s mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.
    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.
    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

     
  • richardmitnick 10:56 am on January 8, 2018 Permalink | Reply
    Tags: Boosting Communications with Magnetic Radio, Low-frequency “magnetic radio” to enable better wireless communications underground underwater and in other environments that are challenging for conventional RF communications, NIST, OPM-optically pumped magnetometer,   

    From Optics & Photonics: “Boosting Communications with Magnetic Radio” 

    Optics & Photonics

    07 January 2018
    Stewart Wills

    1
    Dave Howe of the U.S. National Institutes of Standards and Technology (NIST) leads an effort to use optically pumped magnetometers (OPMs) as receivers for low-frequency “magnetic radio,” to enable better wireless communications underground, underwater and in other environments that are challenging for conventional RF communications. [Image: Burrus/NIST]

    Researchers at the U.S. National Institute of Standards (NIST) are working on ways to overcome a vexing problem of a wireless world: how to get usable signals in crowded built environments, underground and even underwater (Rev. Sci. Instr., doi: 10.1063/1.5003821).

    At the heart of the team’s approach is an optically pumped magnetometer (OPM)—a highly sensitive, room-temperature quantum detector that can do double duty as a kind of magnetic-radio receiver. In the NIST setup, the OPM is used to pull in modulated signals encoded in very low frequency (VLF) magnetic fields, which are less prone to attenuation by the surrounding environment than a typical cell or GPS signal. And the team has coupled that sensitive detection technology with a modulation scheme that can pack more information into the otherwise limited bandwidth of VLF fields.

    The skin-depth conundrum

    Anyone who’s struggled to get a decent cell signal in a building basement knows the penetration limits of the high-frequency RF bands used by mobile phones. But the problem goes much further than simple inconvenience. For example, high-frequency-signal attenuation prevents military submarines and underground surveying operations from taking advantage of GPS location, and can cause GPS to cut out in the dense, skyscraper-lined canyons of urban downtowns. And it can block wireless communication among first responders picking their way through debris or rubble-cluttered disaster scenes.

    Technically, those limitations relate to a quantity known as the skin depth—the depth in a material at which an AC electromagnetic field becomes attenuated to 1/e of its original strength. The skin depth is inversely proportional to the square root of the signal frequency (as well as the conductivity and relative permeability of the material the electromagnetic field is trying to penetrate). That means, for a given material, that the penetration depth for a signal in the gigahertz (GHz) band typical of modern wireless communications can be three orders of magnitude smaller than for a VLF channel in the kilohertz (kHz) range.

    An obvious way to address the skin-depth issue is to communicate at very low frequencies. But those frequencies have their own problems. The biggest is extremely limited bandwidth, which rules out data-intensive applications like position detection or video. Another problem is the large antennas required to pull in the faint VLF signals using conventional receiving equipment. As a result, while the VLF band is now used for communications with submarines underwater, the data exchange amounts to little more than text messages—and, to receive them, the sub must spool out a long antenna and rise to periscope depth.

    A quantum solution

    The NIST team, led by researcher Dave Howe, has proposed a solution to VLF’s sensitivity problems: Encode the modulated signal on low-frequency magnetic fields—and then reconfigure the new generation of ultrasensitive magnetometers that’s emerging from quantum technology as magnetic-radio receivers. This would push the receiver’s ability to pick up VLF communications far beyond that of conventional RF receivers. And, the team suggests, picking the right modulation scheme for encoding the signal could hammer down ambient noise, allowing the best possible use of the available bandwidth in these low-frequency chanels.

    The specific quantum sensor used by Howe’s team is an optically pumped magnetometer (OPM). These devices, typically employed to measure faint natural magnetic fields, are a bit more robust than alternatives such as superconducting quantum interference devices (SQUID), as OPMs can work at room temperature and have low size, power and cost requirements.

    2
    The optically pumped magnetometer (OPM) setup in the work by Howe’s NIST group. PD, photodiode; PBS, polarizing beamsplitter; L, lock-in amplifier. DC, ZF and SO refer to the magnetometer’s three operating modes: direct current, zero field, and self-oscillating. Photo at right shows the size of the rubidium atom vapor cell. [Image: Reprinted from V. Gerginov et al., Rev. Sci. Instr. 88, 125005 (2017), with the permission of AIP Publishing]

    The instrument used by the NIST scientists works by firing pump and probe lasers into a vapor cell containing isotopically pure 87Rb atoms. Changes in the quantum spin of the atom ensemble due to an external, signal-carrying modulated DC magnetic field result in changes in the probe light’s polarization as it passes through the ensemble. Those changes, in turn, are read out at a balanced polarimeter at the end of the chain, and converted into AC signals to decode the magnetic-radio signal.

    Meanwhile, on the transmission side, the team found that it could significantly reduce the impact of environmental noise—and, thus, boost the low-frequency signal’s carrying capacity—by adopting a digital binary phase-shift keying (BPSK) modulation scheme for the low-frequency magnetic-field signal. In particular, the BPSK modulation designed by the team had the effect of suppressing noise sources such as Earth’s natural magnetic background and the 50/60-Hz hum (plus harmonics) from the electrical power grid.

    Picotesla sensitivity

    In a proof of principle, the NIST researchers created a simple, single-channel digitally encoded DC magnetic signal, and found that the OPM setup could detect the faint, sub-kHz-frequency signal at picotesla field strengths, far below the ambient magnetic background noise, across a distance spanning tens of meters in the magnetically noisy indoor NIST setting. The team believes that range could be extended further, to the hundreds of meters, in less noisy environments, and through continued improvements both in sensor technology and signal modulation.

    On that head, Howe’s group is working on a new custom quantum magnetometer that can provide still greater sensitivity, and on other techniques to reduce noise and expand bandwidth. In a press release, Howe likened the combining of quantum sensor technology and low-frequency magnetic radio to “inventing an entirely new field.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Optics & Photonics News (OPN) is The Optical Society’s monthly news magazine. It provides in-depth coverage of recent developments in the field of optics and offers busy professionals the tools they need to succeed in the optics industry, as well as informative pieces on a variety of topics such as science and society, education, technology and business. OPN strives to make the various facets of this diverse field accessible to researchers, engineers, businesspeople and students. Contributors include scientists and journalists who specialize in the field of optics. We welcome your submissions.

     
  • richardmitnick 12:09 pm on November 15, 2017 Permalink | Reply
    Tags: A Speed Gun for Photosynthesis, A type of optical sensor that if the science bears out will be able to estimate the rate of photosynthesis, , , , NIST, SIF - Solar Induced Fluorescence, Specially designed sap flow sensors, Such aDevice would revolutionize agriculture forestry and the study of Earth’s climate and ecosystems   

    From NIST: “A Speed Gun for Photosynthesis” 


    NIST

    1
    The NIST forest in Gaithersburg, Maryland. Credit: R. Press/NIST

    November 03, 2017 [NIST is not always quick to social media]
    Rich Press

    On a recent sunny afternoon, David Allen was standing by a third-floor window in a research building at the National Institute of Standards and Technology (NIST), holding in his hands a device that looked like a cross between a video camera and a telescope. The NIST campus is in suburban Gaithersburg, Maryland, but looking out the window, Allen could see 24 hectares (60 acres) of tulip tree, oak, hickory and red maple—a remnant of the northeastern hardwood forest that once dominated this landscape.

    Allen mounted the device on a tripod and pointed it out the window at the patch of forest below. The device wasn’t a camera, but a type of optical sensor that, if the science bears out, will be able to estimate the rate of photosynthesis—the chemical reaction that enables plants to convert water, carbon dioxide (CO2) and sunlight into food and fiber—from a distance.

    That measurement is possible because when plants are photosynthesizing, their leaves emit a very faint glow of infrared light. That glow is called Solar Induced Fluorescence, or SIF, and in recent years, optical sensors for measuring it have advanced dramatically. The sensor that Allen had just mounted on a tripod was one of them.

    “If SIF sensors end up working well,” Allen said, “I can imagine an instrument that stares at crops or a forest and has a digital readout on it that says how fast the plant is growing in real time.”

    Such a device would revolutionize agriculture, forestry and the study of Earth’s climate and ecosystems.

    2
    NIST scientist David Allen and Boston University Ph.D. student Julia Marrs aim a SIF sensor at a specific tree in the NIST forest.
    Credit: R. Press/NIST

    Allen is a NIST chemist whose research involves remote sensing—the technology that’s used to observe Earth from outer space. Remote sensing allows scientists to track hurricanes, map terrain, monitor population growth and produce daily weather reports. The technology is so deeply embedded in our everyday lives that it’s easy to take for granted. But each type of remote sensing had to be developed from the ground up, and the SIF project at NIST shows how that’s done.

    Some satellites are already collecting SIF data, but standards are needed to ensure that those measurements can be properly interpreted. NIST has a long history of developing standards for satellite-based measurements, and Allen’s research is aimed at developing standards for measuring SIF. Doing that requires a better understanding of the biological processes that underlie SIF, and for that, Allen teamed up with outside scientists.

    At the same time that Allen was aiming a SIF sensor through that third-floor window, a team of biologists from Boston University and Bowdoin College was in the NIST forest measuring photosynthesis up close. A pair of them spent the day climbing into the canopy on an aluminum orchard ladder. Once there, they would use a portable gas exchange analyzer to measure photosynthesis directly based on how much CO2 the leaf pulled out of the air. They also measured SIF at close range.

    3
    Boston University ecologist Lucy Hutyra (left) works at the forest edge alongside plant physiological ecologist Barry Logan (center) and ecologist Jaret Reblin, both of Bowdoin College in Brunswick, Maine. They measured photosynthesis directly, as well as temperature, humidity, and other environmental variables. Credit: R. Press/NIST

    Other scientists checked on specially designed sap flow sensors they had installed on the trunks of trees to measure the movement of water toward the leaves for photosynthesis.

    “We’re measuring the vital signs of the trees,” said Lucy Hutyra, the Boston University ecologist who led the team of scientists on the ground. The idea was to use those ground measurements to make sense of the SIF data collected from a distance.

    “If we measure an increase in photosynthesis at the leaf, we should see a corresponding change in the optical signal,” Hutyra said.

    4
    After directly measuring photosynthesis in an individual leaf using a field portable gas exchange analyzer, scientists preserved a small sample of leaf tissue in liquid nitrogen. They would later analyze that tissue in the lab to measure levels of chlorophyll and other pigments. Credit: R. Press/NIST

    The research was also taking place at still a higher level. That afternoon, Bruce Cook and Larry Corp, scientists with NASA’s G-LiHT project, flew over the NIST forest in a twin-turboprop plane that carried multiple sensors, including a SIF sensor and Light Detection and Ranging (LiDAR) sensors that mapped the internal structure of the forest canopy. The aircraft made six parallel passes over the forest at about 340 meters (1,100 feet, slightly above the minimum safe altitude allowed by FAA regulations), the instruments peering out from a port cut into the belly of the aircraft.

    That gave the scientists three simultaneous measurements to work with: from the ground, from the window above the forest and from the air. They’ll spend months correlating the data.

    “It’s tricky, because when you go from the leaf level to the forest level, you often get different results,” Allen said. For instance, at the forest level, the SIF signal is affected by the variations in the canopy, including its contours and density. “We’re still studying those effects.”

    5
    At the airport in Gaithersburg, Maryland, NASA earth scientist Bruce Cook (left), leader of the Goddard LiDAR, Hyperspectral, and Thermal (G-LiHT) project, shows David Allen and Julia Marrs the sensor array in the bottom of the aircraft. Credit: R. Press/NIST

    Currently, there is no reliable way to measure photosynthesis in real time over a wide area. Instead, scientists measure how green an area is to gauge how much chlorophyll is present—that’s the molecule that supports photosynthesis and gives leaves their color. But if a plant lacks water or nutrients, it may be green even if the photosynthetic machinery is switched off.

    SIF may be a much better indicator of active photosynthesis. When plants are photosynthesizing, most of the light energy absorbed by the chlorophyll molecule goes into growing the plant, but about two to five percent of that energy leaks away as SIF. The amount of leakage is not always proportional to photosynthesis, however. Environmental variables also come into play.

    The NIST forest is a test bed for understanding how all those variables interrelate. In addition to SIF data and the vital signs of trees, the scientists are collecting environmental data such as temperature, relative humidity and solar irradiance. They’re also figuring out the best ways to configure and calibrate the SIF instruments.

    “We’d like to see robust, repeatable results that make sense,” Allen said. “That will allow us to scale up from the leaf level, to the forest level, to the ecosystem level, and to estimate photosynthesis from measurements made at any of those scales.”

    Making SIF scalable is a key part of the measurement standard that Allen is working to create, and it will go from the ground level to measurements made from outer space.

    6
    A corner of the NIST forest shot by NASA scientists, and the plane that carried them and their G-LiHT airborne imaging system.
    Credit: Bruce Cook, Larry Corp/NASA (left); David Allen/NIST

    Using SIF to measure photosynthesis in real time would allow farmers to use only as much irrigation and fertilizer as their crops need, and only when they need it. Forest managers would be able to know how fast their timber is growing without having to tromp through the woods with a tape measure. Environmental managers would be able to monitor the recovery of damaged or deforested habitats after a drought or forest fire.

    And scientists would have a powerful new tool for studying how plants help regulate the amount of CO2 in the atmosphere.

    Humans add CO2 to the atmosphere when they burn fossil fuels, and land-based plants remove roughly a quarter of that CO2 through photosynthesis. But the environmental factors that affect that process are not well understood, mainly because scientists haven’t had a good way to measure the uptake of CO2 at the ecosystem level. SIF measurements, and the standards for interpreting them accurately, might help solve that problem.

    “CO2 exchange by plants is one of the most important biological processes on the planet,” Allen said, “and SIF will give us a new way to see that process in action.”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD, USA

    NIST Mission, Vision, Core Competencies, and Core Values

    NIST’s mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.
    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.
    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

     
  • richardmitnick 1:30 pm on November 14, 2017 Permalink | Reply
    Tags: , “Lamp-plaque” method, , FELs, Hyperspectral cameras are used for a wide range of monitoring applications including biomedical defense and ground-based air-based and space-based environmental sensing, Lights Camera Calibrate! Improving Space Cameras with a Better Model for Ultra-Bright Lamps, NIST, , There’s an emerging market for hyperspectral sensors in general   

    From NIST: “Lights, Camera, Calibrate! Improving Space Cameras with a Better Model for Ultra-Bright Lamps” 


    NIST

    November 14, 2017
    Jennifer Lauren Lee

    1
    A standard FEL lamp, such as the one pictured here, is about the size of a person’s thumb. Credit: David Allen/NIST

    Studio photographers may be familiar with the 1,000-watt quartz halogen lamps known as “FELs.” Scientists use them too—specially calibrated ones, at least—to test the performance of light sensors that monitor Earth’s weather, plant life and oceans, often from space.

    A researcher at the National Institute of Standards and Technology (NIST) has recently made an improved mathematical model of the light output of FEL lamps. The new model, developed by NIST theorist Eric Shirley, will make the lamps more useful research tools, the scientists say, particularly for calibrating a relatively new class of cameras called hyperspectral imagers.

    Rainbow Vision

    Hyperspectral cameras are used for a wide range of monitoring applications, including biomedical, defense, and ground-based, air-based and space-based environmental sensing. While ordinary cameras only capture light in three bands of wavelengths—red, green and blue—hyperspectral imagers can be designed to see all the colors of the rainbow and beyond, including ultraviolet and infrared. Their increased range allows these cameras to reveal the distinctive signatures of processes that are invisible to the naked eye.

    Some of these effects are subtle, however—such as when researchers are trying to tease out changes in ocean color, or to monitor plant growth, which helps them predict crop productivity.

    “These are both examples where you’re looking at an extremely small signal of just a couple percent total,” said David Allen of NIST’s Physical Measurement Laboratory (PML). In cases like this, achieving low uncertainties in the calibration of their detectors is essential.

    Of particular interest to Allen and his colleagues was a calibration technique called the “lamp-plaque” method, popular with scientists because it is relatively inexpensive and portable. For this calibration procedure, researchers use a standard FEL lamp. Incidentally, FEL is the name designated by the American National Standards Institute (ANSI) for these lamps. It is not an acronym.

    First, the lamp light shines onto a white, rectangular board called a reflectance plaque, made of a material that scatters more than 99 percent of the visible, ultraviolet and near-infrared light that hits it. Then, after bouncing off the plaque, the scattered light hits the camera being calibrated.

    The method has been used for decades to calibrate other kinds of sensors, which only need to see one point of light. Hyperspectral imagers, on the other hand, can distinguish shapes.

    “They have some field of view, like a camera,” Allen said. “That means that to calibrate them, you need something that illuminates a larger area.” And the trouble with the otherwise convenient lamp-plaque system is that the light bouncing off the plaque isn’t uniform: It’s brightest in the center and less intense toward the edges.

    The researchers could easily calculate the intensity of the light in the brightest spot, but they didn’t know exactly how that light falls off in brightness toward the plaque’s edges.

    To lower the calibration uncertainties, researchers needed a better theoretical model of the lamp-plaque system.

    Counting Coils

    Shirley, the NIST theorist who took on this task, had to consider several parameters. One major contributor to the variations in intensity is the orientation of the lamp with respect to the plaque. FEL lamps have a filament that consists of a coiled coil—the shape that an old-fashioned telephone cord would make if wrapped around a finger. All that coiling means that light produced by one part of the filament can be physically blocked by other parts of the filament. Setting the lamp at an angle with respect to the plaque exacerbates this effect.

    2
    Close-up of an FEL lamp revealing its “coiled coil” filament. Behind the lamp is a white reflectance plaque like the ones used in calibrations. Credit: Jennifer Lauren Lee/NIST

    To model the system, Shirley took into account the diameter of the wire and both coils, the amount of space between each curve of the coils and the distance between the lamp and the plaque.

    “These are all things that were obvious,” Shirley said, “but they were not as appreciated before.”

    NIST scientists tested the actual output of some FEL lamp-plaque systems against what the model predicted and found good agreement. They say the uncertainties on light intensity across the entire plaque could now be as low as a fraction of a percent, down from about 10 to 15 percent.

    Moving forward, NIST will incorporate the new knowledge into its calibration service for hyperspectral imagers. But researchers are preparing to publish their results and hope scientists will use the new model when doing their own calibrations. The work could also serve as a foundation for creating better detector specifications, potentially useful for U.S. manufacturers who build and sell the cameras.

    “There’s an emerging market for hyperspectral sensors in general,” Allen said. “They’re becoming more sophisticated, and this is a component to help them be a more robust product in an increasingly competitive market.”

    Sensors, Modeling & simulation research, Optical / photometry / laser metrology, Physics and Standards

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD, USA

    NIST Mission, Vision, Core Competencies, and Core Values

    NIST’s mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.
    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.
    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: