Tagged: The National Institute of Standards and Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:29 am on April 12, 2023 Permalink | Reply
    Tags: "From GPS to Laser Pointers Quantum Science Is All Around Us", , , Entanglement and superposition are resources for quantum computing. These are what make quantum computing powerful., NIST has been a leader in quantum mechanics since its earliest day because of the precision measurement involved., , Quantum is not just a theory. Rather it is just the way nature is., , The National Institute of Standards and Technology, There are quantum applications being pursued in chemistry and biology and health care and finance and transport and manufacturing and so forth.   

    From The National Institute of Standards and Technology: “From GPS to Laser Pointers Quantum Science Is All Around Us” 

    From The National Institute of Standards and Technology

    4.12.23
    Andrew Wilson

    1
    NIST researcher Andrew Wilson holds a surface-electrode ion trap used for quantum information processing. The computer screen behind Wilson shows three white dots, a live microscope image of three single atoms. They are held in a triangle pattern by an ion trap like the one Wilson is holding. Credit: R. Jacobson/NIST.

    If you’ve gotten around with GPS, had an MRI, or tormented your cat with a laser pointer, quantum science is a part of your life.

    Ahead of “World Quantum Day” this week, we asked Andrew Wilson, who leads NIST’s Quantum Physics Division, to explain just what exactly quantum science is and why it matters.

    We’re celebrating World Quantum Day. Let’s start simply. When we say “quantum,” what do we mean by that?

    Well, it means different things to different people. But it essentially comes down to using fundamental quantum properties to do great things. When people talk about using quantum, it generally comes down to two things:

    Quantum superposition is something that has two possible forms being in both of those forms to some extent at the same time.
    Entanglement means you’ve got at least two things that are always connected together; they have no independent existence anymore. Something that happens to one always affects the other. It’s kind of romantic!

    Entanglement and superposition are resources for quantum computing. These are what make quantum computing powerful.

    Why are scientists so interested in all things quantum?

    I think in the early days of quantum physics, there were ideas like the laser. Quantum physics underpins the laser, and the laser turns out to be rather important. It supports the internet. Quantum also comes into things like MRI imaging and semiconductor chips. So, we rely on quantum behavior to understand how these things work. That’s quantum physics. This early version of quantum physics is called semiclassical physics. And a lot of technology based on this uses superposition. Today, this is widely referred to as Quantum 1.0.

    But as we physicists kept working on quantum systems, and getting better at making and controlling these, we started thinking, OK, maybe we can do useful things with entanglement. So, we added entanglement to the toolbox. That’s Quantum 2.0. Quantum 2.0 is about trying to capture the advantages and the practical applications of both superposition and entanglement. We’re really trying to see how we can make entanglement practical. There will have to be many scientific breakthroughs, including fundamental science, for this kind of technology to be ubiquitous in our economy and society.

    At the same time that progress was being made in labs, some clever people realized that this toolbox could be used for information processing. Quantum computing emerged from the “coming together” of clever ideas and advancements in labs, a mix of quantum physics and information science.

    We can develop quantum computers, but what else can we do?

    We can also use superposition and entanglement for improved sensors and communications. We can make quantum sensors that measure things more precisely than classical physics allows. We can communicate information in quantum form that is resistant to eavesdropping. The challenge with these Quantum 2.0 things is making them practical. There is much work to do, and it’s very exciting to see the progress being made.

    Another thing that makes quantum interesting is that there are potential applications in many areas, far beyond physics. There are applications being pursued in chemistry, biology, health care, finance, transport, manufacturing and so forth. It can be a very interdisciplinary field. That makes it hard because each one of us only has a certain amount of expertise. On the other hand, the cool part is you get to collaborate with people who are experts in other things and learn from them.

    When you hear the word “quantum,” it sounds so abstract. Where does it show up in our everyday lives?

    People do tend to think of quantum as sort of a weird and abstract thing. Because most of the stuff we deal with in the real world — pens, cars, coffee cups, etc. — those things don’t behave quantum mechanically in our everyday experience. Because quantum mechanics is not an everyday experience for most people, it can seem very strange.

    But quantum is not just a theory, it’s just the way nature is. For those of us who work with this every day, it’s not mysterious or abstract. It’s as practical as anything else that we deal with during the day, including pens and coffee.

    As I said, there are lots of practical applications of quantum. There are parts of electronics that rely heavily on quantum mechanics. Health care, communication, lots of technology relies on it.

    One of the most common practical applications is timekeeping. The only reason you’re able to have a GPS on your phone or in your car is that you’ve got some atomic clocks in satellites. You may not know it, but you’re using quantum superposition in those clocks, making sure you can figure out where you’re going. So if I’m supposed to be meeting my wife at a restaurant, and I don’t know where it is, I’m relying on quantum mechanics to get me there, to achieve that goal. This is an everyday use of quantum mechanics, looking at our phones and figuring out where we’re going.

    Why is it important to continue to research quantum science?

    Studying quantum may lead us to the next big thing, or a bunch of things, whatever the next laser or GPS may be. There are a lot of ideas out there for how we can use quantum, and people are frantically trying to figure out:

    What can quantum technology be used for?
    How can we advance quantum technology so that it can be used for practical problems?

    Economies are affected by Quantum 1.0, and there’s a high probability that Quantum 2.0 will have another transformational impact. There are so many ideas floating around that people are excited about; that’s why we’re doing this.

    NIST specifically is doing this because we do measurement science to help spur innovation and competitiveness. People come to NIST with measurement problems, and often, we can overcome classical barriers to this measurement problem using quantum mechanics. That’s why NIST has been a leader in quantum mechanics since its earliest day because of the precision measurement involved.

    The more you can measure something very precisely, the more you can make improvements to that technology. So there’s a lovely cycle of measuring more precisely, improving the technology, and measuring more. But at some point, we hit the limit of the measurement scheme we’re using, and we have to develop a new approach. Measurement science is key to advancing technology. That’s how I think about it.

    Where did your fascination with this area of research come from?

    When I was a kid, I liked building and fixing things. My bike would break, and it was the way I got around, so I was highly motivated to figure out how to fix it. So, I pulled things apart and put them back together again. I tinkered with things. I had some people around me who had knowledge of electronics, and I started building little simple circuits or simple gadgets with little motors or lights.

    I wanted to understand how things work. Why is it doing this thing? And I was curious and got drawn into things. It helps to have a high tolerance for being confused. I want to say that physicists are perpetually confused about the latest thing they’re thinking about, and that is the way we learn, right? You’re confused today. You figure something out, and you’re very happy about this, but you’ll be confused by something new tomorrow!

    When I got into the lab, I found I was pretty good at fixing things, making things work, and understanding why things don’t work and fixing those things. So, when you have that kind of inclination, you wind up as an experimentalist.

    2
    NIST researcher Andrew Wilson points to an ion trap inside a glass vacuum envelope. This trap is used for quantum computing. It can confine more than one atomic element (beryllium and magnesium) at the same time. NIST pioneered this capability, and it’s now being used by companies working on quantum computing. Credit: R. Jacobson/NIST.

    And as for quantum, it’s just cool, right? For example, I do a lot of work with lasers. There’s almost nothing cooler than lasers. If you’ve only seen the little red dots of a laser pointer, come into some of these labs, and you’ll see the most incredible colors in nature. It’s basically a rainbow on steroids. They’re so beautiful and just wonderful to be around. There’s also a profound sense of joy from seeing something that no one has ever seen before, sometimes a discovery that scientists have been seeking for decades.

    The lab feels like a playground to me, albeit with a challenging scientific mission, hard work, long hours, occasional setbacks, and serious safety requirements that require careful following of protocols.

    A lab is like Disneyland to an experimental physicist like me. When you’re in the lab and you see on your screen a signal, an image, a trace of something, after all that hard work, it’s just a reminder of how incredible nature really is. It’s better than any fiction book that’s ever been written in my humble opinion. This work just draws you in.

    And of course, we’re not just tinkering around here, we’re mission-driven. We push very hard; it’s also a very competitive field. Many of us like to compete.

    What is on the horizon for quantum science? What excites you about the future?

    There’s a ton of really great science being done and quantum technologies being developed. We now do things in the lab routinely that even just a few years ago we only dreamed about being able to do and didn’t know how. We can implement important algorithms for quantum computing. We can build sensing-type devices with quantum performance far beyond what anyone has had before. We can communicate quantum information over greater distances and with better fidelity than ever before.

    There are different sorts of quantum computers that many companies are now building. NIST is developing ideas and technologies that these companies will need in the future as they try to extend the capabilities of their machines.

    Many things about how quantum technologies might evolve remain unclear, but we as scientists are just very patient, slowly chipping away at problems. When you’re chasing after something really important, that can be massively transformative, you have to have a lot of resilience and grit.

    Scientists hammer things out — improve things by factors of two — year after year. It’s like a running a marathon. We have our 100-meter races, too, but quantum is really a sustained effort. NIST has had a sustained quantum effort for decades now.

    As we begin to work on potential applications of quantum, we’re learning so much about things beyond quantum physics. It’s exciting to support companies that are part of the emerging quantum industry and to see the creative ways they are advancing technologies. Perhaps we will be able to look back at this moment in time as when quantum revolutionized technology, in the same way that silicon chips and integrated circuits did in the 1960s and ’70s. I hope so. We shall see.

    See the full article here.

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD.

    The National Institute of Standards and Technology‘s Mission, Vision, Core Competencies, and Core Values

    Mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.

    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.

    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

    Background

    The Articles of Confederation, ratified by the colonies in 1781, contained the clause, “The United States in Congress assembled shall also have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States”. Article 1, section 8, of the Constitution of the United States (1789), transferred this power to Congress; “The Congress shall have power…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures”.

    In January 1790, President George Washington, in his first annual message to Congress stated that, “Uniformity in the currency, weights, and measures of the United States is an object of great importance, and will, I am persuaded, be duly attended to”, and ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage, Weights, and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, “A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience”, but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared “Weights and measures may be ranked among the necessities of life to every individual of human society”.

    From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, which was part of the U.S. Coast and Geodetic Survey in the Department of the Treasury.

    Bureau of Standards

    In 1901 in response to a bill proposed by Congressman James H. Southard (R- Ohio) the National Bureau of Standards was founded with the mandate to provide standard weights and measures and to serve as the national physical laboratory for the United States. (Southard had previously sponsored a bill for metric conversion of the United States.)

    President Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was $40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, and set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington DC (US) and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures the Bureau developed instruments for electrical units and for measurement of light. In 1905 a meeting was called that would be the first National Conference on Weights and Measures.

    Initially conceived as purely a metrology agency the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products. Some of these standards were for products intended for government use; but product standards also affected private-sector consumption. Quality standards were developed for products including some types of clothing; automobile brake systems and headlamps; antifreeze; and electrical safety. During World War I, the Bureau worked on multiple problems related to war production even operating its own facility to produce optical glass when European supplies were cut off. Between the wars Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II military research and development was carried out including development of radio propagation forecast methods; the proximity fuze and the standardized airframe used originally for Project Pigeon; and shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.

    In 1948, financed by the United States Air Force the Bureau began design and construction of SEAC: the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version- DYSEAC- was built for the Signal Corps in 1954.

    Due to a changing mission, the “National Bureau of Standards” became the “ The National Institute of Standards and Technology” in 1988.

    Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings.

    Organization

    NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, Colorado, which was dedicated by President Eisenhower in 1954. NIST’s activities are organized into laboratory programs and extramural programs. Effective October 1, 2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six. NIST Laboratories include:

    Communications Technology Laboratory (CTL)
    Engineering Laboratory (EL)
    Information Technology Laboratory (ITL)
    Center for Neutron Research (NCNR)
    Material Measurement Laboratory (MML)
    Physical Measurement Laboratory (PML)

    Extramural programs include:

    Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers to assist small and mid-sized manufacturers to create and retain jobs, improve efficiencies, and minimize waste through process improvements and to increase market penetration with innovation and growth strategies;
    Technology Innovation Program (TIP), a grant program where NIST and industry partners cost share the early-stage development of innovative but high-risk technologies;
    Baldrige Performance Excellence Program, which administers the Malcolm Baldrige National Quality Award, the nation’s highest award for performance and business excellence.

    NIST’s Boulder laboratories are best known for NIST‑F1 which houses an atomic clock.

    NIST‑F1 serves as the source of the nation’s official time. From its measurement of the natural resonance frequency of cesium—which defines the second—NIST broadcasts time signals via longwave radio station WWVB near Fort Collins in Colorado, and shortwave radio stations WWV and WWVH, located near Fort Collins and Kekaha in Hawai’i, respectively.

    NIST also operates a neutron science user facility: the NIST Center for Neutron Research (NCNR).

    The NCNR provides scientists access to a variety of neutron scattering instruments which they use in many research fields (materials science; fuel cells; biotechnology etc.).

    The SURF III Synchrotron Ultraviolet Radiation Facility is a source of synchrotron radiation in continuous operation since 1961.

    SURF III now serves as the US national standard for source-based radiometry throughout the generalized optical spectrum. All NASA-borne extreme-ultraviolet observation instruments have been calibrated at SURF since the 1970s, and SURF is used for measurement and characterization of systems for extreme ultraviolet lithography.

    The Center for Nanoscale Science and Technology performs research in nanotechnology, both through internal research efforts and by running a user-accessible cleanroom nanomanufacturing facility.

    This “NanoFab” is equipped with tools for lithographic patterning and imaging (e.g., electron microscopes and atomic force microscopes).
    Committees

    NIST has seven standing committees:

    Technical Guidelines Development Committee (TGDC)
    Advisory Committee on Earthquake Hazards Reduction (ACEHR)
    National Construction Safety Team Advisory Committee (NCST Advisory Committee)
    Information Security and Privacy Advisory Board (ISPAB)
    Visiting Committee on Advanced Technology (VCAT)
    Board of Overseers for the Malcolm Baldrige National Quality Award (MBNQA Board of Overseers)
    Manufacturing Extension Partnership National Advisory Board (MEPNAB)

    Measurements and standards

    As part of its mission, NIST supplies industry, academia, government, and other users with over 1,300 Standard Reference Materials (SRMs). These artifacts are certified as having specific characteristics or component content, used as calibration standards for measuring equipment and procedures, quality control benchmarks for industrial processes, and experimental control samples.

    Handbook 44

    NIST publishes the Handbook 44 each year after the annual meeting of the National Conference on Weights and Measures (NCWM). Each edition is developed through cooperation of the Committee on Specifications and Tolerances of the NCWM and the Weights and Measures Division (WMD) of the NIST. The purpose of the book is a partial fulfillment of the statutory responsibility for “cooperation with the states in securing uniformity of weights and measures laws and methods of inspection”.

    NIST has been publishing various forms of what is now the Handbook 44 since 1918 and began publication under the current name in 1949. The 2010 edition conforms to the concept of the primary use of the SI (metric) measurements recommended by the Omnibus Foreign Trade and Competitiveness Act of 1988.

     
  • richardmitnick 10:30 am on March 16, 2023 Permalink | Reply
    Tags: "NRC Authorizes Restart of NIST Research Reactor", NIST will soon begin a careful restart process and low-power testing., NIST's Center for Neutron Research, The National Institute of Standards and Technology   

    From The National Institute of Standards and Technology: “NRC Authorizes Restart of NIST Research Reactor” 

    From The National Institute of Standards and Technology

    3.10.23
    Jennifer Huergo
    jennifer.huergo@nist.gov
    (301) 975-6343

    1
    NIST’s Center for Neutron Research, in Gaithersburg, Maryland. Credit: NIST.

    NIST will soon begin a careful restart process and low-power testing.

    The U.S. Department of Commerce’s National Institute of Standards and Technology (NIST) has received authorization from the Nuclear Regulatory Commission (NRC) to restart its research reactor. The NIST Center for Neutron Research reactor has been shut down since Feb. 3, 2021, when a single fuel element overheated and was damaged because it was not securely latched into place.

    The NIST reactor is used for a broad range of research and operates at far lower power, temperature and pressure conditions than utility reactors that generate electricity. The NRC’s extensive review of this incident showed that the public was safe at all times during the event.

    “We are extremely pleased to have reached this milestone and to begin our return to normal operations,” said Under Secretary of Commerce for Standards and Technology and NIST Director Laurie E. Locascio. “We are committed to ensuring the safe operation of this vital national resource so that it can once again support important advances in medicine, material science, technology and more.”

    Over the past two years, NIST has reviewed and updated its training, operations, procedures, communications and attitudes toward safety. The NRC confirmed there were no impacts on the reactor’s structures, systems and components that would preclude restart. The commission also evaluated NIST’s revised procedures and practices to ensure that they “provide reasonable assurance that the reactor will be operated consistent with its license and the NRC’s regulations.”

    With its authorization to restart the research reactor, the NRC released a technical evaluation report that outlines how it reached its decision and documents its review of the corrective actions NIST has taken per an August 2022 Confirmatory Order.

    “I am proud of the tremendous progress our NCNR and other NIST staff members have made to ensure we could bring the NCNR back into service,” said Locascio. “We look forward to continuing to work with the NRC to ensure sustained safe operations.”

    The NIST Reactor Safety Evaluation Committee will complete a review to confirm that all conditions for restart have been met before low-power testing of the reactor begins. This testing is expected to last several weeks before the reactor can be returned to full operational status. The NCNR will keep its research community informed of progress toward a return to scientific operations.

    All of NIST’s updates and reports on the 2021 incident and recovery, along with a Q&A and links to related NRC reports, can be found on the NIST website.

    See the full article here.

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD.

    The National Institute of Standards and Technology‘s Mission, Vision, Core Competencies, and Core Values

    Mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.

    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.

    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

    Background

    The Articles of Confederation, ratified by the colonies in 1781, contained the clause, “The United States in Congress assembled shall also have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States”. Article 1, section 8, of the Constitution of the United States (1789), transferred this power to Congress; “The Congress shall have power…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures”.

    In January 1790, President George Washington, in his first annual message to Congress stated that, “Uniformity in the currency, weights, and measures of the United States is an object of great importance, and will, I am persuaded, be duly attended to”, and ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage, Weights, and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, “A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience”, but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared “Weights and measures may be ranked among the necessities of life to every individual of human society”.

    From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, which was part of the U.S. Coast and Geodetic Survey in the Department of the Treasury.

    Bureau of Standards

    In 1901 in response to a bill proposed by Congressman James H. Southard (R- Ohio) the National Bureau of Standards was founded with the mandate to provide standard weights and measures and to serve as the national physical laboratory for the United States. (Southard had previously sponsored a bill for metric conversion of the United States.)

    President Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was $40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, and set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington DC (US) and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures the Bureau developed instruments for electrical units and for measurement of light. In 1905 a meeting was called that would be the first National Conference on Weights and Measures.

    Initially conceived as purely a metrology agency the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products. Some of these standards were for products intended for government use; but product standards also affected private-sector consumption. Quality standards were developed for products including some types of clothing; automobile brake systems and headlamps; antifreeze; and electrical safety. During World War I, the Bureau worked on multiple problems related to war production even operating its own facility to produce optical glass when European supplies were cut off. Between the wars Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II military research and development was carried out including development of radio propagation forecast methods; the proximity fuze and the standardized airframe used originally for Project Pigeon; and shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.

    In 1948, financed by the United States Air Force the Bureau began design and construction of SEAC: the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version- DYSEAC- was built for the Signal Corps in 1954.

    Due to a changing mission, the “National Bureau of Standards” became the “ The National Institute of Standards and Technology” in 1988.

    Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings.

    Organization

    NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, Colorado, which was dedicated by President Eisenhower in 1954. NIST’s activities are organized into laboratory programs and extramural programs. Effective October 1, 2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six. NIST Laboratories include:

    Communications Technology Laboratory (CTL)
    Engineering Laboratory (EL)
    Information Technology Laboratory (ITL)
    Center for Neutron Research (NCNR)
    Material Measurement Laboratory (MML)
    Physical Measurement Laboratory (PML)

    Extramural programs include:

    Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers to assist small and mid-sized manufacturers to create and retain jobs, improve efficiencies, and minimize waste through process improvements and to increase market penetration with innovation and growth strategies;
    Technology Innovation Program (TIP), a grant program where NIST and industry partners cost share the early-stage development of innovative but high-risk technologies;
    Baldrige Performance Excellence Program, which administers the Malcolm Baldrige National Quality Award, the nation’s highest award for performance and business excellence.

    NIST’s Boulder laboratories are best known for NIST‑F1 which houses an atomic clock.

    NIST‑F1 serves as the source of the nation’s official time. From its measurement of the natural resonance frequency of cesium—which defines the second—NIST broadcasts time signals via longwave radio station WWVB near Fort Collins in Colorado, and shortwave radio stations WWV and WWVH, located near Fort Collins and Kekaha in Hawai’i, respectively.

    NIST also operates a neutron science user facility: the NIST Center for Neutron Research (NCNR).

    The NCNR provides scientists access to a variety of neutron scattering instruments which they use in many research fields (materials science; fuel cells; biotechnology etc.).

    The SURF III Synchrotron Ultraviolet Radiation Facility is a source of synchrotron radiation in continuous operation since 1961.

    SURF III now serves as the US national standard for source-based radiometry throughout the generalized optical spectrum. All NASA-borne extreme-ultraviolet observation instruments have been calibrated at SURF since the 1970s, and SURF is used for measurement and characterization of systems for extreme ultraviolet lithography.

    The Center for Nanoscale Science and Technology performs research in nanotechnology, both through internal research efforts and by running a user-accessible cleanroom nanomanufacturing facility.

    This “NanoFab” is equipped with tools for lithographic patterning and imaging (e.g., electron microscopes and atomic force microscopes).
    Committees

    NIST has seven standing committees:

    Technical Guidelines Development Committee (TGDC)
    Advisory Committee on Earthquake Hazards Reduction (ACEHR)
    National Construction Safety Team Advisory Committee (NCST Advisory Committee)
    Information Security and Privacy Advisory Board (ISPAB)
    Visiting Committee on Advanced Technology (VCAT)
    Board of Overseers for the Malcolm Baldrige National Quality Award (MBNQA Board of Overseers)
    Manufacturing Extension Partnership National Advisory Board (MEPNAB)

    Measurements and standards

    As part of its mission, NIST supplies industry, academia, government, and other users with over 1,300 Standard Reference Materials (SRMs). These artifacts are certified as having specific characteristics or component content, used as calibration standards for measuring equipment and procedures, quality control benchmarks for industrial processes, and experimental control samples.

    Handbook 44

    NIST publishes the Handbook 44 each year after the annual meeting of the National Conference on Weights and Measures (NCWM). Each edition is developed through cooperation of the Committee on Specifications and Tolerances of the NCWM and the Weights and Measures Division (WMD) of the NIST. The purpose of the book is a partial fulfillment of the statutory responsibility for “cooperation with the states in securing uniformity of weights and measures laws and methods of inspection”.

    NIST has been publishing various forms of what is now the Handbook 44 since 1918 and began publication under the current name in 1949. The 2010 edition conforms to the concept of the primary use of the SI (metric) measurements recommended by the Omnibus Foreign Trade and Competitiveness Act of 1988.

     
  • richardmitnick 9:32 am on March 15, 2023 Permalink | Reply
    Tags: "Brain-Inspired Computing Can Help Us Create Faster and More Energy-Efficient Devices — If We Win the Race", , , , , , , Soon conventional architectures will be incapable of keeping pace with the demands of future data processing., , The Brain as a Time Computer, The National Institute of Standards and Technology, Today there is a race to build computing systems that look like regular grids of low-precision processing elements., We must revisit the brain to understand its tremendous energy efficiency and use that understanding to build computing systems inspired by it.   

    From The National Institute of Standards and Technology: “Brain-Inspired Computing Can Help Us Create Faster and More Energy-Efficient Devices — If We Win the Race” 

    From The National Institute of Standards and Technology

    3.15.23
    Advait Madhavan

    1
    One way race logic strives to save energy is by addressing the shortest-path problem. In one scenario, cars set off in multiple directions trying to find the fastest route. When the first car arrives, all the other cars stop, saving energy.
    Credit: NIST.

    “The human brain is an amazingly energy-efficient device. In computing terms, it can perform the equivalent of an exaflop — a billion-billion (1 followed by 18 zeros) mathematical operations per second — with just 20 watts of power.

    In comparison, one of the most powerful supercomputers in the world, the Oak Ridge Frontier, has recently demonstrated exaflop computing. But it needs a million times more power — 20 megawatts — to pull off this feat.

    My colleagues and I are looking to the brain as a guide in developing a powerful yet energy-efficient computer circuit design. You see, energy efficiency has emerged as the predominant factor keeping us from creating even more powerful computer chips. While ever-smaller electronic components have exponentially increased the computing power of our devices, those gains are slowing down.

    Interestingly, our view of how the brain works has been a source of constant inspiration to the computing world. To understand how we arrived at our approach, we need to take a short tour of computing history.

    How a 19th Century Mathematician Launched the Computing Revolution

    Mathematician George Boole’s impact on the modern age is incalculable. His 1847 invention, now known as Boolean algebra, assigns 1 and 0 values to logical propositions (true or false, respectively). Boolean algebra describes a way to perform precise mathematical calculations with them.

    Boole’s inspiration came from how he understood the brain to work. He imagined the laws of thought as being logical propositions that could take on true (1) or false (0) values. He expressed these laws mathematically, so they could perform arithmetic in a precise, systematic fashion. Many years later, with the research of information scientist Claude Shannon, Boolean algebra was used with electrical switches that we now know as transistors, the building blocks of today’s computers.

    Today, tiny, nanometer-sized transistors operating as switches inside microchips are the most manufactured device ever. More than 13 sextillion (one followed by 21 zeros) devices have been fabricated as of 2018. Millions of such transistors are arranged into centralized, flexible architectures for processing data.

    These micro-scale processing machines, or microprocessors, are used in nearly anything you use that has a computer — a cellphone, a washing machine, a car or a smartwatch. Their versatility, scalability and robustness have been responsible for their “Swiss Army knife” reputation. They are able to do many things that humans are not well suited to do, such as calculating trajectories of rockets or rapidly crunching numbers in large financial spreadsheets.

    In recent years, transistors have gotten a thousand times smaller, a hundred thousand times faster, and able to use a billion times less energy. But these incredible improvements from the early days of computing are not enough anymore. The amount of data that is being produced by human activity has exponentially increased. The centralized Swiss Army-knife approach cannot keep up with the data deluge of the modern age.

    On the other hand, our biological evolution over billions of years solved the problem of handling lots of data by using lots of processing elements.

    While neurons, or nerve cells, were discovered in the late 1800s, their impact on computing would occur only 100 years later. Scientists studying the computational behavior of brains began building decentralized processing models that relied on large amounts of data. This allowed computer engineers to revisit the organization of the brain as a guiding light for rearranging the billions of transistors at their disposal.

    The Brain as a Sea of Low-Precision Neurons

    The neuron doctrine, as it is known today, envisions the brain as made up of a vast sea of interconnected nerve cells — known as neurons — that communicate with each other through electrical and chemical interactions. They communicate across physical barriers called synapses. This view, popularized by early neuroscientists, is very different from Boole’s more abstract logical view of brain function.

    But what are they doing computationally? Enter Walter Pitts and Warren McCulloch.

    In 1943, they proposed the first mathematical model of a neuron. Their model showed that nerve cells in the brain have enormous computational power. It described how a nerve cell accumulates electrical activity from its neighboring nerve cells based on their “importance” and outputs electrical activity based on its aggregated input. This electrical activity, in the form of spikes, enables the brain to do everything from transmitting information to storing memory to responding to visual stimuli, such as a family picture or a beautiful sunset. As neurons can have thousands of neighbors, such an approach was well suited to dealing with applications with a lot of input data.

    2
    The human brain is an amazingly energy-efficient device. NIST researchers are using the brain as inspiration to develop more energy-efficient computer circuit designs. Credit: MattLphotography/Shutterstock.

    Today, 80 years later, the McCulloch and Pitts model is widely regarded as the parent of modern neural network models and the basis of the recent explosion in artificial intelligence. Especially useful in situations where the precision of data is limited, modern AI and machine learning algorithms have performed miraculous feats in a variety of areas such as search, analytics, forecasting, optimization, gaming and natural language processing. They perform with human-level accuracy on image and speech recognition tasks, predict weather, outclass chess grandmasters, and as shown recently with ChatGPT, parse and respond to natural language.

    Historically, the power of our computing systems was rooted in being able to do very precise calculations. Small errors in initial conditions of rocket trajectories can lead to huge errors later in flight. Though many applications still have such requirements, the computational power of modern deep-learning networks arise from their large size and interconnectivity.

    A single neuron in a modern network can have up to a couple of thousand other neurons connected to it. Though each neuron may not be as precise, its behavior is determined by the aggregated information of many of its neighbors. When trained on an input dataset, the interconnection strengths between each pair of neurons are adjusted over time, so that overall network makes correct decisions. Essentially, the neurons are all working together as a team. The network as a whole can make up for the reduced precision in each of its atomic elements. Quantity has a quality all its own.

    To make such models more practical, the underlying hardware must reflect the network. Running large, low-precision networks on a small number of high-precision computing elements ends up being tremendously inefficient.

    Today there is a race to build computing systems that look like regular grids of low-precision processing elements, each filled with neural network functionality. From smartphones to data centers, low-precision AI chips are becoming more and more common.

    As the application space for AI and machine learning algorithms continues to grow, this trend is only going to increase. Soon conventional architectures will be incapable of keeping pace with the demands of future data processing.

    Even though modern AI hardware systems can perform tremendous feats of cognitive intelligence, such as beating the best human player of Go, a complex strategy game, such systems take tens of thousands of watts of power to run. On the other hand, the human Go grandmaster’s brain is only consuming 20 watts of power. We must revisit the brain to understand its tremendous energy efficiency and use that understanding to build computing systems inspired by it.

    One clue comes from recent neuroscience research where the timing of energy spikes in the brain is found to be important. We are beginning to believe that this timing may be the key to making computers more energy efficient.

    The Brain as a Time Computer

    In the early 1990s, French neuroscientists performed a series of experiments to test the speed of the human visual system. They were surprised to find that the visual system was much faster than they had previously thought, responding to a visual stimulus in as little as 100 milliseconds. This went against the prevailing notion of how spikes encode information.

    The conventional picture suggests that a neuron must receive a long train of spikes from its neighbors, aggregate all of them, and respond. But the time it would take to aggregate a long spike train and respond would be much larger than experimentally discovered. This meant that some neurons were not aggregating long spike trains from neighbors — and conversely, were working with just a couple of spikes from their neighbors when producing an output.

    This radically changes that way we think information is encoded. If the brain is using only a few spikes to make decisions, then it must be making decisions based on the timing differences between spikes. A similar process occurs in the auditory systems of humans and other animals. For example, researchers have verified that barn owls use the difference in the arrival times of a sound to each ear to locate their prey.

    These observations suggest that we might make more efficient computers by mimicking this aspect of the brain and using the timing of signals to represent information.

    Inspired by this idea, my NIST colleagues and I are aiming to develop a new type of computer circuit that uses something we call “race logic” to solve problems. In race logic, signals race against each other, and the timing between them matters. The winner of the race tells us something about the solution of the problem.

    To understand race logic, let’s first go back briefly to conventional computing. Conventional digital computers solve problems by sending Boolean bits of information — 0s and 1s — on wires through a circuit. During circuit operation, bits regularly flip their values, from 0 to 1 and vice versa. Each bit flip consumes energy, and circuits with lots of bit flipping are said to have high activity and consume a lot of energy. Trying to reduce energy consumption suggests reducing activity, hence reducing the number of bit flips to perform a given task.

    Race logic reduces activity by encoding information in the timing of those bit flips on a wire. This approach allows a single bit flip on a wire to encode values larger than 0 or 1, making it an efficient encoding.

    The circuit could be configured to look like a map between your home and workplace and solve for problems, such as the most efficient route. Electrical signals travel through various pathways in the circuit. The first one to reach the end of the circuit wins the race, revealing the most efficient route in the process. Since only a single bit flip passes through many elements of the circuit, it has low activity and high efficiency.

    An additional advantage of race logic is that signals that lose the race by moving through slower routes are stopped, further saving energy. Imagine a marathon where you asked the runners not to run the same route, but for each runner to find the most efficient route to the finish line. Once the winner crosses that finish line, all the other runners stop, saving their own energy. If you apply this to a computer, lots of energy is saved over time.

    One important application that race logic is particularly good at solving is the shortest path problem in networks, such as finding the quickest route from one place to another or determining the lowest number of connections required on social media to connect two people. It also forms the basis for complicated network analytics that answer more complex questions, such as highest traffic nodes in networks, path planning for streets in a busy city, the spread of a disease through a population, or the fastest way to route information on the internet.

    How did I think to connect brain science to a new type of computer circuit?

    The concept of race logic was originally conceived and first practically demonstrated during my Ph.D. thesis work at the University of California-Santa Barbara, guided by the electrical engineering expertise of Professor Dmitri Strukov and the computer science expertise of Professor Tim Sherwood. I am currently working toward exploring mathematical techniques and practical technologies to make this concept even more efficient.

    Using Race Logic to Make the Next Generation of Energy-Efficient Computers

    Next-generation computers are going to look very different from the computers of yesterday. As the quantity and nature of our data gathering changes, the demands from our computing systems must change as well. Hardware that powers tomorrow’s computing applications must keep energy impacts minimal and be good for the planet.

    By being in touch with the latest developments in brain science, next-generation computers can benefit from the recently uncovered secrets of biology and meet the ever-increasing demand for energy-efficient computing hardware.

    Our team at NIST is running a race of our own — to help computing power reach its full potential and protect our planet at the same time.”

    See the full article here.

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD.

    The National Institute of Standards and Technology‘s Mission, Vision, Core Competencies, and Core Values

    Mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.

    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.

    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

    Background

    The Articles of Confederation, ratified by the colonies in 1781, contained the clause, “The United States in Congress assembled shall also have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States”. Article 1, section 8, of the Constitution of the United States (1789), transferred this power to Congress; “The Congress shall have power…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures”.

    In January 1790, President George Washington, in his first annual message to Congress stated that, “Uniformity in the currency, weights, and measures of the United States is an object of great importance, and will, I am persuaded, be duly attended to”, and ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage, Weights, and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, “A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience”, but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared “Weights and measures may be ranked among the necessities of life to every individual of human society”.

    From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, which was part of the U.S. Coast and Geodetic Survey in the Department of the Treasury.

    Bureau of Standards

    In 1901 in response to a bill proposed by Congressman James H. Southard (R- Ohio) the National Bureau of Standards was founded with the mandate to provide standard weights and measures and to serve as the national physical laboratory for the United States. (Southard had previously sponsored a bill for metric conversion of the United States.)

    President Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was $40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, and set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington DC (US) and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures the Bureau developed instruments for electrical units and for measurement of light. In 1905 a meeting was called that would be the first National Conference on Weights and Measures.

    Initially conceived as purely a metrology agency the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products. Some of these standards were for products intended for government use; but product standards also affected private-sector consumption. Quality standards were developed for products including some types of clothing; automobile brake systems and headlamps; antifreeze; and electrical safety. During World War I, the Bureau worked on multiple problems related to war production even operating its own facility to produce optical glass when European supplies were cut off. Between the wars Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II military research and development was carried out including development of radio propagation forecast methods; the proximity fuze and the standardized airframe used originally for Project Pigeon; and shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.

    In 1948, financed by the United States Air Force the Bureau began design and construction of SEAC: the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version- DYSEAC- was built for the Signal Corps in 1954.

    Due to a changing mission, the “National Bureau of Standards” became the “ The National Institute of Standards and Technology” in 1988.

    Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings.

    Organization

    NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, Colorado, which was dedicated by President Eisenhower in 1954. NIST’s activities are organized into laboratory programs and extramural programs. Effective October 1, 2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six. NIST Laboratories include:

    Communications Technology Laboratory (CTL)
    Engineering Laboratory (EL)
    Information Technology Laboratory (ITL)
    Center for Neutron Research (NCNR)
    Material Measurement Laboratory (MML)
    Physical Measurement Laboratory (PML)

    Extramural programs include:

    Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers to assist small and mid-sized manufacturers to create and retain jobs, improve efficiencies, and minimize waste through process improvements and to increase market penetration with innovation and growth strategies;
    Technology Innovation Program (TIP), a grant program where NIST and industry partners cost share the early-stage development of innovative but high-risk technologies;
    Baldrige Performance Excellence Program, which administers the Malcolm Baldrige National Quality Award, the nation’s highest award for performance and business excellence.

    NIST’s Boulder laboratories are best known for NIST‑F1 which houses an atomic clock.

    NIST‑F1 serves as the source of the nation’s official time. From its measurement of the natural resonance frequency of cesium—which defines the second—NIST broadcasts time signals via longwave radio station WWVB near Fort Collins in Colorado, and shortwave radio stations WWV and WWVH, located near Fort Collins and Kekaha in Hawai’i, respectively.

    NIST also operates a neutron science user facility: the NIST Center for Neutron Research (NCNR).

    The NCNR provides scientists access to a variety of neutron scattering instruments which they use in many research fields (materials science; fuel cells; biotechnology etc.).

    The SURF III Synchrotron Ultraviolet Radiation Facility is a source of synchrotron radiation in continuous operation since 1961.

    SURF III now serves as the US national standard for source-based radiometry throughout the generalized optical spectrum. All NASA-borne extreme-ultraviolet observation instruments have been calibrated at SURF since the 1970s, and SURF is used for measurement and characterization of systems for extreme ultraviolet lithography.

    The Center for Nanoscale Science and Technology performs research in nanotechnology, both through internal research efforts and by running a user-accessible cleanroom nanomanufacturing facility.

    This “NanoFab” is equipped with tools for lithographic patterning and imaging (e.g., electron microscopes and atomic force microscopes).
    Committees

    NIST has seven standing committees:

    Technical Guidelines Development Committee (TGDC)
    Advisory Committee on Earthquake Hazards Reduction (ACEHR)
    National Construction Safety Team Advisory Committee (NCST Advisory Committee)
    Information Security and Privacy Advisory Board (ISPAB)
    Visiting Committee on Advanced Technology (VCAT)
    Board of Overseers for the Malcolm Baldrige National Quality Award (MBNQA Board of Overseers)
    Manufacturing Extension Partnership National Advisory Board (MEPNAB)

    Measurements and standards

    As part of its mission, NIST supplies industry, academia, government, and other users with over 1,300 Standard Reference Materials (SRMs). These artifacts are certified as having specific characteristics or component content, used as calibration standards for measuring equipment and procedures, quality control benchmarks for industrial processes, and experimental control samples.

    Handbook 44

    NIST publishes the Handbook 44 each year after the annual meeting of the National Conference on Weights and Measures (NCWM). Each edition is developed through cooperation of the Committee on Specifications and Tolerances of the NCWM and the Weights and Measures Division (WMD) of the NIST. The purpose of the book is a partial fulfillment of the statutory responsibility for “cooperation with the states in securing uniformity of weights and measures laws and methods of inspection”.

    NIST has been publishing various forms of what is now the Handbook 44 since 1918 and began publication under the current name in 1949. The 2010 edition conforms to the concept of the primary use of the SI (metric) measurements recommended by the Omnibus Foreign Trade and Competitiveness Act of 1988.

     
  • richardmitnick 10:50 am on February 7, 2023 Permalink | Reply
    Tags: "NIST Selects ‘Lightweight Cryptography’ Algorithms to Protect Small Devices", , “Ascon“, The National Institute of Standards and Technology   

    From The National Institute of Standards and Technology: “NIST Selects ‘Lightweight Cryptography’ Algorithms to Protect Small Devices” 

    From The National Institute of Standards and Technology

    2.7.23

    Chad Boutin
    charles.boutin@nist.gov
    (301) 975-4261

    1
    Lightweight cryptography is designed to protect information created and transmitted by the Internet of Things, as well as for other miniature technologies. Credit: N. Hanacek/NIST.

    Lightweight electronics, meet the heavyweight champion for protecting your information: Security experts at the National Institute of Standards and Technology (NIST) have announced a victor in their program to find a worthy defender of data generated by small devices. The winner, a group of cryptographic algorithms called “Ascon“, will be published as NIST’s lightweight cryptography standard later in 2023.

    The chosen algorithms are designed to protect information created and transmitted by the Internet of Things (IoT), including its myriad tiny sensors and actuators. They are also designed for other miniature technologies such as implanted medical devices, stress detectors inside roads and bridges, and keyless entry fobs for vehicles. Devices like these need “lightweight cryptography” — protection that uses the limited amount of electronic resources they possess. According to NIST computer scientist Kerry McKay, the newly selected algorithms should be appropriate for most forms of tiny tech.

    “The world is moving toward using small devices for lots of tasks ranging from sensing to identification to machine control, and because these small devices have limited resources, they need security that has a compact implementation,” she said. “These algorithms should cover most devices that have these sorts of resource constraints.”

    To determine the strongest and most efficient lightweight algorithms, NIST held a development program that took several years, first communicating with industry and other organizations to understand their needs and then requesting potential solutions from the world’s cryptography community in 2018. After receiving 57 submissions, McKay and mathematician Meltem Sönmez Turan managed a multi-round public review process in which cryptographers examined and attempted to find weaknesses in the candidates, eventually whittling them down to 10 finalists before selecting the winner.
    ===
    “We considered a number of criteria to be important,” McKay said. “The ability to provide security was paramount, but we also had to consider factors such as a candidate algorithm’s performance and flexibility in terms of speed, size and energy use. In the end we made a selection that was a good all-around choice.”

    Ascon was developed in 2014 by a team of cryptographers from Graz University of Technology, Infineon Technologies, Lamarr Security Research and Radboud University. It was selected in 2019 as the primary choice for lightweight authenticated encryption in the final portfolio of the CAESAR competition, a sign that Ascon had withstood years of examination by cryptographers — a characteristic the NIST team also valued, McKay said.

    There are currently seven members of the Ascon family, some or all of which may become part of NIST’s published lightweight cryptography standard. As a family, the variants give a range of functionality that will offer designers options for different tasks. Two of these tasks, McKay said, are among the most important in lightweight cryptography: authenticated encryption with associated data (AEAD) and hashing.

    AEAD protects the confidentiality of a message, but it also allows extra information — such as the header of a message, or a device’s IP address — to be included without being encrypted. The algorithm ensures that all of the protected data is authentic and has not changed in transit. AEAD can be used in vehicle-to-vehicle communications, and it also can help prevent counterfeiting of messages exchanged with the radio frequency identification (RFID) tags that often help track packages in warehouses.

    Hashing creates a short digital fingerprint of a message that allows a recipient to determine whether the message has changed. In lightweight cryptography, hashing might be used to check whether a software update is appropriate or has downloaded correctly.

    Currently, the most efficient NIST-approved technique for AEAD is the Advanced Encryption Standard (defined in FIPS 197) used with the Galois/Counter Mode (SP 800-38D), and for hashing, SHA-256 (defined in FIPS 180-4) is widely used. McKay said that these standards remain in effect for general use.

    “The goal of this project is not to replace AES or our hash standards,” she said. “NIST still recommends their use on devices that don’t have the resource constraints that these new algorithms address. There are native instructions in many processors, which support fast, high-throughput implementations. In addition, these algorithms are included in many protocols and should continue to be supported for interoperability purposes.”

    Neither are the new algorithms intended to be used for post-quantum encryption, another current concern of the cryptography community that NIST is working to address using a similar public review process for potential algorithms.

    “One of the Ascon variants offers a measure of resistance to the sort of attack a powerful quantum computer might mount. However, that’s not the main goal here,” McKay said. “Post-quantum encryption is primarily important for long-term secrets that need to be protected for years. Generally, lightweight cryptography is important for more ephemeral secrets.”

    The specification of Ascon includes multiple variants, and the finalized standard may not include all of them. The NIST team plans to work with Ascon’s designers and the cryptography community to finalize the details of standardization. Additional information may be found on NIST’s project website.

    See the full article here.

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD.

    The National Institute of Standards and Technology‘s Mission, Vision, Core Competencies, and Core Values

    Mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.

    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.

    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

    Background

    The Articles of Confederation, ratified by the colonies in 1781, contained the clause, “The United States in Congress assembled shall also have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States”. Article 1, section 8, of the Constitution of the United States (1789), transferred this power to Congress; “The Congress shall have power…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures”.

    In January 1790, President George Washington, in his first annual message to Congress stated that, “Uniformity in the currency, weights, and measures of the United States is an object of great importance, and will, I am persuaded, be duly attended to”, and ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage, Weights, and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, “A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience”, but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared “Weights and measures may be ranked among the necessities of life to every individual of human society”.

    From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, which was part of the U.S. Coast and Geodetic Survey in the Department of the Treasury.

    Bureau of Standards

    In 1901 in response to a bill proposed by Congressman James H. Southard (R- Ohio) the National Bureau of Standards was founded with the mandate to provide standard weights and measures and to serve as the national physical laboratory for the United States. (Southard had previously sponsored a bill for metric conversion of the United States.)

    President Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was $40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, and set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington DC (US) and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures the Bureau developed instruments for electrical units and for measurement of light. In 1905 a meeting was called that would be the first National Conference on Weights and Measures.

    Initially conceived as purely a metrology agency the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products. Some of these standards were for products intended for government use; but product standards also affected private-sector consumption. Quality standards were developed for products including some types of clothing; automobile brake systems and headlamps; antifreeze; and electrical safety. During World War I, the Bureau worked on multiple problems related to war production even operating its own facility to produce optical glass when European supplies were cut off. Between the wars Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II military research and development was carried out including development of radio propagation forecast methods; the proximity fuze and the standardized airframe used originally for Project Pigeon; and shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.

    In 1948, financed by the United States Air Force the Bureau began design and construction of SEAC: the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version- DYSEAC- was built for the Signal Corps in 1954.

    Due to a changing mission, the “National Bureau of Standards” became the “ The National Institute of Standards and Technology” in 1988.

    Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings.

    Organization

    NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, Colorado, which was dedicated by President Eisenhower in 1954. NIST’s activities are organized into laboratory programs and extramural programs. Effective October 1, 2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six. NIST Laboratories include:

    Communications Technology Laboratory (CTL)
    Engineering Laboratory (EL)
    Information Technology Laboratory (ITL)
    Center for Neutron Research (NCNR)
    Material Measurement Laboratory (MML)
    Physical Measurement Laboratory (PML)

    Extramural programs include:

    Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers to assist small and mid-sized manufacturers to create and retain jobs, improve efficiencies, and minimize waste through process improvements and to increase market penetration with innovation and growth strategies;
    Technology Innovation Program (TIP), a grant program where NIST and industry partners cost share the early-stage development of innovative but high-risk technologies;
    Baldrige Performance Excellence Program, which administers the Malcolm Baldrige National Quality Award, the nation’s highest award for performance and business excellence.

    NIST’s Boulder laboratories are best known for NIST‑F1 which houses an atomic clock.

    NIST‑F1 serves as the source of the nation’s official time. From its measurement of the natural resonance frequency of cesium—which defines the second—NIST broadcasts time signals via longwave radio station WWVB near Fort Collins in Colorado, and shortwave radio stations WWV and WWVH, located near Fort Collins and Kekaha in Hawai’i, respectively.

    NIST also operates a neutron science user facility: the NIST Center for Neutron Research (NCNR).

    The NCNR provides scientists access to a variety of neutron scattering instruments which they use in many research fields (materials science; fuel cells; biotechnology etc.).

    The SURF III Synchrotron Ultraviolet Radiation Facility is a source of synchrotron radiation in continuous operation since 1961.

    SURF III now serves as the US national standard for source-based radiometry throughout the generalized optical spectrum. All NASA-borne extreme-ultraviolet observation instruments have been calibrated at SURF since the 1970s, and SURF is used for measurement and characterization of systems for extreme ultraviolet lithography.

    The Center for Nanoscale Science and Technology performs research in nanotechnology, both through internal research efforts and by running a user-accessible cleanroom nanomanufacturing facility.

    This “NanoFab” is equipped with tools for lithographic patterning and imaging (e.g., electron microscopes and atomic force microscopes).
    Committees

    NIST has seven standing committees:

    Technical Guidelines Development Committee (TGDC)
    Advisory Committee on Earthquake Hazards Reduction (ACEHR)
    National Construction Safety Team Advisory Committee (NCST Advisory Committee)
    Information Security and Privacy Advisory Board (ISPAB)
    Visiting Committee on Advanced Technology (VCAT)
    Board of Overseers for the Malcolm Baldrige National Quality Award (MBNQA Board of Overseers)
    Manufacturing Extension Partnership National Advisory Board (MEPNAB)

    Measurements and standards

    As part of its mission, NIST supplies industry, academia, government, and other users with over 1,300 Standard Reference Materials (SRMs). These artifacts are certified as having specific characteristics or component content, used as calibration standards for measuring equipment and procedures, quality control benchmarks for industrial processes, and experimental control samples.

    Handbook 44

    NIST publishes the Handbook 44 each year after the annual meeting of the National Conference on Weights and Measures (NCWM). Each edition is developed through cooperation of the Committee on Specifications and Tolerances of the NCWM and the Weights and Measures Division (WMD) of the NIST. The purpose of the book is a partial fulfillment of the statutory responsibility for “cooperation with the states in securing uniformity of weights and measures laws and methods of inspection”.

    NIST has been publishing various forms of what is now the Handbook 44 since 1918 and began publication under the current name in 1949. The 2010 edition conforms to the concept of the primary use of the SI (metric) measurements recommended by the Omnibus Foreign Trade and Competitiveness Act of 1988.

     
  • richardmitnick 11:30 am on January 26, 2023 Permalink | Reply
    Tags: "NIST Risk Management Framework Aims to Improve Trustworthiness of Artificial Intelligence", AI RMF Playbook, , Artificial Intelligence Risk Management Framework (AI RMF 1.0), Compared with traditional software AI poses a number of different risks., Enhancing AI trustworthiness while managing risks based on our democratic values, NIST plans to launch a Trustworthy and Responsible AI Resource Center to help organizations put the AI RMF 1.0 into practice., The AI RMF follows a direction from Congress for NIST to develop the framework and was produced in close collaboration with the private and public sectors., The framework equips organizations to think about AI and risk differently., The framework is part of NIST’s larger effort to cultivate trust in AI technologies., The National Institute of Standards and Technology   

    From The National Institute of Standards and Technology: “NIST Risk Management Framework Aims to Improve Trustworthiness of Artificial Intelligence” 

    From The National Institute of Standards and Technology

    1.26.23

    1
    Credit: N. Hanacek/NIST.

    The U.S. Department of Commerce’s National Institute of Standards and Technology (NIST) has released its Artificial Intelligence Risk Management Framework (AI RMF 1.0), a guidance document for voluntary use by organizations designing, developing, deploying or using AI systems to help manage the many risks of AI technologies.

    The AI RMF follows a direction from Congress for NIST to develop the framework and was produced in close collaboration with the private and public sectors. It is intended to adapt to the AI landscape as technologies continue to develop, and to be used by organizations in varying degrees and capacities so that society can benefit from AI technologies while also being protected from its potential harms.

    “This voluntary framework will help develop and deploy AI technologies in ways that enable the United States, other nations and organizations to enhance AI trustworthiness while managing risks based on our democratic values,” said Deputy Commerce Secretary Don Graves. “It should accelerate AI innovation and growth while advancing — rather than restricting or damaging — civil rights, civil liberties and equity for all.”

    Compared with traditional software AI poses a number of different risks. AI systems are trained on data that can change over time, sometimes significantly and unexpectedly, affecting the systems in ways that can be difficult to understand. These systems are also “socio-technical” in nature, meaning they are influenced by societal dynamics and human behavior. AI risks can emerge from the complex interplay of these technical and societal factors, affecting people’s lives in situations ranging from their experiences with online chatbots to the results of job and loan applications.

    The framework equips organizations to think about AI and risk differently. It promotes a change in institutional culture, encouraging organizations to approach AI with a new perspective — including how to think about, communicate, measure and monitor AI risks and its potential positive and negative impacts.

    The AI RMF provides a flexible, structured and measurable process that will enable organizations to address AI risks. Following this process for managing AI risks can maximize the benefits of AI technologies while reducing the likelihood of negative impacts to individuals, groups, communities, organizations and society.

    The framework is part of NIST’s larger effort to cultivate trust in AI technologies — necessary if the technology is to be accepted widely by society, according to Under Secretary for Standards and Technology and NIST Director Laurie E. Locascio.

    “The AI Risk Management Framework can help companies and other organizations in any sector and any size to jump-start or enhance their AI risk management approaches,” Locascio said. “It offers a new way to integrate responsible practices and actionable guidance to operationalize trustworthy and responsible AI. We expect the AI RMF to help drive development of best practices and standards.”

    The AI RMF is divided into two parts. The first part discusses how organizations can frame the risks related to AI and outlines the characteristics of trustworthy AI systems. The second part, the core of the framework, describes four specific functions — govern, map, measure and manage — to help organizations address the risks of AI systems in practice. These functions can be applied in context-specific use cases and at any stages of the AI life cycle.

    Working closely with the private and public sectors, NIST has been developing the AI RMF for 18 months. The document reflects about 400 sets of formal comments NIST received from more than 240 different organizations on draft versions of the framework. NIST today released statements from some of the organizations that have already committed to use or promote the framework.

    The agency also today released a companion voluntary AI RMF Playbook, which suggests ways to navigate and use the framework.

    NIST plans to work with the AI community to update the framework periodically and welcomes suggestions for additions and improvements to the playbook at any time. Comments received by the end of February 2023 will be included in an updated version of the playbook to be released in spring 2023.

    In addition, NIST plans to launch a Trustworthy and Responsible AI Resource Center to help organizations put the AI RMF 1.0 into practice. The agency encourages organizations to develop and share profiles of how they would put it to use in their specific contexts. Submissions may be sent to AIFramework@nist.gov.

    NIST is committed to continuing its work with companies, civil society, government agencies, universities and others to develop additional guidance. The agency today issued a roadmap for that work.

    The framework is part of NIST’s broad and growing portfolio of AI-related work that includes fundamental and applied research along with a focus on measurement and evaluation, technical standards, and contributions to AI policy.

    See the full article here.

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD.

    The National Institute of Standards and Technology‘s Mission, Vision, Core Competencies, and Core Values

    Mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.

    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.

    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

    Background

    The Articles of Confederation, ratified by the colonies in 1781, contained the clause, “The United States in Congress assembled shall also have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States”. Article 1, section 8, of the Constitution of the United States (1789), transferred this power to Congress; “The Congress shall have power…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures”.

    In January 1790, President George Washington, in his first annual message to Congress stated that, “Uniformity in the currency, weights, and measures of the United States is an object of great importance, and will, I am persuaded, be duly attended to”, and ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage, Weights, and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, “A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience”, but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared “Weights and measures may be ranked among the necessities of life to every individual of human society”.

    From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, which was part of the U.S. Coast and Geodetic Survey in the Department of the Treasury.

    Bureau of Standards

    In 1901 in response to a bill proposed by Congressman James H. Southard (R- Ohio) the National Bureau of Standards was founded with the mandate to provide standard weights and measures and to serve as the national physical laboratory for the United States. (Southard had previously sponsored a bill for metric conversion of the United States.)

    President Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was $40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, and set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington DC (US) and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures the Bureau developed instruments for electrical units and for measurement of light. In 1905 a meeting was called that would be the first National Conference on Weights and Measures.

    Initially conceived as purely a metrology agency the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products. Some of these standards were for products intended for government use; but product standards also affected private-sector consumption. Quality standards were developed for products including some types of clothing; automobile brake systems and headlamps; antifreeze; and electrical safety. During World War I, the Bureau worked on multiple problems related to war production even operating its own facility to produce optical glass when European supplies were cut off. Between the wars Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II military research and development was carried out including development of radio propagation forecast methods; the proximity fuze and the standardized airframe used originally for Project Pigeon; and shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.

    In 1948, financed by the United States Air Force the Bureau began design and construction of SEAC: the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version- DYSEAC- was built for the Signal Corps in 1954.

    Due to a changing mission, the “National Bureau of Standards” became the “ The National Institute of Standards and Technology” in 1988.

    Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings.

    Organization

    NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, Colorado, which was dedicated by President Eisenhower in 1954. NIST’s activities are organized into laboratory programs and extramural programs. Effective October 1, 2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six. NIST Laboratories include:

    Communications Technology Laboratory (CTL)
    Engineering Laboratory (EL)
    Information Technology Laboratory (ITL)
    Center for Neutron Research (NCNR)
    Material Measurement Laboratory (MML)
    Physical Measurement Laboratory (PML)

    Extramural programs include:

    Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers to assist small and mid-sized manufacturers to create and retain jobs, improve efficiencies, and minimize waste through process improvements and to increase market penetration with innovation and growth strategies;
    Technology Innovation Program (TIP), a grant program where NIST and industry partners cost share the early-stage development of innovative but high-risk technologies;
    Baldrige Performance Excellence Program, which administers the Malcolm Baldrige National Quality Award, the nation’s highest award for performance and business excellence.

    NIST’s Boulder laboratories are best known for NIST‑F1 which houses an atomic clock.

    NIST‑F1 serves as the source of the nation’s official time. From its measurement of the natural resonance frequency of cesium—which defines the second—NIST broadcasts time signals via longwave radio station WWVB near Fort Collins in Colorado, and shortwave radio stations WWV and WWVH, located near Fort Collins and Kekaha in Hawai’i, respectively.

    NIST also operates a neutron science user facility: the NIST Center for Neutron Research (NCNR).

    The NCNR provides scientists access to a variety of neutron scattering instruments which they use in many research fields (materials science; fuel cells; biotechnology etc.).

    The SURF III Synchrotron Ultraviolet Radiation Facility is a source of synchrotron radiation in continuous operation since 1961.

    SURF III now serves as the US national standard for source-based radiometry throughout the generalized optical spectrum. All NASA-borne extreme-ultraviolet observation instruments have been calibrated at SURF since the 1970s, and SURF is used for measurement and characterization of systems for extreme ultraviolet lithography.

    The Center for Nanoscale Science and Technology performs research in nanotechnology, both through internal research efforts and by running a user-accessible cleanroom nanomanufacturing facility.

    This “NanoFab” is equipped with tools for lithographic patterning and imaging (e.g., electron microscopes and atomic force microscopes).
    Committees

    NIST has seven standing committees:

    Technical Guidelines Development Committee (TGDC)
    Advisory Committee on Earthquake Hazards Reduction (ACEHR)
    National Construction Safety Team Advisory Committee (NCST Advisory Committee)
    Information Security and Privacy Advisory Board (ISPAB)
    Visiting Committee on Advanced Technology (VCAT)
    Board of Overseers for the Malcolm Baldrige National Quality Award (MBNQA Board of Overseers)
    Manufacturing Extension Partnership National Advisory Board (MEPNAB)

    Measurements and standards

    As part of its mission, NIST supplies industry, academia, government, and other users with over 1,300 Standard Reference Materials (SRMs). These artifacts are certified as having specific characteristics or component content, used as calibration standards for measuring equipment and procedures, quality control benchmarks for industrial processes, and experimental control samples.

    Handbook 44

    NIST publishes the Handbook 44 each year after the annual meeting of the National Conference on Weights and Measures (NCWM). Each edition is developed through cooperation of the Committee on Specifications and Tolerances of the NCWM and the Weights and Measures Division (WMD) of the NIST. The purpose of the book is a partial fulfillment of the statutory responsibility for “cooperation with the states in securing uniformity of weights and measures laws and methods of inspection”.

    NIST has been publishing various forms of what is now the Handbook 44 since 1918 and began publication under the current name in 1949. The 2010 edition conforms to the concept of the primary use of the SI (metric) measurements recommended by the Omnibus Foreign Trade and Competitiveness Act of 1988.

     
  • richardmitnick 8:20 pm on January 18, 2023 Permalink | Reply
    Tags: "NIST Improves Its Flagship Device for Measuring Mass", "QHARS": quantum Hall array resistance standard, , NIST-4 Kibble balance, The National Institute of Standards and Technology   

    From The National Institute of Standards and Technology: “NIST Improves Its Flagship Device for Measuring Mass” 

    From The National Institute of Standards and Technology

    1.18.23
    Reported and written by Jennifer Lauren Lee

    Media Contact

    Rich Press
    richard.press@nist.gov
    (301) 975-0501

    Technical Contact

    Darine El Haddad
    darine.elhaddad@nist.gov
    (301) 975-6552

    1
    The NIST-4 Kibble balance side view.

    2
    The NIST-4 Kibble balance is an electromechanical machine that measures the mass of objects roughly 1 kg. Here, you can see the top of the balance, which includes a wheel that rotates back and forth as the two sides of the balance move. Just visible on the left side of the image are a set of thin electrical wires that connect the electromagnetic coil (not pictured) to other key parts of the balance. Incidentally, the reason the thin wires are coiled like springs instead of being pulled straight is so that as the wheel moves back and forth, the wires stretch without touching each other.
    Credit: Jennifer Lauren Lee/NIST.

    The instrument can already measure the mass of objects of roughly 1 kilogram, about as heavy as a quart of milk, as accurately as any device in the world. But now, NIST researchers have further improved their Kibble balance’s performance by adding to it a custom-built device that provides an exact definition of electrical resistance. The device is called the quantum Hall array resistance standard (QHARS), and it consists of a set of several smaller devices that use a quirk of quantum physics to generate extremely precise amounts of electrical resistance. The researchers describe their work in Communications Physics [below].

    The improvement should help scientists use their balances to measure masses smaller than 1 kilogram with high accuracy, something no other Kibble balance has done before.

    NIST-4 measurements were used to help scientists redefine the kilogram, the fundamental unit of mass in the International System of Units (SI), in 2019. Everything that must be weighed, from market produce to the ingredients in your cold medicine, relies on this new definition of mass [below].

    The new custom-built QHARS device is an example of a measurement standard — an object or instrument that has some predefined relationship to a physical quantity such as length or time or brightness. The standard in this case is an electrical device that uses quantum principles to generate a precise amount of electrical resistance. This generated resistance then serves as a reference during the Kibble balance’s operation.

    ‘Current’ Dilemma

    The NIST-4 Kibble balance machine works by comparing mechanical force to electromagnetic force. In a nutshell, a mass sits on the balance, and gravity pulls it down. Researchers then pump current through a coil of wire sitting in a magnetic field, and that electrical current pushes the mass upward, effectively levitating it in midair. Scientists measure the amount of current that’s needed to float the object, balancing it exactly. If you can measure the current, you can work out the object’s mass.

    But for this to work, measurement scientists need to know exactly how much current flows through the coil with a high degree of accuracy. They do this by measuring two other easier-to-measure values: the voltage and the resistance.

    A quantum voltage standard is already integrated into the device. But the quantum resistance standard could not be used directly because the traditional device, made of gallium arsenide (GaAs), cannot function correctly with the relatively large amounts of current needed to levitate a macro-scale object like a 50- or 100- or even 1,000-gram mass. So instead, the GaAs device was used separately to measure the resistance of a freshly calibrated object that is then inserted into NIST-4 and used in the actual measurement.

    3
    A close-up of the array of units that make up a QHARS device. The electronic chip that sits on the gold platform is about 7 mm long and 3.5 mm wide, roughly the size of a pencil eraser. Credit: Alireza R. Panna/NIST.

    New QHARS to the Rescue

    To address this problem, NIST has been designing and testing a new type of quantum resistance device: the QHARS. Instead of GaAs, this instrument is made of graphene — the single atomic layer-thick sheet of carbon atoms that has been a hot topic for many years for its promise in a variety of uses, including faster and flexible electronics.

    The new graphene QHARS developed at NIST passes current through an array of 13 smaller elements in parallel. These elements work based on something called the quantum Hall effect, in which the electrical resistance is “quantized” — that is, it can have only a few possible, very specific and predictable, values. That makes the device a resistance standard that is accurate on a quantum level. (See animation.)

    Using 13 quantum Hall resistor units together further increases the amount of current the new QHARS can handle.

    “We need about 700 microamperes [millionths of an ampere] flowing in the coil to levitate a 100-gram mass,” Haddad said. “In the gallium arsenide resistance standard, you can’t do that.”

    To prove this new quantum resistance standard could work in NIST-4, Haddad and her team used multiple QHARS devices, one at a time, and compared their results indirectly to the GaAs quantum resistance standard. The results for the 50-gram mass measurements all closely agreed with one another — “it’s as good as it gets,” Haddad said.

    Future models of the new resistance standard might see further improvements. In order to work, both the traditional GaAs device and the graphene QHARS must be cooled to just a few degrees above absolute zero and be exposed to their own high magnetic fields. Someday, a QHARS-style device could be developed to work at room temperature and zero magnetic field, which would make the whole system much more compact.

    Also, unlike the old resistance standard, a next-generation QHARS could be programmable, meaning the instrument would be more versatile: Scientists could use one device to generate different amounts of resistance depending on what they needed for a particular experiment.

    “A quantum resistance standard that is programmable and that works at room temperature with a low magnetic field: This is what the physicists are trying to push for,” Haddad said.

    Science paper:
    Communications Physics
    See the science paper for instructive material with images.
    Science article:
    new definition of mass
    See the science article for instructive material with images.

    See the full article here.

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD.

    The National Institute of Standards and Technology‘s Mission, Vision, Core Competencies, and Core Values

    Mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.

    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.

    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

    Background

    The Articles of Confederation, ratified by the colonies in 1781, contained the clause, “The United States in Congress assembled shall also have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States”. Article 1, section 8, of the Constitution of the United States (1789), transferred this power to Congress; “The Congress shall have power…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures”.

    In January 1790, President George Washington, in his first annual message to Congress stated that, “Uniformity in the currency, weights, and measures of the United States is an object of great importance, and will, I am persuaded, be duly attended to”, and ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage, Weights, and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, “A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience”, but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared “Weights and measures may be ranked among the necessities of life to every individual of human society”.

    From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, which was part of the U.S. Coast and Geodetic Survey in the Department of the Treasury.

    Bureau of Standards

    In 1901 in response to a bill proposed by Congressman James H. Southard (R- Ohio) the National Bureau of Standards was founded with the mandate to provide standard weights and measures and to serve as the national physical laboratory for the United States. (Southard had previously sponsored a bill for metric conversion of the United States.)

    President Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was $40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, and set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington DC (US) and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures the Bureau developed instruments for electrical units and for measurement of light. In 1905 a meeting was called that would be the first National Conference on Weights and Measures.

    Initially conceived as purely a metrology agency the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products. Some of these standards were for products intended for government use; but product standards also affected private-sector consumption. Quality standards were developed for products including some types of clothing; automobile brake systems and headlamps; antifreeze; and electrical safety. During World War I, the Bureau worked on multiple problems related to war production even operating its own facility to produce optical glass when European supplies were cut off. Between the wars Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II military research and development was carried out including development of radio propagation forecast methods; the proximity fuze and the standardized airframe used originally for Project Pigeon; and shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.

    In 1948, financed by the United States Air Force the Bureau began design and construction of SEAC: the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version- DYSEAC- was built for the Signal Corps in 1954.

    Due to a changing mission, the “National Bureau of Standards” became the “ The National Institute of Standards and Technology” in 1988.

    Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings.

    Organization

    NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, Colorado, which was dedicated by President Eisenhower in 1954. NIST’s activities are organized into laboratory programs and extramural programs. Effective October 1, 2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six. NIST Laboratories include:

    Communications Technology Laboratory (CTL)
    Engineering Laboratory (EL)
    Information Technology Laboratory (ITL)
    Center for Neutron Research (NCNR)
    Material Measurement Laboratory (MML)
    Physical Measurement Laboratory (PML)

    Extramural programs include:

    Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers to assist small and mid-sized manufacturers to create and retain jobs, improve efficiencies, and minimize waste through process improvements and to increase market penetration with innovation and growth strategies;
    Technology Innovation Program (TIP), a grant program where NIST and industry partners cost share the early-stage development of innovative but high-risk technologies;
    Baldrige Performance Excellence Program, which administers the Malcolm Baldrige National Quality Award, the nation’s highest award for performance and business excellence.

    NIST’s Boulder laboratories are best known for NIST‑F1 which houses an atomic clock.

    NIST‑F1 serves as the source of the nation’s official time. From its measurement of the natural resonance frequency of cesium—which defines the second—NIST broadcasts time signals via longwave radio station WWVB near Fort Collins in Colorado, and shortwave radio stations WWV and WWVH, located near Fort Collins and Kekaha in Hawai’i, respectively.

    NIST also operates a neutron science user facility: the NIST Center for Neutron Research (NCNR).

    The NCNR provides scientists access to a variety of neutron scattering instruments which they use in many research fields (materials science; fuel cells; biotechnology etc.).

    The SURF III Synchrotron Ultraviolet Radiation Facility is a source of synchrotron radiation in continuous operation since 1961.

    SURF III now serves as the US national standard for source-based radiometry throughout the generalized optical spectrum. All NASA-borne extreme-ultraviolet observation instruments have been calibrated at SURF since the 1970s, and SURF is used for measurement and characterization of systems for extreme ultraviolet lithography.

    The Center for Nanoscale Science and Technology performs research in nanotechnology, both through internal research efforts and by running a user-accessible cleanroom nanomanufacturing facility.

    This “NanoFab” is equipped with tools for lithographic patterning and imaging (e.g., electron microscopes and atomic force microscopes).
    Committees

    NIST has seven standing committees:

    Technical Guidelines Development Committee (TGDC)
    Advisory Committee on Earthquake Hazards Reduction (ACEHR)
    National Construction Safety Team Advisory Committee (NCST Advisory Committee)
    Information Security and Privacy Advisory Board (ISPAB)
    Visiting Committee on Advanced Technology (VCAT)
    Board of Overseers for the Malcolm Baldrige National Quality Award (MBNQA Board of Overseers)
    Manufacturing Extension Partnership National Advisory Board (MEPNAB)

    Measurements and standards

    As part of its mission, NIST supplies industry, academia, government, and other users with over 1,300 Standard Reference Materials (SRMs). These artifacts are certified as having specific characteristics or component content, used as calibration standards for measuring equipment and procedures, quality control benchmarks for industrial processes, and experimental control samples.

    Handbook 44

    NIST publishes the Handbook 44 each year after the annual meeting of the National Conference on Weights and Measures (NCWM). Each edition is developed through cooperation of the Committee on Specifications and Tolerances of the NCWM and the Weights and Measures Division (WMD) of the NIST. The purpose of the book is a partial fulfillment of the statutory responsibility for “cooperation with the states in securing uniformity of weights and measures laws and methods of inspection”.

    NIST has been publishing various forms of what is now the Handbook 44 since 1918 and began publication under the current name in 1949. The 2010 edition conforms to the concept of the primary use of the SI (metric) measurements recommended by the Omnibus Foreign Trade and Competitiveness Act of 1988.

     
  • richardmitnick 3:42 pm on January 3, 2023 Permalink | Reply
    Tags: "Chip circuit for light could be applied to quantum computations", , , Microchip quantum dots—artificial atoms that generate photons on-demand when illuminated by a laser—with miniature circuits guiding the light without significant loss of intensity., Optical communications, , , , , The circuit allows for significant time delays because it employs waveguides of various lengths that can store photons for relatively long periods of time., The longer delay times with the new circuit are important for operations in which photons from one or more quantum dots need to arrive at a specific location at equally spaced time intervals., The National Institute of Standards and Technology, The prototype circuits have a loss of intensity equal to only one percent of similar circuits., The researchers fabricated silicon-nitride waveguides-the channels through which the photons traveled—and buried them in silicon dioxide., These photon qubits—all of which travel at the same speed and are indistinguishable from each other—must simultaneously arrive at specific processing nodes in the circuit., Using light rather than electric charges to store and carry information.   

    From The National Institute of Standards and Technology Via “phys.org” : “Chip circuit for light could be applied to quantum computations” 

    From The National Institute of Standards and Technology

    Via

    “phys.org”

    1.3.23

    1
    Future versions of the new photonic circuits will feature low-loss waveguides—the channels through which the single photons travel–some 3 meters long but tightly coiled to fit on a chip. The long waveguides will allow researchers to more precisely choose the time intervals (Δt) when photons exit different channels to rendezvous at a particular location. Credit: NIST.

    The ability to transmit and manipulate, with minimal loss, the smallest unit of light—the photon—plays a pivotal role in optical communications as well as designs for quantum computers that would use light rather than electric charges to store and carry information.

    Now, researchers at the National Institute of Standards and Technology (NIST) and their colleagues have connected, on a single microchip quantum dots—artificial atoms that generate individual photons rapidly and on-demand when illuminated by a laser—with miniature circuits that can guide the light without significant loss of intensity.

    To create the ultra-low-loss circuits, the researchers fabricated silicon-nitride waveguides—the channels through which the photons traveled—and buried them in silicon dioxide. The channels were wide but shallow, a geometry that reduced the likelihood that photons would scatter out of the waveguides. Encapsulating the waveguides in silicon dioxide also helped to reduce scattering.

    The scientists reported that their prototype circuits have a loss of intensity equal to only one percent of similar circuits—also using quantum dots—that were fabricated by other teams.

    Ultimately, devices that incorporate this new chip technology could take advantage of the strange properties of quantum mechanics to perform complex computations that classical (non-quantum) circuits may not be capable of doing.

    2
    Illustration shows some of the steps in creating the new ultra-low-loss photonic circuit on a chip. A microprobe lifts a gallium arsenide device containing a quantum dot—artificial atoms that generate single photons—from one chip. Then the probe places the quantum-dot device atop a low-loss silver-nitride waveguide built on another chip. Credit: S. Kelley/NIST.

    For instance, according to the laws of quantum mechanics, a single photon has a probability of residing in two different places, such as two different waveguides, at the same time. Those probabilities can be used to store information; an individual photon can act as a quantum bit, or qubit, which carries much more information than the binary bit of a classical computer, which is limited to a value of 0 or 1.

    To perform operations necessary to solve computational problems, these photon qubits—all of which travel at the same speed and are indistinguishable from each other—must simultaneously arrive at specific processing nodes in the circuit. That poses a challenge because photons originating from different locations—and traveling along different waveguides—across the circuit may lie at significantly different distances from processing points. To ensure simultaneous arrival, photons emitted closer to the designated destination must delay their journey, giving those that lie in more distant waveguides a head start.

    The circuit devised by NIST researchers including Ashish Chanana and Marcelo Davanco, along with an international team of colleagues, allows for significant time delays because it employs waveguides of various lengths that can store photons for relatively long periods of time. For instance, the researchers calculate that a 3-meter-long waveguide (tightly coiled so its diameter on a chip is only a few millimeters) would have a 50 percent probability of transmitting a photon with a time delay of 20 nanoseconds (billionths of a second). By comparison, previous devices, developed by other teams and operating under similar conditions, were limited to inducing time delays only one one-hundredth as long.

    The longer delay times achieved with the new circuit are also important for operations in which photons from one or more quantum dots need to arrive at a specific location at equally spaced time intervals. In addition, the low-loss quantum-dot circuit could dramatically increase the number of single photons available for carrying quantum information on a chip, enabling larger, speedier, and more reliable computational and information-processing systems

    The scientists, who include researchers from The University of California-Santa Barbara, the Massachusetts Institute of Technology, the Korea Institute of Science and Technology and the University of São Paulo in Brazil, reported their findings December 11 in Nature Communications [below].

    The hybrid circuit consists of two components, each initially built on a separate chip. One, a gallium arsenide semiconductor device designed and fabricated at NIST, hosts the quantum dots and directly funnels the single photons they generate into a second device—a low-loss silicon nitride waveguide developed at The University of California-Santa Barbara.

    To marry the two components, researchers at MIT first used the fine metal tip of a pick-and-place microprobe, acting like a miniature crowbar, to pry the gallium arsenide device from the chip built at NIST. They then placed it atop the silicon nitride circuit on the other chip.

    The researchers face several challenges before the hybrid circuit can be routinely employed in a photonic device. At present, only about 6 percent of the individual photons generated by the quantum dots can be funneled into the circuit. However, simulations suggest that if the team changes the angle at which the photons are funneled, in tandem with improvements in the positioning and orientation of the quantum dots, the rate could rise above 80 percent.

    Another issue is that the quantum dots do not always emit single photons at exactly the same wavelength, a requirement for creating the indistinguishable photons necessary for the quantum computational operations. The team is exploring several strategies, including applying a constant electric field to the dots, that may alleviate that problem.

    Science paper:
    Nature Communications
    See the science paper for instructive material with images.

    See the full article here.

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD.

    The National Institute of Standards and Technology‘s Mission, Vision, Core Competencies, and Core Values

    Mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.

    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.

    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

    Background

    The Articles of Confederation, ratified by the colonies in 1781, contained the clause, “The United States in Congress assembled shall also have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States”. Article 1, section 8, of the Constitution of the United States (1789), transferred this power to Congress; “The Congress shall have power…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures”.

    In January 1790, President George Washington, in his first annual message to Congress stated that, “Uniformity in the currency, weights, and measures of the United States is an object of great importance, and will, I am persuaded, be duly attended to”, and ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage, Weights, and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, “A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience”, but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared “Weights and measures may be ranked among the necessities of life to every individual of human society”.

    From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, which was part of the U.S. Coast and Geodetic Survey in the Department of the Treasury.

    Bureau of Standards

    In 1901 in response to a bill proposed by Congressman James H. Southard (R- Ohio) the National Bureau of Standards was founded with the mandate to provide standard weights and measures and to serve as the national physical laboratory for the United States. (Southard had previously sponsored a bill for metric conversion of the United States.)

    President Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was $40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, and set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington DC (US) and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures the Bureau developed instruments for electrical units and for measurement of light. In 1905 a meeting was called that would be the first National Conference on Weights and Measures.

    Initially conceived as purely a metrology agency the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products. Some of these standards were for products intended for government use; but product standards also affected private-sector consumption. Quality standards were developed for products including some types of clothing; automobile brake systems and headlamps; antifreeze; and electrical safety. During World War I, the Bureau worked on multiple problems related to war production even operating its own facility to produce optical glass when European supplies were cut off. Between the wars Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II military research and development was carried out including development of radio propagation forecast methods; the proximity fuze and the standardized airframe used originally for Project Pigeon; and shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.

    In 1948, financed by the United States Air Force the Bureau began design and construction of SEAC: the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version- DYSEAC- was built for the Signal Corps in 1954.

    Due to a changing mission, the “National Bureau of Standards” became the “ The National Institute of Standards and Technology” in 1988.

    Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings.

    Organization

    NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, Colorado, which was dedicated by President Eisenhower in 1954. NIST’s activities are organized into laboratory programs and extramural programs. Effective October 1, 2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six. NIST Laboratories include:

    Communications Technology Laboratory (CTL)
    Engineering Laboratory (EL)
    Information Technology Laboratory (ITL)
    Center for Neutron Research (NCNR)
    Material Measurement Laboratory (MML)
    Physical Measurement Laboratory (PML)

    Extramural programs include:

    Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers to assist small and mid-sized manufacturers to create and retain jobs, improve efficiencies, and minimize waste through process improvements and to increase market penetration with innovation and growth strategies;
    Technology Innovation Program (TIP), a grant program where NIST and industry partners cost share the early-stage development of innovative but high-risk technologies;
    Baldrige Performance Excellence Program, which administers the Malcolm Baldrige National Quality Award, the nation’s highest award for performance and business excellence.

    NIST’s Boulder laboratories are best known for NIST‑F1 which houses an atomic clock.

    NIST‑F1 serves as the source of the nation’s official time. From its measurement of the natural resonance frequency of cesium—which defines the second—NIST broadcasts time signals via longwave radio station WWVB near Fort Collins in Colorado, and shortwave radio stations WWV and WWVH, located near Fort Collins and Kekaha in Hawai’i, respectively.

    NIST also operates a neutron science user facility: the NIST Center for Neutron Research (NCNR).

    The NCNR provides scientists access to a variety of neutron scattering instruments which they use in many research fields (materials science; fuel cells; biotechnology etc.).

    The SURF III Synchrotron Ultraviolet Radiation Facility is a source of synchrotron radiation in continuous operation since 1961.

    SURF III now serves as the US national standard for source-based radiometry throughout the generalized optical spectrum. All NASA-borne extreme-ultraviolet observation instruments have been calibrated at SURF since the 1970s, and SURF is used for measurement and characterization of systems for extreme ultraviolet lithography.

    The Center for Nanoscale Science and Technology performs research in nanotechnology, both through internal research efforts and by running a user-accessible cleanroom nanomanufacturing facility.

    This “NanoFab” is equipped with tools for lithographic patterning and imaging (e.g., electron microscopes and atomic force microscopes).
    Committees

    NIST has seven standing committees:

    Technical Guidelines Development Committee (TGDC)
    Advisory Committee on Earthquake Hazards Reduction (ACEHR)
    National Construction Safety Team Advisory Committee (NCST Advisory Committee)
    Information Security and Privacy Advisory Board (ISPAB)
    Visiting Committee on Advanced Technology (VCAT)
    Board of Overseers for the Malcolm Baldrige National Quality Award (MBNQA Board of Overseers)
    Manufacturing Extension Partnership National Advisory Board (MEPNAB)

    Measurements and standards

    As part of its mission, NIST supplies industry, academia, government, and other users with over 1,300 Standard Reference Materials (SRMs). These artifacts are certified as having specific characteristics or component content, used as calibration standards for measuring equipment and procedures, quality control benchmarks for industrial processes, and experimental control samples.

    Handbook 44

    NIST publishes the Handbook 44 each year after the annual meeting of the National Conference on Weights and Measures (NCWM). Each edition is developed through cooperation of the Committee on Specifications and Tolerances of the NCWM and the Weights and Measures Division (WMD) of the NIST. The purpose of the book is a partial fulfillment of the statutory responsibility for “cooperation with the states in securing uniformity of weights and measures laws and methods of inspection”.

    NIST has been publishing various forms of what is now the Handbook 44 since 1918 and began publication under the current name in 1949. The 2010 edition conforms to the concept of the primary use of the SI (metric) measurements recommended by the Omnibus Foreign Trade and Competitiveness Act of 1988.

     
  • richardmitnick 8:48 pm on December 8, 2022 Permalink | Reply
    Tags: "Unveiling the Universe - In Four New Studies NIST Explores Novel Ways to Hunt Dark Matter", , , , , , The National Institute of Standards and Technology   

    From The National Institute of Standards and Technology: “Unveiling the Universe – In Four New Studies NIST Explores Novel Ways to Hunt Dark Matter” 

    From The National Institute of Standards and Technology

    12.8.22

    Media Contact
    Rich Press
    richard.press@nist.gov
    (301) 975-0501

    Technical Contact
    Jacob Taylor
    jacob.taylor@nist.gov
    (301) 975-8586

    Marianna Safronova
    marianna.safronova@nist.gov

    For decades, astronomers and physicists have been trying to solve one of the deepest mysteries about the cosmos: An estimated 85% of its mass is missing. Numerous astronomical observations indicate that the visible mass in the universe is not nearly enough to hold galaxies together and account for how matter clumps. Some kind of invisible, unknown type of subatomic particle, dubbed dark matter, must provide the extra gravitational glue.

    In underground laboratories and at particle accelerators, scientists have been searching for this dark matter with no success for more than 30 years.
    __________________________________
    Dark Matter Background
    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., and Vera Rubin a Woman in STEM, denied the Nobel, some 30 years later, did most of the work on Dark Matter.

    Fritz Zwicky.

    Coma cluster via NASA/ESA Hubble, the original example of Dark Matter discovered during observations by Fritz Zwicky and confirmed 30 years later by Vera Rubin.

    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.

    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.

    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).

    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970.

    Vera Rubin measuring spectra, worked on Dark Matter(Emilio Segre Visual Archives AIP SPL).

    Dark Matter Research

    Super Cryogenic Dark Matter Search from DOE’s SLAC National Accelerator Laboratory at Stanford University at SNOLAB (Vale Inco Mine, Sudbury, Canada).

    LBNL LZ Dark Matter Experiment xenon detector at Sanford Underground Research Facility Credit: Matt Kapust.


    DAMA at Gran Sasso uses sodium iodide housed in copper to hunt for dark matter LNGS-INFN.

    Yale HAYSTAC axion dark matter experiment at Yale’s Wright Lab.

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB (CA) deep in Sudbury’s Creighton Mine.

    The LBNL LZ Dark Matter Experiment Dark Matter project at SURF, Lead, SD.

    DAMA-LIBRA Dark Matter experiment at the Italian National Institute for Nuclear Physics’ (INFN’s) Gran Sasso National Laboratories (LNGS) located in the Abruzzo region of central Italy.

    DARWIN Dark Matter experiment. A design study for a next-generation, multi-ton dark matter detector in Europe at The University of Zurich [Universität Zürich](CH).

    PandaX II Dark Matter experiment at Jin-ping Underground Laboratory (CJPL) in Sichuan, China.

    Inside the Axion Dark Matter eXperiment U Washington. Credit: Mark Stone U. of Washington. Axion Dark Matter Experiment.

    3
    The University of Western Australia ORGAN Experiment’s main detector. A small copper cylinder called a “resonant cavity” traps photons generated during dark matter conversion. The cylinder is bolted to a “dilution refrigerator” which cools the experiment to very low temperatures.
    __________________________________

    Researchers at NIST are now exploring new ways to search for the invisible particles. In one study, a prototype for a much larger experiment, researchers have used state-of-the-art superconducting detectors to hunt for dark matter. The study has already placed new limits on the possible mass of one type of hypothesized dark matter. Another NIST team has proposed that trapped electrons, commonly used to measure properties of ordinary particles, could also serve as highly sensitive detectors of hypothetical dark matter particles if they carry charge.

    In the superconducting detector study, NIST scientists Jeff Chiles and Sae Woo Nam and their collaborators used tungsten silicide superconducting nanowires only one-thousandth the width of a human hair as dark-matter detectors.

    1
    NIST scientists Jeff Chiles and Sae Woo Nam and their collaborators used tungsten silicide superconducting nanowires only one-thousandth the width of a human hair as dark-matter detectors.

    “Superconducting” refers to a property that some materials, such as tungsten silicide, have at ultralow temperatures: zero resistance to the flow of electrical current. Systems of such wires, formally known as superconducting nanowire single-photon detectors (SNSPDs), are exquisitely sensitive to extremely small amounts of energy imparted by photons (particles of light) and perhaps dark matter particles when they collide with the detectors.

    Although the experiment would have to be performed on a larger scale with many more detectors to provide an expanded dataset, it is still the most sensitive search for dark photons performed to date in this mass range, Nam said. The researchers, including collaborators from the Massachusetts Institute of Technology, Stanford University, University of Washington, New York University and the Flatiron Institute, reported their results in an article in Physical Review Letters [below] posted on June 10.

    In a second report, some of the same NIST researchers and their collaborators analyzed data from the first study in a different way. The scientists ignored potential effects of the stack of insulating material and focused only on whether any kind of dark matter particles would be capable of interacting with individual electrons in the nanowire detector itself — either by scattering off an electron or being absorbed by it. Although small, this study has placed the strongest limits of any experiment to date — excluding astrophysical searches and studies of the sun — on the strength of interactions between electrons and dark matter in the sub-million-eV mass range. That makes it likely that a scaled-up version of the SNSPD setup could make a significant contribution to the search for dark matter, said Chiles. He and his colleagues from the Hebrew University of Jerusalem, the University of California-Santa Cruz, the University of California’s Santa Cruz Institute for Particle Physics; and MIT reported this analysis in an article in the Dec. 8 edition of Physical Review D [below].

    In a third study, a NIST physicist and his colleagues proposed that single electrons, electromagnetically confined to a small region of space, could be sensitive detectors of charged dark matter particles. For more than three decades, scientists have used a much heavier population of positively charged beryllium ions to probe the electric and magnetic properties of ordinary (non-dark) charged particles. Electrons, however, would make ideal detectors for sensing dark matter particles if those particles have even the slightest electric charge. That’s because electrons have the lowest mass of any charged particle known and therefore are easily pushed or pulled by the merest electrical disturbance, such as a particle with a small electric charge passing nearby. Only a few single trapped electrons would be needed to detect charged dark matter particles with only one-hundredth the charge of an electron, said NIST physicist Jake Taylor, a fellow of the Joint Quantum Institute and the Joint Center for Quantum Information and Computer Science, research partnerships between NIST and the University of Maryland. The electromagnetically trapped electrons would be cooled to a fraction of a degree above absolute zero in order to limit the particle’s inherent jitter. Taylor, along with Daniel Carney of The DOE’s Lawrence Berkeley National Laboratory in California, Hartmut Haffner of the University of California- Berkeley, and David C. Moore of Yale University, described their proposed experiment in a Physical Review Letters [below] article posted online last August. By configuring the trap so that the strength of the electron’s confinement is different along each dimension — length, width and height — the trap could potentially also provide information about the direction from which the dark matter particle arrived. However, scientists must grapple with a technological challenge before they can employ electron trapping to search for dark matter. Photons are used to cool, manipulate and sense the motion of trapped ions and electrons. For beryllium ions, those photons — generated by a laser — fall in the range of visible light. The technology that enables visible-light photons to manipulate trapped beryllium ions is well established. In contrast, the photons required to sense the motion of single electrons have microwave energies, and the necessary detection technology has yet to be perfected. However, if interest in the project is strong enough, scientists might develop an electron trap capable of detecting dark matter in less than five years, Carney estimated.

    In the fourth study, a NIST researcher and an international group of colleagues are looking beyond Earth to hunt for dark matter. A team that includes Marianna Safronova of the University of Delaware and the Joint Quantum Institute has proposed that a new generation of atomic clocks, installed on a spacecraft that would fly closer to the Sun than Mercury’s orbit, could search for signs of ultralight dark matter. This hypothetical type of dark matter, bound to a halo surrounding the Sun, would cause tiny variations in the fundamental constants of nature, including the mass of the electron and the fine structure constant. Changes in these constants would alter the frequency at which atomic clocks vibrate — the rate at which they “tick.” Among the large variety of atomic clocks, researchers would carefully choose two that have different sensitivities to changes in the fundamental constants driven by ultralight dark matter. By measuring the ratio of the two varying frequencies, scientists could reveal the presence of the dark matter, the researchers calculated. They describe their analysis in an article posted online Dec. 5 in Nature Astronomy [below].

    Science papers:
    Physical Review Letters
    Physical Review D
    Physical Review Letters 2021
    Nature Astronomy

    See the full article here.

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD.

    The National Institute of Standards and Technology‘s Mission, Vision, Core Competencies, and Core Values

    Mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.

    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.

    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

    Background

    The Articles of Confederation, ratified by the colonies in 1781, contained the clause, “The United States in Congress assembled shall also have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States”. Article 1, section 8, of the Constitution of the United States (1789), transferred this power to Congress; “The Congress shall have power…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures”.

    In January 1790, President George Washington, in his first annual message to Congress stated that, “Uniformity in the currency, weights, and measures of the United States is an object of great importance, and will, I am persuaded, be duly attended to”, and ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage, Weights, and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, “A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience”, but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared “Weights and measures may be ranked among the necessities of life to every individual of human society”.

    From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, which was part of the U.S. Coast and Geodetic Survey in the Department of the Treasury.

    Bureau of Standards

    In 1901 in response to a bill proposed by Congressman James H. Southard (R- Ohio) the National Bureau of Standards was founded with the mandate to provide standard weights and measures and to serve as the national physical laboratory for the United States. (Southard had previously sponsored a bill for metric conversion of the United States.)

    President Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was $40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, and set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington DC (US) and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures the Bureau developed instruments for electrical units and for measurement of light. In 1905 a meeting was called that would be the first National Conference on Weights and Measures.

    Initially conceived as purely a metrology agency the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products. Some of these standards were for products intended for government use; but product standards also affected private-sector consumption. Quality standards were developed for products including some types of clothing; automobile brake systems and headlamps; antifreeze; and electrical safety. During World War I, the Bureau worked on multiple problems related to war production even operating its own facility to produce optical glass when European supplies were cut off. Between the wars Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II military research and development was carried out including development of radio propagation forecast methods; the proximity fuze and the standardized airframe used originally for Project Pigeon; and shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.

    In 1948, financed by the United States Air Force the Bureau began design and construction of SEAC: the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version- DYSEAC- was built for the Signal Corps in 1954.

    Due to a changing mission, the “National Bureau of Standards” became the “ The National Institute of Standards and Technology” in 1988.

    Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings.

    Organization

    NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, Colorado, which was dedicated by President Eisenhower in 1954. NIST’s activities are organized into laboratory programs and extramural programs. Effective October 1, 2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six. NIST Laboratories include:

    Communications Technology Laboratory (CTL)
    Engineering Laboratory (EL)
    Information Technology Laboratory (ITL)
    Center for Neutron Research (NCNR)
    Material Measurement Laboratory (MML)
    Physical Measurement Laboratory (PML)

    Extramural programs include:

    Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers to assist small and mid-sized manufacturers to create and retain jobs, improve efficiencies, and minimize waste through process improvements and to increase market penetration with innovation and growth strategies;
    Technology Innovation Program (TIP), a grant program where NIST and industry partners cost share the early-stage development of innovative but high-risk technologies;
    Baldrige Performance Excellence Program, which administers the Malcolm Baldrige National Quality Award, the nation’s highest award for performance and business excellence.

    NIST’s Boulder laboratories are best known for NIST‑F1 which houses an atomic clock.

    NIST‑F1 serves as the source of the nation’s official time. From its measurement of the natural resonance frequency of cesium—which defines the second—NIST broadcasts time signals via longwave radio station WWVB near Fort Collins in Colorado, and shortwave radio stations WWV and WWVH, located near Fort Collins and Kekaha in Hawai’i, respectively.

    NIST also operates a neutron science user facility: the NIST Center for Neutron Research (NCNR).

    The NCNR provides scientists access to a variety of neutron scattering instruments which they use in many research fields (materials science; fuel cells; biotechnology etc.).

    The SURF III Synchrotron Ultraviolet Radiation Facility is a source of synchrotron radiation in continuous operation since 1961.

    SURF III now serves as the US national standard for source-based radiometry throughout the generalized optical spectrum. All NASA-borne extreme-ultraviolet observation instruments have been calibrated at SURF since the 1970s, and SURF is used for measurement and characterization of systems for extreme ultraviolet lithography.

    The Center for Nanoscale Science and Technology performs research in nanotechnology, both through internal research efforts and by running a user-accessible cleanroom nanomanufacturing facility.

    This “NanoFab” is equipped with tools for lithographic patterning and imaging (e.g., electron microscopes and atomic force microscopes).
    Committees

    NIST has seven standing committees:

    Technical Guidelines Development Committee (TGDC)
    Advisory Committee on Earthquake Hazards Reduction (ACEHR)
    National Construction Safety Team Advisory Committee (NCST Advisory Committee)
    Information Security and Privacy Advisory Board (ISPAB)
    Visiting Committee on Advanced Technology (VCAT)
    Board of Overseers for the Malcolm Baldrige National Quality Award (MBNQA Board of Overseers)
    Manufacturing Extension Partnership National Advisory Board (MEPNAB)

    Measurements and standards

    As part of its mission, NIST supplies industry, academia, government, and other users with over 1,300 Standard Reference Materials (SRMs). These artifacts are certified as having specific characteristics or component content, used as calibration standards for measuring equipment and procedures, quality control benchmarks for industrial processes, and experimental control samples.

    Handbook 44

    NIST publishes the Handbook 44 each year after the annual meeting of the National Conference on Weights and Measures (NCWM). Each edition is developed through cooperation of the Committee on Specifications and Tolerances of the NCWM and the Weights and Measures Division (WMD) of the NIST. The purpose of the book is a partial fulfillment of the statutory responsibility for “cooperation with the states in securing uniformity of weights and measures laws and methods of inspection”.

    NIST has been publishing various forms of what is now the Handbook 44 since 1918 and began publication under the current name in 1949. The 2010 edition conforms to the concept of the primary use of the SI (metric) measurements recommended by the Omnibus Foreign Trade and Competitiveness Act of 1988.

     
  • richardmitnick 11:58 am on November 12, 2022 Permalink | Reply
    Tags: "Collaboration achieves record level of radio frequency signal synthesis with quantum-based accuracy", , RF Josephson arbitrary waveform synthesizer, , The National Institute of Standards and Technology   

    From The National Institute of Standards and Technology And The University of Colorado-Boulder Via “TechXplore” at “Science X”: “Collaboration achieves record level of radio frequency signal synthesis with quantum-based accuracy” 

    From The National Institute of Standards and Technology

    And

    U Colorado

    The University of Colorado-Boulder

    Via

    “TechXplore” at “Science X”

    11.11.22

    1
    Credit: National Institute of Standards and Technology.

    NIST, in collaboration with The University of Colorado-Boulder faculty, published a paper titled: RF Josephson Arbitrary Waveform Synthesizer with Integrated Superconducting Diplexers demonstrating results that show a significant step toward a broadband, integrated, quantum-based microwave voltage source with useful power above -30 dBm.

    This milestone creates new opportunities for improving measurements of high-accuracy RF voltage and power for modern high-speed communications components and instruments.

    NIST’s goal is to advance quantum-based standards for RF communications to eliminate costs and overhead in calibration and traceability chain measurements by providing self-calibrated, quantum-based standards and automated measurement capability to communication and instrument manufacturers.

    The team is developing a quantum-defined superconducting programmable voltage source for generating microwave-frequency waveforms. The voltage source is an RF Josephson arbitrary waveform synthesizer (RF-JAWS) that utilizes a superconducting integrated circuit that is cooled to 4 K and is composed of an array of 4,500 Josephson junctions.

    The researchers incorporated on-chip superconducting diplexers and integrated them with the RF-JAWS circuit to achieve an open-circuit signal of 22 mV rms at 1.005 GHz, which is a 25% increase in state-of-the-art. The use of integrated filtering enables 25% larger microwave amplitudes compared to the state-of-the-art thanks to a broader passband and lower loss.

    Measurements of the new circuit showed that it correctly synthesized the RF waveform with a signal amplitude that was based on quantum effects.

    The paper is published in IEEE Transactions on Applied Superconductivity.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Colorado Campus

    As the flagship university of the state of Colorado The University of Colorado-Boulder , founded in 1876, five months before Colorado became a state. It is a dynamic community of scholars and learners situated on one of the most spectacular college campuses in the country, and is classified as an R1 University, meaning that it engages in a very high level of research activity. As one of 34 U.S. public institutions belonging to the prestigious Association of American Universities ), a selective group of major research universities in North America, – and the only member in the Rocky Mountain region – we have a proud tradition of academic excellence, with five Nobel laureates and more than 50 members of prestigious academic academies.

    University of Colorado-Boulder has blossomed in size and quality since we opened our doors in 1877 – attracting superb faculty, staff, and students and building strong programs in the sciences, engineering, business, law, arts, humanities, education, music, and many other disciplines.

    Today, with our sights set on becoming the standard for the great comprehensive public research universities of the new century, we strive to serve the people of Colorado and to engage with the world through excellence in our teaching, research, creative work, and service.

    In 2015, the university comprised nine colleges and schools and offered over 150 academic programs and enrolled almost 17,000 students. Five Nobel Laureates, nine MacArthur Fellows, and 20 astronauts have been affiliated with CU Boulder as students; researchers; or faculty members in its history. In 2010, the university received nearly $454 million in sponsored research to fund programs like the Laboratory for Atmospheric and Space Physics and JILA. CU Boulder has been called a Public Ivy, a group of publicly funded universities considered as providing a quality of education comparable to those of the Ivy League.

    The Colorado Buffaloes compete in 17 varsity sports and are members of the NCAA Division I Pac-12 Conference. The Buffaloes have won 28 national championships: 20 in skiing, seven total in men’s and women’s cross country, and one in football. The university has produced a total of ten Olympic medalists. Approximately 900 students participate in 34 intercollegiate club sports annually as well.

    On March 14, 1876, the Colorado territorial legislature passed an amendment to the state constitution that provided money for the establishment of the University of Colorado in Boulder, the Colorado School of Mines in Golden, and the Colorado State University – College of Agricultural Sciences in Fort Collins.

    Two cities competed for the site of the University of Colorado: Boulder and Cañon City. The consolation prize for the losing city was to be home of the new Colorado State Prison. Cañon City was at a disadvantage as it was already the home of the Colorado Territorial Prison. (There are now six prisons in the Cañon City area.)

    The cornerstone of the building that became Old Main was laid on September 20, 1875. The doors of the university opened on September 5, 1877. At the time, there were few high schools in the state that could adequately prepare students for university work, so in addition to the University, a preparatory school was formed on campus. In the fall of 1877, the student body consisted of 15 students in the college proper and 50 students in the preparatory school. There were 38 men and 27 women, and their ages ranged from 12–23 years.

    During World War II, Colorado was one of 131 colleges and universities nationally that took part in the V-12 Navy College Training Program which offered students a path to a navy commission.

    University of Colorado-Boulder hired its first female professor, Mary Rippon, in 1878. It hired its first African-American professor, Charles H. Nilon, in 1956, and its first African-American librarian, Mildred Nilon, in 1962. Its first African American female graduate, Lucile Berkeley Buchanan, received her degree in 1918.

    Research institutes

    University of Colorado-Boulder’s research mission is supported by eleven research institutes within the university. Each research institute supports faculty from multiple academic departments, allowing institutes to conduct truly multidisciplinary research.

    The Institute for Behavioral Genetics (IBG) is a research institute within the Graduate School dedicated to conducting and facilitating research on the genetic and environmental bases of individual differences in behavior. After its founding in 1967 IBG led the resurging interest in genetic influences on behavior. IBG was the first post-World War II research institute dedicated to research in behavioral genetics. IBG remains one of the top research facilities for research in behavioral genetics, including human behavioral genetics, psychiatric genetics, quantitative genetics, statistical genetics, and animal behavioral genetics.

    The Institute of Cognitive Science (ICS) at CU Boulder promotes interdisciplinary research and training in cognitive science. ICS is highly interdisciplinary; its research focuses on education, language processing, emotion, and higher level cognition using experimental methods. It is home to a state-of-the-art fMRI system used to collect neuroimaging data.

    ATLAS Institute is a center for interdisciplinary research and academic study, where engineering, computer science and robotics are blended with design-oriented topics. Part of CU Boulder’s College of Engineering and Applied Science, the institute offers academic programs at the undergraduate, master’s and doctoral levels, and administers research labs, hacker and makerspaces, and a black box experimental performance studio. At the beginning of the 2018–2019 academic year, approximately 1,200 students were enrolled in ATLAS academic programs and the institute sponsored six research labs.[64]

    In addition to IBG, ICS and ATLAS, the university’s other institutes include Biofrontiers Institute, Cooperative Institute for Research in Environmental Sciences, Institute of Arctic & Alpine Research (INSTAAR), Institute of Behavioral Science (IBS), JILA, Laboratory for Atmospheric & Space Physics (LASP), Renewable & Sustainable Energy Institute (RASEI), and the University of Colorado Museum of Natural History.

    NIST Campus, Gaitherberg, MD.

    The National Institute of Standards and Technology‘s Mission, Vision, Core Competencies, and Core Values

    Mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.

    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.

    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

    Background

    The Articles of Confederation, ratified by the colonies in 1781, contained the clause, “The United States in Congress assembled shall also have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States”. Article 1, section 8, of the Constitution of the United States (1789), transferred this power to Congress; “The Congress shall have power…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures”.

    In January 1790, President George Washington, in his first annual message to Congress stated that, “Uniformity in the currency, weights, and measures of the United States is an object of great importance, and will, I am persuaded, be duly attended to”, and ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage, Weights, and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, “A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience”, but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared “Weights and measures may be ranked among the necessities of life to every individual of human society”.

    From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, which was part of the U.S. Coast and Geodetic Survey in the Department of the Treasury.

    Bureau of Standards

    In 1901 in response to a bill proposed by Congressman James H. Southard (R- Ohio) the National Bureau of Standards was founded with the mandate to provide standard weights and measures and to serve as the national physical laboratory for the United States. (Southard had previously sponsored a bill for metric conversion of the United States.)

    President Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was $40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, and set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington DC (US) and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures the Bureau developed instruments for electrical units and for measurement of light. In 1905 a meeting was called that would be the first National Conference on Weights and Measures.

    Initially conceived as purely a metrology agency the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products. Some of these standards were for products intended for government use; but product standards also affected private-sector consumption. Quality standards were developed for products including some types of clothing; automobile brake systems and headlamps; antifreeze; and electrical safety. During World War I, the Bureau worked on multiple problems related to war production even operating its own facility to produce optical glass when European supplies were cut off. Between the wars Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II military research and development was carried out including development of radio propagation forecast methods; the proximity fuze and the standardized airframe used originally for Project Pigeon; and shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.

    In 1948, financed by the United States Air Force the Bureau began design and construction of SEAC: the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version- DYSEAC- was built for the Signal Corps in 1954.

    Due to a changing mission, the “National Bureau of Standards” became the “ The National Institute of Standards and Technology” in 1988.

    Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings.

    Organization

    NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, Colorado, which was dedicated by President Eisenhower in 1954. NIST’s activities are organized into laboratory programs and extramural programs. Effective October 1, 2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six. NIST Laboratories include:

    Communications Technology Laboratory (CTL)
    Engineering Laboratory (EL)
    Information Technology Laboratory (ITL)
    Center for Neutron Research (NCNR)
    Material Measurement Laboratory (MML)
    Physical Measurement Laboratory (PML)

    Extramural programs include:

    Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers to assist small and mid-sized manufacturers to create and retain jobs, improve efficiencies, and minimize waste through process improvements and to increase market penetration with innovation and growth strategies;
    Technology Innovation Program (TIP), a grant program where NIST and industry partners cost share the early-stage development of innovative but high-risk technologies;
    Baldrige Performance Excellence Program, which administers the Malcolm Baldrige National Quality Award, the nation’s highest award for performance and business excellence.

    NIST’s Boulder laboratories are best known for NIST‑F1 which houses an atomic clock.

    NIST‑F1 serves as the source of the nation’s official time. From its measurement of the natural resonance frequency of cesium—which defines the second—NIST broadcasts time signals via longwave radio station WWVB near Fort Collins in Colorado, and shortwave radio stations WWV and WWVH, located near Fort Collins and Kekaha in Hawai’i, respectively.

    NIST also operates a neutron science user facility: the NIST Center for Neutron Research (NCNR).

    The NCNR provides scientists access to a variety of neutron scattering instruments which they use in many research fields (materials science; fuel cells; biotechnology etc.).

    The SURF III Synchrotron Ultraviolet Radiation Facility is a source of synchrotron radiation in continuous operation since 1961.

    SURF III now serves as the US national standard for source-based radiometry throughout the generalized optical spectrum. All NASA-borne extreme-ultraviolet observation instruments have been calibrated at SURF since the 1970s, and SURF is used for measurement and characterization of systems for extreme ultraviolet lithography.

    The Center for Nanoscale Science and Technology performs research in nanotechnology, both through internal research efforts and by running a user-accessible cleanroom nanomanufacturing facility.

    This “NanoFab” is equipped with tools for lithographic patterning and imaging (e.g., electron microscopes and atomic force microscopes).
    Committees

    NIST has seven standing committees:

    Technical Guidelines Development Committee (TGDC)
    Advisory Committee on Earthquake Hazards Reduction (ACEHR)
    National Construction Safety Team Advisory Committee (NCST Advisory Committee)
    Information Security and Privacy Advisory Board (ISPAB)
    Visiting Committee on Advanced Technology (VCAT)
    Board of Overseers for the Malcolm Baldrige National Quality Award (MBNQA Board of Overseers)
    Manufacturing Extension Partnership National Advisory Board (MEPNAB)

    Measurements and standards

    As part of its mission, NIST supplies industry, academia, government, and other users with over 1,300 Standard Reference Materials (SRMs). These artifacts are certified as having specific characteristics or component content, used as calibration standards for measuring equipment and procedures, quality control benchmarks for industrial processes, and experimental control samples.

    Handbook 44

    NIST publishes the Handbook 44 each year after the annual meeting of the National Conference on Weights and Measures (NCWM). Each edition is developed through cooperation of the Committee on Specifications and Tolerances of the NCWM and the Weights and Measures Division (WMD) of the NIST. The purpose of the book is a partial fulfillment of the statutory responsibility for “cooperation with the states in securing uniformity of weights and measures laws and methods of inspection”.

    NIST has been publishing various forms of what is now the Handbook 44 since 1918 and began publication under the current name in 1949. The 2010 edition conforms to the concept of the primary use of the SI (metric) measurements recommended by the Omnibus Foreign Trade and Competitiveness Act of 1988.

     
  • richardmitnick 1:52 pm on November 4, 2022 Permalink | Reply
    Tags: "Entering a New Phase - NIST Technique Simultaneously Locates Multiple Defects on Microchip Circuits", A break or defect in a wire will show up as an abrupt change in the vibration of the tip., An "AFM" features an ultrasharp tip attached to a tiny cantilever that vibrates like a diving board., Defective computer chips are the bane of the semiconductor industry., Researchers at the National Institute of Standards and Technology have developed a method that can simultaneously locate individual electrical flaws in multiple microcircuits on the same chip., The National Institute of Standards and Technology, The technique relies on a relatively inexpensive and common imaging tool-an atomic force microscope., Using a test chip featuring four pairs of wires buried 4 micrometers beneath the surface the scientists demonstrated that their technique produced clear and accurate images of defects.   

    From The National Institute of Standards and Technology: “Entering a New Phase – NIST Technique Simultaneously Locates Multiple Defects on Microchip Circuits” 

    From The National Institute of Standards and Technology

    11.3.22

    Media Contact
    Ben P. Stein
    benjamin.stein@nist.gov
    (301) 975-2763

    Technical Contacts
    Joseph J. Kopanski
    joseph.kopanski@nist.gov
    (301) 975-2089

    Evgheni Strelcov
    evgheni.strelcov@nist.gov
    (301) 975-3573

    Defective computer chips are the bane of the semiconductor industry. Even a seemingly minor flaw in a chip packed with billions of electrical connections might cause a critical operation in a computer or other sensitive electronic device to fail.

    By modifying an existing technique for identifying defects, researchers at the National Institute of Standards and Technology (NIST) have developed a method that can simultaneously locate individual electrical flaws in multiple microcircuits on the same chip. Because the technique relies on a relatively inexpensive and common imaging tool, an atomic force microscope (AFM), it may provide a new way to test the interconnected wiring of computer chips in the factory.

    An AFM features an ultrasharp tip attached to a tiny cantilever that vibrates like a diving board. In the standard mode of operation, scientists apply an AC (alternating current) voltage to the tip as it scans across individual wires buried in parallel several micrometers (millionths of a meter) below the surface of a silicon chip. The voltage difference between the tip and each wire generates an electric force revealed as changes in the frequency or amplitude (height) of the vibrating tip. A break or defect in a wire will show up as an abrupt change in the vibration of the tip.

    1
    NIST researchers have developed a technique that can simultaneously locate individual electrical flaws in multiple microcircuits on the same chip. Credit: S. Kelley/NIST.

    However, that method of searching for defects with an AFM, known as electrostatic force microscopy (EFM), has a drawback. The vibration of the tip is affected not only by the static electric field from the wire under study but also by the voltages from all the neighboring wires. Those extraneous signals interfere with the ability to clearly image defects in the wire undergoing scanning.

    NIST scientists Joseph Kopanski, Evgheni Strelcov and Lin You solved the problem by applying specific AC voltages, supplied by an external generator, to individual neighboring wires instead of to the tip. An AC voltage alternates between positive and negative values; traced over time the voltage resembles a wave with peaks and valleys. In a single cycle, the voltage reaches its maximum positive voltage (the peak) and then falls to its lowest negative voltage (the valley).

    Taking advantage of this cyclic nature, the researchers applied the same AC voltage to neighboring wires as they did to the wire undergoing scanning, with one important difference: The voltages to the neighbors were exactly out of phase. Whenever the voltage to the wire of interest reached its highest value, the voltages to the neighboring wires were at their lowest.

    2
    Out-of-phase AC voltages (indicated by plus and minus) are applied to neighboring wires. A defect shows up as a clear change in the vibration of the tip as it is moved along the wire. Credit: S. Kelley/NIST.

    The out-of-phase voltages exerted electrostatic forces on the AFM tip that opposed the force exerted by the scanned wire. Those oppositely directed forces translated into regions of high contrast on an AFM image, making it easier to distinguish the signal from the wire of interest.

    Using a test chip featuring four pairs of wires buried 4 micrometers beneath the surface, the scientists demonstrated that their technique produced clear and accurate images of defects. And in tailoring the AC voltages applied to each wire so that they have different frequencies, the researchers showed that they could image defects in several adjacent wires at the same time.

    Because the technique depends on an AC voltage applied remotely, to the wires rather than the AFM, the researchers have dubbed the technique remote bias-induced electrostatic force microscopy.

    “Applying a voltage to the wires instead of the AFM tip may seem like a small innovation, but it makes a big difference,” Kopanski said. “The method does not require a new instrument and could be easily implemented by the semiconductor industry,” he added.

    Other techniques used to spot defects, which include X-rays or magnetic fields, are also highly accurate but require more costly equipment, Strelcov noted.

    The researchers presented their work on Nov. 3 at the 48th International Symposium for Testing and Failure Analysis in Pasadena, California.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD.

    The National Institute of Standards and Technology‘s Mission, Vision, Core Competencies, and Core Values

    Mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.

    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.

    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

    Background

    The Articles of Confederation, ratified by the colonies in 1781, contained the clause, “The United States in Congress assembled shall also have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States”. Article 1, section 8, of the Constitution of the United States (1789), transferred this power to Congress; “The Congress shall have power…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures”.

    In January 1790, President George Washington, in his first annual message to Congress stated that, “Uniformity in the currency, weights, and measures of the United States is an object of great importance, and will, I am persuaded, be duly attended to”, and ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage, Weights, and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, “A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience”, but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared “Weights and measures may be ranked among the necessities of life to every individual of human society”.

    From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, which was part of the U.S. Coast and Geodetic Survey in the Department of the Treasury.

    Bureau of Standards

    In 1901 in response to a bill proposed by Congressman James H. Southard (R- Ohio) the National Bureau of Standards was founded with the mandate to provide standard weights and measures and to serve as the national physical laboratory for the United States. (Southard had previously sponsored a bill for metric conversion of the United States.)

    President Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was $40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, and set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington DC (US) and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures the Bureau developed instruments for electrical units and for measurement of light. In 1905 a meeting was called that would be the first National Conference on Weights and Measures.

    Initially conceived as purely a metrology agency the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products. Some of these standards were for products intended for government use; but product standards also affected private-sector consumption. Quality standards were developed for products including some types of clothing; automobile brake systems and headlamps; antifreeze; and electrical safety. During World War I, the Bureau worked on multiple problems related to war production even operating its own facility to produce optical glass when European supplies were cut off. Between the wars Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II military research and development was carried out including development of radio propagation forecast methods; the proximity fuze and the standardized airframe used originally for Project Pigeon; and shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.

    In 1948, financed by the United States Air Force the Bureau began design and construction of SEAC: the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version- DYSEAC- was built for the Signal Corps in 1954.

    Due to a changing mission, the “National Bureau of Standards” became the “ The National Institute of Standards and Technology” in 1988.

    Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings.

    Organization

    NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, Colorado, which was dedicated by President Eisenhower in 1954. NIST’s activities are organized into laboratory programs and extramural programs. Effective October 1, 2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six. NIST Laboratories include:

    Communications Technology Laboratory (CTL)
    Engineering Laboratory (EL)
    Information Technology Laboratory (ITL)
    Center for Neutron Research (NCNR)
    Material Measurement Laboratory (MML)
    Physical Measurement Laboratory (PML)

    Extramural programs include:

    Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers to assist small and mid-sized manufacturers to create and retain jobs, improve efficiencies, and minimize waste through process improvements and to increase market penetration with innovation and growth strategies;
    Technology Innovation Program (TIP), a grant program where NIST and industry partners cost share the early-stage development of innovative but high-risk technologies;
    Baldrige Performance Excellence Program, which administers the Malcolm Baldrige National Quality Award, the nation’s highest award for performance and business excellence.

    NIST’s Boulder laboratories are best known for NIST‑F1 which houses an atomic clock.

    NIST‑F1 serves as the source of the nation’s official time. From its measurement of the natural resonance frequency of cesium—which defines the second—NIST broadcasts time signals via longwave radio station WWVB near Fort Collins in Colorado, and shortwave radio stations WWV and WWVH, located near Fort Collins and Kekaha in Hawai’i, respectively.

    NIST also operates a neutron science user facility: the NIST Center for Neutron Research (NCNR).

    The NCNR provides scientists access to a variety of neutron scattering instruments which they use in many research fields (materials science; fuel cells; biotechnology etc.).

    The SURF III Synchrotron Ultraviolet Radiation Facility is a source of synchrotron radiation in continuous operation since 1961.

    SURF III now serves as the US national standard for source-based radiometry throughout the generalized optical spectrum. All NASA-borne extreme-ultraviolet observation instruments have been calibrated at SURF since the 1970s, and SURF is used for measurement and characterization of systems for extreme ultraviolet lithography.

    The Center for Nanoscale Science and Technology performs research in nanotechnology, both through internal research efforts and by running a user-accessible cleanroom nanomanufacturing facility.

    This “NanoFab” is equipped with tools for lithographic patterning and imaging (e.g., electron microscopes and atomic force microscopes).
    Committees

    NIST has seven standing committees:

    Technical Guidelines Development Committee (TGDC)
    Advisory Committee on Earthquake Hazards Reduction (ACEHR)
    National Construction Safety Team Advisory Committee (NCST Advisory Committee)
    Information Security and Privacy Advisory Board (ISPAB)
    Visiting Committee on Advanced Technology (VCAT)
    Board of Overseers for the Malcolm Baldrige National Quality Award (MBNQA Board of Overseers)
    Manufacturing Extension Partnership National Advisory Board (MEPNAB)

    Measurements and standards

    As part of its mission, NIST supplies industry, academia, government, and other users with over 1,300 Standard Reference Materials (SRMs). These artifacts are certified as having specific characteristics or component content, used as calibration standards for measuring equipment and procedures, quality control benchmarks for industrial processes, and experimental control samples.

    Handbook 44

    NIST publishes the Handbook 44 each year after the annual meeting of the National Conference on Weights and Measures (NCWM). Each edition is developed through cooperation of the Committee on Specifications and Tolerances of the NCWM and the Weights and Measures Division (WMD) of the NIST. The purpose of the book is a partial fulfillment of the statutory responsibility for “cooperation with the states in securing uniformity of weights and measures laws and methods of inspection”.

    NIST has been publishing various forms of what is now the Handbook 44 since 1918 and began publication under the current name in 1949. The 2010 edition conforms to the concept of the primary use of the SI (metric) measurements recommended by the Omnibus Foreign Trade and Competitiveness Act of 1988.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: