Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:50 pm on September 11, 2021 Permalink | Reply
    Tags: "Reconfigurable Metasurfaces Provide Nanoscale Light Control", Applied Research & Technology, , Researchers have designed electromechanically reconfigurable ultrathin optical elements that can be controlled and programmed on a pixel-by-pixel level., Spiral patterns that transform from 2D to 3D,   

    From The Optical Society : “Reconfigurable Metasurfaces Provide Nanoscale Light Control” 

    From The Optical Society

    9 September 2021

    Researchers have designed electromechanically reconfigurable ultrathin optical elements that can be controlled and programmed on a pixel-by-pixel level. These versatile metasurfaces could offer a new chip-based way to achieve nanoscale control of light, which could lead to better optical displays, information encoding and digital light processing.

    “Metasurfaces are ultrathin and compact optical elements that can be used to manipulate the amplitude, phase and polarization of light,” said research team leader Jiafang Li from The Beijing Institute of Technology[北京理工大学](CN) in China. “Although most metasurfaces are static and passive, we created metasurfaces that mechanically deform in response to electrostatic forces.”

    In The Optical Society (OSA) journal Optics Express, the researchers describe how they created the new metasurfaces using nanoscale techniques inspired by kirigami, a variation of origami that includes cutting as well as folding. This allowed them to create tiny units that transform from 2D designs into 3D structures when a voltage is applied.

    “We were able to create a dynamic holographic display using our reconfigurable metasurface,” said Li. “These optical elements could lead to new types of devices with optical multitasking and rewritable functionalities. They might also be used in real-time 3D displays and high-resolution projectors, for example.”

    2
    Researchers designed reconfigurable metasurfaces with 2D spirals that deform when a voltage is applied. Each of the spiral units act as a pixel and can be independently manipulated. The researchers demonstrated the metasurface by using it to create a hologram display. Credit: Jiafang Li, Beijing Institute of Technology.

    Spiral patterns that transform from 2D to 3D

    To create the new metasurfaces, the researchers designed a repeating 2D pattern of two combined spirals that are etched into a gold nanofilm and suspended above silicon dioxide pillars. The units are arranged in a square lattice with just two microns of space between each one. When a voltage is applied, the spirals deform due to electrostatic forces. This transformation, which is reversable and repeatable, can be used to dynamically modulate the optical properties of the metasurface.

    The researchers used their new approach to make two types of metasurfaces for controlling light on a pixel-by-pixel basis. One metasurface used the same voltage to deform each unit but featured spirals with structural patterns that varied to create different deformation heights. The second metasurface used different voltages applied to each unit to achieve different deformation heights for units with identical structural patterns.

    As a proof-of-concept demonstration, the researchers used these metasurfaces to demonstrate beam control and to make a holographic display. “We were able to reconstruct images from the metasurface by merely controlling the voltage bias, proving the feasibility of our scheme for effective light modulation,” said Li.

    The researchers plan to explore strategies that can be used to achieve pixelated voltage control, such as the multi-line addressing method used to drive several rows simultaneously in commercial OLED displays. To make the technology more practical, they are also working to improve the signal-to-noise ratio and modulation quality of the reconfiguration system.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Optical Society (OSA) is a professional association of individuals and companies with an interest in optics and photonics. It publishes journals, and organizes conferences and exhibitions. In 2019 it had about 22,000 members in more than 100 different countries, including some 300 companies.

     
  • richardmitnick 10:02 am on September 11, 2021 Permalink | Reply
    Tags: "Cassini’s wake-how might a spacecraft disturb its own measurements?", Applied Research & Technology, , ,   

    From European Space Agency [Agence spatiale européenne] [Europäische Weltraumorganisation](EU) : “Cassini’s wake-how might a spacecraft disturb its own measurements?” 

    ESA Space For Europe Banner

    European Space Agency – United Space in Europe (EU)

    From European Space Agency [Agence spatiale européenne] [Europäische Weltraumorganisation](EU)

    10/09/2021

    1
    12/09/2017
    Illustration of Cassini diving towards Saturn as part of the mission’s Grand Finale.
    The spacecraft will burn up in Saturn’s atmosphere on 15 September 2017, satisfying planetary protection requirements to avoid possible contamination of any moons of Saturn that could have conditions suitable for life. Credit: NASA/JPL-Caltech (US).

    Simply by moving through the heavens, spacecraft change the space about them. Such interactions are invisible to the naked eye, but can endanger mission performance and safety. A new ESA Resarch Fellow study simulated the Cassini spacecraft in the vicinity of Saturn, checking the findings against actual space measurements. It reveals Cassini cast an ‘ion wake’ up to 6 m behind it, a void of plasma particles like a trail of a boat.

    Space might be a vacuum but it is far from empty, awash with charged particles and electromagnetic fields. This study, published in the Journal of Geophysical Research: Space Physics, employed ESA-funded software called the Spacecraft Plasma Interaction System (SPIS), used to model the interaction between spacecraft and these surrounding environments.

    “This study marks the first time that these simulations have been compared to and confirmed with actual spacecraft measurements from a planet beyond Earth,” explains ESA Research Fellow Mika Holmberg, who spent three years at ESA’s Space Environments and Effects section at the ESTEC technical centre in the Netherlands.

    2
    Saturn bowshock.
    15/02/2013

    The international Cassini spacecraft exploring the magnetic environment of Saturn. The image is not to scale. Saturn’s magnetosphere is depicted in grey, while the complex bow shock region – the shock wave in the solar wind that surrounds the magnetosphere – is shown in blue.
    While crossing the bow shock on 3 February 2007, Cassini recorded a particularly strong shock (an Alfvén Mach number of approximately 100) under a ‘quasi-parallel’ magnetic field configuration, during which significant particle acceleration was detected for the first time. The findings provide insight into particle acceleration at the shocks surrounding the remnants of supernova explosions. Credit: ESA.

    The study focused on the NASA-ESA-ASI Cassini-Huygens spacecraft, which left Earth in 1997 for a nearly two-decade odyssey to explore Saturn and its major moons. The gas giant’s magnetic field is the second largest of any planet’s – populated by charged particles originating from both Saturn itself and its 82 moons.

    Mika comments: “Cassini’s suite of instruments included a Langmuir probe, an electrode extending out from the spacecraft body. Think of it as a ‘space weather station’, to measure the density, temperature and velocity of the charged particles surrounding the spacecraft. This instrument provided the solid data to confirm the accuracy of our SPIS simulation.”

    These kind of simulations are useful in principle for any spacecraft or instrumentation placed in space, but especially for scientific missions focused on studying the space environments of the planets, including Earth.

    Mika adds: “They are important for accurate analyses of particle and field measurements from planetary missions, including the direct characterisation of space environments such as magnetospheres, the solar wind, the ionospheres of planets and moons – even possible plumes arising from them. Cassini gave us an exciting example of the latter when it passed through a plume originating from the icy moon Enceladus, revealing evidence of liquid water beneath its frozen surface.

    “But, crucially, results from in-situ instruments may also be interpreted wrongly if local interactions are not properly accounted for, such as the wakes formed by the spacecraft.”

    SPIS is also commonly employed to model the occurence of surface charging across various spacecraft surfaces, which can give rise to ‘electrostatic discharge’ – essentially a kind of space lightning that risks severe damage to subsystems or may even threaten mission loss. This charging of the spacecraft is driven in turn by the particles and radiation surrounding it.

    3
    Modelling the ion density around Cassini.

    Even sustained sunlight liberates electrons from spacecraft surfaces, a factor which needs accounting for within the modelling.

    Mika notes: “These insights are important for future planetary missions as well, such as NASA’s Europa Clipper and ESA’s Jupiter mission Juice.

    NASA/Europa Clipper annotated.

    NASA Europa Clipper depiction.

    European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU)JUICE Schematic.

    European Space Agency [Agence spatiale européenne](EU) Juice spacecraft depiction.

    We ran a large number of simulations for Juice which actually resulted in the changing of some surface materials, since the simulations showed the mission might be in danger with the original selection.”

    SPIS is an open source software initiated back in 2001 by ESA with the support of French space agency CNES in collaboration with the French aerospace laboratory ONERA and the Artenum company.

    “Having a chance to work at ESA with the experts who were actually involved in developing the software was a golden opportunity,” adds Mika.

    ESA space environment and effects specialist Fabricie Cipriani oversaw Mika’s work at ESTEC: “The complexity and sensitivity of scientific instruments for planetary explorations continue to grow. So simulation tools of this kind are essential both to identify potential issues during early development phases, and to ensure the accurate interpretation of results once an instrument is flying – if, as in Cassini’s case, the spacecraft’s interaction with its environment is significant.

    “And in addition to her work on Cassini, Mika also performed challenging modelling work to quantify surface charging levels of the Juice spacecraft during its exploration of Jupiter’s Galilean moons. We now have a full model that will be very useful for later assessment, then ‐ once at Jupiter ‐ actual mission data exploitation.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings


    Please help promote STEM in your local schools.

    Stem Education Coalition

    European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU), established in 1975, is an intergovernmental organization dedicated to the exploration of space, currently with 19 member states. Headquartered in Paris, ESA has a staff of more than 2,000. ESA’s space flight program includes human spaceflight, mainly through the participation in the International Space Station program, the launch and operations of unmanned exploration missions to other planets and the Moon, Earth observation, science, telecommunication as well as maintaining a major spaceport, the Guiana Space Centre at Kourou, French Guiana, and designing launch vehicles. ESA science missions are based at ESTEC (NL) in Noordwijk, Netherlands, Earth Observation missions at ESRIN in Frascati, Italy, ESA Mission Control (ESOC) is in Darmstadt, Germany, the European Astronaut Centre (EAC) that trains astronauts for future missions is situated in Cologne, Germany, and the European Space Astronomy Centre is located in Villanueva de la Cañada, Spain.

    ESA’s space flight programme includes human spaceflight (mainly through participation in the International Space Station program); the launch and operation of uncrewed exploration missions to other planets and the Moon; Earth observation, science and telecommunication; designing launch vehicles; and maintaining a major spaceport, the The Guiana Space Centre [Centre Spatial Guyanais; CSG also called Europe’s Spaceport) at Kourou, French Guiana. The main European launch vehicle Ariane 5 is operated through Arianespace with ESA sharing in the costs of launching and further developing this launch vehicle. The agency is also working with NASA to manufacture the Orion Spacecraft service module that will fly on the Space Launch System.

    The agency’s facilities are distributed among the following centres:

    ESA European Space Research and Technology Centre (ESTEC) (NL)in Noordwijk, Netherlands;
    ESA Centre for Earth Observation [ESRIN] (IT) in Frascati, Italy;
    ESA Mission Control ESA European Space Operations Center [ESOC](DE) is in Darmstadt, Germany;
    ESA -European Astronaut Centre [EAC] trains astronauts for future missions is situated in Cologne, Germany;
    European Centre for Space Applications and Telecommunications (ECSAT) (UK), a research institute created in 2009, is located in Harwell, England;
    ESA – European Space Astronomy Centre [ESAC] (ES) is located in Villanueva de la Cañada, Madrid, Spain.
    European Space Agency Science Programme is a long-term programme of space science and space exploration missions.

    Foundation

    After World War II, many European scientists left Western Europe in order to work with the United States. Although the 1950s boom made it possible for Western European countries to invest in research and specifically in space-related activities, Western European scientists realized solely national projects would not be able to compete with the two main superpowers. In 1958, only months after the Sputnik shock, Edoardo Amaldi (Italy) and Pierre Auger (France), two prominent members of the Western European scientific community, met to discuss the foundation of a common Western European space agency. The meeting was attended by scientific representatives from eight countries, including Harrie Massey (United Kingdom).

    The Western European nations decided to have two agencies: one concerned with developing a launch system, ELDO (European Launch Development Organization), and the other the precursor of the European Space Agency, ESRO (European Space Research Organisation). The latter was established on 20 March 1964 by an agreement signed on 14 June 1962. From 1968 to 1972, ESRO launched seven research satellites.

    ESA in its current form was founded with the ESA Convention in 1975, when ESRO was merged with ELDO. ESA had ten founding member states: Belgium, Denmark, France, West Germany, Italy, the Netherlands, Spain, Sweden, Switzerland, and the United Kingdom. These signed the ESA Convention in 1975 and deposited the instruments of ratification by 1980, when the convention came into force. During this interval the agency functioned in a de facto fashion. ESA launched its first major scientific mission in 1975, Cos-B, a space probe monitoring gamma-ray emissions in the universe, which was first worked on by ESRO.

    ESA50 Logo large

    Later activities

    ESA collaborated with National Aeronautics Space Agency on the International Ultraviolet Explorer (IUE), the world’s first high-orbit telescope, which was launched in 1978 and operated successfully for 18 years.

    ESA Infrared Space Observatory.

    A number of successful Earth-orbit projects followed, and in 1986 ESA began Giotto, its first deep-space mission, to study the comets Halley and Grigg–Skjellerup. Hipparcos, a star-mapping mission, was launched in 1989 and in the 1990s SOHO, Ulysses and the Hubble Space Telescope were all jointly carried out with NASA. Later scientific missions in cooperation with NASA include the Cassini–Huygens space probe, to which ESA contributed by building the Titan landing module Huygens.

    As the successor of ELDO, ESA has also constructed rockets for scientific and commercial payloads. Ariane 1, launched in 1979, carried mostly commercial payloads into orbit from 1984 onward. The next two versions of the Ariane rocket were intermediate stages in the development of a more advanced launch system, the Ariane 4, which operated between 1988 and 2003 and established ESA as the world leader in commercial space launches in the 1990s. Although the succeeding Ariane 5 experienced a failure on its first flight, it has since firmly established itself within the heavily competitive commercial space launch market with 82 successful launches until 2018. The successor launch vehicle of Ariane 5, the Ariane 6, is under development and is envisioned to enter service in the 2020s.

    The beginning of the new millennium saw ESA become, along with agencies like National Aeronautics Space Agency(US), Japan Aerospace Exploration Agency, Indian Space Research Organisation, the Canadian Space Agency(CA) and Roscosmos(RU), one of the major participants in scientific space research. Although ESA had relied on co-operation with NASA in previous decades, especially the 1990s, changed circumstances (such as tough legal restrictions on information sharing by the United States military) led to decisions to rely more on itself and on co-operation with Russia. A 2011 press issue thus stated:

    “Russia is ESA’s first partner in its efforts to ensure long-term access to space. There is a framework agreement between ESA and the government of the Russian Federation on cooperation and partnership in the exploration and use of outer space for peaceful purposes, and cooperation is already underway in two different areas of launcher activity that will bring benefits to both partners.”

    Notable ESA programmes include SMART-1, a probe testing cutting-edge space propulsion technology, the Mars Express and Venus Express missions, as well as the development of the Ariane 5 rocket and its role in the ISS partnership. ESA maintains its scientific and research projects mainly for astronomy-space missions such as Corot, launched on 27 December 2006, a milestone in the search for exoplanets.

    On 21 January 2019, ArianeGroup and Arianespace announced a one-year contract with ESA to study and prepare for a mission to mine the Moon for lunar regolith.

    Mission

    The treaty establishing the European Space Agency reads:

    The purpose of the Agency shall be to provide for and to promote, for exclusively peaceful purposes, cooperation among European States in space research and technology and their space applications, with a view to their being used for scientific purposes and for operational space applications systems…

    ESA is responsible for setting a unified space and related industrial policy, recommending space objectives to the member states, and integrating national programs like satellite development, into the European program as much as possible.

    Jean-Jacques Dordain – ESA’s Director General (2003–2015) – outlined the European Space Agency’s mission in a 2003 interview:

    “Today space activities have pursued the benefit of citizens, and citizens are asking for a better quality of life on Earth. They want greater security and economic wealth, but they also want to pursue their dreams, to increase their knowledge, and they want younger people to be attracted to the pursuit of science and technology. I think that space can do all of this: it can produce a higher quality of life, better security, more economic wealth, and also fulfill our citizens’ dreams and thirst for knowledge, and attract the young generation. This is the reason space exploration is an integral part of overall space activities. It has always been so, and it will be even more important in the future.”

    Activities

    According to the ESA website, the activities are:

    Observing the Earth
    Human Spaceflight
    Launchers
    Navigation
    Space Science
    Space Engineering & Technology
    Operations
    Telecommunications & Integrated Applications
    Preparing for the Future
    Space for Climate

    Programmes

    Copernicus Programme
    Cosmic Vision
    ExoMars
    FAST20XX
    Galileo
    Horizon 2000
    Living Planet Programme

    Mandatory

    Every member country must contribute to these programmes:

    Technology Development Element Programme
    Science Core Technology Programme
    General Study Programme
    European Component Initiative

    Optional

    Depending on their individual choices the countries can contribute to the following programmes, listed according to:

    Launchers
    Earth Observation
    Human Spaceflight and Exploration
    Telecommunications
    Navigation
    Space Situational Awareness
    Technology

    ESA_LAB@

    ESA has formed partnerships with universities. ESA_LAB@ refers to research laboratories at universities. Currently there are ESA_LAB@

    Technische Universität Darmstadt
    École des hautes études commerciales de Paris (HEC Paris)
    Université de recherche Paris Sciences et Lettres
    University of Central Lancashire

    Membership and contribution to ESA

    By 2015, ESA was an intergovernmental organisation of 22 member states. Member states participate to varying degrees in the mandatory (25% of total expenditures in 2008) and optional space programmes (75% of total expenditures in 2008). The 2008 budget amounted to €3.0 billion whilst the 2009 budget amounted to €3.6 billion. The total budget amounted to about €3.7 billion in 2010, €3.99 billion in 2011, €4.02 billion in 2012, €4.28 billion in 2013, €4.10 billion in 2014 and €4.33 billion in 2015. English is the main language within ESA. Additionally, official documents are also provided in German and documents regarding the Spacelab are also provided in Italian. If found appropriate, the agency may conduct its correspondence in any language of a member state.

    Non-full member states
    Slovenia
    Since 2016, Slovenia has been an associated member of the ESA.

    Latvia
    Latvia became the second current associated member on 30 June 2020, when the Association Agreement was signed by ESA Director Jan Wörner and the Minister of Education and Science of Latvia, Ilga Šuplinska in Riga. The Saeima ratified it on July 27. Previously associated members were Austria, Norway and Finland, all of which later joined ESA as full members.

    Canada
    Since 1 January 1979, Canada has had the special status of a Cooperating State within ESA. By virtue of this accord, the Canadian Space Agency takes part in ESA’s deliberative bodies and decision-making and also in ESA’s programmes and activities. Canadian firms can bid for and receive contracts to work on programmes. The accord has a provision ensuring a fair industrial return to Canada. The most recent Cooperation Agreement was signed on 15 December 2010 with a term extending to 2020. For 2014, Canada’s annual assessed contribution to the ESA general budget was €6,059,449 (CAD$8,559,050). For 2017, Canada has increased its annual contribution to €21,600,000 (CAD$30,000,000).

    Enlargement

    After the decision of the ESA Council of 21/22 March 2001, the procedure for accession of the European states was detailed as described the document titled The Plan for European Co-operating States (PECS). Nations that want to become a full member of ESA do so in 3 stages. First a Cooperation Agreement is signed between the country and ESA. In this stage, the country has very limited financial responsibilities. If a country wants to co-operate more fully with ESA, it signs a European Cooperating State (ECS) Agreement. The ECS Agreement makes companies based in the country eligible for participation in ESA procurements. The country can also participate in all ESA programmes, except for the Basic Technology Research Programme. While the financial contribution of the country concerned increases, it is still much lower than that of a full member state. The agreement is normally followed by a Plan For European Cooperating State (or PECS Charter). This is a 5-year programme of basic research and development activities aimed at improving the nation’s space industry capacity. At the end of the 5-year period, the country can either begin negotiations to become a full member state or an associated state or sign a new PECS Charter.

    During the Ministerial Meeting in December 2014, ESA ministers approved a resolution calling for discussions to begin with Israel, Australia and South Africa on future association agreements. The ministers noted that “concrete cooperation is at an advanced stage” with these nations and that “prospects for mutual benefits are existing”.

    A separate space exploration strategy resolution calls for further co-operation with the United States, Russia and China on “LEO exploration, including a continuation of ISS cooperation and the development of a robust plan for the coordinated use of space transportation vehicles and systems for exploration purposes, participation in robotic missions for the exploration of the Moon, the robotic exploration of Mars, leading to a broad Mars Sample Return mission in which Europe should be involved as a full partner, and human missions beyond LEO in the longer term.”

    Relationship with the European Union

    The political perspective of the European Union (EU) was to make ESA an agency of the EU by 2014, although this date was not met. The EU member states provide most of ESA’s funding, and they are all either full ESA members or observers.

    History

    At the time ESA was formed, its main goals did not encompass human space flight; rather it considered itself to be primarily a scientific research organisation for uncrewed space exploration in contrast to its American and Soviet counterparts. It is therefore not surprising that the first non-Soviet European in space was not an ESA astronaut on a European space craft; it was Czechoslovak Vladimír Remek who in 1978 became the first non-Soviet or American in space (the first man in space being Yuri Gagarin of the Soviet Union) – on a Soviet Soyuz spacecraft, followed by the Pole Mirosław Hermaszewski and East German Sigmund Jähn in the same year. This Soviet co-operation programme, known as Intercosmos, primarily involved the participation of Eastern bloc countries. In 1982, however, Jean-Loup Chrétien became the first non-Communist Bloc astronaut on a flight to the Soviet Salyut 7 space station.

    Because Chrétien did not officially fly into space as an ESA astronaut, but rather as a member of the French CNES astronaut corps, the German Ulf Merbold is considered the first ESA astronaut to fly into space. He participated in the STS-9 Space Shuttle mission that included the first use of the European-built Spacelab in 1983. STS-9 marked the beginning of an extensive ESA/NASA joint partnership that included dozens of space flights of ESA astronauts in the following years. Some of these missions with Spacelab were fully funded and organizationally and scientifically controlled by ESA (such as two missions by Germany and one by Japan) with European astronauts as full crew members rather than guests on board. Beside paying for Spacelab flights and seats on the shuttles, ESA continued its human space flight co-operation with the Soviet Union and later Russia, including numerous visits to Mir.

    During the latter half of the 1980s, European human space flights changed from being the exception to routine and therefore, in 1990, the European Astronaut Centre in Cologne, Germany was established. It selects and trains prospective astronauts and is responsible for the co-ordination with international partners, especially with regard to the International Space Station. As of 2006, the ESA astronaut corps officially included twelve members, including nationals from most large European countries except the United Kingdom.

    In the summer of 2008, ESA started to recruit new astronauts so that final selection would be due in spring 2009. Almost 10,000 people registered as astronaut candidates before registration ended in June 2008. 8,413 fulfilled the initial application criteria. Of the applicants, 918 were chosen to take part in the first stage of psychological testing, which narrowed down the field to 192. After two-stage psychological tests and medical evaluation in early 2009, as well as formal interviews, six new members of the European Astronaut Corps were selected – five men and one woman.

    Cooperation with other countries and organisations

    ESA has signed co-operation agreements with the following states that currently neither plan to integrate as tightly with ESA institutions as Canada, nor envision future membership of ESA: Argentina, Brazil, China, India (for the Chandrayan mission), Russia and Turkey.

    Additionally, ESA has joint projects with the European Union, NASA of the United States and is participating in the International Space Station together with the United States (NASA), Russia and Japan (JAXA).

    European Union
    ESA and EU member states
    ESA-only members
    EU-only members

    ESA is not an agency or body of the European Union (EU), and has non-EU countries (Norway, Switzerland, and the United Kingdom) as members. There are however ties between the two, with various agreements in place and being worked on, to define the legal status of ESA with regard to the EU.

    There are common goals between ESA and the EU. ESA has an EU liaison office in Brussels. On certain projects, the EU and ESA co-operate, such as the upcoming Galileo satellite navigation system. Space policy has since December 2009 been an area for voting in the European Council. Under the European Space Policy of 2007, the EU, ESA and its Member States committed themselves to increasing co-ordination of their activities and programmes and to organising their respective roles relating to space.

    The Lisbon Treaty of 2009 reinforces the case for space in Europe and strengthens the role of ESA as an R&D space agency. Article 189 of the Treaty gives the EU a mandate to elaborate a European space policy and take related measures, and provides that the EU should establish appropriate relations with ESA.

    Former Italian astronaut Umberto Guidoni, during his tenure as a Member of the European Parliament from 2004 to 2009, stressed the importance of the European Union as a driving force for space exploration, “…since other players are coming up such as India and China it is becoming ever more important that Europeans can have an independent access to space. We have to invest more into space research and technology in order to have an industry capable of competing with other international players.”

    The first EU-ESA International Conference on Human Space Exploration took place in Prague on 22 and 23 October 2009. A road map which would lead to a common vision and strategic planning in the area of space exploration was discussed. Ministers from all 29 EU and ESA members as well as members of parliament were in attendance.

    National space organisations of member states:

    The Centre National d’Études Spatiales(FR) (CNES) (National Centre for Space Study) is the French government space agency (administratively, a “public establishment of industrial and commercial character”). Its headquarters are in central Paris. CNES is the main participant on the Ariane project. Indeed, CNES designed and tested all Ariane family rockets (mainly from its centre in Évry near Paris)
    The UK Space Agency is a partnership of the UK government departments which are active in space. Through the UK Space Agency, the partners provide delegates to represent the UK on the various ESA governing bodies. Each partner funds its own programme.
    The Italian Space Agency A.S.I. – Agenzia Spaziale Italiana was founded in 1988 to promote, co-ordinate and conduct space activities in Italy. Operating under the Ministry of the Universities and of Scientific and Technological Research, the agency cooperates with numerous entities active in space technology and with the president of the Council of Ministers. Internationally, the ASI provides Italy’s delegation to the Council of the European Space Agency and to its subordinate bodies.
    The German Aerospace Center (DLR)[Deutsches Zentrum für Luft- und Raumfahrt e. V.] is the national research centre for aviation and space flight of the Federal Republic of Germany and of other member states in the Helmholtz Association. Its extensive research and development projects are included in national and international cooperative programmes. In addition to its research projects, the centre is the assigned space agency of Germany bestowing headquarters of German space flight activities and its associates.
    The Instituto Nacional de Técnica Aeroespacial (INTA)(ES) (National Institute for Aerospace Technique) is a Public Research Organization specialised in aerospace research and technology development in Spain. Among other functions, it serves as a platform for space research and acts as a significant testing facility for the aeronautic and space sector in the country.

    National Aeronautics Space Agency(US)

    ESA has a long history of collaboration with NASA. Since ESA’s astronaut corps was formed, the Space Shuttle has been the primary launch vehicle used by ESA’s astronauts to get into space through partnership programmes with NASA. In the 1980s and 1990s, the Spacelab programme was an ESA-NASA joint research programme that had ESA develop and manufacture orbital labs for the Space Shuttle for several flights on which ESA participate with astronauts in experiments.

    In robotic science mission and exploration missions, NASA has been ESA’s main partner. Cassini–Huygens was a joint NASA-ESA mission, along with the Infrared Space Observatory, INTEGRAL, SOHO, and others.

    Also, the Hubble Space Telescope is a joint project of NASA and ESA.

    Future ESA-NASA joint projects include the James Webb Space Telescope and the proposed Laser Interferometer Space Antenna.

    NASA has committed to provide support to ESA’s proposed MarcoPolo-R mission to return an asteroid sample to Earth for further analysis. NASA and ESA will also likely join together for a Mars Sample Return Mission. In October 2020 the ESA entered into a memorandum of understanding (MOU) with NASA to work together on the Artemis program, which will provide an orbiting lunar gateway and also accomplish the first manned lunar landing in 50 years, whose team will include the first woman on the Moon.


    Cooperation with other space agencies

    Since China has started to invest more money into space activities, the Chinese Space Agency(CN) has sought international partnerships. ESA is, beside the Russian Space Agency, one of its most important partners. Two space agencies cooperated in the development of the Double Star Mission. In 2017, ESA sent two astronauts to China for two weeks sea survival training with Chinese astronauts in Yantai, Shandong.

    ESA entered into a major joint venture with Russia in the form of the CSTS, the preparation of French Guiana spaceport for launches of Soyuz-2 rockets and other projects. With India, ESA agreed to send instruments into space aboard the ISRO’s Chandrayaan-1 in 2008. ESA is also co-operating with Japan, the most notable current project in collaboration with JAXA is the BepiColombo mission to Mercury.

    Speaking to reporters at an air show near Moscow in August 2011, ESA head Jean-Jacques Dordain said ESA and Russia’s Roskosmos space agency would “carry out the first flight to Mars together.”

     
  • richardmitnick 11:21 am on September 10, 2021 Permalink | Reply
    Tags: "After 20 years of trying scientists succeed in doping a 1D chain of cuprates", Applied Research & Technology, Chemically controlled chains reveal an ultrastrong attraction between electrons that may help cuprate superconductors carry electrical current with no loss at relatively high temperatures., ,   

    From DOE’s SLAC National Accelerator Laboratory (US) : “After 20 years of trying scientists succeed in doping a 1D chain of cuprates” 

    From DOE’s SLAC National Accelerator Laboratory (US)

    September 9, 2021
    Glennda Chui

    The chemically controlled chains reveal an ultrastrong attraction between electrons that may help cuprate superconductors carry electrical current with no loss at relatively high temperatures.

    1
    An illustration of 1D copper oxide, or cuprate, chains that have been “doped” to free up some of their electrons in a study led by researchers at SLAC National Accelerator Laboratory and Stanford University (US) and Clemson University (US). Copper atoms are black and oxygen atoms purple. The red springs represent natural vibrations that jiggle the atomic lattice, which may help produce an unexpectedly strong attraction (not shown) between neighboring electrons in the lattice. This “nearest-neighbor” attraction may play a role in unconventional superconductivity – the ability to conduct electric current with no loss at relatively high temperatures. (Greg Stewart/SLAC National Accelerator Laboratory.)

    When scientists study unconventional superconductors – complex materials that conduct electricity with zero loss at relatively high temperatures – they often rely on simplified models to get an understanding of what’s going on.

    Researchers know these quantum materials get their abilities from electrons that join forces to form a sort of electron soup. But modeling this process in all its complexity would take far more time and computing power than anyone can imagine having today. So for understanding one key class of unconventional superconductors – copper oxides, or cuprates – researchers created, for simplicity, a theoretical model in which the material exists in just one dimension, as a string of atoms. They made these one-dimensional cuprates in the lab and found that their behavior agreed with the theory pretty well.

    Unfortunately, these 1D atomic chains lacked one thing: They could not be doped, a process where some atoms are replaced by others to change the number of electrons that are free to move around. Doping is one of several factors scientists can adjust to tweak the behavior of materials like these, and it’s a critical part of getting them to superconduct.

    Now a study led by scientists at the Department of Energy’s SLAC National Accelerator Laboratory and Stanford and Clemson universities has synthesized the first 1D cuprate material that can be doped. Their analysis of the doped material suggests that the most prominent proposed model of how cuprates achieve superconductivity is missing a key ingredient: an unexpectedly strong attraction between neighboring electrons in the material’s atomic structure, or lattice. That attraction, they said, may be the result of interactions with natural lattice vibrations.

    The team reported their findings today in Science.

    2
    An illustration depicts an unexpectedly strong attraction between electrons in neighboring lattice sites within a 1D chain of copper oxide, or cuprate – a material that conducts electrical current with no loss at relatively high temperatures. A study led by Stanford, SLAC and Clemson discovered this unusually strong “nearest-neighbor” attraction in a 1D cuprate chain that had been “doped” to increase the density of its free electrons. They said the unexpected strength of the attractions may result from interactions with natural vibrations in the material’s atomic lattice, which may play a role in cuprate superconductivity. (SCI-HUA)

    “The inability to controllably dope one-dimensional cuprate systems has been a significant barrier to understanding these materials for more than two decades,” said Zhi-Xun Shen, a Stanford professor and investigator with the Stanford Institute for Materials and Energy Sciences (SIMES) at SLAC.

    “Now that we’ve done it,” he said, “our experiments show that our current model misses a very important phenomenon that’s present in the real material.”

    Zhuoyu Chen, a postdoctoral researcher in Shen’s lab who led the experimental part of the study, said the research was made possible by a system the team developed for making 1D chains embedded in a 3D material and moving them directly into a chamber at SLAC’s Stanford Synchrotron Radiation Lightsource (SSRL) for analysis with a powerful X-ray beam.

    “It’s a unique setup,” he said, “and indispensable for achieving the high-quality data we needed to see these very subtle effects.”

    From grids to chains, in theory

    The predominant model used to simulate these complex materials is known as the Hubbard model. In its 2D version, it is based on a flat, evenly spaced grid of the simplest possible atoms.

    But this basic 2D grid is already too complicated for today’s computers and algorithms to handle, said Thomas Devereaux, a SLAC and Stanford professor and SIMES investigator who supervised the theoretical part of this work. There’s no well-accepted way to make sure the model’s calculations for the material’s physical properties are correct, so if they don’t match experimental results it’s impossible to tell whether the calculations or the theoretical model went wrong.

    To solve that problem, scientists have applied the Hubbard model to 1D chains of the simplest possible cuprate lattice – a string of copper and oxygen atoms. This 1D version of the model can accurately calculate and capture the collective behavior of electrons in materials made of undoped 1D chains. But until now, there hasn’t been a way to test the accuracy of its predictions for the doped versions of the chains because no one was able to make them in the lab, despite more than two decades of trying.

    “Our major achievement was in synthesizing these doped chains,” Chen said. “We were able to dope them over a very wide range and get systematic data to pin down what we were observing.”

    One atomic layer at a time

    To make the doped 1D chains, Chen and his colleagues sprayed a film of a cuprate material known as barium strontium copper oxide (BSCO), just a few atomic layers thick, onto a supportive surface inside a sealed chamber at the specially designed SSRL beamline. The shape of the lattices in the film and on the surface lined up in a way that created 1D chains of copper and oxygen embedded in the 3D BSCO material.

    They doped the chains by exposing them to ozone and heat, which added oxygen atoms to their atomic lattices, Chen said. Each oxygen atom pulled an electron out of the chain, and those freed-up electrons become more mobile. When millions of these free-flowing electrons come together, they can create the collective state that’s the basis of superconductivity.

    3
    Researchers at SLAC, Stanford and Clemson used a technique called angle-resolved photoemission spectroscopy (ARPES), shown here, to eject electrons from doped 1D copper oxide chains and measure their direction and energy. This gave them a detailed and sensitive picture of how the electrons in the material behave. The work was done at a specially designed beamline at SLAC’s Stanford Synchrotron Radiation Lightsource (SSRL). (Zhuoyu Chen/Stanford University.)

    Next the researchers shuttled their chains into another part of the beamline for analysis with angle-resolved photoemission spectroscopy, or ARPES. This technique ejected electrons from the chains and measured their direction and energy, giving scientists a detailed and sensitive picture of how the electrons in the material behave.

    Surprisingly strong attractions

    Their analysis showed that in the doped 1D material, the electrons’ attraction to their counterparts in neighboring lattice sites is 10 times stronger than the Hubbard model predicts, said Yao Wang, an assistant professor at Clemson University who worked on the theory side of the study.

    The research team suggested that this high level of “nearest-neighbor” attraction may stem from interactions with phonons – natural vibrations that jiggle the atomic latticework. Phonons are known to play a role in conventional superconductivity, and there are indications that they could also be involved in a different way in unconventional superconductivity that occurs at much warmer temperatures in materials like the cuprates, although that has not been definitively proven.

    The scientists said it’s likely that this strong nearest-neighbor attraction between electrons exists in all the cuprates and could help in understanding superconductivity in the 2D versions of the Hubbard model and its kin, giving scientists a more complete picture of these puzzling materials.

    Researchers from DOE’s Oak Ridge National Laboratory contributed to this work, which was funded by the DOE Office of Science. SSRL is an Office of Science user facility.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    SLAC National Accelerator Laboratory (US) originally named Stanford Linear Accelerator Center, is a Department of Energy (US) National Laboratory operated by Stanford University (US) under the programmatic direction of the Department of Energy (US) Office of Science and located in Menlo Park, California. It is the site of the Stanford Linear Accelerator, a 3.2 kilometer (2-mile) linear accelerator constructed in 1966 and shut down in the 2000s, which could accelerate electrons to energies of 50 GeV.
    Today SLAC research centers on a broad program in atomic and solid-state physics, chemistry, biology, and medicine using X-rays from synchrotron radiation and a free-electron laser as well as experimental and theoretical research in elementary particle physics, astroparticle physics, and cosmology.

    Founded in 1962 as the Stanford Linear Accelerator Center, the facility is located on 172 hectares (426 acres) of Stanford University-owned land on Sand Hill Road in Menlo Park, California—just west of the University’s main campus. The main accelerator is 3.2 kilometers (2 mi) long—the longest linear accelerator in the world—and has been operational since 1966.

    Research at SLAC has produced three Nobel Prizes in Physics

    1976: The charm quark—see J/ψ meson
    1990: Quark structure inside protons and neutrons
    1995: The tau lepton

    SLAC’s meeting facilities also provided a venue for the Homebrew Computer Club and other pioneers of the home computer revolution of the late 1970s and early 1980s.

    In 1984 the laboratory was named an ASME National Historic Engineering Landmark and an IEEE Milestone.

    SLAC developed and, in December 1991, began hosting the first World Wide Web server outside of Europe.

    In the early-to-mid 1990s, the Stanford Linear Collider (SLC) investigated the properties of the Z boson using the Stanford Large Detector.

    As of 2005, SLAC employed over 1,000 people, some 150 of whom were physicists with doctorate degrees, and served over 3,000 visiting researchers yearly, operating particle accelerators for high-energy physics and the Stanford Synchrotron Radiation Laboratory (SSRL) for synchrotron light radiation research, which was “indispensable” in the research leading to the 2006 Nobel Prize in Chemistry awarded to Stanford Professor Roger D. Kornberg.

    In October 2008, the Department of Energy announced that the center’s name would be changed to SLAC National Accelerator Laboratory. The reasons given include a better representation of the new direction of the lab and the ability to trademark the laboratory’s name. Stanford University had legally opposed the Department of Energy’s attempt to trademark “Stanford Linear Accelerator Center”.

    In March 2009, it was announced that the SLAC National Accelerator Laboratory was to receive $68.3 million in Recovery Act Funding to be disbursed by Department of Energy’s Office of Science.

    In October 2016, Bits and Watts launched as a collaboration between SLAC and Stanford University to design “better, greener electric grids”. SLAC later pulled out over concerns about an industry partner, the state-owned Chinese electric utility.

    Accelerator

    The main accelerator was an RF linear accelerator that accelerated electrons and positrons up to 50 GeV. At 3.2 km (2.0 mi) long, the accelerator was the longest linear accelerator in the world, and was claimed to be “the world’s most straight object.” until 2017 when the European x-ray free electron laser opened. The main accelerator is buried 9 m (30 ft) below ground and passes underneath Interstate Highway 280. The above-ground klystron gallery atop the beamline, was the longest building in the United States until the LIGO project’s twin interferometers were completed in 1999. It is easily distinguishable from the air and is marked as a visual waypoint on aeronautical charts.

    A portion of the original linear accelerator is now part of the Linac Coherent Light Source [below].

    Stanford Linear Collider

    The Stanford Linear Collider was a linear accelerator that collided electrons and positrons at SLAC. The center of mass energy was about 90 GeV, equal to the mass of the Z boson, which the accelerator was designed to study. Grad student Barrett D. Milliken discovered the first Z event on 12 April 1989 while poring over the previous day’s computer data from the Mark II detector. The bulk of the data was collected by the SLAC Large Detector, which came online in 1991. Although largely overshadowed by the Large Electron–Positron Collider at CERN, which began running in 1989, the highly polarized electron beam at SLC (close to 80%) made certain unique measurements possible, such as parity violation in Z Boson-b quark coupling.

    Presently no beam enters the south and north arcs in the machine, which leads to the Final Focus, therefore this section is mothballed to run beam into the PEP2 section from the beam switchyard.

    The SLAC Large Detector (SLD) was the main detector for the Stanford Linear Collider. It was designed primarily to detect Z bosons produced by the accelerator’s electron-positron collisions. Built in 1991, the SLD operated from 1992 to 1998.

    SLAC National Accelerator Laboratory(US)Large Detector

    PEP

    PEP (Positron-Electron Project) began operation in 1980, with center-of-mass energies up to 29 GeV. At its apex, PEP had five large particle detectors in operation, as well as a sixth smaller detector. About 300 researchers made used of PEP. PEP stopped operating in 1990, and PEP-II began construction in 1994.

    PEP-II

    From 1999 to 2008, the main purpose of the linear accelerator was to inject electrons and positrons into the PEP-II accelerator, an electron-positron collider with a pair of storage rings 2.2 km (1.4 mi) in circumference. PEP-II was host to the BaBar experiment, one of the so-called B-Factory experiments studying charge-parity symmetry.

    SLAC National Accelerator Laboratory(US) BaBar

    Fermi Gamma-ray Space Telescope

    SLAC plays a primary role in the mission and operation of the Fermi Gamma-ray Space Telescope, launched in August 2008. The principal scientific objectives of this mission are:

    To understand the mechanisms of particle acceleration in AGNs, pulsars, and SNRs.
    To resolve the gamma-ray sky: unidentified sources and diffuse emission.
    To determine the high-energy behavior of gamma-ray bursts and transients.
    To probe dark matter and fundamental physics.


    KIPAC

    The Stanford PULSE Institute (PULSE) is a Stanford Independent Laboratory located in the Central Laboratory at SLAC. PULSE was created by Stanford in 2005 to help Stanford faculty and SLAC scientists develop ultrafast x-ray research at LCLS.

    The Linac Coherent Light Source (LCLS)[below] is a free electron laser facility located at SLAC. The LCLS is partially a reconstruction of the last 1/3 of the original linear accelerator at SLAC, and can deliver extremely intense x-ray radiation for research in a number of areas. It achieved first lasing in April 2009.

    The laser produces hard X-rays, 10^9 times the relative brightness of traditional synchrotron sources and is the most powerful x-ray source in the world. LCLS enables a variety of new experiments and provides enhancements for existing experimental methods. Often, x-rays are used to take “snapshots” of objects at the atomic level before obliterating samples. The laser’s wavelength, ranging from 6.2 to 0.13 nm (200 to 9500 electron volts (eV)) is similar to the width of an atom, providing extremely detailed information that was previously unattainable. Additionally, the laser is capable of capturing images with a “shutter speed” measured in femtoseconds, or million-billionths of a second, necessary because the intensity of the beam is often high enough so that the sample explodes on the femtosecond timescale.

    The LCLS-II [below] project is to provide a major upgrade to LCLS by adding two new X-ray laser beams. The new system will utilize the 500 m (1,600 ft) of existing tunnel to add a new superconducting accelerator at 4 GeV and two new sets of undulators that will increase the available energy range of LCLS. The advancement from the discoveries using this new capabilities may include new drugs, next-generation computers, and new materials.

    FACET

    In 2012, the first two-thirds (~2 km) of the original SLAC LINAC were recommissioned for a new user facility, the Facility for Advanced Accelerator Experimental Tests (FACET). This facility was capable of delivering 20 GeV, 3 nC electron (and positron) beams with short bunch lengths and small spot sizes, ideal for beam-driven plasma acceleration studies. The facility ended operations in 2016 for the constructions of LCLS-II which will occupy the first third of the SLAC LINAC. The FACET-II project will re-establish electron and positron beams in the middle third of the LINAC for the continuation of beam-driven plasma acceleration studies in 2019.

    The Next Linear Collider Test Accelerator (NLCTA) is a 60-120 MeV high-brightness electron beam linear accelerator used for experiments on advanced beam manipulation and acceleration techniques. It is located at SLAC’s end station B

    SSRL and LCLS are DOE Office of Science user facilities.

    Stanford University (US)

    Leland and Jane Stanford founded the University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members.

    Stanford University, officially Leland Stanford Junior University, is a private research university located in Stanford, California. Stanford was founded in 1885 by Leland and Jane Stanford in memory of their only child, Leland Stanford Jr., who had died of typhoid fever at age 15 the previous year. Stanford is consistently ranked as among the most prestigious and top universities in the world by major education publications. It is also one of the top fundraising institutions in the country, becoming the first school to raise more than a billion dollars in a year.

    Leland Stanford was a U.S. senator and former governor of California who made his fortune as a railroad tycoon. The school admitted its first students on October 1, 1891, as a coeducational and non-denominational institution. Stanford University struggled financially after the death of Leland Stanford in 1893 and again after much of the campus was damaged by the 1906 San Francisco earthquake. Following World War II, provost Frederick Terman supported faculty and graduates’ entrepreneurialism to build self-sufficient local industry in what would later be known as Silicon Valley.

    The university is organized around seven schools: three schools consisting of 40 academic departments at the undergraduate level as well as four professional schools that focus on graduate programs in law, medicine, education, and business. All schools are on the same campus. Students compete in 36 varsity sports, and the university is one of two private institutions in the Division I FBS Pac-12 Conference. It has gained 126 NCAA team championships, and Stanford has won the NACDA Directors’ Cup for 24 consecutive years, beginning in 1994–1995. In addition, Stanford students and alumni have won 270 Olympic medals including 139 gold medals.

    As of October 2020, 84 Nobel laureates, 28 Turing Award laureates, and eight Fields Medalists have been affiliated with Stanford as students, alumni, faculty, or staff. In addition, Stanford is particularly noted for its entrepreneurship and is one of the most successful universities in attracting funding for start-ups. Stanford alumni have founded numerous companies, which combined produce more than $2.7 trillion in annual revenue, roughly equivalent to the 7th largest economy in the world (as of 2020). Stanford is the alma mater of one president of the United States (Herbert Hoover), 74 living billionaires, and 17 astronauts. It is also one of the leading producers of Fulbright Scholars, Marshall Scholars, Rhodes Scholars, and members of the United States Congress.

    Stanford University was founded in 1885 by Leland and Jane Stanford, dedicated to Leland Stanford Jr, their only child. The institution opened in 1891 on Stanford’s previous Palo Alto farm.

    Jane and Leland Stanford modeled their university after the great eastern universities, most specifically Cornell University. Stanford opened being called the “Cornell of the West” in 1891 due to faculty being former Cornell affiliates (either professors, alumni, or both) including its first president, David Starr Jordan, and second president, John Casper Branner. Both Cornell and Stanford were among the first to have higher education be accessible, nonsectarian, and open to women as well as to men. Cornell is credited as one of the first American universities to adopt this radical departure from traditional education, and Stanford became an early adopter as well.

    Despite being impacted by earthquakes in both 1906 and 1989, the campus was rebuilt each time. In 1919, The Hoover Institution on War, Revolution and Peace was started by Herbert Hoover to preserve artifacts related to World War I. The Stanford Medical Center, completed in 1959, is a teaching hospital with over 800 beds. The DOE’s SLAC National Accelerator Laboratory(US)(originally named the Stanford Linear Accelerator Center), established in 1962, performs research in particle physics.

    Land

    Most of Stanford is on an 8,180-acre (12.8 sq mi; 33.1 km^2) campus, one of the largest in the United States. It is located on the San Francisco Peninsula, in the northwest part of the Santa Clara Valley (Silicon Valley) approximately 37 miles (60 km) southeast of San Francisco and approximately 20 miles (30 km) northwest of San Jose. In 2008, 60% of this land remained undeveloped.

    Stanford’s main campus includes a census-designated place within unincorporated Santa Clara County, although some of the university land (such as the Stanford Shopping Center and the Stanford Research Park) is within the city limits of Palo Alto. The campus also includes much land in unincorporated San Mateo County (including the SLAC National Accelerator Laboratory and the Jasper Ridge Biological Preserve), as well as in the city limits of Menlo Park (Stanford Hills neighborhood), Woodside, and Portola Valley.

    Non-central campus

    Stanford currently operates in various locations outside of its central campus.

    On the founding grant:

    Jasper Ridge Biological Preserve is a 1,200-acre (490 ha) natural reserve south of the central campus owned by the university and used by wildlife biologists for research.
    SLAC National Accelerator Laboratory is a facility west of the central campus operated by the university for the Department of Energy. It contains the longest linear particle accelerator in the world, 2 miles (3.2 km) on 426 acres (172 ha) of land.
    Golf course and a seasonal lake: The university also has its own golf course and a seasonal lake (Lake Lagunita, actually an irrigation reservoir), both home to the vulnerable California tiger salamander. As of 2012 Lake Lagunita was often dry and the university had no plans to artificially fill it.

    Off the founding grant:

    Hopkins Marine Station, in Pacific Grove, California, is a marine biology research center owned by the university since 1892.
    Study abroad locations: unlike typical study abroad programs, Stanford itself operates in several locations around the world; thus, each location has Stanford faculty-in-residence and staff in addition to students, creating a “mini-Stanford”.

    Redwood City campus for many of the university’s administrative offices located in Redwood City, California, a few miles north of the main campus. In 2005, the university purchased a small, 35-acre (14 ha) campus in Midpoint Technology Park intended for staff offices; development was delayed by The Great Recession. In 2015 the university announced a development plan and the Redwood City campus opened in March 2019.

    The Bass Center in Washington, DC provides a base, including housing, for the Stanford in Washington program for undergraduates. It includes a small art gallery open to the public.

    China: Stanford Center at Peking University, housed in the Lee Jung Sen Building, is a small center for researchers and students in collaboration with Beijing University [北京大学](CN) (Kavli Institute for Astronomy and Astrophysics at Peking University(CN) (KIAA-PKU).

    Administration and organization

    Stanford is a private, non-profit university that is administered as a corporate trust governed by a privately appointed board of trustees with a maximum membership of 38. Trustees serve five-year terms (not more than two consecutive terms) and meet five times annually.[83] A new trustee is chosen by the current trustees by ballot. The Stanford trustees also oversee the Stanford Research Park, the Stanford Shopping Center, the Cantor Center for Visual Arts, Stanford University Medical Center, and many associated medical facilities (including the Lucile Packard Children’s Hospital).

    The board appoints a president to serve as the chief executive officer of the university, to prescribe the duties of professors and course of study, to manage financial and business affairs, and to appoint nine vice presidents. The provost is the chief academic and budget officer, to whom the deans of each of the seven schools report. Persis Drell became the 13th provost in February 2017.

    As of 2018, the university was organized into seven academic schools. The schools of Humanities and Sciences (27 departments), Engineering (nine departments), and Earth, Energy & Environmental Sciences (four departments) have both graduate and undergraduate programs while the Schools of Law, Medicine, Education and Business have graduate programs only. The powers and authority of the faculty are vested in the Academic Council, which is made up of tenure and non-tenure line faculty, research faculty, senior fellows in some policy centers and institutes, the president of the university, and some other academic administrators, but most matters are handled by the Faculty Senate, made up of 55 elected representatives of the faculty.

    The Associated Students of Stanford University (ASSU) is the student government for Stanford and all registered students are members. Its elected leadership consists of the Undergraduate Senate elected by the undergraduate students, the Graduate Student Council elected by the graduate students, and the President and Vice President elected as a ticket by the entire student body.

    Stanford is the beneficiary of a special clause in the California Constitution, which explicitly exempts Stanford property from taxation so long as the property is used for educational purposes.

    Endowment and donations

    The university’s endowment, managed by the Stanford Management Company, was valued at $27.7 billion as of August 31, 2019. Payouts from the Stanford endowment covered approximately 21.8% of university expenses in the 2019 fiscal year. In the 2018 NACUBO-TIAA survey of colleges and universities in the United States and Canada, only Harvard University(US), the University of Texas System(US), and Yale University(US) had larger endowments than Stanford.

    In 2006, President John L. Hennessy launched a five-year campaign called the Stanford Challenge, which reached its $4.3 billion fundraising goal in 2009, two years ahead of time, but continued fundraising for the duration of the campaign. It concluded on December 31, 2011, having raised a total of $6.23 billion and breaking the previous campaign fundraising record of $3.88 billion held by Yale. Specifically, the campaign raised $253.7 million for undergraduate financial aid, as well as $2.33 billion for its initiative in “Seeking Solutions” to global problems, $1.61 billion for “Educating Leaders” by improving K-12 education, and $2.11 billion for “Foundation of Excellence” aimed at providing academic support for Stanford students and faculty. Funds supported 366 new fellowships for graduate students, 139 new endowed chairs for faculty, and 38 new or renovated buildings. The new funding also enabled the construction of a facility for stem cell research; a new campus for the business school; an expansion of the law school; a new Engineering Quad; a new art and art history building; an on-campus concert hall; a new art museum; and a planned expansion of the medical school, among other things. In 2012, the university raised $1.035 billion, becoming the first school to raise more than a billion dollars in a year.

    Research centers and institutes

    DOE’s SLAC National Accelerator Laboratory(US)
    Stanford Research Institute, a center of innovation to support economic development in the region.
    Hoover Institution, a conservative American public policy institution and research institution that promotes personal and economic liberty, free enterprise, and limited government.
    Hasso Plattner Institute of Design, a multidisciplinary design school in cooperation with the Hasso Plattner Institute of University of Potsdam [Universität Potsdam](DE) that integrates product design, engineering, and business management education).
    Martin Luther King Jr. Research and Education Institute, which grew out of and still contains the Martin Luther King Jr. Papers Project.
    John S. Knight Fellowship for Professional Journalists
    Center for Ocean Solutions
    Together with UC Berkeley(US) and UC San Francisco(US), Stanford is part of the Biohub, a new medical science research center founded in 2016 by a $600 million commitment from Facebook CEO and founder Mark Zuckerberg and pediatrician Priscilla Chan.

    Discoveries and innovation

    Natural sciences

    Biological synthesis of deoxyribonucleic acid (DNA) – Arthur Kornberg synthesized DNA material and won the Nobel Prize in Physiology or Medicine 1959 for his work at Stanford.
    First Transgenic organism – Stanley Cohen and Herbert Boyer were the first scientists to transplant genes from one living organism to another, a fundamental discovery for genetic engineering. Thousands of products have been developed on the basis of their work, including human growth hormone and hepatitis B vaccine.
    Laser – Arthur Leonard Schawlow shared the 1981 Nobel Prize in Physics with Nicolaas Bloembergen and Kai Siegbahn for his work on lasers.
    Nuclear magnetic resonance – Felix Bloch developed new methods for nuclear magnetic precision measurements, which are the underlying principles of the MRI.

    Computer and applied sciences

    ARPANETStanford Research Institute, formerly part of Stanford but on a separate campus, was the site of one of the four original ARPANET nodes.

    Internet—Stanford was the site where the original design of the Internet was undertaken. Vint Cerf led a research group to elaborate the design of the Transmission Control Protocol (TCP/IP) that he originally co-created with Robert E. Kahn (Bob Kahn) in 1973 and which formed the basis for the architecture of the Internet.

    Frequency modulation synthesis – John Chowning of the Music department invented the FM music synthesis algorithm in 1967, and Stanford later licensed it to Yamaha Corporation.

    Google – Google began in January 1996 as a research project by Larry Page and Sergey Brin when they were both PhD students at Stanford. They were working on the Stanford Digital Library Project (SDLP). The SDLP’s goal was “to develop the enabling technologies for a single, integrated and universal digital library” and it was funded through the National Science Foundation, among other federal agencies.

    Klystron tube – invented by the brothers Russell and Sigurd Varian at Stanford. Their prototype was completed and demonstrated successfully on August 30, 1937. Upon publication in 1939, news of the klystron immediately influenced the work of U.S. and UK researchers working on radar equipment.

    RISCARPA funded VLSI project of microprocessor design. Stanford and UC Berkeley are most associated with the popularization of this concept. The Stanford MIPS would go on to be commercialized as the successful MIPS architecture, while Berkeley RISC gave its name to the entire concept, commercialized as the SPARC. Another success from this era were IBM’s efforts that eventually led to the IBM POWER instruction set architecture, PowerPC, and Power ISA. As these projects matured, a wide variety of similar designs flourished in the late 1980s and especially the early 1990s, representing a major force in the Unix workstation market as well as embedded processors in laser printers, routers and similar products.
    SUN workstation – Andy Bechtolsheim designed the SUN workstation for the Stanford University Network communications project as a personal CAD workstation, which led to Sun Microsystems.

    Businesses and entrepreneurship

    Stanford is one of the most successful universities in creating companies and licensing its inventions to existing companies; it is often held up as a model for technology transfer. Stanford’s Office of Technology Licensing is responsible for commercializing university research, intellectual property, and university-developed projects.

    The university is described as having a strong venture culture in which students are encouraged, and often funded, to launch their own companies.

    Companies founded by Stanford alumni generate more than $2.7 trillion in annual revenue, equivalent to the 10th-largest economy in the world.

    Some companies closely associated with Stanford and their connections include:

    Hewlett-Packard, 1939, co-founders William R. Hewlett (B.S, PhD) and David Packard (M.S).
    Silicon Graphics, 1981, co-founders James H. Clark (Associate Professor) and several of his grad students.
    Sun Microsystems, 1982, co-founders Vinod Khosla (M.B.A), Andy Bechtolsheim (PhD) and Scott McNealy (M.B.A).
    Cisco, 1984, founders Leonard Bosack (M.S) and Sandy Lerner (M.S) who were in charge of Stanford Computer Science and Graduate School of Business computer operations groups respectively when the hardware was developed.[163]
    Yahoo!, 1994, co-founders Jerry Yang (B.S, M.S) and David Filo (M.S).
    Google, 1998, co-founders Larry Page (M.S) and Sergey Brin (M.S).
    LinkedIn, 2002, co-founders Reid Hoffman (B.S), Konstantin Guericke (B.S, M.S), Eric Lee (B.S), and Alan Liu (B.S).
    Instagram, 2010, co-founders Kevin Systrom (B.S) and Mike Krieger (B.S).
    Snapchat, 2011, co-founders Evan Spiegel and Bobby Murphy (B.S).
    Coursera, 2012, co-founders Andrew Ng (Associate Professor) and Daphne Koller (Professor, PhD).

    Student body

    Stanford enrolled 6,996 undergraduate and 10,253 graduate students as of the 2019–2020 school year. Women comprised 50.4% of undergraduates and 41.5% of graduate students. In the same academic year, the freshman retention rate was 99%.

    Stanford awarded 1,819 undergraduate degrees, 2,393 master’s degrees, 770 doctoral degrees, and 3270 professional degrees in the 2018–2019 school year. The four-year graduation rate for the class of 2017 cohort was 72.9%, and the six-year rate was 94.4%. The relatively low four-year graduation rate is a function of the university’s coterminal degree (or “coterm”) program, which allows students to earn a master’s degree as a 1-to-2-year extension of their undergraduate program.

    As of 2010, fifteen percent of undergraduates were first-generation students.

    Athletics

    As of 2016 Stanford had 16 male varsity sports and 20 female varsity sports, 19 club sports and about 27 intramural sports. In 1930, following a unanimous vote by the Executive Committee for the Associated Students, the athletic department adopted the mascot “Indian.” The Indian symbol and name were dropped by President Richard Lyman in 1972, after objections from Native American students and a vote by the student senate. The sports teams are now officially referred to as the “Stanford Cardinal,” referring to the deep red color, not the cardinal bird. Stanford is a member of the Pac-12 Conference in most sports, the Mountain Pacific Sports Federation in several other sports, and the America East Conference in field hockey with the participation in the inter-collegiate NCAA’s Division I FBS.

    Its traditional sports rival is the University of California, Berkeley, the neighbor to the north in the East Bay. The winner of the annual “Big Game” between the Cal and Cardinal football teams gains custody of the Stanford Axe.

    Stanford has had at least one NCAA team champion every year since the 1976–77 school year and has earned 126 NCAA national team titles since its establishment, the most among universities, and Stanford has won 522 individual national championships, the most by any university. Stanford has won the award for the top-ranked Division 1 athletic program—the NACDA Directors’ Cup, formerly known as the Sears Cup—annually for the past twenty-four straight years. Stanford athletes have won medals in every Olympic Games since 1912, winning 270 Olympic medals total, 139 of them gold. In the 2008 Summer Olympics, and 2016 Summer Olympics, Stanford won more Olympic medals than any other university in the United States. Stanford athletes won 16 medals at the 2012 Summer Olympics (12 gold, two silver and two bronze), and 27 medals at the 2016 Summer Olympics.

    Traditions

    The unofficial motto of Stanford, selected by President Jordan, is Die Luft der Freiheit weht. Translated from the German language, this quotation from Ulrich von Hutten means, “The wind of freedom blows.” The motto was controversial during World War I, when anything in German was suspect; at that time the university disavowed that this motto was official.
    Hail, Stanford, Hail! is the Stanford Hymn sometimes sung at ceremonies or adapted by the various University singing groups. It was written in 1892 by mechanical engineering professor Albert W. Smith and his wife, Mary Roberts Smith (in 1896 she earned the first Stanford doctorate in Economics and later became associate professor of Sociology), but was not officially adopted until after a performance on campus in March 1902 by the Mormon Tabernacle Choir.
    “Uncommon Man/Uncommon Woman”: Stanford does not award honorary degrees, but in 1953 the degree of “Uncommon Man/Uncommon Woman” was created to recognize individuals who give rare and extraordinary service to the University. Technically, this degree is awarded by the Stanford Associates, a voluntary group that is part of the university’s alumni association. As Stanford’s highest honor, it is not conferred at prescribed intervals, but only when appropriate to recognize extraordinary service. Recipients include Herbert Hoover, Bill Hewlett, Dave Packard, Lucile Packard, and John Gardner.
    Big Game events: The events in the week leading up to the Big Game vs. UC Berkeley, including Gaieties (a musical written, composed, produced, and performed by the students of Ram’s Head Theatrical Society).
    “Viennese Ball”: a formal ball with waltzes that was initially started in the 1970s by students returning from the now-closed Stanford in Vienna overseas program. It is now open to all students.
    “Full Moon on the Quad”: An annual event at Main Quad, where students gather to kiss one another starting at midnight. Typically organized by the Junior class cabinet, the festivities include live entertainment, such as music and dance performances.
    “Band Run”: An annual festivity at the beginning of the school year, where the band picks up freshmen from dorms across campus while stopping to perform at each location, culminating in a finale performance at Main Quad.
    “Mausoleum Party”: An annual Halloween Party at the Stanford Mausoleum, the final resting place of Leland Stanford Jr. and his parents. A 20-year tradition, the “Mausoleum Party” was on hiatus from 2002 to 2005 due to a lack of funding, but was revived in 2006. In 2008, it was hosted in Old Union rather than at the actual Mausoleum, because rain prohibited generators from being rented. In 2009, after fundraising efforts by the Junior Class Presidents and the ASSU Executive, the event was able to return to the Mausoleum despite facing budget cuts earlier in the year.
    Former campus traditions include the “Big Game bonfire” on Lake Lagunita (a seasonal lake usually dry in the fall), which was formally ended in 1997 because of the presence of endangered salamanders in the lake bed.

    Award laureates and scholars

    Stanford’s current community of scholars includes:

    19 Nobel Prize laureates (as of October 2020, 85 affiliates in total)
    171 members of the National Academy of Sciences
    109 members of National Academy of Engineering
    76 members of National Academy of Medicine
    288 members of the American Academy of Arts and Sciences
    19 recipients of the National Medal of Science
    1 recipient of the National Medal of Technology
    4 recipients of the National Humanities Medal
    49 members of American Philosophical Society
    56 fellows of the American Physics Society (since 1995)
    4 Pulitzer Prize winners
    31 MacArthur Fellows
    4 Wolf Foundation Prize winners
    2 ACL Lifetime Achievement Award winners
    14 AAAI fellows
    2 Presidential Medal of Freedom winners

    Stanford University Seal

     
  • richardmitnick 10:49 am on September 10, 2021 Permalink | Reply
    Tags: "Nano ‘camera’ made using molecular glue allows real-time monitoring of chemical reactions", Applied Research & Technology, , Cucurbituril: a molecular glue which interacts strongly with both semiconductor quantum dots and gold nanoparticles., The platform could be used to study a wide range of molecules for a variety of potential applications such as the improvement of photocatalysis and photovoltaics for renewable energy.,   

    From University of Cambridge (UK) : “Nano ‘camera’ made using molecular glue allows real-time monitoring of chemical reactions” 

    U Cambridge bloc

    From University of Cambridge (UK)

    02 Sep 2021
    Sarah Collins

    1
    Researchers have made a tiny camera held together with ‘molecular glue’ that allows them to observe chemical reactions in real time.

    The device, made by a team from the University of Cambridge, combines tiny semiconductor nanocrystals called quantum dots and gold nanoparticles using molecular glue called cucurbituril (CB). When added to water with the molecule to be studied, the components self-assemble in seconds into a stable, powerful tool that allows the real-time monitoring of chemical reactions.

    The camera harvests light within the semiconductors, inducing electron transfer processes like those that occur in photosynthesis, which can be monitored using incorporated gold nanoparticle sensors and spectroscopic techniques. They were able to use the camera to observe chemical species which had been previously theorised but not directly observed.

    The platform could be used to study a wide range of molecules for a variety of potential applications such as the improvement of photocatalysis and photovoltaics for renewable energy. The results are reported in the journal Nature Nanotechnology.

    Nature controls the assemblies of complex structures at the molecular scale through self-limiting processes. However, mimicking these processes in the lab is usually time-consuming, expensive and reliant on complex procedures.

    “In order to develop new materials with superior properties, we often combine different chemical species together to come up with a hybrid material that has the properties we want,” said Professor Oren Scherman from Cambridge’s Yusuf Hamied Department of Chemistry, who led the research. “But making these hybrid nanostructures is difficult, and you often end up with uncontrolled growth or materials that are unstable.”

    The new method that Scherman and his colleagues from Cambridge’s Cavendish Laboratory and University College London (UK) developed uses cucurbituril – a molecular glue which interacts strongly with both semiconductor quantum dots and gold nanoparticles. The researchers used small semiconductor nanocrystals to control the assembly of larger nanoparticles through a process they coined interfacial self-limiting aggregation. The process leads to permeable and stable hybrid materials that interact with light. The camera was used to observe photocatalysis and track light-induced electron transfer.

    “We were surprised how powerful this new tool is, considering how straightforward it is to assemble,” said first author Dr Kamil Sokołowski, also from the Department of Chemistry.

    To make their nano camera, the team added the individual components, along with the molecule they wanted to observe, to water at room temperature. Previously, when gold nanoparticles were mixed with the molecular glue in the absence of quantum dots, the components underwent unlimited aggregation and fell out of solution. However, with the strategy developed by the researchers, quantum dots mediate the assembly of these nanostructures so that the semiconductor-metal hybrids control and limit their own size and shape. In addition, these structures stay stable for weeks.

    “This self-limiting property was surprising, it wasn’t anything we expected to see,” said co-author Dr Jade McCune, also from the Department of Chemistry. “We found that the aggregation of one nanoparticulate component could be controlled through the addition of another nanoparticle component.”

    When the researchers mixed the components together, the team used spectroscopy to observe chemical reactions in real time. Using the camera, they were able to observe the formation of radical species – a molecule with an unpaired electron – and products of their assembly such as sigma dimeric viologen species, where two radicals form a reversible carbon-carbon bond. The latter species had been theorised but never observed.

    “People have spent their whole careers getting pieces of matter to come together in a controlled way,” said Scherman, who is also Director of the Melville Laboratory. “This platform will unlock a wide range of processes, including many materials and chemistries that are important for sustainable technologies. The full potential of semiconductor and plasmonic nanocrystals can now be explored, providing an opportunity to simultaneously induce and observe photochemical reactions.”

    “This platform is a really big toolbox considering the number of metal and semiconductor building blocks that can be now coupled together using this chemistry– it opens up lots of new possibilities for imaging chemical reactions and sensing through taking snapshots of monitored chemical systems,” said Sokołowski. “The simplicity of the setup means that researchers no longer need complex, expensive methods to get the same results.”

    Researchers from the Scherman lab are currently working to further develop these hybrids towards artificial photosynthetic systems and (photo)catalysis where electron-transfer processes can be observed directly in real time. The team is also looking at mechanisms of carbon-carbon bond formation as well as electrode interfaces for battery applications.

    The research was carried out in collaboration with Professor Jeremy Baumberg at Cambridge’s Cavendish Laboratory and Dr Edina Rosta at University College London. It was funded in part by the Engineering and Physical Sciences Research Council (EPSRC).

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Cambridge Campus

    The University of Cambridge (UK) [legally The Chancellor, Masters, and Scholars of the University of Cambridge] is a collegiate public research university in Cambridge, England. Founded in 1209 Cambridge is the second-oldest university in the English-speaking world and the world’s fourth-oldest surviving university. It grew out of an association of scholars who left the University of Oxford(UK) after a dispute with townsfolk. The two ancient universities share many common features and are often jointly referred to as “Oxbridge”.

    Cambridge is formed from a variety of institutions which include 31 semi-autonomous constituent colleges and over 150 academic departments, faculties and other institutions organised into six schools. All the colleges are self-governing institutions within the university, each controlling its own membership and with its own internal structure and activities. All students are members of a college. Cambridge does not have a main campus and its colleges and central facilities are scattered throughout the city. Undergraduate teaching at Cambridge is organised around weekly small-group supervisions in the colleges – a feature unique to the Oxbridge system. These are complemented by classes, lectures, seminars, laboratory work and occasionally further supervisions provided by the central university faculties and departments. Postgraduate teaching is provided predominantly centrally.

    Cambridge University Press a department of the university is the oldest university press in the world and currently the second largest university press in the world. Cambridge Assessment also a department of the university is one of the world’s leading examining bodies and provides assessment to over eight million learners globally every year. The university also operates eight cultural and scientific museums, including the Fitzwilliam Museum, as well as a botanic garden. Cambridge’s libraries – of which there are 116 – hold a total of around 16 million books, around nine million of which are in Cambridge University Library, a legal deposit library. The university is home to – but independent of – the Cambridge Union – the world’s oldest debating society. The university is closely linked to the development of the high-tech business cluster known as “Silicon Fe”. It is the central member of Cambridge University Health Partners, an academic health science centre based around the Cambridge Biomedical Campus.

    By both endowment size and consolidated assets Cambridge is the wealthiest university in the United Kingdom. In the fiscal year ending 31 July 2019, the central university – excluding colleges – had a total income of £2.192 billion of which £592.4 million was from research grants and contracts. At the end of the same financial year the central university and colleges together possessed a combined endowment of over £7.1 billion and overall consolidated net assets (excluding “immaterial” historical assets) of over £12.5 billion. It is a member of numerous associations and forms part of the ‘golden triangle’ of English universities.

    Cambridge has educated many notable alumni including eminent mathematicians; scientists; politicians; lawyers; philosophers; writers; actors; monarchs and other heads of state. As of October 2020 121 Nobel laureates; 11 Fields Medalists; 7 Turing Award winners; and 14 British prime ministers have been affiliated with Cambridge as students; alumni; faculty or research staff. University alumni have won 194 Olympic medals.

    History

    By the late 12th century the Cambridge area already had a scholarly and ecclesiastical reputation due to monks from the nearby bishopric church of Ely. However it was an incident at Oxford which is most likely to have led to the establishment of the university: three Oxford scholars were hanged by the town authorities for the death of a woman without consulting the ecclesiastical authorities who would normally take precedence (and pardon the scholars) in such a case; but were at that time in conflict with King John. Fearing more violence from the townsfolk scholars from the University of Oxford started to move away to cities such as Paris; Reading; and Cambridge. Subsequently enough scholars remained in Cambridge to form the nucleus of a new university when it had become safe enough for academia to resume at Oxford. In order to claim precedence it is common for Cambridge to trace its founding to the 1231 charter from Henry III granting it the right to discipline its own members (ius non-trahi extra) and an exemption from some taxes; Oxford was not granted similar rights until 1248.

    A bull in 1233 from Pope Gregory IX gave graduates from Cambridge the right to teach “everywhere in Christendom”. After Cambridge was described as a studium generale in a letter from Pope Nicholas IV in 1290 and confirmed as such in a bull by Pope John XXII in 1318 it became common for researchers from other European medieval universities to visit Cambridge to study or to give lecture courses.

    Foundation of the colleges

    The colleges at the University of Cambridge were originally an incidental feature of the system. No college is as old as the university itself. The colleges were endowed fellowships of scholars. There were also institutions without endowments called hostels. The hostels were gradually absorbed by the colleges over the centuries; but they have left some traces, such as the name of Garret Hostel Lane.

    Hugh Balsham, Bishop of Ely, founded Peterhouse – Cambridge’s first college in 1284. Many colleges were founded during the 14th and 15th centuries but colleges continued to be established until modern times. There was a gap of 204 years between the founding of Sidney Sussex in 1596 and that of Downing in 1800. The most recently established college is Robinson built in the late 1970s. However Homerton College only achieved full university college status in March 2010 making it the newest full college (it was previously an “Approved Society” affiliated with the university).

    In medieval times many colleges were founded so that their members would pray for the souls of the founders and were often associated with chapels or abbeys. The colleges’ focus changed in 1536 with the Dissolution of the Monasteries. Henry VIII ordered the university to disband its Faculty of Canon Law and to stop teaching “scholastic philosophy”. In response, colleges changed their curricula away from canon law and towards the classics; the Bible; and mathematics.

    Nearly a century later the university was at the centre of a Protestant schism. Many nobles, intellectuals and even commoners saw the ways of the Church of England as too similar to the Catholic Church and felt that it was used by the Crown to usurp the rightful powers of the counties. East Anglia was the centre of what became the Puritan movement. In Cambridge the movement was particularly strong at Emmanuel; St Catharine’s Hall; Sidney Sussex; and Christ’s College. They produced many “non-conformist” graduates who, greatly influenced by social position or preaching left for New England and especially the Massachusetts Bay Colony during the Great Migration decade of the 1630s. Oliver Cromwell, Parliamentary commander during the English Civil War and head of the English Commonwealth (1649–1660), attended Sidney Sussex.

    Modern period

    After the Cambridge University Act formalised the organisational structure of the university the study of many new subjects was introduced e.g. theology, history and modern languages. Resources necessary for new courses in the arts architecture and archaeology were donated by Viscount Fitzwilliam of Trinity College who also founded the Fitzwilliam Museum. In 1847 Prince Albert was elected Chancellor of the University of Cambridge after a close contest with the Earl of Powis. Albert used his position as Chancellor to campaign successfully for reformed and more modern university curricula, expanding the subjects taught beyond the traditional mathematics and classics to include modern history and the natural sciences. Between 1896 and 1902 Downing College sold part of its land to build the Downing Site with new scientific laboratories for anatomy, genetics, and Earth sciences. During the same period the New Museums Site was erected including the Cavendish Laboratory which has since moved to the West Cambridge Site and other departments for chemistry and medicine.

    The University of Cambridge began to award PhD degrees in the first third of the 20th century. The first Cambridge PhD in mathematics was awarded in 1924.

    In the First World War 13,878 members of the university served and 2,470 were killed. Teaching and the fees it earned came almost to a stop and severe financial difficulties followed. As a consequence the university first received systematic state support in 1919 and a Royal Commission appointed in 1920 recommended that the university (but not the colleges) should receive an annual grant. Following the Second World War the university saw a rapid expansion of student numbers and available places; this was partly due to the success and popularity gained by many Cambridge scientists.

     
  • richardmitnick 10:27 am on September 10, 2021 Permalink | Reply
    Tags: "An insulator made of two conductors", Applied Research & Technology, In graphene layers twisted relative to each other two electrical conductors team up to form an insulator., ,   

    From Swiss Federal Institute of Technology in Zürich [ETH Zürich] [Eidgenössische Technische Hochschule Zürich] (CH): “An insulator made of two conductors” 

    From Swiss Federal Institute of Technology in Zürich [ETH Zürich] [Eidgenössische Technische Hochschule Zürich] (CH)

    09.09.2021
    Oliver Morsch

    At ETH Zürich researchers have observed a new state of matter: in graphene layers twisted relative to each other, two electrical conductors team up to form an insulator.

    1
    In two graphene double layers twisted relative to each other (red and blue), insulating states consisting of electron-​hole pairs (‘-‘ and ‘+’) can form. (Visualisations: Peter Rickhaus / ETH Zürich)

    Ohm’s law is well-​known from physics class. It states that the resistance of a conductor and the voltage applied to it determine how much current will flow through the conductor. The electrons in the material – the negatively charged carriers – move in a disordered fashion and largely independently of each other. Physicists find it far more interesting, however, when the charge carriers influence one another strongly enough for that simple picture not to be correct anymore.

    This is the case, for instance, in “Twisted Bilayer Graphene”, which was discovered a few years ago. That material is made from two wafer-​thin graphene layers consisting of a single layer of carbon atoms each. If two neighbouring layers are slightly twisted with respect to each other, the electrons can be influenced in such a way that they interact strongly with one another. As a consequence, the material can, for instance, become superconducting and hence conduct current without any losses.

    A team of researchers led by Klaus Ensslin and Thomas Ihn at the Laboratory for Solid State Physics at ETH Zürich, together with colleagues at The University of Texas-Austin (US), has now observed a novel state in twisted double layers of graphene. In that state, negatively charged electrons and positively charged so-​called holes, which are missing electrons in the material, are correlated so strongly with each other that the material no longer conducts electric current.

    Twisted graphene layers

    “In conventional experiments, in which graphene layers are twisted by about one degree with respect to each other, the mobility of the electrons is influenced by quantum mechanical tunnelling between the layers”, explains Peter Rickhaus, a post-​doc and lead author of the study recently published in the journal Science. “In our new experiment, by contrast, we twist two double layers of graphene by more than two degrees relative to each other, so that electrons can essentially no longer tunnel between the double layers.”

    Increased resistance through coupling

    As a result of this, by applying an electric field electrons can be created in one of the double layers and holes in the other. Both electrons and holes can conduct electric current. Therefore, one would expect the two graphene double layers together to form an even better conductor with a smaller resistance.

    Under certain circumstances, however, the exact opposite can happen, as Folkert de Vries, a post-​doc in Ensslin’s team, explains: “If we adjust the electric field in such a way as to have the same number of electrons and holes in the double layers, the resistance suddenly increases sharply.” For several weeks Ensslin and his collaborators were unable to make sense of that surprising result, but eventually their theory colleague Allan H. MacDonald from Austin gave them a decisive hint: according to MacDonald, they had observed a new kind of density wave.

    So-​called charge density waves usually arise in one-​dimensional conductors when the electrons in the material collectively conduct electric current and also spatially arrange themselves into waves. In the experiment performed by the ETH researchers, it is now the electrons and holes that pair with each other by electrostatic attraction and thus form a collective density wave. That density wave, however, now consists of electrically neutral electron-​hole pairs, so that the two double layers taken together can no longer conduct electric current.

    2
    Twisted graphene (left) is sandwiched between two-dimensional insulators and attached to contacts in order to measure electric current (centre). An electron-hole state is then created by applying a large voltage to the gate electrodes (right). (Visualisations: Peter Rickhaus / ETH Zürich).

    New correlated state

    “That’s a completely new correlated state of electrons and holes which has no overall charge”, says Ensslin. “This neutral state can, nevertheless, transmit information or conduct heat. Moreover, what’s special about it is that we can completely control it through the twisting angle and the applied voltage.” Similar states have been observed in other materials in which electron-​hole pairs (also known as excitons) are created through excitation using laser light. In the experiment at ETH, however, the electrons and holes are in their ground state, or state of lowest energy, which means that their lifetime is not limited by spontaneous decay.

    Possible application in quantum technologies

    Ensslin, who specializes in the investigation of the electronic properties of small quantum systems, is already speculating about possible practical applications for the new correlated state. However, this will require a fair amount of preparatory work. One could trap the electron-​hole pairs, for instance in a (Fabry-​Pérot) resonator. That is very demanding, as neutral particles cannot be directly controlled, for example using electric fields. The fact that the state is electrically neutral might, on the other hand, turn out to be an advantage: it could be exploited to make quantum memories less susceptible to electric field noise.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ETH Zurich campus
    Swiss Federal Institute of Technology in Zürich [ETH Zürich] [Eidgenössische Technische Hochschule Zürich] (CH) is a public research university in the city of Zürich, Switzerland. Founded by the Swiss Federal Government in 1854 with the stated mission to educate engineers and scientists, the school focuses exclusively on science, technology, engineering and mathematics. Like its sister institution Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne](CH) , it is part of the Swiss Federal Institutes of Technology Domain (ETH Domain)) , part of the Swiss Federal Department of Economic Affairs, Education and Research [EAER][Eidgenössisches Departement für Wirtschaft, Bildung und Forschung] [Département fédéral de l’économie, de la formation et de la recherche] (CH).

    The university is an attractive destination for international students thanks to low tuition fees of 809 CHF per semester, PhD and graduate salaries that are amongst the world’s highest, and a world-class reputation in academia and industry. There are currently 22,200 students from over 120 countries, of which 4,180 are pursuing doctoral degrees. In the 2021 edition of the QS World University Rankings ETH Zürich is ranked 6th in the world and 8th by the Times Higher Education World Rankings 2020. In the 2020 QS World University Rankings by subject it is ranked 4th in the world for engineering and technology (2nd in Europe) and 1st for earth & marine science.

    As of November 2019, 21 Nobel laureates, 2 Fields Medalists, 2 Pritzker Prize winners, and 1 Turing Award winner have been affiliated with the Institute, including Albert Einstein. Other notable alumni include John von Neumann and Santiago Calatrava. It is a founding member of the IDEA League and the International Alliance of Research Universities (IARU) and a member of the CESAER network.

    ETH Zürich was founded on 7 February 1854 by the Swiss Confederation and began giving its first lectures on 16 October 1855 as a polytechnic institute (eidgenössische polytechnische Schule) at various sites throughout the city of Zurich. It was initially composed of six faculties: architecture, civil engineering, mechanical engineering, chemistry, forestry, and an integrated department for the fields of mathematics, natural sciences, literature, and social and political sciences.

    It is locally still known as Polytechnikum, or simply as Poly, derived from the original name eidgenössische polytechnische Schule, which translates to “federal polytechnic school”.

    ETH Zürich is a federal institute (i.e., under direct administration by the Swiss government), whereas the University of Zürich [Universität Zürich ] (CH) is a cantonal institution. The decision for a new federal university was heavily disputed at the time; the liberals pressed for a “federal university”, while the conservative forces wanted all universities to remain under cantonal control, worried that the liberals would gain more political power than they already had. In the beginning, both universities were co-located in the buildings of the University of Zürich.

    From 1905 to 1908, under the presidency of Jérôme Franel, the course program of ETH Zürich was restructured to that of a real university and ETH Zürich was granted the right to award doctorates. In 1909 the first doctorates were awarded. In 1911, it was given its current name, Eidgenössische Technische Hochschule. In 1924, another reorganization structured the university in 12 departments. However, it now has 16 departments.

    ETH Zürich, EPFL (Swiss Federal Institute of Technology in Lausanne) [École polytechnique fédérale de Lausanne](CH), and four associated research institutes form the Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles polytechniques fédérales] (CH) with the aim of collaborating on scientific projects.

    Reputation and ranking

    ETH Zürich is ranked among the top universities in the world. Typically, popular rankings place the institution as the best university in continental Europe and ETH Zürich is consistently ranked among the top 1-5 universities in Europe, and among the top 3-10 best universities of the world.

    Historically, ETH Zürich has achieved its reputation particularly in the fields of chemistry, mathematics and physics. There are 32 Nobel laureates who are associated with ETH Zürich, the most recent of whom is Richard F. Heck, awarded the Nobel Prize in chemistry in 2010. Albert Einstein is perhaps its most famous alumnus.

    In 2018, the QS World University Rankings placed ETH Zürich at 7th overall in the world. In 2015, ETH Zürich was ranked 5th in the world in Engineering, Science and Technology, just behind the Massachusetts Institute of Technology(US), Stanford University(US) and University of Cambridge(UK). In 2015, ETH Zürich also ranked 6th in the world in Natural Sciences, and in 2016 ranked 1st in the world for Earth & Marine Sciences for the second consecutive year.

    In 2016, Times Higher Education World University Rankings ranked ETH Zürich 9th overall in the world and 8th in the world in the field of Engineering & Technology, just behind the Massachusetts Institute of Technology(US), Stanford University(US), California Institute of Technology(US), Princeton University(US), University of Cambridge(UK), Imperial College London(UK) and University of Oxford(UK) .

    In a comparison of Swiss universities by swissUP Ranking and in rankings published by CHE comparing the universities of German-speaking countries, ETH Zürich traditionally is ranked first in natural sciences, computer science and engineering sciences.

    In the survey CHE ExcellenceRanking on the quality of Western European graduate school programs in the fields of biology, chemistry, physics and mathematics, ETH Zürich was assessed as one of the three institutions to have excellent programs in all the considered fields, the other two being Imperial College London(UK) and the University of Cambridge(UK), respectively.

     
  • richardmitnick 10:02 am on September 10, 2021 Permalink | Reply
    Tags: "Anticipating Climate Impacts of Major Volcanic Eruptions", Applied Research & Technology, , , , , In the event of future eruptions on par with or larger than those at El Chichón and Pinatubo, Major volcanic eruptions inject large amounts of gases; aerosols; and particulates into the atmosphere., Meanwhile NASA’s Aerosol Robotic Network (AERONET); Micro-Pulse Lidar Network (MPLNET); and Southern Hemisphere Additional Ozonesondes (SHADOZ) would provide real-time observations from the ground., NASA recently developed a volcanic eruption response plan to maximize the quantity and quality of observations it makes following eruptions., National Aeronautics Space Agency (US)’s rapid response plan for gathering atmospheric data amid major volcanic eruptions paired with efforts to improve eruption simulations will offer better views , Rapid mobilization of NASA’s observational and research assets will permit scientists to make early initial estimates of potential impacts., Rapid responses to major volcanic eruptions enable scientists to make timely initial estimates of potential climate impacts to assist responders in implementing mitigation efforts., The threshold amount of volcanic SO2 emissions required to produce measurable climate impacts is not known exactly.,   

    From Eos: “Anticipating Climate Impacts of Major Volcanic Eruptions” 

    From AGU
    Eos news bloc

    From Eos

    31 August 2021

    Simon A. Carn
    scarn@mtu.edu
    Paul A. Newman
    Valentina Aquila
    Helge Gonnermann
    Josef Dufek

    National Aeronautics Space Agency (US)’s rapid response plan for gathering atmospheric data amid major volcanic eruptions, paired with efforts to improve eruption simulations, will offer better views of these events’ global effects.

    1
    A thick cloud of volcanic ash and aerosols rises into the atmosphere above the north Pacific Ocean on 22 June 2019. An astronaut aboard the International Space Station captured this image of the plume during an eruption of Raikoke Volcano in the Kuril Islands. Credit: ISS Crew Earth Observations Facility and the Earth Science and Remote Sensing Unit, Johnson Space Center.

    This year marks the 30th anniversary of the most recent volcanic eruption that had a measurable effect on global climate. In addition to devastating much of the surrounding landscape and driving thousands of people to flee the area, the June 1991 eruption at Mount Pinatubo in the Philippines sent towering plumes of gas, ash, and particulates high into the atmosphere—materials that ultimately reduced average global surface temperatures by up to about 0.5°C in 1991–1993. It has also been more than 40 years since the last major explosive eruption in the conterminous United States, at Mount St. Helens in Washington in May 1980. As the institutional memory of these infrequent, but high-impact, events fades in this country and new generations of scientists assume responsibility for volcanic eruption responses, the geophysical community must remain prepared for coming eruptions, regardless of these events’ locations.

    Rapid responses to major volcanic eruptions enable scientists to make timely initial estimates of potential climate impacts (i.e., long-term effects) to assist responders in implementing mitigation efforts, including preparing for weather and climate effects in the few years following an eruption. These events also present critical opportunities to advance volcano science [National Academies of Sciences, Engineering, and Medicine (NASEM), 2017], and observations of large events with the potential to affect climate and life globally are particularly valuable.

    Recognizing this value, NASA recently developed a volcanic eruption response plan to maximize the quantity and quality of observations it makes following eruptions [NASA, 2018*], and it is facilitating continuing research into the drivers and behaviors of volcanic eruptions to further improve scientific eruption response efforts.

    *See References below

    How Volcanic Eruptions Affect Climate

    Major volcanic eruptions inject large amounts of gases; aerosols; and particulates into the atmosphere. Timely quantification of these emissions shortly after they erupt and as they disperse is needed to assess their potential climate effects. Scientists have a reasonable understanding of the fundamentals of how explosive volcanic eruptions influence climate and stratospheric ozone. This understanding is based on a few well-studied events in the satellite remote sensing era (e.g., Pinatubo) and on proxy records of older eruptions such as the 1815 eruption of Tambora in Indonesia [Robock, 2000]. However, the specific effects of eruptions depend on their magnitude, location, and the particular mix of materials ejected.

    To affect global climate, an eruption must inject large quantities of sulfur dioxide (SO2) or other sulfur species (e.g., hydrogen sulfide, H2S) into the stratosphere, where they are converted to sulfuric acid (or sulfate) aerosols over weeks to months (Figure 1). The sulfate aerosols linger in the stratosphere for a few years, reflecting some incoming solar radiation and thus reducing global average surface temperatures by as much as about 0.5°C for 1–3 years, after which temperatures recover to preeruption levels.

    2
    Fig. 1. In the top plot, the black curve represents monthly global mean stratospheric aerosol optical depth (AOD; background is 0.004 or below) for green light (525 nanometers) from 1979 to 2018 from the Global Space-based Stratospheric Aerosol Climatology (GloSSAC) [Kovilakam et al., 2020; Thomason et al., 2018]. AOD is a measure of aerosol abundance in the atmosphere. Red dots represent annual sulfur dioxide (SO2) emissions in teragrams (Tg) from explosive volcanic eruptions as determined from satellite measurements [Carn, 2021]. The dashed horizontal line indicates the 5-Tg SO2 emission threshold for a NASA eruption response. Vertical gray bars indicate notable volcanic eruptions and their SO2 emissions. From left to right, He = 1980 Mount St. Helens (United States), Ul = 1980 Ulawun (Papua New Guinea (PNG)), Pa = 1981 Pagan (Commonwealth of the Northern Mariana Islands), El = 1982 El Chichón (Mexico), Co = 1983 Colo (Indonesia), Ne = 1985 Nevado del Ruiz (Colombia), Ba = 1988 Banda Api (Indonesia), Ke = 1990 Kelut (Indonesia), Pi = 1991 Mount Pinatubo (Philippines), Ce = 1991 Cerro Hudson (Chile), Ra = 1994 Rabaul (PNG), Ru = 2001 Ulawun, 2002 Ruang (Indonesia), Re = 2002 Reventador (Ecuador), Ma = 2005 Manam (PNG), So = 2006 Soufriere Hills (Montserrat), Ra = 2006 Rabaul (PNG), Ka = 2008 Kasatochi (USA), Sa = 2009 Sarychev Peak (Russia), Me = 2010 Merapi (Indonesia), Na = 2011 Nabro (Eritrea), Ke = 2014 Kelut (Indonesia), Ca = 2015 Calbuco (Chile), Am = 2018 Ambae (Vanuatu). In the bottom plot, circles indicate satellite-measured SO2 emissions (symbol size denotes SO2 mass) and estimated plume altitudes (symbol color denotes altitude) for volcanic eruptions since October 1978 [Carn, 2021].

    Although this direct radiative effect cools the surface, the aerosol particles also promote warming in the stratosphere by absorbing outgoing longwave radiation emitted from Earth’s surface as well as some solar radiation, which affects atmospheric temperature gradients and thus circulation (an indirect advective effect). This absorption of longwave radiation also promotes chemical reactions on the aerosol particles that drive stratospheric ozone depletion [Kremser et al., 2016], which reduces absorption of ultraviolet (UV) radiation and further influences atmospheric circulation. The interplay of aerosol radiative and advective effects, which both influence surface temperatures, leads to regional and seasonal variations in surface cooling and warming. For example, because advective effects tend to dominate in winter in the northern midlatitudes, winter warming of Northern Hemisphere continents—lasting about 2 years—is expected after major tropical eruptions [Shindell et al., 2004].

    Eruptions from tropical volcanoes like Pinatubo typically generate more extensive stratospheric aerosol veils because material injected into the tropical stratosphere can spread into both hemispheres. However, major high-latitude eruptions can also have significant climate impacts depending on their season and the altitude that their eruption plumes reach [Toohey et al., 2019].

    The effects of volcanic ash particles are usually neglected in climate models because the particles have shorter atmospheric lifetimes than sulfate aerosols, although recent work has suggested that persistent fine ash may influence stratospheric sulfur chemistry [Zhu et al., 2020]. This finding provides further motivation for timely sampling of volcanic eruption clouds.

    The threshold amount of volcanic SO2 emissions required to produce measurable climate impacts is not known exactly. On the basis of prior eruptions, NASA considers that an injection of roughly 5 teragrams (5 million metric tons) of SO2 or more into the stratosphere has sufficient potential for climate forcing of –1 Watt per square meter (that is, 1 Watt per square meter less energy is put into Earth’s climate system as a result of the stratospheric aerosols produced from the SO2) and warrants application of substantial observational assets.

    Since the dawn of the satellite era for eruption observations in 1978, this threshold has been surpassed by only two eruptions: at El Chichón (Mexico) in 1982 and Pinatubo in 1991 (Figure 1), which reached 5 and 6, respectively, on the volcanic explosivity index (VEI; a logarithmic scale of eruption size from 0 to 8). Since Pinatubo, the observational tools that NASA employs have greatly improved.

    In the event of future eruptions on par with or larger than those at El Chichón and Pinatubo, rapid mobilization of NASA’s observational and research assets, including satellites, balloons, ground-based instruments, aircraft, and modeling capabilities, will permit scientists to make early initial estimates of potential impacts. Capturing the transient effects of volcanic aerosols on climate would also provide critical data to inform proposed solar geoengineering strategies that involve introducing aerosols into the atmosphere to mitigate global warming [NASEM, 2021].

    NASA’s Eruption Response Plan

    In the United States, NASA has traditionally led investigations of eruptions involving stratospheric injection because of the agency’s global satellite-based observation capabilities for measuring atmospheric composition and chemistry and its unique suborbital assets for measuring the evolution of volcanic clouds in the stratosphere.

    Under its current plan, NASA’s eruption response procedures will be triggered in the event an eruption emits at least 5 teragrams of SO2 into the stratosphere, as estimated using NASA’s or other satellite assets [e.g., Carn et al., 2016]. The first phase of the response plan involves a review of near-real-time satellite data by a combined panel of NASA Headquarters (HQ) science program managers and NASA research scientists in parallel with initial modeling of the eruption plume’s potential atmospheric evolution and impacts.

    The HQ review identifies relevant measurement and modeling capabilities at the various NASA centers and among existing NASA-funded activities. HQ personnel would establish and task science leads and teams comprising relevant experts from inside and outside NASA to take responsibility for observations from the ground, from balloons, and from aircraft. The efforts of these three groups would be supplemented by satellite observations and modeling to develop key questions, priority observations, and sampling and deployment plans.

    Implementing the plan developed in this phase would likely result in major diversions and re-tasking of assets, such as NASA aircraft involved in meteorological monitoring, from ongoing NASA research activities and field deployments. Ensuring that these diversions are warranted necessitates that this review process is thorough and tasking assignments are carefully considered.

    The second phase of NASA’s volcanic response plan—starting between 1 week and 1 month after the eruption—involves the application of its satellite platforms, ground observations from operational networks, and eruption cloud modeling. Satellites would track volcanic clouds to observe levels of SO2 and other aerosols and materials. Gathering early information on volcanic aerosol properties like density, particle composition, and particle size distribution would provide key information for assessing in greater detail the potential evolution and effects of the volcanic aerosols. Such assessments could provide valuable information on the amount of expected surface cooling attributable to these aerosols, as well as the lifetime of stratospheric aerosol particles—two factors that depend strongly on the aerosols’ size distribution and temporal evolution.

    Meanwhile NASA’s Aerosol Robotic Network (AERONET); Micro-Pulse Lidar Network (MPLNET); and Southern Hemisphere Additional Ozonesondes (SHADOZ) would provide real-time observations from the ground. Eruption cloud modeling would be used to calculate cloud trajectories and dispersion to optimize selection of ground stations for balloon launches and re-tasking of airborne assets.

    The third phase of the response plan—starting 1–3 months after an eruption—would see the deployment of rapid response balloons and aircraft (e.g., from NASA’s Airborne Science Program). The NASA P-3 Orion, Gulfstream V, and DC-8 aircraft have ranges of more than 7,000 kilometers and can carry heavy instrumentation payloads of more than 2,500 kilograms to sample the middle to upper troposphere. A mix of in situ and remote sensing instruments would be employed to collect detailed observations of eruption plume structure, evolution, and optical properties.

    NASA’s high-altitude aircraft (ER-2 and WB-57f) provide coverage into the stratosphere (above about 18 kilometers) with payloads of more than 2,200 kilograms. These high-altitude planes would carry payloads for measuring the evolving aerosol distributions along with trace gas measurements in situ to further understand the response of stratospheric ozone and climate forcing to the eruption. In particular, the high-altitude observations would include data on the particle composition and size distribution of aerosols, as well as on ozone, SO2, nitrous oxide and other stratospheric tracers, water vapor, and free radical species. Instrumented balloons capable of reaching the stratosphere could also be rapidly deployed to remote locations to supplement these data in areas not reached by the aircraft.

    The third phase would be staged as several 2- to 6-week deployments over a 1- to 2-year period that would document the seasonal evolution, latitudinal dispersion, and multiyear dissipation of the plume from the stratosphere. These longer-term observations would help to constrain model simulations of the eruption’s impacts on the global atmosphere and climate.

    Enhancing Eruption Response

    An effective eruption response is contingent on timely recognition of the hallmarks of a major volcanic eruption, namely, stratospheric injection and substantial emissions of SO2 (and H2S) amounting to more than 5 teragrams, using satellite data. However, it may take several hours to a day after an event for satellites to confirm that emissions have reached this level. By then, time has been lost to position instruments and personnel to effectively sample the earliest stages of an eruption, and it is already too late to observe the onset of the eruption.

    Hence, a key element in efforts to strengthen eruption responses is improving our recognition of distinctive geophysical or geochemical eruption precursors that may herald a high-magnitude event. Observations of large, sulfur-rich eruptions such as Pinatubo have led to scientific consensus that such eruptions emit “excess” volatiles—gas emissions (especially sulfur species, but also other gases such as water vapor and carbon dioxide) exceeding those that could be derived from the erupted magma alone. Excess volatiles, in the form of gas bubbles derived from within or below a magma reservoir that then accumulate near the top of the reservoir, may exacerbate climate impacts of eruptions and influence magmatic processes like magma differentiation, eruption triggering and magnitude, and hydrothermal ore deposition [e.g., Edmonds and Woods, 2018]. They may also produce detectable eruption precursors and influence eruption and plume dynamics, although how remains largely unknown.

    With support from NASA’s Interdisciplinary Research in Earth Science program, we (the authors) have begun an integrated investigation of eruption dynamics focused on understanding the fate of excess volatiles from their origins in a magma reservoir, through underground conduits and into a volcanic plume, and, subsequently, as they are dispersed in the atmosphere. The satellite observations we use are the same or similar to those required for rapid assessment and response to future high-magnitude events (with a VEI of 6 or greater).

    Our investigation is using data from previous moderate-scale eruptions (VEI of 3–5) with excellent satellite observational records that captured instances in which gases and aerosols displayed disparate atmospheric dispersion patterns. Among the main questions we are examining is whether excess volatile accumulation in magma reservoirs can drive large eruptions and produce enhanced aerosol-related climate impacts resulting from these eruptions. Using numerical model simulations of eruptions involving variable quantities of excess volatiles, we will endeavor to reproduce the specific atmospheric distributions of gases and aerosols observed by satellites after these events and thus elucidate how volatile accumulation might influence plume dispersion and climate impacts.

    We are currently developing a framework to simulate a future eruption with a VEI of 6+. Over the coming year, we hope to produce benchmark simulations that track the fate of volcanic gases as they travel from a subsurface magmatic system into the atmosphere to be distributed globally. This simulation framework will comprise a coupled suite of subsystem-scale numerical models, including models of magma withdrawal from the magma reservoir, magma ascent within the volcanic conduit, stratospheric injection within the volcanic plume, and atmospheric dispersion and effects on climate.

    With these tools, NASA will have gained important capabilities in simulating volcanic eruptions and understanding their potential precursors. These capabilities will complement NASA’s satellite and suborbital observations of volcanic eruptions as they unfold—an important advance for volcano science and a powerful means to assess the climate impacts of future large explosive eruptions.

    References:

    Carn, S. A. (2021), Multi-Satellite Volcanic Sulfur Dioxide L4 Long-Term Global Database V4, USA, Goddard Earth Sci. Data and Inf. Serv. Cent., Greenbelt, Md., https://doi.org/10.5067/MEASURES/SO2/DATA405.

    Carn, S. A., L. Clarisse, and A. J. Prata (2016), Multi-decadal satellite measurements of global volcanic degassing, J. Volcanol. Geotherm. Res., 311, 99–134, https://doi.org/10.1016/j.jvolgeores.2016.01.002.

    Edmonds, M., and A. W. Woods (2018), Exsolved volatiles in magma reservoirs, J. Volcanol. Geotherm. Res., 368, 13–30, https://doi.org/10.1016/j.jvolgeores.2018.10.018.

    Kovilakam, M., et al. (2020), The Global Space-based Stratospheric Aerosol Climatology (version 2.0): 1979–2018, Earth Syst. Sci. Data, 12(4), 2,607–2,634, https://doi.org/10.5194/essd-12-2607-2020.

    Kremser, S., et al. (2016), Stratospheric aerosol—Observations, processes, and impact on climate, Rev. Geophys., 54(2), 278–335, https://doi.org/10.1002/2015RG000511.

    NASA (2018), NASA Major Volcanic Eruption Response Plan, version 11, Greenbelt, Md., acd-ext.gsfc.nasa.gov/Documents/NASA_reports/Docs/VolcanoWorkshopReport_v12.pdf.

    National Academies of Sciences, Engineering, and Medicine (NASEM) (2017), Volcanic Eruptions and Their Repose, Unrest, Precursors, and Timing, Natl. Acad. Press, Washington, D.C., https://doi.org/10.17226/24650.

    National Academies of Sciences, Engineering, and Medicine (NASEM) (2021), Reflecting Sunlight: Recommendations for Solar Geoengineering Research and Research Governance, Natl. Acad. Press, Washington, D.C., https://doi.org/10.17226/25762.

    Robock, A. (2000), Volcanic eruptions and climate, Rev. Geophys., 38(2), 191–219, https://doi.org/10.1029/1998RG000054.

    Shindell, D. T., et al. (2004), Dynamic winter climate response to large tropical volcanic eruptions since 1600, J. Geophys Res., 109, D05104, https://doi.org/10.1029/2003JD004151.

    Thomason, L. W., et al. (2018), A global space-based stratospheric aerosol climatology: 1979–2016, Earth Syst. Sci. Data, 10(1), 469–492, https://doi.org/10.5194/essd-10-469-2018.

    Toohey, M., et al. (2019), Disproportionately strong climate forcing from extratropical explosive volcanic eruptions, Nat. Geosci., 12(2), 100–107, https://doi.org/10.1038/s41561-018-0286-2.

    Zhu, Y., et al. (2020), Persisting volcanic ash particles impact stratospheric SO2 lifetime and aerosol optical properties, Nat. Commun., 11, 4526, https://doi.org/10.1038/s41467-020-18352-5.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 9:18 am on September 10, 2021 Permalink | Reply
    Tags: "New Imaging Reveals Hidden Ice Age Landscapes Buried Deep in The North Sea", "Tunnel valleys", Applied Research & Technology, , Enormous gouges carved by subglacial rivers buried hundreds of meters below the floor of the North Sea., , Important markers of "deglaciation"., In the way that we can leave footprints in the sand glaciers leave an imprint on the land upon which they flow., Reflection seismology,   

    From Science Alert (US) : “New Imaging Reveals Hidden Ice Age Landscapes Buried Deep in The North Sea” 

    ScienceAlert

    From Science Alert (US)

    10 SEPTEMBER 2021
    MICHELLE STARR

    1
    One of the tunnel valleys revealed by the seismic data. (James Kirkham).

    The hidden scars left on the landscape during ice ages thousands to millions of years ago have now been imaged in spectacular detail.

    Using a technique called reflection seismology, a team of scientists has imaged enormous gouges carved by subglacial rivers buried hundreds of meters below the floor of the North Sea. Called “tunnel valleys”, these features can help us understand how frozen landscapes change in response to a warming climate.

    “The origin of these channels was unresolved for over a century. This discovery will help us better understand the ongoing retreat of present-day glaciers in Antarctica and Greenland,” said geophysicist James Kirkham of the British Antarctic Survey.

    “In the way that we can leave footprints in the sand glaciers leave an imprint on the land upon which they flow. Our new cutting edge data gives us important markers of deglaciation.”

    2
    A map reveals the location of channels buried beneath the North Sea with an overlay showing the ice sheet limits 21,000 years ago. (James Kirkham).

    Reflection seismology, as the name suggests, relies on vibrations propagating underground to generate a density profile up to significant depths. It’s a little like how we can use earthquakes to map the density of the interior of our entire planet, but targeted and on smaller scales.

    In this case, air gun clusters were towed over a section of the North Sea. As these sound waves from these clusters propagated, hydrophones picked up the reflections as they bounced off structures of different densities below the seafloor.

    Researchers then cleaned up and analyzed the high-resolution 3D data to build a layered map of the ancient landscape.

    Even buried beneath up to 300 meters (984 feet) of sediment, this equipment is able to capture features as small as just 4 meters. This means that the data obtained is the most detailed to date on the tunnel valleys below the North Sea.

    The data revealed 19 cross-cutting channels between 300 and 3,000 meters wide, with undulating thalwegs. Based on the morphology of these channels, the researchers interpreted them as tunnel valleys formed by meltwater running away underneath ancient ice sheets.


    Ancient sub-ocean landscapes give clues to future ice sheet change – James Kirkham.

    Because of the high level of detail, these channels reveal information about how the ice sheets interacted with the channels as they formed. Since the ice sheets found at Earth’s poles today are currently undergoing melting in response to a warming climate, a better understanding of this process can help us figure out what is going to happen to Greenland and Antarctica in the future.

    “Although we have known about the huge glacial channels in the North Sea for some time, this is the first time we have imaged fine-scale landforms within them,” said geophysicist Kelly Hogan of the British Antarctic Survey.

    4
    A comparison of the tunnel valleys with current-day glacial features. (James Kirkham).

    “These delicate features tell us about how water moved through the channels (beneath the ice) and even how ice simply stagnated and melted away. It is very difficult to observe what goes on underneath our large ice sheets today, particularly how moving water and sediment is affecting ice flow and we know that these are important controls on ice behaviour,” Hogan added.

    “As a result, using these ancient channels to understand how ice will respond to changing conditions in a warming climate is extremely relevant and timely.”

    Future research, the team said, should involve shallow drilling, to place better chronological constraints on the tunnel valleys, as well as collection of a broader swath of seismic data.

    This more granular detail will enable us to better model the hydrological systems of ancient ice sheets, and apply that knowledge to our current situation.

    The research has been published in Geology.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

     
  • richardmitnick 3:16 pm on September 9, 2021 Permalink | Reply
    Tags: "Groundbreaking Technique Yields Important New Details on Silicon; Subatomic Particles; and Possible ‘Fifth Force’", Applied Research & Technology, , Bragg planes, Finding of “Pendellösung” oscillations, , The scientists uncovered new information about an important subatomic particle and a long-theorized fifth force of nature.   

    From National Institute of Standards and Technology (US) : “Groundbreaking Technique Yields Important New Details on Silicon; Subatomic Particles; and Possible ‘Fifth Force’” 

    From National Institute of Standards and Technology (US)

    September 09, 2021

    Media Contact:

    Ben P. Stein
    benjamin.stein@nist.gov
    (301) 975-2763

    Technical Contacts:

    Benjamin Heacock
    benjamin.heacock@nist.gov

    (301) 975-6218
    Michael G. Huber
    michael.huber@nist.gov
    (301) 975-5641

    1
    As neutrons pass through a crystal, they create two different standing waves – one along atomic planes and one between them. The interaction of these waves affects the path of the neutron, revealing aspects of the crystal structure. Credit: NIST.

    Using a groundbreaking new technique at the National Institute of Standards and Technology (NIST), an international collaboration led by NIST researchers has revealed previously unrecognized properties of technologically crucial silicon crystals and uncovered new information about an important subatomic particle and a long-theorized fifth force of nature.

    By aiming subatomic particles known as neutrons at silicon crystals and monitoring the outcome with exquisite sensitivity, the NIST scientists were able to obtain three extraordinary results: the first measurement of a key neutron property in 20 years using a unique method; the highest-precision measurements of the effects of heat-related vibrations in a silicon crystal; and limits on the strength of a possible “fifth force” beyond standard physics theories.

    The researchers report their findings in the journal Science.

    2
    In a regular crystal such as silicon, there are many parallel sheets of atoms, each of which forms a plane. Probing different planes with neutrons reveals different aspects of the crystal. Credit: NIST.

    To obtain information about crystalline materials at the atomic scale, scientists typically aim a beam of particles (such as X-rays, electrons or neutrons) at the crystal and detect the beam’s angles, intensities and patterns as it passes through or ricochets off planes in the crystal’s lattice-like atomic geometry.

    That information is critically important for characterizing the electronic, mechanical and magnetic properties of microchip components and various novel nanomaterials for next-generation applications including quantum computing. A great deal is known already, but continued progress requires increasingly detailed knowledge.

    “A vastly improved understanding of the crystal structure of silicon, the ‘universal’ substrate or foundation material on which everything is built, will be crucial in understanding the nature of components operating near the point at which the accuracy of measurements is limited by quantum effects,” said NIST senior project scientist Michael Huber.

    Neutrons, Atoms and Angles

    Like all quantum objects, neutrons have both point-like particle and wave properties. As a neutron travels through the crystal, it forms standing waves (like a plucked guitar string) both in between and on top of rows or sheets of atoms called Bragg planes. When waves from each of the two routes combine, or “interfere” in the parlance of physics, they create faint patterns called pendellösung oscillations that provide insights into the forces that neutrons experience inside the crystal.

    “Imagine two identical guitars,” said Huber. “Pluck them the same way, and as the strings vibrate, drive one down a road with speed bumps — that is, along the planes of atoms in the lattice — and drive the other down a road of the same length without the speed bumps — analogous to moving between the lattice planes. Comparing the sounds from both guitars tells us something about the speed bumps: how big they are, how smooth, and do they have interesting shapes?”

    The latest work, which was conducted at the NIST Center for Neutron Research (NCNR) in Gaithersburg, Maryland, in collaboration with researchers from Japan, the U.S. and Canada, resulted in a fourfold improvement in precision measurement of the silicon crystal structure.

    3
    Each neutron in an atomic nucleus is made up of three elementary particles called quarks. The three quarks’ electrical charge sum to zero, making it electrically neutral. But the distribution of those charges is such that positive charges are more likely to be found in the center of the neutron, and negative charges toward the outside. Credit: NIST.

    Not-Quite-Neutral Neutrons

    In one striking result, the scientists measured the electrical “charge radius” of the neutron in a new way with an uncertainty in the radius value competitive with the most-precise prior results using other methods. Neutrons are electrically neutral, as their name suggests. But they are composite objects made up of three elementary charged particles called quarks with different electrical properties that are not exactly uniformly distributed.

    As a result, predominantly negative charge from one kind of quark tends to be located toward the outer part of the neutron, whereas net positive charge is located toward the center. The distance between those two concentrations is the “charge radius.” That dimension, important to fundamental physics, has been measured by similar types of experiments whose results differ significantly. The new pendellösung data is unaffected by the factors thought to lead to these discrepancies.

    Measuring the pendellösung oscillations in an electrically charged environment provides a unique way to gauge the charge radius. “When the neutron is in the crystal, it is well within the atomic electric cloud,” said NIST’s Benjamin Heacock, the first author on the Science paper.

    “In there, because the distances between charges are so small, the interatomic electric fields are enormous, on the order of a hundred million volts per centimeter. Because of that very, very large field, our technique is sensitive to the fact that the neutron behaves like a spherical composite particle with a slightly positive core and a slightly negative surrounding shell.”

    Vibrations and Uncertainty

    A valuable alternative to neutrons is X-ray scattering. But its accuracy has been limited by atomic motion caused by heat. Thermal vibration causes the distances between crystal planes to keep changing, and thus changes the interference patterns being measured.

    The scientists employed neutron pendellösung oscillation measurements to test the values predicted by X-ray scattering models and found that some significantly underestimate the magnitude of the vibration.

    The results provide valuable complementary information for both x-ray and neutron scattering. “Neutrons interact almost entirely with the protons and neutrons at the centers, or nuclei, of the atoms,” Huber said, “and x-rays reveal how the electrons are arranged between the nuclei. This complementary knowledge deepens our understanding.

    “One reason our measurements are so sensitive is that neutrons penetrate much deeper into the crystal than x-rays – a centimeter or more – and thus measures a much larger assembly of nuclei. We have found evidence that the nuclei and electrons may not vibrate rigidly, as is commonly assumed. That shifts our understanding on the how silicon atoms interact with one another inside a crystal lattice.”

    Force Five

    The Standard Model is the current, widely accepted theory of how particles and forces interact at the smallest scales. But it’s an incomplete explanation of how nature works, and scientists suspect there is more to the universe than the theory describes.

    The Standard Model describes three fundamental forces in nature: electromagnetic, strong and weak. Each force operates through the action of “carrier particles.” For example, the photon is the force carrier for the electromagnetic force. But the Standard Model has yet to incorporate gravity in its description of nature. Furthermore, some experiments and theories suggest the possible presence of a fifth force.

    “Generally, if there’s a force carrier, the length scale over which it acts is inversely proportional to its mass,” meaning it can only influence other particles over a limited range, Heacock said. But the photon, which has no mass, can act over an unlimited range. “So, if we can bracket the range over which it might act, we can limit its strength.” The scientists’ results improve constraints on the strength of a potential fifth force by tenfold over a length scale between 0.02 nanometers (nm, billionths of a meter) and 10 nm, giving fifth-force hunters a narrowed range over which to look.

    The researchers are already planning more expansive pendellösung measurements using both silicon and germanium. They expect a possible factor of five reduction in their measurement uncertainties, which could produce the most precise measurement of the neutron charge radius to date and further constrain — or discover — a fifth force. They also plan to perform a cryogenic version of the experiment, which would lend insight into how the crystal atoms behave in their so-called “quantum ground state,” which accounts for the fact that quantum objects are never perfectly still, even at temperatures approaching absolute zero.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    NIST Campus, Gaitherberg, MD, USA

    National Institute of Standards and Technology (US)‘s Mission, Vision, Core Competencies, and Core Values

    Mission

    To promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.

    NIST’s vision

    NIST will be the world’s leader in creating critical measurement solutions and promoting equitable standards. Our efforts stimulate innovation, foster industrial competitiveness, and improve the quality of life.

    NIST’s core competencies

    Measurement science
    Rigorous traceability
    Development and use of standards

    NIST’s core values

    NIST is an organization with strong values, reflected both in our history and our current work. NIST leadership and staff will uphold these values to ensure a high performing environment that is safe and respectful of all.

    Perseverance: We take the long view, planning the future with scientific knowledge and imagination to ensure continued impact and relevance for our stakeholders.
    Integrity: We are ethical, honest, independent, and provide an objective perspective.
    Inclusivity: We work collaboratively to harness the diversity of people and ideas, both inside and outside of NIST, to attain the best solutions to multidisciplinary challenges.
    Excellence: We apply rigor and critical thinking to achieve world-class results and continuous improvement in everything we do.

    Background

    The Articles of Confederation, ratified by the colonies in 1781, contained the clause, “The United States in Congress assembled shall also have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States”. Article 1, section 8, of the Constitution of the United States (1789), transferred this power to Congress; “The Congress shall have power…To coin money, regulate the value thereof, and of foreign coin, and fix the standard of weights and measures”.

    In January 1790, President George Washington, in his first annual message to Congress stated that, “Uniformity in the currency, weights, and measures of the United States is an object of great importance, and will, I am persuaded, be duly attended to”, and ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage, Weights, and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, “A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience”, but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared “Weights and measures may be ranked among the necessities of life to every individual of human society”.

    From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, which was part of the U.S. Coast and Geodetic Survey in the Department of the Treasury.

    Bureau of Standards

    In 1901 in response to a bill proposed by Congressman James H. Southard (R- Ohio) the National Bureau of Standards was founded with the mandate to provide standard weights and measures and to serve as the national physical laboratory for the United States. (Southard had previously sponsored a bill for metric conversion of the United States.)

    President Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was $40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, and set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington DC (US) and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures the Bureau developed instruments for electrical units and for measurement of light. In 1905 a meeting was called that would be the first National Conference on Weights and Measures.

    Initially conceived as purely a metrology agency the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products. Some of these standards were for products intended for government use; but product standards also affected private-sector consumption. Quality standards were developed for products including some types of clothing; automobile brake systems and headlamps; antifreeze; and electrical safety. During World War I, the Bureau worked on multiple problems related to war production even operating its own facility to produce optical glass when European supplies were cut off. Between the wars Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II military research and development was carried out including development of radio propagation forecast methods; the proximity fuze and the standardized airframe used originally for Project Pigeon; and shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.

    In 1948, financed by the United States Air Force the Bureau began design and construction of SEAC: the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version- DYSEAC- was built for the Signal Corps in 1954.

    Due to a changing mission, the “National Bureau of Standards” became the “National Institute of Standards and Technology (US)” in 1988.

    Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings.

    Organization

    NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, Colorado, which was dedicated by President Eisenhower in 1954. NIST’s activities are organized into laboratory programs and extramural programs. Effective October 1, 2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six. NIST Laboratories include:

    Communications Technology Laboratory (CTL)
    Engineering Laboratory (EL)
    Information Technology Laboratory (ITL)
    Center for Neutron Research (NCNR)
    Material Measurement Laboratory (MML)
    Physical Measurement Laboratory (PML)

    Extramural programs include:

    Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers to assist small and mid-sized manufacturers to create and retain jobs, improve efficiencies, and minimize waste through process improvements and to increase market penetration with innovation and growth strategies;
    Technology Innovation Program (TIP), a grant program where NIST and industry partners cost share the early-stage development of innovative but high-risk technologies;
    Baldrige Performance Excellence Program, which administers the Malcolm Baldrige National Quality Award, the nation’s highest award for performance and business excellence.

    NIST’s Boulder laboratories are best known for NIST‑F1 which houses an atomic clock. NIST‑F1 serves as the source of the nation’s official time. From its measurement of the natural resonance frequency of cesium—which defines the second—NIST broadcasts time signals via longwave radio station WWVB near Fort Collins in Colorado, and shortwave radio stations WWV and WWVH, located near Fort Collins and Kekaha in Hawai’i, respectively.

    NIST also operates a neutron science user facility: the NIST Center for Neutron Research (NCNR). The NCNR provides scientists access to a variety of neutron scattering instruments which they use in many research fields (materials science; fuel cells; biotechnology etc.).

    The SURF III Synchrotron Ultraviolet Radiation Facility is a source of synchrotron radiation in continuous operation since 1961. SURF III now serves as the US national standard for source-based radiometry throughout the generalized optical spectrum. All NASA-borne extreme-ultraviolet observation instruments have been calibrated at SURF since the 1970s, and SURF is used for measurement and characterization of systems for extreme ultraviolet lithography.

    The Center for Nanoscale Science and Technology (CNST) performs research in nanotechnology, both through internal research efforts and by running a user-accessible cleanroom nanomanufacturing facility. This “NanoFab” is equipped with tools for lithographic patterning and imaging (e.g., electron microscopes and atomic force microscopes).

    Committees

    NIST has seven standing committees:

    Technical Guidelines Development Committee (TGDC)
    Advisory Committee on Earthquake Hazards Reduction (ACEHR)
    National Construction Safety Team Advisory Committee (NCST Advisory Committee)
    Information Security and Privacy Advisory Board (ISPAB)
    Visiting Committee on Advanced Technology (VCAT)
    Board of Overseers for the Malcolm Baldrige National Quality Award (MBNQA Board of Overseers)
    Manufacturing Extension Partnership National Advisory Board (MEPNAB)

    Measurements and standards

    As part of its mission, NIST supplies industry, academia, government, and other users with over 1,300 Standard Reference Materials (SRMs). These artifacts are certified as having specific characteristics or component content, used as calibration standards for measuring equipment and procedures, quality control benchmarks for industrial processes, and experimental control samples.

    Handbook 44

    NIST publishes the Handbook 44 each year after the annual meeting of the National Conference on Weights and Measures (NCWM). Each edition is developed through cooperation of the Committee on Specifications and Tolerances of the NCWM and the Weights and Measures Division (WMD) of the NIST. The purpose of the book is a partial fulfillment of the statutory responsibility for “cooperation with the states in securing uniformity of weights and measures laws and methods of inspection”.

    NIST has been publishing various forms of what is now the Handbook 44 since 1918 and began publication under the current name in 1949. The 2010 edition conforms to the concept of the primary use of the SI (metric) measurements recommended by the Omnibus Foreign Trade and Competitiveness Act of 1988.

     
  • richardmitnick 2:05 pm on September 9, 2021 Permalink | Reply
    Tags: "A universal system for decoding any type of data sent across a network", "Noise encountered by data along the way such as electromagnetic interference from a microwave or Bluetooth device., A new silicon chip can decode any error-correcting code through the use of a novel algorithm known as Guessing Random Additive Noise Decoding (GRAND)., Applied Research & Technology, GRAND works by guessing the noise that affected the message and uses the noise pattern to deduce the original information., , The GRAND chip uses a three-tiered structure. Each stage operates independently which increases the throughput of the system and saves power., The researchers tested the GRAND chip and found it could effectively decode any moderate redundancy code up to 128 bits in length with only about a microsecond of latency., While the offending noise appears random in nature it has a probabilistic structure that allows the algorithm to guess what it might be.   

    From Massachusetts Institute of Technology (US) : “A universal system for decoding any type of data sent across a network” 

    MIT News

    From Massachusetts Institute of Technology (US)

    September 9, 2021
    Adam Zewe

    1
    A new silicon chip can decode any error-correcting code through the use of a novel algorithm known as Guessing Random Additive Noise Decoding (GRAND). Credits: Jose-Luis Olivares, MIT, with chip courtesy of the researchers.

    Every piece of data that travels over the internet — from paragraphs in an email to 3D graphics in a virtual reality environment — can be altered by the noise it encounters along the way, such as electromagnetic interference from a microwave or Bluetooth device. The data are coded so that when they arrive at their destination, a decoding algorithm can undo the negative effects of that noise and retrieve the original data.

    Since the 1950s, most error-correcting codes and decoding algorithms have been designed together. Each code had a structure that corresponded with a particular, highly complex decoding algorithm, which often required the use of dedicated hardware.

    Researchers at MIT, Boston University (US), and Maynooth National University of Ireland [Ollscoil na hÉireann Mhá Nuad](IE) in have now created the first silicon chip that is able to decode any code regardless of its structure with maximum accuracy using a universal decoding algorithm called Guessing Random Additive Noise Decoding (GRAND). By eliminating the need for multiple, computationally complex decoders, GRAND enables increased efficiency that could have applications in augmented and virtual reality, gaming, 5G networks, and connected devices that rely on processing a high volume of data with minimal delay.

    The research at MIT is led by Muriel Médard, the Cecil H. and Ida Green Professor in the Department of Electrical Engineering and Computer Science, and was co-authored by Amit Solomon and Wei Ann, both graduate students at MIT; Rabia Tugce Yazicigil, assistant professor of electrical and computer engineering at Boston University; Arslan Riaz and Vaibhav Bansal, both graduate students at Boston University; Ken R. Duffy, director of the Hamilton Institute at the National University of Ireland at Maynooth; and Kevin Galligan, a Maynooth graduate student. The research will be presented at the European Solid-States Device Research and Circuits Conference next week.

    Focus on noise

    One way to think of these codes is as redundant hashes (in this case, a series of 1s and 0s) added to the end of the original data. The rules for the creation of that hash are stored in a specific codebook.

    As the encoded data travel over a network, they are affected by noise, or energy that disrupts the signal, which is often generated by other electronic devices. When that coded data and the noise that affected them arrive at their destination, the decoding algorithm consults its codebook and uses the structure of the hash to guess what the stored information is.

    Instead, GRAND works by guessing the noise that affected the message and uses the noise pattern to deduce the original information. GRAND generates a series of noise sequences in the order they are likely to occur, subtracts them from the received data, and checks to see if the resulting codeword is in a codebook.

    While the noise appears random in nature it has a probabilistic structure that allows the algorithm to guess what it might be.

    “In a way, it is similar to troubleshooting. If someone brings their car into the shop, the mechanic doesn’t start by mapping the entire car to blueprints. Instead, they start by asking, ‘What is the most likely thing to go wrong?’ Maybe it just needs gas. If that doesn’t work, what’s next? Maybe the battery is dead?” Médard says.

    Novel hardware

    The GRAND chip uses a three-tiered structure, starting with the simplest possible solutions in the first stage and working up to longer and more complex noise patterns in the two subsequent stages. Each stage operates independently which increases the throughput of the system and saves power.

    The device is also designed to switch seamlessly between two codebooks. It contains two static random-access memory chips, one that can crack codewords, while the other loads a new codebook and then switches to decoding without any downtime.

    The researchers tested the GRAND chip and found it could effectively decode any moderate redundancy code up to 128 bits in length with only about a microsecond of latency.

    Médard and her collaborators had previously demonstrated the success of the algorithm, but this new work showcases the effectiveness and efficiency of GRAND in hardware for the first time.

    Developing hardware for the novel decoding algorithm required the researchers to first toss aside their preconceived notions, Médard says.

    “We couldn’t go out and reuse things that had already been done. This was like a complete whiteboard. We had to really think about every single component from scratch. It was a journey of reconsideration. And I think when we do our next chip, there will be things with this first chip that we’ll realize we did out of habit or assumption that we can do better,” she says.

    A chip for the future

    Since GRAND only uses codebooks for verification, the chip not only works with legacy codes but could also be used with codes that haven’t even been introduced yet.

    In the lead-up to 5G implementation, regulators and communications companies struggled to find consensus as to which codes should be used in the new network. Regulators ultimately chose to use two types of traditional codes for 5G infrastructure in different situations. Using GRAND could eliminate the need for that rigid standardization in the future, Médard says.

    The GRAND chip could even open the field of coding to a wave of innovation.

    “For reasons I’m not quite sure of, people approach coding with awe, like it is black magic. The process is mathematically nasty, so people just use codes that already exist. I’m hoping this will recast the discussion so it is not so standards-oriented, enabling people to use codes that already exist and create new codes,” she says.

    Moving forward, Médard and her collaborators plan to tackle the problem of soft detection with a retooled version of the GRAND chip. In soft detection, the received data are less precise.

    They also plan to test the ability of GRAND to crack longer, more complex codes and adjust the structure of the silicon chip to improve its energy efficiency.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

    Massachusetts Institute of Technology (US) is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory (US), the MIT Bates Research and Engineering Center (US), and the Haystack Observatory (US), as well as affiliated laboratories such as the Broad Institute of MIT and Harvard(US) and Whitehead Institute (US).

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology (US) adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with Massachusetts Institute of Technology (US) . The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology (US) is a member of the Association of American Universities (AAU).

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia (US), wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after Massachusetts Institute of Technology (US) was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst (US)). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    Massachusetts Institute of Technology (US) was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology (US) faculty and alumni rebuffed Harvard University (US) president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, the Massachusetts Institute of Technology (US) administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology (US) catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities (US)in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at Massachusetts Institute of Technology (US) that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    Massachusetts Institute of Technology (US)‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology (US)’s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, Massachusetts Institute of Technology (US) became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected Massachusetts Institute of Technology (US) profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of Massachusetts Institute of Technology (US) between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, Massachusetts Institute of Technology (US) no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and Massachusetts Institute of Technology (US)’s defense research. In this period Massachusetts Institute of Technology (US)’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. Massachusetts Institute of Technology (US) ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT (US) Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However six Massachusetts Institute of Technology (US) students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at Massachusetts Institute of Technology (US) over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, Massachusetts Institute of Technology (US)’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    Massachusetts Institute of Technology (US) has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology (US) classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    Massachusetts Institute of Technology (US) was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, Massachusetts Institute of Technology (US) launched OpenCourseWare to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, Massachusetts Institute of Technology (US) announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology (US) faculty adopted an open-access policy to make its scholarship publicly accessible online.

    Massachusetts Institute of Technology (US) has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology (US) community with thousands of police officers from the New England region and Canada. On November 25, 2013, Massachusetts Institute of Technology (US) announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of the Massachusetts Institute of Technology (US) community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO (US) was designed and constructed by a team of scientists from California Institute of Technology (US), Massachusetts Institute of Technology (US), and industrial contractors, and funded by the National Science Foundation (US) .

    MIT/Caltech Advanced aLigo .

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology (US) physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also an Massachusetts Institute of Technology (US) graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of Massachusetts Institute of Technology (US) is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the Massachusetts Institute of Technology (US) community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 10:46 am on September 9, 2021 Permalink | Reply
    Tags: "Unprecedented Plasma Lensing for High-Intensity Lasers", Applied Research & Technology, Berkeley Lab Laser Accelerator (BELLA) Center, , , , Using thin hollow structures-or “capillaries”-containing a plasma to transport the pulses of light.   

    From DOE’s Lawrence Berkeley National Laboratory (US): “Unprecedented Plasma Lensing for High-Intensity Lasers” 

    From DOE’s Lawrence Berkeley National Laboratory (US)

    September 9, 2021
    Media Relations
    media@lbl.gov
    (510) 486-5183

    By Joe Chew

    1
    A 20-centimeter-long capillary discharge waveguide, used at BELLA Center to guide high-intensity laser pulses, and applied to set their record thus far for accelerating electrons: 8 billion electron volts (GeV). Credit: Thor Swift/Berkeley Lab.

    High-power laser pulses focused to small spots to reach incredible intensities enable a variety of applications, ranging from scientific research to industry and medicine. At the Berkeley Lab Laser Accelerator (BELLA) Center, for instance, intensity is key to building particle accelerators thousands of times shorter than conventional ones that reach the same energy. However, laser-plasma accelerators (LPAs) require sustained intensity over many centimeters, not just a spot focus that rapidly expands because of diffraction.

    To achieve sustained intensity, the BELLA Center, at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), uses thin hollow structures-or “capillaries”-containing a plasma to transport the pulses of light. BELLA Center scientists have been pushing toward longer and longer capillaries as they strive for higher beam energies with their LPAs.

    Their latest work shows, with higher precision than ever before, that these plasma waveguides are extremely stable and of reproducibly high quality, and that these characteristics can be maintained over distances as long as 40 centimeters. It confirms that this key technology for LPAs can be scaled up as the BELLA Center pushes toward higher energies, benefitting potential applications that range from biomedical research and treatment to free-electron-laser light sources for research facilities.

    The work – led by postdoctoral scholar Marlene Turner, working with staff scientist Anthony Gonsalves – is described in a study published in the journal High Power Laser Science and Engineering.

    “This work shows that capillaries can produce extremely stable plasma targets for acceleration and that observed variations in accelerator performance are primarily laser fluctuation driven, which indicates the need for active laser feedback control,” said Cameron Geddes, director of the Accelerator Technology and Applied Physics Division, parent organization of the BELLA Center.

    Plasma channels give consistent guidance to powerful pulses

    Fiber optics can transport laser beam pulses over thousands of kilometers, a principle familiar in modern computer networks. However, with the high laser intensities used at BELLA Center (20 orders of magnitude more intense than the sunlight on the Earth’s surface), electrons would be near-instantaneously removed from their parent atoms by the laser field, destroying solid materials such as glass fibers. The solution is to use plasma, a state of matter in which electrons have already been removed from their atoms, as a “fiber.”

    2
    Marlene Turner (right) collaborating under COVID precautions with Anthony Gonsalves. Credit: Thor Swift/Berkeley Lab.

    The BELLA Center has used plasmas to guide laser pulses over distances as long as 20 centimeters to achieve the highest laser-driven particle energies to date. The plasma is created by an electrical discharge inside the capillary. This is where electrons “surf” a wave of ultrahigh electric field set up by the laser pulse. The longer the sustained focus, the faster they are going at the end of the ride.

    However, the gas breakdown in an electrical discharge is a violent and largely uncontrolled event (imagine a tiny, confined lightning strike). Charting a path forward to ever higher energies and precision control at the BELLA Center, researchers needed to know how reproducible the wave-guiding characteristics are from one laser pulse to another, and how well each laser pulse can be guided.

    In order to give wave-guiding results analogous to a fiber optic, the plasma density should be lowest in the center, with a profile mathematically described as parabolic. “We showed, with unprecedented precision, that the plasma profiles are indeed very parabolic over the laser pulse spot size they are designed to guide,” said Gonsalves. “This allows for pulse propagation in the waveguide without quality degradation.”

    Other types of plasma waveguides (there are several ways to create them) can also be measured with high precision using these methods.

    The measurement precision was also ideal for investigating how much the density profile changes from one laser shot to another, since although the capillary is durable, the wave-guiding plasma within it is formed anew each time. The team found outstanding stability and reproducibility.

    “These results, along with our ongoing work on active feedback aided by machine learning techniques, are a big step to improving the stability and usability of laser-plasma accelerators,” said Eric Esarey, director of the BELLA Center. (Active feedback to stabilize laser fluctuations is also the subject of research and development at the BELLA Center.)

    Guided laser pulses illuminate a path toward progress

    Laser-plasma acceleration technology could reduce the size and cost of particle accelerators –increasing their availability for hospitals and universities, for instance, and ultimately bringing these benefits to a next-generation particle collider for high-energy physics. One of the keys to increasing their particle-beam energy beyond the present record of 8 billion electron volts (GeV) is the use of longer accelerating channels; another is “staging,” or the use of the output of one acceleration module as the input to another. Verifying the quality of the plasma channel where the acceleration takes place – and the consistency and reproducibility of that quality –gives a vote of confidence in the technology basis of these plans.

    4
    Marlene Turner inspects a 40-centimeter-long capillary. Credit: Thor Swift/Berkeley Lab.

    Aside from showing that this capillary-based waveguide is of high and consistent quality, this work involved waveguides twice as long as the one used for achieving record-breaking energy. “The precision 40-centimeter-long waveguides we have now developed could push those energies even higher,” said Turner.

    The work was supported by the DOE Office of Science, Office of High Energy Physics.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Bringing Science Solutions to the World

    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) (US) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the The National Academy of Sciences (US), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the The National Academy of Engineering (US), and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California (US) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the University of California- Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a University of California-Berkeley (US) physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    History

    1931–1941

    The laboratory was founded on August 26, 1931, by Ernest Lawrence, as the Radiation Laboratory of the University of California, Berkeley, associated with the Physics Department. It centered physics research around his new instrument, the cyclotron, a type of particle accelerator for which he was awarded the Nobel Prize in Physics in 1939.

    LBNL 88 inch cyclotron.


    Throughout the 1930s, Lawrence pushed to create larger and larger machines for physics research, courting private philanthropists for funding. He was the first to develop a large team to build big projects to make discoveries in basic research. Eventually these machines grew too large to be held on the university grounds, and in 1940 the lab moved to its current site atop the hill above campus. Part of the team put together during this period includes two other young scientists who went on to establish large laboratories; J. Robert Oppenheimer founded DOE’s Los Alamos Laboratory (US), and Robert Wilson founded Fermi National Accelerator Laboratory(US).

    1942–1950

    Leslie Groves visited Lawrence’s Radiation Laboratory in late 1942 as he was organizing the Manhattan Project, meeting J. Robert Oppenheimer for the first time. Oppenheimer was tasked with organizing the nuclear bomb development effort and founded today’s Los Alamos National Laboratory to help keep the work secret. At the RadLab, Lawrence and his colleagues developed the technique of electromagnetic enrichment of uranium using their experience with cyclotrons. The “calutrons” (named after the University) became the basic unit of the massive Y-12 facility in Oak Ridge, Tennessee. Lawrence’s lab helped contribute to what have been judged to be the three most valuable technology developments of the war (the atomic bomb, proximity fuse, and radar). The cyclotron, whose construction was stalled during the war, was finished in November 1946. The Manhattan Project shut down two months later.

    1951–2018

    After the war, the Radiation Laboratory became one of the first laboratories to be incorporated into the Atomic Energy Commission (AEC) (now Department of Energy (US). The most highly classified work remained at Los Alamos, but the RadLab remained involved. Edward Teller suggested setting up a second lab similar to Los Alamos to compete with their designs. This led to the creation of an offshoot of the RadLab (now the Lawrence Livermore National Laboratory (US)) in 1952. Some of the RadLab’s work was transferred to the new lab, but some classified research continued at Berkeley Lab until the 1970s, when it became a laboratory dedicated only to unclassified scientific research.

    Shortly after the death of Lawrence in August 1958, the UC Radiation Laboratory (both branches) was renamed the Lawrence Radiation Laboratory. The Berkeley location became the Lawrence Berkeley Laboratory in 1971, although many continued to call it the RadLab. Gradually, another shortened form came into common usage, LBNL. Its formal name was amended to Ernest Orlando Lawrence Berkeley National Laboratory in 1995, when “National” was added to the names of all DOE labs. “Ernest Orlando” was later dropped to shorten the name. Today, the lab is commonly referred to as “Berkeley Lab”.

    The Alvarez Physics Memos are a set of informal working papers of the large group of physicists, engineers, computer programmers, and technicians led by Luis W. Alvarez from the early 1950s until his death in 1988. Over 1700 memos are available on-line, hosted by the Laboratory.

    The lab remains owned by the Department of Energy (US), with management from the University of California (US). Companies such as Intel were funding the lab’s research into computing chips.

    Science mission

    From the 1950s through the present, Berkeley Lab has maintained its status as a major international center for physics research, and has also diversified its research program into almost every realm of scientific investigation. Its mission is to solve the most pressing and profound scientific problems facing humanity, conduct basic research for a secure energy future, understand living systems to improve the environment, health, and energy supply, understand matter and energy in the universe, build and safely operate leading scientific facilities for the nation, and train the next generation of scientists and engineers.

    The Laboratory’s 20 scientific divisions are organized within six areas of research: Computing Sciences; Physical Sciences; Earth and Environmental Sciences; Biosciences; Energy Sciences; and Energy Technologies. Berkeley Lab has six main science thrusts: advancing integrated fundamental energy science; integrative biological and environmental system science; advanced computing for science impact; discovering the fundamental properties of matter and energy; accelerators for the future; and developing energy technology innovations for a sustainable future. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab tradition that continues today.

    Berkeley Lab operates five major National User Facilities for the DOE Office of Science (US):

    The Advanced Light Source (ALS) is a synchrotron light source with 41 beam lines providing ultraviolet, soft x-ray, and hard x-ray light to scientific experiments.

    LBNL/ALS


    The ALS is one of the world’s brightest sources of soft x-rays, which are used to characterize the electronic structure of matter and to reveal microscopic structures with elemental and chemical specificity. About 2,500 scientist-users carry out research at ALS every year. Berkeley Lab is proposing an upgrade of ALS which would increase the coherent flux of soft x-rays by two-three orders of magnitude.

    The DOE Joint Genome Institute (US) supports genomic research in support of the DOE missions in alternative energy, global carbon cycling, and environmental management. The JGI’s partner laboratories are Berkeley Lab, DOE’s Lawrence Livermore National Laboratory (US), DOE’s Oak Ridge National Laboratory (US)(ORNL), DOE’s Pacific Northwest National Laboratory (US) (PNNL), and the HudsonAlpha Institute for Biotechnology (US). The JGI’s central role is the development of a diversity of large-scale experimental and computational capabilities to link sequence to biological insights relevant to energy and environmental research. Approximately 1,200 scientist-users take advantage of JGI’s capabilities for their research every year.

    The LBNL Molecular Foundry (US) [above] is a multidisciplinary nanoscience research facility. Its seven research facilities focus on Imaging and Manipulation of Nanostructures; Nanofabrication; Theory of Nanostructured Materials; Inorganic Nanostructures; Biological Nanostructures; Organic and Macromolecular Synthesis; and Electron Microscopy. Approximately 700 scientist-users make use of these facilities in their research every year.

    The DOE’s NERSC National Energy Research Scientific Computing Center (US) is the scientific computing facility that provides large-scale computing for the DOE’s unclassified research programs. Its current systems provide over 3 billion computational hours annually. NERSC supports 6,000 scientific users from universities, national laboratories, and industry.

    DOE’s NERSC National Energy Research Scientific Computing Center(US) at Lawrence Berkeley National Laboratory

    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    NERSC is a DOE Office of Science User Facility.

    The DOE’s Energy Science Network (US) is a high-speed network infrastructure optimized for very large scientific data flows. ESNet provides connectivity for all major DOE sites and facilities, and the network transports roughly 35 petabytes of traffic each month.

    Berkeley Lab is the lead partner in the DOE’s Joint Bioenergy Institute (US) (JBEI), located in Emeryville, California. Other partners are the DOE’s Sandia National Laboratory (US), the University of California (UC) campuses of Berkeley and Davis, the Carnegie Institution for Science (US), and DOE’s Lawrence Livermore National Laboratory (US) (LLNL). JBEI’s primary scientific mission is to advance the development of the next generation of biofuels – liquid fuels derived from the solar energy stored in plant biomass. JBEI is one of three new U.S. Department of Energy (DOE) Bioenergy Research Centers (BRCs).

    Berkeley Lab has a major role in two DOE Energy Innovation Hubs. The mission of the Joint Center for Artificial Photosynthesis (JCAP) is to find a cost-effective method to produce fuels using only sunlight, water, and carbon dioxide. The lead institution for JCAP is the California Institute of Technology (US) and Berkeley Lab is the second institutional center. The mission of the Joint Center for Energy Storage Research (JCESR) is to create next-generation battery technologies that will transform transportation and the electricity grid. DOE’s Argonne National Laboratory (US) leads JCESR and Berkeley Lab is a major partner.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: