Recent Updates Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 5:26 pm on May 20, 2022 Permalink | Reply
    Tags: "The missing piece to faster and cheaper and more accurate 3D mapping", , , Switzerland is currently mapping its entire landscape using airborne laser scanners – the first time since 2000., The Geodetic Engineering Laboratory (Topo) within EPFL's School of Architecture; Civil and Environmental Engineering (ENAC)., , The work could be done five times faster based upon the work of Jan Skaloud Davide Cucci and Aurélien Brun., Three-dimensional (3D) mapping is a very useful tool.   

    From The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH) : “The missing piece to faster and cheaper and more accurate 3D mapping” 

    From The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH)

    5.20.22
    Célia Zwahlen

    1
    Engineers at EPFL and The University of Geneva[Université de Genève](CH) believe they hold the key to automated drone mapping. By combining artificial intelligence with a new algorithm, their method promises to considerably reduce the time and resources needed to accurately scan complex landscapes.

    Three-dimensional (3D) mapping is a very useful tool, such as for monitoring construction sites, tracking the effects of climate change on ecosystems and verifying the safety of roads and bridges. However, the technology currently used to automate the mapping process is limited, making it a long and costly endeavor.

    “Switzerland is currently mapping its entire landscape using airborne laser scanners – the first time since 2000. But the process will take four to five years since the scanners have to fly at an altitude below one kilometer if they are to collect data with sufficient detail and accuracy,” says Jan Skaloud, a senior scientist at the Geodetic Engineering Laboratory (Topo) within EPFL’s School of Architecture, Civil and Environmental Engineering (ENAC). “With our method, surveyors can send laser scanners as high as five kilometers and still maintain accuracy. Our lasers are more sensitive and can beam light over a much wider area, making the process five times faster.”

    The method is described in a paper published in ISPRS Journal of Photogrammetry and Remote Sensing by Davide Cucci, a senior research associate at the Research Center for Statistics of the Geneva School of Economics and Management of the University of Geneva, who works with Topo regularly, Jan Skaloud, and Aurélien Brun, lead author, a recent Master’s graduate from EPFL and winner of an award from the Western Switzerland Association of Surveyor Engineers (IGSO).

    Missing the point

    LiDAR laser scanners beam millions of pulses of light on surfaces to create high-resolution digital twins – computer-based replicas of objects or landscapes – that can be used in architecture, road systems and manufacturing, for example. Lasers are particularly effective at collecting spatial data since they don’t depend on ambient light, can collect accurate data at large distances and can essentially “see through” vegetation. But lasers’ accuracy is often lost when they’re mounted on drones or other moving vehicles, especially in areas with numerous obstacles like dense cities, underground infrastructure sites, and places where GPS signals are interrupted. This results in gaps and misalignments in the datapoints used to generate 3D maps (also known as laser-point clouds), and can lead to double vision of scanned objects. These errors must be corrected manually before a map can be used.


    ln Leysin, a LiDAR mounted on a drone maps the landscape, 28 March 2022. © Topo/EPFL

    “For now, there’s no way to generate perfectly aligned 3D maps without a manual data-correction step,” says Cucci. “A lot of semi-automatic methods are being explored to overcome this problem, but ours has the advantage of resolving the issue directly at the scanner level, where measurements are taken, eliminating the need to subsequently make corrections. It’s also fully software-driven, meaning it can be implemented quickly and seamlessly by end users.”

    On the road to automation

    The Topo method leverages recent advancements in artificial intelligence to detect when a given object has been scanned several times from different angles. The method involves selecting correspondences and inserting them into what’s called a Dynamic Network, in order to correct gaps and misalignments in the laser-point cloud.

    “We’re bringing more automation to 3D mapping technology, which will go a long way towards improving its efficiency and productivity and allow for a much wider range of applications,” says Skaloud.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    EPFL bloc

    EPFL campus

    The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH) is a research institute and university in Lausanne, Switzerland, that specializes in natural sciences and engineering. It is one of the two Swiss Federal Institutes of Technology, and it has three main missions: education, research and technology transfer.

    The QS World University Rankings ranks EPFL(CH) 14th in the world across all fields in their 2020/2021 ranking, whereas Times Higher Education World University Rankings ranks EPFL(CH) as the world’s 19th best school for Engineering and Technology in 2020.

    EPFL(CH) is located in the French-speaking part of Switzerland; the sister institution in the German-speaking part of Switzerland is The Swiss Federal Institute of Technology ETH Zürich [Eidgenössische Technische Hochschule Zürich] (CH). Associated with several specialized research institutes, the two universities form The Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles Polytechniques Fédérales] (CH) which is directly dependent on the Federal Department of Economic Affairs, Education and Research. In connection with research and teaching activities, EPFL(CH) operates a nuclear reactor CROCUS; a Tokamak Fusion reactor; a Blue Gene/Q Supercomputer; and P3 bio-hazard facilities.

    ETH Zürich, EPFL (Swiss Federal Institute of Technology in Lausanne) [École Polytechnique Fédérale de Lausanne](CH), and four associated research institutes form The Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles polytechniques fédérales] (CH) with the aim of collaborating on scientific projects.

    The roots of modern-day EPFL(CH) can be traced back to the foundation of a private school under the name École Spéciale de Lausanne in 1853 at the initiative of Lois Rivier, a graduate of the École Centrale Paris (FR) and John Gay the then professor and rector of the Académie de Lausanne. At its inception it had only 11 students and the offices were located at Rue du Valentin in Lausanne. In 1869, it became the technical department of the public Académie de Lausanne. When the Académie was reorganized and acquired the status of a university in 1890, the technical faculty changed its name to École d’Ingénieurs de l’Université de Lausanne. In 1946, it was renamed the École polytechnique de l’Université de Lausanne (EPUL). In 1969, the EPUL was separated from the rest of the University of Lausanne and became a federal institute under its current name. EPFL(CH), like ETH Zürich (CH), is thus directly controlled by the Swiss federal government. In contrast, all other universities in Switzerland are controlled by their respective cantonal governments. Following the nomination of Patrick Aebischer as president in 2000, EPFL(CH) has started to develop into the field of life sciences. It absorbed the Swiss Institute for Experimental Cancer Research (ISREC) in 2008.

    In 1946, there were 360 students. In 1969, EPFL(CH) had 1,400 students and 55 professors. In the past two decades the university has grown rapidly and as of 2012 roughly 14,000 people study or work on campus, about 9,300 of these being Bachelor, Master or PhD students. The environment at modern day EPFL(CH) is highly international with the school attracting students and researchers from all over the world. More than 125 countries are represented on the campus and the university has two official languages, French and English.

    Organization

    EPFL is organized into eight schools, themselves formed of institutes that group research units (laboratories or chairs) around common themes:

    School of Basic Sciences
    Institute of Mathematics
    Institute of Chemical Sciences and Engineering
    Institute of Physics
    European Centre of Atomic and Molecular Computations
    Bernoulli Center
    Biomedical Imaging Research Center
    Interdisciplinary Center for Electron Microscopy
    MPG-EPFL Centre for Molecular Nanosciences and Technology
    Swiss Plasma Center
    Laboratory of Astrophysics

    School of Engineering

    Institute of Electrical Engineering
    Institute of Mechanical Engineering
    Institute of Materials
    Institute of Microengineering
    Institute of Bioengineering

    School of Architecture, Civil and Environmental Engineering

    Institute of Architecture
    Civil Engineering Institute
    Institute of Urban and Regional Sciences
    Environmental Engineering Institute

    School of Computer and Communication Sciences

    Algorithms & Theoretical Computer Science
    Artificial Intelligence & Machine Learning
    Computational Biology
    Computer Architecture & Integrated Systems
    Data Management & Information Retrieval
    Graphics & Vision
    Human-Computer Interaction
    Information & Communication Theory
    Networking
    Programming Languages & Formal Methods
    Security & Cryptography
    Signal & Image Processing
    Systems

    School of Life Sciences

    Bachelor-Master Teaching Section in Life Sciences and Technologies
    Brain Mind Institute
    Institute of Bioengineering
    Swiss Institute for Experimental Cancer Research
    Global Health Institute
    Ten Technology Platforms & Core Facilities (PTECH)
    Center for Phenogenomics
    NCCR Synaptic Bases of Mental Diseases

    College of Management of Technology

    Swiss Finance Institute at EPFL
    Section of Management of Technology and Entrepreneurship
    Institute of Technology and Public Policy
    Institute of Management of Technology and Entrepreneurship
    Section of Financial Engineering

    College of Humanities

    Human and social sciences teaching program

    EPFL Middle East

    Section of Energy Management and Sustainability

    In addition to the eight schools there are seven closely related institutions

    Swiss Cancer Centre
    Center for Biomedical Imaging (CIBM)
    Centre for Advanced Modelling Science (CADMOS)
    École Cantonale d’art de Lausanne (ECAL)
    Campus Biotech
    Wyss Center for Bio- and Neuro-engineering
    Swiss National Supercomputing Centre

     
  • richardmitnick 3:49 pm on May 20, 2022 Permalink | Reply
    Tags: "Pevatrons": cosmic particle accelerators, "Seeing Inside a Cosmic Superaccelerator", Only recently has data begun to shed light on these energetic particles., Peta Electron Volt particles, , The universe has no shortage of exceedingly energetic particles. They slam into Earth’s atmosphere all the time.   

    From “Sky & Telescope”: “Seeing Inside a Cosmic Superaccelerator” 

    From “Sky & Telescope”

    May 19, 2022
    Monica Young

    Astronomers are exploring a celestial particle accelerator in the Eel Nebula that surrounds a distant pulsar.

    1
    This artist’s representation shows a pulsar wind nebula around another pulsar, named Geminga. Pulsar wind nebulae may be the cosmic sites of particle accelerators. Credit:
    Nahks TrEhnl.

    Take a coin out of your pocket and flip it. That coin-flip carries a peta-electron-volt (PeV) of energy. Now imagine a particle a million billion times smaller than your coin, far beyond the range of even the most powerful microscope — and it’s flitting by with that same amount of energy. That one particle surpasses by a thousandfold the energy that humanity’s most sophisticated particle accelerators can generate.

    Yet the universe has no shortage of such exceedingly energetic particles. They slam into Earth’s atmosphere all the time. But while astronomers have long known these potent particles exist, they’ve struggled to understand how they come to be. Only recently has data begun to shed light on this phenomenon.

    A Twisted Path

    The trouble is, PeV particles are generally charged, whether they be protons or electrons. As such, they’re susceptible to the manipulations of magnetic fields, their paths bending this way and that as they pass through the galaxy. Tracing a single particle back to its source is nigh impossible.

    But the processes that make energetic particles also makes gamma rays. And gamma rays, being chargeless photons, are not so easily led astray by the galaxy’s swirling magnetic field. These photons are thus the messengers that can tell astronomers where particles are being accelerated — and how.

    Two facilities have come online in recent years to give astronomers access to the highest-energy gamma rays: the Large High Altitude Air Shower Observatory (LHAASO) in Tibet and the High-Altitude Water Cherenkov Observatory (HAWC) in Mexico. Their data has enabled astronomers to identify roughly a dozen possible cosmic particle accelerators, known as Pevatrons.

    3
    LHAASO, the cosmic ray observatory in Yangbajing, southwest China’s Tibet Autonomous Region. Credit: Institute of High Energy Physics of the Chinese Academy of Sciences

    The Eel Nebula

    One of these Pevatron candidates is the Eel Nebula, 11,400 light-years away in the constellation Scutum. In this nebula, a cloud of charged particles surrounds a pulsar as it speeds through space, leading to its distinct snakish shape.

    Using observations not just of gamma rays but also X-rays and radio waves to describe the particle cloud, Daniel Burgess (Columbia Astrophysics Laboratory) and team put together a computer model that describes the current state of the pulsar, the plasma around it, and their evolution over time. In a study to appear in The Astrophysical Journal, they show that this particular Pevatron is accelerating electrons to PeV energies.

    3
    This animation shows first the X-ray photons detected from the Eel Nebula, then the gamma-ray emission. Gamma rays have higher energy and lower resolution than X-rays, so the image appears much blurrier at gamma rays. Credit: Daniel Burgess et al.

    “This is one of the first unambiguously identified [electron-accelerating] PeVatron candidates,” says Henrike Fleischhack (Catholic University of America). “The follow-up observations and detailed modeling presented here . . . can serve as a blueprint for the study and identification of other PeVatron candidates.”

    Indeed, team member Kaya Mori (also at Columbia Astrophysics Laboratory) confirms that the team is working on applying the same technique to multiple other pulsar clouds, including two nebulae evocatively named Dragonfly and Boomerang. Other teams are investigating alternative Pevatrons, such as the shocked plasma bubbles cast out by supernova explosions.

    While the Eel Nebula is a clear candidate source of PeV electrons, Fleischhack points out that the energetic particles observed at Earth consist of not only electrons but protons. And so far, most of the other candidate Pevatrons have been found to only accelerate electrons.

    “The question remains,” Fleischhack says: “Where are the [proton-accelerating] PeVatrons that we know must be out there?”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sky & Telescope, founded in 1941 by Charles A. Federer Jr. and Helen Spence Federer, has the largest, most experienced staff of any astronomy magazine in the world. Its editors are virtually all amateur or professional astronomers, and every one has built a telescope, written a book, done original research, developed a new product, or otherwise distinguished him or herself.

    Sky & Telescope magazine, now in its eighth decade, came about because of some happy accidents. Its earliest known ancestor was a four-page bulletin called The Amateur Astronomer, which was begun in 1929 by the Amateur Astronomers Association in New York City. Then, in 1935, the American Museum of Natural History opened its Hayden Planetarium and began to issue a monthly bulletin that became a full-size magazine called The Sky within a year. Under the editorship of Hans Christian Adamson, The Sky featured large illustrations and articles from astronomers all over the globe. It immediately absorbed The Amateur Astronomer.

    Despite initial success, by 1939 the planetarium found itself unable to continue financial support of The Sky. Charles A. Federer, who would become the dominant force behind Sky & Telescope, was then working as a lecturer at the planetarium. He was asked to take over publishing The Sky. Federer agreed and started an independent publishing corporation in New York.

    “Our first issue came out in January 1940,” he noted. “We dropped from 32 to 24 pages, used cheaper quality paper…but editorially we further defined the departments and tried to squeeze as much information as possible between the covers.” Federer was The Sky’s editor, and his wife, Helen, served as managing editor. In that January 1940 issue, they stated their goal: “We shall try to make the magazine meet the needs of amateur astronomy, so that amateur astronomers will come to regard it as essential to their pursuit, and professionals to consider it a worthwhile medium in which to bring their work before the public.”

     
  • richardmitnick 3:12 pm on May 20, 2022 Permalink | Reply
    Tags: "AFMs": atomic force microscopes, "Cubit": Ancient measurement which was approximately the length of a forearm, "Metrology": the science of measurement, "Nanometre scale": A nanometre is a billionth of a metre., "The small things make a big difference in the science of measurement", , , , In the interest of greater accuracy in the 1790s the French government commission standardized the metre as the basic unit of distance., , Once the realm of research scientists nanoscales are increasingly important in industry., Since 2018 some key definitions of measurement units have been redefined., The kilo; the ampere; the kelvin and the mole are now based on fundamental constants in nature instead of physical models., The Romans used fingers and feet in their measurement systems while the story goes that Henry I of England (c 1068 - 1135) tried to standardize a yard as the distance from his nose to his thumb.   

    From “Horizon” The EU Research and Innovation Magazine : “The small things make a big difference in the science of measurement” 

    From “Horizon” The EU Research and Innovation Magazine

    19 May 2022
    Anthony King

    1
    As technology shrinks to the nanoscale, measuring the things we can barely see becomes ever more important. © Rito Succeed, Shutterstock.

    Scientists must make ever more sophisticated measurements as technology shrinks to the nanoscale and we face global challenges from the effects of climate change.

    As industry works more and more on the nanometre scale (a nanometre is a billionth of a metre), there is a need to measure more reliably and accurately things we can barely see. This requires metrology, the science of measurement.

    Nano-scale metrology is useful in everyday life, for example to measure doses of medication or in the development of computer chips for our digital devices.

    ‘Metrology is needed everywhere that you make measurements or if you want to compare measurements,’ said Virpi Korpelainen, senior scientist at the Technical Research Centre of Finland and National Metrology Institute in Espoo, Finland.

    Since the earliest civilizations, standardized and consistent measurements have always been crucial to the smooth functioning of society. In ancient times, physical quantities such as a body measurement were used.

    One of the earliest known units was the cubit, which was approximately the length of a forearm. The Romans used fingers and feet in their measurement systems while the story goes that Henry I of England (c 1068 – 1135) tried to standardise a yard as the distance from his nose to his thumb.

    Standard units

    Standardization demands precise definitions and consistent measurements. In the interest of greater accuracy in the 1790s the French government commission standardized the metre as the basic unit of distance. This set Europe on a path to the standardized international system of base units (SI) which has been evolving since.

    Since 2018, some key definitions of measurement units have been redefined. The kilo, the ampere, the kelvin and the mole are now based on fundamental constants in nature instead of physical models. This is because over time, the physical models change as happened with the model of the kilo, which lost a tiny amount of mass over 100 years after it was created. With this new approach, which was adopted after years of careful science, the definitions will not change.

    This evolution is often driven by incredibly sophisticated science, familiar only to metrologists, such as the speed of light in a vacuum (metre), the rate of radioactive decay (time) or the Planck constant (kilogram), all of which are used to calibrate key units of measurement under the SI.

    ‘When you buy a measuring instrument, people typically don’t think of where the scale comes from,’ said Korpelainen. This goes for scientists and engineers too.

    Once the realm of research scientists, nanoscales are increasingly important in industry. Nanotechnology, computer chips and medications typically rely on very accurate measurements at very small scales.

    Even the most advanced microscopes need to be calibrated, meaning that steps must be taken to standardise its measurements of the very small. Korpelainen and colleagues around Europe are developing improved atomic force microscopes (AFMs) in an ongoing project called MetExSPM.

    AFM is a type of microscope that gets so close to a sample, it can almost reveal its individual atoms. ‘In industry, people need traceable measurements for quality control and for buying components from subcontractors,’ said Korpelainen.

    The project will allow the AFM microscopes to take reliable measurements at nanoscale resolution by using high-speed scanning, even on relatively large samples.

    ‘Industry needs AFM resolution if they want to measure distances between really small structures,’ Korpelainen said. Research on AFMs has revealed that measurement errors are easily introduced at this scale and can be as high as 30%.

    The demand for small, sophisticated, high-performing devices means the nanoscale is growing in importance. She used an AFM microscope and lasers to calibrate precision scales for other microscopes.

    She also coordinated another project, 3DNano, in order to measure nanoscale 3D objects that are not always perfectly symmetrical. Precise measurements of such objects support the development of new technology in medicine, energy storage and space exploration.

    Radon flux

    Dr Annette Röttger, a nuclear physicist at PTB, the national metrology institute in Germany is interested in measuring radon, a radioactive gas with no colour, smell or taste.

    Radon is naturally occurring. It originates from decaying uranium below ground. Generally, the gas leaks into the atmosphere and is harmless, but it can reach dangerous levels when it builds up in dwellings, potentially causing illness to residents.

    But there is another reason Röttger is interested in measuring radon. She believes it can improve the measurement of important greenhouse gases (GHG).

    ‘For methane and carbon dioxide, you can measure the amounts in the atmosphere very precisely, but you cannot measure the flux of these gases coming out of the ground, representatively,’ said Röttger.

    ‘Flux’ is the rate of seepage of a gas. It is a helpful measurement to trace the quantities of other GHG such as methane that also seep out of the ground. Measurements of methane coming out of the ground are variable, so that one spot will differ from another a few steps away. The flow of radon gas out of the ground closely tracks the flow of methane, a damaging GHG with both natural and human origins.

    When radon gas emissions from the ground increase, so do carbon dioxide and methane levels. ‘Radon is more homogenous,’ said Röttger, ‘and there is a close correlation between radon and these greenhouse gases.’ The research project to study it is called traceRadon.

    Radon is measured via its radioactivity but because of its low concentrations it is very challenging to measure. ‘Several devices will not work at all, so you will get a zero-reading value because you are below the detection limit,’ said Röttger.

    Wetland rewetting

    Measuring the escape of radon enables scientists to model the rate of emissions over a landscape. This can be useful to measure the effects of climate mitigation measures. For example, research indicates that the rapid rewetting of drained peatland stores greenhouse gas and mitigates climate change.

    But if you go to the trouble of rewetting a large marshland, ‘You will want to know if this worked,’ said Röttger. ‘If it works for these GHG, then we should see less radon coming out too. If we don’t, then it didn’t work.’

    With more precise calibration, the project will improve radon measurements over large geographical areas. This may also be used to improve radiological early warning systems in a European monitoring network called the European Radiological Data Exchange Platform (EURDEP).

    ‘We have lots of false alarms (due to radon) and we might even miss an alarm because of this,’ said Röttger. ‘We can make this network better which is increasingly important for radiological emergency management support by metrology.’

    Given the intensity of the climate crisis, it is crucial to present reliable data for policy makers, added Röttger. This will assist greatly in addressing climate change, arguably the biggest threat mankind

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

     
  • richardmitnick 1:10 pm on May 20, 2022 Permalink | Reply
    Tags: "Q&A- 'People have to be at the centre of the energy transformation', , , , , Nebojsa Nakicenovic: Vice-Chair- Group of Chief Scientific Advisors (GCSA)   

    From “Horizon” The EU Research and Innovation Magazine : “Q&A- ‘People have to be at the centre of the energy transformation’ 

    From “Horizon” The EU Research and Innovation Magazine

    17 May 2022
    Kevin Casey

    1
    Nebojsa Nakicenovic, Vice-Chair, Group of Chief Scientific Advisors (GCSA)

    In June 2021, the EU’s Group of Chief Scientific Advisors (GCSA) published the Scientific Opinion entitled “A systemic approach to the energy transition in Europe”, arguing that the clean energy transition in the European Green Deal must keep people at its centre. In light of tomorrow’s EU announcement that is critical to the future of energy supply in Europe, we invite GCSA Vice-Chair Nebojsa Nakicenovic to comment on the centrality of a just transition and the importance on staying focused on a clean energy future even at times of intensifying pressure.

    Tell us why the European Commission even needs a scientific opinion at all. Does not the evidence speak for itself?

    This publication (A Systemic Approach to energy Transition in Europe) is part of the Science Advice Mechanism (SAM) of the European Commission. From my perspective, this is a very unique way of providing scientific advice to the decision makers. Many governments have chief scientific advisors with that function. What is unique about SAM in the European Commission is that it has three independent parts.

    First, there is the Group of Chief Scientific Advisors who provide the scientific opinion. There are very clear process rules about how that happens. The other independent part is the so-called SAPEA (Scientific Advice for Policy of the European Academies). This is a consortium of over 100 European academies. They provide a scientific evidence review, similar to the climate change assessment of the IPCC (Intergovernmental Panel on Climate Change).

    The assessment is a scientific analysis of what we know about a particular topic. They (SAPEA) do not provide a scientific opinion or scientific advice, importantly they look into the possible options. We, the group of seven chief scientific advisors, based on this evidence review — evidence, so factual scientific knowledge — provide a scientific opinion to the European Commission.

    There is also a unit in the Commission that catalyses this process. The three groups work closely together but we are independent. That explains the context. Why would we provide a scientific opinion? It is because the topic is considered really crucial and central to multiple crisis facing Europe and the world.

    Does a just transition require a transformation of the economic model of energy services? People own the problem, should they not own the solution too?

    That is precisely what we have tried to address in our scientific opinion – based on the scientific evidence. We didn’t go beyond the scientific evidence.

    Energy cannot be seen as a silo. We – people – have to be at the centre. That means it has to be an inclusive process involving everybody and, importantly, not leaving anyone behind. Because there is a great danger that any transformation, unfortunately, leads to winners and hopefully there will be many, many winners but also – I wouldn’t say “losers” – but there are people who fall through the cracks who might be left behind and do not have an escape hatch. This is what was a high priority – to identify how to do that.

    2
    The EU’s Group of Chief Scientific Advisors argue that the clean energy transition in the European Green Deal must keep people at its centre. ©Alexanderstock23, Shutterstock.

    In our scientific opinion – and in fact we say explicitly, it is essential that sustainable energy, lifestyles, and behaviours become the preferred choice for the people – become a natural choice. For that, we have to create an environment that allows that. This is clearly very, very complex, I don’t think anybody has a silver bullet on that question.

    The world has changed since the paper was published in June 2021. In particular war, inflation and recent dire warnings from the IPCC about rising temperatures. How does that affect your opinion on a just transition?

    I have to be very careful to distinguish what is in our scientific opinion based on the evidence and what is my personal view. It’s important not to mix the two or I would not be reflecting the scientific advice mechanism which I think is very unique – I just want to make that clear. Here is my private opinion based on our scientific opinion but not in it.

    Geopolitics are changing. There is no doubt that we are in a crucial moment in history. And this is why we argued before – again, my view – that we shouldn’t lose sight of the long term objectives .

    We are likely to exceed 1.5 degrees – it is almost certain that by 2040 we will be above (the limit prescribed), perhaps even earlier. From the scientific point of view, this is not new.

    From the policy point of view and behavioural point of view, this is something one needs to somehow internalise. We will exceed that goal and we will bear the dangerous consequences. But, we should not lose the perspective of doing our utmost to reach 1.5 degrees in the future – and for that we need to act now.

    This is another dimension of justice – intergenerational justice. We have to make sure that we leave the planet to the future generations (hopefully) in better condition than what will occur over the next decade or two.

    Is it even possible for the EGD to achieve ‘a clean, circular economy, a modern, resource-efficient and competitive economy’ by 2050?

    Again, we are in the realm of opinion. Nobody can tell what the future will be like.

    I was very enthusiastic when in 2015 all of the world adopted the UN’s Sustainable Development Goals (SDGs) and when there was the Paris Agreement on climate change. I think those were the two really important visionary steps towards this aspirational transformation that we were talking about.

    I would also argue that the European Green Deal, Fit for 55 and New European Bauhaus initiatives are even more actionable in some sense. They provide a clearer agenda for how the world and life might and should look in 2050.

    I don’t want to sound too pessimistic and again let me add, this is my personal perspective – you know, 30 years is a long enough time to achieve this transformation.

    We have done that before. The most recent example is of mobile phones. It all started in 1990 and today, everybody in the world has a phone. Even the poorest people have a phone because it has enabled new economic activities, because it’s beneficial for many (despite the nuisance of always being reachable!)

    Another example just to show in principle this is doable, is the replacement of horses by motor vehicles. That also took 30 years in most of the countries. We have 30 years to replace our vehicle fleet by hydrogen and electric. We have just enough time for the transformation if we act immediately.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

     
  • richardmitnick 12:37 pm on May 20, 2022 Permalink | Reply
    Tags: "A mountain of learning", A group of UW students hiked up to one of Mount Baker’s most prominent glaciers-Easton Glacier-and learned how to gather highly precise data that can be used to track glacier change., , If one took a picture every year one would struggle to put a number on how much ice has melted away., It’s important to monitor glacier sites to understand the impacts of climate change., , There are things you experience when you are standing next to a glacier that you just can’t learn in a classroom., There’s an ideal way to learn about retreating glaciers: visit them for the day., To go through the calculations and match up the elevation models and see clearly where the glacier has thinned was highly educational.   

    From The University of Washington (US) Civil & Environmental Engineering: “A mountain of learning” 

    From The University of Washington (US) Civil & Environmental Engineering

    In

    The University of Washington College of Engineering

    At


    The University of Washington

    5.20.22

    Brooke Fisher
    Photos: Mark Stone/University of Washington

    Students practice drone surveying techniques at Mount Baker.

    There’s an ideal way to learn about retreating glaciers: visit them for the day.

    That’s exactly what a group of UW students did in September 2021, when they hiked up to one of Mount Baker’s most prominent glaciers, Easton Glacier, and learned how to gather highly precise data that can be used to track glacier change.

    “It’s a great educational opportunity — students just need a pair of hiking boots,” says CEE Assistant Professor David Shean. “There are things you experience when you are standing next to a glacier that you just can’t learn in a classroom. Students feel the wind and hear the roar of the waterfall as the glacier melts. They realize there are daily variations in these things, and you don’t get that from a textbook or PowerPoint slides.”

    About twice per year, Shean takes students out to the glacier. Many have already taken or are planning to take Shean’s Advanced Surveying class, which covers modern surveying techniques for scientific and engineering applications.

    “To go through the calculations and match up the elevation models and see clearly where the glacier has thinned was interesting,” says CEE Ph.D. student Seth Vanderwilt. “Just from standing on the hiking trail, if you took a picture every year you would struggle to put a number on how much ice has melted away.”

    The outings are a mix of teaching opportunity and research for Shean, who has been studying the glacier, located in the North Cascades, since 2014. During his Ph.D. studies, Shean started using satellite data to track glaciers in Washington and continues to monitor glacial change. In the past seven years, Shean has observed hundreds of meters of retreat and up to 20 meters of thinning in places.

    “It’s important to monitor sites like Easton to understand the impacts of regional climate change, but coupled with that are changes in the snowpack, vegetation and surrounding landscape, such as bedrock that was covered with ice for thousands of years,” Shean says. “We are building a record that can be used to study this interconnected system in detail.”

    1
    2
    3

    Three images:

    CEE Assistant Professor David Shean explains how to perform a Global Navigation Satellite System survey, where receivers on board the drones communicated with multiple satellite networks, including the well-known Global Positioning System, to pinpoint the precise location where images were captured.

    Shean holds a fixed-wing drone, which can map large areas in a back-and-forth “lawnmower” pattern.

    Ph.D. student Danny Hogan sets up a Global Navigation Satellite System receiver.

    A few students ventured into the valley below to place ground control point targets that would help lock in the precise locations of the drones and surface of the glacier, while other students helped launch the two drones: a quadcopter and a small fixed-wing drone with a 3-foot wingspan, for mapping larger areas. Equipped with high-resolution cameras, the drones captured a variety of photos from different locations and angles.

    The students also gained valuable experience with satellite navigation and positioning, which would be important for their later modeling efforts. Survey-grade Global Navigation Satellite System receivers on board the drones communicated with multiple satellite networks to pinpoint the precise location where images were captured.

    “Even with the best-available satellite images, the resolution and geolocation accuracy of our measurements is around a few feet. Using the drones, we can get down to a few centimeters, which enables all sorts of new science questions to be answered,” Shean says. “We can measure subtle changes as well as capture the rate of change, which shows how the glaciers are evolving over time.”

    4

    Created by CEE Ph.D. student Seth Vanderwilt, the above animations show the change in surface elevation of the Easton Glacier from 2014-2021. The visualizations were created utilizing a technique called Structure from Motion, which creates 3D reconstructions from overlapping two-dimensional images. On the left image is a shaded relief map, which shows the topography in detail. On the right image is a color orthomosaic that depicts a geometrically accurate view of the landscape.

    Creating 3D models

    Gathering highly precise data is just the first step. In Shean’s Advanced Surveying course, students learn to use software to stitch the drone images together and create 3D models and topographic maps.

    5
    CEE Ph.D. students Seth Vanderwilt and Hannah Besso discuss the results of the Global Navigation Satellite System survey.

    For the final class deliverable, students apply what they’ve learned to a project of their choice. Teaming up with classmates, Vanderwilt processed the images gathered at Easton Glacier in September, along with all of the data going back to 2014. The students created a time series analysis with a combined 6,500 drone images, which revealed approximately 3-4 meters of thinning over the lower glacier each year.

    For Besso’s final project, she worked with classmates to collect new drone imagery at the site of the 2014 Oso landslide, which they used to create data visualizations. Comparing their 3D models to post-landslide data gathered by the United States Geological Survey, the students found that in the aftermath of the landslide, the banks of the North Fork of the Stillaguamish River were eroding and the channel was widening.

    “We took the project from the idea phase to going out to the field site to fly drones on a weekend,” Besso says. “It was something that was within our ability level after taking the class, but it took some training and planning because there were tall trees and challenging terrain and we didn’t want to crash the drones.”

    Sought-after skillset

    Hands-on experience gathering, processing and analyzing high-resolution topographic data gives students an advantage when applying for jobs, says Shean. Environmental consulting firms now rely on drones for inspections and mapping, but drone surveying is not taught in traditional college courses.

    “It gives our students a leg up,” Shean says. “They understand the data acquisition, software and how to deliver a final product. I get emails from students who took this class in previous years saying they did a drone survey at their new job and it worked beautifully. It’s one of the most rewarding aspects of teaching courses like this at UW.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Washington Civil & Environmental Engineering

    Take a moment to look around you. Buildings, bridges, running water and transit systems are the work of civil and environmental engineers.
    2
    Civil and environmental engineers design, construct and manage the essential facilities, systems and structures around us. Their work plays a crucial role in enabling livable, sustainable cities, healthy environments and strong economies.

    At the University of Washington, Civil & Environmental Engineering students and faculty are taking on the challenges presented by our aging national infrastructure, while developing new approaches to address the needs of urban systems and communities around the globe. UW CEE is dedicated to providing students with leading-edge technical skill development and opportunities for hands-on practice to enable them to tackle complex engineering problems in response to changing technological and societal needs.

    Housed in an outstanding university, UW CEE offers one of the world’s premier programs in the field. The UW College of Engineering undergraduate program is ranked #18 and CEE’s graduate programs are ranked #16 for civil engineering and #27 for environmental engineering for 2020, according to U.S. News & World Report.

    The University of Washington College of Engineering

    Mission, Facts, and Stats

    Our mission is to develop outstanding engineers and ideas that change the world.

    Faculty:
    275 faculty (25.2% women)

    Achievements:

    128 NSF Young Investigator/Early Career Awards since 1984
    32 Sloan Foundation Research Awards
    2 MacArthur Foundation Fellows (2007 and 2011)

    A national leader in educating engineers, each year the College turns out new discoveries, inventions and top-flight graduates, all contributing to the strength of our economy and the vitality of our community.

    Engineering innovation

    Engineers drive the innovation economy and are vital to solving society’s most challenging problems. The College of Engineering is a key part of a world-class research university in a thriving hub of aerospace, biotechnology, global health and information technology innovation. Over 50% of UW startups in FY18 came from the College of Engineering.

    Commitment to diversity and access

    The College of Engineering is committed to developing and supporting a diverse student body and faculty that reflect and elevate the populations we serve. We are a national leader in women in engineering; 25.5% of our faculty are women compared to 17.4% nationally. We offer a robust set of diversity programs for students and faculty.

    Research and commercialization

    The University of Washington is an engine of economic growth, today ranked third in the nation for the number of startups launched each year, with 65 companies having been started in the last five years alone by UW students and faculty, or with technology developed here. The College of Engineering is a key contributor to these innovations, and engineering faculty, students or technology are behind half of all UW startups. In FY19, UW received $1.58 billion in total research awards from federal and nonfederal sources.

    u-washington-campus

    The University of Washington is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.
    So what defines us —the students, faculty and community members at the University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

     
  • richardmitnick 11:42 am on May 20, 2022 Permalink | Reply
    Tags: "Is it topological? A new materials database has the answer", , Electron band structure, , Researchers at MIT and elsewhere have discovered that-in fact-topological materials are everywhere if you know how to look for them., Scientists found that 90 percent of all known crystalline structures contain at least one topological property., , , The new study was motivated by a desire to speed up the traditional search for topological materials., Topological quantum chemistry, Topology stems from a branch of mathematics that studies shapes that can be manipulated or deformed without losing certain core properties., What will it take to make our electronics smarter faster and more resilient? One idea is to build them from materials that are topological.   

    From The Massachusetts Institute of Technology: “Is it topological? A new materials database has the answer” 

    From The Massachusetts Institute of Technology

    May 19, 2022
    Jennifer Chu

    1
    Searchable tool reveals more than 90,000 known materials with electronic properties that remain unperturbed in the face of disruption. Image: Christine Daniloff, MIT.

    What will it take to make our electronics smarter, faster, and more resilient? One idea is to build them from materials that are topological.

    Topology stems from a branch of mathematics that studies shapes that can be manipulated or deformed without losing certain core properties. A donut is a common example: If it were made of rubber, a donut could be twisted and squeezed into a completely new shape, such as a coffee mug, while retaining a key trait — namely, its center hole, which takes the form of the cup’s handle. The hole, in this case, is a topological trait, robust against certain deformations.

    In recent years, scientists have applied concepts of topology to the discovery of materials with similarly robust electronic properties. In 2007, researchers predicted the first electronic topological insulators — materials in which electrons that behave in ways that are “topologically protected,” or persistent in the face of certain disruptions.

    Since then, scientists have searched for more topological materials with the aim of building better, more robust electronic devices. Until recently, only a handful of such materials were identified, and were therefore assumed to be a rarity.

    Now researchers at MIT and elsewhere have discovered that-in fact-topological materials are everywhere if you know how to look for them.

    In a paper published today in Science, the team, led by Nicolas Regnault of Princeton University and the École Normale Supérieure Paris, reports harnessing the power of multiple supercomputers to map the electronic structure of more than 96,000 natural and synthetic crystalline materials. They applied sophisticated filters to determine whether and what kind of topological traits exist in each structure.

    Overall, they found that 90 percent of all known crystalline structures contain at least one topological property, and more than 50 percent of all naturally occurring materials exhibit some sort of topological behavior.

    “We found there’s a ubiquity — topology is everywhere,” says Benjamin Wieder, the study’s co-lead, and a postdoc in MIT’s Department of Physics.

    The team has compiled the newly identified materials into a new, freely accessible Topological Materials Database resembling a periodic table of topology. With this new library, scientists can quickly search materials of interest for any topological properties they might hold, and harness them to build ultra-low-power transistors, new magnetic memory storage, and other devices with robust electronic properties.

    The paper includes co-lead author Maia Vergniory of the Donostia International Physics Center, Luis Elcoro of the University of Basque Country, Stuart Parkin and Claudia Felser of the Max Planck Institute, and Andrei Bernevig of Princeton University.

    Beyond intuition

    The new study was motivated by a desire to speed up the traditional search for topological materials.

    “The way the original materials were found was through chemical intuition,” Wieder says. “That approach had a lot of early successes. But as we theoretically predicted more kinds of topological phases, it seemed intuition wasn’t getting us very far.”

    Wieder and his colleagues instead utilized an efficient and systematic method to root out signs of topology, or robust electronic behavior, in all known crystalline structures, also known as inorganic solid-state materials.

    For their study, the researchers looked to the Inorganic Crystal Structure Database, or ICSD, a repository into which researchers enter the atomic and chemical structures of crystalline materials that they have studied. The database includes materials found in nature, as well as those that have been synthesized and manipulated in the lab. The ICSD is currently the largest materials database in the world, containing over 193,000 crystals whose structures have been mapped and characterized.

    The team downloaded the entire ICSD, and after performing some data cleaning to weed out structures with corrupted files or incomplete data, the researchers were left with just over 96,000 processable structures. For each of these structures, they performed a set of calculations based on fundamental knowledge of the relation between chemical constituents, to produce a map of the material’s electronic structure, also known as the electron band structure.

    The team was able to efficiently carry out the complicated calculations for each structure using multiple supercomputers, which they then employed to perform a second set of operations, this time to screen for various known topological phases, or persistent electrical behavior in each crystal material.

    “We’re looking for signatures in the electronic structure in which certain robust phenomena should occur in this material,” explains Wieder, whose previous work involved refining and expanding the screening technique, known as topological quantum chemistry.

    From their high-throughput analysis, the team quickly discovered a surprisingly large number of materials that are naturally topological, without any experimental manipulation, as well as materials that can be manipulated, for instance with light or chemical doping, to exhibit some sort of robust electronic behavior. They also discovered a handful of materials that contained more than one topological state when exposed to certain conditions.

    “Topological phases of matter in 3D solid-state materials have been proposed as venues for observing and manipulating exotic effects, including the interconversion of electrical current and electron spin, the tabletop simulation of exotic theories from high-energy physics, and even, under the right conditions, the storage and manipulation of quantum information,” Wieder notes.

    For experimentalists who are studying such effects, Wieder says the team’s new database now reveals a menagerie of new materials to explore.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

    The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center , and the Haystack Observatory , as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and Whitehead Institute.

    Massachusettes Institute of Technology-Haystack Observatory Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology . The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities (AAU).

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after The Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    The Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, The Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at The Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    The Massachusetts Institute of Technology‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology ‘s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, The Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected The Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of The Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, The Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and The Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. The Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However six Massachusetts Institute of Technology students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at The Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, The Massachusetts Institute of Technology’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    The Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    The Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, The Massachusetts Institute of Technology launched OpenCourseWare to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, The Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

    The Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, The Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of The Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology , Massachusetts Institute of Technology, and industrial contractors, and funded by the National Science Foundation .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also a Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of The Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 10:47 am on May 20, 2022 Permalink | Reply
    Tags: "Puzzling Quantum Scenario Appears Not to Conserve Energy", , , Physicists are divided as to whether the new paradox exposes a genuine violation of the conservation of energy., , , , Superoscillation, Surprising as superoscillation is it doesn’t contradict any laws of physics., The mathematician Emmy Noether proved in 1915 that conserved quantities like energy and momentum spring from symmetries of nature.   

    From Quanta Magazine: “Puzzling Quantum Scenario Appears Not to Conserve Energy” 

    From Quanta Magazine

    May 16, 2022
    Katie McCormick

    1
    The quantum paradox is akin to red light turning green. Credit: Kristina Armitage/Quanta Magazine.

    The quantum physicists Sandu Popescu, Yakir Aharonov and Daniel Rohrlich have been troubled by the same scenario for three decades.

    It started when they wrote about a surprising wave phenomenon called superoscillation in 1990. “We were never able to really tell what exactly was bothering us,” said Popescu, a professor at the University of Bristol(UK). “Since then, every year we come back and we see it from a different angle.”

    Finally, in December 2020, the trio published a paper in the PNAS explaining what the problem is: In quantum systems, superoscillation appears to violate the law of conservation of energy. This law, which states that the energy of an isolated system never changes, is more than a bedrock physical principle. It’s now understood to be an expression of the fundamental symmetries of the universe — a “very important part of the edifice of physics,” said Chiara Marletto, a physicist at the University of Oxford (UK).

    Physicists are divided as to whether the new paradox exposes a genuine violation of the conservation of energy. Their attitudes toward the problem depend in part on whether individual experimental outcomes in quantum mechanics should be considered seriously, no matter how improbable they may be. The hope is that by putting in the effort to resolve the puzzle, researchers will be able to clarify some of the most subtle and strange aspects of quantum theory.

    Mirror Trick

    Aharonov has described the scenario in question as akin to opening a box full of red light — low-energy electromagnetic waves — and seeing a high-energy gamma ray shoot out. How can this happen?

    The key ingredient is superoscillation, an effect that seems to contradict what every physics student learns about waves.

    Any wave, no matter how complicated, can be represented as a sum of sine waves of different frequencies. Students learn that a wave can oscillate only as fast as its highest-frequency sine wave component. So combine a bunch of red light, and it should stay red.

    But around 1990, Aharonov and Popescu found that special combinations of sine waves produce regions of the collective wave that wiggle faster than any of the constituents. Their colleague Michael Berry illustrated the power of superoscillation by showing that it’s possible (though impractical) to play Beethoven’s Ninth Symphony by combining only sound waves below 1 hertz — frequencies so low that, individually, they would be imperceptible to the human ear. This rediscovery of superoscillation, which was already known to some signal processing experts, inspired physicists to invent an array of applications, from high-resolution imaging to new radio designs.

    Surprising as superoscillation is it doesn’t contradict any laws of physics. But when Aharonov, Popescu and Rohrlich applied the concept to quantum mechanics, they encountered a situation that’s downright paradoxical.

    In quantum mechanics, a particle is described by a wave function, a kind of wave whose varying amplitude conveys the probability of finding the particle in different locations. Wave functions can be expressed as sums of sine waves, just as other waves can.

    A wave’s energy is proportional to its frequency. This means that, when a wave function is a combination of multiple sine waves, the particle is in a “superposition” of energies. When its energy is measured, the wave function seems to mysteriously “collapse” to one of the energies in the superposition.

    Popescu, Aharonov and Rohrlich exposed the paradox using a thought experiment. Suppose you have a photon trapped inside a box, and this photon’s wave function has a superoscillatory region. Quickly put a mirror in the photon’s path right where the wave function superoscillates, keeping the mirror there for a short time. If the photon happens to be close enough to the mirror during that time, the mirror will bounce the photon out of the box.

    Remember we’re dealing with the photon’s wave function here. Since the bounce doesn’t constitute a measurement, the wave function doesn’t collapse. Instead, it splits in two: Most of the wave function remains in the box, but the small, rapidly oscillating piece near where the mirror was inserted leaves the box and heads toward the detector.

    Because this superoscillatory piece has been plucked from the rest of the wave function, it is now identical to a photon of much higher energy. When this piece hits the detector, the entire wave function collapses. When it does, there’s a small but real chance that the detector will register a high-energy photon. It’s like the gamma ray emerging from a box of red light. “This is shocking,” said Popescu.

    The clever measurement scheme somehow imparts more energy to the photon than any of its wave function’s components would have allowed. Where did the energy come from?

    Legal Ambiguities

    The mathematician Emmy Noether proved in 1915 that conserved quantities like energy and momentum spring from symmetries of nature. Energy is conserved because of “time-translation symmetry”: the rule that the equations governing particles stay the same from moment to moment. (Energy is the stable quantity that represents this sameness.) Notably, energy isn’t conserved in situations where gravity warps the fabric of space-time, since this warping changes the physics in different places and times, nor is it conserved on cosmological scales, where the expansion of space introduces time-dependence. But for something like light in a box, physicists agree: Time-translation symmetry (and thus energy conservation) should hold.

    Applying Noether’s theorem to the equations of quantum mechanics gets complicated, though.

    In classical mechanics, you can always check the initial energy of a system, let it evolve, then check the final energy, and you’ll find that the energy stays constant. But measuring the energy of a quantum system necessarily disturbs it by collapsing its wave function, preventing it from evolving as it otherwise would have. So the only way to check that energy is conserved as a quantum system evolves is to do so statistically: Run an experiment many times, checking the initial energy half the time and the final energy the other half. The statistical distribution of energies before and after the system’s evolution should match.

    Popescu says the thought experiment, while perplexing, is compatible with this version of conservation of energy. Because the superoscillatory region is such a small part of the photon’s wave function, the photon has a very low probability of being found there — only on rare occasions will the “shocking” photon emerge from the box. Over the course of many runs, the energy budget will stay balanced. “We do not claim that energy conservation in the … statistical version is incorrect,” he said. “But all we claim is that that is not the end of the story.”

    The problem is, the thought experiment suggests that energy conservation can be violated in individual instances — something many physicists object to. David Griffiths, a professor emeritus at Reed College in Oregon and author of standard textbooks on quantum mechanics, maintains that energy must be conserved in each individual experimental run (even if this is normally hard to check).

    Marletto agrees. In her opinion, if it looks as if your experiment is violating this conservation law, you’re not looking hard enough. The excess energy must come from somewhere. “There are a number of ways in which this alleged violation of the energy conservation could come about,” she said, “one of which is not fully taking into account the environment.”

    Popescu and his colleagues think they have accounted for the environment; they suspected that the photon gains its extra energy from the mirror, but they calculated that the mirror’s energy does not change.

    The search continues for a resolution to the apparent paradox, and with it, a better understanding of quantum theory. Such puzzles have been fruitful for physicists in the past. As John Wheeler once said, “No progress without a paradox!”

    “If you ignore such questions,” Popescu said, “you’re never really going to … understand what quantum mechanics is.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 10:04 am on May 20, 2022 Permalink | Reply
    Tags: "Brookhaven Lab Launches New Quantum Network Facility", A rooftop construction effort is in progress and the optical system is in active development., , , Building these long-distance entanglement-on-demand capabilities will have an enormous impact on the scientific community., Enhanced optical interferometry, , , Quantum networking is only in its infancy., Quantum networking offers a significantly more precise rapid and secure form of communication that will touch several key industries and applications that affect our day-to-day lives., , The facility is equipped with the state-of-the-art quantum networking equipment necessary to build these long-distance entanglement distribution networks., The Facility’s fiber network infrastructure is currently being expanded to cover more nodes across Long Island and the New York City metropolitan area., The new facility already possesses one of the most advanced regional quantum networks in the U.S.   

    From The DOE’s Brookhaven National Laboratory: “Brookhaven Lab Launches New Quantum Network Facility” 

    From The DOE’s Brookhaven National Laboratory

    May 18, 2022
    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    New user facility will provide infrastructure and capabilities for benchmarking performance, validating concepts, and expediting the development of the quantum internet ecosystem.

    1
    Scientists testing equipment that will be used in the Quantum Free Space Link project, which will transmit entangled photons between a building on the Brookhaven Lab site and another more than 20 kilometers away on the Stony Brook University campus.

    The U.S. Department of Energy’s Brookhaven National Laboratory has launched a new Quantum Network Facility that will serve scientists from across the country and around the world working to advance the exciting new field of quantum communication networks.

    “Building a wide-spread, quantum-based, global communication network—the Quantum Internet—has the potential to be among the most important technological frontiers of the 21st century,” said Gabriella Carini, director of Brookhaven Lab’s Instrumentation Division. “This new facility will provide the tools and capabilities researchers need to make large-scale quantum entanglement distribution networks a reality.”

    In quantum mechanics, the physical properties of entangled particles (typically photons) remain associated, even when separated by vast distances. As a result, when measurements are performed on one side, they also affect the other. Scientists can take advantage of these properties to create a secure, long-distance quantum information network. The new facility already possesses one of the most advanced regional quantum networks in the U.S., with a team at Brookhaven Lab and Stony Brook University-SUNY recently completing the Nation’s longest quantum network, spanning 98 miles and connecting the institutions’ two campuses.

    Building these long-distance entanglement-on-demand capabilities will have an enormous impact on the scientific community. They will enable a whole new range of applications, such as enhanced optical interferometry, large line-of-sight arrays of entangled sensors, quantum networks of atomic clocks, and distributed quantum computing. The facility is equipped with the state-of-the-art quantum networking equipment necessary to build these long-distance entanglement distribution networks.

    Quantum networking is only in its infancy. Many challenges remain before the full potential of large quantum communication systems can be realized. Brookhaven’s Quantum Network Facility was formed to address all of these challenges and more. As an experimental facility, it is open to the worldwide user community. Experimental opportunities, expanding from these research efforts, will be focused on the development of foundational quantum devices, including entanglement generation and detection, and characterization of portable quantum memories with a focus on scalability through room-temperature operation.

    “One of the key aspects of this facility is that it provides the ability to integrate these quantum hardware building blocks with existing real-life inter-city fibers and characterize their performance once they are integrated with the current internet,” said Julian Martinez-Rincon, scientist in the Instrumentation Division’s Quantum Systems group. “The goal of these efforts is to perform long-distance entanglement experiments targeted at implementing new scientific applications such as distributed quantum sensing and computing, as well as to develop algorithms and protocols to remotely control a regional quantum internet testbed.”

    The facility is supported entirely by its users, including resident research programs and external users through research partnership agreements.

    The long-term vision for the facility is for it to become one of the first instances of a quantum-repeater-assisted, entanglement distribution network that will be capable of heralding and maintaining entanglement at all of its quantum nodes. The Facility’s fiber network infrastructure is currently being expanded to cover more nodes across Long Island and the New York City metropolitan area. This includes multi-purpose quantum nodes at Brookhaven Lab, Stony Brook University, and in New York City, assisted by entanglement generation and swapping nodes located in Commack and Westbury.

    At the same time, facility researchers are working to create a “free space optical link,” which will provide a direct, site-to-site entanglement distribution channel between Brookhaven Lab and Stony Brook University through open air, independent of underground infrastructure. A rooftop construction effort is in progress and the optical system is in active development, with a smaller-scale system already demonstrating a connection through free space and back with excellent efficiency.

    “Quantum networking offers a significantly more precise, rapid, and secure form of communication that will touch several key industries and applications that affect our day-to-day lives,” said Carini. “Brookhaven Lab and our partners at Stony Brook University are excited to be at the forefront of this revolution in technology.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Brookhaven Campus

    One of ten national laboratories overseen and primarily funded by the The DOE Office of Science, The DOE’s Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

    Research at BNL specializes in nuclear and high energy physics, energy science and technology, environmental and bioscience, nanoscience and national security. The 5,300 acre campus contains several large research facilities, including the Relativistic Heavy Ion Collider [below] and National Synchrotron Light Source II [below]. Seven Nobel prizes have been awarded for work conducted at Brookhaven lab.

    BNL is staffed by approximately 2,750 scientists, engineers, technicians, and support personnel, and hosts 4,000 guest investigators every year. The laboratory has its own police station, fire department, and ZIP code (11973). In total, the lab spans a 5,265-acre (21 km^2) area that is mostly coterminous with the hamlet of Upton, New York. BNL is served by a rail spur operated as-needed by the New York and Atlantic Railway. Co-located with the laboratory is the Upton, New York, forecast office of the National Weather Service.

    Major programs

    Although originally conceived as a nuclear research facility, Brookhaven Lab’s mission has greatly expanded. Its foci are now:

    Nuclear and high-energy physics
    Physics and chemistry of materials
    Environmental and climate research
    Nanomaterials
    Energy research
    Nonproliferation
    Structural biology
    Accelerator physics

    Operation

    Brookhaven National Lab was originally owned by the Atomic Energy Commission(US) and is now owned by that agency’s successor, the United States Department of Energy (DOE). DOE subcontracts the research and operation to universities and research organizations. It is currently operated by Brookhaven Science Associates LLC, which is an equal partnership of Stony Brook University and Battelle Memorial Institute. From 1947 to 1998, it was operated by Associated Universities, Inc.(AUI), but AUI lost its contract in the wake of two incidents: a 1994 fire at the facility’s high-beam flux reactor that exposed several workers to radiation and reports in 1997 of a tritium leak into the groundwater of the Long Island Central Pine Barrens on which the facility sits.

    Foundations

    Following World War II, the US Atomic Energy Commission was created to support government-sponsored peacetime research on atomic energy. The effort to build a nuclear reactor in the American northeast was fostered largely by physicists Isidor Isaac Rabi and Norman Foster Ramsey Jr., who during the war witnessed many of their colleagues at Columbia University leave for new remote research sites following the departure of the Manhattan Project from its campus. Their effort to house this reactor near New York City was rivalled by a similar effort at the Massachusetts Institute of Technology to have a facility near Boston, Massachusetts. Involvement was quickly solicited from representatives of northeastern universities to the south and west of New York City such that this city would be at their geographic center. In March 1946 a nonprofit corporation was established that consisted of representatives from nine major research universities — Columbia University, Cornell University, Harvard University, Johns Hopkins University, Massachusetts Institute of Technology, Princeton University, University of Pennsylvania, University of Rochester, and Yale University.

    Out of 17 considered sites in the Boston-Washington corridor, Camp Upton on Long Island was eventually chosen as the most suitable in consideration of space, transportation, and availability. The camp had been a training center from the US Army during both World War I and World War II. After the latter war, Camp Upton was deemed no longer necessary and became available for reuse. A plan was conceived to convert the military camp into a research facility.

    On March 21, 1947, the Camp Upton site was officially transferred from the U.S. War Department to the new U.S. Atomic Energy Commission (AEC), predecessor to the U.S. Department of Energy (DOE).

    Research and facilities

    Reactor history

    In 1947 construction began on the first nuclear reactor at Brookhaven, the Brookhaven Graphite Research Reactor. This reactor, which opened in 1950, was the first reactor to be constructed in the United States after World War II. The High Flux Beam Reactor operated from 1965 to 1999. In 1959 Brookhaven built the first US reactor specifically tailored to medical research, the Brookhaven Medical Research Reactor, which operated until 2000.

    Accelerator history

    In 1952 Brookhaven began using its first particle accelerator, the Cosmotron. At the time the Cosmotron was the world’s highest energy accelerator, being the first to impart more than 1 GeV of energy to a particle.

    BNL Cosmotron 1952-1966.

    The Cosmotron was retired in 1966, after it was superseded in 1960 by the new Alternating Gradient Synchrotron (AGS).

    BNL Alternating Gradient Synchrotron (AGS).

    The AGS was used in research that resulted in 3 Nobel prizes, including the discovery of the muon neutrino, the charm quark, and CP violation.

    In 1970 in BNL started the ISABELLE project to develop and build two proton intersecting storage rings.

    The groundbreaking for the project was in October 1978. In 1981, with the tunnel for the accelerator already excavated, problems with the superconducting magnets needed for the ISABELLE accelerator brought the project to a halt, and the project was eventually cancelled in 1983.

    The National Synchrotron Light Source operated from 1982 to 2014 and was involved with two Nobel Prize-winning discoveries. It has since been replaced by the National Synchrotron Light Source II. [below].

    BNL National Synchrotron Light Source.

    After ISABELLE’S cancellation, physicist at BNL proposed that the excavated tunnel and parts of the magnet assembly be used in another accelerator. In 1984 the first proposal for the accelerator now known as the Relativistic Heavy Ion Collider (RHIC)[below] was put forward. The construction got funded in 1991 and RHIC has been operational since 2000. One of the world’s only two operating heavy-ion colliders, RHIC is as of 2010 the second-highest-energy collider after the Large Hadron Collider(CH). RHIC is housed in a tunnel 2.4 miles (3.9 km) long and is visible from space.

    On January 9, 2020, It was announced by Paul Dabbar, undersecretary of the US Department of Energy Office of Science, that the BNL eRHIC design has been selected over the conceptual design put forward by DOE’s Thomas Jefferson National Accelerator Facility [Jlab] (US) as the future Electron–ion collider (EIC) in the United States.

    Brookhaven Lab Electron-Ion Collider (EIC) to be built inside the tunnel that currently houses the RHIC.

    In addition to the site selection, it was announced that the BNL EIC had acquired CD-0 from the Department of Energy. BNL’s eRHIC design proposes upgrading the existing Relativistic Heavy Ion Collider, which collides beams light to heavy ions including polarized protons, with a polarized electron facility, to be housed in the same tunnel.

    Other discoveries

    In 1958, Brookhaven scientists created one of the world’s first video games, Tennis for Two. In 1968 Brookhaven scientists patented Maglev, a transportation technology that utilizes magnetic levitation.

    Major facilities

    Relativistic Heavy Ion Collider (RHIC), which was designed to research quark–gluon plasma and the sources of proton spin. Until 2009 it was the world’s most powerful heavy ion collider. It is the only collider of spin-polarized protons.

    Center for Functional Nanomaterials (CFN), used for the study of nanoscale materials.

    BNL National Synchrotron Light Source II, Brookhaven’s newest user facility, opened in 2015 to replace the National Synchrotron Light Source (NSLS), which had operated for 30 years. NSLS was involved in the work that won the 2003 and 2009 Nobel Prize in Chemistry.

    Alternating Gradient Synchrotron, a particle accelerator that was used in three of the lab’s Nobel prizes.
    Accelerator Test Facility, generates, accelerates and monitors particle beams.
    Tandem Van de Graaff, once the world’s largest electrostatic accelerator.

    Computational Science resources, including access to a massively parallel Blue Gene series supercomputer that is among the fastest in the world for scientific research, run jointly by Brookhaven National Laboratory and Stony Brook University-SUNY.

    Interdisciplinary Science Building, with unique laboratories for studying high-temperature superconductors and other materials important for addressing energy challenges.
    NASA Space Radiation Laboratory, where scientists use beams of ions to simulate cosmic rays and assess the risks of space radiation to human space travelers and equipment.

    Off-site contributions

    It is a contributing partner to the ATLAS experiment, one of the four detectors located at the The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH)[CERN] Large Hadron Collider(LHC).

    The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH)[CERN] map.

    Iconic view of the European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear] [Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH) [CERN] ATLAS detector.

    It is currently operating at The European Organization for Nuclear Research [La Organización Europea para la Investigación Nuclear][Organisation européenne pour la recherche nucléaire] [Europäische Organisation für Kernforschung](CH) [CERN] near Geneva, Switzerland.

    Brookhaven was also responsible for the design of the Spallation Neutron Source at DOE’s Oak Ridge National Laboratory, Tennessee.

    DOE’s Oak Ridge National Laboratory Spallation Neutron Source annotated.

    Brookhaven plays a role in a range of neutrino research projects around the world, including the Daya Bay Neutrino Experiment (CN) nuclear power plant, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China.

    Daya Bay Neutrino Experiment (CN) nuclear power plant, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China .


    BNL Center for Functional Nanomaterials.

    BNL National Synchrotron Light Source II.

    BNL NSLS II.

    BNL Relative Heavy Ion Collider Campus.

    BNL/RHIC Star Detector.

    BNL/RHIC Phenix detector.

     
  • richardmitnick 9:57 pm on May 19, 2022 Permalink | Reply
    Tags: , "Puzzling features deep in Earth's interior illuminated by high resolution imaging", , Hotspot volcanoes, Scientists use seismic waves from earthquakes to see beneath Earth's surface—the echoes and shadows of these waves revealing radar-like images of deep interior topography., The new research could also help scientists understand what sits beneath and gives rise to volcanic chains like the Hawaiian Islands., The researchers used the latest numerical modeling methods to reveal kilometer-scale structures at the core-mantle boundary., The team's observations add to a growing body of evidence that Earth's deep interior is just as variable as it's surface., , Until recently images of the structures at the core-mantle boundary-an area of key interest for studying our planet's internal heat flow-have been grainy and difficult to interpret.   

    From The University of Cambridge (UK) via phys.org : “Puzzling features deep in Earth’s interior illuminated by high resolution imaging” 

    U Cambridge bloc

    From The University of Cambridge (UK)

    via

    phys.org

    May 19, 2022

    1
    Credit: CC0 Public Domain.

    New research led by the University of Cambridge is the first to take a detailed image of an unusual pocket of rock at the boundary layer with Earth’s core, some three thousand kilometers beneath the surface.

    The enigmatic area of rock, which is located almost directly beneath the Hawaiian Islands, is one of several ultra-low velocity zones—so-called because earthquake waves slow to a crawl as they pass through them.

    The research, published today in Nature Communications, is the first to reveal the complex internal variability of one of these pockets in detail, shedding light on the landscape of Earth’s deep interior and the processes operating within it.

    “Of all Earth’s deep interior features, these are the most fascinating and complex. We’ve now got the first solid evidence to show their internal structure—it’s a real milestone in deep earth seismology,” said lead author Zhi Li, Ph.D. student at Cambridge’s Department of Earth Sciences.

    Earth’s interior is layered like an onion: at the center sits the iron-nickel core, surrounded by a thick layer known as the mantle, and on top of that a thin outer shell—the crust we live on. Although the mantle is solid rock, it is hot enough to flow extremely slowly. These internal convection currents feed heat to the surface, driving the movement of tectonic plates and fuelling volcanic eruptions.

    Scientists use seismic waves from earthquakes to see beneath Earth’s surface—the echoes and shadows of these waves revealing radar-like images of deep interior topography. But until recently, images of the structures at the core-mantle boundary, an area of key interest for studying our planet’s internal heat flow, have been grainy and difficult to interpret.

    The researchers used the latest numerical modeling methods to reveal kilometer-scale structures at the core-mantle boundary. According to co-author Dr. Kuangdai Leng, who developed the methods while at the University of Oxford, “We are really pushing the limits of modern high-performance computing for elastodynamic simulations, taking advantage of wave symmetries unnoticed or unused before.” Leng, who is currently based at the Science and Technology Facilities Council, said that this means they can improve the resolution of the images by an order of magnitude compared to previous work.

    They observed a 40% reduction in the speed of seismic waves traveling at the base of the ultra-low velocity zone beneath Hawaii. According to the authors, this supports existing proposals that the zone contains much more iron than the surrounding rocks—meaning it is denser and more sluggish. “It’s possible that this iron-rich material is a remnant of ancient rocks from Earth’s early history or even that iron might be leaking from the core by an unknown means,” said project lead, Dr. Sanne Cottaar from Cambridge Earth Sciences.

    The new research could also help scientists understand what sits beneath and gives rise to volcanic chains like the Hawaiian Islands. Scientists have started to notice a correlation between the location of the descriptively-named hotspot volcanoes, which include Hawaii and Iceland, and the ultra-low velocity zones at the base of the mantle. The origin of hotspot volcanoes has been widely debated, but the most popular theory suggests that plume-like structures bring hot mantle material all the way from the core-mantle boundary to the surface.

    With images of the ultra-low velocity zone beneath Hawaii now in hand, the team can also gather rare physical evidence from what is likely the root of the plume feeding Hawaii. Their observation of dense, iron-rich rock beneath Hawaii would support surface observations, “Basalts erupting from Hawaii have anomalous isotope signatures which could either point to either an early-Earth origin or core leaking, it means some of this dense material piled up at the base must be dragged to the surface,” said Cottaar.

    More of the core-mantle boundary now needs to be imaged to understand if all surface hotspots have a pocket of dense material at the base. Where and how the core-mantle boundary can be targeted does depend on where earthquakes occur, and where seismometers are installed to record the waves.

    The team’s observations add to a growing body of evidence that Earth’s deep interior is just as variable as it’s surface. “These low velocity zones are one of the most intricate features we see at extreme depths—if we expand our search we are likely to see ever-increasing levels of complexity, both structural and chemical, at the core-mantle boundary,” said Li.

    They now plan to apply their techniques to enhance the resolution of imaging of other pockets at the core-mantle boundary, as well as mapping new zones. Eventually they hope to map the geological landscape across the core-mantle boundary and understand its relationship with the dynamics and evolutionary history of our planet.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Cambridge Campus

    The University of Cambridge (UK) [legally The Chancellor, Masters, and Scholars of the University of Cambridge] is a collegiate public research university in Cambridge, England. Founded in 1209 Cambridge is the second-oldest university in the English-speaking world and the world’s fourth-oldest surviving university. It grew out of an association of scholars who left the University of Oxford (UK) after a dispute with townsfolk. The two ancient universities share many common features and are often jointly referred to as “Oxbridge”.

    Cambridge is formed from a variety of institutions which include 31 semi-autonomous constituent colleges and over 150 academic departments, faculties and other institutions organised into six schools. All the colleges are self-governing institutions within the university, each controlling its own membership and with its own internal structure and activities. All students are members of a college. Cambridge does not have a main campus and its colleges and central facilities are scattered throughout the city. Undergraduate teaching at Cambridge is organised around weekly small-group supervisions in the colleges – a feature unique to the Oxbridge system. These are complemented by classes, lectures, seminars, laboratory work and occasionally further supervisions provided by the central university faculties and departments. Postgraduate teaching is provided predominantly centrally.

    Cambridge University Press a department of the university is the oldest university press in the world and currently the second largest university press in the world. Cambridge Assessment also a department of the university is one of the world’s leading examining bodies and provides assessment to over eight million learners globally every year. The university also operates eight cultural and scientific museums, including the Fitzwilliam Museum, as well as a botanic garden. Cambridge’s libraries – of which there are 116 – hold a total of around 16 million books, around nine million of which are in Cambridge University Library, a legal deposit library. The university is home to – but independent of – the Cambridge Union – the world’s oldest debating society. The university is closely linked to the development of the high-tech business cluster known as “Silicon Fe”. It is the central member of Cambridge University Health Partners, an academic health science centre based around the Cambridge Biomedical Campus.

    By both endowment size and consolidated assets Cambridge is the wealthiest university in the United Kingdom. In the fiscal year ending 31 July 2019, the central university – excluding colleges – had a total income of £2.192 billion of which £592.4 million was from research grants and contracts. At the end of the same financial year the central university and colleges together possessed a combined endowment of over £7.1 billion and overall consolidated net assets (excluding “immaterial” historical assets) of over £12.5 billion. It is a member of numerous associations and forms part of the ‘golden triangle’ of English universities.

    Cambridge has educated many notable alumni including eminent mathematicians; scientists; politicians; lawyers; philosophers; writers; actors; monarchs and other heads of state. As of October 2020 121 Nobel laureates; 11 Fields Medalists; 7 Turing Award winners; and 14 British prime ministers have been affiliated with Cambridge as students; alumni; faculty or research staff. University alumni have won 194 Olympic medals.

    History

    By the late 12th century the Cambridge area already had a scholarly and ecclesiastical reputation due to monks from the nearby bishopric church of Ely. However it was an incident at Oxford which is most likely to have led to the establishment of the university: three Oxford scholars were hanged by the town authorities for the death of a woman without consulting the ecclesiastical authorities who would normally take precedence (and pardon the scholars) in such a case; but were at that time in conflict with King John. Fearing more violence from the townsfolk scholars from the University of Oxford started to move away to cities such as Paris; Reading; and Cambridge. Subsequently enough scholars remained in Cambridge to form the nucleus of a new university when it had become safe enough for academia to resume at Oxford. In order to claim precedence it is common for Cambridge to trace its founding to the 1231 charter from Henry III granting it the right to discipline its own members (ius non-trahi extra) and an exemption from some taxes; Oxford was not granted similar rights until 1248.

    A bull in 1233 from Pope Gregory IX gave graduates from Cambridge the right to teach “everywhere in Christendom”. After Cambridge was described as a studium generale in a letter from Pope Nicholas IV in 1290 and confirmed as such in a bull by Pope John XXII in 1318 it became common for researchers from other European medieval universities to visit Cambridge to study or to give lecture courses.

    Foundation of the colleges

    The colleges at the University of Cambridge were originally an incidental feature of the system. No college is as old as the university itself. The colleges were endowed fellowships of scholars. There were also institutions without endowments called hostels. The hostels were gradually absorbed by the colleges over the centuries; but they have left some traces, such as the name of Garret Hostel Lane.

    Hugh Balsham, Bishop of Ely, founded Peterhouse – Cambridge’s first college in 1284. Many colleges were founded during the 14th and 15th centuries but colleges continued to be established until modern times. There was a gap of 204 years between the founding of Sidney Sussex in 1596 and that of Downing in 1800. The most recently established college is Robinson built in the late 1970s. However Homerton College only achieved full university college status in March 2010 making it the newest full college (it was previously an “Approved Society” affiliated with the university).

    In medieval times many colleges were founded so that their members would pray for the souls of the founders and were often associated with chapels or abbeys. The colleges’ focus changed in 1536 with the Dissolution of the Monasteries. Henry VIII ordered the university to disband its Faculty of Canon Law and to stop teaching “scholastic philosophy”. In response, colleges changed their curricula away from canon law and towards the classics; the Bible; and mathematics.

    Nearly a century later the university was at the centre of a Protestant schism. Many nobles, intellectuals and even commoners saw the ways of the Church of England as too similar to the Catholic Church and felt that it was used by the Crown to usurp the rightful powers of the counties. East Anglia was the centre of what became the Puritan movement. In Cambridge the movement was particularly strong at Emmanuel; St Catharine’s Hall; Sidney Sussex; and Christ’s College. They produced many “non-conformist” graduates who, greatly influenced by social position or preaching left for New England and especially the Massachusetts Bay Colony during the Great Migration decade of the 1630s. Oliver Cromwell, Parliamentary commander during the English Civil War and head of the English Commonwealth (1649–1660), attended Sidney Sussex.

    Modern period

    After the Cambridge University Act formalised the organisational structure of the university the study of many new subjects was introduced e.g. theology, history and modern languages. Resources necessary for new courses in the arts architecture and archaeology were donated by Viscount Fitzwilliam of Trinity College who also founded the Fitzwilliam Museum. In 1847 Prince Albert was elected Chancellor of the University of Cambridge after a close contest with the Earl of Powis. Albert used his position as Chancellor to campaign successfully for reformed and more modern university curricula, expanding the subjects taught beyond the traditional mathematics and classics to include modern history and the natural sciences. Between 1896 and 1902 Downing College sold part of its land to build the Downing Site with new scientific laboratories for anatomy, genetics, and Earth sciences. During the same period the New Museums Site was erected including the Cavendish Laboratory which has since moved to the West Cambridge Site and other departments for chemistry and medicine.

    The University of Cambridge began to award PhD degrees in the first third of the 20th century. The first Cambridge PhD in mathematics was awarded in 1924.

    In the First World War 13,878 members of the university served and 2,470 were killed. Teaching and the fees it earned came almost to a stop and severe financial difficulties followed. As a consequence the university first received systematic state support in 1919 and a Royal Commission appointed in 1920 recommended that the university (but not the colleges) should receive an annual grant. Following the Second World War the university saw a rapid expansion of student numbers and available places; this was partly due to the success and popularity gained by many Cambridge scientists.

     
  • richardmitnick 9:28 pm on May 19, 2022 Permalink | Reply
    Tags: "Light-Controlled Reactions at the Nanoscale", , , MPG Institute for Quantum Optics [MPG Institut für Quantenoptik] (DE), , Reaction nanoscopy, The events in the nanocosmos can be controlled with the help of electromagnetic fields., The researchers used strong femtosecond-laser pulses to generate localized fields on the surfaces of isolated nanoparticles., There is hustle and bustle on the surface of nanoparticles. Molecules dock; dissolve and change their location. All this drives chemical reactions and changes matter and even gives rise to new materia   

    From MPG Institute for Quantum Optics [MPG Institut für Quantenoptik] (DE) : “Light-Controlled Reactions at the Nanoscale” 

    Max Planck Institut für Quantenoptik (DE)

    From MPG Institute for Quantum Optics [MPG Institut für Quantenoptik] (DE)

    May 19, 2022

    Dr. Boris Bergues
    Head of the Strong-Field Dynamics Group Laboratory for Attosecond Physics
    +49 89 32905-330
    boris.bergues@mpq.mpg.de
    LMU Munich and Max Planck Institute for Quantum Optics

    Thorsten Naeser
    Attosecond Physics
    Tel +49 89 32905-124
    Fax +49 89 32905-649
    thorsten.naeser@mpq.mpg.de

    Physicists at the Max Planck Institute of Quantum Optics and Ludwig Maximilian University of Munich [Ludwig-Maximilians-Universität München](DE) in collaboration with Stanford University have used for the first time laser light to control the location of light-induced reactions on the surface of nanoparticles.

    Controlling strong electromagnetic fields on nanoparticles is the key to triggering targeted molecular reactions on their surfaces. Such control over strong fields is achieved via laser light. Although laser-induced formation and breaking of molecular bonds on nanoparticle surfaces have been observed in the past, nanoscopic optical control of surface reactions has not yet been achieved. An international team led by Dr. Boris Bergues and Prof. Matthias Kling at Ludwig-Maximilians-Universität (LMU) and the Max Planck Institute of Quantum Optics (MPQ) in collaboration with Stanford University has now closed this gap. The physicists determined for the first time the location of light-induced molecular reactions on the surface of isolated silicon dioxide nanoparticles using ultrashort laser pulses.

    1
    A nanoparticle in the field of a femtosecond laser pulse with tailored waveform and polarization. The controlled enhancement of the field in specific nanoscopic regions of the nanoparticle (yellow spots) induces site-selective photochemical reactions of the molecules adsorbed on the surface. Imaging of the molecular fragments emitted from these regions enables all-optical control of the reaction sites with nanometer resolution. Illustration: RMT.Bergues.

    There is hustle and bustle on the surface of nanoparticles. Molecules dock, dissolve and change their location. All this drives chemical reactions, changes matter and even gives rise to new materials. The events in the nanocosmos can be controlled with the help of electromagnetic fields. This has now been demonstrated by a team led by Dr. Boris Bergues and Prof. Matthias Kling from the Ultrafast Electronics and Nanophotonics Group. To this end, the researchers used strong, femtosecond-laser pulses to generate localized fields on the surfaces of isolated nanoparticles. A femtosecond is one millionth of a billionth of a second.

    Using so-called reaction nanoscopy, a new technique recently developed in the same group, the physicists were able to image the reaction site and birthplace of molecular fragments on the surface of silica nanoparticles – at a resolution better than 20 nanometers. The nanoscopic spatial control, achievable at even higher resolution, was brought about by the scientists by superimposing the fields of two laser pulses with different color, and controlled waveform and polarization. Thereby, they had to set the time delay between the two pulses with attosecond accuracy. An attosecond is still a thousand times shorter than a femtosecond. When interacting with this tailored light, the surface of the nanoparticles and the molecules adsorbed there were ionized at targeted sites, leading to the dissociation of the molecules into different fragments.

    “Molecular surface reactions on nanoparticles play a fundamental role in nanocatalysis. They could be a key to clean energy production, in particular via photocatalytic water splitting,” explains Matthias Kling. “Our results also pave the way for tracking photocatalytic reactions on nanoparticles not only with nanometer spatial resolution, but also with femtosecond temporal resolution. This will provide detailed insights into the surface processes on the natural spatial and temporal scales of their dynamics,” adds Boris Bergues.

    The scientists anticipate that this promising new approach can be applied to numerous complex isolated nanostructured materials.

    Science paper:
    Optica

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition


    Research at the MPG Institute for Quantum Optics [Max Planck Institut für Quantenoptik ] (DE)
    Light can behave as an electromagnetic wave or a shower of particles that have no mass, called photons, depending on the conditions under which it is studied or used. Matter, on the other hand, is composed of particles, but it can actually exhibit wave-like properties, giving rise to many astonishing phenomena in the microcosm.

    At our institute we explore the interaction of light and quantum systems, exploiting the two extreme regimes of the wave-particle duality of light and matter. On the one hand we handle light at the single photon level where wave-interference phenomena differ from those of intense light beams. On the other hand, when cooling ensembles of massive particles down to extremely low temperatures we suddenly observe phenomena that go back to their wave-like nature. Furthermore, when dealing with ultrashort and highly intense light pulses comprising trillions of photons we can completely neglect the particle properties of light. We take advantage of the large force that the rapidly oscillating electromagnetic field exerts on electrons to steer their motion within molecules or accelerate them to relativistic energies.

    MPG Society for the Advancement of Science [MPG Gesellschaft zur Förderung der Wissenschaften e. V.] is a formally independent non-governmental and non-profit association of German research institutes founded in 1911 as the Kaiser Wilhelm Society and renamed the Max Planck Society in 1948 in honor of its former president, theoretical physicist Max Planck. The society is funded by the federal and state governments of Germany as well as other sources.

    According to its primary goal, the MPG Society supports fundamental research in the natural, life and social sciences, the arts and humanities in its 83 (as of January 2014) MPG Institutes. The society has a total staff of approximately 17,000 permanent employees, including 5,470 scientists, plus around 4,600 non-tenured scientists and guests. Society budget for 2015 was about €1.7 billion.

    The MPG Institutes focus on excellence in research. The MPG Society has a world-leading reputation as a science and technology research organization, with 33 Nobel Prizes awarded to their scientists, and is generally regarded as the foremost basic research organization in Europe and the world. In 2013, the Nature Publishing Index placed the MPG institutes fifth worldwide in terms of research published in Nature journals (after Harvard University, The Massachusetts Institute of Technology, Stanford University and The National Institutes of Health). In terms of total research volume (unweighted by citations or impact), the Max Planck Society is only outranked by The Chinese Academy of Sciences [中国科学院](CN), The Russian Academy of Sciences [Росси́йская акаде́мия нау́к](RU) and Harvard University. The Thomson Reuters-Science Watch website placed the MPG Society as the second leading research organization worldwide following Harvard University, in terms of the impact of the produced research over science fields.

    The MPG Society and its predecessor Kaiser Wilhelm Society hosted several renowned scientists in their fields, including Otto Hahn, Werner Heisenberg, and Albert Einstein.

    History

    The organization was established in 1911 as the Kaiser Wilhelm Society, or Kaiser-Wilhelm-Gesellschaft (KWG), a non-governmental research organization named for the then German emperor. The KWG was one of the world’s leading research organizations; its board of directors included scientists like Walther Bothe, Peter Debye, Albert Einstein, and Fritz Haber. In 1946, Otto Hahn assumed the position of President of KWG, and in 1948, the society was renamed the Max Planck Society (MPG) after its former President (1930–37) Max Planck, who died in 1947.

    The MPG Society has a world-leading reputation as a science and technology research organization. In 2006, the Times Higher Education Supplement rankings of non-university research institutions (based on international peer review by academics) placed the MPG Society as No.1 in the world for science research, and No.3 in technology research (behind AT&T Corporation and The DOE’s Argonne National Laboratory.

    The domain mpg.de attracted at least 1.7 million visitors annually by 2008 according to a Compete.com study.

    MPG Institutes and research groups

    The MPG Society consists of over 80 research institutes. In addition, the society funds a number of Max Planck Research Groups (MPRG) and International Max Planck Research Schools (IMPRS). The purpose of establishing independent research groups at various universities is to strengthen the required networking between universities and institutes of the Max Planck Society.
    The research units are primarily located across Europe with a few in South Korea and the U.S. In 2007, the Society established its first non-European centre, with an institute on the Jupiter campus of Florida Atlantic University (US) focusing on neuroscience.
    The MPG Institutes operate independently from, though in close cooperation with, the universities, and focus on innovative research which does not fit into the university structure due to their interdisciplinary or transdisciplinary nature or which require resources that cannot be met by the state universities.

    Internally, MPG Institutes are organized into research departments headed by directors such that each MPI has several directors, a position roughly comparable to anything from full professor to department head at a university. Other core members include Junior and Senior Research Fellows.

    In addition, there are several associated institutes:

    International Max Planck Research Schools

    International Max Planck Research Schools

    Together with the Association of Universities and other Education Institutions in Germany, the Max Planck Society established numerous International Max Planck Research Schools (IMPRS) to promote junior scientists:

    • Cologne Graduate School of Ageing Research, Cologne
    • International Max Planck Research School for Intelligent Systems, at the Max Planck Institute for Intelligent Systems located in Tübingen and Stuttgart
    • International Max Planck Research School on Adapting Behavior in a Fundamentally Uncertain World (Uncertainty School), at the Max Planck Institutes for Economics, for Human Development, and/or Research on Collective Goods
    • International Max Planck Research School for Analysis, Design and Optimization in Chemical and Biochemical Process Engineering, Magdeburg
    • International Max Planck Research School for Astronomy and Cosmic Physics, Heidelberg at the MPI for Astronomy
    • International Max Planck Research School for Astrophysics, Garching at the MPI for Astrophysics
    • International Max Planck Research School for Complex Surfaces in Material Sciences, Berlin
    • International Max Planck Research School for Computer Science, Saarbrücken
    • International Max Planck Research School for Earth System Modeling, Hamburg
    • International Max Planck Research School for Elementary Particle Physics, Munich, at the MPI for Physics
    • International Max Planck Research School for Environmental, Cellular and Molecular Microbiology, Marburg at the Max Planck Institute for Terrestrial Microbiology
    • International Max Planck Research School for Evolutionary Biology, Plön at the Max Planck Institute for Evolutionary Biology
    • International Max Planck Research School “From Molecules to Organisms”, Tübingen at the Max Planck Institute for Developmental Biology
    • International Max Planck Research School for Global Biogeochemical Cycles, Jena at the Max Planck Institute for Biogeochemistry
    • International Max Planck Research School on Gravitational Wave Astronomy, Hannover and Potsdam MPI for Gravitational Physics
    • International Max Planck Research School for Heart and Lung Research, Bad Nauheim at the Max Planck Institute for Heart and Lung Research
    • International Max Planck Research School for Infectious Diseases and Immunity, Berlin at the Max Planck Institute for Infection Biology
    • International Max Planck Research School for Language Sciences, Nijmegen
    • International Max Planck Research School for Neurosciences, Göttingen
    • International Max Planck Research School for Cognitive and Systems Neuroscience, Tübingen
    • International Max Planck Research School for Marine Microbiology (MarMic), joint program of the Max Planck Institute for Marine Microbiology in Bremen, the University of Bremen, the Alfred Wegener Institute for Polar and Marine Research in Bremerhaven, and the Jacobs University Bremen
    • International Max Planck Research School for Maritime Affairs, Hamburg
    • International Max Planck Research School for Molecular and Cellular Biology, Freiburg
    • International Max Planck Research School for Molecular and Cellular Life Sciences, Munich
    • International Max Planck Research School for Molecular Biology, Göttingen
    • International Max Planck Research School for Molecular Cell Biology and Bioengineering, Dresden
    • International Max Planck Research School Molecular Biomedicine, program combined with the ‘Graduate Programm Cell Dynamics And Disease’ at the University of Münster and the Max Planck Institute for Molecular Biomedicine
    • International Max Planck Research School on Multiscale Bio-Systems, Potsdam
    • International Max Planck Research School for Organismal Biology, at the University of Konstanz and the Max Planck Institute for Ornithology
    • International Max Planck Research School on Reactive Structure Analysis for Chemical Reactions (IMPRS RECHARGE), Mülheim an der Ruhr, at the Max Planck Institute for Chemical Energy Conversion
    • International Max Planck Research School for Science and Technology of Nano-Systems, Halle at Max Planck Institute of Microstructure Physics
    • International Max Planck Research School for Solar System Science at the University of Göttingen hosted by MPI for Solar System Research
    • International Max Planck Research School for Astronomy and Astrophysics, Bonn, at the MPI for Radio Astronomy (formerly the International Max Planck Research School for Radio and Infrared Astronomy)
    • International Max Planck Research School for the Social and Political Constitution of the Economy, Cologne
    • International Max Planck Research School for Surface and Interface Engineering in Advanced Materials, Düsseldorf at Max Planck Institute for Iron Research GmbH
    • International Max Planck Research School for Ultrafast Imaging and Structural Dynamics, Hamburg

    Max Planck Schools

    • Max Planck School of Cognition
    • Max Planck School Matter to Life
    • Max Planck School of Photonics

    Max Planck Center

    • The Max Planck Centre for Attosecond Science (MPC-AS), POSTECH Pohang
    • The Max Planck POSTECH Center for Complex Phase Materials, POSTECH Pohang

    Max Planck Institutes

    Among others:
    • Max Planck Institute for Neurobiology of Behavior – caesar, Bonn
    • Max Planck Institute for Aeronomics in Katlenburg-Lindau was renamed to Max Planck Institute for Solar System Research in 2004;
    • Max Planck Institute for Biology in Tübingen was closed in 2005;
    • Max Planck Institute for Cell Biology in Ladenburg b. Heidelberg was closed in 2003;
    • Max Planck Institute for Economics in Jena was renamed to the Max Planck Institute for the Science of Human History in 2014;
    • Max Planck Institute for Ionospheric Research in Katlenburg-Lindau was renamed to Max Planck Institute for Aeronomics in 1958;
    • Max Planck Institute for Metals Research, Stuttgart
    • Max Planck Institute of Oceanic Biology in Wilhelmshaven was renamed to Max Planck Institute of Cell Biology in 1968 and moved to Ladenburg 1977;
    • Max Planck Institute for Psychological Research in Munich merged into the Max Planck Institute for Human Cognitive and Brain Sciences in 2004;
    • Max Planck Institute for Protein and Leather Research in Regensburg moved to Munich 1957 and was united with the Max Planck Institute for Biochemistry in 1977;
    • Max Planck Institute for Virus Research in Tübingen was renamed as Max Planck Institute for Developmental Biology in 1985;
    • Max Planck Institute for the Study of the Scientific-Technical World in Starnberg (from 1970 until 1981 (closed)) directed by Carl Friedrich von Weizsäcker and Jürgen Habermas.
    • Max Planck Institute for Behavioral Physiology
    • Max Planck Institute of Experimental Endocrinology
    • Max Planck Institute for Foreign and International Social Law
    • Max Planck Institute for Physics and Astrophysics
    • Max Planck Research Unit for Enzymology of Protein Folding

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: