Updates from August, 2018 Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:45 am on August 16, 2018 Permalink | Reply
    Tags: , , , , , Microwaves from Solar Flares,   

    From AAS NOVA: “Microwaves from Solar Flares” 

    AASNOVA

    From AAS NOVA

    15 August 2018
    Susanna Kohler

    1
    A solar flare erupting on the limb of the Sun (look just below center on the right edge of the Sun in this extreme-ultraviolet image from the Solar Dynamics Observatory) in September 2017 makes for a perfect test case to see what new information we can learn from microwave observations. [NASA/Solar Dynamics Observatory]

    NASA/SDO

    The Sun is a rather well-studied star, so it’s always exciting when we get the opportunity to observe it in a new way. One such opportunity is upcoming, via the Parker Solar Probe that just launched last week.

    NASA Parker Solar Probe Plus

    154f8-sol_parkersolarprobe2_nasa

    But while we wait for that new view of the Sun, we have another one to examine: the Sun in microwaves.

    2
    The spectrum over time for the first ~1 hr of the solar flare SL2017-09-10, shown at different wavelengths. [Gary et al. 2018]

    Peering into Flares

    In our efforts to better understand solar flares — sudden eruptions that occur when magnetic energy is abruptly released from the Sun, sending a burst of particles and radiation into space — we’ve observed these phenomena across a wide range of wavelengths. One wavelength regime known to be valuable for understanding the physics of solar flares is that of microwaves, which are emitted by high-energy electrons that are accelerated as energy is released in the flare.

    But before now, the vast majority of microwave studies of solar flares have relied on data from the Nobeyama Radioheliograph in Japan, which observes the Sun at just two fixed frequencies that lie well above the peak of the microwave spectrum.

    Nobeyama Millimeter Array Radioheliograph, located near Minamimaki, Nagano at an elevation of 1350m

    Nobeyama Radio Telescope, located in the Nobeyama highlands in Nagano, Japan

    This spectral regime explores only regions of high magnetic field strength.

    What could we learn about solar flares from the lower-frequency microwaves emitted from more weakly magnetized regions? A newly upgraded array, the Expanded Owens Valley Solar Array (EOVSA) in California, is now helping us to answer this question.

    Ten, now 15 antennas of NJIT’s 13-antenna Expanded Owens Valley Solar Array (EOVSA), near Big Pine, California

    4
    One of the antennas in the Owens Valley Solar Array. [Dale E. Gary]

    An Upgraded Array

    After its recent upgrade, which concluded in April 2017, EOVSA now consists of 15 antennas — which produce imaging and spectroscopy data that span the microwave spectrum, including lower microwave frequencies. In a recent study led by Dale Gary (New Jersey Institute of Technology), a team of scientists has presented the first example of microwave imaging spectroscopy from EOVSA, demonstrating the powerful new observations capable with this technology.

    As a target to test the array’s capabilities, Gary and collaborators selected a solar flare that occurred on the limb of the Sun — i.e., the edge of its disk, as seen from Earth — in September of 2017: SOL2017-09-10.

    4
    EOVSA microwave data plotted in color over a 5’ x 5’ AIA image of the Sun during SOL2017-09-10. RHESSI hard X-ray data is shown in contours. The EOVSA data reveals the presence of high-energy electrons in multiple locations: in small reconnecting loops, well above these bright loops, and, at the north and sourth, associated with the legs of a much larger loop. [Gary et al. 2018]

    High-Energy Electrons Everywhere!

    High-frequency microwave observations of flares like SOL2017-09-10 had already demonstrated the presence of high-energy electrons in regions of high magnetic fields, like the small closed magnetic loops anchored in the Sun’s surface. But EOVSA’s view of the whole microwave spectrum has revealed that the spatial extent of high-energy electrons is much larger than we thought — these energetic electrons also lie well above the small reconnected loops, in the space between the loops and an erupting rope of magnetic flux associated with the flare.

    This discovery indicates the necessity of some amendments to our standard model for the physics of solar flares. Though these early results from EOVSA may be preliminary, they clearly demonstrate the powerful capabilities of this new technology. We can look forward to more new observations of the Sun in the future, continuing to advance our understanding of how energy is released from our nearest star.

    Citation

    Dale E. Gary et al 2018 ApJ 863 83. http://iopscience.iop.org/article/10.3847/1538-4357/aad0ef/meta

    Related journal articles
    _________________________________________________
    See the full article for further references with links.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    1

    AAS Mission and Vision Statement

    The mission of the American Astronomical Society is to enhance and share humanity’s scientific understanding of the Universe.

    The Society, through its publications, disseminates and archives the results of astronomical research. The Society also communicates and explains our understanding of the universe to the public.
    The Society facilitates and strengthens the interactions among members through professional meetings and other means. The Society supports member divisions representing specialized research and astronomical interests.
    The Society represents the goals of its community of members to the nation and the world. The Society also works with other scientific and educational societies to promote the advancement of science.
    The Society, through its members, trains, mentors and supports the next generation of astronomers. The Society supports and promotes increased participation of historically underrepresented groups in astronomy.
    The Society assists its members to develop their skills in the fields of education and public outreach at all levels. The Society promotes broad interest in astronomy, which enhances science literacy and leads many to careers in science and engineering.

    Adopted June 7, 2009

    Advertisements
     
  • richardmitnick 8:10 am on August 16, 2018 Permalink | Reply
    Tags: , , , Samy Movassaghi,   

    From CSIROscope: Women in STEM- “The key to a STEM career? Curiosity, persistence and a knack for problem solving!” Samy Movassaghi 

    CSIRO bloc

    From CSIROscope

    16 August 2018
    Ali Green

    1
    On top of Samy’s work as a researcher, she is often sought after as a spokesperson for inspiring young people to take up a career in tech.

    It’s National Science Week and we’ve been taking a closer look at science, technology, engineering and maths (STEM) careers and pathways – answering burning questions and debunking myths like: what kinds of opportunities can be found in a STEM career path? What current and future jobs rely on STEM skills? What kinds of people pursue STEM careers? Can I only become a physicist if I study physics?

    To answer some of these questions, we’re getting up close and personal with Telecommunications Engineer, 2017 Google Research Fellowship recipient and ICT Student of the Year, Samy Movassaghi to hear about some of the cool things she’s doing in her job, what sparked her interest in STEM, and the pathway that led to her becoming a STEM professional. Samy even has some tips for eager young STEM enthusiasts!

    2
    Samy developed an algorithm inspired by fireflies to help solve a communications network challenge.

    Tell us a bit about what you’re working on at the moment and how you got there.

    Samy: I work on wearable biomarker sensors, or “insideables” that can track our health. Specifically, communications between a network of intelligent, low-power, micro and nano-technology sensors which can be placed on or in the body (including in the blood stream) to monitor vitals and provide timely data for medical diagnoses and action. One potential advantage of this technology is early detection of medical conditions, resulting in major improvements to quality of life. These networks can be expanded beyond healthcare for use in sport, entertainment and many other areas with their main characteristic being to improve the user’s quality of life.

    Apparently some of this work was inspired by fireflies?

    Yeah, that’s right. I designed a self-organisation algorithm inspired by the way fireflies stimulate each other to communicate (flash their lights) which allows the coexisting networks to autonomously configure themselves when communicating. The difficulty is, these sensors, which are all battery powered, are placed on and in the body, making constant recharging and replacement impractical. A better solution would be to extend their battery life as much as possible. So, like a swarm of fireflies, my protocol allows the sensors to communicate with each other and power up and adapt their transmissions when needed, minimising the drain on their batteries.

    And what was your pathway to this job?

    Well, I did a PhD in telecommunications engineering. During this time I did a couple of internships and won a few awards like the ICT student of the year award from the Australian Computer Society (ACS) at the Digital Disruptor Awards, a Google Fellowship award that is funding me to go to Mountain View at the end of this month, being featured as part of the CSIROSeven campaign promoting STEM careers, Business Innovation in IT award from Nasscom Australia and some others!

    Wow that’s impressive! What were all these awards for?

    So they were mainly for my proposals and research work during my PhD studies, showcased across various competitions, and also the work that I had accomplished by participating in solving challenges at a number of hackathons.

    What type of personality traits or interests do you think lend themselves to a career in computers and tech?

    So this work is mainly about persistence and problem solving. For me, it’s just like wanting to solve a brain teaser or find my way through a maze – I like the challenge of finding a way to solve a problem.

    What’s the earliest step you remember taking on your education path towards a career in information technology (IT)?

    As a child I was quite lucky that my parents were very open to us exploring what we wanted to do. They would constantly buy me all these electronic starter kits, and I would put them together and then watch them work, progressing to more complex projects – and that’s how it all started. I was constantly inspired by remote controls, or anything electronic. I would pull them apart trying to understand what those circuits and components were all about and how they led to certain functionalities. I decided electronic engineering was my natural calling and so I pursued a bachelor degree to understand more around that. Later on I decided to look into the communication between circuits, which led to further research in telecommunications engineering through my Masters and PhD Studies.

    And for any young people looking to pursue a career similar to yours, what are your recommendations?

    Nowadays, even at the very early ages in primary school, I can see there are a lot of coding challenges and different competitions that really encourage students to pursue a career in STEM and get exposed to coding or building new applications for certain challenges within a specific area of demand.

    ____________________
    Over the next month there are a number of different events encouraging young people to get into ICT, one of which is the international Bebras computational thinking challenge. The Bebras challenge is designed to enhance students’ problem solving skills and prepare them for the jobs of the future. It’s a free classroom resource for teachers and runs 3-14 September. Visit the link below to take the Bebras Challenge.
    ____________________

    With how much urgency should we be promoting people to take up careers in STEM or ICT?

    With the recent advancements in the Internet of Things, machine learning, data science, and big data, a career in ICT is very promising for one’s future. As humans collect more and more data, having an IT background helps you to better understand the science behind the data and how it can be used to make decisions and improve ones’ quality of life. There are so many opportunities to marry IT knowledge with all sorts of other STEM disciplines – medical, environment and design for example.

    Do you have any final words of advice for someone thinking about pursuing a STEM career?

    In my case, I’m really happy that I chose a career in STEM because it has given me the opportunity to explore my world in another dimension. With all the advancements happening around us, my STEM background gives me a better understanding of our changing world, and makes me feel like I can make a contribution. That is very motivating and quite exciting.

    I’d recommend that students interested in a STEM career investigate the different competitions and challenges available to them. It’s a great way to sharpen and test your STEM skills set while having fun.

    ____________________

    How will your computational thinking skills prepare you for the jobs of the future?
    Take the Bebras Challenge

    ____________________

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    SKA/ASKAP radio telescope at the Murchison Radio-astronomy Observatory (MRO) in Mid West region of Western Australia

    So what can we expect these new radio projects to discover? We have no idea, but history tells us that they are almost certain to deliver some major surprises.

    Making these new discoveries may not be so simple. Gone are the days when astronomers could just notice something odd as they browse their tables and graphs.

    Nowadays, astronomers are more likely to be distilling their answers from carefully-posed queries to databases containing petabytes of data. Human brains are just not up to the job of making unexpected discoveries in these circumstances, and instead we will need to develop “learning machines” to help us discover the unexpected.

    With the right tools and careful insight, who knows what we might find.

    CSIRO campus

    CSIRO, the Commonwealth Scientific and Industrial Research Organisation, is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

     
  • richardmitnick 7:46 am on August 16, 2018 Permalink | Reply
    Tags: ASTERIA cubesat, , , , , , , Planet transit. NASA/Ames   

    From MIT News and NASA/JPL-Caltech : “Tiny ASTERIA satellite achieves a first for CubeSats” 

    MIT News
    MIT Widget


    From MIT News and NASA/JPL-Caltech

    August 15, 2018
    Lauren Hinkel
    Mary Knapp

    1
    Members of the ASTERIA team prepare the petite satellite for its journey to space. Photo courtesy of NASA/JPL-Caltech

    MIT/NASA JPL/Caltech ASTERIA cubesat

    2
    This plot shows the transit lightcurve of 55 Cancri e observed by ASTERIA. Image courtesy of NASA/JPL-Caltech

    Measurement of an exoplanet transit demonstrates proof of concept that small spacecraft can perform high-precision photometry.

    Planet transit. NASA/Ames

    A miniature satellite called ASTERIA (Arcsecond Space Telescope Enabling Research in Astrophysics) has measured the transit of a previously-discovered super-Earth exoplanet, 55 Cancri e. This finding shows that miniature satellites, like ASTERIA, are capable of making of sensitive detections of exoplanets via the transit method.

    While observing 55 Cancri e, which is known to transit, ASTERIA measured a miniscule change in brightness, about 0.04 percent, when the super-Earth crossed in front of its star. This transit measurement is the first of its kind for CubeSats (the class of satellites to which ASTERIA belongs) which are about the size of a briefcase and hitch a ride to space as secondary payloads on rockets used for larger spacecraft.

    The ASTERIA team presented updates and lessons learned about the mission at the Small Satellite Conference in Logan, Utah, last week.

    The ASTERIA project is a collaboration between MIT and NASA’s Jet Propulsion Laboratory (JPL) in Pasadena, California, funded through JPL’s Phaeton Program. The project started in 2010 as an undergraduate class project in 16.83/12.43 (Space Systems Engineering), involving a technology demonstration of astrophysical measurements using a Cubesat, with a primary goal of training early-career engineers.

    The ASTERIA mission — of which Department of Earth, Atmospheric and Planetary Sciences Class of 1941 Professor of Planetary Sciences Sara Seager is the Principal Investigator — was designed to demonstrate key technologies, including very stable pointing and thermal control for making extremely precise measurements of stellar brightness in a tiny satellite. Earlier this year, ASTERIA achieved pointing stability of 0.5 arcseconds and thermal stability of 0.01 degrees Celsius. These technologies are important for precision photometry, i.e., the measurement of stellar brightness over time.

    Precision photometry, in turn, provides a way to study stellar activity, transiting exoplanets, and other astrophysical phenomena. Several MIT alumni have been involved in ASTERIA’s development from the beginning including Matthew W. Smith PhD ’14, Christopher Pong ScD ’14, Alessandra Babuscia PhD ’12, and Mary Knapp PhD ’18. Brice-Olivier Demory, a professor at the University of Bern and a former EAPS postdoc who is also a member of the ASTERIA science team, performed the data reduction that revealed the transit.

    ASTERIA’s success demonstrates that CubeSats can perform big science in a small package. This finding has earned ASTERIA the honor of “Mission of the Year,” which was awarded at the SmallSat conference. The honor is presented annually to the mission that has demonstrated a significant improvement in the capability of small satellites, which weigh less than 150 kilograms. Eligible missions have launched, established communication, and acquired results from on-orbit after Jan, 1, 2017.

    Now that ASTERIA has proven that it can measure exoplanet transits, it will continue observing two bright, nearby stars to search for previously unknown transiting exoplanets. Additional funding for ASTERIA operations was provided by the Heising-Simons Foundation

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 4:26 pm on August 15, 2018 Permalink | Reply
    Tags: Alaskan M=6.4 earthquake, , , ,   

    From temblor: “Large earthquake in the Arctic National Wildlife Refuge raises questions about new oil drilling leases” 

    1

    From temblor

    August 14, 2018
    David Jacobson, M.Sc.
    Ross Stein, Ph.D

    1
    This photo shows a herd of porcupine caribou within the Arctic National Wildlife Refuge. (Photo from: U.S. Fish and Wildlife Service)

    Large quake strikes Northern Alaska

    Over the weekend, Alaska’s North Slope was struck by the largest earthquake ever recorded in the region. The M=6.4 shock occurred within the Arctic National Wildlife Refuge, an area with protected status currently under threat from companies that seek to drill for oil and gas in the region. While few people inhabit this northern region, oil production workers around Prudhoe Bay to the north felt the quake. Fortunately there were no reports of damage to any structures or oil pipelines.

    2
    This map from the Alaska Earthquake Center shows the location of Sunday’s mainshock, as well as recorded aftershocks.

    Potential for larger quakes

    Sunday’s quake struck underneath the Sadlerochit Mountains north of the Brooks Range. These mountains are an upwarped fold, likely caused by an underlying ‘blind’ thrust fault that has steadily uplifted the fold above the coastal plain. However, while a thrust fault is likely responsible for the formation of these mountains, Sunday’s quake was strike-slip in nature. According to the Alaska Earthquake Center operated by the University of Alaska Fairbanks and the USGS, strike-slip events are common in the Brooks Range.

    3
    4
    Active uplift of the Sadlerochit Mountains is evident from the ‘Wind Gap’ on the eastern fold, where a stream that formerly crossed the fold was defeated by the fold uplift, and now carries no water north to the coast. The ‘Water Gap’ on the westrn fold has been able to incise into the fold as rapidly as the fold has uplifted, and so it still carries water northward. The major rivers to the west and east of the fold axis are both deflected by the fold. Folds (‘anticlines’) of these kinds are almost always produced by blind thrust faults. Since folds trap oil deposits, they are often the target of oil and gas drilling. (Geological interpretation by Temblor)

    We suspect that the M=6.4 quake and its principle aftershocks struck on a ‘tear fault’ along the thrust. Based on the fold length, the thrust itself would have a dimension of 50 x 40 km, and so is capable of a M~7.3 quake, much larger than the 12 August event. Earthquakes on blind thrust faults caused the M=6.7 Coalinga, CA, quake in 1983, and the M=7.3 El Asnam, Algeria, quake in 1980.

    A surprising earthquake

    While a strike-slip fault rupturing in this region is not considered surprising, the magnitude of Sunday’s quake is. Prior to the M=6.4, the largest quake ever recorded in the region was a M=5.2 in 1995. So, this quake was over 50 times greater than the previous largest quake. Because of this, state seismologist Mike West said that, “it’s safe to say this earthquake will cause a re-evaluation of the seismic potential of that area.”

    5
    This map shows the location of oil fields in Alaska’s North Slope region. The approximate location of Sunday’s M=6.4 earthquake is also shown. Given the distance between the oil fields and the earthquake, the M=6.4 was not induced.

    The Global Earthquake Activity Rate (GEAR) model, which is available in Temblor, further supports the inference that Sunday’s quake was unexpectedly large. This model uses global strain rates and the last 40 years of seismicity to forecast the likely earthquake magnitude in your lifetime anywhere on earth. From the model, which can be viewed here, one can see that the model doesn’t pick up a large earthquake risk in the region. So, quakes on the blind thrust faults are uncommon, but possible.

    Arctic National Wildlife Refuge under siege

    Any changes to the seismic hazard of the North Slope could potentially impact drilling operations currently in place, as well as aspirations to drill in the Wildlife Refuge. Only last month, the Interior Department expedited an environmental review of the impacts leasing part of the Arctic National Wildlife Refuge for oil drilling could have. The administration is rapidly moving forward as it seeks to open the coastal plain to energy exploration. Environmentalists are concerned that drilling could significantly impact the polar bears, caribou, and waterfowl in the refuge, which has enjoyed government protection for decades. This quake underlies another drilling risk: Induced earthquakes that could increase the shaking in the region, as has happened in Oklahoma since extensive drilling began there in about 2003. According to the Washington Post, should the Arctic Refuge leases are approved, two plots of land, each 400,000 acres would be open to drilling by 2024.

    6
    800,000 acres within the Arctic National Wildlife Refuge are at risk of being exposed to oil and gas drilling. (Photo by: Florian Schulz)

    References-no links
    USGS
    Alaska Earthquake Center
    Yahoo News
    Washington Post

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Earthquake Alert

    1

    Earthquake Alert

    Earthquake Network project

    Earthquake Network is a research project which aims at developing and maintaining a crowdsourced smartphone-based earthquake warning system at a global level. Smartphones made available by the population are used to detect the earthquake waves using the on-board accelerometers. When an earthquake is detected, an earthquake warning is issued in order to alert the population not yet reached by the damaging waves of the earthquake.

    The project started on January 1, 2013 with the release of the homonymous Android application Earthquake Network. The author of the research project and developer of the smartphone application is Francesco Finazzi of the University of Bergamo, Italy.

    Get the app in the Google Play store.

    3
    Smartphone network spatial distribution (green and red dots) on December 4, 2015

    Meet The Quake-Catcher Network

    QCN bloc

    Quake-Catcher Network

    The Quake-Catcher Network is a collaborative initiative for developing the world’s largest, low-cost strong-motion seismic network by utilizing sensors in and attached to internet-connected computers. With your help, the Quake-Catcher Network can provide better understanding of earthquakes, give early warning to schools, emergency response systems, and others. The Quake-Catcher Network also provides educational software designed to help teach about earthquakes and earthquake hazards.

    After almost eight years at Stanford, and a year at CalTech, the QCN project is moving to the University of Southern California Dept. of Earth Sciences. QCN will be sponsored by the Incorporated Research Institutions for Seismology (IRIS) and the Southern California Earthquake Center (SCEC).

    The Quake-Catcher Network is a distributed computing network that links volunteer hosted computers into a real-time motion sensing network. QCN is one of many scientific computing projects that runs on the world-renowned distributed computing platform Berkeley Open Infrastructure for Network Computing (BOINC).

    The volunteer computers monitor vibrational sensors called MEMS accelerometers, and digitally transmit “triggers” to QCN’s servers whenever strong new motions are observed. QCN’s servers sift through these signals, and determine which ones represent earthquakes, and which ones represent cultural noise (like doors slamming, or trucks driving by).

    There are two categories of sensors used by QCN: 1) internal mobile device sensors, and 2) external USB sensors.

    Mobile Devices: MEMS sensors are often included in laptops, games, cell phones, and other electronic devices for hardware protection, navigation, and game control. When these devices are still and connected to QCN, QCN software monitors the internal accelerometer for strong new shaking. Unfortunately, these devices are rarely secured to the floor, so they may bounce around when a large earthquake occurs. While this is less than ideal for characterizing the regional ground shaking, many such sensors can still provide useful information about earthquake locations and magnitudes.

    USB Sensors: MEMS sensors can be mounted to the floor and connected to a desktop computer via a USB cable. These sensors have several advantages over mobile device sensors. 1) By mounting them to the floor, they measure more reliable shaking than mobile devices. 2) These sensors typically have lower noise and better resolution of 3D motion. 3) Desktops are often left on and do not move. 4) The USB sensor is physically removed from the game, phone, or laptop, so human interaction with the device doesn’t reduce the sensors’ performance. 5) USB sensors can be aligned to North, so we know what direction the horizontal “X” and “Y” axes correspond to.

    If you are a science teacher at a K-12 school, please apply for a free USB sensor and accompanying QCN software. QCN has been able to purchase sensors to donate to schools in need. If you are interested in donating to the program or requesting a sensor, click here.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.

    Earthquake safety is a responsibility shared by billions worldwide. The Quake-Catcher Network (QCN) provides software so that individuals can join together to improve earthquake monitoring, earthquake awareness, and the science of earthquakes. The Quake-Catcher Network (QCN) links existing networked laptops and desktops in hopes to form the worlds largest strong-motion seismic network.

    Below, the QCN Quake Catcher Network map
    QCN Quake Catcher Network map

    ShakeAlert: An Earthquake Early Warning System for the West Coast of the United States
    1

    The U. S. Geological Survey (USGS) along with a coalition of State and university partners is developing and testing an earthquake early warning (EEW) system called ShakeAlert for the west coast of the United States. Long term funding must be secured before the system can begin sending general public notifications, however, some limited pilot projects are active and more are being developed. The USGS has set the goal of beginning limited public notifications in 2018.

    Watch a video describing how ShakeAlert works in English or Spanish.

    The primary project partners include:

    United States Geological Survey
    California Governor’s Office of Emergency Services (CalOES)
    California Geological Survey
    California Institute of Technology
    University of California Berkeley
    University of Washington
    University of Oregon
    Gordon and Betty Moore Foundation

    The Earthquake Threat

    Earthquakes pose a national challenge because more than 143 million Americans live in areas of significant seismic risk across 39 states. Most of our Nation’s earthquake risk is concentrated on the West Coast of the United States. The Federal Emergency Management Agency (FEMA) has estimated the average annualized loss from earthquakes, nationwide, to be $5.3 billion, with 77 percent of that figure ($4.1 billion) coming from California, Washington, and Oregon, and 66 percent ($3.5 billion) from California alone. In the next 30 years, California has a 99.7 percent chance of a magnitude 6.7 or larger earthquake and the Pacific Northwest has a 10 percent chance of a magnitude 8 to 9 megathrust earthquake on the Cascadia subduction zone.

    Part of the Solution

    Today, the technology exists to detect earthquakes, so quickly, that an alert can reach some areas before strong shaking arrives. The purpose of the ShakeAlert system is to identify and characterize an earthquake a few seconds after it begins, calculate the likely intensity of ground shaking that will result, and deliver warnings to people and infrastructure in harm’s way. This can be done by detecting the first energy to radiate from an earthquake, the P-wave energy, which rarely causes damage. Using P-wave information, we first estimate the location and the magnitude of the earthquake. Then, the anticipated ground shaking across the region to be affected is estimated and a warning is provided to local populations. The method can provide warning before the S-wave arrives, bringing the strong shaking that usually causes most of the damage.

    Studies of earthquake early warning methods in California have shown that the warning time would range from a few seconds to a few tens of seconds. ShakeAlert can give enough time to slow trains and taxiing planes, to prevent cars from entering bridges and tunnels, to move away from dangerous machines or chemicals in work environments and to take cover under a desk, or to automatically shut down and isolate industrial systems. Taking such actions before shaking starts can reduce damage and casualties during an earthquake. It can also prevent cascading failures in the aftermath of an event. For example, isolating utilities before shaking starts can reduce the number of fire initiations.

    System Goal

    The USGS will issue public warnings of potentially damaging earthquakes and provide warning parameter data to government agencies and private users on a region-by-region basis, as soon as the ShakeAlert system, its products, and its parametric data meet minimum quality and reliability standards in those geographic regions. The USGS has set the goal of beginning limited public notifications in 2018. Product availability will expand geographically via ANSS regional seismic networks, such that ShakeAlert products and warnings become available for all regions with dense seismic instrumentation.

    Current Status

    The West Coast ShakeAlert system is being developed by expanding and upgrading the infrastructure of regional seismic networks that are part of the Advanced National Seismic System (ANSS); the California Integrated Seismic Network (CISN) is made up of the Southern California Seismic Network, SCSN) and the Northern California Seismic System, NCSS and the Pacific Northwest Seismic Network (PNSN). This enables the USGS and ANSS to leverage their substantial investment in sensor networks, data telemetry systems, data processing centers, and software for earthquake monitoring activities residing in these network centers. The ShakeAlert system has been sending live alerts to “beta” users in California since January of 2012 and in the Pacific Northwest since February of 2015.

    In February of 2016 the USGS, along with its partners, rolled-out the next-generation ShakeAlert early warning test system in California joined by Oregon and Washington in April 2017. This West Coast-wide “production prototype” has been designed for redundant, reliable operations. The system includes geographically distributed servers, and allows for automatic fail-over if connection is lost.

    This next-generation system will not yet support public warnings but does allow selected early adopters to develop and deploy pilot implementations that take protective actions triggered by the ShakeAlert notifications in areas with sufficient sensor coverage.

    Authorities

    The USGS will develop and operate the ShakeAlert system, and issue public notifications under collaborative authorities with FEMA, as part of the National Earthquake Hazard Reduction Program, as enacted by the Earthquake Hazards Reduction Act of 1977, 42 U.S.C. §§ 7704 SEC. 2.

    For More Information

    Robert de Groot, ShakeAlert National Coordinator for Communication, Education, and Outreach
    rdegroot@usgs.gov
    626-583-7225

    Learn more about EEW Research

    ShakeAlert Fact Sheet

    ShakeAlert Implementation Plan

     
  • richardmitnick 4:02 pm on August 15, 2018 Permalink | Reply
    Tags: , , , , , , , What Was It Like When We First Made Protons And Neutrons?   

    From Ethan Siegel: “What Was It Like When We First Made Protons And Neutrons?” 

    From Ethan Siegel
    Aug 15, 2018

    In the earliest stages of the Universe, before there were protons or neutrons, we had a quark-gluon plasma.

    Quark gluon plasma. Duke University

    1
    The internal structure of a proton, with quarks, gluons, and quark spin shown. The nuclear force acts like a spring, with negligible force when unstretched but large, attractive forces when stretched to large distances. (BROOKHAVEN NATIONAL LABORATORY)

    The story of our cosmic history is one of an expanding and cooling Universe. As we progressed from a hot, dense, uniform state to a cold, sparse, clumpy one, a number of momentous events happened throughout our cosmic history. At the moment of the hot Big Bang, the Universe was filled with all sorts of ultra-high energy particles, antiparticles, and quanta of radiation, moving at or close to the speed of light.

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex Mittelmann Cold creation

    On the other hand, today, we have a Universe filled with stars, galaxies, gas, dust, and many other phenomena that are too low in energy to have existed in the early Universe. Once things cooled enough so that the Higgs gave mass to the Universe, you might think that protons and neutrons would immediately form. But they couldn’t exist right away. Here’s the story of how they came to be.

    3
    At very high temperatures and densities, we have a free, unbound, quark-gluon plasma. At lower temperatures and densities, we have much more stable hadrons: protons and neutrons. (BNL/RHIC)

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    In the heat of the early Universe, but after the fundamental particles have obtained a rest mass, we have every particle-antiparticle combination that’s energetically possible popping in-and-out of existence. There are:

    quarks and antiquarks,
    leptons and antileptons,
    neutrinos and antineutrinos,
    as well as the gauge bosons,

    all of which exist so long as there’s enough energy (E) to create these particles of given masses (m) via Einstein’s E = mc². Particles get mass just 100 picoseconds (10^-10 s) after the hot Big Bang begins, but there are no protons or neutrons yet.

    4
    The early Universe was full of matter and radiation, and was so hot and dense that it prevented all composite particles, like protons and neutrons from stably forming for the first fraction-of-a-second. (RHIC COLLABORATION, BROOKHAVEN)

    Instead, the Universe is so hot and dense that what we have is known as a quark-gluon plasma. The reason for this is counterintuitive, if the only forces you’re familiar with are gravity and electromagnetism. In those cases, the forces get stronger in magnitude the closer you bring two particles. Halve the distance between two electric charges and the force quadruples between them; halve the distance between two masses and the force might even more-than-quadruple, as General Relativity dictates.

    But take two quarks, antiquarks, or a quark-antiquark combination, for example, and halve the distance between them, and the strength of the strong nuclear force that binds them together does something very different. It doesn’t quadruple. It doesn’t even double. Instead, the force between them drops.

    5
    At high energies (small distances), the strong force’s interaction strength drops to zero. At large distances, it increases rapidly. This is the idea of asymptotic freedom, which has been experimentally confirmed to great precision. (S. BETHKE; PROG.PART.NUCL.PHYS.58:351–386,2007 https://arxiv.org/abs/hep-ex/0606035)

    This is weird, but this is how atomic nuclei and the strong nuclear force actually work. Below a certain distance, the force between any two particles with a color-charge (quarks and gluons) actually drops to zero, only increasing as they get farther apart. At the high temperatures and densities present at these very early times, the nuclear force is too weak to bind anything together. As a result, particles simply zip around, colliding with each other, creating new ones and annihilating away.

    But as the Universe expands, it both cools and gets less dense. And as time goes on, it becomes harder to make the more massive particles.

    5
    The production of matter/antimatter pairs (left) from pure energy is a completely reversible reaction (right), with matter/antimatter annihilating back to pure energy. This creation-and-annihilation process, which obeys E = mc², is the only known way to create and destroy matter or antimatter. At low energies, particle-antiparticle creation is suppressed. (DMITRI POGOSYAN / UNIVERSITY OF ALBERTA)

    In addition, with the exception of the lightest quarks (up and down, plus anti-up and anti-down) and the lightest charged lepton (the electron, plus the positron), all the other particles are unstable to radioactive decay. As the picoseconds turn into nanoseconds, and the nanoseconds pile up into microseconds, the heavier particles stop being created and disappear from our Universe. Bottom/anti-bottom quarks disappear first, followed by the tau and anti-tau leptons. Then the charm/anti-charm quarks go, followed by the strange/anti-strange quarks.

    6
    The rest masses of the fundamental particles in the Universe determine when and under what conditions they can be created. The more massive a particle is, the less time it can spontaneously be created for in the early Universe. (FIG. 15–04A FROM UNIVERSE-REVIEW.CA)

    As we lose more and more particle/antiparticle combinations, they create greater numbers of the lighter particle/antiparticle pairs that can still exist, but also greater numbers of photons. Every time we produce two photons from particle/antiparticle annihilation, it slows down the cooling of the Universe a little bit. The Universe is getting cooler and sparser, but it’s also changing what’s in it. In the early stages, only a small-but-substantial percentage of the particles around are photons, neutrinos, and antineutrinos. But as these particles start to disappear, these fractions rise higher and higher.

    7
    In the early Universe, the full suite of particles and their antimatter particles were extraordinarily abundant, but as they Universe cooled, the majority annihilated away. All the conventional matter we have left over today is from the quarks and leptons, while everything that annihilated away created more photons, neutrinos, and antineutrinos.(E. SIEGEL / BEYOND THE GALAXY)

    And as the Universe cools even farther, the muons and anti-muons start to decay away, at the same time that the up-and-down quarks (plus the anti-up and anti-down quarks) start to separate away to substantial (femtometer: 10^-15 m) distances. About 10-to-20 microseconds after the Big Bang, we hit a critical temperature/density combination. We’ve now cooled down to a temperature of around 2 trillion K (2 × 10¹² K), and now the quarks and antiquarks are far enough apart that the strong force starts to get substantial.

    Just like an unstretched spring doesn’t exert a force but a stretched spring does, the quarks don’t feel a confining force until they reach a certain distance. But once they do, they become bound.

    8
    The three valence quarks of a proton contribute to its spin, but so do the gluons, sea quarks and antiquarks, and orbital angular momentum as well. The electrostatic repulsion and the attractive strong nuclear force, in tandem, are what give the proton its size.(APS/ALAN STONEBRAKER)

    Gradually, we make the transition: from free up, down, anti-up and anti-down quarks to bound protons, neutrons, anti-protons and anti-neutrons. The Universe is still hot enough to make new particle-antiparticle combinations, and was making lots of up/anti-up and down/anti-down quark combinations when things were dense enough.

    But now that they’re not dense enough, and we have protons and neutrons (and anti-protons and anti-neutrons) instead, the Universe isn’t hot enough to spontaneously create new proton/anti-proton or neutron/anti-neutron pairs. What this means is that when protons and anti-protons (or neutrons and anti-neutrons) find each other, they annihilate away, and we cannot make new ones.

    9
    Whenever you collide a particle with its antiparticle, it can annihilate away into pure energy. This means if you collide any two particles at all with enough energy, you can create a matter-antimatter pair. But if the Universe is below a certain energy threshold, you can only annihilate, not create. (ANDREW DENISZCZYC, 2017)

    What happens, then, as the Universe cools through this critical stage is the following:

    the remaining free quarks begin to experience confinement, becoming protons, neutrons, anti-protons, anti-neutrons, and pions (unstable particles known as mesons),
    the mesons decay away, while the anti-protons and anti-neutrons annihilate with the protons and neutrons,
    and this leaves us with protons and neutrons alone, only because at some earlier stage, the Universe created more matter than antimatter.

    10
    As the Universe expands and cools, unstable particles and antiparticles decay, while matter-antimatter pairs annihilate and photons can no longer collide at high enough energies to create new particles. But there will always be leftover particles that can no longer find their antiparticle counterparts. Either they’re stable or they’ll decay, but both have consequences for our Universe. (E. SIEGEL)

    At last, the Universe starts to resemble something we’d recognize today. Sure, it’s hot and dense. Sure, there are no atoms or even any atomic nuclei. Sure, it’s still filled with a bunch of positrons (the antimatter counterpart of electrons) and electrons, and is still creating-and-annihilating them spontaneously. But most of what exists now, perhaps 25 microseconds after the start of the hot Big Bang, still exists in some form today. The protons and neutrons will become the building blocks of atoms; the neutrinos and antineutrinos and photons will become part of the cosmic background; the leftover electrons that will exist when the electron/positron pairs annihilate away will combine with the atomic nuclei to make atoms, molecules, and complex biochemical reactions possible.

    11
    Each s orbital (red), each of the p orbitals (yellow), the d orbitals (blue) and the f orbitals (green) can contain only two electrons apiece: one spin up and one spin down in each one. The number of filled orbitals is determined by the number of protons in an atom’s nucleus. Without the protons created in the early Universe, none of what we have in our Universe today would be possible. (LIBRETEXTS LIBRARY / NSF / UC DAVIS)

    But at this stage, the biggest new thing that occurs is that particles are no longer individual-and-free on all scales. Instead, for the first time, the Universe has created a stable, bound state of multiple particles. A proton is two up and one down quark, bound by gluons, while a neutron is one up and two down quarks, bound by gluons. Only because we created more matter than antimatter do we have a Universe that has protons and neutrons left over; only because the Higgs gave rest mass to the fundamental particles do we get these bound, atomic nuclei.

    12
    The strong force, operating as it does because of the existence of ‘color charge’ and the exchange of gluons, is responsible for the force that holds atomic nuclei together. (WIKIMEDIA COMMONS USER QASHQAIILOVE)

    Owing to the nature of the strong force, and the tremendous binding energy that occurs in these stretched-spring-like interactions between the quarks, the masses of the proton and neutron are some 100 times heavier than the quarks that make them up. The Higgs gave mass to the Universe, but confinement is what gives us 99% of our mass. Without protons and neutrons, our Universe would never be the same.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 2:10 pm on August 15, 2018 Permalink | Reply
    Tags: , , , , , New phased aray system coming by 2022,   

    From Astrobiology Magazine: “Arecibo Observatory to Get $5.8 Million Upgrade to Expand View” 

    Astrobiology Magazine

    From Astrobiology Magazine

    This is great news for this grand old lady.


    NAIC Arecibo Observatory operated by University of Central Florida, Yang Enterprises and UMET, Altitude 497 m (1,631 ft)

    The National Science Foundation has awarded a team of scientists $5.8 million to design and mount a supersensitive antenna at the focal point of the Arecibo Observatory’s 1,000-foot-diameter dish, which is managed by the University of Central Florida. The antenna, called a phased-array feed, will increase the telescope’s observation capabilities 500 percent.

    The team, led by Brigham Young University engineering professors Brian Jeffs and Karl Warnick, includes collaborators at UCF and Cornell University. UCF and its partners have managed the facility since April when the team won a bid from the NSF to run the site.

    “We already have one of the most powerful telescopes on the planet, and with this award we will be able to do even more,” said Francisco Cordova, Arecibo site director and an engineer. “We are very excited with the award to fund the new ALPACA (Advanced Cryogenic L-Band Phased Array Camera for Arecibo) receiver at the Arecibo Observatory. This receiver, which is the next generation of our most-used receiver will be able to increase the survey speed by a factor of five. The receiver will accelerate research in gravitational waves, Fast Radio Bursts, dark matter and pulsar surveys, ensuring that AO continues to be at the forefront of radio astronomy for years to come.”

    Cordova has been working with BYU for months to prepare the NSF proposal. Jeffs and Warnick are considered the world’s foremost experts in phased-array feeds and are familiar with Arecibo. Nine years ago, they installed a gold-plated array of many small antennas at Arecibo that increased the surveying ability of the telescope from one beam of radio waves to seven beams. The new NSF-sponsored phased-array feed will have 166 antennas and will increase the field of view of the telescope to 40 beams, providing much smoother and continuous coverage of the sky than conventional receivers. The new array is scheduled to be installed by 2022.

    “We’re taking the most sensitive radio telescope in the world and opening it up so that it can view a larger part of the sky at one time,” Warnick said. “There’s a lot of things in space you can see with an optical camera, but you can see even more with a radio telescope.”

    One scientific objective of the new feed will be tracking new pulsars — especially millisecond pulsars that help signal the presence of gravity waves. Albert Einstein predicted the existence of gravity waves in 1916, but scientists just detected them a few years ago. Gravitational waves are produced by catastrophic events, such as two colliding black holes, and they cause ripples in the fabric of space-time.

    The phased-array feed will also search for extra-terrestrial intelligence, detect Fast Radio Bursts and conduct surveys to help unravel the mystery of dark matter in the universe.

    “Every galaxy in the universe has an invisible cloud of dark matter around it that we don’t yet understand,” Warnick said. “This will help solve one of the mysteries of the universe.”

    Arecibo, which was built in the 1960s, has been making headlines recently for its contributions to major space science news. It helped confirm gravitational waves and FRBs. Just last month, it was used by a team from Switzerland to confirm some of Einstein’s theories. Scientists from around the world use the facility’s powerful instruments to study everything from pulsars and dark matter to solar weather, and NASA uses it to study asteroids. The upgrade will benefit all the work being done at Arecibo and help scientists understand the cosmos.

    UCF is leading the consortium that includes Universidad Metropolitana in Puerto Rico and Yang Enterprises Inc. based in Oviedo, Florida, in managing the NSF facility. The facility, which was damaged when Hurricane Maria hit Puerto Rico last year, opened quickly after the storm. Emergency repairs that needed immediate attention, such as patching roofs and repairing electrical feeds, have been underway since May after the site received hurricane-relief funding. Additional repairs that will require more time and expertise will be completed as soon as possible.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 12:49 pm on August 15, 2018 Permalink | Reply
    Tags: ENGIN-X, ISIS Neutron and Muon Source, Neutrons: an unexpected engineering tool,   

    From Science and Technology Facilities Council: News, Events, Publications, ENGIN-X, ISIS Neutron and Muon Source 


    From Science and Technology Facilities Council

    Neutrons: an unexpected engineering tool

    What do you associate the word “engineering” with? Construction? Electronics? Maths? The list could go on and on, but in all likelihood it would be some time before you’d arrive at “neutrons”.

    However neutrons and engineering have more in common than you might expect. We take a closer look at ENGIN-X, ISIS Neutron and Muon Source’s dedicated engineering beamline, to find out how neutrons have been solving engineering problems for over 20 years.

    1

    STFC ISIS Neutron and Muon source

    Measuring stresses and strains at the atomic level

    ISIS Neutron and Muon Source produces high energy beams of neutrons that are fired into materials allowing us to study them at the atomic scale. As neutrons don’t carry an electrical charge they are largely unaffected by the atoms around them, letting us see deep into materials.

    ENGIN-X uses neutrons to non-destructively measure the stresses and strains hidden within engineering components under realistic conditions. Samples can be subjected to stress loads up to 100kN and temperatures up to 1000°C to recreate the challenging conditions engineering components face every day.

    Solving engineering problems for more than 20 years

    Despite being best known for their science, ISIS Neutron and Muon Source has had a longstanding involvement with engineering. “Not only are a substantial proportion of our workforce engineers, but we have a dedicated instrument that allows us to study engineering components in ways that no other technique can,” explains ISIS Neutron and Muon Source director Prof Robert McGreevy.

    The use of neutrons to study residual stress was pioneered at the Harwell reactors in the 1970s. ISIS Neutron and Muon Source entered the international stage for engineering stress measurement more than 20 years ago when it built ENGIN – one of the first dedicated neutron stress beamlines in the world. In response to growing demand from the engineering community, ENGIN was superseded by ENGIN-X in 2003, funded by the Engineering and Physical Science Research Council (EPSRC).

    Installation of the ENGIN-X guide, which became operational 2003, by the team from Swiss Neutronics. Image credit: ISIS Annual Review 2002.

    Prof Robert McGreevy, director of ISIS Neutron and Muon Source adds: “it’s great to see such a wide variety of research taking place on ENGIN-X. Not many people realise the important role that neutrons play in engineering, or that engineering plays in neutron research. The social and economic benefits are easy to explain and will continue for many years to come.”

    Take a look at some of the research that’s taken place on this extraordinary instrument below.

    Keeping us safe on the move
    Neutrons can delve deep into engineering components used in transport such as aircraft wings, jet engine casings and train wheels.

    Airbus was able to discover areas of potential stress and weakness in its aircraft wings by testing them using ENGIN-X.

    This assures the quality of engineering components before the manufacturing process begins, keeping us safe in the skies.

    Surveying stresses and strains in ancient artefacts

    Neutrons can be used to non-destructively study ancient artefacts. The facility has previously studied a 3,000 year old vase, bronze age swords, medieval armour, and copper bolts from Napoleonic war era ships.

    In 2015, ENGIN-X was used to look at a broken ancient tie rod used to support one of the biggest cathedrals in the world – Milan Cathedral.

    The experimental data from ENGIN-X revealed the residual stresses in the inner part of the iron rod without damaging the artefact.

    Stress strain measurements power progress in the energy sector

    The energy industry can use neutrons to probe deep into objects of interest. For example, graphite material used in UK nuclear reactors, and materials used in solid oxide fuel cells, have all been examined using ENGIN-X.

    The instrument was also used to study pipes for the offshore oil and gas industry. The pipe installation process and welding procedure induce strains in the pipeline which can, in principle, lead to failure.

    It is important that pipelines used to carry fluids are robust and able to withstand production and installation. Experimental data from ENGIN-X gave insight into the structural integrity of the pipe during the installation process.

    Monitoring the quality of hip implants

    Neutrons can also benefit our health as they can be used to study samples such as orthopaedic implants in great detail.

    Orthopaedic implants are often coated in a thin layer of hydroxyapatite. The coating process causes stress within the implant, which could cause the implant to fail.ENGIN-X was used to collect stress measurements to determine how the coating process relates to implant failure.

    Researchers used neutrons rather than x-rays to study the samples as they can penetrate deep into the materials more effectively. Their results were used to develop a computer model used by the implant industry to monitor the quality of hydroxyapatite coatings.

    Stress test sheds light on solar system formation

    Neutrons can also probe samples from the natural world – including meteorites.

    ENGIN-X has been used to look at the stresses within samples of meteorites. The type of stress indicates the potential impact and cooling experience it has been through.

    These findings could help reveal conditions during the early formation of the solar system.

    Solving engineering issues for industry

    Over the past 30 years, various industries have directly benefitted from the value that neutron science can bring to their business.

    ENGIN-X has been used by many companies including EDF Energy, TWI, and Boeing.

    Researchers from Rolls Royce used ENGIN-X to identify a mechanism they believed led to the formation of surface defects in the turbine blades.

    The team were able to conclusively identify the mechanism that caused the defects. As a result they’ve implemented a new manufacturing process that prevents surface defects from occurring.

    Further information

    ENGIN-X infographic
    How does ISIS Neutron and Muon Source work
    Further information on ENGIN-X
    More case studies on ENGIN-X

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    STFC Hartree Centre

    Helping build a globally competitive, knowledge-based UK economy

    We are a world-leading multi-disciplinary science organisation, and our goal is to deliver economic, societal, scientific and international benefits to the UK and its people – and more broadly to the world. Our strength comes from our distinct but interrelated functions:

    Universities: we support university-based research, innovation and skills development in astronomy, particle physics, nuclear physics, and space science
    Scientific Facilities: we provide access to world-leading, large-scale facilities across a range of physical and life sciences, enabling research, innovation and skills training in these areas
    National Campuses: we work with partners to build National Science and Innovation Campuses based around our National Laboratories to promote academic and industrial collaboration and translation of our research to market through direct interaction with industry
    Inspiring and Involving: we help ensure a future pipeline of skilled and enthusiastic young people by using the excitement of our sciences to encourage wider take-up of STEM subjects in school and future life (science, technology, engineering and mathematics)

    We support an academic community of around 1,700 in particle physics, nuclear physics, and astronomy including space science, who work at more than 50 universities and research institutes in the UK, Europe, Japan and the United States, including a rolling cohort of more than 900 PhD students.

    STFC-funded universities produce physics postgraduates with outstanding high-end scientific, analytic and technical skills who on graduation enjoy almost full employment. Roughly half of our PhD students continue in research, sustaining national capability and creating the bedrock of the UK’s scientific excellence. The remainder – much valued for their numerical, problem solving and project management skills – choose equally important industrial, commercial or government careers.

    Our large-scale scientific facilities in the UK and Europe are used by more than 3,500 users each year, carrying out more than 2,000 experiments and generating around 900 publications. The facilities provide a range of research techniques using neutrons, muons, lasers and x-rays, and high performance computing and complex analysis of large data sets.

    They are used by scientists across a huge variety of science disciplines ranging from the physical and heritage sciences to medicine, biosciences, the environment, energy, and more. These facilities provide a massive productivity boost for UK science, as well as unique capabilities for UK industry.

    Our two Campuses are based around our Rutherford Appleton Laboratory at Harwell in Oxfordshire, and our Daresbury Laboratory in Cheshire – each of which offers a different cluster of technological expertise that underpins and ties together diverse research fields.

    The combination of access to world-class research facilities and scientists, office and laboratory space, business support, and an environment which encourages innovation has proven a compelling combination, attracting start-ups, SMEs and large blue chips such as IBM and Unilever.

    We think our science is awesome – and we know students, teachers and parents think so too. That’s why we run an extensive Public Engagement and science communication programme, ranging from loans to schools of Moon Rocks, funding support for academics to inspire more young people, embedding public engagement in our funded grant programme, and running a series of lectures, travelling exhibitions and visits to our sites across the year.

    Ninety per cent of physics undergraduates say that they were attracted to the course by our sciences, and applications for physics courses are up – despite an overall decline in university enrolment.

     
  • richardmitnick 11:36 am on August 15, 2018 Permalink | Reply
    Tags: aUCNPs-alloyed upconverting nanoparticles, , , Light-Emitting Nanoparticles Could Provide a Safer Way to Image Living Cells,   

    From Lawrence Berkeley National Lab: “Light-Emitting Nanoparticles Could Provide a Safer Way to Image Living Cells” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    August 15, 2018
    Glenn Roberts Jr.
    geroberts@lbl.gov
    (510) 486-5582

    Berkeley Lab scientists show how tiny, metal-rich particles can be excited with a low-power laser for deep-tissue imaging.

    1
    Light emitted by nanoparticles injected into the mammary fat pads of a live mouse is imaged through several millimeters of tissue. This sequence shows how the light emitted by these laser-excited particles can be imaged through deep tissue two hours after injection (left), four hours after injection (center), and six hours after injection (right). (Credit: UC San Francisco)

    A research team has demonstrated how light-emitting nanoparticles, developed at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), can be used to see deep in living tissue.

    The specially designed nanoparticles can be excited by ultralow-power laser light at near-infrared wavelengths considered safe for the human body. They absorb this light and then emit visible light that can be measured by standard imaging equipment.

    The development and biological imaging application of these nanoparticles is detailed in a study published online Aug. 6 in Nature Communications.

    Researchers hope to further develop these so-called alloyed upconverting nanoparticles, or aUCNPs, so that they can attach to specific components of cells to serve in an advanced imaging system to light up even single cancer cells, for example. Such a system may ultimately guide high-precision surgeries and radiation treatments, and help to erase even very tiny traces of cancer.

    “With a laser even weaker than a standard green laser pointer, we can image deep into tissue,” said Bruce Cohen, who is part of a science team at Berkeley Lab’s Molecular Foundry that is working with UC San Francisco researchers to adapt the nanoparticles for medical uses. The Molecular Foundry is a DOE Office of Science User Facility specializing in nanoscience research – it is accessible to visiting scientists from around the nation and the world.

    Cohen noted that some existing imaging systems use higher-power laser light that runs the risk of damaging cells.

    “The challenge is: How do we image living systems at high sensitivity without damaging them? This combination of low-energy light and low-laser powers is what everyone in the field has been working toward for a while,” he said. The laser power needed for the aUCNPs is millions of times lower than the power needed for conventional near-infrared-imaging probes.

    In this latest study, researchers have demonstrated how the aUCNPs can be imaged in live mouse tissue at several millimeters’ depth. They were excited with lasers weak enough not to cause any damage.

    Researchers injected nanoparticles into the mammary fat pads of mice and recorded images of the light emitted by the particles, which did not appear to pose any toxicity to the cells.

    More testing will be required to know whether the aUCNPs produced at Berkeley Lab can be safely injected into humans, and to test coatings Berkeley Lab scientists are designing to specifically bind to cancerous cells.

    Dr. Mekhail Anwar, a radiation oncologist and an assistant professor at UC San Francisco who participated in the latest study, noted that there are numerous medical scanning techniques to locate cancers – from mammograms to MRIs and PET-CT scans – but these techniques can lack precise details at very small scales.

    “We really need to know exactly where each cancer cell is,” said Anwar, a Foundry user who collaborates with Molecular Foundry scientists in his research. “Usually we say you’re lucky when we catch it early and the cancer is only about a centimeter – that’s about 1 billion cells. But where are the smaller groups of cells hiding?”

    Future work at the Molecular Foundry will hopefully lead to improved techniques for imaging cancer using the aUCNPs, he said, and researchers are developing an imaging sensor to integrate with nanoparticles that could be attached to surgical equipment and even surgical gloves to pinpoint cancer hot spots during surgical procedures.

    2
    At left is a high-resolution transmission electron microscope image of a nanoparticle measuring 8 nanometers in diameter, with a 4-nanometer-thick shell. The scale bar is 5 nanometers. At right is a scanning transmission electron microscope image showing a collection of 8-nanometer nanoparticles with 8-nanometer shells (scale bar is 25 nanometers). (Credit: Berkeley Lab)

    A breakthrough in the Lab’s development of UCNPs was in finding ways to boost their efficiency in emitting the absorbed light at higher energies, said Emory Chan, a staff scientist at the Molecular Foundry who also participated in the latest study.

    For decades, the research community had believed that the best way to produce these so-called upconverting materials was to implant them or “dope” them with a low concentration of metals known as lanthanides. Too many of these metals, researchers had believed, would cause the light they emit to become less bright with more of these added metals.

    But experiments led by Molecular Foundry researchers Bining “Bella” Tian and Angel Fernandez-Bravo, who made lanthanide-rich UCNPs and measured their properties, upended this prevailing understanding.

    Studies of individual UCNPs proved especially valuable in showing that erbium, a lanthanide previously thought to only play a role in light emission, can also directly absorb light and free up another lanthanide, ytterbium, to absorb more light. Emory Chan, a staff scientist at the Molecular Foundry who also participated in the latest study, described erbium’s newly discovered multitasking role in the UCNPs as a “triple threat.”

    The UCNPs used in the latest study measure about 12-15 nanometers (billionths of a meter) across – small enough to allow them to penetrate into tissue. “Their shells are grown like an onion, a layer at a time,” Chan said.

    Jim Schuck, a study participant and former Berkeley Lab scientist now at Columbia University, noted that the latest study builds on a decade-long effort at the Molecular Foundry to understand, redesign, and find new applications for UCNPs.

    “This new paradigm in UCNP design, which leads to much brighter particles, is a real game-changer for all single-UCNP imaging applications,” he said.

    Researchers at the Molecular Foundry will be working on ways to automate the fabrication of the nanoparticles with robots, and to coat them with markers that selectively bind to cancerous cells.

    Cohen said that the collaborative work with UCSF has opened new avenues of exploration for UCNPs, and he expects the research effort to grow.

    “We never would have thought of using these for imaging during surgeries,” he said. “Working with researchers like Mekhail opens up this wonderful cross-pollination of different fields and different ideas.”

    Anwar said, “We’re really grateful to have access to the knowledge and wide array of instrumentation” at the Lab’s Molecular Foundry.

    Other researchers at Berkeley Lab’s Molecular Foundry and at UC Berkeley, UC San Francisco, and Columbia University also participated in this study.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 5:03 am on August 15, 2018 Permalink | Reply
    Tags: , , , ,   

    From Nature via U Wisconsin IceCube: “Special relativity validated by neutrinos” 

    U Wisconsin ICECUBE neutrino detector at the South Pole

    IceCube employs more than 5000 detectors lowered on 86 strings into almost 100 holes in the Antarctic ice NSF B. Gudbjartsson, IceCube Collaboration

    Lunar Icecube

    IceCube DeepCore annotated

    IceCube PINGU annotated


    DM-Ice II at IceCube annotated

    Nature Mag
    From Nature

    13 August 2018
    Matthew Mewes

    Neutrinos are tiny, ghost-like particles that habitually change identity. A measurement of the rate of change in high-energy neutrinos racing through Earth provides a record-breaking test of Einstein’s special theory of relativity.

    The existence of extremely light, electrically neutral particles called neutrinos was first postulated in 1930 to explain an apparent violation of energy conservation in the decays of certain unstable atomic nuclei. Writing in Nature Physics, the IceCube Collaboration1 now uses neutrinos seen in the world’s largest particle detector to scrutinize another cornerstone of physics: Lorentz invariance. This principle states that the laws of physics are independent of the speed and orientation of the experimenter’s frame of reference, and serves as the mathematical foundation for Albert Einstein’s special theory of relativity. Scouring their data for signs of broken Lorentz invariance, the authors carry out one of the most stringent tests of special relativity so far, and demonstrate how the peculiarities of neutrinos can be used to probe the foundations of modern physics.

    Physicists generally assume that Lorentz invariance holds exactly. However, in the late 1990s, the principle began to be systematically challenged2, largely because of the possibility that it was broken slightly in proposed theories of fundamental physics, such as string theory3. Over the past two decades, researchers have tested Lorentz invariance in objects ranging from photons to the Moon4.

    The IceCube Collaboration instead tested the principle using neutrinos. Neutrinos interact with matter through the weak force — one of the four fundamental forces of nature. The influence of the weak force is limited to minute distances. As a result, interactions between neutrinos and matter are extremely improbable, and a neutrino can easily traverse the entire Earth unimpeded. This poses a challenge for physicists trying to study these elusive particles, because almost every neutrino will simply pass through any detector completely unnoticed.

    The IceCube Neutrino Observatory, located at the South Pole, remedies this problem by monitoring an immense target volume to glimpse the exceedingly rare interactions. At the heart of the detector are more than 5,000 light sensors, which are focused on 1 cubic kilometre (1 billion tonnes) of ice. The sensors constantly look for the telltale flashes of light that are produced when a neutrino collides with a particle in the ice.

    The main goal of the IceCube Neutrino Observatory is to observe comparatively scarce neutrinos that are produced during some of the Universe’s most violent astrophysical events. However, in its test of Lorentz invariance, the collaboration studied more-abundant neutrinos that are generated when fast-moving charged particles from space collide with atoms in Earth’s atmosphere. There are three known types of neutrino: electron, muon and tau. Most of the neutrinos produced in the atmosphere are muon neutrinos.

    Atmospheric neutrinos generated around the globe travel freely to the South Pole, but can change type along the way. Such changes stem from the fact that electron, muon and tau neutrinos are not particles in the usual sense. They are actually quantum combinations of three ‘real’ particles — ν1, ν2 and ν3 — that have tiny but different masses.

    In a simple approximation relevant to the IceCube experiment, the birth of a muon neutrino in the atmosphere can be thought of as the simultaneous production of two quantum-mechanical waves: one for ν2 and one for ν3 (Fig. 1). These waves are observed as a muon neutrino only because they are in phase, which means the peaks of the two waves are seen at the same time. By contrast, a tau neutrino results from out-of-phase waves, whereby the peak of one wave arrives with the valley of the other.

    1
    Figure 1 | Propagation of neutrinos through Earth. There are three known types of neutrino: electron, muon and tau. a, A muon neutrino produced in Earth’s atmosphere can be thought of as the combination of two quantum-mechanical waves (red and blue) that are in phase — the peaks of the waves are observed at the same time. If a principle known as Lorentz invariance were violated, these waves could travel at different speeds through Earth’s interior and be detected in the out-of-phase tau-neutrino state. b, The IceCube Collaboration1 reports no evidence of such conversion, constraining the extent to which Lorentz invariance could be violated.

    If neutrinos were massless and Lorentz invariance held exactly, the two waves would simply travel in unison, always maintaining the in-phase muon-neutrino state. However, small differences in the masses of ν2 and ν3 or broken Lorentz invariance could cause the waves to travel at slightly different speeds, leading to a gradual shift from the muon-neutrino state to the out-of-phase tau-neutrino state. Such transitions are known as neutrino oscillations and enable the IceCube detector to pick out potential violations of Lorentz invariance. Oscillations resulting from mass differences are expected to be negligible at the neutrino energies considered in the authors’ analysis, so the observation of an oscillation would signal a possible breakdown of special relativity.

    The IceCube Collaboration is not the first group to seek Lorentz-invariance violation in neutrino oscillations [5–10]. However, two key factors allowed the authors to carry out the most precise search so far. First, atmospheric neutrinos that are produced on the opposite side of Earth to the detector travel a large distance (almost 13,000 km) before being observed, maximizing the probability that a potential oscillation will occur. Second, the large size of the detector allows neutrinos to be observed that have much higher energies than those that can be seen in other experiments.

    Such high energies imply that the quantum-mechanical waves have tiny wavelengths, down to less than one-billionth of the width of an atom. The IceCube Collaboration saw no sign of oscillations, and therefore inferred that the peaks of the waves associated with ν2 and ν3 are shifted by no more than this distance after travelling the diameter of Earth. Consequently, the speeds of the waves differ by no more than a few parts per 10^28 — a result that is one of the most precise speed comparisons in history.

    The authors’ analysis provides support for special relativity and places tight constraints on a number of different classes of Lorentz-invariance violation, many for the first time. Although already impressive, the IceCube experiment has yet to reach its full potential. Because of limited data, the authors restricted their attention to violations that are independent of the direction of neutrino propagation, neglecting possible direction-dependent violations that could arise more generally.

    With a greater number of neutrino detections, the experiment, or a larger future version [11], could search for direction-dependent violations. Eventually, similar studies involving more-energetic astrophysical neutrinos propagating over astronomical distances could test the foundations of physics at unprecedented levels.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

    IceCube is a particle detector at the South Pole that records the interactions of a nearly massless sub-atomic particle called the neutrino. IceCube searches for neutrinos from the most violent astrophysical sources: events like exploding stars, gamma ray bursts, and cataclysmic phenomena involving black holes and neutron stars. The IceCube telescope is a powerful tool to search for dark matter, and could reveal the new physical processes associated with the enigmatic origin of the highest energy particles in nature. In addition, exploring the background of neutrinos produced in the atmosphere, IceCube studies the neutrinos themselves; their energies far exceed those produced by accelerator beams. IceCube is the world’s largest neutrino detector, encompassing a cubic kilometer of ice.

     
  • richardmitnick 4:23 am on August 15, 2018 Permalink | Reply
    Tags: , , , , ,   

    From Science and Technology Facilities Council: “The next big thing in astronomy: ESO’s Extremely Large Telescope” 


    From Science and Technology Facilities Council


    The Extremely Large Telescope (ELT) will be the world’s largest visible and infrared telescope – when completed, this new eye on the sky will open up new windows onto the universe and see things we can’t yet imagine. (Credit: STFC)

    On the top of a mountain in Chile, construction of the largest visible and infrared telescope ever built – the Extremely Large Telescope, or ELT – is underway. When the telescope is operational, its 39-metre primary mirror will gather 217 times more light than the Hubble Telescope.

    The ELT’s scale makes it a feat of engineering, and its ambition makes it a feat of imagination.

    The science applications of the ELT are vast. It will allow us to image planets outside our solar system, tell us what they are made of and if they can support life, and maybe even help us understand more about the mysteries of dark matter. And that’s just the start.

    As the ELT and its instruments evolve, it will generate discoveries that we can’t yet imagine.

    Delivering this awe-inspiring project is beyond the reach of a single nation, but is within our grasp thanks to multinational collaboration and science-led innovation.

    The UK is playing a key role in the ELT’s innovation.

    At the Science and Technology Facilities Council (STFC), we fund the UK ELT Project Office, led by Dr Chris Evans. It co-ordinates activities across the UK’s principle partners for research and development for the project – the University of Cambridge, Durham University, University of Oxford, and STFC’s UK Astronomy Technology Centre and RAL Space – in close collaboration with the European Southern Observatory (ESO). As Dr Evans says: “It’s exciting to be part of one of the biggest global science collaborations in history – and to see the UK helping to shape the project and drive it forwards.

    2
    The ELT will be taller than a football stadium. (Credit: ESO)

    Let’s find out more about the ELT

    The ELT will be the world’s largest optical telescope, meaning it will use mirrors to gather light in the visible and the infrared spectrum. The telescope is being built by the ESO and its 15 member states, of which the UK is a major partner.

    When complete, the ELT will be a state-of-the-art facility, with capabilities far beyond any other ground-based optical telescope. It will have a footprint of 115 metres (to make room for the telescope, the top of the mountain has been levelled), and its dome will be 80 metres tall, making it taller than a football stadium.

    This size is only possible because the ELT has driven an ‘industrial revolution’ in telescope construction. This has changed the way the telescope is made, and means that new types of businesses can be involved in its production, with greater emphasis on production speed, quality and logistics.

    The ELT will be made up of lots of smaller components that fit together (like Lego), rather than a few one-off items.

    The telescope’s primary mirror is a great example of this – it will be 39-metres across, and made up of 798 hexagonal segments. It’s the size of this primary mirror that determines how much light it can capture – and how much of the universe it will be able to see.

    For astronomers, physicists and stargazers everywhere, developing a telescope this size with these capabilities is a major priority – it’s the one they have been waiting for.

    Very large vs extremely large

    If you want to be precise, the difference between an extremely large telescope and a very large telescope is about 30.80 metres…that’s the difference in size between the ELT and the Very Large Telescope (VLT), currently the most advanced optical observatory in the world.

    ESO VLT at Cerro Paranal in the Atacama Desert, •ANTU (UT1; The Sun ),
    •KUEYEN (UT2; The Moon ),
    •MELIPAL (UT3; The Southern Cross ), and
    •YEPUN (UT4; Venus – as evening star).
    elevation 2,635 m (8,645 ft) from above Credit J.L. Dauvergne & G. Hüdepohl atacama photo

    It’s this huge jump in class between the 8-10 metre telescopes and the ELT that’s getting astronomers so excited.

    The ELT will be much more powerful than any other telescope currently in existence. If the telescope was placed at Land’s End, it could see a bumblebee at John O’Groats.

    Right now, 8-10 metre telescopes are the best on the planet. With them, astronomers and physicists have made amazing discoveries, like producing the first image of a planet outside our solar system and tracking stars as they move around the black hole in the centre of our galaxy. The ELT won’t replace these telescopes; they will continue to power scientific discovery for years to come.

    But they have also opened the door to new mysteries about our universe. To address the new questions raised by existing telescopes and make new discoveries, astronomers need a new class of telescope to complement them – one in the 30-60 metre diameter range.

    3
    ELT deploying lasers to create artificial stars.
    (Credit: ESO/L. Calçada/N. Risinger (skysurvey.org))

    Glistening against the awesome backdrop of the night sky above ESO_s Paranal Observatory, four laser beams project out into the darkness from Unit Telescope 4 UT4 of the VLT.

    Size isn’t the only thing that matters

    It’s not just the size of the telescope that makes the ELT so impressive. The engineers and scientists designing the telescope and its instruments are bringing expertise honed for other facilities to bear on every aspect of the ELT.

    One of the most sophisticated pieces of technology underpinning the ELT’s operation is its adaptive optics system. Adaptive optics allow astronomers to take really clear images of the stars by stopping them from twinkling.

    They can do this because the twinkling isn’t caused by the stars themselves – it’s caused by distortions in the earth’s atmosphere. By measuring the effect of the atmosphere on bright reference stars in the nearby sky (or on artificial laser guide stars), thousands of little tiny pistons (called actuators) under the surface of one of the mirrors push it gently to change its shape and correct for distortions in the atmosphere and create crisp images of the cosmos.

    This technology also has applications for much smaller environments, like the environment within the human body. Biological and medical researchers have been working together with imaging experts to study the murky environment within cells.

    4
    Exquisite instruments

    Being able to capture light from the far reaches of the universe is one thing – but it’s the ELT’s instruments that will transform that light into scientific discoveries.

    The ELT will have three key instruments in place at ‘first light’ or following soon after – MICADO (a camera), HARMONI (a spectrograph), and METIS (a mid-infrared spectrograph and imager).

    HARMONI is the ELT’s ‘workhorse’ spectrograph. It will detect light in the visible and near-infrared parts of the spectrum, and produce 3D images of the sky with unparalleled sharpness and clarity. Because the ELT will have adaptive optics built in, the design of the telescope and HARMONI must stay closely coupled. This is an exciting challenge for the UK team leading the design of this critical instrument.

    Leading things is Professor Niranjan Thatte from the University of Oxford, in collaboration with STFC’s UK Astronomy Technology Centre and Rutherford Appleton Laboratory, and experts at Durham University.

    The group – along with contributions from international partners in Lyon, Marseille, Tenerife, and Madrid – will also be working together to ensure the subsystems for HARMONI operate seamlessly.

    Read our interview with Professor Niranjan Thatte from the University of Oxford and lead investigator on the HARMONI instrument.

    While HARMONI will be the first instrument to tackle the big questions ELT was built for, other instruments will be added to the telescope after first light. These include HIRES (a high-resolution spectrograph) and MOSAIC (a multi-object spectrograph).

    ESO E-ELT HIRES in development

    ESO E-ELT MOSAIC

    MOSAIC will allow astronomers to observe large numbers of the most distant galaxies simultaneously, and build on scientific discoveries expected from the James Webb Space Telescope.

    As part of an international consortium of 11 countries, UK science and engineering teams are leading aspects of the instrument design.

    Find out why MOSAIC is the instrument Professor Simon Morris, ESO Council Member and Professor of Physics at Durham University, is most excited about.

    The future of astronomy

    The ELT will change the way we view the universe and open new avenues of exploration.

    There are certain scientific questions about the universe we know ELT will be able to answer: it will let us observe atmospheres of planets inside and outside our own solar system (possibly detecting ‘bio-markers’ indicating that they could support life), look back in time at the most distant galaxies so we can understand their formation and evolution, and make direct measurements of the expanding Universe, which could tell us more about dark matter and how it is distributed.

    But perhaps the most exciting questions the ELT will help us to answer are the ones we haven’t yet thought to ask, and the serendipitous discoveries that will take us by surprise.

    It’s worth remembering that the ELT is not just being designed for today’s researchers.

    This incredible telescope will inspire a new generation of astronomers to look to the sky, and fuel their discoveries for decades as they work to understand our place in the universe.
    Find out more about the ELT on the ESO website
    Discover more about big telescopes

    Big telescopes infographic

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    STFC Hartree Centre

    Helping build a globally competitive, knowledge-based UK economy

    We are a world-leading multi-disciplinary science organisation, and our goal is to deliver economic, societal, scientific and international benefits to the UK and its people – and more broadly to the world. Our strength comes from our distinct but interrelated functions:

    Universities: we support university-based research, innovation and skills development in astronomy, particle physics, nuclear physics, and space science
    Scientific Facilities: we provide access to world-leading, large-scale facilities across a range of physical and life sciences, enabling research, innovation and skills training in these areas
    National Campuses: we work with partners to build National Science and Innovation Campuses based around our National Laboratories to promote academic and industrial collaboration and translation of our research to market through direct interaction with industry
    Inspiring and Involving: we help ensure a future pipeline of skilled and enthusiastic young people by using the excitement of our sciences to encourage wider take-up of STEM subjects in school and future life (science, technology, engineering and mathematics)

    We support an academic community of around 1,700 in particle physics, nuclear physics, and astronomy including space science, who work at more than 50 universities and research institutes in the UK, Europe, Japan and the United States, including a rolling cohort of more than 900 PhD students.

    STFC-funded universities produce physics postgraduates with outstanding high-end scientific, analytic and technical skills who on graduation enjoy almost full employment. Roughly half of our PhD students continue in research, sustaining national capability and creating the bedrock of the UK’s scientific excellence. The remainder – much valued for their numerical, problem solving and project management skills – choose equally important industrial, commercial or government careers.

    Our large-scale scientific facilities in the UK and Europe are used by more than 3,500 users each year, carrying out more than 2,000 experiments and generating around 900 publications. The facilities provide a range of research techniques using neutrons, muons, lasers and x-rays, and high performance computing and complex analysis of large data sets.

    They are used by scientists across a huge variety of science disciplines ranging from the physical and heritage sciences to medicine, biosciences, the environment, energy, and more. These facilities provide a massive productivity boost for UK science, as well as unique capabilities for UK industry.

    Our two Campuses are based around our Rutherford Appleton Laboratory at Harwell in Oxfordshire, and our Daresbury Laboratory in Cheshire – each of which offers a different cluster of technological expertise that underpins and ties together diverse research fields.

    The combination of access to world-class research facilities and scientists, office and laboratory space, business support, and an environment which encourages innovation has proven a compelling combination, attracting start-ups, SMEs and large blue chips such as IBM and Unilever.

    We think our science is awesome – and we know students, teachers and parents think so too. That’s why we run an extensive Public Engagement and science communication programme, ranging from loans to schools of Moon Rocks, funding support for academics to inspire more young people, embedding public engagement in our funded grant programme, and running a series of lectures, travelling exhibitions and visits to our sites across the year.

    Ninety per cent of physics undergraduates say that they were attracted to the course by our sciences, and applications for physics courses are up – despite an overall decline in university enrolment.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: