Recent Updates Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 5:40 pm on November 18, 2018 Permalink | Reply
    Tags: "New Arecibo Observatory Message Challenge Announced", Arecibo message 1974, , , , ,   

    From Astrobiology Magazine: “New Arecibo Observatory Message Challenge Announced” 

    Astrobiology Magazine

    From Astrobiology Magazine

    Nov 18, 2018

    In 1974, the Arecibo Observatory made history by beaming the most powerful radio message into deep space ever made.

    This radio message was transmitted toward the globular cluster M13 using the Arecibo telescope in 1974. Image Credit Arne Nordmann (norro) Wikipedia

    The famous Arecibo Message was designed by the AO 74’s staff, led by Frank Drake, and with the help of the astronomer and famed science communicator Carl Sagan. It contained information about the human race and was intended to be our intergalactic calling card.

    Frank Drake with his Drake Equation. Credit Frank Drake

    Carl Sagan NASA/JPL

    “Our society and our technology have changed a lot since 1974,” says Francisco Cordova, the director of the NSF-funded Arecibo Observatory. “So, if we were assembling our message today, what would it say? What would it look like? What one would need to learn to be able to design the right updated message from the earthlings? Those are the questions we are posing to young people around the world through the New Arecibo Message – the global challenge.”

    NAIC Arecibo Observatory operated by University of Central Florida, Yang Enterprises and UMET, Altitude 497 m (1,631 ft).

    The NSF-funded facility, which is home to the largest fully operational radar telescope on the planet, will launch its online competition later today on the 44th anniversary of the original Arecibo message. Check out the observatory’s website after 1 p.m. for details and today’s Google doodle for more information about the first message.

    But this will be no simple task. In order to get started, teams of up to 10 students in grades kindergarten through college, must decode various clues that will be released online. Like a Chinese puzzle box, teams must learn about Space Sciences, break coded messages and solve brain-puzzles to qualify, get instructions, register and then submit their entries. Arecibo will post its first puzzle on its website and social media channels this afternoon (Nov. 16).

    This challenge gives teams nine months to complete their designs. A winner will be announced during the Arecibo Observatory Week activities planned for 2019, which includes the special celebration of the 45thanniversary of the original Arecibo Message.

    “We have quite a few surprises in store for participants and we will be sharing more details as the competition progresses,” Cordova says. “We can’t wait to see what our young people across the globe come up with.”

    The Arecibo Observatory is operated by the University of Central Florida (UCF) in partnership with Sistema Ana G. Mendez Universidad Metropolitana and Yang Enterprises Inc., under a cooperative agreement with the National Society of Sciences (NSF). The planetary radar program is supported by NASA’s Near Earth Object Observation Program.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 10:35 am on November 18, 2018 Permalink | Reply  

    Holiday recess 

    There will be very little if any activity on this blog due to family obligations for Thanksgiving week.

  • richardmitnick 6:35 pm on November 17, 2018 Permalink | Reply
    Tags: , , CEPC-Circular Electron Positron Collider plans in China, , ILC-International Linear Collider plans in Japan, , , Studying the Higgs   

    From Science Magazine: “China unveils design for $5 billion particle smasher” 

    From Science Magazine

    China’s Circular Electron Positron Collider would be built underground in a 100-kilometer-circumference tunnel at an as-yet-undetermined site.

    Nov. 16, 2018
    Dennis Normile

    BEIJING—The center of gravity in high energy physics could move to Asia if either of two grand plans is realized. At a workshop here last week, Chinese scientists unveiled the full conceptual design for the proposed Circular Electron Positron Collider (CEPC), a $5 billion machine to tackle the next big challenge in particle physics: studying the Higgs boson. (Part of the design was published in the summer.) Now, they’re ready to develop detailed plans, start construction in 2022, and launch operations around 2030—if the Chinese government agrees to fund it.

    Meanwhile, Japan’s government is due to decide by the end of December whether to host an equally costly machine to study the Higgs, the International Linear Collider (ILC).

    ILC schematic, being planned for the Kitakami highland, in the Iwate prefecture of northern Japan

    How Japan’s decision might affect China’s, which is a few years away, is unclear. But it seems increasingly likely that most of the future action around the Higgs will be in Asia. Proposed “Higgs factories” in Europe are decades away and the United States has no serious plans [remember the superconducting supercollider intended for Texas and killed by our idiot Congress in 1993 for having “no immediate economic value”?].

    The Higgs boson, key to explaining how other particles gain mass, was discovered at CERN, the European particle physics laboratory near Geneva, Switzerland, in 2012—more than 40 years after being theoretically predicted.

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event


    CERN map

    CERN LHC Tunnel

    CERN LHC particles

    Now, scientists want to confirm the particle’s properties, how it interacts with other particles, and whether it contributes to dark matter. Having only mass but no spin and no charge, the Higgs is really a “new kind of elementary particle” that is both “a special part of the standard model” and a “harbinger of some profound new principles,” says Nima Arkani-Hamed, a theorist at the Institute for Advanced Study in Princeton, New Jersey.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Standard Model of Particle Physics from Symmetry Magazine

    Answering the most important questions in particle physics today “involves studying the Higgs to death,” he says.

    “Physicists want at least one machine,” says Joao Guimaraes da Costa, a physicist at the Chinese Academy of Sciences’s Institute of High Energy Physics (IHEP) here, which put together the Chinese proposal. “Ideally, both should be built,” because each has its scientific merits, adds Hitoshi Murayama, a theoretical physicist at the University of California, Berkeley, and the University of Tokyo’s Kavli Institute for the Physics and Mathematics of the Universe in Kashiwa, Japan.

    The CERN discovery relied on the Large Hadron Collider, a 27-kilometer ring [map is above] in which high-energy protons traveling in opposite directions are steered into head-on collisions. This produces showers of many types of particles, forcing physicists to sift through billions of events to spot the telltale signal of a Higgs. It’s a bit like smashing together cherry pies, Murayama says: “A lot of goo flies out when what you are really looking for is the little clinks between pits.”

    Smashing electrons into their antimatter counterparts, positrons, results in cleaner collisions that typically produce one Z particle and one Higgs boson at a time, says Bill Murray of The University of Warwick in Coventry, U.K. How Z particles decay is well understood, so other signals can be attributed to the Higgs “and we can watch what it does,” Murray says.

    Japan’s plan to build an electron-positron collider grew from international investigations in the 1990s. Physicists favored a linear arrangement [see schematic above], in which the particles are sent down two straight opposing raceways, colliding like bullets in rifles put muzzle to muzzle. That design promises higher energies, because it avoids the losses that result when charged particles are sent in a circle, causing them to shed energy in the form of x-rays. Its disadvantage is that particles that don’t collide are lost; in a circular design they continue around the ring for another chance at colliding.

    Along the way, Japan signaled it might host the machine and shoulder the lion’s share of the cost, with other countries contributing detectors, other components, and expertise. A 2013 basic design envisioned a 500-giga-electronvolt (GeV) linear collider in a 31-kilometer tunnel costing almost $8 billion, not counting labor. But by then, the CERN team had already pegged the Higgs mass at 125 GeV, making the ILC design “overkill,” Murayama says. The group has since revised the plan, aiming for a 250-GeV accelerator housed in a 20-kilometer-long tunnel and costing $5 billion, says Murayama, who is also deputy director of the Linear Collider Collaboration, which coordinates global R&D work on several future colliders.

    IHEP scientists made their own proposal just 2 months after the Higgs was announced. They recognized the energy required for a Higgs factory “is still in a range where circular is better,” Murray says. With its beamlines buried in a 100-kilometer-circumference tunnel at a site yet to be chosen, the CEPC would collide electrons and positrons at up to 240 GeV.

    Both approaches have their advantages. The CEPC will produce Higgs at roughly five times the rate of ILC, allowing research to move faster. But Murayama notes that the ILC could easily be upgraded to higher energies by extending the tunnel by another couple of kilometers. Most physicists don’t want to choose. The two colliders “are quite complementary,” Murray says.

    Whether politicians and funding agencies agree remains to be seen. Construction of the CEPC hinges on funding under China’s next 5-year plan, which starts in 2021, says IHEP Director Wang Yifang. IHEP would then also seek international contributors. Murayama says Japan needs to say yes to the ILC in time to negotiate support from the European Union under a particle physics strategy to be hammered out in 2019. Missing that opportunity could mean delaying the collider by 20 years, he says—and perhaps ceding the field to China.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 2:40 pm on November 17, 2018 Permalink | Reply
    Tags: Airbus Zephyr S, , , HAPS – missions to the edge of space to watch over Earth, High Altitude Pseudo Satellite, , Potential High Altitude Pseudo Satellite Design   

    From European Space Agency: “HAPS – missions to the edge of space to watch over Earth” 

    ESA Space For Europe Banner

    From European Space Agency

    15 November 2018

    Lighter-than-air Stratobus, Thales Alenia Space/Briot
    Thales Alenia Space’s Stratobus airship can carry up to 250 kg of payload, its electric engines flying against the breeze to hold itself in position, and relying on fuel cells at night. Its first flight is projected for 2021.

    Is it a bird? Is it a plane? No, it’s a High-Altitude Pseudo-Satellite (HAPS) — an uncrewed airship, plane or balloon watching over Earth from the stratosphere. Operating like satellites but from closer to Earth, HAPS are the ‘missing link’ between drones flying close to Earth’s surface and satellites orbiting in space.

    They float or fly high above conventional aircraft and offer continuous day-and-night coverage of the territory below. Target applications include search and rescue missions, disaster relief, environmental monitoring and agriculture.

    ESA’s Directorates of Telecommunication, Earth Observation and Navigation are working together to establish a HAPS Programme. The Agency will hold its second HAPS4ESA workshop on 12–14 February 2019, and the future of HAPS was discussed at yesterday’s Φ-week session at ESA’s Earth observation centre in Frascati, Italy.


    Airbus Zephyr S , Airbus
    The Airbus Zephyr S is a High Altitude Pseudo Satellite (HAPS) that flew for more than 25 days on its maiden flight in 2018. HAPS float or fly high above conventional aircraft and offer continuous day-and-night coverage of the territory below. Target applications include search and rescue missions, disaster relief, environmental monitoring and agriculture.

    Additionally, ESA is performing other HAPS studies through its Discovery and Preparation Programme, identifying how HAPS could bring value to satellite communications and Earth observation in terms of performance or cost, to highlight gaps in current HAPS technologies, and plan moves towards operational services.

    “HAPS could give us prolonged high-resolution coverage of specific regions of Earth,” explains Juan Lizarraga Cubillos, leading both studies from ESA. “They could also help provide tactica and emergency communications and broadband internet services.”

    By combining the expertise of telecommunications company HISPASAT and aircraft maker Airbus, the TELEO – High-Altitude Pseudo-Satellites for Telecommunication and Complementary Space Applications – team found that aerodynamic HAPS, taking the form of aircraft, could complement traditional satellite networks.

    HAPS could also improve security for major events – for example the Olympic Games or G7 meetings– and emergency situations, by providing secure communication bubbles over areas of interest.

    High Altitude Pseudo Satellite, ESA
    In the future, High Altitude Pseudo Satellites could be used as relays between satellites and ground stations to improve data transfer.

    Juan Carlos Martin Quirós from HISPASAT explains, “Because HAPS can be rapidly deployed compared to satellites, in addition to being low-cost and flexible, they could be extremely useful in telecommunications services.”

    The TELEO team also looked at disaster management and maritime traffic safety and security.

    “At Airbus we have demonstrated that aerodynamic HAPS are a practical reality – a Zephyr S was flown this year carrying prototypes of passive Earth observation payloads,” explains Steffen Kuntz from Airbus.

    Potential High Altitude Pseudo Satellite Design, ESA
    Design for a High Altitude Pseudo Satellite (HAPS) arising from a Discovery and Preparation study. The study, HAPPIEST, investigated the role of HAPS in future telecommunications networks, looking at where they could complement and fill gaps in existing satellite networks and applications.

    The HAPPIEST – High-Altitude Pseudo-Satellites: Proposal of Initiatives to Enhance Satellite Communication – team from the University of León, Thales Alenia Space, Elecnor Deimos and Airobotics mainly focused on the potential of ‘aerostatic’ HAPS in the form of stratospheric balloons – able to carry more payload and generate more power than aerodynamic HAPS.

    HAPPIEST investigated the role of HAPS in future telecommunications networks, to complement and fill gaps in existing satellite networks and applications.

    HAPS looks promising – both economically and technically – in response to natural disasters or in supporting field activities in areas lacking infrastructure, such as remote areas or the deep sea. Additionally, HAPS could be useful as an intermediate relay step between a satellite and a ground station, easing the transfer of data and reducing the ground and satellite infrastructure required.

    High-altitude pseudo-satellites, ESA Earth Observation Graphics Bureau

    High Altitude Pseudo-Satellites, or HAPS, are platforms that float or fly at high altitude like conventional aircraft but operate more like satellites – except that rather than working from space they can remain in position inside the atmosphere for weeks or even months, variously enabling precise monitoring and surveillance, high-bandwidth communications or back up to existing satellite navigation services.

    “We found that HAPS don’t really compete with terrestrial networks in highly developed areas, or with satellite networks where the areas of interest are large”, explains Jesus Gonzalo, leading the project from the University of León. “But HAPS efficiently complement the networks in between, where the target area is limited and changing and where ground infrastructure is inexistent or unavailable.”

    Based on their research, the HAPPIEST team designed a HAPS measuring 181 metres long, with a take-off mass of 16 metric tons for an operational payload of 250 kg, envisaged for the 2025 timeframe.

    Looking ahead, ESA is already running five more studies with the objective of developing business cases or innovative new applications and services to be enabled by HAPS. Several further studies are planned for the near future, especially in using HAPS as intermediaries between satellites and ground stations.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The European Space Agency (ESA), established in 1975, is an intergovernmental organization dedicated to the exploration of space, currently with 19 member states. Headquartered in Paris, ESA has a staff of more than 2,000. ESA’s space flight program includes human spaceflight, mainly through the participation in the International Space Station program, the launch and operations of unmanned exploration missions to other planets and the Moon, Earth observation, science, telecommunication as well as maintaining a major spaceport, the Guiana Space Centre at Kourou, French Guiana, and designing launch vehicles. ESA science missions are based at ESTEC in Noordwijk, Netherlands, Earth Observation missions at ESRIN in Frascati, Italy, ESA Mission Control (ESOC) is in Darmstadt, Germany, the European Astronaut Centre (EAC) that trains astronauts for future missions is situated in Cologne, Germany, and the European Space Astronomy Centre is located in Villanueva de la Cañada, Spain.

    ESA50 Logo large

  • richardmitnick 2:06 pm on November 17, 2018 Permalink | Reply
    Tags: Earthquakes and Potential for Levee Failure, Interview with Dr. Stewart, , Water Management: Levees and Subsidence, When the Levee Breaks: Cascading failures in the Sacramento-San Joaquin River Delta California   

    From temblor: “When the Levee Breaks: Cascading failures in the Sacramento-San Joaquin River Delta, California” 


    From temblor

    November 16, 2018
    Jason R. Patton, Ph.D., Ross Stein, Ph.D., Volkan Sevilgen, M.Sc.

    Water Management: Levees and Subsidence

    The state of California provides a substantial proportion of food for our country and other nations. Because of this agribusiness history in the state, water rights have been a durable [recurrent] topic in the politics of the state and the western US.

    Competing interests for the limited water resources have resulted in dramatic debate about the management of the Sacramento – San Joaquin River Delta. Islands and agricultural land in this region were protected by flooding by the installation of levees. Levees are artificial stream banks, designed to prevent flooding the islands and the land adjacent to the river. The islands are composed of peat, and once the levees prevented seasonal flooding, the peat began to outgas and compact. So, today the islands are over 15 ft below the river level, which is at sea level. In fact, the islands are like empty bowls with their rims just above the river level. If the levees are breached in even a modest earthquake, the river water would rush in to flood the islands.

    But the loss of the islands are just the tip of the iceberg. The real problem is that so much water would be displaced from the Sacramento River that sea water from San Pablo and Suisun Bays would be sucked in, rendering the Delta-Mendota Canal useless for irrigating the verdant crops of the Great Valley.

    The figure below shows the spatial extent of this land subsidence. The color represents the magnitude of this subsidence relative to feet below sea level (Windham-Meyers et al., 2018). The levees are currently the only physical barrier that prevents these regions from flooding with salty water.

    Sacramento – San Joaquin Delta wetlands are shown with color representing the magnitude of subsidence below sea level (modified from Galloway et al., 1999). The red box represents the area that was studied by Windham-Meyers et al. (2018) for the potential for greenhouse gas emissions from peat in these wetlands.

    Earthquakes and Potential for Levee Failure

    The levees in the Delta are highly susceptible to failure due to earthquakes because they were built out of mud between the Gold Rush and the 1906 Earthquake. The 2014 South Napa earthquake is a reminder that even small quakes on faults with low slip rates can cause strong (0.5 g) shaking. The Delta is just east of some major faults, and straddles others.

    Below is a summary map showing the relative potential for levee collapse due to earthquakes in the Delta (Mount and Twiss, 2005). According to these authors, there is a 2 in 3 chance that either floods or an earthquake will cause catastrophic flooding in the Delta by 2050. These authors combined estimates of earthquake ground shaking and knowledge about the conditions of the levees in the Delta to prepare this map that shows relative damage potential for different zones.

    Zones of potential damage from earthquake induced liquefaction and levee collapse (modified from Torres et al., 2000).

    Millions of people in the United States and elsewhere are exposed to flood hazards. When one is exposed to these flood hazards, they are at risk. Learn more about your exposure to flood hazards at the Temblor app here.

    Below is a map showing flood hazards and earthquake faults in the Delta area. Blue represents the chance of flooding in 10 years. Darker blue represents higher flood hazard. The flood hazard is based on the FEMA flood zones. The major USGS active faults shown in red are labeled.

    Flood hazards shown using the Temblor app. Earthquake faults and recent earthquakes are also shown.

    Below is an animation that shows a visualization of what will likely happen if there are levee collapses in the Delta. First we see where the islands will be flooded following an earthquake. Then we see a simulation of the changes in salinity for the areas that are flooded (the islands). Animation provided courtesy of Metropolitan Water District of Southern California.

    Animation of earthquake induced failures of Delta levees which protect deeply subsided islands or “holes”. This large void below sea-level and sudden collapse can pull over one million acre-feet of sea water into the Delta significantly impacting the water supply for millions of Californians and millions of acres of farmland. Seismic risks from Delta Risk Management Strategy, California Department of Water Resources 2009.

    Interview with Dr. Stewart

    Jonathan Stewart, Ph.D. is a professor of geotechnical engineering, earthquake engineering, and engineering seismology and the Chair of the Department of Civil and Environmental Engineering at the University of California, Los Angeles. Dr. Stewart has been studying the interactions between structures (like the levees) and earthquakes for over 2 decades.

    Temblor asked Dr. Stewart some questions about a recent talk they gave to scientists and engineers.

    Temblor: This subject matter is complicated from a political perspective, a geotechnical perspective, and a natural hazards perspective. Some of the natural hazards in the Delta have been exasperated by our management of the natural resources there. What do you view is the most important message that the scientific facts tell us about the enhanced risk of flooding in the Delta due to our management of the natural resources in this area?

    Dr. Stewart: The Delta region is indeed complex, but the seismic risk is easy to understand. The levees, when viewed as individual earth structures, are highly vulnerable to the effects of earthquakes. Many factors contribute to this, including the lack of engineering in their original construction in many cases, subsidence of the interior islands, which effectively heightens the levees, and the extraordinarily soft peaty organic soils upon which they are founded. Furthermore, the levees constantly impound water, meaning that portions of the levee fill are constantly saturated and susceptible to liquefaction. If a significant earthquake occurs on the active faults near the west end of the Delta, we expect multiple breaches. It is also important to recognize that failure of any one levee segment would inundate the interior island, and thus would represent a failure of the levee system.

    Because of the way we have managed the Delta, when these breaches occur, saline water from San Francisco Bay will be drawn into the Delta as the below sea level islands fill. This will be a disaster for the water distribution system in California and the regional ecology.

    Temblor: There exist a wide range of contributing factors for flood hazards in the Delta. Some factors are beyond our control. Please tell us about the most important factors that are beyond our control.

    Dr. Stewart: The occurrence of earthquakes on the regional faults, is of course, beyond our control. Levee stability is also affected by sea level rise, and high water events related to extreme upstream precipitation. Such events are under our control to some extent in that they are influenced by climate change, but the ‘our’ becomes very large as this is a national and global problem.

    Temblor: California has just elected Gavin Newsom as Governor, who has demonstrated an awareness for earthquake hazards in the past when he served as the Mayor of San Francisco and as the Lieutenant Governor for the state of California. So, is this the moment to seize the day and get the federal and state governments to act now to prevent this predictable disaster from occurring? What should they do?

    Dr. Stewart: Absolutely. If we suffer a catastrophic failure of our water distribution system due to an earthquake in the Delta before we act, history will not judge our political and engineering leaders kindly. The threat is real, and the science refutes those who would deny it. We must act. We know how to address this problem through solutions like the California Water Fix program, we just need the political will to see it through in a timely way.

    Learn More

    Learn more about this history of the Delta from this report prepared for the San Francisco Estuary Institute and Aquatic Science Center (Whipple et al., 2012). The report is available in pdf format and the GIS data are also posted online.

    Dr. Stewart and their colleagues prepared a report about the factors and processes that contribute to the stability of levees in the Delta (Deverel et al., 2016). Read this report here.


    Deverel, S.J., Bachand, S., Brandenberg, S.J., Jones, C.E., Stewart, J.P., and Zimmaro, P., 2016. Factors and Processes Affecting Delta Levee System Vulnerability in San Francisco Estuary and Watershed Science, v. 14, no. 4.

    Galloway, D.L., Jones, D.R., and Ingebritsen, S.E., 1999. Land subsidence in the United States. USGS Circular 1182,

    Shouse, M.K., and Cox, D.A., 2013. USGS Science at Work in the San Francisco Bay and Sacramento-San Joaquin Delta Estuary: U.S. Geological Survey Fact Sheet 2013–3037, 6 p.,

    Torres RA, et al. 2000. Seismic vulnerability of the Sacramento-San Joaquin Delta levees. Report of levees and channels technical team, seismic vulnerability sub-team to CALFED Bay-Delta Program. 30 p.

    Whipple, A., Grossinger, R. M., Rankin, D., Stanford, B., Askevold, R.A., 2012. Sacramento-San Joaquin Delta Historical Ecology Investigation: Exploring Pattern and Process. SFEI Contribution No. 672. SFEI: Richmond

    Windham-Meyers, L., Bergamaschi, B., Anderson, F., Knox, S., Miller, R., and Fujii, R., 2018. Potential for negative emissions of greenhouse gases (CO2, CH4 and N2O) through coastal peatland re-establishment: Novel insights from high frequency flux data at meter and kilometer scales in Env. Res. Letters, v. 13.,

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Earthquake Alert


    Earthquake Alert

    Earthquake Network project

    Earthquake Network is a research project which aims at developing and maintaining a crowdsourced smartphone-based earthquake warning system at a global level. Smartphones made available by the population are used to detect the earthquake waves using the on-board accelerometers. When an earthquake is detected, an earthquake warning is issued in order to alert the population not yet reached by the damaging waves of the earthquake.

    The project started on January 1, 2013 with the release of the homonymous Android application Earthquake Network. The author of the research project and developer of the smartphone application is Francesco Finazzi of the University of Bergamo, Italy.

    Get the app in the Google Play store.

    Smartphone network spatial distribution (green and red dots) on December 4, 2015

    Meet The Quake-Catcher Network

    QCN bloc

    Quake-Catcher Network

    The Quake-Catcher Network is a collaborative initiative for developing the world’s largest, low-cost strong-motion seismic network by utilizing sensors in and attached to internet-connected computers. With your help, the Quake-Catcher Network can provide better understanding of earthquakes, give early warning to schools, emergency response systems, and others. The Quake-Catcher Network also provides educational software designed to help teach about earthquakes and earthquake hazards.

    After almost eight years at Stanford, and a year at CalTech, the QCN project is moving to the University of Southern California Dept. of Earth Sciences. QCN will be sponsored by the Incorporated Research Institutions for Seismology (IRIS) and the Southern California Earthquake Center (SCEC).

    The Quake-Catcher Network is a distributed computing network that links volunteer hosted computers into a real-time motion sensing network. QCN is one of many scientific computing projects that runs on the world-renowned distributed computing platform Berkeley Open Infrastructure for Network Computing (BOINC).

    The volunteer computers monitor vibrational sensors called MEMS accelerometers, and digitally transmit “triggers” to QCN’s servers whenever strong new motions are observed. QCN’s servers sift through these signals, and determine which ones represent earthquakes, and which ones represent cultural noise (like doors slamming, or trucks driving by).

    There are two categories of sensors used by QCN: 1) internal mobile device sensors, and 2) external USB sensors.

    Mobile Devices: MEMS sensors are often included in laptops, games, cell phones, and other electronic devices for hardware protection, navigation, and game control. When these devices are still and connected to QCN, QCN software monitors the internal accelerometer for strong new shaking. Unfortunately, these devices are rarely secured to the floor, so they may bounce around when a large earthquake occurs. While this is less than ideal for characterizing the regional ground shaking, many such sensors can still provide useful information about earthquake locations and magnitudes.

    USB Sensors: MEMS sensors can be mounted to the floor and connected to a desktop computer via a USB cable. These sensors have several advantages over mobile device sensors. 1) By mounting them to the floor, they measure more reliable shaking than mobile devices. 2) These sensors typically have lower noise and better resolution of 3D motion. 3) Desktops are often left on and do not move. 4) The USB sensor is physically removed from the game, phone, or laptop, so human interaction with the device doesn’t reduce the sensors’ performance. 5) USB sensors can be aligned to North, so we know what direction the horizontal “X” and “Y” axes correspond to.

    If you are a science teacher at a K-12 school, please apply for a free USB sensor and accompanying QCN software. QCN has been able to purchase sensors to donate to schools in need. If you are interested in donating to the program or requesting a sensor, click here.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.

    Earthquake safety is a responsibility shared by billions worldwide. The Quake-Catcher Network (QCN) provides software so that individuals can join together to improve earthquake monitoring, earthquake awareness, and the science of earthquakes. The Quake-Catcher Network (QCN) links existing networked laptops and desktops in hopes to form the worlds largest strong-motion seismic network.

    Below, the QCN Quake Catcher Network map
    QCN Quake Catcher Network map

    ShakeAlert: An Earthquake Early Warning System for the West Coast of the United States

    The U. S. Geological Survey (USGS) along with a coalition of State and university partners is developing and testing an earthquake early warning (EEW) system called ShakeAlert for the west coast of the United States. Long term funding must be secured before the system can begin sending general public notifications, however, some limited pilot projects are active and more are being developed. The USGS has set the goal of beginning limited public notifications in 2018.

    Watch a video describing how ShakeAlert works in English or Spanish.

    The primary project partners include:

    United States Geological Survey
    California Governor’s Office of Emergency Services (CalOES)
    California Geological Survey
    California Institute of Technology
    University of California Berkeley
    University of Washington
    University of Oregon
    Gordon and Betty Moore Foundation

    The Earthquake Threat

    Earthquakes pose a national challenge because more than 143 million Americans live in areas of significant seismic risk across 39 states. Most of our Nation’s earthquake risk is concentrated on the West Coast of the United States. The Federal Emergency Management Agency (FEMA) has estimated the average annualized loss from earthquakes, nationwide, to be $5.3 billion, with 77 percent of that figure ($4.1 billion) coming from California, Washington, and Oregon, and 66 percent ($3.5 billion) from California alone. In the next 30 years, California has a 99.7 percent chance of a magnitude 6.7 or larger earthquake and the Pacific Northwest has a 10 percent chance of a magnitude 8 to 9 megathrust earthquake on the Cascadia subduction zone.

    Part of the Solution

    Today, the technology exists to detect earthquakes, so quickly, that an alert can reach some areas before strong shaking arrives. The purpose of the ShakeAlert system is to identify and characterize an earthquake a few seconds after it begins, calculate the likely intensity of ground shaking that will result, and deliver warnings to people and infrastructure in harm’s way. This can be done by detecting the first energy to radiate from an earthquake, the P-wave energy, which rarely causes damage. Using P-wave information, we first estimate the location and the magnitude of the earthquake. Then, the anticipated ground shaking across the region to be affected is estimated and a warning is provided to local populations. The method can provide warning before the S-wave arrives, bringing the strong shaking that usually causes most of the damage.

    Studies of earthquake early warning methods in California have shown that the warning time would range from a few seconds to a few tens of seconds. ShakeAlert can give enough time to slow trains and taxiing planes, to prevent cars from entering bridges and tunnels, to move away from dangerous machines or chemicals in work environments and to take cover under a desk, or to automatically shut down and isolate industrial systems. Taking such actions before shaking starts can reduce damage and casualties during an earthquake. It can also prevent cascading failures in the aftermath of an event. For example, isolating utilities before shaking starts can reduce the number of fire initiations.

    System Goal

    The USGS will issue public warnings of potentially damaging earthquakes and provide warning parameter data to government agencies and private users on a region-by-region basis, as soon as the ShakeAlert system, its products, and its parametric data meet minimum quality and reliability standards in those geographic regions. The USGS has set the goal of beginning limited public notifications in 2018. Product availability will expand geographically via ANSS regional seismic networks, such that ShakeAlert products and warnings become available for all regions with dense seismic instrumentation.

    Current Status

    The West Coast ShakeAlert system is being developed by expanding and upgrading the infrastructure of regional seismic networks that are part of the Advanced National Seismic System (ANSS); the California Integrated Seismic Network (CISN) is made up of the Southern California Seismic Network, SCSN) and the Northern California Seismic System, NCSS and the Pacific Northwest Seismic Network (PNSN). This enables the USGS and ANSS to leverage their substantial investment in sensor networks, data telemetry systems, data processing centers, and software for earthquake monitoring activities residing in these network centers. The ShakeAlert system has been sending live alerts to “beta” users in California since January of 2012 and in the Pacific Northwest since February of 2015.

    In February of 2016 the USGS, along with its partners, rolled-out the next-generation ShakeAlert early warning test system in California joined by Oregon and Washington in April 2017. This West Coast-wide “production prototype” has been designed for redundant, reliable operations. The system includes geographically distributed servers, and allows for automatic fail-over if connection is lost.

    This next-generation system will not yet support public warnings but does allow selected early adopters to develop and deploy pilot implementations that take protective actions triggered by the ShakeAlert notifications in areas with sufficient sensor coverage.


    The USGS will develop and operate the ShakeAlert system, and issue public notifications under collaborative authorities with FEMA, as part of the National Earthquake Hazard Reduction Program, as enacted by the Earthquake Hazards Reduction Act of 1977, 42 U.S.C. §§ 7704 SEC. 2.

    For More Information

    Robert de Groot, ShakeAlert National Coordinator for Communication, Education, and Outreach

    Learn more about EEW Research

    ShakeAlert Fact Sheet

    ShakeAlert Implementation Plan

  • richardmitnick 12:41 pm on November 16, 2018 Permalink | Reply
    Tags: Can cause nerve and respiratory damage from breathing in the neurotoxin brevetoxin, Characterized by explosions of single-celled algae called dinoflagellates, HABs-harmful algal blooms. Red tide is just one type of HAB, Harmful algae known as "red tide", Iron-rich dust from the Sahara scatters into the Atlantic Ocean and fertilizes the water creating ideal conditions for dinoflagellates to thrive, The troublesome blooms are no longer seasonal, University of Rochester   

    From University of Rochester: “Red Tide: A Looming ‘Planetary Problem’ “ 

    U Rochester bloc

    From University of Rochester

    Lindsey Valich

    What does a persistent bloom of algae indicate about the health of the planet?

    DEADLY DUST: The cycles of algae blooms known as red tides that are plaguing the Gulf of Mexico have their origins half a world away, rising out of the Sahara in Africa. While the blooms have existed for millennia, the cycles have been happening with more frequency and intensity, say Rochester alumni who study red tides and their impact.

    While the harmful algae known as red tide have historically been common in warm waters like those of the Gulf of Mexico, the troublesome blooms are no longer seasonal. The algae kill marine animals and make life miserable for beachgoers.

    A particularly robust cycle that began last fall prompted Florida Governor Rick Scott to declare a state of emergency this past summer for seven counties in southern Florida.

    Michael Parsons ’90 and Michael Savarese ’81, ’84 (MS) are leading an effort to study red tide and determine what can be done to mitigate its effects. As researchers at Florida Gulf Coast University, they analyze the blooms and environmental changes in coastal settings, particularly in response to human development, sea-level rise, and global warming.

    “Regions of harmful algal blooms across the globe have increased in size, number and frequency,” Savarese says. “This isn’t just a Florida problem or a Gulf of Mexico problem, this is a planetary problem.”

    What is red tide?

    When algae grow out of control and produce toxins harmful to humans and wildlife ecosystems, they are called harmful algal blooms (HABs). Red tide is just one type of HAB, common in the Gulf of Mexico and characterized by explosions of single-celled algae called dinoflagellates. Each cell is about the size of a grain of salt, but when concentrations become greater than 100,000 cells per liter of water, the harmful algae can severely lower oxygen levels and give water a reddish or brownish color.

    What are the effects on humans?

    Red tide is harmful to humans if ingested, either by eating tainted shellfish—which can cause nerve and respiratory damage—or breathing in the neurotoxin brevetoxin, which the algae produce. Breathing the toxin can cause people to sneeze or cough, and red tide may exacerbate symptoms of asthma or other pre-existing respiratory ailments. Most of the respiratory irritations are easily fixed, though: “You just leave the beach,” Parsons says. “But when you leave the beach, you’re disrupting the tourism economy.”

    Florida is hit especially hard economically by red tide because of the state’s reliance on tourism. Fort Myers Beach, for example, announced this year that the area has been losing $2.6 million per day because of red tide, Parsons says. “The economic impacts are huge.”

    What causes red tide?

    Poor water quality does not directly lead to red tide algal blooms, Parsons says. “Everyone assumes the cause of red tide is agricultural nutrients coming off the landscape, but it’s not that simple. Red tides have existed for millennia.”

    Poor water quality can exacerbate the problem, but red tide algal blooms actually form far offshore, triggered by a natural cycle. Iron-rich dust from the Sahara scatters into the Atlantic Ocean and fertilizes the water, creating ideal conditions for dinoflagellates to thrive. The Florida coast is fairly shallow until about 100 to 200 miles out, where the gulf drops into extremely deep water. When those deep waters rise up toward the surface, they can bring in new nutrients that further feed the red tide.

    Why was red tide so bad this year?

    While scientists are still studying the connection between climate change and red tide, there “is clearly some sort of relationship,” Savarese says. “For algae to bloom and thrive, warmer waters are important. The current Gulf of Mexico temperatures are unprecedented in recent history.”

    Warmer waters are just one of a “perfect storm” of factors contributing to the intensity of the current red tide, Parsons says. Other factors include more persistent winds blowing offshore blooms inland and “legacy” nutrients—litter, fertilizers, and wastewater runoff—from Hurricane Irma, which hit Florida in September 2017.

    Red tide used to be more common in the winter, but even that’s changing: the current red tide has been a continuous presence in Florida since October 2017. “I don’t know when red tide season is anymore, and the reason I don’t know is scary: there seem to be red tides year-round now,” Savarese says.

    With this bleak picture, is there any hope that the red tide may go away any time soon?

    Yes, surmises Parsons. “It’s hard to predict, but we are seeing the system change. Nutrients are moving into different pathways, which should basically starve the red tide. But things could change back at any time.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Rochester Campus

    The University of Rochester is one of the country’s top-tier research universities. Our 158 buildings house more than 200 academic majors, more than 2,000 faculty and instructional staff, and some 10,500 students—approximately half of whom are women.

    Learning at the University of Rochester is also on a very personal scale. Rochester remains one of the smallest and most collegiate among top research universities, with smaller classes, a low 10:1 student to teacher ratio, and increased interactions with faculty.

  • richardmitnick 12:10 pm on November 16, 2018 Permalink | Reply
    Tags: , Integrated Bay Observatory, Narragansett Bay, , RI C-AIM-Rhode Island Consortium for Coastal Ecology Assessment Innovation and Modeling, Start of the 3D modeling process by examining the buoys and creating technical drawings,   

    From University of Rhode Island: “Bringing the Bay Observatory to 3D life” 

    From University of Rhode Island

    RISD graduate student and C-AIM researcher Stewart Copeland in his Providence studio developing new 3D models of the Bay Observatory’s equipment.


    Shaun Kirby,
    RI C-AIM Communications & Outreach Coordinator

    Stewart Copeland has been a webmaster, documentary filmmaker, and even a touring musician over the past 10 years. Now, the Tennessee native is developing 3D models of sensor buoys which comprise the integrated Bay Observatory, a new array of equipment to monitor the ecological changes of Narragansett Bay.

    “I grew up an hour south of Nashville, and I’m not a water person,” admits Copeland, a graduate student at the Rhode Island School of Design’s Edna Lawrence Nature Lab. “But I’m learning a lot about the ocean.”

    C-AIM researchers and students run a test launch of a sensor buoy this past spring. (Photo by Timo Kuester)

    The observatory, which is being deployed by the Rhode Island Consortium for Coastal Ecology Assessment, Innovation and Modeling (RI C-AIM), encompasses multiple marine research tools that will gather new data about Narragansett Bay’s ecosystems, from nutrient concentrations and phytoplankton populations to water circulation patterns.

    But Copeland, alongside Neal Overstrom, a co-principal investigator for the consortium and the Nature Lab’s director, is working to visualize not the data collected from the observatory through 3D modeling, but these tools which make subsequent research possible.

    Copeland starts his 3D modeling process by examining the buoys and creating technical drawings.

    “We get way too used to aerial views, dots on a map showing a buoy’s placement,” the RISD student explains. “But passing by it on a boat, you see this yellow thing with solar panels on it. It has all this technology extending from its bottom, and then life grows on it.”

    “That’s really exciting, and the challenge is showing more about the place itself from where all this data is coming.”

    The buoys will be moored at specific locations in Narragansett Bay this coming spring. Overstrom likened the buoys to a Mars rover, a vehicle oftentimes drawing more interest as a sojourning machine than in the data it collects.

    “These sensor buoys are entities in and of themselves, out there on Narragansett Bay day and night, through all kinds of weather,” he asserts. “The question for us is, how do virtual representations further inform what these buoys are doing above and beyond being critical platforms for data collection?”

    Copeland is also working closely with Dr. Harold ‘Bud’ Vincent, lead researcher for RI C-AIM coordinating the installation of the Bay Observatory’s equipment.

    “3D models allow ocean engineers to do things such as assess the buoyancy and stability of a buoy prior to assembly and deployment into the water, and also visualize placement of the many component parts inside,” explains Vincent, associate professor of ocean engineering at the University of Rhode Island. “3D modeling offers a source of permanent documentation for future engineering changes.”

    After creating technical drawings, Copeland takes a multitude of photos of the sensor buoy equipment, which he will utilize in a 3D visualizing computer program.
    [Animated in the full article and in this blog’s RSS feed.]

    “We can share with the public what is happening “under the hood” of the buoys with the 3D models as well, which is a great opportunity for outreach.”

    For Copeland, the test is utilizing current modeling technology to develop the most detailed 3D representations.

    “When you start to rebuild an object digitally, you learn what 3D tools can and can’t do,” he says. “While I am trying to think about how the project can grow, I also want to generate 3D assets that are useful to all of the consortium.”

    Funded by a $19 million grant from the NSF through EPSCoR, and also a $3.8 million state match, the consortium is a collaboration of engineers, scientists, designers and communicators from eight higher education institutions across the state—University of Rhode Island (lead), Brown University, Bryant University, Providence College, Rhode Island College, Rhode Island School of Design, Roger Williams University, and Salve Regina University—across the state developing a new research infrastructure to assess, predict and respond to the effects of climate variability on coastal ecosystems.

    Working together with businesses and area communities, the consortium seeks to position Rhode Island as a center of excellence for researchers on Narragansett Bay and beyond.
    For more information about the consortium and its researchers at institutions across the state, including URI, visit

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Rhode Island is a diverse and dynamic community whose members are connected by a common quest for knowledge.

    As a major research university defined by innovation and big thinking, URI offers its undergraduate, graduate, and professional students distinctive educational opportunities designed to meet the global challenges of today’s world and the rapidly evolving needs of tomorrow. That’s why we’re here.

    The University of Rhode Island, commonly referred to as URI, is the flagship public research as well as the land grant and sea grant university for the state of Rhode Island. Its main campus is located in the village of Kingston in southern Rhode Island. Additionally, smaller campuses include the Feinstein Campus in Providence, the Rhode Island Nursing Education Center in Providence, the Narragansett Bay Campus in Narragansett, and the W. Alton Jones Campus in West Greenwich.

    The university offers bachelor’s degrees, master’s degrees, and doctoral degrees in 80 undergraduate and 49 graduate areas of study through eight academic colleges. These colleges include Arts and Sciences, Business Administration, Education and Professional Studies, Engineering, Health Sciences, Environment and Life Sciences, Nursing and Pharmacy. Another college, University College for Academic Success, serves primarily as an advising college for all incoming undergraduates and follows them through their first two years of enrollment at URI.

    The University enrolled about 13,600 undergraduate and 3,000 graduate students in Fall 2015.[2] U.S. News & World Report classifies URI as a tier 1 national university, ranking it tied for 161st in the U.S.

  • richardmitnick 11:30 am on November 16, 2018 Permalink | Reply
    Tags: , , , Shedding new light on photosynthesis, , University of Michigan researchers have developed a powerful microscope that can map how light energy migrates in photosynthetic bacteria on timescales of one-quadrillionth of a second.   

    From University of Michigan: “Shedding new light on photosynthesis” 

    U Michigan bloc

    From University of Michigan

    October 11, 2018
    Morgan Sherburne

    Employing a series of ultrashort laser pulses, a new microscope reveals intricate details that govern photosynthetic processes in purple bacteria. Image credit: Vivek Tiwari, Yassel Acosta and Jennifer Ogilvie

    University of Michigan researchers have developed a powerful microscope that can map how light energy migrates in photosynthetic bacteria on timescales of one-quadrillionth of a second.

    The microscope could help researchers develop more efficient organic photovoltaic materials, a type of solar cell that could provide cheaper energy than silicon-based solar cells.

    In photosynthetic plants and bacteria, light hits the leaf or bacteria and a system of tiny light-harvesting antenna shuttle it along through proteins to what’s called a reaction center. Here, light is “trapped” and turned into metabolic energy for the organisms.

    Jennifer Ogilvie, U-M professor of physics and biophysics, and her team want to capture the movement of this light energy through proteins in a cell, and the team has taken one step toward that goal in developing this microscope. Their study has been published in Nature Communications.

    Ogilvie, graduate student Yassel Acosta and postdoctoral fellow Vivek Tiwari worked together to develop the microscope, which uses a method called two-dimensional electronic spectroscopy to generate images of energy migration within proteins during photosynthesis. The microscope images an area the size of one-fifth of a human blood cell and can capture events that take a period of one-quadrillionth of a second.

    Two-dimensional spectroscopy works by reading the energy levels within a system in two ways. First, it reads the wavelength of light that’s absorbed in a photosynthetic system. Then, it reads the wavelength of light detected within the system, allowing energy to be tracked as it flows through the organism.

    The instrument combines this method with a microscope to measure a signal from nearly a million times smaller volumes than before. Previous measurements imaged samples averaged over sections that were a million times larger. Averaging over large sections obscures the different ways energy might be moving within the same system.

    “We’ve now combined both of those techniques so we can get at really fast processes as well as really detailed information about how these molecules are interacting,” Ogilvie said. “If I look at one nanoscopic region of my sample versus another, the spectroscopy can look very different. Previously, I didn’t know that, because I only got the average measurement. I couldn’t learn about the differences, which can be important for understanding how the system works.”

    In developing the microscope, Ogilvie and her team studied colonies of photosynthetic purple bacterial cells. Previously, scientists have mainly looked at purified parts of these types of cells. By looking at an intact cell system, Ogilvie and her team were able to observe how a complete system’s different components interacted.

    The team also studied bacteria that had been grown in high light conditions, low light conditions and a mixture of both. By tracking light emitted from the bacteria, the microscope enabled them to view how the energy level structure and flow of energy through the system changed depending on the bacteria’s light conditions.

    Similarly, this microscope can help scientists understand how organic photovoltaic materials work, Ogilvie says. Instead of the light-harvesting antennae complexes found in plants and bacteria, organic photovoltaic materials have what are called “donor” molecules and “acceptor” molecules. When light travels through these materials, the donor molecule sends electrons to acceptor molecules, generating electricity.

    “We might find there are regions where the excitation doesn’t produce a charge that can be harvested, and then we might find regions where it works really well,” Ogilvie said. “If we look at the interactions between these components, we might be able to correlate the material’s morphology with what’s working well and what isn’t.”

    In organisms, these zones occur because one area of the organism might not be receiving as much light as another area, and therefore is packed with light-harvesting antennae and few reaction centers. Other areas might be flooded with light, and bacteria may have fewer antennae—but more reaction centers. In photovoltaic material, the distribution of donor and receptor molecules may change depending on the material’s morphology. This could affect the material’s efficiency in converting light into electricity.

    “All of these materials have to have different components that do different things—components that will absorb the light, components that will take that the energy from the light and convert it to something that can be used, like electricity,” Ogilvie said. “It’s a holy grail to be able to map in space and time the exact flow of energy through these systems.”

    See the full article here .



    Stem Education Coalition

    U MIchigan Campus

    The University of Michigan (U-M, UM, UMich, or U of M), frequently referred to simply as Michigan, is a public research university located in Ann Arbor, Michigan, United States. Originally, founded in 1817 in Detroit as the Catholepistemiad, or University of Michigania, 20 years before the Michigan Territory officially became a state, the University of Michigan is the state’s oldest university. The university moved to Ann Arbor in 1837 onto 40 acres (16 ha) of what is now known as Central Campus. Since its establishment in Ann Arbor, the university campus has expanded to include more than 584 major buildings with a combined area of more than 34 million gross square feet (781 acres or 3.16 km²), and has two satellite campuses located in Flint and Dearborn. The University was one of the founding members of the Association of American Universities.

    Considered one of the foremost research universities in the United States,[7] the university has very high research activity and its comprehensive graduate program offers doctoral degrees in the humanities, social sciences, and STEM fields (Science, Technology, Engineering and Mathematics) as well as professional degrees in business, medicine, law, pharmacy, nursing, social work and dentistry. Michigan’s body of living alumni (as of 2012) comprises more than 500,000. Besides academic life, Michigan’s athletic teams compete in Division I of the NCAA and are collectively known as the Wolverines. They are members of the Big Ten Conference.

  • richardmitnick 10:41 am on November 16, 2018 Permalink | Reply
    Tags: , , , , Texas Petawatt Laser,   

    From University of Texas at Austin: “UT Austin Selected for New Nationwide High-Intensity Laser Network” 

    U Texas Austin bloc

    From University of Texas at Austin

    30 October 2018
    Marc G Airhart

    The Texas Petawatt Laser, among the most powerful in the U.S., will be part of a new national network funded by the Dept. of Energy, named LaserNetUS. Credit: University of Texas at Austin.

    The University of Texas at Austin will be a key player in LaserNetUS, a new national network of institutions operating high-intensity, ultrafast lasers. The overall project, funded over two years with $6.8 million from the U.S. Department of Energy’s Office of Fusion Energy Sciences, aims to help boost the country’s global competitiveness in high-intensity laser research.

    UT Austin is home to one of the most powerful lasers in the country, the Texas Petawatt Laser. The university will receive $1.2 million to fund its part of the network.

    “UT Austin has become one of the international leaders in research with ultra-intense lasers, having operated one of the highest-power lasers in the world for the past 10 years,” said Todd Ditmire, director of UT Austin’s Center for High Energy Density Science, which houses the Texas Petawatt Laser. “We can play a major role in the new LaserNetUS network with our established record of leadership in this exciting field of science.”

    High-intensity lasers have a broad range of applications in basic research, manufacturing and medicine. For example, they can be used to re-create some of the most extreme conditions in the universe, such as those found in supernova explosions and near black holes. They can generate particles for high-energy physics research or intense X-ray pulses to probe matter as it evolves on ultrafast time scales. They are also promising in many potential technological areas such as generating intense neutron bursts to evaluate aging aircraft components, precisely cutting materials or potentially delivering tightly focused radiation therapy to cancer tumors.

    LaserNetUS includes the most powerful lasers in the United States, some of which have powers approaching or exceeding a petawatt. Petawatt lasers generate light with at least a million billion watts of power, or nearly 100 times the output of all the world’s power plants — but only in the briefest of bursts. Using the technology pioneered by two of the winners of this year’s Nobel Prize in physics, called chirped pulse amplification, these lasers fire off ultrafast bursts of light shorter than a tenth of a trillionth of a second.

    “I am particularly excited to lead the Texas Petawatt science effort into the next phase of research under this new, LaserNetUS funding,” said Ditmire. “This funding will enable us to collaborate with some of the leading optical and plasma physics scientists from around the U.S.”

    LaserNetUS will provide U.S. scientists increased access to the unique high-intensity laser facilities at nine institutions: UT Austin, The Ohio State University, Colorado State University, the University of Michigan, University of Nebraska-Lincoln, University of Rochester, SLAC National Accelerator Laboratory, Lawrence Berkeley National Laboratory and Lawrence Livermore National Laboratory.

    The U.S. was the dominant innovator and user of high-intensity laser technology in the 1990s, but now Europe and Asia have taken the lead, according to a recent report from the National Academies of Sciences, Engineering and Medicine titled “Opportunities in Intense Ultrafast Lasers: Reaching for the Brightest Light.” Currently, 80 to 90 percent of the world’s high-intensity ultrafast laser systems are overseas, and all of the highest-power research lasers currently in construction or already built are also overseas. The report’s authors recommended establishing a national network of laser facilities to emulate successful efforts in Europe. LaserNetUS was established for exactly that purpose.

    The Office of Fusion Energy Sciences is a part of the Department of Energy’s Office of Science.

    LaserNetUS will hold a nationwide call for proposals for access to the network’s facilities. The proposals will be peer reviewed by an independent panel. This call will allow any researcher in the U.S. to get time on one of the high-intensity lasers at the LaserNetUS host institutions.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Texas Austin campus

    U Texas at Austin

    In 1839, the Congress of the Republic of Texas ordered that a site be set aside to meet the state’s higher education needs. After a series of delays over the next several decades, the state legislature reinvigorated the project in 1876, calling for the establishment of a “university of the first class.” Austin was selected as the site for the new university in 1881, and construction began on the original Main Building in November 1882. Less than one year later, on Sept. 15, 1883, The University of Texas at Austin opened with one building, eight professors, one proctor, and 221 students — and a mission to change the world. Today, UT Austin is a world-renowned higher education, research, and public service institution serving more than 51,000 students annually through 18 top-ranked colleges and schools.

  • richardmitnick 9:58 am on November 16, 2018 Permalink | Reply
    Tags: ACC- Antarctic Circumpolar Current, , , ,   

    From CSIROscope: “Explainer: how the Antarctic Circumpolar Current helps keep Antarctica frozen” 

    CSIRO bloc

    From CSIROscope

    16 November 2018
    Helen Phillips
    Benoit Legresy
    Nathan Bindoff

    The Antarctic Circumpolar Current, or ACC, is the strongest ocean current on our planet. It extends from the sea surface to the bottom of the ocean, and encircles Antarctica.

    It is vital for Earth’s health because it keeps Antarctica cool and frozen. It is also changing as the world’s climate warms. Scientists like us are studying the current to find out how it might affect the future of Antarctica’s ice sheets, and the world’s sea levels.

    The ACC carries an estimated 165 million to 182 million cubic metres of water every second (a unit also called a “Sverdrup”) from west to east, more than 100 times the flow of all the rivers on Earth. It provides the main connection between the Indian, Pacific and Atlantic Oceans.

    The tightest geographical constriction through which the current flows is Drake Passage, where only 800 km separates South America from Antarctica. While elsewhere the ACC appears to have a broad domain, it must also navigate steep undersea mountains that constrain its path and steer it north and south across the Southern Ocean.

    Scientists deploying a vertical microstructure profiler (VMP-2000), which measures temperature, salinity, pressure and turbulence, from RV Investigator in the Antarctic Circumpolar Current, November 2018. Photo credit: Nathan Bindoff.

    What is the Antarctic Circumpolar Current?

    A satellite view over Antarctica reveals a frozen continent surrounded by icy waters. Moving northward, away from Antarctica, the water temperatures rise slowly at first and then rapidly across a sharp gradient. It is the ACC that maintains this boundary.

    Map of the ocean surface temperature as measured by satellites and analysed by the European Copernicus Marine Services. The sea ice extent around the antarctic continent for this day appears in light blue. The two black lines indicate the long term position of the southern and northern front of the Antarctic Circumpolar Current.

    The ACC is created by the combined effects of strong westerly winds across the Southern Ocean, and the big change in surface temperatures between the Equator and the poles.

    Ocean density increases as water gets colder and as it gets more salty. The warm, salty surface waters of the subtropics are much lighter than the cold, fresher waters close to Antarctica. We can imagine that the depth of constant density levels slopes up towards Antarctica.

    The westerly winds make this slope steeper, and the ACC rides eastward along it, faster where the slope is steeper, and weaker where it’s flatter.

    Fronts and bottom water

    In the ACC there are sharp changes in water density known as fronts. The Subantarctic Front to the north and Polar Front further south are the two main fronts of the ACC (the black lines in the images). Both are known to split into two or three branches in some parts of the Southern Ocean, and merge together in other parts.

    Scientists can figure out the density and speed of the current by measuring the ocean’s height, using altimeters. For instance, denser waters sit lower and lighter waters stand taller, and differences between the height of the sea surface give the speed of the current.

    Map of how fast the waters around Antarctica are moving in an easterly direction. It is produced using 23 years of satellite altimetry (ocean height) observations as provided by the European Copernicus Marine Services. Author provided.

    The path of the ACC is a meandering one, because of the steering effect of the sea floor, and also because of instabilities in the current.

    The ACC also plays a part in the meridional (or global) overturning circulation, which brings deep waters formed in the North Atlantic southward into the Southern Ocean. Once there it becomes known as Circumpolar Deep Water, and is carried around Antarctica by the ACC. It slowly rises toward the surface south of the Polar Front.

    Once it surfaces, some of the water flows northward again and sinks north of the Subarctic Front. The remaining part flows toward Antarctica where it is transformed into the densest water in the ocean, sinking to the sea floor and flowing northward in the abyss as Antarctic Bottom Water. These pathways are the main way that the oceans absorb heat and carbon dioxide and sequester it in the deep ocean.

    Changing current

    The ACC is not immune to climate change. The Southern Ocean has warmed and freshened in the upper 2,000 m. Rapid warming and freshening has also been found in the Antarctic Bottom Water, the deepest layer of the ocean.

    Waters south of the Polar Front are becoming fresher due to increased rainfall there, and waters to the north of the Polar Front are becoming saltier due to increased evaporation. These changes are caused by human activity, primarily through adding greenhouse gases to the atmosphere, and depletion of the ozone layer. The ozone hole is now recovering but greenhouse gases continue to rise globally.

    Winds have strengthened by about 40% over the Southern Ocean over the past 40 years. Surprisingly, this has not translated into an increase in the strength of the ACC. Instead there has been an increase in eddies that move heat towards the pole, particularly in hotspots such as Drake Passage, Kerguelen Plateau, and between Tasmania and New Zealand.

    We have observed much change already. The question now is how this increased transfer of heat across the ACC will impact the stability of the Antarctic ice sheet, and consequently the rate of global sea-level rise.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    SKA/ASKAP radio telescope at the Murchison Radio-astronomy Observatory (MRO) in Mid West region of Western Australia

    So what can we expect these new radio projects to discover? We have no idea, but history tells us that they are almost certain to deliver some major surprises.

    Making these new discoveries may not be so simple. Gone are the days when astronomers could just notice something odd as they browse their tables and graphs.

    Nowadays, astronomers are more likely to be distilling their answers from carefully-posed queries to databases containing petabytes of data. Human brains are just not up to the job of making unexpected discoveries in these circumstances, and instead we will need to develop “learning machines” to help us discover the unexpected.

    With the right tools and careful insight, who knows what we might find.

    CSIRO campus

    CSIRO, the Commonwealth Scientific and Industrial Research Organisation, is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: