Tagged: Horizon – The EU Research and Innovation Magazine Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:25 pm on March 29, 2019 Permalink | Reply
    Tags: "Fleets of autonomous satellites to coordinate tasks among themselves", , , For the cost of a multi-tonne satellite it is possibile to use groups of small satellites – even hundreds of them – to set up a sensor network., Horizon - The EU Research and Innovation Magazine, MiRAGE software, NetSat project Würzburg University and Centre for Telematics, Researchers are using automation and artificial intelligence to make smaller autonomous satellites smarter and more effective.   

    From Horizon The EU Research and Innovation Magazine: “Fleets of autonomous satellites to coordinate tasks among themselves” 

    1

    From Horizon The EU Research and Innovation Magazine

    27 March 2019
    Rex Merrifield

    1
    Smaller, autonomous satellites could help analyse the internal structure of clouds to give a more detailed view of Earth’s changing climate. Image credit – Earth Science and Remote Sensing Unit, NASA Johnson Space Center


    NASA Johnson Space Center, Houston TX, USA campus

    Space missions have long benefited from some autonomous operations being carried out aboard spacecraft, but with a sharp increase expected in the number of satellites being launched in the next few years, researchers are using automation and artificial intelligence to make them smarter and more effective.

    Technology firms and researchers see scope for giving satellites more onboard control, to circumvent difficulties in communicating with Earth and reduce the need for continuous hands-on supervision and intervention from afar. That will reduce operating costs and potentially allow them to do more sophisticated tasks independently of their Earth-bound supervisors.

    Smaller, autonomous spacecraft could close the gaps in coverage between much larger, more expensive telecommunications satellites, or be used in formations to monitor space weather or observe Earth from different perspectives simultaneously – such as three-dimensional real-time analysis of clouds or monitoring volcanic plumes.

    In doing so, they would be able to correct and maintain their trajectory, avoid collisions and supervise their on-board systems all on their own – all at a substantially lower operating cost.

    Professor Klaus Schilling, chair of robotics and telematics at Würzburg University in Germany, has been working on the technology for groups of small, autonomous satellites to fly in formation, communicating directly with each other to organise and coordinate tasks. Success would mark a world first.

    Network

    For the cost of a multi-tonne satellite, he sees the possibility to use groups of small satellites – even hundreds of them – to set up a sensor network. The fleet would need more advanced coordination and control, but would be able to provide better temporal and spatial resolution than one giant craft.

    While miniaturisation can present difficulties for satellites, such as susceptibility to noise in electronic circuits, sophisticated software can detect and correct these problems and cooperation between small spacecraft can also enhance their capabilities, Prof. Schilling says.

    “This is even the case with a single satellite, but it becomes critical at the multi-satellite level, in the context of the formation,” said Prof. Schilling, who also heads the German research firm the Centre for Telematics.

    His NetSat project aims to launch four small satellites at the end of this year, to orbit the Earth and test formations with varying degrees of autonomy, with light-touch supervision from ground control.

    The satellites will be around 3 kilograms each – a mere fraction of the size of the biggest satellites – and will be placed in a low Earth orbit, about 600 kilometres above the surface.

    To date, Prof. Schilling and his team have used satellites already in orbit to develop and demonstrate systems for communication, positioning and orientation, and they are currently testing an electrical propulsion system for NetSat.

    The technology also incorporates two decades of learning from research into controlling formations of mobile robots, extending into three dimensions the swarm-like behaviour used to coordinate terrestrial rovers.

    2
    Klaus Schilling with the German first pico-satellite (a satellite with 1 kg of mass), designed and realized by his team 2005. Image credit – University Würzburg

    Coordinate

    The NetSat spacecraft will be able to coordinate with each other over distances from about 100 kilometres down to 10 metres, as well as change their formation depending on the tasks they need to perform.

    ‘For us it will be like having a laboratory in space, where we can do a lot of operation tests, a lot of control tests and a lot of sensor tests, which will help us for future missions,’ Prof. Schilling said.

    NetSat works by distributing computing power between satellites in a formation, but another approach under exploration is to use artificial intelligence (AI) to increase satellite autonomy.

    AI can make a satellite aware of its surroundings and decide autonomously when and how to carry out operational tasks, such as gathering images, analysing and processing them, and then selecting only the essential data for downloading to the Earth station.

    The aim could be to identify specific targets that can be monitored or tracked, perhaps a building or a ship or a vehicle on the surface of the Earth, or filtering out clouds to improve image quality.

    Such a satellite could also recognise new events that need monitoring or anomalies demanding action, says Dr Lorenzo Feruglio, founder and chief executive of Italian space-technology start-up AIKO, based in Turin.

    ‘In a sense you need to detect the conditions and what is happening and then you react to those conditions autonomously, using AI rather than traditional algorithms,’ Dr Feruglio said.

    He leads a project called MiRAGE, which is using AI tools such as deep learning to automate satellite operations.

    Lower cost

    Such smart, AI-based on-board systems ensure the spacecraft can complete its tasks without the delays involved in awaiting new instructions or decisions from ground control, which can then focus on critical issues rather than routine tasks – with sharply reduced staffing levels and at much lower cost.

    The MiRAGE software, some of which has its roots in the functionality demanded by drones or autonomous cars, will be launched as an on-board experiment on a small satellite in the last quarter of this year, with a view to being rolled out on larger spacecraft in future. One of the aims is to demonstrate the adaptability of AI to different tasks and mission objectives – including the possibility of deep space exploration.

    ‘In general, AI and deep learning are proving their worth in many different industries and the benefits (for space missions) are way from being fully explored yet,’ Dr Feruglio added.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

     
  • richardmitnick 10:05 am on February 22, 2019 Permalink | Reply
    Tags: "Fuel buildup", Erratic firestorms known as pyroCbs, Horizon - The EU Research and Innovation Magazine   

    From Horizon The EU Research and Innovation Magazine: “‘It eats everything’ – the new breed of wildfire that’s impossible to predict” 

    1

    From Horizon The EU Research and Innovation Magazine

    21 February 2019
    Annette Ekin

    1
    The 2017 Chilean wildfires, along with those in Portugal, were confirmation that the new type of fire was here to stay. Image credit – Pablo Trincado, licensed under CC BY 2.0

    We’re fighting a different kind of wildfire whose behaviour experts are struggling to predict.

    Climate change and negligent forest management are causing higher-intensity, faster-moving fires that can generate enough energy to evolve into erratic firestorms, known as pyroCbs, in the face of which first responders can do little.

    “Traditionally we could predict the fire behaviour and the direction of the fire but under those conditions and those moments it’s not possible,” said Marc Castellnou, president of the Spanish independent wildfire prevention group Pau Costa Foundation.

    As a wildland fire analyst with the Catalan fire services, Castellnou reconstructs wildfires using simulations, satellite, on-the-ground and other data.

    This wildfire shows a different behaviour than those of the past, he says. ‘It eats everything.’

    While these fires are rare, when one strikes it can generate 100,000 kilowatts of energy per metre. In firefighting terms, this is 10 times what a firefighter can handle, but even at 4,000 kilowatts, firefighters cannot go near the flames and require aerial support. “The old way of fighting fires by sending firefighters – that’s gone,” Castellnou said.

    New normal

    There have been signs of trouble since the 1990s, according to Castellnou.

    “This change has been cooking for a long time, but the first time we realised something wrong was happening were the years 2009 and 2012,” he said, referring to the Black Saturday bushfires in the Australian state of Victoria that killed 173 people and wildfires in Spain, Portugal, Chile and California, US. Many in the fire community initially thought these were just abnormal events, he says.

    But then wildfires in Chile and Portugal in 2017 indicated that those weren’t simply extreme years. “That was the new normal arriving. 2018 has confirmed that,” he said, referring to the deadly wildfires in Greece and in California.

    On October 15, 2017, Castellnou was in central Portugal to conduct analysis then support the local services as the wildfires became firestorms.

    “What I saw was the pace of the fires … You think: ‘Well that cannot be real.’ When you go there (and see the damage) you understand that that is the reality,” he said.

    Castellnou, who spoke about the future of fighting wildfires at the EU’s security research event in December 2018, first joined the Catalan fire and rescue services as a seasonal firefighter when he was a teenager. In the past, he says, a fire that destroyed 25,000 hectares a day was considered extreme. According to his figures, the October fires in Portugal consumed 220,000 hectares of forest, an area 22 times the size of Lisbon and killed more than 40 people. Castellnou says that at their peak, wildfires burned at a rate of 10,000 hectares per hour over seven hours.

    “This is something that blew my mind and I cannot use technology to simulate that because models can’t predict it,” he said. The challenge is now predicting how they will behave, he says. “We’re still not there. We’re struggling.”

    Flammable

    Wildfire experts say that climate change, causing a long-term rise in temperature and less rainfall, is creating unprecedented flammable conditions that are making forests burn with more intensity. Wildfires now occur in the wintertime and affect regions in latitudes beyond the fire season-prone countries of Spain, Greece, Italy, Portugal and France. Castellnou says that wildfires are expected to affect highly populated areas like central Europe.

    “Last summer, it was the first time in history we were having wildfires in (nearly) every single country in Europe,” he said.

    “It’s not that climate change will create these new scenarios. No, no. The new scenario is already here, and it has come a lot faster than expected.”

    According to experts, urbanisation and poor forest management for reducing fuel – the grasses and shrubs that fires feed on – are also to blame.

    David Caballero, who also spoke at the security research event, assesses the wildfire risks in populated areas, focusing on the wildland-urban interface, where infrastructure and urban development intermingle with forests and other wildlands. He is contributing to a project called Clarity that is working to join up different IT systems to protect cities and infrastructures from the effects of climate change.

    He says we’re seeing more fast-growing, high-energy fires affecting populated areas.

    “We have to be prepared. Whenever we have forest in Europe, we eventually will have forest fires,” he said.

    He travelled to the seaside village of Mati, Greece, in the immediate aftermath of Europe’s deadliest wildfires last year which killed 99 people in the region of Attica. Speaking to firefighters and survivors, he learnt that many people did not expect the fires to cross the highway that runs parallel to the coast. In the past the fires had halted at this point, but this time they leapt across, burning through Mati.

    “There was an enormous amount of fuel due to the lack of management for 40 years,” he said. The fires tore through the village and reached the coast in just 20 minutes.

    Caballero says that all along the Mediterranean coast, unregulated construction with little regard for safety and evacuation routes and lax vegetation management mean that more places are at risk. He says local and regional authorities can no longer afford to be negligent. ‘We are living surrounded by fuel,’ he said.

    Culture of risk

    Pau Costa Foundation, established to speed up the sharing of information and know-how between fire services and society, works on a number of prevention campaigns. For a project called Heimdall, set up to contribute to an EU-wide information system about fires and other emergencies, the foundation is ensuring that the general public has a voice in shaping it.

    One of the foundation’s aims is to change the social perception of wildfires. A tendency to fight every fire, small or large, has let landscapes thrive artificially, Castellnou says. “Not all fire is bad,” he said. By clearing old trees, fires can make way for the growth of new forests that are adapted to climate change.

    Smaller fires, through activities such as prescribed burning, also have a role to play in creating scars in the land which break up a bigger fire’s path. “A mosaic of landscape of different ages and low-intensity fires is the best protection against the big fires,” he said.

    Oriol Vilalta, director of the foundation and a volunteer firefighter, says with wildfires killing more people in Europe, causing more than 200 deaths in the past three years, it’s time we learnt how to coexist with them.

    “We need to create a culture of risk. The Japanese know very well what to do in case of an earthquake, but we don’t know what to do in Europe with fires,” Vilalta said.

    In the past, the tendency was to evacuate people, but the general public must become part of the solution through self-protection, he says. “(That’s) what to do and what not to do, where to stay and where not to stay in case of a fire.”

    The research projects in this article are funded by the EU. If you liked this article, please consider sharing it on social media.

    What can be done? The expert view

    Create an EU-wide programme to teach people how to live with wildfires and manage landscapes, similar to FireSmart in Canada or Firewise in the US [Don’t look to the USA for expertise, we do not have it. One of the biggest problems we have is the lack of removal of “fuel buildup”, the removal of dead trees. I am a hiker and I see this in every venue I have for hiking.].

    Establish long-term or permanent research structures to understand our future with fire.

    Help municipalities, firefighters and communities work together to raise awareness and knowledge about wildfire risks and how fires behave.

    Reduce the vegetation that makes forests flammable [especially dead “fuel buildup”], so fire and rescue services have the capacity to fight fires.

    Create an incentive to clear vegetation, such as constructing buildings from wood or using biomass to heat public buildings and hospitals.

    In the case of a fire, provide clear information for residents about when to evacuate and when to stay in their homes.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

     
  • richardmitnick 10:16 am on September 14, 2018 Permalink | Reply
    Tags: , , , , , , Dark matter clusters could reveal nature of dark energy, Horizon - The EU Research and Innovation Magazine   

    From Horizon The EU Research and Innovation Magazine: “Dark matter clusters could reveal nature of dark energy” 

    1

    From Horizon The EU Research and Innovation Magazine

    10 September 2018
    Jon Cartwright

    1
    Gravitational lensing in galaxy clusters such as Abell 370 are helping scientists to measure the dark matter distribution. Image credit – NASA, ESA, the Hubble SM4 ERO Team and ST-ECF

    Scientists are hoping to understand one of the most enduring mysteries in cosmology by simulating its effect on the clustering of galaxies.

    That mystery is dark energy – the phenomenon that scientists hypothesise is causing the universe to expand at an ever-faster rate. No-one knows anything about dark energy, except that it could be, somehow, blowing pretty much everything apart.

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    Meanwhile, dark energy has an equally shady cousin – dark matter.

    Dark Matter Research

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Scientists studying the cosmic microwave background hope to learn about more than just how the universe grew—it could also offer insight into dark matter, dark energy and the mass of the neutrino.

    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al

    Dark Matter Particle Explorer China

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB deep in Sudbury’s Creighton Mine

    LUX Dark matter Experiment at SURF, Lead, SD, USA

    ADMX Axion Dark Matter Experiment, U Uashington

    This invisible substance appears to have been clustering around galaxies, and preventing them from spinning themselves apart, by lending them an extra gravitational pull.

    Such a clustering effect is in competition with dark energy’s accelerating expansion. Yet studying the precise nature of this competition might shed some light on dark energy.

    ‘Many dark energy models are already ruled out with current data,’ said Dr Alexander Mead, a cosmologist at the University of British Columbia in Vancouver, Canada, who is working on a project called Halo modelling. ‘Hopefully in future we can rule more out.’

    Gravitational lensing

    Currently, the only way dark matter can be observed is by looking for the effects of its gravitational pull on other matter and light. The intense gravitational field it produces can cause light to distort and bend over large distances – an effect known as gravitational lensing.

    By mapping the dark matter ​in distant parts of the cosmos, scientists can work out how much dark matter clustering there is – and in principle how that clustering is being affected by dark energy.

    The link between gravitational lensing and dark matter clustering is not straightforward, however. To interpret the data from telescopes, scientists must refer to detailed cosmological models – mathematical representations of complex systems.

    Dr Mead is developing a clustering model that he hopes will have enough accuracy to distinguish between different dark-energy hypotheses.

    ‘An analogy I like a lot is with turbulence. In turbulent fluid flow you can talk about currents and eddies, which are nice words, but the reality of how fluid in a pipe goes from flowing calmly to flowing in a turbulent fashion is extremely complicated.’

    _________________________________________

    ‘If dark energy turns out to be a dynamical phenomenon this will have a profound implication not only on cosmology, but on our understanding of fundamental physics.’

    Dr Pier Stefano Corasaniti, Paris Observatory, France
    _________________________________________

    Fifth force

    One of the more exotic theories is that dark energy is the result of a hitherto undetected fifth force, in addition to nature’s four known forces – gravity, electromagnetism, and the strong and weak nuclear forces inside atoms.

    A more common hypothesis for dark energy, however, is known as the cosmological constant, which was put forward by Albert Einstein as part of his general theory of relativity. It is often believed to describe an all-pervading sea of virtual particles that are continually popping into and out of existence throughout the universe.

    One way to rule out the cosmological constant hypothesis, of course, is to prove that dark energy is not constant at all. This is the goal of Dr Pier Stefano Corasaniti of the Paris Observatory in France, who – in a project called EDECS – is approaching dark-matter clustering from a different direction.

    Instead of attempting to model clustering from gravitational lensing data, he is beginning specifically with a dynamical – that is, not constant – hypothesis of dark energy, and trying to predict how dark matter would cluster if this was the case.

    Pushing the limits

    There are, in principle, infinite ways dark energy can vary in space and time, although many theories have already been ruled out by existing observations. Dr Corasaniti is focussing his simulations on types of dynamical dark energy that push at the edges of these observational limits, paving the way for tests with future experiments.

    The simulations, which trace the evolution of numerous, ‘N-body’ dark matter particles, require supercomputers running for long periods of time, processing several petabytes (one thousand million million bytes) of data.

    ‘We have run among the largest cosmological N-body simulations ever realised,’ Dr Corasaniti said.

    Dr Corasaniti’s simulations predict that the way dark energy evolves over time ought to affect dark matter clustering. This, in turn, alters the efficiency with which galaxies form in ways that would not be the case with constant dark energy.

    The predictions his models are making could be tested with the help of forthcoming telescopes such as the Large Synoptic Survey Telescope in Chile and the Square Kilometre Array in Australia and South Africa, as well as by satellite missions such as Euclid (EUropean Cooperation for LIghtning Detection) and WFIRST (Wide Field Infrared Survey Telescope).

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    SKA Square Kilometer Array


    SKA South Africa

    ESA/Euclid spacecraft

    NASA/WFIRST

    ‘If dark energy turns out to be a dynamical phenomenon this will have a profound implication not only on cosmology, but on our understanding of fundamental physics,’ said Dr Corasaniti.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

     
  • richardmitnick 9:54 am on September 14, 2018 Permalink | Reply
    Tags: , , , Earthquake research, EPOS-European Plate Observing System, Horizon - The EU Research and Innovation Magazine, ,   

    From Horizon The EU Research and Innovation Magazine : “Plate tectonics observatory to create seismic shift in earthquake research” 

    1

    From Horizon The EU Research and Innovation Magazine

    13 September 2018
    Gareth Willmer

    1
    A 6.2-magnitude earthquake in Amatrice, Italy, in August 2016 killed nearly 300 people. Image credit – Amatrice Corso by Mario1952 is licensed under Creative Commons CC-BY-SA-2.5 and 2016 Amatrice earthquake by Leggi il Firenzepost is licensed under CC BY 3.0

    We may never be able to entirely predict earthquakes such as those that hit central Italy in 2016, but we could better assess how they’re going to play out by joining up data from different scientific fields in a new Europe-wide observatory, say scientists.

    In 2016 and early 2017, a series of major earthquakes rocked central Italy. In the hill town of Amatrice, one magnitude-6.2 earthquake devastated the town and claimed the lives of nearly 300 people, with hundreds more injured.

    Richard Walters, an assistant professor in the Department of Earth Sciences at Durham University, UK, has been studying a variety of datasets to understand how these quakes played out.

    Durham U bloc

    From Durham University

    He and his colleagues found that a network of underground faults meant there was a series of seismic events rather than one major earthquake – a finding that could help scientists predict how future seismic events unroll.

    ‘We were only able to achieve this by analysing a huge variety of datasets,’ said Dr Walters. These included catalogues of thousands of tiny aftershocks, maps of earthquake ruptures measured by geologists clambering over Italian hillslopes, GPS-based ground-motion measurements, data collected by a satellite hundreds of kilometres up, and seismological data from a global network of instruments.

    ‘Many of these datasets or processed products were generously shared by other scientists for free, and were fundamental to our results,’ he said. ‘This is how we make big advances.’

    At the moment, this type of research can rely on having a strong network of contacts and disadvantage those without them. That’s where a new initiative called the European Plate Observing System (EPOS), set to launch in 2020, comes in.

    The aim is to create an online tool that brings together data products and knowledge into a central hub across the solid Earth science disciplines.

    ‘The idea is that a scientist can go to the EPOS portal, where they can find a repository with all the earthquake rupture models, historical earthquake data and strain maps, and use this data to make an interpretative model,’ said Professor Massimo Cocco, the project’s coordinator.

    ‘A scientist studying an earthquake, a volcano, a tsunami, and so on, needs to be able to access very different data generated by different communities.’

    __________________________________________________

    ‘While in Europe’s current climate politicians may be putting up borders, scientists in those same countries are trying even harder to break down national barriers.’

    Dr Richard Walters, Durham University, UK
    __________________________________________________

    Mosaic

    At the moment, findings on solid Earth science at a European scale are scattered among a mosaic of hundreds of research organisations. The challenge is to incorporate a variety of accessible information from many different scientific fields, using a combination of real-time, historical and interpretative data.

    EPOS will integrate data from 10 areas of Earth science, including seismology, geodesy, geological data, volcano observations, satellite data products and anthropogenic – or human-influenced – hazards.

    It will help build on the type of data integration that happened after the Amatrice quake, in which the lead organisation behind EPOS – Italy’s National Institute of Geophysics and Volcanology (INGV) – was involved in coordinating and fostering data sharing.

    This included real-time data from temporary sensor deployments, as well as seismic hazard maps, satellite data products and geophysical data – leading to a first model of the quake’s causative source within 48 hours to aid emergency planning.

    So far, a prototype of the portal has been developed and it will now be tested by users over the coming year to make sure it meets needs.

    Dr Walters said that EPOS is right on time. ‘Projects like EPOS are especially timely and valuable right now, as many of the subdisciplines that make up solid Earth geoscience are entering the era of big data,’ he said.

    Eyjafjallajökull

    The eruption of Icelandic volcano Eyjafjallajökull in 2010 highlights another issue that EPOS is hoping to improve – the challenge of coordination across borders. Though this event did not cost human lives, it had a much wider impact in Europe, leading to flights being grounded throughout the region and costing airlines an estimated €1.3 billion.

    In such cases, said Prof. Cocco, it helps to know factors such as the ash’s composition, something that affects how a plume travels but is not necessarily included in the models of meteorologists. That knowledge could be gained through access to volcanology data, and also used by aviation authorities and airlines, potentially to design systems to protect engines.

    Prof. Cocco said the idea is that EPOS could also be used by people outside the research community to ‘increase the resilience of society to geohazards’. An engineer or organisation could use data on ground shaking or earthquake occurrence to aid safe exploitation of resources or evaluate risks in building a nuclear power plant, for example.

    In addition, the aim is to make it easier for students or young scientists to interpret data through tools, software, tutorials and discovery services, rather than having access to just raw data. ‘Otherwise, you are providing only usability to skilled scientists,’ said Prof. Cocco. ‘This, to me, is the only way to achieve open science.’

    At present, the EPOS community comprises about 50 partners across 25 European countries, with hundreds of research infrastructures, institutes and organisations providing data. The organisation has, meanwhile, submitted a final application to become a legal entity known as a European Research Infrastructure Consortium (ERIC), with a decision establishing the ERIC expected within the next two months. This official status will aid integration with other national and European organisations, and have benefits in the allocation of funding, said Prof. Cocco.

    Professor Giulio Di Toro, a structural geologist at the University of Padova in Italy, said it is great to have this type of hub to bring information together and improve access, but also important to ensure that it doesn’t lead to an increase in bureaucracy. If institutions come up against funding issues, it could also pose a challenge to their ability to share data, he added: ‘If for some years you don’t get grants, you will not produce data to share.’

    Meanwhile, Dr Walters sees a positive spirit reflected in these types of initiative. ‘While in Europe’s current climate politicians may be putting up borders,’ he said, ‘scientists in those same countries are trying even harder to break down national barriers, and working together to build something better for everyone.’

    The implementation phase of EPOS is being part-funded by the EU. If you liked this article, please consider sharing it on social media.

    Earthquake Alert

    1

    Earthquake Alert

    Earthquake Network projectEarthquake Network is a research project which aims at developing and maintaining a crowdsourced smartphone-based earthquake warning system at a global level. Smartphones made available by the population are used to detect the earthquake waves using the on-board accelerometers. When an earthquake is detected, an earthquake warning is issued in order to alert the population not yet reached by the damaging waves of the earthquake.

    The project started on January 1, 2013 with the release of the homonymous Android application Earthquake Network. The author of the research project and developer of the smartphone application is Francesco Finazzi of the University of Bergamo, Italy.

    Get the app in the Google Play store.

    3
    Smartphone network spatial distribution (green and red dots) on December 4, 2015

    Meet The Quake-Catcher Network

    QCN bloc

    Quake-Catcher Network

    The Quake-Catcher Network is a collaborative initiative for developing the world’s largest, low-cost strong-motion seismic network by utilizing sensors in and attached to internet-connected computers. With your help, the Quake-Catcher Network can provide better understanding of earthquakes, give early warning to schools, emergency response systems, and others. The Quake-Catcher Network also provides educational software designed to help teach about earthquakes and earthquake hazards.

    After almost eight years at Stanford, and a year at CalTech, the QCN project is moving to the University of Southern California Dept. of Earth Sciences. QCN will be sponsored by the Incorporated Research Institutions for Seismology (IRIS) and the Southern California Earthquake Center (SCEC).

    The Quake-Catcher Network is a distributed computing network that links volunteer hosted computers into a real-time motion sensing network. QCN is one of many scientific computing projects that runs on the world-renowned distributed computing platform Berkeley Open Infrastructure for Network Computing (BOINC).

    The volunteer computers monitor vibrational sensors called MEMS accelerometers, and digitally transmit “triggers” to QCN’s servers whenever strong new motions are observed. QCN’s servers sift through these signals, and determine which ones represent earthquakes, and which ones represent cultural noise (like doors slamming, or trucks driving by).

    There are two categories of sensors used by QCN: 1) internal mobile device sensors, and 2) external USB sensors.

    Mobile Devices: MEMS sensors are often included in laptops, games, cell phones, and other electronic devices for hardware protection, navigation, and game control. When these devices are still and connected to QCN, QCN software monitors the internal accelerometer for strong new shaking. Unfortunately, these devices are rarely secured to the floor, so they may bounce around when a large earthquake occurs. While this is less than ideal for characterizing the regional ground shaking, many such sensors can still provide useful information about earthquake locations and magnitudes.

    USB Sensors: MEMS sensors can be mounted to the floor and connected to a desktop computer via a USB cable. These sensors have several advantages over mobile device sensors. 1) By mounting them to the floor, they measure more reliable shaking than mobile devices. 2) These sensors typically have lower noise and better resolution of 3D motion. 3) Desktops are often left on and do not move. 4) The USB sensor is physically removed from the game, phone, or laptop, so human interaction with the device doesn’t reduce the sensors’ performance. 5) USB sensors can be aligned to North, so we know what direction the horizontal “X” and “Y” axes correspond to.

    If you are a science teacher at a K-12 school, please apply for a free USB sensor and accompanying QCN software. QCN has been able to purchase sensors to donate to schools in need. If you are interested in donating to the program or requesting a sensor, click here.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.

    Earthquake safety is a responsibility shared by billions worldwide. The Quake-Catcher Network (QCN) provides software so that individuals can join together to improve earthquake monitoring, earthquake awareness, and the science of earthquakes. The Quake-Catcher Network (QCN) links existing networked laptops and desktops in hopes to form the worlds largest strong-motion seismic network.

    Below, the QCN Quake Catcher Network map
    QCN Quake Catcher Network map

    ShakeAlert: An Earthquake Early Warning System for the West Coast of the United States

    The U. S. Geological Survey (USGS) along with a coalition of State and university partners is developing and testing an earthquake early warning (EEW) system called ShakeAlert for the west coast of the United States. Long term funding must be secured before the system can begin sending general public notifications, however, some limited pilot projects are active and more are being developed. The USGS has set the goal of beginning limited public notifications in 2018.

    Watch a video describing how ShakeAlert works in English or Spanish.

    The primary project partners include:

    United States Geological Survey
    California Governor’s Office of Emergency Services (CalOES)
    California Geological Survey
    California Institute of Technology
    University of California Berkeley
    University of Washington
    University of Oregon
    Gordon and Betty Moore Foundation

    The Earthquake Threat

    Earthquakes pose a national challenge because more than 143 million Americans live in areas of significant seismic risk across 39 states. Most of our Nation’s earthquake risk is concentrated on the West Coast of the United States. The Federal Emergency Management Agency (FEMA) has estimated the average annualized loss from earthquakes, nationwide, to be $5.3 billion, with 77 percent of that figure ($4.1 billion) coming from California, Washington, and Oregon, and 66 percent ($3.5 billion) from California alone. In the next 30 years, California has a 99.7 percent chance of a magnitude 6.7 or larger earthquake and the Pacific Northwest has a 10 percent chance of a magnitude 8 to 9 megathrust earthquake on the Cascadia subduction zone.

    Part of the Solution

    Today, the technology exists to detect earthquakes, so quickly, that an alert can reach some areas before strong shaking arrives. The purpose of the ShakeAlert system is to identify and characterize an earthquake a few seconds after it begins, calculate the likely intensity of ground shaking that will result, and deliver warnings to people and infrastructure in harm’s way. This can be done by detecting the first energy to radiate from an earthquake, the P-wave energy, which rarely causes damage. Using P-wave information, we first estimate the location and the magnitude of the earthquake. Then, the anticipated ground shaking across the region to be affected is estimated and a warning is provided to local populations. The method can provide warning before the S-wave arrives, bringing the strong shaking that usually causes most of the damage.

    Studies of earthquake early warning methods in California have shown that the warning time would range from a few seconds to a few tens of seconds. ShakeAlert can give enough time to slow trains and taxiing planes, to prevent cars from entering bridges and tunnels, to move away from dangerous machines or chemicals in work environments and to take cover under a desk, or to automatically shut down and isolate industrial systems. Taking such actions before shaking starts can reduce damage and casualties during an earthquake. It can also prevent cascading failures in the aftermath of an event. For example, isolating utilities before shaking starts can reduce the number of fire initiations.

    System Goal

    The USGS will issue public warnings of potentially damaging earthquakes and provide warning parameter data to government agencies and private users on a region-by-region basis, as soon as the ShakeAlert system, its products, and its parametric data meet minimum quality and reliability standards in those geographic regions. The USGS has set the goal of beginning limited public notifications in 2018. Product availability will expand geographically via ANSS regional seismic networks, such that ShakeAlert products and warnings become available for all regions with dense seismic instrumentation.

    Current Status

    The West Coast ShakeAlert system is being developed by expanding and upgrading the infrastructure of regional seismic networks that are part of the Advanced National Seismic System (ANSS); the California Integrated Seismic Network (CISN) is made up of the Southern California Seismic Network, SCSN) and the Northern California Seismic System, NCSS and the Pacific Northwest Seismic Network (PNSN). This enables the USGS and ANSS to leverage their substantial investment in sensor networks, data telemetry systems, data processing centers, and software for earthquake monitoring activities residing in these network centers. The ShakeAlert system has been sending live alerts to “beta” users in California since January of 2012 and in the Pacific Northwest since February of 2015.

    In February of 2016 the USGS, along with its partners, rolled-out the next-generation ShakeAlert early warning test system in California joined by Oregon and Washington in April 2017. This West Coast-wide “production prototype” has been designed for redundant, reliable operations. The system includes geographically distributed servers, and allows for automatic fail-over if connection is lost.

    This next-generation system will not yet support public warnings but does allow selected early adopters to develop and deploy pilot implementations that take protective actions triggered by the ShakeAlert notifications in areas with sufficient sensor coverage.

    Authorities

    The USGS will develop and operate the ShakeAlert system, and issue public notifications under collaborative authorities with FEMA, as part of the National Earthquake Hazard Reduction Program, as enacted by the Earthquake Hazards Reduction Act of 1977, 42 U.S.C. §§ 7704 SEC. 2.

    For More Information

    Robert de Groot, ShakeAlert National Coordinator for Communication, Education, and Outreach
    rdegroot@usgs.gov
    626-583-7225

    Learn more about EEW Research

    ShakeAlert Fact Sheet

    ShakeAlert Implementation Plan

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

     
  • richardmitnick 10:59 am on July 20, 2018 Permalink | Reply
    Tags: Antimatter plasma reveals secrets of deep space signals, , , , Computer model called OSIRIS, , Horizon - The EU Research and Innovation Magazine, , The exact conditions necessary to produce a plasma containing positrons remain unclear   

    From Horizon The EU Research and Innovation Magazine: “Antimatter plasma reveals secrets of deep space signals” 

    1

    From Horizon The EU Research and Innovation Magazine

    16 July 2018
    Jude Gonzalez

    1
    Mysterious radiation emitted from pulsars – like this one shown leaving a long tail of debris as it races through the Milky Way – have puzzled astronomers for decades. Image credit – NASA

    Mysterious radiation emitted from distant corners of the galaxy could finally be explained with efforts to recreate a unique state of matter that blinked into existence in the first moments after the Big Bang.

    For 50 years, astronomers have puzzled over strange radio waves and gamma rays thrown out from the spinning remnants of dead stars called pulsars.

    Researchers believe that these enigmatic, highly-energetic pulses of radiation are produced by bursts of electrons and their antimatter twins, positrons. The universe was briefly filled with these superheated, electrically charged particles in the seconds that followed the Big Bang before all antimatter vanished, taking the positrons with it. But astrophysicists think the conditions needed to forge positrons may still exist in the powerful electric and magnetic fields generated around pulsars.

    ‘These fields are so strong, and they twist and reconnect so violently, that they essentially apply Einstein’s equation of E = mc^2 and create matter and antimatter out of energy,’ said Professor Luis Silva at the Instituto Superior Técnico in Lisbon, Portugal. Together, the electrons and positrons are thought to form a super-heated form of matter known as a plasma around a pulsar.

    But the exact conditions necessary to produce a plasma containing positrons remain unclear. Scientists also still do not understand why the radio waves emitted by the plasma around pulsars have properties similar to light in a laser beam – a wave structure known as coherence.

    To find out, researchers are now turning to powerful computer simulations to model what might be going on. In the past, such simulations have struggled to mimic the staggering number of particles generated around pulsars. But Prof. Silva and his team, together with researchers at the University of California, Los Angeles in the United States, have adapted a computer model called OSIRIS so that it can run on supercomputers, allowing it to follow billions of particles simultaneously.

    The updated model, which forms part of the InPairs project, has identified the astrophysical conditions necessary for pulsars to generate electrons and positrons when magnetic fields are torn apart and reattached to their neighbours in a process known as magnetic reconnection.

    OSIRIS also predicted that the gamma rays released by electrons and positrons as they race across a magnetic field will shine in discontinuous spurts rather than smooth beams.

    The findings have added weight to theories that the enigmatic signals coming from pulsars are produced by the destruction of electrons as they recombine with positrons in the magnetic fields around these dead stars.

    Prof. Silva is now using the data from these simulations to search for similar burst signatures in past astronomical observations. The tell-tale patterns would reveal details on how magnetic fields evolve around pulsars, offering fresh clues about what is going on inside them. It will also help confirm the validity of the OSIRIS model for researchers trying to create antimatter in the laboratory.

    2
    The OSIRIS computer model predicts how powerful magnetic fields around pulsars evolve, helping scientists understand where matter and antimatter can be created out of the vacuum of space. Image credit – Fabio Cruz

    Blasting lasers

    Insights gained from the simulations are already being used to help design experiments that will use high-powered lasers to mimic the huge amounts of energy released by pulsars. The Extreme Light Infrastructure will blast targets no wider than a human hair with petawatts of laser power. Under this project, lasers are under construction at three facilities around Europe – in Măgurele in Romania, Szeged in Hungary, and Prague in the Czech Republic. If successful, the experiments could create billions of electron-positrons pairs.

    ‘OSIRIS is helping researchers optimise laser properties to create matter and antimatter like pulsars do,’ said Prof. Silva. ‘The model offers a road map for future experiments.’

    But there are some who are attempting to wield matter-antimatter plasmas in even more controlled ways so they can study them.

    Professor Thomas Sunn Pedersen, an applied physicist at the Max Planck Institute for Plasma Physics in Garching, Germany, is using charged metal plates to confine positrons alongside electrons as a first step towards creating a matter-antimatter plasma on a table top.

    Although Prof. Sunn Pedersen works with the most intense beam of low-energy positrons in the world, concentrating enough particles to ignite a matter-antimatter plasma remains challenging. Researchers use electro-magnetic ‘cages’ generated under vacuum to confine antimatter, but these require openings for the particles to be injected inside. These same openings allow particles to leak back out, however, making it difficult to build up enough particles for a plasma to form.

    Prof. Sunn Pedersen has invented an electro-magnetic field with a ‘trap door’ that can let positrons in before closing behind them. Last year, the new design was able to boost the time the antimatter particles remained confined in the field by a factor of 20, holding them in place for over a second.

    ‘No one has ever achieved that in a fully magnetic trap,’ said Prof. Sunn Pedersen. ‘We have proven that the idea works.’

    But holding these elusive antimatter particles in place is only one milestone towards creating a matter-antimatter plasma in the laboratory. As part of the PAIRPLASMA project, Prof. Sunn Pedersen is now increasing the quality of the vacuum and generating the field with a levitating ring to confine positrons for over a minute. Studying the properties of plasmas ignited under these conditions will offer valuable insights to neighbouring fields.

    In June, for example, Prof. Sunn Pedersen used a variation of this magnetic trap to set a new world record in nuclear fusion reactions ignited in conventional-matter plasmas.

    ‘Collective phenomena like turbulence currently complicate control over big fusion plasmas,’ said Prof. Sunn Pedersen. ‘A lot of that is driven by the fact that the ions are much heavier than the electrons in them.’

    He hopes that by producing electron-positron plasmas like those created by the Big Bang, it may be possible to sidestep this complication because electrons and positrons have the exact same mass. If they can be controlled, such plasmas could help to validate complex models and recreate the conditions around pulsars so they can be studied up close in the laboratory for the first time.

    If successful it may finally give astronomers the answers they have puzzled over for so long.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

     
  • richardmitnick 10:16 am on June 1, 2018 Permalink | Reply
    Tags: , Horizon - The EU Research and Innovation Magazine, ICEYE, ICEYE X1 – PSLV C40, , One of ICEYE’s key initial focuses has been ice surveillance for companies involved in Arctic operations   

    From Horizon: “Microsatellite swarms could paint clearer picture of our planet” 

    1

    From Horizon The EU Research and Innovation Magazine

    28 May 2018
    Gareth Willmer

    1
    Microsatellites such as those developed by ICEYE not only reduce the size of the satellite but also cut costs significantly. Image credit – ICEYE

    Space is not just a hostile place for life, but also for business. Building and launching a traditional bus-sized satellite tens of thousands of kilometres above Earth can cost hundreds of millions of euros, but thanks to miniature satellites, the economics are changing.

    Among the start-ups seeking new ways to tap into space’s potential is microsatellite manufacturer ICEYE.

    1

    2
    ICEYE X1 – PSLV C40

    It aims to cut satellite prices to less than one-hundredth of traditional satellites, using a series of microsatellites partly built with off-the-shelf mobile electronics.

    In January, the company sent what it described as the world’s first microsatellite based on synthetic-aperture radar – technology that allows satellites to see through clouds and into the dark – into a low-Earth orbit of about 500 kilometres.

    Suitcase-sized and weighing just 70 kilograms, ICEYE-X1 is the first of three satellites that the company plans to launch this year, with a goal of having 18 in the sky by the end of 2020.

    ICEYE says that gloomy conditions can make imagery using optical-based systems unavailable up to 75 % of the time, a problem their technology avoids.

    ‘That means you can image in any place in the world at any time,’ said Pekka Laurila, CFO and co-founder of ICEYE.

    At present, requests from companies for data can take satellite providers days to process, and are often updated only once every 12 hours. ICEYE believes it can get this down to two hours once it gets six microsatellites into the sky, and even further with more launches.

    ‘If you’re able to do monitoring on a scale of a few hours, you are actually catching a set of completely new phenomena that has never been monitored from space before,’ said Laurila. ‘It gives you access to understanding these phenomena on a human timescale.’

    Ice surveillance

    There are all sorts of areas in which this could be applied, from agricultural production to tracking climate change, but one of ICEYE’s key initial focuses has been ice surveillance for companies involved in Arctic operations – where vessels moving at several knots need rapid updates on ice-field movements.

    ‘That’s an area where continuous coverage is extremely important,’ said Laurila.

    This revolutionary approach has arrived at a time when unprecedented amounts of data are being generated by satellites.

    The surge in data is driven by a range of factors, including more detailed Earth observation services. One way to process this increasing flow of information is to find better ways of getting satellite data back down to ground.

    At the moment, a lot of satellite data gets lost in transit to and from Earth, or ‘thrown overboard’, according to John Mackey, CEO of mBryonics, a technology development company based in Galway, Ireland.

    He coordinates a project called RAVEN, which is working to improve signal transmission. To do so, mBryonics is harnessing a technology called adaptive optics, which is used in telescopes to give astronomers clear images of stars by reducing the twinkle when viewing them through the distortion of Earth’s atmosphere.

    Adapting this technique to beam data up and down from satellites helps create a much stronger signal and a higher data rate by lessening such atmospheric interference, said Mackey.

    Moving this data faster could also help with a challenge facing future low-orbit satellites – seeing less of the Earth than those satellites higher up. Low-orbit satellites have a more limited line of sight to ground stations and therefore a smaller window to beam data down when they pass by – maybe just 10 to 15 minutes, said Mackey. Speeding up the data rate means they can transfer more during this period.

    Additionally, mBryonics is seeking to use its technology to create links between satellites, which could help create constellations to intelligently route data in the most efficient way possible. ‘Then, if I send my data up to the satellite, it can fire it across the satellite constellation and get me to my destination much faster,’ said Mackey.

    And not only can that cut the number of ground stations needed, but it could also help move the data faster and thus avoid big delays in providing costly satellite-related services. mBryonics is aiming to demo a full commercial system of its satellite technology within the next two years.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: