Tagged: Horizon – The EU Research and Innovation Magazine Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:28 am on June 24, 2022 Permalink | Reply
    Tags: "Hydrogen heads home to challenge oil and gas as local energy supply", A Hydrogen Valley is a medium-sized area where clean hydrogen is produced locally and consumed by homes; vehicles and industry., , Central to this is an electrolysis plant that produces hydrogen from energy supplied by two newly built solar-power plants., , Gas production is winding down., Horizon - The EU Research and Innovation Magazine, Hydrogen is carving out its place in the world of renewable energy., Most usually stored as a gas this zero-emission energy carrier is used to fuel everyday applications such as in transport; heating and industry., Renewable diversification, Shifts in the soil from drilling for gas are causing minor earthquakes., The European Union has an eventual target of 100 of hydrogen valleys., The idea behind hydrogen valleys is to create a self-sufficient hydrogen ecosystem from start to finish., The Northern Netherlands region is setting out to become a so-called “Hydrogen Valley”., The strategy is to provide a regional economic impetus while also fighting the main driver of climate change-the burning of fossil fuels.   

    From “Horizon” The EU Research and Innovation Magazine : “Hydrogen heads home to challenge oil and gas as local energy supply” 

    From “Horizon” The EU Research and Innovation Magazine

    22 June 2022
    Tom Cassauwers

    Hydrogen is now finding its place in the world of renewable energy. © Juan Roballo, Shutterstock.

    Hydrogen is carving out its place in the world of renewable energy. Regional developments like hydrogen valleys and hydrogen islands are serving as blueprints for larger ecosystems to produce and consume this versatile fuel locally.

    The Northern Netherlands region used to be prime gas country. One of the largest gas fields in the world was found underfoot in Groningen province. Gas extraction from the territory helped bankroll the Netherlands for decades. But times are changing.

    “Gas production is winding down,” said Jochem Durenkamp, hydrogen project manager at New Energy Coalition. ‘Which would mean the north would lose many jobs. Hydrogen turned out to be a perfect replacement.’

    With gas extraction and related jobs coming to an end, these northern regions are seeking alternatives. Furthermore, shifts in the soil from drilling for gas are causing minor earthquakes, with 72 registered in 2021 alone. This has significant economic repercussions, particularly when it damages houses in the area. As much as €1.2 billion has been paid out in compensation for earthquake damage since 1991.

    The Northern Netherlands region is setting out to become a so-called “Hydrogen Valley”. The HEAVENN project, coordinated by the New Energy Coalition, is at the helm. The region is tapping European support to develop the infrastructure necessary to adopt green hydrogen as a locally produced energy supply.

    The European Union has an eventual target of 100 of these hydrogen valleys. Currently there are 23 in Europe at various stages of development, with the ambition to double this total by 2025. Dozens of projects have commenced all over Europe and in 20 countries worldwide, in a rapidly evolving clean energy investment trend worth billions. Follow the link for a map of hydrogen valleys.

    The strategy is to provide a regional economic impetus while also fighting the main driver of climate change-the burning of fossil fuels. Eventually, when enough regions emerge, they will join up to create a wide-scale hydrogen-based economy founded on a clean, secure energy supply.

    Green hydrogen

    The Northern Netherlands is in an ideal position to take advantage of the hydrogen opportunity. Located close to the rapidly expanding offshore wind farms of the North Sea, it has a direct line of renewable energy to manufacture green hydrogen. On top of that, the previous gas exploitation in the region has created a body of knowledge and skills that easily transfers to the production, distribution, storage and consumption of hydrogen in the local economy.

    The idea behind hydrogen valleys is to create a self-sufficient hydrogen ecosystem from start to finish. In the case of HEAVENN, that begins by identifying sites where the electrolysis process can be used to separate water into hydrogen and oxygen by use of electricity.

    A Hydrogen Valley is a medium-sized area where clean hydrogen is produced locally and consumed by homes, vehicles and industry. The goal is to initiate a hydrogen economy at the community level. Eventually the regional hydrogen valleys will join up to create wider economic zones powered by hydrogen.

    When this electricity is derived from renewable sources, like offshore wind in the case of HEAVENN, the hydrogen is considered to be a green energy source. Most usually stored as a gas this zero-emission energy carrier is used to fuel everyday applications such as in transport, heating and industry.

    HEAVENN, for example, invests in projects for hydrogen-based mobility with a number of hydrogen filling points for every kind of hydrogen powered vehicle – from cars to trucks and buses. Hydrogen will also be used to power a datacentre and to heat residential neighbourhoods.

    Building energy ecosystems is not easy. ‘The project includes thirty partners,’ said Durenkamp. ‘It’s a big challenge to coordinate what they do, but building this ecosystem is key for hydrogen.’

    Beyond the partners, the local community is also an important player. ‘It’s very important that inhabitants are consulted,” said Durenkamp. “Where before, energy was extracted from underground, it’s now very visible in the landscape with wind turbines, solar panels and large electrolysis facilities. Whenever something is done in the project, it’s done together with the local inhabitants.”

    Clean energy islands

    Another region unlocking hydrogen’s potential is the Spanish island of Mallorca, which styles itself as a “Hydrogen Island”.

    ‘The idea of the project came when CEMEX, a cement manufacturer, announced it would close its plant on Mallorca,’ said María Jaén Caparrós. She acts as coordinator of hydrogen innovation at Enagás, the Transmission System Operator of the national gas grid in Spain. ‘With hydrogen, we want to re-industrialise the island and decarbonise the Balearic region.’

    Known as GREEN HYSLAND, the project will create an ecosystem of hydrogen producers and users across the Mediterranean island. Achieving this will cut down on expensive energy imports and eliminate harmful emissions.

    Central to this is an electrolysis plant that produces hydrogen from energy supplied by two newly built solar-power plants. This hydrogen is then used in a range of different applications in the locality. For example, the public transport company of the city of Palma de Mallorca is rolling out hydrogen-powered buses. Another use-case is to power the island’s vital ferry port and even to provide energy for a hotel. But community energy needs community support.

    Renewable diversification

    “It’s key to have the support of society,” said Jaén Caparrós. “Hydrogen is something new for the Balearic Islands. This project will not only promote reindustrialisation based on renewables, but will also provide knowledge, research and innovation. It is a milestone that the Balearic Islands must take advantage of, in order to promote the diversification of the production model with new, stable and quality jobs.”

    The second related objective of GREEN HYSLAND is to reduce the emissions from the use of natural gas. They will inject part of the hydrogen into the gas grid, according to Jaén Caparrós. They are compatible sources of energy. “We will build a hydrogen pipeline to transport it to the injection point,” he said, “Which we will use to partly decarbonise the natural gas grid.” They plan to commence this phase by the end of 2022.

    In this way, hydrogen can be mixed into the existing gas infrastructure used to heat homes, hotels and industry or generate electricity. The resulting blend of gas and green hydrogen has a lower emissions footprint than just using gas by itself, a step toward complete decarbonization.

    Hydrogen blueprints

    GREEN HYSLAND even joined up with parties from outside of Europe. ‘We are 30 partners from 11 countries including Morocco and Chile,’ said Jaén Caparrós. ‘They also want to develop green hydrogen ecosystems, and hydrogen valleys have an added value if we can connect with regions inside and outside of Europe,’ she said.

    ‘Hydrogen valleys create new jobs, re-industrialise and create new economic activities,’ said Jaén Caparrós. ‘And on top of that they decarbonise. It serves the entire society.’

    Once this infrastructure-building and experimentation phase is complete, the lessons learned will also need to scale up. Both HEAVENN and GREEN HYSLAND want to share what they learn. ‘We want to be a blueprint for other regions across the world,’ concluded Durenkamp. ‘If this project is a success, we want to share it.’
    Research in this article was funded by the EU.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 9:19 am on June 3, 2022 Permalink | Reply
    Tags: "Robot dogs take a walk on the wired side", , , Horizon - The EU Research and Innovation Magazine, Legged robots are already becoming integrated into our economy and society today., Legged robots are already used for industrial inspections and other observation tasks., , Robots are learning to walk and work., Robots could be very helpful to humanity., Robots with improved locomotive abilities can help in search and rescue operations and space exploration., The LeMo project’s dog-like robot is one of the first to have learned to walk through reinforcement learning.   

    From “Horizon” The EU Research and Innovation Magazine : “Robot dogs take a walk on the wired side” 

    From “Horizon” The EU Research and Innovation Magazine

    01 June 2022
    Tom Cassauwers

    The LeMo project’s dog-like robot is one of the first to have learned to walk through reinforcement learning. © ETH Zürich.

    Robots are learning to walk and work. While robot dogs are not yet man’s best friend, real autonomy and reasoning will make them useful companions in industry, search & rescue and even space exploration. But you must walk before you can run and machines are learning lessons from biology for better walking robots.

    The first chords of the 1960s Motown song Do You Love Me, by the Contours sound on the speakers as the robots start to dance. Several models, including a bipedal humanoid version, and a four-legged dog-like contraption, are seen dancing with each other. They shuffle, do pirouettes and swing.

    Released by the US robotics company Boston Dynamics, the viral video of robots with legs dancing created a stir at the end of 2020. Reactions ranged from people suggesting it was made using CGI, to fear that the robots were going to take over the world. Yet for all the impressive engineering, the video also showed the limitations that legged robots face. Whereas for humans dancing is quite easy, for robots it’s incredibly hard, and the three-minute video meant that every movement of the robots had to be manually scripted in detail.

    ‘Today robots are still relatively stupid,’ said Marco Hutter, professor at ETH Zürich and expert in robotics. ‘A lot of the Boston Dynamics videos are hand-crafted movements for specific environments. They need human supervision. In terms of real autonomy and reasoning, we’re still far away from humans, animals or what we expect from science-fiction.’

    Yet these sorts of robots could be very helpful to humanity. They could help us when disasters strike, they could improve industrial operations and logistics and they could even help us explore outer space. But for that to happen we need to make legged robots better at basic tasks like walking and teach them how to do so without supervision.

    Virtual learning

    The ERC-project LeMo is one of the investigations launched by European researchers to make robots move more autonomously. Their core premise is that legged locomotion isn’t what it could be, and that machine learning techniques could improve it. LeMo is specifically focused on so-called reinforcement learning.

    ‘Reinforcement learning uses a simulation to generate massive data for training a neural network control policy,’ explained Hutter, who is also the project leader of LeMo. ‘The better the robot walks in the simulation, the higher reward it gets. If the robot falls over, or slips, it gets punished.’

    The robot they use in the project is a 50 kilogram, dog-like, four-legged robot. On top of it are several sensors and cameras that allow it to detect its environment. This part has become pretty standard for legged robots, yet the advancement LeMo produces lies in the software. Instead of using a model-based approach, where the researchers program rules into the system, like ‘when there’s a rock on the ground, lift up your feet higher’, they ‘train’ an AI-system in a simulation.

    Here the robot’s system walks over and over through a virtual terrain simulation, and every time it performs well it receives a reward. Every time it fails it receives a punishment. By repeating this process millions of times, the robot learns how to walk through trial-and-error.

    Robots with improved locomotive abilities can help in search and rescue operations and space exploration. © ETH Zürich.

    ‘LeMo is one of the first times reinforcement learning has been used on legged robots,’ said Hutter. ‘Because of this, the robot can now walk across challenging terrain, like slippery ground and inclined steps. We practically never fall anymore.’

    Using this technology, the ETH Zürich team recently won a $2 million Defense Advanced Research Projects Agency (DARPA) contest in which teams were challenged to deploy a fleet of robots to explore challenging underground areas by themselves.

    ‘Legged robots are already used for industrial inspections and other observation tasks,’ said Hutter. ‘But there are also applications like search & rescue and even space exploration, where we need better locomotion. Using techniques like reinforcement learning we can accomplish this.’

    Natural inspiration

    Another ERC-project, called M-Runners, is working on how to build legged robots that work in outer space. Today when we launch robots to places like the moon or Mars, they are generally wheeled robots. These need to land, and ride on, relatively flat pieces of terrain.

    ‘But the interesting things for geologists aren’t generally located in the flatlands,’ said professor Alin Albu-Schäffer, of the TU Munich and the German Aerospace Center. ‘They are found in places like canyons, where rovers cannot easily go.’

    Which is why there’s a strong interest in sending legged robots up into space. But before we can do that, more research needs to happen on making them work better. M-Runner here takes inspiration from nature.

    ‘Our hypothesis is that biology is more energy efficient,’ said Albu-Schäffer. ‘Our muscles and tendons have some elasticity. Animals, like a horse galloping, use this elasticity to store and release energy. Traditional robots on the other hand are rigid, and don’t do that.’

    This means that legged robots are not as efficient as they could be. But really understanding these processes, and transferring them to robots, is quite a challenge. It requires a deep understanding of biology, but also of the mathematics behind how movements are made and repeated.

    The complex system of the limb, with a high amount of interdependent parts like muscles, tendons and bones, working together very closely to repeat movements like walking or running. ‘Modelling this mathematically is a scientifically unsolved question,’ said Albu-Schäffer.

    Which is what the M-Runner project is trying to solve, and transfer to robots, a quest that’s heavily interdisciplinary. ‘We work on biomechanics and biological systems,’ said Albu-Schäffer. ‘But also neuroscience, mathematics and physics. In turn we build tools that apply this to the actual robots.’

    So far the project has already built a prototype robot, a dog-sized variant, on which the researchers are testing different types of running and gaits. The eventual goal is to apply this theoretical research into a role such as space exploration. ‘We also think about low gravity in simulations,’ says Albu-Schäffer. ‘The robot here can do more spectacular jumps and stride farther.’

    Beyond this research, legged robots are already becoming integrated into our economy and society today. ‘These machines are already in use,’ said Hutter. ‘It’s not a household item yet. But in industrial contexts it’s getting more popular, and in China even household use-cases are being investigated.’

    But their mass market appeal relies on these robots becoming better at walking and acting in the real world. Which is why more research is needed. ‘Legged robots aren’t just about Boston Dynamics,’ said Albu-Schäffer. ‘In Europe cutting edge-research is also being done, and we’re seeing real advances in the technology.’

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 8:39 am on June 3, 2022 Permalink | Reply
    Tags: "Deforestation cuts through community as well as biodiversity", , , , , Deforestation contributes to climate change by releasing significant amounts of carbon into the atmosphere., Deforestation in the Gran Chaco forest in South America can be the end of the world for indigenous people who call it home., , , Horizon - The EU Research and Innovation Magazine, Indigenous people that live in the Gran Chaco rely on the forest for food and materials. It is also fundamental to their culture., The Gran Chaco is semi-arid or dry., The second largest forest in South America after the Amazon rainforest is the Gran Chaco.   

    From “Horizon” The EU Research and Innovation Magazine : “Deforestation cuts through community as well as biodiversity” 

    From “Horizon” The EU Research and Innovation Magazine

    30 May 2022
    Sandrine Ceurstemont

    Deforestation in South America is devastating to local communities. © Xico Putini, Shutterstock.

    Deforestation in the Gran Chaco forest in South America can be the end of the world for indigenous people who call it home. Researchers have been investigating different aspects of its human impact.

    The second largest forest in South America after the Amazon rainforest is the Gran Chaco. Home to 9 million people and thousands of species, it is under intense pressure from deforestation.

    Stretching across Argentina, Paraguay, Bolivia and Brazil, it has one of the highest rates of deforestation in the world. Since 1985, more than 140 000 square kilometres, about one fifth of the entire forest, has been cut down.

    Occupying a vast region to the east of the Andes, unlike the Amazon rainforest, the Gran Chaco is semi-arid or dry. The population living there includes about 35 different groups of indigenous people. Hunter-gatherers by tradition, their livelihoods are closely entwined with the forest’s future.

    Dr Valentina Bonifacio has been working as a researcher in the Gran Chaco forest for the past 15 years, and has experienced its rapid deforestation first-hand. Densely-wooded areas have been cleared and turned into agricultural land to grow highly-profitable soybean crops, and expanses of pastureland have given way to cattle raised for beef production.

    ‘I really saw the Chaco disappearing and it’s very scary to see how quickly a territory can change,’ said Dr Bonifacio, associate professor at the Ca’ Foscari University of Venice in Italy. ‘If it continues to be deforested at the same rate, very soon the Chaco is not going to be a forest anymore.’

    Deforestation contributes to climate change by releasing significant amounts of carbon into the atmosphere, and threatens plant and animal survival. Several species in the Chaco, such as the South American jaguar and the screaming hairy armadillo, are disappearing. Deforestation also impacts the local communities that call the forest home.

    The Gran Chaco’s dry thorn forests, cactus stands, palm savannas and a 100s of native species of animals are threatened by rampant deforestation. © Michele Graziano Ceddia 2017.

    Human impact

    In the lanloss project, Dr Bonifacio is supervising Dr Tamar Blickstein who is investigating what the loss of the forest means to people living in the region.

    Small-scale farmers often experience feelings of grief as large commercial farms take over and droughts and extreme rainfall caused by clearing forested land makes it harder for them to carry on growing crops. Indigenous people are also impacted by the threat deforestation poses to their kinship networks embedded in the forest.

    During fieldwork later this year, Dr Blickstein plans to use satellite maps of deforestation as a form of storytelling. One of her goals is to show these maps to people from different communities, such as indigenous people, small-hold farmers and settlers who are experiencing deforestation, to see how they react to it and to gather their opinions.

    “I really saw the Chaco disappearing and it’s very scary to see how quickly a territory can change. If it continues to be deforested at the same rate, very soon the Chaco is not going to be a forest anymore.”
    Dr Valentina Bonifacio, lanloss

    She might also use satellite images on a website to illustrate people’s stories related to deforestation. ‘I think it would be an interesting outcome to have people’s subjective voices meshed in a storytelling process with these satellite data visuals to illuminate data that is quite abstract and quantitative,’ she said. ‘It would give it a human face and a human voice.’

    Previous research has typically focused on specific populations rather than examining different social groups together. Dr Blickstein hopes that her work will contribute to increasing awareness about deforestation in the Chaco and even help empower locals.

    ‘Interpreting (satellite) data with communities in the field means that they learn how to use these maps and this kind of data,’ said Dr Bonifacio. ‘It might turn out to be useful to them.’

    Power struggles

    Indigenous people that live in the Gran Chaco rely on the forest for food and materials. It is also fundamental to their culture.

    ‘To them the loss of the forest is nothing less than the end of the world,’ said Dr Graziano Ceddia, assistant professor at the Centre for Development and Environment of the University of Bern in Switzerland.

    Agricultural expansion drives deforestation, but the attitudes and aspirations of the different people involved is less clear. In the INCLUDE project, Dr Ceddia and his colleagues focused on better understanding the governance structures that underpin deforestation in the Chaco Salteño, a part of the forest located in the province of Salta in north-western Argentina.

    Bringing to light the perspectives of indigenous people and small-scale farmers living in the region affected by capital-intensive agriculturalisation was equally important. Their views and needs are often ignored when land-use decisions are made and they typically miss out on economic gains. ‘We wanted to give a voice to both of these marginalised groups who are paying most of the consequences of deforestation,’ said Dr Ceddia.

    Over the course of three years, Dr Ceddia and his colleagues talked to many different people in the region who have an interest in deforestation, such as academics, public administration and non-governmental organisations (NGO) employees, farmers and indigenous people.

    “To (indigenous people), the loss of the forest is nothing less than the end of the world.”
    Dr Graziano Ceddia, INCLUDE

    Overall, they found that large-scale producers were in a better position to influence government policies related to deforestation compared to other groups. They also found that deforestation was perceived differently by different groups of people. Large-scale producers, for example, typically associated forested areas with poverty and agricultural expansion with development. On the other hand, farmers and indigenous people referred to forests as their homes and their lives.

    Furthermore, Dr Ceddia and his colleagues found that land-use scenarios based on the views of indigenous people and farmers were more sustainable and environmentally just. Local farmers’ organisations, for example, have helped develop a switch to modes of production that are less damaging to remaining forests.

    Conversely, while looking more generally at tropical areas from Latin America to Southeast Asia, Dr Ceddia showed how cropland expansion, which contributes significantly to carbon emissions and biodiversity loss, is driven by investors. They choose to grow flex-crops such as oil palm, soy and sugar cane, since they have multiple uses, for example as food, fuel and animal feed. This means they are more likely to generate a profit compared to crops with a single use, often at the expense of the local people and the environment.

    ‘Agriculture is not necessarily oriented to the production of food but simply as a branch of investment which has to generate a certain return on invested capital,’ said Dr Ceddia.

    Enabling change

    Although research can provide information about the impact of deforestation, Dr Ceddia thinks that social activism is important to bring about change. He and his team found that laws to protect the forest were implemented more stringently in provinces of the Chaco in Argentina where indigenous people and small-scale farmers organised protests against deforestation.

    At the same time, in provinces where large-scale producers were better organised to protect their interests, deforestation laws were less strictly implemented. ‘I think what is important for change is grassroot movements and people acting on the ground,’ said Dr Ceddia. ‘It brings hope to see that there are some scientists who are also taking action and becoming activists.’

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 3:12 pm on May 20, 2022 Permalink | Reply
    Tags: "AFMs": atomic force microscopes, "Cubit": Ancient measurement which was approximately the length of a forearm, "Metrology": the science of measurement, "Nanometre scale": A nanometre is a billionth of a metre., "The small things make a big difference in the science of measurement", , , Horizon - The EU Research and Innovation Magazine, In the interest of greater accuracy in the 1790s the French government commission standardized the metre as the basic unit of distance., , Once the realm of research scientists nanoscales are increasingly important in industry., Since 2018 some key definitions of measurement units have been redefined., The kilo; the ampere; the kelvin and the mole are now based on fundamental constants in nature instead of physical models., The Romans used fingers and feet in their measurement systems while the story goes that Henry I of England (c 1068 - 1135) tried to standardize a yard as the distance from his nose to his thumb.   

    From “Horizon” The EU Research and Innovation Magazine : “The small things make a big difference in the science of measurement” 

    From “Horizon” The EU Research and Innovation Magazine

    19 May 2022
    Anthony King

    As technology shrinks to the nanoscale, measuring the things we can barely see becomes ever more important. © Rito Succeed, Shutterstock.

    Scientists must make ever more sophisticated measurements as technology shrinks to the nanoscale and we face global challenges from the effects of climate change.

    As industry works more and more on the nanometre scale (a nanometre is a billionth of a metre), there is a need to measure more reliably and accurately things we can barely see. This requires metrology, the science of measurement.

    Nano-scale metrology is useful in everyday life, for example to measure doses of medication or in the development of computer chips for our digital devices.

    ‘Metrology is needed everywhere that you make measurements or if you want to compare measurements,’ said Virpi Korpelainen, senior scientist at the Technical Research Centre of Finland and National Metrology Institute in Espoo, Finland.

    Since the earliest civilizations, standardized and consistent measurements have always been crucial to the smooth functioning of society. In ancient times, physical quantities such as a body measurement were used.

    One of the earliest known units was the cubit, which was approximately the length of a forearm. The Romans used fingers and feet in their measurement systems while the story goes that Henry I of England (c 1068 – 1135) tried to standardise a yard as the distance from his nose to his thumb.

    Standard units

    Standardization demands precise definitions and consistent measurements. In the interest of greater accuracy in the 1790s the French government commission standardized the metre as the basic unit of distance. This set Europe on a path to the standardized international system of base units (SI) which has been evolving since.

    Since 2018, some key definitions of measurement units have been redefined. The kilo, the ampere, the kelvin and the mole are now based on fundamental constants in nature instead of physical models. This is because over time, the physical models change as happened with the model of the kilo, which lost a tiny amount of mass over 100 years after it was created. With this new approach, which was adopted after years of careful science, the definitions will not change.

    This evolution is often driven by incredibly sophisticated science, familiar only to metrologists, such as the speed of light in a vacuum (metre), the rate of radioactive decay (time) or the Planck constant (kilogram), all of which are used to calibrate key units of measurement under the SI.

    ‘When you buy a measuring instrument, people typically don’t think of where the scale comes from,’ said Korpelainen. This goes for scientists and engineers too.

    Once the realm of research scientists, nanoscales are increasingly important in industry. Nanotechnology, computer chips and medications typically rely on very accurate measurements at very small scales.

    Even the most advanced microscopes need to be calibrated, meaning that steps must be taken to standardise its measurements of the very small. Korpelainen and colleagues around Europe are developing improved atomic force microscopes (AFMs) in an ongoing project called MetExSPM.

    AFM is a type of microscope that gets so close to a sample, it can almost reveal its individual atoms. ‘In industry, people need traceable measurements for quality control and for buying components from subcontractors,’ said Korpelainen.

    The project will allow the AFM microscopes to take reliable measurements at nanoscale resolution by using high-speed scanning, even on relatively large samples.

    ‘Industry needs AFM resolution if they want to measure distances between really small structures,’ Korpelainen said. Research on AFMs has revealed that measurement errors are easily introduced at this scale and can be as high as 30%.

    The demand for small, sophisticated, high-performing devices means the nanoscale is growing in importance. She used an AFM microscope and lasers to calibrate precision scales for other microscopes.

    She also coordinated another project, 3DNano, in order to measure nanoscale 3D objects that are not always perfectly symmetrical. Precise measurements of such objects support the development of new technology in medicine, energy storage and space exploration.

    Radon flux

    Dr Annette Röttger, a nuclear physicist at PTB, the national metrology institute in Germany is interested in measuring radon, a radioactive gas with no colour, smell or taste.

    Radon is naturally occurring. It originates from decaying uranium below ground. Generally, the gas leaks into the atmosphere and is harmless, but it can reach dangerous levels when it builds up in dwellings, potentially causing illness to residents.

    But there is another reason Röttger is interested in measuring radon. She believes it can improve the measurement of important greenhouse gases (GHG).

    ‘For methane and carbon dioxide, you can measure the amounts in the atmosphere very precisely, but you cannot measure the flux of these gases coming out of the ground, representatively,’ said Röttger.

    ‘Flux’ is the rate of seepage of a gas. It is a helpful measurement to trace the quantities of other GHG such as methane that also seep out of the ground. Measurements of methane coming out of the ground are variable, so that one spot will differ from another a few steps away. The flow of radon gas out of the ground closely tracks the flow of methane, a damaging GHG with both natural and human origins.

    When radon gas emissions from the ground increase, so do carbon dioxide and methane levels. ‘Radon is more homogenous,’ said Röttger, ‘and there is a close correlation between radon and these greenhouse gases.’ The research project to study it is called traceRadon.

    Radon is measured via its radioactivity but because of its low concentrations it is very challenging to measure. ‘Several devices will not work at all, so you will get a zero-reading value because you are below the detection limit,’ said Röttger.

    Wetland rewetting

    Measuring the escape of radon enables scientists to model the rate of emissions over a landscape. This can be useful to measure the effects of climate mitigation measures. For example, research indicates that the rapid rewetting of drained peatland stores greenhouse gas and mitigates climate change.

    But if you go to the trouble of rewetting a large marshland, ‘You will want to know if this worked,’ said Röttger. ‘If it works for these GHG, then we should see less radon coming out too. If we don’t, then it didn’t work.’

    With more precise calibration, the project will improve radon measurements over large geographical areas. This may also be used to improve radiological early warning systems in a European monitoring network called the European Radiological Data Exchange Platform (EURDEP).

    ‘We have lots of false alarms (due to radon) and we might even miss an alarm because of this,’ said Röttger. ‘We can make this network better which is increasingly important for radiological emergency management support by metrology.’

    Given the intensity of the climate crisis, it is crucial to present reliable data for policy makers, added Röttger. This will assist greatly in addressing climate change, arguably the biggest threat mankind

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 1:10 pm on May 20, 2022 Permalink | Reply
    Tags: "Q&A- 'People have to be at the centre of the energy transformation', , , , Horizon - The EU Research and Innovation Magazine, Nebojsa Nakicenovic: Vice-Chair- Group of Chief Scientific Advisors (GCSA)   

    From “Horizon” The EU Research and Innovation Magazine : “Q&A- ‘People have to be at the centre of the energy transformation’ 

    From “Horizon” The EU Research and Innovation Magazine

    17 May 2022
    Kevin Casey

    Nebojsa Nakicenovic, Vice-Chair, Group of Chief Scientific Advisors (GCSA)

    In June 2021, the EU’s Group of Chief Scientific Advisors (GCSA) published the Scientific Opinion entitled “A systemic approach to the energy transition in Europe”, arguing that the clean energy transition in the European Green Deal must keep people at its centre. In light of tomorrow’s EU announcement that is critical to the future of energy supply in Europe, we invite GCSA Vice-Chair Nebojsa Nakicenovic to comment on the centrality of a just transition and the importance on staying focused on a clean energy future even at times of intensifying pressure.

    Tell us why the European Commission even needs a scientific opinion at all. Does not the evidence speak for itself?

    This publication (A Systemic Approach to energy Transition in Europe) is part of the Science Advice Mechanism (SAM) of the European Commission. From my perspective, this is a very unique way of providing scientific advice to the decision makers. Many governments have chief scientific advisors with that function. What is unique about SAM in the European Commission is that it has three independent parts.

    First, there is the Group of Chief Scientific Advisors who provide the scientific opinion. There are very clear process rules about how that happens. The other independent part is the so-called SAPEA (Scientific Advice for Policy of the European Academies). This is a consortium of over 100 European academies. They provide a scientific evidence review, similar to the climate change assessment of the IPCC (Intergovernmental Panel on Climate Change).

    The assessment is a scientific analysis of what we know about a particular topic. They (SAPEA) do not provide a scientific opinion or scientific advice, importantly they look into the possible options. We, the group of seven chief scientific advisors, based on this evidence review — evidence, so factual scientific knowledge — provide a scientific opinion to the European Commission.

    There is also a unit in the Commission that catalyses this process. The three groups work closely together but we are independent. That explains the context. Why would we provide a scientific opinion? It is because the topic is considered really crucial and central to multiple crisis facing Europe and the world.

    Does a just transition require a transformation of the economic model of energy services? People own the problem, should they not own the solution too?

    That is precisely what we have tried to address in our scientific opinion – based on the scientific evidence. We didn’t go beyond the scientific evidence.

    Energy cannot be seen as a silo. We – people – have to be at the centre. That means it has to be an inclusive process involving everybody and, importantly, not leaving anyone behind. Because there is a great danger that any transformation, unfortunately, leads to winners and hopefully there will be many, many winners but also – I wouldn’t say “losers” – but there are people who fall through the cracks who might be left behind and do not have an escape hatch. This is what was a high priority – to identify how to do that.

    The EU’s Group of Chief Scientific Advisors argue that the clean energy transition in the European Green Deal must keep people at its centre. ©Alexanderstock23, Shutterstock.

    In our scientific opinion – and in fact we say explicitly, it is essential that sustainable energy, lifestyles, and behaviours become the preferred choice for the people – become a natural choice. For that, we have to create an environment that allows that. This is clearly very, very complex, I don’t think anybody has a silver bullet on that question.

    The world has changed since the paper was published in June 2021. In particular war, inflation and recent dire warnings from the IPCC about rising temperatures. How does that affect your opinion on a just transition?

    I have to be very careful to distinguish what is in our scientific opinion based on the evidence and what is my personal view. It’s important not to mix the two or I would not be reflecting the scientific advice mechanism which I think is very unique – I just want to make that clear. Here is my private opinion based on our scientific opinion but not in it.

    Geopolitics are changing. There is no doubt that we are in a crucial moment in history. And this is why we argued before – again, my view – that we shouldn’t lose sight of the long term objectives .

    We are likely to exceed 1.5 degrees – it is almost certain that by 2040 we will be above (the limit prescribed), perhaps even earlier. From the scientific point of view, this is not new.

    From the policy point of view and behavioural point of view, this is something one needs to somehow internalise. We will exceed that goal and we will bear the dangerous consequences. But, we should not lose the perspective of doing our utmost to reach 1.5 degrees in the future – and for that we need to act now.

    This is another dimension of justice – intergenerational justice. We have to make sure that we leave the planet to the future generations (hopefully) in better condition than what will occur over the next decade or two.

    Is it even possible for the EGD to achieve ‘a clean, circular economy, a modern, resource-efficient and competitive economy’ by 2050?

    Again, we are in the realm of opinion. Nobody can tell what the future will be like.

    I was very enthusiastic when in 2015 all of the world adopted the UN’s Sustainable Development Goals (SDGs) and when there was the Paris Agreement on climate change. I think those were the two really important visionary steps towards this aspirational transformation that we were talking about.

    I would also argue that the European Green Deal, Fit for 55 and New European Bauhaus initiatives are even more actionable in some sense. They provide a clearer agenda for how the world and life might and should look in 2050.

    I don’t want to sound too pessimistic and again let me add, this is my personal perspective – you know, 30 years is a long enough time to achieve this transformation.

    We have done that before. The most recent example is of mobile phones. It all started in 1990 and today, everybody in the world has a phone. Even the poorest people have a phone because it has enabled new economic activities, because it’s beneficial for many (despite the nuisance of always being reachable!)

    Another example just to show in principle this is doable, is the replacement of horses by motor vehicles. That also took 30 years in most of the countries. We have 30 years to replace our vehicle fleet by hydrogen and electric. We have just enough time for the transformation if we act immediately.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 11:14 am on May 13, 2022 Permalink | Reply
    Tags: "Taking pray out of spray and Spotify for blossoms", , , , Digital solutions will help citrus farmers reduce use of pesticides., Horizon - The EU Research and Innovation Magazine   

    From “Horizon” The EU Research and Innovation Magazine : “Taking pray out of spray and Spotify for blossoms” 

    From “Horizon” The EU Research and Innovation Magazine

    09 May 2022
    Andrew Dunne

    Digital solutions will help citrus farmers reduce use of pesticides. © VitiGroup, 2022.

    Each day, millions of us load fresh fruit and flowers into our shopping baskets. The global trade in cut flowers and citrus fruits together are worth around €30 billion. When it comes to embracing technology, however, these big businesses tend to remain stuck in the past.

    In 2017, serendipity brought Martina Drobná into contact with biologist and entomologist, Dr Bruno Gábel.

    ‘It all started with my mum,’ said Drobná, a Slovakian-based graphic designer and coordinator of the CITRUS-PORT project.

    ‘She told me about her new neighbour in Modra (Slovakia), a brilliant scientist, who was working on ways to help fruit growers predict the disease and pest risk for their crops,’ she said.

    When the two met, Gábel explained how his hand-written calculations had helped local grape growers by forecasting likely disease risk at different times. This meant producers could target spraying when treating their crops and they didn’t need to spray continuously as they had in the past. Drobná was sold.

    Together with IT entrepreneur Roman Korbačka they created a website and app called VitiPort.

    ‘Our aim was to create a decision support system – which could advise growers on whether or when to treat crops,’ she explained. Calling themselves VitiGroup, the trio have since commercialised Vitiport, and have also developed a sister platform, Genimen-port, for apples and pears.

    Lemon squeezy

    Now they are using the same technology to help citrus growers of, for example, lemons, grapefruit and oranges, many of whom are based in Mediterranean countries. Every year, citrus farmers’ crops are threatened by disease and pests such as grey mould, brown rot and black spot fungal disease, as well as orange tortrix and the citrus leafminer.

    ‘In many growing areas we find it’s common to spray every week but that soon becomes a vicious cycle of over-spraying,’ said Drobná. ‘The more you spray, the more the pathogen builds up resistance and the more you then need to spray to control it; it just gets worse and worse.’

    With CitrusPort – which is soon to become operational – the team hopes to challenge conventional wisdom about the importance of spraying by providing timely, user-friendly information about when and how best to treat. Subscribers download an app, find their farm via GPS and from then on, at the start of each week receive an accurate indication of disease risk and advice about treatment.

    Given time, Drobná hopes that CitrusPort can replicate some of the impressive results the team already achieved for grapes and apples. In Champagne, where VitiPort has operated since 2018, targeted information about when to spray has helped decrease pesticide use by 58%.

    They see global potential for CitrusPort with interest already coming in from Australia, the US and Tunisia.

    Spotify for flowers

    Entrepreneur Eric Egberts has a passion for flowers and a 30-year history in the business. One thing that increasingly troubles him are restricted markets and limited consumer choice.

    ‘I used to see all kinds of flowers on my travels and would bring different types home for friends,’ he said. ‘Everyone would always ask, “Where can I get these?” I realised that in most florists, consumer choice was restricted to just the best-sellers such as red roses, pink daisies, white and yellow chrysanthemums,’ said Egberts.

    He likens the problem in the flower industry to the music industry pre-streaming services. ‘Like with music back when we just listened to the radio, with flowers you only ever get the options that the DJ plays,’ he said. ‘But there are so many more options that could be available.’

    His concept is to unlock the hidden gems of the flower world through digital innovation. With his EU-funded FLOURISH project – which comprises growers, wholesalers, and a team of computer scientists – he hopes to do for the flower industry what Spotify has done for music.

    The plan is to increase consumer choice while offering a new route to market for growers.

    Empowered flowers

    His system enables florists to pool their inventories, Egberts explained. Instead of relying on florists to present ready-made bouquets, with Egberts’ innovation consumers are in charge of mixing and matching flowers to suit their moods and desires all via an app.

    ‘Our system can create automatic bouquets based on a variety of consumer preferences, as well as their previous choices, potentially introducing thousands of new varieties to customers,’ he said.

    ‘For the consumer it’s all about tailoring bouquets to suit particular tastes and personalisation. For the growers, it’s about opening access to new markets, but also offering them better insight into where their flowers are going and where they are popular,’ he said.

    From the environmental perspective, the system allows you to choose to have flowers that are produced with a lower carbon footprint or from local growers only. ‘Currently, as a consumer, you have very limited knowledge about any of these factors,’ he said.

    Admittedly, consumer costs would be slightly higher and delivery times slightly longer as different flowers get sourced from different locations. However, Egberts is convinced demand is there for his idea.

    He has already spun-out part of the work through the BloomyPro website and the team now wants a big commercial partner to come on board to provide scale.

    Egberts is passionate about flowers and has big ambitions for the work. ‘Flowers are key to so many important milestones in our lives,’ he said. ‘I want to bring flowers to more people, and I want to be in the cell phone of every person in the world to help do this.’

    Each of these projects were supported by the European Innovation Council (EIC) which helps game-changing innovations to scale up. Follow the link to learn more about the EIC.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 8:02 am on May 6, 2022 Permalink | Reply
    Tags: "Finding the missing links of black hole astronomy", , , , , , , Horizon - The EU Research and Innovation Magazine, ,   

    From Horizon The EU Research and Innovation Magazine : “Finding the missing links of black hole astronomy” 

    From Horizon The EU Research and Innovation Magazine

    05 May 2022
    Anthony King

    An accreting SMBH in a fairly local galaxy with very large and extended radio jets. © R. Timmerman; LOFAR & Hubble Space Telescope.

    A deeper understanding of black holes could revolutionise our understanding of physics, but their mysterious nature makes them difficult to observe.

    The weirdness exhibited by black holes boggles the mind. Formed when a star burns all its nuclear fuel and collapses under its own gravitation, black holes are such oddities that at one time, even Einstein didn’t think they were possible.

    They are regions in space with such intense gravitation that not even light escapes their pull. Once magnificent shining stars burn out and shrink to a relatively tiny husk, all their mass is concentrated in a small space. Imagine our Sun with its diameter of roughly 1.4 million kilometres shrinking to a black hole the size of a small city just six kilometres across. This compactness gives black holes immense gravitational pull.

    Not only do they trap light, black holes can shred any stars they encounter and even merge with each other. Events like this release bursts of energy that are detectable from billions of light years away.

    The Nobel Prize in Physics 2020 was shared by scientists who discovered an invisible object at the heart of the Milky Way that pulls stars towards it. This is a supermassive black hole, or SMBH, and it has a mass that is millions of times that of our sun.

    “At the heart of every massive galaxy, we think there is a supermassive black hole,” said astrophysicist Dr Kenneth Duncan at the Royal Observatory in Edinburgh, UK. “We also think they play a really important role in how galaxies form, including the Milky Way.”

    Galactic monsters

    Supermassive black holes are gravitating monsters of the Universe. ‘Black holes at the centre of galaxies can be between a million and a few billion times the mass of our Sun,’ said Professor Phillip Best, astrophysicist at The University of Edinburgh (SCT).

    They pull in gas and dust from their surroundings, even objects as large as stars. Just before this material falls in towards the black hole’s event horizon or point of no return, it moves quickly and heats up, emitting energy as energetic flashes. Powerful jets of material that emit radio waves may also spew out from this ingestion process.

    These can be detected on Earth using radio telescopes such as Europe’s LOFAR, which has detectors in the UK, Ireland, France, the Netherlands, Germany, Sweden, Poland and Latvia.

    Duncan is tapping LOFAR observations to identify the massive black holes in a project called HIZRAD. ‘We can detect growing black holes further back in time,’ said Duncan, ‘with the goal being to find the very first and some of the most extreme black holes in the Universe.’

    LOFAR can pinpoint even obscured black holes. Duncan has used artificial intelligence techniques to combine data from LOFAR and telescope surveys to identify objects of interest.

    Better instruments

    Better instruments will soon assist in this task. An upgrade to the William Herschel Telescope on La Palma, Spain, will allow it to observe thousands of galaxies at the same time. A spectroscope called WEAVE has the potential to detect supermassive black holes and to observe star and galaxy formation.

    Radio signals indicate that supermassive black holes exist from as early as the first 5-10% of the Universe’s history. These are a billion solar masses, explained Best, who is the research supervisor.

    The surprising part is that these giants existed at the early stages of the Universe. “You’ve got to get all this mass into a very small volume and do it extremely quickly, in terms of the Universe’s history,” said Best.

    We know that following the Big Bang, the Universe began as an expanding cloud of primordial matter. Studies of the cosmic background radiation indicate that eventually clumps of matter came together to form stars. However, ‘The process where you form a blackhole as large as a billion solar masses is not fully understood,’ said Best.

    Intermediate black holes

    While studies of SMBHs are ongoing, Dr Peter Jonker, astronomer at Radboud University [Radboud Universiteit](NL), is intrigued by the formation of black holes of intermediate scale.

    He is studying the possible existence of intermediate black holes (IMBH) with the imbh project [CORDIS]. He notes that supermassive black holes have been observed from when the Universe was only 600 million years old. Scientists estimate the overall age of the universe to be around 13.8 billion years.

    “The Universe started out like a homogenous soup of material, so how do you get clumps that weigh a billion times the mass of the sun in a very short time?” said Jonker.

    While supermassive black holes might consume sun-like stars (called white dwarfs) in their entirety, IMBHs should be powerful enough to only shred them, emitting a revealing flash of energy.

    ‘When a compact star, a white dwarf, is ripped apart, it can be ripped only by intermediate mass black holes,’ said Jonker. ‘Supermassive black holes eat them whole.’ There are strong indications that intermediate black holes are out there, but there’s no proof yet.

    He is searching for flashes of intense X-ray energy to indicate the presence of an intermediate black hole. The problem is when signals are detected, the intense flashes last just a few hours. This means the data arrives too late be able to turn optical telescopes towards the source for observations.

    “This happens once in 10,000 years per galaxy, so we haven’t seen one yet in our Milky Way,” said Jonker.

    Jonker also seeks to observe the expected outcome of two black holes spinning and merging, then emitting a gravitational wave that bumps nearby stars. However, to discern these stars being jolted necessitates powerful space-based telescopes.

    X-ray flashes

    The Gaia satellite, launched in 2013, is providing some assistance, but a planned mission called Euclid will take higher resolution images and may help Jonker prove IMBHs exist.

    This satellite was due to be launched on a Russian rocket; it will now be launched with a slight delay on a European Ariane 6 rocket

    Nonetheless, a small satellite – the Chinese Einstein Probe – is scheduled for launch in 2023 and will look out for flashes of X-ray energy that could signify intermediate black holes.

    Duncan in Edinburgh says that the search for intermediate black holes ties in with his own quest. ‘It can potentially help us solve the question of where the supermassive ones came from,’ he said.

    Right now, physicists rely on quantum theory and Einstein’s equations to describe how the Universe works. These cannot be the final say, however, because they do not fit well together.

    “The theory of gravity breaks down near a black hole, and if we observe them closely enough,” said Jonker, “Our expectation is that we will find deviations from the theory and important advances in understanding how physics works.”

    The research in this article was funded by the EU.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 3:13 pm on December 10, 2021 Permalink | Reply
    Tags: "Is Europe entering a golden age of astronomy?", , , , , , Horizon - The EU Research and Innovation Magazine, ,   

    From Horizon The EU Research and Innovation Magazine : “Is Europe entering a golden age of astronomy?” 

    From Horizon The EU Research and Innovation Magazine

    08 December 2021
    Gareth Willmer

    Europe’s largest astronomy network brings together around 20 telescopes and telescope arrays. © vchal, Shutterstock.

    Groundbreaking discoveries about gravitational waves, black holes, cosmic rays, neutrinos and other areas of cutting-edge astronomy may soon become more frequent due to the convergence of two major communities of astronomers in a fresh project.

    Previously, Europe had two major collaborative networks for ground-based astronomy running over the past couple of decades, known as OPTICON and RadioNet. These focused on observing astronomical phenomena in separate wavelength ranges of the electromagnetic spectrum – the former at optical wavelengths, in a portion of the spectrum that includes visible light; and the latter at longer, radio wavelengths.

    Now, these two domains of astronomy are uniting in a project called the OPTICON RadioNet Pilot (ORP)(EU), a consortium of astronomers from 37 institutions and 15 European countries, plus Australia and South Africa.

    Referring to itself as Europe’s largest astronomy network, the initiative was set up in light of the increasing need for astronomers to have a range of skills in different domains and use complementary techniques to understand phenomena. It also brings together around 20 telescopes and telescope arrays owned by members of the consortium, with the aim of harmonising methods and tools between the two domains, and opening up physical and virtual access to facilities.

    There are some people who are experts in both domains, but these are different communities,’ said Dr Jean-Gabriel Cuby at The National Centre for Scientific Research [Centre national de la recherche scientifique [CNRS](FR) and The Aix-Marseille University [Aix-Marseille Université](FR), and coordinator for the ORP project. “I was trained as an optical astronomer, and other people were trained as radio astronomers. Now, we need also to train wavelength-neutral astronomers.”

    He explained that the more you can observe about phenomena at different wavelengths, the more of a picture you can build. ‘Multi-wavelength astronomy is about observing across the whole spectral domain to have as much information as possible,’ he said. ‘The light we receive in optical and radio wavelengths comes from different physical processes; so the more we observe in terms of wavelength coverage, the more we learn about the physical processes.’

    Dr Cuby said the aim is to facilitate and speed up the process of getting telescope time for projects that require different facilities – which can be a long-winded process – making it easier for people to do more ambitious projects that previously required vast management efforts.

    The telescope facilities include the likes of LOFAR, a trans-European low-frequency radio telescope network based in the Netherlands, and EVN, a network of radio telescopes located mainly in Europe and Asia, with additional antennas in South Africa and Puerto Rico.

    ASTRON Institute for Radio Astronomy(NL) LOFAR Radio Antenna Bank(NL)

    ASTRON (NL) LOFAR European Map.

    IPTA-International Pulsar Timing Array-Clockwise from upper left: Green Bank Radio Telescope (US), Arecibo Radio Telescope (US) [no longer in service], Nancay Radio Telescope (FR), Lovell Radio Telescope Cheshire (UK), Parkes Radio Telescope (AU), LOFAR Radio Telescope Exloo (NL), GMRT Pune India, Westerbork Radio Telescope (NL), Effelsberg Radio Telescope (DE)

    The telescope facilities include the likes of LOFAR, a trans-European low-frequency radio telescope network based in the Netherlands, and EVN, a network of radio telescopes located mainly in Europe and Asia, with additional antennas in South Africa and Puerto Rico.

    European Very Long Baseline Interferometry Network

    GMVA The Global VLBI Array

    Multi-messenger age

    Dr Cuby elaborated on how the need is growing to foster harmonisation between domains in the current age of so-called multi-messenger astronomy. This involves the observation of various “messenger” particles – such as gravitational waves, neutrinos and cosmic rays – that can reveal different information about the same sources, potentially giving unprecedented insight into the universe and its origins.

    Harmonisation is also key for time-domain astronomy, which explores how astronomical events vary over time. Events now being explored are frequently transient, with many, like fast radio bursts, lasting mere milliseconds. Capturing multiple aspects of them thus requires rapid deployment of telescopes and facilities, which can again be aided by collaboration. ‘This time-domain astronomy is going to explode in the coming years,’ said Dr Cuby. ‘This is really the golden age of astronomy.’

    Professor Gerry Gilmore, a cosmologist at The University of Cambridge (UK) who is involved in ORP as scientific coordinator for OPTICON, elaborated further. “That’s the sort of science we now do, where you discover something that’s usually highly variable and very often it’s transient,” he said. “It’s all over very quickly and you don’t get another chance. You want then to be able to bring the whole array of potential capabilities into looking at that particular place in the sky now.”

    Previously, said Prof. Gilmore, capturing a transient event relied on a huge amount of luck in looking in the right place at the right time, but ORP provides a chance to “plan to be lucky” through more targeted efforts between different researchers and opens up the “discovery space” in astronomy.

    “As soon as the technology became available to start looking for shorter and shorter timescale events, hey presto, we discovered they’re all there – the universe is full of stuff. And it’s the most extreme things that happen fastest.”

    Gravitational waves

    Much of this multi-messenger and time-domain astronomy is in its infancy, but is being opened up by advances in technologies and new deployments of cutting-edge observatories around the world.

    One emerging area that ORP hopes will be spurred by collaboration is that of gravitational waves. First detected in 2015, these are ripples in space-time formed by some of the universe’s most cataclysmic events, such as pairs of black holes colliding.

    This November, an international team of astronomers announced the detection of a record number of gravitational waves, adding 35 new observations over the course of roughly six months to bring the total to 90 so far.

    LIGO Virgo Kagra Masses in the Stellar Graveyard. Credit: Frank Elavsky and Aaron Geller at Northwestern University(US).

    The findings, they believe, will help further our understanding of the evolution of the universe, and topics such as the life and death of stars.

    With the related study listing more than 1,600 authors from all corners of the world, and harnessing around 100 ground- and space-based instruments – including visible, infrared and radio telescopes, neutrino and gamma-ray observatories, and X-ray instruments – this reflects the hugely extensive collaboration taking place in modern astronomy.

    One of the authors, Dr Sarp Akcay, a theoretical physicist at The University College Dublin (IE) who is not involved in ORP, said the ORP initiative looks promising for inspiring more rapid discoveries.

    “This type of large-scale collaboration will be extremely helpful for gravitational-wave astronomy, and even more so for so-called multi-messenger astronomy,” he said. “With more telescopes joining a global network, follow-up observations can be made quicker in the future, adding to our knowledge of these events.”

    Prof. Gilmore said, meanwhile, that although the main focus of ORP is on inspiring collaboration rather than carrying out specific investigations itself, a test case for the project is combining the search for black holes in the optical and radio wavelengths to find out more about their nature, exactly how common these objects are, and whether theories about them are correct.

    And with the Milky Way alone thought to harbour millions of black holes, which are often formed by the death of massive stars, there’s a vast amount to find out. “There’s a handful of them that have been observed in very special circumstances,” said Prof. Gilmore. “So we’ve seen the tip of the iceberg, but we predict that there are huge numbers of them.”
    Long-term view

    Though it’s early days for ORP, which launched this March, and the exact way it develops is yet to be seen, Dr Cuby and his team hope that the pilot can later transition into a sustainable long-term project beyond its current scheduled duration until early 2025. The aim is also to enable open access to those around the world, broadening the scope for involvement of previously under-represented researchers and countries.

    Prof. Gilmore said, meanwhile, that the separate communities have been increasingly converging in recent years, while the OPTICON and RadioNet projects have already established strong collaborative networks in their individual domains over many years. “The community has been changing steadily over the last few decades,” he said. “People have been forming teams and using a range of facilities for a given scientific topic. Multi-wavelength astronomy is the reality of the way we actually do it these days.”

    With the ORP project, he said: “Now, it should be possible for a group of young, enthusiastic scientists just to choose their leader, she writes the proposal, and ping – off the team goes”.

    Professor Anton Zensus, scientific coordinator for RadioNet in the ORP project, believes the initiative is a “crucial step” in furthering the field of astronomy that will allow a much richer picture of the universe.

    “Multifrequency use allows us to better understand the secrets of the universe,” he said. “ORP will allow a fast reaction to unexpected and transient astronomical phenomena in the sky, such as gamma ray bursts. We aim on getting a full image illuminating all aspects of phenomena.”

    Dr Zensus added that bringing the radio and optical communities together to harmonise astronomy is a “crucial step to make it attractive for users from all astronomical communities” and help open up this area of science to non-specialist users too. “A multi-messenger approach will deepen our understanding of astronomy phenomena, and at the same time create new questions and approaches,” he said.

    [This writer is forced to ask: Will Europe eclipse the U.S.A. in especially as they did in High Energy Physics? When the U.S. Congress foolishly cancelled the Superconducting Supercollider [SSC] in Texas, the U.S allowed for the hegemony in High Energy Physics to move the the Large Hadron Collider [LHC] at CERN on the Swiss French border The LHC powered up to 14 TeV. The LHC found the Higgs Boson after Fermilab’s Tevatron could not develop the energy in TeV to do the job. The Tevetron never actually powered up to even 2 TeV. But the SSC was much larger and would have quickly developed 20 TeV in each direction.

    Now the NSF seems to be abandoning Radio Astronomy, having defunded the Arecibo Radio Telescope in Puerto Rico and reducing the funding for he Green Bank Radio Telescope in West Virginia, while the European Union has committed €20 billion to Radio Astronomy.

    So will the U.S.A. again be eclipsed by Europe?]

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 9:19 am on November 5, 2021 Permalink | Reply
    Tags: "Keeping one step ahead of earthquakes", , , Horizon - The EU Research and Innovation Magazine, , ,   

    From Horizon The EU Research and Innovation Magazine : “Keeping one step ahead of earthquakes” 

    From Horizon The EU Research and Innovation Magazine

    03 November 2021
    Nick Klenske

    As technologies continue to improve, earthquake-prone cities will be better prepared. © Marco Iacobucci Epp, Shutterstock.

    While accurately predicting earthquakes is in the realm of science fiction, early warning systems are very much a reality. As advances in research and technology make these systems increasingly effective, they’re vital to reducing an earthquake’s human, social and economic toll.

    Damaging earthquakes can strike at any time. While we can’t prevent them from occurring, we can make sure casualties, economic loss and disruption of essential services are kept to a minimum.

    Building more resilient cities is key to withstanding earthquake disasters. If we had a better idea of when earthquakes would strike, authorities could initiate local emergency, evacuation and shelter plans. But unfortunately, this is not the case.

    ‘Because earthquakes occur on faults, we know where they will occur. The problem is that we don’t know how to predict when an earthquake will strike,’’ explained Quentin Bletery, from the Research Institute for Development (IRD) in France. He is a researcher at the Géoazur laboratory at The University of Côte d’Azur [Université Côte d’Azur](FR).

    ‘Successful earthquake prediction must provide the location, time and magnitude of a future event with high accuracy, [something] which as of now, can’t be done,’ added Johannes Schweitzer, Principal Research Geophysicist at NORSAR, an independent research foundation specialised in seismology and seismic monitoring.

    Potential of AI to improve the accuracy and speed of early warning systems

    Earthquake early warning (EEW) systems are evolving rapidly thanks to advances in computer power and network communication.

    EEW systems work by identifying the first signals generated by an earthquake rupture before the strongest shaking and tsunami reach populated areas. These signals follow the origin of the earthquake and can be recorded seconds before the seismic waves.

    A promising, recently identified early signal is the prompt elasto-gravity signal (PEGS), which travels at the speed of light but is a million times smaller than seismic waves, and therefore, often goes undetected.

    According to Bletery, artificial intelligence (AI) could play a key role in identifying this signal. With the support of the EARLI project, he is leading an effort to develop an AI algorithm capable of doing exactly that.

    “Our AI system aims to increase the accuracy and speed of early warning systems by enabling them to pick up an extremely weak signal that precedes even the fastest seismic waves,” said Bletery.

    Albeit still in its very early stages, if the project succeeds, Bletery says public authorities will have access to nearly instantaneous information about an earthquake’s magnitude and location. “This would allow them to take such immediate mitigation efforts as, for example, shutting down infrastructure like trains and nuclear power plants and moving people to earthquake- and tsunami-safe zones,” he noted.

    Statistical technique to enhance seismic resilience

    Another approach to improve seismic seismic resilience and reduce human losses is operational earthquake forecasting (OEF). TURNkey, led by NORSAR, aims to improve the effectiveness of this statistical technique used to study seismic sequences to provide timely warnings.

    “OEF can inform us about changing seismic hazards over time, enabling emergency managers and public authorities to prepare for a potentially damaging earthquake,” explained Ivan Van Bever, TURNkey project manager. “What OEF can’t do, is provide warnings with a high level of accuracy.”

    In addition to improving existing methods, TURNkey is developing the “Forecasting – Early Warning – Consequence Prediction – Response” (FWCR) platform to increase the accuracy of earthquake warnings and ensure that all warning-related information is sent to end-users in a format that is both understandable and useful.

    “The platform will forecast and issue warnings for aftershocks and will improve the ability for users to estimate both direct and indirect losses,” said Van Bever

    Better prepared than ever

    The platform is currently being tested at six locations across Europe: Bucharest (Romania), the Pyrenees mountain range (France), the towns of Hveragerdi and Husavik (Iceland), the cities of Patras and Aigio (Greece), and the port of Gioia Tauro (Southern Italy). It is also being tested in Groningen province (Netherlands), which is affected by induced seismicity – minor earthquakes and tremors caused by human activity that alters the stresses and strains on the Earth’s crust.

    Johannes Schweitzer, who is the project coordinator, is confident the multi-sensor-based earthquake information system will prove capable of enabling early warning and rapid response. “The TURNkey platform will close the gap between theoretical systems and their practical application in Europe,” remarked Schweitzer. “In doing so, it will improve a city’s seismic resilience before, during and after a damaging earthquake.”

    “As these technologies and systems continue to improve, they could reduce an earthquake’s human, social and economic toll,” added Bletery.

    Earthquake-prone cities will be better prepared than ever before. At the very least these new systems will give people a heads up to drop, cover and hold on during an earthquake.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition


    Earthquake Alert


    Earthquake Alert

    Earthquake Network project is a research project which aims at developing and maintaining a crowdsourced smartphone-based earthquake warning system at a global level. Smartphones made available by the population are used to detect the earthquake waves using the on-board accelerometers. When an earthquake is detected, an earthquake warning is issued in order to alert the population not yet reached by the damaging waves of the earthquake.

    The project started on January 1, 2013 with the release of the homonymous Android application Earthquake Network. The author of the research project and developer of the smartphone application is Francesco Finazzi of the University of Bergamo, Italy.

    Get the app in the Google Play store.

    Smartphone network spatial distribution (green and red dots) on December 4, 2015
    Meet The Quake-Catcher Network
    QCN bloc

    Quake-Catcher Network

    The Quake-Catcher Network is a collaborative initiative for developing the world’s largest, low-cost strong-motion seismic network by utilizing sensors in and attached to internet-connected computers. With your help, the Quake-Catcher Network can provide better understanding of earthquakes, give early warning to schools, emergency response systems, and others. The Quake-Catcher Network also provides educational software designed to help teach about earthquakes and earthquake hazards.
    After almost eight years at Stanford University (US), and a year at California Institute of Technology (US), the QCN project is moving to the University of Southern California (US) Dept. of Earth Sciences. QCN will be sponsored by the Incorporated Research Institutions for Seismology (IRIS) and the Southern California Earthquake Center (SCEC).

    The Quake-Catcher Network is a distributed computing network that links volunteer hosted computers into a real-time motion sensing network. QCN is one of many scientific computing projects that runs on the world-renowned distributed computing platform Berkeley Open Infrastructure for Network Computing (BOINC).

    The volunteer computers monitor vibrational sensors called MEMS accelerometers, and digitally transmit “triggers” to QCN’s servers whenever strong new motions are observed. QCN’s servers sift through these signals, and determine which ones represent earthquakes, and which ones represent cultural noise (like doors slamming, or trucks driving by).

    There are two categories of sensors used by QCN: 1) internal mobile device sensors, and 2) external USB sensors.

    Mobile Devices: MEMS sensors are often included in laptops, games, cell phones, and other electronic devices for hardware protection, navigation, and game control. When these devices are still and connected to QCN, QCN software monitors the internal accelerometer for strong new shaking. Unfortunately, these devices are rarely secured to the floor, so they may bounce around when a large earthquake occurs. While this is less than ideal for characterizing the regional ground shaking, many such sensors can still provide useful information about earthquake locations and magnitudes.

    USB Sensors: MEMS sensors can be mounted to the floor and connected to a desktop computer via a USB cable. These sensors have several advantages over mobile device sensors. 1) By mounting them to the floor, they measure more reliable shaking than mobile devices. 2) These sensors typically have lower noise and better resolution of 3D motion. 3) Desktops are often left on and do not move. 4) The USB sensor is physically removed from the game, phone, or laptop, so human interaction with the device doesn’t reduce the sensors’ performance. 5) USB sensors can be aligned to North, so we know what direction the horizontal “X” and “Y” axes correspond to.

    If you are a science teacher at a K-12 school, please apply for a free USB sensor and accompanying QCN software. QCN has been able to purchase sensors to donate to schools in need. If you are interested in donating to the program or requesting a sensor, click here.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.
    Earthquake safety is a responsibility shared by billions worldwide. The Quake-Catcher Network (QCN) provides software so that individuals can join together to improve earthquake monitoring, earthquake awareness, and the science of earthquakes. The Quake-Catcher Network (QCN) links existing networked laptops and desktops in hopes to form the worlds largest strong-motion seismic network.

    Below, the QCN Quake Catcher Network map
    QCN Quake Catcher Network map



    About Early Warning Labs, LLC

    Early Warning Labs, LLC (EWL) is an Earthquake Early Warning technology developer and integrator located in Santa Monica, CA. EWL is partnered with industry leading GIS provider ESRI, Inc. and is collaborating with the US Government and university partners.

    EWL is investing millions of dollars over the next 36 months to complete the final integration and delivery of Earthquake Early Warning to individual consumers, government entities, and commercial users.

    EWL’s mission is to improve, expand, and lower the costs of the existing earthquake early warning systems.

    EWL is developing a robust cloud server environment to handle low-cost mass distribution of these warnings. In addition, Early Warning Labs is researching and developing automated response standards
    and systems that allow public and private users to take pre-defined automated actions to protect lives and assets.

    EWL has an existing beta R&D test system installed at one of the largest studios in Southern California. The goal of this system is to stress test EWL’s hardware, software, and alert signals while improving latency and reliability.

    ShakeAlert: An Earthquake Early Warning System for the West Coast of the United States

    The U. S. Geological Survey (USGS) along with a coalition of State and university partners is developing and testing an earthquake early warning (EEW) system called ShakeAlert for the west coast of the United States. Long term funding must be secured before the system can begin sending general public notifications, however, some limited pilot projects are active and more are being developed. The USGS has set the goal of beginning limited public notifications in 2018.

    Watch a video describing how ShakeAlert works in English or Spanish.

    The primary project partners include:

    United States Geological Survey
    California Governor’s Office of Emergency Services (CalOES)
    California Geological Survey California Institute of Technology
    University of California Berkeley
    University of Washington
    University of Oregon
    Gordon and Betty Moore Foundation

    The Earthquake Threat

    Earthquakes pose a national challenge because more than 143 million Americans live in areas of significant seismic risk across 39 states. Most of our Nation’s earthquake risk is concentrated on the West Coast of the United States. The Federal Emergency Management Agency (FEMA) has estimated the average annualized loss from earthquakes, nationwide, to be $5.3 billion, with 77 percent of that figure ($4.1 billion) coming from California, Washington, and Oregon, and 66 percent ($3.5 billion) from California alone. In the next 30 years, California has a 99.7 percent chance of a magnitude 6.7 or larger earthquake and the Pacific Northwest has a 10 percent chance of a magnitude 8 to 9 megathrust earthquake on the Cascadia subduction zone.

    Part of the Solution

    Today, the technology exists to detect earthquakes, so quickly, that an alert can reach some areas before strong shaking arrives. The purpose of the ShakeAlert system is to identify and characterize an earthquake a few seconds after it begins, calculate the likely intensity of ground shaking that will result, and deliver warnings to people and infrastructure in harm’s way. This can be done by detecting the first energy to radiate from an earthquake, the P-wave energy, which rarely causes damage. Using P-wave information, we first estimate the location and the magnitude of the earthquake. Then, the anticipated ground shaking across the region to be affected is estimated and a warning is provided to local populations. The method can provide warning before the S-wave arrives, bringing the strong shaking that usually causes most of the damage.

    Studies of earthquake early warning methods in California have shown that the warning time would range from a few seconds to a few tens of seconds. ShakeAlert can give enough time to slow trains and taxiing planes, to prevent cars from entering bridges and tunnels, to move away from dangerous machines or chemicals in work environments and to take cover under a desk, or to automatically shut down and isolate industrial systems. Taking such actions before shaking starts can reduce damage and casualties during an earthquake. It can also prevent cascading failures in the aftermath of an event. For example, isolating utilities before shaking starts can reduce the number of fire initiations.

    System Goal

    The USGS will issue public warnings of potentially damaging earthquakes and provide warning parameter data to government agencies and private users on a region-by-region basis, as soon as the ShakeAlert system, its products, and its parametric data meet minimum quality and reliability standards in those geographic regions. The USGS has set the goal of beginning limited public notifications in 2018. Product availability will expand geographically via ANSS regional seismic networks, such that ShakeAlert products and warnings become available for all regions with dense seismic instrumentation.

    Current Status

    The West Coast ShakeAlert system is being developed by expanding and upgrading the infrastructure of regional seismic networks that are part of the Advanced National Seismic System (ANSS); the California Integrated Seismic Network (CISN) is made up of the Southern California Seismic Network, SCSN) and the Northern California Seismic System, NCSS and the Pacific Northwest Seismic Network (PNSN). This enables the USGS and ANSS to leverage their substantial investment in sensor networks, data telemetry systems, data processing centers, and software for earthquake monitoring activities residing in these network centers. The ShakeAlert system has been sending live alerts to “beta” users in California since January of 2012 and in the Pacific Northwest since February of 2015.

    In February of 2016 the USGS, along with its partners, rolled-out the next-generation ShakeAlert early warning test system in California joined by Oregon and Washington in April 2017. This West Coast-wide “production prototype” has been designed for redundant, reliable operations. The system includes geographically distributed servers, and allows for automatic fail-over if connection is lost.

    This next-generation system will not yet support public warnings but does allow selected early adopters to develop and deploy pilot implementations that take protective actions triggered by the ShakeAlert notifications in areas with sufficient sensor coverage.


    The USGS will develop and operate the ShakeAlert system, and issue public notifications under collaborative authorities with FEMA, as part of the National Earthquake Hazard Reduction Program, as enacted by the Earthquake Hazards Reduction Act of 1977, 42 U.S.C. §§ 7704 SEC. 2.

    For More Information

    Robert de Groot, ShakeAlert National Coordinator for Communication, Education, and Outreach

    Learn more about EEW Research

    ShakeAlert Fact Sheet

    ShakeAlert Implementation Plan

    Earthquake Early Warning Introduction

    The United States Geological Survey (USGS), in collaboration with state agencies, university partners, and private industry, is developing an earthquake early warning system (EEW) for the West Coast of the United States called ShakeAlert. The USGS Earthquake Hazards Program aims to mitigate earthquake losses in the United States. Citizens, first responders, and engineers rely on the USGS for accurate and timely information about where earthquakes occur, the ground shaking intensity in different locations, and the likelihood is of future significant ground shaking.

    The ShakeAlert Earthquake Early Warning System recently entered its first phase of operations. The USGS working in partnership with the California Governor’s Office of Emergency Services (Cal OES) is now allowing for the testing of public alerting via apps, Wireless Emergency Alerts, and by other means throughout California.

    ShakeAlert partners in Oregon and Washington are working with the USGS to test public alerting in those states sometime in 2020.

    ShakeAlert has demonstrated the feasibility of earthquake early warning, from event detection to producing USGS issued ShakeAlerts ® and will continue to undergo testing and will improve over time. In particular, robust and reliable alert delivery pathways for automated actions are currently being developed and implemented by private industry partners for use in California, Oregon, and Washington.

    Earthquake Early Warning Background

    The objective of an earthquake early warning system is to rapidly detect the initiation of an earthquake, estimate the level of ground shaking intensity to be expected, and issue a warning before significant ground shaking starts. A network of seismic sensors detects the first energy to radiate from an earthquake, the P-wave energy, and the location and the magnitude of the earthquake is rapidly determined. Then, the anticipated ground shaking across the region to be affected is estimated. The system can provide warning before the S-wave arrives, which brings the strong shaking that usually causes most of the damage. Warnings will be distributed to local and state public emergency response officials, critical infrastructure, private businesses, and the public. EEW systems have been successfully implemented in Japan, Taiwan, Mexico, and other nations with varying degrees of sophistication and coverage.

    Earthquake early warning can provide enough time to:
    Instruct students and employees to take a protective action such as Drop, Cover, and Hold On
    Initiate mass notification procedures
    Open fire-house doors and notify local first responders
    Slow and stop trains and taxiing planes
    Install measures to prevent/limit additional cars from going on bridges, entering tunnels, and being on freeway overpasses before the shaking starts
    Move people away from dangerous machines or chemicals in work environments
    Shut down gas lines, water treatment plants, or nuclear reactors
    Automatically shut down and isolate industrial systems

    However, earthquake warning notifications must be transmitted without requiring human review and response action must be automated, as the total warning times are short depending on geographic distance and varying soil densities from the epicenter.

    GNSS-Global Navigational Satellite System

    GNSS station | Pacific Northwest Geodetic Array, Central Washington University (US)

  • richardmitnick 11:24 am on October 29, 2021 Permalink | Reply
    Tags: "Bigger better blades for wind turbines", , As Europe’s wind turbines grow in size with individual blades-soon longer than a professional football pitch- the biggest challenge will be delivering more power with less wear., , Europe is full of wind–and making good use of it. Wind energy is set to make the largest contribution to EU renewable energy targets., Horizon - The EU Research and Innovation Magazine   

    From Horizon The EU Research and Innovation Magazine : “Bigger better blades for wind turbines” 

    From Horizon The EU Research and Innovation Magazine

    One of the biggest challenges is the repair of wind turbine blades to withstand the forces of nature © Adwo, Shutterstock.

    As Europe’s wind turbines grow in size with individual blades-soon longer than a professional football pitch- the biggest challenge will be delivering more power with less wear.

    Europe is full of wind–and making good use of it. Wind energy is set to make the largest contribution to EU renewable energy targets.

    This makes it a key component in Europe becoming climate-neutral, an objective the EU wants to reach by 2050. Home-grown technologies and tools will help Europe meet its climate goals while enhancing the competitiveness of the EU wind ecosystem on the global stage and create new green jobs.

    The winds of change

    In 2020, wind energy met about 16% of Europe’s electricity demand, including a majority of installations on land and a fraction offshore, both floating and fixed.

    Europe has plans to significantly up the ante, with projections to increase total wind-based power generation by about 50% over the next 5 years. Increasing power performance will be achieved not only by more installations but also wind turbines that can generate more power than their predecessors and that are out of commission less for maintenance and repairs.

    Wind turbines are huge, fast (considering their size and weight), and subjected to very harsh working conditions. Imagine a football pitch spinning around in the air at about 15 to 20 revolutions per minute in some of the gustiest places on Earth.

    From 2000 to 2018, the average length of wind turbine blades more than doubled. Newer models are expected to reach lengths exceeding 85 metres by 2025. Some offshore turbines could be sweeping the sky in the near future with blades 110 metres long – a rotational diameter of two football pitches end to end.

    The larger the blades, the faster the tips move – and the greater the erosion on their leading edges. The industry has made tremendous technological progress in materials, design and manufacturing. Still, putting up bigger blades that deliver more power with less wear is a tremendous challenge.

    Fortunately, the EU has a plan that includes improving resilience to degradation – which will only increase with larger blades and more and more extreme weather events – and better non-destructive monitoring to catch defects early, even during manufacture.

    A coat of armour that ‘gives’

    To withstand the forces of nature and the huge forces the rotation itself generates, blades are manufactured with a multilayer ‘coat of armour’. Typically, the outer layer erodes during operation and the inner layers can become detached.

    According to Asta Šakalytė Director of Research and Development at Aerox Advanced Polymers, SL, although the lifespan of a turbine is theoretically 25 years, current medium-sized systems typically require extensive maintenance at about 10 years due to blade deterioration. Newer ones with larger rotational diameters show severe erosion by the second year of service.

    To address the problem, Aerox developed AROLEP®, a pioneering proprietary leading edge protection system that is now market-ready thanks to work done by the LEP4BLADES project.

    Unlike conventional coatings you might find on pipes, Aerox’s coating is viscoelastic, meaning that it gives or, more precisely, deforms under stress and bounces back. As Šakalytė explained, ‘this is achieved with a combination of two polymers with different complementary properties. The AROLEP® coating can absorb high-speed and high-frequency impacts caused by raindrops and other particles hitting the leading edge of the blade. Tailor-made modification of polymer properties ensures the coating and blade materials work together so the impact effects are dissipated throughout the structure of the blade.’

    Independent performance tests showed AROLEP® protects the integrity of the blades better than any other available solution – and it can be used for new blades as well as those already in service.

    Market uptake should have significant ripple effects back to consumers: significant savings in maintenance, repair and downtime translating to lower energy costs. In the meantime, Aerox is continuing to improve the formulation while targeting novel coatings and adhesives for future blades that could help make wind turbine manufacture a zero-waste business.

    And an angel to watch over them

    Coatings are designed to minimise damage, but they cannot completely prevent it. Improved structural health monitoring technologies could catch defects early before the scales tip and repair or replacement creates financial and practical problems as large as the turbines themselves.

    Blade failures are a significant issue for the wind turbine industry. Approximately a third of the billions of euros annually that go towards operation and maintenance (O&M) of wind turbines is for inspection and/or repair of blade coatings.

    Until now, it had been impossible to identify internal defects in blade coatings. Visual inspection is the method of choice during manufacture and maintenance, but it misses defects lurking under the surface.

    Even technologically advanced methods of inspection like inductive and ultrasound technologies fall short when it comes to the coatings on wind turbine blades. They require contact that can damage blades and coatings, particularly if wet, and they cannot analyse individual layers, only total thickness.

    One way to see inside multilayer coatings may lie in the terahertz (THz) region of the electromagnetic spectrum – between microwave and infrared frequencies. It can ‘see’ through things and identify what is inside – and its chemical composition and electrical properties – in a non-destructive, non-invasive and non-ionising way.

    Until a few decades ago its potential was difficult to tap in part due to our inability to efficiently generate and detect the waves. But that is changing now with proprietary THz technology developed specifically for industrial use by das-Nano and introduced to the market in the context of the NOTUS project.

    According to Eduardo Azanza, Chief Executive Officer of das-Nano and NOTUS coordinator, ‘NOTUS is the first contactless tool for non-destructive material inspection specifically designed for wind turbine inspection. It can perform deep characterisation of individual layers of any coating structure and any blade, independent of materials, enabling quantification of interlayer adherence.’

    NOTUS is available in three versions for applications along the life cycle of blades supporting development, manufacturing, operation and even inspection by receiving personnel or insurance companies. It could save windfarm operators approximately 10% of O&M costs based on Azanza’s estimates.

    And windfarms are not the only ones who will benefit. NOTUS works with all sorts of multilayer substrates, including metal, composite and plastic. It accommodates flat and curved surfaces and dry, wet and cured paints.

    The THz technology also enables electrical characterisation of advanced materials such as graphene, 2D materials, thin films and bulk materials.

    Azanza said: ‘das-Nano has brought to market NOTUS, a harmless technology for fast and non-destructive inspection of every single product in a manufacturing line, identifying defective pieces at the earliest possible time.’

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: