Tagged: ETH Zürich Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:29 pm on February 22, 2020 Permalink | Reply
    Tags: "Time-resolved measurement in a memory device", , Data can be stored in magnetic tunnel junctions virtually without any error and in less than a nanosecond., ETH Zürich, , , , The researchers replaced the isolated metal dot by a magnetic tunnel junction., Tomorrow’s memory devices   

    From ETH Zürich: “Time-resolved measurement in a memory device” 

    ETH Zurich bloc

    From ETH Zürich

    19.02.2020
    Oliver Morsch

    Researchers at ETH have measured the timing of single writing events in a novel magnetic memory device with a resolution of less than 100 picoseconds. Their results are relevant for the next generation of main memories based on magnetism.

    1
    The chip produced by IMEC for the experiments at ETH. The tunnel junctions used to measure the timing of the magnetisation reversal are located at the centre (Image courtesy of IMEC).

    At the Department for Materials of the ETH in Zürich, Pietro Gambardella and his collaborators investigate tomorrow’s memory devices. They should be fast, retain data reliably for a long time and also be cheap. So-​called magnetic “random access memories” (MRAM) achieve this quadrature of the circle by combining fast switching via electric currents with durable data storage in magnetic materials. A few years ago researchers could already show that a certain physical effect – the spin-​orbit torque – makes particularly fast data storage possible. Now Gambardella’s group, together with the R&D-​centre IMEC in Belgium, managed to temporally resolve the exact dynamics of a single such storage event – and to use a few tricks to make it even faster.

    Magnetising with single spins

    To store data magnetically, one has to invert the direction of magnetisation of a ferromagnetic (that is, permanently magnetic) material in order to represent the information as a logic value, 0 or 1. In older technologies, such as magnetic tapes or hard drives, this is achieved through magnetic fields produced inside current-​carrying coils. Modern MRAM-​memories, by contrast, directly use the spins of electrons, which are magnetic, much like small compass needles, and flow directly through a magnetic layer as an electric current. In Gambardella’s experiments, electrons with opposite spin directions are spatially separated by the spin-​orbit interaction. This, in turn, creates an effective magnetic field, which can be used to invert the direction of magnetisation of a tiny metal dot.

    “We know from earlier experiments, in which we stroboscopically scanned a single magnetic metal dot with X-​rays, that the magnetisation reversal happens very fast, in about a nanosecond”, says Eva Grimaldi, a post-​doc in Gambardella’s group. “However, those were mean values averaged over many reversal events. Now we wanted to know how exactly a single such event takes place and to show that it can work on an industry-​compatible magnetic memory device.”

    Time resolution through a tunnel junction

    2
    Electron microscope image of the magnetic tunnel junction (MTJ, at the centre) and of the electrodes for controlling and measuring the reversal process. (Image: P. Gambardella / ETH Zürich)

    To do so, the researchers replaced the isolated metal dot by a magnetic tunnel junction. Such a tunnel junction contains two magnetic layers separated by an insulation layer that is only one nanometre thick. Depending on the spin direction – along the magnetisation of the magnetic layers, or opposite to it – the electrons can tunnel through that insulating layer more or less easily. This results in an electrical resistance that depends on the alignment of the magnetization in one layer with respect to the other and thus represents “0” and “1”. From the time dependence of that resistance during a reversal event, the researchers could reconstruct the exact dynamics of the process. In particular, they found that the magnetisation reversal happens in two stages: an incubation stage, during which the magnetisation stays constant, and the actual reversal stage, which lasts less than a nanosecond.

    3
    The magnetic tunnel junction (yellow and red disks) in which the magnetisation of the red disk is inverted by electron spins (blue and yellow arrows). The reversal process is measured through the tunnel resistance (vertical blue arrows).

    Small fluctuations

    “For a fast and reliable memory device it is essential that the time fluctuations between the individual reversal events are minimized”, explains Gambardella’s PhD student Viola Krizakova. So, based on their data the scientists developed a strategy to make those fluctuations as small as possible. To that end, they changed the current pulses used to control the magnetisation reversal in such a way as to introduce two additional physical phenomena. The so-​called spin-​transfer torque as well as a short voltage pulse during the reversal stage now resulted in a reduction of the total time for the reversal event to less than 0,3 nanoseconds, with temporal fluctuations of less than 0,2 nanoseconds.

    Application-​ready technology

    “Putting all of this together, we have found a method whereby data can be stored in magnetic tunnel junctions virtually without any error and in less than a nanosecond”, says Gambardella. Moreover, the collaboration with the research centre IMEC made it possible to test the new technology directly on an industry-​compatible wafer. Kevin Garello, a former post-​doc from Gambardella’s lab, produced the chips containing the tunnel contacts for the experiments at ETH and optimized the materials for them. In principle, the technology would, therefore, be immediately ready for use in a new generation of MRAM.

    Gambardella stresses that MRAM memories are particularly interesting because, differently from conventional main memories such as SRAM or DRAM, they don’t lose their information when the computer is switched off, but are still equally fast. He concedes, though, that the market for MRAM memories currently does not demand such high writing speeds since other technical bottlenecks such as power losses caused by large switching currents limit the access times. In the meantime, he and his co-​workers are already planning further improvements: they want to shrink the tunnel junctions and use different materials that use current more efficiently.

    Science paper:
    “Grimaldi E, et al. Single-​shot dynamics of spin–orbit torque and spin transfer torque switching in three-​terminal magnetic tunnel junctions.”
    Nature Nanotechnology

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ETH Zurich campus
    ETH Zürich is one of the leading international universities for technology and the natural sciences. It is well known for its excellent education, ground-breaking fundamental research and for implementing its results directly into practice.

    Founded in 1855, ETH Zürich today has more than 18,500 students from over 110 countries, including 4,000 doctoral students. To researchers, it offers an inspiring working environment, to students, a comprehensive education.

    Twenty-one Nobel Laureates have studied, taught or conducted research at ETH Zürich, underlining the excellent reputation of the university.

     
  • richardmitnick 3:00 pm on February 14, 2020 Permalink | Reply
    Tags: "Underestimated chemical diversity", , , China alone accounts for 37 percent of turnover., ETH Zürich, Some 350000 different substances are produced and traded around the world., This comprehensive list of 350000 different substancescannot provide information about which chemicals are hazardous to health or the environment.   

    From ETH Zürich: “Underestimated chemical diversity” 

    ETH Zurich bloc

    From ETH Zürich

    14.02.2020
    Ori Schipper

    An international team of researchers has conducted a global review of all registered industrial chemicals: some 350,000 different substances are produced and traded around the world – well in excess of the 100,000 reached in previous estimates. For about a third of these substances, there is a lack of publicly accessible information.

    1
    There are an ever-​greater number of industrial chemicals on the world market, but many lack publicly available information on aspects such as their chemical identities and hazard potential. (Photograph: fotohunter/iStock)

    The last time a list was compiled of all the chemicals available on the market and in circulation worldwide, it ran to 100,000 entries. Drawn up shortly after the turn of the millennium, the list focused on markets in the US, Canada and western Europe, which made sense because 20 years ago, these countries accounted for more than two thirds of worldwide chemical sales.

    Global market

    Things have changed dramatically since then. First, turnover has more than doubled, reaching EUR 3.4 billion in 2017; second, the global west now participates in just a third of the worldwide chemical trade, whereas China alone accounts for 37 percent of turnover. “We broadened our scope to take in the global market – and we’re now presenting a first comprehensive overview of all chemicals available worldwide,” says Zhanyun Wang, Senior Scientist at the Department of Civil, Environmental and Geomatic Engineering at ETH Zürich.

    Working with a team of international experts, Wang brought together data from 22 registers covering 19 countries and regions (including the EU). The new list contains 350,000 entries. “The chemical diversity we know now is three times greater than 20 years ago,” says Wang. This, he says, is primarily because a larger number of registers are now taken into account: “As a result, our new list includes many chemicals that are registered in developing and transition countries, which are often with limited oversight.”

    Confidential business information

    On its own, this comprehensive list cannot provide information about which chemicals are hazardous to health or the environment, for example. “Our inventory is only the first step in the substances’ characterisation,” says Wang, adding that previous work suggested that some 3 percent of all chemicals may give cause for concern. If you apply this figure to the new multitude of chemicals, 6,000 new potentially problematic substances could be expected, he says.

    Far more astonishing for Wang was the fact that a good third of all chemicals have inadequate descriptions in the various registers. About 70,000 entries are for mixtures and polymers (such as petroleum resin), with no details provided about the individual components. Another 50,000 entries relate to chemicals where the identities are considered confidential business information and are therefore not publicly accessible. “Only the manufacturers know what they are and how dangerous or toxic they are,” says Wang. “That leaves you with an uneasy feeling – like a meal where you’re told that it’s well cooked, but not what it contains.”

    An urgent call for international collaboration

    Globalisation and worldwide trade ensure that – unlike national registers – chemicals do not stop at national borders. As Wang and his colleagues note in their article in the journal Environmental Science & Technology the various registers need therefore to be merged if we want to keep track of all the chemicals that are produced and traded anywhere in the world. “Only by joining forces, across different countries and disciplines, will we be able to cope with this ever-​expanding chemical diversity,” says Wang.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ETH Zurich campus
    ETH Zürich is one of the leading international universities for technology and the natural sciences. It is well known for its excellent education, ground-breaking fundamental research and for implementing its results directly into practice.

    Founded in 1855, ETH Zürich today has more than 18,500 students from over 110 countries, including 4,000 doctoral students. To researchers, it offers an inspiring working environment, to students, a comprehensive education.

    Twenty-one Nobel Laureates have studied, taught or conducted research at ETH Zürich, underlining the excellent reputation of the university.

     
  • richardmitnick 11:02 am on December 19, 2019 Permalink | Reply
    Tags: "Laser-linked satellites could deliver ‘internet from space’", ETH Zürich   

    From ETH Zürich: “Laser-linked satellites could deliver ‘internet from space’” 

    ETH Zurich bloc

    From ETH Zürich

    1
    (Credit: Getty Images)

    December 18th, 2019
    ETH Zürich

    A new generation of low-​flying satellites promises an “Internet from space”, that will be able to cover even remote regions around the world. Computer scientists at ETH Zürich are proposing a novel network design that could double the network capacity of such systems.


    How the ETH computer scientists Debopam Bhattacherjee and Ankit Singla are improving the “Internet from space”. (Video: ETH Zürich / Bhattacherjee, D & Singla, A)

    A new generation of low-​flying satellites promises an “Internet from space”, that will be able to cover even remote regions around the world. Computer scientists at ETH Zürich are proposing a novel network design that could double the network capacity of such systems.

    Satellites do not yet play a major role in the world’s Internet infrastructure. However, this may soon be set to change. Within the next decade, a new generation of satellites could lay the foundations for an “Internet from space”, believes Ankit Singla, professor at ETH Zürich’s Network Design & Architecture Lab. His team is investigating how to improve the performance of large-​scale computer networks, including the Internet.

    Exploiting advances in cost-​cutting technologies in the space sector, the new satellite systems would use thousands of satellites instead of the tens of satellites used in past systems. These satellites could then be linked to each other via laser light to form a network. The coverage provided by these satellites would reach remote regions that currently have no or very limited access to the Internet, since they are not or only poorly connected to the intercontinental fibre-​optic cables that power today’s Internet.

    The race for the Internet of the sky

    The capabilities of the LEO satellites have triggered a new, contested “space race”, with heavyweights such as Elon Musk’s SpaceX and Jeff Bezos’ Amazon throwing their hats into the ring.

    2
    LEO satellites. https://www.bsonetwork.com

    These companies are developing large-​scale satellite constellations with thousands to tens of thousands of satellites. These would orbit the Earth at speeds of 27,000 km/h at a height of around 500 km (geostationary satellites: 35,768 km).

    SpaceX, for example, has already launched its first 120 satellites, and is planning to offer a satellite-​based broadband Internet service from 2020. In addition to global coverage, the technology used in the “Internet from space” promises high data transfer rates without major delays in data transmission – the latency, as computer scientists call these delays, is significantly lower than that of traditional, geostationary satellites, and even that of underground fibre-​optic lines for long-​distance communication.

    “If these plans succeed, it would be a huge leap forward in the world’s Internet infrastructure,” says Debopam Bhattacherjee. The doctoral candidate who is working with Singla is investigating the optimal design of networks for satellite-​based broadband Internet in order to guarantee a high-​bandwidth, delay-​free data flow. He will present his results in Florida today at ACM CoNEXT 2019 [ACM/DL], the International Conference on emerging Networking EXperiments and Technologies.

    New design for dynamic networks

    The new research challenges arising from the “Internet from space” compared to the “Internet at ground level” are due to the fact that the satellites are in motion. The satellites represent nodes through which the data travels. As the satellite-​based nodes constantly change their position in relation to one another, they form a highly dynamic network. In contrast, the transit nodes belonging to the “Internet at ground level” do not change their location or position. As a result, the largely static infrastructure of the “Internet at ground level” does not address the same requirements as those for the “Internet from space”.

    “To implement satellite-​based broadband Internet, we have to rethink virtually all aspects of the way in which the Internet is currently designed to function,” says Singla. He explains that as the satellites fly very fast and in dense swarms, more efficient approaches to network design are required for the satellite Internet. Even the design concepts used for mobile networks on high-​speed trains, drones and aircraft cannot be transferred easily to satellites.

    Bhattacherjee and Singla have now developed a mathematical model that demonstrates how one might fundamentally improve the network design in space. They have tested their design approach using the example of SpaceX and Amazon, but it can be applied independently of the technology of a particular company.

    Patterns ensure smooth data traffic

    The design concept devised by Bhattacherjee and Singla is based entirely on the high temporal dynamics of the low-​Earth orbit satellites. The key question they first asked was: how can thousands of satellites be linked together to achieve the best possible network performance? The answer is not easy, as each satellite can have no more than four connections to other satellites.

    Intuitively, one might think that the satellites always connect only to the nearest satellites. According to Bhattacherjee, however, this assumption is too restrictive. The satellites could well connect to satellites that are more distant. To maximise data transfer efficiency, it would actually be more efficient if the data used longer connections but crossed fewer nodes (satellites). After all, the act of data crossing through a node also consumes resources, thus reducing resources available for other connections.

    However, reducing the number of on-​path nodes in order to increase efficiency must not compromise the length of the end-​to-end path. Otherwise, this will deteriorate latency. Further, it is important that inter-​satellite connections do not change too often, as establishing new connections can take tens of seconds during which data cannot be exchanged.

    The novel idea behind Bhattacherjee’s and Singla’s approach is that the connections between the satellites would be built based on specialised, repetitive patterns. The most suitable pattern depends on the satellite constellation’s geometry and the network’s input traffic. A key point is that the connection pattern repeats on every satellite in the network, with all satellites connected in exactly the same way, and with the connections remaining stable over time.

    In the case of SpaceX, the new design concept increases network efficiency by 54 percent in comparison with the current approach; for Kuiper (Amazon), the efficiency increase is 45 percent. “Our approach could double the efficiency of satellite-​based Internet,” says Bhattacherjee in conclusion.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ETH Zurich campus
    ETH Zürich is one of the leading international universities for technology and the natural sciences. It is well known for its excellent education, ground-breaking fundamental research and for implementing its results directly into practice.

    Founded in 1855, ETH Zürich today has more than 18,500 students from over 110 countries, including 4,000 doctoral students. To researchers, it offers an inspiring working environment, to students, a comprehensive education.

    Twenty-one Nobel Laureates have studied, taught or conducted research at ETH Zürich, underlining the excellent reputation of the university.

     
  • richardmitnick 1:43 pm on November 11, 2019 Permalink | Reply
    Tags: "Magnets for the second dimension", , “We’re particularly interested in applications in the field of soft robotics” says Hongri Gu, ETH Zürich, , Scientists at ETH Zürich have managed to create magnetic building blocks in the shape of cubes that – for the first time ever – can be joined together to form two-​dimensional shapes., The new building blocks which the scientists call modules are not dipolar but quadrupolar which means they each have two north poles and two south poles.   

    From ETH Zürich: “Magnets for the second dimension” 

    ETH Zurich bloc

    From ETH Zürich

    11.11.2019
    Fabio Bergamin

    ETH scientists have developed cube-​shaped magnetic building blocks that can be assembled into two-​dimensional shapes and controlled by an external magnetic field. They can be used for soft robotics applications.

    1
    Quadrupole modules can be assembled into two-​dimensional shapes, including pixel art emojis like these. (Photograph: ETH Zürich/Hongri Gu)

    If you’ve ever tried to put several really strong, small cube magnets right next to each other on a magnetic board, you’ll know that you just can’t do it. What happens is that the magnets always arrange themselves in a column sticking out vertically from the magnetic board. Moreover, it’s almost impossible to join several rows of these magnets together to form a flat surface. That’s because magnets are dipolar. Equal poles repel each other, with the north pole of one magnet always attaching itself to the south pole of another and vice versa. This explains why they form a column with all the magnets aligned the same way.

    Now, scientists at ETH Zürich have managed to create magnetic building blocks in the shape of cubes that – for the first time ever – can be joined together to form two-​dimensional shapes. The new building blocks, which the scientists call modules, are not dipolar but quadrupolar, which means they each have two north poles and two south poles. Inside each of the modules, which are 3D printed in plastic, there are two small conventional dipole magnets with their equal poles facing each other (see picture). The building blocks can be assembled like little chess boards to form any two-​dimensional shapes. It works like this: Because the south and north poles attract each other, a quadrupole building block with its two south poles facing left and right will attract, on each of its four sides, a building block that is rotated by 90 degrees so its north poles on face left and right.

    3
    Dipole magnet and quadrupole module in diagram form. (Source: Gu H et al. Science Robotics 2019)

    4
    Quadrupole modules have an edge length of just over two millimetres (Photograph: ETH Zürich / Hongri Gu)

    Building on this principle, the scientists made coloured modules with an edge length of just over two millimetres. They assembled them into pixel art emojis to demonstrate what the modules can do. However, possible use cases go way beyond such gimmicks. “We’re particularly interested in applications in the field of soft robotics,” says Hongri Gu, a doctoral student in Professor Bradley Nelson’s group at ETH and lead author of the paper that the scientists recently published in Science Robotics.

    Quadrupole and dipole in the same building block

    The quadrupole dominates the magnetic properties of the modules. It is a little more complicated than that, though, because in addition to the strong quadrupole, the scientists also built a weak dipole into the building blocks. They achieved this by arranging the little magnets in the module at a slight angle to each other rather than parallel (see picture).

    “This causes the modules to align themselves with an external magnetic field, like a compass needle does,” Gu explains. “With a variable magnetic field, we can then move the shapes we have built out of the modules. Add in some flexible connectors and it’s even possible to build robots that can be controlled by a magnetic field.”

    6
    An external magnetic field (centre and right images) can be used to control the orientation of the modules. Shown here is a combination of magnetic modules and flexible connectors. (Source: Gu H et al. Science Robotics 2019)

    Gu says that their work was initially about developing the new principle. It is size-​independent, he says, meaning that there is no reason why much smaller quadrupole modules couldn’t be developed. The scientists are also studying how the modules could be used to combine a linear structure into a multidimensional object with the help of a magnetic field. This is something that could be of use in the medicine in the future: it is conceivable that objects such as stents could be formed from a thread consisting of such modules. The thread could be inserted into the body in a relatively simple, minimally invasive procedure through a tiny opening and then a magnetic field applied to assemble it into the final multidimensional structure inside the body.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ETH Zurich campus
    ETH Zürich is one of the leading international universities for technology and the natural sciences. It is well known for its excellent education, ground-breaking fundamental research and for implementing its results directly into practice.

    Founded in 1855, ETH Zürich today has more than 18,500 students from over 110 countries, including 4,000 doctoral students. To researchers, it offers an inspiring working environment, to students, a comprehensive education.

    Twenty-one Nobel Laureates have studied, taught or conducted research at ETH Zürich, underlining the excellent reputation of the university.

     
  • richardmitnick 9:49 am on October 20, 2019 Permalink | Reply
    Tags: A lot in common with facial recognition at Facebook and other social media., , , , , , , ETH Zürich, Improving on standard methods for estimating the dark matter content of the universe through artificial intelligence., , The scientists used their fully trained neural network to analyse actual dark matter maps from the KiDS-​450 dataset., Using cutting-​edge machine learning algorithms for cosmological data analysis.,   

    From ETH Zürich: “Artificial intelligence probes dark matter in the universe” 

    ETH Zurich bloc

    From ETH Zürich

    18.09.2019
    Oliver Morsch

    A team of physicists and computer scientists at ETH Zürich has developed a new approach to the problem of dark matter and dark energy in the universe. Using machine learning tools, they programmed computers to teach themselves how to extract the relevant information from maps of the universe.

    1
    Excerpt from a typical computer-​generated dark matter map used by the researchers to train the neural network. (Source: ETH Zürich)

    Understanding the how our universe came to be what it is today and what will be its final destiny is one of the biggest challenges in science. The awe-​inspiring display of countless stars on a clear night gives us some idea of the magnitude of the problem, and yet that is only part of the story. The deeper riddle lies in what we cannot see, at least not directly: dark matter and dark energy. With dark matter pulling the universe together and dark energy causing it to expand faster, cosmologists need to know exactly how much of those two is out there in order to refine their models.

    At ETH Zürich, scientists from the Department of Physics and the Department of Computer Science have now joined forces to improve on standard methods for estimating the dark matter content of the universe through artificial intelligence. They used cutting-​edge machine learning algorithms for cosmological data analysis that have a lot in common with those used for facial recognition by Facebook and other social media. Their results have recently been published in the scientific journal Physical Review D.

    Facial recognition for cosmology

    While there are no faces to be recognized in pictures taken of the night sky, cosmologists still look for something rather similar, as Tomasz Kacprzak, a researcher in the group of Alexandre Refregier at the Institute of Particle Physics and Astrophysics, explains: “Facebook uses its algorithms to find eyes, mouths or ears in images; we use ours to look for the tell-​tale signs of dark matter and dark energy.” As dark matter cannot be seen directly in telescope images, physicists rely on the fact that all matter – including the dark variety – slightly bends the path of light rays arriving at the Earth from distant galaxies. This effect, known as “weak gravitational lensing”, distorts the images of those galaxies very subtly, much like far-​away objects appear blurred on a hot day as light passes through layers of air at different temperatures.

    Weak gravitational lensing NASA/ESA Hubble

    Cosmologists can use that distortion to work backwards and create mass maps of the sky showing where dark matter is located. Next, they compare those dark matter maps to theoretical predictions in order to find which cosmological model most closely matches the data. Traditionally, this is done using human-​designed statistics such as so-​called correlation functions that describe how different parts of the maps are related to each other. Such statistics, however, are limited as to how well they can find complex patterns in the matter maps.

    Neural networks teach themselves

    “In our recent work, we have used a completely new methodology”, says Alexandre Refregier. “Instead of inventing the appropriate statistical analysis ourselves, we let computers do the job.” This is where Aurelien Lucchi and his colleagues from the Data Analytics Lab at the Department of Computer Science come in. Together with Janis Fluri, a PhD student in Refregier’s group and lead author of the study, they used machine learning algorithms called deep artificial neural networks and taught them to extract the largest possible amount of information from the dark matter maps.

    2
    Once the neural network has been trained, it can be used to extract cosmological parameters from actual images of the night sky. (Visualisations: ETH Zürich)

    In a first step, the scientists trained the neural networks by feeding them computer-​generated data that simulates the universe. That way, they knew what the correct answer for a given cosmological parameter – for instance, the ratio between the total amount of dark matter and dark energy – should be for each simulated dark matter map. By repeatedly analysing the dark matter maps, the neural network taught itself to look for the right kind of features in them and to extract more and more of the desired information. In the Facebook analogy, it got better at distinguishing random oval shapes from eyes or mouths.

    More accurate than human-​made analysis

    The results of that training were encouraging: the neural networks came up with values that were 30% more accurate than those obtained by traditional methods based on human-​made statistical analysis. For cosmologists, that is a huge improvement as reaching the same accuracy by increasing the number of telescope images would require twice as much observation time – which is expensive.

    Finally, the scientists used their fully trained neural network to analyse actual dark matter maps from the KiDS-​450 dataset. “This is the first time such machine learning tools have been used in this context,” says Fluri, “and we found that the deep artificial neural network enables us to extract more information from the data than previous approaches. We believe that this usage of machine learning in cosmology will have many future applications.”

    As a next step, he and his colleagues are planning to apply their method to bigger image sets such as the Dark Energy Survey.

    Also, more cosmological parameters and refinements such as details about the nature of dark energy will be fed to the neural networks.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ETH Zurich campus
    ETH Zürich is one of the leading international universities for technology and the natural sciences. It is well known for its excellent education, ground-breaking fundamental research and for implementing its results directly into practice.

    Founded in 1855, ETH Zürich today has more than 18,500 students from over 110 countries, including 4,000 doctoral students. To researchers, it offers an inspiring working environment, to students, a comprehensive education.

    Twenty-one Nobel Laureates have studied, taught or conducted research at ETH Zürich, underlining the excellent reputation of the university.

     
  • richardmitnick 3:33 pm on September 20, 2019 Permalink | Reply
    Tags: "A Simulation Booster for Nanoelectronics", , ETH Zürich, ,   

    From insideHPC: “A Simulation Booster for Nanoelectronics” 

    From insideHPC

    September 20, 2019
    Simone Ulmer

    ETH Zurich bloc

    ETH Zürich

    1
    Self-heating in a so-called Fin field-effect transistor (FinFET) at high current densities. Each constituting Silicon atom is colored according to its temperature (Image: Jean Favre, CSCS)

    Two research groups from ETH Zürich have developed a method that can simulate nanoelectronics devices and their properties realistically, quickly and efficiently. This offers a ray of hope for the industry and data centre operators alike, both of which are struggling with the (over)heating that comes with increasingly small and powerful transistors – and with the high resulting electricity costs for cooling.

    Chip manufacturers are already assembling transistors that measure just a few nanometres across. They are much smaller than a human hair, whose diameter is approximately 20,000 nanometres in the case of finer strands. Now, demand for increasingly powerful supercomputers is driving the industry to develop components that are even smaller and yet more powerful at the same time.

    One of the 2019 Gordon Bell Prize Finalists

    However, in addition to physical laws that make it harder to build ultra-scaled transistors, the problem of the ever increasing heat dissipation is putting manufacturers in a tricky situation – partly due to steep rises in cooling requirements and the resulting demand for energy. Cooling the computers already accounts for up to 40 percent of power consumption in some data centres, as the research groups led by ETH professors Torsten Hoefler and Mathieu Luisier report in their latest study, which they hope will allow a better approach to be developed. With their study, the researchers are now nominated for the ACM Gordon Bell Prize, the most prestigious prize in the area of supercomputers, which is awarded annually at the SC supercomputing conference in the United States.

    To make today’s nanotransistors more efficient, the research group led by Luisier from the Integrated Systems Laboratory (IIS) at ETH Zürich simulates transistors using software named OMEN, which is a so-called quantum transport simulator. OMEN runs its calculations based on what is known as density functional theory (DFT), allowing a realistic simulation of transistors in atomic resolution and at the quantum mechanical level. This simulation visualises how electrical current flows through the nanotransistor and how the electrons interact with crystal vibrations, thus enabling researchers to precisely identify locations where heat is produced. In turn, OMEN also provides useful clues as to where there is room for improvement.

    Improving transistors using optimised simulations

    Until now, conventional programming methods and supercomputers only permitted researchers to simulate heat dissipation in transistors consisting of around 1,000 atoms, as data communication between the processors and memory requirements made it impossible to produce a realistic simulation of larger objects. Most computer programs do not spend most of their time performing computing operations, but rather moving data between processors, main memory and external interfaces. According to the scientists, OMEN also suffered from a pronounced bottleneck in communication, which curtailed performance. “The software is already used in the semiconductor industry, but there is considerable room for improvement in terms of its numerical algorithms and parallelisation,” says Luisier.

    Until now, the parallelization of OMEN was designed according to the physics of the electro-thermal problem, as Luisier explains. Now, Ph.D. student Alexandros Ziogas and the postdoc Tal Ben-Nun – working under Hoefler, head of the Scalable Parallel Computing Laboratory at ETH Zürich – have not looked at the physics but rather at the dependencies between the data. They reorganised the computing operations according to these dependencies, effectively without considering the underlying physics. In optimising the code, they had the help of two of the most powerful supercomputers in the world – “Piz Daint” at the Swiss National Supercomputing Centre (CSCS) and “Summit” at Oak Ridge National Laboratory in the US, the latter being the fastest supercomputer in the world.

    Cray Piz Daint Cray XC50/XC40 supercomputer of the Swiss National Supercomputing Center (CSCS)

    ORNL IBM AC922 SUMMIT supercomputer, No.1 on the TOP500. Credit: Carlos Jones, Oak Ridge National Laboratory/U.S. Dept. of Energy

    According to the researchers, the resulting code – dubbed DaCe OMEN – produced simulation results that were just as precise as those from the original OMEN software.

    For the first time, DaCe OMEN has reportedly made it possible for researchers to produce a realistic simulation of transistors ten times the size, made up of 10,000 atoms, on the same number of processors – and up to 14 times faster than the original method took for 1,000 atoms. Overall, DaCe OMEN is more efficient than OMEN by two orders of magnitude: on Summit, it was possible to simulate, among other things, a realistic transistor up to 140 times faster with a sustained performance of 85.45 petaflops per second – and indeed to do so in double precision on 4,560 computer nodes. This extreme boost in computing speed has earned the researchers a nomination for the Gordon Bell Prize.

    Data-centric programming

    he scientists achieved this optimisation by applying the principles of data-centric parallel programming (DAPP), which was developed by Hoefler’s research group. Here, the aim is to minimise data transport and therefore communication between the processors. “This type of programming allows us to very accurately determine not only where this communication can be improved on various levels of the program, but also how we can tune specific computing-intensive sections, known as computational kernels, within the calculation for a single state,” says Ben-Nun. This multilevel approach makes it possible to optimise an application without having to rewrite it every time. Data movements are also optimised without modifying the original calculation – and for any desired computer architecture. “When we optimise the code for the target architecture, we’re now only changing it from the perspective of the performance engineer, and not that of the programmer – that is, the researcher who translates the scientific problem into code,” says Hoefler. This, he says, leads to the establishment of a very simple interface between computer scientists and interdisciplinary programmers.

    The application of DaCe OMEN has shown that the most heat is generated near the end of the nanotransistor channel and revealed how it spreads from there and affects the whole system. The scientists are convinced that the new process for simulating electronic components of this kind has a variety of potential applications. One example is in the production of lithium batteries, which can lead to some unpleasant surprises when they overheat.

    Data-centric programming is an approach that ETH Professor Torsten Hoefler has been pursuing for a number of years with a goal of putting the power of supercomputers to more efficient use. In 2015, Hoefler received an ERC Starting Grant for his project, Data Centric Parallel Programming (DAPP).

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Founded on December 28, 2006, insideHPC is a blog that distills news and events in the world of HPC and presents them in bite-sized nuggets of helpfulness as a resource for supercomputing professionals. As one reader said, we’re sifting through all the news so you don’t have to!

    If you would like to contact me with suggestions, comments, corrections, errors or new company announcements, please send me an email at rich@insidehpc.com. Or you can send me mail at:

    insideHPC
    2825 NW Upshur
    Suite G
    Portland, OR 97239

    Phone: (503) 877-5048

     
  • richardmitnick 12:48 pm on September 18, 2019 Permalink | Reply
    Tags: A new approach to the problems of dark matter and dark energy, , , , , , Deep artificial neural networks, ETH Zürich, Facial recognition for cosmology, ,   

    From ETH Zürich: “Artificial intelligence probes dark matter in the universe” 

    ETH Zurich bloc

    From ETH Zürich

    18.09.2019
    Oliver Morsch

    A team of physicists and computer scientists at ETH Zürich has developed a new approach to the problem of dark matter and dark energy in the universe. Using machine learning tools, they programmed computers to teach themselves how to extract the relevant information from maps of the universe.

    1
    Excerpt from a typical computer-generated dark matter map used by the researchers to train the neural network. (Source: ETH Zürich)

    Understanding the how our universe came to be what it is today and what will be its final destiny is one of the biggest challenges in science. The awe-inspiring display of countless stars on a clear night gives us some idea of the magnitude of the problem, and yet that is only part of the story. The deeper riddle lies in what we cannot see, at least not directly: dark matter and dark energy. With dark matter pulling the universe together and dark energy causing it to expand faster, cosmologists need to know exactly how much of those two is out there in order to refine their models.

    At ETH Zürich, scientists from the Department of Physics and the Department of Computer Science have now joined forces to improve on standard methods for estimating the dark matter content of the universe through artificial intelligence. They used cutting-edge machine learning algorithms for cosmological data analysis that have a lot in common with those used for facial recognition by Facebook and other social media. Their results have recently been published in the scientific journal Physical Review D.

    Facial recognition for cosmology

    While there are no faces to be recognized in pictures taken of the night sky, cosmologists still look for something rather similar, as Tomasz Kacprzak, a researcher in the group of Alexandre Refregier at the Institute of Particle Physics and Astrophysics, explains: “Facebook uses its algorithms to find eyes, mouths or ears in images; we use ours to look for the tell-tale signs of dark matter and dark energy.” As dark matter cannot be seen directly in telescope images, physicists rely on the fact that all matter – including the dark variety – slightly bends the path of light rays arriving at the Earth from distant galaxies. This effect, known as “weak gravitational lensing”, distorts the images of those galaxies very subtly, much like far-away objects appear blurred on a hot day as light passes through layers of air at different temperatures.

    Weak gravitational lensing NASA/ESA Hubble

    Cosmologists can use that distortion to work backwards and create mass maps of the sky showing where dark matter is located. Next, they compare those dark matter maps to theoretical predictions in order to find which cosmological model most closely matches the data. Traditionally, this is done using human-designed statistics such as so-called correlation functions that describe how different parts of the maps are related to each other. Such statistics, however, are limited as to how well they can find complex patterns in the matter maps.

    Neural networks teach themselves

    “In our recent work, we have used a completely new methodology”, says Alexandre Refregier. “Instead of inventing the appropriate statistical analysis ourselves, we let computers do the job.” This is where Aurelien Lucchi and his colleagues from the Data Analytics Lab at the Department of Computer Science come in. Together with Janis Fluri, a PhD student in Refregier’s group and lead author of the study, they used machine learning algorithms called deep artificial neural networks and taught them to extract the largest possible amount of information from the dark matter maps.

    2
    Once the neural network has been trained, it can be used to extract cosmological parameters from actual images of the night sky. (Visualisations: ETH Zürich)

    In a first step, the scientists trained the neural networks by feeding them computer-generated data that simulates the universe. That way, they knew what the correct answer for a given cosmological parameter – for instance, the ratio between the total amount of dark matter and dark energy – should be for each simulated dark matter map. By repeatedly analysing the dark matter maps, the neural network taught itself to look for the right kind of features in them and to extract more and more of the desired information. In the Facebook analogy, it got better at distinguishing random oval shapes from eyes or mouths.

    More accurate than human-made analysis

    The results of that training were encouraging: the neural networks came up with values that were 30% more accurate than those obtained by traditional methods based on human-made statistical analysis. For cosmologists, that is a huge improvement as reaching the same accuracy by increasing the number of telescope images would require twice as much observation time – which is expensive.

    Finally, the scientists used their fully trained neural network to analyse actual dark matter maps from the KiDS-450 dataset. “This is the first time such machine learning tools have been used in this context,” says Fluri, “and we found that the deep artificial neural network enables us to extract more information from the data than previous approaches. We believe that this usage of machine learning in cosmology will have many future applications.”

    As a next step, he and his colleagues are planning to apply their method to bigger image sets such as the Dark Energy Survey.

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    Timeline of the Inflationary Universe WMAP

    The Dark Energy Survey (DES) is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. DES began searching the Southern skies on August 31, 2013.

    According to Einstein’s theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up. To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called dark energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    DES is designed to probe the origin of the accelerating universe and help uncover the nature of dark energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the DES collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.

    Also, more cosmological parameters and refinements such as details about the nature of dark energy will be fed to the neural networks.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ETH Zurich campus
    ETH Zürich is one of the leading international universities for technology and the natural sciences. It is well known for its excellent education, ground-breaking fundamental research and for implementing its results directly into practice.

    Founded in 1855, ETH Zürich today has more than 18,500 students from over 110 countries, including 4,000 doctoral students. To researchers, it offers an inspiring working environment, to students, a comprehensive education.

    Twenty-one Nobel Laureates have studied, taught or conducted research at ETH Zürich, underlining the excellent reputation of the university.

     
  • richardmitnick 11:01 am on September 16, 2019 Permalink | Reply
    Tags: , , ETH Zürich, Metal oxide sensors, Methanol is sometimes referred to as ethanol’s deadly twin   

    From ETH Zürich: “Measuring ethanol’s deadly twin” 

    ETH Zurich bloc

    From ETH Zürich

    16.09.2019
    Fabio Bergamin

    ETH researchers have developed an inexpensive, handheld measuring device that can distinguish between methanol and potable alcohol. It offers a simple, quick method of detecting adulterated or contaminated alcoholic beverages and is able to diagnose methanol poisoning in exhaled breath.

    1
    Women in India sell home-brewed alcohol, which may contain toxic amounts of methanol. (Photograph: Shutterstock / Steve Estvanik)

    Methanol is sometimes referred to as ethanol’s deadly twin. While the latter is the intoxicating ingredient in wine, beer and schnapps, the former is a chemical that becomes highly toxic when metabolised by the human body. Even a relatively small amount of methanol can cause blindness or even prove fatal if left untreated.

    Cases of poisoning from the consumption of alcoholic beverages tainted with methanol occur time and again, particularly in developing and emerging countries, because alcoholic fermentation also produces small quantities of methanol. Whenever alcohol is unprofessionally distilled in backyard operations, relevant amounts of methanol may end up in the liquor. Beverages that have been adulterated with windscreen washer fluid or other liquids containing methanol are another potential cause of poisoning.

    Beverage analyses and the breath test

    Until now, methanol could be distinguished from ethanol only in a chemical analysis laboratory. Even hospitals require relatively large, expensive equipment in order to diagnose methanol poisoning. “These appliances are rarely available in emerging and developing countries, where outbreaks of methanol poisoning are most prevalent,” says Andreas Güntner, a research group leader at the Particle Technology Laboratory of ETH Professor Sotiris Pratsinis and a researcher at the University Hospital Zürich.

    He and his colleagues have now developed an affordable handheld device based on a small metal oxide sensor. It is able to detect adulterated alcohol within two minutes by “sniffing out” methanol and ethanol vapours from a beverage. Moreover, the tool can also be used to diagnose methanol poisoning by analysing a patient’s exhaled breath. In an emergency, this helps ensure the appropriate measures are taken without delay.

    Separating methanol from ethanol

    There’s nothing new about using metal oxide sensors to measure alcoholic vapours. However, this method was unable to distinguish between different alcohols, such as ethanol and methanol. “Even the breathalyser tests used by the police measure only ethanol, although some devices also erroneously identify methanol as ethanol,” explains Jan van den Broek, a doctoral student at ETH and the lead author of the study.

    First, the ETH scientists developed a highly sensitive alcohol sensor using nanoparticles of tin oxide doped with palladium. Next, they used a trick to differentiate between methanol and ethanol. Instead of analysing the sample directly with the sensor, the two types of alcohol are first separated in an attached tube filled with a porous polymer, through which the sample air is sucked by a small pump. As its molecules are smaller, methanol passes through the polymer tube more quickly than ethanol.

    2
    The millimetre-sized black dot in the centre of the gold section is the alcohol sensor.

    3
    In this image, the sensor is inside the white casing. To its right is the polymer tube in which methanol is separated from ethanol. (Photographs: Van den Broek J et al. Nature Communications 2019)

    The measuring device proved to be exceptionally sensitive. In laboratory tests, it detected even trace amounts of methanol contamination selectively in alcoholic beverages, down to the low legal limits. Furthermore, the scientists analysed breath samples from a person who had previously drunk rum. For test purposes, the researchers subsequently added a small quantity of methanol to the breath sample.

    Patent pending

    The researchers have filed a patent application for the measuring method. They are now working to integrate the technology into a device that can be put to practical use. “This technology is low cost, making it suitable for use in developing countries as well. Moreover, it’s simple to use and can be operated even without laboratory training, for example by authorities or tourists,” Güntner says. It is also ideal for quality control in distilleries.

    Methanol is more than just a nuisance in conjunction with alcoholic beverages, it is also an important industrial chemical – and one that might come to play an even more important role: methanol is being considered as a potential future fuel, since vehicles can be powered with methanol fuel cells. So a further application for the new technology could be as an alarm sensor to detect leaks in tanks.

    The study was part of the University Medicine Zürich – Zürich Exhalomics flagship project.

    Science paper:
    Highly selective detection of methanol over ethanol by a handheld gas sensor
    Nature Communications

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ETH Zurich campus
    ETH Zürich is one of the leading international universities for technology and the natural sciences. It is well known for its excellent education, ground-breaking fundamental research and for implementing its results directly into practice.

    Founded in 1855, ETH Zürich today has more than 18,500 students from over 110 countries, including 4,000 doctoral students. To researchers, it offers an inspiring working environment, to students, a comprehensive education.

    Twenty-one Nobel Laureates have studied, taught or conducted research at ETH Zürich, underlining the excellent reputation of the university.

     
  • richardmitnick 11:21 am on September 12, 2019 Permalink | Reply
    Tags: , Architected metamaterials, , ETH Zürich, , ,   

    From Caltech: “New Metamaterial Morphs Into New Shapes, Taking on New Properties” 

    Caltech Logo

    From Caltech

    September 11, 2019

    Robert Perkins
    (626) 395‑1862
    rperkins@caltech.edu

    1

    A newly developed type of architected metamaterial has the ability to change shape in a tunable fashion.

    While most reconfigurable materials can toggle between two distinct states, the way a switch toggles on or off, the new material’s shape can be finely tuned, adjusting its physical properties as desired. The material, which has potential applications in next-generation energy storage and bio-implantable micro-devices, was developed by a joint Caltech-Georgia Tech-ETH Zürich team in the lab of Julia R. Greer.

    Greer, the Ruben F. and Donna Mettler Professor of Materials Science, Mechanics and Medical Engineering in Caltech’s Division of Engineering and Applied Science, creates materials out of micro- and nanoscale building blocks that are arranged into sophisticated architectures that can be periodic, like a lattice, or non-periodic in a tailor-made fashion, giving them unusual physical properties.

    Most materials that are designed to change shape require a persistent external stimulus to change from one shape to another and stay that way: for example, they may be one shape when wet and a different shape when dry—like a sponge that swells as it absorbs water.

    By contrast, the new nanomaterial deforms through an electrochemically driven silicon-lithium alloying reaction, meaning that it can be finely controlled to attain any “in-between” states, remain in these configurations even upon the removal of the stimulus, and be easily reversed. Apply a little current, and a resulting chemical reaction changes the shape by a controlled, small degree. Apply a lot of current, and the shape changes substantially. Remove the electrical control, and the configuration is retained—just like tying off a balloon. A description of the new type of material was published online by the journal Nature on September 11.

    Defects and imperfections exist in all materials, and can often determine a material’s properties. In this case, the team chose to take advantage of that fact and build in defects to imbue the material with the properties they wanted.

    “The most intriguing part of this work to me is the critical role of defects in such dynamically responsive architected materials,” says Xiaoxing Xia, a graduate student at Caltech and lead author of the Nature paper.

    For the Nature paper, the team designed a silicon-coated lattice with microscale straight beams that bend into curves under electrochemical stimulation, taking on unique mechanical and vibrational properties. Greer’s team created these materials using an ultra-high-resolution 3D printing process called two-photon lithography. Using this novel fabrication method, they were able to build in defects in the architected material system, based on a pre-arranged design. In a test of the system, the team fabricated a sheet of the material that, under electrical control, reveals a Caltech icon.

    3

    “This just further shows that materials are just like people, it’s the imperfections that make them interesting. I have always had a particular liking for defects, and this time Xiaoxing managed to first uncover the effect of different types of defects on these metamaterials and then use them to program a particular pattern that would emerge in response to electrochemical stimulus,” says Greer.

    A material with such a finely controllable ability to change shape has potential in future energy storage systems because it provides a pathway to create adaptive energy storage systems that would enable batteries, for example, to be significantly lighter, safer, and to have substantially longer lives, Greer says. Some battery materials expand when storing energy, creating a mechanical degradation due to stress from the repeated expanding and contracting. Architected materials like this one can be designed to handle such structural transformations.

    “Electrochemically active metamaterials provide a novel pathway for development of next generation smart batteries with both increased capacity and novel functionalities. At Georgia Tech, we are developing the computational tools to predict this complex coupled electro-chemo-mechanical behavior,” says Claudio V. Di Leo, assistant professor of aerospace engineering at the Georgia Institute of Technology.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”

    Caltech campus

     
  • richardmitnick 3:29 pm on April 12, 2019 Permalink | Reply
    Tags: , ETH Zürich, Laser technology to understand the quantum nature of a vacuum setting a landmark in our attempts to measure absolute nothingness,   

    From ETH Zürich via Science Alert: “For The First Time, Physicists Have Managed to Measure Precisely Absolutely Nothing” 

    ETH Zurich bloc

    From ETH Zürich

    via

    ScienceAlert

    Science Alert

    12 APR 2019
    MIKE MCRAE

    1
    (koto_feja/iStock)

    For some physicists, measuring the spectrum of tiny waves making up empty space has been a goal for decades, but until now none have found a good way to achieve it.

    Now physicists from ETH Zürich have cleverly used laser pulses to understand the quantum nature of a vacuum, setting a landmark in our attempts to measure absolute nothingness.

    Our Universe is fundamentally bumpy. Like a fresh canvas yet to be painted, there’s a texture to bare reality which we can only just detect.

    What we take for the complete absence of matter and radiation is an infinite field of possibility from which particles emerge. In fact, there is a field for every elemental particle, just waiting for sufficient energy to define key features of its existence.

    Those particles are all constrained by a strange rule – as some possibilities increase, others have to shrink. A particle can be in a precise location, for example, but it will have a vague momentum. Or vice versa.

    This uncertainty principle doesn’t just apply to particles. It applies to the vacant field itself.

    Standing back, that artist’s canvas looks remarkably smooth. Likewise, over an extended period of time, the amount of energy in a volume of empty space averages out to zero.

    But as we focus in, for any single moment we become less certain about how much energy we’ll find, resulting in a spectrum of probabilities.

    We typically think of this weave as random. But there are correlations which could tell us a thing or two about the nature of this rippling.

    “The vacuum fluctuations of the electromagnetic field have clearly visible consequences, and among other things, are responsible for the fact that an atom can spontaneously emit light,” says physicst Ileana-Cristina Benea-Chelmus from the Institute for Quantum Electronics at ETH Zurich.

    To measure most things, you need to establish a starting point. Unfortunately for something already in its lowest energy state, it’s a little like measuring the force of a punch from a non-moving fist.

    “Traditional detectors for light such as photodiodes are based on the principle that light particles – and hence energy – are absorbed by the detector,” says says Benea-Chelmus.

    “However, from the vacuum, which represents the lowest energy state of a physical system, no further energy can be extracted.”

    Rather than measure the transfer of energy from an empty field, the team devised a way to look for the signature of its subtle probability shifts in the polarisation of photons.

    By comparing two laser pulses just a trillionth of a second in length, sent through a super-cold crystal at different times and locations, the team could work out how the empty space between the crystal’s atoms affected the light.

    “Still, the measured signal is absolutely tiny, and we really had to max out our experimental capabilities of measuring very small fields,” says physicist Jérôme Faist.

    Tiny is an understatement. That quantum ‘wiggle’ was so small, they needed up to a trillion observations for each comparison just to be sure the measurements were legitimate.

    As miniscule as the final results happened to be, the measurements allowed them to determine the fine spectrum of an electromagnetic field in its ground state.

    Getting a grip on what is effectively empty space is becoming a big deal in quantum physics.

    Only recently, another team of physicists attempted to put limits on the noise of a vacuum at room temperature in order to improve the functionality of the gravitational wave detector LIGO.

    Virtual particles – the brief ghosts of possible particles that barely exist as uncertainties in a field – are also key to understanding how black holes slowly evaporate away over time through Hawking radiation.

    In the future, we’ll need even more tricks like these if we’re to understand the fabric the Universe is painted on.

    This research was published in Nature.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ETH Zurich campus
    ETH Zürich is one of the leading international universities for technology and the natural sciences. It is well known for its excellent education, ground-breaking fundamental research and for implementing its results directly into practice.

    Founded in 1855, ETH Zürich today has more than 18,500 students from over 110 countries, including 4,000 doctoral students. To researchers, it offers an inspiring working environment, to students, a comprehensive education.

    Twenty-one Nobel Laureates have studied, taught or conducted research at ETH Zürich, underlining the excellent reputation of the university.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: