Tagged: WIRED Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:50 pm on March 18, 2019 Permalink | Reply
    Tags: "AI Algorithms Are Now Shockingly Good at Doing Science", , WIRED   

    From Quanta via WIRED: “AI Algorithms Are Now Shockingly Good at Doing Science” 

    Quanta Magazine
    Quanta Magazine

    via

    Wired logo

    From WIRED

    3.17.19
    Dan Falk

    1
    Whether probing the evolution of galaxies or discovering new chemical compounds, algorithms are detecting patterns no humans could have spotted. Rachel Suggs/Quanta Magazine

    No human, or team of humans, could possibly keep up with the avalanche of information produced by many of today’s physics and astronomy experiments. Some of them record terabytes of data every day—and the torrent is only increasing. The Square Kilometer Array, a radio telescope slated to switch on in the mid-2020s, will generate about as much data traffic each year as the entire internet.

    SKA Square Kilometer Array

    The deluge has many scientists turning to artificial intelligence for help. With minimal human input, AI systems such as artificial neural networks—computer-simulated networks of neurons that mimic the function of brains—can plow through mountains of data, highlighting anomalies and detecting patterns that humans could never have spotted.

    Of course, the use of computers to aid in scientific research goes back about 75 years, and the method of manually poring over data in search of meaningful patterns originated millennia earlier. But some scientists are arguing that the latest techniques in machine learning and AI represent a fundamentally new way of doing science. One such approach, known as generative modeling, can help identify the most plausible theory among competing explanations for observational data, based solely on the data, and, importantly, without any preprogrammed knowledge of what physical processes might be at work in the system under study. Proponents of generative modeling see it as novel enough to be considered a potential “third way” of learning about the universe.

    Traditionally, we’ve learned about nature through observation. Think of Johannes Kepler poring over Tycho Brahe’s tables of planetary positions and trying to discern the underlying pattern. (He eventually deduced that planets move in elliptical orbits.) Science has also advanced through simulation. An astronomer might model the movement of the Milky Way and its neighboring galaxy, Andromeda, and predict that they’ll collide in a few billion years. Both observation and simulation help scientists generate hypotheses that can then be tested with further observations. Generative modeling differs from both of these approaches.

    Milkdromeda -Andromeda on the left-Earth’s night sky in 3.75 billion years-NASA

    “It’s basically a third approach, between observation and simulation,” says Kevin Schawinski, an astrophysicist and one of generative modeling’s most enthusiastic proponents, who worked until recently at the Swiss Federal Institute of Technology in Zurich (ETH Zurich). “It’s a different way to attack a problem.”

    Some scientists see generative modeling and other new techniques simply as power tools for doing traditional science. But most agree that AI is having an enormous impact, and that its role in science will only grow. Brian Nord, an astrophysicist at Fermi National Accelerator Laboratory who uses artificial neural networks to study the cosmos, is among those who fear there’s nothing a human scientist does that will be impossible to automate. “It’s a bit of a chilling thought,” he said.


    Discovery by Generation

    Ever since graduate school, Schawinski has been making a name for himself in data-driven science. While working on his doctorate, he faced the task of classifying thousands of galaxies based on their appearance. Because no readily available software existed for the job, he decided to crowdsource it—and so the Galaxy Zoo citizen science project was born.

    Galaxy Zoo via Astrobites

    Beginning in 2007, ordinary computer users helped astronomers by logging their best guesses as to which galaxy belonged in which category, with majority rule typically leading to correct classifications. The project was a success, but, as Schawinski notes, AI has made it obsolete: “Today, a talented scientist with a background in machine learning and access to cloud computing could do the whole thing in an afternoon.”

    Schawinski turned to the powerful new tool of generative modeling in 2016. Essentially, generative modeling asks how likely it is, given condition X, that you’ll observe outcome Y. The approach has proved incredibly potent and versatile. As an example, suppose you feed a generative model a set of images of human faces, with each face labeled with the person’s age. As the computer program combs through these “training data,” it begins to draw a connection between older faces and an increased likelihood of wrinkles. Eventually it can “age” any face that it’s given—that is, it can predict what physical changes a given face of any age is likely to undergo.

    3
    None of these faces is real. The faces in the top row (A) and left-hand column (B) were constructed by a generative adversarial network (GAN) using building-block elements of real faces. The GAN then combined basic features of the faces in A, including their gender, age and face shape, with finer features of faces in B, such as hair color and eye color, to create all the faces in the rest of the grid. NVIDIA

    The best-known generative modeling systems are “generative adversarial networks” (GANs). After adequate exposure to training data, a GAN can repair images that have damaged or missing pixels, or they can make blurry photographs sharp. They learn to infer the missing information by means of a competition (hence the term “adversarial”): One part of the network, known as the generator, generates fake data, while a second part, the discriminator, tries to distinguish fake data from real data. As the program runs, both halves get progressively better. You may have seen some of the hyper-realistic, GAN-produced “faces” that have circulated recently — images of “freakishly realistic people who don’t actually exist,” as one headline put it.

    More broadly, generative modeling takes sets of data (typically images, but not always) and breaks each of them down into a set of basic, abstract building blocks — scientists refer to this as the data’s “latent space.” The algorithm manipulates elements of the latent space to see how this affects the original data, and this helps uncover physical processes that are at work in the system.

    The idea of a latent space is abstract and hard to visualize, but as a rough analogy, think of what your brain might be doing when you try to determine the gender of a human face. Perhaps you notice hairstyle, nose shape, and so on, as well as patterns you can’t easily put into words. The computer program is similarly looking for salient features among data: Though it has no idea what a mustache is or what gender is, if it’s been trained on data sets in which some images are tagged “man” or “woman,” and in which some have a “mustache” tag, it will quickly deduce a connection.

    In a paper published in December in Astronomy & Astrophysics, Schawinski and his ETH Zurich colleagues Dennis Turp and Ce Zhang used generative modeling to investigate the physical changes that galaxies undergo as they evolve. (The software they used treats the latent space somewhat differently from the way a generative adversarial network treats it, so it is not technically a GAN, though similar.) Their model created artificial data sets as a way of testing hypotheses about physical processes. They asked, for instance, how the “quenching” of star formation—a sharp reduction in formation rates—is related to the increasing density of a galaxy’s environment.

    For Schawinski, the key question is how much information about stellar and galactic processes could be teased out of the data alone. “Let’s erase everything we know about astrophysics,” he said. “To what degree could we rediscover that knowledge, just using the data itself?”

    First, the galaxy images were reduced to their latent space; then, Schawinski could tweak one element of that space in a way that corresponded to a particular change in the galaxy’s environment—the density of its surroundings, for example. Then he could re-generate the galaxy and see what differences turned up. “So now I have a hypothesis-generation machine,” he explained. “I can take a whole bunch of galaxies that are originally in a low-density environment and make them look like they’re in a high-density environment, by this process.” Schawinski, Turp and Zhang saw that, as galaxies go from low- to high-density environments, they become redder in color, and their stars become more centrally concentrated. This matches existing observations about galaxies, Schawinski said. The question is why this is so.

    The next step, Schawinski says, has not yet been automated: “I have to come in as a human, and say, ‘OK, what kind of physics could explain this effect?’” For the process in question, there are two plausible explanations: Perhaps galaxies become redder in high-density environments because they contain more dust, or perhaps they become redder because of a decline in star formation (in other words, their stars tend to be older). With a generative model, both ideas can be put to the test: Elements in the latent space related to dustiness and star formation rates are changed to see how this affects galaxies’ color. “And the answer is clear,” Schawinski said. Redder galaxies are “where the star formation had dropped, not the ones where the dust changed. So we should favor that explanation.”

    4
    Using generative modeling, astrophysicists could investigate how galaxies change when they go from low-density regions of the cosmos to high-density regions, and what physical processes are responsible for these changes. K. Schawinski et al.; doi: 10.1051/0004-6361/201833800

    The approach is related to traditional simulation, but with critical differences. A simulation is “essentially assumption-driven,” Schawinski said. “The approach is to say, ‘I think I know what the underlying physical laws are that give rise to everything that I see in the system.’ So I have a recipe for star formation, I have a recipe for how dark matter behaves, and so on. I put all of my hypotheses in there, and I let the simulation run. And then I ask: Does that look like reality?” What he’s done with generative modeling, he said, is “in some sense, exactly the opposite of a simulation. We don’t know anything; we don’t want to assume anything. We want the data itself to tell us what might be going on.”

    The apparent success of generative modeling in a study like this obviously doesn’t mean that astronomers and graduate students have been made redundant—but it appears to represent a shift in the degree to which learning about astrophysical objects and processes can be achieved by an artificial system that has little more at its electronic fingertips than a vast pool of data. “It’s not fully automated science—but it demonstrates that we’re capable of at least in part building the tools that make the process of science automatic,” Schawinski said.

    Generative modeling is clearly powerful, but whether it truly represents a new approach to science is open to debate. For David Hogg, a cosmologist at New York University and the Flatiron Institute (which, like Quanta, is funded by the Simons Foundation), the technique is impressive but ultimately just a very sophisticated way of extracting patterns from data—which is what astronomers have been doing for centuries.


    In other words, it’s an advanced form of observation plus analysis. Hogg’s own work, like Schawinski’s, leans heavily on AI; he’s been using neural networks to classify stars according to their spectra and to infer other physical attributes of stars using data-driven models. But he sees his work, as well as Schawinski’s, as tried-and-true science. “I don’t think it’s a third way,” he said recently. “I just think we as a community are becoming far more sophisticated about how we use the data. In particular, we are getting much better at comparing data to data. But in my view, my work is still squarely in the observational mode.”

    Hardworking Assistants

    Whether they’re conceptually novel or not, it’s clear that AI and neural networks have come to play a critical role in contemporary astronomy and physics research. At the Heidelberg Institute for Theoretical Studies, the physicist Kai Polsterer heads the astroinformatics group — a team of researchers focused on new, data-centered methods of doing astrophysics. Recently, they’ve been using a machine-learning algorithm to extract redshift information from galaxy data sets, a previously arduous task.

    Polsterer sees these new AI-based systems as “hardworking assistants” that can comb through data for hours on end without getting bored or complaining about the working conditions. These systems can do all the tedious grunt work, he said, leaving you “to do the cool, interesting science on your own.”

    But they’re not perfect. In particular, Polsterer cautions, the algorithms can only do what they’ve been trained to do. The system is “agnostic” regarding the input. Give it a galaxy, and the software can estimate its redshift and its age — but feed that same system a selfie, or a picture of a rotting fish, and it will output a (very wrong) age for that, too. In the end, oversight by a human scientist remains essential, he said. “It comes back to you, the researcher. You’re the one in charge of doing the interpretation.”

    For his part, Nord, at Fermilab, cautions that it’s crucial that neural networks deliver not only results, but also error bars to go along with them, as every undergraduate is trained to do. In science, if you make a measurement and don’t report an estimate of the associated error, no one will take the results seriously, he said.

    Like many AI researchers, Nord is also concerned about the impenetrability of results produced by neural networks; often, a system delivers an answer without offering a clear picture of how that result was obtained.

    Yet not everyone feels that a lack of transparency is necessarily a problem. Lenka Zdeborová, a researcher at the Institute of Theoretical Physics at CEA Saclay in France, points out that human intuitions are often equally impenetrable. You look at a photograph and instantly recognize a cat—“but you don’t know how you know,” she said. “Your own brain is in some sense a black box.”

    It’s not only astrophysicists and cosmologists who are migrating toward AI-fueled, data-driven science. Quantum physicists like Roger Melko of the Perimeter Institute for Theoretical Physics and the University of Waterloo in Ontario have used neural networks to solve some of the toughest and most important problems in that field, such as how to represent the mathematical “wave function” describing a many-particle system.

    Perimeter Institute in Waterloo, Canada


    AI is essential because of what Melko calls “the exponential curse of dimensionality.” That is, the possibilities for the form of a wave function grow exponentially with the number of particles in the system it describes. The difficulty is similar to trying to work out the best move in a game like chess or Go: You try to peer ahead to the next move, imagining what your opponent will play, and then choose the best response, but with each move, the number of possibilities proliferates.

    Of course, AI systems have mastered both of these games—chess, decades ago, and Go in 2016, when an AI system called AlphaGo defeated a top human player. They are similarly suited to problems in quantum physics, Melko says.

    The Mind of the Machine

    Whether Schawinski is right in claiming that he’s found a “third way” of doing science, or whether, as Hogg says, it’s merely traditional observation and data analysis “on steroids,” it’s clear AI is changing the flavor of scientific discovery, and it’s certainly accelerating it. How far will the AI revolution go in science?

    Occasionally, grand claims are made regarding the achievements of a “robo-scientist.” A decade ago, an AI robot chemist named Adam investigated the genome of baker’s yeast and worked out which genes are responsible for making certain amino acids. (Adam did this by observing strains of yeast that had certain genes missing, and comparing the results to the behavior of strains that had the genes.) Wired’s headline read, “Robot Makes Scientific Discovery All by Itself.”

    More recently, Lee Cronin, a chemist at the University of Glasgow, has been using a robot to randomly mix chemicals, to see what sorts of new compounds are formed.

    Monitoring the reactions in real-time with a mass spectrometer, a nuclear magnetic resonance machine, and an infrared spectrometer, the system eventually learned to predict which combinations would be the most reactive. Even if it doesn’t lead to further discoveries, Cronin has said, the robotic system could allow chemists to speed up their research by about 90 percent.

    Last year, another team of scientists at ETH Zurich used neural networks to deduce physical laws from sets of data. Their system, a sort of robo-Kepler, rediscovered the heliocentric model of the solar system from records of the position of the sun and Mars in the sky, as seen from Earth, and figured out the law of conservation of momentum by observing colliding balls. Since physical laws can often be expressed in more than one way, the researchers wonder if the system might offer new ways—perhaps simpler ways—of thinking about known laws.

    These are all examples of AI kick-starting the process of scientific discovery, though in every case, we can debate just how revolutionary the new approach is. Perhaps most controversial is the question of how much information can be gleaned from data alone—a pressing question in the age of stupendously large (and growing) piles of it. In The Book of Why (2018), the computer scientist Judea Pearl and the science writer Dana Mackenzie assert that data are “profoundly dumb.” Questions about causality “can never be answered from data alone,” they write. “Anytime you see a paper or a study that analyzes the data in a model-free way, you can be certain that the output of the study will merely summarize, and perhaps transform, but not interpret the data.” Schawinski sympathizes with Pearl’s position, but he described the idea of working with “data alone” as “a bit of a straw man.” He’s never claimed to deduce cause and effect that way, he said. “I’m merely saying we can do more with data than we often conventionally do.”

    Another oft-heard argument is that science requires creativity, and that—at least so far—we have no idea how to program that into a machine. (Simply trying everything, like Cronin’s robo-chemist, doesn’t seem especially creative.) “Coming up with a theory, with reasoning, I think demands creativity,” Polsterer said. “Every time you need creativity, you will need a human.” And where does creativity come from? Polsterer suspects it is related to boredom—something that, he says, a machine cannot experience. “To be creative, you have to dislike being bored. And I don’t think a computer will ever feel bored.” On the other hand, words like “creative” and “inspired” have often been used to describe programs like Deep Blue and AlphaGo. And the struggle to describe what goes on inside the “mind” of a machine is mirrored by the difficulty we have in probing our own thought processes.

    Schawinski recently left academia for the private sector; he now runs a startup called Modulos which employs a number of ETH scientists and, according to its website, works “in the eye of the storm of developments in AI and machine learning.” Whatever obstacles may lie between current AI technology and full-fledged artificial minds, he and other experts feel that machines are poised to do more and more of the work of human scientists. Whether there is a limit remains to be seen.

    “Will it be possible, in the foreseeable future, to build a machine that can discover physics or mathematics that the brightest humans alive are not able to do on their own, using biological hardware?” Schawinski wonders. “Will the future of science eventually necessarily be driven by machines that operate on a level that we can never reach? I don’t know. It’s a good question.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 1:10 pm on March 10, 2019 Permalink | Reply
    Tags: A quantum computer would greatly speed up analysis of the collisions hopefully finding evidence of supersymmetry much sooner—or at least allowing us to ditch the theory and move on., And they’ve been waiting for decades. Google is in the race as are IBM Microsoft Intel and a clutch of startups academic groups and the Chinese government., , At the moment researchers spend weeks and months sifting through the debris from proton-proton collisions in the LCH trying to find exotic heavy sister-particles to all our known particles of matter., “This is a marathon” says David Reilly who leads Microsoft’s quantum lab at the University of Sydney Australia. “And it's only 10 minutes into the marathon.”, , , CERN-Future Circular Collider, For CERN the quantum promise could for instance help its scientists find evidence of supersymmetry or SUSY which so far has proven elusive., HL-LHC-High-Luminosity LHC, IBM has steadily been boosting the number of qubits on its quantum computers starting with a meagre 5-qubit computer then 16- and 20-qubit machines and just recently showing off its 50-qubit processor, In a bid to make sense of the impending data deluge some at CERN are turning to the emerging field of quantum computing., In a quantum computer each circuit can have one of two values—either one (on) or zero (off) in binary code; the computer turns the voltage in a circuit on or off to make it work., In theory a quantum computer would process all the states a qubit can have at once and with every qubit added to its memory size its computational power should increase exponentially., Last year physicists from the California Institute of Technology in Pasadena and the University of Southern California managed to replicate the discovery of the Higgs boson found at the LHC in 2012, None of the competing teams have come close to reaching even the first milestone., , , , The quest has now lasted decades and a number of physicists are questioning if the theory behind SUSY is really valid., Traditional computers—be it an Apple Watch or the most powerful supercomputer—rely on tiny silicon transistors that work like on-off switches to encode bits of data., Venture capitalists invested some $250 million in various companies researching quantum computing in 2018 alone., WIRED   

    From WIRED: “Inside the High-Stakes Race to Make Quantum Computers Work” 

    Wired logo

    From WIRED

    03.08.19
    Katia Moskvitch

    1
    View Pictures/Getty Images

    Deep beneath the Franco-Swiss border, the Large Hadron Collider is sleeping.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    But it won’t be quiet for long. Over the coming years, the world’s largest particle accelerator will be supercharged, increasing the number of proton collisions per second by a factor of two and a half.

    Once the work is complete in 2026, researchers hope to unlock some of the most fundamental questions in the universe. But with the increased power will come a deluge of data the likes of which high-energy physics has never seen before. And, right now, humanity has no way of knowing what the collider might find.

    To understand the scale of the problem, consider this: When it shut down in December 2018, the LHC generated about 300 gigabytes of data every second, adding up to 25 petabytes (PB) annually. For comparison, you’d have to spend 50,000 years listening to music to go through 25 PB of MP3 songs, while the human brain can store memories equivalent to just 2.5 PB of binary data. To make sense of all that information, the LHC data was pumped out to 170 computing centers in 42 countries [http://greybook.cern.ch/]. It was this global collaboration that helped discover the elusive Higgs boson, part of the Higgs field believed to give mass to elementary particles of matter.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    To process the looming data torrent, scientists at the European Organization for Nuclear Research, or CERN, will need 50 to 100 times more computing power than they have at their disposal today. A proposed Future Circular Collider, four times the size of the LHC and 10 times as powerful, would create an impossibly large quantity of data, at least twice as much as the LHC.

    CERN FCC Future Circular Collider map

    In a bid to make sense of the impending data deluge, some at CERN are turning to the emerging field of quantum computing. Powered by the very laws of nature the LHC is probing, such a machine could potentially crunch the expected volume of data in no time at all. What’s more, it would speak the same language as the LHC. While numerous labs around the world are trying to harness the power of quantum computing, it is the future work at CERN that makes it particularly exciting research. There’s just one problem: Right now, there are only prototypes; nobody knows whether it’s actually possible to build a reliable quantum device.

    Traditional computers—be it an Apple Watch or the most powerful supercomputer—rely on tiny silicon transistors that work like on-off switches to encode bits of data.

    ORNL IBM AC922 SUMMIT supercomputer, No.1 on the TOP500. Credit: Carlos Jones, Oak Ridge National Laboratory/U.S. Dept. of Energy

    Each circuit can have one of two values—either one (on) or zero (off) in binary code; the computer turns the voltage in a circuit on or off to make it work.

    A quantum computer is not limited to this “either/or” way of thinking. Its memory is made up of quantum bits, or qubits—tiny particles of matter like atoms or electrons. And qubits can do “both/and,” meaning that they can be in a superposition of all possible combinations of zeros and ones; they can be all of those states simultaneously.

    For CERN, the quantum promise could, for instance, help its scientists find evidence of supersymmetry, or SUSY, which so far has proven elusive.

    Standard Model of Supersymmetry via DESY

    At the moment, researchers spend weeks and months sifting through the debris from proton-proton collisions in the LCH, trying to find exotic, heavy sister-particles to all our known particles of matter. The quest has now lasted decades, and a number of physicists are questioning if the theory behind SUSY is really valid. A quantum computer would greatly speed up analysis of the collisions, hopefully finding evidence of supersymmetry much sooner—or at least allowing us to ditch the theory and move on.

    A quantum device might also help scientists understand the evolution of the early universe, the first few minutes after the Big Bang. Physicists are pretty confident that back then, our universe was nothing but a strange soup of subatomic particles called quarks and gluons. To understand how this quark-gluon plasma has evolved into the universe we have today, researchers simulate the conditions of the infant universe and then test their models at the LHC, with multiple collisions. Performing a simulation on a quantum computer, governed by the same laws that govern the very particles that the LHC is smashing together, could lead to a much more accurate model to test.

    Beyond pure science, banks, pharmaceutical companies, and governments are also waiting to get their hands on computing power that could be tens or even hundreds of times greater than that of any traditional computer.

    And they’ve been waiting for decades. Google is in the race, as are IBM, Microsoft, Intel and a clutch of startups, academic groups, and the Chinese government. The stakes are incredibly high. Last October, the European Union pledged to give $1 billion to over 5,000 European quantum technology researchers over the next decade, while venture capitalists invested some $250 million in various companies researching quantum computing in 2018 alone. “This is a marathon,” says David Reilly, who leads Microsoft’s quantum lab at the University of Sydney, Australia. “And it’s only 10 minutes into the marathon.”

    Despite the hype surrounding quantum computing and the media frenzy triggered by every announcement of a new qubit record, none of the competing teams have come close to reaching even the first milestone, fancily called quantum supremacy—the moment when a quantum computer performs at least one specific task better than a standard computer. Any kind of task, even if it is totally artificial and pointless. There are plenty of rumors in the quantum community that Google may be close, although if true, it would give the company bragging rights at best, says Michael Biercuk, a physicist at the University of Sydney and founder of quantum startup Q-CTRL. “It would be a bit of a gimmick—an artificial goal,” says Reilly “It’s like concocting some mathematical problem that really doesn’t have an obvious impact on the world just to say that a quantum computer can solve it.”

    That’s because the first real checkpoint in this race is much further away. Called quantum advantage, it would see a quantum computer outperform normal computers on a truly useful task. (Some researchers use the terms quantum supremacy and quantum advantage interchangeably.) And then there is the finish line, the creation of a universal quantum computer. The hope is that it would deliver a computational nirvana with the ability to perform a broad range of incredibly complex tasks. At stake is the design of new molecules for life-saving drugs, helping banks to adjust the riskiness of their investment portfolios, a way to break all current cryptography and develop new, stronger systems, and for scientists at CERN, a way to glimpse the universe as it was just moments after the Big Bang.

    Slowly but surely, work is already underway. Federico Carminati, a physicist at CERN, admits that today’s quantum computers wouldn’t give researchers anything more than classical machines, but, undeterred, he’s started tinkering with IBM’s prototype quantum device via the cloud while waiting for the technology to mature. It’s the latest baby step in the quantum marathon. The deal between CERN and IBM was struck in November last year at an industry workshop organized by the research organization.

    Set up to exchange ideas and discuss potential collab­orations, the event had CERN’s spacious auditorium packed to the brim with researchers from Google, IBM, Intel, D-Wave, Rigetti, and Microsoft. Google detailed its tests of Bristlecone, a 72-qubit machine. Rigetti was touting its work on a 128-qubit system. Intel showed that it was in close pursuit with 49 qubits. For IBM, physicist Ivano Tavernelli took to the stage to explain the company’s progress.

    IBM has steadily been boosting the number of qubits on its quantum computers, starting with a meagre 5-qubit computer, then 16- and 20-qubit machines, and just recently showing off its 50-qubit processor.

    IBM iconic image of Quantum computer

    Carminati listened to Tavernelli, intrigued, and during a much needed coffee break approached him for a chat. A few minutes later, CERN had added a quantum computer to its impressive technology arsenal. CERN researchers are now starting to develop entirely new algorithms and computing models, aiming to grow together with the device. “A fundamental part of this process is to build a solid relationship with the technology providers,” says Carminati. “These are our first steps in quantum computing, but even if we are coming relatively late into the game, we are bringing unique expertise in many fields. We are experts in quantum mechanics, which is at the base of quantum computing.”

    The attraction of quantum devices is obvious. Take standard computers. The prediction by former Intel CEO Gordon Moore in 1965 that the number of components in an integrated circuit would double roughly every two years has held true for more than half a century. But many believe that Moore’s law is about to hit the limits of physics. Since the 1980s, however, researchers have been pondering an alternative. The idea was popularized by Richard Feynman, an American physicist at Caltech in Pasadena. During a lecture in 1981, he lamented that computers could not really simulate what was happening at a subatomic level, with tricky particles like electrons and photons that behave like waves but also dare to exist in two states at once, a phenomenon known as quantum superposition.

    Feynman proposed to build a machine that could. “I’m not happy with all the analyses that go with just the classical theory, because nature isn’t classical, dammit,” he told the audience back in 1981. “And if you want to make a simulation of nature, you’d better make it quantum mechanical, and by golly it’s a wonderful problem, because it doesn’t look so easy.”

    And so the quantum race began. Qubits can be made in different ways, but the rule is that two qubits can be both in state A, both in state B, one in state A and one in state B, or vice versa, so there are four probabilities in total. And you won’t know what state a qubit is at until you measure it and the qubit is yanked out of its quantum world of probabilities into our mundane physical reality.

    In theory, a quantum computer would process all the states a qubit can have at once, and with every qubit added to its memory size, its computational power should increase exponentially. So, for three qubits, there are eight states to work with simultaneously, for four, 16; for 10, 1,024; and for 20, a whopping 1,048,576 states. You don’t need a lot of qubits to quickly surpass the memory banks of the world’s most powerful modern supercomputers—meaning that for specific tasks, a quantum computer could find a solution much faster than any regular computer ever would. Add to this another crucial concept of quantum mechanics: entanglement. It means that qubits can be linked into a single quantum system, where operating on one affects the rest of the system. This way, the computer can harness the processing power of both simultaneously, massively increasing its computational ability.

    While a number of companies and labs are competing in the quantum marathon, many are running their own races, taking different approaches. One device has even been used by a team of researchers to analyze CERN data, albeit not at CERN. Last year, physicists from the California Institute of Technology in Pasadena and the University of Southern California managed to replicate the discovery of the Higgs boson, found at the LHC in 2012, by sifting through the collider’s troves of data using a quantum computer manufactured by D-Wave, a Canadian firm based in Burnaby, British Columbia. The findings didn’t arrive any quicker than on a traditional computer, but, crucially, the research showed a quantum machine could do the work.

    One of the oldest runners in the quantum race, D-Wave announced back in 2007 that it had built a fully functioning, commercially available 16-qubit quantum computer prototype—a claim that’s controversial to this day. D-Wave focuses on a technology called quantum annealing, based on the natural tendency of real-world quantum systems to find low-energy states (a bit like a spinning top that inevitably will fall over). A D-Wave quantum computer imagines the possible solutions of a problem as a landscape of peaks and valleys; each coordinate represents a possible solution and its elevation represents its energy. Annealing allows you to set up the problem, and then let the system fall into the answer—in about 20 milliseconds. As it does so, it can tunnel through the peaks as it searches for the lowest valleys. It finds the lowest point in the vast landscape of solutions, which corresponds to the best possible outcome—although it does not attempt to fully correct for any errors, inevitable in quantum computation. D-Wave is now working on a prototype of a universal annealing quantum computer, says Alan Baratz, the company’s chief product officer.

    Apart from D-Wave’s quantum annealing, there are three other main approaches to try and bend the quantum world to our whim: integrated circuits, topological qubits and ions trapped with lasers. CERN is placing high hopes on the first method but is closely watching other efforts too.

    IBM, whose computer Carminati has just started using, as well as Google and Intel, all make quantum chips with integrated circuits—quantum gates—that are superconducting, a state when certain metals conduct electricity with zero resistance. Each quantum gate holds a pair of very fragile qubits. Any noise will disrupt them and introduce errors—and in the quantum world, noise is anything from temperature fluctuations to electromagnetic and sound waves to physical vibrations.

    To isolate the chip from the outside world as much as possible and get the circuits to exhibit quantum mechanical effects, it needs to be supercooled to extremely low temperatures. At the IBM quantum lab in Zurich, the chip is housed in a white tank—a cryostat—suspended from the ceiling. The temperature inside the tank is a steady 10 millikelvin or –273 degrees Celsius, a fraction above absolute zero and colder than outer space. But even this isn’t enough.

    Just working with the quantum chip, when scientists manipulate the qubits, causes noise. “The outside world is continually interacting with our quantum hardware, damaging the information we are trying to process,” says physicist John Preskill at the California Institute of Technology, who in 2012 coined the term quantum supremacy. It’s impossible to get rid of the noise completely, so researchers are trying to suppress it as much as possible, hence the ultracold temperatures to achieve at least some stability and allow more time for quantum computations.

    “My job is to extend the lifetime of qubits, and we’ve got four of them to play with,” says Matthias Mergenthaler, an Oxford University postdoc student working at IBM’s Zurich lab. That doesn’t sound like a lot, but, he explains, it’s not so much the number of qubits that counts but their quality, meaning qubits with as low a noise level as possible, to ensure they last as long as possible in superposition and allow the machine to compute. And it’s here, in the fiddly world of noise reduction, that quantum computing hits up against one of its biggest challenges. Right now, the device you’re reading this on probably performs at a level similar to that of a quantum computer with 30 noisy qubits. But if you can reduce the noise, then the quantum computer is many times more powerful.

    Once the noise is reduced, researchers try to correct any remaining errors with the help of special error-correcting algorithms, run on a classical computer. The problem is, such error correction works qubit by qubit, so the more qubits there are, the more errors the system has to cope with. Say a computer makes an error once every 1,000 computational steps; it doesn’t sound like much, but after 1,000 or so operations, the program will output incorrect results. To be able to achieve meaningful computations and surpass standard computers, a quantum machine has to have about 1,000 qubits that are relatively low noise and with error rates as corrected as possible. When you put them all together, these 1,000 qubits will make up what researchers call a logical qubit. None yet exist—so far, the best that prototype quantum devices have achieved is error correction for up to 10 qubits. That’s why these prototypes are called noisy intermediate-scale quantum computers (NISQ), a term also coined by Preskill in 2017.

    For Carminati, it’s clear the technology isn’t ready yet. But that isn’t really an issue. At CERN the challenge is to be ready to unlock the power of quantum computers when and if the hardware becomes available. “One exciting possibility will be to perform very, very accurate simulations of quantum systems with a quantum computer—which in itself is a quantum system,” he says. “Other groundbreaking opportunities will come from the blend of quantum computing and artificial intelligence to analyze big data, a very ambitious proposition at the moment, but central to our needs.”

    But some physicists think NISQ machines will stay just that—noisy—forever. Gil Kalai, a professor at Yale University, says that error correcting and noise suppression will never be good enough to allow any kind of useful quantum computation. And it’s not even due to technology, he says, but to the fundamentals of quantum mechanics. Interacting systems have a tendency for errors to be connected, or correlated, he says, meaning errors will affect many qubits simultaneously. Because of that, it simply won’t be possible to create error-correcting codes that keep noise levels low enough for a quantum computer with the required large number of qubits.

    “My analysis shows that noisy quantum computers with a few dozen qubits deliver such primitive computational power that it will simply not be possible to use them as the building blocks we need to build quantum computers on a wider scale,” he says. Among scientists, such skepticism is hotly debated. The blogs of Kalai and fellow quantum skeptics are forums for lively discussion, as was a recent much-shared article titled “The Case Against Quantum Computing”—followed by its rebuttal, “The Case Against the Case Against Quantum Computing.

    For now, the quantum critics are in a minority. “Provided the qubits we can already correct keep their form and size as we scale, we should be okay,” says Ray Laflamme, a physicist at the University of Waterloo in Ontario, Canada. The crucial thing to watch out for right now is not whether scientists can reach 50, 72, or 128 qubits, but whether scaling quantum computers to this size significantly increases the overall rate of error.

    3
    The Quantum Nano Centre in Canada is one of numerous big-budget research and development labs focussed on quantum computing. James Brittain/Getty Images

    Others believe that the best way to suppress noise and create logical qubits is by making qubits in a different way. At Microsoft, researchers are developing topological qubits—although its array of quantum labs around the world has yet to create a single one. If it succeeds, these qubits would be much more stable than those made with integrated circuits. Microsoft’s idea is to split a particle—for example an electron—in two, creating Majorana fermion quasi-particles. They were theorized back in 1937, and in 2012 researchers at Delft University of Technology in the Netherlands, working at Microsoft’s condensed matter physics lab, obtained the first experimental evidence of their existence.

    “You will only need one of our qubits for every 1,000 of the other qubits on the market today,” says Chetan Nayak, general manager of quantum hardware at Microsoft. In other words, every single topological qubit would be a logical one from the start. Reilly believes that researching these elusive qubits is worth the effort, despite years with little progress, because if one is created, scaling such a device to thousands of logical qubits would be much easier than with a NISQ machine. “It will be extremely important for us to try out our code and algorithms on different quantum simulators and hardware solutions,” says Carminati. “Sure, no machine is ready for prime time quantum production, but neither are we.”

    Another company Carminati is watching closely is IonQ, a US startup that spun out of the University of Maryland. It uses the third main approach to quantum computing: trapping ions. They are naturally quantum, having superposition effects right from the start and at room temperature, meaning that they don’t have to be supercooled like the integrated circuits of NISQ machines. Each ion is a singular qubit, and researchers trap them with special tiny silicon ion traps and then use lasers to run algorithms by varying the times and intensities at which each tiny laser beam hits the qubits. The beams encode data to the ions and read it out from them by getting each ion to change its electronic states.

    In December, IonQ unveiled its commercial device, capable of hosting 160 ion qubits and performing simple quantum operations on a string of 79 qubits. Still, right now, ion qubits are just as noisy as those made by Google, IBM, and Intel, and neither IonQ nor any other labs around the world experimenting with ions have achieved quantum supremacy.

    As the noise and hype surrounding quantum computers rumbles on, at CERN, the clock is ticking. The collider will wake up in just five years, ever mightier, and all that data will have to be analyzed. A non-noisy, error-corrected quantum computer will then come in quite handy.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 1:54 pm on February 7, 2019 Permalink | Reply
    Tags: , , , , Now You Can Join the Search for Killer Asteroids, , , , WIRED   

    From WIRED: “Now You Can Join the Search for Killer Asteroids” 

    Wired logo

    From WIRED

    02.07.19
    Sarah Scoles

    1
    A Hawaii observatory just put the largest astronomical data trove ever online, making it free and accessible so anyone can hunt for new cosmic phenomena. R. White/STScI/PS1 Science Consortium

    If you want to watch sunrise from the national park at the top of Mount Haleakala, the volcano that makes up around 75 percent of the island of Maui, you have to make a reservation. Being at 10,023 feet, the summit provides a spectacular—and very popular, ticket-controlled—view.

    2
    Looking into the Haleakalā crater

    Just about a mile down the road from the visitors’ center sits “Science City,” where civilian and military telescopes curl around the road, their domes bubbling up toward the sky. Like the park’s visitors, they’re looking out beyond Earth’s atmosphere—toward the Sun, satellites, asteroids, or distant galaxies. And one of them, called the Panoramic Survey Telescope and Rapid Response System, or Pan-STARRS, just released the biggest digital astro-dataset ever, amounting to 1.6 petabytes, the equivalent of around 500,000 HD movies.

    Pann-STARS 1 Telescope, U Hawaii, situated at Haleakala Observatories near the summit of Haleakala in Hawaii, USA, altitude 3,052 m (10,013 ft)

    From its start in 2010, Pan-STARRS has been watching the 75 percent of the sky it can see from its perch and recording cosmic states and changes on its 1.4-billion-pixel camera. It even discovered the strange ‘Oumuamua, the interstellar object that a Harvard astronomer has suggested could be an alien spaceship.

    3
    An artist’s rendering of the first recorded visitor to the solar system, ‘Oumuamua.
    Aunt_Spray/Getty Images

    Big surveys like this one, which watch swaths of sky agnostically rather than homing in on specific stuff, represent a big chunk of modern astronomy. They are an efficient, pseudo-egalitarian way to collect data, uncover the unexpected, and allow for discovery long after the lens cap closes. With better computing power, astronomers can see the universe not just as it was and is but also as it’s changing, by comparing, say, how a given part of the sky looks on Tuesday to how it looks on Wednesday. Pan-STARRS’s latest data dump, in particular, gives everyone access to the in-process cosmos, opening up the “time domain” to all earthlings with a good internet connection.

    Pan-STARRS, like all projects, was once just an idea. It started around the turn of this century, when astronomers Nick Kaiser, John Tonry, and Gerry Luppino, from Hawaii’s Institute for Astronomy, suggested that relatively “modest” telescopes—hooked to huge cameras—were the best way to image large skyfields.

    Today, that idea has morphed into Pan-STARRS, a many-pixeled instrument attached to a 1.8-meter telescope (big optical telescopes may measure around 10 meters). It takes multiple images of each part of the sky to show how it’s changing. Over the course of four years, Pan-STARRS imaged the heavens above 12 times, using five different filters. These pictures may show supernovae flaring up and dimming back down, active galaxies whose centers glare as their black holes digest material, and strange bursts from cataclysmic events. “When you visit the same piece of sky again and again, you can recognize, ‘Oh, this galaxy has a new star in it that was not there when we were there a year or three months ago,” says Rick White, an astronomer at the Space Telescope Science Institute, which hosts Pan-STARRS’s archive. In this way, Pan-STARRS is a forerunner of the massive Large Synoptic Survey Telescope, or LSST, which will snap 800 panoramic images every evening, with a 3.2-billion-pixel camera, capturing the whole sky twice a week.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    Plus, by comparing bright dots that move between images, astronomers can uncover closer-by objects, like rocks whose path might sweep uncomfortably close to Earth.

    That latter part is not just interesting to scientists, but to the military too. “It’s considered a defense function to find asteroids that might cause us to go extinct,” says White. That’s (at least part of) why the Air Force, which also operates a satellite-tracking system on Haleakala, pushed $60 million into Pan-STARRS’s development. NASA, the state of Hawaii, a consortium of scientists, and some private donations ponied up the rest.

    But when the telescope first got to work, its operations hit some snags. Its initial images were about half as sharp as they should have been, because the system that adjusted the telescope’s mirror to make up for distortions wasn’t working right.

    Also, the Air Force redacted parts of the sky. It used software called “Magic” to detect streaks of light that might be satellites (including the US government’s own). Magic masked those streaks, essentially placing a dead-pixel black bar across that section of sky, to “to prevent the determination of any orbital element of the artificial satellite before the images left the [Institute for Astronomy] servers,” according to a recent paper by the Pan-STARRS group. In December 2011, the Air Force “dropped the requirement,” says the article. The magic was gone, and the scientists reprocessed the original raw data, removing the black boxes.

    The first tranche of data, from the world’s most substantial digital sky survey, came in December 2016. It was full of stars, galaxies, space rocks, and strangeness. The telescope and its associated scientists have already found an eponymous comet, crafted a 3D model of the Milky Way’s dust, unearthed way-old active galaxies, and spotted everyone’s favorite probably-not-an-alien-spaceship, ’Oumuamua.

    The real deal, though, entered the world late last month, when astronomers publicly released and put online all the individual snapshots, including auto-generated catalogs of some 800 million objects. With that dataset, astronomers and regular people everywhere (once they’ve read a fair number of help-me files) can check out a patch of sky and see how it evolved as time marched on. The curious can do more of the “time domain” science Pan-STARRS was made for: catching explosions, watching rocks, and squinting at unexplained bursts.

    Pan-STARRS might never have gotten its observations online if NASA hadn’t seen its own future in the observatory’s massive data pileup. That 1.6-petabyte archive is now housed at the Space Telescope Science Institute, in Maryland, in a repository called the Mikulski Archive for Space Telescopes. The Institute is also the home of bytes from Hubble, Kepler, GALEX, and 15 other missions, mostly belonging to NASA. “At the beginning they didn’t have any commitment to release the data publicly,” says White. “It’s such a large quantity they didn’t think they could manage to do it.” The Institute, though, welcomed this outsider data in part so it could learn how to deal with such huge quantities.

    The hope is that Pan-STARRS’s freely available data will make a big contribution to astronomy. Just look at the discoveries people publish using Hubble data, says White. “The majority of papers being published are from archival data, by scientists that have no connection to the original observations,” he says. That, he believes, will hold true for Pan-STARRS too.

    But surveys are beautiful not just because they can be shared online. They’re also A+ because their observations aren’t narrow. In much of astronomy, scientists look at specific objects in specific ways at specific times. Maybe they zoom in on the magnetic field of pulsar J1745–2900, or the hydrogen gas in the farthest reaches of the Milky Way’s Perseus arm, or that one alien spaceship rock. Those observations are perfect for that individual astronomer to learn about that field, arm, or ship—but they’re not as great for anything or anyone else. Surveys, on the other hand, serve everyone.

    “The Sloan Digital Sky Survey set the standard for these huge survey projects,” says White. Sloan, which started operations in 2000, is on its fourth iteration, collecting light with telescopes at Apache Point Observatory in New Mexico and Las Campanas Observatory in Northern Chile.

    SDSS 2.5 meter Telescope at Apache Point Observatory, near Sunspot NM, USA, Altitude 2,788 meters (9,147 ft)

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Carnegie Las Campanas Observatory in the southern Atacama Desert of Chile in the Atacama Region approximately 100 kilometres (62 mi) northeast of the city of La Serena,near the southern end and over 2,500 m (8,200 ft) high

    From the early universe to the modern state of the Milky Way’s union, Sloan data has painted a full-on portrait of the universe that, like those creepy Renaissance portraits, will stick around for years to come.

    Over in a different part of New Mexico, on the high Plains of San Agustin, radio astronomers recently set the Very Large Array’s sights on a new survey. Having started in 2017, the Very Large Array Sky Survey is still at the beginning of its seven years of operation.

    NRAO/Karl V Jansky Expanded Very Large Array, on the Plains of San Agustin fifty miles west of Socorro, NM, USA, at an elevation of 6970 ft (2124 m)

    But astronomers don’t have to wait for it to finish its observations, as happened with the first Pan-STARRS survey. “Within several days of the data coming off the telescope, the images are available to everybody,” says Brian Kent, who, since 2012, has worked on the software that processes the data. Which is no small task: For every four hours of skywatching, the telescope spits out 300 gigabytes, which the software then has to make useful and usable. “You have to put the collective smarts of the astronomers into the software,” he says.

    Kent is excited about the same kinds of time-domain discoveries as White is: about seeing the universe at work rather than as a set of static images. Including the chronological dimension is hot in astronomy right now, from these surveys to future instruments like the LSST and the massive Square Kilometre Array, a radio telescope that will spread across two continents.

    SKA Square Kilometer Array

    SKA Murchison Widefield Array, Boolardy station in outback Western Australia, at the Murchison Radio-astronomy Observatory (MRO)


    Australian Square Kilometre Array Pathfinder (ASKAP) is a radio telescope array located at Murchison Radio-astronomy Observatory (MRO) in the Australian Mid West. ASKAP consists of 36 identical parabolic antennas, each 12 metres in diameter, working together as a single instrument with a total collecting area of approximately 4,000 square metres.

    SKA LOFAR core (“superterp”) near Exloo, Netherlands

    SKA South Africa


    SKA Meerkat telescope, 90 km outside the small Northern Cape town of Carnarvon, SA


    SKA Meerkat telescope, 90 km outside the small Northern Cape town of Carnarvon, SA

    SKA Meerkat telescope, South African design

    Now, as of late January, anyone can access all of those observations, containing phenomena astronomers don’t yet know about and that—hey, who knows—you could beat them to discovering.
    Big surveys like this one, which watch swaths of sky agnostically rather than homing in on specific stuff, represent a big chunk of modern astronomy. They are an efficient, pseudo-egalitarian way to collect data, uncover the unexpected, and allow for discovery long after the lens cap closes. With better computing power, astronomers can see the universe not just as it was and is but also as it’s changing, by comparing, say, how a given part of the sky looks on Tuesday to how it looks on Wednesday. Pan-STARRS’s latest data dump, in particular, gives everyone access to the in-process cosmos, opening up the “time domain” to all earthlings with a good internet connection.

    But surveys are beautiful not just because they can be shared online. They’re also A+ because their observations aren’t narrow. In much of astronomy, scientists look at specific objects in specific ways at specific times. Maybe they zoom in on the magnetic field of pulsar J1745–2900, or the hydrogen gas in the farthest reaches of the Milky Way’s Perseus arm, or that one alien spaceship rock. Those observations are perfect for that individual astronomer to learn about that field, arm, or ship—but they’re not as great for anything or anyone else. Surveys, on the other hand, serve everyone.

    “The Sloan Digital Sky Survey set the standard for these huge survey projects,” says White. Sloan, which started operations in 2000, is on its fourth iteration, collecting light with telescopes at Apache Point Observatory in New Mexico and Las Campanas Observatory in Northern Chile. From the early universe to the modern state of the Milky Way’s union, Sloan data has painted a full-on portrait of the universe that, like those creepy Renaissance portraits, will stick around for years to come.

    Over in a different part of New Mexico, on the high Plains of San Agustin, radio astronomers recently set the Very Large Array’s sights on a new survey. Having started in 2017, the Very Large Array Sky Survey is still at the beginning of its seven years of operation. But astronomers don’t have to wait for it to finish its observations, as happened with the first Pan-STARRS survey. “Within several days of the data coming off the telescope, the images are available to everybody,” says Brian Kent, who, since 2012, has worked on the software that processes the data. Which is no small task: For every four hours of skywatching, the telescope spits out 300 gigabytes, which the software then has to make useful and usable. “You have to put the collective smarts of the astronomers into the software,” he says.

    Kent is excited about the same kinds of time-domain discoveries as White is: about seeing the universe at work rather than as a set of static images. Including the chronological dimension is hot in astronomy right now, from these surveys to future instruments like the LSST and the massive Square Kilometre Array, a radio telescope that will spread across two continents.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 12:38 pm on February 1, 2019 Permalink | Reply
    Tags: A project in eastern Tennessee quietly exceeded the scale of any corporate AI lab. It was run by the US government., , , Nvidia powerful graphics processors, ORNL SUMMIT supercomputer unveiled-world's most powerful in 2018, Summit has a hybrid architecture and each node contains multiple IBM POWER9 CPUs and NVIDIA Volta GPUs all connected together with NVIDIA’s high-speed NVLink, , TensorFlow machine-learning software, The World’s Fastest Supercomputer Breaks an AI Record, WIRED   

    From Oak Ridge National Laboratory via WIRED “The World’s Fastest Supercomputer Breaks an AI Record” 

    i1

    From Oak Ridge National Laboratory

    via

    Wired logo

    WIRED

    ORNL IBM AC922 SUMMIT supercomputer. Credit: Carlos Jones, Oak Ridge National Laboratory/U.S. Dept. of Energy

    1
    Oak Ridge National Lab’s Summit supercomputer became the world’s most powerful in 2018, reclaiming that title from China for the first time in five years.
    Carlos Jones/Oak Ridge National Lab

    Along America’s west coast, the world’s most valuable companies are racing to make artificial intelligence smarter. Google and Facebook have boasted of experiments using billions of photos and thousands of high-powered processors. But late last year, a project in eastern Tennessee quietly exceeded the scale of any corporate AI lab. It was run by the US government.

    The record-setting project involved the world’s most powerful supercomputer, Summit, at Oak Ridge National Lab. The machine captured that crown in June last year, reclaiming the title for the US after five years of China topping the list. As part of a climate research project, the giant computer booted up a machine-learning experiment that ran faster than any before.

    Summit, which occupies an area equivalent to two tennis courts, used more than 27,000 powerful graphics processors in the project. It tapped their power to train deep-learning algorithms, the technology driving AI’s frontier, chewing through the exercise at a rate of a billion billion operations per second, a pace known in supercomputing circles as an exaflop.

    “Deep learning has never been scaled to such levels of performance before,” says Prabhat, who leads a research group at the National Energy Research Scientific Computing Center at Lawrence Berkeley National Lab. (He goes by one name.) His group collaborated with researchers at Summit’s home base, Oak Ridge National Lab.

    Fittingly, the world’s most powerful computer’s AI workout was focused on one of the world’s largest problems: climate change. Tech companies train algorithms to recognize faces or road signs; the government scientists trained theirs to detect weather patterns like cyclones in the copious output from climate simulations that spool out a century’s worth of three-hour forecasts for Earth’s atmosphere. (It’s unclear how much power the project used or how much carbon that spewed into the air.)

    The Summit experiment has implications for the future of both AI and climate science. The project demonstrates the scientific potential of adapting deep learning to supercomputers, which traditionally simulate physical and chemical processes such as nuclear explosions, black holes, or new materials. It also shows that machine learning can benefit from more computing power—if you can find it—boding well for future breakthroughs.

    “We didn’t know until we did it that it could be done at this scale,” says Rajat Monga, an engineering director at Google. He and other Googlers helped the project by adapting the company’s open-source TensorFlow machine-learning software to Summit’s giant scale.

    Most work on scaling up deep learning has taken place inside the data centers of internet companies, where servers work together on problems by splitting them up, because they are connected relatively loosely, not bound into one giant computer. Supercomputers like Summit have a different architecture, with specialized high-speed connections linking their thousands of processors into a single system that can work as a whole. Until recently, there has been relatively little work on adapting machine learning to work on that kind of hardware.

    Monga says working to adapt TensorFlow to Summit’s scale will also inform Google’s efforts to expand its internal AI systems. Engineers from Nvidia also helped out on the project, by making sure the machine’s tens of thousands of Nvidia graphics processors worked together smoothly.

    Finding ways to put more computing power behind deep-learning algorithms has played a major part in the technology’s recent ascent. The technology that Siri uses to recognize your voice and Waymo vehicles use to read road signs burst into usefulness in 2012 after researchers adapted it to run on Nvidia graphics processors.

    In an analysis published last May, researchers from OpenAI, a San Francisco research institute cofounded by Elon Musk, calculated that the amount of computing power in the largest publicly disclosed machine-learning experiments has doubled roughly every 3.43 months since 2012; that would mean an 11-fold increase each year. That progression has helped bots from Google parent Alphabet defeat champions at tough board games and videogames, and fueled a big jump in the accuracy of Google’s translation service.

    Google and other companies are now creating new kinds of chips customized for AI to continue that trend. Google has said that “pods” tightly integrating 1,000 of its AI chips—dubbed tensor processing units, or TPUs—can provide 100 petaflops of computing power, one-tenth the rate Summit achieved on its AI experiment.

    The Summit project’s contribution to climate science is to show how giant-scale AI could improve our understanding of future weather patterns. When researchers generate century-long climate predictions, reading the resulting forecast is a challenge. “Imagine you have a YouTube movie that runs for 100 years. There’s no way to find all the cats and dogs in it by hand,” says Prabhat of Lawrence Berkeley. The software typically used to automate the process is imperfect, he says. Summit’s results showed that machine learning can do it better, which should help predict storm impacts such as flooding or physical damage. The Summit results won Oak Ridge, Lawrence Berkeley, and Nvidia researchers the Gordon Bell Prize for boundary-pushing work in supercomputing.

    Running deep learning on supercomputers is a new idea that’s come along at a good moment for climate researchers, says Michael Pritchard, a professor at the University of California, Irvine. The slowing pace of improvements to conventional processors had led engineers to stuff supercomputers with growing numbers of graphics chips, where performance has grown more reliably. “There came a point where you couldn’t keep growing computing power in the normal way,” Pritchard says.

    That shift posed some challenges to conventional simulations, which had to be adapted. It also opened the door to embracing the power of deep learning, which is a natural fit for graphics chips. That could give us a clearer view of our climate’s future. Pritchard’s group showed last year that deep learning can generate more realistic simulations of clouds inside climate forecasts, which could improve forecasts of changing rainfall patterns.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    i2

     
  • richardmitnick 1:27 pm on December 19, 2018 Permalink | Reply
    Tags: , , , , , WIRED   

    From WIRED: “Dark Matter Hunters Pivot After Years of Failed Searches” 

    Wired logo

    From WIRED

    12.19.18
    Sophia Chen

    1
    NASA Goddard

    Physicists are remarkably frank: they don’t know what dark matter is made of.

    “We’re all scratching our heads,” says physicist Reina Maruyama of Yale University.

    “The gut feeling is that 80 percent of it is one thing, and 20 percent of it is something else,” says physicist Gray Rybka of the University of Washington. Why does he think this? It’s not because of science. “It’s a folk wisdom,” he says.

    Peering through telescopes, researchers have found a deluge of evidence for dark matter. Galaxies, they’ve observed, rotate far faster than their visible mass allows. The established equations of gravity dictate that those galaxies should fall apart, like pieces of cake batter flinging off a spinning hand mixer. The prevailing thought is that some invisible material—dark matter—must be holding those galaxies together. Observations suggest that dark matter consists of diffuse material “sort of like a cotton ball,” says Maruyama, who co-leads a dark matter research collaboration called COSINE-100.

    2
    Jay Hyun Jo/DM-Ice/KIMS

    Here on Earth, though, clues are scant. Given the speed that galaxies rotate, dark matter should make up 85 percent of the matter in the universe, including on our provincial little home planet. But only one experiment, a detector in Italy named DAMA, has ever registered compelling evidence of the stuff on Earth.

    DAMA-LIBRA at Gran Sasso


    Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy

    “There have been hints in other experiments, but DAMA is the only one with robust signals,” says Maruyama, who is unaffiliated with the experiment. For two decades, DAMA has consistently measured a varying signal that peaks in June and dips in December. The signal suggests that dark matter hits Earth at different rates corresponding to its location in its orbit, which matches theoretical predictions.

    But the search has yielded few other promising signals. This year, several detectors reported null findings. XENON1T, a collaboration whose detector is located in the same Italian lab as DAMA, announced they hadn’t found anything this May.

    XENON1T at Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy

    Panda-X, a China-based experiment, published in July that they also hadn’t found anything.

    PandaX II Dark Matter experiment at Jin-ping Underground Laboratory (CJPL) in Sichuan, China

    Even DAMA’s results have been called into question: In December, Maruyama’s team published that their detector, a South-Korea based DAMA replica made of some 200 pounds of sodium iodide crystal, failed to reproduce its Italian predecessor’s results.

    These experiments are all designed to search for a specific dark matter candidate, a theorized class of particles known as Weakly Interacting Massive Particles, or WIMPs, that should be about a million times heavier than an electron. WIMPs have dominated dark matter research for years, and Miguel Zumalacárregui is tired of them. About a decade ago, when Zumalacárregui was still a PhD student, WIMP researchers were already promising an imminent discovery. “They’re just coming back empty-handed,” says Zumalacárregui, now an astrophysicist at the University of California, Berkeley.

    He’s not the only one with WIMP fatigue. “In some ways, I grew tired of WIMPs long ago,” says Rybka. Rybka is co-leading an experiment that is pursuing another dark matter candidate: a dainty particle called an axion, roughly a billion times lighter than an electron and much lighter than the WIMP. In April, the Axion Dark Matter Experiment collaboration announced that they’d finally tweaked their detector to be sensitive enough to detect axions.

    Inside the ADMX experiment hall at the University of Washington Credit Mark Stone U. of Washington

    The detector acts sort of like an AM radio, says Rybka. A strong magnet inside the machine would convert incoming axions into radio waves, which the detector would then pick up. “Given that we don’t know the exact mass of the axion, we don’t know which frequency to tune to,” says Rybka. “So we slowly turn the knob while listening, and mostly we hear noise. But someday, hopefully, we’ll tune to the right frequency, and we’ll hear that pure tone.”

    He is betting on axions because they would also resolve a piece of another long-standing puzzle in physics: exactly how quarks bind together to form atomic nuclei. “It seems too good to just be a coincidence, that this theory from nuclear physics happens to make the right amount of dark matter,” says Rybka.

    As Rybka’s team sifts through earthly data for signs of axions, astrophysicists look to the skies for leads. In a paper published in October, Zumalacárregui and a colleague ruled out an old idea that dark matter was mostly made of black holes. They reached this conclusion by looking through two decades of supernovae observations. When a supernova passes behind a black hole, the black hole’s gravity bends the supernova’s light to make it appear brighter. The brighter the light, the more massive the black hole. So by tabulating the brightness of hundreds of supernovae, they calculated that black holes that are at least one-hundredth the size of the sun can account for up to 40 percent of dark matter, and no more.

    “We’re at a point where our best theories seem to be breaking,” says astrophysicist Jamie Farnes of Oxford University. “We clearly need some kind of new idea. There’s something key we’re missing about how the universe is working.”

    Farnes is trying to fill that void. In a paper published in December [Astronomy and Astrophysics], he proposed that dark matter could be a weird fluid that moves toward you if you try to push it away. He created a simplistic simulation of the universe containing this fluid and found that it could potentially also explain why the universe is expanding, another long-standing mystery in physics. He is careful to point out that his ideas are speculative, and it is still unclear whether they are consistent with prior telescope observations and dark matter experiments.

    WIMPs could still be dark matter as well, despite enthusiasm for new approaches. Maruyama’s Korean experiment has ruled out “the canonical, vanilla WIMP that most people talk about,” she says, but lesser-known WIMP cousins are still on the table.

    It’s important to remember, as physicists clutch onto their favorite theories—regardless of how refreshing they are—that they need corroborating data. “The universe doesn’t care what is beautiful or elegant,” says Farnes. Nor does it care about what’s trendy. Guys, the universe might be really uncool.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 9:01 am on October 3, 2018 Permalink | Reply
    Tags: , , Transmission electron microscopes, WIRED   

    From WIRED: “New Microscope Shows the Quantum World in Crazy Detail” 

    Wired logo

    From WIRED

    9.21.18
    Sophia Chen

    1
    Scientists at Lawrence Berkeley National Lab use the microscope to painstakingly map every single atom in a nanoparticle. Here, they surveyed a tiny iron platinum cluster under the microscope and virtually picked it apart. Colin Ophus

    The transmission electron microscope was designed to break records. Using its beam of electrons, scientists have glimpsed many types of viruses for the first time. They’ve used it to study parts of biological cells like ribosomes and mitochondria. You can see individual atoms with it.

    Scanning transmission electron microscope Wikipedia Materialscientist

    Custom-designed scanning transmission electron microscope at Cornell University by David Muller/Cornell University

    But experts have recently unlocked new potential for the machine. “It’s been a very dramatic and sudden shift,” says physicist David Muller of Cornell University. “It was a little bit like everyone was flying biplanes, and all of a sudden, here’s a jetliner.”

    For one thing, Muller’s team has set a new record. Publishing in Nature this July, they used their scope to take the highest resolution images to date. To do this, they had to create special lenses to better focus the electrons, sort of like “glasses” for the microscope, he says. They also developed a super-sensitive camera, capable of quickly registering single electrons. Their new images show a razor-thin layer, just two atoms thick, of molybdenum and sulfur atoms bonded together. Not only could they distinguish between individual atoms, they could even see them when they were about only 0.4 angstroms apart, half the length of a chemical bond. They even could spot a gap where a sulfur atom was missing in the material’s otherwise repeating pattern. “They could do this primarily because their electron camera is so good,” says physicist Colin Ophus of Lawrence Berkeley National Lab, who was not involved with the work.

    2
    Each dot in this image is a single molybdenum or sulfur atom from two overlapping but twisted atom-thick sheets. Cornell University’s transmission electron microscope, which took this image, broke the record for highest-resolution microscope this July. David Muller/Cornell University

    Now the rest of the field is clamoring to outfit their scopes with similar cameras, says Muller. “You can see all sorts of things you couldn’t before,” he says. In particular, Muller is studying thin materials, one to two atoms thick, that exhibit unusual properties. For example, physicists recently discovered that one type of thin material, when layered in a certain way, becomes superconducting. Muller thinks that the microscope could help reveal the underlying mechanisms behind such properties.

    When it comes to magnifying the miniscule, electrons are fundamentally better than visible light. That’s because electrons, which have wavelike properties due to quantum mechanics, have wavelengths a thousand times shorter. Shorter wavelengths produce higher resolution, much like finer thread can create more intricate embroidery. “Electron microscopes are pretty much the only game in town if you want to look at things on the atomic scale,” says physicist Ben McMorran of the University of Oregon. Pelting a material with electrons and detecting the ones that have traveled through produces a detailed image of that material.

    But high resolution isn’t the machine’s only trick. In a paper recently accepted to Nano Letters [not recovered], a team led by McMorran has developed a new type of image you can take with the microscope. This method can image materials normally transparent to electrons, such as lightweight atoms like lithium. It should allow scientists to study and improve lithium-based batteries with atomic detail.

    There’s more. By measuring a property of the electron called its phase, they can actually map the electric and magnetic fields inside the material, says Fehmi Yasin, a physics graduate student at the University of Oregon. “This technique can tease more information out of the electrons,” he says.

    These new capabilities can help scientists like Mary Scott, a physicist at the University of California, Berkeley, who studies nanoparticles smaller than a bacterium. Scott has spent long hours photographing these tiny inanimate clumps under an electron microscope. Using a special rig, she carefully tilts the sample to get as many angles as possible. Then, from those images, she creates an extremely precise 3-D model, accurate down to the atom. In 2017, she and her team mapped the exact locations of 23,000 atoms in a single silver and platinum nanoparticle. The point of such painstaking models is to study how individual atoms contribute to a property of the material—how strong or conductive it is, for example. The new techniques could help Scott examine those material properties more easily.

    But the ultimate goal of such experiments isn’t merely to study the materials. Eventually, scientists like Scott want to turn atoms into Legos: to assemble them, brick by brick, into brand new materials. But even tiny changes in a material’s atomic composition or structure can alter its function, says Scott, and no one fully understands why. The microscope images can teach them how and why atoms lock together.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 5:08 pm on September 16, 2018 Permalink | Reply
    Tags: Astronomers Have Found the Universe's Missing Matter, , , , , WIRED   

    From WIRED: “Astronomers Have Found the Universe’s Missing Matter” 

    Wired logo

    from WIRED

    1
    A computer simulation of the hot gas between galaxies hinted at the location of the universe’s missing matter. Princeton University/Renyue Cen.

    09.16.18
    Katia Moskvitch

    For decades, some of the atomic matter in the universe had not been located. Recent papers reveal where it’s been hiding. [No papers cited]

    Astronomers have finally found the last of the missing universe. It’s been hiding since the mid-1990s, when researchers decided to inventory all the “ordinary” matter in the cosmos—stars and planets and gas, anything made out of atomic parts. (This isn’t “dark matter,” which remains a wholly separate enigma.) They had a pretty good idea of how much should be out there, based on theoretical studies of how matter was created during the Big Bang. Studies of the cosmic microwave background (CMB)—the leftover light from the Big Bang—would confirm these initial estimates.

    So they added up all the matter they could see—stars and gas clouds and the like, all the so-called baryons. They were able to account for only about 10 percent of what there should be. And when they considered that ordinary matter makes up only 15 percent of all matter in the universe—dark matter makes up the rest—they had only inventoried a mere 1.5 percent of all matter in the universe.

    Now, in a series of three recent papers, astronomers have identified the final chunks of all the ordinary matter in the universe. (They are still deeply perplexed as to what makes up dark matter.) And despite the fact that it took so long to identify it all, researchers spotted it right where they had expected it to be all along: in extensive tendrils of hot gas that span the otherwise empty chasms between galaxies, more properly known as the warm-hot intergalactic medium, or WHIM.

    Early indications that there might be extensive spans of effectively invisible gas between galaxies came from computer simulations done in 1998. “We wanted to see what was happening to all the gas in the universe,” said Jeremiah Ostriker, a cosmologist at Princeton University who constructed one of those simulations along with his colleague Renyue Cen. The two ran simulations of gas movements in the universe acted on by gravity, light, supernova explosions and all the forces that move matter in space. “We concluded that the gas will accumulate in filaments that should be detectable,” he said.

    Except they weren’t — not yet.

    “It was clear from the early days of cosmological simulations that many of the baryons would be in a hot, diffuse form — not in galaxies,” said Ian McCarthy, an astrophysicist at Liverpool John Moores University. Astronomers expected these hot baryons to conform to a cosmic superstructure, one made of invisible dark matter, that spanned the immense voids between galaxies. The gravitational force of the dark matter would pull gas toward it and heat the gas up to millions of degrees. Unfortunately, hot, diffuse gas is extremely difficult to find.

    To spot the hidden filaments, two independent teams of researchers searched for precise distortions in the CMB, the afterglow of the Big Bang. As that light from the early universe streams across the cosmos, it can be affected by the regions that it’s passing through. In particular, the electrons in hot, ionized gas (such as the WHIM) should interact with photons from the CMB in a way that imparts some additional energy to those photons. The CMB’s spectrum should get distorted.

    Unfortunately the best maps of the CMB (provided by the Planck satellite) showed no such distortions. Either the gas wasn’t there, or the effect was too subtle to show up.

    CMB per ESA/Planck


    ESA/Planck 2009 to 2013

    But the two teams of researchers were determined to make them visible. From increasingly detailed computer simulations of the universe, they knew that gas should stretch between massive galaxies like cobwebs across a windowsill. Planck wasn’t able to see the gas between any single pair of galaxies. So the researchers figured out a way to multiply the faint signal by a million.

    First, the scientists looked through catalogs of known galaxies to find appropriate galaxy pairs — galaxies that were sufficiently massive, and that were at the right distance apart, to produce a relatively thick cobweb of gas between them. Then the astrophysicists went back to the Planck data, identified where each pair of galaxies was located, and then essentially cut out that region of the sky using digital scissors. With over a million clippings in hand (in the case of the study led by Anna de Graaff, a Ph.D. student at the University of Edinburgh), they rotated each one and zoomed it in or out so that all the pairs of galaxies appeared to be in the same position. They then stacked a million galaxy pairs on top of one another. (A group led by Hideki Tanimura at the Institute of Space Astrophysics in Orsay, France, combined 260,000 pairs of galaxies.) At last, the individual threads — ghostly filaments of diffuse hot gas — suddenly became visible.

    2
    (A) Images of one million galaxy pairs were aligned and added together. (B) Astronomers mapped all the gas within the actual galaxies. (C) By subtracting the galaxies (B) from the initial image (A), researchers revealed filamentary gas hiding in intergalactic space. Adapted by Quanta Magazine

    The technique has its pitfalls. The interpretation of the results, said Michael Shull, an astronomer at the University of Colorado at Boulder, requires assumptions about the temperature and spatial distribution of the hot gas. And because of the stacking of signals, “one always worries about ‘weak signals’ that are the result of combining large numbers of data,” he said. “As is sometimes found in opinion polls, one can get erroneous results when one has outliers or biases in the distribution that skew the statistics.”

    In part because of these concerns, the cosmological community didn’t consider the case settled. What was needed was an independent way of measuring the hot gas. This summer, one arrived.

    Lighthouse Effect

    While the first two teams of researchers were stacking signals together, a third team followed a different approach. They observed a distant quasar — a bright beacon from billions of light-years away — and used it to detect gas in the seemingly empty intergalactic spaces through which the light traveled. It was like examining the beam of a faraway lighthouse in order to study the fog around it.

    Usually when astronomers do this, they try to look for light that has been absorbed by atomic hydrogen, since it is the most abundant element in the universe. Unfortunately, this option was out. The WHIM is so hot that it ionizes hydrogen, stripping its single electron away. The result is a plasma of free protons and electrons that don’t absorb any light.

    So the group decided to look for another element instead: oxygen. While there’s not nearly as much oxygen as hydrogen in the WHIM, atomic oxygen has eight electrons, as opposed to hydrogen’s one. The heat from the WHIM strips most of those electrons away, but not all. The team, led by Fabrizio Nicastro of the National Institute for Astrophysics in Rome, tracked the light that was absorbed by oxygen that had lost all but two of its electrons. They found two pockets of hot intergalactic gas. The oxygen “provides a tracer of the much larger reservoir of hydrogen and helium gas,” said Shull, who is a member of Nicastro’s team. The researchers then extrapolated the amount of gas they found between Earth and this particular quasar to the universe as a whole. The result suggested that they had located the missing 30 percent.

    The number also agrees nicely with the findings from the CMB studies. “The groups are looking at different pieces of the same puzzle and are coming up with the same answer, which is reassuring, given the differences in their methods,” said Mike Boylan-Kolchin, an astronomer at the University of Texas, Austin.

    The next step, said Shull, is to observe more quasars with next-generation X-ray and ultraviolet telescopes with greater sensitivity. “The quasar we observed was the best and brightest lighthouse that we could find. Other ones will be fainter, and the observations will take longer,” he said. But for now, the takeaway is clear. “We conclude that the missing baryons have been found,” their team wrote.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 6:57 am on September 3, 2018 Permalink | Reply
    Tags: , , , , , , , , , WIRED   

    From The Atlantic via WIRED: “China Built the World’s Largest Telescope. Then Came the Tourists” 

    Atlantic Magazine

    The Atlantic Magazine

    via

    Wired logo

    WIRED

    08.26.18
    Sarah Scoles

    Thousands of people moved[?*] to let China build and protect the world’s largest telescope. And then the government drew in orders of magnitude more tourists, potentially undercutting its own science in an attempt to promote it.

    FAST radio telescope, with phase arrays from Australia [https://sciencesprings.wordpress.com/2017/12/18/from-csiroscope-our-top-telescope-tech-travels-fast/] located in the Dawodang depression in Pingtang County, Guizhou Province, south China

    “I hope we go inside this golf ball,” Sabrina Stierwalt joked as she and a group of other radio astronomers approached what did, in fact, appear to be a giant golf ball in the middle of China’s new Pingtang Astronomy Town.

    Stierwalt was a little drunk, a lot full, even more tired. The nighttime scene felt surreal. But then again, even a sober, well-rested person might struggle to make sense of this cosmos-themed, touristy confection of a metropolis.

    On the group’s walk around town that night, they seemed to traverse the ever-expanding universe. Light from a Saturn-shaped lamp crested and receded, its rings locked into support pillars that appeared to make it levitate. Stierwalt stepped onto a sidewalk, and its panels lit up beneath her feet, leaving a trail of lights behind her like the tail of a meteor. Someone had even brought constellations down to Earth, linking together lights in the ground to match the patterns in the sky.

    1
    The tourist town, about 10 miles from the telescope, lights up at night. Credit Intentionally Withheld

    The day before, Stierwalt had traveled from Southern California to Pingtang Astronomy Town for a conference hosted by scientists from the world’s largest telescope. It was a new designation: China’s Five-Hundred-Meter Aperture Spherical Radio Telescope, or FAST, had been completed just a year before, in September 2016. Wandering, tipsy, around this shrine to the stars, the 40 or so other foreign astronomers had come to China to collaborate on the superlative-snatching instrument.

    For now, though, they wouldn’t get to see the telescope itself, nestled in a natural enclosure called a karst depression about 10 miles away. First things first: the golf ball.

    As the group got closer, they saw a red carpet unrolled into the entrance of the giant white orb, guarded by iridescent dragons on an inflatable arch. Inside, they buckled up in rows of molded yellow plastic chairs. The lights dimmed. It was an IMAX movie—a cartoon, with an animated narrator. Not the likeness of a person but … what was it? A soup bowl?

    No, Stierwalt realized. It was a clip-art version of the gargantuan telescope itself. Small cartoon FAST flew around big cartoon FAST, describing the monumental feat of engineering just over yonder: a giant geodesic dome shaped out of 4,450 triangular panels, above which receivers collect radio waves from astronomical objects.

    FAST’s dish, nestled into a depression, is made of thousands of triangular panels. located in the Dawodang depression in Pingtang County, Guizhou Province, south China located in the Dawodang depression in Pingtang County, Guizhou Province, south China VCGGetty Images

    China spent $180 million to create the telescope, which officials have repeatedly said will make the country the global leader in radio astronomy. But the local government also spent several times that on this nearby Astronomy Town—hotels, housing, a vineyard, a museum, a playground, classy restaurants, all those themed light fixtures. The government hopes that promoting their scope in this way will encourage tourists and new residents to gravitate to the historically poor Guizhou province.

    It is, in some sense, an experiment into whether this type of science and economic development can coexist. Which is strange, because normally, they purposefully don’t.

    The point of radio telescopes is to sense radio waves from space—gas clouds, galaxies, quasars. By the time those celestial objects’ emissions reach Earth, they’ve dimmed to near-nothingness, so astronomers build these gigantic dishes to pick up the faint signals. But their size makes them particularly sensitive to all radio waves, including those from cell phones, satellites, radar systems, spark plugs, microwaves, Wi-Fi, short circuits, and basically anything else that uses electricity or communicates. Protection against radio-frequency interference, or RFI, is why scientists put their radio telescopes in remote locations: the mountains of West Virginia, the deserts of Chile, the way-outback of Australia.

    FAST’s site used to be remote like that. The country even forcibly relocated thousands of villagers who lived nearby, so their modern trappings wouldn’t interfere with the new prized instrument.

    But then, paradoxically, the government built—just a few miles from the displaced villagers’ demolished houses—this astronomy town. It also plans to increase the permanent population by hundreds of thousands. That’s a lot of cell phones, each of which persistently emits radio waves with around 1 watt of power.

    By the time certain deep-space emissions reach Earth, their power often comes with 24+ zeroes in front: 0.0000000000000000000000001 watts.

    FAST has been in the making for a long time. In the early 2000s, China angled to host the Square Kilometre Array, a collection of coordinated radio antennas whose dishes would be scattered over thousands of miles. But in 2006, the international SKA committee dismissed China, and then chose to set up its distributed mondo-telescope in South Africa and Australia instead.

    Undeterred, Chinese astronomers set out to build their own powerful instrument.

    In 2007, China’s National Development and Reform Commission allocated $90 million for the project, with $90 million more streaming in from other agencies. Four years later, construction began in one of China’s poorest regions, in the karst hills of the southwestern part of the country. They do things fast in China: The team finished the telescope in just five years. In September 2016, FAST received its “first light,” from a pulsar 1,351 light-years away, during its official opening.

    A year later, Stierwalt and the other visiting scientists arrived in Pingtang, and after an evening of touring Astronomy Town, they got down to business.

    See, FAST’s opening had been more ceremony than science (the commissioning phase is officially scheduled to end by September 2019). It was still far from fully operational—engineers are still trying to perfect, for instance, the motors that push and pull its surface into shape, allowing it to point and focus correctly. And the relatively new crop of radio astronomers running the telescope were hungry for advice about how to run such a massive research instrument.

    The visiting astronomers had worked with telescopes that have contributed to understanding of hydrogen emissions, pulsars, powerful bursts, and distant galaxies. But they weren’t just subject experts: Many were logistical wizards, having worked on multiple instruments and large surveys, and with substantial and dispersed teams. Stierwalt studies interacting dwarf galaxies, and while she’s a staff scientist at Caltech/IPAC, she uses telescopes all over. “Each gives a different piece of the puzzle,” she says. Optical telescopes show the stars. Infrared instruments reveal dust and older stars. X-ray observatories pick out black holes. And single-dish radio telescopes like FAST see the bigger picture: They can map out the gas inside of and surrounding galaxies.

    So at the Radio Astronomy Conference, Stierwalt and the other visitors shared how FAST could benefit from their instruments, and vice versa, and talked about how to run big projects. That work had begun even before the participants arrived. “Prior to the meeting, I traveled extensively all over the world to personally meet with the leaders of previous large surveys,” says Marko Krčo, a research fellow who’s been working for the Chinese Academy of Sciences since the summer of 2016.

    He asked the meeting’s speakers, some of those same leaders, to talk about what had gone wrong in their own surveys, and how the interpersonal end had functioned. “How did you organize yourselves?” he says. “How did you work together? How did you communicate?”

    That kind of feedback would be especially important for FAST to accomplish one of its first, appropriately lofty goals: helping astronomers collect signals from many sides of the universe, all at once. They’d call it the Commensal Radio Astronomy FAST Survey, or CRAFTS.

    3
    Above the dish, engineers have suspended instruments that collect cosmic radio waves. Feature China/Barcroft Media/Getty Images

    Most radio astronomical surveys have a single job: Map gas. Find pulsars. Discover galaxies. They do that by collecting signals in a receiver suspended over the dish of a radio telescope, engineered to capture a certain range of frequencies from the cosmos. Normally, the different astronomer factions don’t use that receiver at the same time, because they each take their data differently. But CRAFTS aims to be the first survey that simultaneously collects data for such a broad spectrum of scientists—without having to pause to reconfigure its single receiver.

    CRAFTS has a receiver that looks for signals from 1.04 gigahertz to 1.45 gigahertz, about 10 times higher than your FM radio. Within that range, as part of CRAFTS, scientists could simultaneously look for gas inside and beyond the galaxy, scan for pulsars, watch for mysterious “fast radio bursts,” make detailed maps, and maybe even search for ET. “That sounds straightforward,” says Stierwalt. “Point the telescope. Collect the data. Mine the data.”

    4
    Engineers from FAST and the Australian science agency install the telescope’s CRAFTS receiver. Marko Krčo

    But it’s not easy. Pulsar astronomers want quicktime samples at a wide range of frequencies; hydrogen studiers, meanwhile, don’t need data chunks as often, but they care deeply about the granular frequency details. On top of that, each group adjusts the observations, calibrating them, kind of like you’d make sure your speedometer reads 45 mph when you’re going 45. And they use different kinds of adjustments.

    When we spoke, Krčo had just returned from a trip to Green Bank, where he was testing whether they could set everyone’s speedometer correctly. “I think it will be one of the big sort of legacies of FAST,” says Krčo. And it’s especially important since the National Science Foundation has recently cratered funding to both Arecibo and Green Bank observatories, the United States’ most significant single-dish radio telescopes.


    NAIC Arecibo Observatory, previously the largest radio telescope in the world operated by University of Central Florida, Yang Enterprises and UMET, Altitude 497 m (1,631 ft)

    Green Bank does have financial support, $2 million per year for five years from Yuri Milner’s Breaktrhough Listen Project.

    Breakthrough Listen Project

    1

    Lick Automated Planet Finder telescope, Mount Hamilton, CA, USA



    GBO radio telescope, Green Bank, West Virginia, USA


    CSIRO/Parkes Observatory, located 20 kilometres north of the town of Parkes, New South Wales, Australia

    While they remain open, they have to seek private project money, meaning chunks of time are no longer available for astronomers’ proposals. Adding hours, on a different continent, helps everybody.

    At the end of the conference in Pingtang County, Krčo and his colleagues presented a concrete plan for CRAFTS, giving all the visitors a chance to approve the proposed design. “Each group could raise any red flags, if necessary, regarding their individual science goals or suggest modifications,” says Krčo.

    In addition to the CRAFTS receiver, Krčo says they’ll add six more, sensitive to different frequencies. Together, they will detect radio waves from 70 megahertz to 3 gigahertz. He says they’ll find thousands of new pulsars (as of July 2018, they had already found more than 40), and do detailed studies of hydrogen inside the galaxy and in the wider universe, among numerous other worthy scientific goals.

    “There’s just a hell of a lot of work to do to get there,” says Krčo. “But we’re doing it.”

    For FAST to fulfill its potential, though, Krčo and his colleagues won’t just have to solve engineering problems: They’ll also have to deal with the problems that engineering created.

    During the four-day Radio Astronomy Forum, Stierwalt and the other astronomers did, finally, get to see the actual telescope, taking a bus up a tight, tortuous road through the karst between town and telescope.

    As soon as they arrived on site, they were instructed to shut down their phones to protect the instrument from the radio frequency interference. But not even these astronomers, who want pristine FAST data for themselves, could resist pressing that capture button. “Our sweet, sweet tour guide continually reminded us to please turn off our phones,” says Stierwalt, “but we all kept taking pictures and sneaking them out because no one really seemed to care.” Come on: It’s the world’s largest telescope.

    Maybe their minder stayed lax because a burst here or there wouldn’t make much of a difference in those early days. The number of regular tourists allowed at the site all day is capped at 3,000, to limit RFI, and they have to put their phones in lockers before they go see the dish. Krčo says the site bumps up against the visitor limit most days.

    But tourism and development are complicated for a sensitive scientific instrument. Within three miles of the telescope, the government passed legislation establishing a “radio-quiet zone,” where RFI-emitting devices are severely restricted. No one (not cellular providers or radio broadcasters) can get a transmitting license, and people entering the facility itself will have their electronics confiscated. “No one lives inside the zone, and the area is not open to the general public,” says Krčo, although some with commercial interests, like local farmers, can enter the zone with special permission. The government relocated villagers who lived within that protected area with promises of repayment in cash, housing, and jobs in tourism and FAST support services. (Though a 2016 report in Agence France-Presse revealed that up to 500 relocated families were suing the Pingtang government, alleging “land grabs without compensation, forced demolitions and unlawful detentions.”)

    The country’s Civil Aviation Administration has also adjusted air travel, setting up two restricted flight zones near the scope, canceling two routes, and adding or adjusting three others. “We can still see some RFI from aircraft navigational beacons,” says Krčo. “It’s much less, though, compared to what it’d look like without the adjusted air routes. It’d be impossible to fully clear a large enough air space to create a completely quiet sky.”

    None of the invisible boundaries, after all, function like force fields. RFI that originates from beyond can pass right on through. At least at the five-star tourist hotel, around 10 miles away, there’s Wi-Fi. The tour center, says an American pulsar astronomer, has a direct line of sight to the telescope.

    When Krčo first arrived on the job, he stayed in the astronomy town. “Every morning, we were counting all the new buildings springing up overnight,” Krčo says. “It would be half a dozen.”

    One day, he woke up to a new five-story structure out his window. Couldn’t be, he thought. But he checked a picture he’d taken the day before, and, sure enough, there had been no building in that spot.

    The corn close to town was covered in construction dust. “I’ve never seen anything like that in my whole life,” says Krčo. Today, though, the corn is gone, covered instead in hotels, museums, and shopping centers.

    5
    Before FAST, few large structures existed in this part of China. Feature China/Barcroft Media/Getty Images

    6
    Now, they abound. Liu Xu/Xinhua/Getty Images

    At a press conference in March 2017, Guizhou’s governor declared that the province would build 10,000 kilometers of new highway by 2020, in addition to completing 17 airports and 4,000 kilometers of high-speed train lines. That’s partly to accommodate the hundreds of thousands of people the province expects to relocate here permanently, as well as the tourists. While just those 3,000 people per day will get to visit the telescope itself, there’s no cap on how many can sojourn in Astronomy Town; the deputy director of Guizhou’s reform and development commission, according to China Daily, said it would be “a main astronomical tourism zone worldwide.” “The town has grown incredibly over the last couple of years due to tourism development,” says Krčo. “This has impacted our RFI environment, but not yet to a point where it is unmanageable.”

    Krčo says that geography protects FAST against much of that human interference. “There are a great many mountains between the telescope and the town,” says Krčo. The land blocks the waves, which you’ve seen yourself if you’ve ever tried to pick up NPR in a canyon. But even though the waves can’t go directly into the telescope, Krčo says the team still sees their echoes, reflections beamed down from the atmosphere.

    “People at the visitors’ center have been using cameras and whatnot, and we can see the RFI from that,” he said last November (enforcement seems to have ramped up since then). “During the daytime,” he adds, “our RFI is much worse than nighttime,” largely due to engineers working onsite (that should improve once commissioning is over). But the tourist traps aren’t run and weren’t developed by FAST staff but by various governmental arms—so FAST, really, has no control over what they do.

    The global radio astronomy community has concerns. “I’m absolutely sure that if people are going to bring their toys, then there’s going to be RFI,” says Carla Beaudet, an RFI engineer at Green Bank Observatory, who spends her career trying to help humans see the radio sky despite themselves. Green Bank itself sits in the middle of a strict radio protection zone with a radius of 10 miles, in which there’s no Wi-Fi or even microwaves.

    There are other ways of dealing with RFI—and Krčo says FAST has a permanent team of engineers dedicated to dealing with interference. One solution, which can pick up the strongest contamination, is a small antenna mounted to one of FAST’s support towers. “The idea is that it will observe the same RFI as the big dish,” says Krčo. “Then, in principle, we can remove the RFI from the data in real time.”

    At other telescopes, astronomers are developing machine-learning algorithms that could identify, extract, and compensate for dirty data. All telescopes, after all, have human contamination, even the ones without malls next door. You can’t stop a communications satellite from passing overhead, or a radar beam from bouncing the wrong way across the mountains. And while you can decide not to build a tourist town in the first place, you probably can’t stop a tidal wave of construction once it’s crested.

    In their free evenings at the Radio Astronomy Forum, Stierwalt and the other astronomers wandered through the development. Across from their luxury hotel, workers were constructing a huge mall. It was just scaffolding then, but sparks flew from tools every night. “So the joke was, ‘I wonder if we’ll be able to go shopping at the mall by the end of our trip,’” says Stierwalt.

    At the end of the conference, Stierwalt rode a bus back to the airport, awed by what she’d seen. The karst hills, dipping and rising out the window, looked like those in Puerto Rico, where she had used the 300-meter Arecibo telescope for weeks at a time during her graduate research.

    When she tried to check in for her flight, she didn’t know where to go, what to do. An agent wrote her passport number down wrong.

    A young Chinese man, an astronomer, saw her struggle and approached her. “I’m on your flight,” he said, “and I’ll make sure you get on it.”

    In line after line, they started talking about other things—life, science. “I was describing the astronomy landscape for me,” she says. Never enough jobs, never enough research money, necessary competition with your friends. “For him, it’s very different.”

    He lives in a country that wants to accrete a community of radio astronomers, not winnow one down. A country that wants to support (and promote) ambitious telescopes, rather than defund the ones it has. China isn’t just trying to build a tourist economy around its telescope—it’s also trying to build a scientific culture around radio astronomy.

    That latter part seems like a safe bet. But the first is still uncertain. So is how the tourist economy will affect—for better or worse—FAST’s scientific payoff. “Much like their CRAFTS survey is trying to make everyone happy—all the different kinds of radio astronomers—this will be a true test of ‘Can you make everyone happy?’” says Stierwalt. “Can you make a prosperous astronomy town right next to a telescope that doesn’t want you to be using your phone or your microwave?”

    Right now, nobody knows. But if the speed of everything else in Guizhou is any indication, we’ll all find out fast.

    [* I had previously read, which I cannot any longer back up, that FAST was built in a fortunately found an empty natural bowl in the land. If anyone can correct me, please do]

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 1:33 pm on August 11, 2018 Permalink | Reply
    Tags: , , , , , Quasars are now known to be supermassive black holes feeding on surrounding gas not stars., , WIRED, Zwicky Transit Facility at California’s Palomar Observatory   

    From Wired: “Star-Swallowing Black Holes Reveal Secrets in Exotic Light Shows” 

    Wired logo

    From Wired

    08.11.18
    Joshua Sokol

    Black holes, befitting their name and general vibe, are hard to find and harder to study. You can eavesdrop on small ones from the gravitational waves that echo through space when they collide—but that technique is new, and still rare. You can produce laborious maps of stars flitting around the black hole at the center of the Milky Way or nearby galaxies. Or you can watch them gulp down gas clouds, which emit radiation as they fall.

    Now researchers have a new option. They’ve begun corralling ultrabright flashes called tidal disruption events (TDEs), which occur when a large black hole seizes a passing star, shreds it in two and devours much of it with the appetite of a bear snagging a salmon. “To me, it’s sort of like science fiction,” said Enrico Ramirez-Ruiz, an astrophysicist at University of California, Santa Cruz, and the Niels Bohr Institute.

    During the past few years, though, the study of TDEs has transformed from science fiction to a sleepy cottage industry, and now into something more like a bustling tech startup.

    Automated wide-field telescopes that can pan across thousands of galaxies each night have uncovered about two dozen TDEs. Included in these discoveries are some bizarre and long-sought members of the TDE zoo. In June, a study in the journal Nature described an outburst of X-ray light in a cluster of faraway stars that astronomers interpreted as a midsized black hole swallowing a star. That same month, another group announced in Science that they had discovered what may be brightest ever TDE, one that illuminated faint gas at the heart of a pair of merging galaxies.

    These discoveries have taken place as our understanding of what’s really happening during a TDE comes into sharper focus. At the end of May, a group of astrophysicists proposed [The Astrophysical Journal Letters] a new theoretical model for how TDEs work. The model can explain why different TDEs can appear to behave differently, even though the underlying physics is presumably the same.

    Astronomers hope that decoding these exotic light shows will let them conduct a black hole census. Tidal disruptions expose the masses, spins and sheer numbers of black holes in the universe, the vast majority of which would be otherwise invisible. Theorists are hungry, for example, to see if TDEs might unveil any intermediate-mass black holes with weights between the two known black hole classes: star-size black holes that weigh a few times more than the sun, and the million- and billion-solar-mass behemoths that haunt the cores of galaxies. The Nature paper claims they may already have.


    A numerical simulation of the core of a star as it’s being consumed by a black hole. Video by Guillochon and Ramirez-Ruiz

    Researchers have also started to use TDEs to probe the fundamental physics of black holes. They can be used to test whether black holes always have event horizons—curtains beyond which nothing can return—as Einstein’s theory of general relativity predicts.

    Meanwhile, many more observations are on the way. The rate of new TDEs, now about one or two per year, could jump up by an order of magnitude [Stellar Tidal Disruption Events in General Relativity]even by the end of this year because of the Zwicky Transient Facility, which started scanning the sky over California’s Palomar Observatory in March.

    Zwicky Transit Facility at California’s Palomar Observatory schematic

    Zwicky Transit Facility at California’s Palomar Observatory

    And with the addition of planned observatories, it may increase perhaps another order of magnitude in the years to come.Researchers have also started to use TDEs to probe the fundamental physics of black holes. They can be used to test whether black holes always have event horizons—curtains beyond which nothing can return—as Einstein’s theory of general relativity predicts.

    “The field has really blossomed,” said Suvi Gezari at the University of Maryland, one of the few stubborn pioneers who staked their careers on TDEs during leaner years. She now leads the Zwicky Transient Facility’s TDE-hunting team, which has already snagged unpublished candidates in its opening months, she said. “Now people are really digging in.”

    Searching for Star-Taffy

    In 1975, the British physicist Jack Hills first dreamed up a black-hole-eats-star scenario as a way to explain what powers quasars—superbright points of light from the distant universe. (Quasars are now known to be supermassive black holes feeding on surrounding gas, not stars.) But in 1988, the British cosmologist Martin Rees realized [Nature]that black holes snacking on a star would exhibit a sharp flare, not a steady glow. Looking for such flares could let astronomers find and study the black holes themselves, he argued.

    Nothing that fit the bill turned up until the late 1990s. That’s when Stefanie Komossa, at the time a graduate student at the Max Planck Institute for Extraterrestrial Physics in Garching, Germany, found massive X-ray flares [Discovery of a giant and luminous X-ray outburst from the optically inactive galaxy pair RXJ1242.6-1119] from the centers of distant galaxies that brightened and dimmed according to the Rees predictions.

    The astronomical community responded to these discoveries—based on just a few data points—with caution. Then in the mid-2000s, Gezari, then beginning a postdoc at the California Institute of Technology, searched for and discovered her own handful of TDE candidates. She looked for flashes of ultraviolet light, not X-rays as Komossa had. “In the old days,” Gezari said, “I was just trying to convince people that any of our discoveries were actually due to a tidal disruption.”

    Soon, though, she had something to sway even the doubters. In 2010, Gezari discovered an especially clear flare, rising and falling as modelers predicted. She published it in Nature in 2012, catching other astronomers’ attention. In the years since, large surveys in optical light, sifting through the sky for changes in brightness, have taken over the hunt. And like Komossa’s and Gezari’s TDEs, which had both been fished out of missions designed to look for other things, the newest batch showed up as bycatch. “It was, oh, why didn’t we think about looking for these?” said Christopher Kochanek, an astrophysicist at Ohio State University who works on a project designed to search for supernovas [ASAS-SN OSU All-Sky Automated Survey for Supernovae].

    Now, with a growing number of TDEs in hand, astrophysicists are within arm’s reach of Rees’s original goal: pinpointing and studying gargantuan black holes. But they still need to learn to interpret these events, divining their basic physics. Unexpectedly, the known TDEs fall into separate classes [A unified model for tidal disruption events]. Some seem to emit mostly ultraviolet and optical light, as if from gas heated to tens of thousands of degrees. Others glow fiercely with X-rays, suggesting temperatures an order of magnitude higher. Yet presumably they all have the same basic physical root.

    To be disrupted, an unlucky star must venture close enough to a black hole that gravitational tides exceed the internal gravity that binds the star together. In other words, the difference in the black hole’s gravitational pull on the near and far sides of the star, along with the inertial pull as the star swings around the black hole, stretches the star out into a stream. “Basically it spaghettifies,” said James Guillochon, an astrophysicist at the Harvard-Smithsonian Center for Astrophysics.

    The outer half of the star escapes away into space. But the inner half—that dense stream of star-taffy—swirls into the black hole, heating up and releasing huge sums of energy that radiate across the universe.

    With this general mechanism understood, researchers had trouble understanding why individual TDEs can look so distinct. One longstanding idea appeals to different phases of the star-eating process. As the star flesh gets initially torn away and stretched into a stream, it might ricochet around the black hole and slam into its own tail. This process might heat the tail up to ultraviolet-producing temperatures—but not hotter. Then later—after a few months or a year—the star would settle into an accretion disk, a fat bagel of spinning gas that theories predict should be hot enough to emit X-rays.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 2:41 pm on July 22, 2018 Permalink | Reply
    Tags: , , , , , , , , Sau Lan Wu, WIRED,   

    From LHC at CERN and University of Wisconsin Madison via WIRED and Quanta: Women in STEM “Meet the Woman Who Rocked Particle Physics—Three Times” Sau Lan Wu 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    U Wisconsin

    via

    Wired logo

    WIRED

    originated at

    Quanta Magazine
    Quanta Magazine

    7.22.18
    Joshua Roebke

    1
    Sau Lan Wu at CERN, the laboratory near Geneva that houses the Large Hadron Collider. The mural depicts the detector she and her collaborators used to discover the Higgs boson. Thi My Lien Nguyen/Quanta Magazine

    In 1963, Maria Goeppert Mayer won the Nobel Prize in physics for describing the layered, shell-like structures of atomic nuclei. No woman has won since.

    One of the many women who, in a different world, might have won the physics prize in the intervening 55 years is Sau Lan Wu. Wu is the Enrico Fermi Distinguished Professor of Physics at the University of Wisconsin, Madison, and an experimentalist at CERN, the laboratory near Geneva that houses the Large Hadron Collider.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    Wu’s name appears on more than 1,000 papers in high-energy physics, and she has contributed to a half-dozen of the most important experiments in her field over the past 50 years. She has even realized the improbable goal she set for herself as a young researcher: to make at least three major discoveries.

    Wu was an integral member of one of the two groups that observed the J/psi particle, which heralded the existence of a fourth kind of quark, now called the charm. The discovery, in 1974, was known as the November Revolution, a coup that led to the establishment of the Standard Model of particle physics.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.


    Standard Model of Particle Physics from Symmetry Magazine

    Later in the 1970s, Wu did much of the math and analysis to discern the three “jets” of energy flying away from particle collisions that signaled the existence of gluons—particles that mediate the strong force holding protons and neutrons together. This was the first observation of particles that communicate a force since scientists recognized photons of light as the carriers of electromagnetism. Wu later became one of the group leaders for the ATLAS experiment, one of the two collaborations at the Large Hadron Collider that discovered the Higgs boson in 2012, filling in the final piece of the Standard Model.

    CERN ATLAS Higgs Event


    CERN/ATLAS detector

    She continues to search for new particles that would transcend the Standard Model and push physics forward.

    Sau Lan Wu was born in occupied Hong Kong during World War II. Her mother was the sixth concubine to a wealthy businessman who abandoned them and her younger brother when Wu was a child. She grew up in abject poverty, sleeping alone in a space behind a rice shop. Her mother was illiterate, but she urged her daughter to pursue an education and become independent of volatile men.

    Wu graduated from a government school in Hong Kong and applied to 50 universities in the United States. She received a scholarship to attend Vassar College and arrived with $40 to her name.

    Although she originally intended to become an artist, she was inspired to study physics after reading a biography of Marie Curie. She worked on experiments during consecutive summers at Brookhaven National Laboratory on Long Island, and she attended graduate school at Harvard University. She was the only woman in her cohort and was barred from entering the male dormitories to join the study groups that met there. She has labored since then to make a space for everyone in physics, mentoring more than 60 men and women through their doctorates.

    Quanta Magazine joined Sau Lan Wu on a gray couch in sunny Cleveland in early June. She had just delivered an invited lecture about the discovery of gluons at a symposium to honor the 50th birthday of the Standard Model. The interview has been condensed and edited for clarity.

    2
    3
    Wu’s office at CERN is decorated with mementos and photos, including one of her and her husband, Tai Tsun Wu, a professor of theoretical physics at Harvard.
    Thi My Lien Nguyen/Quanta Magazine

    You work on the largest experiments in the world, mentor dozens of students, and travel back and forth between Madison and Geneva. What is a normal day like for you?

    Very tiring! In principle, I am full-time at CERN, but I do go to Madison fairly often. So I do travel a lot.

    How do you manage it all?

    Well, I think the key is that I am totally devoted. My husband, Tai Tsun Wu, is also a professor, in theoretical physics at Harvard. Right now, he’s working even harder than me, which is hard to imagine. He’s doing a calculation about the Higgs boson decay that is very difficult. But I encourage him to work hard, because it’s good for your mental state when you are older. That’s why I work so hard, too.

    Of all the discoveries you were involved in, do you have a favorite?

    Discovering the gluon was a fantastic time. I was just a second- or third-year assistant professor. And I was so happy. That’s because I was the baby, the youngest of all the key members of the collaboration.

    The gluon was the first force-carrying particle discovered since the photon. The W and Z bosons, which carry the weak force, were discovered a few years later, and the researchers who found them won a Nobel Prize. Why was no prize awarded for the discovery of the gluon?

    Well, you are going to have to ask the Nobel committee that. [Laughs.] I can tell you what I think, though. Only three people can win a Nobel Prize. And there were three other physicists on the experiment with me who were more senior than I was. They treated me very well. But I pushed the idea of searching for the gluon right away, and I did the calculations. I didn’t even talk to theorists. Although I married a theorist, I never really paid attention to what the theorists told me to do.

    How did you wind up being the one to do those calculations?

    If you want to be successful, you have to be fast. But you also have to be first. So I did the calculations to make sure that as soon as a new collider at at DESY [the German Electron Synchrotron] turned on in Hamburg we could see the gluon and recognize its signal of three jets of particles.

    DESY Helmholtz Centres & Networks: DESY’s synchrotron radiation source: the PETRA III storage ring (in orange) with the three experimental halls (in blue) in 2015.

    We were not so sure in those days that the signal for the gluon would be clear-cut, because the concept of jets had only been introduced a couple of years earlier, but this seemed to be the only way to discover gluons.

    You were also involved in discovering the Higgs boson, the particle in the Standard Model that gives many other particles their masses. How was that experiment different from the others that you were part of?

    I worked a lot more and a lot longer to discover the Higgs than I have on anything else. I worked for over 30 years, doing one experiment after another. I think I contributed a lot to that discovery. But the ATLAS collaboration at CERN is so large that you can’t even talk about your individual contribution. There are 3,000 people who built and worked on our experiment [including 600 scientists at Brookhaven National Lab, NY, USA]. How can anyone claim anything? In the old days, life was easier.

    Has it gotten any easier to be a woman in physics than when you started?

    Not for me. But for younger women, yes. There is a trend among funding agencies and institutions to encourage younger women, which I think is great. But for someone like me it is harder. I went through a very difficult time. And now that I am established others say: Why should we treat you any differently?

    Who were some of your mentors when you were a young researcher?

    Bjørn Wiik really helped me when I was looking for the gluon at DESY.

    How so?

    Well, when I started at the University of Wisconsin, I was looking for a new project. I was interested in doing electron-positron collisions, which could give the clearest indication of a gluon. So I went to talk to another professor at Wisconsin who did these kinds of experiments at SLAC, the lab at Stanford. But he was not interested in working with me.

    So I tried to join a project at the new electron-positron collider at DESY. I wanted to join the JADE experiment [abbreviated from the nations that developed the detector: Japan, Germany (Deutschland) and England]. I had some friends working there, so I went to Germany and I was all set to join them. But then I heard that no one had told a big professor in the group about me, so I called him up. He said, “I am not sure if I can take you, and I am going on vacation for a month. I’ll phone you when I get back.” I was really sad because I was already in Germany at DESY.

    But then I ran into Bjørn Wiik, who led a different experiment called TASSO, and he said, “What are you doing here?” I said, “I tried to join JADE, but they turned me down.” He said, “Come and talk to me.” He accepted me the very next day.

    4
    TASSO detector at PETRA at DESY

    And the thing is, JADE later broke their chamber, and they could not have observed the three-jet signal for gluons when we observed it first at TASSO. So I have learned that if something does not work out for you in life, something else will.

    5
    Wu and Bjørn Wiik in 1978, in the electronic control room of the TASSO experiment at the German Electron Synchrotron in Hamburg, Germany. Dr. Ulrich Kötz

    You certainly turned that negative into a positive.

    Yes. The same thing happened when I left Hong Kong to attend college in the US. I applied to 50 universities after I went through a catalog at the American consulate. I wrote in every application, “I need a full scholarship and room and board,” because I had no money. Four universities replied. Three of them turned me down. Vassar was the only American college that accepted me. And it turns out, it was the best college of all the ones I applied to.

    If you persist, something good is bound to happen. My philosophy is that you have to work hard and have good judgment. But you also have to have luck.

    I know this is an unfair question, because no one ever asks men, even though we should, but how can society inspire more women to study physics or consider it as a career?

    Well, I can only say something about my field, experimental high-energy physics. I think my field is very hard for women. I think partially it’s the problem of family.

    My husband and I did not live together for 10 years, except during the summers. And I gave up having children. When I was considering having children, it was around the time when I was up for tenure and a grant. I feared I would lose both if I got pregnant. I was less worried about actually having children than I was about walking into my department or a meeting while pregnant. So it’s very, very hard for families.

    I think it still can be.

    Yeah, but for the younger generation it’s different. Nowadays, a department looks good if it supports women. I don’t mean that departments are deliberately doing that only to look better, but they no longer actively fight against women. It’s still hard, though. Especially in experimental high-energy physics. I think there is so much traveling that it makes having a family or a life difficult. Theory is much easier.

    You have done so much to help establish the Standard Model of particle physics. What do you like about it? What do you not like?

    It’s just amazing that the Standard Model works as well as it does. I like that every time we try to search for something that is not accounted for in the Standard Model, we do not find it, because the Standard Model says we shouldn’t.

    But back in my day, there was so much that we had yet to discover and establish. The problem now is that everything fits together so beautifully and the Model is so well confirmed. That’s why I miss the time of the J/psi discovery. Nobody expected that, and nobody really had a clue what it was.

    But maybe those days of surprise aren’t over.

    We know that the Standard Model is an incomplete description of nature. It doesn’t account for gravity, the masses of neutrinos, or dark matter—the invisible substance that seems to make up six-sevenths of the universe’s mass. Do you have a favorite idea for what lies beyond the Standard Model?

    Well, right now I am searching for the particles that make up dark matter. The only thing is, I am committed to working at the Large Hadron Collider at CERN. But a collider may or may not be the best place to look for dark matter. It’s out there in the galaxies, but we don’t see it here on Earth.

    Still, I am going to try. If dark matter has any interactions with the known particles, it can be produced via collisions at the LHC. But weakly interacting dark matter would not leave a visible signature in our detector at ATLAS, so we have to intuit its existence from what we actually see. Right now, I am concentrating on finding hints of dark matter in the form of missing energy and momentum in a collision that produces a single Higgs boson.

    What else have you been working on?What else have you been working on?

    Our most important task is to understand the properties of the Higgs boson, which is a completely new kind of particle. The Higgs is more symmetric than any other particle we know about; it’s the first particle that we have discovered without any spin. My group and I were major contributors to the very recent measurement of Higgs bosons interacting with top quarks. That observation was extremely challenging. We examined five years of collision data, and my team worked intensively on advanced machine-learning techniques and statistics.

    In addition to studying the Higgs and searching for dark matter, my group and I also contributed to the silicon pixel detector, to the trigger system [that identifies potentially interesting collisions], and to the computing system in the ATLAS detector. We are now improving these during the shutdown and upgrade of the LHC. We are also very excited about the near future, because we plan to start using quantum computing to do our data analysis.

    6
    Wu at CERN. Thi My Lien Nguyen/Quanta Magazine

    Do you have any advice for young physicists just starting their careers?

    Some of the young experimentalists today are a bit too conservative. In other words, they are afraid to do something that is not in the mainstream. They fear doing something risky and not getting a result. I don’t blame them. It’s the way the culture is. My advice to them is to figure out what the most important experiments are and then be persistent. Good experiments always take time.

    But not everyone gets to take that time.

    Right. Young students don’t always have the freedom to be very innovative, unless they can do it in a very short amount of time and be successful. They don’t always get to be patient and just explore. They need to be recognized by their collaborators. They need people to write them letters of recommendation.

    The only thing that you can do is work hard. But I also tell my students, “Communicate. Don’t close yourselves off. Try to come up with good ideas on your own but also in groups. Try to innovate. Nothing will be easy. But it is all worth it to discover something new.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    In achievement and prestige, the University of Wisconsin–Madison has long been recognized as one of America’s great universities. A public, land-grant institution, UW–Madison offers a complete spectrum of liberal arts studies, professional programs and student activities. Spanning 936 acres along the southern shore of Lake Mendota, the campus is located in the city of Madison.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: