Tagged: WIRED Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:21 pm on September 11, 2017 Permalink | Reply
    Tags: ORNL Cray Titan XK7 Supercomputer, ORNL IBM Summit Supercomputer, WIRED   

    From WIRED: “The Astonishing Engineering Behind America’s Latest, Greatest Supercomputer” 

    Wired logo



    If you want to do big, serious science, you’ll need a serious machine. You know, like a giant water-cooled computer that’s 200,000 times more powerful than a top-of-the-line laptop and that sucks up enough energy to power 12,000 homes.

    You’ll need Summit, a supercomputer nearing completion at the Oak Ridge National Laboratory in Tennessee.

    ORNL IBM Summit Supercomputer

    When it opens for business next year, it’ll be the United States’ most powerful supercomputer and perhaps the most powerful in the world. Because as science gets bigger, so too must its machines, requiring ever more awesome engineering, both for the computer itself and the building that has to house it without melting. Modeling the astounding number of variables that affect climate change, for instance, is no task for desktop computers in labs. Some goes for genomics work and drug discovery and materials science. If it’s wildly complex, it’ll soon course through Summit’s circuits.

    Summit will be five to 10 times more powerful than its predecessor, Oak Ridge’s Titan supercomputer, which will continue running its science for about a year after Summit comes online.

    ORNL Cray Titan XK7 Supercomputer

    (Not that there’s anything wrong with Titan. It’s just that at 5 years old, the machine is getting on in years by supercomputer standards.) But it’ll be pieced together in much the same way: cabinet after cabinet of so-called nodes. While each node for Titan, all 18,688 of them, consists of one CPU and one GPU, with Summit it’ll be two CPUs working with six GPUs.

    Diagram showing how chilled water is delivered to the building.
    Heery International

    Think of the GPU as a turbocharger for the CPU in this relationship. While not all supercomputers use this setup, known as a heterogeneous architecture, those that do get a boost―each of the 4,600 nodes in Summit can manage 40 teraflops. So at peak performance, Summit will hit 200 petaflops, a petaflop being one million billion operations a second. “So we envision research teams using all of those GPUs on every single node when they run, that’s sort of our mission as a facility,” says Stephen McNally, operations manager.

    Performing all those operations sucks up a lot of power and generates a ton of heat. That poses a daunting challenge for Heery, the company charged with preventing Summit from overheating and powering the building that houses it. Heery’s piping in 20 megawatts of electricity (the supercomputer itself will run on 15 megawatts), enough juice to power a decent-sized city. “12,000 Southern homes with their air conditioners cranking would be roughly 20 megawatts of power,” says George Wellborn, senior associate at Heery. Luckily, Oak Ridge is hooked up to the Tennessee Valley Authority, which in Tennessee alone has a generating capacity of nearly 20,000 megawatts from 19 hydroelectric dams, two nuclear power plants, and too many other sources to get into here.

    Another engineering pickle: Each of the supercomputer’s 4,600 nodes needs to be cooled individually. Summit will use water. (Titan uses a refrigerant. You could also cool your electronics in a bath of mineral oil, if you were so inclined.) “Every one of those nodes is using a cold plate technology, where we’re putting water through a cold plate that’s directly on top,” says Jim Rogers, director for computing and facilities. “So 70 percent of the heat that’s generated by this thing can be absorbed by that cold plate.”

    Overhead view of Summit. Heery International

    Curiously, this isn’t super-chilled water―it’s a comfortable 70 degrees Fahrenheit. Why? Because if you drop the temperature too much, you’ll form dew, which is a great way to ruin a supercomputer. “You have to have higher flow rates to carry the heat away,” Rogers says (we’re talking a max flow of nearly 8,000 gallons per minute), “but that tradeoff is good in terms of energy efficiency and operating cost.”

    Summit still has to … summit some final steps before it can start crunching heavy-duty science. Its cabinets should all be installed by late October, then it will undergo a year of testing and debugging. But soon enough, one of the most impressive devices humankind has ever assembled will go node to node with the best supercomputers in the world.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 3:02 pm on June 6, 2017 Permalink | Reply
    Tags: An IBM Breakthrough Ensures Silicon Will Keep Shrinking, , , , WIRED   

    From WIRED: “An IBM Breakthrough Ensures Silicon Will Keep Shrinking” 

    Wired logo


    IBM scientists at the SUNY Polytechnic Institute Colleges of Nanoscale Science and Engineering’s NanoTech Complex in Albany, NY prepare test wafers with 5nm silicon nanosheet transistors, loaded into the front opening unified pod, or FOUPs, to test an industry-first process of building 5nm transistors using silicon nanosheets.Connie Zhou

    The limits of silicon have not been reached quite yet.

    Today, an IBM-led group of researchers have detailed a breakthrough transistor design, one that will enable processors to continue their Moore’s Law march toward smaller, more affordable iterations. Better still? They achieved it not with carbon nanotubes or some other theoretical solution, but with an inventive new process that actually works, and should scale up to the demands of mass manufacturing within several years.

    That should also, conveniently enough, be just in time to power the self-driving cars, on-board artificial intelligence, and 5G sensors that comprise the ambitions of nearly every major tech player today—which was no sure thing.

    5nm Or Bust

    For decades, the semiconductor industry has obsessed over smallness, and for good reason. The more transistors you can squeeze into a chip, the more speed and power efficiency gains you reap, at lower cost. The famed Moore’s Law is simply the observation made by Intel co-founder Gordon Moore, in 1965, that the number of transistors had doubled every year. In 1975, Moore revised that estimate to every two years. While the industry has fallen off of that pace, it still regularly finds ways to shrink.

    Doing so has required no shortage of inventiveness. The last major breakthrough came in 2009, when researchers detailed a new type of transistor design called FinFET. The first manufacturing of a FinFET transistor design in 2012 gave the industry a much-needed boost, enabling processors made on a 22-nanometer process. FinFET was a revolutionary step in its own right, and the first major shift in transistor structure in decades. Its key insight was to use a 3-D structure to control electric current, rather than the 2-D “planar” system of years past.

    ”Fundamentally, FinFET structure is a single rectangle, with the three sides of the structure covered in gates,” says Mukesh Khare, vice president of semiconductor research for IBM Research. Think of the transistor as a switch; applying different voltages to the gate turns the transistor “on” or “off.” Having three sides surrounded by gates maximizes the amount of current flowing in the “on” state, for performance gains, and minimizes the amount of leakage in the “off” state, which improves efficiency.

    But just five years later, those gains already threaten to run dry. “The problem with FinFET is it’s running out of steam,” says Dan Hutcheson, CEO of VLSI Research, which focuses on semiconductor manufacturing. While FinFET underpins today’s bleeding-edge 10nm process chips, and should be sufficient for 7nm as well, the fun stops there. “Around 5nm, in order to keep the scaling and transistor working, we need to move to a different structure,” Hutcheson says.

    Enter IBM.

    Rather than FinFET’s vertical fin structure, the company—along with research partners GlobalFoundries and Samsung—has gone horizontal, layering silicon nanosheets in a way that effectively results in a fourth gate.

    A scan of IBM Research Alliance’s 5nm transistor, built using an industry-first process to stack silicon nanosheets as the device structure.IBM

    “You can imagine that FinFET is now turned sideways, and stacked on top of each other,” says Khare. For a sense of scale, in this architecture electrical signals pass through a switch that’s the width of two or three strands of DNA.

    “It’s a big development,” says Hutcheson. “If I can make the transistor smaller, I get more transistors in the same area, which means I get more compute power in the same area.” In this case, that number leaps from 20 billion transistors in a 7nm process to 30 billion on a 5nm process, fingernail-sized chip. IBM pegs the gains at either 40 percent better performance at the same power, or 75 percent reduction in power at the same efficiency.

    Just in Time

    The timing couldn’t be better.

    Actual processors built off of this new structure aren’t expected to hit the market until 2019 at the earliest. But that roughly lines up with industry estimates for broader adoption of everything from self-driving cars to 5G, innovations that can’t scale without a functional 5nm process in place.

    IBM Research scientist Nicolas Loubet holds a wafer of chips with 5nm silicon nanosheet transistors manufactured using an industry-first process that can deliver 40 percent performance enhancement at fixed power, or 75 percent power savings at matched performance.Connie Zhou

    “The world’s sitting on this stuff, artificial intelligence, self-driving cars. They’re all highly dependent on more efficient computing power. That only comes from this type of technology,” says Hutcheson. “Without this, we stop.”

    Take self-driving cars as a specific example. They may work well enough today, but they also require tens of thousands of dollars worth of chips to function, an impractical added cost for a mainstream product. A 5nm process drives those expenses way down. Think, too, of always-on IoT sensors that will collect constant streams of data in a 5G world. Or more practically, think of smartphones that can last two or three days on a charge rather than one, with roughly the same-sized battery. And that’s before you hit the categories that no one’s even thought of yet.

    “The economic value that Moore’s Law generates is unquestionable. That’s where innovations such as this one come into play, to extend scaling not by traditional ways but coming up with innovative structures,” says Khare.

    Widespread adoption of many of those technologies is still years away. And success in all of them will require a confluence of both technological and regulatory progress. At least when they get there, though, the tiny chips that make it all work will be right there waiting for them.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 5:22 pm on June 4, 2017 Permalink | Reply
    Tags: , , , , , , , WIRED   

    From WIRED: “Cosmic Discoveries Fuel a Fight Over the Universe’s Beginnings” 

    Wired logo

    Ashley Yeager

    Light from the first galaxies clears the universe. ESO/L. Calçada

    Not long after the Big Bang, all went dark. The hydrogen gas that pervaded the early universe would have snuffed out the light of the universe’s first stars and galaxies. For hundreds of millions of years, even a galaxy’s worth of stars—or unthinkably bright beacons such as those created by supermassive black holes—would have been rendered all but invisible.

    Eventually this fog burned off as high-energy ultraviolet light broke the atoms apart in a process called reionization.

    Reionization era and first stars, Caltech

    But the questions of exactly how this happened—which celestial objects powered the process and how many of them were needed—have consumed astronomers for decades.

    Now, in a series of studies, researchers have looked further into the early universe than ever before. They’ve used galaxies and dark matter as a giant cosmic lens to see some of the earliest galaxies known, illuminating how these galaxies could have dissipated the cosmic fog. In addition, an international team of astronomers has found dozens of supermassive black holes—each with the mass of millions of suns—lighting up the early universe. Another team has found evidence that supermassive black holes existed hundreds of millions of years before anyone thought possible. The new discoveries should make clear just how much black holes contributed to the reionization of the universe, even as they’ve opened up questions as to how such supermassive black holes were able to form so early in the universe’s history.

    First Light

    In the first years after the Big Bang, the universe was too hot to allow atoms to form. Protons and electrons flew about, scattering any light. Then after about 380,000 years, these protons and electrons cooled enough to form hydrogen atoms, which coalesced into stars and galaxies over the next few hundreds of millions of years.

    Starlight from these galaxies would have been bright and energetic, with lots of it falling in the ultraviolet part of the spectrum. As this light flew out into the universe, it ran into more hydrogen gas. These photons of light would break apart the hydrogen gas, contributing to reionization, but as they did so, the gas snuffed out the light.

    Lucy Reading-Ikkanda/Quanta Magazine

    To find these stars, astronomers have to look for the non-ultraviolet part of their light and extrapolate from there. But this non-ultraviolet light is relatively dim and hard to see without help.

    A team led by Rachael Livermore, an astrophysicist at the University of Texas at Austin, found just the help needed in the form of a giant cosmic lens.

    Gravitational Lensing NASA/ESA

    These so-called gravitational lenses form when a galaxy cluster, filled with massive dark matter, bends space-time to focus and magnify any object on the other side of it. Livermore used this technique with images from the Hubble Space Telescope to spot extremely faint galaxies from as far back as 600 million years after the Big Bang—right in the thick of reionization.

    NASA/ESA Hubble Telescope

    In a recent paper that appeared in The Astrophysical Journal, Livermore and colleagues also calculated that if you add galaxies like these to the previously known galaxies, then stars should be able to generate enough intense ultraviolet light to reionize the universe.

    Yet there’s a catch. Astronomers doing this work have to estimate how much of a star’s ultraviolet light escaped its home galaxy (which is full of light-blocking hydrogen gas) to go out into the wider universe and contribute to reionization writ large. That estimate—called the escape fraction—creates a huge uncertainty that Livermore is quick to acknowledge.

    In addition, not everyone believes Livermore’s results. Rychard Bouwens, an astrophysicist at Leiden University in the Netherlands, argues in a paper submitted to The Astrophysical Journal that Livermore didn’t properly subtract the light from the galaxy clusters that make up the gravitational lens.


    As a result, he said, the distant galaxies aren’t as faint as Livermore and colleagues claim, and astronomers have not found enough galaxies to conclude that stars ionized the universe.

    If stars couldn’t get the job done, perhaps supermassive black holes could. Beastly in size, up to a billion times the mass of the sun, supermassive black holes devour matter. They tug it toward them and heat it up, a process that emits lots of light and creates luminous objects that we call quasars. Because quasars emit way more ionizing radiation than stars do, they could in theory reionize the universe.

    The trick is finding enough quasars to do it. In a paper posted to the scientific preprint site arxiv.org last month, astronomers working with the Subaru Telescope announced the discovery of 33 quasars that are about a 10th as bright as ones identified before.

    NAOJ/Subaru Telescope at Mauna Kea Hawaii, USA

    With such faint quasars, the astronomers should be able to calculate just how much ultraviolet light these supermassive black holes emit, said Michael Strauss, an astrophysicist at Princeton University and a member of the team.

    The researchers haven’t done the analysis yet, but they expect to publish the results in the coming months.

    The oldest of these quasars dates back to around a billion years after the Big Bang, which seems about how long it would take ordinary black holes to devour enough matter to bulk up to supermassive status.

    This is why another recent discovery [ApJ] is so puzzling. A team of researchers led by Richard Ellis, an astronomer at the European Southern Observatory, was observing a bright, star-forming galaxy seen as it was just 600 million years after the Big Bang.

    The galaxy’s spectrum—a catalog of light by wavelength—appeared to contain a signature of ionized nitrogen. It’s hard to ionize ordinary hydrogen, and even harder to ionize nitrogen. It requires more higher-energy ultraviolet light than stars emit. So another strong source of ionizing radiation, possibly a supermassive black hole, had to exist at this time, Ellis said.

    One supermassive black hole at the center of an early star-forming galaxy might be an outlier. It doesn’t mean there were enough of them around to reionize the universe. So Ellis has started to look at other early galaxies. His team now has tentative evidence that supermassive black holes sat at the centers of other massive, star-forming galaxies in the early universe. Studying these objects could help clarify what reionized the universe and illuminate how supermassive black holes formed at all. “That is a very exciting possibility,” Ellis said.

    All this work is beginning to converge on a relatively straightforward explanation for what reionized the universe. The first population of young, hot stars probably started the process, then drove it forward for hundreds of millions of years. Over time, these stars died; the stars that replaced them weren’t quite so bright and hot. But by this point in cosmic history, supermassive black holes had enough time to grow and could start to take over. Researchers such as Steve Finkelstein, an astrophysicist at the University of Texas at Austin, are using the latest observational data and simulations of early galactic activity to test out the details of this scenario, such as how much stars and black holes contribute to the process at different times.

    His work—and all work involving the universe’s first billion years—will get a boost in the coming years after the 2018 launch of the James Webb Space Telescope, Hubble’s successor, which has been explicitly designed to find the first objects in the universe.

    NASA/ESA/CSA Webb Telescope annotated

    Its findings will probably provoke many more questions, too.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 3:26 pm on May 19, 2017 Permalink | Reply
    Tags: , , Chemists Are One Step Closer to Manipulating All Matter, Controlling a single molecule’s behavior, David Wineland, , , WIRED   

    From WIRED: “Chemists Are One Step Closer to Manipulating All Matter” 

    Wired logo


    Date of Publication: 05.11.17.
    Nick Stockton

    Getty Images

    For all their periodic tables, styrofoam ball-and-pencil models, and mouth-garbling vocabulary, chemists really don’t know jack about molecules.

    Part of the problem is they can’t really control what molecules do. Molecules spin, vibrate, and trade electrons, all of which affect the way they react with other molecules. Of course, scientists know enough about those scaled-up reactions to do things like make concrete, refine gasoline, and brew beer. But if you’re trying to use individual molecules as tools, or manipulate them so precisely that you can snap them together like Lego pieces, you need better control. Scientists aren’t all the way there yet, but recently scientists at the National Institute of Standards and Technology solved an early challenge: controlling a single molecule’s behavior.

    At the very basic level, controlling a molecule would let scientists learn more about it. “This is a long-standing problem,” says Dietrich Leibfried, a physicist with NIST’s Ion Storage Group in Boulder, Colorado. “Everything around us is made out of molecules, but it’s hard to precisely find out about them.” And that would have practical applications. For instance, NIST keeps tables of molecular properties that astrophysicists consult when they’re reading the spectral signatures of faraway stars and exoplanets. Filling in those blanks would support predictions of whether some exoplanet can support life. With enough control, scientists won’t just get a better look at molecules—they’ll manipulate matter.

    But for now, they are still experimenting. Scientists know how to control atoms using cold vacuum and lasers—so at NIST, scientists’ limited molecular control builds on that knowledge. Their research, published yesterday in Nature, describes their experiment: They begin with a vacuum chamber, a 3-inch box containing a tiny electrode, which itself holds a single positively charged calcium atomic ion. Then come the molecules: Ionized hydrogen gas, which the scientists leak into the vacuum chamber until a single H2 reacts with the calcium atom.

    Now the ionized atom and the ionized molecule are trapped together. But they’re repelled by their positive charges, and the force of the repulsion sends them vibrating—like two magnets when you bring them close. They’re also spinning, like a lopsided barbell hurled into the air.

    So the scientists set out to freeze the pair in place, again calling on their skills of atomic control. First they fire a low-energy laser at the calcium atom, cooling it and stopping its motion—and because it’s coupled to the hydrogen molecule, the hydrogen stops vibrating as well. That’s the easy part. The calcium-hydride is still rotating. “That rotation, the spinning along the horizontal or vertical plane, is the hardest thing to control,” says Leibfried. Imagine trying to stick Legos together if they were spinning independently. Leibfried and his group do know how to stop, and even alter the spinning. They figured that out last year using lasers tuned to specific frequencies.

    All that rigamarole is worthless if you don’t know which way the molecule is pointing, though. And if you want to check in on the molecule—by firing another laser—you set it into random motion once again. So instead the NIST scientists fire a teeny tiny laser at the calcium atom, causing it to wiggle. Because it is connected to the hydrogen molecule, it picks up on the molecule’s state. And Leibfried and his team can “read” that state by examining the way the laser’s light scatters when it encounters the calcium atom. The whole intricate choreography between them lasts about a millisecond, and at the end they can see if the molecule behaved as it was directed.

    So what’s the point of all that? If you can control with certainty the orientation of a molecule, it’s one step closer to sticking them together exactly how you want—no more tossing compounds in a beaker and praying for the right kind of bubbles. Or, to return to the Lego analogy, you can understand—and manipulate—how molecules stick together.

    This discovery builds off work done by Leibfried’s mentor, Nobel winner David Wineland, who did the foundational atomic control work behind atomic clocks based on single trapped ions. But unlike atomic clocks—which changed the scale at which scientists could measure time, and led to breakthroughs like GPS—this process isn’t ready to revolutionize chemistry just yet. Scientists need to fine-tune their control, and have yet to proof the concept on molecules besides hydrogen. Having just one molecule would be like trying to build a city from Legos using only 2×4 bricks.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 2:48 pm on April 30, 2017 Permalink | Reply
    Tags: Angela Olinto, , , , , EUSO-SPB-Extreme Universe Space Observatory Super Pressure Balloon, Ultrahigh-energy cosmic rays, WIRED,   

    From WIRED: “Women in STEM -“A Cosmic-Ray Hunter Closes in on Super-Energetic Particles” Angela Olinto 

    Wired logo


    Angela Olinto in Wanaka, New Zealand, in March.Alpine Images for Quanta Magazine

    On April 25, at 10:50 am local time, a white helium balloon ascended from Wanaka, New Zealand, and lifted Angela Olinto’s hopes into the stratosphere. The football stadium-size NASA balloon, now floating 20 miles above the Earth, carries a one-ton detector that Olinto helped design and see off the ground. Every moonless night for the next few months, it will peer out at the dark curve of the Earth, hunting for the fluorescent streaks of mystery particles called “ultrahigh-energy cosmic rays” crashing into the sky. The Extreme Universe Space Observatory Super Pressure Balloon (EUSO-SPB) experiment will be the first ever to record the ultraviolet light from these rare events by looking down at the atmosphere instead of up. The wider field of view will allow it to detect the streaks at a faster rate than previous, ground-based experiments, which Olinto hopes will be the key to finally figuring out the particles’ origin.

    Olinto, the leader of the seven-country EUSO-SPB experiment, is a professor of astrophysics at the University of Chicago. She grew up in Brazil and recalls that during her “beach days in Rio” she often wondered about nature. Over the 40 years since she was 16, Olinto said, she has remained captivated by the combined power of mathematics and experiments to explain the universe. “Many people think of physics as hard; I find it so elegant, and so simple compared to literature, which is really amazing, but it’s so varied that it’s infinite,” she said. “We have four forces of nature, and everything can be done mathematically. Nobody’s opinions matter, which I like very much!”

    Olinto has spent the last 22 years theorizing about ultra high-energy cosmic rays. Composed of single protons or heavier atomic nuclei, they pack within quantum proportions as much energy as baseballs or bowling balls, and hurtle through space many millions of times more energetically than particles at the Large Hadron Collider, the world’s most powerful accelerator. “They’re so energetic that theorists like me have a hard time coming up with something in nature that could reach those energies,” Olinto said. “If we didn’t observe these cosmic rays, we wouldn’t believe they actually would be produced.”

    Olinto and her collaborators have proposed that ultrahigh-energy cosmic rays could be emitted by newly born, rapidly rotating neutron stars, called “pulsars.” She calls these “the little guys,” since their main competitors are “the big guys”: the supermassive black holes that churn at the centers of active galaxies. But no one knows which theory is right, or if it’s something else entirely. Ultrahigh-energy cosmic rays pepper Earth so sparsely and haphazardly—their paths skewed by the galaxy’s magnetic field—that they leave few clues about their origin. In recent years, a hazy “hot spot” of the particles coming from a region in the Northern sky seems to be showing up in data collected by the Telescope Array in Utah.

    Cosmic Ray Telescope Array Project at Delta, Utah by Roger J. Wendell – 08

    But this potential clue has only compounded the puzzle: Somehow, the alleged hot spot doesn’t spill over at all into the field of view of the much larger and more powerful Pierre Auger Observatory in Argentina.

    Pierre Auger Observatory Pierre Auger Observatory in the western Mendoza Province, Argentina, near the Andes

    To find out the origin of ultrahigh-energy cosmic rays, Olinto and her colleagues need enough data to produce a map of where in the sky the particles come from—a map that can be compared with the locations of known cosmological objects. “In the cosmic ray world, the big dream is to point,” she said during an interview at a January meeting of the American Physical Society in Washington, DC.

    She sees the current balloon flight as a necessary next step. If successful, it will serve as a proof of principle for future space-based ultrahigh-energy cosmic-ray experiments, such as her proposed satellite detector, Poemma (Probe of Extreme Multi-Messenger Astrophysics).

    The POEMAS system to monitor the sun at 45/90 GHz with circular polarization. Guigue

    While in New Zealand in late March preparing for the balloon launch, Olinto received the good news from NASA that Poemma had been selected for further study.

    Olinto wants answers, and she has an ambitious timeline for getting them. An edited and condensed version of our conversations in Washington and on a phone call to New Zealand follows.

    QUANTA MAGAZINE: What was your path to astrophysics and ultrahigh-energy cosmic rays?

    ANGELA OLINTO: I was really interested in the basic workings of nature: Why three families of quarks? What is the unified theory of everything? But I realized how many easier questions we have in astrophysics: that you could actually take a lifetime and go answer them. Graduate school at MIT showed me the way to astrophysics — how it can be an amazing route to many questions, including how the universe looks, how it functions, and even particle physics questions. I didn’t plan to study ultrahigh-energy cosmic rays; but every step it was, “OK, it looks promising.”

    Extreme Universe Space Observatory Super Pressure Balloon (EUSO-SPB)

    How long have you been trying to answer this particular question?

    In 1995, we had a study group at Fermilab for ultrahigh-energy cosmic rays, because the AGASA (Akeno Giant Air Shower Array) experiment was seeing these amazing events that were so energetic that the particles broke a predicted energy limit known as the “GZK cutoff.” I was studying magnetic fields at the time, and so Jim Cronin, who just passed away last year in August—he was a brilliant man, charismatic, full of energy, lovely man—he asked that I explain what we know about cosmic magnetic fields. At that time the answer was not very much, but I gave him what we did know. And because he invited me I got to learn what he was up to. And I thought, wow, this is pretty interesting.

    Later you helped plan and run Pierre Auger, an array of detectors spread across 3,000 square kilometers of Argentinian grassland. Did you actually go around and persuade farmers to let you put detectors on their land?

    Not me; it was the Argentinian team who did the amazing job of talking to everybody. The American team helped build a planetarium and a school in that area, so we did interact with them, but not directly on negotiations over land. In Argentina it was like this: You get a big fraction of folks who are very excited and part of it from the beginning. Gradually you got through the big landowners. But eventually we had a couple who were really not interested. So we had two regions in the middle of the array that were empty of the detectors for quite some time, and then we finally closed it.

    Space is much easier in that sense; it’s one instrument and no one owns the atmosphere. On the other hand, the nice thing about having all the farmers involved is that Malargüe, the city in Argentina that has had the detectors deployed, has changed completely. The students are much more connected to the world and speak English. Some are coming to the US for undergraduate and even graduate school eventually. It’s been a major transformation for a small town where nobody went to college before. So that was pretty amazing. It took a huge outreach effort and a lot of time, but this was very important, because we needed them to let us in.

    Why is space the next step?

    To go the next step on the ground—to get 30,000 square kilometers instrumented—is something I tried to do, but it’s really difficult. It’s hard enough with 3,000; it was crazy to begin with, but we did it. To get to the next order of magnitude seems really difficult. On the other hand, going to space you can see 100 times more volume of air in the same minute. And then we can increase by orders of magnitude the ability to see ultrahigh-energy cosmic rays, see where they are coming from, how they are produced, what objects can reach these kinds of energies.

    What will we learn from EUSO-SPB?

    We will not have enough data to revolutionize our understanding at this point, but we will show how it can be done from space. The work we do with the balloon is really in preparation for something like Poemma, our proposed satellite experiment. We plan to have two telescopes free-flying and communicating with each other, and by recording cosmic-ray events with both of the them we should be able to also reproduce the direction and composition very precisely.

    Speaking of Poemma, do you still teach a class called Cosmology for Poets?

    We don’t call it that anymore, but yes. What it entails is teaching nonscience majors what we know about the history of the universe: what we’ve learned and why we think it is the way it is, how we measure things and how our scientific understanding of the history of the universe is now pretty interesting. First, we have a story that works brilliantly, and second, we have all kinds of puzzles like dark matter and dark energy that are yet to be understood. So it gives the sense of the huge progress since I started looking at this. It’s unbelievable; in my lifetime it’s changed completely, and mostly due to amazing detections and observations.

    One thing I try to do in this course is to mix in some art. I tell them to go to a museum and choose an object or art piece that tells you something about the universe—that connects to what we talked about in class. And here my goal is to just make them dream a bit free from all the boundaries of science. In science there’s right and wrong, but in art there are no easy right and wrong answers. I want them to see if they can have a personal attachment to the story I told them. And I think art helps me do that.

    You’ve said that when you left Brazil for MIT at 21, you were suffering from a serious muscle disease called polymyositis, which also recurred in 2006. Did those experiences contribute to your drive to push the field forward?

    I think this helps me not get worked up about small stuff. There are always many reasons to give up when working on high-risk research. I see some colleagues who get worked up about things that I’m like, whatever, let’s just keep going. And I think that attitude to minimize things that are not that big has to do with being close to death. Being that close, it’s like, well, everything is positive. I’m very much a positive person and most of the time say, let’s keep pushing. I think having a question that is not answered that is well posed is a very good incentive to keep moving.

    Between the “big guys” and the “little guys”—black holes versus pulsating neutron stars—what’s your bet for which ones produce ultrahigh-energy cosmic rays?

    I think it’s 50-50 at this point—both can do it and there’s no showstopper on either side—but I root always for the underdog. It looks like ultrahigh-energy cosmic rays have a heavier composition, which helps the neutron star case, since we had heavy elements in our neutron star models from the beginning. However, it’s possible that supermassive black holes do the job, too, and basically folks just imagine that the bigger the better, so the supermassive black holes are usually a little bit ahead. It could be somewhere in the middle: intermediate-mass black holes. Or ultrahigh-energy cosmic rays could be related to other interesting phenomena, like fast radio bursts, or something that we don’t know anything about.

    When do you think we’ll know for sure?

    You know how when you climb the mountain—I rarely look at where I’m going. I look at the next two steps. I know I’m going to the top but I don’t look at the top, because it’s difficult to do small steps when the road is really long. So I don’t try to predict exactly. But I would imagine—we have a decadal survey process, so that takes quite some time, and then we have another decade—so let’s say, in the 2030s we should know the answer.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 11:29 pm on March 1, 2017 Permalink | Reply
    Tags: , , , WIRED   

    From Wired: “Italy’s Etna Volcano Throws Lava Bombs in Its First Big Eruption of 2017” 

    Wired logo


    Eric Clemetti

    Europe’s biggest and most powerful volcano, Mount Etna, erupts sending an ash cloud across the holiday isle of Sicily.

    After one of the most quiet years in decades, Etna has decided to make 2017 a little more exciting. Early this week, the volcano had a moderate strombolian eruption, what the folks who monitor Etna call a “paroxysm,” that produced a lava fountain over the summit of the volcano. Strombolian eruptions (named after nearby Stromboli) are caused by gas-rich magma reaching the surface and erupting explosively. They also tend to produce lava flows at the same time, but they are less intense explosions than a plinian eruption (like what happened at Pinatubo or St. Helens).

    Some of the images of the eruption show a stream of lava coming from the New Southeast Crater while strombolian explosions threw lava bombs hundreds of meters from the vent. The ash from this eruption did not disrupt the air traffic in or out of the airport at nearby Catania—however, past stronger eruptions have caused it to shut down.

    Of course, there was a torrent of hyperbole published about this eruption. But even as dramatic as this eruption looked, it is relatively benign, mainly impacting the summit area of Etna. Always be skeptical of news articles that sell any volcanic eruption as a portend of doom or massive destruction.

    Very few actually are as hazardous as breathless media outlets would suggest. Eruptions at Etna may pose a hazard to air traffic through ash emissions, and slow-moving lava flows could endanger some of the villages and homes on the lower slopes of the volcano. This has happened before, and attempts were made to divert the lava flows (with moderate success). But the lava flow jeopardizes property much more than life; the flows move so slowly that you can likely out-walk them. Etna does have some history of explosive eruptions, but in its most recent activity over the last decade, these events have been very rare.

    Remember, there are a lot of webcams pointed at Etna. You can see a lot of different views (including IR thermal camera) on the INGV webcams, while Radio7 has a variety from different views and EtnaGuide has some near the summit. The next time Etna rumbles, be sure to check it out live.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 2:36 pm on January 17, 2017 Permalink | Reply
    Tags: , , It's a bad time to be a physicist, Physicists run to Silicon Valley, , WIRED   

    From WIRED: “Move Over, Coders—Physicists Will Soon Rule Silicon Valley” 

    Wired logo


    Oscar Boykin.Ariel Zambelich/WIRED

    It’s a bad time to be a physicist.

    At least, that’s what Oscar Boykin says. He majored in physics at the Georgia Institute of Technology and in 2002 he finished a physics PhD at UCLA. But four years ago, physicists at the Large Hadron Collider in Switzerland discovered the Higgs boson, a subatomic particle first predicted in the 1960s. As Boykin points out, everyone expected it. The Higgs didn’t mess with the theoretical models of the universe. It didn’t change anything or give physcists anything new to strive for. “Physicists are excited when there’s something wrong with physics, and we’re in a situation now where there’s not a lot that’s wrong,” he says. “It’s a disheartening place for a physicist to be in.” Plus, the pay isn’t too good.

    Boykin is no longer a physicist. He’s a Silicon Valley software engineer. And it’s a very good time to be one of those.

    Boykin works at Stripe, a $9-billion startup that helps businesses accept payments online. He helps build and operate software systems that collect data from across the company’s services, and he works to predict the future of these services, including when, where, and how the fraudulent transactions will come. As a physicist, he’s ideally suited to the job, which requires both extreme math and abstract thought. And yet, unlike a physicist, he’s working in a field that now offers endless challenges and possibilities. Plus, the pay is great.

    If physics and software engineering were subatomic particles, Silicon Valley has turned into the place where the fields collide. Boykin works with three other physicists at Stripe. In December, when General Electric acquired the machine learning startup Wise.io, CEO Jeff Immelt boasted that he had just grabbed a company packed with physicists, most notably UC Berkeley astrophysicist Joshua Bloom. The open source machine learning software H20, used by 70,000 data scientists across the globe, was built with help from Swiss physicist Arno Candel, who once worked at the SLAC National Accelerator Laboratory. Vijay Narayanan, Microsoft’s head of data science, is an astrophysicist, and several other physicists work under him.

    It’s not on purpose, exactly. “We didn’t go into the physics kindergarten and steal a basket of children,” says Stripe president and co-founder John Collison. “It just happened.” And it’s happening across Silicon Valley. Because structurally and technologically, the things that just about every internet company needs to do are more and more suited to the skill set of a physicist.

    The Naturals

    Of course, physicists have played a role in computer technology since its earliest days, just as they’ve played a role in so many other fields. John Mauchly, who helped design the ENIAC, one of the earliest computers, was a physicist. Dennis Ritchie, the father of the C programming language, was too.

    But this is a particularly ripe moment for physicists in computer tech, thanks to the rise of machine learning, where machines learn tasks by analyzing vast amounts of data. This new wave of data science and AI is something that suits physicists right down to their socks.

    Among other things, the industry has embraced neural networks, software that aims to mimic the structure of the human brain. But these neural networks are really just math on an enormous scale, mostly linear algebra and probability theory. Computer scientists aren’t necessarily trained in these areas, but physicists are. “The only thing that is really new to physicists is learning how to optimize these neural networks, training them, but that’s relatively straightforward,” Boykin says. “One technique is called ‘Newton’s method.’ Newton the physicist, not some other Newton.”

    Chris Bishop, who heads Microsoft’s Cambridge research lab, felt the same way thirty years ago, when deep neural networks first started to show promise in the academic world. That’s what led him from physics into machine learning. “There is something very natural about a physicist going into machine learning,” he says, “more natural than a computer scientist.”

    The Challenge Space

    Ten years ago, Boykin says, so many of his old physics pals were moving into the financial world. That same flavor of mathematics was also enormously useful on Wall Street as a way of predicting where the markets would go. One key method was The Black-Scholes Equation, a means of determining the value of a financial derivative. But Black-Scholes helped foment the great crash of 2008, and now, Boykin and others physicists say that far more of their colleagues are moving into data science and other kinds of computer tech.

    Earlier this decade, physicists arrived at the top tech companies to help build so-called Big Data software, systems that juggle data across hundreds or even thousands of machines. At Twitter, Boykin helped build one called Summingbird, and three guys who met in the physics department at MIT built similar software at a startup called Cloudant. Physicists know how to handle data—at MIT, Cloudant’s founders handled massive datasets from the the Large Hadron Collider—and building these enormously complex systems requires its own breed of abstract thought. Then, once these systems were built, so many physicists have helped use the data they harnessed.

    In the early days of Google, one of the key people building the massively distributed systems in the company’s engine room was Yonatan Zunger, who has a PhD in string theory from Stanford. And when Kevin Scott joined the Google’s ads team, charged with grabbing data from across Google and using it to predict which ads were most likely to get the most clicks, he hired countless physicists. Unlike many computer scientists, they were suited to the very experimental nature of machine learning. “It was almost like lab science,” says Scott, now chief technology officer at LinkedIn.

    Now that Big Data software is commonplace—Stripe uses an open source version of what Boykin helped build at Twitter—it’s helping machine learning models drive predictions inside so many other companies. That provides physicists with any even wider avenue into the Silicon Valley. At Stripe, Boykin’s team also includes Roban Kramer (physics PhD, Columbia), Christian Anderson (physics master’s, Harvard), and Kelley Rivoire (physics bachelor’s, MIT). They come because they’re suited to the work. And they come because of the money. As Boykin says: “The salaries in tech are arguably absurd.” But they also come because there are so many hard problems to solve.

    Anderson left Harvard before getting his PhD because he came to view the field much as Boykin does—as an intellectual pursuit of diminishing returns. But that’s not the case on the internet. “Implicit in ‘the internet’ is the scope, the coverage of it,” Anderson says. “It makes opportunities much greater, but it also enriches the challenge space, the problem space. There is intellectual upside.”

    The Future

    Today, physicists are moving into Silicon Valley companies. But in the years come, a similar phenomenon will spread much further. Machine learning will change not only how the world analyzes data but how it builds software. Neural networks are already reinventing image recognition, speech recognition, machine translation, and the very nature of software interfaces. As Microsoft’s Chris Bishop says, software engineering is moving from handcrafted code based on logic to machine learning models based on probability and uncertainty. Companies like Google and Facebook are beginning to retrain their engineers in this new way of thinking. Eventually, the rest of the computing world will follow suit.

    In other words, all the physicists pushing into the realm of the Silicon Valley engineer is a sign of a much bigger change to come. Soon, all the Silicon Valley engineers will push into the realm of the physicist.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 3:00 pm on January 2, 2017 Permalink | Reply
    Tags: , Deep Within a Mountain Physicists Race to Unearth Dark Matter, , WIRED, ,   

    From WIRED: Women in STEM- “Deep Within a Mountain, Physicists Race to Unearth Dark Matter” Elena Aprile 

    Wired logo


    Joshua Sokol

    Elena Aprile in her lab at Columbia University.Ben Sklar for Quanta Magazine

    In a lab buried under the Apennine Mountains of Italy, Elena Aprile, a professor of physics at Columbia University, is racing to unearth what would be one of the biggest discoveries in physics.

    There is five times more dark matter in the Universe than “normal” matter, the atoms and molecules that make up all we know. Yet, it is still unknown what this dominant dark component actually is.

    Today, an international collaboration of scientists inaugurated the new XENON1T instrument designed to search for dark matter with unprecedented sensitivity, at the Gran Sasso Underground Laboratory of INFN in Italy.

    Gran Sasso LABORATORI NAZIONALI del GRAN SASSO,located in the Abruzzo region of central Italy

    She has not yet succeeded, even after more than a decade of work. Then again, nobody else has, either.

    Aprile leads the XENON dark matter experiment, one of several competing efforts to detect a particle responsible for the astrophysical peculiarities that are collectively attributed to dark matter.


    These include stars that rotate around the cores of galaxies as if pulled by invisible mass, excessive warping of space around large galaxy clusters, and the leopard-print pattern of hot and cold spots in the early universe.

    For decades, the most popular explanation for such phenomena was that dark matter is made of as-yet undiscovered weakly interacting massive particles, known as WIMPs. These WIMPs would only rarely leave an imprint on the more familiar everyday matter.

    That paradigm has recently been under fire. The Large Hadron Collider located at the CERN laboratory near Geneva has not yet found anything to support the existence of WIMPs. Other particles, less studied, could also do the trick. Dark matter’s astrophysical effects might even be caused by modifications of gravity, with no need for the missing stuff at all.

    The most stringent WIMP searches have been done using Aprile’s strategy: Pour plenty of liquid xenon—a noble element like helium or neon, but heavier—into a vat. Shield it from cosmic rays, which would inundate the detector with spurious signals. Then wait for a passing WIMP to bang into a xenon atom’s nucleus. Once it does, capture out the tiny flash of light that should result.

    These experiments use progressively larger tanks of liquid xenon that the researchers believe should be able to catch the occasional passing WIMP. Each successive search without a discovery shows that WIMPs, if they exist, must be lighter or less prone to leave a mark on normal matter than had been assumed.

    In recent years, Aprile’s team has vied with two close competitors for the title of Most-thorough WIMP Search: LUX, the Large Underground Xenon experiment, a U.S.-based group that split from her team in 2007, and PandaX, the Particle and Astrophysical Xenon experiment, a Chinese group that broke away in 2009. Both collaborators-turned-rivals also use liquid-xenon detectors and similar technology. Soon, though, Aprile expects her team to be firmly on top: The third-generation XENON experiment—larger than before, with three and a half metric tons of xenon to catch passing WIMPs—has been running since the spring, and is now taking data. A final upgrade is planned for the early 2020s.

    The game can’t go on forever, though. The scientists will eventually hit astrophysical bedrock: The experiments will become sensitive enough to pick up neutrinos from space, flooding the particle detectors with noise. If WIMPs haven’t been detected by that point, Aprile plans to stop and rethink where else to look.

    Aprile splits her time between her native Italy and New York City, where in 1986 she became the first female professor of physics at Columbia University. Quanta caught up with her on a Saturday morning in her Brooklyn high-rise apartment that faces toward the Statue of Liberty. An edited and condensed version of the interview follows.

    QUANTA MAGAZINE: How closely do you follow the theoretical back and forth about the nature of dark matter?

    ELENA APRILE: For me, driving the technology, driving the detector, making it the best detector is what makes it exciting. The point right now is that in a couple of years, maybe four or five in total, we will definitely say there is no WIMP or we will discover something.

    I don’t care much about what the theorists say. I go on with my experiment. The idea of the WIMP is clearly today still quite ideal. Nobody could tell you “No, you’re crazy looking for a WIMP.”

    What do you imagine will happen over the next few years in this search?

    If we find a signal, we have to go even faster and build a larger scale detector which we are planning already—in order to have a chance to see more of them, and have a chance to build up the statistics. If we see nothing after a year or two, the same story.

    The plan for the collaboration, for me and how I drive these 130 people, is very clear for the next four or five years. But beyond that, we will go almost to the level that we start really to see neutrinos. If we end up being lucky—if a supernova goes off next to us and we see neutrinos—we will not have found dark matter, but still detect something very exciting.

    How did you get started with this xenon detector technology?

    I started my career as a summer student at CERN. Carlo Rubbia was a professor at Harvard and also a physicist at CERN. He proposed a liquid-argon TPC—time projection chamber. This was hugely exciting as a detector because you can measure precisely the energy of a particle, and you can measure the location of the interaction, and you can do tracking. So, that was my first experience, to build the first liquid-argon ‘baby’ detector—1977, yes, that’s when it started. And then I went to Harvard, and I did my early work with Rubbia on liquid argon. That was the seed that led eventually to the monstrous, huge liquid-argon detector called ICARUS.

    Later, I left Rubbia and I accepted the position of assistant professor here at Columbia. I got interested in continuing with liquid-argon detectors, but for neutrino detection from submarines. I got my first grant from DARPA [the Defense Advanced Research Projects Agency]. They didn’t give a damn about supernova neutrinos, but they wanted to see neutrinos from the [nuclear] Russian submarines. And then we had Supernova 1987A, and I made a proposal to fly a liquid-argon telescope on a high-altitude balloon to detect the gamma rays from this supernova.

    I studied a lot—the properties of argon, krypton, xenon—and then it became clear that xenon is a much more promising material for gamma-ray detection. So I turned my attention to liquid xenon for gamma-ray astrophysics.

    How did that swerve into a search for dark matter?

    I had this idea that this detector I built for gamma-ray astrophysics could have been, in another version, ideal to look for dark matter. I said to myself: “Maybe it’s worth going into this field. The question is hot, and maybe we have the right tool to finally make some progress.”

    It’s atypical that the NSF [National Science Foundation], for someone new like me, will fund the proposal right away. It was the strength of what I had done all those years with the a liquid-xenon TPC for gamma-ray astrophysics. They realized that this woman can do it. Not because I’m very bold and I proposed a very aggressive program—which of course is typical of me—but I think it was the work that we did for another purpose which gave the strength to the XENON program, which I proposed in 2001 to the NSF.

    What was it like to go from launching high-altitude balloons to working underground?

    We had quite a few balloon campaigns. It’s something that I would do again, and I didn’t appreciate it then. You get your detector ready, you sit it on this gondola. At some point you are ready, but you can’t do anything because every morning you go and you wait for the weather guy to tell you if it’s the right moment to fly. In that scenario you are a slave to something bigger than you, which you can’t do anything about. You go on the launch pad, you look at the guy measuring, checking everything, and he says “No.”

    Underground, I guess, there is no such major thing holding you from operating your detector. But there are still, in the back of your mind, thoughts about the seismic resilience of what you designed and what you built.

    In a 2011 interview with The New York Times about women at the top of their scientific fields, you described the life of a scientist as tough, competitive and constantly exposed. You suggested that if one of your daughters aspired to be a scientist you would want her to be made of titanium. What did you mean by that?

    Maybe I shouldn’t demand this of every woman in science or physics. It’s true that it might not be fair to ask that everyone is made of titanium. But we must face it—in building or running this new experiment—there is going to be a lot of pressure sometimes. It’s on every student, every postdoc, every one of us: Try to go fast and get the results, and work day and night if you want to get there. You can go on medical leave or disability, but the WIMP is not waiting for you. Somebody else is going to get it, right? This is what I mean when I say you have to be strong.

    Going after something like this, it’s not a 9-to-5 job. I wouldn’t discourage anyone at the beginning to try. But then once you start, you cannot just pretend that this is just a normal job. This is not a normal job. It’s not a job. It’s a quest.

    In another interview, with the Italian newspaper La Repubblica, you discussed having a brilliant but demanding mentor in Carlo Rubbia, who won the Nobel Prize for Physics in 1984. What was that relationship like?

    It made me of titanium, probably. You have to imagine this 23-year-old young woman from Italy ending up at CERN as a summer student in the group of this guy. Even today, I would still be scared if I were that person. Carlo exudes confidence. I was just intimidated.

    He would keep pushing you beyond the state that is even possible: “It’s all about the science; it’s all about the goal. How the hell you get there I don’t care: If you’re not sleeping, if you’re not eating, if you don’t have time to sleep with your husband for a month, who cares? You have a baby to feed? Find some way.” Since I survived that period I knew that I was made a bit of titanium, let’s put it that way. I did learn to contain my tears. This is a person you don’t want to show weakness to.

    Now, 30 years after going off to start your own lab, how does the experience of having worked with him inform the scientist you are today, the leader of XENON?

    For a long time, he was still involved in his liquid-argon effort. He would still tell me, “What are you doing with xenon; you have to turn to argon.” It has taken me many years to get over this Rubbia fear, for many reasons, probably—even if I don’t admit it. But now I feel very strong. I can face him and say: “Hey, your liquid-argon detector isn’t working. Mine is working.”

    I decided I want to be a more practical person. Most guys are naive. All these guys are naive. A lot of things he did and does are exceptional, yes, but building a successful experiment is not something you do alone. This is a team effort and you must be able to work well with your team. Alone, I wouldn’t get anywhere. Everybody counts. It doesn’t matter that we build a beautiful machine: I don’t believe in machines. We are going to get this damn thing out of it. We’re going to get the most out of the thing that we built with our brains, with the brains of our students and postdocs who really look at this data. We want to respect each one of them.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 12:02 pm on December 14, 2016 Permalink | Reply
    Tags: , Mount St. Helens, WIRED   

    From WIRED: “A Swarm of Earthquakes Shakes Mount St. Helens” 

    Wired logo


    Erik Klemetti

    An aerial view of Mount St.Helens in Washington State from May 2009.Getty Images

    Mount St. Helens is keeping up its unsettled 2016, this time with another small earthquake swarm. The USGS detected over 120 earthquakes over the last few days, all occurring 2-4 kilometers (1-2 miles) beneath the volcano and all very small (less than M1). These earthquakes, like the ones that happened earlier this year, are likely caused by magma moving or faults adjusting as pressure changes within the magmatic system underneath Mount St. Helens. It doesn’t change the status of the volcano: It’s active, taking what will likely be a brief rest before its next eruption. That could still be years from now.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 8:25 am on October 3, 2016 Permalink | Reply
    Tags: A New Generation of Astronomers Is on the Hunt for the Next Earth, , EXPRES, WIRED   

    From WIRED: “A New Generation of Astronomers Is on the Hunt for the Next Earth” 

    Wired logo


    Sarah Scoles

    An artist’s rendering of the planet Proxima b orbiting the red dwarf star Proxima Centauri.M. Kornmesser/ESO.

    About a month ago, astronomers announced they had found a new exoplanet—this one, orbiting in the habitable zone of the nearest star to Earth. Proxima b is exciting because it’s nearby, and someday someone might send a spaceprobe to it. Plus, it has a mass close to Earth’s—making it more likely to be livable.

    But Proxima b is also notable because scientists know its mass at all. Many of the 44 potentially habitable exoplanets have been found by Kepler-style transit searches, which watch and wait for planets to pass in front of stars and eclipse some of their light.

    Planet transit. NASA/Ames
    Planet transit. NASA/Ames

    But that only gives you a measure of a planet’s radius. Proxima b popped out of a different technique, one that, for the most part, hasn’t been sensitive enough to see planets like Earth. And advances in that technique—including a new instrument called EXPRES—could improve detection enough for scientists to find and weigh lots of other Earth-mass planets.

    That exoplanet-finding method, called a radial velocity search, works by detecting the seesaw of a star pulled around by its planet’s gravity. The back-and-forth movement in each orbit takes the star ever so slightly toward us, and then ever so slightly away from us. Rinse and repeat, at regular intervals. When the star moves in our direction, its light waves appear a little squished—bluer. When it moves away, the waves appear stretched—redder.

    Transits tell us a planet’s radius. Radial velocities tell us a planet’s mass. But a planet’s overall density, and so its composition, only comes from their powers combined. If scientists only knew how wide you were and not how much you weighed, they wouldn’t know if you were gassy or rocky. The same is true of planets. So being able to detect Earth-sized planets with the radial velocity method will help scientists to figure out if they actually are Earth-like. That’s an express goal of EXPRES, and the Yale lab led by astronomer Debra Fischer that is developing it.


    The Hardware

    EXPRES is a spectrograph, a device that can measure the regular, repeating shifts from radial velocity changes. They split incoming light up by wavelength—like a prism, if prisms provided lots of data. The previous state-of-the-astronomical-art, an instrument called HARPS, was installed on a 3.6-meter telescope in Chile by the University of Geneva in Switzerland in 2003. But HARPS doesn’t comb the star’s spectrum finely enough to see the littlest, lice-like planets. And it doesn’t take stars’ natural noise into account enough.


    ESO 3.6m telescope & HARPS at LaSilla
    ESO 3.6m telescope & HARPS at LaSilla, Chile.

    “We’re used to the idea that the exoplanet research is this booming industry,” Fischer says. But she begs to differ. For the past five years, she says, “our ability to detect planets has absolutely flattened out.” Proxima b, a planet 1.3 times the mass of Earth, is the smallest-mass planet yet found with the radial velocity technique. But scientists only found it because Proxima is so close (cosmically speaking) to us, as well as very close to its low-mass star. Because of those proximities, the back-and-forth showed up stronger than it would have in a different system.

    Fischer (and every other exoplanet scientist) wants to find more worlds like that. But current instruments, including HARPS, can only pick up velocities of about 1 meter per second—10 times higher than Earth’s 10 centimeter per second pull on the Sun. “The precision of our instruments isn’t good enough,” says Fischer. “That’s what EXPRES is setting out to try to change.”

    Algorithmic advances

    The team has changed the physical technology—the hardware—to make EXPRES more precise. But the EXPRES team also innovates on the software side, using simulations to inform hardware development and subtracting out intractable noise.

    Here’s the problem: The red- and blueshifts in starlight don’t come only from gentle planetary prods.

    Stars are not calm beasts. Their atmospheres roil, boil, and send out huge plumes of particles and radiation. “It’s sort of like a fountain going off in every direction,” says Fischer. That movement—along with the stars’ wholesale motion—shifts the light. And scientists have to disentangle which shifts come from the fountain and which from the planetary tug-of-war.

    Planets shift the whole spectrum—from low-energy light to high-energy—at once, by the same amount. Atmospheric activity, on the other hand, causes different shifts in different parts of a star’s spectrum.

    Think about ultraviolet images of the Sun compared to infrared ones, says Fischer. In UV light, huge loops swing out from the solar atmosphere—they’re moving fast and shine much brighter than the Sun’s disk. Their velocity shifts show up strong. But switch to an IR picture of that same time period, and where the loops popped out, you’ll see only little surface spots. Because of their smallness and their slowness, they don’t really reveal themselves.

    That difference has let the team sift the global from the local. “As soon as you tell me there’s a difference, you’ve opened the door a crack,” says Fischer. “Now, we get to kick in the door.”

    EXPRES will ultra-finely split the light according to wavelength. And then before the team uses wavelength to calculate velocity, they find the “stellar jitter” and sift it out. And by “they,” they mean astrostatistician Jessi Cisewski, who developed the sort-and-separate algorithms.

    Road tripping

    The EXPRES team hopes to deliver the instrument to its permanent home—the Discovery Channel Telescope at Lowell Observatory in Flagstaff—in May of next year, and begin work by September 2017. They will drive it to Arizona in a U-Haul truck, with hopefully some thematic playlists blaring in the background.

    Discovery Channel Telescope at Lowell Observatory, Happy Jack AZ
    Discovery Channel Telescope at Lowell Observatory, Happy Jack AZ, USA

    The Discovery Channel Telescope is a good fit for EXPRES because it has active optics that autocorrect for distortions. It’s also a marsupial: It has a pouch for five different instruments and can switch between them in just 60 seconds. That way, even if there’s just a little bit of open time on the telescope, operators can switch to the EXPRES pouch at the end of someone else’s project, and only waste a minute. EXPRES will get data on the regular, which is key to seeing the periodic signals from planets.

    They’ll use that time to do the 100 Earths Project, looking at a few hundred well-studied stars all over the sky to find 100 Earth-ish-mass planets, creating a catalog big enough to actually draw conclusions.

    The EXPRES team is, of course, not the only group hoping to make it big with small-planet measurements. The closest comparable instrument will come from the same Geneva scientists who developed HARPS. This project—called ESPRESSO—will arrive at the Very Large Telescope in Chile around the same time EXPRES gets to Arizona.

    ESO/Espresso on the VLT
    ESO/Espresso on the VLT

    “The planet Proxima Centauri b, which was recently discovered with HARPS, is a foretaste of what will be possible with ESPRESSO,” says Francesco Pepe, who leads ESPRESSO’s evolution. He says the hard- and software will also let them look into the atmospheres of other worlds from the surface of our own, exposing “the ‘inner nature’ of exoplanets.”

    In a 2013 presentation, Pepe recalled a time 10 years before, when HARPS first put eyes on the sky and scientists thought they had butted up against technological and stellar limits. Radial-velocity searches had gotten as good as they could get, people thought (silly people). “Today, we know that reality is different, fortunately,” he said.

    Now, they will just have to see who makes that reality reality first. Regardless, the instrument name will begin with an E.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: