Tagged: SA Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:51 pm on July 12, 2017 Permalink | Reply
    Tags: , Many deep underground experiments, , , SA,   

    From SA: “Physicists Go Deep in Search of Dark Matter” 

    Scientific American

    Scientific American

    July 11, 2017
    Sarah Scoles

    A laboratory buried nearly a mile beneath South Dakota is at the forefront of a global push for subterranean science.

    1
    A worker gazes into the darkness of the Sanford Underground Research Facility’s “4850 level,” a cavern nearly a mile deep in the Homestake mine that houses state-of-the-art physics experiments. Credit: Sarah Scoles

    The elevator that lowers them 4,850 feet down a mine shaft to a subterranean physics lab isn’t called an elevator, the physicists tell me. It’s called The Cage. It descends at precisely 7:30 A.M.—the same time it leaves the surface every day—and doesn’t wait around for stragglers.

    I show up on time, and prepare to board with a group of scientists. We look identical: in coveralls blinged out with reflective tape, steel-toed boots, an emergency breathing mask and a lamp that clips to the belt and loops over the shoulder.

    An operator opens the big yellow door, directs us inside and then closes The Cage. Soon it begins bumping down at 500 feet per minute. The operator’s headlamp provides the only light, tracing along the timber that lines the shaft. We descend for 10 minutes, silently imagining the weight of the world above us increasing. Water trickling down the shaft’s walls provides an unsettling sound track.

    This place—the Sanford Underground Research Facility (SURF) in Lead, S.D.—hosts experiments that can only be conducted deep under Earth’s surface. Entombed beneath the Black Hills by thousands of feet of solid rock, these experiments are shielded from much of the background radiation that bathes the planet’s surface. Here scientists can more easily detect various elusive cosmic messengers that would otherwise be swamped by the sound and fury at the surface—neutrinos that stream from our sun and from distant exploding stars or other hypothetical particles thought to make up the mysterious dark matter that acts as a hidden hand guiding the growth of galaxies. Such particles are so dim that they’re drowned out aboveground: Looking for them there is a bit like looking for a spotlight shining from the sun’s surface. But these are the very particles scientists must study to understand how our universe came to be. And so, from the depths of Earth where even the very closest star does not shine, they are glimpsing some of the most ancient, distant and cataclysmic aspects of the cosmos.

    This place was not always science-centric: For more than 100 years its labyrinth of deep chambers and drippy, dirt-floored tunnels was a gold mine called Homestake. Today, stripped of much of its precious ore, the facility has become a figurative gold mine for researchers as the U.S.’s premier subterranean lab. This fall SURF will debut a new experiment at the frontiers of physics: CASPAR, which mimics the conditions at the cores of stars where atoms of hydrogen and other light elements fuse to release energy, forming as a by-product the more substantial elements required for building asteroids, planets, mines and mammals. This year physicists are also starting to build equipment for an experiment called LUX–ZEPLIN (LZ), which will try to detect particles of dark matter as soon as 2020.

    Lux Zeplin project at SURF

    It is all part of a trend unfolding around (as well as within) the globe, as scientists construct or repurpose buried infrastructure in places like Minnesota, Japan, Italy, China and Finland to peer deep into the cosmos from deep underground, seeking to learn why the universe is the way it is—and maybe how humans got here at all.

    Inside The Cage, the riders have leaned their heads back against the walls, eyes closed for a quiet moment before work. They look up as the elevator lurches to a stop and the door opens onto a rounded, rocky hallway, covered in netting to protect against rock slides and cave-ins. The light is yellow, with a spectrum not unlike the sun’s.

    “Just another day in paradise,” one of the passengers says as the operator releases us into this alien environment. We walk away from The Cage, our only conduit to the surface, and toward the strange science that—like extreme subterranean organisms that survive without sunlight—can only happen here. (LZ), which will try to detect particles of dark matter as soon as 2020.

    Cosmic Messengers in a Mine

    En route to our first destination, the LZ dark-matter experiment, we walk through a section of the mine called the Davis Lab.

    Its name descends from late physicist Ray Davis, who visited the town of Lead in the 1960s with a science experiment in mind. Back then Lead and next-door Deadwood looked much like they look now, with one-floor casinos and a bar bearing a sign that reads “Historic Site Saloon No. 10 Where Wild Bill Was Shot.” Davis had asked the owners of the Homestake Mine if he could use a small slice of that vast space to search for solar neutrinos.

    Neutrinos are nearly massless particles with no electrical charge. They move almost as fast as light itself. They are barely subject to the effects of gravity and are immune to electromagnetism. In fact, they hardly interact with anything at all—a neutrino might just zip straight through the atoms of any corporeal object in the universe in the way a motorcycle can split lanes straight through traffic. Physicists and astronomers love neutrinos because their cosmic shyness keeps them pristine. Each carries imprints, like birthmarks, from the explosions and radioactive decays that unleashed them on the cosmos. By studying them, scientists can learn about the inner workings of supernovae, the first moments after the big bang, and the seething hearts of stars—including our sun, which is what Davis wanted to investigate. In the 1960s, theorists had already predicted that neutrinos should exist, but no one had yet found them in the physical world.

    The mining company decided to let Davis try to become the first person to do so.

    Toiling away on Homestake’s “4850 level”—the “floor” 4,850 feet below the surface—Davis built a neutrino detector that became operational in 1967.

    Sanford Underground levels

    Over the course of the next quarter century he extracted what he came for: actual neutrinos, not just theoretical ones on paper. As the first person to directly detect the particles—and so prove they existed at all—Davis won the 2002 Nobel Prize. He was one of the first to show that, sometimes, to best connect with deep space, humans have to travel farther from it, deep inside the planet itself.

    During the initial decades of the Davis experiment, the Homestake Mine continued sending a steady stream of gold to the surface, ultimately producing nearly three million pounds of the precious metal during its lifetime—the most of any mine in the Western Hemisphere. But in 2002 when the price of an ounce dropped too low for the mine to turn a profit, Barrick Gold Corp. shut it down and later donated the facility to the State of South Dakota.

    The state—with funding from billionaire T. Denny Sanford and the U.S. Department of Energy—expanded on Davis’s legacy and turned the whole operation into a physics lab: today’s SURF, with the original Davis Campus at its core.

    Setting Up Shop

    As we enter the Davis Campus, we snap elastic-ankle booties over our shoes and are gifted a sticker. “It’s always sunny on the 4850,” it says. The evidence does not support this conclusion.

    Our guide, Mark Hanhardt, doesn’t have such a sticker, but he does have a Ghostbusters patch on the upper arm of his coveralls. He later refers to the dark matter that LZ will look for as “ghost particles.” He is, then, the buster to which his patch refers. He’s a jolly guy, with a smile—the eyes-and-mouth kind—always in between his beard and short haircut. An experiment-support scientist, he is also the son of a former Homestake miner called Jim Hanhardt. Jim was laid off when Homestake stopped mining—but he got a different belowground job back when SURF took over, becoming a technical support lead in 2008. For a few years, before his father’s recent death, the two toiled together in this subterranean space—a common story around Lead. Everyone in town seems to know or share blood with someone who works in the lab, because SURF hired back many miners and contracted with local companies for blasting and construction work. Hanhardt’s daily work, then, is carrying on dual legacies—one familial, one scientific. “There’s already been one Nobel from down here,” Hanhardt says, gesturing for us to follow him down the hallway. “Maybe there will be more.”

    Hanhardt walks along the platform toward the high-ceilinged room that SURF employees are currently preparing for LUX-ZEPLIN. Most of the space belongs to an immense and empty water tank—three and a half times as tall as me, and across whose diameter four and a half of me could lie down.

    SURF LUX water tank was transported in pieces and welded together in the Davis Cavern

    Hanhardt calls it the “giant science bucket.” Once it had been filled with 72,000 gallons of water and shielded an experiment called LUX, which operated from October 2014 to May 2016. At the time LUX was the world’s most sensitive seeker of dark matter—more attuned to the universe’s most mysterious particles than any other experiment on the planet.

    Decades of observations with telescopes have hinted the universe is full of invisible matter that neither emits nor reflects light but outweighs all the visible stars, gas and galaxies combined. This dark matter has apparently shaped some of those galaxies into spirals, and may even be what made their matter glom together into galaxies in the first place. No one knows exactly what the dark matter is made of, but most physicists agree it is likely composed of at least one kind of undiscovered subatomic particle. But just as one cannot say for sure what Sasquatch looks like until you spy one on a remote camera or ensnare one in a trap, scientists can’t say what dark matter is until they capture some.

    LUX tried to do just that. During its nearly yearlong run, a 350-kilogram canister of liquid xenon sat nested like a matryoshka doll inside the giant water tank, which isolated the xenon from the intrepid background of run-of-the-mill cosmic rays that manage to penetrate even this far underground. The xenon, denser than solid aluminum, waited hopefully for hypothetical dark matter particles to tunnel through thousands of feet of earth, ending up in South Dakota after their interstellar—or even intergalactic—journeys. If a particle of dark matter struck an atom of xenon, the collision would produce a flash of light. Electrons would then spin out of the collision, making a second flash. Detectors lining the tank’s interior would pick those up and send a signal back to scientists, who could rewind the reaction to study the particles that first sparked the fireworks.

    In October 2016 SURF scientists began dismantling LUX and carting its xenon, like miners, to the surface. The setup had seen nothing. Dark matter had stayed true to its name.

    To tenacious physicists, that just meant they needed a bigger, better bucket in which to collect dark matter: LUX-ZEPLIN. When it debuts in 2020, this follow-on experiment will still be the best in the world: 70 times as sensitive as its predecessor, thanks in large part to its 10 metric tons of liquid xenon—as compared with LUX the First’s puny third of a metric ton. The scientific collaboration, which involves 250 scientists from the U.S., the U.K., Portugal, Russia and South Korea, launched construction in February.

    Hanhardt sticks his head inside the silvery cylinder of the empty water tank and whispers “Helloooo.” The tiny sound seems to echo almost endlessly, bouncing on the tank walls and throwing itself back at us as evidence of his existence.

    Deep Physics

    SURF occupies one of the world’s deepest scientific spaces, more than twice as far down as the Soudan Underground Laboratory in Minnesota, which is in a former iron mine.

    2
    Soudan Underground Laboratory in Minnesota.Alamy photos

    The Super-Kamiokande lab, which focuses on neutrinos like Davis did, occupies the Mozumi zinc mine in Japan, 3,300 feet underground.

    Super-Kamiokande experiment. located under Mount Ikeno near the city of Hida, Gifu Prefecture, Japan

    The deepest physics facility in the world, though, is China’s Jinping Lab, in Sichuan, China which takes advantage of the tunnels beneath a hydroelectric dam.

    4

    It has a dark matter detector and a neutrino experiment called PandaX.

    5

    Using existing infrastructure, as these labs do, means scientists can focus on building their experiments instead of blasting rock. And it means they can rely on local workers who already know how to help maintain the snaking caverns that might otherwise flood, collapse or fill with poisonous gases. Italy is the first country to complete a belowground lab, Gran Sasso, for the express purpose of doing research. It took them 30 years.

    Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy

    Each of these far-flung facilities is racng to be the first to make breakthrough discoveries about elusive dark matter and ghostly neutrinos. But for the end-result science to emerge at its best, the facilities need one another—and one another’s data—to be better, faster and stronger than they can manage on their own. Together, they form an ecosystem that supports science that can’t be done on the surface.

    A Pint-Size Star Is Born

    SURF, since its genesis, has been expanding beyond the Davis Campus to other parts of the mine—of which there are plenty. The new “campus” is so far away that to visit it we take a railway cart, rumbling down darkened tracks through cavernous spaces like pickax-wielders of old. Cool air still blows past us, somehow flowing into this nether realm fresh from the surface world almost a mile above. Hallway lights pass at intervals, glowing then receding in slow, strobelike procession until we reach what is called the Ross Campus and the CASPAR experiment.

    CASPAR’s accelerator at SURF

    CASPAR is a particle accelerator—but one that fits in a regular-size room. A series of tubes, the air sucked from them by vacuum pumps, snakes across tables that run all the way across the room, then bend back into a farther open space. From one end a beam of particles streams through the tubes, its path bent by magnets. At the other end sits a target. When the beam bull’s-eyes it, the collision triggers the fusion processes that happen inside stars, when small atoms join to build larger ones. These processes happen deep inside stellar cores all across the universe, and have created essentially all the elements heavier than helium (elements astronomers call “metals,” even when they are not down in mines).

    All those “metals” comprise you, me, these tubes, this cavity, SURF, the ecosystem of underground labs, Earth and everything you may (or may not) care about. But scientists do not actually understand the details of how stars fuse elements. And because they cannot fly into the center of a star, they have instead traveled toward the center of the planet. Here, shielded from stray radiation and particles that bombard Earth’s surface, they can much more clearly see the particles and radiation from their own experiment, rather than from the sun or space.

    When we arrive, a batch of graduate students and three professors are huddled over several computers, trying to get that beam as just-right as it can be. The mini accelerator itself is on the other side of a door next to them. It looks like a kid’s chemistry set, minus the colorful liquids.

    Physicist Michael Wiescher, from the University of Notre Dame, steps away from his colleagues to tell me what they are doing. He speaks quietly, perhaps trying not to disturb them. He needn’t worry, though: Their attention is as focused as the experiment’s beam.

    That’s because it’s a big day down here: Wiescher and the others, from Notre Dame and the South Dakota School of Mines, are just starting to launch the beam toward their target. Soon they will make their own pint-size stars, farther from outer space than most people ever go. Their first experiments will examine the details of a process called “helium burning.” In the burning’s first stage, an important interaction happens when three helium nuclei alchemize into one carbon—the atom that by definition makes molecules “organic.” In actual stars this only happens with age: After stars like the sun have burned through most of the hydrogen fuel at their cores, and have evolved into red giant stars, they begin to fuse helium instead. But here in SURF, in a bathroom-size setup, CASPAR can learn about burning helium any day the scientists see fit, and so learn how to create again and again the elements that became us—fast-forwarding the sun’s clock while rewinding our own. “It’s not just physics,” says Hanhardt, who stands watch as the team works, “It’s philosophy.” It deals, in other words, in the big questions: How, literally, did we get here? Why, cosmically? These queries have scientific answers but existential implications, the science having moved into territory previously only occupied by religion.

    European researchers, Wiescher tells me, are two years behind in their work on a similar project called LUNA–MV at Gran Sasso.

    6
    LUNA–MV at Gran Sasso

    China is building its own—JUNA. But CASPAR will (any day now) start cooking first. After the CASPAR team gets a few results on their own, they plan to merge data with some of these other teams, and will let scientists come down to this cave to do their own experiments with the CASPAR equipment. Someday soon—when CASPAR opens up for collaborators, when LZ begins its search—SURF will be robust and bustling in the way of the gold mine’s heyday, back when a single neutrino experiment squatted in a corner.

    One of the computer-focused scientists says, “We have 100 percent beam transmission!” and then a smiling grad student—Thomas Kadlecek, from the South Dakota School of Mines—turns to me and Wiescher. He likes it down here, he says. His grandfather was a miner back when it was Homestake. With that, he quickly turns away again goes back to his work, leaning on a rack of electronics.

    I later find out his grandfather died in Homestake. Just as one generation of stars fuels the next—South Dakota’s previous underground generations inspire the ones that follow. “They identify with the mine,” Wiescher explains. “It’s incredible.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 10:13 am on July 10, 2017 Permalink | Reply
    Tags: , , , , Personalized cancer vaccines, Personalized Cancer Vaccines Vanquish Melanoma in Small Study, SA   

    From SA: “Personalized Cancer Vaccines Vanquish Melanoma in Small Study” 

    Scientific American

    Scientific American

    July 6, 2017
    Sharon Begley

    The therapy trains the immune system to attack tumors.

    1
    Metastatic melanoma cells. Credit: NIH Wikimedia

    A small pilot study raises hopes that personalized cancer vaccines might prove safer and more effective than immune-based therapies already in use or further along in development. In a paper published online in Nature on Wednesday, scientists reported that all six melanoma patients who received an experimental, custom-made vaccine seemed to benefit: their tumors did not return after treatment.

    Researchers not involved in the study praised its results, but with caveats. The scientists “did a beautiful job,” said MD Anderson Cancer Center’s Greg Lizee, an expert in tumor immunology, who called the results “very encouraging.” But because the study did not include a comparison group of patients who received standard treatment and not the vaccine, he cautioned, “it’s not completely proved yet that the lack of [cancer] recurrence was due to the vaccine.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 11:23 am on July 1, 2017 Permalink | Reply
    Tags: Entanglement and quantum interference, IBM Q experience, Now an interface based on the popular programming language Python, , , SA, Supercomputers still rule   

    From SA: “Quantum Computing Becomes More Accessible” 

    Scientific American

    Scientific American

    June 26, 2017
    Dario Gil

    1
    Credit: World Economic Forum

    Quantum computing has captured imaginations for almost 50 years. The reason is simple: it offers a path to solving problems that could never be answered with classical machines. Examples include simulating chemistry exactly to develop new molecules and materials and solving complex optimization problems, which seek the best solution from among many possible alternatives. Every industry has a need for optimization, which is one reason this technology has so much disruptive potential.

    Until recently, access to nascent quantum computers was restricted to specialists in a few labs around the world. But progress over the past several years has enabled the construction of the world’s first prototype systems that can finally test out ideas, algorithms and other techniques that until now were strictly theoretical.

    Quantum computers tackle problems by harnessing the power of quantum mechanics. Rather than considering each possible solution one at a time, as a classical machine would, they behave in ways that cannot be explained with classical analogies. They start out in a quantum superposition of all possible solutions, and then they use entanglement and quantum interference to home in on the correct answer—processes that we do not observe in our everyday lives. The promise they offer, however, comes at the cost of them being difficult to build. A popular design requires superconducting materials (kept 100 times colder than outer space), exquisite control over delicate quantum states and shielding for the processor to keep out even a single stray ray of light.

    Existing machines are still too small to fully solve problems more complex than supercomputers can handle today. Nevertheless, tremendous progress has been made. Algorithms have been developed that will run faster on a quantum machine. Techniques now exist that prolong coherence (the lifetime of quantum information) in superconducting quantum bits by a factor of more than 100 compared with 10 years ago. We can now measure the most important kinds of quantum errors. And in 2016 IBM provided the public access to the first quantum computer in the cloud—the IBM Q experience—with a graphical interface for programming it and now an interface based on the popular programming language Python. Opening this system to the world has fueled innovations that are vital for this technology to progress, and to date more than 20 academic papers have been published using this tool. The field is expanding dramatically. Academic research groups and more than 50 start-ups and large corporations worldwide are focused on making quantum computing a reality.

    With these technological advancements and a machine at anyone’s fingertips, now is the time for getting “quantum ready.” People can begin to figure out what they would do if machines existed today that could solve new problems. And many quantum computing guides are available online to help them get started.

    There are still many obstacles. Coherence times must improve, quantum error rates must decrease, and eventually, we must mitigate or correct the errors that do occur. Researchers will continue to drive innovations in both the hardware and software. Investigators disagree, however, over which criteria should determine when quantum computing has achieved technological maturity. Some have proposed a standard defined by the ability to perform a scientific measurement so obscure that it is not easily explained to a general audience. I and others disagree, arguing that quantum computing will not have emerged as a technology until it can solve problems that have commercial, intellectual and societal importance. The good news is, that day is finally within our sights.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 4:15 pm on June 29, 2017 Permalink | Reply
    Tags: , , , SA, The Case for Cosmic Modesty,   

    From SA: “The Case for Cosmic Modesty” 

    Scientific American

    Scientific American

    June 28, 2017
    Abraham Loeb

    1
    The Parkes radio telescope in Australia has been used to search for extraterrestrial intelligence. Credit: Ian Sutton Flickr (CC BY-SA 3.0)

    “There are many reasons to be modest,” my mother used to say when I was a kid. But after three decades as an astronomer, I can add one more reason: the richness of the universe around us.

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Prior to the development of modern astronomy, humans tended to think the physical world centered on us. The sun and the stars were thought to revolve around Earth. Although naive in retrospect, this is a natural starting point. When my daughters were infants, they tended to think the world centered on them. Their development portrayed an accelerated miniature of human history. As they grew up, they matured and acquired a more balanced perspective.

    Similarly, observing the sky makes us aware of the big picture and teaches us modesty. We now know we are not at the center of the physical universe, because Earth orbits the sun, which circles around the center of the Milky Way Galaxy, which itself drifts with a peculiar velocity of ~0.001c (c is the speed of light) relative to the cosmic rest frame.

    Milky Way NASA/JPL-Caltech /ESO R. Hurt

    Many people, however, still believe we might be at the center of the biological universe; namely, that life is rare or unique to Earth. In contrast, my working hypothesis, drawn from the above example of the physical universe, is that we are not special in general, not only in terms of our physical coordinates but also as a form of life. Adopting this perspective implies we are not alone. There should be life out there in both primitive and intelligent forms. This conclusion, implied by the principle of “cosmic modesty,” has implications. If life is likely to exist elsewhere, we should search for it in all of its possible forms.

    Breakthrough Listen Project

    1

    Lick Automated Planet Finder telescope, Mount Hamilton, CA, USA



    GBO radio telescope, Green Bank, West Virginia, USA

    CSIRO/Parkes Observatory, located 20 kilometres north of the town of Parkes, New South Wales, Australia

    Breakthrough Starshot Initiative

    Breakthrough Starshot

    ESO 3.6m telescope & HARPS at LaSilla, 600 km north of Santiago de Chile at an altitude of 2400 metres.

    SPACEOBS, the San Pedro de Atacama Celestial Explorations Observatory is located at 2450m above sea level, north of the Atacama Desert, in Chile, near to the village of San Pedro de Atacama and close to the border with Bolivia and Argentina

    SNO Sierra Nevada Observatory is a high elevation observatory 2900m above the sea level located in the Sierra Nevada mountain range in Granada Spain and operated maintained and supplied by IAC

    Teide Observatory in Tenerife Spain, home of two 40 cm LCO telescopes

    Observatori Astronòmic del Montsec (OAdM), located in the town of Sant Esteve de la Sarga (Pallars Jussà), 1,570 meters on the sea level

    Bayfordbury Observatory,approximately 6 miles from the main campus of the University of Hertfordshire

    Our civilization has reached an important milestone. We now have access to unprecedented technologies in our search for extraterrestrial life, be it primitive or intelligent. The search for primitive life is currently underway and well funded, but the search for intelligence is out of the mainstream of federal funding agencies. This should not be the case given that the only planet known to host life, Earth, shows both primitive and intelligent life forms of it.

    Our first radio signals have leaked by now out to a distance of more than 100 light-years and we might soon hear back a response. Rather than being guided by Fermi’s paradox: “Where is everybody?” or by philosophical arguments about the rarity of intelligence, we should invest funds in building better observatories and searching for a wide variety of artificial signals in the sky. Civilizations at our technological level might produce mostly weak signals. For example, a nuclear war on the nearest planet outside the solar system would not be visible even with our largest telescopes.

    But very advanced civilizations could potentially be detectable out to the edge of the observable universe through their most powerful beacons. The evidence for an alien civilization might not be in the traditional form of radio communication signals. Rather, it could involve detecting artifacts on planets via the spectral edge from solar cells, industrial pollution of atmospheres, artificial lights or bursts of radiation from artificial beams sweeping across the sky.

    Finding the answer to the important question: “Are we alone?” will change our perspective on our place in the universe and will open new interdisciplinary fields of research, such as astrolinguistics (how to communicate with aliens), astropolitics (how to negotiate with them for information), astrosociology (how to interpret their collective behavior), astroeconomics (how to trade space-based resources) and so on. We could shortcut our own progress by learning from civilizations that benefited from a head start of billions of years.

    There is no doubt that noticing the big picture taught my young daughters modesty. Similarly, the Kepler space telescope survey of nearby stars allowed astronomers to infer there are probably more habitable Earth-mass planets in the observable volume of the universe than there are grains of sand on all beaches on Earth. Emperors or kings who boasted after conquering a piece of land on Earth resemble an ant that hugs with great pride a single grain of sand on the landscape of a huge beach.

    Just over the past year, astronomers discovered a potentially habitable planet, Proxima b, around the nearest star, Proxima Centauri as well as three potentially habitable planets out of seven around another nearby star TRAPPIST-1.

    ESO Pale Red Dot project

    ESO Red Dots Campaign

    Centauris Alpha Beta Proxima 27, February 2012. Skatebiker

    The TRAPPIST-1 star, an ultracool dwarf, is orbited by seven Earth-size planets (NASA).

    ESO Belgian robotic Trappist National Telescope at Cerro La Silla, Chile interior

    ESO Belgian robotic Trappist-South National Telescope at Cerro La Silla, Chile

    (And if life formed on one of the three, it was likely transferred to the others.) These dwarf stars, whose masses are 12 percent and 8 percent the sun’s mass, respectively, will live for up to 10 trillion years, about a thousand times longer than the sun. Hence, they provide excellent prospects for life in the distant future, long after the sun will die and turn into a cool white dwarf.

    I therefore advise my wealthy friends to buy real estate on Proxima b, because its value will likely go up dramatically in the future. But this also raises an important scientific question: “Is life most likely to emerge at the present cosmic time near a star like the sun?” By surveying the habitability of the universe throughout cosmic history from the birth of the first stars 30 million years after the big bang to the death of the last stars in 10 trillion years, one reaches the conclusion that unless habitability around low-mass stars is suppressed, life is most likely to exist near red dwarf stars like Proxima Centauri or TRAPPIST-1 trillions of years from now.

    The chemistry of “life as we know it” requires liquid water, but being at the right distance from the host star for achieving a comfortable temperature on the planet’s surface is not a sufficient condition for life. The planet also needs to have an atmosphere. In the absence of an external atmospheric pressure, warming by starlight would transform water ice directly into gas rather than a liquid phase.

    The warning sign can be found next door: Mars has a tenth of Earth’s mass and lost its atmosphere. Does Proxima b have an atmosphere? If so, the atmosphere and any surface ocean it sustains will moderate the temperature contrast between its permanent day and night sides. The James Webb Space Telescope, scheduled for launch in October 2018, will be able to distinguish between the temperature contrast expected if Proxima b is bare rock compared with the case where its climate is moderated by an atmosphere, possibly along with an ocean.

    A cosmic perspective about our origins would also contribute to a balanced worldview. The heavy elements that assembled to make Earth were produced in the heart of a nearby massive star that exploded. A speck of this material takes form as our body during our life but then goes back to Earth (with one exception, namely the ashes of Clyde Tombaugh, the discoverer of Pluto, which were put on the New Horizons spacecraft and are making their way back to space).

    What are we then, if not just a transient shape that a speck of material takes for a brief moment in cosmic history on the surface of one planet out of so many? Despite all of this, life is still the most precious phenomenon we treasure on Earth. It would be amazing if we find evidence for “life as we know it” on the surface of another planet, and even more remarkable if our telescopes will trace evidence for an advanced technology on an alien spacecraft roaming through interstellar space.

    References, some with links, some without links.

    Lingam, M. & Loeb, A. 2017, ApJ 837, L23-L28.

    Lingam, M. & Loeb, A. 2017, MNRAS (in the press); preprint available at https://arxiv.org/abs/1702.05500

    Lin, H., Gonzalez, G. A. & Loeb, A., 2014, ApJ 792, L7-L11.

    Loeb, A. & Turner, E. L. 2012, Astrobiology 12, 290-290.

    Guillochon, J. & Loeb, A. ApJ 811, L20-L26.

    Anglada-Escude’, G. et al. 2016, Nature 536, 437-440.

    Gillon, M. et al. 2016, Nature 542, 456-460.

    Lingam, M. & Loeb, A. 2017, PNAS (in the press); preprint available at https://arxiv.org/abs/1703.00878

    Loeb, A., Batista, R. A., & Sloan, D. 2016, JCAP 8, 40-52.

    Kreidberg, L. & Loeb, A. 2016, ApJ, 832, L12-L18.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 11:37 am on June 29, 2017 Permalink | Reply
    Tags: For Apple and every other phone company software became much more important than hardware, Inventing mobile apps, iPhone, SA, The iPhone a full-fledged hand-held computer that could also make calls and browse the internet, The iPhone transformed the mobile phone business the internet economy and in many ways society as a whole, Top-line Nokia phones had more memory better cameras and faster mobile connectivity   

    From SA: “Understanding the Real Innovation behind the iPhone” 

    Scientific American

    Scientific American

    June 29, 2017
    Kalle Lyytinen

    1
    The first iPhone was more a hand-held computer than anything else. Credit: Paul J. Richards Getty Images

    When the iPhone emerged in 2007, it came with all the promise and pomp of a major Steve Jobs announcement, highlighting its user interface and slick design as key selling points. We know now that the iPhone transformed the mobile phone business, the internet economy and, in many ways, society as a whole. But technically speaking, the iPhone was not very innovative.

    Its software and the interface idea were based on the iPod, which was already reinventing the digital music industry. Touchscreens had appeared on earlier phone and tablet models, including Apple’s own Newton. And top-line Nokia phones had more memory, better cameras and faster mobile connectivity. What made the iPhone transformative was the shift in concept underpinning the entire iPhone project: Its designers did not create a telephone with some extra features, but rather a full-fledged hand-held computer that could also make calls and browse the internet.

    As a scholar of management, design and innovation, I find it hard to predict what the next truly revolutionary technological development will be. In the 10 years since the launch of the iPhone, so much about modern life, commerce and culture has changed. In part that’s because the iPhone, and the smartphone boom it spurred, created a portable personal technology infrastructure that’s almost infinitely expandable. The iPhone changed the game not because of its initial technology and cool user interface but rather as a result of its creators’ imagination and courage.

    Inventing mobile apps

    As the iPhone took shape, its designers found themselves torn between making a phone or a computer. Engineers and marketing executives alike worried the new device would kill the iPod market that had driven Apple’s corporate resurgence for five years. Nokia, the biggest player in the cellphone market at the time, had similar technologies and prototypes, and also feared outcompeting its own successful mobile phone product lines that used a simpler and more old-fashioned software platform than that on which iPhone was built.

    Apple took the leap, however, by installing a fully capable computer operating system on the iPhone, along with a few small application programs. Some were phone-related, including a program that handled making and receiving calls, as well as a new way to display voicemail messages, and a system that kept different contacts’ text messages separate. Others were more computer-like, including an email app and a web browser. Of course, the music-playing features from the iPod were included too, linking the phone with the emerging Apple music ecosystem.

    Initially, that was about it for apps. But skilled computer engineers and hackers knew they were holding a palm-sized computer, and set to work writing their own software and getting it running on their iPhones. That was the dawn of the now-ubiquitous app. Within a year, these apps were so popular, and their potential so significant, that Apple’s second version of the iPhone operating system made it easy (and legal) for users to install apps on their phones.

    Shifting priorities

    The prospect of making a fully functional hand-held computer changed how users and manufacturers alike thought about mobile phones. For Apple and every other phone company, software became much more important than hardware. What apps a phone could run, and how quickly, mattered much more than whether it had a slightly better camera or could hold a few more photos; whether it flipped open, slid open or was a bar-style; or whether it had a large keyboard or a small one. The iPhone’s keyboard was on-screen and software-generated, making a function that had required dedicated hardware into one running on generic hardware and dedicated software.

    At the time of the iPhone launch, Nokia offered about 200 different phone styles to meet all the different needs of its hundreds of millions of customers. There was just one iPhone model at the start, and in the ensuing decade there have been only 14 major styles – though today they come in different colors, not just white and black as the original did. This is the power of software functionality and related simplicity.

    The heightened importance of software on a mobile phone shifted the industry’s economy as well. The money came now not just from selling devices and phone services, but also from marketing and selling apps and in-app advertisements. App developers must share revenue with the companies that control smartphones’ operating systems, providing serious earning power: Apple holds about 15 percent of the mobile phone market, but reaps 80 percent of global smartphone profits.

    Whatever the next tech industry game-changer is, and whenever it arrives, it will likely have some connection to the smartphone and related infrastructure. Even today, exploring virtual reality requires only installing an app and connecting just a bit of additional hardware to an existing phone. Similarly, smartphone interfaces and cameras already monitor and control intelligent and automated homes. Even as devices are developed to operate all around us, and even in our clothes, many of them will be able to point to the iPhone as a conceptual ancestor and inspiration.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 6:04 pm on June 20, 2017 Permalink | Reply
    Tags: , , Cyber security, SA   

    From SA: “World’s Most Powerful Particle Collider Taps AI to Expose Hack Attacks” 

    Scientific American

    Scientific American

    June 19, 2017
    Jesse Emspak

    1
    A general view of the CERN Computer / Data Center and server farm. Credit: Dean Mouhtaropoulos Getty Images

    Thousands of scientists worldwide tap into CERN’s computer networks each day in their quest to better understand the fundamental structure of the universe. Unfortunately, they are not the only ones who want a piece of this vast pool of computing power, which serves the world’s largest particle physics laboratory. The hundreds of thousands of computers in CERN’s grid are also a prime target for hackers who want to hijack those resources to make money or attack other computer systems. But rather than engaging in a perpetual game of hide-and-seek with these cyber intruders via conventional security systems, CERN scientists are turning to artificial intelligence to help them outsmart their online opponents.

    Current detection systems typically spot attacks on networks by scanning incoming data for known viruses and other types of malicious code. But these systems are relatively useless against new and unfamiliar threats. Given how quickly malware changes these days, CERN is developing new systems that use machine learning to recognize and report abnormal network traffic to an administrator. For example, a system might learn to flag traffic that requires an uncharacteristically large amount of bandwidth, uses the incorrect procedure when it tries to enter the network (much like using the wrong secret knock on a door) or seeks network access via an unauthorized port (essentially trying to get in through a door that is off-limits).

    CERN’s cybersecurity department is training its AI software to learn the difference between normal and dubious behavior on the network, and to then alert staff via phone text, e-mail or computer message of any potential threat. The system could even be automated to shut down suspicious activity on its own, says Andres Gomez, lead author of a paper [Intrusion Prevention and Detection in GridComputing – The ALICE Case] describing the new cybersecurity framework.

    CERN’s Jewel

    CERN—the French acronym for the European Organization for Nuclear Research lab, which sits on the Franco-Swiss border—is opting for this new approach to protect a computer grid used by more than 8,000 physicists to quickly access and analyze large volumes of data produced by the Large Hadron Collider (LHC).

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    The LHC’s main job is to collide atomic particles at high-speed so that scientists can study how particles interact. Particle detectors and other scientific instruments within the LHC gather information about these collisions, and CERN makes it available to laboratories and universities worldwide for use in their own research projects.

    The LHC is expected to generate a total of about 50 petabytes of data (equal to 15 million high-definition movies) in 2017 alone, and demands more computing power and data storage than CERN itself can provide. In anticipation of that type of growth the laboratory in 2002 created its Worldwide LHC Computing Grid, which connects computers from more than 170 research facilities across more than 40 countries. CERN’s computer network functions somewhat like an electrical grid, which relies on a network of generating stations that create and deliver electricity as needed to a particular community of homes and businesses. In CERN’s case the community consists of research labs that require varying amounts of computing resources, based on the type of work they are doing at any given time.

    Grid Guardians

    One of the biggest challenges to defending a computer grid is the fact that security cannot interfere with the sharing of processing power and data storage. Scientists from labs in different parts of the world might end up accessing the same computers to do their research if demand on the grid is high or if their projects are similar. CERN also has to worry about whether the computers of the scientists’ connecting into the grid are free of viruses and other malicious software that could enter and spread quickly due to all the sharing. A virus might, for example, allow hackers to take over parts of the grid and use those computers either to generate digital currency known as bitcoins or to launch cyber attacks against other computers. “In normal situations, antivirus programs try to keep intrusions out of a single machine,” Gomez says. “In the grid we have to protect hundreds of thousands of machines that already allow” researchers outside CERN to use a variety of software programs they need for their different experiments. “The magnitude of the data you can collect and the very distributed environment make intrusion detection on [a] grid far more complex,” he says.

    Jarno Niemelä, a senior security researcher at F-Secure, a company that designs antivirus and computer security systems, says CERN’s use of machine learning to train its network defenses will give the lab much-needed flexibility in protecting its grid, especially when searching for new threats. Still, artificially intelligent intrusion detection is not without risks—and one of the biggest is whether Gomez and his team can develop machine-learning algorithms that can tell the difference between normal and harmful activity on the network without raising a lot of false alarms, Niemelä says.

    CERN’s AI cybersecurity upgrades are still in the early stages and will be rolled out over time. The first test will be protecting the portion of the grid used by ALICE (A Large Ion Collider Experiment)—a key LHC project to study the collisions of lead nuclei. If tests on ALICE are successful, CERN’s machine learning–based security could then be used to defend parts of the grid used by the institution’s six other detector experiments.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 1:27 pm on June 18, 2017 Permalink | Reply
    Tags: , China has taken the leadership in quantum communication, China Shatters 'Spooky Action at a Distance' Record, For now the system remains mostly a proof of concept, Global quantum communication is possible and will be achieved in the near future, , Preps for Quantum Internet, , , SA   

    From SA: “China Shatters ‘Spooky Action at a Distance’ Record, Preps for Quantum Internet” 

    Scientific American

    Scientific American

    June 15, 2017
    Lee Billings

    1
    Credit: Alfred Pasieka Getty Images

    In a landmark study, a team of Chinese scientists using an experimental satellite has tested quantum entanglement over unprecedented distances, beaming entangled pairs of photons to three ground stations across China—each separated by more than 1,200 kilometers. The test verifies a mysterious and long-held tenet of quantum theory, and firmly establishes China as the front-runner in a burgeoning “quantum space race” to create a secure, quantum-based global communications network—that is, a potentially unhackable “quantum internet” that would be of immense geopolitical importance. The findings were published Thursday in Science.

    “China has taken the leadership in quantum communication,” says Nicolas Gisin, a physicist at the University of Geneva who was not involved in the study. “This demonstrates that global quantum communication is possible and will be achieved in the near future.”

    The concept of quantum communications is considered the gold standard for security, in part because any compromising surveillance leaves its imprint on the transmission. Conventional encrypted messages require secret keys to decrypt, but those keys are vulnerable to eavesdropping as they are sent out into the ether. In quantum communications, however, these keys can be encoded in various quantum states of entangled photons—such as their polarization—and these states will be unavoidably altered if a message is intercepted by eavesdroppers. Ground-based quantum communications typically send entangled photon pairs via fiber-optic cables or open air. But collisions with ordinary atoms along the way disrupt the photons’ delicate quantum states, limiting transmission distances to a few hundred kilometers. Sophisticated devices called “quantum repeaters”—equipped with “quantum memory” modules—could in principle be daisy-chained together to receive, store and retransmit the quantum keys across longer distances, but this task is so complex and difficult that such systems remain largely theoretical.

    “A quantum repeater has to receive photons from two different places, then store them in quantum memory, then interfere them directly with each other” before sending further signals along a network, says Paul Kwiat, a physicist at the University of Illinois in Urbana–Champaign who is unaffiliated with the Chinese team. “But in order to do all that, you have to know you’ve stored them without actually measuring them.” The situation, Kwiat says, is a bit like knowing what you have received in the mail without looking in your mailbox or opening the package inside. “You can shake the package—but that’s difficult to do if what you’re receiving is just photons. You want to make sure you’ve received them but you don’t want to absorb them. In principle it’s possible—no question—but it’s very hard to do.”

    To form a globe-girdling secure quantum communications network, then, the only available solution is to beam quantum keys through the vacuum of space then distribute them across tens to hundreds of kilometers using ground-based nodes. Launched into low Earth orbit in 2016 and named after an ancient Chinese philosopher, the 600-kilogram “Micius” satellite is China’s premiere effort to do just that, and is only the first of a fleet the nation plans as part of its $100-million Quantum Experiments at Space Scale (QUESS) program.

    Micius carries in its heart an assemblage of crystals and lasers that generates entangled photon pairs then splits and transmits them on separate beams to ground stations in its line-of-sight on Earth. For the latest test, the three receiving stations were located in the cities of Delingha and Ürümqi—both on the Tibetan Plateau—as well as in the city of Lijiang in China’s far southwest. At 1,203 kilometers, the geographical distance between Delingha and Lijiang is the record-setting stretch over which the entangled photon pairs were transmitted.

    For now the system remains mostly a proof of concept, because the current reported data transmission rate between Micius and its receiving stations is too low to sustain practical quantum communications. Of the roughly six million entangled pairs that Micius’s crystalline core produced during each second of transmission, only about one pair per second reached the ground-based detectors after the beams weakened as they passed through Earth’s atmosphere and each receiving station’s light-gathering telescopes. Team leader Jian-Wei Pan—a physicist at the University of Science and Technology of China in Hefei who has pushed and planned for the experiment since 2003—compares the feat with detecting a single photon from a lone match struck by someone standing on the moon. Even so, he says, Micius’s transmission of entangled photon pairs is “a trillion times more efficient than using the best telecommunication fibers. … We have done something that was absolutely impossible without the satellite.” Within the next five years, Pan says, QUESS will launch more practical quantum communications satellites.

    Although Pan and his team plan for Micius and its nascent network of sister satellites to eventually distribute quantum keys, their initial demonstration instead aimed to achieve a simpler task: proving Einstein wrong.

    Einstein famously derided as “spooky action at a distance” one of the most bizarre elements of quantum theory—the way that measuring one member of an entangled pair of particles seems to instantaneously change the state of its counterpart, even if that counterpart particle is on the other side of the galaxy. This was abhorrent to Einstein, because it suggests information might be transmitted between the particles faster than light, breaking the universal speed limit set by his theory of special relativity. Instead, he and others posited, perhaps the entangled particles somehow shared “hidden variables” that are inaccessible to experiment but would determine the particles’ subsequent behavior when measured. In 1964 the physicist John Bell devised a way to test Einstein’s idea, calculating a limit that physicists could statistically measure for how much hidden variables could possibly correlate with the behavior of entangled particles. If experiments showed this limit to be exceeded, then Einstein’s idea of hidden variables would be incorrect.

    Ever since the 1970s “Bell tests” by physicists across ever-larger swaths of spacetime have shown that Einstein was indeed mistaken, and that entangled particles do in fact surpass Bell’s strict limits. The most definitive test arguably occurred in the Netherlands in 2015, when a team at Delft University of Technology closed several potential “loopholes” that had plagued past experiments and offered slim-but-significant opportunities for the influence of hidden variables to slip through. That test, though, involved separating entangled particles by scarcely more than a kilometer. With Micius’s transmission of entangled photons between widely separated ground stations, Pan’s team has now performed a Bell test at distances a thousand times greater. Just as before, their results confirm that Einstein was wrong. The quantum realm remains a spooky place—although no one yet understands why.

    “Of course, no one who accepts quantum mechanics could possibly doubt that entanglement can be created over that distance—or over any distance—but it’s still nice to see it made concrete,” says Scott Aaronson, a physicist at The University of Texas at Austin. “Nothing we knew suggested this goal was unachievable. The significance of this news is not that it was unexpected or that it overturns anything previously believed, but simply that it’s a satisfying culmination of years of hard work.”

    That work largely began in the 1990s when Pan, leader of the Chinese team, was a graduate student in the lab of the physicist Anton Zeilinger at the University of Innsbruck in Austria. Zeilinger was Pan’s PhD adviser, and they collaborated closely to test and further develop ideas for quantum communication. Pan returned to China to start his own lab in 2001, and Zeilinger started one as well at the Austrian Academy of Sciences in Vienna. For the next seven years they would compete fiercely to break records for transmitting entangled photon pairs across ever-wider gaps, and in ever-more extreme conditions, in ground-based experiments. All the while each man lobbied his respective nation’s space agency to green-light a satellite that could be used to test the technique from space. But Zeilinger’s proposals perished in a bureaucratic swamp at the European Space Agency whereas Pan’s were quickly embraced by the China National Space Administration. Ultimately, Zeilinger chose to collaborate again with his old pupil rather than compete against him; today the Austrian Academy of Sciences is a partner in QUESS, and the project has plans to use Micius to perform an intercontinental quantum key distribution experiment between ground stations in Vienna and Beijing.

    “I am happy that the Micius works so well,” Zeilinger says. “But one has to realize that it is a missed opportunity for Europe and others, too.”

    For years now, other researchers and institutions have been scrambling to catch up, pushing governments for more funding for further experiments on the ground and in space—and many of them see Micius’s success as the catalytic event they have been waiting for. “This is a major milestone, because if we are ever to have a quantum internet in the future, we will need to send entanglement over these sorts of long distances,” says Thomas Jennewein, a physicist at the University of Waterloo in Canada who was not involved with the study. “This research is groundbreaking for all of us in the community—everyone can point to it and say, ‘see, it does work!’”

    Jennewein and his collaborators are pursuing a space-based approach from the ground up, partnering with the Canadian Space Agency to plan a smaller, simpler satellite that could launch as soon as five years from now to act as a “universal receiver” and redistribute entangled photons beamed up from ground stations. At the National University of Singapore, an international collaboration led by the physicist Alexander Ling has already launched cheap shoe box–size CubeSats to create, study and perhaps even transmit photon pairs that are “correlated”—a situation just shy of full entanglement. And in the U.S., Kwiat at the University of Illinois is using NASA funding to develop a device that could someday test quantum communications using “hyperentanglement” (the simultaneous entanglement of photon pairs in multiple ways) onboard the International Space Station.

    Perhaps most significantly, a team led by Gerd Leuchs and Christoph Marquardt at the Max Planck Institute for the Science of Light in Germany is developing quantum communications protocols for commercially available laser systems already in space onboard the European Copernicus and SpaceDataHighway satellites. Using one of these systems, the team successfully encoded and sent simple quantum states to ground stations using photons beamed from a satellite in geostationary orbit, some 38,000 kilometers above Earth. This approach, Marquardt explains, does not rely on entanglement and is very different from that of QUESS—but it could, with minimal upgrades, nonetheless be used to distribute quantum keys for secure communications in as little as five years. Their results appear in Optica.

    “Our purpose is really to find a shortcut into making things like quantum key distribution with satellites economically viable and employable, pretty fast and soon,” Marquardt says. “[Engineers] invested 20 years of hard work making these systems, so it’s easier to upgrade them than to design everything from scratch. … It is a very good advantage if you can rely on something that is already qualified in space, because space qualification is very complicated. It usually takes five to 10 years just to develop that.”

    Marquardt and others suspect, however, that this field could be much further advanced than has been publicly acknowledged, with developments possibly hidden behind veils of official secrecy in the U.S. and elsewhere. It may be that the era of quantum communication is already upon us. “Some colleague of mine made the joke, ‘the silence of the U.S. is very loud,’” Marquardt says. “They had some very good groups concerning free-space satellites and quantum key distribution at Los Alamos [National Laboratory] and other places, and suddenly they stopped publishing. So we always say there are two reasons that they stopped publishing: either it didn’t work, or it worked really well!”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 2:39 pm on May 13, 2017 Permalink | Reply
    Tags: , , , , SA,   

    From SA: “Is a Popular Theory of Cosmic Creation Pseudoscience?” 

    Scientific American

    Scientific American

    May 12, 2017
    John Horgan

    Physicists battle over whether the theory of inflation is untestable and hence not really scientific.

    1
    An article in the February Scientific American, “Pop Goes the Universe,” criticized the theory of cosmic inflation, arguing that it “cannot be evaluated using the scientific method.” Scientific American has now published a letter by 33 scientists, including Stephen Hawking, strongly objecting to the February article. Credit: Scientific American, February 2017.

    A brouhaha has erupted over the theory of cosmic creation known as inflation. The theory holds that in the first instant of the big bang, the universe underwent a tremendous, exponential growth spurt before settling down to the slower rate of expansion observed today.

    First conceived in the early 1980s, inflation quickly became popular, because it seemed to account for puzzling features of the observable universe. Inflation explains, supposedly, why the universe looks quite similar in all directions and yet isn’t entirely uniform, since it contains galaxies and other clumps of matter.

    By the early 1990s, some cosmologists were beginning to doubt inflation. “I like inflation,” David Schramm, a prominent contributor to the big bang theory, told me in 1993. But he worried that inflation does not offer any unique, definitive predictions, which cannot be explained in any other way.

    “You won’t see that for inflation, Schramm said, “whereas for the big bang itself you do see that. The beautiful, cosmic microwave background and the light-element abundances tell you, ‘This is it.’”

    CMB per ESA/Planck

    ESA/Planck

    In other words, inflation cannot be falsified. According to philosopher Karl Popper, a theory that doesn’t offer predictions specific and precise enough to be proven false isn’t really scientific.

    In my 1996 book The End of Science I derided inflation as “ironic science,” which can never be proven true or false and hence isn’t really science at all. I have continued whacking inflation since then, because as with string theory, another example of ironic science, the problems of inflation have only worsened over time.

    There are many different versions of string theory and inflation, which offer many different predictions. Both theories imply, moreover, that our cosmos is just one of many universes, none of which can be observed. (For more criticism of strings and multiverses, see my recent Q&A with string critic Peter Woit.)

    I was thus gratified when physicists Anna Ijjas, Paul Steinhardt and Abraham Loeb presented a stinging critique of inflation in Scientific American in February and urged cosmologists to “consider new ideas about how the universe began.

    Steinhardt’s authorship is especially significant, since he is credited with inventing inflation together with Alan Guth and Andrei Linde. Steinhardt has been voicing qualms about inflation for years. See for example my 2014 Q&A with him on this blog, in which Steinhardt says: “Scientific ideas should be simple, explanatory, predictive. The inflationary multiverse as currently understood appears to have none of those properties.” Ijjas et al. expand on Steinhardt’s long-standing concerns. The authors assert that

    …inflationary cosmology, as we currently understand it, cannot be evaluated using the scientific method. As we have discussed, the expected outcome of inflation can easily change if we vary the initial conditions, change the shape of the inflationary energy density curve, or simply note that it leads to eternal inflation and a multimess. Individually and collectively, these features make inflation so flexible that no experiment can ever disprove it.

    I love the term “multimess.” Now a group of 33 scientists has pushed back hard against the critique of Ijjas, Steinhardt and Loeb. The group includes inflation pioneers Alan Guth and Andrei Linde as well as Steven Weinberg, Edward Witten and Stephen Hawking. In a letter published in Scientific American, they insist that inflation is testable and hence scientific. They conclude:

    “During the more than 35 years of its existence, inflationary theory has gradually become the main cosmological paradigm describing the early stages of the evolution of the universe and the formation of its large-scale structure. No one claims that inflation has become certain; scientific theories don’t get proved the way mathematical theorems do, but as time passes, the successful ones become better and better established by improved experimental tests and theoretical advances. This has happened with inflation. Progress continues, supported by the enthusiastic efforts of many scientists who have chosen to participate in this vibrant branch of cosmology. Empirical science is alive and well!”

    That last sentence strikes me as whistling past the graveyard, but read the letter and judge for yourself. In their response, Ijjas, Steinhardt and Loeb stand firm, especially on their argument that inflation is not empirically testable. They note that

    …if inflation produces a multiverse in which, to quote a previous statement from one of the responding authors (Guth), “anything that can happen will happen”—it makes no sense whatsoever to talk about predictions… any inflationary model gives an infinite diversity of outcomes with none preferred over any other. This makes inflation immune from any observational test.

    Almost 40 years after their inception, inflation and string theory are in worse shape than ever. The persistence of these unfalsifiable and hence unscientific theories is an embarrassment that risks damaging science’s reputation at a time when science can ill afford it. Isn’t it time to pull the plug?

    Further Reading:

    Why I Still Doubt Inflation, in Spite of Gravitational Wave Findings.

    Why String Theory Is Still Not Even Wrong.

    See also my Q&As with physicists Edward Witten, Steven Weinberg, George Ellis, Carlo Rovelli, Scott Aaronson, Stephen Wolfram, Sabine Hossenfelder, Priyamvada Natarajan, Garrett Lisi, Paul Steinhardt and Lee Smolin.

    Meta-Post: Horgan Posts on Physics, Cosmology

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 5:11 pm on February 12, 2017 Permalink | Reply
    Tags: , Arctic 2.0: What Happens after All the Ice Goes?, , SA   

    From SA: “Arctic 2.0: What Happens after All the Ice Goes?” 

    Scientific American

    Scientific American

    February 9, 2017
    Julia Rosen

    1
    Credit: Global Panorama Flickr (CC BY-SA 2.0)

    As the Arctic slipped into the half-darkness of autumn last year, it seemed to enter the Twilight Zone. In the span of a few months, all manner of strange things happened.

    The cap of sea ice covering the Arctic Ocean started to shrink when it should have been growing. Temperatures at the North Pole soared more than 20 °C above normal at times. And polar bears prowling the shorelines of Hudson Bay had a record number of run-ins with people while waiting for the water to freeze over.

    It was a stark illustration of just how quickly climate change is reshaping the far north. And if last autumn was bizarre, it’s the summers that have really got scientists worried. As early as 2030, researchers say, the Arctic Ocean could lose essentially all of its ice during the warmest months of the year—a radical transformation that would upend Arctic ecosystems and disrupt many northern communities.

    Change will spill beyond the region, too. An increasingly blue Arctic Ocean could amplify warming trends and even scramble weather patterns around the globe. “It’s not just that we’re talking about polar bears or seals,” says Julienne Stroeve, a sea-ice researcher at University College London. “We all are ice-dependent species.”

    With the prospect of ice-free Arctic summers on the horizon, scientists are striving to understand how residents of the north will fare, which animals face the biggest risks and whether nations could save them by protecting small icy refuges.

    But as some researchers look even further into the future, they see reasons to preserve hope. If society ever manages to reverse the surge in greenhouse-gas concentrations—as some suspect it ultimately will—then the same physics that makes it easy for Arctic sea ice to melt rapidly may also allow it to regrow, says Stephanie Pfirman, a sea-ice researcher at Barnard College in New York City.

    She and other scientists say that it’s time to look beyond the Arctic’s decline and start thinking about what it would take to restore sea ice. That raises controversial questions about how quickly summer ice could return and whether it could regrow fast enough to spare Arctic species. Could nations even cool the climate quickly through geoengineering, to reverse the most drastic changes up north?

    Pfirman and her colleagues published a paper last year designed to kick-start a broader conversation about how countries might plan for the regrowth of ice, and whether they would welcome it. Only by considering all the possibilities for the far future can the world stay one step ahead of the ever-changing Arctic, say scientists. “We’ve committed to the Arctic of the next generation,” Pfirman says. “What comes next?”

    Blue period

    Pfirman remembers the first time she realized just how fast the Arctic was unravelling. It was September 2007, and she was preparing to give a talk. She went online to download the latest sea-ice maps and discovered something disturbing: the extent of Arctic ice had shrunk past the record minimum and was still dropping. “Oh, no! It’s happening,” she thought.

    Although Pfirman and others knew that Arctic sea ice was shrinking, they hadn’t expected to see such extreme ice losses until the middle of the twenty-first century. “It was a wake-up call that we had basically run out of time,” she says.

    In theory, there’s still a chance that the world could prevent the total loss of summer sea ice. Global climate models suggest that about 3 million square kilometres—roughly half of the minimum summer coverage in recent decades—could survive if countries fulfil their commitments to the newly ratified Paris climate agreement, which limits global warming to 2 °C above pre-industrial temperatures.

    But sea-ice researchers aren’t counting on that. Models have consistently underestimated ice losses in the past, causing scientists to worry that the declines in the next few decades will outpace projections. And given the limited commitments that countries have made so far to address climate change, many researchers suspect the world will overshoot the 2 °C target, all but guaranteeing essentially ice-free summers (winter ice is projected to persist for much longer).

    In the best-case scenario, the Arctic is in for a 4–5 °C temperature rise, thanks to processes that amplify warming at high latitudes, says James Overland, an oceanographer at the US National Oceanic and Atmospheric Administration in Seattle, Washington. “We really don’t have any clue about how disruptive that’s going to be.”

    The Arctic’s 4 million residents—including 400,000 indigenous people—will feel the most direct effects of ice loss. Entire coastal communities, such as many in Alaska, will be forced to relocate as permafrost melts and shorelines crumble without sea ice to buffer them from violent storms, according to a 2013 report by the Brookings Institution in Washington DC. Residents in Greenland will find it hard to travel on sea ice, and reindeer herders in Siberia could struggle to feed their animals. At the same time, new economic opportunities will beckon as open water allows greater access to fishing grounds, oil and gas deposits, and other sources of revenue.

    People living at mid-latitudes may not be immune, either. Emerging research suggests that open water in the Arctic might have helped to amplify weather events, such as cold snaps in the United States, Europe and Asia in recent winters.

    Indeed, the impacts could reach around the globe. That’s because sea ice helps to cool the planet by reflecting sunlight and preventing the Arctic Ocean from absorbing heat. Keeping local air and water temperatures low, in turn, limits melting of the Greenland ice sheet and permafrost. With summer ice gone, Greenland’s glaciers could contribute more to sea-level rise, and permafrost could release its stores of greenhouse gases such as methane. Such is the vast influence of Arctic ice.

    “It is really the tail that wags the dog of global climate,” says Brenda Ekwurzel, director of climate science at the Union of Concerned Scientists in Cambridge, Massachusetts.

    But Arctic ecosystems will take the biggest hit. In 2007, for example, biologists in Alaska noticed something odd: vast numbers of walruses had clambered ashore on the coast of the Chukchi Sea. From above, it looked like the Woodstock music festival—with tusks—as thousands of plump pinnipeds crowded swathes of ice-free shoreline.

    Normally, walruses rest atop sea ice while foraging on the shallow sea floor. But that year, and almost every year since, sea-ice retreat made that impossible by late summer. Pacific walruses have adapted by hauling out on land, but scientists with the US Fish and Wildlife Service worry that their numbers will continue to decline. Here and across the region, the effects of Arctic thawing will ripple through ecosystems.

    In the ocean, photosynthetic plankton that thrive in open water will replace algae that grow on ice. Some models suggest that biological productivity in a seasonally ice-free Arctic could increase by up to 70% by 2100, which could boost revenue from Arctic fisheries even more. (To prevent a seafood gold rush, five Arctic nations have agreed to refrain from unregulated fishing in international waters for now.) Many whales already seem to be benefiting from the bounty of food, says Sue Moore, an Arctic mammal specialist at the Pacific Marine Environmental Laboratory.

    But the changing Arctic will pose a challenge for species whose life cycles are intimately linked to sea ice, such as walruses and Arctic seals—as well as polar bears, which don’t have much to eat on land. Research suggests that many will starve if the ice-free season gets too long in much of the Arctic. “Basically, you can write off most of the southern populations,” says Andrew Derocher, a biologist at the University of Alberta in Edmonton, Canada. Such findings spurred the US Fish and Wildlife Service to list polar bears as threatened in 2008.

    The last of the ice

    Ice-dependent ecosystems may survive for longest along the rugged north shores of Greenland and Canada, where models suggest that about half a million square kilometres of summer sea ice will linger after the rest of the Arctic opens up. Wind patterns cause ice to pile up there, and the thickness of the ice—along with the high latitude—helps prevent it from melting. “The Siberian coastlines are the ice factory, and the Canadian Arctic Archipelago is the ice graveyard,” says Robert Newton, an oceanographer at Columbia University’s Lamont–Doherty Earth Observatory in Palisades, New York.

    Groups such as the wildlife charity WWF have proposed protecting this ‘last ice area’ as a World Heritage Site in the hope that it will serve as a life preserver for many Arctic species. Last December, Canada announced that it would at least consider setting the area aside for conservation, and indigenous groups have expressed interest in helping to manage it. (Before he left office, then-US president Barack Obama joined Canadian Prime Minister Justin Trudeau in pledging to protect 17% of the countries’ Arctic lands and 10% of marine areas by 2020.)

    But the last ice area has limitations as an Arctic Noah’s ark. Some species don’t live in the region, and those that do are there in only small numbers. Derocher estimates that there are less than 2,000 polar bears in that last ice area today—a fraction of the total Arctic population of roughly 25,000. How many bears will live there in the future depends on how the ecosystem evolves with warming.

    The area may also be more vulnerable than global climate models suggest. Bruno Tremblay, a sea-ice researcher at McGill University in Montreal, Canada, and David Huard, an independent climate consultant based in Quebec, Canada, studied the fate of the refuge with a high-resolution sea-ice and ocean model that better represented the narrow channels between the islands of the Canadian archipelago.

    In a report commissioned by the WWF, they found that ice might actually be able to sneak between the islands and flow south to latitudes where it would melt. According to the model, Tremblay says, “even the last ice area gets flushed out much more efficiently”.

    If the future of the Arctic seems dire, there is one source of optimism: summer sea ice will return whenever the planet cools down again. “It’s not this irreversible process,” Stroeve says. “You could bring it back even if you lose it all.”

    Unlike land-based ice sheets, which wax and wane over millennia and lag behind climate changes by similar spans, sea ice will regrow as soon as summer temperatures get cold enough. But identifying the exact threshold at which sea ice will return is tricky, says Dirk Notz, a sea-ice researcher at the Max Planck Institute for Meteorology in Hamburg, Germany. On the basis of model projections, researchers suggest that the threshold hovers around 450 parts per million (p.p.m.)—some 50 p.p.m. higher than today. But greenhouse-gas concentrations are not the only factor that affects ice regrowth; it also depends on how long the region has been ice-free in summer, which determines how much heat can build up in the Arctic Ocean.

    Notz and his colleagues studied the interplay between greenhouse gases and ocean temperature with a global climate model. They increased CO2 from pre-industrial concentrations of 280 p.p.m. to 1,100 p.p.m.—a bit more than the 1,000 p.p.m. projected by 2100 if no major action is taken to curtail greenhouse-gas emissions. Then they left it at those levels for millennia.

    This obliterated both winter and summer sea ice, and allowed the ocean to warm up. The researchers then reduced CO2 concentrations to levels at which summer ice should have returned, but it did not regrow until the ocean had a chance to cool off, which took centuries.

    By contrast, if the Arctic experiences ice-free summers for a relatively short time before greenhouse gases drop, then models suggest ice would regrow much sooner. That could theoretically start to happen by the end of the century, assuming that nations take very aggressive steps to reduce carbon dioxide levels, according to Newton, Pfirman and their colleagues. So even if society cannot forestall the loss of summer sea ice in coming decades, taking action to keep CO2 concentrations under control could still make it easier to regrow the ice cover later, Notz says.

    Global cooling

    Given the stakes, some researchers have proposed global-scale geoengineering to cool the planet and, by extension, preserve or restore ice. Others argue that it might be possible to chill just the north, for instance by artificially whitening the Arctic Ocean with light-coloured floating particles to reflect sunlight. A study this year suggested installing wind-powered pumps to bring water to the surface in winter, where it would freeze, forming thicker ice.

    But many researchers hesitate to embrace geoengineering. And most agree that regional efforts would take tremendous effort and have limited benefits, given that Earth’s circulation systems could just bring more heat north to compensate. “It’s kind of like walking against a conveyor the wrong way,” Pfirman says. She and others agree that managing greenhouse gases—and local pollutants such as black carbon from shipping—is the only long-term solution.

    Returning to a world with summer sea ice could have big perks, such as restoring some of the climate services that the Arctic provides to the globe and stabilizing weather patterns. And in the region itself, restoring a white Arctic could offer relief to polar bears and other ice-dependent species, says Pfirman. These creatures might be able to weather a relatively short ice-free window, hunkered down in either the last ice area or other places set aside to preserve biodiversity. When the ice returned, they could spread out again to repopulate the Arctic.

    That has almost certainly happened during past climate changes. For instance, researchers think the Arctic may have experienced nearly ice-free summers during the last interglacial period, 130,000 years ago.

    But, one thing is certain: getting back to a world with Arctic summer sea ice won’t be simple, politically or technically. Not everyone will embrace a return to an ice-covered Arctic, especially if it’s been blue for several generations. Companies and countries are already eyeing the opportunities for oil and gas exploration, mining, shipping, tourism and fishing in a region hungry for economic development. “In many communities, people are split,” Pfirman says.

    Some researchers also say that the idea of regrowing sea ice seems like wishful thinking, because it would require efforts well beyond what nations must do to meet the Paris agreement. Limiting warming to 2 °C will probably entail converting huge swathes of land into forest and using still-nascent technologies to suck billions of tonnes of CO2 out of the air. Lowering greenhouse-gas concentrations enough to regrow ice would demand even more.

    And if summer sea ice ever does come back, it’s hard to know how a remade Arctic would work, Derocher says. “There will be an ecosystem. It will function. It just may not look like the one we currently have.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 3:45 pm on February 1, 2017 Permalink | Reply
    Tags: , , SA,   

    From SA: “IceCube Closes in on Mysterious Nature of Neutrinos” 

    Scientific American

    Scientific American

    February 1, 2017
    Calla Cofield

    The Antarctica-based observatory has found hints of strange patterns in the ghostly particles’ masses

    1
    IceCube neutrino detector interior
    U Wisconsin IceCube Neutrino detector

    Buried under the Antarctic ice, the IceCube experiment was designed primarily to capture particles called neutrinos that are produced by powerful cosmic events, but it is also helping scientists learn about the fundamental nature of these ghostly particles.

    At a meeting of the American Physical Society (APS) in Washington, D.C., this week, scientists with the IceCube collaboration presented new results that contribute to an ongoing mystery about the nature of neutrinos. These particles pour down on Earth from the sun, but they mostly pass unimpeded, like ghosts, through regular matter.

    The new results support evidence of a strange symmetry in measurements of one neutrino mass. In particle physics, symmetries often indicate underlying physics that scientists haven’t yet unearthed. [Neutrinos from Beyond the Solar System Found (Images)]

    Mystery of the neutrino mass

    Neutrinos are fundamental particles of nature. They aren’t one of the particles that make up atoms. (Those are electrons, protons and neutrons.) Neutrinos very, very rarely interact with regular matter, so they don’t really influence human beings at all (unless, of course, you happen to be a particle physicist who studies them). The sun generates neutrinos in droves, but for the most part, those particles pour through the Earth, like phantoms.

    The [U Wisconsin] IceCube Neutrino Observatory is a neutrino detector buried under 0.9 miles (1.45 kilometers) of ice in Antarctica. The ice provides a shield from other types of radiation and particles that would otherwise overwhelm the rare instances when neutrinos do interact with the detector and create a signal for scientists to study.

    Neutrinos come in three “flavors”: the tau neutrino, the muon neutrino and the electron neutrino. For a long time, scientists debated whether neutrinos had mass or if they were similar to photons (particles of light), which are considered massless. Eventually, scientists showed that neutrinos do have mass, and the 2015 Nobel Prize was awarded for work on neutrinos, including investigations into neutrino masses.

    But saying that neutrinos have mass is not the same as saying that a rock or an apple has mass. Neutrinos are particles that exist in the quantum world, and the quantum world is weird—light can be both a wave and a particle; cats can be both alive and dead. So it’s not that each neutrino flavor has its own mass, but rather that the neutrino flavors combine into what are called “mass eigenstates,” and those are what scientists measure. (For the purpose of simplicity, a Michigan State University statement describing the new findings calls the mass eigenstates “neutrino species.”)

    “One of the outstanding questions is whether there is a pattern to the fractions that go into each neutrino species,” Tyce DeYoung, an associate professor of physics and astronomy at Michigan State University and one of the IceCube collaborators working on the new finding, told Space.com.

    One neutrino species appears to be made up of mostly electron neutrinos, with some muon and tau neutrinos; the second neutrino species seems to be an almost equal mix of all three; and the third is still a bit of a mystery, but one previous study suggested that it might be an even split between muon and tau, with just a few electron neutrinos thrown in.

    At the APS meeting, Joshua Hignight, a postdoctoral researcher at Michigan State University working with DeYoung, presented preliminary results from IceCube that support the equal split of muon and tau neutrinos in that third mass species.

    “This question of whether the third type is exactly equal parts muon and tau is called the maximal mixing question,” he said. “Since we don’t know any reason that this neutrino species should be exactly half and half, that would either be a really astonishing coincidence or possibly telling us about some physical principle that we haven’t discovered yet.”

    Generally speaking, any given feature of the universe can be explained either by a random process or by some rule that governs how things behave. If the number of muon and tau neutrinos in the third neutrino species were determined randomly, there would be much higher odds that those numbers would not be equal.

    “To me, this is very interesting, because it implies a fundamental symmetry,” DeYoung said.

    To better understand why the equal number of muon and tau neutrinos in the mass species implies nonrandomness, DeYoung gave the example of scientists discovering that protons and neutrons (the two particles that make up the nucleus of an atom) have very similar masses. The scientists who first discovered those masses might have wondered if that similarity was a mere coincidence or the product of some underlying similarity.

    It turns out, it’s the latter: Neutrons and protons are both made of three elementary particles called quarks (though a different combination of two quark varieties). In that case, a similarity on the surface indicated something hidden below, the scientists said.

    The new results from IceCube are “generally consistent” with recent results from the T2K neutrino experiment in Japan, which is dedicated to answering questions about the fundamental nature of neutrinos.

    T2K Experiment
    T2K map
    T2K Experiment

    But the Nova experiment, based at Fermi National Accelerator Laboratory [FNAL] outside Chicago, did not “prefer the exact symmetry” between the muon and tau neutrinos in the third mass species, according to DeYoung.

    FNAL/NOvA experiment
    FNAL/NOvA experiment map
    FNAL NOvA Near Detector
    FNAL NOvA Near Detector

    “That’s a tension; that’s not a direct contradiction at this point,” he said. “It’s the sort of not-quite-agreement that we’re going to be looking into over the next couple of years.”

    IceCube was designed to detect somewhat-high-energy neutrinos from distant cosmic sources, but most neutrino experiments on Earth detect lower-energy neutrinos from the sun or nuclear reactors on Earth. Both T2K and Nova detect neutrinos at about an order of magnitude lower energy than IceCube. The consistency between the measurements made by IceCube and T2K are a test of “the robustness of the measurement” and “a success for our standard theory” of neutrino physics, DeYoung said.

    Neutrinos don’t affect most people’s day-to-day lives, but physicists hope that by studying these particles, they can find clues about some of the biggest mysteries in the cosmos. One of those cosmic mysteries could include an explanation for dark matter, the mysterious stuff that is five times more common in the universe than the “regular” matter that makes up planets, stars and all of the visible objects in the cosmos. Dark matter has a gravitational pull on regular matter, and it has shaped the cosmic landscape throughout the history of the universe. Some theorists think dark matter could be a new type of neutrino.

    The IceCube results are still preliminary, according to DeYoung. The scientists plan to submit the final results for publication after they’ve finished running the complete statistical analysis of the data.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: