Tagged: SA Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 6:04 pm on June 20, 2017 Permalink | Reply
    Tags: , , Cyber security, SA   

    From SA: “World’s Most Powerful Particle Collider Taps AI to Expose Hack Attacks” 

    Scientific American

    Scientific American

    June 19, 2017
    Jesse Emspak

    1
    A general view of the CERN Computer / Data Center and server farm. Credit: Dean Mouhtaropoulos Getty Images

    Thousands of scientists worldwide tap into CERN’s computer networks each day in their quest to better understand the fundamental structure of the universe. Unfortunately, they are not the only ones who want a piece of this vast pool of computing power, which serves the world’s largest particle physics laboratory. The hundreds of thousands of computers in CERN’s grid are also a prime target for hackers who want to hijack those resources to make money or attack other computer systems. But rather than engaging in a perpetual game of hide-and-seek with these cyber intruders via conventional security systems, CERN scientists are turning to artificial intelligence to help them outsmart their online opponents.

    Current detection systems typically spot attacks on networks by scanning incoming data for known viruses and other types of malicious code. But these systems are relatively useless against new and unfamiliar threats. Given how quickly malware changes these days, CERN is developing new systems that use machine learning to recognize and report abnormal network traffic to an administrator. For example, a system might learn to flag traffic that requires an uncharacteristically large amount of bandwidth, uses the incorrect procedure when it tries to enter the network (much like using the wrong secret knock on a door) or seeks network access via an unauthorized port (essentially trying to get in through a door that is off-limits).

    CERN’s cybersecurity department is training its AI software to learn the difference between normal and dubious behavior on the network, and to then alert staff via phone text, e-mail or computer message of any potential threat. The system could even be automated to shut down suspicious activity on its own, says Andres Gomez, lead author of a paper [Intrusion Prevention and Detection in GridComputing – The ALICE Case] describing the new cybersecurity framework.

    CERN’s Jewel

    CERN—the French acronym for the European Organization for Nuclear Research lab, which sits on the Franco-Swiss border—is opting for this new approach to protect a computer grid used by more than 8,000 physicists to quickly access and analyze large volumes of data produced by the Large Hadron Collider (LHC).

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    The LHC’s main job is to collide atomic particles at high-speed so that scientists can study how particles interact. Particle detectors and other scientific instruments within the LHC gather information about these collisions, and CERN makes it available to laboratories and universities worldwide for use in their own research projects.

    The LHC is expected to generate a total of about 50 petabytes of data (equal to 15 million high-definition movies) in 2017 alone, and demands more computing power and data storage than CERN itself can provide. In anticipation of that type of growth the laboratory in 2002 created its Worldwide LHC Computing Grid, which connects computers from more than 170 research facilities across more than 40 countries. CERN’s computer network functions somewhat like an electrical grid, which relies on a network of generating stations that create and deliver electricity as needed to a particular community of homes and businesses. In CERN’s case the community consists of research labs that require varying amounts of computing resources, based on the type of work they are doing at any given time.

    Grid Guardians

    One of the biggest challenges to defending a computer grid is the fact that security cannot interfere with the sharing of processing power and data storage. Scientists from labs in different parts of the world might end up accessing the same computers to do their research if demand on the grid is high or if their projects are similar. CERN also has to worry about whether the computers of the scientists’ connecting into the grid are free of viruses and other malicious software that could enter and spread quickly due to all the sharing. A virus might, for example, allow hackers to take over parts of the grid and use those computers either to generate digital currency known as bitcoins or to launch cyber attacks against other computers. “In normal situations, antivirus programs try to keep intrusions out of a single machine,” Gomez says. “In the grid we have to protect hundreds of thousands of machines that already allow” researchers outside CERN to use a variety of software programs they need for their different experiments. “The magnitude of the data you can collect and the very distributed environment make intrusion detection on [a] grid far more complex,” he says.

    Jarno Niemelä, a senior security researcher at F-Secure, a company that designs antivirus and computer security systems, says CERN’s use of machine learning to train its network defenses will give the lab much-needed flexibility in protecting its grid, especially when searching for new threats. Still, artificially intelligent intrusion detection is not without risks—and one of the biggest is whether Gomez and his team can develop machine-learning algorithms that can tell the difference between normal and harmful activity on the network without raising a lot of false alarms, Niemelä says.

    CERN’s AI cybersecurity upgrades are still in the early stages and will be rolled out over time. The first test will be protecting the portion of the grid used by ALICE (A Large Ion Collider Experiment)—a key LHC project to study the collisions of lead nuclei. If tests on ALICE are successful, CERN’s machine learning–based security could then be used to defend parts of the grid used by the institution’s six other detector experiments.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 1:27 pm on June 18, 2017 Permalink | Reply
    Tags: , China has taken the leadership in quantum communication, China Shatters 'Spooky Action at a Distance' Record, For now the system remains mostly a proof of concept, Global quantum communication is possible and will be achieved in the near future, , Preps for Quantum Internet, , , SA   

    From SA: “China Shatters ‘Spooky Action at a Distance’ Record, Preps for Quantum Internet” 

    Scientific American

    Scientific American

    June 15, 2017
    Lee Billings

    1
    Credit: Alfred Pasieka Getty Images

    In a landmark study, a team of Chinese scientists using an experimental satellite has tested quantum entanglement over unprecedented distances, beaming entangled pairs of photons to three ground stations across China—each separated by more than 1,200 kilometers. The test verifies a mysterious and long-held tenet of quantum theory, and firmly establishes China as the front-runner in a burgeoning “quantum space race” to create a secure, quantum-based global communications network—that is, a potentially unhackable “quantum internet” that would be of immense geopolitical importance. The findings were published Thursday in Science.

    “China has taken the leadership in quantum communication,” says Nicolas Gisin, a physicist at the University of Geneva who was not involved in the study. “This demonstrates that global quantum communication is possible and will be achieved in the near future.”

    The concept of quantum communications is considered the gold standard for security, in part because any compromising surveillance leaves its imprint on the transmission. Conventional encrypted messages require secret keys to decrypt, but those keys are vulnerable to eavesdropping as they are sent out into the ether. In quantum communications, however, these keys can be encoded in various quantum states of entangled photons—such as their polarization—and these states will be unavoidably altered if a message is intercepted by eavesdroppers. Ground-based quantum communications typically send entangled photon pairs via fiber-optic cables or open air. But collisions with ordinary atoms along the way disrupt the photons’ delicate quantum states, limiting transmission distances to a few hundred kilometers. Sophisticated devices called “quantum repeaters”—equipped with “quantum memory” modules—could in principle be daisy-chained together to receive, store and retransmit the quantum keys across longer distances, but this task is so complex and difficult that such systems remain largely theoretical.

    “A quantum repeater has to receive photons from two different places, then store them in quantum memory, then interfere them directly with each other” before sending further signals along a network, says Paul Kwiat, a physicist at the University of Illinois in Urbana–Champaign who is unaffiliated with the Chinese team. “But in order to do all that, you have to know you’ve stored them without actually measuring them.” The situation, Kwiat says, is a bit like knowing what you have received in the mail without looking in your mailbox or opening the package inside. “You can shake the package—but that’s difficult to do if what you’re receiving is just photons. You want to make sure you’ve received them but you don’t want to absorb them. In principle it’s possible—no question—but it’s very hard to do.”

    To form a globe-girdling secure quantum communications network, then, the only available solution is to beam quantum keys through the vacuum of space then distribute them across tens to hundreds of kilometers using ground-based nodes. Launched into low Earth orbit in 2016 and named after an ancient Chinese philosopher, the 600-kilogram “Micius” satellite is China’s premiere effort to do just that, and is only the first of a fleet the nation plans as part of its $100-million Quantum Experiments at Space Scale (QUESS) program.

    Micius carries in its heart an assemblage of crystals and lasers that generates entangled photon pairs then splits and transmits them on separate beams to ground stations in its line-of-sight on Earth. For the latest test, the three receiving stations were located in the cities of Delingha and Ürümqi—both on the Tibetan Plateau—as well as in the city of Lijiang in China’s far southwest. At 1,203 kilometers, the geographical distance between Delingha and Lijiang is the record-setting stretch over which the entangled photon pairs were transmitted.

    For now the system remains mostly a proof of concept, because the current reported data transmission rate between Micius and its receiving stations is too low to sustain practical quantum communications. Of the roughly six million entangled pairs that Micius’s crystalline core produced during each second of transmission, only about one pair per second reached the ground-based detectors after the beams weakened as they passed through Earth’s atmosphere and each receiving station’s light-gathering telescopes. Team leader Jian-Wei Pan—a physicist at the University of Science and Technology of China in Hefei who has pushed and planned for the experiment since 2003—compares the feat with detecting a single photon from a lone match struck by someone standing on the moon. Even so, he says, Micius’s transmission of entangled photon pairs is “a trillion times more efficient than using the best telecommunication fibers. … We have done something that was absolutely impossible without the satellite.” Within the next five years, Pan says, QUESS will launch more practical quantum communications satellites.

    Although Pan and his team plan for Micius and its nascent network of sister satellites to eventually distribute quantum keys, their initial demonstration instead aimed to achieve a simpler task: proving Einstein wrong.

    Einstein famously derided as “spooky action at a distance” one of the most bizarre elements of quantum theory—the way that measuring one member of an entangled pair of particles seems to instantaneously change the state of its counterpart, even if that counterpart particle is on the other side of the galaxy. This was abhorrent to Einstein, because it suggests information might be transmitted between the particles faster than light, breaking the universal speed limit set by his theory of special relativity. Instead, he and others posited, perhaps the entangled particles somehow shared “hidden variables” that are inaccessible to experiment but would determine the particles’ subsequent behavior when measured. In 1964 the physicist John Bell devised a way to test Einstein’s idea, calculating a limit that physicists could statistically measure for how much hidden variables could possibly correlate with the behavior of entangled particles. If experiments showed this limit to be exceeded, then Einstein’s idea of hidden variables would be incorrect.

    Ever since the 1970s “Bell tests” by physicists across ever-larger swaths of spacetime have shown that Einstein was indeed mistaken, and that entangled particles do in fact surpass Bell’s strict limits. The most definitive test arguably occurred in the Netherlands in 2015, when a team at Delft University of Technology closed several potential “loopholes” that had plagued past experiments and offered slim-but-significant opportunities for the influence of hidden variables to slip through. That test, though, involved separating entangled particles by scarcely more than a kilometer. With Micius’s transmission of entangled photons between widely separated ground stations, Pan’s team has now performed a Bell test at distances a thousand times greater. Just as before, their results confirm that Einstein was wrong. The quantum realm remains a spooky place—although no one yet understands why.

    “Of course, no one who accepts quantum mechanics could possibly doubt that entanglement can be created over that distance—or over any distance—but it’s still nice to see it made concrete,” says Scott Aaronson, a physicist at The University of Texas at Austin. “Nothing we knew suggested this goal was unachievable. The significance of this news is not that it was unexpected or that it overturns anything previously believed, but simply that it’s a satisfying culmination of years of hard work.”

    That work largely began in the 1990s when Pan, leader of the Chinese team, was a graduate student in the lab of the physicist Anton Zeilinger at the University of Innsbruck in Austria. Zeilinger was Pan’s PhD adviser, and they collaborated closely to test and further develop ideas for quantum communication. Pan returned to China to start his own lab in 2001, and Zeilinger started one as well at the Austrian Academy of Sciences in Vienna. For the next seven years they would compete fiercely to break records for transmitting entangled photon pairs across ever-wider gaps, and in ever-more extreme conditions, in ground-based experiments. All the while each man lobbied his respective nation’s space agency to green-light a satellite that could be used to test the technique from space. But Zeilinger’s proposals perished in a bureaucratic swamp at the European Space Agency whereas Pan’s were quickly embraced by the China National Space Administration. Ultimately, Zeilinger chose to collaborate again with his old pupil rather than compete against him; today the Austrian Academy of Sciences is a partner in QUESS, and the project has plans to use Micius to perform an intercontinental quantum key distribution experiment between ground stations in Vienna and Beijing.

    “I am happy that the Micius works so well,” Zeilinger says. “But one has to realize that it is a missed opportunity for Europe and others, too.”

    For years now, other researchers and institutions have been scrambling to catch up, pushing governments for more funding for further experiments on the ground and in space—and many of them see Micius’s success as the catalytic event they have been waiting for. “This is a major milestone, because if we are ever to have a quantum internet in the future, we will need to send entanglement over these sorts of long distances,” says Thomas Jennewein, a physicist at the University of Waterloo in Canada who was not involved with the study. “This research is groundbreaking for all of us in the community—everyone can point to it and say, ‘see, it does work!’”

    Jennewein and his collaborators are pursuing a space-based approach from the ground up, partnering with the Canadian Space Agency to plan a smaller, simpler satellite that could launch as soon as five years from now to act as a “universal receiver” and redistribute entangled photons beamed up from ground stations. At the National University of Singapore, an international collaboration led by the physicist Alexander Ling has already launched cheap shoe box–size CubeSats to create, study and perhaps even transmit photon pairs that are “correlated”—a situation just shy of full entanglement. And in the U.S., Kwiat at the University of Illinois is using NASA funding to develop a device that could someday test quantum communications using “hyperentanglement” (the simultaneous entanglement of photon pairs in multiple ways) onboard the International Space Station.

    Perhaps most significantly, a team led by Gerd Leuchs and Christoph Marquardt at the Max Planck Institute for the Science of Light in Germany is developing quantum communications protocols for commercially available laser systems already in space onboard the European Copernicus and SpaceDataHighway satellites. Using one of these systems, the team successfully encoded and sent simple quantum states to ground stations using photons beamed from a satellite in geostationary orbit, some 38,000 kilometers above Earth. This approach, Marquardt explains, does not rely on entanglement and is very different from that of QUESS—but it could, with minimal upgrades, nonetheless be used to distribute quantum keys for secure communications in as little as five years. Their results appear in Optica.

    “Our purpose is really to find a shortcut into making things like quantum key distribution with satellites economically viable and employable, pretty fast and soon,” Marquardt says. “[Engineers] invested 20 years of hard work making these systems, so it’s easier to upgrade them than to design everything from scratch. … It is a very good advantage if you can rely on something that is already qualified in space, because space qualification is very complicated. It usually takes five to 10 years just to develop that.”

    Marquardt and others suspect, however, that this field could be much further advanced than has been publicly acknowledged, with developments possibly hidden behind veils of official secrecy in the U.S. and elsewhere. It may be that the era of quantum communication is already upon us. “Some colleague of mine made the joke, ‘the silence of the U.S. is very loud,’” Marquardt says. “They had some very good groups concerning free-space satellites and quantum key distribution at Los Alamos [National Laboratory] and other places, and suddenly they stopped publishing. So we always say there are two reasons that they stopped publishing: either it didn’t work, or it worked really well!”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 2:39 pm on May 13, 2017 Permalink | Reply
    Tags: , , , , SA,   

    From SA: “Is a Popular Theory of Cosmic Creation Pseudoscience?” 

    Scientific American

    Scientific American

    May 12, 2017
    John Horgan

    Physicists battle over whether the theory of inflation is untestable and hence not really scientific.

    1
    An article in the February Scientific American, “Pop Goes the Universe,” criticized the theory of cosmic inflation, arguing that it “cannot be evaluated using the scientific method.” Scientific American has now published a letter by 33 scientists, including Stephen Hawking, strongly objecting to the February article. Credit: Scientific American, February 2017.

    A brouhaha has erupted over the theory of cosmic creation known as inflation. The theory holds that in the first instant of the big bang, the universe underwent a tremendous, exponential growth spurt before settling down to the slower rate of expansion observed today.

    First conceived in the early 1980s, inflation quickly became popular, because it seemed to account for puzzling features of the observable universe. Inflation explains, supposedly, why the universe looks quite similar in all directions and yet isn’t entirely uniform, since it contains galaxies and other clumps of matter.

    By the early 1990s, some cosmologists were beginning to doubt inflation. “I like inflation,” David Schramm, a prominent contributor to the big bang theory, told me in 1993. But he worried that inflation does not offer any unique, definitive predictions, which cannot be explained in any other way.

    “You won’t see that for inflation, Schramm said, “whereas for the big bang itself you do see that. The beautiful, cosmic microwave background and the light-element abundances tell you, ‘This is it.’”

    CMB per ESA/Planck

    ESA/Planck

    In other words, inflation cannot be falsified. According to philosopher Karl Popper, a theory that doesn’t offer predictions specific and precise enough to be proven false isn’t really scientific.

    In my 1996 book The End of Science I derided inflation as “ironic science,” which can never be proven true or false and hence isn’t really science at all. I have continued whacking inflation since then, because as with string theory, another example of ironic science, the problems of inflation have only worsened over time.

    There are many different versions of string theory and inflation, which offer many different predictions. Both theories imply, moreover, that our cosmos is just one of many universes, none of which can be observed. (For more criticism of strings and multiverses, see my recent Q&A with string critic Peter Woit.)

    I was thus gratified when physicists Anna Ijjas, Paul Steinhardt and Abraham Loeb presented a stinging critique of inflation in Scientific American in February and urged cosmologists to “consider new ideas about how the universe began.

    Steinhardt’s authorship is especially significant, since he is credited with inventing inflation together with Alan Guth and Andrei Linde. Steinhardt has been voicing qualms about inflation for years. See for example my 2014 Q&A with him on this blog, in which Steinhardt says: “Scientific ideas should be simple, explanatory, predictive. The inflationary multiverse as currently understood appears to have none of those properties.” Ijjas et al. expand on Steinhardt’s long-standing concerns. The authors assert that

    …inflationary cosmology, as we currently understand it, cannot be evaluated using the scientific method. As we have discussed, the expected outcome of inflation can easily change if we vary the initial conditions, change the shape of the inflationary energy density curve, or simply note that it leads to eternal inflation and a multimess. Individually and collectively, these features make inflation so flexible that no experiment can ever disprove it.

    I love the term “multimess.” Now a group of 33 scientists has pushed back hard against the critique of Ijjas, Steinhardt and Loeb. The group includes inflation pioneers Alan Guth and Andrei Linde as well as Steven Weinberg, Edward Witten and Stephen Hawking. In a letter published in Scientific American, they insist that inflation is testable and hence scientific. They conclude:

    “During the more than 35 years of its existence, inflationary theory has gradually become the main cosmological paradigm describing the early stages of the evolution of the universe and the formation of its large-scale structure. No one claims that inflation has become certain; scientific theories don’t get proved the way mathematical theorems do, but as time passes, the successful ones become better and better established by improved experimental tests and theoretical advances. This has happened with inflation. Progress continues, supported by the enthusiastic efforts of many scientists who have chosen to participate in this vibrant branch of cosmology. Empirical science is alive and well!”

    That last sentence strikes me as whistling past the graveyard, but read the letter and judge for yourself. In their response, Ijjas, Steinhardt and Loeb stand firm, especially on their argument that inflation is not empirically testable. They note that

    …if inflation produces a multiverse in which, to quote a previous statement from one of the responding authors (Guth), “anything that can happen will happen”—it makes no sense whatsoever to talk about predictions… any inflationary model gives an infinite diversity of outcomes with none preferred over any other. This makes inflation immune from any observational test.

    Almost 40 years after their inception, inflation and string theory are in worse shape than ever. The persistence of these unfalsifiable and hence unscientific theories is an embarrassment that risks damaging science’s reputation at a time when science can ill afford it. Isn’t it time to pull the plug?

    Further Reading:

    Why I Still Doubt Inflation, in Spite of Gravitational Wave Findings.

    Why String Theory Is Still Not Even Wrong.

    See also my Q&As with physicists Edward Witten, Steven Weinberg, George Ellis, Carlo Rovelli, Scott Aaronson, Stephen Wolfram, Sabine Hossenfelder, Priyamvada Natarajan, Garrett Lisi, Paul Steinhardt and Lee Smolin.

    Meta-Post: Horgan Posts on Physics, Cosmology

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 5:11 pm on February 12, 2017 Permalink | Reply
    Tags: , Arctic 2.0: What Happens after All the Ice Goes?, , SA   

    From SA: “Arctic 2.0: What Happens after All the Ice Goes?” 

    Scientific American

    Scientific American

    February 9, 2017
    Julia Rosen

    1
    Credit: Global Panorama Flickr (CC BY-SA 2.0)

    As the Arctic slipped into the half-darkness of autumn last year, it seemed to enter the Twilight Zone. In the span of a few months, all manner of strange things happened.

    The cap of sea ice covering the Arctic Ocean started to shrink when it should have been growing. Temperatures at the North Pole soared more than 20 °C above normal at times. And polar bears prowling the shorelines of Hudson Bay had a record number of run-ins with people while waiting for the water to freeze over.

    It was a stark illustration of just how quickly climate change is reshaping the far north. And if last autumn was bizarre, it’s the summers that have really got scientists worried. As early as 2030, researchers say, the Arctic Ocean could lose essentially all of its ice during the warmest months of the year—a radical transformation that would upend Arctic ecosystems and disrupt many northern communities.

    Change will spill beyond the region, too. An increasingly blue Arctic Ocean could amplify warming trends and even scramble weather patterns around the globe. “It’s not just that we’re talking about polar bears or seals,” says Julienne Stroeve, a sea-ice researcher at University College London. “We all are ice-dependent species.”

    With the prospect of ice-free Arctic summers on the horizon, scientists are striving to understand how residents of the north will fare, which animals face the biggest risks and whether nations could save them by protecting small icy refuges.

    But as some researchers look even further into the future, they see reasons to preserve hope. If society ever manages to reverse the surge in greenhouse-gas concentrations—as some suspect it ultimately will—then the same physics that makes it easy for Arctic sea ice to melt rapidly may also allow it to regrow, says Stephanie Pfirman, a sea-ice researcher at Barnard College in New York City.

    She and other scientists say that it’s time to look beyond the Arctic’s decline and start thinking about what it would take to restore sea ice. That raises controversial questions about how quickly summer ice could return and whether it could regrow fast enough to spare Arctic species. Could nations even cool the climate quickly through geoengineering, to reverse the most drastic changes up north?

    Pfirman and her colleagues published a paper last year designed to kick-start a broader conversation about how countries might plan for the regrowth of ice, and whether they would welcome it. Only by considering all the possibilities for the far future can the world stay one step ahead of the ever-changing Arctic, say scientists. “We’ve committed to the Arctic of the next generation,” Pfirman says. “What comes next?”

    Blue period

    Pfirman remembers the first time she realized just how fast the Arctic was unravelling. It was September 2007, and she was preparing to give a talk. She went online to download the latest sea-ice maps and discovered something disturbing: the extent of Arctic ice had shrunk past the record minimum and was still dropping. “Oh, no! It’s happening,” she thought.

    Although Pfirman and others knew that Arctic sea ice was shrinking, they hadn’t expected to see such extreme ice losses until the middle of the twenty-first century. “It was a wake-up call that we had basically run out of time,” she says.

    In theory, there’s still a chance that the world could prevent the total loss of summer sea ice. Global climate models suggest that about 3 million square kilometres—roughly half of the minimum summer coverage in recent decades—could survive if countries fulfil their commitments to the newly ratified Paris climate agreement, which limits global warming to 2 °C above pre-industrial temperatures.

    But sea-ice researchers aren’t counting on that. Models have consistently underestimated ice losses in the past, causing scientists to worry that the declines in the next few decades will outpace projections. And given the limited commitments that countries have made so far to address climate change, many researchers suspect the world will overshoot the 2 °C target, all but guaranteeing essentially ice-free summers (winter ice is projected to persist for much longer).

    In the best-case scenario, the Arctic is in for a 4–5 °C temperature rise, thanks to processes that amplify warming at high latitudes, says James Overland, an oceanographer at the US National Oceanic and Atmospheric Administration in Seattle, Washington. “We really don’t have any clue about how disruptive that’s going to be.”

    The Arctic’s 4 million residents—including 400,000 indigenous people—will feel the most direct effects of ice loss. Entire coastal communities, such as many in Alaska, will be forced to relocate as permafrost melts and shorelines crumble without sea ice to buffer them from violent storms, according to a 2013 report by the Brookings Institution in Washington DC. Residents in Greenland will find it hard to travel on sea ice, and reindeer herders in Siberia could struggle to feed their animals. At the same time, new economic opportunities will beckon as open water allows greater access to fishing grounds, oil and gas deposits, and other sources of revenue.

    People living at mid-latitudes may not be immune, either. Emerging research suggests that open water in the Arctic might have helped to amplify weather events, such as cold snaps in the United States, Europe and Asia in recent winters.

    Indeed, the impacts could reach around the globe. That’s because sea ice helps to cool the planet by reflecting sunlight and preventing the Arctic Ocean from absorbing heat. Keeping local air and water temperatures low, in turn, limits melting of the Greenland ice sheet and permafrost. With summer ice gone, Greenland’s glaciers could contribute more to sea-level rise, and permafrost could release its stores of greenhouse gases such as methane. Such is the vast influence of Arctic ice.

    “It is really the tail that wags the dog of global climate,” says Brenda Ekwurzel, director of climate science at the Union of Concerned Scientists in Cambridge, Massachusetts.

    But Arctic ecosystems will take the biggest hit. In 2007, for example, biologists in Alaska noticed something odd: vast numbers of walruses had clambered ashore on the coast of the Chukchi Sea. From above, it looked like the Woodstock music festival—with tusks—as thousands of plump pinnipeds crowded swathes of ice-free shoreline.

    Normally, walruses rest atop sea ice while foraging on the shallow sea floor. But that year, and almost every year since, sea-ice retreat made that impossible by late summer. Pacific walruses have adapted by hauling out on land, but scientists with the US Fish and Wildlife Service worry that their numbers will continue to decline. Here and across the region, the effects of Arctic thawing will ripple through ecosystems.

    In the ocean, photosynthetic plankton that thrive in open water will replace algae that grow on ice. Some models suggest that biological productivity in a seasonally ice-free Arctic could increase by up to 70% by 2100, which could boost revenue from Arctic fisheries even more. (To prevent a seafood gold rush, five Arctic nations have agreed to refrain from unregulated fishing in international waters for now.) Many whales already seem to be benefiting from the bounty of food, says Sue Moore, an Arctic mammal specialist at the Pacific Marine Environmental Laboratory.

    But the changing Arctic will pose a challenge for species whose life cycles are intimately linked to sea ice, such as walruses and Arctic seals—as well as polar bears, which don’t have much to eat on land. Research suggests that many will starve if the ice-free season gets too long in much of the Arctic. “Basically, you can write off most of the southern populations,” says Andrew Derocher, a biologist at the University of Alberta in Edmonton, Canada. Such findings spurred the US Fish and Wildlife Service to list polar bears as threatened in 2008.

    The last of the ice

    Ice-dependent ecosystems may survive for longest along the rugged north shores of Greenland and Canada, where models suggest that about half a million square kilometres of summer sea ice will linger after the rest of the Arctic opens up. Wind patterns cause ice to pile up there, and the thickness of the ice—along with the high latitude—helps prevent it from melting. “The Siberian coastlines are the ice factory, and the Canadian Arctic Archipelago is the ice graveyard,” says Robert Newton, an oceanographer at Columbia University’s Lamont–Doherty Earth Observatory in Palisades, New York.

    Groups such as the wildlife charity WWF have proposed protecting this ‘last ice area’ as a World Heritage Site in the hope that it will serve as a life preserver for many Arctic species. Last December, Canada announced that it would at least consider setting the area aside for conservation, and indigenous groups have expressed interest in helping to manage it. (Before he left office, then-US president Barack Obama joined Canadian Prime Minister Justin Trudeau in pledging to protect 17% of the countries’ Arctic lands and 10% of marine areas by 2020.)

    But the last ice area has limitations as an Arctic Noah’s ark. Some species don’t live in the region, and those that do are there in only small numbers. Derocher estimates that there are less than 2,000 polar bears in that last ice area today—a fraction of the total Arctic population of roughly 25,000. How many bears will live there in the future depends on how the ecosystem evolves with warming.

    The area may also be more vulnerable than global climate models suggest. Bruno Tremblay, a sea-ice researcher at McGill University in Montreal, Canada, and David Huard, an independent climate consultant based in Quebec, Canada, studied the fate of the refuge with a high-resolution sea-ice and ocean model that better represented the narrow channels between the islands of the Canadian archipelago.

    In a report commissioned by the WWF, they found that ice might actually be able to sneak between the islands and flow south to latitudes where it would melt. According to the model, Tremblay says, “even the last ice area gets flushed out much more efficiently”.

    If the future of the Arctic seems dire, there is one source of optimism: summer sea ice will return whenever the planet cools down again. “It’s not this irreversible process,” Stroeve says. “You could bring it back even if you lose it all.”

    Unlike land-based ice sheets, which wax and wane over millennia and lag behind climate changes by similar spans, sea ice will regrow as soon as summer temperatures get cold enough. But identifying the exact threshold at which sea ice will return is tricky, says Dirk Notz, a sea-ice researcher at the Max Planck Institute for Meteorology in Hamburg, Germany. On the basis of model projections, researchers suggest that the threshold hovers around 450 parts per million (p.p.m.)—some 50 p.p.m. higher than today. But greenhouse-gas concentrations are not the only factor that affects ice regrowth; it also depends on how long the region has been ice-free in summer, which determines how much heat can build up in the Arctic Ocean.

    Notz and his colleagues studied the interplay between greenhouse gases and ocean temperature with a global climate model. They increased CO2 from pre-industrial concentrations of 280 p.p.m. to 1,100 p.p.m.—a bit more than the 1,000 p.p.m. projected by 2100 if no major action is taken to curtail greenhouse-gas emissions. Then they left it at those levels for millennia.

    This obliterated both winter and summer sea ice, and allowed the ocean to warm up. The researchers then reduced CO2 concentrations to levels at which summer ice should have returned, but it did not regrow until the ocean had a chance to cool off, which took centuries.

    By contrast, if the Arctic experiences ice-free summers for a relatively short time before greenhouse gases drop, then models suggest ice would regrow much sooner. That could theoretically start to happen by the end of the century, assuming that nations take very aggressive steps to reduce carbon dioxide levels, according to Newton, Pfirman and their colleagues. So even if society cannot forestall the loss of summer sea ice in coming decades, taking action to keep CO2 concentrations under control could still make it easier to regrow the ice cover later, Notz says.

    Global cooling

    Given the stakes, some researchers have proposed global-scale geoengineering to cool the planet and, by extension, preserve or restore ice. Others argue that it might be possible to chill just the north, for instance by artificially whitening the Arctic Ocean with light-coloured floating particles to reflect sunlight. A study this year suggested installing wind-powered pumps to bring water to the surface in winter, where it would freeze, forming thicker ice.

    But many researchers hesitate to embrace geoengineering. And most agree that regional efforts would take tremendous effort and have limited benefits, given that Earth’s circulation systems could just bring more heat north to compensate. “It’s kind of like walking against a conveyor the wrong way,” Pfirman says. She and others agree that managing greenhouse gases—and local pollutants such as black carbon from shipping—is the only long-term solution.

    Returning to a world with summer sea ice could have big perks, such as restoring some of the climate services that the Arctic provides to the globe and stabilizing weather patterns. And in the region itself, restoring a white Arctic could offer relief to polar bears and other ice-dependent species, says Pfirman. These creatures might be able to weather a relatively short ice-free window, hunkered down in either the last ice area or other places set aside to preserve biodiversity. When the ice returned, they could spread out again to repopulate the Arctic.

    That has almost certainly happened during past climate changes. For instance, researchers think the Arctic may have experienced nearly ice-free summers during the last interglacial period, 130,000 years ago.

    But, one thing is certain: getting back to a world with Arctic summer sea ice won’t be simple, politically or technically. Not everyone will embrace a return to an ice-covered Arctic, especially if it’s been blue for several generations. Companies and countries are already eyeing the opportunities for oil and gas exploration, mining, shipping, tourism and fishing in a region hungry for economic development. “In many communities, people are split,” Pfirman says.

    Some researchers also say that the idea of regrowing sea ice seems like wishful thinking, because it would require efforts well beyond what nations must do to meet the Paris agreement. Limiting warming to 2 °C will probably entail converting huge swathes of land into forest and using still-nascent technologies to suck billions of tonnes of CO2 out of the air. Lowering greenhouse-gas concentrations enough to regrow ice would demand even more.

    And if summer sea ice ever does come back, it’s hard to know how a remade Arctic would work, Derocher says. “There will be an ecosystem. It will function. It just may not look like the one we currently have.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 3:45 pm on February 1, 2017 Permalink | Reply
    Tags: , , SA,   

    From SA: “IceCube Closes in on Mysterious Nature of Neutrinos” 

    Scientific American

    Scientific American

    February 1, 2017
    Calla Cofield

    The Antarctica-based observatory has found hints of strange patterns in the ghostly particles’ masses

    1
    IceCube neutrino detector interior
    U Wisconsin IceCube Neutrino detector

    Buried under the Antarctic ice, the IceCube experiment was designed primarily to capture particles called neutrinos that are produced by powerful cosmic events, but it is also helping scientists learn about the fundamental nature of these ghostly particles.

    At a meeting of the American Physical Society (APS) in Washington, D.C., this week, scientists with the IceCube collaboration presented new results that contribute to an ongoing mystery about the nature of neutrinos. These particles pour down on Earth from the sun, but they mostly pass unimpeded, like ghosts, through regular matter.

    The new results support evidence of a strange symmetry in measurements of one neutrino mass. In particle physics, symmetries often indicate underlying physics that scientists haven’t yet unearthed. [Neutrinos from Beyond the Solar System Found (Images)]

    Mystery of the neutrino mass

    Neutrinos are fundamental particles of nature. They aren’t one of the particles that make up atoms. (Those are electrons, protons and neutrons.) Neutrinos very, very rarely interact with regular matter, so they don’t really influence human beings at all (unless, of course, you happen to be a particle physicist who studies them). The sun generates neutrinos in droves, but for the most part, those particles pour through the Earth, like phantoms.

    The [U Wisconsin] IceCube Neutrino Observatory is a neutrino detector buried under 0.9 miles (1.45 kilometers) of ice in Antarctica. The ice provides a shield from other types of radiation and particles that would otherwise overwhelm the rare instances when neutrinos do interact with the detector and create a signal for scientists to study.

    Neutrinos come in three “flavors”: the tau neutrino, the muon neutrino and the electron neutrino. For a long time, scientists debated whether neutrinos had mass or if they were similar to photons (particles of light), which are considered massless. Eventually, scientists showed that neutrinos do have mass, and the 2015 Nobel Prize was awarded for work on neutrinos, including investigations into neutrino masses.

    But saying that neutrinos have mass is not the same as saying that a rock or an apple has mass. Neutrinos are particles that exist in the quantum world, and the quantum world is weird—light can be both a wave and a particle; cats can be both alive and dead. So it’s not that each neutrino flavor has its own mass, but rather that the neutrino flavors combine into what are called “mass eigenstates,” and those are what scientists measure. (For the purpose of simplicity, a Michigan State University statement describing the new findings calls the mass eigenstates “neutrino species.”)

    “One of the outstanding questions is whether there is a pattern to the fractions that go into each neutrino species,” Tyce DeYoung, an associate professor of physics and astronomy at Michigan State University and one of the IceCube collaborators working on the new finding, told Space.com.

    One neutrino species appears to be made up of mostly electron neutrinos, with some muon and tau neutrinos; the second neutrino species seems to be an almost equal mix of all three; and the third is still a bit of a mystery, but one previous study suggested that it might be an even split between muon and tau, with just a few electron neutrinos thrown in.

    At the APS meeting, Joshua Hignight, a postdoctoral researcher at Michigan State University working with DeYoung, presented preliminary results from IceCube that support the equal split of muon and tau neutrinos in that third mass species.

    “This question of whether the third type is exactly equal parts muon and tau is called the maximal mixing question,” he said. “Since we don’t know any reason that this neutrino species should be exactly half and half, that would either be a really astonishing coincidence or possibly telling us about some physical principle that we haven’t discovered yet.”

    Generally speaking, any given feature of the universe can be explained either by a random process or by some rule that governs how things behave. If the number of muon and tau neutrinos in the third neutrino species were determined randomly, there would be much higher odds that those numbers would not be equal.

    “To me, this is very interesting, because it implies a fundamental symmetry,” DeYoung said.

    To better understand why the equal number of muon and tau neutrinos in the mass species implies nonrandomness, DeYoung gave the example of scientists discovering that protons and neutrons (the two particles that make up the nucleus of an atom) have very similar masses. The scientists who first discovered those masses might have wondered if that similarity was a mere coincidence or the product of some underlying similarity.

    It turns out, it’s the latter: Neutrons and protons are both made of three elementary particles called quarks (though a different combination of two quark varieties). In that case, a similarity on the surface indicated something hidden below, the scientists said.

    The new results from IceCube are “generally consistent” with recent results from the T2K neutrino experiment in Japan, which is dedicated to answering questions about the fundamental nature of neutrinos.

    T2K Experiment
    T2K map
    T2K Experiment

    But the Nova experiment, based at Fermi National Accelerator Laboratory [FNAL] outside Chicago, did not “prefer the exact symmetry” between the muon and tau neutrinos in the third mass species, according to DeYoung.

    FNAL/NOvA experiment
    FNAL/NOvA experiment map
    FNAL NOvA Near Detector
    FNAL NOvA Near Detector

    “That’s a tension; that’s not a direct contradiction at this point,” he said. “It’s the sort of not-quite-agreement that we’re going to be looking into over the next couple of years.”

    IceCube was designed to detect somewhat-high-energy neutrinos from distant cosmic sources, but most neutrino experiments on Earth detect lower-energy neutrinos from the sun or nuclear reactors on Earth. Both T2K and Nova detect neutrinos at about an order of magnitude lower energy than IceCube. The consistency between the measurements made by IceCube and T2K are a test of “the robustness of the measurement” and “a success for our standard theory” of neutrino physics, DeYoung said.

    Neutrinos don’t affect most people’s day-to-day lives, but physicists hope that by studying these particles, they can find clues about some of the biggest mysteries in the cosmos. One of those cosmic mysteries could include an explanation for dark matter, the mysterious stuff that is five times more common in the universe than the “regular” matter that makes up planets, stars and all of the visible objects in the cosmos. Dark matter has a gravitational pull on regular matter, and it has shaped the cosmic landscape throughout the history of the universe. Some theorists think dark matter could be a new type of neutrino.

    The IceCube results are still preliminary, according to DeYoung. The scientists plan to submit the final results for publication after they’ve finished running the complete statistical analysis of the data.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 3:30 pm on January 24, 2017 Permalink | Reply
    Tags: Hidden Figures, Human computers, NACA, , SA   

    From SA: “The Story of NASA’s Real “Hidden Figures”’ 

    Scientific American

    Scientific American

    January 24, 2017
    Elizabeth Howell

    1
    Mary Jackson was one of the “human computers” portrayed in the film “Hidden Figures.” Credit: NASA

    In the 1960s, Mercury astronauts Alan Shepard, Gus Grissom, John Glenn and others absorbed the accolades of being the first men in space. Behind the scenes, they were supported by hundreds of unheralded NASA workers, including “human computers” who did the calculations for their orbital trajectories. Hidden Figures, a 2016 book by Margot Lee Shetterly and a movie based on the book, celebrates the contributions of some of those workers.

    Beginning in 1935, the National Advisory Committee for Aeronautics (NACA), a precursor of NASA, hired hundreds of women as computers. The job title designated someone who performed mathematical equations and calculations by hand, according to a NASA history. The computers worked at the Langley Memorial Aeronautical Laboratory in Virginia.

    Human computers were not a new concept. In the late 19th and early 20thcentury, female “computers” at Harvard University analyzed star photos to learn more about their basic properties.

    2

    Edward Charles Pickering, left, director of the Harvard College Observatory, hired women to analyze the images.
    Credit: Harvard-Smithsonian Center for Astrophysics. via Space.com.

    These women made discoveries still fundamental to astronomy today. For example: Williamina Fleming is best known for classifying stars based on their temperature, and Annie Jump Cannon developed a stellar classification system still used today (from coolest to hottest stars: O, B, A, F, G, K, M.)

    During World War II, the computer pool was expanded. Langley began recruiting African-American women with college degrees to work as computers, according to NASA. However, segregation policies required that these women work in a separate section, called the West Area Computers—although computing sections became more integrated after the first several years.

    As the years passed and the center evolved, the West Computers became engineers, (electronic) computer programmers, the first black managers at Langley and trajectory whizzes whose work propelled the first American, John Glenn, into orbit in 1962.

    “Hidden Figures” focuses on three computers, Mary Jackson, Katherine Johnson and Dorothy Vaughan. Here are brief biographies of these women:

    Mary Jackson (1921-2005)

    Jackson hailed from Hampton, Virginia. She graduated with high marks from high school and received a bachelor of science degree from the Hampton Institute in Mathematics and Physical Science, according to a biography posted on NASA’s website. She began her career as a schoolteacher, and took on several other jobs before joining NACA.

    As a computer with the all-black West Area Computing section, she was involved with wind tunnels and flight experiments. Her job was to extract the relevant data from experiments and flight tests. She also tried to help other women advance in their career, according to the biography, by advising them on what educational opportunities to pursue.

    “She discovered that occasionally it was something as simple as a lack of a couple of courses, or perhaps the location of the individual, or perhaps the assignments given them, and of course, the ever present glass ceiling that most women seemed to encounter,” stated the biography.

    After 30 years with NACA and NASA (at which point she was an engineer), Jackson decided to become an equal opportunity specialist to help women and minorities. Although described as a behind-the-scenes sort of worker, she helped many people get promoted or become supervisors. She retired from NASA in 1985.

    Katherine Johnson (born 1918)

    Johnson showed early brilliance in West Virginia schools by being promoted several years ahead of her age, according to NASA. She attended a high school on the campus of West Virginia State College by age 13, and began attending the college at age 18. After graduating with highest honors, she started work as a schoolteacher in 1937.

    Two years later, when the college chose to integrate its graduate schools, Johnson and two male students were offered spots. She quickly enrolled, but left to have children. In 1953, when she was back in the workforce, Johnson joined the West Area Computing section at Langley.

    She began her career working with data from flight tests, but her life quickly changed after the Soviet Union launched the first satellite in 1957. For example, some of her math equations were used in a lecture series compendium called Notes on Space Technology. These lectures were given by engineers that later formed the Space Task Group, NACA’s section on space travel.

    For the Mercury missions, Johnson did trajectory analysis for Shepard’s Freedom 7 mission in 1961, and (at John Glenn’s request) did the same job for his orbital mission in 1962. Despite Glenn’s trajectory being planned by computers, Glenn reportedly wanted Johnson herself to run through the equations to make sure they were safe.

    “When asked to name her greatest contribution to space exploration, Katherine Johnson talks about the calculations that helped synch Project Apollo’s Lunar Lander with the moon-orbiting Command and Service Module,” NASA wrote. “She also worked on the space shuttle and the Earth Resources Satellite, and authored or coauthored 26 research reports.”

    Johnson retired from NASA In 1986. At age 97, in 2015, she received the Presidential Medal of Freedom, the highest civilian honor in the United States.

    Dorothy Vaughan (1910-2008)

    Vaughan joined the Langley Memorial Aeronautical Laboratory in 1943 after beginning her career as a math teacher in Farmville, Virginia. Her job during World War II was a temporary position, but (in part thanks to a new executive order prohibiting discrimination in the defense industry) she was hired on permanently because the laboratory had a wealth of data to process.

    Still, the law required that she and her black colleagues needed to work separately from white female computers, and the first supervisors were white. Vaughan became the first black NACA supervisor in 1949 and made sure that her employees received promotions or pay raises if merited.

    Her segregation was ended in 1958 when NACA became NASA, at which point NASA created an analysis and computation division. Vaughan was an expert programmer in FORTRAN, a prominent computer language of the day, and also contributed to a satellite-launching rocket called Scout (Solid Controlled Orbital Utility Test). She retired from NASA in 1971.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 4:55 pm on January 22, 2017 Permalink | Reply
    Tags: , , How Trump Could Unravel Obama’s Science Legacy, , SA   

    From SA: “How Trump Could Unravel Obama’s Science Legacy” 

    Scientific American

    Scientific American

    January 20, 2017
    Lauren Morello

    1
    Land in the Bears Ears region of Utah is among that designated a national monument by Barack Obama. Credit: Bureau of Land Management Flickr (CC BY 2.0).

    Barack Obama used his presidential powers to make changes that affect science. Once Donald Trump is inaugurated as president on 20 January, he will be able to do the same. These charts illustrate the government that Trump inherits as it relates to science and research, and explore how the new president might seek to take things in a different direction.

    Appointing leaders and freezing new hires

    As does every new president, Trump gets to fill out the ranks of federal science agencies with political appointees, from the agency chiefs who require Senate confirmation to lower-level bureaucrats. These jobs range from two spots at the US Geological Survey—the director and an assistant—to 358 positions at the Department of Energy. Trump has already nominated a handful of people to fill these slots, including former Governor of Texas Rick Perry, who has questioned the science underlying climate change, as energy secretary.

    2
    Credit: Nature, January 19, 2017, doi:10.1038/nature.2017.21327

    The much-larger ranks of non-political ‘career’ employees, meanwhile, could shrink under Trump, who has pledged to freeze federal hiring within his first 100 days in office. Staffing levels at science agencies—which stayed relatively flat under Obama, despite his enthusiasm for research— could eventually dwindle by attrition.

    Balancing basic and applied science

    Funding science involves a delicate balance. Science in the Obama years tilted the needle towards applied research—from the launch of the ambitious Precision Medicine Initiative to sequence the genomes of one million people, to the creation of a string of institutes to foster robotics and other innovative manufacturing technologies in partnership with private industry.

    3
    Credit: Nature, January 19, 2017, doi:10.1038/nature.2017.21327

    It is not clear which flavour of research Trump will favour, in part because he has said little publicly about science before or after the election. In September, Trump wrote that “scientific advances do require long term investment”, in response to questions from the advocacy group ScienceDebate.org. But the president-elect’s pick to lead the White House Office of Management and Budget, Representative Mick Mulvaney (Republican, South Carolina), has pushed for sharp cuts in government spending in recent years.

    Undoing Obama’s conservation triumphs

    More than any other president, Obama has used the Antiquities Act—a law that dates back to 1906—to protect public lands from development. He has declared 29 new national monuments, such as the Bears Ears buttes in Utah, and enlarged 5 others, preserving a total of around 553 million acres of land and water.

    4
    Credit: Nature, January 19, 2017, doi:10.1038/nature.2017.21327

    Some Republican politicians have suggested that Trump should remove protections from some or all of these areas, but most legal scholars say that only an act of Congress can reverse a monument designation. That might not stop the Trump administration from trying. The president-elect’s nominee to lead the Interior Department, Representative Ryan Zinke (Republican, Montana), told a Senate committee on 17 January that Trump could “amend”, if not fully rescind, the monuments that Obama created.

    Reversing stem cell and climate change policies?

    Faced with an often-hostile Congress, Obama enacted many of his signature policies by executive order—from reversing restrictions on research with human embryonic stem cells to helping communities prepare for climate change. That strategy now seems poised to backfire: Trump has vowed to reverse “every unconstitutional executive action, memorandum and order issued by President Obama” beginning on his first day in office, January 20.

    5
    Credit: Nature, January 19, 2017, doi:10.1038/nature.2017.21327

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 9:41 am on January 16, 2017 Permalink | Reply
    Tags: , , Autism Risk May Arise From Sex-Specific Traits, , SA, SNP - single nucleotide polymorphism   

    From SA: “Autism Risk May Arise From Sex-Specific Traits” 

    Scientific American

    Scientific American

    January 16, 2017
    Ann Griswold

    Genetic sequences that code for physical features that differ between boys and girls also seem to contribute to risk for the disorder.

    1
    Alena Baranova, EyeEm, Getty Images

    2
    Basic biology: Different genetic variants contribute to autism risk in boys versus girls. Alfred Pasieka / Science Photo Library

    Genetic variants that shape physical features that vary with sex, such as waist-to-hip ratio, may also affect autism risk, according to a new study.

    Many of the genes involved in these features are not linked to autism or even the brain. Instead, they help establish basic physical differences between the sexes, says lead investigator Lauren Weiss, associate professor of psychiatry at the University of California, San Francisco.

    “Whatever general biological sex differences cause a [variant] to have a different effect on things like height in males and females, those same mechanisms seem to be contributing to autism risk,” she says. The work appeared in November in PLOS Genetics.

    The results bolster the notion that mutations in some genes contribute to autism’s skewed sex ratio: The condition is diagnosed in about five boys for every girl. That may be because girls require a bigger genetic hit to show features of the condition, because sex hormones in the womb boost the risk in boys or because autism is easier to detect in boys than in girls.

    The new study is the first to look at sex differences in common genetic variants called single nucleotide polymorphisms (SNPs). It shows that the sexes differ in which autism-linked SNPs they have, but not in the overall number of such SNPs.

    Separate sets:

    Weiss and her team analyzed published genetic data from four databases and unpublished data from five others. Altogether, they reviewed information from 8,646 individuals with autism, including 1,468 girls and women. They also analyzed data from 15,028 controls, some of whom are related to people in the autism group.

    The researchers first identified SNPs that differ between males with autism and their unaffected family members and unrelated controls. They then repeated the procedure for girls and women with autism.

    These two analyses revealed distinct sets of SNPs associated with autism: a set of five SNPs in boys and men and a separate set of three SNPs in girls and women. None of the variants have previously been associated with autism.

    The researchers then compared males who have autism with females who have the condition. They found similar levels of genetic variation in the two groups, with equal numbers of autism risk genes affected. This result suggests that common variants do not contribute to a stronger genetic hit in girls with autism.

    Body of data:

    When the researchers compared people who have autism with controls, they did not find any differences in SNPs in genes that respond to sex hormones.

    The team then looked at 11 SNPs known to influence height, weight, body mass index, hip and waist measurements in women, and 15 variants that influence these physical traits in men. They found more of these sex-specific SNPs in people with autism than in controls. None of these SNPs have previously been associated with autism.

    The findings suggest that different SNPs contribute to autism risk in boys and girls.

    The fact that some of these SNPs also shape physical traits in a sex-specific way is particularly interesting, says Meng-Chuan Lai, assistant professor in psychiatry at the University of Toronto, who was not involved in the study. Scientists should examine whether sex differences in brain structure in people with autism track with the sex-specific SNPs, he says.

    Weiss says she hopes the findings will spur researchers to pay more attention to the influences of sex when sifting through genomic data. Outfitting genetic repositories with the option to sort data by sex would be the next step for that approach.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 1:11 pm on January 11, 2017 Permalink | Reply
    Tags: , , SA, We Must Learn How to Talk about Science--Fast   

    From SA: “We Must Learn How to Talk about Science–Fast” 

    Scientific American

    Scientific American

    January 10, 2017
    Paul A. Hanle

    1
    Global warming is just one area where public ignorance about science is extremely dangerous. Credit: NASA Scientific Visualization Studio,Goddard Space Flight Center Wikimedia

    Today in Washington, the National Academies of Sciences, Engineering, and Medicine are convening a public discussion of their December report on communicating science effectively. It could not come at a more relevant moment, the day confirmation hearings begin for the President-Elect’s cabinet choices. Arguably it should have happened long before, as we find astonishing disdain for evidence-based thinking among many of the leaders and their advisors who are now taking the reins of government.

    The report identifies “cross-cutting themes” common to the range of issues that were addressed, from climate change to genetically modified organisms. One major finding is about the “deficit model”—the idea that non-scientists, if only informed of the facts of science, will think and act more in line with scientific evidence—which the authors say is widespread among scientists and science communicators. As those of us whose mission is to reach wide and diverse audiences know, and the Academies state unequivocally, this deficit model is wrong. Not always wrong, but mostly wrong, especially where the science communication bears on issues that are contentious like climate change. In such a context, people rely on their own values and beliefs, knowledge and skills, goals and needs—and on those in their communities and peer groups–more than on expert opinion. Not surprising, really, but quite clear and useful.

    In climate change, that finding translates to this: there is no use in just beating those who doubt climate change—the vast majority of whom are conservative in politics—over the head with the facts.

    But it does not translate, either, to the scientific community doing nothing to convey those facts and what they imply for action to address the climate problem. On the contrary, confronting falsehoods and lies about climate change is critically important in this moment when misrepresentations threaten to recur like cancer after years of remission. As more than one leading climate scientist has noted in the wake of the election, it falls to the expert scientific community—with virtual unanimity in accepting the reality, human cause, and urgency of addressing the climate problem—to communicate these facts to the people about to take power and to the public who are their constituency.

    The question is, how do we do that in the face of disdain for evidence and attacks on evidence-based thinking that have permeated so much of recent politics? The report contains gems for scientists—indeed for anyone—practicing science communication, and a call for better and more work to understand the huge amount we still don’t know about how to do this. And we need to do this right now, with real threats, based on falsehoods already evident, to potentially dismantle and discard the edifice of Federal science funding at such agencies as NASA, NOAA, DOE, and the NSF, which has been a foundation of U.S. greatness in science.

    One of these gems is the Academy’s reiteration (in this newly charged context) of the conclusion of many researchers that “science as an institution possesses norms and practices that restrain scientists and offer means for policing and sanctioning those who violate its standards,” while “those who are not bound by scientific norms have at times intentionally mischaracterized scientific information to serve their financial or political interests.” It’s an asymmetrical game we must play. Science in contention needs social and behavioral science to help it determine how authoritative voices from science can be heard when authority is important and in question.

    Not everyone is qualified to judge scientific truth, but everyone must know how to grasp what’s needed to make informed judgments about science that affects their lives in issues like climate change. Alas, explaining how exactly to make that happen is not in the purview of the report because it is a research agenda, and no doubt also aims to stay above the fray of politics. This would be too bad, if it weren’t for the commitment of researchers and organizations that communicate about climate to undertake the research that the report recommends. The Academy calls for a pragmatic, systems approach, developing explanatory models with predictive value at the outset—with practitioners and researchers working in partnership across multiple disciplines.

    Extraordinary times call for extraordinary measures. Science communicators need to act now and learn quickly, with proper deliberation but real urgency. One idea is to do practical research about what works in science communication with different audiences, building on the existing body of knowledge, in real time. There are at least three strong reasons to take this “build the airplane while flying” approach. First, as President Kennedy said, because it is hard. In this case, because it lends real-world tempering that is measurable in effectiveness.

    Second, a key measure of the robustness of a scientific explanation is its capacity to predict—essentially the same requirement as that which guides practice, and which bears fruit as soon as it is recognized. With no lag time required to translate an academic insight into a point of practice, we will benefit immediately from research on communication while communicating the scientific truth. Third, and most important, science communicators must use these tools as soon as possible in the face of what appears to be an historic turn against science in key places. Apparently, we have failed to impress a massive swath of the American public, and this failure threatens the very foundations of science through denial of facts, falsehoods, and elevation of ideological thinking above facts. This is the wolf at the door…and if science doesn’t figure out how to counter it quickly, we might just as well throw the door open.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 11:48 am on January 11, 2017 Permalink | Reply
    Tags: Pancreatic Cancer, SA, Steve Jobs   

    From SA: “The Puzzle of Pancreatic Cancer: How Steve Jobs Did Not Beat the Odds—but Nobel Winner Ralph Steinman Did” 

    Scientific American

    Scientific American

    October 7, 2011 [Just brought forward again.]
    Katherine Harmon

    1
    Apple CEO Steve Jobs holds up the new iPhone that was introduced at Macworld on January 9, 2007 in San Francisco, California. Credit: David Paul Morris Stringer, Getty Images

    Editor’s note (1/10/17): Ten years ago, on January 9, 2007, Steve Jobs introduced the world to the iPhone. In honor of the smart phone’s game-changing impact on personal electronics and communications, we are republishing the following story about Jobs’ battle with cancer, published shortly after his death in 2011.

    Steve Jobs was a rare case, right down to his death. Announced Wednesday, Jobs’s death from “complications of pancreatic cancer” only hints at the vast complexity of the disease to which he succumbed at the age of 56.

    Jobs joined recently announced Nobel Prize winner Ralph Steinman, actor Patrick Swayze and football great Gene Upshaw as the latest bold-faced name to die from this aggressive disease—one that even he, with his vast fortune, and Steinman, with his use of experimental immunological treatments, could not forestall indefinitely.

    Most pancreatic cancers (53 percent) are diagnosed after they have spread—and those have an exceedingly low survival rate, with just 1.8 percent of patients living for more than five years after diagnosis. (For all types of the cancer, the average five-year survival rate when diagnosed is only slightly higher at 3.3 percent.) So how did Jobs, who was diagnosed in the fall of 2003—and who revealed it publicly in 2004—manage to survive for eight years?

    Jobs had a rare form of the cancer, known as neuroendocrine cancer, which grows more slowly and is easier to treat, explains Leonard Saltz, acting chief of the gastrointestinal oncology service at Memorial Sloan-Kettering Cancer Center. “Survival for many years or even decades with endocrine cancer is not surprising.” For that type, the sort that Jobs had, “survival is measured in years, as opposed to pancreatic cancer, which is measured in months.”

    “When you have a pancreatic neuroendocrine tumor, that is substantially different from pancreatic cancer,” Saltz says.

    Steinman, on the other hand, did have the type that is usually fatal within a year after diagnosis. “Ralph had the garden variety, poorly differentiated pancreatic cancer,” says Sarah Schlesinger, an associate professor of immunology and cell physiology at The Rockefeller University, where she worked with Steinman.

    Given the grim prognosis for both these forms of cancer, researchers are hard at work trying to develop better treatments and diagnostics, and to figure out just why one patient might live for eight years—and another for eight months.

    Two different kinds

    Pancreatic cancer is a rare disease, with about 44,000 new cases diagnosed in the U.S. each year, and a lifetime risk of getting it at about 1.4 percent. The vast majority of those cancers—some 95 percent—are known as adenocarcinomas, the sort that Steinman had. Jobs’s form, known as pancreatic neuroendrocrine tumor (pNET), makes up the small fraction of other pancreatic cancer sufferers.

    The pancreas itself is essentially two different organs, which means two distinct kinds of tissue—and two very different types of cancer, Saltz points out. The most common kind of pancreatic cancer, the adenocarcinomas, originate in what is known as the exocrine portion of the pancreas. This is the main mass of the organ, which makes digestive enzymes that get shuttled to the gastrointestinal tract via specialized ducts.

    “Scattered in that larger organ are thousands of tiny islands,” Saltz explains. “These are islands of endocrine tissue,” which makes hormones that are secreted into the blood. It was a cancer of these islet cells that Jobs had.

    Difficult to diagnose

    Pancreatic cancer is so deadly in large part because it is often caught at a very late stage. Unlike lung or colon cancer, it does not create a lot of early symptoms. Saltz said he was hesitant to even list the manifestations (which include upper abdominal pain, weight loss, appetite loss and blood clots) because they are such common complaints that, he noted, everyone would go home and decide by this evening that they had pancreatic cancer.

    Most cases are discovered after some symptoms persist or more severe indications, such as jaundice, occur.

    Some groups are looking for a better way to screen for pancreatic cancer, in hopes of catching it earlier. “There’s a big push for developing a blood test,” says Philip Arlen, president and CEO of Neogenix Oncology, Inc., a company that is looking into both diagnostics and treatment for pancreatic cancer. They have found a couple of genetic markers that are present in pancreatic cancer but not in normal tissue. The goal, says Arlen, who previously worked as a researcher at the National Cancer Institute, is to develop something akin to a PSA (prostate-specific antigen) test for prostate cancer.

    There are clues, for example, that pancreatic cancer is not as much a sudden-onset disease as it often seems. After studying the accumulation of genetic mutations in pancreatic cancer tumors, researchers concluded that the disease takes an average of seven years to form a substantive tumor and closer to a decade to start moving to other organs, according to research published last October. Armed with that knowledge and the other finding of pre-malignant lesions, Arlen is hopeful that a non-invasive screening method will eventually be developed.

    Widespread screening for more common cancers, such as breast, colon and prostate, have come under fire lately for leading to too many false positives and excessive follow-up treatment. With even rarer diseases, it is much trickier, Saltz points out, and would demand an exceedingly low false-positive rate. “Pancreatic cancer, although it’s a terrifying disease, is rare,” he says.

    Trying new treatments

    When pancreatic cancer is caught early, doctors will usually try to remove it surgically. As Saltz points out, however, the chances that it will come back in the next year or two are still relatively high. And the surgery itself is risky. The pancreas is lodged deep within the abdomen, surrounded by—and connected to—other major organs. “It’s considered the magnum opus of a surgeon’s repertoire,” she says of partial pancreas removal, which is known as the Whipple procedure.

    If the cancer has already spread, as it had in Steinman’s case, the most common approach is chemotherapy, which “for regular pancreatic cancer, is not very effective,” Saltz says. The mainstay is the chemo drug gemcitabine (Gemzar), which is one of the treatments Steinman received. In trials, some patients saw no benefit, but for a minority, it extended life by as long as a few years, suggesting that an essential molecular difference exists in their tumors.

    Despite initial positive signs from chemo, and even when Steinman was doing better, “he felt like he was living with Damocles’ sword over his neck—he never knew when it was going to come back,” Schlesinger says. So he turned to what he knew: the immune system. “Ralph felt deeply that the key to a cure is getting the immune system revved up enough to fight off the tumor,” Schlesinger says. “That wasn’t such a simple thing to do.”

    Enlisting the immune system to fight off a cancer has long been a goal of researchers. The only immunotherapy currently approved for general use as cancer treatment is a drug for metastatic melanoma (ipilimumab, or Yervoy, approved in March). Saltz calls that approval good “evidence that it’s an important avenue to explore” for other forms of cancer.

    Scientist as test subject

    When word spread that Steinman had pancreatic cancer, Schlesinger says, there was an outpouring of offers from fellow immunologists to try treatments they were working on—many having been made possible by Steinman’s own discoveries about the immune system’s dendritic cells. Not all the experimental drugs were meant to tackle pancreatic tumors; some were for skin or prostate cancer.

    In all, Steinman tried eight different experimental therapies, Schlesinger says. But they were not under-the-table, backroom needle jabs, she is quick to point out. Each drug was already being tested on other patients in phase I clinical trails, and Schlesinger and Steinman went through great pains—and many hours—to ensure all of the proper institutional and government approvals were granted before he got the therapies.

    The first treatment he got was a vaccine called GVAX, under development to treat prostate cancer. He also received a novel therapy that worked on a developmental pathway (the hedgehog signaling pathway) and two that were based on dendritic cells: one in which dendritic cells were created from his own blood cells that were then “pulsed with RNA that had been isolated from his tumor,” Schlesinger explains; and another in which the dendritic cells were filled with “peptides that were from his own tumor.” The hope was that the RNA and proteins from his tumor would help his dendritic cells stimulate his immune system to attack the cancer.

    Arlen’s group is testing, in a phase I trial, a monoclonal antibody to treat patients with the more common form of pancreatic cancer. Preliminary data show that the antibody finds its target with some 50 to 60 percent of patients with adenocarcinoma, he says. But that does not mean that it will leave them disease-free. And he hopes that a combination of the new approaches and the more standard drugs will yield even better results—a trial that they plan to start next year.

    “I think it’s far too early to say they have a treatment for any of these diseases,” Saltz concludes.

    Treating Jobs’s cancer

    Endocrine cancer, the variety Jobs had, is treated with a different variety of chemotherapy drugs. Two new drugs for this type were just approved by the U.S. Food and Drug Administration (FDA) earlier this year. Everolimus (sold as Afinitor) works by blocking the mTOR kinase target to alter cellular signaling and was approved in May. Sunitinib (sold as Sutent) blocks a vascular endothelial growth factor. “Neither is a cure—neither is a wonder drug for the disease,” Saltz says. “Each provides some modest benefit. ”

    One form of treatment that is not recommended for most pancreatic cancer is a liver transplant. Media observers surmised that the transplant Jobs received in 2009 had been necessary because the cancer had spread to his liver. And although liver failure is a common cause of death for pancreatic cancer patients, because the liver is close to the pancreas and often gets invaded by the spreading cancer, getting a new one “is not an accepted standard form of treatment,” he says, citing a lack of evidence to show that it works.

    Even if the new liver staved off organ failure, the immunosuppressants necessary to avoid organ rejection “can reduce the body’s ability to fight off any cancer cells that remain,” Saltz says. And factoring the many other variables of real life, it’s ultimately not possible to conclude whether the liver transplant “made him live longer, the same or shorter—we don’t know,” Saltz remarks.

    Keys for a cure

    Steinman, however, is a much different case. With his collection of therapies, he did manage to beat the average odds for his type of pancreatic cancer—by years. But “which thing made the difference, we will still never know,” Schlesinger says. “My personal belief is it is a combination of therapies.” Steinman, for his part, “had so much faith in dendritic cells,” Schlesinger says. “He believed that his dendritic cells played an important part.” She notes that even though they did apply and get special, individual treatment protocols for Steinman to receive each of the experimental therapies, she never doubted what they were doing; “I only felt inadequate,” she says, having a background in dendritic cells and HIV rather than cancer research.

    To truly be able to hack into the inner-workings of pancreatic cancer, “there needs to be more basic science work in humans,” Schlesinger says. Saltz points to the current efforts to better grasp the molecular and genetic differences of each tumor, in hopes of finding patterns in growth rate and treatment response, which might turn into better therapeutic targets. But much of what determines why one patient might live for seven years and another for seven months seems to depend on the biology of these cancers. Which, Saltz says, “is a nice elegant way of saying that we truly don’t understand.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: