Tagged: NYT Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:41 pm on March 6, 2019 Permalink | Reply
    Tags: "Another Obstacle for Women in Science: Men Get More Federal Grant Money", Among the top 50 institutions funded by the N.I.H. the researchers found that women received median awards of $94000 compared with $135000 for men, At the Big Ten schools including Penn State the University of Michigan and Northwestern female principal investigators received a median grant of $66000 compared with $148000 for men, “It means women are working harder with less money to get to the same level as men” said Dr. Woodruff a researcher at the Northwestern University Feinberg School of Medicine, “That first grant is monumentally important and determines your trajectory” said Carolina Abdala a head and neck specialist at the University of Southern California who won her first N.I.H. grant , But when it comes to the size of those awards men are often rewarded with bigger grants than women according to a study published Tuesday in JAMA, For ambitious young scientists trying to start their own research labs winning a prestigious grant from the National Institutes of Health can be career making, Having less money put women at a disadvantage making it harder to hire graduate students and buy lab equipment, Identifying the problem is a step toward solving the problem, NYT, Only one in five applicants for an N.I.H. grant lands one, Over all the median N.I.H. award for female researchers at universities was roughly $126600 compared with $167700 for men., The disparity was even greater at the nation’s top universities, The N.I.H. did not dispute the study’s findings and said it was working to address the funding disparities and more broadly the gender inequities that bedevil women in the fields, The study analyzed 54000 grants awarded from 2006 to 2017 and used key benchmarks to ensure recipients were at similar points in their careers, The study by researchers at Northwestern University confirms longstanding disparities between men and women in the fields of science, There was one exception to the pattern- the study found that women who were applying for individual research grants received nearly $16000 more than male applicants 11% of grants,   

    From The New York Times: Women in STEM-“Another Obstacle for Women in Science: Men Get More Federal Grant Money” 

    New York Times

    From The New York Times

    March 5, 2019
    Andrew Jacobs

    1
    A scientist working with radioactive material in the isotope laboratory of the National Institutes of Health, circa 1950. Credit National Institutes of Health.

    For ambitious young scientists trying to start their own research labs, winning a prestigious grant from the National Institutes of Health can be career making.

    But when it comes to the size of those awards, men are often rewarded with bigger grants than women, according to a study published Tuesday in JAMA, which found that men who were the principal investigators on research projects received $41,000 more than women.

    The disparity was even greater at the nation’s top universities. At Yale, women received $68,800 less than men, and at Brown, the median disparity was $76,500. Over all, the median N.I.H. award for female researchers was roughly $126,600, compared with $167,700 for men.

    The study, by researchers at Northwestern University, confirms longstanding disparities between men and women in the field of science. In recent years, a cavalcade of studies has documented biases that favor male researchers in hiring, pay, prize money, speaking invitations and even the effusiveness displayed in letters of recommendation.

    “It’s disappointing, but identifying the problem is a step toward solving the problem,” said Cori Bargmann, a neuroscientist who runs the $3 billion science arm of the Chan Zuckerberg Initiative, a philanthropic organization, and who was not involved in the study.

    In a statement, the N.I.H. did not dispute the study’s findings and said it was working to address the funding disparities and, more broadly, the gender inequities that bedevil women in the field.

    “We have and continue to support efforts to understand the barriers and factors faced by women scientists and to implement interventions to overcome them,” it said.

    Only one in five applicants for an N.I.H. grant lands one, an achievement that can be crucial in whether a young researcher succeeds or drops out of the field.

    “That first grant is monumentally important and determines your trajectory,” said Carolina Abdala, a head and neck specialist at the University of Southern California, who won her first N.I.H. grant in 1998. “It can help get you on the tenure track and it gets you into that club of successful scientists who can procure their own funding, which makes it easier to change jobs.”

    But the size of the grant can also be important in determining the scale and ambition of a junior researcher’s first lab. Teresa K. Woodruff, a co-author of the JAMA study, said that having less money put women at a disadvantage, making it harder to hire graduate students and buy lab equipment.

    “It means women are working harder with less money to get to the same level as men,” said Dr. Woodruff, a researcher at the Northwestern University Feinberg School of Medicine. “If we had the same footing, the engine of science would move a little faster toward the promise of basic science and medical cures.”

    The study analyzed 54,000 grants awarded from 2006 to 2017 and used key benchmarks to ensure recipients were at similar points in their careers. Among the top 50 institutions funded by the N.I.H., the researchers found that women received median awards of $94,000 compared with $135,000 for men. At the Big Ten schools, including Penn State, the University of Michigan and Northwestern, female principal investigators received a median grant of $66,000 compared with $148,000 for men.

    There was one exception to the pattern; in a curious twist, the study found that women who were applying for individual research grants received nearly $16,000 more than male applicants. Dr. Woodruff noted that such grants made up only 11 percent of N.I.H. grant money, but said more research was needed into funding disparities.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 7:01 pm on February 25, 2019 Permalink | Reply
    Tags: A disturbance in the Force, Adam Riess [High-Z Supernova Search Team] Saul Perlmutter [Supernova Cosmology Project] and Brian Schmidt [High-Z Supernova Search Team]shared the Nobel Prize in physics awarded in 2011 for proving th, As space expands it carries galaxies away from each other like the raisins in a rising cake. The farther apart two galaxies are the faster they will fly away from each other. The Hubble constant simpl, , , Axions? Phantom energy? Astrophysicists scramble to patch a hole in the universe- rewriting cosmic history in the process, , , , Dark energy might be getting stronger and denser leading to a future in which atoms are ripped apart and time ends, , NYT, The Hubble constant- named after Edwin Hubble the Mount Wilson astronomer who in 1929 discovered that the universe is expanding, Thus far there is no evidence for most of these ideas, Under the influence of dark energy the cosmos is now doubling in size every 10 billion years   

    From The New York Times: “Have Dark Forces Been Messing With the Cosmos?” 

    New York Times

    From The New York Times

    Feb. 25, 2019
    Dennis Overbye

    1
    Brian Stauffer

    Axions? Phantom energy? Astrophysicists scramble to patch a hole in the universe, rewriting cosmic history in the process.

    There was, you might say, a disturbance in the Force.

    Long, long ago, when the universe was only about 100,000 years old — a buzzing, expanding mass of particles and radiation — a strange new energy field switched on. That energy suffused space with a kind of cosmic antigravity, delivering a not-so-gentle boost to the expansion of the universe.

    Then, after another 100,000 years or so, the new field simply winked off, leaving no trace other than a speeded-up universe.

    So goes the strange-sounding story being promulgated by a handful of astronomers from Johns Hopkins University. In a bold and speculative leap into the past, the team has posited the existence of this field to explain an astronomical puzzle: the universe seems to be expanding faster than it should be.

    The cosmos is expanding only about 9 percent more quickly than theory prescribes. But this slight-sounding discrepancy has intrigued astronomers, who think it might be revealing something new about the universe.

    And so, for the last couple of years, they have been gathering in workshops and conferences to search for a mistake or loophole in their previous measurements and calculations, so far to no avail.

    “If we’re going to be serious about cosmology, this is the kind of thing we have to be able to take seriously,” said Lisa Randall, a Harvard theorist who has been pondering the problem.

    At a recent meeting in Chicago, Josh Frieman, a theorist at the Fermi National Accelerator Laboratory in Batavia, Ill., asked: “At what point do we claim the discovery of new physics?”

    Now ideas are popping up. Some researchers say the problem could be solved by inferring the existence of previously unknown subatomic particles. Others, such as the Johns Hopkins group, are invoking new kinds of energy fields.

    Adding to the confusion, there already is a force field — called dark energy — making the universe expand faster. And a new, controversial report suggests that this dark energy might be getting stronger and denser, leading to a future in which atoms are ripped apart and time ends.

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    Thus far, there is no evidence for most of these ideas. If any turn out to be right, scientists may have to rewrite the story of the origin, history and, perhaps, fate of the universe.

    Or it could all be a mistake. Astronomers have rigorous methods to estimate the effects of statistical noise and other random errors on their results; not so for the unexamined biases called systematic errors.

    As Wendy L. Freedman, of the University of Chicago, said at the Chicago meeting, “The unknown systematic is what gets you in the end.”

    2
    Edwin Hubble in 1949, two decades after he discovered that the universe is expanding.CreditBoyer/Roger Viollet, via Getty Images

    Hubble trouble

    Generations of great astronomers have come to grief trying to measure the universe. At issue is a number called the Hubble constant, named after Edwin Hubble, the Mount Wilson astronomer who in 1929 discovered that the universe is expanding.

    Edwin Hubble looking through a 100-inch Hooker telescope at Mount Wilson in Southern California

    Mt Wilson 100 inch Hooker Telescope, perched atop the San Gabriel Mountains outside Los Angeles, CA, USA, Mount Wilson, California, US, Altitude 1,742 m (5,715 ft)

    As space expands, it carries galaxies away from each other like the raisins in a rising cake. The farther apart two galaxies are, the faster they will fly away from each other. The Hubble constant simply says by how much.

    But to calibrate the Hubble constant, astronomers depend on so-called standard candles: objects, such as supernova explosions and certain variable stars, whose distances can be estimated by luminosity or some other feature. This is where the arguing begins.

    Standard Candles to measure age and distance of the universe NASA

    Until a few decades ago, astronomers could not agree on the value of the Hubble constant within a factor of two: either 50 or 100 kilometers per second per megaparsec. (A megaparsec is 3.26 million light years.)

    But in 2001, a team using the Hubble Space Telescope, and led by Dr. Freedman, reported a value of 72. For every megaparsec farther away from us that a galaxy is, it is moving 72 kilometers per second faster.

    NASA/ESA Hubble Telescope

    More recent efforts by Adam G. Riess, of Johns Hopkins and the Space Telescope Science Institute, and others have obtained similar numbers, and astronomers now say they have narrowed the uncertainty in the Hubble constant to just 2.4 percent.

    But new precision has brought new trouble. These results are so good that they now disagree with results from the European Planck spacecraft, which predict a Hubble constant of 67.

    ESA/Planck 2009 to 2013

    4
    Workers with the European Planck spacecraft at the European Space Agency spaceport in Kourou, French Guiana, in 2009.CreditESA – S. Corvaja

    The discrepancy — 9 percent — sounds fatal but may not be, astronomers contend, because Planck and human astronomers do very different kinds of observations.

    Planck is considered the gold standard of cosmology. It spent four years studying the cosmic bath of microwaves [CMB] left over from the end of the Big Bang, when the universe was just 380,000 years old.

    CMB per ESA/Planck

    But it did not measure the Hubble constant directly. Rather, the Planck group derived the value of the constant, and other cosmic parameters, from a mathematical model largely based on those microwaves.

    In short, Planck’s Hubble constant is based on a cosmic baby picture. In contrast, the classical astronomical value is derived from what cosmologists modestly call “local measurements,” a few billion light-years deep into a middle-aged universe.

    What if that baby picture left out or obscured some important feature of the universe?

    ‘Cosmological Whac-a-Mole’

    And so cosmologists are off to the game that Lloyd Knox, an astrophysicist from the University of California, Davis, called “cosmological Whac-a-Mole” at the recent Chicago meeting: attempting to fix the model of the early universe, to make it expand a little faster without breaking what the model already does well.

    One approach, some astrophysicists suggest, is to add more species of lightweight subatomic particles, such as the ghostlike neutrinos, to the early universe. (Physicists already recognize three kinds of neutrinos, and argue whether there is evidence for a fourth variety.) These would give the universe more room to stash energy, in the same way that more drawers in your dresser allow you to own more pairs of socks. Thus invigorated, the universe would expand faster, according to the Big Bang math, and hopefully not mess up the microwave baby picture.

    A more drastic approach, from the Johns Hopkins group, invokes fields of exotic anti-gravitational energy. The idea exploits an aspect of string theory, the putative but unproven “theory of everything” that posits that the elementary constituents of reality are very tiny, wriggling strings.

    String theory suggests that space could be laced with exotic energy fields associated with lightweight particles or forces yet undiscovered. Those fields, collectively called quintessence, could act in opposition to gravity, and could change over time — popping up, decaying or altering their effect, switching from repulsive to attractive.

    The team focused in particular on the effects of fields associated with hypothetical particles called axions. Had one such field arisen when the universe was about 100,000 years old, it could have produced just the right amount of energy to fix the Hubble discrepancy, the team reported in a paper late last year. They refer to this theoretical force as “early dark energy.”

    “I was surprised how it came out,” said Marc Kamionkowski, a Johns Hopkins cosmologist who was part of the study. “This works.”

    The jury is still out. Dr. Riess said that the idea seems to work, which is not to say that he agrees with it, or that it is right. Nature, manifest in future observations, will have the final say.

    Dr. Knox called the Johns Hopkins paper “an existence proof” that the Hubble problem could be solved. “I think that’s new,” he said.

    Dr. Randall, however, has taken issue with aspects of the Johns Hopkins calculations. She and a trio of Harvard postdocs are working on a similar idea that she says works as well and is mathematically consistent. “It’s novel and very cool,” Dr. Randall said.

    So far, the smart money is still on cosmic confusion. Michael Turner, a veteran cosmologist at the University of Chicago and the organizer of a recent airing of the Hubble tensions, said, “Indeed, all of this is going over all of our heads. We are confused and hoping that the confusion will lead to something good!”

    Doomsday? Nah, nevermind

    Early dark energy appeals to some cosmologists because it hints at a link to, or between, two mysterious episodes in the history of the universe. As Dr. Riess said, “This is not the first time the universe has been expanding too fast.”

    The first episode occurred when the universe was less than a trillionth of a trillionth of a second old. At that moment, cosmologists surmise, a violent ballooning propelled the Big Bang; in a fraction of a trillionth of a second, this event — named “inflation” by the cosmologist Alan Guth, of M.I.T. — smoothed and flattened the initial chaos into the more orderly universe observed today. Nobody knows what drove inflation.

    The second episode is unfolding today: cosmic expansion is speeding up. But why? The issue came to light in 1998, when two competing teams of astronomers asked whether the collective gravity of the galaxies might be slowing the expansion enough to one day drag everything together into a Big Crunch.

    To great surprise, they discovered the opposite: the expansion was accelerating under the influence of an anti-gravitational force later called dark energy. The two teams won a Nobel Prize.

    Studies of Universe’s Expansion Win Physics Nobel

    By DENNIS OVERBYE OCT. 4, 2011

    3
    From left, Adam Riess [High-Z Supernova Search Team], Saul Perlmutter [Supernova Cosmology Project] and Brian Schmidt [High-Z Supernova Search Team]shared the Nobel Prize in physics awarded Tuesday. Credit Johns Hopkins University; University Of California At Berkeley; Australian National University

    Dark energy comprises 70 percent of the mass-energy of the universe. And, spookily, it behaves very much like a fudge factor known as the cosmological constant, a cosmic repulsive force that Einstein inserted in his equations a century ago thinking it would keep the universe from collapsing under its own weight. He later abandoned the idea, perhaps too soon.

    Under the influence of dark energy, the cosmos is now doubling in size every 10 billion years — to what end, nobody knows.

    Early dark energy, the force invoked by the Johns Hopkins group, might represent a third episode of antigravity taking over the universe and speeding it up. Perhaps all three episodes are different manifestations of the same underlying tendency of the universe to go rogue and speed up occasionally. In an email, Dr. Riess said, “Maybe the universe does this from time-to-time?”

    If so, it would mean that the current manifestation of dark energy is not Einstein’s constant after all. It might wink off one day. That would relieve astronomers, and everybody else, of an existential nightmare regarding the future of the universe. If dark energy remains constant, everything outside our galaxy eventually will be moving away from us faster than the speed of light, and will no longer be visible. The universe will become lifeless and utterly dark.

    But if dark energy is temporary — if one day it switches off — cosmologists and metaphysicians can all go back to contemplating a sensible tomorrow.

    “An appealing feature of this is that there might be a future for humanity,” said Scott Dodelson, a theorist at Carnegie Mellon who has explored similar scenarios [Physical Review D].

    The phantom cosmos

    But the future is still up for grabs.

    Far from switching off, the dark energy currently in the universe actually has increased over cosmic time, according to a recent report in Nature Astronomy. If this keeps up, the universe could end one day in what astronomers call the Big Rip, with atoms and elementary particles torn asunder — perhaps the ultimate cosmic catastrophe.

    This dire scenario emerges from the work of Guido Risaliti, of the University of Florence in Italy, and Elisabeta Lusso, of Durham University in England. For the last four years, they have plumbed the deep history of the universe, using violent, faraway cataclysms called quasars as distance markers.

    Quasars arise from supermassive black holes at the centers of galaxies; they are the brightest objects in nature, and can be seen clear across the universe. As standard candles, quasars aren’t ideal because their masses vary widely. Nevertheless, the researchers identified some regularities in the emissions from quasars, allowing the history of the cosmos to be traced back nearly 12 billion years. The team found that the rate of cosmic expansion deviated from expectations over that time span.

    One interpretation of the results is that dark energy is not constant after all, but is changing, growing denser and thus stronger over cosmic time. It so happens that this increase in dark energy also would be just enough to resolve the discrepancy in measurements of the Hubble constant.

    The bad news is that, if this model is right, dark energy may be in a particularly virulent and — most physicists say — implausible form called phantom energy. Its existence would imply that things can lose energy by speeding up, for instance. Robert Caldwell, a Dartmouth physicist, has referred to it as “bad news stuff.”

    As the universe expands, the push from phantom energy would grow without bounds, eventually overcoming gravity and tearing apart first Earth, then atoms.

    The Hubble-constant community responded to the new report with caution. “If it holds up, this is a very interesting result,” said Dr. Freedman.

    Astronomers have been trying to take the measure of this dark energy for two decades. Two space missions — the European Space Agency’s Euclid and NASA’s Wfirst — have been designed to study dark energy and hopefully deliver definitive answers in the coming decade. The fate of the universe is at stake.

    ESA/Euclid spacecraft

    NASA/WFIRST

    In the meantime, everything, including phantom energy, is up for consideration, according to Dr. Riess.

    “In a list of possible solutions to the tension via new physics, mentioning weird dark energy like this would seem appropriate,” he wrote in an email. “Heck, at least their dark energy goes in the right direction to solve the tension. It could have gone the other way and made it worse!”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 2:16 pm on February 22, 2019 Permalink | Reply
    Tags: "DNA Gets a New — and Bigger — Genetic Alphabet", DNA is spelled out with four letters or bases. Researchers have now built a system with eight. It may hold clues to the potential for life elsewhere in the universe and could also expand our capacity , Natural DNA is spelled out with four different letters known as bases — A C G and T. Dr. Benner and his colleagues have built DNA with eight bases — four natural and four unnatural. They named the, NYT   

    From The New York Times: “DNA Gets a New — and Bigger — Genetic Alphabet” 

    New York Times

    From The New York Times

    Feb. 21, 2019
    Carl Zimmer

    DNA is spelled out with four letters, or bases. Researchers have now built a system with eight. It may hold clues to the potential for life elsewhere in the universe and could also expand our capacity to store digital data on Earth.

    1
    Animation by Millie Georgiadis/Indiana University School of Medicine

    In 1985, the chemist Steven A. Benner sat down with some colleagues and a notebook and sketched out a way to expand the alphabet of DNA. He has been trying to make those sketches real ever since.

    On Thursday, Dr. Benner and a team of scientists reported success: in a paper, published in Science, they said they have in effect doubled the genetic alphabet.

    Natural DNA is spelled out with four different letters known as bases — A, C, G and T. Dr. Benner and his colleagues have built DNA with eight bases — four natural, and four unnatural. They named their new system Hachimoji DNA (hachi is Japanese for eight, moji for letter).

    Crafting the four new bases that don’t exist in nature was a chemical tour-de-force. They fit neatly into DNA’s double helix, and enzymes can read them as easily as natural bases, in order to make molecules.

    “We can do everything here that is necessary for life,” said Dr. Benner, now a distinguished fellow at the Foundation for Applied Molecular Evolution in Florida.

    Hachimoji DNA could have many applications, including a far more durable way to store digital data that could last for centuries. “This could be huge that way,” said Dr. Nicholas V. Hud, a biochemist at Georgia Institute of Technology who was not involved in research.

    It also raises a profound question about the nature of life elsewhere in the universe, offering the possibility that the four-base DNA we are familiar with may not be the only chemistry that could support life.

    The four natural bases of DNA are all anchored to molecular backbones. A pair of backbones can join into a double helix because their bases are attracted to each other. The bases form a bond with their hydrogen atoms.

    But bases don’t stick together at random. C can only bond to G, and A can only bond to T. These strict rules help ensure that DNA strands don’t clump together into a jumble. No matter what sequence of bases are contained in natural DNA, it still keeps its shape.

    But those four bases are not the only compounds that can attach to DNA’s backbone and link to another base — at least on paper. Dr. Benner and his colleagues thought up a dozen alternatives.

    Working at the Swiss university ETH Zurich at the time, Dr. Benner tried to make some of those imaginary bases real.

    “Of course, the first thing you discover is your design theory is not terribly good,” said Dr. Benner.

    Once Dr. Benner and his colleagues combined real atoms, according to his designs, the artificial bases didn’t work as he had hoped.

    Nevertheless, Dr. Benner’s initial forays impressed other chemists. “His work was a real inspiration for me,” said Floyd E. Romesberg, now of the Scripps Research Institute in San Diego. Reading about Dr. Benner’s early experiments, Dr. Romesberg decided to try to create his own bases.

    Dr. Romesberg chose not to make bases that linked together with hydrogen bonds; instead, he fashioned a pair of oily compounds that repelled water. That chemistry brought his unnatural pair of bases together. “Oil doesn’t like to mix with water, but it does like to mix with oil,” said Dr. Romesberg.

    In the years that followed, Dr. Romesberg and his colleagues fashioned enzymes that could copy DNA made from both natural bases and unnatural, oily ones. In 2014, the scientists engineered bacteria that could make new copies of these hybrid genes.

    In recent years, Dr. Romesberg’s team has begun making unnatural proteins from these unnatural genes. He founded a company, Synthorx, to develop some of these proteins as cancer drugs.

    At the same time, Dr. Benner continued with his own experiments. He and his colleagues succeeded in creating one pair of new bases.

    Like Dr. Romesberg, they found an application for their unnatural DNA. Their six-base DNA became the basis of a new, sensitive test for viruses in blood samples.

    They then went on to create a second pair of new bases. Now with eight bases to play with, the researchers started building DNA molecules with a variety of different sequences. The researchers found that no matter which sequence they created, the molecules still formed the standard double helix.

    Because Hachimoji DNA held onto this shape, it could act like regular DNA: it could store information, and that information could be read to make a molecule.

    For a cell, the first step in making a molecule is to read a gene using special enzymes. They make a copy of the gene in a single-stranded version of DNA, called RNA.

    Depending on the gene, the cell will then do one of two things with that RNA. In some cases, it will use the RNA as a guide to build a protein. But in other cases, the RNA molecule floats off to do a job of its own.

    Dr. Benner and his colleagues created a Hachimoji gene for an RNA molecule. They predicted that the RNA molecule would be able to grab a molecule called a fluorophore. Cradled by the RNA molecule, the fluorophore would absorb light and release it as a green flash.

    Andrew Ellington, an evolutionary engineer at the University of Texas, led the effort to find an enzyme that could read Hachimoji DNA. He and his colleagues found a promising one made by a virus, and they tinkered with it until the enzyme could easily read all eight bases.

    They mixed the enzyme in test tubes with the Hachimoji gene. As they had hoped, their test tubes began glowing green.

    “Here you have it from start to finish,” said Dr. Benner. “We can store information, we can transfer it to another molecule and that other molecule has a function — and here it is, glowing.”

    In the future, Hachimoji DNA may store information of a radically different sort. It might someday encode a movie or a spreadsheet.

    Today, movies, spreadsheets and other digital files are typically stored on silicon chips or magnetic tapes. But those kinds of storage have serious shortcomings. For one thing, they can deteriorate in just years.

    DNA, by contrast, can remain intact for centuries. Last year, researchers at Microsoft and the University of Washington managed to encode 35 songs, videos, documents, and other files, totaling 200 megabytes, in a batch of DNA molecules.

    With eight bases instead of four, Hachimoji DNA could potentially encode far more information. “DNA capable of twice as much storage? That’s pretty amazing in my view,” said Dr. Ellington.

    Beyond our current need for storage, Hachimoji DNA also offers some clues about life itself. Scientists have long wondered if our DNA evolved only four bases because they’re the only ones that can work in genes. Could life have taken a different path?

    “Steve’s work goes a long way to say that it could have — it just didn’t,” said Dr. Romesberg.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 9:05 pm on February 17, 2019 Permalink | Reply
    Tags: "The Secret History of Women in Coding", , , NYT   

    From The New York Times: Women In STEM-“The Secret History of Women in Coding” 

    New York Times

    From The New York Times

    Feb. 13, 2019
    Clive Thompson

    Computer programming once had much better gender balance than it does today. What went wrong?

    1
    2
    Mary Allen Wilkes with a LINC at M.I.T., where she was a programmer. Credit Joseph C. Towler, Jr.

    As a teenager in Maryland in the 1950s, Mary Allen Wilkes had no plans to become a software pioneer — she dreamed of being a litigator. One day in junior high in 1950, though, her geography teacher surprised her with a comment: “Mary Allen, when you grow up, you should be a computer programmer!” Wilkes had no idea what a programmer was; she wasn’t even sure what a computer was. Relatively few Americans were. The first digital computers had been built barely a decade earlier at universities and in government labs.

    By the time she was graduating from Wellesley College in 1959, she knew her legal ambitions were out of reach. Her mentors all told her the same thing: Don’t even bother applying to law school. “They said: ‘Don’t do it. You may not get in. Or if you get in, you may not get out. And if you get out, you won’t get a job,’ ” she recalls. If she lucked out and got hired, it wouldn’t be to argue cases in front of a judge. More likely, she would be a law librarian, a legal secretary, someone processing trusts and estates.

    But Wilkes remembered her junior high school teacher’s suggestion. In college, she heard that computers were supposed to be the key to the future. She knew that the Massachusetts Institute of Technology had a few of them.


    So on the day of her graduation, she had her parents drive her over to M.I.T. and marched into the school’s employment office. “Do you have any jobs for computer programmers?” she asked. They did, and they hired her.

    It might seem strange now that they were happy to take on a random applicant with absolutely no experience in computer programming. But in those days, almost nobody had any experience writing code. The discipline did not yet really exist; there were vanishingly few college courses in it, and no majors. (Stanford, for example, didn’t create a computer-science department until 1965.) So instead, institutions that needed programmers just used aptitude tests to evaluate applicants’ ability to think logically. Wilkes happened to have some intellectual preparation: As a philosophy major, she had studied symbolic logic, which can involve creating arguments and inferences by stringing together and/or statements in a way that resembles coding.

    Wilkes quickly became a programming whiz. She first worked on the IBM 704, which required her to write in an abstruse “assembly language.”

    7
    An IBM 704 computer, with IBM 727 tape drives and IBM 780 CRT display. (Image courtesy of LLNL.)

    (A typical command might be something like “LXA A, K,” telling the computer to take the number in Location A of its memory and load it into to the “Index Register” K.) Even getting the program into the IBM 704 was a laborious affair. There were no keyboards or screens; Wilkes had to write a program on paper and give it to a typist, who translated each command into holes on a punch card. She would carry boxes of commands to an “operator,” who then fed a stack of such cards into a reader. The computer executed the program and produced results, typed out on a printer.

    Often enough, Wilkes’s code didn’t produce the result she wanted. So she had to pore over her lines of code, trying to deduce her mistake, stepping through each line in her head and envisioning how the machine would execute it — turning her mind, as it were, into the computer. Then she would rewrite the program. The capacity of most computers at the time was quite limited; the IBM 704 could handle only about 4,000 “words” of code in its memory. A good programmer was concise and elegant and never wasted a word. They were poets of bits. “It was like working logic puzzles — big, complicated logic puzzles,” Wilkes says. “I still have a very picky, precise mind, to a fault. I notice pictures that are crooked on the wall.”

    What sort of person possesses that kind of mentality? Back then, it was assumed to be women. They had already played a foundational role in the prehistory of computing: During World War II, women operated some of the first computational machines used for code-breaking at Bletchley Park in Britain.

    9
    A Colossus Mark 2 computer being operated by Wrens. The slanted control panel on the left was used to set the “pin” (or “cam”) patterns of the Lorenz. The “bedstead” paper tape transport is on the right.

    Develope-Tommy Flowers, assisted by Sidney Broadhurst, William Chandler and for the Mark 2 machines, Allen Coombs
    Manufacturer-Post Office Research Station
    Type-Special-purpose electronic digital programmable computer
    Generation-First-generation computer
    Release date Mk 1: December 1943 Mk 2: 1 June 1944
    Discontinued 1960

    8
    The Lorenz SZ machines had 12 wheels, each with a different number of cams (or “pins”).
    Wheel number 1 2 3 4 5 6 7 8 9 10 11 12
    BP wheel name[13] ψ1 ψ2 ψ3 ψ4 ψ5 μ37 μ61 χ1 χ2 χ3 χ4 χ5
    Number of cams (pins) 43 47 51 53 59 37 61 41 31 29 26 23

    Colossus was a set of computers developed by British codebreakers in the years 1943–1945 to help in the cryptanalysis of the Lorenz cipher. Colossus used thermionic valves (vacuum tubes) to perform Boolean and counting operations. Colossus is thus regarded as the world’s first programmable, electronic, digital computer, although it was programmed by switches and plugs and not by a stored program.

    Colossus was designed by research telephone engineer Tommy Flowers to solve a problem posed by mathematician Max Newman at the Government Code and Cypher School (GC&CS) at Bletchley Park. Alan Turing’s use of probability in cryptanalysis (see Banburismus) contributed to its design. It has sometimes been erroneously stated that Turing designed Colossus to aid the cryptanalysis of the Enigma.Turing’s machine that helped decode Enigma was the electromechanical Bombe, not Colossus.

    In the United States, by 1960, according to government statistics, more than one in four programmers were women. At M.I.T.’s Lincoln Labs in the 1960s, where Wilkes worked, she recalls that most of those the government categorized as “career programmers” were female. It wasn’t high-status work — yet.

    In 1961, Wilkes was assigned to a prominent new project, the creation of the LINC.

    LINC from MIT Lincoln Lab


    Wesley Clark in 1962 at a demonstration of the first Laboratory Instrument Computer, or LINC. Credit MIT Lincoln Laboratory

    As one of the world’s first interactive personal computers, it would be a breakthrough device that could fit in a single office or lab. It would even have its own keyboard and screen, so it could be programmed more quickly, without awkward punch cards or printouts. The designers, who knew they could make the hardware, needed Wilkes to help write the software that would let a user control the computer in real time.

    For two and a half years, she and a team toiled away at flow charts, pondering how the circuitry functioned, how to let people communicate with it. “We worked all these crazy hours; we ate all kinds of terrible food,” she says. There was sexism, yes, especially in the disparity between how men and women were paid and promoted, but Wilkes enjoyed the relative comity that existed among the men and women at Lincoln Labs, the sense of being among intellectual peers. “We were a bunch of nerds,” Wilkes says dryly. “We were a bunch of geeks. We dressed like geeks. I was completely accepted by the men in my group.” When they got an early prototype of the LINC working, it solved a fiendish data-processing problem for a biologist, who was so excited that he danced a happy jig around the machine.

    In late 1964, after Wilkes returned from traveling around the world for a year, she was asked to finish writing the LINC’s operating system. But the lab had been relocated to St. Louis, and she had no desire to move there. Instead, a LINC was shipped to her parents’ house in Baltimore. Looming in the front hall near the foot of the stairs, a tall cabinet of whirring magnetic tapes across from a refrigerator-size box full of circuitry, it was an early glimpse of a sci-fi future: Wilkes was one of the first people on the planet to have a personal computer in her home. (Her father, an Episcopal clergyman, was thrilled. “He bragged about it,” she says. “He would tell anybody who would listen, ‘I bet you don’t have a computer in your living room.’ ”) Before long, LINC users around the world were using her code to program medical analyses and even create a chatbot that interviewed patients about their symptoms.

    But even as Wilkes established herself as a programmer, she still craved a life as a lawyer. “I also really finally got to the point where I said, ‘I don’t think I want to do this for the rest of my life,’ ” she says. Computers were intellectually stimulating but socially isolating. In 1972, she applied and got in to Harvard Law School, and after graduating, she spent the next four decades as a lawyer. “I absolutely loved it,” she says.

    Today Wilkes is retired and lives in Cambridge, Mass. White-haired at 81, she still has the precise mannerisms and the ready, beaming smile that can be seen in photos from the ’60s, when she posed, grinning, beside the LINC. She told me that she occasionally gives talks to young students studying computer science. But the industry they’re heading into is, astonishingly, less populated with women — and by many accounts less welcoming to them — than it was in Wilkes’s day. In 1960, when she started working at M.I.T., the proportion of women in computing and mathematical professions (which are grouped together in federal government data) was 27 percent. It reached 35 percent in 1990. But, in the government’s published figures, that was the peak. The numbers fell after that, and by 2013, women were down to 26 percent — below their share in 1960.

    When Wilkes talks to today’s young coders, they are often shocked to learn that women were among the field’s earliest, towering innovators and once a common sight in corporate America. “Their mouths are agape,” Wilkes says. “They have absolutely no idea.”

    Almost 200 years ago, the first person to be what we would now call a coder was, in fact, a woman: Lady Ada Lovelace.

    4
    Ada Lovelace aka Augusta Ada Byron-1843 or 1850 a rare daguerreotype by Antoine Claudet. Picture taken in his studio probably near Regents Park in London
    Date 2 January 1843
    Source https://blogs.bodleian.ox.ac.uk/adalovelace/2015/10/14/only-known-photographs-of-ada-lovelace-in-bodleian-display/ Reproduction courtesy of Geoffrey Bond.
    Augusta Ada King, Countess of Lovelace (née Byron; 10 December 1815 – 27 November 1852) was an English mathematician and writer, chiefly known for her work on Charles Babbage’s proposed mechanical general-purpose computer, the Analytical Engine [below]. She was the first to recognise that the machine had applications beyond pure calculation, and published the first algorithm intended to be carried out by such a machine. As a result, she is sometimes regarded as the first to recognise the full potential of a “computing machine” and the first computer programmer.

    Analytical Engine was a proposed mechanical general-purpose computer designed by English mathematician and computer pioneer Charles Babbage. It was first described in 1837 as the successor to Babbage’s difference engine

    As a young mathematician in England in 1833, she met Charles Babbage, an inventor who was struggling to design what he called the Analytical Engine, which would be made of metal gears and able to execute if/then commands and store information in memory. Enthralled, Lovelace grasped the enormous potential of a device like this. A computer that could modify its own instructions and memory could be far more than a rote calculator, she realized. To prove it, Lovelace wrote what is often regarded as the first computer program in history, an algorithm with which the Analytical Engine would calculate the Bernoulli sequence of numbers. (She wasn’t shy about her accomplishments: “That brain of mine is something more than merely mortal; as time will show,” she once wrote.) But Babbage never managed to build his computer, and Lovelace, who died of cancer at 36, never saw her code executed.

    Analytical Engine was a proposed mechanical general-purpose computer designed by English mathematician and computer pioneer Charles Babbage. It was first described in 1837 as the successor to Babbage’s difference engine

    When digital computers finally became a practical reality in the 1940s, women were again pioneers in writing software for the machines. At the time, men in the computing industry regarded writing code as a secondary, less interesting task. The real glory lay in making the hardware. Software? “That term hadn’t yet been invented,” says Jennifer S. Light, a professor at M.I.T. who studies the history of science and technology.

    This dynamic was at work in the development of the first programmable digital computer in the United States, the Electronic Numerical Integrator and Computer, or Eniac, during the 1940s.

    3
    Computer operators with an Eniac — the world’s first programmable general-purpose computer. Credit Corbis/Getty Images

    ENIAC progamming. Columbia University

    Funded by the military, the thing was a behemoth, weighing more than 30 tons and including 17,468 vacuum tubes. Merely getting it to work was seen as the heroic, manly engineering feat. In contrast, programming it seemed menial, even secretarial. Women had long been employed in the scut work of doing calculations. In the years leading up to the Eniac, many companies bought huge electronic tabulating machines — quite useful for tallying up payroll, say — from companies like IBM; women frequently worked as the punch-card operators for these overgrown calculators. When the time came to hire technicians to write instructions for the Eniac, it made sense, to the men in charge, to pick an all-female team: Kathleen McNulty, Jean Jennings, Betty Snyder, Marlyn Wescoff, Frances Bilas and Ruth Lichterman. The men would figure out what they wanted Eniac to do; the women “programmed” it to execute the instructions.

    “We could diagnose troubles almost down to the individual vacuum tube,” Jennings later told an interviewer for the IEEE Annals of the History of Computing. Jennings, who grew up as the tomboy daughter of low-income parents near a Missouri community of 104 people, studied math at college. “Since we knew both the application and the machine, we learned to diagnose troubles as well as, if not better than, the engineer.”

    The Eniac women were among the first coders to discover that software never works right the first time — and that a programmer’s main work, really, is to find and fix the bugs. Their innovations included some of software’s core concepts. Betty Snyder realized that if you wanted to debug a program that wasn’t running correctly, it would help to have a “break point,” a moment when you could stop a program midway through its run. To this day, break points are a key part of the debugging process.

    In 1946, Eniac’s creators wanted to show off the computer to a group of leaders in science, technology and the military. They asked Jennings and Snyder to write a program that calculated missile trajectories. After weeks of intense effort, they and their team had a working program, except for one glitch: It was supposed to stop when the missile landed, but for some reason it kept running. The night before the demo, Snyder suddenly intuited the problem. She went to work early the next day, flipped a single switch inside the Eniac and eliminated the bug. “Betty could do more logical reasoning while she was asleep than most people can do awake,” Jennings later said. Nonetheless, the women got little credit for their work. At that first official demonstration to show off Eniac, the male project managers didn’t mention, much less introduce, the women.

    After the war, as coding jobs spread from the military into the private sector, women remained in the coding vanguard, doing some of the highest-profile work.

    3
    Rear Admiral Grace M. Hopper, 1984

    Grace Brewster Murray Hopper (née Murray; December 9, 1906 – January 1, 1992) was an American computer scientist and United States Navy rear admiral. One of the first programmers of the Harvard Mark I computer, she was a pioneer of computer programming who invented one of the first compiler related tools. She popularized the idea of machine-independent programming languages, which led to the development of COBOL, an early high-level programming language still in use today.

    The pioneering programmer Grace Hopper is frequently credited with creating the first “compiler,” a program that lets users create programming languages that more closely resemble regular written words: A coder could thus write the English-like code, and the compiler would do the hard work of turning it into ones and zeros for the computer. Hopper also developed the “Flowmatic” language for nontechnical businesspeople. Later, she advised the team that created the Cobol language, which became widely used by corporations. Another programmer from the team, Jean E. Sammet, continued to be influential in the language’s development for decades. Fran Allen was so expert in optimizing Fortran, a popular language for performing scientific calculations, that she became the first female IBM fellow.

    NERSC Hopper Cray XE6 supercomputer

    When the number of coding jobs exploded in the ’50s and ’60s as companies began relying on software to process payrolls and crunch data, men had no special advantage in being hired. As Wilkes had discovered, employers simply looked for candidates who were logical, good at math and meticulous. And in this respect, gender stereotypes worked in women’s favor: Some executives argued that women’s traditional expertise at painstaking activities like knitting and weaving manifested precisely this mind-set. (The 1968 book Your Career in Computers stated that people who like “cooking from a cookbook” make good programmers.)

    The field rewarded aptitude: Applicants were often given a test (typically one involving pattern recognition), hired if they passed it and trained on the job, a process that made the field especially receptive to neophytes. “Know Nothing About Computers? Then We’ll Teach You (and Pay You While Doing So),” one British ad promised in 1965. In a 1957 recruiting pitch in the United States, IBM’s brochure titled My Fair Ladies specifically encouraged women to apply for coding jobs.

    Such was the hunger for programming talent that a young black woman named Arlene Gwendolyn Lee [no photo available] could become one of the early female programmers in Canada, despite the open discrimination of the time. Lee was half of a biracial couple to whom no one would rent, so she needed money to buy a house. According to her son, who has described his mother’s experience in a blog post, Lee showed up at a firm after seeing its ad for data processing and systems analytics jobs in a Toronto newspaper sometime in the early 1960s. Lee persuaded the employers, who were all white, to let her take the coding aptitude test. When she placed in the 99th percentile, the supervisors grilled her with questions before hiring her. “I had it easy,” she later told her son. “The computer didn’t care that I was a woman or that I was black. Most women had it much harder.”

    Elsie Shutt learned to code during her college summers while working for the military at the Aberdeen Proving Ground, an Army facility in Maryland.

    8
    Elsie Shutt founded one of the first software businesses in the U.S. in 1958

    In 1953, while taking time off from graduate school, she was hired to code for Raytheon, where the programmer work force “was about 50 percent men and 50 percent women,” she told Janet Abbate, a Virginia Tech historian and author of the 2012 book Recoding Gender. “And it really amazed me that these men were programmers, because I thought it was women’s work!”

    When Shutt had a child in 1957, state law required her to leave her job; the ’50s and ’60s may have been welcoming to full-time female coders, but firms were unwilling to offer part-time work, even to superb coders. So Shutt founded Computations Inc., a consultancy that produced code for corporations. She hired stay-at-home mothers as part-time employees; if they didn’t already know how to code, she trained them. They cared for their kids during the day, then coded at night, renting time on local computers. “What it turned into was a feeling of mission,” Shutt told Abbate, “in providing work for women who were talented and did good work and couldn’t get part-time jobs.” Business Week called the Computations work force the “pregnant programmers” in a 1963 article illustrated with a picture of a baby in a bassinet in a home hallway, with the mother in the background, hard at work writing software. (The article’s title: Mixing Math and Motherhood.)

    By 1967, there were so many female programmers that Cosmopolitan magazine published an article about The Computer Girls, accompanied by pictures of beehived women at work on computers that evoked the control deck of the U.S.S. Enterprise. The story noted that women could make $20,000 a year doing this work (or more than $150,000 in today’s money). It was the rare white-collar occupation in which women could thrive. Nearly every other highly trained professional field admitted few women; even women with math degrees had limited options: teaching high school math or doing rote calculations at insurance firms.

    “Women back then would basically go, ‘Well, if I don’t do programming, what else will I do?’ ” Janet Abbate says. “The situation was very grim for women’s opportunities.”

    If we want to pinpoint a moment when women began to be forced out of programming, we can look at one year: 1984. A decade earlier, a study revealed that the numbers of men and women who expressed an interest in coding as a career were equal. Men were more likely to enroll in computer-science programs, but women’s participation rose steadily and rapidly through the late ’70s until, by the 1983-84 academic year, 37.1 percent of all students graduating with degrees in computer and information sciences were women. In only one decade, their participation rate more than doubled.

    But then things went into reverse. From 1984 onward, the percentage dropped; by the time 2010 rolled around, it had been cut in half. Only 17.6 percent of the students graduating from computer-science and information-science programs were women.

    One reason for this vertiginous decline has to do with a change in how and when kids learned to program. The advent of personal computers in the late ’70s and early ’80s remade the pool of students who pursued computer-science degrees. Before then, pretty much every student who showed up at college had never touched a computer or even been in the room with one. Computers were rare and expensive devices, available for the most part only in research labs or corporate settings. Nearly all students were on equal footing, in other words, and new to programming.

    Once the first generation of personal computers, like the Commodore 64 or the TRS-80, found their way into homes, teenagers were able to play around with them, slowly learning the major concepts of programming in their spare time.

    9
    Commodore 64

    10
    Radio Shack Tandy TRS80

    By the mid-’80s, some college freshmen were showing up for their first class already proficient as programmers. They were remarkably well prepared for and perhaps even a little jaded about what Computer Science 101 might bring. As it turned out, these students were mostly men, as two academics discovered when they looked into the reasons women’s enrollment was so low.

    5
    Keypunch operators at IBM in Stockholm in the 1930s. Credit IBM

    One researcher was Allan Fisher, then the associate dean of the computer-science school at Carnegie Mellon University. The school established an undergraduate program in computer science in 1988, and after a few years of operation, Fisher noticed that the proportion of women in the major was consistently below 10 percent. In 1994, he hired Jane Margolis, a social scientist who is now a senior researcher in the U.C.L.A. School of Education and Information Studies, to figure out why. Over four years, from 1995 to 1999, she and her colleagues interviewed and tracked roughly 100 undergraduates, male and female, in Carnegie Mellon’s computer-science department; she and Fisher later published the findings in their 2002 book “Unlocking the Clubhouse: Women in Computing.”

    What Margolis discovered was that the first-year students arriving at Carnegie Mellon with substantial experience were almost all male. They had received much more exposure to computers than girls had; for example, boys were more than twice as likely to have been given one as a gift by their parents. And if parents bought a computer for the family, they most often put it in a son’s room, not a daughter’s. Sons also tended to have what amounted to an “internship” relationship with fathers, working through Basic-language manuals with them, receiving encouragement from them; the same wasn’t true for daughters. “That was a very important part of our findings,” Margolis says. Nearly every female student in computer science at Carnegie Mellon told Margolis that her father had worked with her brother — “and they had to fight their way through to get some attention.”

    Their mothers were typically less engaged with computers in the home, they told her. Girls, even the nerdy ones, picked up these cues and seemed to dial back their enthusiasm accordingly. These were pretty familiar roles for boys and girls, historically: Boys were cheered on for playing with construction sets and electronics kits, while girls were steered toward dolls and toy kitchens. It wasn’t terribly surprising to Margolis that a new technology would follow the same pattern as it became widely accepted.

    At school, girls got much the same message: Computers were for boys. Geeky boys who formed computer clubs, at least in part to escape the torments of jock culture, often wound up, whether intentionally or not, reproducing the same exclusionary behavior. (These groups snubbed not only girls but also black and Latino boys.) Such male cliques created “a kind of peer support network,” in Fisher’s words.

    This helped explain why Carnegie Mellon’s first-year classes were starkly divided between the sizable number of men who were already confident in basic programming concepts and the women who were frequently complete neophytes. A cultural schism had emerged. The women started doubting their ability. How would they ever catch up?

    What Margolis heard from students — and from faculty members, too — was that there was a sense in the classroom that if you hadn’t already been coding obsessively for years, you didn’t belong. The “real programmer” was the one who “had a computer-screen tan from being in front of the monitor all the time,” as Margolis puts it. “The idea was, you just have to love being with a computer all the time, and if you don’t do it 24/7, you’re not a ‘real’ programmer.” The truth is, many of the men themselves didn’t fit this monomaniacal stereotype. But there was a double standard: While it was O.K. for the men to want to engage in various other pursuits, women who expressed the same wish felt judged for not being “hard core” enough. By the second year, many of these women, besieged by doubts, began dropping out of the program. (The same was true for the few black and Latino students who also arrived on campus without teenage programming experience.)

    A similar pattern took hold at many other campuses. Patricia Ordóñez, a first-year student at Johns Hopkins University in 1985, enrolled in an Introduction to Minicomputers course. She had been a math whiz in high school but had little experience in coding; when she raised her hand in class at college to ask a question, many of the other students who had spent their teenage years programming — and the professor — made her feel singled out. “I remember one day he looked at me and said, ‘You should already know this by now,’ ” she told me. “I thought, I’m never going to succeed.” She switched majors as a result.

    Yet a student’s decision to stick with or quit the subject did not seem to be correlated with coding talent. Many of the women who dropped out were getting perfectly good grades, Margolis learned. Indeed, some who left had been top students. And the women who did persist and made it to the third year of their program had by then generally caught up to the teenage obsessives. The degree’s coursework was, in other words, a leveling force. Learning Basic as a teenage hobby might lead to lots of fun and useful skills, but the pace of learning at college was so much more intense that by the end of the degree, everyone eventually wound up graduating at roughly the same levels of programming mastery.

    5
    An E.R.A./Univac 1103 computer in the 1950s.Credit Hum Images/Alamy

    “It turned out that having prior experience is not a great predictor, even of academic success,” Fisher says. Ordóñez’s later experience illustrates exactly this: After changing majors at Johns Hopkins, she later took night classes in coding and eventually got a Ph.D. in computer science in her 30s; today, she’s a professor at the University of Puerto Rico Río Piedras, specializing in data science.

    By the ’80s, the early pioneering work done by female programmers had mostly been forgotten. In contrast, Hollywood was putting out precisely the opposite image: Computers were a male domain. In hit movies like Revenge of the Nerds, Weird Science, Tron, WarGames and others, the computer nerds were nearly always young white men. Video games, a significant gateway activity that led to an interest in computers, were pitched far more often at boys, as research in 1985 by Sara Kiesler [Psychology of Women Quartly], a professor at Carnegie Mellon, found. “In the culture, it became something that guys do and are good at,” says Kiesler, who is also a program manager at the National Science Foundation. “There were all kinds of things signaling that if you don’t have the right genes, you’re not welcome.”

    A 1983 study involving M.I.T. students produced equally bleak accounts. Women who raised their hands in class were often ignored by professors and talked over by other students. They would be told they weren’t aggressive enough; if they challenged other students or contradicted them, they heard comments like “You sure are bitchy today — must be your period.” Behavior in some research groups “sometimes approximates that of the locker room,” the report concluded, with men openly rating how “cute” their female students were. (“Gee, I don’t think it’s fair that the only two girls in the group are in the same office,” one said. “We should share.”) Male students mused about women’s mediocrity: “I really don’t think the woman students around here are as good as the men,” one said.

    By then, as programming enjoyed its first burst of cultural attention, so many students were racing to enroll in computer science that universities ran into a supply problem: They didn’t have enough professors to teach everyone. Some added hurdles, courses that students had to pass before they could be accepted into the computer-science major. Punishing workloads and classes that covered the material at a lightning pace weeded out those who didn’t get it immediately. All this fostered an environment in which the students mostly likely to get through were those who had already been exposed to coding — young men, mostly. “Every time the field has instituted these filters on the front end, that’s had the effect of reducing the participation of women in particular,” says Eric S. Roberts, a longtime professor of computer science, now at Reed College, who first studied this problem and called it the “capacity crisis.”

    When computer-science programs began to expand again in the mid-’90s, coding’s culture was set. Most of the incoming students were men. The interest among women never recovered to the levels reached in the late ’70s and early ’80s. And the women who did show up were often isolated. In a room of 20 students, perhaps five or even fewer might be women.

    In 1991, Ellen Spertus, now a computer scientist at Mills College, published a report on women’s experiences in programming classes. She cataloged a landscape populated by men who snickered about the presumed inferiority of women and by professors who told female students that they were “far too pretty” to be studying electrical engineering; when some men at Carnegie Mellon were asked to stop using pictures of naked women as desktop wallpaper on their computers, they angrily complained that it was censorship of the sort practiced by “the Nazis or the Ayatollah Khomeini.”

    As programming was shutting its doors to women in academia, a similar transformation was taking place in corporate America. The emergence of what would be called “culture fit” was changing the who, and the why, of the hiring process. Managers began picking coders less on the basis of aptitude and more on how well they fit a personality type: the acerbic, aloof male nerd.

    The shift actually began far earlier, back in the late ’60s, when managers recognized that male coders shared a growing tendency to be antisocial isolates, lording their arcane technical expertise over that of their bosses. Programmers were “often egocentric, slightly neurotic,” as Richard Brandon, a well-known computer-industry analyst, put it in an address at a 1968 conference, adding that “the incidence of beards, sandals and other symptoms of rugged individualism or nonconformity are notably greater among this demographic.”

    In addition to testing for logical thinking, as in Mary Allen Wilkes’s day, companies began using personality tests to select specifically for these sorts of caustic loner qualities. “These became very powerful narratives,” says Nathan Ensmenger, a professor of informatics at Indiana University, who has studied [Gender and Computing] this transition. The hunt for that personality type cut women out. Managers might shrug and accept a man who was unkempt, unshaven and surly, but they wouldn’t tolerate a women who behaved the same way. Coding increasingly required late nights, but managers claimed that it was too unsafe to have women working into the wee hours, so they forbid them to stay late with the men.

    At the same time, the old hierarchy of hardware and software became inverted. Software was becoming a critical, and lucrative, sector of corporate America. Employers increasingly hired programmers whom they could envision one day ascending to key managerial roles in programming. And few companies were willing to put a woman in charge of men. “They wanted people who were more aligned with management,” says Marie Hicks, a historian at the Illinois Institute of Technology. “One of the big takeaways is that technical skill does not equate to success.”

    By the 1990s and 2000s, the pursuit of “culture fit” was in full force, particularly at start-ups, which involve a relatively small number of people typically confined to tight quarters for long hours. Founders looked to hire people who were socially and culturally similar to them.

    “It’s all this loosey-goosey ‘culture’ thing,” says Sue Gardner, former head of the Wikimedia Foundation, the nonprofit that hosts Wikipedia and other sites. After her stint there, Gardner decided to study why so few women were employed as coders. In 2014, she surveyed more than 1,400 women in the field and conducted sit-down interviews with scores more. It became clear to her that the occupation’s takeover by men in the ’90s had turned into a self-perpetuating cycle. Because almost everyone in charge was a white or Asian man, that was the model for whom to hire; managers recognized talent only when it walked and talked as they did. For example, many companies have relied on whiteboard challenges when hiring a coder — a prospective employee is asked to write code, often a sorting algorithm, on a whiteboard while the employers watch. This sort of thing bears almost no resemblance to the work coders actually do in their jobs. But whiteboard questions resemble classroom work at Ivy League institutions. It feels familiar to the men doing the hiring, many of whom are only a few years out of college. “What I came to realize,” Gardner says, “is that it’s not that women are excluded. It’s that practically everyone is excluded if you’re not a young white or Asian man who’s single.”

    One coder, Stephanie Hurlburt, was a stereotypical math nerd who had deep experience working on graphics software. “I love C++, the low-level stuff,” she told me, referring to a complex language known for allowing programmers to write very fast-running code, useful in graphics. Hurlburt worked for a series of firms this decade, including Unity (which makes popular software for designing games), and then for Facebook on its Oculus Rift VR headset, grinding away for long hours in the run-up to the release of its first demo. Hurlburt became accustomed to shrugging off negative attention and crude sexism. She heard, including from many authority figures she admired, that women weren’t wired for math. While working as a coder, if she expressed ignorance of any concept, no matter how trivial, male colleagues would disparage her. “I thought you were at a higher math level,” one sniffed.

    In 2016, Hurlburt and a friend, Rich Geldreich, founded a start-up called Binomial, where they created software that helps compress the size of “textures” in graphics-heavy software. Being self-employed, she figured, would mean not having to deal with belittling bosses. But when she and Geldreich went to sell their product, some customers assumed that she was just the marketing person. “I don’t know how you got this product off the ground when you only have one programmer!” she recalls one client telling Geldreich.

    In 2014, an informal analysis by a tech entrepreneur and former academic named Kieran Snyder of 248 corporate performance reviews for tech engineers determined that women were considerably more likely than men to receive reviews with negative feedback; men were far more likely to get reviews that had only constructive feedback, with no negative material. In a 2016 experiment conducted by the tech recruiting firm Speak With a Geek, 5,000 résumés with identical information were submitted to firms. When identifying details were removed from the résumés, 54 percent of the women received interview offers; when gendered names and other biographical information were given, only 5 percent of them did.

    Lurking beneath some of this sexist atmosphere is the phantasm of sociobiology. As this line of thinking goes, women are less suited to coding than men because biology better endows men with the qualities necessary to excel at programming. Many women who work in software face this line of reasoning all the time. Cate Huston, a software engineer at Google from 2011 to 2014, heard it from colleagues there when they pondered why such a low percentage of the company’s programmers were women. Peers would argue that Google hired only the best — that if women weren’t being hired, it was because they didn’t have enough innate logic or grit, she recalls.

    In the summer of 2017, a Google employee named James Damore suggested in an internal email that several qualities more commonly found in women — including higher rates of anxiety — explained why they weren’t thriving in a competitive world of coding; he cited the cognitive neuroscientist Simon Baron-Cohen, who theorizes that the male brain is more likely to be “systemizing,” compared with women’s “empathizing” brains. Google fired Damore, saying it could not employ someone who would argue that his female colleagues were inherently unsuited to the job. But on Google’s internal boards, other male employees backed up Damore, agreeing with his analysis. The assumption that the makeup of the coding work force reflects a pure meritocracy runs deep among many Silicon Valley men; for them, sociobiology offers a way to explain things, particularly for the type who prefers to believe that sexism in the workplace is not a big deal, or even doubts it really exists.

    But if biology were the reason so few women are in coding, it would be impossible to explain why women were so prominent in the early years of American programming, when the work could be, if anything, far harder than today’s programming. It was an uncharted new field, in which you had to do math in binary and hexadecimal formats, and there were no helpful internet forums, no Google to query, for assistance with your bug. It was just your brain in a jar, solving hellish problems.

    If biology limited women’s ability to code, then the ratio of women to men in programming ought to be similar in other countries. It isn’t. In India, roughly 40 percent of the students studying computer science and related fields are women. This is despite even greater barriers to becoming a female coder there; India has such rigid gender roles that female college students often have an 8 p.m. curfew, meaning they can’t work late in the computer lab, as the social scientist Roli Varma learned when she studied them in 2015. The Indian women had one big cultural advantage over their American peers, though: They were far more likely to be encouraged by their parents to go into the field, Varma says. What’s more, the women regarded coding as a safer job because it kept them indoors, lessening their exposure to street-level sexual harassment. It was, in other words, considered normal in India that women would code. The picture has been similar in Malaysia, where in 2001 — precisely when the share of American women in computer science had slid into a trough — women represented 52 percent of the undergraduate computer-science majors and 39 percent of the Ph.D. candidates at the University of Malaya in Kuala Lumpur.

    Today, when midcareer women decide that Silicon Valley’s culture is unlikely to change, many simply leave the industry. When Sue Gardner surveyed those 1,400 women in 2014, they told her the same story: In the early years, as junior coders, they looked past the ambient sexism they encountered. They loved programming and were ambitious and excited by their jobs. But over time, Gardner says, “they get ground down.” As they rose in the ranks, they found few, if any, mentors. Nearly two-thirds either experienced or witnessed harassment, she read in “The Athena Factor” (a 2008 study of women in tech); in Gardner’s survey, one-third reported that their managers were more friendly toward and gave more support to their male co-workers. It’s often assumed that having children is the moment when women are sidelined in tech careers, as in many others, but Gardner discovered that wasn’t often the breaking point for these women. They grew discouraged seeing men with no better or even lesser qualifications get superior opportunities and treatment.

    “What surprised me was that they felt, ‘I did all that work!’ They were angry,” Gardner says. “It wasn’t like they needed a helping hand or needed a little extra coaching. They were mad. They were not leaving because they couldn’t hack it. They were leaving because they were skilled professionals who had skills that were broadly in demand in the marketplace, and they had other options. So they’re like, ‘[expletive] it — I’ll go somewhere where I’m seen as valuable.’ ”

    The result is an industry that is drastically more male than it was decades ago, and far more so than the workplace at large. In 2018, according to data from the Bureau of Labor Statistics, about 26 percent of the workers in “computer and mathematical occupations” were women. The percentages for people of color are similarly low: Black employees were 8.4 percent, Latinos 7.5 percent. (The Census Bureau’s American Community Survey put black coders at only 4.7 percent in 2016.) In the more rarefied world of the top Silicon Valley tech firms, the numbers are even more austere: A 2017 analysis by Recode, a news site that covers the technology industry, revealed that 20 percent of Google’s technical employees were women, while only 1 percent were black and 3 percent were Hispanic. Facebook was nearly identical; the numbers at Twitter were 15 percent, 2 percent and 4 percent, respectively.

    The reversal has been profound. In the early days of coding, women flocked to programming because it offered more opportunity and reward for merit, more than fields like law. Now software has the closed door.

    In the late 1990s, Allan Fisher decided that Carnegie Mellon would try to address the male-female imbalance in its computer-science program. Prompted by Jane Margolis’s findings, Fisher and his colleagues instituted several changes. One was the creation of classes that grouped students by experience: The kids who had been coding since youth would start on one track; the newcomers to coding would have a slightly different curriculum, allowing them more time to catch up. Carnegie Mellon also offered extra tutoring to all students, which was particularly useful for the novice coders. If Fisher could get them to stay through the first and second years, he knew, they would catch up to their peers.

    5
    Components from four of the earliest electronic computers, held by Patsy Boyce Simmers, Gail Taylor, Millie Beck and Norma Stec, employees at the United States Army’s Ballistics Research Laboratory.Credit Science Source

    They also modified the courses in order to show how code has impacts in the real world, so a new student’s view of programming wouldn’t just be an endless vista of algorithms disconnected from any practical use. Fisher wanted students to glimpse, earlier on, what it was like to make software that works its way into people’s lives. Back in the ’90s, before social media and even before the internet had gone mainstream, the influence that code could have on daily life wasn’t so easy to see.

    Faculty members, too, adopted a different perspective. For years some had tacitly endorsed the idea that the students who came in already knowing code were born to it. Carnegie Mellon “rewarded the obsessive hacker,” Fisher told me. But the faculty now knew that their assumptions weren’t true; they had been confusing previous experience with raw aptitude. They still wanted to encourage those obsessive teenage coders, but they had come to understand that the neophytes were just as likely to bloom rapidly into remarkable talents and deserved as much support. “We had to broaden how faculty sees what a successful student looks like,” he says. The admissions process was adjusted, too; it no longer gave as much preference to students who had been teenage coders.

    No single policy changed things. “There’s really a virtuous cycle,” Fisher says. “If you make the program accommodate people with less experience, then people with less experience come in.” Faculty members became more used to seeing how green coders evolve into accomplished ones, and they learned how to teach that type.

    Carnegie Mellon’s efforts were remarkably successful. Only a few years after these changes, the percentage of women entering its computer-science program boomed, rising to 42 percent from 7 percent; graduation rates for women rose to nearly match those of the men. The school vaulted over the national average. Other schools concerned about the low number of female students began using approaches similar to Fisher’s. In 2006, Harvey Mudd College tinkered with its Introduction to Computer Science course, creating a track specifically for novices, and rebranded it as Creative Problem Solving in Science and Engineering Using Computational Approaches — which, the institution’s president, Maria Klawe, told me, “is actually a better description of what you’re actually doing when you’re coding.” By 2018, 54 percent of Harvey Mudd’s graduates who majored in computer science were women.

    A broader cultural shift has accompanied the schools’ efforts. In the last few years, women’s interest in coding has begun rapidly rising throughout the United States. In 2012, the percentage of female undergraduates who plan to major in computer science began to rise at rates not seen for 35 years [Computing Research News], since the decline in the mid-’80s, according to research by Linda Sax, an education professor at U.C.L.A. There has also been a boomlet of groups and organizations training and encouraging underrepresented cohorts to enter the field, like Black Girls Code and Code Newbie. Coding has come to be seen, in purely economic terms, as a bastion of well-paying and engaging work.

    In an age when Instagram and Snapchat and iPhones are part of the warp and weft of life’s daily fabric, potential coders worry less that the job will be isolated, antisocial and distant from reality. “Women who see themselves as creative or artistic are more likely to pursue computer science today than in the past,” says Sax, who has pored over decades of demographic data about the students in STEM fields. They’re still less likely to go into coding than other fields, but programming is increasingly on their horizon. This shift is abetted by the fact that it’s much easier to learn programming without getting a full degree, through free online coding schools, relatively cheaper “boot camps” or even meetup groups for newcomers — opportunities that have emerged only in the last decade.

    Changing the culture at schools is one thing. Most female veterans of code I’ve spoken to say that what is harder is shifting the culture of the industry at large, particularly the reflexive sexism and racism still deeply ingrained in Silicon Valley. Some, like Sue Gardner, sometimes wonder if it’s even ethical for her to encourage young women to go into tech. She fears they’ll pour out of computer-science programs in increasing numbers, arrive at their first coding job excited and thrive early on, but then gradually get beaten down by industry. “The truth is, we can attract more and different people into the field, but they’re just going to hit that wall in midcareer, unless we change how things happen higher up,” she says.

    On a spring weekend in 2017, more than 700 coders and designers were given 24 hours to dream up and create a new product at a hackathon in New York hosted by TechCrunch, a news site devoted to technology and Silicon Valley. At lunchtime on Sunday, the teams presented their creations to a panel of industry judges, in a blizzard of frantic elevator pitches. There was Instagrammie, a robot system that would automatically recognize the mood of an elderly relative or a person with limited mobility; there was Waste Not, an app to reduce food waste. Most of the contestants were coders who worked at local high-tech firms or computer-science students at nearby universities.

    6
    Despite women’s historical role in the vanguard of computer programing, some female veterans of code wonder if it’s even ethical to encourage young women to go into tech because of the reflexive sexism in the current culture of Silicon Valley.CreditApic/Getty Images

    The winning team, though, was a trio of high school girls from New Jersey: Sowmya Patapati, Akshaya Dinesh and Amulya Balakrishnan. In only 24 hours, they created reVIVE, a virtual-reality app that tests children for signs of A.D.H.D. After the students were handed their winnings onstage — a trophy-size check for $5,000 — they flopped into chairs in a nearby room to recuperate. They had been coding almost nonstop since noon the day before and were bleary with exhaustion.

    “Lots of caffeine,” Balakrishnan, 17, said, laughing. She wore a blue T-shirt that read WHO HACK THE WORLD? GIRLS. The girls told me that they had impressed even themselves by how much they accomplished in 24 hours. “Our app really does streamline the process of detecting A.D.H.D.,” said Dinesh, who was also 17. “It usually takes six to nine months to diagnose, and thousands of dollars! We could do it digitally in a much faster way!”

    They all became interested in coding in high school, each of them with strong encouragement from immigrant parents. Balakrishnan’s parents worked in software and medicine; Dinesh’s parents came to the United States from India in 2000 and worked in information technology. Patapati immigrated from India as an infant with her young mother, who never went to college, and her father, an information-tech worker who was the first in his rural family to go to college.

    Drawn to coding in high school, the young hackers got used to being the lone girl nerds at school, as Dinesh told me.

    “I tried so hard to get other girls interested in computer science, and it was like, the interest levels were just so low,” she says. “When I walked into my first hackathon, it was the most intimidating thing ever. I looked at a room of 80 kids: Five were girls, and I was probably the youngest person there.” But she kept on competing in 25 more hackathons, and her confidence grew. To break the isolation and meet more girls in coding, she attended events by organizations like #BuiltByGirls, which is where, a few days previously, she had met Patapati and Balakrishnan and where they decided to team up. To attend TechCrunch, Patapati, who was 16, and Balakrishnan skipped a junior prom and a friend’s birthday party. “Who needs a party when you can go to a hackathon?” Patapati said.

    Winning TechCrunch as a group of young women of color brought extra attention, not all of it positive. “I’ve gotten a lot of comments like: ‘Oh, you won the hackathon because you’re a girl! You’re a diversity pick,” Balakrishnan said. After the prize was announced online, she recalled later, “there were quite a few engineers who commented, ‘Oh, it was a girl pick; obviously that’s why they won.’ ”

    Nearly two years later, Balakrishnan was taking a gap year to create a heart-monitoring product she invented, and she was in the running for $100,000 to develop it. She was applying to college to study computer science and, in her spare time, competing in a beauty pageant, inspired by Miss USA 2017, Kara McCullough, who was a nuclear scientist. “I realized that I could use pageantry as a platform to show more girls that they could embrace their femininity and be involved in a very technical, male-dominated field,” she says. Dinesh, in her final year at high school, had started an all-female hackathon that now takes place annually in New York. (“The vibe was definitely very different,” she says, more focused on training newcomers.)

    Patapati and Dinesh enrolled at Stanford last fall to study computer science; both are interested deeply in A.I. They’ve noticed the subtle tensions for women in the coding classes. Patapati, who founded a Women in A.I. group with an Apple tech lead, has watched as male colleagues ignore her raised hand in group discussions or repeat something she just said as if it were their idea. “I think sometimes it’s just a bias that people don’t even recognize that they have,” she says. “That’s been really upsetting.”

    Dinesh says “there’s absolutely a difference in confidence levels” between the male and female newcomers. The Stanford curriculum is so intense that even the relative veterans like her are scrambling: When we spoke recently, she had just spent “three all-nighters in a row” on a single project, for which students had to engineer a “print” command from scratch. At 18, she has few illusions about the road ahead. When she went to a blockchain conference, it was a sea of “middle-aged white and Asian men,” she says. “I’m never going to one again,” she adds with a laugh.

    “My dream is to work on autonomous driving at Tesla or Waymo or some company like that. Or if I see that there’s something missing, maybe I’ll start my own company.” She has begun moving in that direction already, having met one venture capitalist via #BuiltByGirls. “So now I know I can start reaching out to her, and I can start reaching out to other people that she might know,” she says.

    Will she look around, 20 years from now, to see that software has returned to its roots, with women everywhere? “I’m not really sure what will happen,” she admits. “But I do think it is absolutely on the upward climb.”

    Correction: Feb. 14, 2019
    An earlier version of this article misidentified the institution Ellen Spertus was affiliated with when she published a 1991 report on women’s experiences in programming classes. Spertus was at M.I.T. when she published the report, not Mills College, where she is currently a professor.

    Correction: Feb. 14, 2019
    An earlier version of this article misstated Akshaya Dinesh’s current age. She is 18, not 19.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 1:15 pm on January 25, 2019 Permalink | Reply
    Tags: , , , NYT, , , , , Wish list of particle colliders   

    From The New York Times- “Opinion: The Uncertain Future of Particle Physics” 

    New York Times

    From The New York Times

    Jan. 23, 2019
    Sabine Hossenfelder

    Ten years in, the Large Hadron Collider has failed to deliver the exciting discoveries that scientists promised.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    The Large Hadron Collider is the world’s largest particle accelerator. It’s a 16-mile-long underground ring, located at CERN in Geneva, in which protons collide at almost the speed of light. With a $5 billion price tag and a $1 billion annual operation cost, the L.H.C. is the most expensive instrument ever built — and that’s even though it reuses the tunnel of an earlier collider.

    CERN Large Electron Positron Collider

    The L.H.C. has collected data since September 2008. Last month, the second experimental run completed, and the collider will be shut down for the next two years for scheduled upgrades. With the L.H.C. on hiatus, particle physicists are already making plans to build an even larger collider. Last week, CERN unveiled plans to build an accelerator that is larger and far more powerful than the L.H.C. — and would cost over $10 billion.

    CERN FCC Future Circular Collider map

    I used to be a particle physicist. For my Ph.D. thesis, I did L.H.C. predictions, and while I have stopped working in the field, I still believe that slamming particles into one another is the most promising route to understanding what matter is made of and how it holds together. But $10 billion is a hefty price tag. And I’m not sure it’s worth it.

    In 2012, experiments at the L.H.C. confirmed the discovery of the Higgs boson — a prediction that dates back to the 1960s — and it remains the only discovery made at the L.H.C.

    Peter Higgs

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    Particle physicists are quick to emphasize that they have learned other things: For example, they now have better knowledge about the structure of the proton, and they’ve seen new (albeit unstable) composite particles. But let’s be honest: It’s disappointing.

    Before the L.H.C. started operation, particle physicists had more exciting predictions than that. They thought that other new particles would also appear near the energy at which the Higgs boson could be produced. They also thought that the L.H.C. would see evidence for new dimensions of space. They further hoped that this mammoth collider would deliver clues about the nature of dark matter (which astrophysicists think constitutes 85 percent of the matter in the universe) or about a unified force.

    The stories about new particles, dark matter and additional dimensions were repeated in countless media outlets from before the launch of the L.H.C. until a few years ago. What happened to those predictions? The simple answer is this: Those predictions were wrong — that much is now clear.

    The trouble is, a “prediction” in particle physics is today little more than guesswork. (In case you were wondering, yes, that’s exactly why I left the field.) In the past 30 years, particle physicists have produced thousands of theories whose mathematics they can design to “predict” pretty much anything. For example, in 2015 when a statistical fluctuation in the L.H.C. data looked like it might be a new particle, physicists produced more than 500 papers in eight months to explain what later turned out to be merely noise. The same has happened many other times for similar fluctuations, demonstrating how worthless those predictions are.

    To date, particle physicists have no reliable prediction that there should be anything new to find until about 15 orders of magnitude above the currently accessible energies. And the only reliable prediction they had for the L.H.C. was that of the Higgs boson. Unfortunately, particle physicists have not been very forthcoming with this information. Last year, Nigel Lockyer, the director of Fermilab, told the BBC, “From a simple calculation of the Higgs’ mass, there has to be new science.” This “simple calculation” is what predicted that the L.H.C. should already have seen new science.

    I recently came across a promotional video for the Future Circular Collider that physicists have proposed to build at CERN. This video, which is hosted on the CERN website, advertises the planned machine as a test for dark matter and as a probe for the origin of the universe. It is extremely misleading: Yes, it is possible that a new collider finds a particle that makes up dark matter, but there is no particular reason to think it will. And such a machine will not tell us anything about the origin of the universe. Paola Catapano, head of audiovisual productions at CERN, informed me that this video “is obviously addressed to politicians and not fellow physicists and uses the same arguments as those used to promote the L.H.C. in the ’90s.”

    But big science experiments are investments in our future. Decisions about what to fund should be based on facts, not on shiny advertising. For this, we need to know when a prediction is just a guess. And if particle physicists have only guesses, maybe we should wait until they have better reasons for why a larger collider might find something new.

    It is correct that some technological developments, like strong magnets, benefit from these particle colliders and that particle physics positively contributes to scientific education in general. These are worthy investments, but if that’s what you want to spend money on, you don’t also need to dig a tunnel.

    And there are other avenues to pursue. For example, the astrophysical observations pointing toward dark matter should be explored further; better understanding those observations would help us make more reliable predictions about whether a larger collider can produce the dark matter particle — if it even is a particle.

    There are also medium-scale experiments that tend to fall off the table because giant projects eat up money. One important medium-scale project is the interface between the quantum realm and gravity, which is now accessible to experimental testing. Another place where discoveries could be waiting is in the foundations of quantum mechanics. These could have major technological impacts.

    Now that the L.H.C. is being upgraded and particle physics experiments at the detector are taking a break, it’s time for particle physicists to step back and reflect on the state of the field. It’s time for them to ask why none of the exciting predictions they promised have resulted in discoveries. Money will not solve this problem. And neither will a larger particle collider.

    See the full article here .

    See also From Science News: “Physicists aim to outdo the LHC with this wish list of particle colliders

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 9:48 am on December 31, 2018 Permalink | Reply
    Tags: , , , , New Horizons, NYT,   

    From The New York Times: “NASA’s New Horizons Will Visit Ultima Thule on New Year’s Day” 

    New York Times

    From The New York Times

    Dec. 31, 2018
    Kenneth Chang

    The probe that visited Pluto will study a mysterious icy world just after midnight. Ultima Thule will be the most distant object ever visited by a spacecraft.

    1
    We should get a clearer look at the Kuiper Belt object, Ultima Thule, when the New Horizons spacecraft, which took this composite image between August and mid-December, flies by on Jan. 1. Credit NASA/Johns Hopkins Applied Physics Laboratory/Southwest Research Institute

    NASA’s New Horizons spacecraft, which flew past Pluto in 2015, will zip past another icy world nicknamed Ultima Thule on New Year’s Day, gathering information on what is believed to be a pristine fragment from the earliest days of the solar system.

    NASA New Horizons spacecraft

    It will be the most distant object ever visited by a spacecraft.

    At 12:33 a.m. Eastern time, New Horizons will pass within about 2,200 miles of Ultima Thule, speeding at 31,500 m.p.h.

    How do I watch the flyby?

    Though it is a NASA spacecraft, the New Horizons mission is operated by the Johns Hopkins Applied Physics Laboratory in Maryland. Coverage of the flyby will be broadcast on the lab’s website and YouTube channel as well as NASA TV. On Twitter, updates will appear on @NewHorizons2015, the account maintained by S. Alan Stern, the principal investigator for the mission, and on NASA’s @NASANewHorizons account.

    While the scientists will celebrate the moment of flyby as if it were New Year’s, they will have no idea how the mission is actually going at that point. The spacecraft, busy making its science observations, will not turn to send a message back to Earth until a few hours later. Then it will take six hours for that radio signal, traveling at the speed of light, to reach Earth.

    Tell me about this small frozen world

    Based on suggestions from the public, the New Horizons team chose a nickname for the world: Ultima Thule, which means “distant places beyond the known world.” Officially, it is 2014 MU69, a catalog designation assigned by the International Astronomical Union’s Minor Planet Center. The “2014” refers to the year it was discovered, the result of a careful scan of the night sky by the Hubble Space Telescope for targets that New Horizons might be able to fly by after its Pluto encounter.

    No telescope on Earth has been able to clearly spot MU69. Even sharp-eyed Hubble can make out only a dot of light. Scientists estimate that it is 12 to 22 miles wide, and that it is dark, reflecting about 10 percent of the light that hits it.

    Four billion miles from the sun, MU69 is a billion miles farther out than Pluto, part of the ring of icy worlds beyond Neptune known as the Kuiper belt. Its orbit, nearly circular, suggests that it has been undisturbed since the birth of the solar system 4.5 billion years ago.

    Why do planetary scientists care about this small thing 4 billion miles from the sun?

    Every time a spacecraft visits an asteroid or a comet, planetary scientists talk about how it is a precious time capsule from the solar system’s baby days when the planets were forming. That is true, but especially true for Ultima Thule.

    Asteroids around the solar system have collided with each other and broken apart. Comets partially vaporize each time they pass close to the sun. But Ultima Thule may have instead been in a deep freeze the whole time, perhaps essentially pristine since it formed 4.5 billion years ago.

    Will there be pictures of Ultima Thule?

    New Horizons has been taking pictures for months, but for most of that time Ultima Thule has been little more than a dot in any of these images.

    At a news conference on Tuesday morning after the flyby, the scientists expect to release a picture taken before the flyby. Ultima Thule is expected to be a mere six pixels wide in that picture — enough to get a rough idea of its shape but not much more.

    The first set of images captured by New Horizons during the flyby should be back on Earth by Tuesday evening, and those are to be shown at news conferences describing the science results on Wednesday and Thursday.

    But when the pictures come, they could be striking — in case you forgot what kind of pictures New Horizons took when it flew past Pluto, here are some highlights of its findings.

    Isn’t NASA closed?

    Yes, NASA is one of the agencies affected by the partial federal government shutdown, and most NASA employees are currently furloughed. However, missions in space, including New Horizons, are considered essential activities. (It would be a shame if NASA had to throw away spacecraft costing hundreds of millions of dollars.)

    NASA will not be issuing news releases, but the Johns Hopkins Applied Physics Laboratory public affairs staff will get the news out, and on Friday, NASA Administrator Jim Bridenstine indicated that the agency would continue providing information on New Horizons as well as Osiris-Rex, a mission that is exploring a near-earth asteroid, Bennu.

    NASA OSIRIS-REx Spacecraft

    What happens after the flyby?

    Because New Horizons is so far away, its radio signal is weak, and the data will trickle back over the next 20 months. At the same time, it will make observations of other objects in the Kuiper belt to compare with Ultima Thule.

    The spacecraft has enough propellant left to possibly head to a third target, but that depends on whether there is anything close enough along its path. Astronomers, busy with Ultima Thule, have yet to start that new search.

    Beyond that, New Horizons will continue heading out of the solar system. Powered by a plutonium power source, it will to take data and communicate home with Earth for perhaps another 20 years, headed out of the solar system. However, it is not moving quite as fast as the Voyager 1 and Voyager 2 spacecraft that have now both entered interstellar space, so it is unclear whether New Horizons will make a similar crossing before its power runs out.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 11:06 am on December 25, 2018 Permalink | Reply
    Tags: , , , , , NYT, , ,   

    From The New York Times: “It’s Intermission for the Large Hadron Collider” 

    New York Times

    From The New York Times

    This is a special Augmented reality production of the NYT. Please view the original full article to take advantage of the 360 degree images inside the LHC.

    DEC. 21, 2018
    Dennis Overbye

    The largest machine ever built is shutting down for two years of upgrades. Take an immersive tour of the collider and study the remnants of a Higgs particle in augmented reality.

    4

    CERN Control Center

    MEYRIN, Switzerland — There is silence on the subatomic firing range.

    A quarter-century ago, the physicists of CERN, the European Center for Nuclear Research, bet their careers and their political capital on the biggest and most expensive science experiment ever built, the Large Hadron Collider.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE

    CERN/ALICE Detector

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    The collider is a kind of microscope that works by flinging subatomic particles around a 17-mile electromagnetic racetrack beneath the French-Swiss countryside, smashing them together 600 million times a second and sifting through the debris for new particles and forces of nature. The instrument is also a time machine, providing a glimpse of the physics that prevailed in the early moments of the universe and laid the foundation for the cosmos as we see it today.

    The reward came in 2012 with the discovery of the Higgs boson, a long-sought particle that helps explain why there is mass, diversity and life in the cosmos.

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    The discovery was celebrated with champagne and a Nobel prize.

    The collider will continue smashing particles and expectations for another 20 years. But first, an intermission. On December 3rd, the particle beams stopped humming. The giant magnets that guide the whizzing protons sighed and released their grip. The underground detectors that ring the tunnel stood down from their watch.

    Over the next two years, during the first of what will be a series of shutdowns, engineers will upgrade the collider to make its beams more intense and its instruments more sensitive and discerning. And theoretical physicists will pause to make sense of the tantalizing, bewildering mysteries that the Large Hadron Collider has generated so far.

    When protons collide

    The collider gets its mojo from Einstein’s dictum that mass and energy are the same. The more energy that the collider can produce, the more massive are the particles created by the collisions. With every increase in the energy of their collider, CERN physicists are able to edge farther and farther back in time, closer to the physics of the Big Bang, when the universe was much hotter than today.

    Inside CERN’s subterranean ring, some 10,000 superconducting electromagnets, powered by a small city’s worth of electricity, guide two beams of protons in opposite directions around the tunnel at 99.99999 percent of the speed of light, or an energy of 7 trillion electron volts. Those protons make the 17-mile circuit 11,000 times a second. (In physics, mass and energy are both expressed in terms of units called electron volts. A single proton, the building block of ordinary atoms, weighs about a billion electron volts.)

    The protons enter the collider as atoms in a puff of hydrogen gas squirted from a bottle. As the atoms travel, electrical fields strip them of electrons, leaving bare, positively charged protons. These are sped up by a series of increasingly larger and more energetic electromagnets, until they are ready to enter the main ring of the collider.

    When protons finally enter the main ring, they have been boosted into flying bombs of primordial energy, primed to smash apart — and recombine — when they strike their opposite numbers head-on, coming from the other direction.

    The protons circulate inside vacuum pipes – one running clockwise, the other counterclockwise – and these are surrounded by superconducting electromagnets strung together around the tunnel like sausages. To generate enough force to bend the speeding protons, the magnets must be uncommonly strong: 8.3 Tesla, or more than a hundred thousand times stronger than Earth’s magnetic field — and more than strong enough to wreck a fancy Swiss watch.

    Such a field in turn requires an electrical current of 12,000 amperes. That’s only feasible if the magnets are superconducting, meaning that electricity flows without expensive resistance. For that to happen, the magnets must be supercold; they are bathed in 150 tons of superfluid helium at a temperature of 1.9 Kelvin, making the Large Hadron Collider literally one of the coldest places in the universe.

    If things go wrong down here, they can go very wrong. In 2008, as the collider was still being tuned up, the link between a pair of magnets exploded, delaying operations for almost two years.

    The energy stored in the magnetic fields is equivalent to a fully loaded jumbo jet going 500 miles per hour; if a magnet loses its cool and heats up, all that energy must go someplace. And the proton beam itself can cut through many feet of steel.

    A tale of four detectors

    The beams cross at four points around the racetrack.

    At each juncture, gigantic detectors — underground mountains of electronics, cables, computers, pipes, magnets and even more magnets — have been erected. The two biggest and most expensive experiments, CMS (the Compact Muon Solenoid) and Atlas (A Toroidal L.H.C. Apparatus) sit, respectively, at the noon and 6 o’clock positions of the circular track.

    Wrapped around them, like the layers of an onion, are instruments designed to measure every last spark of energy or matter that might spew from the collision. Silicon detectors track the paths of lightweight, charged particles such as electrons. Scintillation crystals capture the energies of gamma rays. Chambers of electrified gas track more far-flung particles. And powerful magnets bend the paths of these particles so that their charges and masses can be determined.

    The proton beams cross 40 million times per second in each of the four detectors, resulting in about a billion actual collisions every second.

    What’s the antimatter?

    Why is there something instead of nothing in the universe?

    Answering that question is the mission of the detector known as LHCb, which sits at about 4 o’clock on the collider dial. The “b” stands for beauty — and for the B meson, a subatomic particle that is crucial to the experiment.

    When matter is created — in a collider, in the Big Bang — equal amounts of matter and its opposite, antimatter, should be formed, according to the laws of physics As We Know Them. When matter and antimatter meet, they annihilate each other, producing energy.

    By that logic, when matter and antimatter formed in the Big Bang, they should have cancelled out each other, leaving behind an empty universe. But it’s not empty: We are here, and our antimatter is not.

    Why not? Physicists suspect that some subtle imbalance between matter and antimatter is responsible. The LHCb experiment looks for that imbalance in the behavior of B mesons, which are often sprayed from the proton collisions.

    B mesons have an exotic property: They flicker back and forth between being matter and antimatter. Sensors record their passage through the LHCb room, seeking differences between the particles and their antimatter twins. Any discrepancy between the two could be a clue to why matter flourished billions of years ago and antimatter perished.

    Turning back the cosmic clock

    At about 8 o’clock on the collider dial is Alice, another detector with a special purpose. It, too, is fixed on the distant past: the brief moment a couple of microseconds after the Big Bang, before the first protons and neutrons congealed out of a “primordial soup” of quarks and gluons.

    Alice’s job is to study tiny droplets of that distant past that are created when the collider bangs together lead ions instead of protons. Researchers expected this material, known in the lingo as a quark-gluon plasma, to behave like a gas, but it turns out to behave more like a liquid.

    Sifting the data

    The collider’s enormous detectors are like 100 megapixel cameras that take 40 million pictures a second. Most of the data from that deluge is immediately thrown away. Triggers, programmed to pick out events that physicists thought might be interesting, save only about a thousand collision events per second. Even still, an enormous pool of data winds up in the CERN computer banks.

    CERN DATA Center

    According to the casino rules of modern quantum physics, anything that can happen will happen eventually. Before a single proton is fired through the collider, computers have calculated all the possible outcomes of a collision according to known physics. Any unexpected bump in the real data at some energy could be a signal of unknown physics, a new particle.

    That was how the Higgs was discovered, emerging from the statistical noise in the autumn of 2011. Only one of every 10 billion collisions creates a Higgs boson. The Higgs vanishes instantly and can’t be observed directly, but it decays into fragments that can be measured and identified.

    What eventually stood out from the data was evidence for a particle that weighs all by itself as much as an iodine atom: a flake of an invisible force field that permeates space like molasses, impeding motion and assigning mass to objects that pass through it.

    And so in 2012, after half a century and billions of dollars, thousands of physicists toasted over champagne. Peter Higgs, for whom the elusive boson was named, shared the Nobel prize with François Englert, who had independently predicted the particle’s existence.

    Peter Higgs

    François Englert

    An intermission underground

    The current shutdown is the first of a pair of billion-dollar upgrades intended to boost the productivity of the Large Hadron Collider tenfold by the end of the decade.

    The first shutdown will last for two years, until 2021; during that time, engineers will improve the series of smaller racetracks that speed up protons and inject them into the main collider. The collider then will run for two years and shut down again, in 2024, for two more years, so that engineers can install new magnets to intensify the proton beams and collisions.

    Reincarnated in 2026 as the High Luminosity L.H.C., the collider is scheduled to run for another decade, until 2035 or so, which means its career probing the edge of human knowledge is still beginning.

    Judging by the collider’s productivity, measured in terms of trillions of subatomic smashups, more than 95 percent of its scientific potential lies ahead.

    Both the Atlas and CMS experiments will receive major upgrades during the next two shutdowns, including new silicon trackers, to replace the olds ones burned out by radiation.

    To keep up with the increased collision rate, both Atlas and CMS have had to upgrade the finicky trigger systems that decide which collision events to keep and study. Currently, of a billion events per second, they can keep 1,500; the upgrade will raise that figure to 10,000.

    And what a flow of collisions it will be. Physicists measure the productivity, or luminosity, of their colliders in terms of collisions. It took about 3,000 trillion collisions to confirm the Higgs boson. As of the December shutdown the collider had logged about 20,000 trillion collisions. But those were, and are, early days.

    By 2037, the Large Hadron Collider should have produced roughly 4 million trillion primordial fireballs, bristling with who knows what. The whole universe is still up for grabs.

    After the Higgs

    Discovering the Higgs was an auspicious start. But the champagne came with a mystery.

    Over the last century, physicists have learned to explain some of the grandest and subtlest phenomena in nature — the arc of a rainbow, the scent of a gardenia, the twitch of a cat’s whiskers — as a handful of elementary particles interacting through four basic forces, playing a game of catch with force-carrying particles called bosons according to a set of equations called the Standard Model.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    But why these particles and these forces? Why is the universe made of matter but not antimatter? What happens at the center of a black hole, or happened at the first instant of the Big Bang? If the Higgs boson determines the masses of particles, what determines the mass of the Higgs?

    Who, in other words, watches the watchman?

    The Standard Model, for all its brilliance and elegance, does not say. Particles that might answer these questions have not shown up yet in the collider. Fabiola Gianotti, the director-general of CERN, expressed surprise. “I would have expected new physics to manifest itself at the energy scale of the Large Hadron Collider,” she said.

    Some physicists have responded by speculating about multiple universes and other exotic phenomena. Some clues, Dr. Gianotti said, might come from studying the new particle on the block, the Higgs.

    “We physicists are happy when we understand things, but we are even happier when we don’t understand,” she said. “And today we know that we don’t understand everything. We know that we are missing something important and fundamental. And this is very exciting.”

    Colliders of tomorrow

    Humans soon must decide which machines, if any, will be built to augment or replace the Large Hadron Collider. That collider had a “killer app” of sorts: it was designed to achieve an energy at which, according to the prediction of the Standard Model, the Higgs or something like it would become evident and provide an explanation for particle masses.

    But the Standard Model doesn’t predict a new keystone particle in the next higher energy range. Luckily, nobody believes the Standard Model is the last word about the universe, but as the machines increase in energy, particle physicists will be shooting in the dark.

    For a long time, the leading candidate for Next Big Physics Machine has been the International Linear Collider, which would fire electrons and their antimatter opposites, positrons, at each other.

    ILC schematic, being planned for the Kitakami highland, in the Iwate prefecture of northern Japan

    The collisions would produce showers of Higgs bosons. The experiment would be built in Japan, if it is built at all, but Japan has yet to commit to hosting the project, which would require them to pay for about half of the $5.5 billion cost- see https://sciencesprings.wordpress.com/2018/12/21/from-nature-via-ilc-plans-for-worlds-next-major-particle-collider-dealt-big-blow.

    In the meantime, Europe has convened meetings and workshops to decide on a plan for the future of particle physics there. “If there is no word from Japan by the end of the year, then the I.L.C. will not figure in the next five-year plan for Europe,” Lyn Evans, a CERN physicist who was in charge of building the Large Hadron Collider, said in an email.

    CERN has proposed its own version of a linear collider, the Compact Linear Collider, that could be scaled up gradually from Higgs bosons to higher energies. Also being considered is a humongous collider, 100 kilometers around, that would lie under Lake Geneva and would reach energies of 100 trillion electron volts — seven times the power of the Large Hadron Collider.

    Cern Compact Linear Collider

    CLC map

    CLC TWO-BEAM ACCELERATION TEST STAND

    And in November the Chinese Academy of Sciences released the design for a next-generation collider of similar size, called the Circular Electron Positron Collider.

    China Circular Electron Positron Collider (CEPC) map

    China Circular Electron-Positron collider depiction

    The machine could be the precursor for a still more powerful machine that has been dubbed the Great Collider. Politics and economics, as well as physics, will decide which, if any, of these machines will see a shovel.

    “If we want a new machine, nothing is possible before 2035,” Frederick Bordry, CERN’s director of accelerators, said of European plans. Building such a machine is a true human adventure, he said: “Twenty-five years to build and another 25 to operate.”

    Noting that he himself is 64, he added, “I’m working for the young people.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 11:52 am on December 18, 2018 Permalink | Reply
    Tags: , Continental drift, , Hard evidence of tectonic origins was destroyed long ago, NYT, , The link between plate tectonics and the evolution of complex life, What caused the shell to crack apart in the first place, With subduction established water like oceanic crust, would cycle between Earth’s surface and mantle, You need plate tectonics to sustain life   

    From The New York Times: “The Earth’s Shell Has Cracked, and We’re Drifting on the Pieces” 

    New York Times

    From The New York Times

    Dec. 18, 2018
    Natalie Angier

    Plate tectonics helped make our planet stable and habitable. But the slow shifting of continents is still a mysterious process.

    1
    The San Andreas fault in the Carrizo Plain in California. The fault line forms the boundary between the Pacific and the North American plates. Credit Peter Menzel/Science Source

    The theory of plate tectonics is one of the great scientific advances of our age, right up there with Darwin’s theory of evolution and Einstein’s theory of relativity.

    The idea that Earth’s outer shell is broken up into giant puzzle pieces, or plates, all gliding atop a kind of conveyor belt of hot, weak rock — here rising up from the underlying mantle, there plunging back into it — explains much about the structure and behavior of our home planet: the mountains and ocean canyons, the earthquakes and volcanoes, the very composition of the air we breathe.

    Yet success is no guarantee against a midlife crisis, and so it is that half a century after the basic mechanisms of plate tectonics were first elucidated, geologists are confronting surprising gaps in their understanding of a concept that is truly the bedrock of their profession.

    They are sparring over when, exactly, the whole movable plate system began. Is it nearly as ancient as the planet itself — that is, roughly 4.5 billion years old — or a youthful one billion years, or somewhere in between?

    They are asking what caused the shell to crack apart in the first place, and how the industrious recycling of Earth’s crust began.

    They are comparing Earth with its sister planet, Venus. The two worlds are roughly the same size and built of similar rocky material, yet Earth has plate tectonics and Venus does not. Scientists want to know why.

    “In the 1960s and 70s, when people came up with the notion of plate tectonics, they didn’t think about what it was like in the distant past,” said Jun Korenaga, a geophysicist at Yale University.

    “People were so busy trying to prove plate tectonics by looking at the present situation, or were caught up applying the concept to problems in their own field. The origin issue is a much more recent debate.”

    Researchers also are exploring the link between plate tectonics and the evolution of complex life. Fortuitously timed continental collisions and mountain smackdowns may well have supplied crucial nutrients at key moments of biological inventiveness, like the legendary Cambrian explosion of 500 million years ago, when the ancestors of modern life-forms appeared.

    “The connection between deep Earth processes and Earth surface biology hasn’t been thought about too clearly in the past, but that’s changing fast,” said Aubrey Zerkle, a geochemist at the University of St. Andrews in Scotland.

    It’s increasingly obvious that “you need plate tectonics to sustain life,” Dr. Zerkle added. “If there wasn’t a way of recycling material between mantle and crust, all these elements that are crucial to life, like carbon, nitrogen, phosphorus and oxygen, would get tied up in rocks and stay there.”

    The origin and implications of plate tectonics were the subject of a recent meeting and themed issue of Philosophical Transactions of the Royal Society.

    Researchers said that pinning down when and how Earth’s vivid geological machinations arose will do more than flesh out our understanding of our home base. The answers could well guide our search for life and habitable planets beyond the solar system.

    Robert Stern, a geoscientist at the University of Texas at Dallas, argues that if we’re looking for another planet to colonize, we want to avoid ones with signs of plate tectonic activity. Those are the places where life is likely to have evolved beyond the “single cell or worm stage, and we don’t want to fight another technological civilization for their planet.”

    “A relatively benign way for the Earth to lose heat”

    2
    Mount Singabung erupting in Indonesia in October 2014. Plate tectonics “allows Earth to maintain a stabler and more benign environment overall,” explained one scientist. Credit Dedy Sahputra/European Pressphoto Agency

    The idea that continents are not fixed but rather peregrinate around the globe dates back several centuries, when mapmakers began noticing the complementarity of various land masses — for example, the way the northeast bulge of South America looks as though it could fit snugly in the cupped palm of the southwest coast of Africa.

    But it wasn’t until the mid-twentieth century that the generic notion of “continental drift” was transformed into a full-bodied theory, complete with evidence of a subterranean engine driving these continental odysseys.

    Geologists determined that Earth’s outer layer is broken into eight or nine large segments and five or six smaller ones, a mix of relatively thin, dense oceanic plates riding low and thicker, lighter continental plates bobbing high.

    At large fissures on the ocean floor, melting rock from the underlying mantle rises up, adding to the oceanic plates. At other fracture points in the crust, oceanic plates are diving back inside, or subducting, their mass devoured in the mantle’s hot belly.

    The high-riding continental plates are likewise jostled by the magmatic activity below, skating around at an average pace of one or two inches a year, sometimes crashing together to form, say, the Himalayan mountain chain, or pulling apart at Africa’s Great Rift Valley.

    All this convective bubbling up and recycling between crust and mantle, this creative destruction and reconstruction of parts — “tectonic” comes from the Greek word for build — is Earth’s way of following the second law of thermodynamics. The movement shakes off into the frigidity of space the vast internal heat that the planet has stored since its violent formation.

    And while shifting, crumbling plates may seem inherently unreliable, a poor foundation on which to raise a family, the end result is a surprising degree of stability. “Plate tectonics is a relatively benign way for Earth to lose heat,” said Peter Cawood, an Earth scientist at Monash University in Australia.

    “You get what are catastrophic events in localized areas, in earthquakes and tsunamis,” he added. “But the mechanism allows Earth to maintain a stabler and more benign environment overall.”

    4
    Sulfuric gas in the Afar Triple Junction in Ethiopia, at the top of the Great Rift Valley. Three tectonic plates meet at this spot: the Arabian plate and two African plates, Nubian and Somali. Credit Massimo Rumi/Barcroft Media, via Getty Images

    Unfortunately for geologists, the very nature of plate tectonics obscures its biography. Oceanic crust, where the telltale mantle exchange zones are located, is recycled through the upwelling and subducting pipeline every 200 million years or so, which means hard evidence of tectonic origins was destroyed long ago.

    Continental crust is older, and rocks dating back more than 4 billion years have been identified in places like Jack Hills, Australia. But continental plates float above the subductive fray, revealing little of the system’s origins.

    Nevertheless, geoscientists are doing their best with extant rocks, models and laboratory experiments to sketch out possible tectonic timelines. Dr. Korenaga and his colleagues have proposed that plate tectonics began very early, right after Earth’s crust solidified from its initial magmatic state.

    “That is when the conditions would have been easiest for plate tectonics to get started,” he said. At that point, he said, most of the water on Earth — delivered by comets — would still be on the surface, with little of it having found its way into the mantle. The heat convecting up through the mantle would exert a stronger force on dry rocks than on rocks that were lubricated.

    At the same time, the surface water would make it easier for the hot, twisting rocks beneath to crack the surface lid apart, rather as a sprinkling of water from the faucet eases the task of popping ice cubes from a tray. The cracking open of the surface lid, Dr. Korenaga said, is key to getting the all-mighty subduction engine started. With subduction established, water, like oceanic crust, would cycle between Earth’s surface and mantle.

    Water is constantly recycled between the mantle and crust

    5
    A map of tectonic plates in the Indian Ocean based on data showing seafloor gravity anomalies. The red areas show areas where gravity is stronger, largely aligning with underwater ridges, seamounts and plate edges. Credit Joshua Stevens, Sandwell, D. et al., NASA

    On the opposite end of the origins debate is Dr. Stern, who argues that plate tectonics is a mere billion years old or less, and that Earth spent its first 3.5 billion years with a simple “single lid” as its outer shell: a crust riddled with volcanoes and other means of heat ventilation, but no moving plates, no subduction, no recycling between inside and out.

    As evidence of the youthfulness of the plate regimen, Dr. Stern points to two classes of rocks: ophiolites and blueschist.

    Ophiolites are pieces of oceanic crust atop bits of underlying mantle that have made their way onto land and thus have escaped the relentless recycling of oceanic crust. Recent research has shown that ophiolites are not just any slice of oceanic crust, Dr. Stern said, but rather were formed by the forces of subduction.

    Similarly, blueschists are rocks that are fashioned under very high pressure but low temperatures, and “the only place you can do that is in a subduction zone,” Dr. Stern said.

    Nearly all ophiolites are less than a billion years old, he added, while the most ancient blueschists, found in China, are just 800 million years old. No ophiolites, no blueschists, no evidence of subduction or plate tectonics.

    Most geologists opt for a middle ground. “Science is a democratic process,” said Michael Brown, a geologist at the University of Maryland and an editor of the themed issue, “and the prevailing view is that Earth started to exhibit behaviors that look like plate tectonics 2.5 to 3 billion years ago.”

    Significantly, that chronology decouples plate tectonics from the origin of life on Earth: evidence of the earliest single-celled organisms dates back more than 3.6 billion years. Nevertheless, scientists view plate tectonics as vital to the sustained evolution of that primordial life.

    6
    In Iceland, a visible fault between the North American and Eurasian plates, which are pulling away from each other at a rate of about an inch a year. Credit Universal History Archive/UIG, via Getty Images

    Plate tectonic activity did not just help to stabilize Earth’s heat management system. The movement kept a steady supply of water shuttling between mantle and crust, rather than gradually evaporating from the surface.

    It blocked the dangerous buildup of greenhouse gases in the atmosphere by sucking excess carbon from the ocean and subducting it underground. It shook up mountains and pulverized rocks, freeing up essential minerals and nutrients like phosphorus, oxygen and nitrogen for use in the growing carnival of life.

    Dr. Zerkle discerns a link between geological and biological high drama: “It’s been suggested that time periods of supercontinental cycles — when small continents smash together to make large supercontinents, and those supercontinents then rip apart into smaller continents again — could have put large pulses of nutrients into the biosphere and allowed organisms to really take off.”

    Plate tectonics also built the right playing fields for Darwinian games.

    “Think about what drives evolution,” Dr. Stern said. “It’s isolation and competition. You need to break continents and continental shelves apart, and separate one ocean from another, for speciation to occur.”

    Life is always falling apart, on the rocks — and a good thing, too.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 5:29 pm on December 8, 2018 Permalink | Reply
    Tags: China’s Chang’e-4 Launches on Mission to the Moon’s Far Side, NYT   

    From The New York Times: “China’s Chang’e-4 Launches on Mission to the Moon’s Far Side” 

    New York Times

    From The New York Times

    Dec. 7, 2018
    Kenneth Chang

    1
    China hopes to send its Chang’e-4 lunar lander to the far side of the moon, shown here illuminated by the sun in an image captured by NASA’s Deep Space Climate Observatory satellite. Credit NASA Goddard

    China is aiming to go where no one has gone before: the far side of the moon.

    A rocket carrying the Chang’e-4 lunar lander blasted off at about 2:23 a.m. local time on Saturday from Xichang Satellite Launch Center in southern China. (In the United States, it was still midday Friday). Chinese authorities did not broadcast the launch, but an unofficial live stream recorded near the site showed the rocket rise from the launchpad until its flames looked like a bright star in the area’s dark skies.

    Nearly one hour later, Xinhua, China’s state-run news agency reported that Chang’e-4 had successfully launched.

    2
    Agence France-Presse-Getty Images

    Exactly when it will set down at its destination has not yet been announced — possibly in early January — but Chang’e-4 will provide the first close-up look at a part of the moon that is eternally out of view from Earth.

    What is Chang’e-4?

    5

    Chang’e-4 includes two main parts: the main lander weighing about 2,400 pounds and a 300-pound rover. By comparison, NASA’s Opportunity rover on Mars weighs about 400 pounds, and the Curiosity rover there is much bigger, at 2,000 pounds.

    The spacecraft is largely a clone of Chang’e-3, which landed on the moon in 2013. Indeed, Chang’e-4 was built as the backup in case the first attempt failed. With the success — the first soft landing of any spacecraft on the moon since 1976 — the Chinese outfitted Chang’e-4 with a different set of instruments and decided to send it to a different location.

    Where is Chang’e-4 going?

    The rover will land in the 110-mile-wide Von Kármán crater. It is on the far side of the moon, which is always facing away from Earth. (The moon is what planetary scientists call “tidally locked” to the rotation of the Earth. That is, its period of rotation — its day — is the same as the time it takes to make one orbit around Earth.)

    The crater is within an area known as the South Pole-Aitken basin, a gigantic, 1,600-mile-wide crater at the bottom of the moon, which has a mineralogy distinct from other locations. That may reflect materials from the inside of the moon that were brought up by the impact that created the basin.

    The far side is also considerably more mountainous than the near side for reasons not yet understood.

    What will Chang’e-4 study?

    The suite of instruments on the rover and the lander include cameras, ground-penetrating radar and spectrometers to help identify the composition of rocks and dirt in the area. And China’s space agency has collaborated with other countries. One instrument was developed at Kiel University in Germany; another was provided by the Swedish Institute of Space Physics.

    The instruments will probe the structure of the rocks beneath the spacecraft and study the effects of the solar wind striking the lunar surface. Chang’e-4 will also test the ability of making radio astronomy observations from the far side of the moon, without the effects of noise and interference from Earth.

    According to the Xinhua news agency, Chang’e-4 is also carrying an intriguing biology experiment to see if plant seeds will germinate and silkworm eggs will hatch in the moon’s low gravity.

    6
    China launched a relay satellite named Queqiao, which will beam messages between Earth and the Chang’e-4 lander, in May. Credit Cai Yang/XinHua, via Associated Press

    How will the spacecraft communicate with Earth?

    Because the moon blocks radio signals from our planet, the Chinese launched a satellite, called Queqiao, in May.

    6
    Flying high: The Queqiao satellite will communicate between Earth and a lander that will be places on the far side of the Moon later this year (Courtesy: China Aerospace Science and Technology Corporation)

    It is circling high over the far side of the moon, and will relay messages between Earth and the Chang’e-4 lander.

    When will Chang’e-4 land on the moon?

    China’s space agency has not announced a landing date, though some expect that will be the first week of January, when the sun will be shining over the far side of the moon, an important consideration because Chang’e-4 is solar-powered.

    Zhang Xiaoping, an associate professor from Space Science Institute/Lunar and Planetary Science Laboratory of Macau University of Science and Technology, said that the spacecraft would follow the Chang’e-3’s trajectory. That means it would arrive in three to five days and then orbit the moon for several days (13 in the case of Chang’e-3) while preparing for the landing, he said.

    Wait, I thought the far side of the moon was dark.

    The far side is not dark all of the time.

    The first new moon of 2019 is Jan. 6. That’s when you cannot see the moon because the dark side — the side that is in shadow facing away from the sun — is facing Earth. And when the near side of the moon is dark, the far side is awash in bright sunshine.

    Why is China so secretive about all of this?

    Chinese officials have talked about Chang’e-4 in public, but their interactions with journalists more resemble the carefully managed strategy used by the Soviet program during the Cold War rather than the more open publicity by NASA and many other space agencies. That way, the Chinese, like the Soviets, could boast about the successes and downplay any failures.

    What does Chang’e mean?

    In Chinese mythology, Chang’e is the goddess of the moon. Other missions have been named after her, too.

    Chang’e-1 and 2 went into orbit around the moon but did not land. Chang’e-1 was launched in 2007. Chang’e-2 followed in 2010.

    The next step in China’s moon program is for the Chang’e-5 robotic spacecraft to land on the moon and then bring rock samples back to Earth for additional study.

    Chang’e-5 was supposed to head to the moon before Chang’e-4, but a launch failure of the large Chinese rocket needed to carry it to space delayed the mission until at least 2019.

    Who else is planning to go to the moon?

    Next year, the Indian government is planning to launch a mission, Chandrayaan-2, that includes an orbiter, a lander and a rover.

    7
    The Indian Space Research Organisation (ISRO) is set to make yet another breakthrough in it series of space missions, with the launch of Chandrayaan-2,10 years after its first lunar mission in November 2008.

    SpaceIL, an Israeli team that was a finalist in the Google Lunar X Prize, is also still aiming to send a robotic lander to the moon early next year, even though the $20 million prize has expired.

    NASA announced last week that nine companies will compete for robotic missions to carry science experiments to the moon. The space agency said the first of those could go as early as 2019, but most of the companies said they would not be ready until 2021.

    Jim Bridenstine, the NASA administrator, has praised the Chang’e-4 mission as exciting, and at the International Astronautical Congress in Bremen, Germany in October, talked of possible collaboration with the Chinese space agency. Federal laws limit any NASA interaction with the Chinese.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 8:30 am on November 27, 2018 Permalink | Reply
    Tags: , , NYT, Yellowstone's Eternal Scenes are Changing Before Our Eyes   

    From The New York Times: “Your Children’s Yellowstone Will Be Radically Different” 

    New York Times

    From The New York Times

    NOV. 15, 2018
    Marguerite Holloway
    Photographs and time-lapse video by Josh Haner

    On a recent fall afternoon in the Lamar Valley, visitors watched a wolf pack lope along a thinly forested riverbank, ten or so black and gray figures shadowy against the snow. A little farther along the road, a herd of bison swung their great heads as they rooted for food in the sagebrush steppe, their deep rumbles clear in the quiet, cold air.

    In the United States, Yellowstone National Park is the only place bison and wolves can be seen in great numbers. Because of the park, these animals survive. Yellowstone was crucial to bringing back bison, reintroducing gray wolves, and restoring trumpeter swans, elk, and grizzly bears — all five species driven toward extinction found refuge here.

    1
    Bison in Yellowstone.

    But the Yellowstone of charismatic megafauna and of stunning geysers that four million visitors a year travel to see is changing before the eyes of those who know it best. Researchers who have spent years studying, managing, and exploring its roughly 3,400 square miles say that soon the landscape may look dramatically different.

    Over the next few decades of climate change, the country’s first national park will quite likely see increased fire, less forest, expanding grasslands, shallower, warmer waterways, and more invasive plants — all of which may alter how, and how many, animals move through the landscape. Ecosystems are always in flux, but climate change is transforming habitats so quickly that many plants and animals may not be able to adapt well or at all.

    Yellowstone National Park, established in 1872, is one of the Unesco World Heritage sites threatened by climate change. It is home to some of the country’s oldest weather stations, including one at Mammoth Hot Springs. Data from the park and surrounding area has helped scientists understand and track climate change in the Western United States.

    Since 1948, the average annual temperature in the Greater Yellowstone Ecosystem — an area of 34,375 square miles that includes the park, national forests, and Grand Teton National Park — has risen about 2 degrees Fahrenheit. Researchers report that winter is, on balance, 10 days shorter and less cold.

    2
    The Grand Prismatic Spring from above

    “For the Northern Rockies, snowpack has fallen to its lowest level in eight centuries,” said Patrick Gonzalez, a forest and climate change scientist at the University of California, Berkeley.

    Because snow is a cornerstone of the park’s ecology, the decline is alarming to some ecologists.

    3
    The Grand Prismatic Spring.

    Summers in the park have become warmer, drier and increasingly prone to fire. Even if rainfall increases in the future, it will evaporate more quickly, said Michael Tercek, an ecologist who has worked in Yellowstone for 28 years.

    “By the time my daughter is an old woman, the climate will be as different for her as the last ice age seems to us,” Dr. Tercek said.

    Yellowstone’s unusual landscape — of snow and steam, of cold streams and hot springs — is volcanic. Magma gives rise to boiling water and multihued thermophiles, bacteria that thrive at high temperatures.

    In 1883, The New York Times described the park as an “almost mystical wonderland.”

    For many visitors, Yellowstone represents American wilderness: a place with big, open skies where antelope and bison still roam.

    “You run into visitors and they thank you for the place,” said Ann Rodman, a park scientist. “They are seeing elk and antelope for the first time in their lives.”

    3
    An elk at Mammoth Hot Springs

    Ms. Rodman, who has been working in Yellowstone for 30 years, has pored over temperature and weather data. The trends surprised her, as well as the urgency.

    “When I first started doing it, I really thought climate change was something that was going to happen to us in the future,” she said. “But it is one of those things where the more you study it, the more you realize how much is changing and how fast.”

    “Then you begin to go through this stage, I don’t know if it is like the stages of grief,” Ms. Rodman said. “All of a sudden it hits you that this is a really, really big deal and we aren’t really talking about it and we aren’t really thinking about it.”

    4
    Ann Rodman, a scientist at Yellowstone.

    Ms. Rodman has seen vast changes near the town of Gardiner, Mont., at the north entrance to Yellowstone. Some non-nutritious invasive plants like cheatgrass and desert madwort have replaced nutritious native plants. Those changes worry Ms. Rodman and others: Give invasives an inch and they take miles.

    Cheatgrass has already spread into the Lamar Valley. “This is what we don’t want — to turn into what it looks like in Gardiner,” Ms. Rodman said. “The seeds come in on people’s cars and on people’s boots.”

    5
    Pronghorn antelope, with cheatgrass in the background.

    Cheatgrass can thrive in disturbed soils and can ignite “like tissue paper,” she said. It takes hold after fires, preventing native plants from regrowing.

    If cheatgrass and its ilk spread, bison and elk could be affected. Cheatgrass, for instance, grows quickly in the spring. “It can suck the moisture out of the ground early,” Ms. Rodman said. “Then it is gone, so it doesn’t sustain animals throughout the summer the way native grasses would.”

    In recent years, elk have lost forage when drier, hotter summers have shortened what ecologists call the green wave, in which plants become green at different times at different elevations, said Andrew J. Hansen of Montana State University.

    Some elk now stay in valleys outside the park, nibbling lawns and alfalfa fields, Dr. Hansen said. And where they go, wolves follow. “It is a very interesting mix of land-use change and climate change, possibly leading to quite dramatic shifts in migration and to thousands of elk on private land,” he said.

    Drier summers also mean that fires are a greater threat. The conditions that gave rise to the fires of 1988, when a third of the park burned, could become common.

    By the end of the century, “the weather like the summer of ’88 will likely be there all the time rather than being the very rare exception,” said Monica G. Turner of the University of Wisconsin-Madison. “As the climate is warming, we are getting fires that are happening more often. We are starting to have the young forests burn again before they have had a chance to recover.”

    6
    Evergreen forest damaged by bark beetles.

    7
    rees, and new growth, after a forest fire.

    In 2016, a wildfire swept through trees in a section near the Madison River that had burned in 1988. Because young trees don’t have many cones on them, Dr. Turner said, they don’t have as many seeds to release to form new forest. The cones they do have are close to the ground, which means they are less likely to survive the heat.

    Repeated fires could lead to more grassland. “The structure of the forests is going to change,” Dr. Turner said. “They might become sparse or not recover if we keep doing a double and triple whammy.”

    Forests shade waterways, and those too are experiencing climate-related changes. “We can very definitely see warming trends during the summer and fall,” said Daniel J. Isaak of the United States Forest Service. “Stream and river flows are declining as snowpack declines.” As fish become concentrated in smaller areas, Dr. Isaak said, disease can increase in a population because transmission is easier.

    8
    Sour Creek

    In 2016, the Yellowstone River — famous for its fly fishing and its cutthroat trout, which thrive in colder waters — was closed to anglers for 183 miles downstream from the park after an outbreak of kidney disease killed thousands of fish. “The feeling was that this was a canary in the coal mine,” said Dan Vermillion of Sweetwater Travel Company, a fly-fishing operation in Livingston, Mont.

    Lower flows and warmer water are one consequence of spring arriving earlier. Quickly melting snow unleashing torrents is another. Flooding has affected the nesting of water birds like common loons, American white pelicans, and double-crested cormorants. “All their nesting is on lakes and ponds, and water levels are fluctuating wildly, as it does with climate change,” said Douglas W. Smith, a park biologist.

    And Yellowstone’s trumpeter swans are declining. By the early 20th century, hunters had wiped out most of the enormous birds in the continental United States, killing them for food and fashionable feathers. But 70 or so swans remained in the Yellowstone region, some of them safe inside the park. Those birds helped restore trumpeters nationwide. Now only two trumpeter pairs live in the park, and they have not bred successfully for several years.

    Part of the reason, said Dr. Smith and a colleague, Lauren E. Walker, may be the loss of nests and nesting sites during spring floods. A pair on Swan Lake, just south of Mammoth Hot Springs, has spurned the floating nest that the Park Service installed to help the birds.

    “Heritage-wise this is a really important population,” Dr. Walker said. “If this is no longer a reliable spot, what does that mean for the places that may have more human disturbance?”

    On the shores of Yellowstone Lake, dozens of late-season visitors watched two grizzly bears eating a carcass, while a coyote and some ravens circled, just a hundred or so yards from the road. “If they run this way,” the ranger called out, “get in your cars.”

    Grizzlies are omnivores, eating whatever is available, including the fat- and protein-packed nuts of the whitebark pine. That pine is perhaps the species most visibly affected by climate change in Yellowstone and throughout the West. Warmer temperatures have allowed a native pest, the mountain pine beetle, to better survive winter, move into high elevations and have a longer reproductive season. In the last 30 years, an estimated 80 percent of the whitebark pines in the park have died by fire, beetle, or fungal infection.

    For want of the whitebark pine, a great deal could be lost. The trees are a foundation species, meaning they play a central role in the structure of the ecosystem. They colonize exposed mountain sites, allowing other plants to get a root-hold. Their wide canopies protect snowpack from the sun. They are also a keystone species. They provide food for birds like the Clark’s nutcracker, which, in turn, create whitebark pine nurseries by caching nuts. And they are an important food source for squirrels, foxes, and grizzlies.

    When pine nuts are not plentiful, bears consume other foods, including the elk or deer innards left by hunters outside the park. And that can bring the Yellowstone-area grizzlies, relisted as threatened this September, into conflict with people.

    The loss of the pines “has far-reaching implications for the entire ecosystem,” said Jesse A. Logan, a retired Forest Service researcher.

    “The rest of the landscape, even in the mountainous West, has been so altered that Yellowstone becomes even more important,” Dr. Logan said.

    Yellowstone provides a refuge for people seeking and delighting in a sense of wilderness. It offers a landscape unlike any other: a largely intact ecosystem rich in wildlife and rich in geothermal features. Yellowstone’s unusual beauty was forged by volcanic heat; heat from humanity could be its undoing.

    8
    Map showing Snow-telemetry (SNOTEL) weather stations in and near Yellowstone National Park. Left: Background shows average number of days per water year (October–September) with SWE greater than 0 cm. Right: Background shows average annual peak (greatest) SWE (cm). Data source = SNODAS [20]. Both panels are averaged over water years ending 2005– 2014, which was the length of record available for this data source. Gray areas = Lakes.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: