Tagged: Scientific American Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:43 am on July 7, 2019 Permalink | Reply
    Tags: "Oregon Is About to Get a Lot More Hazardous", , , Landslides and debris flows, Scientific American, ,   

    From Scientific American: “Oregon Is About to Get a Lot More Hazardous” 

    Scientific American

    From Scientific American

    June 29, 2019
    Dana Hunter

    State leadership is failing its citizens—and there will be a body count.

    1
    Credit: Dale Simonson (CC BY-SA 2.0)

    When you live in an area at as much geologic risk as Oregon, you would expect that government officials would maybe, possibly, take those risks seriously. But the people who currently govern Oregon seem quite determined to ignore hazards and let the state languish unprepared.

    It’s bad enough that legislators voted this month to allow “new schools, hospitals, jails, and police and fire stations” to be built in areas that will most certainly be inundated in the event of a tsunami. Both parties think it’s a good idea now; I doubt they’ll still be feeling great about locating schools right in the path of rampaging seawater when the big one hits. But short-term economic gain outweighs long-term planning, so here we are. What else can we expect from a statehouse where lawmakers who would rather flee the state than be forced to deal with climate change?

    People say they’re willing to accept the risks. However, the state government is now planning to make it far harder for residents to even know what those risks are, because Oregon’s Department of Geology and Mineral Industries (DOGAMI for short) is severely underfunded and will now lose three critically-needed experts on staff as a punishment for going over budget. As if that weren’t bad enough, the governor’s office is considering whether the agency should even continue to exist:

    “In a note on the preliminary budget proposal for the agency, the Joint Ways and Means Committee said the Governor’s office would be “evaluating if the Department should continue to exist as an independent or recommendations to abolish and move the individual programs to other entities.”

    That drastic of a move could come with big consequences,” Avy said.

    “It would be incredibly disruptive to staff and it is likely that some on-going studies would be discontinued,” he said.”Oregon would lose a valued agency and may lose talented staff in our Geological Survey and Services Program which provides a focus on geologic and mineral mapping and natural hazard identification.”

    Can we be real for a minute, here? Oregon is a geologically young state in an active subduction zone, located on an ocean that has subduction zones on both sides, which generate ocean-spanning tsunamis on a regular basis. The local subduction zone, plus Basin and Range crustal stretching and faulting, also produces active volcanoes. Many, many volcanoes. Also, too, all of this folding and faulting and uplifting and volcanoing leaves the state terribly landslide prone. This is not a place where you can safely starve your local geological survey of funds, and then shut it down when it needs extra money to identify and quantify the hazards you face.

    So if you live in Oregon, or even if you just visit, I’d strongly consider writing a polite but serious missive to Governor Kate Brown, letting her know that it would perhaps be a good idea to look further into the possible repercussions of signing that deplorable tsunami bill (I mean, at least take the schools out of the mix!), and also fully fund DOGAMI rather than further crippling it and then stripping it for parts.

    Let’s have a brief tour of Oregon’s geohazards which DOGAMI helps protect us from, then, shall we?

    Tsunamis

    The Oregon coast is extremely susceptible to tsunamis, both generated from Cascadia and from other subduction zones along the Pacific Ocean. You can see evidence of them everywhere.

    1
    Cascadia subduction zone. This is the site of recurring en:megathrust earthquakes at average intervals of about 500 years, including the en:Cascadia Earthquake of en:1700.

    One of the starkest reminders in recent times was the dock that was ripped from the shoreline in Misawa, Japan, in the brutal 2011 Tōhoku Earthquake. The tsunami that sheared it loose and set it afloat also washed ashore in California and Oregon, causing millions of dollars in damage; loss of life in the United States was only avoided due to ample warnings.

    3
    Ocean energy distribution forecast map for the 2011 Sendai earthquake from the U.S. NOAA. Note the location of Australia for scale.

    Just over a year later, the dock washed up on Agate Beach, Oregon.

    At Agate Beach, homes and businesses are built right in the path of the next Cascadia tsunami. I can’t describe to you the eerie sensation you feel turning away from that dock to see vulnerable structures that will be piles of flooded rubble after the next tsunami hits.

    3
    Residences and businesses on Agate Beach. Even a modest tsunami will cause untold damage to these structures. Credit: Dana Hunter

    The people here will have minutes to find high ground after the shaking stops, if that long. There is some high ground nearby, but not much, and perhaps not near enough. Roads will probably be destroyed or blocked in the quake. This is the sort of location the legislature has decided it would be fine to site schools.

    Earthquakes

    6
    The stump of a drowned spruce at Sunset Bay, Shore Acres, OR. Lockwood DeWitt for scale. Credit: Dana Hunter

    Sunset Bay is the site of one of Oregon’s many ghost forests. Here, a Cascadia earthquake dropped the shoreline about 1,200 years ago, suddenly drowning huge, healthy trees in salt water. At least seven spectacular earthquakes have hit the Oregon coast in the past 3,500 years. It may not sound like much, or often… but look to Japan for the reason why we should take the threat extremely seriously. And Oregon doesn’t just have to worry about Cascadia quakes: the state is full of faults, stretching from north to south and from coast to interior.

    Volcanoes

    Huge swathes of Oregon are volcanic. As in, recently volcanic. As in, will definitely erupt again quite soon.

    Mount Hood, a sibling to Mount St. Helens, is right outside of Portland and last erupted in the mid-1800s. It is hazardous as heck.

    6
    Mount Hood reflected in Trillium Lake, Oregon, United States

    But Hood is very, very far from the only young volcano in the state, and evidence of recent eruptions is everywhere. Belknap shield volcano and its associated volcanoes on McKenzie Pass ceased erupting only 1,500 years ago, and the forces that created it are still active today.

    7
    Belknap Crater, Oregon. Cascades Volcano Observatory

    Another volcanic center like it could emerge in the near future. And you see here just a tiny swath of the destruction such a volcanic center causes.

    You know what you really don’t want to be caught unawares by? A volcano. And even once they’ve stopped erupting, the buggers can be dangerous. Sector collapses, lahars, and other woes plague old volcanoes. You need people who can keep a sharp eye on them. And I’m sorry, but the USGS can’t be everywhere at once. Local volcano monitoring is important!

    Landslides and debris flows

    If you’re an Oregon resident, you’ll probably remember how bloody long it took to finish the Eddyville Bypass due to the massive landslide that got reactivated during construction. Steep terrain plus plenty of rain equals lots of rock and soil going where we’d prefer it didn’t.

    Debris flows and landslides regularly take out Oregon roads, including this stretch on a drainage by Mount Hood.

    7
    Construction equipment copes with damage caused by massive debris flows coming down from Mount Hood. Credit: Dana Hunter

    We know from the Oso mudslide just how deadly these mass movements can be. Having experts out there who understand how to map the geology of an area and identify problem areas is critically important, especially in places where a lot of people want to live, work, and play.

    Contact the governor’s office and let her know if you don’t think it’s worth letting a budget shortfall torpedo the agency that should be doing the most to identify these hazards and help us mitigate them.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 3:05 pm on June 17, 2019 Permalink | Reply
    Tags: , , , Scientific American   

    From Scientific American: “Which Should Come First in Physics: Theory or Experiment?” 

    Scientific American

    From Scientific American

    June 17, 2019
    Grigoris Panoutsopoulos
    Frank Zimmermann

    Plans for giant particle accelerators of the future focus attention on how scientific discoveries are really made.

    The discovery of the Higgs particle at the Large Hadron Collider (LHC) over half a decade ago marked a milestone in the long journey towards understanding the deeper structure of matter. Today, particle physics strives to push a diverse range of experimental approaches from which we may glean new answers to fundamental questions regarding the creation of the universe and the nature of the mysterious and elusive dark matter.

    Such an endeavor requires a post-LHC particle collider with an energy capability significantly greater than that of previous colliders. This is how the idea for the Future Circular Collider (FCC) at CERN came to be—a machine that could put the exploration of new physics in high gear.

    CERN FCC Future Circular Collider map

    To understand the validity of this proposal, we should, however, start at the beginning and once more ask ourselves: “How does physics progress?”

    Many believe that grand revolutions are driven exclusively by new theories, whereas experiments play the parts of movie extras. The played-out story goes a little something like this: theorists form conjectures, and experiments are used solely for the purposes of testing them. After all, most of us proclaim our admiration for Einstein’s relativity or for quantum mechanics, but seldom do we pause and consider whether these awe-inspiring theories could have been attained without the contributions of the Michelson-Morley, Stern-Gerlach or black-body–radiation experiments.

    This simplistic picture, despite being far removed from the creative, and often surprising, ways in which physics has developed over time, remains quite widespread even among scientists. Its pernicious influence can be seen in the discussion of future facilities like the proposed FCC at CERN.

    In the wake of the discovery of the Higgs boson in 2012, we have finally of all of the pieces of puzzle of the Standard Model (SM) of physics in place. Nevertheless, the unknowns regarding dark matter, neutrino masses and the observed imbalance between matter and antimatter are among numerous indications that the SM is not the ultimate theory of elementary particles and their interactions.

    Quite a number of theories have been developed to overcome the problems surrounding the SM, but so far none has been experimentally verified. This fact has left the world of physics brimming with anticipation. In the end, science has shown time and again that it can find new, creative ways to surmount any obstacles placed along its path. And one such way is for experiment to assume the leading role, so that it can help get the stuck wagon of particle physics moving and out of the mire

    In this regard, the FCC study was launched by CERN in 2013 as a global effort to explore different scenarios for particle colliders that could inaugurate the post-LHC era and for advancing key technologies. A staged approach, it entails the construction of an electron-positron collider followed by a proton collider, which would present an eightfold energy leap compared to the LHC and thus grant us direct access to a previously unexplored regime. Both colliders will be housed in a new 100-kilometer circumference tunnel. The FCC study complements previous design studies for linear colliders in Europe (CLIC) and Japan (ILC), while China also has similar plans for a large-scale circular collider (CEPC).

    CERN/CLIC

    ILC schematic, being planned for the Kitakami highland, in the Iwate prefecture of northern Japan

    China Circular Electron Positron Collider (CEPC) map

    Future colliders could offer a deep understanding of the Higgs properties, but even more importantly, they represent an opportunity for exploring uncharted territory in an unprecedented energy scale. As Gian Giudice, head of CERN’s Theoretical Physics Department, argues: “High-energy colliders remain an indispensable and irreplaceable tool to continue our exploration of the inner workings of the universe.”

    Nevertheless, the FCC is seen by some as a questionable scientific investment in the absence of clear theoretical guidance about where the elusive new physics may lie. The history of physics, however, offers evidence in support of a different view: that experiments often play a leading and exploratory role in the progress of science.

    As the eminent historian of physics Peter Galison puts it, we have to “step down from the aristocratic view of physics that treats the discipline as if all interesting questions are structured by high theory.” Besides, quite a few experiments have been realized without being guided by a well-established theory but were instead undertaken for the purposes of exploring new domains. Let us examine some illuminating examples.

    In the 16th century, King Frederick II of Denmark financed Uraniborg, an early research center, where Tycho Brahe constructed large astronomical instruments, like a huge mural quadrant (unfortunately, the telescope was invented a few years later) and carried out many detailed observations that had not previously been possible. The realization of an enormous experimental structure, at a hitherto unprecedented scale, transformed our view of the world. Tycho Brahe’s precise astronomical measurements enabled Johannes Kepler to develop his laws of planetary motion and to make a significant contribution to the scientific revolution.

    The development of electromagnetism serves as another apt example: many electrical phenomena were discovered by physicists, such as Charles Dufay, André-Marie Ampère and Michael Faraday, in the 18th and 19th centuries, through experiments that had not been guided by any developed theory of electricity.

    Moving closer to the present day, we see that the entire history of particle physics is indeed full of similar cases. In the aftermath of World War II, a constant and laborious experimental effort characterized the field of particle physics, and it was what allowed the Standard Model to emerge through a “zoo” of newly discovered particles. As a prominent example, quarks, the fundamental constituents of the proton and neutron, were discovered through a number of exploratory experiments during the late1960s at the Stanford Linear Accelerator (SLAC).

    The majority of practicing physicists recognize the exceptional importance of experiment as an exploratory process. For instance, Victor “Viki” Weisskopf, the former director-general of CERN and an icon of modern physics, grasped clearly the dynamics of the experimental process in the context of particle physics:

    “There are three kinds of physicists, namely the machine builders, the experimental physicists, and the theoretical physicists. If we compare those three classes, we find that the machine builders are the most important ones, because if they were not there, we would not get into this small-scale region of space. If we compare this with the discovery of America, the machine builders correspond to captains and ship builders who truly developed the techniques at that time. The experimentalists were those fellows on the ships who sailed to the other side of the world and then jumped upon the new islands and wrote down what they saw. The theoretical physicists are those fellows who stayed behind in Madrid and told Columbus that he was going to land in India.” (Weisskopf 1977)

    Despite being a theoretical physicist himself, he was able to recognize the exploratory character of experimentation in particle physics. Thus, his words eerily foreshadow the present era. As one of the most respected theoretical physicists of our time, Nima Arkani-Hamed, claimed in a recent interview, “when theorists are more confused, it’s the time for more, not less experiments.”

    The FCC, at present, strives to keep alive the exploratory spirit of the previous fabled colliders. It is not intended to be used as a verification tool for a specific theory but as a means of paving multiple experimental paths for the future. The experimental process should be allowed to develop its own momentum. This does not mean that experimentation and instrumentation should not maintain a close relationship with the theoretical community; at the end of the day, there is but one physics, and it must ensure its unity.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 4:47 pm on April 30, 2019 Permalink | Reply
    Tags: "Cosmology Has Some Big Problems", New measurements of the Hubble constant- the rate of universal expansion suggested major differences between two independent methods of calculation., Scientific American   

    From Scientific American: “Cosmology Has Some Big Problems” 

    Scientific American

    From Scientific American

    April 30, 2019
    Bjørn Ekeberg

    The field relies on a conceptual framework that has trouble accounting for new observations.

    1
    Credit: Thanapol Sisrang Getty Images

    What do we really know about our universe?

    Born out of a cosmic explosion 13.8 billion years ago, the universe rapidly inflated and then cooled, it is still expanding at an increasing rate and mostly made up of unknown dark matter and dark energy … right?

    This well-known story is usually taken as a self-evident scientific fact, despite the relative lack of empirical evidence—and despite a steady crop of discrepancies arising with observations of the distant universe.

    In recent months, new measurements of the Hubble constant, the rate of universal expansion, suggested major differences between two independent methods of calculation. Discrepancies on the expansion rate have huge implications not simply for calculation but for the validity of cosmology’s current standard model at the extreme scales of the cosmos.

    Another recent probe found galaxies inconsistent with the theory of dark matter, which posits this hypothetical substance to be everywhere. But according to the latest measurements, it is not, suggesting the theory needs to be reexamined.

    It’s perhaps worth stopping to ask why astrophysicists hypothesize dark matter to be everywhere in the universe? The answer lies in a peculiar feature of cosmological physics that is not often remarked. For a crucial function of theories such as dark matter, dark energy and inflation, which each in its own way is tied to the big bang paradigm, is not to describe known empirical phenomena but rather to maintain the mathematical coherence of the framework itself while accounting for discrepant observations. Fundamentally, they are names for something that must exist insofar as the framework is assumed to be universally valid.

    Each new discrepancy between observation and theory can of course in and of itself be considered an exciting promise of more research, a progressive refinement toward the truth. But when it adds up, it could also suggest a more confounding problem that is not resolved by tweaking parameters or adding new variables.

    Consider the context of the problem and its history. As a mathematically driven science, cosmological physics is usually thought to be extremely precise. But the cosmos is unlike any scientific subject matter on earth. A theory of the entire universe, based on our own tiny neighborhood as the only known sample of it, requires a lot of simplifying assumptions. When these assumptions are multiplied and stretched across vast distances, the potential for error increases, and this is further compounded by our very limited means of testing.

    Historically, Newton’s physical laws made up a theoretical framework that worked for our own solar system with remarkable precision. Both Uranus and Neptune, for example, were discovered through predictions based on Newton’s model. But as the scales grew larger, its validity proved limited. Einstein’s general relativity framework provided an extended and more precise reach beyond the furthest reaches of our own galaxy. But just how far could it go?

    The big bang paradigm that emerged in the mid-20th century effectively stretches the model’s validity to a kind of infinity, defined either as the boundary of the radius of the universe (calculated at 46 billion light-years) or in terms of the beginning of time. This giant stretch is based on a few concrete discoveries, such as Edwin Hubble’s observation that the universe appears to be expanding (in 1929) and the detection of the microwave background radiation (in 1964).

    2
    The 15 meter Holmdel horn antenna at Bell Telephone Laboratories in Holmdel, New Jersey was built in 1959 for pioneering work in communication satellites for the NASA ECHO I. The antenna was 50 feet in length and the entire structure weighed about 18 tons. It was composed of aluminum with a steel base. It was used to detect radio waves that bounced off Project ECHO balloon satellites. The horn was later modified to work with the Telstar Communication Satellite frequencies as a receiver for broadcast signals from the satellite. In 1964, radio astronomers Robert Wilson and Arno Penzias discovered the cosmic microwave background radiation with it, for which they were awarded the 1978 Nobel prize in physics. In 1990 the horn was dedicated to the National Park Service as a National Historic Landmark.

    But considering the scale involved, these limited observations have had an outsized influence on cosmological theory.

    Edwin Hubble looking through a 100-inch Hooker telescope at Mount Wilson in Southern California, 1929 discovers the Universe is Expanding

    NASA/ Cosmic Background Explorer COBE 1989 to 1993.

    Cosmic Microwave Background NASA/WMAP

    NASA/WMAP 2001 to 2010

    CMB per ESA/Planck

    ESA/Planck 2009 to 2013

    It is of course entirely plausible that the validity of general relativity breaks down much closer to our own home than at the edge of the hypothetical end of the universe. And if that were the case, today’s multilayered theoretical edifice of the big bang paradigm would turn out to be a confusing mix of fictional beasts invented to uphold the model along with empirically valid variables, mutually reliant on each other to the point of making it impossible to sort science from fiction.

    Compounding this problem, most observations of the universe occur experimentally and indirectly. Today’s space telescopes provide no direct view of anything—they produce measurements through an interplay of theoretical predictions and pliable parameters, in which the model is involved every step of the way. The framework literally frames the problem; it determines where and how to observe. And so, despite the advanced technologies and methods involved, the profound limitations to the endeavor also increase the risk of being led astray by the kind of assumptions that cannot be calculated.

    After spending many years researching the foundations of cosmological physics from a philosophy of science perspective, I have not been surprised to hear some scientists openly talking about a crisis in cosmology. In the big “inflation debate” in Scientific American a few years ago, a key piece of the big bang paradigm was criticized by one of the theory’s original proponents for having become indefensible as a scientific theory.

    Why? Because inflation theory relies on ad hoc contrivances to accommodate almost any data, and because its proposed physical field is not based on anything with empirical justification. This is probably because a crucial function of inflation is to bridge the transition from an unknowable big bang to a physics we can recognize today. So, is it science or a convenient invention?

    Inflation

    4
    Alan Guth, from Highland Park High School and M.I.T., who first proposed cosmic inflation

    HPHS Owls

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex MittelmannColdcreation

    Alan Guth’s notes:
    5

    A few astrophysicists, such as Michael J. Disney, have criticized the big bang paradigm for its lack of demonstrated certainties. In his analysis, the theoretical framework has far fewer certain observations than free parameters to tweak them—a so-called “negative significance” that would be an alarming sign for any science. As Disney writes in American Scientist: “A skeptic is entitled to feel that a negative significance, after so much time, effort and trimming, is nothing more than one would expect of a folktale constantly re-edited to fit inconvenient new observations.”

    As I discuss in my new book, Metaphysical Experiments, there is a deeper history behind the current problems. The big bang hypothesis itself originally emerged as an indirect consequence of general relativity undergoing remodeling. Einstein had made a fundamental assumption about the universe, that it was static in both space and time, and to make his equations add up, he added a “cosmological constant,” for which he freely admitted there was no physical justification.

    But when Hubble observed that the universe was expanding and Einstein’s solution no longer seemed to make sense, some mathematical physicists tried to change a fundamental assumption of the model: that the universe was the same in all spatial directions but variant in time. Not insignificantly, this theory came with a very promising upside: a possible merger between cosmology and nuclear physics. Could the brave new model of the atom also explain our universe?

    From the outset, the theory only spoke to the immediate aftermath of an explicitly hypothetical event, whose principal function was as a limit condition, the point at which the theory breaks down. Big bang theory says nothing about the big bang; it is rather a possible hypothetical premise for resolving general relativity.

    On top of this undemonstrable but very productive hypothesis, floor upon floor has been added intact, with vastly extended scales and new discrepancies. To explain observations of galaxies inconsistent with general relativity, the existence of dark matter was posited as an unknown and invisible form of matter calculated to make up more than a quarter of all mass-energy content in the universe—assuming, of course, the framework is universally valid.

    Fritz Zwicky discovered Dark Matter when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science)

    Coma cluster via NASA/ESA Hubble

    In 1998, when a set of supernova measurements of accelerating galaxies seemed at odds with the framework, a new theory emerged of a mysterious force called dark energy, calculated to fill circa 70 percent of the mass-energy of the universe.

    [The Supernova Cosmology Project is one of two research teams that determined the likelihood of an accelerating universe and therefore a positive cosmological constant, using data from the redshift of Type Ia supernovae. The project was headed by Saul Perlmutter at Lawrence Berkeley National Laboratory, with members from Australia, Chile, France, Portugal, Spain, Sweden, the United Kingdom, and the United States.

    This discovery was named “Breakthrough of the Year for 1998” by Science Magazin and, along with the High-z Supernova Search Team, the project team won the 2007 Gruber Prize in Cosmology and the 2015 Breakthrough Prize in Fundamental Physics. In 2011, Perlmutter was awarded the Nobel Prize in Physics for this work, alongside Adam Riess and Brian P. Schmidt from the High-z team.]

    The crux of today’s cosmological paradigm is that in order to maintain a mathematically unified theory valid for the entire universe, we must accept that 95 percent of our cosmos is furnished by completely unknown elements and forces for which we have no empirical evidence whatsoever. For a scientist to be confident of this picture requires an exceptional faith in the power of mathematical unification.

    In the end, the conundrum for cosmology is its reliance on the framework as a necessary presupposition for conducting research. For lack of a clear alternative, as astrophysicist Disney also notes, it is in a sense stuck with the paradigm. It seems more pragmatic to add new theoretical floors than to rethink the fundamentals.

    Contrary to the scientific ideal of getting progressively closer to the truth, it looks rather like cosmology, to borrow a term from technology studies, has become path-dependent: overdetermined by the implications of its past inventions.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 5:44 pm on April 17, 2019 Permalink | Reply
    Tags: , , Einstein’s Unfinished Revolution: The Search for What Lies Beyond the Quantum, , , , Scientific American   

    From Scientific American: “Cosmologist Lee Smolin says that at certain key points, the scientific worldview is based on fallacious reasoning” 

    Scientific American

    From Scientific American

    April 17, 2019
    Jim Daley

    Lee Smolin, author of six books about the philosophical issues raised by contemporary physics, says every time he writes a new one, the experience completely changes the direction his own research is taking. In his latest book, Einstein’s Unfinished Revolution: The Search for What Lies Beyond the Quantum, Smolin, a cosmologist and quantum theorist at the Perimeter Institute for Theoretical Physics in Ontario, tackles what he sees as the limitations in quantum theory.

    1
    Credit: Perimeter Institute

    “I want to say the scientific worldview is based on fallacious reasoning at certain key points,” Smolin says. In Einstein’s Unfinished Revolution, he argues one of those key points was the assumption that quantum physics is a complete theory. This incompleteness, Smolin argues, is the reason quantum physics has not been able to solve certain questions about the universe.

    “Most of what we do [in science] is take the laws that have been discovered by experiments to apply to parts of the universe, and just assume that they can be scaled up to apply to the whole universe,” Smolin says. “I’m going to be suggesting that’s wrong.”

    Join Smolin at the Perimeter Institute as he discusses his book and takes the audience on a journey through the basics of quantum physics and the experiments and scientists who have changed our understanding of the universe. The discussion, “Einstein’s Unfinished Revolution,” is part of Perimeter’s public lecture series and will take place on Wednesday, April 17, at 7 P.M. Eastern time. Online viewers can participate in the discussion by tweeting to @Perimeter using the #piLIVE hashtag.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 8:17 am on April 5, 2019 Permalink | Reply
    Tags: , In string theory a “solution” implies a vacuum of spacetime that is governed by Einstein’s theory of gravity coupled to a quantum field theory., In the past two decades a new branch of string theory called F-theory has allowed physicists to work with strongly interacting or strongly coupled strings, , Scientific American, String theorists can use algebraic geometry to analyze the various ways of compactifying extra dimensions in F-theory and to find solutions., ,   

    From Scientific American: “Found: A Quadrillion Ways for String Theory to Make Our Universe” 

    Scientific American

    From Scientific American

    Mar 29, 2019
    Anil Ananthaswamy

    Stemming from the “F-theory” branch of string theory, each solution replicates key features of the standard model of particle physics.

    1
    Photo: dianaarturovna/Getty Images

    Physicists who have been roaming the “landscape” of string theory — the space of zillions and zillions of mathematical solutions of the theory, where each solution provides the kinds of equations physicists need to describe reality — have stumbled upon a subset of such equations that have the same set of matter particles as exists in our universe.

    String Theory depiction. Cross section of the quintic Calabi–Yau manifold Calabi yau.jpg. Jbourjai (using Mathematica output)

    Standard Model of Supersymmetry via DESY

    But this is no small subset: there are at least a quadrillion such solutions, making it the largest such set ever found in string theory.

    According to string theory, all particles and fundamental forces arise from the vibrational states of tiny strings. For mathematical consistency, these strings vibrate in 10-dimensional spacetime. And for consistency with our familiar everyday experience of the universe, with three spatial dimensions and the dimension of time, the additional six dimensions are “compactified” so as to be undetectable.

    Different compactifications lead to different solutions. In string theory, a “solution” implies a vacuum of spacetime that is governed by Einstein’s theory of gravity coupled to a quantum field theory. Each solution describes a unique universe, with its own set of particles, fundamental forces and other such defining properties.

    Some string theorists have focused their efforts on trying to find ways to connect string theory to properties of our known, observable universe — particularly the standard model of particle physics, which describes all known particles and all their mutual forces except gravity.

    Much of this effort has involved a version of string theory in which the strings interact weakly. However, in the past two decades, a new branch of string theory called F-theory has allowed physicists to work with strongly interacting, or strongly coupled, strings.

    ____________________________________________________
    F-theory is a branch of string theory developed by Cumrun Vafa. The new vacua described by F-theory were discovered by Vafa and allowed string theorists to construct new realistic vacua — in the form of F-theory compactified on elliptically fibered Calabi–Yau four-folds. The letter “F” supposedly stands for “Father”.

    F-theory is formally a 12-dimensional theory, but the only way to obtain an acceptable background is to compactify this theory on a two-torus. By doing so, one obtains type IIB superstring theory in 10 dimensions. The SL(2,Z) S-duality symmetry of the resulting type IIB string theory is manifest because it arises as the group of large diffeomorphisms of the two-dimensional torus.

    More generally, one can compactify F-theory on an elliptically fibered manifold (elliptic fibration), i.e. a fiber bundle whose fiber is a two-dimensional torus (also called an elliptic curve). For example, a subclass of the K3 manifolds is elliptically fibered, and F-theory on a K3 manifold is dual to heterotic string theory on a two-torus. Also, the moduli spaces of those theories should be isomorphic.

    The large number of semirealistic solutions to string theory referred to as the string theory landscape, with 10 272 , 000 {\displaystyle 10^{272,000}} {\displaystyle 10^{272,000}} elements or so, is dominated by F-theory compactifications on Calabi–Yau four-folds.[3] There are about 10 15 {\displaystyle 10^{15}} 10^{15} of those solutions consistent with the Standard Model of particle physics.

    -Wikipedia

    ____________________________________________________

    “An intriguing, surprising result is that when the coupling is large, we can start describing the theory very geometrically,” says Mirjam Cvetic of the University of Pennsylvania in Philadelphia.

    This means that string theorists can use algebraic geometry — which uses algebraic techniques to tackle geometric problems — to analyze the various ways of compactifying extra dimensions in F-theory and to find solutions. Mathematicians have been independently studying some of the geometric forms that appear in F-theory. “They provide us physicists a vast toolkit”, says Ling Lin, also of the University of Pennsylvania. “The geometry is really the key… it is the ‘language’ that makes F-theory such a powerful framework.”

    Now, Cvetic, Lin, James Halverson of Northeastern University in Boston, and their colleagues have used such techniques to identify a class of solutions with string vibrational modes that lead to a similar spectrum of fermions (or, particles of matter) as is described by the standard model — including the property that all fermions come in three generations (for example, the electron, muon and tau are the three generations of one type of fermion).

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)

    The F-theory solutions found by Cvetic and colleagues have particles that also exhibit the handedness, or chirality, of the standard model particles. In particle physics lingo, the solutions reproduce the exact “chiral spectrum” of standard model particles. For example, the quarks and leptons in these solutions come in left and right-handed versions, as they do in our universe.

    The new work shows that there are at least a quadrillion solutions in which particles have the same chiral spectrum as the standard model, which is 10 orders of magnitude more solutions than had been found within string theory until now. “This is by far the largest domain of standard model solutions,” Cvetic says. “It’s somehow surprising and actually also rewarding that it turns out to be in the strongly coupled string theory regime, where geometry helped us.”

    A quadrillion — while it’s much, much smaller than the size of the landscape of solutions in F-theory (which at last count was shown to be of the order of 10272,000) — is a tremendously large number. “And because it’s a tremendously large number, and it gets something nontrivial in real world particle physics correct, we should take it seriously and study it further,” Halverson says.

    Further study would involve uncovering stronger connections with the particle physics of the real world. The researchers still have to work out the couplings or interactions between particles in the F-theory solutions — which again depend on the geometric details of the compactifications of the extra dimensions.

    It could be that within the space of the quadrillion solutions, there are some with couplings that could cause the proton to decay within observable timescales. This would clearly be at odds with the real world, as experiments have yet to see any sign of protons decaying. Alternatively, physicists could search for solutions that realize the spectrum of standard model particles that preserve a mathematical symmetry called R-parity. “This symmetry forbids certain proton decay processes and would be very attractive from a particle physics point of view, but is missing in our current models,” Lin says.

    Also, the work assumes supersymmetry, which means that all the standard model particles have partner particles. String theory needs this symmetry in order to ensure the mathematical consistency of solutions.

    But in order for any supersymmetric theory to tally with the observable universe, the symmetry has to be broken (much like how a diner’s selection of cutlery and drinking glass on her left or right side will “break” the symmetry of the table setting at a round dinner table). Else, the partner particles would have the same mass as standard model particles — and that is clearly not the case, since we don’t observe any such partner particles in our experiments.

    Crucially, experiments at the Large Hadron Collider (LHC) have also shown that supersymmetry — if it is the correct description of nature — is not broken even at the energy scales probed by the LHC, given that the LHC has yet to find any supersymmetric particles.

    String theorists think that supersymmetry might be broken only at extremely high energies that are not within experimental reach anytime soon. “The expectation in string theory is that high-scale [supersymmetry] breaking, which is fully consistent with LHC data, is completely possible,” Halverson says. “It requires further analysis to determine whether or not it happens in our case.”

    Despite these caveats, other string theorists are approving of the new work. “This is definitely a step forward in demonstrating that string theory gives rise to many solutions with features of the standard model,” says string theorist Washington Taylor of MIT.

    “It’s very nice work,” says Cumrun Vafa, one of the developers of F-theory, at Harvard University. “The fact you can arrange the geometry and topology to fit with not only Einstein’s equations, but also with the [particle] spectrum that we want, is not trivial. It works out nicely here.”

    But Vafa and Taylor both caution that these solutions are far from matching perfectly with the standard model. Getting solutions to match exactly with the particle physics of our world is one of the ultimate goals of string theory. Vafa is among those who think that, despite the immensity of the landscape of solutions, there exists a unique solution that matches our universe. “I bet there is exactly one,” he says. But, “to pinpoint this is not going to be easy.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 1:28 pm on February 8, 2019 Permalink | Reply
    Tags: ESA Galileo navigation system, , ESA's answer to USA GPS, Scientific American   

    From Scientific American: “Wayward Satellites Test Einstein’s Theory of General Relativity” 

    Scientific American

    From Scientific American

    February 8, 2019
    Megan Gannon

    The botched launch of two Galileo navigation probes made for an unexpected experiment.

    1
    Galileo satellite. Credit: P. Carril and ESA

    ESA Galileo’s navigagtion constellation

    In August 2014 a rocket launched the fifth and sixth satellites of the Galileo global navigation system, the European Union’s $11-billion answer to the U.S.’s GPS. But celebration turned to disappointment when it became clear that the satellites had been dropped off at the wrong cosmic “bus stops.” Instead of being placed in circular orbits at stable altitudes, they were stranded in elliptical orbits useless for navigation.

    The mishap, however, offered a rare opportunity for a fundamental physics experiment. Two independent research teams—one led by Pacôme Delva of the Paris Observatory in France, the other by Sven Herrmann of the University of Bremen in Germany—monitored the wayward satellites to look for holes in Einstein’s general theory of relativity.

    “General relativity continues to be the most accurate description of gravity, and so far it has withstood a huge number of experimental and observational tests,” says Eric Poisson, a physicist at the University of Guelph in Ontario, who was not involved in the new research. Nevertheless, physicists have not been able to merge general relativity with the laws of quantum mechanics, which explain the behavior of energy and matter at a very small scale. “That’s one reason to suspect that gravity is not what Einstein gave us,” Poisson says. “It’s probably a good approximation, but there’s more to the story.”

    Einstein’s theory predicts time will pass more slowly close to a massive object, which means that a clock on Earth’s surface should tick at a more sluggish rate relative to one on a satellite in orbit. This time dilation is known as gravitational redshift. Any subtle deviation from this pattern might give physicists clues for a new theory that unifies gravity and quantum physics.

    Even after the Galileo satellites were nudged closer to circular orbits, they were still climbing and falling about 8,500 kilometers twice a day. Over the course of three years Delva’s and Herrmann’s teams watched how the resulting shifts in gravity altered the frequency of the satellites’ superaccurate atomic clocks. In a previous gravitational redshift test, conducted in 1976, when the Gravity Probe-A suborbital rocket was launched into space with an atomic clock onboard, researchers observed that general relativity predicted the clock’s frequency shift with an uncertainty of 1.4 × 10–4.

    The new studies, published last December in Physical Review Letters, again verified Einstein’s prediction—and increased that precision by a factor of 5.6. So, for now, the century-old theory still reigns.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 10:49 am on January 30, 2019 Permalink | Reply
    Tags: , , , , Finding Alien Life May Require Giant Telescopes Built in Orbit, How big such a telescope must be to offer a reasonable chance of success in that interstellar quest depends on life’s still-unknown cosmic prevalence, iSAT-in-Space Assembled Telescope” study, NASA’s still-in-development Space Launch System (SLS), Scientific American, The forces demanding supersize space telescopes are straightforward: The larger a scope’s light-collecting mirror is the deeper and finer its cosmic gaze, Two of NASA’s pinnacle projects—the International Space Station (ISS) and the Hubble Space Telescope—owe their existence to orbital construction work   

    From Scientific American: “Finding Alien Life May Require Giant Telescopes Built in Orbit” 

    Scientific American

    From Scientific American

    December 12, 2018 [Just presented in social media.]
    Lee Billings

    Scientific American reports on new efforts from NASA and other federal agencies seeking to service and assemble large structures—such as life-finding telescopes—in space.

    1
    Astronauts repair and upgrade the Hubble Space Telescope during the first servicing mission to that orbital observatory, in 1993. NASA is now studying how telescopes far larger than Hubble might someday be assembled and serviced in space by astronauts or robots. Credit: NASA.

    After snapping the final piece into place with a satisfying “click” she feels through her spacesuit gloves, the astronaut pauses to appreciate the view. Her reflection swims before her in a silvery disk the size of three tennis courts; for a moment she feels like a bug floating on a darkened pond. Composed of hundreds of interlocking metallic hexagons like the one she has just installed, the disk is a colossal mirror 30 meters wide, the starlight-gathering eye of the largest space telescope ever built. From her perch on the robotic arm of a small space station, Earth is a tiny blue and white orb she could cover with an outstretched thumb, dwarfed by the bright and silent moon spinning thousands of kilometers below her feet.

    Although this scene remains the stuff of science fiction, an ad hoc assemblage of scientists, engineers and technocrats now say it is well on its way to becoming reality. Under the auspices of a modest NASA-sponsored initiative, this diverse group is gauging how the space agency might build bigger, better space telescopes than previously thought possible—by constructing and servicing them in space. The effort, formally known as the “in-Space Assembled Telescope” study (iSAT), is part of a long trend in which science advances by piggybacking on technologies created for more practical concerns.

    For example, the development of surveillance satellites and warhead-carrying rockets during the 20th-century cold war also catalyzed the creation of robotic interplanetary probes and even NASA’s crewed Apollo lunar missions. Similarly, in the 21st century a soaring military and industrial demand for building and servicing satellites in orbit could lead to dramatically enhanced space telescopes capable of definitively answering some of science’s biggest questions—such as whether or not we are alone. “The iSAT is a program that can be NASA’s next Apollo,” says study member Matt Greenhouse, an astrophysicist at the space agency’s Goddard Space Flight Center. “And the science enabled by the iSAT would likely include discovery of extraterrestrial life—an achievement that would eclipse Apollo in terms of impact on humanity.”


    NASA Goddard Space Flight Center campus

    “And the science enabled by the iSAT would likely include discovery of extraterrestrial life—an achievement that would eclipse Apollo in terms of impact on humanity.”

    Ready for Prime Time

    In some respects, building and repairing spacecraft in space is a revolution that has already arrived, merely kept under the radar by a near-flawless track record that makes it seem deceptively routine. Two of NASA’s pinnacle projects—the International Space Station (ISS) and the Hubble Space Telescope—owe their existence to orbital construction work.

    ISS

    NASA/ESA Hubble Telescope

    Assembled and resupplied in orbit over two decades, the ISS is now roughly as big as a football field and has more living space than a standard six-bedroom house. And only space-based repairs allowed Hubble to become the world’s most iconic and successful telescope, after a space shuttle crew on a first-of-its-kind servicing mission in 1993 fixed a crippling defect in the observatory’s primary mirror.

    NASA COSTAR

    NASA COSTAR installation

    Astronauts have since conducted four more Hubble servicing missions, replacing equipment and upgrading instruments to leave behind an observatory reborn.

    COSTAR was removed from HST in 2009 during the fifth servicing mission and replaced by the Cosmic Origins Spectrograph. It is now on exhibit in the Smithsonian’s National Air and Space Museum.

    NASA Hubble Cosmic Origins Spectrograph

    3
    An artist’s rendition of the upcoming Dragonfly mission, a collaboration between NASA and Space Systems Loral to demonstrate technologies required for orbital construction. Dragonfly’s robotic arm (inset) will assemble and deploy reflectors to create a large radio antenna when the mission launches sometime in the 2020s. Credit: NASA and SSL.

    Today multiple projects are carrying the momentum forward from those pioneering efforts, cultivating powerful new capabilities. Already NASA and the Pentagon’s Defense Advanced Research Projects Agency (DARPA) as well as private-sector companies such as Northrop Grumman and Space Systems Loral (SSL) are building robotic spacecraft for launch in the next few years on lengthy missions to refuel, repair, re-position and upgrade governmental and commercial satellites. Those spacecraft—or at least the technologies they demonstrate—could also be used to assemble telescopes and other large structures in space such as those associated with NASA’s perennial planning for human missions to the moon and Mars. Last year—under the auspices of a “partnership forum” between NASA, the U.S. Air Force and National Reconnaissance Office—the space agency took the lead on crafting a national strategy for further public and private development of in-space assembly in the 2020s and beyond.

    These trends could end what some experts see as a “dark age” in space science and exploration. “Imagine a world where once your car runs low on fuel, instead of driving to the gas station you take it to the junkyard and abandon it. Imagine a world where once you’ve moved into your house for the first time you have no way of ever getting more groceries inside, having a plumber come to fix a leaky pipe or any way to bring in and install a new TV. Imagine a world where we all live in tents that we can carry on our backs and no one thinks to build anything larger or more permanent. That seems crazy, doesn’t it?” says iSAT study member Joe Parrish, a program manager for DARPA’s Tactical Technology Office who helms its Robotic Servicing of Geosynchronous Satellites (RSGS) mission. “But that’s exactly the world we live in right now with our $1-billion–class assets in space. … I think we will look back on the era before on-orbit servicing and assembly the way we now look back on the era when leeches were used to treat diseases.”

    Bigger Is Better

    The fundamental reality behind the push for in-space assembly is easy to understand: Anything going to space must fit within the rocket taking it there. Even the very biggest—the mammoth 10-meter rocket fairing of NASA’s still-in-development Space Launch System (SLS)—would be unable to hold something like the ISS or even the space agency’s smaller “Gateway,” a moon-orbiting space station proposed for the 2020s.

    NASA Space Launch System depiction

    Launching such megaprojects piece by piece, for orbital assembly by astronauts or robots, is literally the only way to get them off the ground. And coincidentally, even though massive “heavy lift” rockets such as the SLS remain ruinously expensive, the midsize rockets that could support orbital assembly with multiple launches are getting cheaper all the time.

    The forces demanding supersize space telescopes are straightforward, too: The larger a scope’s light-collecting mirror is, the deeper and finer its cosmic gaze. Simply put, bigger is better when it comes to telescopes—especially ones with transformative objectives such as tracking the coalescence of galaxies, stars and planets throughout the universe’s 13.8-billion-year history, learning the nature of dark matter and dark energy, and seeking out signs of life on habitable worlds orbiting other stars. Most of today’s designs for space telescopes pursuing such alluring quarry cap out with mirrors as wide as 15 meters—but only because that is the approximate limit of what could be folded to fit within a heavy-lift rocket like the SLS.

    Astronomers have long fantasized about building space observatories even bigger, with mirrors 30 meters wide or more—rivaling the sizes of ground-based telescopes already under construction for the 2020s. Assembled far above our planet’s starlight-scattering atmosphere, these behemoths could perform feats the likes of which ground-based observers can only dream, such as taking pictures of potentially Earth-like worlds around a huge sample of other stars to determine whether those worlds are actually habitable—or even inhabited. If our own Earth is any example to go by, life is a planetary phenomenon that can transform the atmosphere and surface of its home world in clearly recognizable ways; provided, that is, one has a telescope big enough to see such details across interstellar distances.

    A recent “Exoplanet Science Strategy” report from the National Academies of Sciences, Engineering and Medicine said NASA should take the lead on a major new space telescope that begins to approach that grand vision—something capable of surveying hundreds (or at least dozens) of nearby stars for snapshots of potential exo-Earths. That recommendation (itself an echo from several previous prestigious studies) is reinforced by the core conclusion of another new Academies report which calls for the agency to make the search for alien life a more fundamental part of its future space exploration activities. These reports build on the growing consensus that our galaxy likely holds billions of potentially habitable worlds, courtesy of statistics from NASA’s recently deceased Kepler space telescope and the space agency’s newly launched Transiting Exoplanet Survey Satellite.

    NASA/Kepler Telescope

    NASA/MIT TESS

    Whether viewed through the lens of scientific progress, technological capability or public interest, the case for building a life-finding space telescope is stronger than ever before—and steadily strengthening. Sooner or later it seems NASA will find itself tasked with making this longed-for giant leap in the search for life among the stars.

    How big such a telescope must be to offer a reasonable chance of success in that interstellar quest depends on life’s still-unknown cosmic prevalence. With a bit of luck, one with a four-meter mirror might suffice to hit the jackpot, locating an inhabited exo-Earth around one of our sun’s nearest neighboring stars. But if the cosmos is less kind and the closest life-bearing worlds are much farther away, something in excess of the 15-meter limit imposed by near-future rockets could be necessary to sniff out any living planets within our solar system’s corner of the galaxy. In short, in-space assembly may offer the only viable path to completing the millennia-long effort to end humanity’s cosmic loneliness.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 11:37 am on December 10, 2018 Permalink | Reply
    Tags: Advances like those made by Hubble are possible only through sustained publicly-funded research, Arthur “Art” Code, , , , , Lyman Spitzer, , OAO-2, Scientific American, Space Astronomy Laboratory at UW–Madison,   

    From Scientific American: “The World’s First Space Telescope” 

    Scientific American

    From Scientific American

    December 7, 2018
    James Lattis

    50 years ago, astronomers launched the Orbiting Astronomical Observatory, whose descendants include the Hubble, Spitzer and James Webb telescopes.

    In July 1958, an astronomer at the University of Wisconsin–Madison named Arthur “Art” Code received a telegram from the fledgling Space Science Board of the National Academy of Sciences. The agency wanted to know what he and his colleagues would do if given the opportunity to launch into Earth’s orbit an instrument weighing up to 100 pounds.

    Code, newly-minted director of the University’s Washburn Observatory, had something in mind. His department was already well known for pioneering a technique for measuring the light emitted by celestial objects, called photoelectric photometry, and Code had joined the university with the intent of adapting it to the burgeoning field of space astronomy.

    He founded the Space Astronomy Laboratory at UW–Madison and, with his colleagues, proposed to launch a small telescope equipped with a photoelectric photometer, designed to measure the ultraviolet (UV) energy output of stars—a task impossible from Earth’s surface. Fifty years ago, on December 7, 1968, that idea culminated in NASA’s launch of the first successful space-based observatory: the Orbiting Astronomical Observatory, or OAO-2.

    NASA U Wisconsin Orbiting Astronomical Observatory OAO-2

    With it was born the era of America’s Great Observatories, bearing the Hubble, Spitzer, Chandra and Compton space telescopes, a time during which our understanding of the universe repeatedly deepened and transformed.

    NASA/ESA Hubble Telescope

    NASA/Spitzer Infrared Telescope

    NASA/Chandra X-ray Telescope

    NASA Compton Gamma Ray Observatory

    Today, dwindling political appetite and lean funding threaten our progress. Contemporary projects like the James Webb Space Telescope flounder, and federal budgets omit promising projects like the Wide Field Infrared Survey Telescope (WFIRST).

    NASA/ESA/CSA Webb Telescope annotated

    NASA WFIRST

    In celebrating the half century since OAO-2’s launch, we are reminded that major scientific achievements like it become part of the public trust, and to make good on the public trust, we must repay our debt to history by investing in our future. Advances like those made by Hubble are possible only through sustained, publicly-funded research.

    These first investments originated in the late 1950s, during the space race between the U.S. the USSR. They led to economic gains in the private sector, technological and scientific innovations, and the birth of new fields of exploration.

    Astronomer Lyman Spitzer, considered the father of the Hubble Space Telescope, first posited the idea of space-based observing seriously in a 1946 RAND Corporation study. By leaving Earth’s atmosphere, he argued, astronomers could point telescopes at and follow nearly anything in the sky, from comets to galaxy clusters, and measure light in a broader range of the electromagnetic spectrum.

    When Code pitched Wisconsin’s idea to the Space Board, the result was NASA funding to create part of the scientific payload for OAO. The agency went to work planning a spacecraft that could support these astronomical instruments. The Cook Electric Company in Chicago and Grumman Aircraft Engineering Corporation in New York won contracts to help pull it off.

    The payload, named the Wisconsin Experiment Package (WEP), bundled five telescopes equipped with photoelectric photometers and two scanning spectrophotometers, all with UV capabilities. The Massachusetts Institute of Technology created a package of X-ray and gamma detectors.

    Scientists and engineers had to make the instruments on OAO both programmable and capable of operating autonomously between ground contacts. Because repairs were impossible once in orbit, they designed redundant systems and operating modes. Scientists also had to innovate systems for handling complex observations, transmitting data to Earth digitally (still a novelty in those days), and for processing data before they landed in the hands of astronomers.

    The first effort, OAO-1, suffered a fatal power failure after launch in 1966, and the scientific instruments were never turned on. But NASA reinvested, and OAO-2 launched with a new WEP from Wisconsin, and this time a complementary instrument from the Smithsonian Astrophysical Observatory, called Celescope, that used television camera technology to produce images of celestial objects emitting UV light. Expected to operate just one year, OAO-2 continued to make observations for four years.

    Numerous “guest” astronomers received access to the instruments during the extended mission. Such collaborations ultimately led to the creation of the Space Telescope Science Institute, which Code helped organize as acting director in 1981.

    And the data yielded many scientific firsts, including a modern understanding of stellar physics, surprise insights into stellar explosions called novae, and exploration of a comet that had far-reaching implications for theories of planet formation and evolution.

    To be responsible beneficiaries of such insights, we must remember that just as we are yesterday’s future, the firsts of tomorrow depend on today. We honor that public trust only by continuing to fund James Webb, WFIRST, and other projects not yet conceived.

    In the forward of a 1971 volume publishing OAO-2’s scientific results, NASA’s Chief of Astronomy Nancy G. Roman wrote: “The performance of this satellite has completely vindicated the early planners and has rewarded … the entire astronomical community with many exciting new discoveries and much important data to aid in the unravelling of the secrets of the stars.”

    Let’s keep unraveling these stellar secrets.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 10:52 am on October 15, 2018 Permalink | Reply
    Tags: , , , , , NASA Viking 2 Lander, Scientific American, Search for Alien Life Should Be a Fundamental Part of NASA New Report Urges, , The Viking missions to Mars were the last time the space agency performed a direct explicit search for life on another world   

    From Scientific American: “Search for Alien Life Should Be a Fundamental Part of NASA, New Report Urges” 

    Scientific American

    From Scientific American

    October 15, 2018
    Adam Mann

    1
    An image taken by the Viking 2 lander from Utopia Planitia on the surface of Mars in 1976. The Viking missions to Mars were the last time the space agency performed a direct, explicit search for life on another world. Credit: NASA

    NASA Viking 2 Lander

    For decades many researchers have tended to view astrobiology as the underdog of space science. The field—which focuses on the investigation of life beyond Earth—has often been criticized as more philosophical than scientific, because it lacks in tangible samples to study.

    Now that is all changing. Whereas astronomers once knew of no planets outside our solar system, today they have thousands of examples. And although organisms were previously thought to need the relatively mild surface conditions of our world to survive, new findings about life’s ability to persist in the face of extreme darkness, heat, salinity and cold have expanded researchers’ acceptance that it might be found anywhere from Martian deserts to the ice-covered oceans of Saturn’s moon Enceladus.

    Highlighting astrobiology’s increasing maturity and clout, a new Congressionally mandated report from the National Academy of Sciences (NAS) [National Academies Press] urges NASA to make the search for life on other worlds an integral, central part of its exploration efforts. The field is now well set to be a major motivator for the agency’s future portfolio of missions, which could one day let humanity know whether or not we are alone in the universe. “The opportunity to really address this question is at a critically important juncture,” says Barbara Sherwood Lollar, a geologist at the University of Toronto and chair of the committee that wrote the report.

    The astronomy and planetary science communities are currently gearing up to each perform their decadal surveys—once-every-10-year efforts that identify a field’s most significant open questions—and present a wish list of projects to help answer them. Congress and government agencies such as NASA look to the decadal surveys to plan research strategies; the decadals, in turn, look to documents such as the new NAS report for authoritative recommendations on which to base their findings. Astrobiology’s reception of such full-throated encouragement now may boost its odds of becoming a decadal priority.

    Another NAS study released last month could be considered a second vote in astrobiology’s favor. This “Exoplanet Science Strategy” report recommended NASA lead the effort on a new space telescope that could directly gather light from Earth-like planets around other stars. Two concepts, the Large Ultraviolet/Optical/Infrared (LUVOIR) telescope and the Habitable Exoplanet Observatory (HabEx), are current contenders for a multibillion-dollar NASA flagship mission that would fly as early as the 2030s.

    NASA Large UV Optical Infrared Surveyor (LUVOIR)

    NASA Habitable Exoplanet Imaging Mission (HabEx) The Planet Hunter

    Either observatory could use a coronagraph, or “starshade”—objects that selectively block starlight but allow planetary light through—to search for signs of habitability and of life in distant atmospheres.

    NASA JPL Starshade

    NASA/WFIRST


    JPL-Caltech is developing coronagraph technology to enable direct imaging and spectroscopy of exoplanets using the Astrophysics Focused Telescope Assets (AFTA) on the NASA Wide-Field Infrared Survey Telescope (WFIRST).

    But either would need massive and sustained support from outside astrobiology to succeed in the decadal process and beyond.

    There have been previous efforts to back large, astrobiologically focused missions such as NASA’s Terrestrial Planet Finder concepts—ambitious space telescope proposals in the mid-2000s that would have spotted Earth-size exoplanets and characterized their atmospheres (if these projects had ever made it off the drawing board). Instead, they suffered ignominious cancellations that taught astrobiologists several hard lessons. There was still too little information at the time about the number of planets around other stars, says Caleb Scharf, an astrobiologist at Columbia University, meaning advocates could not properly estimate such a mission’s odds of success. His community had yet to realize that in order to do large projects it needed to band together and show how its goals aligned with those of astronomers less professionally interested in finding alien life, he adds. “If we want big toys,” he says. “We need to play better with others.”

    There has also been tension in the past between the astrobiological goals of solar system exploration and the more geophysics-steeped goals that traditionally underpin such efforts, says Jonathan Lunine, a planetary scientist at Cornell University. Missions to other planets or moons have limited capacity for instruments, and those specialized for different tasks often end up in ferocious competitions for a slot onboard. Historically, because the search for life was so open-ended and difficult to define, associated instrumentation lost out to hardware with clearer, more constrained geophysical research priorities. Now, Lunine says, a growing understanding of all the ways biological and geologic evolution are interlinked is helping to show that such objectives do not have to be at odds. “I hope that astrobiology will be embedded as a part of the overall scientific exploration of the solar system,” he says. “Not as an add-on, but as one of the essential disciplines.”

    Above and beyond the recent NAS reports, NASA is arguably already demonstrating more interest in looking for life in our cosmic backyard than it has for decades. This year the agency released a request for experiments that could be carried to another world in our solar system to directly hunt for evidence of living organisms—the first such solicitation since the 1976 Viking missions that looked for life on Mars. “The Ladder of Life Detection,” a paper written by NASA scientists and published in Astrobiology in June, outlined ways to clearly determine if a sample contains extraterrestrial creatures—a goal mentioned in the NAS report. The document also suggests NASA partner with other agencies and organizations working on astrobiological projects, as the space agency did last month when it hosted a workshop with the nonprofit SETI Institute on the search for “techno-signatures,” potential indicators of intelligent aliens.



    “I think astrobiology has gone from being something that seemed fringy or distracting to something that seems to be embraced at NASA as a major touchstone for why we’re doing space exploration and why the public cares,” says Ariel Anbar, a geochemist at Arizona State University in Tempe.

    All this means is astrobiology’s growing influence is helping bring what once were considered outlandish ideas into reality. Anbar recalls attending a conference in the early 1990s, when then–NASA Administrator Dan Goldin displayed an Apollo-era image of Earth from space and suggested the agency try to do the same thing for a planet around another star.

    “That was pretty out there 25 years ago,” he says. “Now it’s not out there at all.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 11:11 am on August 17, 2018 Permalink | Reply
    Tags: , , , Is Gravity Quantum?, , , , Scientific American   

    From Scientific American: “Is Gravity Quantum?” 

    Scientific American

    From Scientific American

    August 14, 2018
    Charles Q. Choi

    1
    Artist’s rendition of gravitational waves generated by merging neutron stars. The primordial universe is another source of gravitational waves, which, if detected, could help physicists devise a quantum theory of gravity. Credit: R. Hurt, Caltech-JPL

    All the fundamental forces of the universe are known to follow the laws of quantum mechanics, save one: gravity. Finding a way to fit gravity into quantum mechanics would bring scientists a giant leap closer to a “theory of everything” that could entirely explain the workings of the cosmos from first principles. A crucial first step in this quest to know whether gravity is quantum is to detect the long-postulated elementary particle of gravity, the graviton. In search of the graviton, physicists are now turning to experiments involving microscopic superconductors, free-falling crystals and the afterglow of the big bang.

    Quantum mechanics suggests everything is made of quanta, or packets of energy, that can behave like both a particle and a wave—for instance, quanta of light are called photons. Detecting gravitons, the hypothetical quanta of gravity, would prove gravity is quantum. The problem is that gravity is extraordinarily weak. To directly observe the minuscule effects a graviton would have on matter, physicist Freeman Dyson famously noted, a graviton detector would have to be so massive that it collapses on itself to form a black hole.

    “One of the issues with theories of quantum gravity is that their predictions are usually nearly impossible to experimentally test,” says quantum physicist Richard Norte of Delft University of Technology in the Netherlands. “This is the main reason why there exist so many competing theories and why we haven’t been successful in understanding how it actually works.”

    In 2015 [Physical Review Letters], however, theoretical physicist James Quach, now at the University of Adelaide in Australia, suggested a way to detect gravitons by taking advantage of their quantum nature. Quantum mechanics suggests the universe is inherently fuzzy—for instance, one can never absolutely know a particle’s position and momentum at the same time. One consequence of this uncertainty is that a vacuum is never completely empty, but instead buzzes with a “quantum foam” of so-called virtual particles that constantly pop in and out of existence. These ghostly entities may be any kind of quanta, including gravitons.

    Decades ago, scientists found that virtual particles can generate detectable forces. For example, the Casimir effect is the attraction or repulsion seen between two mirrors placed close together in vacuum. These reflective surfaces move due to the force generated by virtual photons winking in and out of existence. Previous research suggested that superconductors might reflect gravitons more strongly than normal matter, so Quach calculated that looking for interactions between two thin superconducting sheets in vacuum could reveal a gravitational Casimir effect. The resulting force could be roughly 10 times stronger than that expected from the standard virtual-photon-based Casimir effect.

    Recently, Norte and his colleagues developed a microchip to perform this experiment. This chip held two microscopic aluminum-coated plates that were cooled almost to absolute zero so that they became superconducting. One plate was attached to a movable mirror, and a laser was fired at that mirror. If the plates moved because of a gravitational Casimir effect, the frequency of light reflecting off the mirror would measurably shift. As detailed online July 20 in Physical Review Letters, the scientists failed to see any gravitational Casimir effect. This null result does not necessarily rule out the existence of gravitons—and thus gravity’s quantum nature. Rather, it may simply mean that gravitons do not interact with superconductors as strongly as prior work estimated, says quantum physicist and Nobel laureate Frank Wilczek of the Massachusets Institute of Technology, who did not participate in this study and was unsurprised by its null results. Even so, Quach says, this “was a courageous attempt to detect gravitons.”

    Although Norte’s microchip did not discover whether gravity is quantum, other scientists are pursuing a variety of approaches to find gravitational quantum effects. For example, in 2017 two independent studies suggested that if gravity is quantum it could generate a link known as “entanglement” between particles, so that one particle instantaneously influences another no matter where either is located in the cosmos. A tabletop experiment using laser beams and microscopic diamonds might help search for such gravity-based entanglement. The crystals would be kept in a vacuum to avoid collisions with atoms, so they would interact with one another through gravity alone. Scientists would let these diamonds fall at the same time, and if gravity is quantum the gravitational pull each crystal exerts on the other could entangle them together.

    The researchers would seek out entanglement by shining lasers into each diamond’s heart after the drop. If particles in the crystals’ centers spin one way, they would fluoresce, but they would not if they spin the other way. If the spins in both crystals are in sync more often than chance would predict, this would suggest entanglement. “Experimentalists all over the world are curious to take the challenge up,” says quantum gravity researcher Anupam Mazumdar of the University of Groningen in the Netherlands, co-author of one of the entanglement studies.

    Another strategy to find evidence for quantum gravity is to look at the cosmic microwave background [CMB] radiation, the faint afterglow of the big bang, says cosmologist Alan Guth of M.I.T.

    Cosmic Background Radiation per ESA/Planck

    ESA/Planck 2009 to 2013

    Quanta such as gravitons fluctuate like waves, and the shortest wavelengths would have the most intense fluctuations. When the cosmos expanded staggeringly in size within a sliver of a second after the big bang, according to Guth’s widely supported cosmological model known as inflation, these short wavelengths would have stretched to longer scales across the universe.

    Inflation

    4
    Alan Guth, from Highland Park High School and M.I.T., who first proposed cosmic inflation

    HPHS Owls

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex MittelmannColdcreation

    Alan Guth’s notes:
    5

    This evidence of quantum gravity could be visible as swirls in the polarization, or alignment, of photons from the cosmic microwave background radiation.

    However, the intensity of these patterns of swirls, known as B-modes, depends very much on the exact energy and timing of inflation. “Some versions of inflation predict that these B-modes should be found soon, while other versions predict that the B-modes are so weak that there will never be any hope of detecting them,” Guth says. “But if they are found, and the properties match the expectations from inflation, it would be very strong evidence that gravity is quantized.”

    One more way to find out whether gravity is quantum is to look directly for quantum fluctuations in gravitational waves, which are thought to be made up of gravitons that were generated shortly after the big bang. The Laser Interferometer Gravitational-Wave Observatory (LIGO) first detected gravitational waves in 2016, but it is not sensitive enough to detect the fluctuating gravitational waves in the early universe that inflation stretched to cosmic scales, Guth says.


    VIRGO Gravitational Wave interferometer, near Pisa, Italy

    Caltech/MIT Advanced aLigo Hanford, WA, USA installation


    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA

    Cornell SXS, the Simulating eXtreme Spacetimes (SXS) project

    Gravitational waves. Credit: MPI for Gravitational Physics/W.Benger-Zib

    ESA/eLISA the future of gravitational wave research

    1
    Skymap showing how adding Virgo to LIGO helps in reducing the size of the source-likely region in the sky. (Credit: Giuseppe Greco (Virgo Urbino group)

    A gravitational-wave observatory in space, such as the Laser Interferometer Space Antenna (eLISA, just above), could potentially detect these waves, Wilczek adds.

    In a paper recently accepted by the journal Classical and Quantum Gravity, however, astrophysicist Richard Lieu of the University of Alabama, Huntsville, argues that LIGO should already have detected gravitons if they carry as much energy as some current models of particle physics suggest. It might be that the graviton just packs less energy than expected, but Lieu suggests it might also mean the graviton does not exist. “If the graviton does not exist at all, it will be good news to most physicists, since we have been having such a horrid time in developing a theory of quantum gravity,” Lieu says.

    Still, devising theories that eliminate the graviton may be no easier than devising theories that keep it. “From a theoretical point of view, it is very hard to imagine how gravity could avoid being quantized,” Guth says. “I am not aware of any sensible theory of how classical gravity could interact with quantum matter, and I can’t imagine how such a theory might work.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: