Tagged: Scientific American Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:54 pm on April 27, 2020 Permalink | Reply
    Tags: "Space Telescope Director Says Best Is Yet to Come for Hubble", , , , , Scientific American   

    From Scientific American: “Space Telescope Director Says Best Is Yet to Come for Hubble” 

    From Scientific American

    April 27, 2020
    Lee Billings

    Three decades into the life of the world’s most revered orbital observatory, Ken Sembach, director of the Space Telescope Science Institute, reflects on its future.

    1
    Astronauts maneuver over the exterior of the Hubble Space Telescope during a servicing mission to the observatory in 1999. Credit: NASA

    Thirty years ago a team of NASA astronauts tipped the Hubble Space Telescope out of a space shuttle’s cargo bay and into low-Earth orbit. High above our planet’s starlight-smearing atmosphere, Hubble could study phenomena across the cosmos that ground-based observatories could never hope to see. It was not the first space telescope, but it is by far the longest-lived and most productive—thanks in large part to an innovative design that allowed Hubble to be visited, repaired and upgraded. Today it has irreversibly transformed astronomy, leading not only to profound new discoveries about the universe but also to plans for even more ambitious space telescopes.

    Although Hubble’s eyes are more than 500 kilometers above Earth, its heart is arguably in Baltimore: in the halls, offices and conference rooms of the Space Telescope Science Institute, where the observatory’s science operations take place. To help commemorate Hubble’s three decades of discovery, Scientific American spoke with the institute’s director Ken Sembach about the telescope’s most revolutionary discoveries, its operations during the coronavirus pandemic and how much longer it might last.

    3
    Ken Sembach, director of the Space Telescope Science Institute.

    What is your relationship to Hubble as the director of the Space Telescope Science Institute?

    I’m responsible for the science operations of Hubble, as well as the other work that we do there like the science and the flight operations of the upcoming James Webb Space Telescope [JWST] and the running of the Mikulski Archive for Space Telescopes.

    NASA/ESA/CSA Webb Telescope annotated

    Mikulski Archive For Space Telescopes

    So Hubble is one component of the work the Institute does. I have a great team of people, led by a mission office that is responsible for the day-to-day Hubble operations we do there.

    What is the most fun part of your job when it comes to Hubble?

    I get a lot of satisfaction out of seeing the whole Hubble team working together to make a great science idea become something spectacular. That’s fun. But so is something else—a perk that comes along with my position. It’s called “director’s discretionary time.” And this is something that gives me up to 10 percent of the telescope’s time to use as I choose—usually for particularly important observations that may be too time-sensitive or too ambitious to get through the usual channels for allocating the telescope’s time. Sometimes these are just things that everyone recognizes we need, but they aren’t right at the cutting edge, they aren’t brand-new and shiny—fundamental matters of basic science that have to be done to build up to the tip-top of the peak that everyone wants to get to.

    Some of Hubble’s greatest successes that have really moved the field forward over the years resulted from director’s discretionary time. The best examples of that, I think, are the Deep Fields.

    NASA Hubble Deep Field

    Hubble Ultra Deep Field NASA/ESA Hubble

    The first Hubble Deep Field, the Ultra Deep Field that followed, the Frontier Fields that followed that—they all basically came out of the director at the time saying, “This is important enough and revolutionary enough to do.

    Frontier Fields

    And even though others may disagree with me, we’re going to go forward and do it on behalf of the community.”

    So, for instance, I’ve used my director’s time to just start something called ULLYSES—the Ultraviolet Legacy Library of Young Stars as Essential Standards. This will be the largest Hubble program ever executed. And across 1,000 of the telescope’s orbits around Earth, it will bring the observatory’s unique ultraviolet imaging and spectroscopy to bear on the question of how stars form. Paired with other observations from current and upcoming facilities such as Gaia, ALMA [Atacama Large Millimeter/submillimeter Array] and [JWST], this could let us finally capture and unravel the details of star formation that we haven’t been able to access before.

    ESA/GAIA satellite

    ESO/NRAO/NAOJ ALMA Array in Chile in the Atacama at Chajnantor plateau, at 5,000 metres

    If we don’t understand star formation in a decade—with this program, with all these amazing facilities coming along—we’re probably never going to understand it.

    And none of those ultraviolet observations could be done with something besides Hubble?

    That’s right. You wouldn’t be able to do it. You can’t get ultraviolet from the ground-based observatories, because the atmosphere of Earth blocks that light. So you need a space telescope to do it. And right now Hubble is the only one that’s capable of providing that kind of information. [JWST] and its planned follow-on, an observatory called WFIRST [Wide Field Infrared Survey Telescope], are both in infrared, so they can’t do it. Right now only Hubble sees this kind of light, barring a few minor exceptions that come nowhere close to Hubble’s capabilities. So when Hubble goes, we could be blind to the ultraviolet universe. Right now, even though we are 30 years into its life, I really think, for Hubble, the best is yet to come.

    Really? What’s the “best” that might be coming? And how much longer could Hubble last anyway?

    One lesson of the past three decades is that Hubble always surprises us with new and interesting things wherever it looks. It is a key contributor to the tremendous rate of increase in our knowledge about the universe that we have seen in recent years. And the more we learn, the more we learn that we need to learn more—which is, in part, why we still have such incredible demand. We still receive more than 1,000 proposals every year from researchers around the world hoping to use Hubble to study everything from solar system objects to things at the edge of the visible universe.

    Being realistic, I think Hubble’s got a good five years left. And we’re operating the observatory in a way meant to keep it scientifically productive out to 2025. Does this mean we’ll get to 2025? No, something could go wrong tomorrow—this is the space business, after all. But, then again, maybe we could get to 2030. Hubble has a lot of built-in redundancy. And it has been visited, repaired and upgraded by astronauts five times throughout its life. Each of those servicing missions rejuvenated the observatory and gave it new capabilities—better electronics, better mechanical components, better detectors, things like that. The fact that most of those new things haven’t failed, that means they’re past their infant mortality phase. They could go another 10 or 15 years. Most of Hubble is quite healthy. What worries me are certain things original to the observatory such as the fine-guidance sensor electronics. They’ve been bathing in cosmic radiation every day for 30 years. Eventually, that takes its toll.

    As for the transformative things Hubble could still do: For one thing, Hubble could have a big impact on multimessenger astronomy—where you’re using gravitational-wave observatories to detect things such as merging black holes and neutron stars and then studying those things with other, more traditional facilities. This is a research area that is opening up an entirely new window on the most massive and energetic events that occur the universe, the things that ripple the very fabric of spacetime. Hubble can help immensely to tell us what went bump in the night, what actually collided or coalesced to cause those ripples.

    Also, there is still one of the original problems that Hubble was designed to help solve, which is determining how fast the universe is expanding—something called the Hubble constant. Right now there is a growing tension between measured values of the Hubble constant, between those based on the cosmic microwave background and those based on observing the relatively nearby universe using supernovae. The observatory has helped to drill down on the value of the Hubble constant in the nearby universe to 10 percent precision, as was originally promised. Then we got it to 3 percent. Now we’re working to get it to 1 percent. We may soon get to the point where the tension between these two sets of estimates is such that it really requires entirely new types of physics to describe what’s going on. Maybe there’s another flavor of neutrinos out there. Discovering something like that would be huge.

    And with [JWST] coming along, let’s remember that everything people will look at with [JWST], they’ll want to look at with Hubble, too, to get a more complete picture while the two observatories are both operationally overlapping. Studies of star formation, the first galaxies, exoplanets—all will benefit from these two observatories working together.

    So all these things lead me to believe that Hubble’s best years are yet to come.

    Do you think NASA should consider another servicing mission?

    It’s certainly something worth looking at. There is no obvious successor for Hubble’s capabilities in ultraviolet and really blue optical light in the near future—for the 2020s and perhaps the 2030s. If Hubble really does end in 2025, we might have a gap of 10, 15, maybe even 20 years before another big telescope can come online with those kinds of capabilities. And how damaging would that be to the field? You’re going to want Hubble or something like it to synergize with the observations of so many other planned future missions.

    You know, I would never bet against Hubble. But there are many facets to consider in terms of the potential cost-benefit associated with any potential Hubble refurbishment. Trades would have to be made. I would say it would be short-sighted to slam that door shut. But it would also be cavalier to say “Let’s go and do it” without carefully thinking it through.

    The coronavirus pandemic is obviously posing challenges for every aspect of society. Is it impacting Hubble operations?

    Well, it’s hard to put Hubble in the context of something so enormous that is happening to everybody around the globe. But we are fortunate to have been fairly well prepared for this, because for years, we’ve been downsizing, streamlining and automating operations as a way of saving money and increasing efficiency. So we can ensure the science keeps flowing and Hubble’s operations can continue while most of our team works from home. We do have some personnel on-site to upload commands to the telescope. Whereas for ground-based observatories, where people must be on-site, doing real hands-on things, it’s much more difficult to keep them operational right now. So Hubble is filling a bit of a void with observations for the community during this whole pandemic period. And we’re proud and happy to be able to do that. We’re delighted to be able to deliver inspirational scientific results that give people a little bit of a bright spot in what may be otherwise dark times.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 11:09 am on December 27, 2019 Permalink | Reply
    Tags: "Mapping the Remains of Supernovae", , , , , , Scientific American,   

    From UNSW via Scientific American: “Mapping the Remains of Supernovae” 

    U NSW bloc

    From University of New South Wales

    via

    Scientific American

    Scientific American

    Scientific American January 2020 Issue
    Rachel Berkowitz

    A new tool provides detailed, 3-D chemical view of exploded star systems.

    1
    Light emitted by two supernova remnants. Green indicates charged iron. Credit: I. R. SEITENZAHL ET AL.

    When a dense stellar core called a white dwarf acquires enough material from a companion star orbiting nearby, it burns up in the nuclear fusion blast of a Type Ia supernova. This ejects freshly synthesized elements that mix with interstellar gas and eventually form stars and galaxies. But astrophysicists still don’t know the specific conditions that ignite these explosions.

    Ivo Seitenzahl, an astrophysicist at University of New South Wales Canberra, and his colleagues used the upgraded Very Large Telescope (VLT) in Chile to build unprecedented 3-D chemical maps of the debris left behind by these supernovae.

    ESO VLT at Cerro Paranal in the Atacama Desert, •ANTU (UT1; The Sun ),
    •KUEYEN (UT2; The Moon ),
    •MELIPAL (UT3; The Southern Cross ), and
    •YEPUN (UT4; Venus – as evening star).
    elevation 2,635 m (8,645 ft) from above Credit J.L. Dauvergne & G. Hüdepohl atacama photo,

    These maps can help scientists work backward to “constrain the fundamental properties of these explosions, including the amount of kinetic energy and the mass of the exploding star,” says Carles Badenes, an astrophysicist at the University of Pittsburgh, who was not involved in the study.

    During a supernova event, heavy elements shoot from the white dwarf’s core at supersonic speeds. This drives a shock wave outward through the surrounding interstellar gas and dust, and another shock wave bounces backward into the explosion debris, eventually heating the ejected matter to x-ray-emitting temperatures. Scientists can learn about a supernova remnant’s composition from these x-rays—but current x-ray instruments lack the resolution to measure the movement of ejected material.

    Seitenzahl’s group used visible-light data from the VLT to analyze supernova remnants in a new way, described in July in Physical Review Letters. Basic models suggest that Type Ia supernovae produce most of the universe’s iron. That iron should hold a stronger electrical charge the farther it is behind the supernova’s shock wave and emit distinctive visible wavelengths of light; however, those emissions were too faint to detect before the VLT’s recent instrument upgrade.

    With the upgrade, the researchers detected concentric layers of charged iron within supernova remnants in the Large Magellanic Cloud, a nearby satellite galaxy of our Milky Way. From distortion patterns in light released by the charged iron, they determined the inward shock wave’s velocity in Type Ia supernova remnants for the first time. “This is exciting science that’s been enabled by new technology, used on precisely the type of [supernova] that needs it,” says Dan Milisavljevic, an astronomer at Purdue University, who was also not involved in the work.

    Seitenzahl’s group also found that one particular supernova originated from a white dwarf whose mass was thought to be too small to trigger such an explosion, suggesting there is still more to learn about this process. Further work could reveal more details about the chemicals produced in Type Ia supernovae, whether an explosion initiates on the surface or interior of the star and what conditions trigger the blast.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 12:03 pm on December 26, 2019 Permalink | Reply
    Tags: "'Qutrit' Experiments Are a First in Quantum Teleportation", Quantum communications, , Scientific American, To create their qutrits both teams used the triple-branching path of a photon   

    From Scientific American: “‘Qutrit’ Experiments Are a First in Quantum Teleportation” 

    Scientific American

    From Scientific American

    August 6, 2019
    Daniel Garisto

    The proof-of-concept demonstrations herald a major step forward in quantum communications.

    1
    (aleksandarnakovski/iStock)

    2
    Credit: Getty Images

    For the first time, researchers have teleported a qutrit, a tripartite unit of quantum information. The independent results from two teams are an important advance for the field of quantum teleportation, which has long been limited to qubits—units of quantum information akin to the binary “bits” used in classical computing.

    These proof-of-concept experiments demonstrate that qutrits, which can carry more information and have greater resistance to noise than qubits, may be used in future quantum networks.

    Chinese physicist Guang-Can Guo and his colleagues at the University of Science and Technology of China (USTC) reported their results in a preprint paper on April 28, although that work remains to be published in a peer-reviewed journal. On June 24 the other team, an international collaboration headed by Anton Zeilinger of the Austrian Academy of Sciences and Jian-Wei Pan of USTC, reported its results in a paper in Physical Review Letters. That close timing—as well as the significance of the result—has each team vying for credit and making critiques of the other’s work.

    “Each of these [experiments] is an important advance in the technology of teleportation,” says William Wootters, a physicist at Williams College, who was not involved with either study.

    The name quantum teleportation brings to mind a technology out of Star Trek, where “transporters” can “beam” macroscale objects—even living humans—between far-distant points in space. Reality is less glamorous. In quantum teleportation, the states of two entangled particles are what is transported—for instance, the spin of an electron. Even when far apart, entangled particles share a mysterious connection; in the case of two entangled electrons, whatever happens to one’s spin influences that of the other, instantaneously.

    “Teleportation” also conjures visions of faster-than-light communication, but that picture is wrong, too. If Alice wants to send Bob a message via quantum teleportation, she has to accompany it with classical information transported via photons—at the speed of light but no faster. So what good is it?

    Oddly enough, quantum teleportation may also have important utility for secure communications in the future, and much of the research is funded with cybersecurity applications in mind. In 2017 Pan, Zeilinger and their colleagues used China’s Micius satellite to perform the world’s longest communication experiment, across 7,600 kilometers.

    3
    Illustration of the three cooperating ground stations (Graz, Nanshan, and Xinglong). Listed are all paths

    Two photons—each acting as a qubit—were beamed to Vienna and China. By taking information about the state of the photons, the researchers in each location were able to effectively construct an unhackable password, which they used to conduct a secure video call. The technique acts like a wax seal on a letter: any eavesdropping would interfere and leave a detectable mark.

    Researchers have attempted to teleport more complicated states of particles with some success. In a study published in 2015 [Physics World] Pan and his colleagues managed to teleport two states of a photon: its spin and orbital angular momentum. Still, each of these states was binary—the system was still using qubits. Until now, scientists had never teleported any more complicated state.

    A classical bit can be a 0 or 1. Its quantum counterpart, a qubit, is often said to be 0 and 1—the superposition of both states. Consider, for instance, a photon, which can exhibit either horizontal or vertical polarization. Such qubits are breezily easy for researchers to construct.

    A classical trit can be a 0, 1 or 2—meaning a qutrit must embody the superposition of all three states. This makes qutrits considerably more difficult to make than qubits.

    To create their qutrits, both teams used the triple-branching path of a photon, expressed in carefully orchestrated optical systems of lasers, beam splitters and barium borate crystals. One way to think about this arcane arrangement is the famous double-slit experiment, says physicist Chao-Yang Lu, a co-author of the new paper by Pan and Zeilinger’s team. In that classic experiment, a photon goes through two slits at the same time, creating a wavelike interference pattern. Each slit is a state of 0 and 1, because a photon goes through both. Add a third slit for a photon to traverse, and the result is a qutrit—a quantum system defined by the superposition of three states in which a photon’s path effectively encodes information.

    Creating a qutrit from a photon was only the opening skirmish in a greater battle. Both teams also had to entangle two qutrits together—no mean feat, because light rarely interacts with itself.

    Crucially, they had to confirm the qutrits’ entanglement, also known as the Bell state. Bell states, named after John Stewart Bell, a pioneer of quantum information theory, are the conditions in which particles are maximally entangled. Determining which Bell state qutrits are in is necessary to extract information from them and to prove that they conveyed that information with high fidelity.

    What constitutes “fidelity” in this case? Imagine a pair of weighted dice, Wootters says: If Alice has a die that always lands on 3, but after she sends it to Bob, it only lands on 3 half of the time, the fidelity of the system is low—the odds are high it will corrupt the information it transmits. Accurately transmitting a message is important, whether the communication is quantum or not. Here, the teams are in dispute about the fidelity. Guo and his colleagues believe that their Bell state measurement, taken over 10 states, is sufficient for a proof-of-concept experiment. But Zeilinger and Pan’s group contends that Guo’s team failed to measure a sufficient number of Bell states to definitively prove that it has high enough fidelity.

    Despite mild sniping, the rivalry between the groups remains relatively friendly, even though provenance for the first quantum teleportation of a qutrit hangs in the balance. Both teams agree that each has teleported a qutrit, and they both have plans to go beyond qutrits: to four level systems—ququarts—or even higher.

    Some researchers are less convinced, though. Akira Furusawa, a physicist at the University of Tokyo, says that the method used by the two teams is ill-suited for practical applications because it is slow and inefficient. The researchers acknowledge the criticism but defend their results as a work in progress.

    “Science is step by step. First, you make the impossible thing possible,” Lu says. “Then you work to make it more perfect.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 12:28 pm on October 26, 2019 Permalink | Reply
    Tags: I think it’s perfectly reasonable to examine how communications across interstellar space might play out should they exist., It seems that there is a built-in inevitability for life to cause and participate in information flow., Scientific American, , The bottom line is that we have not yet done enough to tell whether the cosmos is devoid of communicative species or crammed with them., We have to assume that really long-distance communications are actually possible at all., We would also have to assume that technologically inclined species can arise and survive for long enough to expend time and energy on any of these things.   

    From Scientific American: “Interstellar Conversations” 

    Scientific American

    From Scientific American

    Could there be information networks across the galaxy?

    October 19, 2019
    Caleb A. Scharf

    1
    Credit: C. Scharf 2019

    Let’s start by clearing something up. Whatever the ins and outs of the search for extraterrestrial intelligence over the years (that I’ll label as SETI) the bottom line is that we have not yet done enough to tell whether the cosmos is devoid of communicative species or crammed with them. Nowhere has this been articulated better than in the work by Jason Wright, Shubham Kanodia, and Emily Lubar of Penn State and their ‘Haystack equation’ [The Astronomical Journal]. This shows, unequivocally, that to date we’ve searched about as much as if we’d stared into a modest hot-tub’s worth of water from all of Earth’s oceans.

    Consequently, to say that ‘there’s clearly nothing out there’ is like looking in that hot tub, not finding a dolphin, and concluding that dolphins therefore do not exist anywhere on the planet.

    Given that fact, I think it’s perfectly reasonable to examine how communications across interstellar space might play out, should they exist. This does, of course, require a whole bunch of prior assumptions.

    We have to assume that really long-distance communications, whether by radio, laser, beams of neutrinos, massive engineering of weird stellar transit signals, or other barely imagined options are actually possible at all. We have to assume, or at least posit, that information might flow across interstellar space either as inadvertent side effects of a busy species (noisily broadcasting or carelessly pointing lasers, among other things) or as deliberate signals – seeking replies, establishing communications, or tracking a species’ own kind.

    We would also have to assume that technologically inclined species can arise and survive for long enough to expend time and energy on any of these things. That’s part of the depressing, although potentially realistic, Anthropocene mindset. But equally, simply shrugging our shoulders and saying that it’s all hopeless shuts down a discussion that could be very important.

    That importance could stem from the relevance of information itself. At all levels, information appears to be not just an integral part of the phenomenon of life on Earth [Chaos: An Interdiciplary Journal of Nonlinear Science], but the flow of information may represent a critical piece of what makes something alive versus not alive (that flow and informational influence might even be [Journal of the Royal Society Interface] of what life is).

    One small facet of this is very evident in how social animals deploy the flow of information. Imagine, for example, that humans didn’t communicate with each other in any way. It’s next to impossible to imagine that, right? We’re communicating even when we’re not speaking or touching. If I merely watch you walk down the street I’m accumulating information, adding that to my internal stash, analyzing, and incorporating it into my model of the world.

    There’s a much bigger discussion to be had there, but to come back to SETI. It seems that there is a built-in inevitability for life to cause and participate in information flow, and we should assume that extends across interstellar distances too. We ourselves have taken baby steps towards this – from our transmissions to our SETI efforts, to the fact that we maintain communications with our most distant robotic spacecraft, the Voyagers.

    As we’ve seen with studying the ideas of the so-called Fermi Paradox, in principle it’s pretty ‘easy’ for interstellar explorers to spread across the galaxy given a few million years. It therefore should be even easier for an information-bearing network to spread across the galaxy too. Signals can move at up to the speed of light, so the bottlenecks come from issues like the fading of signal strength with distance, the timescale of development of the infrastructure to receive and transmit, and the choices made on directionality (perhaps).

    The beautiful thing is that we can model hypotheses for this galactic information flow – even if we don’t know all the possible ifs, buts, and maybes. We can, in principle, test hypotheticals about the structure of information-bearing interstellar networks, which will also relate to the known physical distribution and dynamics of star systems and planets in our galaxy.

    Perhaps somewhere in there are clues about where we stand in relation to conversations that could be skittering by us all the time. Perhaps too are clues about what those conversations would entail, what the most valuable interstellar informational currencies really are.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 11:51 am on August 23, 2019 Permalink | Reply
    Tags: , , , , , Scientific American, ,   

    From Scientific American: Women in STEM- “In Support of the Vera C. Rubin Observatory” 

    Scientific American

    From Scientific American

    August 23, 2019
    Megan Donahue

    The House of Representatives has taken the first step toward honoring a pioneering woman in astronomy.

    LSST the Vera C. Rubin Observatory

    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.


    LSST Data Journey, Illustration by Sandbox Studio, Chicago with Ana Kova

    On July 23, the U.S. House of Representatives approved H.R. 3196, the Vera C. Rubin Observatory Designation Act, which was introduced by Representative Eddie Bernice Johnson of Texas and Representative Jenniffer González-Colón of Puerto Rico (at large). If the Senate agrees, it will name the facility housing the Large Synoptic Survey Telescope the Vera C. Rubin Observatory in honor of Carnegie Institution for Science researcher Vera Cooper Rubin, who died in 2016.

    Fritz Zwicky discovered Dark Matter when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com

    Coma cluster via NASA/ESA Hubble

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science)


    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL)


    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970. https://home.dtm.ciw.edu

    As a woman astronomer working in the field of cosmology and galaxy studies, Rubin has always been a personal hero of mine. I can’t think of a more appropriate tribute to her memory and her incredible contributions to science, astronomy and future astronomers than this honor.

    The text of the bill itself celebrates the milestones of Rubin’s scientific career. As a student and young professor, she studied how galaxies cluster and move inside such clusters. In 1970 she and astronomer W. Kent Ford, Jr., published measurements of the line-of-sight velocities and locations of individual ionized clouds of gas inside the nearby Andromeda galaxy (M31), showing that they were moving too fast to be gravitationally bound to the galaxy if the only matter binding it was the matter we can see (in the form of stars).

    We call these kinds of observations “rotation curves,” because inside spiral galaxies such as Andromeda or our own Milky Way, the orbits of stars and gas circle the center of the galaxy inside a volume of space shaped like a disk. A typical rotation curve plots the velocities of gas clouds or stars toward or away from us as a function of distance from the center of the disk. These curves can be fit to models of where the matter is inside those orbits to work out how much matter is inside the galaxy and where it sits.

    In Rubin and Ford’s paper, they did not make much of a fuss about the interpretation. By 1980 however, Rubin, Ford and the late Norbert Thonnard presented long-slit spectroscopy of a sample of 21 galaxies. They derived the rotation curves from these data, and in this, their most-cited work, and in the most cited work around this time in Rubin’s career, they boldly posited that gravity caused by something other than stars and gas must be binding the galaxies together. These observations provided some of the first direct evidence of the existence of dark matter inside of galaxies.

    Later observations of clusters of galaxies and of the cosmic microwave background confirm that dark matter exists in even larger structures, and it appears to outweigh the stars and gas in the universe by a factor of about seven. Rubin investigated questions related to the nature of spiral galaxies and dark matter for most of her life. We still don’t know exactly what dark matter is made out of, but her discoveries transformed our thinking about the universe and its contents.

    Although many of us astronomers thought Rubin should have won a Nobel Prize in Physics for her work in finding dark matter in galaxies, it’s not as if she went unrecognized during her life. She was a very highly regarded scientist, and she was recognized by her fellow researchers. In 1993, she was awarded the National Medal of Science, which is based on nomination by one’s peers, submitted to the National Science Foundation, and subsequent selection by 12 presidentially appointed scientists.

    This award was set up by John F. Kennedy in 1962. In the category of physical sciences, it was first given to a woman—Margaret Burbidge—20 years later, after more than 60 men had received that prize. After another 10 years and more than 30 male prizewinners, Rubin won it. (If you’re wondering: yes, an additional 14 years passed and 27 more men won the prize in the physical sciences category before any other women did so.)

    In 1996 Rubin was the second woman ever to receive the Gold Medal of the Royal Astronomical Society. The first woman so honored was Caroline Herschel, nearly 170 years prior. As did many women of her generation (or any of them), Rubin faced many barriers in her career simply because she was a woman. For example, as a scientific staff member of the Carnegie Institution in the 1960s, she had institutional access to the world-class Palomar Observatory in California. But she was denied access to the observatory, with the excuse that there were limited bathroom facilities.

    Caltech Palomar Observatory, located in San Diego County, California, US, at 1,712 m (5,617 ft)

    Nevertheless, she persisted, and in 1965 she was finally allowed to observe at Palomar. She was the first woman to be officially allowed to do so. (Burbidge had gained access under the name of her husband Geoffrey.) Rubin carried on as an advocate for the equal treatment of women in science and helped many other women in their careers as astronomers. The Large Synoptic Survey Telescope, funded primarily by the NSF and the Department of Energy, will carry on her legacy and her work to study the nature of dark energy and dark matter and map out the structure of the universe as traced by billions of galaxies.

    We have come a long way from the days where women weren’t allowed in the same buildings as men. But we still have a long way to travel, because it is still too easy, even in science and with our desire to avoid bias, for a man to cast doubt on the worth of a woman’s work. We also apparently have much to learn about the nature of dark matter—which may be a dark sector of dark matter particle species, for all we know so far. Because of Rubin’s pioneering work, we are all further along these journeys than we would be without her. By hearing her name and her story, along with the wonderful discoveries we all anticipate from the Vera C. Rubin Observatory, little girls everywhere can learn they, too, can contribute to our understanding of the universe.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 10:43 am on July 7, 2019 Permalink | Reply
    Tags: "Oregon Is About to Get a Lot More Hazardous", , , Landslides and debris flows, Scientific American, ,   

    From Scientific American: “Oregon Is About to Get a Lot More Hazardous” 

    Scientific American

    From Scientific American

    June 29, 2019
    Dana Hunter

    State leadership is failing its citizens—and there will be a body count.

    1
    Credit: Dale Simonson (CC BY-SA 2.0)

    When you live in an area at as much geologic risk as Oregon, you would expect that government officials would maybe, possibly, take those risks seriously. But the people who currently govern Oregon seem quite determined to ignore hazards and let the state languish unprepared.

    It’s bad enough that legislators voted this month to allow “new schools, hospitals, jails, and police and fire stations” to be built in areas that will most certainly be inundated in the event of a tsunami. Both parties think it’s a good idea now; I doubt they’ll still be feeling great about locating schools right in the path of rampaging seawater when the big one hits. But short-term economic gain outweighs long-term planning, so here we are. What else can we expect from a statehouse where lawmakers who would rather flee the state than be forced to deal with climate change?

    People say they’re willing to accept the risks. However, the state government is now planning to make it far harder for residents to even know what those risks are, because Oregon’s Department of Geology and Mineral Industries (DOGAMI for short) is severely underfunded and will now lose three critically-needed experts on staff as a punishment for going over budget. As if that weren’t bad enough, the governor’s office is considering whether the agency should even continue to exist:

    “In a note on the preliminary budget proposal for the agency, the Joint Ways and Means Committee said the Governor’s office would be “evaluating if the Department should continue to exist as an independent or recommendations to abolish and move the individual programs to other entities.”

    That drastic of a move could come with big consequences,” Avy said.

    “It would be incredibly disruptive to staff and it is likely that some on-going studies would be discontinued,” he said.”Oregon would lose a valued agency and may lose talented staff in our Geological Survey and Services Program which provides a focus on geologic and mineral mapping and natural hazard identification.”

    Can we be real for a minute, here? Oregon is a geologically young state in an active subduction zone, located on an ocean that has subduction zones on both sides, which generate ocean-spanning tsunamis on a regular basis. The local subduction zone, plus Basin and Range crustal stretching and faulting, also produces active volcanoes. Many, many volcanoes. Also, too, all of this folding and faulting and uplifting and volcanoing leaves the state terribly landslide prone. This is not a place where you can safely starve your local geological survey of funds, and then shut it down when it needs extra money to identify and quantify the hazards you face.

    So if you live in Oregon, or even if you just visit, I’d strongly consider writing a polite but serious missive to Governor Kate Brown, letting her know that it would perhaps be a good idea to look further into the possible repercussions of signing that deplorable tsunami bill (I mean, at least take the schools out of the mix!), and also fully fund DOGAMI rather than further crippling it and then stripping it for parts.

    Let’s have a brief tour of Oregon’s geohazards which DOGAMI helps protect us from, then, shall we?

    Tsunamis

    The Oregon coast is extremely susceptible to tsunamis, both generated from Cascadia and from other subduction zones along the Pacific Ocean. You can see evidence of them everywhere.

    1
    Cascadia subduction zone. This is the site of recurring en:megathrust earthquakes at average intervals of about 500 years, including the en:Cascadia Earthquake of en:1700.

    One of the starkest reminders in recent times was the dock that was ripped from the shoreline in Misawa, Japan, in the brutal 2011 Tōhoku Earthquake. The tsunami that sheared it loose and set it afloat also washed ashore in California and Oregon, causing millions of dollars in damage; loss of life in the United States was only avoided due to ample warnings.

    3
    Ocean energy distribution forecast map for the 2011 Sendai earthquake from the U.S. NOAA. Note the location of Australia for scale.

    Just over a year later, the dock washed up on Agate Beach, Oregon.

    At Agate Beach, homes and businesses are built right in the path of the next Cascadia tsunami. I can’t describe to you the eerie sensation you feel turning away from that dock to see vulnerable structures that will be piles of flooded rubble after the next tsunami hits.

    3
    Residences and businesses on Agate Beach. Even a modest tsunami will cause untold damage to these structures. Credit: Dana Hunter

    The people here will have minutes to find high ground after the shaking stops, if that long. There is some high ground nearby, but not much, and perhaps not near enough. Roads will probably be destroyed or blocked in the quake. This is the sort of location the legislature has decided it would be fine to site schools.

    Earthquakes

    6
    The stump of a drowned spruce at Sunset Bay, Shore Acres, OR. Lockwood DeWitt for scale. Credit: Dana Hunter

    Sunset Bay is the site of one of Oregon’s many ghost forests. Here, a Cascadia earthquake dropped the shoreline about 1,200 years ago, suddenly drowning huge, healthy trees in salt water. At least seven spectacular earthquakes have hit the Oregon coast in the past 3,500 years. It may not sound like much, or often… but look to Japan for the reason why we should take the threat extremely seriously. And Oregon doesn’t just have to worry about Cascadia quakes: the state is full of faults, stretching from north to south and from coast to interior.

    Volcanoes

    Huge swathes of Oregon are volcanic. As in, recently volcanic. As in, will definitely erupt again quite soon.

    Mount Hood, a sibling to Mount St. Helens, is right outside of Portland and last erupted in the mid-1800s. It is hazardous as heck.

    6
    Mount Hood reflected in Trillium Lake, Oregon, United States

    But Hood is very, very far from the only young volcano in the state, and evidence of recent eruptions is everywhere. Belknap shield volcano and its associated volcanoes on McKenzie Pass ceased erupting only 1,500 years ago, and the forces that created it are still active today.

    7
    Belknap Crater, Oregon. Cascades Volcano Observatory

    Another volcanic center like it could emerge in the near future. And you see here just a tiny swath of the destruction such a volcanic center causes.

    You know what you really don’t want to be caught unawares by? A volcano. And even once they’ve stopped erupting, the buggers can be dangerous. Sector collapses, lahars, and other woes plague old volcanoes. You need people who can keep a sharp eye on them. And I’m sorry, but the USGS can’t be everywhere at once. Local volcano monitoring is important!

    Landslides and debris flows

    If you’re an Oregon resident, you’ll probably remember how bloody long it took to finish the Eddyville Bypass due to the massive landslide that got reactivated during construction. Steep terrain plus plenty of rain equals lots of rock and soil going where we’d prefer it didn’t.

    Debris flows and landslides regularly take out Oregon roads, including this stretch on a drainage by Mount Hood.

    7
    Construction equipment copes with damage caused by massive debris flows coming down from Mount Hood. Credit: Dana Hunter

    We know from the Oso mudslide just how deadly these mass movements can be. Having experts out there who understand how to map the geology of an area and identify problem areas is critically important, especially in places where a lot of people want to live, work, and play.

    Contact the governor’s office and let her know if you don’t think it’s worth letting a budget shortfall torpedo the agency that should be doing the most to identify these hazards and help us mitigate them.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 3:05 pm on June 17, 2019 Permalink | Reply
    Tags: , , , Scientific American   

    From Scientific American: “Which Should Come First in Physics: Theory or Experiment?” 

    Scientific American

    From Scientific American

    June 17, 2019
    Grigoris Panoutsopoulos
    Frank Zimmermann

    Plans for giant particle accelerators of the future focus attention on how scientific discoveries are really made.

    The discovery of the Higgs particle at the Large Hadron Collider (LHC) over half a decade ago marked a milestone in the long journey towards understanding the deeper structure of matter. Today, particle physics strives to push a diverse range of experimental approaches from which we may glean new answers to fundamental questions regarding the creation of the universe and the nature of the mysterious and elusive dark matter.

    Such an endeavor requires a post-LHC particle collider with an energy capability significantly greater than that of previous colliders. This is how the idea for the Future Circular Collider (FCC) at CERN came to be—a machine that could put the exploration of new physics in high gear.

    CERN FCC Future Circular Collider map

    To understand the validity of this proposal, we should, however, start at the beginning and once more ask ourselves: “How does physics progress?”

    Many believe that grand revolutions are driven exclusively by new theories, whereas experiments play the parts of movie extras. The played-out story goes a little something like this: theorists form conjectures, and experiments are used solely for the purposes of testing them. After all, most of us proclaim our admiration for Einstein’s relativity or for quantum mechanics, but seldom do we pause and consider whether these awe-inspiring theories could have been attained without the contributions of the Michelson-Morley, Stern-Gerlach or black-body–radiation experiments.

    This simplistic picture, despite being far removed from the creative, and often surprising, ways in which physics has developed over time, remains quite widespread even among scientists. Its pernicious influence can be seen in the discussion of future facilities like the proposed FCC at CERN.

    In the wake of the discovery of the Higgs boson in 2012, we have finally of all of the pieces of puzzle of the Standard Model (SM) of physics in place. Nevertheless, the unknowns regarding dark matter, neutrino masses and the observed imbalance between matter and antimatter are among numerous indications that the SM is not the ultimate theory of elementary particles and their interactions.

    Quite a number of theories have been developed to overcome the problems surrounding the SM, but so far none has been experimentally verified. This fact has left the world of physics brimming with anticipation. In the end, science has shown time and again that it can find new, creative ways to surmount any obstacles placed along its path. And one such way is for experiment to assume the leading role, so that it can help get the stuck wagon of particle physics moving and out of the mire

    In this regard, the FCC study was launched by CERN in 2013 as a global effort to explore different scenarios for particle colliders that could inaugurate the post-LHC era and for advancing key technologies. A staged approach, it entails the construction of an electron-positron collider followed by a proton collider, which would present an eightfold energy leap compared to the LHC and thus grant us direct access to a previously unexplored regime. Both colliders will be housed in a new 100-kilometer circumference tunnel. The FCC study complements previous design studies for linear colliders in Europe (CLIC) and Japan (ILC), while China also has similar plans for a large-scale circular collider (CEPC).

    CERN/CLIC

    ILC schematic, being planned for the Kitakami highland, in the Iwate prefecture of northern Japan

    China Circular Electron Positron Collider (CEPC) map

    Future colliders could offer a deep understanding of the Higgs properties, but even more importantly, they represent an opportunity for exploring uncharted territory in an unprecedented energy scale. As Gian Giudice, head of CERN’s Theoretical Physics Department, argues: “High-energy colliders remain an indispensable and irreplaceable tool to continue our exploration of the inner workings of the universe.”

    Nevertheless, the FCC is seen by some as a questionable scientific investment in the absence of clear theoretical guidance about where the elusive new physics may lie. The history of physics, however, offers evidence in support of a different view: that experiments often play a leading and exploratory role in the progress of science.

    As the eminent historian of physics Peter Galison puts it, we have to “step down from the aristocratic view of physics that treats the discipline as if all interesting questions are structured by high theory.” Besides, quite a few experiments have been realized without being guided by a well-established theory but were instead undertaken for the purposes of exploring new domains. Let us examine some illuminating examples.

    In the 16th century, King Frederick II of Denmark financed Uraniborg, an early research center, where Tycho Brahe constructed large astronomical instruments, like a huge mural quadrant (unfortunately, the telescope was invented a few years later) and carried out many detailed observations that had not previously been possible. The realization of an enormous experimental structure, at a hitherto unprecedented scale, transformed our view of the world. Tycho Brahe’s precise astronomical measurements enabled Johannes Kepler to develop his laws of planetary motion and to make a significant contribution to the scientific revolution.

    The development of electromagnetism serves as another apt example: many electrical phenomena were discovered by physicists, such as Charles Dufay, André-Marie Ampère and Michael Faraday, in the 18th and 19th centuries, through experiments that had not been guided by any developed theory of electricity.

    Moving closer to the present day, we see that the entire history of particle physics is indeed full of similar cases. In the aftermath of World War II, a constant and laborious experimental effort characterized the field of particle physics, and it was what allowed the Standard Model to emerge through a “zoo” of newly discovered particles. As a prominent example, quarks, the fundamental constituents of the proton and neutron, were discovered through a number of exploratory experiments during the late1960s at the Stanford Linear Accelerator (SLAC).

    The majority of practicing physicists recognize the exceptional importance of experiment as an exploratory process. For instance, Victor “Viki” Weisskopf, the former director-general of CERN and an icon of modern physics, grasped clearly the dynamics of the experimental process in the context of particle physics:

    “There are three kinds of physicists, namely the machine builders, the experimental physicists, and the theoretical physicists. If we compare those three classes, we find that the machine builders are the most important ones, because if they were not there, we would not get into this small-scale region of space. If we compare this with the discovery of America, the machine builders correspond to captains and ship builders who truly developed the techniques at that time. The experimentalists were those fellows on the ships who sailed to the other side of the world and then jumped upon the new islands and wrote down what they saw. The theoretical physicists are those fellows who stayed behind in Madrid and told Columbus that he was going to land in India.” (Weisskopf 1977)

    Despite being a theoretical physicist himself, he was able to recognize the exploratory character of experimentation in particle physics. Thus, his words eerily foreshadow the present era. As one of the most respected theoretical physicists of our time, Nima Arkani-Hamed, claimed in a recent interview, “when theorists are more confused, it’s the time for more, not less experiments.”

    The FCC, at present, strives to keep alive the exploratory spirit of the previous fabled colliders. It is not intended to be used as a verification tool for a specific theory but as a means of paving multiple experimental paths for the future. The experimental process should be allowed to develop its own momentum. This does not mean that experimentation and instrumentation should not maintain a close relationship with the theoretical community; at the end of the day, there is but one physics, and it must ensure its unity.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 4:47 pm on April 30, 2019 Permalink | Reply
    Tags: "Cosmology Has Some Big Problems", New measurements of the Hubble constant- the rate of universal expansion suggested major differences between two independent methods of calculation., Scientific American   

    From Scientific American: “Cosmology Has Some Big Problems” 

    Scientific American

    From Scientific American

    April 30, 2019
    Bjørn Ekeberg

    The field relies on a conceptual framework that has trouble accounting for new observations.

    1
    Credit: Thanapol Sisrang Getty Images

    What do we really know about our universe?

    Born out of a cosmic explosion 13.8 billion years ago, the universe rapidly inflated and then cooled, it is still expanding at an increasing rate and mostly made up of unknown dark matter and dark energy … right?

    This well-known story is usually taken as a self-evident scientific fact, despite the relative lack of empirical evidence—and despite a steady crop of discrepancies arising with observations of the distant universe.

    In recent months, new measurements of the Hubble constant, the rate of universal expansion, suggested major differences between two independent methods of calculation. Discrepancies on the expansion rate have huge implications not simply for calculation but for the validity of cosmology’s current standard model at the extreme scales of the cosmos.

    Another recent probe found galaxies inconsistent with the theory of dark matter, which posits this hypothetical substance to be everywhere. But according to the latest measurements, it is not, suggesting the theory needs to be reexamined.

    It’s perhaps worth stopping to ask why astrophysicists hypothesize dark matter to be everywhere in the universe? The answer lies in a peculiar feature of cosmological physics that is not often remarked. For a crucial function of theories such as dark matter, dark energy and inflation, which each in its own way is tied to the big bang paradigm, is not to describe known empirical phenomena but rather to maintain the mathematical coherence of the framework itself while accounting for discrepant observations. Fundamentally, they are names for something that must exist insofar as the framework is assumed to be universally valid.

    Each new discrepancy between observation and theory can of course in and of itself be considered an exciting promise of more research, a progressive refinement toward the truth. But when it adds up, it could also suggest a more confounding problem that is not resolved by tweaking parameters or adding new variables.

    Consider the context of the problem and its history. As a mathematically driven science, cosmological physics is usually thought to be extremely precise. But the cosmos is unlike any scientific subject matter on earth. A theory of the entire universe, based on our own tiny neighborhood as the only known sample of it, requires a lot of simplifying assumptions. When these assumptions are multiplied and stretched across vast distances, the potential for error increases, and this is further compounded by our very limited means of testing.

    Historically, Newton’s physical laws made up a theoretical framework that worked for our own solar system with remarkable precision. Both Uranus and Neptune, for example, were discovered through predictions based on Newton’s model. But as the scales grew larger, its validity proved limited. Einstein’s general relativity framework provided an extended and more precise reach beyond the furthest reaches of our own galaxy. But just how far could it go?

    The big bang paradigm that emerged in the mid-20th century effectively stretches the model’s validity to a kind of infinity, defined either as the boundary of the radius of the universe (calculated at 46 billion light-years) or in terms of the beginning of time. This giant stretch is based on a few concrete discoveries, such as Edwin Hubble’s observation that the universe appears to be expanding (in 1929) and the detection of the microwave background radiation (in 1964).

    2
    The 15 meter Holmdel horn antenna at Bell Telephone Laboratories in Holmdel, New Jersey was built in 1959 for pioneering work in communication satellites for the NASA ECHO I. The antenna was 50 feet in length and the entire structure weighed about 18 tons. It was composed of aluminum with a steel base. It was used to detect radio waves that bounced off Project ECHO balloon satellites. The horn was later modified to work with the Telstar Communication Satellite frequencies as a receiver for broadcast signals from the satellite. In 1964, radio astronomers Robert Wilson and Arno Penzias discovered the cosmic microwave background radiation with it, for which they were awarded the 1978 Nobel prize in physics. In 1990 the horn was dedicated to the National Park Service as a National Historic Landmark.

    But considering the scale involved, these limited observations have had an outsized influence on cosmological theory.

    Edwin Hubble looking through a 100-inch Hooker telescope at Mount Wilson in Southern California, 1929 discovers the Universe is Expanding

    NASA/ Cosmic Background Explorer COBE 1989 to 1993.

    Cosmic Microwave Background NASA/WMAP

    NASA/WMAP 2001 to 2010

    CMB per ESA/Planck

    ESA/Planck 2009 to 2013

    It is of course entirely plausible that the validity of general relativity breaks down much closer to our own home than at the edge of the hypothetical end of the universe. And if that were the case, today’s multilayered theoretical edifice of the big bang paradigm would turn out to be a confusing mix of fictional beasts invented to uphold the model along with empirically valid variables, mutually reliant on each other to the point of making it impossible to sort science from fiction.

    Compounding this problem, most observations of the universe occur experimentally and indirectly. Today’s space telescopes provide no direct view of anything—they produce measurements through an interplay of theoretical predictions and pliable parameters, in which the model is involved every step of the way. The framework literally frames the problem; it determines where and how to observe. And so, despite the advanced technologies and methods involved, the profound limitations to the endeavor also increase the risk of being led astray by the kind of assumptions that cannot be calculated.

    After spending many years researching the foundations of cosmological physics from a philosophy of science perspective, I have not been surprised to hear some scientists openly talking about a crisis in cosmology. In the big “inflation debate” in Scientific American a few years ago, a key piece of the big bang paradigm was criticized by one of the theory’s original proponents for having become indefensible as a scientific theory.

    Why? Because inflation theory relies on ad hoc contrivances to accommodate almost any data, and because its proposed physical field is not based on anything with empirical justification. This is probably because a crucial function of inflation is to bridge the transition from an unknowable big bang to a physics we can recognize today. So, is it science or a convenient invention?

    Inflation

    4
    Alan Guth, from Highland Park High School and M.I.T., who first proposed cosmic inflation

    HPHS Owls

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex MittelmannColdcreation

    Alan Guth’s notes:
    5

    A few astrophysicists, such as Michael J. Disney, have criticized the big bang paradigm for its lack of demonstrated certainties. In his analysis, the theoretical framework has far fewer certain observations than free parameters to tweak them—a so-called “negative significance” that would be an alarming sign for any science. As Disney writes in American Scientist: “A skeptic is entitled to feel that a negative significance, after so much time, effort and trimming, is nothing more than one would expect of a folktale constantly re-edited to fit inconvenient new observations.”

    As I discuss in my new book, Metaphysical Experiments, there is a deeper history behind the current problems. The big bang hypothesis itself originally emerged as an indirect consequence of general relativity undergoing remodeling. Einstein had made a fundamental assumption about the universe, that it was static in both space and time, and to make his equations add up, he added a “cosmological constant,” for which he freely admitted there was no physical justification.

    But when Hubble observed that the universe was expanding and Einstein’s solution no longer seemed to make sense, some mathematical physicists tried to change a fundamental assumption of the model: that the universe was the same in all spatial directions but variant in time. Not insignificantly, this theory came with a very promising upside: a possible merger between cosmology and nuclear physics. Could the brave new model of the atom also explain our universe?

    From the outset, the theory only spoke to the immediate aftermath of an explicitly hypothetical event, whose principal function was as a limit condition, the point at which the theory breaks down. Big bang theory says nothing about the big bang; it is rather a possible hypothetical premise for resolving general relativity.

    On top of this undemonstrable but very productive hypothesis, floor upon floor has been added intact, with vastly extended scales and new discrepancies. To explain observations of galaxies inconsistent with general relativity, the existence of dark matter was posited as an unknown and invisible form of matter calculated to make up more than a quarter of all mass-energy content in the universe—assuming, of course, the framework is universally valid.

    Fritz Zwicky discovered Dark Matter when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science)

    Coma cluster via NASA/ESA Hubble

    In 1998, when a set of supernova measurements of accelerating galaxies seemed at odds with the framework, a new theory emerged of a mysterious force called dark energy, calculated to fill circa 70 percent of the mass-energy of the universe.

    [The Supernova Cosmology Project is one of two research teams that determined the likelihood of an accelerating universe and therefore a positive cosmological constant, using data from the redshift of Type Ia supernovae. The project was headed by Saul Perlmutter at Lawrence Berkeley National Laboratory, with members from Australia, Chile, France, Portugal, Spain, Sweden, the United Kingdom, and the United States.

    This discovery was named “Breakthrough of the Year for 1998” by Science Magazin and, along with the High-z Supernova Search Team, the project team won the 2007 Gruber Prize in Cosmology and the 2015 Breakthrough Prize in Fundamental Physics. In 2011, Perlmutter was awarded the Nobel Prize in Physics for this work, alongside Adam Riess and Brian P. Schmidt from the High-z team.]

    The crux of today’s cosmological paradigm is that in order to maintain a mathematically unified theory valid for the entire universe, we must accept that 95 percent of our cosmos is furnished by completely unknown elements and forces for which we have no empirical evidence whatsoever. For a scientist to be confident of this picture requires an exceptional faith in the power of mathematical unification.

    In the end, the conundrum for cosmology is its reliance on the framework as a necessary presupposition for conducting research. For lack of a clear alternative, as astrophysicist Disney also notes, it is in a sense stuck with the paradigm. It seems more pragmatic to add new theoretical floors than to rethink the fundamentals.

    Contrary to the scientific ideal of getting progressively closer to the truth, it looks rather like cosmology, to borrow a term from technology studies, has become path-dependent: overdetermined by the implications of its past inventions.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 5:44 pm on April 17, 2019 Permalink | Reply
    Tags: , , Einstein’s Unfinished Revolution: The Search for What Lies Beyond the Quantum, , , , Scientific American   

    From Scientific American: “Cosmologist Lee Smolin says that at certain key points, the scientific worldview is based on fallacious reasoning” 

    Scientific American

    From Scientific American

    April 17, 2019
    Jim Daley

    Lee Smolin, author of six books about the philosophical issues raised by contemporary physics, says every time he writes a new one, the experience completely changes the direction his own research is taking. In his latest book, Einstein’s Unfinished Revolution: The Search for What Lies Beyond the Quantum, Smolin, a cosmologist and quantum theorist at the Perimeter Institute for Theoretical Physics in Ontario, tackles what he sees as the limitations in quantum theory.

    1
    Credit: Perimeter Institute

    “I want to say the scientific worldview is based on fallacious reasoning at certain key points,” Smolin says. In Einstein’s Unfinished Revolution, he argues one of those key points was the assumption that quantum physics is a complete theory. This incompleteness, Smolin argues, is the reason quantum physics has not been able to solve certain questions about the universe.

    “Most of what we do [in science] is take the laws that have been discovered by experiments to apply to parts of the universe, and just assume that they can be scaled up to apply to the whole universe,” Smolin says. “I’m going to be suggesting that’s wrong.”

    Join Smolin at the Perimeter Institute as he discusses his book and takes the audience on a journey through the basics of quantum physics and the experiments and scientists who have changed our understanding of the universe. The discussion, “Einstein’s Unfinished Revolution,” is part of Perimeter’s public lecture series and will take place on Wednesday, April 17, at 7 P.M. Eastern time. Online viewers can participate in the discussion by tweeting to @Perimeter using the #piLIVE hashtag.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 8:17 am on April 5, 2019 Permalink | Reply
    Tags: , In string theory a “solution” implies a vacuum of spacetime that is governed by Einstein’s theory of gravity coupled to a quantum field theory., In the past two decades a new branch of string theory called F-theory has allowed physicists to work with strongly interacting or strongly coupled strings, , Scientific American, String theorists can use algebraic geometry to analyze the various ways of compactifying extra dimensions in F-theory and to find solutions., ,   

    From Scientific American: “Found: A Quadrillion Ways for String Theory to Make Our Universe” 

    Scientific American

    From Scientific American

    Mar 29, 2019
    Anil Ananthaswamy

    Stemming from the “F-theory” branch of string theory, each solution replicates key features of the standard model of particle physics.

    1
    Photo: dianaarturovna/Getty Images

    Physicists who have been roaming the “landscape” of string theory — the space of zillions and zillions of mathematical solutions of the theory, where each solution provides the kinds of equations physicists need to describe reality — have stumbled upon a subset of such equations that have the same set of matter particles as exists in our universe.

    String Theory depiction. Cross section of the quintic Calabi–Yau manifold Calabi yau.jpg. Jbourjai (using Mathematica output)

    Standard Model of Supersymmetry via DESY

    But this is no small subset: there are at least a quadrillion such solutions, making it the largest such set ever found in string theory.

    According to string theory, all particles and fundamental forces arise from the vibrational states of tiny strings. For mathematical consistency, these strings vibrate in 10-dimensional spacetime. And for consistency with our familiar everyday experience of the universe, with three spatial dimensions and the dimension of time, the additional six dimensions are “compactified” so as to be undetectable.

    Different compactifications lead to different solutions. In string theory, a “solution” implies a vacuum of spacetime that is governed by Einstein’s theory of gravity coupled to a quantum field theory. Each solution describes a unique universe, with its own set of particles, fundamental forces and other such defining properties.

    Some string theorists have focused their efforts on trying to find ways to connect string theory to properties of our known, observable universe — particularly the standard model of particle physics, which describes all known particles and all their mutual forces except gravity.

    Much of this effort has involved a version of string theory in which the strings interact weakly. However, in the past two decades, a new branch of string theory called F-theory has allowed physicists to work with strongly interacting, or strongly coupled, strings.

    ____________________________________________________
    F-theory is a branch of string theory developed by Cumrun Vafa. The new vacua described by F-theory were discovered by Vafa and allowed string theorists to construct new realistic vacua — in the form of F-theory compactified on elliptically fibered Calabi–Yau four-folds. The letter “F” supposedly stands for “Father”.

    F-theory is formally a 12-dimensional theory, but the only way to obtain an acceptable background is to compactify this theory on a two-torus. By doing so, one obtains type IIB superstring theory in 10 dimensions. The SL(2,Z) S-duality symmetry of the resulting type IIB string theory is manifest because it arises as the group of large diffeomorphisms of the two-dimensional torus.

    More generally, one can compactify F-theory on an elliptically fibered manifold (elliptic fibration), i.e. a fiber bundle whose fiber is a two-dimensional torus (also called an elliptic curve). For example, a subclass of the K3 manifolds is elliptically fibered, and F-theory on a K3 manifold is dual to heterotic string theory on a two-torus. Also, the moduli spaces of those theories should be isomorphic.

    The large number of semirealistic solutions to string theory referred to as the string theory landscape, with 10 272 , 000 {\displaystyle 10^{272,000}} {\displaystyle 10^{272,000}} elements or so, is dominated by F-theory compactifications on Calabi–Yau four-folds.[3] There are about 10 15 {\displaystyle 10^{15}} 10^{15} of those solutions consistent with the Standard Model of particle physics.

    -Wikipedia

    ____________________________________________________

    “An intriguing, surprising result is that when the coupling is large, we can start describing the theory very geometrically,” says Mirjam Cvetic of the University of Pennsylvania in Philadelphia.

    This means that string theorists can use algebraic geometry — which uses algebraic techniques to tackle geometric problems — to analyze the various ways of compactifying extra dimensions in F-theory and to find solutions. Mathematicians have been independently studying some of the geometric forms that appear in F-theory. “They provide us physicists a vast toolkit”, says Ling Lin, also of the University of Pennsylvania. “The geometry is really the key… it is the ‘language’ that makes F-theory such a powerful framework.”

    Now, Cvetic, Lin, James Halverson of Northeastern University in Boston, and their colleagues have used such techniques to identify a class of solutions with string vibrational modes that lead to a similar spectrum of fermions (or, particles of matter) as is described by the standard model — including the property that all fermions come in three generations (for example, the electron, muon and tau are the three generations of one type of fermion).

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)

    The F-theory solutions found by Cvetic and colleagues have particles that also exhibit the handedness, or chirality, of the standard model particles. In particle physics lingo, the solutions reproduce the exact “chiral spectrum” of standard model particles. For example, the quarks and leptons in these solutions come in left and right-handed versions, as they do in our universe.

    The new work shows that there are at least a quadrillion solutions in which particles have the same chiral spectrum as the standard model, which is 10 orders of magnitude more solutions than had been found within string theory until now. “This is by far the largest domain of standard model solutions,” Cvetic says. “It’s somehow surprising and actually also rewarding that it turns out to be in the strongly coupled string theory regime, where geometry helped us.”

    A quadrillion — while it’s much, much smaller than the size of the landscape of solutions in F-theory (which at last count was shown to be of the order of 10272,000) — is a tremendously large number. “And because it’s a tremendously large number, and it gets something nontrivial in real world particle physics correct, we should take it seriously and study it further,” Halverson says.

    Further study would involve uncovering stronger connections with the particle physics of the real world. The researchers still have to work out the couplings or interactions between particles in the F-theory solutions — which again depend on the geometric details of the compactifications of the extra dimensions.

    It could be that within the space of the quadrillion solutions, there are some with couplings that could cause the proton to decay within observable timescales. This would clearly be at odds with the real world, as experiments have yet to see any sign of protons decaying. Alternatively, physicists could search for solutions that realize the spectrum of standard model particles that preserve a mathematical symmetry called R-parity. “This symmetry forbids certain proton decay processes and would be very attractive from a particle physics point of view, but is missing in our current models,” Lin says.

    Also, the work assumes supersymmetry, which means that all the standard model particles have partner particles. String theory needs this symmetry in order to ensure the mathematical consistency of solutions.

    But in order for any supersymmetric theory to tally with the observable universe, the symmetry has to be broken (much like how a diner’s selection of cutlery and drinking glass on her left or right side will “break” the symmetry of the table setting at a round dinner table). Else, the partner particles would have the same mass as standard model particles — and that is clearly not the case, since we don’t observe any such partner particles in our experiments.

    Crucially, experiments at the Large Hadron Collider (LHC) have also shown that supersymmetry — if it is the correct description of nature — is not broken even at the energy scales probed by the LHC, given that the LHC has yet to find any supersymmetric particles.

    String theorists think that supersymmetry might be broken only at extremely high energies that are not within experimental reach anytime soon. “The expectation in string theory is that high-scale [supersymmetry] breaking, which is fully consistent with LHC data, is completely possible,” Halverson says. “It requires further analysis to determine whether or not it happens in our case.”

    Despite these caveats, other string theorists are approving of the new work. “This is definitely a step forward in demonstrating that string theory gives rise to many solutions with features of the standard model,” says string theorist Washington Taylor of MIT.

    “It’s very nice work,” says Cumrun Vafa, one of the developers of F-theory, at Harvard University. “The fact you can arrange the geometry and topology to fit with not only Einstein’s equations, but also with the [particle] spectrum that we want, is not trivial. It works out nicely here.”

    But Vafa and Taylor both caution that these solutions are far from matching perfectly with the standard model. Getting solutions to match exactly with the particle physics of our world is one of the ultimate goals of string theory. Vafa is among those who think that, despite the immensity of the landscape of solutions, there exists a unique solution that matches our universe. “I bet there is exactly one,” he says. But, “to pinpoint this is not going to be easy.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: