Tagged: Nautilus Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 11:00 am on February 7, 2019 Permalink | Reply
    Tags: Abraham (Avi) Loeb, , , , , Black Hole Initiative, Black Hole Institute, , , Infrared results beautifully complemented by observations at radio wavelengths, , Nautilus, , S-02, , , The development of high-resolution infrared cameras revealed a dense cluster of stars at the center of the Milky Way   

    From Nautilus: “How Supermassive Black Holes Were Discovered” 

    Nautilus

    From Nautilus

    February 7, 2019
    Mark J. Reid, CfA SAO

    Astronomers turned a fantastic concept into reality.

    An Introduction to the Black Hole Institute

    Fittingly, the Black Hole Initiative (BHI) was founded 100 years after Karl Schwarzschild solved Einstein’s equations for general relativity—a solution that described a black hole decades before the first astronomical evidence that they exist. As exotic structures of spacetime, black holes continue to fascinate astronomers, physicists, mathematicians, philosophers, and the general public, following on a century of research into their mysterious nature.

    Pictor A Blast from Black Hole in a Galaxy Far, Far Away

    This computer-simulated image of a supermassive black hole at the core of a galaxy. Credit NASA, ESA, and D. Coe, J. Anderson

    The mission of the BHI is interdisciplinary and, to that end, we sponsor many events that create the environment to support interaction between researchers of different disciplines. Philosophers speak with mathematicians, physicists, and astronomers, theorists speak with observers and a series of scheduled events create the venue for people to regularly come together.

    As an example, for a problem we care about, consider the singularities at the centers of black holes, which mark the breakdown of Einstein’s theory of gravity. What would a singularity look like in the quantum mechanical context? Most likely, it would appear as an extreme concentration of a huge mass (more than a few solar masses for astrophysical black holes) within a tiny volume. The size of the reservoir that drains all matter that fell into an astrophysical black hole is unknown and constitutes one of the unsolved problems on which BHI scholars work.

    We are delighted to present a collection of essays which were carefully selected by our senior faculty out of many applications to the first essay competition of the BHI. The winning essays will be published here on Nautilus over the next five weeks, beginning with the fifth-place finisher and working up to the first-place finisher. We hope that you will enjoy them as much as we did.

    —Abraham (Avi) Loeb
    Frank B. Baird, Jr. Professor of Science, Harvard University
    Chair, Harvard Astronomy Department
    Founding Director, Black Hole Initiative (BHI)

    In the 1700s, John Michell in England and Pierre-Simon Laplace in France independently thought “way out of the box” and imagined what would happen if a huge mass were placed in an incredibly small volume. Pushing this thought experiment to the limit, they conjectured that gravitational forces might not allow anything, even light, to escape. Michell and Laplace were imagining what we now call a black hole.

    Astronomers are now convinced that when massive stars burn through their nuclear fuel, they collapse to near nothingness and form a black hole. While the concept of a star collapsing to a black hole is astounding, the possibility that material from millions and even billions of stars can condense into a single supermassive black hole is even more fantastic.

    Cornell SXS, the Simulating eXtreme Spacetimes (SXS) project

    Yet astronomers are now confident that supermassive black holes exist and are found in the centers of most of the 100 billion galaxies in the universe.

    How did we come to this astonishing conclusion? The story begins in the mid-1900s when astronomers expanded their horizons beyond the very narrow range of wavelengths to which our eyes are sensitive. Very strong sources of radio waves were discovered and, when accurate positions were determined, many were found to be centered on distant galaxies. Shortly thereafter, radio antennas were linked together to greatly improve angular resolution.

    NRAO/Karl V Jansky Expanded Very Large Array, on the Plains of San Agustin fifty miles west of Socorro, NM, USA, at an elevation of 6970 ft (2124 m)

    ESO/NRAO/NAOJ ALMA Array in Chile in the Atacama at Chajnantor plateau, at 5,000 metres

    CfA Submillimeter Array Mauna Kea, Hawaii, USA,4,207 m (13,802 ft) above sea level

    These new “interferometers” revealed a totally unexpected picture of the radio emission from galaxies—the radio waves did not appear to come from the galaxy itself, but from two huge “lobes” symmetrically placed about the galaxy. Figure One shows an example of such a “radio galaxy,” named Cygnus A. Radio lobes can be among the largest structures in the universe, upward of a hundred times the size of the galaxy itself.

    2
    Figure One: Radio image of the galaxy Cygnus A. Dominating the image are two huge “lobes” of radio emitting plasma. An optical image of the host galaxy would be smaller than the gap between the lobes. The minimum energy needed to power some radio lobes can be equivalent to the total conversion of 10 million stars to energy! Note the thin trails of radio emission that connect the lobes with the bright spot at the center, where all of the energy originates. NRAO/AUI

    How are immense radio lobes energized? Their symmetrical placement about a galaxy clearly suggested a close relationship. In the 1960s, sensitive radio interferometers confirmed the circumstantial case for a relationship by discovering faint trails, or “jets,” tracing radio emission from the lobes back to a very compact source at the precise center of the galaxy. These findings motivated radio astronomers to increase the sizes of their interferometers in order to better resolve these emissions. Ultimately this led to the technique of Very Long Baseline Interferometry (VLBI), in which radio signals from antennas across the Earth are combined to obtain the angular resolution of a telescope the size of our planet!

    GMVA The Global VLBI Array

    Radio images made from VLBI observations soon revealed that the sources at the centers of radio galaxies are “microscopic” by galaxy standards, even smaller than the distance between the sun and our nearest star.

    When astronomers calculated the energy needed to power radio lobes they were astounded. It required 10 million stars to be “vaporized,” totally converting their mass to energy using Einstein’s famous equation E = mc2! Nuclear reactions, which power stars, cannot even convert 1 percent of a star’s mass to energy. So trying to explain the energy in radio lobes with nuclear power would require more than 1 billion stars, and these stars would have to live within the “microscopic” volume indicated by the VLBI observations. Because of these findings, astronomers began considering alternative energy sources: supermassive black holes.

    Given that the centers of galaxies might harbor supermassive black holes, it was natural to check the center of our Milky Way galaxy for such a monster. In 1974, a very compact radio source, smaller than 1 second of arc (1/3600 of a degree) was discovered there. The compact source was named Sagittarius A*, or Sgr A* for short, and is shown at the center of the right panel of Figure 2. Early VLBI observations established that Sgr A* was far more compact than the size of our solar system. However, no obvious optical, infrared, or even X-ray emitting source could be confidently identified with it, and its nature remained mysterious.

    3
    Figure Two: Images of the central region of the Milky Way. The left panel shows an infrared image. The orbital track of star S2 is overlaid, magnified by a factor of 100. The orbit has period of 16 years, requires an unseen mass of 4 million times that of the sun, and the gravitational center is indicated by the arrow. The right panel shows a radio image. The point-like radio source Sgr A* (just below the middle of the image) is precisely at the gravitational center of the orbiting stars. Sgr A* is intrinsically motionless at the galactic center and, therefore, must be extremely massive.Left panel: R. Genzel; Right panel: J.-H. Zhao

    Star S0-2 Andrea Ghez Keck/UCLA Galactic Center Group

    Andrea’s Favorite star SO-2

    Andrea Ghez, astrophysicist and professor at the University of California, Los Angeles, who leads a team of scientists observing S2 for evidence of a supermassive black hole UCLA Galactic Center Group

    SGR A and SGR A* from Penn State and NASA/Chandra

    SGR A* , the supermassive black hole at the center of the Milky Way. NASA’s Chandra X-Ray Observatory

    Meanwhile, the development of high-resolution infrared cameras revealed a dense cluster of stars at the center of the Milky Way. These stars cannot be seen at optical wavelengths, because visible light is totally absorbed by intervening dust. However, at infrared wavelengths 10 percent of their starlight makes its way to our telescopes, and astronomers have been measuring the positions of these stars for more than two decades. These observations culminated with the important discovery that stars are moving along elliptical paths, which are a unique characteristic of gravitational orbits. One of these stars has now been traced over a complete orbit, as shown in the left panel of Figure Two.

    Many stars have been followed along partial orbits, and all are consistent with orbits about a single object. Two stars have been observed to approach the center to within the size of our solar system, which by galaxy standards is very small. At this point, gravity is so strong that stars are orbiting at nearly 10,000 kilometers per second—fast enough to cross the Earth in one second! These measurements leave no doubt that the stars are responding to an unseen mass of 4 million times that of the sun. Combining this mass with the (astronomically) small volume indicated by the stellar orbits implies an extraordinarily high density. At this density it is hard to imagine how any type of matter would not collapse to form a black hole.

    The infrared results just described are beautifully complemented by observations at radio wavelengths. In order to identify an infrared counterpart for Sgr A*, the position of the radio source needed to be precisely transferred to infrared images. An ingenious method to do this uses sources visible at both radio and infrared wavelengths to tie the reference frames together. Ideal sources are giant red stars, which are bright in the infrared and have strong emission at radio wavelengths from molecules surrounding them. By matching the positions of these stars at the two wavebands, the radio position of Sgr A* can be transferred to infrared images with an accuracy of 0.001 seconds of arc. This technique placed Sgr A* precisely at the position of the gravitational center of the orbiting stars.

    How much of the dark mass within the stellar orbits can be directly associated with the radio source Sgr A*? Were Sgr A* a star, it would be moving at over 10,000 kilometers per second in the strong gravitational field as other stars are observed to do. Only if Sgr A* is extremely massive would it move slowly. The position of Sgr A* has been monitored with VLBI techniques for over two decades, revealing that it is essentially stationary at the dynamical center of the Milky Way. Specifically, the component of Sgr A*’s intrinsic motion perpendicular to the plane of the Milky Way is less than one kilometer per second. By comparison, this is 30 times slower than the Earth orbits the sun. The discovery that Sgr A* is essentially stationary and anchors the galactic center requires that Sgr A* contains over 400,000 times the mass of the sun.

    Recent VLBI observations have shown that the size of the radio emission of Sgr A* is less than that contained within the orbit of Mercury. Combining this volume available to Sgr A* with the lower limit to its mass yields a staggeringly high density. This density is within a factor of less than 10 of the ultimate limit for a black hole. At such an extreme density, the evidence is overwhelming that Sgr A* is a supermassive black hole.

    These discoveries are elegant for their directness and simplicity. Orbits of stars provide an absolutely clear and unequivocal proof of a great unseen mass concentration. Finding that the compact radio source Sgr A* is at the precise location of the unseen mass and is motionless provides even more compelling evidence for a supermassive black hole. Together they form a simple, unique demonstration that the fantastic concept of a supermassive black hole is indeed a reality. John Michell and Pierre-Simon Laplace would be astounded to learn that their conjectures about black holes not only turned out to be correct, but were far grander than they ever could have imagined.

    Mark J. Reid is a senior astronomer at the Center for Astrophysics, Harvard & Smithsonian. He uses radio telescopes across the globe simultaneously to obtain the highest resolution images of newborn and dying stars, as well as black holes.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
  • richardmitnick 10:01 am on November 8, 2018 Permalink | Reply
    Tags: , , , , Nautilus, Sarah Stewart, Synestia, The Woman Who Reinvented the Moon,   

    From Nautilus: Women in STEM- “The Woman Who Reinvented the Moon” Sarah Stewart 

    Nautilus

    From Nautilus

    November 8, 2018
    Brian Gallagher

    Sarah Stewart is living her ideal life—and it just got sweeter. The University of California, Davis planetary physicist recently won a MacArthur Foundation Fellowship, famously and unofficially known as the “genius grant,” for her work on the origin of Earth’s moon, upending a decades-old theory. She’s been awarded $625,000.

    “It’s an amazing concept to just say, ‘We’re going to give you the opportunity to do something, and we’re not going to tell you anything about what to do.’ That’s very unusual and freeing,” she told Nautilus, referring to the grant program. She was particularly thrilled by the recognition the award represents. The foundation speaks to several dozen of a candidate’s peers as a part of its vetting process. “What I really feel is appreciation for my colleagues,” she said. “That really touches me.”

    Nautilus spoke to Stewart during World Space Week, the theme of which, this year, is “Space Unites the World.” It compelled her to pen a poem, using the theme as a title. Nautilus asked Stewart about that, as well as how her laboratory experiments, which replicate the pressures and temperatures of planetary collisions, informed her model of the moon’s birth.

    1
    Sarah Stewart. John D. & Catherine T. MacArthur Foundation

    How can space bring us together?

    This World Space Week is happening at a time where the world seems to be highlighting divisions. And so I wrote what I wrote as a response to that. Space exploration and discovery of things that are surprising and new is a way to bring everyone together, and enjoy the profound beauty of nature. And I would like us to spend more time talking about the things that bring us together.

    Like the moon. Give us a brief history of its origin theories.

    Next year, 2019, is the 50th anniversary of the Apollo moon landing. The rock samples that the Apollo missions brought back basically threw out every previous idea for the origin of the moon. Before the Apollo results were in, a Russian astronomer named Viktor Safronov had been developing models of how planets grow. He found that they grow into these sub- or proto-planet-size bodies that would then collide. A couple of different groups then independently proposed that a giant impact made a disc around the Earth that the moon accreted from. Over the past 50 years, that model became quantitative, predictive. Simulations showed that the moon should be made primarily out of the object that struck the proto-Earth. But the Apollo mission found that the moon is practically a twin of the Earth, particularly its mantle, in major elements and in isotopic ratios: The different weight elements are like fingerprints, present in the same abundances. Every single small asteroid and planet in the solar system has a different fingerprint, except the Earth and the moon. So the giant impact hypothesis was wrong. It’s a lesson in how science works—the giant impact hypothesis hung on for so long because there was no alternative model that hadn’t already been disproven.

    How is your proposal for the moon’s birth different?

    We changed the giant impact. And by changing it we specifically removed one of the original constraints. The original giant impact was proposed to set the length of day of the Earth, because angular momentum—the rotational equivalent of linear momentum—is a physical quantity that is conserved: If we go backward in time, the moon comes closer to the Earth. At the time the moon grew, the Earth would have been spinning with a five-hour day. So all of the giant impact models were tuned to give us a five-hour day for the Earth right after the giant impact. What we did was say, “Well, what if there were a way to change the angular momentum after the moon formed?” That would have to be through a dynamical interaction with the sun. What that means is that we could start the Earth spinning much faster—we were exploring models where the Earth had a two- to three-hour day after the giant impact.

    What did a faster-spinning Earth do to your models?

    The surprising new thing is that when the Earth is hot, vaporized, and spinning quickly, it isn’t a planet anymore. There’s a boundary beyond which all of the Earth material cannot physically stay in an object that rotates altogether—we call that the co-rotation limit. A body that exceeds the co-rotation limit forms a new object that we named a synestia, a Greek-derived word that is meant to represent a connected structure. A synestia is a different object than a planet plus a disc. It has different internal dynamics. In this hot vaporized state, the hot gas in the disc can’t fall onto the planet, because the planet has an atmosphere that’s pushing that gas out. What ends up happening is that the rock vapor that forms a synestia cools by radiating to space, forms magma rain in the outer parts of the synestia, and that magma rain accretes to form the moon within the rock vapor that later cools to become the Earth.

    How did the idea of a synestia come about?

    In 2012, Matija Ćuk and I published a paper that was a high-spin model for the origin of the moon. We changed the impact event, but we didn’t realize that after the impact, things were completely different. It just wasn’t anything we ever extracted from the simulations. It wasn’t until two years later when my student Simon Lock and I were looking at different plots, plots we had never made before out of the same simulations, that we realized that we had been interpreting what happened next incorrectly. There was a bonafide eureka moment where we’re sitting together talking about how the disc would evolve around the Earth after the impact, and realizing that it wasn’t a standard disc. These synestias have probably been sitting in people’s computer systems for quite some time without anyone ever actually identifying them as something different.

    Was the size of the synestia beyond the moon’s current orbit?

    It could have been bigger. Exactly how big it was depends on the energy of the event and how fast it was spinning. We don’t have precise constraints on that to make the moon because a range of synestias could make the moon.

    How long was the Earth in a synestia state?

    The synestia was very large, but it didn’t last very long. Because rock vapor is very hot, and where we are in the solar system is far enough away from the sun that our mean temperature is cooler than rock vapor, the synestia cooled very quickly. So it could last 1,000 years or so before looking like a normal planet again. Exactly how long it lasts depends on what else is happening in the solar system around the Earth. In order to be a long lived object it would need to be very close to the star.

    What was the size of the object that struck proto-Earth?

    We can’t tell, because a variety of mass ratios, impact angles, impact velocities can make a synestia that has enough mass and angular momentum in it to make our moon. I don’t know that we will ever know for sure exactly what hit us. There may be ways for us to constrain the possibilities. One way to do that is to look deep in the Earth for clues about how large the event could have been. There are chemical tracers from the deep mantle that indicate that the Earth wasn’t completely melted and mixed, even by the moon-forming event. Those reach the surface through what are called ocean island basalts, sometimes called mantle plumes, from near the core-mantle boundary, up through the whole mantle to the surface. It could be that that could be used as a constraint on being too big. Because the Earth and the moon are very similar in the mantles of the two bodies, that can be used to determine what is too small of an event. That would give us a range that can probably be satisfied by a number of different impact configurations.

    How much energy does it take to form a synestia?

    Giant impacts are tremendously energetic events. The energy of the event, in terms of the kinetic energy of the impact, is released over hours. The power involved is similar to the power, or luminosity, of the sun. We really cannot think of the Earth as looking anything like the Earth when you’ve just dumped the energy of the sun into this planet.

    How common are synestias?

    We actually think that synestias should happen quite frequently during rocky planet formation. We haven’t looked at the gas giant planets. There are some different physics that happen with those. But for growing rocky bodies like the Earth, we attempted to estimate the statistics of how often there should be synestias. And for Earth-mass bodies anywhere in the universe probably, the body is a synestia at least once while it’s growing. The likelihood of making a synestia goes up as the bodies become larger. Super-Earths also should have been a synestia at some point.

    You say that all of the pressures and temperatures reached during planet formation are now accessible in the laboratory. First, give us a sense of the magnitude of those pressures and temperatures, and then tell us how accessing them in labs is possible.

    The center of the Earth is at several thousand degrees, and has hundreds of gigapascals of pressure—about 3 million times more pressure than the surface. Jupiter’s center is even hotter. The center-of-Jupiter pressures can be reached temporarily during a giant impact, as the bodies are colliding together. A giant impact and the center of Jupiter are about the limits of the pressures and temperatures reached during planet formation: so tens of thousands of degrees, and a million times the pressure of the Earth. To replicate that, we need to dump energy into our rock or mineral very quickly in order to generate a shockwave that reaches these amplitudes in pressure and temperature. We use major minerals in the Earth, or rocky planets—so we’ve studied iron, quartz, forsterite, enstatite, and different alloy compositions of those. Other people have studied the hydrogen helium mixture for Jupiter, and ices for Uranus and Neptune. In my lab we have light gas guns, essentially cannons. And, using compressed hydrogen, we can launch a metal flyer plate—literally a thin disk—to almost 8 kilometers per second. We can reach the core pressures in the Earth, but I can’t reach the range of giant impacts or the center of Jupiter in my lab. But the Sandia Z machine, which is a big capacitor that launches metal plates using a magnetic force, can reach 40 kilometers per second. And with the National Ignition Facility laser at Lawrence Livermore National Lab, we can reach the pressures at the center of Jupiter.

    Sandia Z machine

    National Ignition Facility at LLNL

    What happens to the flyer plates when they’re shot?

    The target simply gets turned to dust after being vaporized and then cooling again. They’re very destructive experiments. You have to make real time measurements—of the wave itself and how fast it’s traveling—within tens of nanoseconds. That we can translate to pressure. My group has spent a lot of time developing ways to measure temperature, and to find phase boundaries. The work that led to the origin of the moon was specifically studying what it takes to vaporize Earth materials, and to determine the boiling points of rocks. We needed to know when it would be vaporized in order to calculate when something would become a synestia.

    How do you use your experimental results?

    What runs in our code is a simplified version of a planet. With our experiments we can simulate a simplified planet to infer the more complicated chemical system. Once we’ve determined the pressure-temperature of the average system, you can ask more detailed questions about the multi-component chemistry of a real planet. In the moon paper that was published this year, there’s two big sections. One that does the simplified modeling of the giant impact—it gives us the pressure-temperature range in the synestia. Then another that looks at the chemistry of the system that starts at these high pressures and temperatures and cools, but now using a more realistic model for the Earth.

    What was it like to get a call from the MacArthur Foundation?

    It did come out of the blue. They called me in my office, and I answered the phone. There were three people on the other end, and they said they were from the MacArthur Foundation. I knew what it was, and I stopped listening, because it was such a nice surprise. To me it probably is just unreal at the moment, meaning it will probably take some time to really sink in.

    How did you come to study planetary physics?

    I had enjoyed science fiction, not thinking I was going to be a scientist. But while I was in high school I had phenomenal math and physics teachers. That really grabbed my interest, so when I went to college I wanted to be a physics major. I quickly learned that the astronomers very much welcomed undergraduate researchers because the work was very accessible to someone with undergraduate skills. I met amazing scientists, and that sparked a whole career.

    What would you be doing if you weren’t a scientist?

    That’s hard. Because it has been my ideal for a very long time. In college I did a lot of theater. More theater than homework. The best theatrical experience I had was directing Sweeney Todd. It was absolutely amazing. So I did watch with some envy as some of my friends pursued a theatrical life. That is something that you can be wistful about, except that that would have been a hard path.

    NASA is celebrating its 60th anniversary. What does that mean to you as a scientist studying space?

    It feels like we’ve learned so much over 60 years, because we’ve had our first visits to everything in the solar system now. But at the same time, we’re completely surprised every time we arrive at a new object. So in some ways we’re still in the youthful period in planetary science, where we’re trying to work out basic knowledge. That’s a very exciting time. We’re still on a very big growth curve.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
  • richardmitnick 2:23 pm on October 21, 2018 Permalink | Reply
    Tags: Are Black Holes Actually Dark Energy Stars?, , , , , , Nautilus   

    From Nautilus: “Are Black Holes Actually Dark Energy Stars?” 

    Nautilus

    From Nautilus

    Oct 15, 2018
    Jesse Stone

    1
    George Chapline believes that the Event Horizon Telescope will offer evidence that black holes are really dark energy stars. NASA.

    What does the supermassive black hole at the center of the Milky Way look like? Early next year, we might find out. The Event Horizon Telescope—really a virtual telescope with an effective diameter of the Earth—has been pointing at Sagittarius A* for the last several years.

    Event Horizon Telescope Array

    Arizona Radio Observatory
    Arizona Radio Observatory/Submillimeter-wave Astronomy (ARO/SMT)

    ESO/APEX
    Atacama Pathfinder EXperiment

    CARMA Array no longer in service
    Combined Array for Research in Millimeter-wave Astronomy (CARMA)

    Atacama Submillimeter Telescope Experiment (ASTE)
    Atacama Submillimeter Telescope Experiment (ASTE)

    Caltech Submillimeter Observatory
    Caltech Submillimeter Observatory (CSO)

    IRAM NOEMA interferometer
    Institut de Radioastronomie Millimetrique (IRAM) 30m

    James Clerk Maxwell Telescope interior, Mauna Kea, Hawaii, USA
    James Clerk Maxwell Telescope interior, Mauna Kea, Hawaii, USA

    Large Millimeter Telescope Alfonso Serrano
    Large Millimeter Telescope Alfonso Serrano

    CfA Submillimeter Array Hawaii SAO
    Submillimeter Array Hawaii SAO

    ESO/NRAO/NAOJ ALMA Array
    ESO/NRAO/NAOJ ALMA Array, Chile

    South Pole Telescope SPTPOL
    South Pole Telescope SPTPOL

    NSF CfA Greenland telescope

    Greenland Telescope

    Future Array/Telescopes

    Plateau de Bure interferometer
    Plateau de Bure interferometer

    Most researchers in the astrophysics community expect that its images, taken from telescopes all over the Earth, will show the telltale signs of a black hole: a bright swirl of light, produced by a disc of gases trapped in the black hole’s orbit, surrounding a black shadow at the center—the event horizon. This encloses the region of space where the black-hole singularity’s gravitational pull is too strong for light to escape.

    But George Chapline, a physicist at the Lawrence Livermore National Laboratory, doesn’t expect to see a black hole. He doesn’t believe they’re real. In 2005, he told Nature that “it’s a near certainty that black holes don’t exist” and—building on previous work he’d done with physics Nobel laureate Robert Laughlin—introduced an alternative model that he dubbed “dark energy stars.” Dark energy is a term physicists use to describe a peculiar kind of energy that appears to permeate the entire universe.

    Dark energy depiction. Image: Volker Springle/Max Planck Institute for Astrophysics/SP)

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    It expands the fabric of spacetime itself, even as gravity attempts to bring objects closer together. Chapline believes that the immense energies in a collapsing star cause its protons and neutrons to decay into a gas of photons and other elementary particles, along with what he refers to as “droplets of vacuum energy.” These form a “condensed” phase of spacetime—much like a gas under enough pressure transitions to liquid—that has a much higher density of dark energy than the spacetime surrounding the star. This provides the pressure necessary to hold gravity at bay and prevent a singularity from forming. Without a singularity in spacetime, there is no black hole.

    The idea has found no support in the astrophysical community—over the last decade, Chapline’s papers on this topic have garnered only single-digit citations. His most popular paper in particle physics, by contrast, has been cited over 600 times. But Chapline suspects his days of wandering in the scientific wilderness may soon be over. He believes that the Event Horizon Telescope will offer evidence that dark energy stars are real.

    The idea goes back to a 2000 paper [International Journal of Modern Physics A], with Evan Hohlfeld and David Santiago, in which Chapline and Laughlin modeled spacetime as a Bose-Einstein condensate—a state of matter that arises when taking an extremely low-density gas to extremely low temperatures, near absolute zero. Chapline and Laughlin’s model is quantum mechanical in nature: General relativity emerges as a consequence of the way that the spacetime condensate behaves on large scales. Spacetime in this model also undergoes phase transformations when it gains or loses energy. Other scientists find this to be a promising path, too. A 2009 paper [Physical Review A] by a group of Japanese physicists stated that “[Bose-Einstein Condensates] are one of the most promising quantum fluids for” analogizing curved spacetime.

    Chapline and Laughlin argue that they can describe the collapsed stars that most scientists take to be black holes as regions where spacetime has undergone a phase transition. They find that the laws of general relativity are valid everywhere in the vicinity of the collapsed star, except at the event horizon, which marks the boundary between two different phases of spacetime.

    In the condensate model the event horizon surrounding a collapsed star is no longer a point of no return but instead a traversable, physical surface. This feature, along with the lack of a singularity that is the signature feature of black holes, means that paradoxes associated with black holes, like the destruction of information, don’t arise. Laughlin has been reticent to conjecture too far beyond his and Chapline’s initial ideas. He believes Chapline is onto something with dark energy stars, “but where we part company is in the amount of speculating we are willing to do about what ‘phase’ of the vacuum might be inside” what most scientists call black holes, Laughlin said. He’s holding off until experimental data reveals more about the interior phase. “I will then write my second paper on the subject,” he said.

    In recent years Chapline has continued to refine his dark energy star model in collaboration with several other authors, including Pawel Mazur of the University of South Carolina and Piotr Marecki of Leipzig University. He’s concluded that dark energy stars aren’t spherical or oblate, like black holes. Instead, they have the shape of a torus, or donut. In a rotating compact object, like a dark energy star, Chapline believes quantum effects in the spacetime condensate generate a large vortex along the object’s axis of rotation. Because the region inside the vortex is empty—think of the depression that forms at the center of whirlpool—the center of the dark energy star is hollow, like an apple without its core. A similar effect is observed when quantum mechanics is used to model rotating drops of superfluid. There too, a central vortex can form at the center of a rotating drop and, surprisingly, change its shape from a sphere to a torus.

    For Chapline, this strange toroidal geometry isn’t a bug of dark energy stars, but a feature, as it helps explain the origin and shape of astrophysical jets—the highly energetic beams of ionized matter that are generated along the axis of rotation of a compact object like a black hole. Chapline believes he’s identified a mechanism in dark energy stars that explains observations of astrophysical jets better than mainstream ones, which posit that energy is extracted from the accretion disk outside of a black hole and focused into a narrow beam along the black hole’s axis of rotation. To Chapline, matter and energy falling toward a dark energy star would make its way to the inner throat (the “donut hole”), where electrons orbiting the throat would, as in a Biermann Battery, generate magnetic fields powerful enough to drive the jets.

    Chapline points to recent experimental work where scientists, at the OMEGA Laser Facility at the University of Rochester, created magnetized jets using lasers to form a ring-like excitation on a flat surface.

    U Rochester Omega Laser facility

    Though the experiments were not conducted with dark energy stars in mind, Chapline believes it provides support for his theory since the ring-like excitation—Chapline calls it a “ring of fire”—is exactly what he would expect to happen along the throat of a dark energy star. He believes the ring could be the key to supporting the existence of dark energy stars. “This ought to eventually show up clearly” in the Event Horizon Telescope images, Chapline said, referring to the ring.

    3
    Black hole vs dark energy star: When viewed from the top down, a dark energy star has a central opening, the donut hole. Chapline believes that matter and energy rotating around the central opening (forming the “ring of fire”) is the source of the astrophysical jets observed by astronomers in the vicinity of what most believe to be black holes. No image credit.

    Chapline also points out that dark energy stars will not be completely opaque to light, as matter and light can pass into, but also out of, a dark energy star. A dark energy star won’t have a completely black interior—instead it will show a distorted image of any stars behind it. Other physicists, though, are skeptical that these kinds of deviations from conventional black hole models would show up in the Event Horizon Telescope data. Raul Carballo-Rubio, a physicist at the International School for Advanced Studies, in Trieste, Italy, has developed his own alternative model to black holes known as semi-classical relativistic stars. Speaking more generally about alternative black hole models Caraballo-Rubio said, “The differences [with black holes] that would arise in these models are too minute to be detected” by the Event Horizon Telescope.

    Chapline plans to discuss his dark energy star predictions in December, at the Kavli Institute for Theoretical Physics in Santa Barbara. But even if his predictions are confirmed, he said he doesn’t expect the scientific community to become convinced overnight. “I expect that for the next few years the [Event Horizon Telescope] people will be confused by what they see.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
  • richardmitnick 1:39 pm on September 30, 2018 Permalink | Reply
    Tags: , , , Nautilus, Top 10 Design Flaws in the Human Body   

    From Nautilus: “Top 10 Design Flaws in the Human Body” 

    Nautilus

    From Nautilus

    May 14, 2015 [Just now in social media]
    By Chip Rowe Illustrations by Len Small

    From our knees to our eyeballs, our bodies are full of hack solutions [Did G-d do a bad job?].

    1

    The Greeks were obsessed with the mathematically perfect body. But unfortunately for anyone chasing that ideal, we were designed not by Pygmalion, the mythical sculptor who carved a flawless woman, but by MacGyver. Evolution constructed our bodies with the biological equivalent of duct tape and lumber scraps. And the only way to refine the form (short of an asteroid strike or nuclear detonation to wipe clean the slate) is to jerry-rig the current model. “Evolution doesn’t produce perfection,” explains Alan Mann, a physical anthropologist at Princeton University. “It produces function.”

    With that in mind, I surveyed anatomists and biologists to compile a punch list for the human body, just as you’d do before buying a house. Get out your checkbook. This one’s a fixer-upper.

    1. An unsound spine

    Problem: Our spines are a mess. It’s a wonder we can even walk, says Bruce Latimer, director of the Center for Human Origins at Case Western Reserve University, in Cleveland. When our ancestors walked on all fours, their spines arched, like a bow, to withstand the weight of the organs suspended below. But then we stood up. That threw the system out of whack by 90 degrees, and the spine was forced to become a column. Next, to allow for bipedalism, it curved forward at the lower back. And to keep the head in balance—so that we didn’t all walk around as if doing the limbo—the upper spine curved in the opposite direction. This change put tremendous pressure on the lower vertebrae, sticking about 80 percent of adults, according to one estimate, with lower back pain.
    Fix: Go back to the arch. “Think of your dog,” Latimer says. “From the sacrum to the neck, it’s a single bow curve. That’s a great system.” Simple. Strong. Pain-free. There’s only one catch: To keep the weight of our heads from pitching us forward, we’d need to return to all fours.

    2. An inflexible knee

    Problem: As Latimer says, “You take the most complex joint in the body and put it between two huge levers—the femur and the tibia—and you’re looking for trouble.” The upshot is your knee only rotates in two directions: forward and back. “That’s why every major sport, except maybe rugby, makes it illegal to clip, or hit an opponent’s knee from the side.”

    2

    Fix: Replace this hinge with a ball and socket, like in your shoulders and hips. We never developed this type of joint at the knee “because we didn’t need it,” Latimer says. “We didn’t know about football.”

    3. A too-narrow pelvis

    Problem: Childbirth hurts. And to add insult to injury, the width of a woman’s pelvis hasn’t changed for some 200,000 years, keeping our brains from growing larger.
    Fix: Sure, you could stretch out the pelvis, Latimer says, but technologists may already be onto a better solution. “I would bet that in 10,000 years, or even in 1,000 years, no woman in the developed world will deliver naturally. A clinic will combine the sperm and egg, and you’ll come by and pick up the kid.”

    4. Exposed testicles

    Problem: A man’s life-giving organs hang vulnerably outside the body.
    Fix: Moving the testicles indoors would save men the pain of getting hit in the nuts. To accomplish this, first you’d need to tweak the sperm, says Gordon Gallup, an evolutionary psychologist at the State University of New York at Albany. Apparently the testicles (unlike the ovaries) get thrown out in the cold because sperm must be kept at 2.5 to 3 degrees Fahrenheit below the body’s internal temperature. Gallup hypothesizes that these lower temperatures keep sperm relatively inactive until they enter the warm confines of a vagina, at which point they go racing off to fertilize the egg.1 This evolutionary hack prevents sperm from wearing themselves out too early. So change the algorithm, Gallup says. Keep the sperm at body temperature and make the vagina hotter. (And, by the way, there’s no need to draw up new blueprints: Elephants offer a pretty good prototype.)

    5. Crowded teeth

    Problem: Humans typically have three molars on each side of the upper and lower jaws near the back of the mouth. When our brain drastically expanded in size, the jaw grew wider and shorter, leaving no room for the third, farthest back molars. These cusped grinders may have been useful before we learned to cook and process food. But now the “wisdom teeth” mostly just get painfully impacted in the gums.
    Fix: Get rid of them. At one point, they appeared to be on their way out—about 25 percent of people today (most commonly Eskimos) are born without some or all of their third molars. In the meantime, we’ve figured out how to safely extract these teeth with dental tools, which, Mann notes, we probably wouldn’t have invented without the bigger brains. So you could call it a wash.

    6. Meandering arteries

    Problem: Blood flows into each of your arms and legs via one main artery, which enters the limb on the front side of the body, by the biceps or hip flexors. To supply blood to tissues at a limb’s back side, such as the triceps and hamstrings, the artery branches out, taking circuitous routes around bones and bundling itself with nerves. This roundabout plumbing can make for some rather annoying glitches. At the elbow, for instance, an artery branch meets up with the ulnar nerve, which animates your little finger, just under the skin. That’s why your arm goes numb when the lower tip of your upper arm bone, called the humerus or “funny bone,” takes a sharp blow.
    The Fix: Feed a second artery into the back side of each arm and leg, by the shoulder blades or buttock, says Rui Diogo, an assistant professor of anatomy at Howard University, in Washington, DC, who studies the evolution of primate muscles. This extra pipe would provide a more direct route from the shoulder to the back of the hand, preventing vessels and nerves from wandering too close to the skin.

    7. A backward retina

    Problem: The photoreceptor cells in the retina of the eye are like microphones facing backward, writes Nathan Lents, an associate professor of molecular biology at the City University of New York. This design forces light to travel the length of each cell, as well as through blood and tissue, to reach the equivalent of a receiver on the cell’s backside. The setup may encourage the retina to detach from its supporting tissue—a leading cause of blindness. It also creates a blind spot where cell fibers, akin to microphone cables, converge at the optic nerve—making the brain refill the hole.
    Fix: Poach the obvious solution from the octopus or the squid: Just flip the retina.

    3

    8. A misrouted nerve

    Problem: The recurrent laryngeal nerve (RLN) plays a vital role in our ability to speak and swallow. It feeds instructions from the brain to the muscles of the voice box, or larynx, below the vocal cords. Theoretically, the trip should be a quick one. But during fetal development, the RLN gets entwined in a tiny lump of tissue in the neck, which descends to become blood vessels near the heart. That drop causes the nerve to loop around the aorta before traveling back up the larynx. Having this nerve in your chest makes it vulnerable during surgery—or a fist fight.
    Fix: “This one’s easy,” says Rebecca Z. German, a professor of anatomy and neurobiology at Northeast Ohio Medical University, in Rootstown. While a baby is in utero, develop the RCN after sending that irksome neck lump of vessel tissue to the chest. That way, the nerve won’t get dragged down with it.

    9. A misplaced voice box

    Problem: The trachea (windpipe) and esophagus (food pipe) open into the same space, the pharynx, which extends from the nose and mouth to the larynx (voice box). To keep food out of the trachea, a leaf-shaped flap called the epiglottis reflexively covers the opening to the larynx whenever you swallow. But sometimes, the epiglottis isn’t fast enough. If you’re talking and laughing while eating, food may slip down and get lodged in your airway, causing you to choke.
    Fix: Take a cue from whales, whose larynx is located in their blowholes. If we moved the larynx into our nose, says German, we could have two independent tubes. Sure, we’d lose the ability to talk. But we could still communicate in song, as whales do, through vibrations in our nostrils.

    10. A klugey brain

    Problem: The human brain evolved in stages. As new additions were being built, older parts had to remain online to keep us up and running, explains psychologist Gary Marcus in his book Kluge: The Haphazard Evolution of the Mind.2 And that live-in construction project led to slapdash workarounds. It’s as if the brain were a dysfunctional workplace, where young employees (the forebrain) handled newfangled technologies like language while the old guard (the midbrain and hindbrain) oversaw the institutional memory—and the fuse box in the basement. A few outcomes: depression, madness, unreliable memories, and confirmation bias.
    Fix: We’re screwed.

    6

    References

    1. Gallup, G.G., Finn, M.M., & Sammis, B. On the origin of descended scrotal testicles: The activation hypothesis. Evolutionary Psychology 7, 517-526 (2009).

    2. Marcus, G. Kluge: The Haphazard Evolution of the Human Mind Houghton Mifflin, Boston, MA (2008).

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
  • richardmitnick 1:08 pm on September 30, 2018 Permalink | Reply
    Tags: Nautilus, Planck time, , Time, Time can now be sliced into slivers as thin as one ten-trillionth of a second   

    From Nautilus: “Is It Time to Get Rid of Time?” 

    Nautilus

    From Nautilus

    September 20, 2018
    Marcia Bartusiak

    1
    No image credit.

    The crisis inside the physics of time.

    Poets often think of time as a river, a free-flowing stream that carries us from the radiant morning of birth to the golden twilight of old age. It is the span that separates the delicate bud of spring from the lush flower of summer.

    Physicists think of time in somewhat more practical terms. For them, time is a means of measuring change—an endless series of instants that, strung together like beads, turn an uncertain future into the present and the present into a definite past.

    The very concept of time allows researchers to calculate when a comet will round the sun or how a signal traverses a silicon chip. Each step in time provides a peek at the evolution of nature’s myriad phenomena.

    In other words, time is a tool. In fact, it was the first scientific tool. Time can now be sliced into slivers as thin as one ten-trillionth of a second.

    Planck Time. Universe Today

    1
    FromQuarkstoQuasars

    But what is being sliced? Unlike mass and distance, time cannot be perceived by our physical senses. We don’t see, hear, smell, touch, or taste time. And yet we somehow measure it. As a cadre of theorists attempt to extend and refine the general theory of relativity, Einstein’s momentous law of gravitation, they have a problem with time. A big problem.

    2
    Slicing it thin: A hydrogen maser clock keeps time by exploiting the so-called hyperfine transition.Wikimedia Commons

    “It’s a crisis,” says mathematician John Baez, of the University of California at Riverside, “and the solution may take physics in a new direction.” Not the physics of our everyday world. Stopwatches, pendulums, and hydrogen maser clocks will continue to keep track of nature quite nicely here in our low-energy earthly environs. The crisis arises when physicists attempt to merge the macrocosm—the universe on its grandest scale—with the microcosm of subatomic particles.

    Under Newton, time was special. Every moment was tallied by a universal clock that stood separate and apart from the phenomenon under study. In general relativity, this is no longer true. Einstein declared that time is not absolute—no particular clock is special—and his equations describing how the gravitational force works take this into account. His law of gravity looks the same no matter what timepiece you happen to be using as your gauge. “In general relativity time is completely arbitrary,” explains theoretical physicist Christopher Isham of Imperial College in London. “The actual physical predictions that come out of general relativity don’t depend on your choice of a clock.” The predictions will be the same whether you are using a clock traveling near the speed of light or one sitting quietly at home on a shelf.

    The choice of clock is still crucial, however, in other areas of physics, particularly quantum mechanics. It plays a central role in Erwin Schrödinger’s celebrated wave equation of 1926. The equation shows how a subatomic particle, whether traveling alone or circling an atom, can be thought of as a collection of waves, a wave packet that moves from point to point in space and from moment to moment in time.

    According to the vision of quantum mechanics, energy and matter are cut up into discrete bits, called quanta, whose motions are jumpy and blurry. They fluctuate madly. The behavior of these particles cannot be worked out exactly, the way a rocket’s trajectory can. Using Schrödinger’s wave equation, you can only calculate the probability that a particle—a wave packet—will attain a certain position or velocity. This is a picture so different from the world of classical physics that even Einstein railed against its indeterminacy. He declared that he could never believe that God would play dice with the world.

    You might say that quantum mechanics introduced a fuzziness into physics: You can pinpoint the precise position of a particle, but at a trade-off; its velocity cannot then be measured very well. Conversely, if you know how fast a particle is going, you won’t be able to know exactly where it is. Werner Heisenberg best summarized this strange and exotic situation with his famous uncertainty principle. But all this action, uncertain as it is, occurs on a fixed stage of space and time, a steadfast arena. A reliable clock is always around—is always needed, really—to keep track of the goings-on and thus enable physicists to describe how the system is changing. At least, that’s the way the equations of quantum mechanics are now set up.

    And that is the crux of the problem. How are physicists expected to merge one law of physics—namely gravity—that requires no special clock to arrive at its predictions, with the subatomic rules of quantum mechanics, which continue to work within a universal, Newtonian time frame? In a way, each theory is marching to the beat of a different drummer (or the ticking of a different clock).

    That’s why things begin to go a little crazy when you attempt to blend these two areas of physics. Although the scale on which quantum gravity comes into play is so small that current technology cannot possibly measure these effects directly, physicists can imagine them. Place quantum particles on the springy, pliable mat of spacetime, and it will bend and fold like so much rubber. And that flexibility will greatly affect the operation of any clock keeping track of the particles. A timepiece caught in that tiny submicroscopic realm would probably resemble a pendulum clock laboring amid the quivers and shudders of an earthquake. “Here the very arena is being subjected to quantum effects, and one is left with nothing to stand on,” explains Isham. “You can end up in a situation where you have no notion of time whatsoever.” But quantum calculations depend on an assured sense of time.

    For Karel Kucha, a general relativist and professor emeritus at the University of Utah, the key to measuring quantum time is to devise, using clever math, an appropriate clock—something he has been attempting, off and on, for several decades. Conservative by nature, Kucha believes it is best to stick with what you know before moving on to more radical solutions. So he has been seeking what might be called the submicroscopic version of a Newtonian clock, a quantum timekeeper that can be used to describe the physics going on in the extraordinary realm ruled by quantum gravity, such as the innards of a black hole or the first instant of creation.

    Unlike the clocks used in everyday physics, Kucha’s hypothetical clock would not stand off in a corner, unaffected by what is going on around it. It would be set within the tiny, dense system where quantum gravity rules and would be part and parcel of it. This insider status has its pitfalls: The clock would change as the system changed—so to keep track of time, you would have to figure out how to monitor those variations. In a way, it would be like having to pry open your wristwatch and check its workings every time you wanted to refer to it.

    The most common candidates for this special type of clock are simply “matter clocks.” “This, of course, is the type of clock we’ve been used to since time immemorial. All the clocks we have around us are made up of matter,” Kucha points out. Conventional timekeeping, after all, means choosing some material medium, such as a set of particles or a fluid, and marking its changes. But with pen and paper, Kucha mathematically takes matter clocks into the domain of quantum gravity, where the gravitational field is extremely strong and those probabilistic quantum-mechanical effects begin to arise. He takes time where no clock has gone before.

    But as you venture into this domain, says Kucha, “matter becomes denser and denser.” And that’s the Achilles heel for any form of matter chosen to be a clock under these extreme conditions; it eventually gets squashed. That may seem obvious from the start, but Kucha needs to examine precisely how the clock breaks down so he can better understand the process and devise new mathematical strategies for constructing his ideal clock.

    More promising as a quantum clock is the geometry of space itself: monitoring spacetime’s changing curvature as the infant universe expands or a black hole forms. Kucha surmises that such a property might still be measurable in the extreme conditions of quantum gravity. The expanding cosmos offers the simplest example of this scheme. Imagine the tiny infant universe as an inflating balloon. Initially, its surface bends sharply around. But as the balloon blows up, the curvature of its surface grows shallower and shallower. “The changing geometry,” explains Kucha, “allows you to see that you are at one instant of time rather than another.” In other words, it can function as a clock.

    Unfortunately, each type of clock that Kucha has investigated so far leads to a different quantum description, different predictions of the system’s behavior. “You can formulate your quantum mechanics with respect to one clock that you place in spacetime and get one answer,” explains Kucha.

    “But if you choose another type of clock, perhaps one based on an electric field, you get a completely different result. It is difficult to say which of these descriptions, if any, is correct.”

    More than that, the clock that is chosen must not eventually crumble. Quantum theory suggests there is a limit to how fine you can cut up space. The smallest quantum grain of space imaginable is 10^33 centimeter wide, the Planck length, named after Max Planck, inventor of the quantum. On that infinitesimal scale, the spacetime canvas turns choppy and jumbled, like the whitecaps on an angry sea. Space and time become unglued and start to wink in and out of existence in a probabilistic froth. Time and space, as we know them, are no longer easily defined. This is the point at which the physics becomes unknown and theorists start walking on shaky ground. As physicist Paul Davies points out in his book About Time, “You must imagine all possible geometries—all possible spacetimes, space warps and time warps—mixed together in a sort of cocktail, or ‘foam.’ ”

    Only a fully developed theory of quantum gravity will show what’s really happening at this unimaginably small level of spacetime. Kucha conjectures that some property of general relativity (as yet unknown) will not undergo quantum fluctuations at this point. Something might hold on and not come unglued. If that’s true, such a property could serve as the reliable clock that Kucha has been seeking for so long. And with that hope, Kucha continues to explore, one by one, the varied possibilities.

    Kucha has been trying to mold general relativity into the style of quantum mechanics, to find a special clock for it. But some other physicists trying to understand quantum gravity believe that the revision should happen the other way around—that quantum gravity should be made over in the likeness of general relativity, where time is pushed into the background. Carlo Rovelli is a champion of this view.

    Forget time,” Rovelli declares emphatically. “Time is simply an experimental fact.” Rovelli, a physicist at the Center of Theoretical Physics in France, has been working on an approach to quantum gravity that is essentially timeless. To simplify the calculations, he and his collaborators, physicists Abhay Ashtekar and Lee Smolin, set up a theoretical space without a clock. In this way, they were able to rewrite Einstein’s general theory of relativity, using a new set of variables so that it could more easily be interpreted and adapted for use on the quantum level.

    Their formulation has allowed physicists to explore how gravity behaves on the subatomic scale in a new way. But is that really possible without any reference to time at all? “First with special relativity and then with general relativity, our classical notion of time has only gotten weaker and weaker,” answers Rovelli. “We think in terms of time. We need it. But the fact that we need time to carry out our thinking does not mean it is reality.”

    Rovelli believes if physicists ever find a unified law that links all the forces of nature under one banner, it will be written without any reference to time. “Then, in certain situations,” says Rovelli, “as when the gravitational field is not dramatically strong, reality organizes itself so that we perceive a flow that we call time.”

    Getting rid of time in the most fundamental physical laws, says Rovelli, will probably require a grand conceptual leap, the same kind of adjustment that 16th-century scientists had to make when Copernicus placed the sun, and not the Earth, at the center of the universe. In so doing, the Polish cleric effectively kicked the Earth into motion, even though back then it was difficult to imagine how the Earth could zoom along in orbit about the sun without its occupants being flung off the surface. “In the 1500s, people thought a moving earth was impossible,” notes Rovelli.

    But maybe the true rules are timeless, including those applied to the subatomic world. Indeed, a movement has been under way to rewrite the laws of quantum mechanics, a renovation that was spurred partly by the problem of time, among other quantum conundrums. As part of that program, theorists have been rephrasing quantum mechanics’ most basic equations to remove any direct reference to time.

    The roots of this approach can be traced to a procedure introduced by the physicist Richard Feynman in the 1940s, a method that has been extended and broadened by others, including James Hartle of the University of California at Santa Barbara and physics Nobel laureate Murray Gell-Mann.

    Basically, it’s a new way to look at Schrödinger’s equation. As originally set up, this equation allows physicists to compute the probability of a particle moving directly from point A to point B over specified slices of time. The alternate approach introduced by Feynman instead considers the infinite number of paths the particle could conceivably take to get from A to B, no matter how slim the chance. Time is removed as a factor; only the potential pathways are significant. Summing up these potentials (some paths are more likely than others, depending on the initial conditions), a specific path emerges in the end.

    The process is sometimes compared to interference between waves. When two waves in the ocean combine, they may reinforce one another (leading to a new and bigger wave) or cancel each other out entirely. Likewise, you might think of these many potential paths as interacting with one another—some getting enhanced, others destroyed—to produce the final path. More important, the variable of time no longer enters into the calculations.

    Hartle has been adapting this technique to his pursuits in quantum cosmology, an endeavor in which the laws of quantum mechanics are applied to the young universe to discern its evolution. Instead of dealing with individual particles, though, he works with all the configurations that could possibly describe an evolving cosmos, an infinite array of potential universes. When he sums up these varied configurations—some enhancing one another, others canceling each other out—a particular spacetime ultimately emerges. In this way, Hartle hopes to obtain clues to the universe’s behavior during the era of quantum gravity. Conveniently, he doesn’t have to choose a special clock to carry out the physics: Time disappears as an essential variable.

    Of course, as Isham points out, “having gotten rid of time, we’re then obliged to explain how we get back to the ordinary world, where time surrounds us.” Quantum gravity theorists have their hunches. Like Rovelli, many are coming to suspect that time is not fundamental at all. This theme resounds again and again in the various approaches aimed at solving the problem of time. Time, they say, may more resemble a physical property such as temperature or pressure. Pressure has no meaning when you talk about one particle or one atom; the concept of pressure arises only when we consider trillions of atoms. The notion of time could very well share this statistical feature. If so, reality would then resemble a pointillist painting. On the smallest of scales—the Planck length—time would have no meaning, just as a pointillist painting, built up from dabs of paint, cannot be fathomed close up.

    Quantum gravity theorists like to compare themselves to archeologists. Each investigator is digging away at a different site, finding a separate artifact of some vast subterranean city. The full extent of the find is not yet realized. What theorists desperately need are data, experimental evidence that could help them decide between the different approaches.

    It seems an impossible task, one that would appear to require recreating the hellish conditions of the Big Bang. But not necessarily. For instance, future generations of “gravity-wave telescopes,” instruments that detect ripples in the rubberlike mat of spacetime, might someday sense the Big Bang’s reverberating thunder, relics from the instant of creation when the force of gravity first emerged. Such waves could provide vital clues to the nature of space and time.

    “We wouldn’t have believed just [decades] ago that it would be possible to say what happened in the first 10 minutes of the Big Bang,” points out Kucha. “But we can now do that by looking at the abundances of the elements. Perhaps if we understand physics on the Planck scale well enough, we’ll be able to search for certain consequences—remnants—that are observable today.” If found, such evidence would bring us the closest ever to our origins and possibly allow us to perceive at last how space and time came to well up out of nothingness some 14 billion years ago.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
  • richardmitnick 6:37 am on August 30, 2018 Permalink | Reply
    Tags: , , , Black Hole Firewalls Could Be Too Tepid to Burn, , , Nautilus, ,   

    From Nautilus: “Black Hole Firewalls Could Be Too Tepid to Burn” 

    Nautilus

    From Nautilus

    Aug 29, 2018
    Charlie Wood

    Artist’s conception of two merging black holes similar to those detected by LIGO Credit LIGO-Caltech/MIT/Sonoma State /Aurore Simonnet

    1
    String theorists elide a paradox about black holes by extinguishing the walls of fire feared to surround them. NASA

    Despite its ability to bend both minds and space, an Einsteinian black hole looks so simple a child could draw it. There’s a point in the center, a perfectly spherical boundary a bit farther out, and that’s it

    The point is the singularity, an infinitely dense, unimaginably small dot contorting space so radically that anything nearby falls straight in, leaving behind a vacuum. The spherical boundary marks the event horizon, the point of no return between the vacuum and the rest of the universe. But according to Einstein’s theory of gravity, the event horizon isn’t anything that an unlucky astronaut would immediately notice if she were to cross it. “It’s like the horizon outside your window,” said Samir Mathur, a physicist at Ohio State University. “If you actually walked over there, there’s nothing.”

    In 2012, however, this placid picture went up in flames. A team of four physicists took a puzzle first put forward by Stephen Hawking about what happens to all the information that falls into the black hole, and turned it on its head. Rather than insisting that an astronaut (often named Alice) pass smoothly over the event horizon, they prioritized a key postulate of quantum mechanics: Information, like matter and energy, must never be destroyed. That change ended up promoting the event horizon from mathematical boundary to physical object, one they colorfully named the wall of fire.

    “It can’t be empty, and it turns out it has to be full of a lot of stuff, a lot of hot stuff,” said Donald Marolf, a physicist at the University of California, Santa Barbara, and one of the four co-authors [no cited paper]. The argument caused an uproar in the theoretical physics community, much as if cartographers suggested that instead of an imaginary line on their maps, Earth’s equator was actually a wall of bright red bricks.

    The news of a structure at the boundary didn’t shock Mathur, however. For more than a decade he had been arguing that black holes are really balls of strings (from string theory) with hot, fuzzy surfaces. “As you come closer and closer it gets hotter and hotter, and that’s what causes the burning,” he explained.

    In recent years, Mathur has been refining his “fuzzball” description, and his most recent calculations bring marginally good news for Alice. While she wouldn’t live a long and healthy life, the horizon’s heat might not be what does her in.

    Fuzzballs are what you get when you apply string theory, a description of nature that replaces particles with strings, to extremely dense objects. Energize a particle and it can only speed up, but strings stretch and swell as well. That ability to expand, combined with additional flexibility from postulated extra dimensions, makes strings fluff up when enough of them are packed into a small space. They form a fuzzy ball that looks from afar like an ordinary black hole—it has the same size (for a given mass) and emits the same kind of “Hawking radiation” that all black holes emit. As a bonus, the slightly bumpy surface changes the way it emits particles and declaws Hawking’s information puzzle, according to Mathur. “It’s more like a planet,” he said, “and it radiates from that surface just like anything else.”

    33
    Olena Shmahalo / Quanta Magazine

    His new work extends arguments from 2014, which asked what would happen to Alice if she were to fall onto a supermassive fuzzball akin to the one at the heart of our galaxy—one with the mass of millions of suns. In such situations, the force of gravity dominates all others. Assuming this constraint, Mathur and his collaborator found that an incoming Alice particle had almost no chance of smashing into an outgoing particle of Hawking radiation. The surface might be hot, he said, but the way the fuzzball expands to swallow new material prevents anything from getting close enough to burn, so Alice should make it to the surface.


    In response, Marolf suggested that a medium-size fuzzball might still be able to barbecue Alice in other ways. It wouldn’t drag her in as fast, and in a collision at lower energies, forces other than gravity could singe her, too.

    Mathur’s team recently took a more detailed look at Alice’s experience with new calculations published in the Journal of High Energy Physics. They concluded that for a modest fuzzball—one as massive as our sun—the overall chance of an Alice particle hitting a radiation particle was slightly higher than they had found before, but still very close to zero. Their work suggested that you’d have to shrink a fuzzball down to a thousand times smaller than the nanoscale before burning would become likely.

    By allowing Alice to reach the surface more or less intact (she would still undergo an uncontroversial and likely fatal stretching), the theory might even end up restoring the Einsteinian picture of smooth passage across the boundary, albeit in a twisted form. There might be a scenario in which Alice went splat on the surface while simultaneously feeling as if she were falling through open space, whatever that might mean.

    “If you jump onto [fuzzballs] in one description, you break up into little strings. That’s the splat picture,” Mathur said. We typically assume that once her particles start breaking up, Alice ceases to be Alice. A bizarre duality in string theory, however, allows her strings to spread out across the fuzzball in an orderly way that preserves their connections, and, perhaps, her sense of self. “If you look carefully at what [the strings] are doing,” Mathur continued, “they’re actually spreading in a very coherent ball.”

    The details of Mathur’s picture remain rough. And the model rests entirely on the machinery of string theory, a mathematical framework with no experimental evidence. What’s more, not even string theory can handle the messiness of realistic fuzzballs. Instead, physicists focus on contrived examples such as highly organized, extra-frigid bodies with extreme features, said Marika Taylor, a string theorist at the University of Southampton in the U.K.

    Mathur’s calculations are exploratory, she said, approximate generalizations from the common features of the simple models. The next step is a theory that can describe the fuzzball’s surface at the quantum level, from the point of view of the string. Nevertheless, she agreed that the hot firewall idea has always smelled fishy from a string-theory perspective. “You suddenly transition from ‘I’m falling perfectly happily’ to ‘Oh my God, I’m completely destroyed’? That’s unsatisfactory,” she said.

    Marolf refrained from commenting on the latest results until he finished discussing them with Mathur, but said that he was interested in learning more about how the other forces had been accounted for and how the fuzzball surface would react to Alice’s visit. He also pointed out that Mathur’s black hole model was just one of many tactics for resolving Hawking’s puzzle, and there was no guarantee that anyone had hit on the right one. “Maybe the real world is crazier than even the things we’ve thought of yet,” he said, “and we’re just not being clever enough.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
  • richardmitnick 11:47 am on August 2, 2018 Permalink | Reply
    Tags: , , Nautilus, ,   

    From Quanta Magazine via Nautilus: “How Artificial Intelligence Can Supercharge the Search for New Particles” 

    Nautilus

    Nautilus

    Quanta Magazine
    From Quanta Magazine

    Jul 25, 2018
    Charlie Wood

    1
    In the hunt for new fundamental particles, physicists have always had to make assumptions about how the particles will behave. New machine learning algorithms don’t.
    Image by ATLAS Experiment © 2018 CERN

    The Large Hadron Collider (LHC) smashes a billion pairs of protons together each second.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    Occasionally the machine may rattle reality enough to have a few of those collisions generate something that’s never been seen before. But because these events are by their nature a surprise, physicists don’t know exactly what to look for. They worry that in the process of winnowing their data from those billions of collisions to a more manageable number, they may be inadvertently deleting evidence for new physics. “We’re always afraid we’re throwing the baby away with the bathwater,” said Kyle Cranmer, a particle physicist at New York University who works with the ATLAS experiment at CERN.

    CERN ATLAS

    Faced with the challenge of intelligent data reduction, some physicists are trying to use a machine learning technique called a “deep neural network” to dredge the sea of familiar events for new physics phenomena.

    In the prototypical use case, a deep neural network learns to tell cats from dogs by studying a stack of photos labeled “cat” and a stack labeled “dog.” But that approach won’t work when hunting for new particles, since physicists can’t feed the machine pictures of something they’ve never seen. So they turn to “weakly supervised learning,” where machines start with known particles and then look for rare events using less granular information, such as how often they might take place overall.

    In a paper posted on the scientific preprint site arxiv.org in May, three researchers proposed applying a related strategy to extend “bump hunting,” the classic particle-hunting technique that found the Higgs boson. The general idea, according to one of the authors, Ben Nachman, a researcher at the Lawrence Berkeley National Laboratory, is to train the machine to seek out rare variations in a data set.

    Consider, as a toy example in the spirit of cats and dogs, a problem of trying to discover a new species of animal in a data set filled with observations of forests across North America. Assuming that any new animals might tend to cluster in certain geographical areas (a notion that corresponds with a new particle that clusters around a certain mass), the algorithm should be able to pick them out by systematically comparing neighboring regions. If British Columbia happens to contain 113 caribous to Washington state’s 19 (even against a background of millions of squirrels), the program will learn to sort caribous from squirrels, all without ever studying caribous directly. “It’s not magic but it feels like magic,” said Tim Cohen, a theoretical particle physicist at the University of Oregon who also studies weak supervision.

    By contrast, traditional searches in particle physics usually require researchers to make an assumption about what the new phenomena will look like. They create a model of how the new particles will behave—for example, a new particle might tend to decay into particular constellations of known particles. Only after they define what they’re looking for can they engineer a custom search strategy. It’s a task that generally takes a Ph.D. student at least a year, and one that Nachman thinks could be done much faster, and more thoroughly.

    The proposed CWoLa algorithm, which stands for Classification Without Labels, can search existing data for any unknown particle that decays into either two lighter unknown particles of the same type, or two known particles of the same or different type. Using ordinary search methods, it would take the LHC collaborations at least 20 years to scour the possibilities for the latter, and no searches currently exist for the former. Nachman, who works on the ATLAS project, says CWoLa could do them all in one go.

    Other experimental particle physicists agree it could be a worthwhile project. “We’ve looked in a lot of the predictable pockets, so starting to fill in the corners we haven’t looked in is an important direction for us to go in next,” said Kate Pachal, a physicist who searches for new particle bumps with the ATLAS project. She batted around the idea of trying to design flexible software that could deal with a range of particle masses last year with some colleagues, but no one knew enough about machine learning. “Now I think it might be the time to try this,” she said.

    The hope is that neural networks could pick up on subtle correlations in the data that resist current modeling efforts. Other machine learning techniques have successfully boosted the efficiency of certain tasks at the LHC, such as identifying “jets” made by bottom-quark particles. The work has left no doubt that some signals are escaping physicists’ notice. “They’re leaving information on the table, and when you spend $10 billion on a machine, you don’t want to leave information on the table,” said Daniel Whiteson, a particle physicist at the University of California, Irvine.

    Yet machine learning is rife with cautionary tales of programs that confused arms with dumbbells (or worse). At the LHC, some worry that the shortcuts will end up reflecting gremlins in the machine itself, which experimental physicists take great pains to intentionally overlook. “Once you find an anomaly, is it new physics or is it something funny that went on with the detector?” asked Till Eifert, a physicist on ATLAS.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 12:38 pm on April 8, 2018 Permalink | Reply
    Tags: , , , , How ‘Oumuamua Got Shredded, Nautilus,   

    From Nautilus: “How ‘Oumuamua Got Shredded” 

    Nautilus

    Nautilus

    Apr 01, 2018
    Sean Raymond

    1
    ‘Oumuamua may be a piece of a torn-apart comet, gravitationally launched into interstellar space, that roamed the galaxy before dropping on our doorstep.ESO / M. Kornmesser / Wikicommons.

    Our solar system’s first houseguest—at least, the first one we have seen in our midst—is a strange one. Scientists have taken to calling it ‘Oumuamua (pronounced “Oh-MOO-ah-MOO-ah”), after it was seen, last October, as a faint streak against a backdrop of stars, by the Pan-STARRS (Panoramic Survey Telescope and Rapid Response System) telescope, in Hawaii.

    Pannstars telescope, U Hawaii, located at Haleakala Observatory, Hawaii, USA, Altitude 3,052 m (10,013 ft)

    In Hawaiian, ‘Oumuamua means “a messenger from afar arriving first.”

    How do we know it’s “from afar”? ‘Oumuamua is fast. Minus the sun’s gravitational tug, it’s clocking 16 miles per second. A massive planet like Jupiter can gravitationally kick an object hard enough to reach that speed, but get this: ‘Oumuamua entered the solar system from above the plane of the planets! There is nothing in the solar system (including Planet Nine if it exists) that can explain its speed. That is why scientists are confident that it came from beyond.

    2
    ‘Oumuamua’s trajectory through the solar system. It was only found after its closest passage to Earth.Brooks Bays / SOEST Publication Services / UH Institute for Astronomy.

    ‘Oumuamua also looks like nothing we’ve seen before. Its brightness in the sky oscillates by about a factor of 10 every seven hours or so, and models show that this may be due to the spinning of a cigar-shaped body (as in Fig. 2). A pair of potato-shaped bodies with different reflectivities could also account for the oscillation. The brightness pattern does not perfectly repeat, indicating that ‘Oumuamua is spinning chaotically—“tumbling” might be the right way to put it.

    3
    A simulation of ‘Oumuamua’s rotation (left) and the variations in observed brightness that this produces.nagualdesign / Wikicommons.

    Another peculiar thing: ‘Oumuamua looks like a water-rich object, but it has no surface water. Measurements of its colors at different wavelengths show an object similar to volatile-rich bodies in the solar system—think comets and water-rich asteroids. But ‘Oumuamua passed closer than Mercury’s orbit, and showed no signs of activity: No gases escaping to form a coma, no jets, no tails. So, even though it looks like a comet, it does not behave like a comet, at least not like the flamboyant, bright comets that we know and love.

    So what is it? The simplest explanation is that ‘Oumuamua is a planetary leftover called a planetesimal, born in a planet-forming disk around another star but got left out of the finished product. Instead it was kicked out into interstellar space by a giant planet similar to Neptune, or maybe Jupiter. ‘Oumuamua wandered for millions to billions of years, then happened to pass close to the sun.

    That can explain how ‘Oumuamua came to roam outside of a planetary system—but it falls short: It doesn’t explain ‘Oumuamua’s weird shape and spin, nor why it looks like a comet with no surface water.

    This is where computer simulations come in. Almost a decade ago, I ran thousands of simulations of how giant planets interact with disks of planetesimals. My goal was to study how planets behave, but what I’ve found is that these same simulations can be used to understand ‘Oumuamua’s origins. Before they are kicked out into interstellar space, some planetesimals pass super close to a giant planet, so close that they should be shredded to pieces, due to gravity: The pull on the planet-facing side of the planetesimal is much larger than on the opposite side. The strong stretching force might play a role in explaining ‘Oumuamua’s unusual shape and tumbling spin, although it has not been carefully modeled yet. This is still speculation.

    This sort of shredding event isn’t just based on calculations, though. We have actually seen it in action. In 1992, comet Shoemaker-Levy 9 passed too close to Jupiter and was torn into a string of fragments. They fell back onto Jupiter in 1994 (some on my 17th birthday).

    4
    Fragments of comet Shoemaker-Levy 9 observed with the Hubble Space telescope in 1994.NASA / ESA / H. Weaver and E. Smith (STScI).

    My simulations find that about 1 percent of planetesimals are torn apart by coming too close to a giant planet, like comet Shoemaker–Levy 9. Instead of bashing into the planet, most of the pieces are eventually thrown out of their planetary systems. If ‘Oumuamua is a planetesimal fragment, it must have gotten shredded very violently.

    This still doesn’t explain ‘Oumuamua’s comet-like appearance. In this way, ‘Oumuamua is like another class of objects in the solar system called the Damocloids: They have comet-like orbits and surfaces but don’t give off any gases when they are heated. We think they are extinct comets that, after a certain number of trips too close to the sun, burned off all of their surface ices. Over the past few decades, researchers have figured out how quickly extinction must happen.

    Back to my simulations. Remember: about 1 percent of planetesimals should have been torn apart before being kicked out into interstellar space. And guess what? About two-thirds of them passed close to their stars a bunch of times before getting kicked out—enough times that they should have become extinct.

    So, ‘Oumuamua may be a piece of a torn-apart comet, as my colleagues and I argue in two recent papers (here [The Astrophysical Journal] and here [MNRAS]): After the disruption, ‘Oumuamua passed close enough to its star enough times to lose its surface volatiles, becoming extinct. Then it was gravitationally launched into interstellar space and roamed the galaxy before dropping on our doorstep.

    5
    ‘Oumuamua’s origin story?Sean Raymond / planetplanet.net.

    We could test this by cracking ‘Oumuamua open. Is there ice buried deep, too deep for the sun to vaporize it? It’s zooming away so fast that tracking it down—the goal of Project Lyra—is no small feat. If that fails we can bank on the Large Synoptic Survey Telescope, which is coming online around 2021, to help us find objects similar to what ‘Oumuamua seems to be. Our story predicts that extinct comets should outnumber “normal” ones about two-to-one, and almost all of these objects should be fragments of larger bodies, meaning they might bear some trace of their violent pasts, either in terms of their shapes and spins or something else.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    If we end up finding a lot of extinct comet fragments, we can be confident ‘Oumuamua is one, too.

    Sean Raymond is an astronomer studying the formation and evolution of planetary systems. He also blogs at htp://www.planetplanet.net.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
  • richardmitnick 12:51 pm on November 12, 2017 Permalink | Reply
    Tags: , , Caleb Scharf, , Nautilus, The Zoomable Universe, This Will Help You Grasp the Sizes of Things in the Universe   

    From Nautilus: “This Will Help You Grasp the Sizes of Things in the Universe” 

    Nautilus

    Nautilus

    Nov 08, 2017
    Dan Garisto

    1
    In The Zoomable Universe, Scharf puts the notion of scale—in biology and physics—center-stage. “The start of your journey through this book and through all known scales of reality is at that edge between known and unknown,” he writes. Illustration by Ron Miller

    Caleb Scharf wants to take you on an epic tour. His latest book, The Zoomable Universe, starts from the ends of the observable universe, exploring its biggest structures, like groups of galaxies, and goes all the way down to the Planck length—less than a billionth of a billionth of a billionth of a meter. It is a breathtaking synthesis of the large and small. Readers journeying through the book are treated to pictures, diagrams, and illustrations all accompanied by Scharf’s lucid, conversational prose. These visual aids give vital depth and perspective to the phenomena that he points out like a cosmic safari guide. Did you know, he offers, that all the Milky Way’s stars can fit inside the volume of our solar system?

    Scharf, the director of Columbia University’s Astrobiology Center, is a suitably engaging guide. He’s the author of the 2012 book Gravity’s Engines: How Bubble-Blowing Black Holes Rule Galaxies, Stars, and Life in the Universe, and last year, he speculated in Nautilus about whether alien life could be so advanced as to be indistinguishable from physics.

    In The Zoomable Universe, Scharf puts the notion of scale—in biology and physics—center-stage. “The start of your journey through this book and through all known scales of reality is at that edge between known and unknown,” he writes. Nautilus caught up with him to talk about our experience with scale and why he thinks it’s mysterious. (Scharf is a member of Nautilus’ advisory board.)

    Why is scale interesting?

    Scale is fascinating. Scientifically it’s a fundamental property of reality. We don’t even think about it. We talk about space and time—and perhaps we puzzle more over the nature of time than we do over the nature of scale or space—but it’s equally mysterious.

    What’s mysterious about scale?

    It’s something we all have direct experience of, even intuitively. We learn to evaluate the size of things. But we’re operating as humans in a very, very narrow slice of what is out there. And we’re aware of a very narrow range of scales: In some sense, we know more about the very large than we do about the very small.

    We know about atoms, kind of, but if you go smaller, it gets more uncertain—not just because of intrinsic uncertainty, but the completeness of our physics gets worse. We don’t really know what’s happening here. That leads you to a mystery at the Planck scale. On the big scale, it’s stuff we can actually see, we can actually chart.


    Not an alien planet, but the faceted eye of a louse embedded in an elephant’s skin. The Zoomable Universe.

    At certain scales, there’s not much happening. Does that hint at some underlying mystery?

    I think that is something worth contemplating. There’s quarks and then there’s 20 orders of magnitude smaller where—what do you say about it? That was the experience for the very small, but on the larger scale there’s some of that too…the emptiness of interstellar space. It is striking how empty most of everything is on the big scale and the small scale.

    We have all this rich stuff going on in the scale of the solar system and the earth and our biological scale. That’s where we’ve gained the most insight, accumulated the most knowledge. It is the scale where matter seems to condense down, where things appear solid, when in fact, it’s equally empty on the inside. But is that a human cultural bias? Or is that telling us something profound about the nature of the universe? I don’t really know the answer to that. But there’s something about the way we’re built, the way we think about the world. We’re clearly not attuned to that emptiness.

    Yet we’re drawn to it.

    We are drawn to it—like the example in the book with the stars packed together. Taking all the stars from the galaxy put together and being able to fit them inside the volume of the solar system? It is shocking. Trust me, I had to run the numbers a couple of times just to go, “Oh wow, okay, that really does work.”

    3
    As the Earth eclipses the Sun, our high wilderness of the lunar landscape is bathed in reddened light. Illustration by Ron Miller

    How did you represent things that we don’t have pictures of, like the surface of an exoplanet, or things at really small scales?

    That’s something we definitely talked a lot about in putting the book together. Ron Miller, the artist, would produce a landscape for an exoplanet. As a scientist, my inclination is to say, “We can’t do that—we can’t say what it looks like.” So we had this dialogue. We wanted an informed artistic approach. It became tricky when we got down to a small scale. I wanted to avoid the usual trope, which is an atom is a sphere, or a molecule is a sphere connected by things. You can’t have a picture of these things in the sense that we’re used to. We tried to compromise. We made something people kind of recognize, but we avoid the ball and stick models that are glued in everyone’s head.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
  • richardmitnick 12:03 pm on November 9, 2017 Permalink | Reply
    Tags: , , Cosmologists have come to realize that our universe may be only one component of the multiverse, , Fred Adams, Mordehai Milgrom and MOND theory, Nautilus, , , The forces are not nearly as finely tuned as many scientists think, The parameters of our universe could have varied by large factors and still allowed for working stars and potentially habitable planets, The strong interaction- the weak interaction- electromagnetism- gravity   

    From Nautilus: “The Not-So-Fine Tuning of the Universe” 

    Nautilus

    Nautilus

    January 19, 2017 [Just found this referenced in another article.]
    Fred Adams
    Illustrations by Jackie Ferrentino

    Before there is life, there must be structure. Our universe synthesized atomic nuclei early in its history. Those nuclei ensnared electrons to form atoms. Those atoms agglomerated into galaxies, stars, and planets. At last, living things had places to call home. We take it for granted that the laws of physics allow for the formation of such structures, but that needn’t have been the case.

    Over the past several decades, many scientists have argued that, had the laws of physics been even slightly different, the cosmos would have been devoid of complex structures. In parallel, cosmologists have come to realize that our universe may be only one component of the multiverse, a vast collection of universes that makes up a much larger region of spacetime. The existence of other universes provides an appealing explanation for the apparent fine-tuning of the laws of physics. These laws vary from universe to universe, and we live in a universe that allows for observers because we couldn’t live anywhere else.

    1
    Setting The Parameters: The universe would have been habitable even if the forces of electromagnetism and gravity had been stronger or weaker. The crosshatched area shows the range of values consistent with life. The asterisk shows the actual values in our universe; the axes are scaled to these values. The constraints are that stars must be able to undergo nuclear fusion (below black curve), live long enough for complex life to evolve (below red curve), be hot enough to support biospheres (left of blue curve), and not outgrow their host galaxies (right of the cyan curve). Fred C. Adams.

    Astrophysicists have discussed fine-tuning so much that many people take it as a given that our universe is preternaturally fit for complex structures. Even skeptics of the multiverse accept fine-tuning; they simply think it must have some other explanation. But in fact the fine-tuning has never been rigorously demonstrated. We do not really know what laws of physics are necessary for the development of astrophysical structures, which are in turn necessary for the development of life. Recent work on stellar evolution, nuclear astrophysics, and structure formation suggest that the case for fine-tuning is less compelling than previously thought. A wide variety of possible universes could support life. Our universe is not as special as it might seem.

    The first type of fine-tuning involves the strengths of the fundamental forces of nature in working stars. If the electromagnetic force had been too strong, the electrical repulsion of protons would shut down nuclear fusion in stellar cores, and stars would fail to shine. If electromagnetism had been too weak, nuclear reactions would run out of control, and stars would blow up in spectacular explosions. If gravity had been too strong, stars would either collapse into black holes or never ignite.

    On closer examination, though, stars are remarkably robust. The strength of the electric force could vary by a factor of nearly 100 in either direction before stellar operations would be compromised. The force of gravity would have to be 100,000 times stronger. Going in the other direction, gravity could be a billion times weaker and still allow for working stars. The allowed strengths for the gravitational and electromagnetic forces depend on the nuclear reaction rate, which in turn depends on the strengths of the nuclear forces. If the reaction rate were faster, stars could function over an even wider range of strengths for gravitation and electromagnetism. Slower nuclear reactions would narrow the range.

    In addition to these minimal operational requirements, stars must meet a number of other constraints that further restrict the allowed strength of the forces. They must be hot. The surface temperature of a star must be high enough to drive the chemical reactions necessary for life. In our universe, there are ample regions around most stars where planets are warm enough, about 300 kelvins, to support biology. In universes where the electromagnetic force is stronger, stars are cooler, making them less hospitable.

    Stars must also have long lives. The evolution of complex life forms takes place over enormous spans of time. Since life is driven by a complex ensemble of chemical reactions, the basic clock for biological evolution is set by the time scales of atoms. In other universes, these atomic clocks will tick at different rates, depending on the strength of electromagnetism, and this variation must be taken into account. When the force is weaker, stars burn their nuclear fuel faster, and their lifetimes decrease.

    3
    Mordehai Milgrom
    Also in Physics
    The Physicist Who Denies Dark Matter
    By Oded Carmeli
    He is one of those dark matter people,” Mordehai Milgrom said about a colleague stopping by his office at the Weizmann Institute of Science. Milgrom introduced us, telling me that his friend is searching for evidence of dark matter in READ MORE

    Finally, stars must be able to form in the first place. In order for galaxies and, later, stars to condense out of primordial gas, the gas must be able to lose energy and cool down. The cooling rate depends (yet again) on the strength of electromagnetism. If this force is too weak, gas cannot cool down fast enough and would remain diffuse instead of condensing into galaxies. Stars must also be smaller than their host galaxies—otherwise star formation would be problematic. These effects put another lower limit on the strength of electromagnetism.

    Putting it all together, the strengths of the fundamental forces can vary by several orders of magnitude and still allow planets and stars to satisfy all the constraints (as illustrated in the figure below). The forces are not nearly as finely tuned as many scientists think.

    A second example of possible fine-tuning arises in the context of carbon production. After moderately large stars have fused the hydrogen in their central cores into helium, helium itself becomes the fuel. Through a complicated set of reactions, helium is burned into carbon and oxygen. Because of their important role in nuclear physics, helium nuclei are given a special name: alpha particles. The most common nuclei are composed of one, three, four, and five alpha particles. The nucleus with two alpha particles, beryllium-8, is conspicuously absent, and for a good reason: It is unstable in our universe.

    The instability of beryllium creates a serious bottleneck for the creation of carbon. As stars fuse helium nuclei together to become beryllium, the beryllium nuclei almost immediately decay back into their constituent parts. At any given time, the stellar core maintains a small but transient population of beryllium. These rare beryllium nuclei can interact with helium to produce carbon. Because the process ultimately involves three helium nuclei, it is called the triple-alpha reaction. But the reaction is too slow to produce the amount of carbon observed in our universe.

    To resolve this discrepancy, physicist Fred Hoyle predicted in 1953 that the carbon nucleus has to have a resonant state at a specific energy, as if it were a little bell that rang with a certain tone. Because of this resonance, the reaction rates for carbon production are much larger than they would be otherwise—large enough to explain the abundance of carbon found in our universe. The resonance was later measured in the laboratory at the predicted energy level.

    3
    Credit above

    The worry is that, in other universes, with alternate strengths of the forces, the energy of this resonance could be different, and stars would not produce enough carbon. Carbon production is compromised if the energy level is changed by more than about 4 percent. This issue is sometimes called the triple-alpha fine-tuning problem.

    Fortunately, this problem has a simple solution. What nuclear physics takes away, it also gives. Suppose nuclear physics did change by enough to neutralize the carbon resonance. Among the possible changes of this magnitude, about half would have the side effect of making beryllium stable, so the loss of the resonance would become irrelevant. In such alternate universes, carbon would be produced in the more logical manner of adding together alpha particles one at a time. Helium could fuse into beryllium, which could then react with additional alpha particles to make carbon. There is no fine-tuning problem after all.

    A third instance of potential fine-tuning concerns the simplest nuclei composed of two particles: deuterium nuclei, which contain one proton and one neutron; diprotons, consisting of two protons; and dineutrons, consisting of two neutrons. In our universe, only deuterium is stable. The production of helium takes place by first combining two protons into deuterium.

    If the strong nuclear force had been even stronger, diprotons could have been stable. In this case, stars could have generated energy through the simplest and fastest of nuclear reactions, where protons combine to become diprotons and eventually other helium isotopes. It is sometimes claimed that stars would then burn through their nuclear fuel at catastrophic rates, resulting in lifetimes that are too short to support biospheres. Conversely, if the strong force had been weaker, then deuterium would be unstable, and the usual stepping stone on the pathway to heavy elements would not be available. Many scientists have speculated that the absence of stable deuterium would lead to a universe with no heavy elements at all and that such a universe would be devoid of complexity and life.

    As it turns out, stars are remarkably stable entities. Their structure adjusts automatically to burn nuclear fuel at exactly the right rate required to support themselves against the crush of their own gravity. If the nuclear reaction rates are higher, stars will burn their nuclear fuel at a lower central temperature, but otherwise they will not be so different. In fact, our universe has an example of this type of behavior. Deuterium nuclei can combine with protons to form helium nuclei through the action of the strong force. The cross section for this reaction, which quantifies the probability of its occurrence, is quadrillions of times larger than for ordinary hydrogen fusion. Nonetheless, stars in our universe burn their deuterium in a relatively uneventful manner. The stellar core has an operating temperature of 1 million kelvins, compared to the 15 million kelvins required to burn hydrogen under ordinary conditions. These deuterium-burning stars have cooler centers and are somewhat larger than the sun, but are otherwise unremarkable.

    Similarly, if the strong nuclear force were lower, stars could continue to operate in the absence of stable deuterium. A number of different processes provide paths by which stars can generate energy and synthesize heavy elements. During the first part of their lives, stars slowly contract, their central cores grow hotter and denser, and they glow with the power output of the sun. Stars in our universe eventually become hot and dense enough to ignite nuclear fusion, but in alternative universes they could continue this contraction phase and generate power by losing gravitational potential energy. The longest-lived stars could shine with a power output roughly comparable to the sun for up to 1 billion years, perhaps long enough for biological evolution to take place.

    For sufficiently massive stars, the contraction would accelerate and become a catastrophic collapse. These stellar bodies would basically go supernova. Their central temperatures and densities would increase to such large values that nuclear reactions would ignite. Many types of nuclear reactions would take place in the death throes of these stars. This process of explosive nucleosynthesis could supply the universe with heavy nuclei, in spite of the lack of deuterium.

    Once such a universe produces trace amounts of heavy elements, later generations of stars have yet another option for nuclear burning. This process, called the carbon-nitrogen-oxygen cycle, does not require deuterium as an intermediate state. Instead, carbon acts as a catalyst to instigate the production of helium. This cycle operates in the interior of the sun and provides a small fraction of its total power. In the absence of stable deuterium, the carbon-nitrogen-oxygen cycle would dominate the energy generation. And this does not exhaust the options for nuclear power generation. Stars could also produce helium through a triple-nucleon process that is roughly analogous to the triple-alpha process for carbon production. Stars thus have many channels for providing both energy and complex nuclei in alternate universes.

    A fourth example of fine-tuning concerns the formation of galaxies and other large-scale structures. They were seeded by small density fluctuations produced in the earliest moments of cosmic time. After the universe had cooled down enough, these fluctuations started to grow stronger under the force of gravity, and denser regions eventually become galaxies and galaxy clusters. The fluctuations started with a small amplitude, denoted Q, equal to 0.00001. The primeval universe was thus incredibly smooth: The density, temperature, and pressure of the densest regions and of the most rarefied regions were the same to within a few parts per 100,000. The value of Q represents another possible instance of fine-tuning in the universe.

    If Q had been lower, it would have taken longer for fluctuations to grow strong enough to become cosmic structures, and galaxies would have had lower densities. If the density of a galaxy is too low, the gas in the galaxy is unable to cool. It might not ever condense into galactic disks or coalesce into stars. Low-density galaxies are not viable habitats for life. Worse, a long enough delay might have prevented galaxies from forming at all. Beginning about 4 billion years ago, the expansion of the universe began to accelerate and pull matter apart faster than it could agglomerate—a change of pace that is usually attributed to a mysterious dark energy. If Q had been too small, it could have taken so long for galaxies to collapse that the acceleration would have started before structure formation was complete, and further growth would have been suppressed. The universe could have ended up devoid of complexity, and lifeless. In order to avoid this fate, the value of Q cannot be smaller by more than a factor of 10.

    What if Q had been larger? Galaxies would have formed earlier and ended up denser. That, too, would have posed a danger for the prospects of habitability. Stars would have been much closer to one another and interacted more often. In so doing, they could have stripped planets out of their orbits and sent them hurtling into deep space. Furthermore, because stars would be closer together, the night sky would be brighter—perhaps as bright as day. If the stellar background were too dense, the combined starlight could boil the oceans of any otherwise suitable planets.

    5
    Galactic What-If: A galaxy that formed in a hypothetical universe with large initial density fluctuations might be even more hospitable than our Milky Way. The central region is too bright and hot for life, and planetary orbits are unstable. But the outer region is similar to the solar neighborhood. In between, the background starlight from the galaxy is comparable in brightness to the sunlight received by Earth, so all planets, no matter their orbits, are potentially habitable. Fred C. Adams.

    In this case, the fine-tuning argument is not very constraining. The central regions of galaxies could indeed produce such intense background radiation that all planets would be rendered uninhabitable. But the outskirts of galaxies would always have a low enough density for habitable planets to survive. An appreciable fraction of galactic real estate remains viable even when Q is thousands of times larger than in our universe. In some cases, a galaxy might be even more hospitable. Throughout much of the galaxy, the night sky could have the same brightness as the sunshine we see during the day on Earth. Planets would receive their life-giving energy from the entire ensemble of background stars rather than from just their own sun. They could reside in almost any orbit. In an alternate universe with larger density fluctuations than our own, even Pluto would get as much daylight as Miami. As a result, a moderately dense galaxy could have more habitable planets than the Milky Way.

    In short, the parameters of our universe could have varied by large factors and still allowed for working stars and potentially habitable planets. The force of gravity could have been 1,000 times stronger or 1 billion times weaker, and stars would still function as long-lived nuclear burning engines. The electromagnetic force could have been stronger or weaker by factors of 100. Nuclear reaction rates could have varied over many orders of magnitude. Alternative stellar physics could have produced the heavy elements that make up the basic raw material for planets and people. Clearly, the parameters that determine stellar structure and evolution are not overly fine-tuned.

    Given that our universe does not seem to be particularly fine-tuned, can we still say that our universe is the best one for life to develop? Our current understanding suggests that the answer is no. One can readily envision a universe that is friendlier to life and perhaps more logical. A universe with stronger initial density fluctuations would make denser galaxies, which could support more habitable planets than our own. A universe with stable beryllium would have straightforward channels available for carbon production and would not need the complication of the triple-alpha process. Although these issues are still being explored, we can already say that universes have many pathways for the development of complexity and biology, and some could be even more favorable for life than our own. In light of these generalizations, astrophysicists need to reexamine the possible implications of the multiverse, including the degree of fine-tuning in our universe.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Welcome to Nautilus. We are delighted you joined us. We are here to tell you about science and its endless connections to our lives. Each month we choose a single topic. And each Thursday we publish a new chapter on that topic online. Each issue combines the sciences, culture and philosophy into a single story told by the world’s leading thinkers and writers. We follow the story wherever it leads us. Read our essays, investigative reports, and blogs. Fiction, too. Take in our games, videos, and graphic stories. Stop in for a minute, or an hour. Nautilus lets science spill over its usual borders. We are science, connected.

     
    • stewarthoughblog 1:13 am on November 10, 2017 Permalink | Reply

      The proposition that long-lived stars could last 1by and possibly be sufficient for life to evolve is not consistent with what science has observed with our solar system, planet and the origin of life. It is estimated that the first life did not appear until almost 1by after formation, making this wide speculation.

      The idea that an increased density of stars in the galaxy could support increased habitability of planets is inconsistent with astrophysical understanding of the criticality of solar radiation to not destroy all life and all biochemicals required.

      It is also widely speculative to propose that any of the fundamental constants and force tolerances can be virtually arbitrarily reassigned with minimal affect without much more serious scientific analysis. In light of the fundamental fact that the understanding of the origin of life naturalistically is a chaotic mess, it is widely speculative to conjecture the fine-tuning of the universe is not critical.

      Like

    • richardmitnick 10:13 am on November 10, 2017 Permalink | Reply

      Thanks for reading and commenting. I appreciate it.

      Like

c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: