Tagged: Physics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:00 pm on September 16, 2018 Permalink | Reply
    Tags: , , , , Physics, , , TheatreWorks   

    From Symmetry: “A play in parallel universes” From 10/10/17 

    Symmetry Mag
    From Symmetry

    10/10/17 [From the past]
    Kathryn Jepsen

    1
    Artwork by Sandbox Studio, Chicago with Ana Kova

    Constellations illustrates the many-worlds interpretation of quantum mechanics—with a love story.

    The play Constellations begins with two people, Roland and Marianne, meeting for the first time. It’s a short scene, and it doesn’t go well. Then the lights go down, come back up, and it’s as if the scene has reset itself. The characters meet for the first time, again, but with slightly different (still unfortunate) results.

    The entire play progresses this way, showing multiple versions of different scenes between Roland, a beekeeper, and Marianne, an astrophysicist.

    In the script, each scene is divided from the next by an indented line. As the stage notes explain: “An indented rule indicates a change in universe.”

    To scientist Richard Partridge, who recently served as a consultant for a production of Constellations at TheatreWorks Silicon Valley, it’s a play about quantum mechanics.

    “Quantum mechanics is about everything happening at once,” he says.

    We don’t experience our lives this way, but atoms and particles do.

    In 1927, physicists Niels Bohr and Werner Heisenberg wrote that, on the scale of atoms and smaller, the properties of physical systems remain undefined until they are measured. Light, for example, can behave as a particle or a wave. But until someone observes it to be one or the other, it exists in a state of quantum superposition: It is both a particle and a wave at the same time. When a scientist takes a measurement, the two possibilities collapse into a single truth.

    Physicist Erwin Schrodinger illustrated this with a cat. He created a thought experiment in which the decay of an atom—an event ruled by quantum mechanics—would trigger toxic gas to be released in a steel chamber with a cat inside. By the rules of quantum mechanics, until someone opened the chamber, the cat existed in a state of superposition: simultaneously alive and dead.

    Some interpretations of quantum mechanics dispute the idea that observing a system can determine its true state. In the many-worlds interpretation, every possibility exists in a giant collection of parallel realities.

    In some, the cat lives. In others, it does not.

    In some Constellations universes, the astrophysicist and the beekeeper fall in love. In others, they do not. “So it’s not really about physics,” Partridge says.

    Constellations director Robert Kelley, who founded TheatreWorks in 1970, agrees. He says he was intimidated by the physics concepts in the play at first but that he was eventually drawn to the relationship at its core.

    “With all of these things swirling around in the play, what really counts is the relationship between two people and the love that grows between them,” he says. “I found that a very charming message for Silicon Valley. We’re surrounded by a whole lot of technology, but probably for most people what counts is when you get home and you’re on the couch and your one-and-a-half-year-old shows up.”

    _______________________________________________________________
    TheatreWorks in Silicon Valley production of Constellations
    2

    Cosmologist Marianne (Carie Kawa) and beekeeper Roland (Robert Gilbert) explore the ever-changing mystery of “what ifs” in the regional premiere of Constellations presented by TheatreWorks Silicon Valley, August 23-September 17, at the Mountain View Center for the Performing Arts.
    Photo by Kevin Berne
    _______________________________________________________________

    Kelley says that he found something familiar in the many timelines of the play. “It’s really kind of fun to see all that happen because it’s common ground for us as human beings: You hang up the phone and think, ‘If only I’d said that or hadn’t said that.’ It’s a fascinating thought that every single thing that happens will then determine every single other thing that happens.”

    Constantly resetting and replaying the same scenes “was very acrobatic,” says Los Angeles-based actress Carie Kawa, who played Marianne in the TheatreWorks production, which concluded in September. “And there were emotional acrobatics—just jumping into different emotional states. Usually you get a little longer arc; this play is just all middles, almost like shooting a film.”

    To her, the repeats and jumps were familiar in a different way: They were an encapsulation of the experience of acting.

    “We do the play over and over again,” she says. “It’s the same scene, but it’s different every single time. And if we’re doing it right, we’re not thinking about the scene that just happened or the scene that’s to come, we’re in the moment.”

    The play will mean different things to different people, Kawa says.

    “A teacher once told me a story about theater and a perspective that he had,” she says. “At first he said, ‘Theater is important because everybody can come together and feel the same feeling at the same time and know that we’re all okay.’

    “But as he progressed in this artistry he realized that, no, what’s happening is everybody is feeling a slightly different feeling at the same time. And that’s OK. That’s what helps us experience our humanity and the humanity of the other people around us. We’re all alone in this together.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


    Advertisements
     
  • richardmitnick 9:55 am on September 16, 2018 Permalink | Reply
    Tags: A new level of “self-awareness” to Earth’s self-regulation which is at the heart of the original Gaia theory, , , , , Creating transformative solutions to the global changes that humans are now causing is a key focus of the University of Exeter’s new Global Systems Institute, , GAIA 2.0, , Physics, Selection by survival alone, Self-regulating system, Stability comes from “sequential selection”   

    From Astrobiology Magazine: ” Famous theory of the living Earth upgraded to ‘Gaia 2.0’ “ 

    Astrobiology Magazine

    From Astrobiology Magazine

    and

    4

    Sep 15, 2018
    No writer credit

    1
    The original Gaia Theory was developed in the late 1960’s by James Lovelock, a British scientist and inventor. Credit: NASA

    1
    James Lovelock. The original uploader was Bruno Comby at English Wikipedia.

    A time-honoured theory into why conditions on Earth have remained stable enough for life to evolve over billions of years has been given a new, innovative twist.

    For around half a century, the ‘Gaia’ hypothesis has provided a unique way of understanding how life has persisted on Earth.

    It champions the idea that living organisms and their inorganic surroundings evolved together as a single, self-regulating system that has kept the planet habitable for life – despite threats such as a brightening Sun, volcanoes and meteorite strikes.

    However, Professor Tim Lenton from the University of Exeter and famed French sociologist of science Professor Bruno Latour are now arguing that humans have the potential to ‘upgrade’ this planetary operating system to create “Gaia 2.0”.

    They believe that the evolution of both humans and their technology could add a new level of “self-awareness” to Earth’s self-regulation, which is at the heart of the original Gaia theory.

    As humans become more aware of the global consequences of their actions, including climate change, a new kind of deliberate self-regulation becomes possible where we limit our impacts on the planet.

    Professors Lenton and Latour suggest that this “conscience choice” to self-regulate introduces a “fundamental new state of Gaia” – which could help us achieve greater global sustainability in the future.

    However, such self-aware self-regulation relies on our ability to continually monitor and model the state of the planet and our effects upon it.

    Professor Lenton, Director of Exeter’s new Global Systems Institute, said: “If we are to create a better world for the growing human population this century then we need to regulate our impacts on our life support-system, and deliberately create a more circular economy that relies – like the biosphere – on the recycling of materials powered by sustainable energy.”

    The original Gaia Theory was developed in the late 1960’s by James Lovelock, a British scientist and inventor. It suggested that both the organic and inorganic components of Earth evolved together as one single, self-regulating system which can control global temperature and atmospheric composition to maintain its own habitability.

    The new perspective article is published in leading journal Science on September 14, 2018.

    It follows recent research, led by Professor Lenton, which offered a fresh solution to how the Gaia hypothesis works in real terms: Stability comes from “sequential selection” in which situations where life destabilises the environment tend to be short-lived and result in further change until a stable situation emerges, which then tends to persist.

    Once this happens, the system has more time to acquire further properties that help to stabilise and maintain it – a process known as “selection by survival alone”.

    Creating transformative solutions to the global changes that humans are now causing is a key focus of the University of Exeter’s new Global Systems Institute.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 2:37 pm on September 15, 2018 Permalink | Reply
    Tags: , Physics, , The End of Theoretical Physics As We Know It   

    From Quanta Magazine: “The End of Theoretical Physics As We Know It” 

    Quanta Magazine
    From Quanta Magazine

    August 27, 2018
    Sabine Hossenfelder

    1
    James O’Brien for Quanta Magazine

    Computer simulations and custom-built quantum analogues are changing what it means to search for the laws of nature.

    Theoretical physics has a reputation for being complicated. I beg to differ. That we are able to write down natural laws in mathematical form at all means that the laws we deal with are simple — much simpler than those of other scientific disciplines.

    Unfortunately, actually solving those equations is often not so simple. For example, we have a perfectly fine theory that describes the elementary particles called quarks and gluons, but no one can calculate how they come together to make a proton. The equations just can’t be solved by any known methods. Similarly, a merger of black holes or even the flow of a mountain stream can be described in deceptively simple terms, but it’s hideously difficult to say what’s going to happen in any particular case.

    Of course, we are relentlessly pushing the limits, searching for new mathematical strategies. But in recent years much of the pushing has come not from more sophisticated math but from more computing power.

    When the first math software became available in the 1980s, it didn’t do much more than save someone a search through enormous printed lists of solved integrals. But once physicists had computers at their fingertips, they realized they no longer had to solve the integrals in the first place, they could just plot the solution.

    In the 1990s, many physicists opposed this “just plot it” approach. Many were not trained in computer analysis, and sometimes they couldn’t tell physical effects from coding artifacts. Maybe this is why I recall many seminars in which a result was degraded as “merely numerical.” But over the past two decades, this attitude has markedly shifted, not least thanks to a new generation of physicists for whom coding is a natural extension of their mathematical skill.

    Accordingly, theoretical physics now has many subdisciplines dedicated to computer simulations of real-world systems, studies that would just not be possible any other way. Computer simulations are what we now use to study the formation of galaxies and supergalactic structures, to calculate the masses of particles that are composed of several quarks, to find out what goes on in the collision of large atomic nuclei, and to understand solar cycles, to name but a few areas of research that are mainly computer based.

    The next step of this shift away from purely mathematical modeling is already on the way: Physicists now custom design laboratory systems that stand in for other systems which they want to better understand. They observe the simulated system in the lab to draw conclusions about, and make predictions for, the system it represents.

    The best example may be the research area that goes by the name “quantum simulations.” These are systems composed of interacting, composite objects, like clouds of atoms. Physicists manipulate the interactions among these objects so the system resembles an interaction among more fundamental particles. For example, in circuit quantum electrodynamics, researchers use tiny superconducting circuits to simulate atoms, and then study how these artificial atoms interact with photons. Or in a lab in Munich, physicists use a superfluid of ultra-cold atoms to settle the debate over whether Higgs-like particles can exist in two dimensions of space (the answer is yes [Nature]).

    These simulations are not only useful to overcome mathematical hurdles in theories we already know. We can also use them to explore consequences of new theories that haven’t been studied before and whose relevance we don’t yet know.

    This is particularly interesting when it comes to the quantum behavior of space and time itself — an area where we still don’t have a good theory. In a recent experiment, for example, Raymond Laflamme, a physicist at the Institute for Quantum Computing at the University of Waterloo in Ontario, Canada, and his group used a quantum simulation to study so-called spin networks, structures that, in some theories, constitute the fundamental fabric of space-time. And Gia Dvali, a physicist at the University of Munich, has proposed a way to simulate the information processing of black holes with ultracold atom gases.

    A similar idea is being pursued in the field of analogue gravity, where physicists use fluids to mimic the behavior of particles in gravitational fields. Black hole space-times have attracted the bulk of attention, as with Jeff Steinhauer’s (still somewhat controversial) claim of having measured Hawking radiation in a black-hole analogue. But researchers have also studied the rapid expansion of the early universe, called “inflation,” with fluid analogues for gravity.

    In addition, physicists have studied hypothetical fundamental particles by observing stand-ins called quasiparticles. These quasiparticles behave like fundamental particles, but they emerge from the collective movement of many other particles. Understanding their properties allows us to learn more about their behavior, and thereby might also to help us find ways of observing the real thing.

    This line of research raises some big questions. First of all, if we can simulate what we now believe to be fundamental by using composite quasiparticles, then maybe what we currently think of as fundamental — space and time and the 25 particles that make up the Standard Model of particle physics — is made up of an underlying structure, too. Quantum simulations also make us wonder what it means to explain the behavior of a system to begin with. Does observing, measuring, and making a prediction by use of a simplified version of a system amount to an explanation?

    But for me, the most interesting aspect of this development is that it ultimately changes how we do physics. With quantum simulations, the mathematical model is of secondary relevance. We currently use the math to identify a suitable system because the math tells us what properties we should look for. But that’s not, strictly speaking, necessary. Maybe, over the course of time, experimentalists will just learn which system maps to which other system, as they have learned which system maps to which math. Perhaps one day, rather than doing calculations, we will just use observations of simplified systems to make predictions.

    At present, I am sure, most of my colleagues would be appalled by this future vision. But in my mind, building a simplified model of a system in the laboratory is conceptually not so different from what physicists have been doing for centuries: writing down simplified models of physical systems in the language of mathematics.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 2:36 pm on September 14, 2018 Permalink | Reply
    Tags: , , , , Nuclear pasta in neutron stars may be the strongest material in the universe, , Physics,   

    From Science News: “Nuclear pasta in neutron stars may be the strongest material in the universe” 

    From Science News

    September 14, 2018
    Emily Conover

    Simulations suggest the theoretical substance is 10 billion times as strong as steel.

    1
    TOUGH STUFF An exotic substance thought to exist within a type of collapsed star called a neutron star (illustrated) may be stronger than any other known material.
    Casey Reed/Penn State University, Wikimedia Commons

    A strand of spaghetti snaps easily, but an exotic substance known as nuclear pasta is an entirely different story.

    Predicted to exist in ultradense dead stars called neutron stars, nuclear pasta may be the strongest material in the universe. Breaking the stuff requires 10 billion times the force needed to crack steel, for example, researchers report in a study accepted in Physical Review Letters.

    “This is a crazy-big figure, but the material is also very, very dense, so that helps make it stronger,” says study coauthor and physicist Charles Horowitz of Indiana University Bloomington.

    Neutron stars form when a dying star explodes, leaving behind a neutron-rich remnant that is squished to extreme pressures by powerful gravitational forces, resulting in materials with bizarre properties (SN: 12/23/17, p. 7).

    About a kilometer below the surface of a neutron star, atomic nuclei are squeezed together so close that they merge into clumps of nuclear matter, a dense mixture of neutrons and protons. These as-yet theoretical clumps are thought to be shaped like blobs, tubes or sheets, and are named after their noodle look-alikes, including gnocchi, spaghetti and lasagna. Even deeper in the neutron star, the nuclear matter fully takes over. The burnt-out star’s entire core is nuclear matter, like one giant atomic nucleus.

    Nuclear pasta is incredibly dense, about 100 trillion times the density of water. It’s impossible to study such an extreme material in the laboratory, says physicist Constança Providência of the University of Coimbra in Portugal who was not involved with the research.
    ____________________________________________
    Al dente

    When atomic nuclei get squeezed together inside a neutron star, scientists think that globs of nuclear matter form into shapes reminiscent of various types of pasta, including gnocchi (left in these simulations of nuclear pasta), spaghetti (middle) and lasagna (right).

    3
    M.E. Caplan and C.J. Horowitz/Reviews of Modern Physics 2017
    ____________________________________________

    Instead, the researchers used computer simulations to stretch nuclear lasagna sheets and explore how the material responded. Immense pressures were required to deform the material, and the pressure required to snap the pasta was greater than for any other known material.

    Earlier simulations had revealed that the outer crust of a neutron star was likewise vastly stronger than steel. But the inner crust, where nuclear pasta lurks, was unexplored territory. “Now, what [the researchers] see is that the inner crust is even stronger,” Providência says.

    Physicists are still aiming to find real-world evidence of nuclear pasta. The new results may provide a glimmer of hope. Neutron stars tend to spin very rapidly, and, as a result, might emit ripples in spacetime called gravitational waves, which scientists could detect at facilities like the Advanced Laser Interferometer Gravitational-wave Observatory, or LIGO. But the spacetime ripples will occur only if a neutron star’s crust is lumpy — meaning that it has “mountains,” or mounds of dense material either on the surface or within the crust.

    “The tricky part is, you need a big mountain,” says physicist Edward Brown of Michigan State University in East Lansing. A stiffer, stronger crust would support larger mountains, which could produce more powerful gravitational waves. But “large” is a relative term. Due to the intense gravity of neutron stars, their mountains would be a far cry from Mount Everest, rising centimeters tall, not kilometers. Previously, scientists didn’t know how large a mountain nuclear pasta could support.

    “That’s where these simulations come in,” Brown says. The results suggest that nuclear pasta could support mountains tens of centimeters tall — big enough that LIGO could spot neutron stars’ gravitational waves. If LIGO caught such signals, scientists could estimate the mountains’ size, and confirm that neutron stars have superstrong materials in their crusts.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 1:53 pm on September 14, 2018 Permalink | Reply
    Tags: , , , , , , Physics   

    From CERN: “The LHC prepares for the future” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    From CERN

    14 Sep 2018
    Corinne Pralavorio

    1
    View of the CERN Control Centre where the operators control the LHC (Image: Maximilien Brice/CERN)

    The Large Hadron Collider is stopping proton collisions for five days this week to undergo numerous tests.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    Accelerator specialists need to test the LHC when it is not in production mode and there are only several weeks left in which they can do it. At the end of the year, CERN’s accelerators will be shut down for a major two-year upgrade programme that will result in a renovated accelerator complex using more intense beams and higher energy. Scientists are conducting research to prepare for this new stage and the next, the High-Luminosity LHC.

    “We have many requests from CERN’s teams because these periods of machine development allow components to be tested in real conditions and the results of simulations to be checked,” says Jan Uythoven, the head of the machine development programme. No fewer than twenty-four tests are scheduled for what will be this year’s third testing period.

    One of the major areas of research focuses on beam stability : perturbations are systematically tracked and corrected by the LHC operators. When instabilities arise, the operators stop the beams and dump them. “To keep high-intensity beams stable, we have to improve the fine-tuning of the LHC,” Jan Uythoven adds. Extensive research is therefore being carried out to better understand these instabilities, with operators causing them deliberately in order to study how the beams behave.

    The operators are also testing new optics for the High-Luminosity LHC or, in other words, a new way of adjusting the magnets to increase the beam concentration at the collision points. Another subject of the study concerns the heat generated by more intense future beams, which raises the temperature in the magnet’s core to the limit of what is needed to maintain the superconducting state. Lastly, tests are also being carried out on new components. In particular, innovative collimators were implemented at the start of the year. Collimators are protective items of equipment that stop the particles that deviate from the trajectory to prevent them from damaging the accelerator.

    After this five-day test period, the LHC will stop running completely for a technical stop lasting another five days, during which teams will carry out repairs and maintenance. The technical stop will be followed by five weeks of proton collisions before the next period of machine development and the lead-ion run.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles

    OTHER PROJECTS AT CERN

    CERN AEGIS

    CERN ALPHA

    CERN ALPHA

    CERN AMS

    CERN ACACUSA

    CERN ASACUSA

    CERN ATRAP

    CERN ATRAP

    CERN AWAKE

    CERN AWAKE

    CERN CAST

    CERN CAST Axion Solar Telescope

    CERN CLOUD

    CERN CLOUD

    CERN COMPASS

    CERN COMPASS

    CERN DIRAC

    CERN DIRAC

    CERN ISOLDE

    CERN ISOLDE

    CERN LHCf

    CERN LHCf

    CERN NA62

    CERN NA62

    CERN NTOF

    CERN TOTEM

    CERN UA9

    CERN Proto Dune

    CERN Proto Dune

     
  • richardmitnick 1:23 pm on September 14, 2018 Permalink | Reply
    Tags: , , , , , Physics   

    From Lawrence Berkeley National Lab: “Gamma Rays, Watch Out: There’s a New Detector in Town” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    September 14, 2018
    Theresa Duque
    tnduque@lbl.gov
    (510) 495-2418

    1
    Heather Crawford and her team of researchers are developing a prototype for an ultrahigh-rate high-purity germanium detector that can count 2 to 5 million gamma rays per second while maintaining high resolution. (Credit: Marilyn Chung/Berkeley Lab)

    Heather Crawford has always had a natural bent for science. When she was a high school student in her native Canada, she took all the science electives within reach without a second thought. She went into college thinking she would study biochemistry, but that all changed when she took her first class in nuclear science – the study of the subatomic world. Her professors noticed her talent for nuclear chemistry, and soon she found herself working as an undergraduate researcher in nuclear science at TRIUMF, the accelerator facility in Vancouver, Canada.

    Today, Crawford is a staff scientist in the Nuclear Science Division at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab). With funding from an Early Career Laboratory Directed Research and Development (LDRD) award announced last year, she and her team of researchers have been developing a prototype for an ultrahigh-rate high-purity germanium (HPGe) detector that can count 2 to 5 million gamma rays per second while maintaining high resolution, allowing them to accurately measure the energy spectrum under extreme conditions. A conventional HPGe detector loses resolution when it goes above 50,000 counts per second.

    Gamma rays hail from nuclear decays and reactions within neutron stars, supernova explosions, and regions around black holes. But they also have origins here on Earth: Gamma rays are generated by radioactive decay, or reactions in nuclear power plants, for example. Their ubiquity thus serves as an all-purpose clue for solving wide-ranging mysteries, from tracking down isotope “fingerprints” of elements in stars, to assessing the impact of a nuclear power plant disaster.

    Crawford said that the ultrafast, high-resolution detector will allow scientists to do more research in less time, collecting gamma-ray statistics at 10 to 100 times the rate previously possible. This opens up new possibilities for gamma-ray spectroscopy in the rarest nuclear systems, such as superheavy elements. “Whenever you’re doing gamma-ray spectroscopy, it’s about resolution and efficiency – ideally, you want an experiment to run for a couple of weeks, not years,” she added.

    With the design for the small yet mighty detector finalized last month – the device measures just 3 inches wide and 3 inches tall – Crawford and her team look forward to testing the prototype, which was fabricated at Berkeley Lab’s Semiconductor Detector Laboratory, as an individual detector, and then moving toward an array.

    “This LDRD gave us a unique opportunity to gain a deeper understanding of how germanium detectors work. Berkeley Lab has always been at the forefront of physics and nuclear science. If our prototype works, we will continue to move forward and push the science of both HPGe detectors and heavy elements,” she said.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 9:25 am on September 14, 2018 Permalink | Reply
    Tags: , Physics, ,   

    From Stanford University: “Exploring the landscape” 

    Stanford University Name
    From Stanford University

    September 14, 2018
    Ker Than

    1
    The String Theory Landscape is a divisive issue among physicists as they continue trying to prove or disprove its key elements. (Image credit: Eric Nyquist)

    Stanford physicists continue to survey the peaks and valleys of the String Theory Landscape that they helped discover nearly two decades ago, even as critics say the theory is ultimately untestable. This story is part 5 of a five-part series.

    Nearly two decades after its proposal, the String Theory Landscape remains divisive among physicists. “In the beginning there were people who hated it. Some hate it even now, and more strongly than before,” Andrei Linde said.

    Many view the Landscape as a kind of Faustian bargain: It elegantly explains why the universe appears to be so eerily fine-tuned for life – there are myriad universes and we just happen to live in one that’s tuned for us – but it does so by dashing Einstein’s dream of one day uncovering a “theory of everything” from which the precise values of nature’s laws and constants logically and inevitably arise.

    The idea that our universe must have laws suitable for life is called the anthropic principle, and it’s a notion many physicists despise. One U.S. Nobel laureate called it “defeatist” and “dangerous” and said it “smells of religion and intelligent design.”

    Even Landscape proponents accept anthropic selection only with a measure of resignation and ambivalence. “Anthropics is distasteful to most string theorists. You’d rather have a nice equation which you can solve to predict the mass of the electron, but that seems very unlikely to me,” Shamit Kachru said. “On the other hand, the Landscape offers in compensation a different kind of elegance: Over vast cosmological scales many solutions are realized; so in that sense, nothing is wasted.”

    Renata Kallosh conceded that the anthropic principle “would not be my first choice” for explaining why the universe is the way it is.

    2
    Renata Kallosh. Hypermultiplet

    “In science, the preference will always be given to non-anthropic explanations – unless, however, there is nothing better,” she said.

    And there is nothing better at present, according to Leonard Susskind. “It’s not enough to say, ‘I hate the idea.’ You have to say, ‘Here’s a better idea.’

    Leonard Susskind by Linda Cicero-Stanford News Service

    Every month or so somebody will come out with some screwball theory of why the cosmological constant is close to zero, but it won’t last for more than a week,” he said. “A legitimate controversy is when there are two more or less equally good ideas which are in conflict with each other. The simple fact is there is no competition.”

    Susskind also thinks that the reports of the death of Einstein’s dream of a unified field theory are greatly exaggerated, and he offers an analogy. “Imagine you live in a world where you only know about one or two different animal species,” he said. “If you were of a scientific bent, you might say, ‘I need to explain why those two species are exactly the way they are.’ Then it’s discovered that there are a lot more different kinds of species, zillions of them. Does that mean that the search for a grand unified theory of life is to be abandoned? No, it means that whatever the fundamental principles are, they shouldn’t be expected to only give rise to a very small number of possibilities. Whatever the ultimate theory for physics is, it should not lead to a conclusion that there’s only one universe. It should lead to the conclusion that there are lots of them.”

    After nearly 20 years, particle physicist Savas Dimopoulos is tired of the debate. “Nature doesn’t care about our wishes and hopes,” said Dimopoulos. “Our job is to find out what is true. The important thing is not whether we like these ideas or not, but how to test them. Because in the end, physics has two legs: theory and experiment. To find the truth, you need both.”

    Surveying the Landscape

    To that end, Dimopoulos’ group is currently designing ultra-precise “tabletop” experiments to search for millimeter-size extra dimensions that are consistent with string theory. “In string theory, there are six extra dimensions of space, and they can take a tremendous variety of forms,” Dimopoulos said. “This richness tells you that there are many universes, but it also predicts many new particles, including a class of light, weakly interacting massive particles, or WIMPs. If we discover several WIMPs, it will be the first observational evidence that we may have a very complex theory at work.”

    The String Theory Landscape will also inform the “Modern Inflationary Cosmology” project, a multi-institution endeavor funded by the Simons Foundation and coordinated by Eva Silverstein. A goal of the project will be to study the primordial seeds of galaxies and other cosmic structures for clues about physics in the early universe. According to inflationary cosmology, the early universe was filled with fields such as the inflaton field and the gravitational field. In some string theory-influenced models of inflation developed by Silverstein and others, fluctuations in these fields were frozen into patterns resembling triangles, rectangles and other shapes, which were preserved as the universe expanded and the fluctuations blossomed into galaxies and other cosmic structures. These patterns, or “non-Gaussianities,” could appear as unusual arrangements of hot spots in the all-sky temperature map produced by the Planck space observatory, as peculiar groupings of galaxies and galaxy clusters in other telescope surveys, or as deviations in the predicted number and type of black holes present in any given region of space.

    “Discovering or constraining non-Gaussianities in as systematic a way as possible will improve our knowledge of conditions in the universe roughly 14 billion years ago and help us distinguish between vastly different models of inflation,” Silverstein said, “including some classes that arose from the String Theory Landscape.

    _______________________________________________
    “Inflation once seemed like a wild-eyed fantasy but it now motivates $1 billion experimental proposals and continues to interact with highly theoretical ideas such as the String Theory Landscape. It’s become a part of the cosmological establishment. Whether the String Theory Landscape eventually attains a similar status as part of the standard paradigm, motivating useful experiments, remains to be seen.”
    —Shamit Kachru, Professor of Physics

    5
    Shamit Kachru, Professor of Physics. Stanford University

    _______________________________________________

    Meanwhile, the science of dark energy continues to evolve as part of astrophysics, independent of the question of whether or not dark energy is Einstein’s cosmological constant.

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    Experiments like the European Space Agency’s upcoming Euclid space mission and the ground-based Large Synoptic Survey Telescope (LSST) being assembled in Chile will measure the acceleration of the universe with unprecedented precision and chart the history of cosmic expansion over the past 10 billion years.

    ESA/Euclid spacecraft

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    “If we learn through Euclid and LSST that the cosmological constant is only approximately a constant, then we will have to rethink some of the underlying assumptions of the String Theory Landscape,” Kallosh said.

    In the end, there may always remain elements of the String Theory Landscape that are difficult or even impossible to test. But this is not unique in science, Silverstein said. “We never measure most of what any theory predicts, even empirically well-established ones,” she said. “It is logically possible to find local support that aspects of string theory are manifested in our observable universe, which would bolster the case that perhaps the more ‘out there’ predictions of the theory – like universes beyond our cosmic horizon – might also be plausible.”

    The history of science is littered with crazy-sounding ideas that have panned out, Kachru said. “Inflation once seemed like a wild-eyed fantasy but it now motivates $1 billion experimental proposals and continues to interact with highly theoretical ideas such as the String Theory Landscape. It’s become a part of the cosmological establishment,” he added. “Whether the String Theory Landscape eventually attains a similar status as part of the standard paradigm, motivating useful experiments, remains to be seen.”

    Perhaps, Kallosh reflected, physicists are just too impatient. She noted that it took 50 years to confirm the existence of the Higgs boson – the final piece of the Standard Model. “Maybe we just need more time,” she said.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Stanford University campus. No image credit

    Leland and Jane Stanford founded the University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members

    Stanford University Seal

     
  • richardmitnick 9:26 pm on September 13, 2018 Permalink | Reply
    Tags: , , Because we only looked at one-millionth of the data that's out there. Perhaps the nightmare is one we've brought upon ourselves, , , Every 25 nanoseconds there's a chance of a collision, Has The Large Hadron Collider Accidentally Thrown Away The Evidence For New Physics?, , Most of CERN's data from the LHC has been lost forever., Only 0.0001% of the total data can be saved for analysis, Out of every one million collisions that occurs at the LHC only one of them has all of its data written down and recorded., , , Physics, We think we're doing the smart thing by choosing to save what we're saving but we can't be sure   

    From Ethan Siegel: “Has The Large Hadron Collider Accidentally Thrown Away The Evidence For New Physics?” 

    From Ethan Siegel
    Sep 13, 2018

    The Universe is out there, waiting for you to discover it.

    1
    The ATLAS particle detector of the Large Hadron Collider (LHC) at the European Nuclear Research Center (CERN) in Geneva, Switzerland. Built inside an underground tunnel of 27km (17miles) in circumference, CERN’s LHC is the world’s largest and most powerful particle collider and the largest single machine in the world. It can only record a tiny fraction of the data it collects. No image credit.

    Over at the Large Hadron Collider, protons simultaneously circle clockwise and counterclockwise, smashing into one another while moving at 99.9999991% the speed of light apiece. At two specific points designed to have the greatest numbers of collisions, enormous particle detectors were constructed and installed: the CMS and ATLAS detectors. After billions upon billions of collisions at these enormous energies, the LHC has brought us further in our hunt for the fundamental nature of the Universe and our understanding of the elementary building blocks of matter.

    Earlier this month, the LHC celebrated 10 years of operation, with the discovery of the Higgs boson marking its crowning achievement. Yet despite these successes, no new particles, interactions, decays, or fundamental physics has been found. Worst of all is this: most of CERN’s data from the LHC has been lost forever.

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    2
    The CMS Collaboration, whose detector is shown prior to final assembly here, has released their latest, most comprehensive results ever. There is no indication of physics beyond the Standard Model in the results.CERN/Maximlien Brice.

    This is one of the least well-understood pieces of the high-energy physics puzzle, at least among the general public. The LHC hasn’t just lost most of its data: it’s lost a whopping 99.9999% of it. That’s right; out of every one million collisions that occurs at the LHC, only one of them has all of its data written down and recorded.

    It’s something that happened out of necessity, due to the limitations imposed by the laws of nature themselves, as well as what technology can presently do. But in making that decision, there’s a tremendous fear made all the more palpable by the fact that, other than the much-anticipated Higgs, nothing new has been discovered. The fear is this: that there is new physics waiting to be discovered, but we’ve missed it by throwing this data away.

    3
    A four-muon candidate event in the ATLAS detector at the Large Hadron Collider. The muon/anti-muon tracks are highlighted in red, as the long-lived muons travel farther than any other unstable particle. This is an interesting event, but for every event we record, a million others get discarded.ATLAS Collaboration/CERN

    We didn’t have a choice in the matter, really. Something had to be thrown away. The way the LHC works is by accelerating protons as close to the speed of light as possible in opposite directions and smashing them together. This is how particle accelerators have worked best for generations. According to Einstein, a particle’s energy is a combination of its rest mass (which you may recognize as E = mc2) and the energy of its motion, also known as its kinetic energy. The faster you go — or more accurately, the closer you get to the speed of light — the higher energy-per-particle you can achieve.

    At the LHC, we collide protons together at 299,792,455 m/s, just 3 m/s shy of the speed of light itself. By smashing them together at such high speeds, moving in opposite directions, we make it possible for otherwise impossible particles to exist.

    The reason is this: all particle (and antiparticles) that we can create have a certain amount of energy inherent to them, in the form of their mass-at-rest. When you smash two particles together, some of that energy has to go into the individual components of those particles, both their rest energy and their kinetic energy (i.e., their energy-of-motion).

    But if you have enough energy, some of that energy can also go into the production of new particles! This is where E = mc2 gets really interesting: not only do all particles with a mass (m) have an energy (E) inherent to their existence, but if you have enough available energy, you can create new particles. At the LHC, humanity has achieved collisions with more available energy for the creation of new particles than in any other laboratory in history.

    The energy-per-particle is around 7 TeV, meaning each proton achieves approximately 7,000 times its rest-mass energy in the form of kinetic energy. But collisions are rare and protons aren’t just tiny, they’re mostly empty space. In order to get a large probability of a collision, you need to put more than one proton in at a time; you inject your protons in bunches instead.

    At full intensity, this means that there are many tiny bunches of protons going clockwise and counterclockwise inside the LHC whenever it’s running. The LHC tunnels are approximately 26 kilometers long, with only 7.5 meters (or around 25 feet) separating each bunch. As these bunches of beams go around, they get squeezed as they interact at the mid-point of each detector. Every 25 nanoseconds, there’s a chance of a collision.

    So what do you do? Do you have a small number of collisions and record every one? That’s a waste of energy and potential data.

    Instead, you pump in enough protons in each bunch to ensure you have a good collision every time two bunches pass through. And every time you have a collision, particles rip through the detector in all directions, triggering the complex electronics and circuitry that allow us to reconstruct what was created, when, and where in the detector. It’s like a giant explosion, and only by measuring all the pieces of shrapnel that come out can we reconstruct what happened (and what new things were created) at the point of ignition.

    CERN CMS Higgs Event

    The problem that then arises, however, is in taking all of that data and recording it. The detectors themselves are big: 22 meters for CMS and 46 meters long for ATLAS. At any given time, there are particles arising from three different collisions in CMS and six separate collisions in ATLAS. In order to record data, there are two steps that must occur:

    The data has to be moved into the detector’s memory, which is limited by the speed of your electronics. Even at the speed of light, we can only “remember” about 1-in-1,000 collisions.
    The data in memory has to be written to disk (or some other permanent device), and that’s a much slower process than storing data in memory. Only about 1-in-1,000 collisions that the memory stores can be written to disk.

    That’s why, with the necessity of taking both of these steps, only 0.0001% of the total data can be saved for analysis.

    How do we know we’re saving the right pieces of data? The ones where it’s most likely we’re creating new particles, seeing the importance of new interactions, or observing new physics?

    When you have proton-proton collisions, most of what comes out are normal particles, in the sense that they’re made up almost exclusively of up-and-down quarks. (This means particles like protons, neutrons, and pions.) And most collisions are glancing collisions, meaning that most of the particles wind up hitting the detector in the forwards or backwards direction.

    So, to take that first step, we try and look for particle tracks of relatively high-energies that go in the transverse direction, rather than forwards or backwards. We try and put into the detector’s memory the events that we think had the most available energy (E) for creating new particles, of the highest mass (m) possible. Then, we quickly perform a computational scan of what’s in the detector’s memory to see if it’s worth writing to disk or not. If we choose to do so, that’s the only thing that detector will be writing for approximately the next 1/40th of a second or so.

    1/40th of a second might not seem like much, but it’s approximately 25,000,000 nanoseconds: enough time for about a million bunches to collide.

    5
    The particle tracks emanating from a high energy collision at the LHC in 2014. Only 1-in-1,000,000 such collisions have been written down and saved; the majority have been lost.

    We think we’re doing the smart thing by choosing to save what we’re saving, but we can’t be sure. In 2010, the CERN Data Centre passed an enormous data milestone: 10 Petabytes of data. By the end of 2013, they had passed 100 Petabytes of data; in 2017, they passed the 200 Petabyte milestone. Yet for all of it, we know that we’ve thrown away — or failed to record — about 1,000,000 times that amount. We may have collected hundreds of Petabytes, but we’ve discarded, and lost forever, hundreds of Zettabytes: more than the total amount of internet data created in a year.

    6
    The total amount of data that’s been collected by the LHC far outstrips the total amount of data sent-and-received over the internet over the last 10 years. But only 0.0001% of that data has been written down and saved; the rest is gone for good. No image credit.

    It’s eminently possible that the LHC created new particles, saw evidence of new interactions, and observed and recorded all the signs of new physics. And it’s also possible, due to our ignorance of what we were looking for, we’ve thrown it all away, and will continue to do so. The nightmare scenario — of no new physics beyond the Standard Model — appears to be coming true. But the real nightmare is the very real possibility that the new physics is there, we’ve built the perfect machine to find it, we’ve found it, and we’ll never realize it because of the decisions and assumptions we’ve made. The real nightmare is that we’ve fooled ourselves into believing the Standard Model is right, because we only looked at one-millionth of the data that’s out there. Perhaps the nightmare is one we’ve brought upon ourselves.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    See the full article here .

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 12:21 pm on September 13, 2018 Permalink | Reply
    Tags: , , Lambda leads the way, , Physics, , The Cosmic Landscape   

    From Stanford University: “Lambda leads the way” 

    Stanford University Name
    From Stanford University

    September 13, 2018
    Ker Than

    1
    Most physicists think that dark energy, the cosmological constant, and lambda all refer to a repulsive energy infused in empty space itself. (Image credit: Eric Nyquist)

    The discovery of dark energy in the 1990s marked a time of reckoning for string theorists: Either their theory had to account for the newfound force that was pushing space-time apart or they had to admit that string theory may never describe the universe we actually live in. This story is part 4 of a five-part series.

    In 1998, astronomers hunting halfway across the universe for the ebbing light of exploded stars announced they had discovered evidence that the universe’s expansion is speeding up and not, as had been suspected since 1929, slowing down.

    The realization came as “a thunderbolt to physicists, something so shocking that we are still reeling from the impact,” Leonard Susskind wrote in his book The Cosmic Landscape.

    Leonard Susskind by Linda Cicero-Stanford News Service

    “Physicists everywhere were asking, ‘Is the experiment wrong?’” Renata Kallosh recalled.

    But with every passing year, new experiments confirmed the results: Expansion is accelerating, not slowing down. For those results to be true, an elusive force that physicists had come to refer to as “dark energy” must be real.

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    Einstein had predicted the existence of dark energy in 1917 when he applied his general theory of relativity to the structure of space-time. He needed a hypothetical force to prevent the universe from collapsing, so he invented a repulsive, space-filling energy that he called the cosmological constant, or lambda. When astronomers discovered in the 1920s that the universe is expanding, Einstein realized that lambda was no longer necessary and he scrapped the idea, calling it his “biggest blunder.”

    But Einstein may have been too hard on himself. Today, most physicists think that dark energy, the cosmological constant and lambda all refer to a repulsive energy infused in empty space itself. Quantum mechanics predicts that the spontaneous creation and annihilation of ghostly “virtual particles” generates an anti-gravitational force whose influence grows with the age and size of the universe.

    When astronomers were able to measure lambda experimentally, they found it had a positive but bewilderingly tiny value that was about a trillion trillion trillion trillion trillion trillion trillion trillion trillion trillion times weaker than theory predicted. The Nobel Prize-winning physicist Steven Weinberg called this humiliating mismatch between observation and theory “the bone in our throat.”

    Equally perplexing, lambda’s tiny value lay just within the narrow range able to support life. If it were much larger, the universe would expand too quickly for galaxies and stars to form; much smaller, and creation would collapse back into a point.

    “Theoretical physics was upside down because of this experimental discovery,” Kallosh said. “We had no explanation whatsoever.”

    The cosmological constant problem

    The first tentative steps toward resolving what came to be known as the “cosmological constant problem” were taken in 2000 by theorists Joseph Polchinski of the University of California, Santa Barbara, and Raphael Bousso, a Stanford postdoc and a former student of Stephen Hawking. The pair published a paper showing that string theory could give rise to an enormous number of unique vacuum states – vastly more than previously thought. “The vacuum state is what remains if you remove all of the particles from the universe,” Andrei Linde explained. “The properties of a vacuum determine what its particles will look like and what the physics of their interactions will be if it were populated.”

    _______________________________________
    “Theoretical physics was upside down because of this experimental discovery. We had no explanation whatsoever.”
    —Renata Kallosh
    Professor of Physics
    _______________________________________

    Each vacuum described, in essence, a potential universe with its own singular take on particles and forces. “It was already known that string theory had lots of solutions,” Susskind said, “but their paper showed that it could have a vast number, and among them could be solutions that had these rare traits like a very low cosmological constant.”

    But despite offering tantalizing hints of string theory universes that could accommodate dark energy, Polchinski and Bousso, who is now at the University of California, Berkeley, stopped short of actually finding one. “They had a correct but imprecise collection of arguments for this diversity,” Susskind said. “They had no real examples of it.”

    In search of de Sitter

    The first reasonably concrete example was discovered by theoretical physicist Eva Silverstein, a professor at the Stanford Institute for Theoretical Physics who was motivated by dark energy’s discovery to search for a mechanism that could create a so-called “de Sitter” solution to string theory. De Sitter solutions (named after the Dutch astronomer Willem de Sitter) represent expanding universes with a positive cosmological constant similar to our own. Silverstein wanted to know if a solution existed in string theory that was compatible with the universe that astronomers actually observe. If none could be found, then string theorists had been wasting their time building castles in the air.

    Up to that point, string theorists had focused on solutions for universes with a negative lambda called anti-de Sitter space-time. “De Sitter solutions are more complex, and until the discovery of dark energy, no one bothered,” Silverstein said. “Some even argued that de Sitter solutions weren’t possible in string theory, and it remains a complicated subject. But these ‘no go’ arguments did not consider the leading contributions to the potential energy in string theory.”

    In 2001, Silverstein published a paper in which she proposed a mechanism for combining various ingredients from string theory – extra dimensions, orientifolds, fluxes and so on – in specific ways to create a de Sitter model. She also predicted that any de Sitter solutions would need to contain certain features. She argued, for example, that the path to positive lambda was indirect and would require making a negative contribution first. “One thing I pointed out early on is that negative contributions to the potential energy, in the right place to produce a local dip in it, would be needed,” Silverstein said, “and that this role could be played by orientifolds, which are defects in string theory’s extra dimensions that have a controlled amount of negative energy.”

    3
    Shamit Kachru, Renata Kallosh and Andrei Linde are three of the four authors of an influential paper that came to be known as KKLT. The paper helped lay the groundwork for the String Theory Landscape. (Image credit: L.A. Cicero)

    KKLT

    Early in 2003, Kallosh and Linde received an email from Shamit Kachru, who had been visiting the string theorist Sandip Trivedi in India. The quartet of physicists was engaged in a long-distance brainstorming session and Kachru’s message contained the kernel of an idea that had come to him during a flight layover in New Delhi.

    When Kallosh plotted data that Kachru had sent, up popped on her computer a chart with the same potential energy dip that Silverstein had predicted. However, this dip had been generated using different string theory ingredients and assumptions. “I knew we were onto something then,” Kallosh said.

    Later that year, the four of them published their results in a famous paper that would come to be known simply as KKLT (after the authors’ last initials). KKLT described a class of de Sitter solutions that incorporated a certain symmetry, called supersymmetry, that many physicists were expecting to see confirmed in particle collider experiments.

    “KKLT was a very important paper,” said particle physicist Savas Dimopoulos, the Hamamoto Family Professor in the School of Humanities and Sciences. “We don’t see supersymmetric particles in nature, so if symmetry did exist in the early universe, it’s been broken. What KKLT did was point out a breaking mechanism.”

    KKLT was also important for psychological reasons. “It was written by members from different parts of the physics community,” Kachru said. “Renata was a supergravity person, Andrei was an inflation person, and Sandip and I were more mathematical string theorists. All of us were saying that this kind of solution of string theory, which allows accelerated expansion due to dark energy, is something to take seriously.”

    For these reason, KKLT’s mathematical model, or “construction,” grabbed physicists’ attention in a way that earlier ones had not. Among those affected were Michael Douglas and Frederik Denef, both at Rutgers University at the time, who used the KKLT construction to famously calculate that there might exist as many as 10500 unique “vacua,” or possible universes, with a small cosmological constant. (For perspective, the total number of particles in the observable universe is estimated to be about 1090.)

    Around the same time, Susskind published a paper of his own expanding upon his colleagues’ findings. “I was more of a cheerleader than anything else,” Susskind said. “My paper was really just saying, ‘Hey guys, are you paying attention to this? This is happening.’”

    Susskind is also credited with naming the emerging concept within string theory of countless hypothetical universes with varying properties: He called it the “anthropic Landscape of string theory,” or the “String Theory Landscape” for short. “The Landscape doesn’t refer to a real place,” Susskind said. “It’s a scientific term borrowed from biology and physics that refers to an energy landscape with lots of hills and valleys. In string theory, the Landscape is incredibly rich, and our universe lies in one of the rare, habitable, low-lying valleys.”

    _______________________________________

    “In string theory, the Landscape is incredibly rich, and our universe lies in one of the rare, habitable, low-lying valleys.”
    —Leonard Susskind, Professor of Physics
    _______________________________________

    Susskind also reminded his fellow physicists that they already knew of a mechanism that could generate the tremendous diversity of universes predicted by string theory. This “natural candidate” had been pointed out by Bousso and Polchinski years earlier.

    Recalling his collaboration with Bousso in 2000, Polchinski, who died in February 2018, wrote in his memoir: “But when Bousso came back a few months later … he had added an important part of the story, the cosmology that allowed the theory to explore all these states. It was just Linde’s eternal chaotic inflation. … I had always assumed that such a thing would not be part of string theory, but in fact it arose quite naturally.”

    A Rube Goldberg construction

    If the measure of a theory’s beauty is the ratio of how many things it explains to how many assumptions it makes to explain them, then the constructions by Silverstein and KKLT are not pretty. Their authors rummaged through string theory’s pantry for exotic ingredients and combined them in wildly creative ways to concoct their imaginary universes. The KKLT construction in particular, Susskind said, was made up of “jury-rigged, Rube Goldberg contraptions” – a reference to the American inventor famous for his cartoon sketches of gadgets that performed simple tasks in convoluted ways.

    But the contrived nature of the de Sitter constructions mattered less to theorists than the fact that they existed at all. In a theory where infinite solutions are possible, Susskind argued, “simplicity and elegance are not considerations.” In all their long years of searching, KKLT and its kin were the clearest signs physicists had ever found that string theory could produce universes roughly resembling our own. The constructions the Stanford theorists produced gave powerful support to physicists’ hope that a mathematical version of our cosmos lay hidden somewhere within string theory’s labyrinthine equations and infinite solutions, and that – with ingenuity, luck and perhaps a late-night revelation or two – it might one day be found.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    Stanford University campus. No image credit

    Leland and Jane Stanford founded the University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members

    Stanford University Seal

     
  • richardmitnick 4:15 pm on September 12, 2018 Permalink | Reply
    Tags: Physics, , , , Bowtie-funnel combo best for conducting light; team found answer in undergrad physics equation   

    From Vanderbilt University: “Bowtie-funnel combo best for conducting light; team found answer in undergrad physics equation” 

    Vanderbilt U Bloc

    From Vanderbilt University

    Aug. 24, 2018
    Heidi Hall

    Running computers on virtually invisible beams of light rather than microelectronics would make them faster, lighter and more energy efficient. A version of that technology already exists in fiber optic cables, but they’re much too large to be practical inside a computer.

    A Vanderbilt team found the answer in a formula familiar to college physics students – a solution so simple and elegant, it was tough for reviewers to believe. Professor Sharon Weiss; her doctoral student, Shuren Hu, and collaborators at the IBM T. J. Watson Research Center and University of Technology in Troyes, France, published the proof in today’s Science Advances, a peer-reviewed, open-access journal from AAAS.

    They developed a structure that’s part bowtie, part funnel that concentrates light powerfully and nearly indefinitely, as measured by a scanning near field optical microscope. Only 12 nanometers connect the points of the bowtie. The diameter of a human hair is 100,000 nanometers.

    1
    The team combined a nanoscale air slot surrounded by silicon with a nanoscale silicon bar surrounded by air. (Vanderbilt University)

    “Light travels faster than electricity and doesn’t have the same heating issues as the copper wires currently carrying the information in computers,” said Weiss, Cornelius Vanderbilt Endowed Chair and Professor of Electrical Engineering, Physics and Materials Science and Engineering. “What is really special about our new research is that the use of the bowtie shape concentrates the light so that a small amount of input light becomes highly amplified in a small region. We can potentially use that for low-power manipulation of information on computer chips.”

    The team published its work as a theory two years ago in ACS Photonics, then partnered with Will Green’s silicon photonics team at IBM to fabricate a device that could prove it.

    The research began with Maxwell’s equations, which describe how light propagates in space and time. Using two principles from these equations and applying boundary conditions that account for materials used, Weiss and Hu combined a nanoscale air slot surrounded by silicon with a nanoscale silicon bar surrounded by air to make the bowtie shape.

    “To increase optical energy density, there are generally two ways: focus light down to a small tiny space and trap light in that space as long as possible,” Hu said. “The challenge is not only to squeeze a comparatively elephant-size photon into refrigerator-size space, but also to keep the elephant voluntarily in the refrigerator for a long time. It has been a prevailing belief in photonics that you have to compromise between trapping time and trapping space: the harder you squeeze photons, the more eager they are to escape.”

    2
    The team developed structure that’s part bowtie, part funnel that conducts light powerfully and indefinitely, as measured by a scanning near field optical microscope. (Ella Maru Studio)

    Weiss said she and Hu will continue working to improve their device and explore its possible application in future computer platforms.

    This work was funded by National Science Foundation GOALI grant ECCS1407777.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Commodore Cornelius Vanderbilt was in his 79th year when he decided to make the gift that founded Vanderbilt University in the spring of 1873.

    The $1 million that he gave to endow and build the university was the commodore’s only major philanthropy. Methodist Bishop Holland N. McTyeire of Nashville, husband of Amelia Townsend who was a cousin of the commodore’s young second wife Frank Crawford, went to New York for medical treatment early in 1873 and spent time recovering in the Vanderbilt mansion. He won the commodore’s admiration and support for the project of building a university in the South that would “contribute to strengthening the ties which should exist between all sections of our common country.”

    McTyeire chose the site for the campus, supervised the construction of buildings and personally planted many of the trees that today make Vanderbilt a national arboretum. At the outset, the university consisted of one Main Building (now Kirkland Hall), an astronomical observatory and houses for professors. Landon C. Garland was Vanderbilt’s first chancellor, serving from 1875 to 1893. He advised McTyeire in selecting the faculty, arranged the curriculum and set the policies of the university.

    For the first 40 years of its existence, Vanderbilt was under the auspices of the Methodist Episcopal Church, South. The Vanderbilt Board of Trust severed its ties with the church in June 1914 as a result of a dispute with the bishops over who would appoint university trustees.

    kirkland hallFrom the outset, Vanderbilt met two definitions of a university: It offered work in the liberal arts and sciences beyond the baccalaureate degree and it embraced several professional schools in addition to its college. James H. Kirkland, the longest serving chancellor in university history (1893-1937), followed Chancellor Garland. He guided Vanderbilt to rebuild after a fire in 1905 that consumed the main building, which was renamed in Kirkland’s honor, and all its contents. He also navigated the university through the separation from the Methodist Church. Notable advances in graduate studies were made under the third chancellor, Oliver Cromwell Carmichael (1937-46). He also created the Joint University Library, brought about by a coalition of Vanderbilt, Peabody College and Scarritt College.

    Remarkable continuity has characterized the government of Vanderbilt. The original charter, issued in 1872, was amended in 1873 to make the legal name of the corporation “The Vanderbilt University.” The charter has not been altered since.

    The university is self-governing under a Board of Trust that, since the beginning, has elected its own members and officers. The university’s general government is vested in the Board of Trust. The immediate government of the university is committed to the chancellor, who is elected by the Board of Trust.

    The original Vanderbilt campus consisted of 75 acres. By 1960, the campus had spread to about 260 acres of land. When George Peabody College for Teachers merged with Vanderbilt in 1979, about 53 acres were added.

    wyatt centerVanderbilt’s student enrollment tended to double itself each 25 years during the first century of the university’s history: 307 in the fall of 1875; 754 in 1900; 1,377 in 1925; 3,529 in 1950; 7,034 in 1975. In the fall of 1999 the enrollment was 10,127.

    In the planning of Vanderbilt, the assumption seemed to be that it would be an all-male institution. Yet the board never enacted rules prohibiting women. At least one woman attended Vanderbilt classes every year from 1875 on. Most came to classes by courtesy of professors or as special or irregular (non-degree) students. From 1892 to 1901 women at Vanderbilt gained full legal equality except in one respect — access to dorms. In 1894 the faculty and board allowed women to compete for academic prizes. By 1897, four or five women entered with each freshman class. By 1913 the student body contained 78 women, or just more than 20 percent of the academic enrollment.

    National recognition of the university’s status came in 1949 with election of Vanderbilt to membership in the select Association of American Universities. In the 1950s Vanderbilt began to outgrow its provincial roots and to measure its achievements by national standards under the leadership of Chancellor Harvie Branscomb. By its 90th anniversary in 1963, Vanderbilt for the first time ranked in the top 20 private universities in the United States.

    Vanderbilt continued to excel in research, and the number of university buildings more than doubled under the leadership of Chancellors Alexander Heard (1963-1982) and Joe B. Wyatt (1982-2000), only the fifth and sixth chancellors in Vanderbilt’s long and distinguished history. Heard added three schools (Blair, the Owen Graduate School of Management and Peabody College) to the seven already existing and constructed three dozen buildings. During Wyatt’s tenure, Vanderbilt acquired or built one-third of the campus buildings and made great strides in diversity, volunteerism and technology.

    The university grew and changed significantly under its seventh chancellor, Gordon Gee, who served from 2000 to 2007. Vanderbilt led the country in the rate of growth for academic research funding, which increased to more than $450 million and became one of the most selective undergraduate institutions in the country.

    On March 1, 2008, Nicholas S. Zeppos was named Vanderbilt’s eighth chancellor after serving as interim chancellor beginning Aug. 1, 2007. Prior to that, he spent 2002-2008 as Vanderbilt’s provost, overseeing undergraduate, graduate and professional education programs as well as development, alumni relations and research efforts in liberal arts and sciences, engineering, music, education, business, law and divinity. He first came to Vanderbilt in 1987 as an assistant professor in the law school. In his first five years, Zeppos led the university through the most challenging economic times since the Great Depression, while continuing to attract the best students and faculty from across the country and around the world. Vanderbilt got through the economic crisis notably less scathed than many of its peers and began and remained committed to its much-praised enhanced financial aid policy for all undergraduates during the same timespan. The Martha Rivers Ingram Commons for first-year students opened in 2008 and College Halls, the next phase in the residential education system at Vanderbilt, is on track to open in the fall of 2014. During Zeppos’ first five years, Vanderbilt has drawn robust support from federal funding agencies, and the Medical Center entered into agreements with regional hospitals and health care systems in middle and east Tennessee that will bring Vanderbilt care to patients across the state.

    studentsToday, Vanderbilt University is a private research university of about 6,500 undergraduates and 5,300 graduate and professional students. The university comprises 10 schools, a public policy center and The Freedom Forum First Amendment Center. Vanderbilt offers undergraduate programs in the liberal arts and sciences, engineering, music, education and human development as well as a full range of graduate and professional degrees. The university is consistently ranked as one of the nation’s top 20 universities by publications such as U.S. News & World Report, with several programs and disciplines ranking in the top 10.

    Cutting-edge research and liberal arts, combined with strong ties to a distinguished medical center, creates an invigorating atmosphere where students tailor their education to meet their goals and researchers collaborate to solve complex questions affecting our health, culture and society.

    Vanderbilt, an independent, privately supported university, and the separate, non-profit Vanderbilt University Medical Center share a respected name and enjoy close collaboration through education and research. Together, the number of people employed by these two organizations exceeds that of the largest private employer in the Middle Tennessee region.
    Related links

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: