Tagged: Dr. Steven Weinberg Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:10 pm on March 21, 2018 Permalink | Reply
    Tags: , , Dr. Steven Weinberg, , , ,   

    From Quanta Magazine: “Science’s Path From Myth to Multiverse” 

    Quanta Magazine
    Quanta Magazine

    In his latest book, the Nobel Prize winner Steven Weinberg explores how science made the modern world, and where it might take us from here.

    March 17, 2015 [Just found this in social media.]
    Dan Falk

    Steven Weinberg, U Texas


    Steven Weinberg

    Steven Weinberg, a physicist at the University of Texas, Austin, won a Nobel Prize in 1979 for work that became a cornerstone of particle physics.

    We can think of the history of physics as an attempt to unify the world around us: Gradually, over many centuries, we’ve come to see that seemingly unrelated phenomena are intimately connected. The physicist Steven Weinberg of the University of Texas, Austin, received his Nobel Prize in 1979 for a major breakthrough in that quest — showing how electromagnetism and the weak nuclear force are manifestations of the same underlying theory (he shared the prize with Abdus Salam and Sheldon Glashow). That work became a cornerstone of the Standard Model of particle physics, which describes how the fundamental building blocks of the universe come together to create the world we see.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.


    Standard Model of Particle Physics from Symmetry Magazine

    In his new book To Explain the World: The Discovery of Modern Science, Weinberg examines how modern science was born.

    2

    By tracing the development of what we now call the “scientific method” — an approach, developed over centuries, that emphasizes experiments and observations rather than reasoning from first principles — he makes the argument that science, unlike other ways of interpreting the world around us, can offer true progress. Through science, our understanding of the world improves over time, building on what has come before. Mistakes can happen, but are eventually corrected. Weinberg spoke with Quanta Magazine about the past and future of physics, the role of philosophy within science, and the startling possibility that the universe we see around us is a tiny sliver of a much larger multiverse. An edited and condensed version of the interview follows.

    QUANTA MAGAZINE: As a physicist, how is your perspective on the history of science different from that of a historian?

    STEVEN WEINBERG: One difference, of course, is that they know more than I do — at least, in their particular field of specialization. Real historians have a much better grasp of the original sources than I could possibly have. If they’re historians of the ancient world, they’ll be experts in Greek and Latin, which I’m not even remotely knowledgeable about.

    But there’s also a difference in attitude. Many historians are strongly opposed to the so-called “Whig interpretation” of history, in which you look at the past and try to pick out the threads that lead to the present. They feel it’s much more important to get into the frame of mind of the people who lived at the time you’re writing about. And they have a point. But I would argue that, when it comes to the history of science, a Whig interpretation is much more justifiable. The reason is that science, unlike, say, politics or religion, is a cumulative branch of knowledge. You can say, not merely as a matter of taste, but with sober judgment, that Newton knew more about the world than Aristotle did, and Einstein knew more than Newton did. There really has been progress. And to trace that progress, it makes sense to look at the science of the past and try to pick out modes of thought that either led to progress, or impeded progress.

    Why did you focus on the history of physics and astronomy?

    Well, that’s what I know about; that’s where I have some competence. But there’s another reason: It’s in physics and astronomy that science first became “modern.” Actually, it’s physics as applied to astronomy. Newton gave us the modern approach to physics in the late 17th century. Other branches of science became modern only more recently: chemistry in the early 19th century; biology in the mid-19th century, or perhaps the early 20th century. So if you want to understand the discovery of modern science — which is the subtitle of my book — that discovery was made in the context of physics, especially as applied to astronomy.

    Theoretical physics is often seen as a quest for unification — we think of Newton, unifying terrestrial and celestial physics, or James Clerk Maxwell, unifying electricity, magnetism, and light. And of course your own work. Where does this quest for unification stand today?

    It hasn’t advanced very much, except for the fact that the theories we speculated about in the 1960s have been confirmed by observation. In the theory I developed in 1967 — Abdus Salam developed essentially the same theory, independently, in 1968 — a symmetry-breaking field played a fundamental role, manifest in a particle called the Higgs boson, whose properties we predicted, except for its mass. Now, thanks to experiments performed at CERN, the Higgs has been verified.

    CERN CMS Higgs Event


    CERN ATLAS Higgs Event

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    So we’re on much more solid ground. But we haven’t gone any further. There have been enormous efforts to take further steps, especially in the context of string theory. String theory would unify all of the forces — the strong and weak nuclear forces, and the electromagnetic force, together with gravity. String theory has provided some deep mathematical ideas about how that might work. But we’re far from being able to verify the theory — much further than we were from verifying the electroweak theory 40 years ago.

    The Large Hadron Collider (LHC) is scheduled to start up again this year [2015], with twice the power it had during its initial run. What do you hope it’ll find — I’m not sure if “hope” is the right word — when it’s turned on?

    “The Standard Model is so complex that it would be hard to put it on a T-shirt.”

    Hope is exactly the right word! It depends on what new particles might have masses in the range that the LHC can probe. There are certainly things to look for. The most obvious thing is the dark-matter particle. We know from astronomy that five-sixths of the matter in the universe is something that doesn’t fit in the Standard Model of particle physics. But we have no idea what its mass is. Astronomers can tell us the total mass of this dark matter, but not the mass carried by each particle. If it’s a conventional dark-matter particle, known as a WIMP — “weakly interacting massive particle” — then the LHC might find it. It depends on how heavy it is, and on how it decays, because you never see the particle itself, you only see the products of its decay.

    The LHC might also find signs of supersymmetry, a theory positing that known particles each have a partner particle — but again, we don’t know what the mass of those partner particles would be. And here, there’s an even deeper uncertainty: We don’t know if supersymmetry has anything to do with the real world. There could also be heavier quarks, perhaps even heavier versions of the Higgs particle.

    It’s sometimes said that supersymmetry would be a kind of thumbs-up for string theory, which has been impossible to test in any direct way. If the LHC finds no evidence for supersymmetry, what happens to string theory?

    Standard model of Supersymmetry DESY

    Damned if I know! Unfortunately, string theory doesn’t make very specific predictions about physics at the energies that are accessible to us. The kind of energies of the structures that string theory deals with are so high, we’ll probably never be able to reproduce them in the lab. But those energies were common in the very early universe. So by making cosmological observations, we may get a handle on the physics of those incredibly high energies. For example, if the matter-energy density at the time of inflation was of the order of magnitude that is characteristic of string theory, then a great deal of gravitational radiation would have been produced at that time, and it would have left an imprint on the cosmic microwave background. Last year, scientists working with the BICEP2 telescope announced that they had found these gravitational waves; now it seems they were actually measuring interstellar dust. Further observations with the Planck satellite may be able to settle this question. I think that’s one of the most exciting things going on in all of physical science right now.

    BICEP 2

    Gravitational Wave Background from BICEP 2 which ultimately failed to be correct. The Planck team determined that the culprit was cosmic dust.

    For theorists, is the ultimate goal a set of equations we could put on a T-shirt?

    That’s the aim. The Standard Model is so complex that it would be hard to put it on a T-shirt — though not impossible; you’d just have to write kind of small. Now, it wouldn’t take gravity into account, so it wouldn’t be a “theory of everything.” But it would be a theory of all the other things we study in our physics laboratories. The Standard Model is sufficiently complicated, and has so many arbitrary features, that we know it’s not the final answer. The goal would be to have a much simpler theory with fewer arbitrary features — maybe even none at all — that would fit on a T-shirt. We’re not there yet.

    Some physicists suggest that we may have to settle for an array of different theories, perhaps representing different solutions to string theory’s equations. Maybe each solution represents a different universe — part of some larger “multiverse.”

    I am not a proponent of the idea that our Big Bang universe is just part of a larger multiverse. It has to be taken seriously as a possibility, though. And it does lead to interesting consequences. For example, it would explain why some constants of nature, particularly the dark energy, have values that seem to be very favorable to the appearance of life.

    Dark energy depiction. Image: Volker Springle/Max Planck Institute for Astrophysics/SP)

    Suppose you have a multiverse in which constants like dark energy vary from one big bang to another. Then, if you ask why it takes the value it does in our Big Bang, you have to take into account that there’s a selection effect: It’s only in big bangs where the dark energy takes a value favorable to the appearance of life that there’s anybody around to ask the question.

    “You don’t have to verify every prediction to know that a theory is correct.”

    This is very closely analogous to a question that astronomers have discussed for thousands of years, concerning the Earth and the sun. Why is the sun the distance that it is from us? If it were closer, the Earth would be too hot to harbor life; if it were further away, the Earth would be too cold. Why is it at just the right distance? Most people, like Galen, the Roman physician, thought that it was due to the benevolence of the gods, that it was all arranged for our benefit. A much better answer — the answer we would give today — is that there are billions of planets in our galaxy, and billions of galaxies in the universe. And it’s not surprising that a few of them, out of all those billions, are positioned in a way that’s favorable for life.

    But at least we can see some of those other planets. That’s not the case with the universes that are said to make up the multiverse.

    It’s not part of the requirement of a successful physical theory that everything it describes be observable, or that all possible predictions of the theory be verifiable. For example, we have a very successful theory of the strong nuclear forces, called quantum chromodynamics [QCD], which is based on the idea that quarks are bound together by forces that increase with distance, so that we will never, even in principle, be able to observe a quark in isolation.
    All we can observe are other successful predictions of QCD. We can’t actually detect quarks, but it doesn’t matter; we know QCD is correct, because it makes predictions that we can verify.

    Similarly, string theory, which predicts a multiverse, can’t be verified by detecting the other parts of the multiverse. But it might make other predictions that can be verified. For example, it may say that in all of the big bangs within the multiverse, certain things will always be true, and those things may be verifiable. It may say that certain symmetries will always be observed, or that they’ll always be broken according to a certain pattern that we can observe. If it made enough predictions like that, then we would say that string theory is correct. And if the theory predicted a multiverse, then we’d say that that’s correct too. You don’t have to verify every prediction to know that a theory is correct.

    When we talk about the multiverse, it seems as though physics is brushing up against philosophy. A number of physicists, including Stephen Hawking and Lawrence Krauss, have angered philosophers by describing philosophy as useless. In your new book, it sounds as if you agree with them. Is that right?

    I think academic philosophy is helpful only in a negative sense — that is, sometimes physicists get impressed with philosophical ideas, so that it can be helpful to hear from experts that those ideas have been challenged within the philosophical community. One example is positivism, which decrees that you should only talk about things that are directly detectable or observable. I think philosophers themselves have challenged that, and it’s good to know that.

    On the other hand, a kind of philosophical discussion does go on among physicists themselves. For example, the discussion we were having earlier about the multiverse raised the issue of what we expect from a scientific theory — when do we reject it as being outside of science; when do we accept it as being confirmed. Those are meta-scientific questions; they’re philosophical questions. The scientists never seem to reach an agreement about those things — like in the case of the multiverse — but then, neither do the professional philosophers.

    And sometimes, as with the example of positivism, the work of professional philosophers actually stands in the way of progress. That’s also the case with the approach known as constructivism — the idea that every society’s scientific theories are a social construct, like its political institutions, and have to be understood as coming out of a particular cultural milieu. I don’t know whether you’d call it a philosophical theory or a historical theory, but at any rate, I think that view is wrong, and I also think it could impede the work of science, because it takes away one of science’s great motivations, which is to discover something that, in an absolute sense, divorced from any cultural milieu, is actually true.

    You’re 81. Many people would be thinking about retirement, but you’re very active. What are you working on now?

    There’s something I’ve been working on for more than a year — maybe it’s just an old man’s obsession, but I’m trying to find an approach to quantum mechanics that makes more sense than existing approaches. I’ve just finished editing the second edition of my book, Lectures on Quantum Mechanics, in which I think I strengthen the argument that none of the existing interpretations of quantum mechanics are entirely satisfactory.

    I don’t intend to retire, because I enjoy doing what I’m doing. I enjoy teaching; I enjoy following research; and I enjoy doing a little research on my own. The year before last, before I got onto this quantum mechanics kick, I was writing papers about down-to-earth problems in elementary particle theory; I was also working on cosmology. I hope I go back to that.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

    Advertisements
     
  • richardmitnick 12:30 pm on December 31, 2017 Permalink | Reply
    Tags: “The universe is inevitable” he declared. “The universe is impossible.”Nima Arkani-Hamed, , Complications in Physics - "Is Nature Unnatural?", Dr. Steven Weinberg, , Nima Arkani-Hamed of the Institute for Advanced Study, , , , The universe might not make sense   

    From Quanta Magazine: Complications in Physics – “Is Nature Unnatural?” 2013 

    Quanta Magazine
    Quanta Magazine

    May 24, 2013 [Just brought forward in social media.]
    Natalie Wolchover

    Decades of confounding experiments have physicists considering a startling possibility: The universe might not make sense.

    1
    Is the universe natural or do we live in an atypical bubble in a multiverse? Recent results at the Large Hadron Collider have forced many physicists to confront the latter possibility. Illustration by Giovanni Villadoro.

    On an overcast afternoon in late April, physics professors and students crowded into a wood-paneled lecture hall at Columbia University for a talk by Nima Arkani-Hamed, a high-profile theorist visiting from the Institute for Advanced Study in nearby Princeton, N.J.

    6
    Nima Arkani-Hamed, Institute for Advanced Study Princeton, N.J., USA
    With his dark, shoulder-length hair shoved behind his ears, Arkani-Hamed laid out the dual, seemingly contradictory implications of recent experimental results at the Large Hadron Collider in Europe.

    3
    “The universe is impossible,” said Nima Arkani-Hamed, 41, of the Institute for Advanced Study, during a recent talk at Columbia University. Natalie Wolchover/Quanta Magazine

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    “The universe is inevitable,” he declared. “The universe is impossible.”

    The spectacular discovery of the Higgs boson in July 2012 confirmed a nearly 50-year-old theory of how elementary particles acquire mass, which enables them to form big structures such as galaxies and humans.

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    “The fact that it was seen more or less where we expected to find it is a triumph for experiment, it’s a triumph for theory, and it’s an indication that physics works,” Arkani-Hamed told the crowd.

    However, in order for the Higgs boson to make sense with the mass (or equivalent energy) it was determined to have, the LHC needed to find a swarm of other particles, too. None turned up.

    With the discovery of only one particle, the LHC experiments deepened a profound problem in physics that had been brewing for decades. Modern equations seem to capture reality with breathtaking accuracy, correctly predicting the values of many constants of nature and the existence of particles like the Higgs. Yet a few constants — including the mass of the Higgs boson — are exponentially different from what these trusted laws indicate they should be, in ways that would rule out any chance of life, unless the universe is shaped by inexplicable fine-tunings and cancellations.

    In peril is the notion of “naturalness,” Albert Einstein’s dream that the laws of nature are sublimely beautiful, inevitable and self-contained. Without it, physicists face the harsh prospect that those laws are just an arbitrary, messy outcome of random fluctuations in the fabric of space and time.

    The LHC will resume smashing protons in 2015 in a last-ditch search for answers. But in papers, talks and interviews, Arkani-Hamed and many other top physicists are already confronting the possibility that the universe might be unnatural. (There is wide disagreement, however, about what it would take to prove it.)

    “Ten or 20 years ago, I was a firm believer in naturalness,” said Nathan Seiberg, a theoretical physicist at the Institute, where Einstein taught from 1933 until his death in 1955. “Now I’m not so sure. My hope is there’s still something we haven’t thought about, some other mechanism that would explain all these things. But I don’t see what it could be.”

    Physicists reason that if the universe is unnatural, with extremely unlikely fundamental constants that make life possible, then an enormous number of universes must exist for our improbable case to have been realized. Otherwise, why should we be so lucky? Unnaturalness would give a huge lift to the multiverse hypothesis, which holds that our universe is one bubble in an infinite and inaccessible foam. According to a popular but polarizing framework called string theory, the number of possible types of universes that can bubble up in a multiverse is around 10^500. In a few of them, chance cancellations would produce the strange constants we observe.

    In such a picture, not everything about this universe is inevitable, rendering it unpredictable. Edward Witten, a string theorist at the Institute, said by email, “I would be happy personally if the multiverse interpretation is not correct, in part because it potentially limits our ability to understand the laws of physics. But none of us were consulted when the universe was created.”

    “Some people hate it,” said Raphael Bousso, a physicist at the University of California at Berkeley who helped develop the multiverse scenario. “But I just don’t think we can analyze it on an emotional basis. It’s a logical possibility that is increasingly favored in the absence of naturalness at the LHC.”

    What the LHC does or doesn’t discover in its next run is likely to lend support to one of two possibilities: Either we live in an overcomplicated but stand-alone universe, or we inhabit an atypical bubble in a multiverse.

    Multiverse. Image credit: public domain, retrieved from https://pixabay.com/

    “We will be a lot smarter five or 10 years from today because of the LHC,” Seiberg said. “So that’s exciting. This is within reach.

    Cosmic Coincidence

    Einstein once wrote that for a scientist, “religious feeling takes the form of a rapturous amazement at the harmony of natural law” and that “this feeling is the guiding principle of his life and work.” Indeed, throughout the 20th century, the deep-seated belief that the laws of nature are harmonious — a belief in “naturalness” — has proven a reliable guide for discovering truth.

    “Naturalness has a track record,” Arkani-Hamed said in an interview. In practice, it is the requirement that the physical constants (particle masses and other fixed properties of the universe) emerge directly from the laws of physics, rather than resulting from improbable cancellations. Time and again, whenever a constant appeared fine-tuned, as if its initial value had been magically dialed to offset other effects, physicists suspected they were missing something. They would seek and inevitably find some particle or feature that materially dialed the constant, obviating a fine-tuned cancellation.

    This time, the self-healing powers of the universe seem to be failing. The Higgs boson has a mass of 126 giga-electron-volts, but interactions with the other known particles should add about 10,000,000,000,000,000,000 giga-electron-volts to its mass. This implies that the Higgs’ “bare mass,” or starting value before other particles affect it, just so happens to be the negative of that astronomical number, resulting in a near-perfect cancellation that leaves just a hint of Higgs behind: 126 giga-electron-volts.

    Physicists have gone through three generations of particle accelerators searching for new particles, posited by a theory called supersymmetry, that would drive the Higgs mass down exactly as much as the known particles drive it up. But so far they’ve come up empty-handed.

    The upgraded LHC will explore ever-higher energy scales in its next run, but even if new particles are found, they will almost definitely be too heavy to influence the Higgs mass in quite the right way. The Higgs will still seem at least 10 or 100 times too light. Physicists disagree about whether this is acceptable in a natural, stand-alone universe. “Fine-tuned a little — maybe it just happens,” said Lisa Randall, a professor at Harvard University. But in Arkani-Hamed’s opinion, being “a little bit tuned is like being a little bit pregnant. It just doesn’t exist.”

    If no new particles appear and the Higgs remains astronomically fine-tuned, then the multiverse hypothesis will stride into the limelight. “It doesn’t mean it’s right,” said Bousso, a longtime supporter of the multiverse picture, “but it does mean it’s the only game in town.”

    A few physicists — notably Joe Lykken of Fermi National Accelerator Laboratory in Batavia, Ill., and Alessandro Strumia of the University of Pisa in Italy — see a third option. They say that physicists might be misgauging the effects of other particles on the Higgs mass and that when calculated differently, its mass appears natural. This “modified naturalness” falters when additional particles, such as the unknown constituents of dark matter, are included in calculations — but the same unorthodox path could yield other ideas. “I don’t want to advocate, but just to discuss the consequences,” Strumia said during a talk earlier this month at Brookhaven National Laboratory.


    4
    Brookhaven Forum 2013 David Curtin, left, a postdoctoral researcher at Stony Brook University, and Alessandro Strumia, a physicist at the National Institute for Nuclear Physics in Italy, discussing Strumia’s “modified naturalness” idea, which questions longstanding assumptions about how to calculate the natural value of the Higgs boson mass. Thomas Lin/Quanta Magazine.

    However, modified naturalness cannot fix an even bigger naturalness problem that exists in physics: The fact that the cosmos wasn’t instantly annihilated by its own energy the moment after the Big Bang.

    Dark Dilemma

    The energy built into the vacuum of space (known as vacuum energy, dark energy or the cosmological constant) is a baffling trillion trillion trillion trillion trillion trillion trillion trillion trillion trillion times smaller than what is calculated to be its natural, albeit self-destructive, value. No theory exists about what could naturally fix this gargantuan disparity. But it’s clear that the cosmological constant has to be enormously fine-tuned to prevent the universe from rapidly exploding or collapsing to a point. It has to be fine-tuned in order for life to have a chance.

    To explain this absurd bit of luck, the multiverse idea has been growing mainstream in cosmology circles over the past few decades. It got a credibility boost in 1987 when the Nobel Prize-winning physicist Steven Weinberg, now a professor at the University of Texas at Austin, calculated that the cosmological constant of our universe is expected in the multiverse scenario [Physical Review Letters].

    5
    Steven Weinberg, University of Texas at Austin

    Of the possible universes capable of supporting life — the only ones that can be observed and contemplated in the first place — ours is among the least fine-tuned. “If the cosmological constant were much larger than the observed value, say by a factor of 10, then we would have no galaxies,” explained Alexander Vilenkin, a cosmologist and multiverse theorist at Tufts University. “It’s hard to imagine how life might exist in such a universe.”

    Most particle physicists hoped that a more testable explanation for the cosmological constant problem would be found. None has. Now, physicists say, the unnaturalness of the Higgs makes the unnaturalness of the cosmological constant more significant. Arkani-Hamed thinks the issues may even be related. “We don’t have an understanding of a basic extraordinary fact about our universe,” he said. “It is big and has big things in it.”

    The multiverse turned into slightly more than just a hand-waving argument in 2000, when Bousso and Joe Polchinski, a professor of theoretical physics at the University of California at Santa Barbara, found a mechanism that could give rise to a panorama of parallel universes. String theory, a hypothetical “theory of everything” that regards particles as invisibly small vibrating lines, posits that space-time is 10-dimensional. At the human scale, we experience just three dimensions of space and one of time, but string theorists argue that six extra dimensions are tightly knotted at every point in the fabric of our 4-D reality. Bousso and Polchinski calculated that there are around 10500 different ways for those six dimensions to be knotted (all tying up varying amounts of energy), making an inconceivably vast and diverse array of universes possible. In other words, naturalness is not required. There isn’t a single, inevitable, perfect universe.

    “It was definitely an aha-moment for me,” Bousso said. But the paper sparked outrage.

    “Particle physicists, especially string theorists, had this dream of predicting uniquely all the constants of nature,” Bousso explained. “Everything would just come out of math and pi and twos. And we came in and said, ‘Look, it’s not going to happen, and there’s a reason it’s not going to happen. We’re thinking about this in totally the wrong way.’ ”

    Life in a Multiverse

    The Big Bang, in the Bousso-Polchinski multiverse scenario, is a fluctuation. A compact, six-dimensional knot that makes up one stitch in the fabric of reality suddenly shape-shifts, releasing energy that forms a bubble of space and time. The properties of this new universe are determined by chance: the amount of energy unleashed during the fluctuation. The vast majority of universes that burst into being in this way are thick with vacuum energy; they either expand or collapse so quickly that life cannot arise in them. But some atypical universes, in which an improbable cancellation yields a tiny value for the cosmological constant, are much like ours.

    In a paper posted last month to the physics preprint website arXiv.org, Bousso and a Berkeley colleague, Lawrence Hall, argue that the Higgs mass makes sense in the multiverse scenario, too. They found that bubble universes that contain enough visible matter (compared to dark matter) to support life most often have supersymmetric particles beyond the energy range of the LHC, and a fine-tuned Higgs boson. Similarly, other physicists showed in 1997 that if the Higgs boson were five times heavier than it is, this would suppress the formation of atoms other than hydrogen, resulting, by yet another means, in a lifeless universe.

    Despite these seemingly successful explanations, many physicists worry that there is little to be gained by adopting the multiverse worldview. Parallel universes cannot be tested for; worse, an unnatural universe resists understanding. “Without naturalness, we will lose the motivation to look for new physics,” said Kfir Blum, a physicist at the Institute for Advanced Study. “We know it’s there, but there is no robust argument for why we should find it.” That sentiment is echoed again and again: “I would prefer the universe to be natural,” Randall said.

    But theories can grow on physicists. After spending more than a decade acclimating himself to the multiverse, Arkani-Hamed now finds it plausible — and a viable route to understanding the ways of our world. “The wonderful point, as far as I’m concerned, is basically any result at the LHC will steer us with different degrees of force down one of these divergent paths,” he said. “This kind of choice is a very, very big deal.”

    Naturalness could pull through. Or it could be a false hope in a strange but comfortable pocket of the multiverse.

    As Arkani-Hamed told the audience at Columbia, “stay tuned.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 10:29 am on November 20, 2017 Permalink | Reply
    Tags: "A Model of Leptons", , , , Dr. Steven Weinberg, , ,   

    From CERN: “50 years since iconic ‘A Model of Leptons’ published” 

    Cern New Bloc

    Cern New Particle Event

    CERN New Masthead

    CERN

    20 Nov 2017
    Harriet Kim Jarlett

    1
    This event shows the real tracks produced in the 1200 litre Gargamelle bubble chamber that provided the first confirmation of a neutral current interaction. (Image: CERN)

    4
    Gargamelle

    1
    Steven Weinberg

    Today, 50 years ago, Steven Weinberg published the iconic paper A Model of Leptons [Physical Review Letters], which explains the profound link between mathematics and nature.

    2
    https://www.manhattanrarebooks.com/pages/books/222/steven-weinberg/a-model-of-leptons

    This paper lies at the core of the Standard Model, our most complete theory of how particles interact in our universe.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Just two pages long, Weinberg’s elegant and simply written theory was revolutionary at the time, yet was virtually ignored for many years. But now, it is cited at least three times a week.

    The paper uses the idea of symmetry – that everything in our universe has a corresponding mirror image – between particles called pions to build Weinberg’s theory of the fundamental forces.

    From 1965 Weinberg had been building a mathematical structure and theorems based on this symmetry that explained why physicists had observed certain interactions between pions and nucleons and how pions behave when they are scattered from one another. This paved the way for a whole theory of hadronic physics at low energy.

    ____________________________________________________________________________
    “It’s what keeps you going as a theoretical physicist to hope that one of your squiggles will turn out to describe reality.”
    Steven Weinberg, Nobel prize winner and author of A Model of Leptons
    ____________________________________________________________________________

    Physicists had been using the concept of symmetry since the 1930’s, but had not yet been able to unite the electromagnetic and weak forces. Uniting the two forces would bring physicists closer to a single theory describing how and why all the fundamental interactions in our universe occur. The mathematics needed the particles carrying these two forces to be massless, but Weinberg and other physicists knew that if the particles really created these forces in nature, they had to be very heavy.

    One day, as the 34-year-old Weinberg was driving his red Camero to work, he had a flash of insight – he had been looking for massless particles in the wrong place. He applied his theory to a rarely mentioned and often disregarded particle, the massive W boson, and paired it with a massless photon. Theorists accounted for the mass of the W by introducing another unseen mechanism. This later became known as the Higgs mechanism, which calls for the existence of a Higgs boson.

    Proving the validity of Weinberg’s theory inspired one of the biggest experimental science programmes ever seen and CERN has built major projects with these discoveries at their heart: the Gargamelle bubble chamber found the first evidence of the electroweak current in 1973; the Super Proton Synchrotron showed, in 1982, the first evidence of the W boson; and most recently the Large Hadron Collider, in 2012, confirmed the existence of the Higgs Boson.

    CERN Super Proton Synchrotron

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    CERN CMS Higgs Event


    CERN/CMS Detector

    CERN ATLAS Higgs Event


    CERN/ATLAS detector

    3
    Steven Weinberg visiting the ATLAS collaboration in 2009. (Image: Maximilien Brice/CERN)

    Speaking to the CERN Courier Weinberg, now 84, describes what it’s like to see his work confirmed: “It’s what keeps you going as a theoretical physicist to hope that one of your squiggles will turn out to describe reality.” He received the Nobel Prize for this iconic, game-changing theory in 1979.

    Half a century after this publication, it’s hard to find a theory that explains fundamental physics as clearly as Weinberg’s, which brought together all the different pieces of the puzzle and assembled them into one, very simple idea.

    Read more about the original theory, and an interview with Steven Weinberg in this month’s CERN Courier.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Meet CERN in a variety of places:

    Quantum Diaries
    QuantumDiaries

    Cern Courier

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

    Quantum Diaries

     
  • richardmitnick 5:03 pm on October 13, 2017 Permalink | Reply
    Tags: , Dr. Steven Weinberg, ,   

    From CERN Courier: “Birth of a symmetry” 


    CERN Courier

    Oct 13, 2017
    Frank Close

    1
    Model of Leptons

    Half a century ago, Steven Weinberg spent the summer at Cape Cod, working on a new theory of the strong interaction of pions.

    1
    Steven Weinberg

    By October 1967, the idea had morphed into a theory of the weak and electromagnetic interactions, and the following month he published a paper that would revolutionise our understanding of the fundamental forces.

    Weinberg’s paper “A Model of Leptons”, published in Physical Review Letters (PRL) on 20 November 1967, determined the direction of high-energy particle physics through the final decades of the 20th century. Just two and a half pages long, it is one of the most highly cited papers in the history of theoretical physics. Its contents are the core of the Standard Model of particles physics, now almost half a century old and still passing every experimental test.

    Most particle physicists today have grown up with the Standard Model’s orderly account of the fundamental particles and interactions, but things were very different in the 1960s.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Quantum electrodynamics (QED) had been well established as the description of the electromagnetic interaction, but there were no mature theories of the strong and weak nuclear forces. By the 1960s, experimental discoveries showed that the weak force exhibits some common features with QED, in particular that it might be mediated by a vector boson analogous to the photon. Theoretical arguments also suggested that QED’s underlying “U(1)” group structure could be generalised to the larger group SU(2), but there was a serious problem with such a scheme: the W boson suspected to mediate the weak force would have to be very massive empirically, whereas the mathematical symmetry of the theory required it to be massless like the photon.

    The importance of symmetries in understanding the fundamental forces was already becoming clear at the time, in particular how nature might hide its symmetries. Could “hidden symmetry” lead to a massive W boson while preserving the mathematical consistency of the theory? It was arguably Weinberg’s developments, in 1967, that brought this concept to life.

    Strong inspiration

    Weinberg’s inspiration was an earlier idea of [Yoichiro] Nambu in which fermions – such as the proton or neutron – can behave like a left- or right-handed screw as they move. If mass is ignored, these two “chiral” states act independently and the theory leads to the existence of a particle with properties similar to those of the pion – specifically a pseudoscalar, which means that it has no spin and its wavefunction changes sign under mirror symmetry. Nambu’s original investigations, however, had not examined how the three versions of the pion, with positive, negative or zero charge, shared their common “pion-ness” when interacting with one another. This commonality, or symmetry, is mathematically expressed by the group SU(2), which had been known in nuclear physics since the 1930s and in mathematics for much longer.

    It was this symmetry that Weinberg used as his point of departure in building a theory of the strong force, where nucleons interact with pions of all charges and the proton and neutron themselves form two “faces” of the underlying SU(2) structure. Empirical observations of the interactions between pions and nucleons showed that the underlying symmetry of SU(2) tended to act on the left- or right-handed chiral possibilities independently. The mathematical structure of the resulting equations to describe this behaviour, as Weinberg discovered, is called SU(2)×SU(2).

    2
    Original manuscript

    However, in nature this symmetry is not perfect because nucleons have mass. Had they been massless, they would have travelled at the speed of light, the left- and right-handed possibilities acting truly independently of one another and the symmetry left intact. That nucleons have a mass, so that the left and right states get mixed up when perceived by observers in different inertial frames, breaks the chiral symmetry. Nambu had investigated this effect as far back as 1959, but without the added richness of the SU(2)×SU(2) mathematical structure that Weinberg brought to the problem. Weinberg had been investigating this more sophisticated theory in around 1965, initially with considerable success. He derived theorems that explained the observed interactions of pions and nucleons at low energies, such as in nuclear physics. He was able to predict how pions behaved when they scattered from one another and, with a few well-defined assumptions, paved the way for a whole theory of hadronic physics at low energies.

    Meanwhile, in 1964, Brout and Englert, Higgs, Kibble, Guralnik and Hagen had demonstrated that the vector bosons of a Yang–Mills theory (one that is like QED but where attributes such as electric charge can be exchanged by the vector bosons themselves) put forward a decade earlier could become massive without spoiling the fundamental gauge symmetry. This “mass-generating mechanism” suggested that a complete Yang–Mills theory of the strong interaction might be possible. In addition to the well-known pion, examples of massive vector particles that feel the strong force had already been found, notably the rho-meson. Like the pion, this too occurs in three charged varieties: positive, negative and zero. Superficially these rho-mesons had the hallmarks of being the gauge bosons of the strong interactions, but they also have mass. Was the strong interaction the theatre for applying the mass-generating mechanism?

    Despite at first seeming so promising, the idea failed to fit the data. For some phenomena, the SU(2)×SU(2) symmetry empirically is broken, but for others where spin didn’t matter it works perfectly. When these patterns were incorporated into the maths, the rho-meson stubbornly remained massless, contrary to reality.

    Epiphany on the road

    In the middle of September 1967, while driving his red Camaro to work at MIT, Weinberg realised that he had been applying the right ideas to the wrong problem. Instead of the strong interactions, for which the SU(2)×SU(2) idea refused to work, the massless photon and the hypothetical massive W boson of the electromagnetic and weak interactions fitted perfectly with this picture. To call this possibility “hypothetical” hardly does justice to the time: the W boson was not discovered until 1984, and in 1967 was so disregarded as to receive at best a passing mention, if any, in textbooks.

    Weinberg needed a concrete model to illustrate his general idea. The numerous strongly interacting hadrons that had been discovered in the 1950s and 1960s were, for him, a quagmire, so he restricted his attention to the electron and neutrino. Here too it is worth recalling the state of knowledge at the time. The constituent quark model with three flavours – up, down and strange – had been formulated in 1964, but was widely disregarded. The experiments at SLAC that would help establish these constituents were a year away from announcing their results, and Bjorken’s ideas of a quark model, articulated at conferences that summer, were not yet widely accepted either. Finally, with only three flavours of quark, Weinberg’s ideas would lead to empirically unwanted “strangeness-changing neutral currents”. All these problems would eventually be solved, but in 1967 Weinberg made a wise choice to focus on leptons and leave quarks well alone.

    3
    Proving validity

    Following the discovery of parity violation in the 1950s, it was clear that the electron can spin like a left- or right-handed screw, whereas the massless neutrino is only left-handed. The left–right symmetry, which had been a feature of the strong interaction, was gone. Instead of two SU(2), the mathematics now only needed one, the second being replaced by the unitary group U(1). So Weinberg set up the equations of SU(2)×U(1) – the same structure that, unknown to him, had been proposed by Sheldon Glashow in 1961 and by Abdus Salam and John Ward in 1964 in attempts to marry the electromagnetic and weak interactions. His theory, like theirs, required two massive electrically charged bosons – the W+ and W– carriers of the weak force – and two neutral bosons: the massless photon and a massive Z0. If correct, it would show that the electromagnetic and weak forces are unified, taking physics a step closer to the goal of a single theory of all fundamental interactions.

    “The history of attempts to unify weak and electromagnetic interactions is very long, and will not be reviewed here.” So began the first footnote in Steven Weinberg’s seminal November 1967 paper, which led to him being awarded the 1979 Nobel Prize in Physics with Salam and Glashow. Weinberg’s footnote mentioned Fermi’s primitive idea for unification in 1934, and also the model that Glashow proposed in 1961.

    Clarity of thought

    Weinberg started his paper by articulating the challenge of unifying the electroweak forces as both an opportunity and a threat. He focused on the leptons – those fermions, such as the electron and neutrino, which do not feel the strong force. “Leptons interact only with photons, and with the [weak] bosons that presumably mediate weak interactions. What could be more natural than to unite these spin-one bosons [the photon and the weak bosons] into a multiplet,” he pondered. That was the opportunity. The threat was that “standing in the way of this synthesis are the obvious differences in the masses of the photon and [weak] boson.”

    Weinberg then suggests a solution: perhaps “the symmetries relating the weak and electromagnetic interactions are exact [at a fundamental level] but are [hidden in practice]”. He then draws attention to the ideas of Higgs, Brout, Englert, Guralnik, Hagen and Kibble, and uses these to give masses to the W and Z in his model. In a further important insight, Weinberg shows how this symmetry-breaking mechanism leaves the photon massless.

    His opening paragraph ended with the prescient observation that: “The model may be renormalisable.” The argument upon which this remark is based appears at the very end of the paper, although with somewhat less confidence than the promise hinted at the beginning. He begins the final paragraph with a question: “Is this model renormalisable?” The extent of his intuition is revealed in his argument: although the presence of a massive vector boson hitherto had been a scourge, the theory with which he had begun had no such mass and, as such, was “probably renormalisable”. So, he pondered: “The question is whether this renormalisablity is lost [by the spontaneous breaking of the symmetry].” And the conclusion: “If this model is renormalisable, what happens when we extend it…to the hadrons?”

    By speculating that his model may be renormalisable, Weinberg was hugely prescient, as ’t Hooft and Veltman would prove four years later. And perhaps it was a chance encounter at the Solvay Congress in Belgium two weeks before his paper was submitted that helped convince Weinberg that he was on the right track.

    Solvay secrets

    By the end of September 1967, Weinberg had his ideas in place as he set off to Belgium to attend the 14th Solvay Congress on Fundamental Problems in Elementary Particle Physics, held in Brussels from 2 to 7 October. He did not speak about his forthcoming paper, but did make some remarks after other talks, in particular following a presentation by Hans Peter Durr about a theorem of Jeffrey Goldstone and spontaneous symmetry breaking. During a general discussion session following Durr’s talk, Weinberg mused: “This raises a question I can’t answer: are such models renormalisable?” He continued with a similar argument to that which later appeared in his paper, ending with: “I hope someone will be able to find out whether or not [this] is a renormalisable theory of weak and electromagnetic interactions.”

    There was remarkably little reaction to Weinberg’s remarks, and he himself has recalled “a general lack of interest”. The only recorded statement came from François Englert, who insisted that the theory is renormalisable; then, remarkably, there is no further discussion. Englert and Robert Brout, then relatively junior scientists, had both attended the same Brussels meeting.

    4
    Nobel prize

    At some point during the Solvay conference, Weinberg presented a hand-written draft of his paper to Durr, and 40 years later I obtained a copy by a roundabout route. Weinberg himself had not seen it in all that time, and thought that all record of his Nobel-winning manuscript had been lost. The original manuscript is notable for there being no sign of second thoughts, or editing, which suggests that it was a provisional final draft of an idea that had been worked through in the preceding days. The only hint of modification after the first draft had been written is a memo squeezed in at the end of a reference to Higgs, to include references to Brout and Englert, and to Guralnik, Hagen and Kibble, for the idea of spontaneous symmetry breaking, on which the paper was based. Weinberg’s intuition about the renormalisability of the model is already present in this manuscript, and is identical to what appears in his PRL paper. There is no mention of Glashow’s SU(2)×U(1) model in the draft, but this is included in the version that was published in PRL the following month. This is the only substantial difference. This manuscript was submitted to the editors of PRL on Weinberg’s return to the US, and received by them on 17 October. It appeared in print on 20 November.

    Lasting impact

    Weinberg’s genius was to assemble together the various pieces of a jigsaw and display the whole picture. The basic idea of mass generation was due to the assorted theorists mentioned above, in the summer of 1964. However, a crucial feature of Weinberg’s model was the trick of being able to give masses to the W and Z while leaving the photon massless. This extension of the mass-generating mechanism was due to Tom Kibble, in 1967, which Weinberg recognises and credits.

    As was the case with his comments in Brussels the previous month, Weinberg’s paper appeared in November 1967 to a deafening silence. “Rarely has so great an accomplishment been so widely ignored,” wrote Sidney Coleman in Science in 1979. Today, Weinberg’s paper has been cited more than 10,000 times. Having been cited but twice in the four years from 1967 to 1971, suddenly it became so important that researchers have cited it three times every week throughout half a century. There is no parallel for this in the history of particle physics. The reason is that in 1971 an event took place that has defined the direction of the field ever since: Gerard ’t Hooft made his debut, and he and Martinus Veltman demonstrated the renormalisability of spontaneously broken Yang–Mills theories. A decade later the W and Z bosons were discovered by experiments at CERN’s Super Proton Synchrotron.

    CERN Super Proton Synchrotron

    A further 30 years were to pass before the discovery of the Higgs boson at the Large Hadron Collider completed the electroweak menu.

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    And in the meantime, completing the Standard Model, quantum chromodynamics was established as the theory of the strong interactions, based on the group SU(3).

    This episode in particle physics is not only one of the seminal breakthroughs in our understanding of the physical world, but touches on the profound link between mathematics and nature. On one hand it shows how it is easier to be Beethoven or Shakespeare than to be Steven Weinberg: change a few notes in a symphony or a phrase in a play, and you can still have a wonderful work of art; change a few symbols in Weinberg’s equations and the edifice falls apart – for if nature does not read your creation, however beautiful it might be, its use for science is diminished. Like all great theorists, Weinberg revealed a new aspect of reality by writing symbols on a sheet of paper and manipulating them according to the logic of mathematics. It took decades of technological progress to enable the discoveries of W and Higgs bosons and other entities that were already “known” to mathematics 50 years ago.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 12:35 pm on June 17, 2017 Permalink | Reply
    Tags: , , , , Dr. Steven Weinberg, Helen Quinn and Roberto Peccei, Peccei-Quinn symmetry, , ,   

    From Quanta: “Roberto Peccei and Helen Quinn, Driving Around Stanford in a Clunky Jeep” 

    Quanta Magazine
    Quanta Magazine

    June 15, 2017
    Thomas Lin
    Olena Shmahalo, Art Director
    Lucy Reading-Ikkanda, graphics

    1
    Ryan Schude for Quanta Magazine
    Helen Quinn and Roberto Peccei walking toward Stanford University’s new science and engineering quad. Behind them is the main quad, the oldest part of the campus. “If you look at a campus map,” said Quinn, who along with Peccei proposed Peccei-Quinn symmetry, “you will see the axis that goes through the middle of both quadrangle areas. We are on that line between the two.”

    Four decades ago, Helen Quinn and Roberto Peccei took on one of the great problems in theoretical particle physics: the strong charge-parity (CP) problem. Why does the symmetry between matter and antimatter break in weak interactions, which are responsible for nuclear decay, but not in strong interactions, which hold matter together?

    “The academic year 1976-77 was particularly exciting for me because Helen Quinn and Steven Weinberg were visiting the Stanford department of physics,” Peccei told Quanta in an email. “Helen and I had similar interests and we soon started working together.”

    Encouraged by Weinberg, who would go on to win a Nobel Prize in physics in 1979 for his work on the unification of electroweak interactions, Quinn and Peccei zeroed in on a CP-violating interaction whose strength can be characterized by an angular variable, theta. They knew theta had to be small, but no one had an elegant mechanism for explaining its smallness.

    “Steve liked to discuss physics over lunch, and Helen and I often joined him,” Peccei said. “Steve invariably brought up the theta problem in our lunch discussions, urging us to find a natural solution for why it was so small.”

    Quinn said by email that she and Peccei knew two things: The problem goes away if any quarks have zero mass (which seems to make theta irrelevant), and “in the very early hot universe all the quarks have zero mass.” They wondered how it could be that “theta is irrelevant in the early universe but matters once it cools enough that the quarks get their masses?”

    They proceeded to draft a “completely wrong paper based on conclusions we drew from this set of facts,” Quinn said. They went to Weinberg, whose comments helped clarify their thinking and, she said, “put us on the right track.”

    They realized they could naturally arrive at a zero value for theta by requiring a new symmetry, now known as the Peccei-Quinn mechanism. Besides being one of the popular proposed solutions to the strong CP problem, Peccei-Quinn symmetry also predicts the existence of a hypothetical “axion” particle, which has become a mainstay in theories of supersymmetry and cosmic inflation and has been proposed as a candidate for dark matter.

    2
    Peccei and Quinn discussing their proposed symmetry with the aid of a sombrero. Ryan Schude for Quanta Magazine

    That year at Stanford, Quinn and Peccei regularly interacted with the theory group at the Stanford Linear Accelerator Center (SLAC) as well as with another group from the University of California, Santa Cruz.

    “We formed a large and active group of theorists, which created a wonderful atmosphere of open discussion and collaboration,” Quinn said, adding that she recalls “riding with Roberto back and forth from Stanford to SLAC in his yellow and clunky Jeep, talking physics ideas as we went.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 6:17 pm on December 7, 2014 Permalink | Reply
    Tags: , Dr. Steven Weinberg, , ,   

    From Harvard: “The ever-smaller future of physics” 

    Harvard University

    Harvard University

    December 5, 2014
    Alvin Powell

    If physicists want to find their long-sought “theory of everything,” they have to get small. And Nobel Prize-winning theoretical physicist Steven Weinberg thinks he knows roughly how small.

    sw
    Nobel winner Steven Weinberg brought his thoughts on a “theory of everything” to the Physics Department’s Lee Historical Lecture. Jon Chase/Harvard Staff Photographer

    Weinberg, who spoke at a packed Geological Lecture Hall Monday evening, said there are hints that the answers to fundamental questions will reveal themselves at around a million billionths — between 10­-17 and 10-19 — of the radius of the typical atomic nucleus.

    “It is in that range that we expect to find really new physics,” said Weinberg, a onetime Harvard professor now on the faculty at the University of Texas at Austin.

    Physicists understand that there are four fundamental forces of nature. Two are familiar in our everyday lives: those of gravity and electromagnetism. The two less-familiar forces operate at the atomic level. The strong force holds the nucleus together while the weak force is responsible for the radioactive decay that changes one type of particle to another and the nuclear fusion that powers the sun.

    For decades, physicists have toiled to create a single theory that explains how all four of these forces work, but without success, instead settling on one theory that explains how gravity acts on a macro scale and another to describe the other three forces and their interactions at the atomic level.

    Weinberg, who won the 1979 Nobel Prize in Physics, with Sheldon Glashow and Abdus Salam, for electroweak theory explaining how the weak force and electromagnetism are related, returned to Harvard to deliver the Physics Department’s annual David M. Lee Historical Lecture. He was introduced by department chair Masahiro Morii and by Andrew Strominger, the Gwill E. York Professor of Physics, who recalled taking Weinberg’s class on general relativity as a Harvard undergrad.

    “I wish I could say I remembered you in Physics 210,” Weinberg said to laughs as he took the podium.

    The event also recognized the outstanding work of four graduate students — two in experimental physics, Dennis Huang and Siyuan Sun, and two in theoretical physics, Shu-Heng Shao and Bo Liu — with the Gertrude and Maurice Goldhaber Prize.

    Weinberg pointed to several hints of something significant going on at the far extremes of tininess. One hint is that the strong force, which weakens at shorter scales, and the weak and electromagnetic forces, which get stronger across shorter distances, appear to converge at that scale.

    Gravity is so weak that it isn’t felt at the atomic scale, overpowered by the other forces that operate there. However, Weinberg said, if you calculate how much mass two protons or two electrons would need for gravity to balance their repulsive electrical force, it would have to not just be enormous, but on a similar scale as the other measurements, the equivalent of 1.04 x 1018 gigaelectron volts.

    “There is a strong suggestion that gravity is somehow unified with those other forces at these scales,” Weinberg said.

    Weinberg also said there are experimental hints in the extremely small masses of neutrinos and in possible proton decay that the tiniest scales are significant in ways that are fundamental to physics.

    “This is a very crude estimate, but the mass of neutrinos which are being observed are in the same ballpark that you would expect from new physics associated with a fundamental length,” Weinberg said. “It all seems to hang together.”

    A major challenge for physicists is that the energy needed to probe what is actually going on at the smallest levels is far beyond current technology, something like 10 trillion times the highest energy we can harness now. And new technology to explore the problem experimentally is not on the horizon. Even with all the wealth in the world, scientists wouldn’t know where to begin, Weinberg said.

    But the experiment may have already been done, by nature, and there may be a way to look back at it, Weinberg said. During the inflationary period immediately after the Big Bang there was that kind of energy, he said, and it would be evident as gravitation waves in the cosmic microwave background, an echo of the Big Bang that astronomers study for hints of the early universe. In fact, astronomers announced they had found such waves earlier this year, though they are waiting for confirmation of the results.

    Gravitational Wave Background
    gravitational waves

    Cosmic Background Radiation Planck
    CMB per ESA/Planck

    ESA Planck
    ESA/Planck

    “The big question that we face … is, can we find a truly fundamental theory uniting all the forces, including gravitation … characterized by tiny lengths like 10-17 to 10-19 nuclear radii?” Weinberg said. “Is it a string theory? That seems like the most beautiful candidate, but we don’t have any direct evidence that it is a string theory. The only handle we have … on this to do further experiments is in cosmology.”

    See the full article here.

    Harvard is the oldest institution of higher education in the United States, established in 1636 by vote of the Great and General Court of the Massachusetts Bay Colony. It was named after the College’s first benefactor, the young minister John Harvard of Charlestown, who upon his death in 1638 left his library and half his estate to the institution. A statue of John Harvard stands today in front of University Hall in Harvard Yard, and is perhaps the University’s best known landmark.

    Harvard University has 12 degree-granting Schools in addition to the Radcliffe Institute for Advanced Study. The University has grown from nine students with a single master to an enrollment of more than 20,000 degree candidates including undergraduate, graduate, and professional students. There are more than 360,000 living alumni in the U.S. and over 190 other countries.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 6:43 pm on April 21, 2012 Permalink | Reply
    Tags: , Dr. Steven Weinberg, , The New York Review of Books   

    From The New York Review of Books: Steven Weinberg “The Crisis of Big Science” 

    nyrb

    In The New York Review of Books, May 10, 2012 Dr. Weinberg writes, as beautifully as ever, about some of the past and future of what is essentially Basic Scientific Research, in the field of Physics. This article is copyright protected, so I will not even quote from it, out of respect for Dr. Weinberg. I will just suggest that you go to the link provided below and read the article.

    sw
    Dr. Steven Weinberg

    Suffice it for me to say that in this article he is concerned with the future of the U.S. budget for basic research, specifically in Physics and Astronomy. But he does spend some time describing where we have been before talking about where we are or are not going. I have read him before in NYRB, and he never fails to properly set a context for his major thesis. But, while this is one of the most eminent people in our scientific community, still, in his description of our history of support and the lack of it for basic research, Dr Weinberg seems to make the defining point of his interest the 1993 cancellation by the U.S. Congress of the Superconducting Super Collider, to have been built in Texas. I have seen it in his previous articles, I have seen him speak about it in videos of his lectures. On the one hand, he is not wrong. On the other hand, let it go. This failure to proceed in a program in the State of Texas, where he has been at the University of Texas, is in no way any sort of defining moment in his incredible and Nobel winning career.

    If Dr Weinberg can be criticized for anything at all in his writing, it is his too quick mentions of various sub-atomic particles and forces which are the elements of the Standard Model. While it might be reasonable for him to expect that his readers would already be familiar with these terms, still he is writing in a journal of popular press, no matter how erudite the journal or its readership. He might just keep some quickie descriptions of quarks, leptons, muons, and bosons, etc., in his word processing files and dump them in for his less learned readership.


    The “Standard Model” with the hypothetical Higgs boson

    The U.S. D.O.E. Office of Science funds about seventeen major research laboratories, such as Berkeley Lab, Brookhaven, Argonne, and Fermilab. There is a lot of concern about the future of many projects in these labs. At Fermilab, the Long Baseline Neutrino Experiment (LBNE) has been pushed back to economic re-design. That is a biggie. Dr. Weinberg comments that things are not rosy in Europe. But the European Southern Observatory, an incredible organization in Astronomy, seems to be pushing ahead with its long range telescope building program. Also, in a previous post here, we saw that Director Oddone of Fermilab had recently returned from meetings in which he was quite impressed with the planning he saw in both Europe and Asia.

    I highly recommend that you read Dr. Weinberg’s article. I always recommend that you read Dr Weinberg. The article can be found here.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: