From Scientific American: “Interstellar Conversations”

Scientific American

From Scientific American

Could there be information networks across the galaxy?

October 19, 2019
Caleb A. Scharf

1
Credit: C. Scharf 2019

Let’s start by clearing something up. Whatever the ins and outs of the search for extraterrestrial intelligence over the years (that I’ll label as SETI) the bottom line is that we have not yet done enough to tell whether the cosmos is devoid of communicative species or crammed with them. Nowhere has this been articulated better than in the work by Jason Wright, Shubham Kanodia, and Emily Lubar of Penn State and their ‘Haystack equation’ [The Astronomical Journal]. This shows, unequivocally, that to date we’ve searched about as much as if we’d stared into a modest hot-tub’s worth of water from all of Earth’s oceans.

Consequently, to say that ‘there’s clearly nothing out there’ is like looking in that hot tub, not finding a dolphin, and concluding that dolphins therefore do not exist anywhere on the planet.

Given that fact, I think it’s perfectly reasonable to examine how communications across interstellar space might play out, should they exist. This does, of course, require a whole bunch of prior assumptions.

We have to assume that really long-distance communications, whether by radio, laser, beams of neutrinos, massive engineering of weird stellar transit signals, or other barely imagined options are actually possible at all. We have to assume, or at least posit, that information might flow across interstellar space either as inadvertent side effects of a busy species (noisily broadcasting or carelessly pointing lasers, among other things) or as deliberate signals – seeking replies, establishing communications, or tracking a species’ own kind.

We would also have to assume that technologically inclined species can arise and survive for long enough to expend time and energy on any of these things. That’s part of the depressing, although potentially realistic, Anthropocene mindset. But equally, simply shrugging our shoulders and saying that it’s all hopeless shuts down a discussion that could be very important.

That importance could stem from the relevance of information itself. At all levels, information appears to be not just an integral part of the phenomenon of life on Earth [Chaos: An Interdiciplary Journal of Nonlinear Science], but the flow of information may represent a critical piece of what makes something alive versus not alive (that flow and informational influence might even be [Journal of the Royal Society Interface] of what life is).

One small facet of this is very evident in how social animals deploy the flow of information. Imagine, for example, that humans didn’t communicate with each other in any way. It’s next to impossible to imagine that, right? We’re communicating even when we’re not speaking or touching. If I merely watch you walk down the street I’m accumulating information, adding that to my internal stash, analyzing, and incorporating it into my model of the world.

There’s a much bigger discussion to be had there, but to come back to SETI. It seems that there is a built-in inevitability for life to cause and participate in information flow, and we should assume that extends across interstellar distances too. We ourselves have taken baby steps towards this – from our transmissions to our SETI efforts, to the fact that we maintain communications with our most distant robotic spacecraft, the Voyagers.

As we’ve seen with studying the ideas of the so-called Fermi Paradox, in principle it’s pretty ‘easy’ for interstellar explorers to spread across the galaxy given a few million years. It therefore should be even easier for an information-bearing network to spread across the galaxy too. Signals can move at up to the speed of light, so the bottlenecks come from issues like the fading of signal strength with distance, the timescale of development of the infrastructure to receive and transmit, and the choices made on directionality (perhaps).

The beautiful thing is that we can model hypotheses for this galactic information flow – even if we don’t know all the possible ifs, buts, and maybes. We can, in principle, test hypotheticals about the structure of information-bearing interstellar networks, which will also relate to the known physical distribution and dynamics of star systems and planets in our galaxy.

Perhaps somewhere in there are clues about where we stand in relation to conversations that could be skittering by us all the time. Perhaps too are clues about what those conversations would entail, what the most valuable interstellar informational currencies really are.

See the full article here .


five-ways-keep-your-child-safe-school-shootings
Please help promote STEM in your local schools.

Stem Education Coalition

Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

#i-think-its-perfectly-reasonable-to-examine-how-communications-across-interstellar-space-might-play-out-should-they-exist, #it-seems-that-there-is-a-built-in-inevitability-for-life-to-cause-and-participate-in-information-flow, #scientific-american, #seti, #the-bottom-line-is-that-we-have-not-yet-done-enough-to-tell-whether-the-cosmos-is-devoid-of-communicative-species-or-crammed-with-them, #we-have-to-assume-that-really-long-distance-communications-are-actually-possible-at-all, #we-would-also-have-to-assume-that-technologically-inclined-species-can-arise-and-survive-for-long-enough-to-expend-time-and-energy-on-any-of-these-things

From Scientific American: Women in STEM- “In Support of the Vera C. Rubin Observatory”

Scientific American

From Scientific American

August 23, 2019
Megan Donahue

The House of Representatives has taken the first step toward honoring a pioneering woman in astronomy.

LSST the Vera C. Rubin Observatory

LSST Camera, built at SLAC



LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.


LSST Data Journey, Illustration by Sandbox Studio, Chicago with Ana Kova

On July 23, the U.S. House of Representatives approved H.R. 3196, the Vera C. Rubin Observatory Designation Act, which was introduced by Representative Eddie Bernice Johnson of Texas and Representative Jenniffer González-Colón of Puerto Rico (at large). If the Senate agrees, it will name the facility housing the Large Synoptic Survey Telescope the Vera C. Rubin Observatory in honor of Carnegie Institution for Science researcher Vera Cooper Rubin, who died in 2016.

Fritz Zwicky discovered Dark Matter when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, did most of the work on Dark Matter.

Fritz Zwicky from http:// palomarskies.blogspot.com

Coma cluster via NASA/ESA Hubble

Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science)


Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL)


Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970. https://home.dtm.ciw.edu

As a woman astronomer working in the field of cosmology and galaxy studies, Rubin has always been a personal hero of mine. I can’t think of a more appropriate tribute to her memory and her incredible contributions to science, astronomy and future astronomers than this honor.

The text of the bill itself celebrates the milestones of Rubin’s scientific career. As a student and young professor, she studied how galaxies cluster and move inside such clusters. In 1970 she and astronomer W. Kent Ford, Jr., published measurements of the line-of-sight velocities and locations of individual ionized clouds of gas inside the nearby Andromeda galaxy (M31), showing that they were moving too fast to be gravitationally bound to the galaxy if the only matter binding it was the matter we can see (in the form of stars).

We call these kinds of observations “rotation curves,” because inside spiral galaxies such as Andromeda or our own Milky Way, the orbits of stars and gas circle the center of the galaxy inside a volume of space shaped like a disk. A typical rotation curve plots the velocities of gas clouds or stars toward or away from us as a function of distance from the center of the disk. These curves can be fit to models of where the matter is inside those orbits to work out how much matter is inside the galaxy and where it sits.

In Rubin and Ford’s paper, they did not make much of a fuss about the interpretation. By 1980 however, Rubin, Ford and the late Norbert Thonnard presented long-slit spectroscopy of a sample of 21 galaxies. They derived the rotation curves from these data, and in this, their most-cited work, and in the most cited work around this time in Rubin’s career, they boldly posited that gravity caused by something other than stars and gas must be binding the galaxies together. These observations provided some of the first direct evidence of the existence of dark matter inside of galaxies.

Later observations of clusters of galaxies and of the cosmic microwave background confirm that dark matter exists in even larger structures, and it appears to outweigh the stars and gas in the universe by a factor of about seven. Rubin investigated questions related to the nature of spiral galaxies and dark matter for most of her life. We still don’t know exactly what dark matter is made out of, but her discoveries transformed our thinking about the universe and its contents.

Although many of us astronomers thought Rubin should have won a Nobel Prize in Physics for her work in finding dark matter in galaxies, it’s not as if she went unrecognized during her life. She was a very highly regarded scientist, and she was recognized by her fellow researchers. In 1993, she was awarded the National Medal of Science, which is based on nomination by one’s peers, submitted to the National Science Foundation, and subsequent selection by 12 presidentially appointed scientists.

This award was set up by John F. Kennedy in 1962. In the category of physical sciences, it was first given to a woman—Margaret Burbidge—20 years later, after more than 60 men had received that prize. After another 10 years and more than 30 male prizewinners, Rubin won it. (If you’re wondering: yes, an additional 14 years passed and 27 more men won the prize in the physical sciences category before any other women did so.)

In 1996 Rubin was the second woman ever to receive the Gold Medal of the Royal Astronomical Society. The first woman so honored was Caroline Herschel, nearly 170 years prior. As did many women of her generation (or any of them), Rubin faced many barriers in her career simply because she was a woman. For example, as a scientific staff member of the Carnegie Institution in the 1960s, she had institutional access to the world-class Palomar Observatory in California. But she was denied access to the observatory, with the excuse that there were limited bathroom facilities.

Caltech Palomar Observatory, located in San Diego County, California, US, at 1,712 m (5,617 ft)

Nevertheless, she persisted, and in 1965 she was finally allowed to observe at Palomar. She was the first woman to be officially allowed to do so. (Burbidge had gained access under the name of her husband Geoffrey.) Rubin carried on as an advocate for the equal treatment of women in science and helped many other women in their careers as astronomers. The Large Synoptic Survey Telescope, funded primarily by the NSF and the Department of Energy, will carry on her legacy and her work to study the nature of dark energy and dark matter and map out the structure of the universe as traced by billions of galaxies.

We have come a long way from the days where women weren’t allowed in the same buildings as men. But we still have a long way to travel, because it is still too easy, even in science and with our desire to avoid bias, for a man to cast doubt on the worth of a woman’s work. We also apparently have much to learn about the nature of dark matter—which may be a dark sector of dark matter particle species, for all we know so far. Because of Rubin’s pioneering work, we are all further along these journeys than we would be without her. By hearing her name and her story, along with the wonderful discoveries we all anticipate from the Vera C. Rubin Observatory, little girls everywhere can learn they, too, can contribute to our understanding of the universe.

See the full article here .


five-ways-keep-your-child-safe-school-shootings
Please help promote STEM in your local schools.

Stem Education Coalition

Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

#astronomy, #astrophysics, #basic-research, #cosmology, #lsst-will-be-the-vera-c-rubin-observatory, #scientific-american, #vera-rubin, #women-in-stem

From Scientific American: “Oregon Is About to Get a Lot More Hazardous”

Scientific American

From Scientific American

June 29, 2019
Dana Hunter

State leadership is failing its citizens—and there will be a body count.

1
Credit: Dale Simonson (CC BY-SA 2.0)

When you live in an area at as much geologic risk as Oregon, you would expect that government officials would maybe, possibly, take those risks seriously. But the people who currently govern Oregon seem quite determined to ignore hazards and let the state languish unprepared.

It’s bad enough that legislators voted this month to allow “new schools, hospitals, jails, and police and fire stations” to be built in areas that will most certainly be inundated in the event of a tsunami. Both parties think it’s a good idea now; I doubt they’ll still be feeling great about locating schools right in the path of rampaging seawater when the big one hits. But short-term economic gain outweighs long-term planning, so here we are. What else can we expect from a statehouse where lawmakers who would rather flee the state than be forced to deal with climate change?

People say they’re willing to accept the risks. However, the state government is now planning to make it far harder for residents to even know what those risks are, because Oregon’s Department of Geology and Mineral Industries (DOGAMI for short) is severely underfunded and will now lose three critically-needed experts on staff as a punishment for going over budget. As if that weren’t bad enough, the governor’s office is considering whether the agency should even continue to exist:

“In a note on the preliminary budget proposal for the agency, the Joint Ways and Means Committee said the Governor’s office would be “evaluating if the Department should continue to exist as an independent or recommendations to abolish and move the individual programs to other entities.”

That drastic of a move could come with big consequences,” Avy said.

“It would be incredibly disruptive to staff and it is likely that some on-going studies would be discontinued,” he said.”Oregon would lose a valued agency and may lose talented staff in our Geological Survey and Services Program which provides a focus on geologic and mineral mapping and natural hazard identification.”

Can we be real for a minute, here? Oregon is a geologically young state in an active subduction zone, located on an ocean that has subduction zones on both sides, which generate ocean-spanning tsunamis on a regular basis. The local subduction zone, plus Basin and Range crustal stretching and faulting, also produces active volcanoes. Many, many volcanoes. Also, too, all of this folding and faulting and uplifting and volcanoing leaves the state terribly landslide prone. This is not a place where you can safely starve your local geological survey of funds, and then shut it down when it needs extra money to identify and quantify the hazards you face.

So if you live in Oregon, or even if you just visit, I’d strongly consider writing a polite but serious missive to Governor Kate Brown, letting her know that it would perhaps be a good idea to look further into the possible repercussions of signing that deplorable tsunami bill (I mean, at least take the schools out of the mix!), and also fully fund DOGAMI rather than further crippling it and then stripping it for parts.

Let’s have a brief tour of Oregon’s geohazards which DOGAMI helps protect us from, then, shall we?

Tsunamis

The Oregon coast is extremely susceptible to tsunamis, both generated from Cascadia and from other subduction zones along the Pacific Ocean. You can see evidence of them everywhere.

1
Cascadia subduction zone. This is the site of recurring en:megathrust earthquakes at average intervals of about 500 years, including the en:Cascadia Earthquake of en:1700.

One of the starkest reminders in recent times was the dock that was ripped from the shoreline in Misawa, Japan, in the brutal 2011 Tōhoku Earthquake. The tsunami that sheared it loose and set it afloat also washed ashore in California and Oregon, causing millions of dollars in damage; loss of life in the United States was only avoided due to ample warnings.

3
Ocean energy distribution forecast map for the 2011 Sendai earthquake from the U.S. NOAA. Note the location of Australia for scale.

Just over a year later, the dock washed up on Agate Beach, Oregon.

At Agate Beach, homes and businesses are built right in the path of the next Cascadia tsunami. I can’t describe to you the eerie sensation you feel turning away from that dock to see vulnerable structures that will be piles of flooded rubble after the next tsunami hits.

3
Residences and businesses on Agate Beach. Even a modest tsunami will cause untold damage to these structures. Credit: Dana Hunter

The people here will have minutes to find high ground after the shaking stops, if that long. There is some high ground nearby, but not much, and perhaps not near enough. Roads will probably be destroyed or blocked in the quake. This is the sort of location the legislature has decided it would be fine to site schools.

Earthquakes

6
The stump of a drowned spruce at Sunset Bay, Shore Acres, OR. Lockwood DeWitt for scale. Credit: Dana Hunter

Sunset Bay is the site of one of Oregon’s many ghost forests. Here, a Cascadia earthquake dropped the shoreline about 1,200 years ago, suddenly drowning huge, healthy trees in salt water. At least seven spectacular earthquakes have hit the Oregon coast in the past 3,500 years. It may not sound like much, or often… but look to Japan for the reason why we should take the threat extremely seriously. And Oregon doesn’t just have to worry about Cascadia quakes: the state is full of faults, stretching from north to south and from coast to interior.

Volcanoes

Huge swathes of Oregon are volcanic. As in, recently volcanic. As in, will definitely erupt again quite soon.

Mount Hood, a sibling to Mount St. Helens, is right outside of Portland and last erupted in the mid-1800s. It is hazardous as heck.

6
Mount Hood reflected in Trillium Lake, Oregon, United States

But Hood is very, very far from the only young volcano in the state, and evidence of recent eruptions is everywhere. Belknap shield volcano and its associated volcanoes on McKenzie Pass ceased erupting only 1,500 years ago, and the forces that created it are still active today.

7
Belknap Crater, Oregon. Cascades Volcano Observatory

Another volcanic center like it could emerge in the near future. And you see here just a tiny swath of the destruction such a volcanic center causes.

You know what you really don’t want to be caught unawares by? A volcano. And even once they’ve stopped erupting, the buggers can be dangerous. Sector collapses, lahars, and other woes plague old volcanoes. You need people who can keep a sharp eye on them. And I’m sorry, but the USGS can’t be everywhere at once. Local volcano monitoring is important!

Landslides and debris flows

If you’re an Oregon resident, you’ll probably remember how bloody long it took to finish the Eddyville Bypass due to the massive landslide that got reactivated during construction. Steep terrain plus plenty of rain equals lots of rock and soil going where we’d prefer it didn’t.

Debris flows and landslides regularly take out Oregon roads, including this stretch on a drainage by Mount Hood.

7
Construction equipment copes with damage caused by massive debris flows coming down from Mount Hood. Credit: Dana Hunter

We know from the Oso mudslide just how deadly these mass movements can be. Having experts out there who understand how to map the geology of an area and identify problem areas is critically important, especially in places where a lot of people want to live, work, and play.

Contact the governor’s office and let her know if you don’t think it’s worth letting a budget shortfall torpedo the agency that should be doing the most to identify these hazards and help us mitigate them.

See the full article here .


five-ways-keep-your-child-safe-school-shootings
Please help promote STEM in your local schools.

Stem Education Coalition

Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

#oregon-is-about-to-get-a-lot-more-hazardous, #cascadia-subduction-zone, #earthquakes, #landslides-and-debris-flows, #scientific-american, #tsunamis, #vulcanology

From Scientific American: “Which Should Come First in Physics: Theory or Experiment?”

Scientific American

From Scientific American

June 17, 2019
Grigoris Panoutsopoulos
Frank Zimmermann

Plans for giant particle accelerators of the future focus attention on how scientific discoveries are really made.

The discovery of the Higgs particle at the Large Hadron Collider (LHC) over half a decade ago marked a milestone in the long journey towards understanding the deeper structure of matter. Today, particle physics strives to push a diverse range of experimental approaches from which we may glean new answers to fundamental questions regarding the creation of the universe and the nature of the mysterious and elusive dark matter.

Such an endeavor requires a post-LHC particle collider with an energy capability significantly greater than that of previous colliders. This is how the idea for the Future Circular Collider (FCC) at CERN came to be—a machine that could put the exploration of new physics in high gear.

CERN FCC Future Circular Collider map

To understand the validity of this proposal, we should, however, start at the beginning and once more ask ourselves: “How does physics progress?”

Many believe that grand revolutions are driven exclusively by new theories, whereas experiments play the parts of movie extras. The played-out story goes a little something like this: theorists form conjectures, and experiments are used solely for the purposes of testing them. After all, most of us proclaim our admiration for Einstein’s relativity or for quantum mechanics, but seldom do we pause and consider whether these awe-inspiring theories could have been attained without the contributions of the Michelson-Morley, Stern-Gerlach or black-body–radiation experiments.

This simplistic picture, despite being far removed from the creative, and often surprising, ways in which physics has developed over time, remains quite widespread even among scientists. Its pernicious influence can be seen in the discussion of future facilities like the proposed FCC at CERN.

In the wake of the discovery of the Higgs boson in 2012, we have finally of all of the pieces of puzzle of the Standard Model (SM) of physics in place. Nevertheless, the unknowns regarding dark matter, neutrino masses and the observed imbalance between matter and antimatter are among numerous indications that the SM is not the ultimate theory of elementary particles and their interactions.

Quite a number of theories have been developed to overcome the problems surrounding the SM, but so far none has been experimentally verified. This fact has left the world of physics brimming with anticipation. In the end, science has shown time and again that it can find new, creative ways to surmount any obstacles placed along its path. And one such way is for experiment to assume the leading role, so that it can help get the stuck wagon of particle physics moving and out of the mire

In this regard, the FCC study was launched by CERN in 2013 as a global effort to explore different scenarios for particle colliders that could inaugurate the post-LHC era and for advancing key technologies. A staged approach, it entails the construction of an electron-positron collider followed by a proton collider, which would present an eightfold energy leap compared to the LHC and thus grant us direct access to a previously unexplored regime. Both colliders will be housed in a new 100-kilometer circumference tunnel. The FCC study complements previous design studies for linear colliders in Europe (CLIC) and Japan (ILC), while China also has similar plans for a large-scale circular collider (CEPC).

CERN/CLIC

ILC schematic, being planned for the Kitakami highland, in the Iwate prefecture of northern Japan

China Circular Electron Positron Collider (CEPC) map

Future colliders could offer a deep understanding of the Higgs properties, but even more importantly, they represent an opportunity for exploring uncharted territory in an unprecedented energy scale. As Gian Giudice, head of CERN’s Theoretical Physics Department, argues: “High-energy colliders remain an indispensable and irreplaceable tool to continue our exploration of the inner workings of the universe.”

Nevertheless, the FCC is seen by some as a questionable scientific investment in the absence of clear theoretical guidance about where the elusive new physics may lie. The history of physics, however, offers evidence in support of a different view: that experiments often play a leading and exploratory role in the progress of science.

As the eminent historian of physics Peter Galison puts it, we have to “step down from the aristocratic view of physics that treats the discipline as if all interesting questions are structured by high theory.” Besides, quite a few experiments have been realized without being guided by a well-established theory but were instead undertaken for the purposes of exploring new domains. Let us examine some illuminating examples.

In the 16th century, King Frederick II of Denmark financed Uraniborg, an early research center, where Tycho Brahe constructed large astronomical instruments, like a huge mural quadrant (unfortunately, the telescope was invented a few years later) and carried out many detailed observations that had not previously been possible. The realization of an enormous experimental structure, at a hitherto unprecedented scale, transformed our view of the world. Tycho Brahe’s precise astronomical measurements enabled Johannes Kepler to develop his laws of planetary motion and to make a significant contribution to the scientific revolution.

The development of electromagnetism serves as another apt example: many electrical phenomena were discovered by physicists, such as Charles Dufay, André-Marie Ampère and Michael Faraday, in the 18th and 19th centuries, through experiments that had not been guided by any developed theory of electricity.

Moving closer to the present day, we see that the entire history of particle physics is indeed full of similar cases. In the aftermath of World War II, a constant and laborious experimental effort characterized the field of particle physics, and it was what allowed the Standard Model to emerge through a “zoo” of newly discovered particles. As a prominent example, quarks, the fundamental constituents of the proton and neutron, were discovered through a number of exploratory experiments during the late1960s at the Stanford Linear Accelerator (SLAC).

The majority of practicing physicists recognize the exceptional importance of experiment as an exploratory process. For instance, Victor “Viki” Weisskopf, the former director-general of CERN and an icon of modern physics, grasped clearly the dynamics of the experimental process in the context of particle physics:

“There are three kinds of physicists, namely the machine builders, the experimental physicists, and the theoretical physicists. If we compare those three classes, we find that the machine builders are the most important ones, because if they were not there, we would not get into this small-scale region of space. If we compare this with the discovery of America, the machine builders correspond to captains and ship builders who truly developed the techniques at that time. The experimentalists were those fellows on the ships who sailed to the other side of the world and then jumped upon the new islands and wrote down what they saw. The theoretical physicists are those fellows who stayed behind in Madrid and told Columbus that he was going to land in India.” (Weisskopf 1977)

Despite being a theoretical physicist himself, he was able to recognize the exploratory character of experimentation in particle physics. Thus, his words eerily foreshadow the present era. As one of the most respected theoretical physicists of our time, Nima Arkani-Hamed, claimed in a recent interview, “when theorists are more confused, it’s the time for more, not less experiments.”

The FCC, at present, strives to keep alive the exploratory spirit of the previous fabled colliders. It is not intended to be used as a verification tool for a specific theory but as a means of paving multiple experimental paths for the future. The experimental process should be allowed to develop its own momentum. This does not mean that experimentation and instrumentation should not maintain a close relationship with the theoretical community; at the end of the day, there is but one physics, and it must ensure its unity.

See the full article here .


five-ways-keep-your-child-safe-school-shootings
Please help promote STEM in your local schools.

Stem Education Coalition

Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

#applied-research-technology, #basic-research, #physics, #scientific-american

From Scientific American: “Cosmology Has Some Big Problems”

Scientific American

From Scientific American

April 30, 2019
Bjørn Ekeberg

The field relies on a conceptual framework that has trouble accounting for new observations.

1
Credit: Thanapol Sisrang Getty Images

What do we really know about our universe?

Born out of a cosmic explosion 13.8 billion years ago, the universe rapidly inflated and then cooled, it is still expanding at an increasing rate and mostly made up of unknown dark matter and dark energy … right?

This well-known story is usually taken as a self-evident scientific fact, despite the relative lack of empirical evidence—and despite a steady crop of discrepancies arising with observations of the distant universe.

In recent months, new measurements of the Hubble constant, the rate of universal expansion, suggested major differences between two independent methods of calculation. Discrepancies on the expansion rate have huge implications not simply for calculation but for the validity of cosmology’s current standard model at the extreme scales of the cosmos.

Another recent probe found galaxies inconsistent with the theory of dark matter, which posits this hypothetical substance to be everywhere. But according to the latest measurements, it is not, suggesting the theory needs to be reexamined.

It’s perhaps worth stopping to ask why astrophysicists hypothesize dark matter to be everywhere in the universe? The answer lies in a peculiar feature of cosmological physics that is not often remarked. For a crucial function of theories such as dark matter, dark energy and inflation, which each in its own way is tied to the big bang paradigm, is not to describe known empirical phenomena but rather to maintain the mathematical coherence of the framework itself while accounting for discrepant observations. Fundamentally, they are names for something that must exist insofar as the framework is assumed to be universally valid.

Each new discrepancy between observation and theory can of course in and of itself be considered an exciting promise of more research, a progressive refinement toward the truth. But when it adds up, it could also suggest a more confounding problem that is not resolved by tweaking parameters or adding new variables.

Consider the context of the problem and its history. As a mathematically driven science, cosmological physics is usually thought to be extremely precise. But the cosmos is unlike any scientific subject matter on earth. A theory of the entire universe, based on our own tiny neighborhood as the only known sample of it, requires a lot of simplifying assumptions. When these assumptions are multiplied and stretched across vast distances, the potential for error increases, and this is further compounded by our very limited means of testing.

Historically, Newton’s physical laws made up a theoretical framework that worked for our own solar system with remarkable precision. Both Uranus and Neptune, for example, were discovered through predictions based on Newton’s model. But as the scales grew larger, its validity proved limited. Einstein’s general relativity framework provided an extended and more precise reach beyond the furthest reaches of our own galaxy. But just how far could it go?

The big bang paradigm that emerged in the mid-20th century effectively stretches the model’s validity to a kind of infinity, defined either as the boundary of the radius of the universe (calculated at 46 billion light-years) or in terms of the beginning of time. This giant stretch is based on a few concrete discoveries, such as Edwin Hubble’s observation that the universe appears to be expanding (in 1929) and the detection of the microwave background radiation (in 1964).

2
The 15 meter Holmdel horn antenna at Bell Telephone Laboratories in Holmdel, New Jersey was built in 1959 for pioneering work in communication satellites for the NASA ECHO I. The antenna was 50 feet in length and the entire structure weighed about 18 tons. It was composed of aluminum with a steel base. It was used to detect radio waves that bounced off Project ECHO balloon satellites. The horn was later modified to work with the Telstar Communication Satellite frequencies as a receiver for broadcast signals from the satellite. In 1964, radio astronomers Robert Wilson and Arno Penzias discovered the cosmic microwave background radiation with it, for which they were awarded the 1978 Nobel prize in physics. In 1990 the horn was dedicated to the National Park Service as a National Historic Landmark.

But considering the scale involved, these limited observations have had an outsized influence on cosmological theory.

Edwin Hubble looking through a 100-inch Hooker telescope at Mount Wilson in Southern California, 1929 discovers the Universe is Expanding

NASA/ Cosmic Background Explorer COBE 1989 to 1993.

Cosmic Microwave Background NASA/WMAP

NASA/WMAP 2001 to 2010

CMB per ESA/Planck

ESA/Planck 2009 to 2013

It is of course entirely plausible that the validity of general relativity breaks down much closer to our own home than at the edge of the hypothetical end of the universe. And if that were the case, today’s multilayered theoretical edifice of the big bang paradigm would turn out to be a confusing mix of fictional beasts invented to uphold the model along with empirically valid variables, mutually reliant on each other to the point of making it impossible to sort science from fiction.

Compounding this problem, most observations of the universe occur experimentally and indirectly. Today’s space telescopes provide no direct view of anything—they produce measurements through an interplay of theoretical predictions and pliable parameters, in which the model is involved every step of the way. The framework literally frames the problem; it determines where and how to observe. And so, despite the advanced technologies and methods involved, the profound limitations to the endeavor also increase the risk of being led astray by the kind of assumptions that cannot be calculated.

After spending many years researching the foundations of cosmological physics from a philosophy of science perspective, I have not been surprised to hear some scientists openly talking about a crisis in cosmology. In the big “inflation debate” in Scientific American a few years ago, a key piece of the big bang paradigm was criticized by one of the theory’s original proponents for having become indefensible as a scientific theory.

Why? Because inflation theory relies on ad hoc contrivances to accommodate almost any data, and because its proposed physical field is not based on anything with empirical justification. This is probably because a crucial function of inflation is to bridge the transition from an unknowable big bang to a physics we can recognize today. So, is it science or a convenient invention?

Inflation

4
Alan Guth, from Highland Park High School and M.I.T., who first proposed cosmic inflation

HPHS Owls

Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex MittelmannColdcreation

Alan Guth’s notes:
5

A few astrophysicists, such as Michael J. Disney, have criticized the big bang paradigm for its lack of demonstrated certainties. In his analysis, the theoretical framework has far fewer certain observations than free parameters to tweak them—a so-called “negative significance” that would be an alarming sign for any science. As Disney writes in American Scientist: “A skeptic is entitled to feel that a negative significance, after so much time, effort and trimming, is nothing more than one would expect of a folktale constantly re-edited to fit inconvenient new observations.”

As I discuss in my new book, Metaphysical Experiments, there is a deeper history behind the current problems. The big bang hypothesis itself originally emerged as an indirect consequence of general relativity undergoing remodeling. Einstein had made a fundamental assumption about the universe, that it was static in both space and time, and to make his equations add up, he added a “cosmological constant,” for which he freely admitted there was no physical justification.

But when Hubble observed that the universe was expanding and Einstein’s solution no longer seemed to make sense, some mathematical physicists tried to change a fundamental assumption of the model: that the universe was the same in all spatial directions but variant in time. Not insignificantly, this theory came with a very promising upside: a possible merger between cosmology and nuclear physics. Could the brave new model of the atom also explain our universe?

From the outset, the theory only spoke to the immediate aftermath of an explicitly hypothetical event, whose principal function was as a limit condition, the point at which the theory breaks down. Big bang theory says nothing about the big bang; it is rather a possible hypothetical premise for resolving general relativity.

On top of this undemonstrable but very productive hypothesis, floor upon floor has been added intact, with vastly extended scales and new discrepancies. To explain observations of galaxies inconsistent with general relativity, the existence of dark matter was posited as an unknown and invisible form of matter calculated to make up more than a quarter of all mass-energy content in the universe—assuming, of course, the framework is universally valid.

Fritz Zwicky discovered Dark Matter when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, did most of the work on Dark Matter.

Fritz Zwicky from http:// palomarskies.blogspot.com

Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science)

Coma cluster via NASA/ESA Hubble

In 1998, when a set of supernova measurements of accelerating galaxies seemed at odds with the framework, a new theory emerged of a mysterious force called dark energy, calculated to fill circa 70 percent of the mass-energy of the universe.

[The Supernova Cosmology Project is one of two research teams that determined the likelihood of an accelerating universe and therefore a positive cosmological constant, using data from the redshift of Type Ia supernovae. The project was headed by Saul Perlmutter at Lawrence Berkeley National Laboratory, with members from Australia, Chile, France, Portugal, Spain, Sweden, the United Kingdom, and the United States.

This discovery was named “Breakthrough of the Year for 1998” by Science Magazin and, along with the High-z Supernova Search Team, the project team won the 2007 Gruber Prize in Cosmology and the 2015 Breakthrough Prize in Fundamental Physics. In 2011, Perlmutter was awarded the Nobel Prize in Physics for this work, alongside Adam Riess and Brian P. Schmidt from the High-z team.]

The crux of today’s cosmological paradigm is that in order to maintain a mathematically unified theory valid for the entire universe, we must accept that 95 percent of our cosmos is furnished by completely unknown elements and forces for which we have no empirical evidence whatsoever. For a scientist to be confident of this picture requires an exceptional faith in the power of mathematical unification.

In the end, the conundrum for cosmology is its reliance on the framework as a necessary presupposition for conducting research. For lack of a clear alternative, as astrophysicist Disney also notes, it is in a sense stuck with the paradigm. It seems more pragmatic to add new theoretical floors than to rethink the fundamentals.

Contrary to the scientific ideal of getting progressively closer to the truth, it looks rather like cosmology, to borrow a term from technology studies, has become path-dependent: overdetermined by the implications of its past inventions.

See the full article here .


five-ways-keep-your-child-safe-school-shootings
Please help promote STEM in your local schools.

Stem Education Coalition

Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

#cosmology-has-some-big-problems, #new-measurements-of-the-hubble-constant-the-rate-of-universal-expansion-suggested-major-differences-between-two-independent-methods-of-calculation, #scientific-american

From Scientific American: “Cosmologist Lee Smolin says that at certain key points, the scientific worldview is based on fallacious reasoning”

Scientific American

From Scientific American

April 17, 2019
Jim Daley

Lee Smolin, author of six books about the philosophical issues raised by contemporary physics, says every time he writes a new one, the experience completely changes the direction his own research is taking. In his latest book, Einstein’s Unfinished Revolution: The Search for What Lies Beyond the Quantum, Smolin, a cosmologist and quantum theorist at the Perimeter Institute for Theoretical Physics in Ontario, tackles what he sees as the limitations in quantum theory.

1
Credit: Perimeter Institute

“I want to say the scientific worldview is based on fallacious reasoning at certain key points,” Smolin says. In Einstein’s Unfinished Revolution, he argues one of those key points was the assumption that quantum physics is a complete theory. This incompleteness, Smolin argues, is the reason quantum physics has not been able to solve certain questions about the universe.

“Most of what we do [in science] is take the laws that have been discovered by experiments to apply to parts of the universe, and just assume that they can be scaled up to apply to the whole universe,” Smolin says. “I’m going to be suggesting that’s wrong.”

Join Smolin at the Perimeter Institute as he discusses his book and takes the audience on a journey through the basics of quantum physics and the experiments and scientists who have changed our understanding of the universe. The discussion, “Einstein’s Unfinished Revolution,” is part of Perimeter’s public lecture series and will take place on Wednesday, April 17, at 7 P.M. Eastern time. Online viewers can participate in the discussion by tweeting to @Perimeter using the #piLIVE hashtag.

See the full article here .


five-ways-keep-your-child-safe-school-shootings
Please help promote STEM in your local schools.

Stem Education Coalition

Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

#albert-einsteins-theory-of-general-relativity, #basic-research, #einsteins-unfinished-revolution-the-search-for-what-lies-beyond-the-quantum, #lee-smolin, #perimeter-institute-of-theoretical-physics, #quantum-physics, #scientific-american

From Scientific American: “Found: A Quadrillion Ways for String Theory to Make Our Universe”

Scientific American

From Scientific American

Mar 29, 2019
Anil Ananthaswamy

Stemming from the “F-theory” branch of string theory, each solution replicates key features of the standard model of particle physics.

1
Photo: dianaarturovna/Getty Images

Physicists who have been roaming the “landscape” of string theory — the space of zillions and zillions of mathematical solutions of the theory, where each solution provides the kinds of equations physicists need to describe reality — have stumbled upon a subset of such equations that have the same set of matter particles as exists in our universe.

String Theory depiction. Cross section of the quintic Calabi–Yau manifold Calabi yau.jpg. Jbourjai (using Mathematica output)

Standard Model of Supersymmetry via DESY

But this is no small subset: there are at least a quadrillion such solutions, making it the largest such set ever found in string theory.

According to string theory, all particles and fundamental forces arise from the vibrational states of tiny strings. For mathematical consistency, these strings vibrate in 10-dimensional spacetime. And for consistency with our familiar everyday experience of the universe, with three spatial dimensions and the dimension of time, the additional six dimensions are “compactified” so as to be undetectable.

Different compactifications lead to different solutions. In string theory, a “solution” implies a vacuum of spacetime that is governed by Einstein’s theory of gravity coupled to a quantum field theory. Each solution describes a unique universe, with its own set of particles, fundamental forces and other such defining properties.

Some string theorists have focused their efforts on trying to find ways to connect string theory to properties of our known, observable universe — particularly the standard model of particle physics, which describes all known particles and all their mutual forces except gravity.

Much of this effort has involved a version of string theory in which the strings interact weakly. However, in the past two decades, a new branch of string theory called F-theory has allowed physicists to work with strongly interacting, or strongly coupled, strings.

____________________________________________________
F-theory is a branch of string theory developed by Cumrun Vafa. The new vacua described by F-theory were discovered by Vafa and allowed string theorists to construct new realistic vacua — in the form of F-theory compactified on elliptically fibered Calabi–Yau four-folds. The letter “F” supposedly stands for “Father”.

F-theory is formally a 12-dimensional theory, but the only way to obtain an acceptable background is to compactify this theory on a two-torus. By doing so, one obtains type IIB superstring theory in 10 dimensions. The SL(2,Z) S-duality symmetry of the resulting type IIB string theory is manifest because it arises as the group of large diffeomorphisms of the two-dimensional torus.

More generally, one can compactify F-theory on an elliptically fibered manifold (elliptic fibration), i.e. a fiber bundle whose fiber is a two-dimensional torus (also called an elliptic curve). For example, a subclass of the K3 manifolds is elliptically fibered, and F-theory on a K3 manifold is dual to heterotic string theory on a two-torus. Also, the moduli spaces of those theories should be isomorphic.

The large number of semirealistic solutions to string theory referred to as the string theory landscape, with 10 272 , 000 {\displaystyle 10^{272,000}} {\displaystyle 10^{272,000}} elements or so, is dominated by F-theory compactifications on Calabi–Yau four-folds.[3] There are about 10 15 {\displaystyle 10^{15}} 10^{15} of those solutions consistent with the Standard Model of particle physics.

-Wikipedia

____________________________________________________

“An intriguing, surprising result is that when the coupling is large, we can start describing the theory very geometrically,” says Mirjam Cvetic of the University of Pennsylvania in Philadelphia.

This means that string theorists can use algebraic geometry — which uses algebraic techniques to tackle geometric problems — to analyze the various ways of compactifying extra dimensions in F-theory and to find solutions. Mathematicians have been independently studying some of the geometric forms that appear in F-theory. “They provide us physicists a vast toolkit”, says Ling Lin, also of the University of Pennsylvania. “The geometry is really the key… it is the ‘language’ that makes F-theory such a powerful framework.”

Now, Cvetic, Lin, James Halverson of Northeastern University in Boston, and their colleagues have used such techniques to identify a class of solutions with string vibrational modes that lead to a similar spectrum of fermions (or, particles of matter) as is described by the standard model — including the property that all fermions come in three generations (for example, the electron, muon and tau are the three generations of one type of fermion).

Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)

The F-theory solutions found by Cvetic and colleagues have particles that also exhibit the handedness, or chirality, of the standard model particles. In particle physics lingo, the solutions reproduce the exact “chiral spectrum” of standard model particles. For example, the quarks and leptons in these solutions come in left and right-handed versions, as they do in our universe.

The new work shows that there are at least a quadrillion solutions in which particles have the same chiral spectrum as the standard model, which is 10 orders of magnitude more solutions than had been found within string theory until now. “This is by far the largest domain of standard model solutions,” Cvetic says. “It’s somehow surprising and actually also rewarding that it turns out to be in the strongly coupled string theory regime, where geometry helped us.”

A quadrillion — while it’s much, much smaller than the size of the landscape of solutions in F-theory (which at last count was shown to be of the order of 10272,000) — is a tremendously large number. “And because it’s a tremendously large number, and it gets something nontrivial in real world particle physics correct, we should take it seriously and study it further,” Halverson says.

Further study would involve uncovering stronger connections with the particle physics of the real world. The researchers still have to work out the couplings or interactions between particles in the F-theory solutions — which again depend on the geometric details of the compactifications of the extra dimensions.

It could be that within the space of the quadrillion solutions, there are some with couplings that could cause the proton to decay within observable timescales. This would clearly be at odds with the real world, as experiments have yet to see any sign of protons decaying. Alternatively, physicists could search for solutions that realize the spectrum of standard model particles that preserve a mathematical symmetry called R-parity. “This symmetry forbids certain proton decay processes and would be very attractive from a particle physics point of view, but is missing in our current models,” Lin says.

Also, the work assumes supersymmetry, which means that all the standard model particles have partner particles. String theory needs this symmetry in order to ensure the mathematical consistency of solutions.

But in order for any supersymmetric theory to tally with the observable universe, the symmetry has to be broken (much like how a diner’s selection of cutlery and drinking glass on her left or right side will “break” the symmetry of the table setting at a round dinner table). Else, the partner particles would have the same mass as standard model particles — and that is clearly not the case, since we don’t observe any such partner particles in our experiments.

Crucially, experiments at the Large Hadron Collider (LHC) have also shown that supersymmetry — if it is the correct description of nature — is not broken even at the energy scales probed by the LHC, given that the LHC has yet to find any supersymmetric particles.

String theorists think that supersymmetry might be broken only at extremely high energies that are not within experimental reach anytime soon. “The expectation in string theory is that high-scale [supersymmetry] breaking, which is fully consistent with LHC data, is completely possible,” Halverson says. “It requires further analysis to determine whether or not it happens in our case.”

Despite these caveats, other string theorists are approving of the new work. “This is definitely a step forward in demonstrating that string theory gives rise to many solutions with features of the standard model,” says string theorist Washington Taylor of MIT.

“It’s very nice work,” says Cumrun Vafa, one of the developers of F-theory, at Harvard University. “The fact you can arrange the geometry and topology to fit with not only Einstein’s equations, but also with the [particle] spectrum that we want, is not trivial. It works out nicely here.”

But Vafa and Taylor both caution that these solutions are far from matching perfectly with the standard model. Getting solutions to match exactly with the particle physics of our world is one of the ultimate goals of string theory. Vafa is among those who think that, despite the immensity of the landscape of solutions, there exists a unique solution that matches our universe. “I bet there is exactly one,” he says. But, “to pinpoint this is not going to be easy.”

See the full article here .


five-ways-keep-your-child-safe-school-shootings
Please help promote STEM in your local schools.

Stem Education Coalition

Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

#basic-research, #in-string-theory-a-solution-implies-a-vacuum-of-spacetime-that-is-governed-by-einsteins-theory-of-gravity-coupled-to-a-quantum-field-theory, #in-the-past-two-decades-a-new-branch-of-string-theory-called-f-theory-has-allowed-physicists-to-work-with-strongly-interacting-or-strongly-coupled-strings, #particle-physics, #scientific-american, #string-theorists-can-use-algebraic-geometry-to-analyze-the-various-ways-of-compactifying-extra-dimensions-in-f-theory-and-to-find-solutions, #string-theory, #supersymmetry-susy