Tagged: Dark Energy Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:23 pm on March 24, 2022 Permalink | Reply
    Tags: "What Can We Learn About the Universe from Just One Galaxy?", , , , CAMELS: Cosmology and Astrophysics with MachinE Learning Simulations, , Dark Energy, , , Omega matter: a cosmological parameter that describes how much dark matter is in the universe, ,   

    From The New Yorker: “What Can We Learn About the Universe from Just One Galaxy?” 


    Rea Irvin

    From The New Yorker

    March 23, 2022
    Rivka Galchen

    1
    Illustration by Nicholas Konrad /The New Yorker

    In new research, begun by an undergraduate, William Blake’s phrase “to see a world in a grain of sand” is suddenly relevant to astrophysics.

    Imagine if you could look at a snowflake at the South Pole and determine the size and the climate of all of Antarctica. Or study a randomly selected tree in the Amazon rain forest and, from that one tree—be it rare or common, narrow or wide, young or old—deduce characteristics of the forest as a whole. Or, what if, by looking at one galaxy among the hundred billion or so in the observable universe, one could say something substantial about the universe as a whole? A recent paper, whose lead authors include a cosmologist, a galaxy-formation expert, and an undergraduate named Jupiter (who did the initial work), suggests that this may be the case. The result at first seemed “crazy” to the paper’s authors. Now, having discussed their work with other astrophysicists and done various “sanity checks,” trying to find errors in their methods, the results are beginning to seem pretty clear. Francisco Villaescusa-Navarro, one of the lead authors of the work, said, “It does look like galaxies somehow retain a memory of the entire universe.”

    The research began as a sort of homework exercise. Jupiter Ding, while a freshman at Princeton University, wrote to the department of astrophysics, hoping to get involved in research. He mentioned that he had some experience with machine learning, a form of artificial intelligence that is adept at picking out patterns in very large data sets. Villaescusa-Navarro, an astrophysicist focused on cosmology, had an idea for what the student might work on. Villaescusa-Navarro had long wanted to look into whether machine learning could be used to help find relationships between galaxies and the universe. “I was thinking, What if you could look at only a thousand galaxies and from that learn properties about the entire universe? I wondered, What is the smallest number we could look at? What if you looked at only one hundred? I thought, O.K., we’ll start with one galaxy.”

    He had no expectation that one galaxy would provide much. But he thought that it would be a good way for Ding to practice using machine learning on a database known as CAMELS (Cosmology and Astrophysics with MachinE Learning Simulations). Shy Genel, an astrophysicist focussed on galaxy formation, who is another lead author on the paper, explained CAMELS this way: “We start with a description of reality shortly after the Big Bang. At that point, the universe is mostly hydrogen gas, and some helium and dark matter. And then, using what we know of the laws of physics, our best guess, we then run the cosmic history for roughly fourteen billion years.” Cosmological simulations have been around for about forty years, but they are increasingly sophisticated—and fast. CAMELS contains some four thousand simulated universes. Working with simulated universes, as opposed to our own, lets researchers ask questions that the gaps in our observational data preclude us from answering. They also let researchers play with different parameters, like the proportions of dark matter and hydrogen gas, to test their impact.

    Ding did the work on CAMELS from his dorm room, on his laptop. He wrote programs to work with the CAMELS data, then sent them to one of the university’s computing clusters, a collection of computers with far more power than his MacBook Air. That computing cluster contained the CAMELS data. Ding’s model trained itself by taking a set of simulated universes and looking at the galaxies within them. Once trained, the model would then be shown a sample galaxy and asked to predict features of the universe from which it was sampled.

    Ding is very humble about his contribution to the research, but he knows far more about astrophysics than even an exceptional first-year student typically does. Ding, a middle child with two sisters, grew up in State College, Pennsylvania. In high school, he took a series of college-level astronomy courses at Penn State and worked on a couple of research projects that involved machine learning. “My dad was really interested in astronomy as a high schooler,” Ding told me. “He went another direction, though.” His father is a professor of marketing at Penn State’s business school.

    Artificial intelligence is an umbrella concept for various disciplines, including machine learning. A famous early machine-learning task was to get a computer to recognize an image of a cat. This is something that a human can do easily, but, for a computer, there are no simple parameters that define the visual concept of a cat. Machine learning is now used for detecting patterns or relationships that are nearly impossible for humans to see, in part because the data is often in many dimensions. The programmer remains the captain, telling the computer what to learn, and deciding what input it’s trained on. But the computer adapts, iteratively, as it learns, and in that way becomes the author of its own algorithms. It was machine learning, for example, that discovered, through analyzing language patterns, the alleged main authors of the posts by “Q” (the supposed high-ranking government official who sparked the QAnon conspiracy theory). It was also able to identify which of Q’s posts appeared to be written by Paul Furber, a South African software developer, and by Ron Watkins, the son of the former owner of 8chan. Machine-learning programs have also been applied in health care, using data to predict which patients are most at risk of falling. Compared with the intuition of doctors, the machine-learning-based assessments reduced falls by about forty per cent, an enormous margin of improvement for a medical intervention.

    Machine learning has catapulted astrophysics research forward, too. Villaescusa-Navarro said, “As a community, we have been dealing with super-hard problems for many, many years. Problems that the smartest people in the field have been working on for decades. And from one day to the next, these problems are getting solved with machine learning.” Even generating a single simulated universe used to take a very long time. You gave a computer some initial conditions and then had to wait while it worked out what those conditions would produce some fourteen billion years down the line. It took less than fourteen billion years, of course, but there was no way to build up a large database of simulated universes in a timely way. Machine-learning advances have sped up these simulations, making a project like CAMELS possible. An even more ambitious project, Learning the Universe, will use machine learning to create simulated universes millions of times faster than CAMELS can; it will then use what’s called simulation-based inference—along with real observational data from telescopes—to determine which starting parameters lead to a universe that most closely resembles our own.

    Ding told me that one of the reasons he chose astronomy has been the proximity he feels to breakthroughs in the field, even as an undergraduate. “For example, I’m in a cosmology class right now, and when my professor talks about dark matter, she talks about it as something ‘a good friend of mine, Vera Rubin, put on the map,’ ” he said. “And dark energy was discovered by a team at Harvard University about twenty years ago, and I did a summer program there. So here I am, learning about this stuff pretty much in the places where these things were happening.” Ding’s research produced something profoundly unexpected. His model used a single galaxy in a simulated universe to pretty accurately say something about that universe. The specific characteristic it was able to predict is called Omega matter, which relates to the density of a universe. Its value was accurately predicted to within ten per cent.

    Ding was initially unsure how meaningful his results were and was curious to hear Villaescusa-Navarro’s perspective. He was more than skeptical. “My first thought was, This is completely crazy, I don’t believe it, this is the work of an undergraduate, there must be a mistake,” Villaescusa-Navarro said. “I asked him to run the program in a few other ways to see if he would still come up with similar results.” The results held.

    Villaescusa-Navarro began to do his own calculations. His doubt focussed foremost on the way that the machine learning itself worked. “One thing about neural networks is that they are amazing at finding correlations, but they also can pick up on numerical artifacts,” he said. Was a parameter wrong? Was there a bug in the code? Villaescusa-Navarro wrote his own program, to ask the same sort of question that he had assigned to Ding: What could information about one galaxy say about the universe in which it resided? Even when asked by a different program, written from scratch, the answer was still coming out the same. This suggested that the result was catching something real.

    “But we couldn’t just publish that,” Villaescusa-Navarro said. “We needed to try and understand why this might be working.” It was working for small galaxies, and for large galaxies, and for galaxies with very different features; only for a small handful of eccentric galaxies did the work not hold. Why?

    The recipe for making a universe is to start with a lot of hydrogen, a little helium, some dark matter, and some dark energy. Dark matter has mass, like the matter we’re familiar with, but it doesn’t reflect or emit light, so we can’t see it. We also can’t see dark energy, but we can think of it as working in the opposite direction of gravity. The universe’s matter, via gravity, pushes it to contract; the universe’s dark energy pushes it to expand.

    Omega matter is a cosmological parameter that describes how much dark matter is in the universe. Along with other parameters, it controls how much the universe is expanding. The higher its value, the slower the universe would grow. One of the research group’s hypotheses to explain their results is, roughly, that the amount of dark matter in a universe has a very strong effect on a galaxy’s properties—a stronger effect than other characteristics. For this reason, even one galaxy could have something to say about the Omega matter of its parent universe, since Omega matter is correlated to what can be pictured as the density of matter that makes a galaxy clump together.

    In December, Genel, an expert on galaxy formation, presented the preliminary results of the paper to the galaxy-formation group he belongs to at The Flatiron Institute Center for Computational Astrophysics. “This was really one of the most fun things that happened to me,” he said. He told me that any galaxy-formation expert could have no other first reaction than to think, This is impossible. A galaxy is, on the scale of a universe, about as substantial as a grain of sand is, relative to the size of the Earth. To think that all by itself it can say something so substantial is, to the majority of the astrophysics community, extremely surprising, in a way analogous to the discovery that each of our cells—from a fingernail cell to a liver cell—contains coding describing our entire body. (Though maybe to the poetic way of thinking—to see the world in a grain of sand—the surprise is that this is surprising.)

    Rachel Somerville, an astrophysicist who was at the talk, recalled the initial reaction as “skepticism, but respectful skepticism, since we knew these were serious researchers.” She remembers being surprised that the approach had even been tried, since it seemed so tremendously unlikely that it would work. Since that time, the researchers have shared their coding and results with experts in the field; the results are taken to be credible and compelling, though the hesitations that the authors themselves have about the results remain.

    The results are not “robust”—for now, the computer can make valid predictions only on the type of universe that it has been trained on. Even within CAMELS, there are two varieties of simulations, and, if the machine is trained on one variety, it cannot be used to make predictions for galaxies in the other variety. That also means that the results cannot be used to make predictions about the universe we live in—at least not yet.

    Villaescusa-Navarro told me, “It is a very beautiful result—I know I shouldn’t say that about my own work.” But what is beauty to an astrophysicist? “It’s about an unexpected connection between two things that seemed not to be related. In this case, cosmology and galaxy formation. It’s about something hidden being revealed.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 2:10 pm on March 12, 2022 Permalink | Reply
    Tags: "Ask Ethan-Did our Universe really arise from nothing?", , , , , Dark Energy, , , ,   

    From Ethan Siegel “Ask Ethan-Did our Universe really arise from nothing?”

    Mar 11, 2022

    The Big Bang was hot, dense, uniform, and filled with matter and energy. Before that? There was nothing. Here’s how that’s possible.

    The more curious we get about the great cosmic unknowns, the more unanswered questions our investigations of the Universe will reveal. Inquiring about the nature of anything — where it is, where it came from, and how it came to be — will inevitably lead you to the same great mysteries: about the ultimate nature and origin of the Universe and everything in it. Yet, no matter how far back we go, those same lingering questions always seem to remain: at some point, the entities that are our “starting point” didn’t necessarily exist, so how did they come to be? Eventually, you wind up at the ultimate question: how did something arise from nothing? As many recent questioners, including Luke Martin, Buzz Morse, Russell Blalack, John Heiss and many others have written:

    “Okay, you surely receive this question endlessly, but I shall ask nonetheless: How did something (the universe/big bang) come from nothing?”

    This is maybe one of the biggest questions of all, because it’s basically asking not only where did everything come from, but how did all of it arise in the first place. Here’s as far as science has gotten us, at least, so far.

    2
    A detailed look at the Universe reveals that it’s made of matter and not antimatter, that dark matter and dark energy are required, and that we don’t know the origin of any of these mysteries. However, the fluctuations in the CMB, the formation and correlations between large-scale structure, and modern observations of gravitational lensing all point towards the same picture. (Credit: Chris Blake and Sam Moorfield)

    Today, when we look out at the Universe, the full suite of observations we’ve collected, even with the known uncertainties taken into account, all point towards a remarkably consistent picture. Our Universe is made of matter (rather than antimatter), obeys the same laws of physics everywhere and at all times, and began — at least, as we know it — with a hot Big Bang some 13.8 billion years ago. It’s governed by General Relativity, it’s expanding and cooling and gravitating, and it’s dominated by dark energy (68%) and dark matter (27%), with normal matter, neutrinos, and radiation making up the rest.

    Today, of course, it’s full of galaxies, stars, planets, heavy elements, and in at least one location, intelligent and technologically advanced life. These structures weren’t always there, but rather arose as a result of cosmic evolution. In a remarkable scientific leap, 20th century scientists were able to reconstruct the timeline for how our Universe went from a mostly uniform Universe, devoid of complex structure and consisting exclusively of hydrogen and helium, to the structure-rich Universe we observe today.

    5
    Supernova remnants (L) and planetary nebulae (R) are both ways for stars to recycle their burned, heavy elements back into the interstellar medium and the next generation of stars and planets. These processes are two ways that the heavy elements necessary for chemical-based life to arise are generated, and it’s difficult (but not impossible) to imagine a Universe without them still giving rise to intelligent observers. (Credits: ESO/VLT/FORS Instrument & Team (L); NASA/ESA/C.R. O’Dell (Vanderbilt) and D. Thompson (LBT) (R))

    If we start from today, we can step backwards in time, and ask where any individual structure or component of that structure came from. For each answer we get, we can then ask, “ok, but where did that come from and how did that arise,” going back until we’re forced to answer, “we don’t know, at least not yet.” Then, at last, we can contemplate what we have, and ask, “how did that arise, and is there a way that it could have arisen from nothing?”

    So, let’s get started.

    The life we have today comes from complex molecules, which must have arisen from the atoms of the periodic table: the raw ingredients that make up all the normal matter we have in the Universe today. The Universe wasn’t born with these atoms; instead, they required multiple generations of stars living-and-dying, with the products of their nuclear reactions recycled into future generations of stars. Without this, planets and complex chemistry would be an impossibility.

    In order to form modern stars and galaxies, we need:

    gravitation to pull small galaxies and star clusters into one another, creating large galaxies and triggering new waves of star formation,
    which required pre-existing collections of mass, created from gravitational growth,
    which require dark matter haloes to form early on, preventing star forming episodes from ejecting that matter back into the intergalactic medium,
    which require the right balance of normal matter, dark matter, and radiation to give rise to the cosmic microwave background, the light elements formed in the hot Big Bang, and the abundances/patterns we see in them,
    which required initial seed fluctuations — density imperfections — to gravitationally grow into these structures,
    which require some way of creating these imperfections, along with some way of creating dark matter and creating the initial amounts of normal matter.

    These are three key ingredients that are required, in the early stages of the hot Big Bang, to give rise to the Universe as we observe it today. Assuming that we also require the laws of physics and spacetime itself to exist — along with matter/energy itself — we probably want to include those as the necessary ingredients that must somehow arise.

    So, in short, when we ask whether we can get a Universe from nothing or not, these are the novel, hitherto unexplained entities that we need to somehow arise.

    5
    An equally-symmetric collection of matter and antimatter (of X and Y, and anti-X and anti-Y) bosons could, with the right GUT properties, give rise to the matter/antimatter asymmetry we find in our Universe today. However, we assume that there is a physical, rather than a divine, explanation for the matter-antimatter asymmetry we observe today, but we do not yet know for certain. (Credit: E. Siegel/Beyond the Galaxy.)

    To get more matter than antimatter, we have to extrapolate back into the very early Universe, to a time when our physics is very much uncertain. The laws of physics as we know them are in some sense symmetric between matter and antimatter: every reaction we’ve ever created or observed can only create-or-destroy matter and antimatter in equal amounts. But the Universe we had, despite beginning in an incredibly hot and dense state where matter and antimatter could both be created in abundant, copious amounts, must have had some way to create a matter/antimatter asymmetry where none existed initially.

    There are many ways to accomplish this. Although we don’t know which scenario actually took place in our young Universe, all ways of doing so involve the following three elements:

    an out-of-equilibrium set of conditions, which naturally arise in an expanding, cooling Universe,
    a way to generate baryon-number-violating interactions, which the Standard Model allows through sphaleron interactions (and beyond-the-Standard-Model scenarios allow in additional ways),
    and a way to generate enough C and CP violation to create a matter/antimattery asymmetry in great enough amounts.

    The Standard Model has all of these ingredients, but not enough.

    If you consider a matter/antimatter symmetric Universe as “a Universe with nothing,” then it’s almost guaranteed that the Universe generated something from nothing, even though we aren’t quite certain exactly how it happened.

    6
    The overdense regions from the early Universe grow and grow over time, but are limited in their growth by both the initial small sizes of the overdensities and also by the presence of radiation that’s still energetic, which prevents structure from growing any faster. It takes tens-to-hundreds of millions of years to form the first stars; clumps of matter exist long before that, however. (Credit: Aaron Smith/TACC/UT-Austin)

    Similarly, there are lots of viable ways to generate dark matter. We know — from extensive testing and searching — that whatever dark matter is, it can’t be composed of any particles that are present in the Standard Model. Whatever its true nature is, it requires new physics beyond what’s presently known. But there are many ways it could have been created, including:

    from being thermally created in the hot, early Universe, and then failing to completely annihilate away, remaining stable thereafter (like the lightest supersymmetric or Kaluza-Klein particle),
    or from a phase transition that spontaneously occurred as the Universe expanded and cooled, ripping massive particles out of the quantum vacuum (e.g., the axion),
    as a new form of a neutrino, which itself can either mix with the known neutrinos (i.e., a sterile neutrino), or as a heavy right-handed neutrino that exists in addition to the conventional neutrinos,
    or as a purely gravitational phenomenon that gives rise to an ultramassive particle (e.g., a WIMPzilla).

    Why is there dark matter, today, when the remainder of the Universe appears to work just fine early on without it? There must have been some way to generate this “thing” where there wasn’t such a thing beforehand, but all of these scenarios require energy. So, then, where did all that energy come from?

    6
    The Universe as we observe it today began with the hot Big Bang: an early hot, dense, uniform, expanding state with specific initial conditions. But if we want to understand where the Big Bang comes from, we must not assume it’s the absolute beginning, and we must not assume that anything we can’t predict doesn’t have a mechanism to explain it. (Credit: C.-A. Faucher-Giguere, A. Lidz, and L. Hernquist, Science, 2008)

    Perhaps, according to cosmic inflation — our leading theory of the Universe’s pre-Big Bang origins — it really did come from nothing. This requires a little bit of an explanation, and is what is most frequently meant by “a Universe from nothing.” (Including, by the way, as it was used in the title of the book of the same name.)

    When you imagine the earliest stages of the hot Big Bang, you have to think of something incredibly hot, dense, high-energy, and almost perfectly uniform. When we ask, “how did this arise,” we typically have two options.

    We can go the Lady Gaga route, and just claim it must’ve been “born this way.” The Universe was born with these properties, which we call initial conditions, and there’s no further explanation. As a theoretical physicist, we call this approach “giving up.”
    Or we can do what theoretical physicists do best: try and concoct a theoretical mechanism that could explain the initial conditions, teasing out concrete predictions that differ from the standard, prevailing theory’s predictions and then going out seeking to measure the critical parameters.

    Cosmic inflation came about as a result of taking that second approach, and it literally changed our conception of how our Universe came to be.

    ___________________________________________________________________
    Inflation

    4
    Alan Guth, from M.I.T., who first proposed cosmic inflation

    Lamda Cold Dark Matter Accerated Expansion of The universe http scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes. Credit: Alex Mittelmann.

    Alan Guth’s notes:
    Alan Guth’s original notes on inflation
    ________________________________________________________________
    7
    Exponential expansion, which takes place during inflation, is so powerful because it is relentless. With every ~10^-35 seconds (or so) that passes, the volume of any particular region of space doubles in each direction, causing any particles or radiation to dilute and causing any curvature to quickly become indistinguishable from flat. (Credit: E. Siegel (L); Ned Wright’s Cosmology Tutorial (R))

    Instead of extrapolating “hot and dense” back to an infinitely hot, infinitely dense singularity, inflation basically says, “perhaps the hot Big Bang was preceded by a period where an extremely large energy density was present in the fabric of space itself, causing the Universe to expand at a relentless (inflationary) rate, and then when inflation ended, that energy got transferred into matter-and-antimatter-and-radiation, creating what we see as the hot Big Bang: the aftermath of inflation.”

    In gory detail, this not only creates a Universe with the same temperature everywhere, spatial flatness, and no leftover relics from a hypothetical grand unified epoch, but also predicts a particular type and spectrum of seed (density) fluctuations, which we then went out and saw. From just empty space itself — although it is empty space filled with a large amount of field energy — a natural process has created the entire observable Universe, rich in structure, as we see it today.

    That’s the big idea of getting a Universe from nothing, but it isn’t satisfying to everyone.

    8
    Even in empty space, the quantum fluctuations inherent to the field nature of the fundamental interactions cannot be removed. As the Universe inflates in the earliest stages, those fluctuations get stretched across the Universe, giving rise to seed density and temperature fluctuations that can still be observed today. (Credit: E. Siegel/Beyond the Galaxy)

    To a large fraction of people, a Universe where space-and-time still exist, along with the laws of physics, the fundamental constants, and some non-zero field energy inherent to the fabric of space itself, is very much divorced from the idea of nothingness. We can imagine, after all, a location outside of space; a moment beyond the confines of time; a set of conditions that have no physical reality to constrain them. And those imaginings — if we define these physical realities as things we need to eliminate to obtain true nothingness — are certainly valid, at least philosophically.

    But that’s the difference between philosophical nothingness and a more physical definition of nothingness. As I wrote back in 2018, there are four scientific definitions of nothing, and they’re all valid, depending on your context:

    A time when your “thing” of interest didn’t exist,
    Empty, physical space,
    Empty spacetime in the lowest-energy state possible, and
    Whatever you’re left with when you take away the entire Universe and the laws governing it.

    We can definitely say we obtained “a Universe from nothing” if we use the first two definitions; we cannot if we use the third; and quite unfortunately, we don’t know enough to say what happens if we use the fourth. Without a physical theory to describe what happens outside of the Universe and beyond the realm physical laws, the concept of true nothingness is physically ill-defined.

    9
    Fluctuations in spacetime itself at the quantum scale get stretched across the Universe during inflation, giving rise to imperfections in both density and gravitational waves. While inflating space can rightfully be called ‘nothing’ in many regards, not everyone agrees. (Credit: E. Siegel; ESA/Planck and the DOE/NASA/NSF Interagency Task Force on CMB research)

    In the context of physics, it’s impossible to make sense of an idea of absolute nothingness. What does it mean to be outside of space and time, and how can space and time sensibly, predictably emerge from a state of non-existence? How can spacetime emerge at a particular location or time, when there’s no definition of location or time without it? Where do the rules governing quanta — the fields and particles both — arise from?

    This line of thought even assumes that space, time, and the laws of physics themselves weren’t eternal, when in fact they may be. Any theorems or proofs to the contrary rely on assumptions whose validity is not soundly established under the conditions which we’d seek to apply them. If you accept a physical definition of “nothing,” then yes, the Universe as we know it very much appears to have arisen from nothing. But if you leave physical constraints behind, then all certainly about our ultimate cosmic origins disappears.

    Unfortunately for us all, inflation, by its very nature, erases any information that might be imprinted from a pre-existing state on our observable Universe. Despite the limitless nature of our imaginations, we can only draw conclusions about matters for which tests involving our physical reality can be constructed. No matter how logically sound any other consideration may be, including a notion of absolute nothingness, it’s merely a construct of our minds.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 4:45 pm on January 21, 2022 Permalink | Reply
    Tags: "Any Single Galaxy Reveals the Composition of an Entire Universe", A group of scientists may have stumbled upon a radical new way to do cosmology., , Cosmic density of matter, , Dark Energy, , , , , , The Cosmology and Astrophysics with Machine Learning Simulations (CAMELS) project, Theoretical Astrophysics   

    From Quanta Magazine (US): “Any Single Galaxy Reveals the Composition of an Entire Universe” 

    From Quanta Magazine (US)

    January 20, 2022
    Charlie Wood

    1
    Credit: Kaze Wong / CAMELS collaboration.


    In the CAMELS project, coders simulated thousands of universes with diverse compositions, arrayed at the end of this video as cubes.

    A group of scientists may have stumbled upon a radical new way to do cosmology.

    Cosmologists usually determine the composition of the universe by observing as much of it as possible. But these researchers have found that a machine learning algorithm can scrutinize a single simulated galaxy and predict the overall makeup of the digital universe in which it exists — a feat analogous to analyzing a random grain of sand under a microscope and working out the mass of Eurasia. The machines appear to have found a pattern that might someday allow astronomers to draw sweeping conclusions about the real cosmos merely by studying its elemental building blocks.

    “This is a completely different idea,” said Francisco Villaescusa-Navarro, a theoretical astrophysicist at The Flatiron Institute Center for Computational Astrophysics (US) and lead author of the work. “Instead of measuring these millions of galaxies, you can just take one. It’s really amazing that this works.”

    It wasn’t supposed to. The improbable find grew out of an exercise Villaescusa-Navarro gave to Jupiter Ding, a Princeton University(US) undergraduate: Build a neural network that, knowing a galaxy’s properties, can estimate a couple of cosmological attributes. The assignment was meant merely to familiarize Ding with machine learning. Then they noticed that the computer was nailing the overall density of matter.

    “I thought the student made a mistake,” Villaescusa-Navarro said. “It was a little bit hard for me to believe, to be honest.”

    The results of the investigation that followed appeared on January 6 submitted for publication. The researchers analyzed 2,000 digital universes generated by The Cosmology and Astrophysics with Machine Learning Simulations (CAMELS) project [The Astrophysical Journal]. These universes had a range of compositions, containing between 10% and 50% matter with the rest made up of Dark Energy, which drives the universe to expand faster and faster. (Our actual cosmos consists of roughly one-third Dark Matter and visible matter and two-thirds Dark Energy.) As the simulations ran, Dark Matter and visible matter swirled together into galaxies. The simulations also included rough treatments of complicated events like supernovas and jets that erupt from supermassive black holes.

    Ding’s neural network studied nearly 1 million simulated galaxies within these diverse digital universes. From its godlike perspective, it knew each galaxy’s size, composition, mass, and more than a dozen other characteristics. It sought to relate this list of numbers to the density of matter in the parent universe.

    It succeeded. When tested on thousands of fresh galaxies from dozens of universes it hadn’t previously examined, the neural network was able to predict the cosmic density of matter to within 10%. “It doesn’t matter which galaxy you are considering,” Villaescusa-Navarro said. “No one imagined this would be possible.”

    “That one galaxy can get [the density to] 10% or so, that was very surprising to me,” said Volker Springel, an expert in simulating galaxy formation at The MPG Institute for Astrophysics [MPG Institut für Astrophysik](DE) who was not involved in the research.

    The algorithm’s performance astonished researchers because galaxies are inherently chaotic objects. Some form all in one go, and others grow by eating their neighbors. Giant galaxies tend to hold onto their matter, while supernovas and black holes in dwarf galaxies might eject most of their visible matter. Still, every galaxy had somehow managed to keep close tabs on the overall density of matter in its universe.

    One interpretation is “that the universe and/or galaxies are in some ways much simpler than we had imagined,” said Pauline Barmby, an astronomer at The Western University (CA). Another is that the simulations have unrecognized flaws.

    The team spent half a year trying to understand how the neural network had gotten so wise. They checked to make sure the algorithm hadn’t just found some way to infer the density from the coding of the simulation rather than the galaxies themselves. “Neural networks are very powerful, but they are super lazy,” Villaescusa-Navarro said.

    Through a series of experiments, the researchers got a sense of how the algorithm was divining the cosmic density. By repeatedly retraining the network while systematically obscuring different galactic properties, they zeroed in on the attributes that mattered most.

    Near the top of the list was a property related to a galaxy’s rotation speed, which corresponds to how much matter (dark and otherwise) sits in the galaxy’s central zone. The finding matches physical intuition, according to Springel. In a universe overflowing with Dark Matter, you’d expect galaxies to grow heavier and spin faster. So you might guess that rotation speed would correlate with the cosmic matter density, although that relationship alone is too rough to have much predictive power.

    The neural network found a much more precise and complicated relationship between 17 or so galactic properties and the matter density. This relationship persists despite galactic mergers, stellar explosions and black hole eruptions. “Once you get to more than [two properties], you can’t plot it and squint at it by eye and see the trend, but a neural network can,” said Shaun Hotchkiss, a cosmologist at The University of Auckland (NZ).

    While the algorithm’s success raises the question of how many of the universe’s traits might be extracted from a thorough study of just one galaxy, cosmologists suspect that real-world applications will be limited. When Villaescusa-Navarro’s group tested their neural network on a different property — cosmic clumpiness — it found no pattern. And Springel expects that other cosmological attributes, such as the accelerating expansion of the universe due to Dark Energy, have little effect on individual galaxies.

    The research does suggest that, in theory, an exhaustive study of the Milky Way and perhaps a few other nearby galaxies could enable an exquisitely precise measurement of our universe’s matter. Such an experiment, Villaescusa-Navarro said, could give clues to other numbers of cosmic import such as the sum of the unknown masses of the universe’s three types of neutrinos.

    3
    Neutrinos- Universe Today

    But in practice, the technique would have to first overcome a major weakness. The CAMELS collaboration cooks up its universes using two different recipes. A neural network trained on one of the recipes makes bad density guesses when given galaxies that were baked according to the other. The cross-prediction failure indicates that the neural network is finding solutions unique to the rules of each recipe. It certainly wouldn’t know what to do with the Milky Way, a galaxy shaped by the real laws of physics. Before applying the technique to the real world, researchers will need to either make the simulations more realistic or adopt more general machine learning techniques — a tall order.

    “I’m very impressed by the possibilities, but one needs to avoid being too carried away,” Springel said.

    But Villaescusa-Navarro takes heart that the neural network was able to find patterns in the messy galaxies of two independent simulations. The digital discovery raises the odds that the real cosmos may be hiding a similar link between the large and the small.

    “It’s a very beautiful thing,” he said. “It establishes a connection between the whole universe and a single galaxy.”

    _____________________________________________________________________________________
    The Dark Energy Survey

    Dark Energy Camera [DECam] built at DOE’s Fermi National Accelerator Laboratory(US).

    NOIRLab National Optical Astronomy Observatory(US) Cerro Tololo Inter-American Observatory(CL) Victor M Blanco 4m Telescope which houses the Dark-Energy-Camera – DECam at Cerro Tololo, Chile at an altitude of 7200 feet.

    NOIRLab(US)NSF NOIRLab NOAO (US) Cerro Tololo Inter-American Observatory(CL) approximately 80 km to the East of La Serena, Chile, at an altitude of 2200 meters.

    Timeline of the Inflationary Universe WMAP.

    The The Dark Energy Survey is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. The Dark Energy Survey began searching the Southern skies on August 31, 2013.

    According to Albert Einstein’s Theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up.

    Saul Perlmutter (center) [The Supernova Cosmology Project] shared the 2006 Shaw Prize in Astronomy, the 2011 Nobel Prize in Physics, and the 2015 Breakthrough Prize in Fundamental Physics with Brian P. Schmidt (right) and Adam Riess (left) [The High-z Supernova Search Team] for providing evidence that the expansion of the universe is accelerating.

    To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called Dark Energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    The Dark Energy Survey is designed to probe the origin of the accelerating universe and help uncover the nature of Dark Energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the Dark Energy Survey collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.
    _____________________________________________________________________________________

    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM, denied the Nobel, some 30 years later, did most of the work on Dark Matter.

    Fritz Zwicky.
    Coma cluster via NASA/ESA Hubble, the original example of Dark Matter discovered during observations by Fritz Zwicky and confirmed 30 years later by Vera Rubin.
    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.

    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.

    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.
    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).

    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970.

    Vera Rubin measuring spectra, worked on Dark Matter(Emilio Segre Visual Archives AIP SPL).
    Dark Matter Research

    LBNL LZ Dark Matter Experiment (US) xenon detector at Sanford Underground Research Facility(US) Credit: Matt Kapust.

    Lamda Cold Dark Matter Accerated Expansion of The universe http scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes. Credit: Alex Mittelmann.

    DAMA at Gran Sasso uses sodium iodide housed in copper to hunt for dark matter LNGS-INFN.

    Yale HAYSTAC axion dark matter experiment at Yale’s Wright Lab.

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB (CA) deep in Sudbury’s Creighton Mine.

    The LBNL LZ Dark Matter Experiment (US) Dark Matter project at SURF, Lead, SD, USA.

    DAMA-LIBRA Dark Matter experiment at the Italian National Institute for Nuclear Physics’ (INFN’s) Gran Sasso National Laboratories (LNGS) located in the Abruzzo region of central Italy.

    DARWIN Dark Matter experiment. A design study for a next-generation, multi-ton dark matter detector in Europe at The University of Zurich [Universität Zürich](CH).

    PandaX II Dark Matter experiment at Jin-ping Underground Laboratory (CJPL) in Sichuan, China.

    Inside the Axion Dark Matter eXperiment U Washington (US) Credit : Mark Stone U. of Washington. Axion Dark Matter Experiment.
    ______________________________________________________

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine (US) is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 9:28 pm on December 6, 2021 Permalink | Reply
    Tags: "The uneven universe", An uneven distribution of the mass in the universe may have an effect on the speed of cosmic expansion., , , , , Dark Energy, , In reality the universe is not uniform: in some places there are stars and planets and in others there is just a void., It is almost always assumed in cosmological calculations that there is a even distribution of matter in the universe., One of the most important applications of the theory is in describing the cosmic expansion of the universe since the Big Bang., , The scientists starting point was the Mori-Zwanzig formalism-a method for describing systems consisting of a large number of particles with a small number of measurands., The speed of this expansion is determined by the amount of energy in the universe., The University of Münster [Westfälische Wilhelms-Universität Münster] (DE)   

    From The University of Münster [Westfälische Wilhelms-Universität Münster] (DE): “The uneven universe” 

    1

    From The University of Münster [Westfälische Wilhelms-Universität Münster](DE)

    3. December 2021

    Communication and Public Relations
    Schlossplatz 2
    48149 Münster
    Tel: +49 251 83-22232
    Fax: +49 251 83-22258
    communication@uni-muenster.de

    Timeline of the Inflationary Universe NASA WMAP (US)

    Researchers study cosmic expansion using methods from many-body physics / Article published in Physical Review Letters.

    It is almost always assumed in cosmological calculations that there is a even distribution of matter in the universe. This is because the calculations would be much too complicated if the position of every single star were to be included. In reality the universe is not uniform: in some places there are stars and planets and in others there is just a void. Physicists Michael te Vrugt and Prof. Raphael Wittkowski from the Institute of Theoretical Physics and the Center for Soft Nanoscience (SoN) at the University of Münster have, together with physicist Dr. Sabine Hossenfelder from The Frankfurt Institute for Advanced Studies (DE), developed a new model for this problem. Their starting point was the Mori-Zwanzig formalism-a method for describing systems consisting of a large number of particles with a small number of measurands. The results of the study have now been published in the journal Physical Review Letters.

    Background: The theory of general relativity developed by Albert Einstein is one of the most successful theories in modern physics. Two of the last five Nobel Prizes for Physics had associations with it: in 2017 for the measurement of gravitational waves, and in 2020 for the discovery of a black hole at the centre of the Milky Way. One of the most important applications of the theory is in describing the cosmic expansion of the universe since the Big Bang. The speed of this expansion is determined by the amount of energy in the universe. In addition to the visible matter, it is above all the dark matter and dark energy which play a role here – at least, according to the Lambda-CDM model currently used in cosmology.

    Lamda Cold Dark Matter Accerated Expansion of The universe http scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes. Credit: Alex Mittelmann.

    “Strictly speaking, it is mathematically wrong to include the mean value of the universe’s energy density in the equations of general relativity”, says Sabine Hossenfelder. The question is now how “bad” this mistake is. Some experts consider it to be irrelevant, others see in it the solution to the enigma of dark energy, whose physical nature is still unknown. An uneven distribution of the mass in the universe may have an effect on the speed of cosmic expansion.

    “The Mori-Zwanzig formalism is already being successfully used in many fields of research, from biophysics to particle physics,” says Raphael Wittkowski, “so it also offered a promising approach to this astrophysical problem.” The team generalised this formalism so that it could be applied to general relativity and, in doing so, derived a model for cosmic expansion while taking into consideration the uneven distribution of matter in the universe.

    The model makes a concrete prediction for the effect of these so-called inhomogeneities on the speed of the expansion of the universe. This prediction deviates slightly from that given by the Lambda-CDM model and thus provides an opportunity to test the new model experimentally. “At present, the astronomical data are not precise enough to measure this deviation,” says Michael te Vrugt, “but the great progress made – for example, in the measurement of gravitational waves – gives us reason to hope that this will change. Also, the new variant of the Mori-Zwanzig formalism can also be applied to other astrophysical problems – so the work is relevant not only to cosmology.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sitz der WWU
    Foto: MünsterView/Tronquet

    The The University of Münster [Westfälische Wilhelms-Universität Münster](DE) is a public university located in the city of Münster, North Rhine-Westphalia in Germany.

    With more than 43,000 students and over 120 fields of study in 15 departments, it is Germany’s fifth largest university and one of the foremost centers of German intellectual life. The university offers a wide range of subjects across the sciences, social sciences and the humanities. Several courses are also taught in English, including PhD programmes as well as postgraduate courses in geoinformatics, geospational technologies or information systems.

    Professors and former students have won ten Leibniz Prizes, the most prestigious as well as the best-funded prize in Europe, and one Fields Medal. The WWU has also been successful in the German government’s Excellence Initiative.

     
  • richardmitnick 1:47 pm on October 8, 2021 Permalink | Reply
    Tags: "Fermilab boasts new Theory Division", Astrophysics Theory, , , , Dark Energy, , , Fermilab experts on perturbative QCD use high-performance computing to tackle the complexity of simulations for experiments at the Large Hadron Collider., Muon g-2 Theory Initiative and the Muon g-2 experiment, , Particle Theory, , , Superconducting Systems,   

    From DOE’s Fermi National Accelerator Laboratory (US) : “Fermilab boasts new Theory Division” 

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From DOE’s Fermi National Accelerator Laboratory (US) , an enduring source of strength for the US contribution to scientific research worldwide.

    October 8, 2021

    Theoretical physics research at Fermi National Particle Accelerator Laboratory has always sparked new ideas and scientific opportunities, while at the same time supporting the large experimental group that conducts research at Fermilab. In recent years, the Theoretical Physics Department has further strengthened its position worldwide as a hub for the high-energy physics theoretical community. The department has now become Fermilab’s newest division, the Theory Division, which officially launched early this year with strong support from HEP.

    This new division seeks to:

    support strategic theory leadership;
    promote new initiatives, as well as strengthen existing ones;
    and leverage U.S. Department of Energy support through partnerships with universities and more.

    “Creating the Theory Division increases the lab’s abilities to stimulate and develop new pathways to discovery,” said Fermilab Director Nigel Lockyer.

    Led by Marcela Carena and her deputy Patrick Fox, this new division features three departments: Particle Theory, Astrophysics Theory and Quantum Theory. “This structure will help us focus our scientific efforts in each area and will allow for impactful contributions to existing and developing programs for the theory community,” said Carena.

    Particle Theory Department

    At the helm of the Particle Theory Department is Andreas Kronfeld. This department studies all aspects of theoretical particle physics, especially those areas inspired by the experimental program—at Fermilab and elsewhere. It coordinates leading national efforts, including the Neutrino Theory Network, and the migration of the lattice gauge theory program to Exascale computing platforms. Lattice quantum chromodynamics, or QCD, experts support the Muon g-2 Theory Initiative, providing a solid theory foundation for the recently announced results of the Muon g-2 experiment.

    Fermilab particle theorists, working with DOE’s Argonne National Laboratory (US) nuclear theorists, are using machine learning for developing novel event generators to precisely model neutrino-nuclear interactions, and employ lattice QCD to model multi-nucleon interactions; both are important for achieving the science goals of DUNE.

    Fermilab experts on perturbative QCD use high-performance computing to tackle the complexity of simulations for experiments at the Large Hadron Collider. Fermilab theorists are strongly involved in the exploration of physics beyond the Standard Model, through model-building, particle physics phenomenology, and formal aspects of quantum field theory.

    Astrophysics Theory Department

    Astrophysics Theory, led by Dan Hooper, consists of researchers who work at the confluence of astrophysics, cosmology and particle physics. Fermilab’s scientists have played a key role in the development of this exciting field worldwide and continue to be deeply involved in supporting the Fermilab cosmic frontier program.

    Key areas of research include dark matter, dark energy, the cosmic microwave background, large-scale structure, neutrino astronomy and axion astrophysics. A large portion of the department’s research involves numerical cosmological simulations of galaxy formation, large-scale structures and gravitational lensing. The department is developing machine-learning tools to help solve these challenging problems.

    Quantum Theory Department

    Led by Roni Harnik, the Quantum Theory Department has researchers working at the interface of quantum information science and high-energy physics. Fermilab theorists are working to harness the developing power of unique quantum information capabilities to address important physics questions, such as the simulation of QCD processes, dynamics in the early universe, and more generally simulating quantum field theories. Quantum-enhanced capabilities also open new opportunities to explore the universe and test theories of new particles, dark matter, gravitational waves and other new physics.

    Scientists in the Quantum Theory Department are developing new algorithms for quantum simulations, and they are proposing novel methods to search for new phenomena using quantum technology, including quantum optics, atomic physics, optomechanical sensors and superconducting systems. The department works in close collaboration with both the Fermilab Superconducting Quantum Materials and Systems Center and the Fermilab Quantum Institute, as well as leads a national QuantISED theory consortium.

    Looking ahead

    The new Theory Division also intends to play a strong role in attracting and inspiring the next generation of theorists, training them in a data-rich environment, as well as promoting an inclusive culture that values diversity.

    “The best part about being a Fermilab theorist,” said Marcela Carena, “is working with brilliant junior scientists and sharing their excitement about exploring new ideas.”

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Fermi National Accelerator Laboratory (US), located just outside Batavia, Illinois, near Chicago, is a United States Department of Energy national laboratory specializing in high-energy particle physics. Since 2007, Fermilab has been operated by the Fermi Research Alliance, a joint venture of the University of Chicago, and the Universities Research Association (URA). Fermilab is a part of the Illinois Technology and Research Corridor.

    Fermilab’s Tevatron was a landmark particle accelerator; until the startup in 2008 of the Large Hadron Collider(CH) near Geneva, Switzerland, it was the most powerful particle accelerator in the world, accelerating antiprotons to energies of 500 GeV, and producing proton-proton collisions with energies of up to 1.6 TeV, the first accelerator to reach one “tera-electron-volt” energy. At 3.9 miles (6.3 km), it was the world’s fourth-largest particle accelerator in circumference. One of its most important achievements was the 1995 discovery of the top quark, announced by research teams using the Tevatron’s CDF and DØ detectors. It was shut down in 2011.

    In addition to high-energy collider physics, Fermilab hosts fixed-target and neutrino experiments, such as MicroBooNE (Micro Booster Neutrino Experiment), NOνA (NuMI Off-Axis νe Appearance) and SeaQuest. Completed neutrino experiments include MINOS (Main Injector Neutrino Oscillation Search), MINOS+, MiniBooNE and SciBooNE (SciBar Booster Neutrino Experiment). The MiniBooNE detector was a 40-foot (12 m) diameter sphere containing 800 tons of mineral oil lined with 1,520 phototube detectors. An estimated 1 million neutrino events were recorded each year. SciBooNE sat in the same neutrino beam as MiniBooNE but had fine-grained tracking capabilities. The NOνA experiment uses, and the MINOS experiment used, Fermilab’s NuMI (Neutrinos at the Main Injector) beam, which is an intense beam of neutrinos that travels 455 miles (732 km) through the Earth to the Soudan Mine in Minnesota and the Ash River, Minnesota, site of the NOνA far detector. In 2017, the ICARUS neutrino experiment was moved from CERN to Fermilab.
    In the public realm, Fermilab is home to a native prairie ecosystem restoration project and hosts many cultural events: public science lectures and symposia, classical and contemporary music concerts, folk dancing and arts galleries. The site is open from dawn to dusk to visitors who present valid photo identification.
    Asteroid 11998 Fermilab is named in honor of the laboratory.
    Weston, Illinois, was a community next to Batavia voted out of existence by its village board in 1966 to provide a site for Fermilab.

    The laboratory was founded in 1969 as the National Accelerator Laboratory; it was renamed in honor of Enrico Fermi in 1974. The laboratory’s first director was Robert Rathbun Wilson, under whom the laboratory opened ahead of time and under budget. Many of the sculptures on the site are of his creation. He is the namesake of the site’s high-rise laboratory building, whose unique shape has become the symbol for Fermilab and which is the center of activity on the campus.
    After Wilson stepped down in 1978 to protest the lack of funding for the lab, Leon M. Lederman took on the job. It was under his guidance that the original accelerator was replaced with the Tevatron, an accelerator capable of colliding protons and antiprotons at a combined energy of 1.96 TeV. Lederman stepped down in 1989. The science education center at the site was named in his honor.
    The later directors include:

    John Peoples, 1989 to 1996
    Michael S. Witherell, July 1999 to June 2005
    Piermaria Oddone, July 2005 to July 2013
    Nigel Lockyer, September 2013 to the present

    Fermilab continues to participate in the work at the Large Hadron Collider (LHC); it serves as a Tier 1 site in the Worldwide LHC Computing Grid.

    DOE’s Fermi National Accelerator Laboratory(US)/MINERvA Reidar Hahn.

    FNAL Don Lincoln.[/caption]

    FNAL Icon

     
  • richardmitnick 8:25 pm on July 18, 2021 Permalink | Reply
    Tags: "Curiosity and technology drive quest to reveal fundamental secrets of the universe", A very specific particle called a J/psi might provide a clearer picture of what’s going on inside a proton’s gluonic field., , Argonne-driven technology is part of a broad initiative to answer fundamental questions about the birth of matter in the universe and the building blocks that hold it all together., , , , , , Computational Science, , Dark Energy, , , , Developing and fabricating detectors that search for signatures from the early universe or enhance our understanding of the most fundamental of particles., , , Exploring the hearts of protons and neutrons, , , Neutrinoless double beta decay can only happen if the neutrino is its own anti-particle., , , , , , , SLAC National Accelerator Laboratory(US), , ,   

    From DOE’s Argonne National Laboratory (US) : “Curiosity and technology drive quest to reveal fundamental secrets of the universe” 

    Argonne Lab

    From DOE’s Argonne National Laboratory (US)

    July 15, 2021
    John Spizzirri

    Argonne-driven technology is part of a broad initiative to answer fundamental questions about the birth of matter in the universe and the building blocks that hold it all together.

    Imagine the first of our species to lie beneath the glow of an evening sky. An enormous sense of awe, perhaps a little fear, fills them as they wonder at those seemingly infinite points of light and what they might mean. As humans, we evolved the capacity to ask big insightful questions about the world around us and worlds beyond us. We dare, even, to question our own origins.

    “The place of humans in the universe is important to understand,” said physicist and computational scientist Salman Habib. ​“Once you realize that there are billions of galaxies we can detect, each with many billions of stars, you understand the insignificance of being human in some sense. But at the same time, you appreciate being human a lot more.”

    The South Pole Telescope is part of a collaboration between Argonne and a number of national labs and universities to measure the CMB, considered the oldest light in the universe.

    The high altitude and extremely dry conditions of the South Pole keep water vapor from absorbing select light wavelengths.

    With no less a sense of wonder than most of us, Habib and colleagues at the U.S. Department of Energy’s (DOE) Argonne National Laboratory are actively researching these questions through an initiative that investigates the fundamental components of both particle physics and astrophysics.

    The breadth of Argonne’s research in these areas is mind-boggling. It takes us back to the very edge of time itself, to some infinitesimally small portion of a second after the Big Bang when random fluctuations in temperature and density arose, eventually forming the breeding grounds of galaxies and planets.

    It explores the heart of protons and neutrons to understand the most fundamental constructs of the visible universe, particles and energy once free in the early post-Big Bang universe, but later confined forever within a basic atomic structure as that universe began to cool.

    And it addresses slightly newer, more controversial questions about the nature of Dark Matter and Dark Energy, both of which play a dominant role in the makeup and dynamics of the universe but are little understood.
    _____________________________________________________________________________________
    Dark Energy Survey

    Dark Energy Camera [DECam] built at DOE’s Fermi National Accelerator Laboratory(US)

    NOIRLab National Optical Astronomy Observatory(US) Cerro Tololo Inter-American Observatory(CL) Victor M Blanco 4m Telescope which houses the Dark-Energy-Camera – DECam at Cerro Tololo, Chile at an altitude of 7200 feet.

    NOIRLab(US)NSF NOIRLab NOAO (US) Cerro Tololo Inter-American Observatory(CL) approximately 80 km to the East of La Serena, Chile, at an altitude of 2200 meters.

    Timeline of the Inflationary Universe WMAP

    The Dark Energy Survey (DES) is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. DES began searching the Southern skies on August 31, 2013.

    According to Einstein’s theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up. To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called dark energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    DES is designed to probe the origin of the accelerating universe and help uncover the nature of dark energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the DES collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.
    _____________________________________________________________________________________

    “And this world-class research we’re doing could not happen without advances in technology,” said Argonne Associate Laboratory Director Kawtar Hafidi, who helped define and merge the different aspects of the initiative.

    “We are developing and fabricating detectors that search for signatures from the early universe or enhance our understanding of the most fundamental of particles,” she added. ​“And because all of these detectors create big data that have to be analyzed, we are developing, among other things, artificial intelligence techniques to do that as well.”

    Decoding messages from the universe

    Fleshing out a theory of the universe on cosmic or subatomic scales requires a combination of observations, experiments, theories, simulations and analyses, which in turn requires access to the world’s most sophisticated telescopes, particle colliders, detectors and supercomputers.

    Argonne is uniquely suited to this mission, equipped as it is with many of those tools, the ability to manufacture others and collaborative privileges with other federal laboratories and leading research institutions to access other capabilities and expertise.

    As lead of the initiative’s cosmology component, Habib uses many of these tools in his quest to understand the origins of the universe and what makes it tick.

    And what better way to do that than to observe it, he said.

    “If you look at the universe as a laboratory, then obviously we should study it and try to figure out what it is telling us about foundational science,” noted Habib. ​“So, one part of what we are trying to do is build ever more sensitive probes to decipher what the universe is trying to tell us.”

    To date, Argonne is involved in several significant sky surveys, which use an array of observational platforms, like telescopes and satellites, to map different corners of the universe and collect information that furthers or rejects a specific theory.

    For example, the South Pole Telescope survey, a collaboration between Argonne and a number of national labs and universities, is measuring the cosmic microwave background (CMB) [above], considered the oldest light in the universe. Variations in CMB properties, such as temperature, signal the original fluctuations in density that ultimately led to all the visible structure in the universe.

    Additionally, the Dark Energy Spectroscopic Instrument and the forthcoming Vera C. Rubin Observatory are specially outfitted, ground-based telescopes designed to shed light on dark energy and dark matter, as well as the formation of luminous structure in the universe.

    DOE’s Lawrence Berkeley National Laboratory(US) DESI spectroscopic instrument on the Mayall 4-meter telescope at Kitt Peak National Observatory, in the Quinlan Mountains in the Arizona-Sonoran Desert on the Tohono O’odham Nation, 88 kilometers 55 mi west-southwest of Tucson, Arizona, Altitude 2,096 m (6,877 ft).

    National Optical Astronomy Observatory (US) Mayall 4 m telescope at NSF NOIRLab NOAO Kitt Peak National Observatory (US) in the Quinlan Mountains in the Arizona-Sonoran Desert on the Tohono O’odham Nation, 88 kilometers 55 mi west-southwest of Tucson, Arizona, Altitude 2,096 m (6,877 ft).

    National Science Foundation(US) NSF (US) NOIRLab NOAO Kitt Peak National Observatory on the Quinlan Mountains in the Arizona-Sonoran Desert on the Tohono O’odham Nation, 88 kilometers (55 mi) west-southwest of Tucson, Arizona, Altitude 2,096 m (6,877 ft).

    National Science Foundation(US) NOIRLab (US) NOAO Kitt Peak National Observatory (US) on Kitt Peak of the Quinlan Mountains in the Arizona-Sonoran Desert on the Tohono O’odham Nation, 88 kilometers (55 mi) west-southwest of Tucson, Arizona, Altitude 2,096 m (6,877 ft). annotated.

    NSF (US) NOIRLab (US) NOAO (US) Vera C. Rubin Observatory [LSST] Telescope currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing NSF (US) NOIRLab (US) NOAO (US) Gemini South Telescope and NSF (US) NOIRLab (US) NOAO (US) Southern Astrophysical Research Telescope.

    Darker matters

    All the data sets derived from these observations are connected to the second component of Argonne’s cosmology push, which revolves around theory and modeling. Cosmologists combine observations, measurements and the prevailing laws of physics to form theories that resolve some of the mysteries of the universe.

    But the universe is complex, and it has an annoying tendency to throw a curve ball just when we thought we had a theory cinched. Discoveries within the past 100 years have revealed that the universe is both expanding and accelerating its expansion — realizations that came as separate but equal surprises.

    Saul Perlmutter (center) [The Supernova Cosmology Project] shared the 2006 Shaw Prize in Astronomy, the 2011 Nobel Prize in Physics, and the 2015 Breakthrough Prize in Fundamental Physics with Brian P. Schmidt (right) and Adam Riess (left) [The High-z Supernova Search Team] for providing evidence that the expansion of the universe is accelerating.

    “To say that we understand the universe would be incorrect. To say that we sort of understand it is fine,” exclaimed Habib. ​“We have a theory that describes what the universe is doing, but each time the universe surprises us, we have to add a new ingredient to that theory.”

    Modeling helps scientists get a clearer picture of whether and how those new ingredients will fit a theory. They make predictions for observations that have not yet been made, telling observers what new measurements to take.

    Habib’s group is applying this same sort of process to gain an ever-so-tentative grasp on the nature of dark energy and dark matter. While scientists can tell us that both exist, that they comprise about 68 and 26% of the universe, respectively, beyond that not much else is known.

    ______________________________________________________________________________________________________________

    Dark Matter Background
    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, some 30 years later, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com.


    Coma cluster via NASA/ESA Hubble.


    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.
    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.
    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).


    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL).


    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970

    Dark Matter Research

    Inside the Axion Dark Matter eXperiment U Washington (US) Credit : Mark Stone U. of Washington. Axion Dark Matter Experiment.
    _____________________________________________________________________________________

    Observations of cosmological structure — the distribution of galaxies and even of their shapes — provide clues about the nature of dark matter, which in turn feeds simple dark matter models and subsequent predictions. If observations, models and predictions aren’t in agreement, that tells scientists that there may be some missing ingredient in their description of dark matter.

    But there are also experiments that are looking for direct evidence of dark matter particles, which require highly sensitive detectors [above]. Argonne has initiated development of specialized superconducting detector technology for the detection of low-mass dark matter particles.

    This technology requires the ability to control properties of layered materials and adjust the temperature where the material transitions from finite to zero resistance, when it becomes a superconductor. And unlike other applications where scientists would like this temperature to be as high as possible — room temperature, for example — here, the transition needs to be very close to absolute zero.

    Habib refers to these dark matter detectors as traps, like those used for hunting — which, in essence, is what cosmologists are doing. Because it’s possible that dark matter doesn’t come in just one species, they need different types of traps.

    “It’s almost like you’re in a jungle in search of a certain animal, but you don’t quite know what it is — it could be a bird, a snake, a tiger — so you build different kinds of traps,” he said.

    Lab researchers are working on technologies to capture these elusive species through new classes of dark matter searches. Collaborating with other institutions, they are now designing and building a first set of pilot projects aimed at looking for dark matter candidates with low mass.

    Tuning in to the early universe

    Amy Bender is working on a different kind of detector — well, a lot of detectors — which are at the heart of a survey of the cosmic microwave background (CMB).

    “The CMB is radiation that has been around the universe for 13 billion years, and we’re directly measuring that,” said Bender, an assistant physicist at Argonne.

    The Argonne-developed detectors — all 16,000 of them — capture photons, or light particles, from that primordial sky through the aforementioned South Pole Telescope, to help answer questions about the early universe, fundamental physics and the formation of cosmic structures.

    Now, the CMB experimental effort is moving into a new phase, CMB-Stage 4 (CMB-S4).

    CMB-S4 is the next-generation ground-based cosmic microwave background experiment.With 21 telescopes at the South Pole and in the Chilean Atacama desert surveying the sky with 550,000 cryogenically-cooled superconducting detectors for 7 years, CMB-S4 will deliver transformative discoveries in fundamental physics, cosmology, astrophysics, and astronomy. CMB-S4 is supported by the Department of Energy Office of Science and the National Science Foundation.

    This larger project tackles even more complex topics like Inflationary Theory, which suggests that the universe expanded faster than the speed of light for a fraction of a second, shortly after the Big Bang.
    _____________________________________________________________________________________
    Inflation

    4
    Alan Guth, from Highland Park High School and M.I.T., who first proposed cosmic inflation
    [caption id="attachment_55311" align="alignnone" width="632"] HPHS Owls

    Lamda Cold Dark Matter Accerated Expansion of The universe http scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes
    Alex Mittelmann, Coldcreation


    Alan Guth’s notes:

    Alan Guth’s original notes on inflation


    _____________________________________________________________________________________

    3
    A section of a detector array with architecture suitable for future CMB experiments, such as the upcoming CMB-S4 project. Fabricated at Argonne’s Center for Nanoscale Materials, 16,000 of these detectors currently drive measurements collected from the South Pole Telescope. (Image by Argonne National Laboratory.)

    While the science is amazing, the technology to get us there is just as fascinating.

    Technically called transition edge sensing (TES) bolometers, the detectors on the telescope are made from superconducting materials fabricated at Argonne’s Center for Nanoscale Materials, a DOE Office of Science User Facility.

    Each of the 16,000 detectors acts as a combination of very sensitive thermometer and camera. As incoming radiation is absorbed on the surface of each detector, measurements are made by supercooling them to a fraction of a degree above absolute zero. (That’s over three times as cold as Antarctica’s lowest recorded temperature.)

    Changes in heat are measured and recorded as changes in electrical resistance and will help inform a map of the CMB’s intensity across the sky.

    CMB-S4 will focus on newer technology that will allow researchers to distinguish very specific patterns in light, or polarized light. In this case, they are looking for what Bender calls the Holy Grail of polarization, a pattern called B-modes.

    Capturing this signal from the early universe — one far fainter than the intensity signal — will help to either confirm or disprove a generic prediction of inflation.

    It will also require the addition of 500,000 detectors distributed among 21 telescopes in two distinct regions of the world, the South Pole and the Chilean desert. There, the high altitude and extremely dry conditions keep water vapor in the atmosphere from absorbing millimeter wavelength light, like that of the CMB.

    While previous experiments have touched on this polarization, the large number of new detectors will improve sensitivity to that polarization and grow our ability to capture it.

    “Literally, we have built these cameras completely from the ground up,” said Bender. ​“Our innovation is in how to make these stacks of superconducting materials work together within this detector, where you have to couple many complex factors and then actually read out the results with the TES. And that is where Argonne has contributed, hugely.”

    Down to the basics

    Argonne’s capabilities in detector technology don’t just stop at the edge of time, nor do the initiative’s investigations just look at the big picture.

    Most of the visible universe, including galaxies, stars, planets and people, are made up of protons and neutrons. Understanding the most fundamental components of those building blocks and how they interact to make atoms and molecules and just about everything else is the realm of physicists like Zein-Eddine Meziani.

    “From the perspective of the future of my field, this initiative is extremely important,” said Meziani, who leads Argonne’s Medium Energy Physics group. ​“It has given us the ability to actually explore new concepts, develop better understanding of the science and a pathway to enter into bigger collaborations and take some leadership.”

    Taking the lead of the initiative’s nuclear physics component, Meziani is steering Argonne toward a significant role in the development of the Electron-Ion Collider, a new U.S. Nuclear Physics Program facility slated for construction at DOE’s Brookhaven National Laboratory (US).

    Argonne’s primary interest in the collider is to elucidate the role that quarks, anti-quarks and gluons play in giving mass and a quantum angular momentum, called spin, to protons and neutrons — nucleons — the particles that comprise the nucleus of an atom.


    EIC Electron Animation, Inner Proton Motion.
    Electrons colliding with ions will exchange virtual photons with the nuclear particles to help scientists ​“see” inside the nuclear particles; the collisions will produce precision 3D snapshots of the internal arrangement of quarks and gluons within ordinary nuclear matter; like a combination CT/MRI scanner for atoms. (Image by Brookhaven National Laboratory.)

    While we once thought nucleons were the finite fundamental particles of an atom, the emergence of powerful particle colliders, like the Stanford Linear Accelerator Center at Stanford University and the former Tevatron at DOE’s Fermilab, proved otherwise.

    It turns out that quarks and gluons were independent of nucleons in the extreme energy densities of the early universe; as the universe expanded and cooled, they transformed into ordinary matter.

    “There was a time when quarks and gluons were free in a big soup, if you will, but we have never seen them free,” explained Meziani. ​“So, we are trying to understand how the universe captured all of this energy that was there and put it into confined systems, like these droplets we call protons and neutrons.”

    Some of that energy is tied up in gluons, which, despite the fact that they have no mass, confer the majority of mass to a proton. So, Meziani is hoping that the Electron-Ion Collider will allow science to explore — among other properties — the origins of mass in the universe through a detailed exploration of gluons.

    And just as Amy Bender is looking for the B-modes polarization in the CMB, Meziani and other researchers are hoping to use a very specific particle called a J/psi to provide a clearer picture of what’s going on inside a proton’s gluonic field.

    But producing and detecting the J/psi particle within the collider — while ensuring that the proton target doesn’t break apart — is a tricky enterprise, which requires new technologies. Again, Argonne is positioning itself at the forefront of this endeavor.

    “We are working on the conceptual designs of technologies that will be extremely important for the detection of these types of particles, as well as for testing concepts for other science that will be conducted at the Electron-Ion Collider,” said Meziani.

    Argonne also is producing detector and related technologies in its quest for a phenomenon called neutrinoless double beta decay. A neutrino is one of the particles emitted during the process of neutron radioactive beta decay and serves as a small but mighty connection between particle physics and astrophysics.

    “Neutrinoless double beta decay can only happen if the neutrino is its own anti-particle,” said Hafidi. ​“If the existence of these very rare decays is confirmed, it would have important consequences in understanding why there is more matter than antimatter in the universe.”

    Argonne scientists from different areas of the lab are working on the Neutrino Experiment with Xenon Time Projection Chamber (NEXT) collaboration to design and prototype key systems for the collaborative’s next big experiment. This includes developing a one-of-a-kind test facility and an R&D program for new, specialized detector systems.

    “We are really working on dramatic new ideas,” said Meziani. ​“We are investing in certain technologies to produce some proof of principle that they will be the ones to pursue later, that the technology breakthroughs that will take us to the highest sensitivity detection of this process will be driven by Argonne.”

    The tools of detection

    Ultimately, fundamental science is science derived from human curiosity. And while we may not always see the reason for pursuing it, more often than not, fundamental science produces results that benefit all of us. Sometimes it’s a gratifying answer to an age-old question, other times it’s a technological breakthrough intended for one science that proves useful in a host of other applications.

    Through their various efforts, Argonne scientists are aiming for both outcomes. But it will take more than curiosity and brain power to solve the questions they are asking. It will take our skills at toolmaking, like the telescopes that peer deep into the heavens and the detectors that capture hints of the earliest light or the most elusive of particles.

    We will need to employ the ultrafast computing power of new supercomputers. Argonne’s forthcoming Aurora exascale machine will analyze mountains of data for help in creating massive models that simulate the dynamics of the universe or subatomic world, which, in turn, might guide new experiments — or introduce new questions.

    Depiction of ANL ALCF Cray Intel SC18 Shasta Aurora exascale supercomputer, to be built at DOE’s Argonne National Laboratory.

    And we will apply artificial intelligence to recognize patterns in complex observations — on the subatomic and cosmic scales — far more quickly than the human eye can, or use it to optimize machinery and experiments for greater efficiency and faster results.

    “I think we have been given the flexibility to explore new technologies that will allow us to answer the big questions,” said Bender. ​“What we’re developing is so cutting edge, you never know where it will show up in everyday life.”

    Funding for research mentioned in this article was provided by Argonne Laboratory Directed Research and Development; Argonne program development; DOE Office of High Energy Physics: Cosmic Frontier, South Pole Telescope-3G project, Detector R&D; and DOE Office of Nuclear Physics.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Argonne National Laboratory (US) seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their is a science and engineering research national laboratory operated by UChicago Argonne LLC for the United States Department of Energy. The facility is located in Lemont, Illinois, outside of Chicago, and is the largest national laboratory by size and scope in the Midwest.

    Argonne had its beginnings in the Metallurgical Laboratory of the University of Chicago, formed in part to carry out Enrico Fermi’s work on nuclear reactors for the Manhattan Project during World War II. After the war, it was designated as the first national laboratory in the United States on July 1, 1946. In the post-war era the lab focused primarily on non-weapon related nuclear physics, designing and building the first power-producing nuclear reactors, helping design the reactors used by the United States’ nuclear navy, and a wide variety of similar projects. In 1994, the lab’s nuclear mission ended, and today it maintains a broad portfolio in basic science research, energy storage and renewable energy, environmental sustainability, supercomputing, and national security.

    UChicago Argonne, LLC, the operator of the laboratory, “brings together the expertise of the University of Chicago (the sole member of the LLC) with Jacobs Engineering Group Inc.” Argonne is a part of the expanding Illinois Technology and Research Corridor. Argonne formerly ran a smaller facility called Argonne National Laboratory-West (or simply Argonne-West) in Idaho next to the Idaho National Engineering and Environmental Laboratory. In 2005, the two Idaho-based laboratories merged to become the DOE’s Idaho National Laboratory.
    What would become Argonne began in 1942 as the Metallurgical Laboratory at the University of Chicago, which had become part of the Manhattan Project. The Met Lab built Chicago Pile-1, the world’s first nuclear reactor, under the stands of the University of Chicago sports stadium. Considered unsafe, in 1943, CP-1 was reconstructed as CP-2, in what is today known as Red Gate Woods but was then the Argonne Forest of the Cook County Forest Preserve District near Palos Hills. The lab was named after the surrounding forest, which in turn was named after the Forest of Argonne in France where U.S. troops fought in World War I. Fermi’s pile was originally going to be constructed in the Argonne forest, and construction plans were set in motion, but a labor dispute brought the project to a halt. Since speed was paramount, the project was moved to the squash court under Stagg Field, the football stadium on the campus of the University of Chicago. Fermi told them that he was sure of his calculations, which said that it would not lead to a runaway reaction, which would have contaminated the city.

    Other activities were added to Argonne over the next five years. On July 1, 1946, the “Metallurgical Laboratory” was formally re-chartered as Argonne National Laboratory for “cooperative research in nucleonics.” At the request of the U.S. Atomic Energy Commission, it began developing nuclear reactors for the nation’s peaceful nuclear energy program. In the late 1940s and early 1950s, the laboratory moved to a larger location in unincorporated DuPage County, Illinois and established a remote location in Idaho, called “Argonne-West,” to conduct further nuclear research.

    In quick succession, the laboratory designed and built Chicago Pile 3 (1944), the world’s first heavy-water moderated reactor, and the Experimental Breeder Reactor I (Chicago Pile 4), built-in Idaho, which lit a string of four light bulbs with the world’s first nuclear-generated electricity in 1951. A complete list of the reactors designed and, in most cases, built and operated by Argonne can be viewed in the, Reactors Designed by Argonne page. The knowledge gained from the Argonne experiments conducted with these reactors 1) formed the foundation for the designs of most of the commercial reactors currently used throughout the world for electric power generation and 2) inform the current evolving designs of liquid-metal reactors for future commercial power stations.

    Conducting classified research, the laboratory was heavily secured; all employees and visitors needed badges to pass a checkpoint, many of the buildings were classified, and the laboratory itself was fenced and guarded. Such alluring secrecy drew visitors both authorized—including King Leopold III of Belgium and Queen Frederica of Greece—and unauthorized. Shortly past 1 a.m. on February 6, 1951, Argonne guards discovered reporter Paul Harvey near the 10-foot (3.0 m) perimeter fence, his coat tangled in the barbed wire. Searching his car, guards found a previously prepared four-page broadcast detailing the saga of his unauthorized entrance into a classified “hot zone”. He was brought before a federal grand jury on charges of conspiracy to obtain information on national security and transmit it to the public, but was not indicted.

    Not all nuclear technology went into developing reactors, however. While designing a scanner for reactor fuel elements in 1957, Argonne physicist William Nelson Beck put his own arm inside the scanner and obtained one of the first ultrasound images of the human body. Remote manipulators designed to handle radioactive materials laid the groundwork for more complex machines used to clean up contaminated areas, sealed laboratories or caves. In 1964, the “Janus” reactor opened to study the effects of neutron radiation on biological life, providing research for guidelines on safe exposure levels for workers at power plants, laboratories and hospitals. Scientists at Argonne pioneered a technique to analyze the moon’s surface using alpha radiation, which launched aboard the Surveyor 5 in 1967 and later analyzed lunar samples from the Apollo 11 mission.

    In addition to nuclear work, the laboratory maintained a strong presence in the basic research of physics and chemistry. In 1955, Argonne chemists co-discovered the elements einsteinium and fermium, elements 99 and 100 in the periodic table. In 1962, laboratory chemists produced the first compound of the inert noble gas xenon, opening up a new field of chemical bonding research. In 1963, they discovered the hydrated electron.

    High-energy physics made a leap forward when Argonne was chosen as the site of the 12.5 GeV Zero Gradient Synchrotron, a proton accelerator that opened in 1963. A bubble chamber allowed scientists to track the motions of subatomic particles as they zipped through the chamber; in 1970, they observed the neutrino in a hydrogen bubble chamber for the first time.

    Meanwhile, the laboratory was also helping to design the reactor for the world’s first nuclear-powered submarine, the U.S.S. Nautilus, which steamed for more than 513,550 nautical miles (951,090 km). The next nuclear reactor model was Experimental Boiling Water Reactor, the forerunner of many modern nuclear plants, and Experimental Breeder Reactor II (EBR-II), which was sodium-cooled, and included a fuel recycling facility. EBR-II was later modified to test other reactor designs, including a fast-neutron reactor and, in 1982, the Integral Fast Reactor concept—a revolutionary design that reprocessed its own fuel, reduced its atomic waste and withstood safety tests of the same failures that triggered the Chernobyl and Three Mile Island disasters. In 1994, however, the U.S. Congress terminated funding for the bulk of Argonne’s nuclear programs.

    Argonne moved to specialize in other areas, while capitalizing on its experience in physics, chemical sciences and metallurgy. In 1987, the laboratory was the first to successfully demonstrate a pioneering technique called plasma wakefield acceleration, which accelerates particles in much shorter distances than conventional accelerators. It also cultivated a strong battery research program.

    Following a major push by then-director Alan Schriesheim, the laboratory was chosen as the site of the Advanced Photon Source, a major X-ray facility which was completed in 1995 and produced the brightest X-rays in the world at the time of its construction.

    On 19 March 2019, it was reported in the Chicago Tribune that the laboratory was constructing the world’s most powerful supercomputer. Costing $500 million it will have the processing power of 1 quintillion flops. Applications will include the analysis of stars and improvements in the power grid.

    With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    About the Advanced Photon Source

    The U. S. Department of Energy Office of Science’s Advanced Photon Source (APS) at Argonne National Laboratory is one of the world’s most productive X-ray light source facilities. The APS provides high-brightness X-ray beams to a diverse community of researchers in materials science, chemistry, condensed matter physics, the life and environmental sciences, and applied research. These X-rays are ideally suited for explorations of materials and biological structures; elemental distribution; chemical, magnetic, electronic states; and a wide range of technologically important engineering systems from batteries to fuel injector sprays, all of which are the foundations of our nation’s economic, technological, and physical well-being. Each year, more than 5,000 researchers use the APS to produce over 2,000 publications detailing impactful discoveries, and solve more vital biological protein structures than users of any other X-ray light source research facility. APS scientists and engineers innovate technology that is at the heart of advancing accelerator and light-source operations. This includes the insertion devices that produce extreme-brightness X-rays prized by researchers, lenses that focus the X-rays down to a few nanometers, instrumentation that maximizes the way the X-rays interact with samples being studied, and software that gathers and manages the massive quantity of data resulting from discovery research at the APS.

    With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    About the Advanced Photon Source

    The U. S. Department of Energy Office of Science’s Advanced Photon Source (APS) at Argonne National Laboratory is one of the world’s most productive X-ray light source facilities. The APS provides high-brightness X-ray beams to a diverse community of researchers in materials science, chemistry, condensed matter physics, the life and environmental sciences, and applied research. These X-rays are ideally suited for explorations of materials and biological structures; elemental distribution; chemical, magnetic, electronic states; and a wide range of technologically important engineering systems from batteries to fuel injector sprays, all of which are the foundations of our nation’s economic, technological, and physical well-being. Each year, more than 5,000 researchers use the APS to produce over 2,000 publications detailing impactful discoveries, and solve more vital biological protein structures than users of any other X-ray light source research facility. APS scientists and engineers innovate technology that is at the heart of advancing accelerator and light-source operations. This includes the insertion devices that produce extreme-brightness X-rays prized by researchers, lenses that focus the X-rays down to a few nanometers, instrumentation that maximizes the way the X-rays interact with samples being studied, and software that gathers and manages the massive quantity of data resulting from discovery research at the APS.

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

     
  • richardmitnick 9:05 am on May 31, 2021 Permalink | Reply
    Tags: "Looking deep into the universe", , Dark Energy, , , , HIRAX telescope in the Karoo semidesert in South Africa, , , , ,   

    From Swiss Federal Institute of Technology in Zürich [ETH Zürich] [Eidgenössische Technische Hochschule Zürich](CH): “Looking deep into the universe” 

    From Swiss Federal Institute of Technology in Zürich [ETH Zürich] [Eidgenössische Technische Hochschule Zürich](CH)

    31.05.2021
    Felix Würsten

    How is matter distributed within our universe? And what is the mysterious substance known as dark energy made of? HIRAX, a new large telescope array comprising hundreds of small radio telescopes, should provide some answers. Among those instrumental in developing the system are physicists from ETH Zürich.

    2
    Hartebeesthoek Radio Astronomy Observatory, located west of Johannesburg South Africa.
    How the final expansion of the HIRAX telescope in the Karoo semidesert in South Africa should look once completed. (Image: Cynthia Chiang / HIRAX.)

    “It’s an exciting project,” says Alexandre Refregier, Professor of Physics at ETH Zürich, as he considers the futuristic-​looking visualisation from South Africa. The image shows a scene in the middle of the Karoo semidesert, far away from larger settlements, with rows upon rows of more than 1,000 parabolic reflectors all directed towards the same point. At first glance, one might assume this is a solar power station, but it’s actually a large radio telescope that over the coming years should provide cosmologists with new insights into the makeup and history of our universe.

    Key element: hydrogen

    HIRAX stands for Hydrogen Intensity and Real-​time Analysis eXperiment and marks the start of a new chapter in the exploration of the universe. The new large telescope will collect radio signals within a frequency range of 400 to 800 MHz. These signals will make it possible to measure the distribution of hydrogen in the universe on a large scale. “If we can use hydrogen, the most common element in the universe, to discover how matter is distributed in space, we could then draw conclusions about what dark matter and dark energy are made of,” Refregier explains.

    Dark Energy and Dark Matter are two mysterious components that together make up the vast majority of the universe. They play a major role in the formation of structures and in the universe’s accelerated expansion. But experts remain puzzled about exactly what dark energy and dark matter are made of. HIRAX should help home in on the precise nature of these two components. The researchers also hope that the new system will deliver insights into fast radio bursts and pulsars.

    Combining hundreds of individual signals

    Not only will Refregier and his team be involved in the scientific analysis of the data, the professor is also helping to develop the new system together with his postdoc Devin Crichton and engineer Thierry Viant. “HIRAX is a remarkable undertaking, not just from a scientific point of view, but also because it represents a significant technological challenge,” Refregier says. As part of their subproject in collaboration with scientists from the University of Geneva [Université de Genève](CH), the ETH researchers are developing what’s known as a digital correlator, which will combine the signals recorded by each of the approximately six-​metre telescopes. “Rather than consisting of a single large telescope, the HIRAX array is made up of numerous smaller radio telescopes that are correlated with each other,” Refregier says. “This enables us to build a telescope with a collection surface and resolution much greater than a measuring device with only one parabolic reflector.”

    Tested in Switzerland

    The physicists first tested the technology for the digital corrector in Switzerland using a pilot system. To do so, they used the two historic radio telescopes housed at the Bleien facility in the Swiss canton of Aargau. They will now use the results of these tests to develop a digital corrector capable of linking 256 reflectors. “The HIRAX telescope is being set up in stages, which allows us to develop and refine the technology we need as we go along,” Refregier says. The funding required for this subproject was recently secured.

    For their digital correlator, the ETH Zurich physicists are using high-​performance graphics processing units that were originally developed for video and gaming applications. The researchers are also breaking new ground when it comes to calibration. To synchronise the measurement signals received by the individual antennas, they use a radio signal transmitted by a drone. It is crucial to pinpoint the position of these signals so that the telescope can then provide the required precision.

    An ideal location

    It’s no accident that the HIRAX telescope is being installed in the Karoo semidesert. As a protected area, it is still largely free of disruptive signals from mobile communications antennas. “It’s actually quite ironic,” Refregier says. “On the one hand, mobile communications technology is a massive help in developing telescopes. On the other, that same technology makes life difficult for radio astronomers because mobile communications antennas transmit within similar frequency ranges.

    Another reason why the Karoo region is an ideal location is that this is also where part of the planned Square Kilometre Array will be erected.


    Once completed, this will be the world’s largest radio telescope, connecting systems in South Africa and Australia and representing yet another giant leap forward in radio astronomy. “Despite its remote position, the Karoo location is well connected by power and data lines,” Refregier says. In this respect, the undertaking presents a challenge because the new telescope will generate 6.5 terabytes of data every second. “This is why we’re going to install the digital corrector directly on site, so that the amount of data can first be reduced before it is sent somewhere else for further processing,” Refregier says.

    Opening the door for the next large-​scale project

    A collaboration among numerous other universities from different countries, the HIRAX project is also important with respect to research policy. First, it strengthens the collaboration between South Africa and Switzerland, enabling young scientists from the former to conduct research in the latter. Second, Refregier says he is grateful that the work we are doing on the development of HIRAX is opening the door to Switzerland’s participation in the Square Kilometre Array: “This means that we can do our part to ensure that Swiss universities are involved in this pioneering project and can keep pace with the latest developments in radio astronomy.”

    _____________________________________________________________________________________
    Dark Energy Survey

    ]

    The Dark Energy Survey (DES) is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. DES began searching the Southern skies on August 31, 2013.

    According to Einstein’s theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up. To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called dark energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    DES is designed to probe the origin of the accelerating universe and help uncover the nature of dark energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the DES collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.
    _____________________________________________________________________________________

    Dark Matter Background
    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, some 30 years later, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com.


    Coma cluster via NASA/ESA Hubble.


    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.
    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.
    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).


    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL).


    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970. https://home.dtm.ciw.edu.


    _____________________________________________________________________________________

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ETH Zurich campus
    Swiss Federal Institute of Technology in Zürich [ETH Zürich] [Eidgenössische Technische Hochschule Zürich](CH) is a public research university in the city of Zürich, Switzerland. Founded by the Swiss Federal Government in 1854 with the stated mission to educate engineers and scientists, the school focuses exclusively on science, technology, engineering and mathematics. Like its sister institution Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne](CH) , it is part of the Swiss Federal Institutes of Technology Domain (ETH Domain)) , part of the Swiss Federal Department of Economic Affairs, Education and Research.

    The university is an attractive destination for international students thanks to low tuition fees of 809 CHF per semester, PhD and graduate salaries that are amongst the world’s highest, and a world-class reputation in academia and industry. There are currently 22,200 students from over 120 countries, of which 4,180 are pursuing doctoral degrees. In the 2021 edition of the QS World University Rankings ETH Zürich is ranked 6th in the world and 8th by the Times Higher Education World Rankings 2020. In the 2020 QS World University Rankings by subject it is ranked 4th in the world for engineering and technology (2nd in Europe) and 1st for earth & marine science.

    As of November 2019, 21 Nobel laureates, 2 Fields Medalists, 2 Pritzker Prize winners, and 1 Turing Award winner have been affiliated with the Institute, including Albert Einstein. Other notable alumni include John von Neumann and Santiago Calatrava. It is a founding member of the IDEA League and the International Alliance of Research Universities (IARU) and a member of the CESAER network.

    ETH Zürich was founded on 7 February 1854 by the Swiss Confederation and began giving its first lectures on 16 October 1855 as a polytechnic institute (eidgenössische polytechnische Schule) at various sites throughout the city of Zurich. It was initially composed of six faculties: architecture, civil engineering, mechanical engineering, chemistry, forestry, and an integrated department for the fields of mathematics, natural sciences, literature, and social and political sciences.

    It is locally still known as Polytechnikum, or simply as Poly, derived from the original name eidgenössische polytechnische Schule, which translates to “federal polytechnic school”.

    ETH Zürich is a federal institute (i.e., under direct administration by the Swiss government), whereas the University of Zürich is a cantonal institution. The decision for a new federal university was heavily disputed at the time; the liberals pressed for a “federal university”, while the conservative forces wanted all universities to remain under cantonal control, worried that the liberals would gain more political power than they already had. In the beginning, both universities were co-located in the buildings of the University of Zürich.

    From 1905 to 1908, under the presidency of Jérôme Franel, the course program of ETH Zürich was restructured to that of a real university and ETH Zürich was granted the right to award doctorates. In 1909 the first doctorates were awarded. In 1911, it was given its current name, Eidgenössische Technische Hochschule. In 1924, another reorganization structured the university in 12 departments. However, it now has 16 departments.

    ETH Zürich, EPFL (Swiss Federal Institute of Technology in Lausanne) [École polytechnique fédérale de Lausanne](CH), and four associated research institutes form the “ETH Domain” with the aim of collaborating on scientific projects.

    Reputation and ranking

    ETH Zürich is ranked among the top universities in the world. Typically, popular rankings place the institution as the best university in continental Europe and ETH Zürich is consistently ranked among the top 1-5 universities in Europe, and among the top 3-10 best universities of the world.

    Historically, ETH Zürich has achieved its reputation particularly in the fields of chemistry, mathematics and physics. There are 32 Nobel laureates who are associated with ETH Zürich, the most recent of whom is Richard F. Heck, awarded the Nobel Prize in chemistry in 2010. Albert Einstein is perhaps its most famous alumnus.

    In 2018, the QS World University Rankings placed ETH Zürich at 7th overall in the world. In 2015, ETH Zürich was ranked 5th in the world in Engineering, Science and Technology, just behind the Massachusetts Institute of Technology(US), Stanford University(US) and University of Cambridge(UK). In 2015, ETH Zürich also ranked 6th in the world in Natural Sciences, and in 2016 ranked 1st in the world for Earth & Marine Sciences for the second consecutive year.

    In 2016, Times Higher Education WorldUniversity Rankings ranked ETH Zürich 9th overall in the world and 8th in the world in the field of Engineering & Technology, just behind the Massachusetts Institute of Technology(US), Stanford University(US), California Institute of Technology(US), Princeton University(US), University of Cambridge(UK), Imperial College London(UK) and

     
  • richardmitnick 11:19 am on May 17, 2021 Permalink | Reply
    Tags: "DESI Begins Creating 3D Map of the Universe", , , , Dark Energy, , , The Dark Energy Spectroscopic Instrument (DESI) will capture the light from tens of millions of galaxies and other cosmic objects.   

    From NSF’s NOIRLab (National Optical-Infrared Astronomy Research Laboratory) (US): “DESI Begins Creating 3D Map of the Universe” 

    From NSF’s NOIRLab (National Optical-Infrared Astronomy Research Laboratory) (US)

    17 May 2021

    Contacts

    Arjun Dey
    NSF’s NOIRLab
    Tel: +1 520-318-8429
    Email: arjun.dey@noirlab.edu

    Parker Fagrelius
    NSF’s NOIRLab
    Email: parker.fagrelius@noirlab.edu

    Amanda Kocz
    NSF’s NOIRLab
    Tel: +1 626-524-5884
    Email: amanda.kocz@noirlab.edu

    The Dark Energy Spectroscopic Instrument (DESI) completes trial run and begins survey to map the Universe and unravel mysterious dark energy.

    1
    A quest to map the Universe and unravel the mysteries of dark energy began officially today, 17 May 2021, at Kitt Peak National Observatory, a Program of NSF’s NOIRLab.

    Over the next five years, the Dark Energy Spectroscopic Instrument (DESI) will capture the light from tens of millions of galaxies and other cosmic objects. During its four-month trial run, which just concluded, the project already collected millions of observations.

    By gathering light from some 30 million galaxies, project scientists say that DESI will help them construct a 3D map of the Universe in unprecedented detail. DESI will do this by collecting spectra, which spread out the light from celestial objects into the colors of the rainbow, revealing information such as the chemical composition of the objects being observed and their relative distances and velocities. This data will help astronomers better understand the repulsive force associated with dark energy, which drives the acceleration of the Universe’s expansion across vast cosmic distances.

    DESI is an international science collaboration managed by the US Department of Energy’s Lawrence Berkeley National Laboratory (US) with primary funding from the Department’s Office of Science. DESI resides at the retrofitted Nicholas U. Mayall 4-meter Telescope at Kitt Peak National Observatory, a Program of NSF’s NOIRLab.

    Jim Siegrist, Associate Director for High Energy Physics at DOE, said, “We are excited to see the start of DESI, the first next-generation dark energy project to begin its science survey. We also congratulate Berkeley Lab, which continues to enhance our capabilities for studying the nature of dark energy, since leading the initial discovery in 1999. DOE’s Berkeley Lab successfully led the 13-nation DESI team, including US government, private, and international contributions, in the design, fabrication, and commissioning of the world’s premier multi-object spectrograph. The strong interagency collaboration with NSF has enabled DOE to install and operate DESI on their Mayall telescope, which is required to carry out this amazing experiment. Along with its primary mission of dark energy studies, the data set will be of use by the wider scientific community for a multitude of astrophysics studies.”

    “The combination of the Mayall telescope and DESI instrument is now the best astronomical survey machine on the planet,” said Arjun Dey, the DESI project scientist for NOIRLab and the DESI Observing Operations lead. “Its initial five-year mission, hopefully the first of many, will produce the most detailed cartographic map of our accelerating, expanding Universe ever created. I can’t wait to see what it will discover!”

    “The DESI experiment is an excellent example of the amazing science that can be achieved when government agencies collaborate to make the most of national observatory facilities like the Mayall telescope,” says Chris Davis, NSF Program Director for NOIRLab.

    What sets DESI apart from previous sky surveys? “We will measure ten times more galaxy spectra than ever obtained,” said the project director, Berkeley Lab’s Michael Levi. “These spectra get us a third dimension.” Instead of two-dimensional images of galaxies, quasars, and other distant objects, he explained, the instrument collects light, or spectra, from the cosmos such that it “becomes a time machine where we place those objects on a timeline that reaches as far back as 11 billion years ago.”

    “DESI is the most ambitious of a new generation of instruments aimed at better understanding the cosmos, in particular its dark energy component,” said project co-spokesperson Nathalie Palanque-Delabrouille, a cosmologist at France’s Alternative Energies and Atomic Energy Commission (CEA). She said the scientific program — including her own interest in quasars — will allow researchers to address with precision two primary questions: what is dark energy, and to what degree does gravity follow the laws of general relativity, which form the basis of our understanding of the cosmos.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    What is NOIRLab?

    NSF’s NOIRLab (National Optical-Infrared Astronomy Research Laboratory) (US), the US center for ground-based optical-infrared astronomy, operates the international Gemini Observatory (US) (a facility of National Science Foundation (US), NRC–Canada, ANID–Chile, MCTIC–Brazil, MINCyT–Argentina, and Korea Astronomy and Space Science Institute [한국천문연구원] (KR)), NOAO Kitt Peak National Observatory(US) (KPNO), Cerro Tololo Inter-American Observatory(CL) (CTIO), the Community Science and Data Center (CSDC), and Vera C. Rubin Observatory (in cooperation with DOE’s SLAC National Accelerator Laboratory (US)). It is managed by the Association of Universities for Research in Astronomy (AURA) (US) under a cooperative agreement with NSF and is headquartered in Tucson, Arizona. The astronomical community is honored to have the opportunity to conduct astronomical research on Iolkam Du’ag (Kitt Peak) in Arizona, on Maunakea in Hawaiʻi, and on Cerro Tololo and Cerro Pachón in Chile. We recognize and acknowledge the very significant cultural role and reverence that these sites have to the Tohono O’odham Nation, to the Native Hawaiian community, and to the local communities in Chile, respectively.

    National Science Foundation(US) NOIRLab (US) NOAO (US) Kitt Peak National Observatory (US) on Kitt Peak of the Quinlan Mountains in the Arizona-Sonoran Desert on the Tohono O’odham Nation, 88 kilometers (55 mi) west-southwest of Tucson, Arizona, Altitude 2,096 m (6,877 ft). annotated.

    NOIRLab(US)NOAO Cerro Tololo Inter-American Observatory(CL) approximately 80 km to the East of La Serena, Chile, at an altitude of 2200 meters.

    The NOAO-Community Science and Data Center(US)

    The NSF NOIRLab Vera C. Rubin Observatory. It is managed by the Association of Universities for Research in Astronomy(US) under a cooperative agreement with NSF and is headquartered in Tucson, Arizona. The astronomical community is honored to have the opportunity to conduct astronomical research on Iolkam Du’ag (Kitt Peak) in Arizona, on Maunakea in Hawaiʻi, and on Cerro Tololo and Cerro Pachón in Chile. We recognize and acknowledge the very significant cultural role and reverence that these sites have to the Tohono O’odham Nation, to the Native Hawaiian community, and to the local communities in Chile, respectively.

    NSF (US) NOIRLab (US) NOAO (US) Vera C. Rubin Observatory [LSST] Telescope currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing NSF (US) NOIRLab (US) NOAO (US) Gemini South Telescope and NSF (US) NOIRLab (US) NOAO (US) Southern Astrophysical Research Telescope.

     
  • richardmitnick 4:25 pm on May 3, 2021 Permalink | Reply
    Tags: "Search for 'dark energy' could illuminate origin and evolution and fate of universe", , , , , Dark Energy, HETDEX-the Hobby-Eberly Telescope Dark Energy Experiment., Hobby-Eberly 9.1 meter Telescope,   

    From Pennsylvania State University: “Search for ‘dark energy’ could illuminate origin and evolution and fate of universe” 

    Penn State Bloc

    From Pennsylvania State University

    May 03, 2021
    Seth Palmer

    The universe we see is only the very tip of the vast cosmic iceberg.

    The hundreds of billions of galaxies it contains, each of them home to billions of stars, planets and moons as well as massive star-and-planet-forming clouds of gas and dust, and all of the visible light and other energy we can detect in the form of electromagnetic radiation, such as radio waves, gamma rays and X-rays — in short, everything we’ve ever seen with our telescopes — only amounts to about 5% of all the mass and energy in the universe.

    Along with this so-called normal matter there is also dark matter, which can’t be seen, but can be observed by its gravitational effect on normal, visible matter, and makes up another 27% of the universe. Add them together, and they only total 32% of the mass of the universe — so where’s the other 68%?

    Dark Energy.

    Dark Energy Survey

    The Dark Energy Survey (DES) is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. DES began searching the Southern skies on August 31, 2013.

    According to Einstein’s theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up. To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called dark energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    DES is designed to probe the origin of the accelerating universe and help uncover the nature of dark energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the DES collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.

    1
    This pie chart shows rounded values for the three known components of the universe: normal matter, dark matter, and dark energy. IMAGE: NASA’s Goddard Space Flight Center (US)

    So what exactly is dark energy? Put simply, it’s a mysterious force that’s pushing the universe outward and causing it to expand faster as it ages, engaged in a cosmic tug-of-war with dark matter, which is trying to pull the universe together. Beyond that, we don’t yet understand what dark energy is, but Penn State astronomers are at the core of a group that’s aiming to find out through a unique and ambitious project 16 years in the making: HETDEX, the Hobby-Eberly Telescope Dark Energy Experiment.

    HETDEX is a collaboration of The University of Texas at Austin (US), Pennsylvania State University (US), Texas A&M University (US), Universities-Sternwärte Munich [Universitäts-Sternwarte München] (DE), Leibniz Institute for Astrophysics [Leibniz-Institut für Astrophysik] (DE) (AIP), Max-Planck-Institut für Extraterrestrische Physik, Institut für Astrophysik Göttingen, and University of Oxford (UK). Financial support is provided by the State of Texas, the United States Air Force, the National Science Foundation and the generous contributions of many private foundations and individuals.

    “HETDEX has the potential to change the game,” said Associate Professor of Astronomy and Astrophysics Donghui Jeong.

    Dark energy and the expanding universe

    Today there is consensus among astronomers that the universe we inhabit is expanding, and that its expansion is accelerating, but the idea of an expanding universe is less than a century old, and the notion of dark energy (or anything else) accelerating that expansion has only been around for a little more than 20 years.

    In 1917 when Albert Einstein applied his general theory of relativity to describe the universe as a whole, laying the foundations for the big bang theory, he and other leading scientists at that time conceived of the cosmos as static and nonexpanding. But in order to keep that universe from collapsing under the attractive force of gravity, he needed to introduce a repulsive force to counteract it: the cosmological constant.

    It wasn’t until 1929 when Edwin Hubble discovered that the universe is in fact expanding, and that galaxies farther from Earth are moving away faster than those that are closer, that the model of a static universe was finally abandoned.

    Even Einstein was quick to modify his theories, by the early 1930s publishing two new and distinct models of the expanding universe, both of them without the cosmological constant.

    But although astronomers had finally come to understand that the universe was expanding, and had more or less abandoned the concept of the cosmological constant, they also presumed that the universe was dominated by matter and that gravity would eventually cause its expansion to slow; the universe would either continue to expand forever, but ever-increasingly slowly, or it would at some point cease its expansion and then collapse, ending in a “big crunch.”

    “That’s the way we thought the universe worked, up until 1998,” said Professor of Astronomy and Astrophysics Robin Ciardullo, a founding member of HETDEX.

    That year, two independent teams — one led by Saul Perlmutter at DOE’s Lawrence Berkeley National Laboratory (US), and the other led by Brian Schmidt of the Australian National University (AU) and Adam Riess of the NASA Space Telescope Science Institute (US) — would nearly simultaneously publish astounding results showing that the expansion of the universe was in fact accelerating, driven by some mysterious antigravity force.

    Later that year, cosmologist Michael Turner of the University of Chicago (US) and DOE’s Fermi National Accelerator Laboratory (US) coined the term “dark energy” to describe this mysterious force.

    The discovery would be named Science magazine’s “Breakthrough of the Year” for 1998, and in 2011 Perlmutter, Schmidt and Reiss would be awarded the Nobel Prize in physics.

    Saul Perlmutter [The Supernova Cosmology Project] shared the 2006 Shaw Prize in Astronomy, the 2011 Nobel Prize in Physics, and the 2015 Breakthrough Prize in Fundamental Physics with Brian P. Schmidt and Adam Riess [The High-z Supernova Search Team] for providing evidence that the expansion of the universe is accelerating.

    Competing theories

    More than 20 years after the discovery of dark energy, astronomers still don’t know what, exactly, it is.

    “Whenever astronomers say ‘dark,’ that means we don’t have any clue about it,” Jeong said with a wry grin. “Dark energy is just another way of saying that we don’t know what’s causing this accelerating expansion.”

    There are, however, a number of theories that attempt to explain dark energy, and a few major contenders.

    Perhaps the most favored explanation is the previously abandoned cosmological constant, which modern-day physicists describe as vacuum energy. “The vacuum in physics is not a state of nothing,” Jeong explained. “It is a place where particles and antiparticles are continuously created and destroyed.” The energy produced in this perpetual cycle could exert an outward-pushing force on space itself, causing its expansion, initiated in the big bang, to accelerate.

    Unfortunately, the theoretical calculations of vacuum energy don’t match the observations — by a factor of as much as 10120, or a one followed by 120 zeroes. “That’s very, very unusual,” Jeong said, “but that’s where we’ll be if dark energy turns out to be constant.” Clearly this discrepancy is a major issue, and it could necessitate a reworking of current theory, but the cosmological constant in the form of vacuum energy is nonetheless the leading candidate so far.

    Another possible explanation is a new, yet-undiscovered particle or field that would permeate all of space; but so far, there’s no evidence to support this.

    A third possibility is that Einstein’s theory of gravity is incorrect. “If you start from the wrong equation,” Jeong said, “then you get the wrong answer.” There are alternatives to general relativity, but each has its own issues and none has yet displaced it as the reigning theory. For now, it’s still the best description of gravity we’ve got.

    Ultimately, what’s needed is more and better observational data — precisely what HETDEX was designed to collect like no other survey has done before.

    A map of stars and sound

    “HETDEX is very ambitious,” Ciardullo said. “It’s going to observe a million galaxies to map out the structure of the universe going over two-thirds of the way back to the beginning of time. We’re the only ones going out that far to see the dark energy component of the universe and how it’s evolving.”

    Ciardullo, an observational astronomer who studies everything from nearby stars to faraway galaxies and dark matter, is HETDEX’s observations manager. He’s quick to note, though, that he’s got help in that role (from Jeong and others) and that he and everyone else on the project wears more than one hat. “This is a very big project,” he said. “It’s over $40 million. But if you count heads, it’s not very many people. And so we all do more than one thing.”

    Jeong, a theoretical astrophysicist and cosmologist who also studies gravitational waves, was instrumental in laying the groundwork for the study and is heavily involved in the project’s data analysis — and he’s also helping Ciardullo determine where to point the 10-meter Hobby-Eberly Telescope, the world’s third largest. “It’s kind of interesting,” he noted with a chuckle, “a theorist telling observers where to look.”

    While other studies measure the universe’s expansion using distant supernovae or a phenomenon known as gravitational lensing, where light is bent by the gravity of massive objects such as galaxies and black holes, HETDEX is focused on sound waves from the big bang, called baryonic acoustic oscillations. Although we can’t actually hear sounds in the vacuum of space, astronomers can see the effect of these primordial sound waves in the distribution of matter throughout the universe.

    During the first 400,000-or-so years following the big bang, the universe existed as dense, hot plasma — a particle soup of matter and energy. Tiny disturbances called quantum fluctuations in that plasma set off sound waves, like ripples from a pebble tossed into a pond, which helped matter begin to clump together and form the universe’s initial structure. The result of this clumping is evident in the cosmic microwave background (also called the “afterglow” of the big bang), which is the first light, and the farthest back, that we can see in the universe. And it’s also imprinted in the distribution of galaxies throughout the universe’s history — like the ripples on our pond, frozen into space.

    “The physics of sound waves is pretty well known,” Ciardullo said. “You see how far these things have gone, you know how fast the sound waves have traveled, so you know the distance. You have a standard ruler on the universe, throughout cosmic history.”

    As the universe has expanded so has the ruler, and those variances in the ruler will show how the universe’s rate of expansion, driven by dark energy, has changed over time.

    “Basically,” Jeong said, “we make a three-dimensional map of galaxies and then measure it.”

    New discovery space

    To make their million-galaxy map, the HETDEX team needed a powerful new instrument.

    A set of more than 150 spectrographs called VIRUS (Visible Integral-Field Replicable Unit Spectrographs), mounted on the Hobby-Eberly Telescope, gathers the light from those galaxies into an array of some 35,000 optical fibers and then splits it into its component wavelengths in an ordered continuum known as a spectrum.

    Galaxies’ spectra reveal, among other things, the speed at which they are moving away from us — a measurement known as “redshift.” Due to the Doppler effect, the wavelength of an object moving away from its observer is stretched (think of a siren that gets lower in pitch as it speeds away), and an object moving toward its observer has its wavelength compressed, like that same siren increasing in pitch as it gets nearer. In the case of receding galaxies, their light is stretched and thus shifted toward the red end of the spectrum.

    Measuring this redshift allows the HETDEX team to calculate the distance to those galaxies and produce a precise three-dimensional map of their positions.

    Among the galaxies HETDEX is observing are what are known as Lyman-alpha galaxies — young star-forming galaxies that emit strong spectral lines at specific ultraviolet wavelengths.

    “We’re using Lyman-alpha-emitting galaxies as a ‘tracer particle,’” explained Research Professor of Astronomy and Astrophysics Caryl Gronwall, who is also a founding member of HETDEX. “They’re easy to find because they have a very strong emission line, which is easy to find spectroscopically with the VIRUS instrument. So we have this method that efficiently picks out galaxies at a fairly high redshift, and then we can measure where they are, measure their properties.”

    4
    The universe is expanding, and that expansion stretches light traveling through space in a phenomenon known as cosmological redshift. The greater the redshift, the greater the distance the light has traveled. As a result, telescopes with infrared detectors are needed to see light from the first, most distant galaxies.
    IMAGE: National Aeronautics Space Agency (US), European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU), and L. Hustak (NASA Space Telescope Science Institute (US))

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Penn State Campus

    The Pennsylvania State University is a public state-related land-grant research university with campuses and facilities throughout Pennsylvania. Founded in 1855 as the Farmers’ High School of Pennsylvania, Penn State became the state’s only land-grant university in 1863. Today, Penn State is a major research university which conducts teaching, research, and public service. Its instructional mission includes undergraduate, graduate, professional and continuing education offered through resident instruction and online delivery. In addition to its land-grant designation, it also participates in the sea-grant, space-grant, and sun-grant research consortia; it is one of only four such universities (along with Cornell University(US), Oregon State University(US), and University of Hawaiʻi at Mānoa(US)). Its University Park campus, which is the largest and serves as the administrative hub, lies within the Borough of State College and College Township. It has two law schools: Penn State Law, on the school’s University Park campus, and Dickinson Law, in Carlisle. The College of Medicine is in Hershey. Penn State is one university that is geographically distributed throughout Pennsylvania. There are 19 commonwealth campuses and 5 special mission campuses located across the state. The University Park campus has been labeled one of the “Public Ivies,” a publicly funded university considered as providing a quality of education comparable to those of the Ivy League.
    Annual enrollment at the University Park campus totals more than 46,800 graduate and undergraduate students, making it one of the largest universities in the United States. It has the world’s largest dues-paying alumni association. The university offers more than 160 majors among all its campuses.

    Annually, the university hosts the Penn State IFC/Panhellenic Dance Marathon (THON), which is the world’s largest student-run philanthropy. This event is held at the Bryce Jordan Center on the University Park campus. The university’s athletics teams compete in Division I of the NCAA and are collectively known as the Penn State Nittany Lions, competing in the Big Ten Conference for most sports. Penn State students, alumni, faculty and coaches have received a total of 54 Olympic medals.

    Early years

    The school was sponsored by the Pennsylvania State Agricultural Society and founded as a degree-granting institution on February 22, 1855, by Pennsylvania’s state legislature as the Farmers’ High School of Pennsylvania. The use of “college” or “university” was avoided because of local prejudice against such institutions as being impractical in their courses of study. Centre County, Pennsylvania, became the home of the new school when James Irvin of Bellefonte, Pennsylvania, donated 200 acres (0.8 km2) of land – the first of 10,101 acres (41 km^2) the school would eventually acquire. In 1862, the school’s name was changed to the Agricultural College of Pennsylvania, and with the passage of the Morrill Land-Grant Acts, Pennsylvania selected the school in 1863 to be the state’s sole land-grant college. The school’s name changed to the Pennsylvania State College in 1874; enrollment fell to 64 undergraduates the following year as the school tried to balance purely agricultural studies with a more classic education.

    George W. Atherton became president of the school in 1882, and broadened the curriculum. Shortly after he introduced engineering studies, Penn State became one of the ten largest engineering schools in the nation. Atherton also expanded the liberal arts and agriculture programs, for which the school began receiving regular appropriations from the state in 1887. A major road in State College has been named in Atherton’s honor. Additionally, Penn State’s Atherton Hall, a well-furnished and centrally located residence hall, is named not after George Atherton himself, but after his wife, Frances Washburn Atherton. His grave is in front of Schwab Auditorium near Old Main, marked by an engraved marble block in front of his statue.

    Early 20th century

    In the years that followed, Penn State grew significantly, becoming the state’s largest grantor of baccalaureate degrees and reaching an enrollment of 5,000 in 1936. Around that time, a system of commonwealth campuses was started by President Ralph Dorn Hetzel to provide an alternative for Depression-era students who were economically unable to leave home to attend college.

    In 1953, President Milton S. Eisenhower, brother of then-U.S. President Dwight D. Eisenhower, sought and won permission to elevate the school to university status as The Pennsylvania State University. Under his successor Eric A. Walker (1956–1970), the university acquired hundreds of acres of surrounding land, and enrollment nearly tripled. In addition, in 1967, the Penn State Milton S. Hershey Medical Center, a college of medicine and hospital, was established in Hershey with a $50 million gift from the Hershey Trust Company.

    Modern era

    In the 1970s, the university became a state-related institution. As such, it now belongs to the Commonwealth System of Higher Education. In 1975, the lyrics in Penn State’s alma mater song were revised to be gender-neutral in honor of International Women’s Year; the revised lyrics were taken from the posthumously-published autobiography of the writer of the original lyrics, Fred Lewis Pattee, and Professor Patricia Farrell acted as a spokesperson for those who wanted the change.

    In 1989, the Pennsylvania College of Technology in Williamsport joined ranks with the university, and in 2000, so did the Dickinson School of Law. The university is now the largest in Pennsylvania. To offset the lack of funding due to the limited growth in state appropriations to Penn State, the university has concentrated its efforts on philanthropy.

     
  • richardmitnick 11:49 pm on March 3, 2021 Permalink | Reply
    Tags: "Will this solve the mystery of the expansion of the universe?", , , , , Dark Energy, , From the science paper: "We find the mean value of the present Hubble parameter in the NEDE model to be H0=71.4±1.0  km s−1 Mpc−1 (68% C.L.).", , Proposed "New early dark energy (NEDE)", South Danish University [Syddansk Universitet](DK)   

    From South Danish University [Syddansk Universitet](DK): “Will this solve the mystery of the expansion of the universe?” 

    From South Danish University [Syddansk Universitet](DK)

    Physicists’ new proposal that a new type of extra dark energy is involved is highlighted in scientific journal.

    3/2/2021
    Birgitte Svennevig

    1
    Credit: CC0 Public Domain.

    The universe was created by a giant bang; the Big Bang 13.8 billion years ago, and then it started to expand. The expansion is ongoing: it is still being stretched out in all directions like a balloon being inflated.

    Physicists agree on this much, but something is wrong. Measuring the expansion rate of the universe in different ways leads to different results.

    So, is something wrong with the methods of measurement? Or is something going on in the universe that physicists have not yet discovered and therefore have not taken into account?

    It could very well be the latter, according to several physicists, i.a. Martin S. Sloth, Professor of Cosmology at SDU.

    In a new scientific article, he and his SDU colleague, postdoc Florian Niedermannn, propose the existence of a new type of dark energy in the universe. If you include it in the various calculations of the expansion of the universe, the results will be more alike.

    – “A new type of dark energy can solve the problem of the conflicting calculations” says Martin S. Sloth.

    Conflicting measurements

    When physicists calculate the expansion rate of the universe, they base the calculation on the assumption that the universe is made up of dark energy, dark matter and ordinary matter. Until recently, all types of observations fitted in with such a model of the universe’s composition of matter and energy, but this is no longer the case.

    Conflicting results arise when looking at the latest data from measurements of supernovae and the cosmic microwave background radiation; the two methods quite simply lead to different results for the expansion rate.

    – “In our model, we find that if there was a new type of extra dark energy in the early universe, it would explain both the background radiation and the supernova measurements simultaneously and without contradiction” says Martin S. Sloth.

    From one phase to another

    – “We believe that in the early universe, dark energy existed in a different phase. You can compare it to when water is cooled and it undergoes a phase transition to ice with a lower density, he explains and continues:

    – “In the same way, dark energy in our model undergoes a transition to a new phase with a lower energy density, thereby changing the effect of the dark energy on the expansion of the universe”.

    According to Sloth and Niedermann’s calculations, the results add up if you imagine that dark energy thus underwent a phase transition triggered by the expansion of the universe.

    A very violent process

    – “It is a phase transition where many bubbles of the new phase suddenly appear, and when these bubbles expand and collide, the phase transition is complete. On a cosmic scale, it is a very violent quantum mechanical process” explains Martin S. Sloth.

    Today we know approx. 20 per cent of the matter that the universe is made of. It is the matter that you and I, planets and galaxies are made of. The universe also consists of Dark Matter, which no one knows what is.

    In addition, there is dark energy in the universe; it is the energy that causes the universe to expand, and it makes up approx. 70 pct. of the energy density of the universe.

    Science paper:
    New early dark energy
    Physical Review D

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The South Danish University [Syddansk Universitet] is a university in Denmark that has campuses located in Southern Denmark and on Zealand.

    The university offers a number of joint programmes in co-operation with the Europe University of Flensburg [Universität Flensburg](DE) and the Christian-Albrecht University of Kiel [Christian-Albrechts-Universität zu Kiel](DE). Contacts with regional industries and the international scientific community are strong.

    With its 29,674 enrolled students (as of 2016), the university is both the third-largest and, given its roots in Odense University, the third-oldest Danish university (fourth if one includes the Technical University of Denmark). Since the introduction of the ranking systems in 2012, the South Danish University has consistently been ranked as one of the top 50 young universities in the world by both the Times Higher Education World University Rankings of the Top 100 Universities Under 50 and the QS World University Rankings of the Top 50 Universities Under 50.

    The South Danish University was established in 1998 when Odense University, the Southern Denmark School of Business and Engineering and the South Jutland University Centre were merged. The University Library of Southern Denmark was also merged with the university in 1998. As the original Odense University was established in 1966, the South Danish University celebrated their 50-year anniversary on September 15, 2016.

    In 2006, the Odense University College of Engineering was merged into the university and renamed as the Faculty of Engineering. After being located in different parts of Odense for several years, a brand new Faculty of Engineering building physically connected to the main Odense Campus was established and opened in 2015. In 2007, the Business School Centre in Slagelse (Handelshøjskolecentret Slagelse) and the National Institute of Public Health (Statens Institut for Folkesundhed) were also merged into the South Danish University.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: