Tagged: “Science News (US)” Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 11:56 am on March 10, 2022 Permalink | Reply
    Tags: "Science News (US)", "The mysterious Hiawatha crater in Greenland is 58 million years old", , , , , The crater was spotted 2015 during a scan by NASA’s Operation IceBridge., The powerful impact that created a mysterious crater at the northwestern edge of Greenland’s ice sheet happened about 58 million years ago.   

    From Science News: “The mysterious Hiawatha crater in Greenland is 58 million years old” 

    From Science News

    Carolyn Gramling

    Pebbles at the edge of Greenland’s ice sheet, shown here in 2019, contain zircon crystals that were altered by an impact about 58 million years ago. Credit: Pierre Beck.

    The powerful impact that created a mysterious crater at the northwestern edge of Greenland’s ice sheet happened about 58 million years ago, researchers report March 9 in Science Advances.

    That timing, confirmed by two separate dating methods, means that the asteroid or comet or meteorite that carved the depression struck long before the Younger Dryas cold snap about 13,000 years ago. Some researchers have suggested the cold spell was caused by such an impact.

    Scientists spotted the crater in 2015 during a scan by NASA’s Operation IceBridge, which used airborne radar to measure the ice sheet’s thickness. Those and other data revealed that the crater, dubbed Hiawatha, is a round depression that spans 31 kilometers and is buried beneath a kilometer of ice (SN: 11/14/18).

    The next step was to determine how old the Hiawatha crater might be. Though the depression itself is unreachable, meltwater at the ice’s base had ported out pebbles and other sediments bearing telltale signs of alteration by an impact, including sand from partially melted rocks and pebbles containing intensely deformed, or “shocked,” zircon crystals.

    Pebbles near the Hiawatha impact crater in northwestern Greenland contain grains of zircon (one at left) that contain many tiny crystals, some altered by the impact (right). These zircon crystals act as tiny time capsules, helping researchers estimate when the impact occurred. Credit: G. Kenny.

    Geochemist Gavin Kenny of the Swedish Museum of Natural History in Stockholm and colleagues dated these alterations using two methods based on the radioactive decay of isotopes, or different forms of elements. For the zircons, the team measured the decay of uranium to lead, and in the sand, the researchers compared the abundances of radioactive argon isotopes with stable ones. Both methods suggest that the impact occurred about 57.99 million years ago.

    That makes the crater far too old to be the smoking gun long sought by proponents of the controversial Younger Dryas impact hypothesis (SN: 6/26/18). The timing also isn’t quite right to link it to a warm period called the Paleocene-Eocene Thermal Maximum, which began around 56 million years ago (SN: 9/28/16). For now, the researchers say, what impact this space punch may have had on Earth’s global climate remains a mystery.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 2:04 pm on February 10, 2022 Permalink | Reply
    Tags: "How the Human Genome Project revolutionized understanding of our DNA", "Science News (US)", , , , ,   

    From Science News (US) : “How the Human Genome Project revolutionized understanding of our DNA” 

    From Science News (US)

    February 9, 2022
    Tina Hesman Saey

    A century of ingenuity and technology advances taught us to read the stories in our genes.

    The iconic double helix, the structure of our DNA, has graced the cover of Science News many times.
    Credit: Jeremy Leung.

    In October 1990, biologists officially embarked on one of the century’s most ambitious scientific efforts: reading the 3 billion pairs of genetic subunits — the A’s, T’s, C’s and G’s — that make up the human instruction book.

    The project promised to blow open our understanding of basic biology, reveal relationships between the myriad forms of life on the planet and transform medicine through insights into genetic diseases and potential cures. When the project was completed in 2003, the scientists having read essentially every letter, President Bill Clinton called it a “stunning and humbling achievement” and predicted it would “revolutionize the diagnosis, prevention and treatment of most, if not all, human diseases.”

    Even dreaming up such an endeavor depended on decades of previous discoveries. In 1905, English biologist William Bateson, who championed the work of Austrian monk Gregor Mendel, suggested the term “genetics” for a new field of study focused on heredity and variation. Early the next decade, American biologist Thomas Hunt Morgan and his colleagues showed that genes are carried on chromosomes. Biochemists had been studying DNA for nearly three-quarters of a century when Oswald Avery and his team at The Rockefeller Institute (US) in New York City helped establish in the 1940s that DNA is the genetic material. And perhaps most notable, and famous today, is the 1953 discovery of the double-helix structure of DNA, by James Watson and Francis Crick of The University of Cambridge (UK) and Rosalind Franklin and Maurice Wilkins of King’s College London (UK).

    But when the draft of the genetic instruction book was first published, independently by an international collective of academic and government labs called The Human Genome Project and the private company Celera Genomics, led by J. Craig Venter, the text was “as striking for what we don’t see as for what we do,” Science News reported (SN: 2/17/01, p. 100). There were many fewer genes than expected, leaving a puzzle about what all the remaining DNA was for.

    In the decades since, scientists have filled in some of that puzzle — identifying a host of genes, for example, that don’t make proteins but are still essential in the body. Other researchers have searched the instruction book to find new treatments for diseases and to figure out how we’re all related — not just people, but all life on planet Earth, past and present.

    To explore how far our understanding of our DNA has come, Science News senior writer and molecular biology reporter Tina Hesman Saey talked with Eric Green, director of The National Human Genome Research Institute [NHGRI](US) at The National Institutes of Health (US) in Bethesda, Md. Green got his start in genomics in the lab of Maynard Olson at The Washington University in St. Louis (US), a pioneer in the field. At the same time, Saey was a graduate student in molecular genetics, working down the hall. She remembers as an undergraduate student sequencing the genes of bacteria 50 to 100 chemical subunits, or bases, at a time. “My mind was completely blown by the idea that you could put together 3 billion bases.” The conversation that follows, which has been edited for length and clarity, looks back on the project and ahead to all that’s left to learn. — Elizabeth Quill

    Ambitious beginnings

    Saey: My first memory of the Human Genome Project was when I was an undergraduate student at The University of Nebraska-Lincoln (US), and I remember Walter Gilbert, who is a Nobel Prize winner, coming and talking about the project. He proposed this really audacious idea of sequencing 3 billion pairs of bases in the human genome — all of our DNA. After Gilbert’s talk, I walked back to the lab with a couple of professors, and they were saying, “This can never happen. It’s going to cost way too much money. There’s just no way we can do this.” So how did you pull it off?

    Green: By the time the genome project started in October of 1990, I was working in a cutting-edge genomics lab at Washington University. We were one of the first funded groups to participate in the Human Genome Project. We had some ideas on how to start, and we had really no idea how we were going to pull it off.

    It was the overwhelmingly compelling vision for why this was so important that galvanized enthusiasm among not only a group of scientists like myself, but also the funding agencies, the governments, the private funders from around the world, who said, “This seems unimaginable, like putting a person on the moon, but it seems so important. We’ll figure it out.” So it was one of these circumstances where you just get the right people in the right place, get them resourced, get them organized, be willing to do things differently, and then figure it out as you go.

    Saey: I got to witness this because I was a graduate student at Washington University, in a lab sequencing the yeast genome. Robert Waterston’s lab, which received one of the first grants from the Human Genome Project, was right across the hall. They started with C. elegans, the roundworm genome. I remember they were starting very methodically, mapping out the genes and then sequencing each piece, marching along. But then, toward the end of the ’90s, there was this shotgun sequencing revolution spearheaded by kind of a controversial figure, Craig Venter. You just shred the genome, throw it all in a sequencing machine and then put it together in the computer. Did that help a lot?

    Green: There’s no question it sped things up. What Craig successfully did was to determine that there were approaches that could be used where you didn’t have to do piecemeal sequencing. The important nuance to point out is the only way you’re able to put [the pieces] back together then was by having many mapping elements that allow you to hang pieces together and organize them. It’s not like it all zipped together 3 billion letters. A lot of the meticulous mapping that had been done, painstaking mapping, helped provide organizing guideposts.

    The press covered it as a race, and the press covered it as option A versus option B. And the truth resided somewhere in between. What was driving the change, of course, was technology advances. If you chart the time since the end of the Human Genome Project, it’s the same phenomenon. Every single time there’s a technology surge, you find yourself doing things completely different than the way you used to.

    Saey: Technology has come a very long way from what I was doing. You can sequence thousands of bases at a time now.

    Green: The other part of the story that sometimes doesn’t get told: It’s not even just the laboratory bench–based technologies. It’s also the computational technologies. Some people don’t realize that when the Human Genome Project started, there was not really a widely functional internet. I was just barely starting to use e-mail.

    So here it was, we were one of the first funded groups for the Human Genome Project. We were considered state of the art. We were collaborating with an outside group generating some sequences, and the only practical way for my collaborator to get me the 300 to 400 bases of sequence was to handwrite it on a piece of paper and fax it to me. And I would analyze it by eye. It’s just remarkable that that was where we were when the project started.

    Eric Green (right) and his mentor, genomics pioneer Maynard Olson, were key players in the Human Genome Project. Below, the two review data to develop genome-mapping strategies slightly before the 1990 start of the project. Courtesy of NHGRI.

    Garbage to gold mine

    Saey: In 2000 was the big press conference to announce the rough draft of the human genome. I was just starting my journalism career at The St. Louis Post-Dispatch, and reported on this. At that time, it was a big revelation that there were these big deserts in between genes, and that we didn’t have nearly as many genes as we thought we were going to. Humans are such complex organisms, how could we not have many more genes than a fruit fly, or a worm? That just didn’t make sense.

    But now, I think, we are getting a better understanding, largely because of the way we can analyze the genome. Can you talk about how that evolution in thinking has progressed?

    Green: Before the genome project started, some [people] were quite critical, and really said it was a bad idea. Some argued that it was a waste of time to sequence the genome end to end; we should just focus and sequence the genes, as if all of humans’ biological richness was going to reside in the genes. Thank goodness we didn’t listen to those critics. Because if we would have done the shortcut and only focused on the genes, we would have only skimmed the biological complexity of humans.

    What we’ve come to learn is that while only 1.5 percent of the letters of the human genome directly encode for what are classically known as protein-coding genes — DNA that gets made into RNA, which gets made into protein — there’s a much larger fraction of the human genome that is biologically important and evolutionarily conserved. It’s widened our definition of a gene, because we now know that sometimes DNA may make RNA, and RNA may go off and do all sorts of biological things.

    Then there’s a whole set of sequences that are far more plentiful than gene sequences, that are really doing all the choreography in our genomes in terms of determining when, where and how much genes get turned on, in what cells and what tissues, at what developmental stages, under what conditions, and so on and so forth.

    It pushed us to think about all the other biological functions in DNA outside the genes. And as you accurately point out, we don’t really have a rulebook for that. And thank goodness the computer technology is helping us because the human eye would just fail miserably at figuring this out. And so as much as anything, computational biology, bioinformatics, data science are the dominant research tools to help bring clarity as to how noncoding sequences in the human genome function. And how they do that in a very carefully crafted choreography with the genes.

    Saey: Well, I’m glad you brought up those sequences, because those are some of my favorites. I’m a huge fan of noncoding RNAs [the RNAs that don’t go on to make proteins]. There are so many of them, and such a huge variety of them. And they work in so many important ways (SN: 4/13/19, p. 22).

    I don’t think that 20 years ago we could have conceived that RNAs that didn’t make proteins would actually be important for something. The genes those RNAs were copied from were considered broken genes or pseudogenes. They were junk.

    Green: Or sloppy transcription; that our enzymes are just going off and making a bunch of RNA because they don’t know how to control themselves. But, no. And I like your point about 20 years ago, we couldn’t imagine. I would propose that 20 years from now, we might look back at this conversation and say, “Oh, my goodness, think about all these other ways that the genome functions.” There’s no reason to think we have our hands around it all in terms of all the biological complexity of DNA; I’m quite sure we don’t.

    Saey: And even when you find a protein-coding gene, you’re not just making one protein. You’re making, on average, seven or eight different versions of this protein from the same gene. After RNA gets copied from DNA, you can mix and match the little parts of a gene to make completely new proteins. And then you can tack on all of these other little chemical groups to change the way things work.

    Green: When I was getting my Ph.D. at Washington University in the 1980s, I didn’t work on DNA, I didn’t work on molecular biology, I didn’t work on RNA. I was working on a set of proteins, studying how they had sugar molecules added to them after they were made, and how, depending upon what tissue they were made in, they got different structures of sugar molecules attached. So just as you point out, you start off with one gene, and you can end up with multiple RNAs that lead to multiple different proteins. And each of those proteins could have different modifications depending on what tissue, what conditions, what development stage, et cetera. This is the incredible amplification of complexity. It’s not in our gene number. We have a long way to go to fully understanding all this.

    Saey: Another thing that really surprises people is how much of our genome is made of extinct viruses and transposons — transposons being these jumping genes that still hop around in our genome. Those transposons can occasionally cause problems, but we also got a lot of innovations from them, including the human placenta, and maybe some things about the way our brain works. So, we’re not even completely human. If you want to view it that way, we’re a lot virus.

    Green: Right. We’re a lot virus. We’re also not all Homo sapiens. Many, many people carry Neandertal bits from a time when Neandertals and Homo sapiens coexisted, and actually interbred (SN: 5/8/21 & 5/22/21, p. 7). But not everybody in the world has that, which is also interesting. One of the aspects of genomics is that it not only has taught us and given us the biological instruction book, it’s also given us a fascinating record of evolution. We can use it to learn lots of things about our evolution, about human migrations, about aspects of humans on this globe.

    Focus on diversity

    Saey: Most people who are interacting with DNA and with the human genome these days do it through ancestry testing and consumer DNA testing. So you can identify the part of the world that people’s DNA came from. And that gets into a lot of discussion about race, and whether race has a biological basis, and what that might mean for medicine.

    There’s been a lot of criticism lately of genetics and genomics, because it’s based a lot on the DNA of people of European ancestry — white people like you and me. But there’s a huge amount of genetic diversity in the world among humans, and especially in Africa, where humans got started. So what are we doing about getting a handle on the vast array of diversity that humans have?

    Green: There’s no question that the successes in genomics that we’ve been discussing are worth talking about and worth show­casing. At the same time, as a field, we have not been perfect. One of the things that we just have to admit that we’ve really not been as successful on is making sure we’ve captured enough of the diversity of the human population with respect to the samples that we’ve used for doing genetic and genomic studies. We have got to fix this problem. It’s a very high priority.

    I really want to emphasize, it’s not even just that it’s the socially right thing to do, that everybody should have information about their genomes. This is very important medically. If the only populations we have a lot of genomic data on are people of European descent, we limit our ability to move genomic analyses and eventually genomic medicine into populations that are not of European descent. And so there’s a high priority through a number of efforts around the world, including in the U.S., to work hard to capture much more diversity of the world’s populations in all studies that we do.

    Saey: There’s been a lot of talk about racialized medicine, where you might have a person come in who is African American, and then you would say, “Oh, well, we should consider this to be the genome that we look at.” Is that a good approach to take? Or do you think it should be broader somehow?

    Green: The truth is, of course, there are certain diseases that tend to cluster in certain populations of common ancestry. And many times those are represented by racial groups.

    But racial grouping is really a social construct that has numerous imperfections. And so on the one hand, you can’t totally ignore some correlations that exist with certain diseases or certain responses to medications in certain groups. But it’s a very blunt tool to use. And we could do better. The way we could do this better is to track much more accurately to specific genomic features, as opposed to certain racial characteristics. So I think what we really want to pay attention to, and we will be doing this increasingly, is thinking about better ways of grouping and stratifying individuals and populations.

    Saey: I wanted to touch too on what we mean when we say genetic diversity. For the most part I think people are familiar with what scientists call SNPs, single nucleotide polymorphisms, and what other people might refer to as mutations. But there are lots of other ways that you can have diversity in the genome: You can be missing entire genes or entire chunks of chromosomes or you can have duplications of certain genes. Are we now able to look at that type of diversity as well? And do we know if that’s important?

    Green: There’s no question that all forms of genomic diversity — genomic variation is probably the word I would use — are not only biologically relevant, they’re proving to be medically relevant. Now, we don’t have a complete inventory of which ones are more relevant than others. But we already know of many examples where medically relevant variations in our genome can be a single letter, a string of letters, it could mean having extra letters or extra segments, or missing segments. It could be a rearrangement of segments. Every one of those [types of variations] are already known to be important in human disease, and eventually will be important for diagnostic medicine and the implementation of genomic medicine.

    Saey: Do you envision a time when we will be able to study and interpret these bigger changes?

    Green: I absolutely envision a time where people will get their complete genome sequenced end to end as part of their medical care, and maybe even at birth. I don’t think we’re there yet. But I truly believe that we will want that information as part of medical management. And I fully believe that technologies will become available and will be inexpensive enough to make it worthwhile. But those predictions are going to have to be based on evidence that indeed that’s feasible and valuable.

    What’s next?

    Saey: So where do we go from here? What does the National Human Genome Research Institute do now that researchers have generated end-to-end sequences of every human chromosome?

    Green: We recently finished a two-and-a-half-year strategic planning process to ask that very question for this coming decade. It was actually an overwhelming exercise because there were so many good ideas. We published these in Nature — our 2020 strategic vision. Some of it [is] applications of genomics to medicine. Of course, everybody’s going to be excited about that. But there are many other forefronts of genomics that are just as exciting.

    We still don’t have the perfect technologies that we can deploy anywhere in the world in any health setting, any medical study, that will get us end-to-end sequencing. We need better and cheaper technologies for letting us read human genome sequences inexpensively in clinical settings. We need complete end-to-end interpretation of every base of the human genome. We need to know not just about the genes, we need to know about all these noncoding regions. We need to understand every human variant that we can find in the world population. And we need to know: Is that variant biologically silent? Is it biologically relevant? Is it medically relevant? If it’s medically relevant, what’s the action that should be taken? That starts to point us to understanding the genomic basis of disease and also to think about how can we use information about genomic variation in the practice of medicine.

    Also, we will continue to think about the implications of these genomic advances to society. How are we going to make sure people understand this? How are we going to make sure things are applied equitably? How are we going to make sure it doesn’t exacerbate inequities in our society? How are we going to deal with a whole host of issues related to privacy?

    Saey: I’m glad that you brought up equity and privacy, because those are some of the things that people are most concerned about right now. There are a lot of historically marginalized people who don’t want any part of genetic research because of the way their groups have been treated in the past. There’s been this history of colonialism. These groups say, if we’re going to do genetics on our people, then it should be our people doing it for us. What is NHGRI doing to build capacity in these communities so that they can do their own research and, maybe, if they decide they want to, share that with other people?

    Green: I completely agree with the notion that if genomics is going to be a successful field, especially as we move this into medicine, we have got to make sure that we engage people from all different communities, all definitions of diversity, and make sure they benefit from it. We absolutely emphasize this point repeatedly in our 2020 strategic vision, so much so that the very first thing we did in 2021 was to release what we call an action agenda for enhancing the diversity of the genomics workforce.

    Another experience we’ve had at NIH that I think is very illustrative of this: We recognized that we wanted African scientists to get more involved in doing genomics. And through a program called H3Africa, the Human Heredity and Health in Africa program, that the NIH and the Wellcome Trust funded, the philosophical mantra is to empower African scientists to do all the studies and build capacity there. It’s been a success by almost any metric. But it’s exactly what you said: We want them to do the studies, we want them to engage with their local communities. We’ll never build the trust if we just come in and say, “We’re going to do all of this.”

    Saey: In terms of privacy, you’ve said a couple of times that you could have somebody’s genome completely sequenced, and then their doctor can use it. But don’t we get into a situation that could be like the movie Gattaca? Some people could be discriminated against if they don’t have their genetic flaws fixed? Are you somehow creating a class of lesser people and more perfect people who don’t have the genetic flaws that everybody else has?

    Green: You just laid out several major ethical dilemmas, and they’re all valid, and we could spend hours talking about each of them. What I would say about our field is, we’ve recognized that everything we are doing is a two-edged sword. On the one edge of that sword are these incredible opportunities for improving the practice of medicine. On the other edge of that sword, as with many technologies, it could be used in ways that would be societally unacceptable. It’s a reason why the field has from the beginning always embraced and invested in ethical, legal and social implications research, or ELSI research, which has attempted to anticipate these concerns and try to provide an evidence base to build policies, and in some cases, laws.

    We do have in the United States a major act called the Genetic Information Nondiscrimination Act, which offers some protection against genetic discrimination. We have laws and policies that protect people’s medical information.

    We should recognize that genomics is just part of a bigger set of societal issues, as more and more intimate information about us is electronically available. Trust me, we can learn a lot about you if we just reviewed your Visa card purchases. We as a society have to recognize that, yes, genomic information has some unique attributes, but it’s not totally exceptional. We need to be part of a broader framework for protecting people so that we can benefit from these incredible opportunities.

    We just need to make sure we don’t get too far out over our skis. Just because we can do something, doesn’t mean we should. We need to think about all the consequences. We should be constantly understanding what will society tolerate, what do people not want. We have some things that are going to be completely unacceptable, like doing genetic editing in unborn children. At this stage, we simply don’t think that’s a smart thing to do, we’re not ready to do it, the scientific community has condemned doing it (SN: 12/22/18 & 1/5/19, p. 20).

    Saey: I do want to circle back, because when we were talking about these noncoding sequences, a lot of them help control how genes are used. That may not be so obvious if you just get this string of somebody’s DNA letters. Can you tell from that how those genes will be used? And how those things will be put together? Or is that something you cannot tell by looking at DNA?

    Green: There’s no question that sometimes when you talk about genomics, and you talk about genetics, and you focus on the genes — you sometimes see the tree and you lose track of the forest. The forest is medical complexity and biological complexity. And for most things about ourselves, how tall we are, what we look like, and common diseases — hypertension, diabetes, Alzheimer’s, autism, et cetera — things are much more complicated than looking even for one gene. It’s multiple genes. And it’s almost always a greater choreography with our lifestyle, and our social experiences, and our exposures and everything from diet to exercise. There’s a lot more to health and disease than just our genes.

    The grand challenge in many ways for the coming decade or two is doing these very large-scale studies where we have as much data as possible, not just genomic data, but lifestyle data and electronic health record data, and environmental data and physiological data. There are absolutely going to be patterns. And we’ve just got to find those patterns.

    Saey: We’re almost out of time. It’s been wonderful talking with you. Did we miss anything?

    Green: We missed all sorts of wonderful things, but you can only spend so much time walking down memory lane.

    What I would say in closing are two things people need to remember: First of all, how incredibly exciting this field is, and how incredibly eager we are to build our tent with more and more people from all different disciplines. And we also want people of all different populations and ancestral groups from all parts of the world. It’s going to be so important to do that.

    The reason we want all these people involved is, we just touched on so many things that we still don’t understand. We need creativity. And we don’t have a playbook. Just like those days where we were bewildered of how we were going to get the Human Genome Project really done, I don’t really know how we’re going to get complete end-to-end understanding of the human genome. But I know if we get creative people working on it, we’ll make incredible progress.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 12:34 pm on February 10, 2022 Permalink | Reply
    Tags: "Science News (US)", "Weird ‘superionic’ matter could make up Earth’s inner core", , , , , , ,   

    From The Australian National University (AU) via Science News (US) : “Weird ‘superionic’ matter could make up Earth’s inner core” 

    ANU Australian National University Bloc

    From The Australian National University (AU)


    Science News (US)

    February 9, 2022
    Emily Conover


    Earth’s core consists of a liquid outer core (yellow) surrounding an inner core (brighter yellow sphere). New computer simulations suggest that, instead of being a normal solid, the inner core may be superionic, a state of matter that has properties of both a solid and liquid. Credit: TUMEGGY/SCIENCE PHOTO LIBRARY/getty images plus.

    A quirky material that behaves like a mishmash of liquid and solid could be hidden deep in the Earth.

    Computer simulations described in two studies suggest that the material in Earth’s inner core, which includes iron and other, lighter elements, may be in a “superionic” state. That means that while the iron stays put, as in a solid, the lighter elements flow like a liquid.

    The research gives a potential peek at the inner workings of an enigmatic, inaccessible realm of the planet. According to conventional scientific wisdom, Earth’s core consists of a liquid outer core surrounding a solid inner core (SN: 1/28/19). But beyond knowing that the inner core is rich in iron, scientists don’t know exactly which other elements are present, and in what quantities.

    “The inner core is very difficult to scrutinize simply because it’s so deep beneath our feet,” says geophysicist Hrvoje Tkalčić of Australian National University in Canberra.

    Seismic waves stirred up by earthquakes can plow through the inner core, providing clues to what’s inside. But measurements of these waves have left researchers puzzled. The velocity of one type of wave, called a shear wave, is lower than expected for solid iron or for many types of iron alloys — mixtures of iron with other materials. “That is a mystery about the inner core,” says geophysicist Yu He of The Chinese Academy of Sciences [中国科学院](CN).

    In one new study, He and colleagues simulated a group of 64 iron atoms, along with various types of lighter elements — hydrogen, carbon and oxygen — under pressures and temperatures expected for the inner core. In a normal solid, atoms arrange themselves in an orderly grid, holding fast to their positions. In a superionic material, some of the atoms arrange neatly, as in a solid, while others are liquid-like free spirits that slip right through the solid lattice. In the simulation, the researchers found, the lighter elements moved about while the iron stayed in place.

    That superionic status slowed shear waves, the researchers report February 9 in Nature, suggesting the weird phase of matter could explain the unexpected shear wave velocity measured in the inner core.

    Shear waves, also known as secondary or S waves, jiggle the Earth perpendicular to their direction of travel, like the undulations that move along a jump rope that’s wiggled up and down (SNS: 1/12/18). Other waves, called primary or P waves, compress and expand the Earth in a direction parallel to their travel, like an accordion being squeezed.

    To really explain the inner core, scientists must find a combination of elements that keeps with everything scientists know about the inner core, including its S wave velocity, P wave velocity and its density. “You have to match all three things, otherwise it doesn’t work,” says mineral physicist John Brodholt of The University College London (UK).

    In a study published in August 2021 in Earth and Planetary Science Letters, Brodholt and colleagues did just that. A simulation of iron, silicon and hydrogen atoms reproduced the inner core’s known characteristics. In the simulation, the material was also superionic: The iron and silicon stayed in position while the hydrogen flowed like a liquid.

    But Brodholt notes that their result is just one possible explanation for the inner core’s properties. Brodholt and his colleagues have previously found [Earth and Planetary Science Letters] other combinations of elements that could explain the inner core without going superionic, he says, leaving unresolved the question of what lurks in Earth’s deepest depths.

    Another puzzle of Earth’s heart is the fact that the inner core’s structure seems to change over time. This has previously been interpreted as evidence that the inner core rotates at a different rate than the rest of the Earth. But He and colleagues suggest that it could instead result from the motions of liquid-like light elements swirling inside the inner core and changing the distribution of elements over time. “This paper sort of offers an explanation for both of these phenomena” — the slow shear wave velocity and the shifting structure — says Tkalčić, who was not involved with either new study.

    One thing missing is laboratory experiments showing how these combinations of elements behave under inner core conditions, says geophysicist Daniele Antonangeli of The University of Paris-Sorbonne [Université de Paris-Sorbonne](FR), who was not involved with the new research. Such tests could help confirm whether the simulations are correct.

    Previous experiments have found evidence that water ice can go superionic, perhaps under conditions found inside Uranus or Neptune (SN: 2/5/18). But researchers can’t yet probe the behavior of superionic materials under the conditions thought to exist inside Earth’s core. So scientists will have to keep pushing the tests to further extremes, Antonangeli says. “The experimentalist that is within me craves seeing experimental validation of this.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    ANU Campus

    The Australian National University (AU) is a world-leading university in Australia’s capital city, Canberra. Our location points to our unique history, ties to the Australian Government and special standing as a resource for the Australian people.

    Our focus on research as an asset, and an approach to education, ensures our graduates are in demand the world-over for their abilities to understand, and apply vision and creativity to addressing complex contemporary challenges.

    The Australian National University is regarded as one of the world’s leading research universities, and is ranked as the number one university in Australia and the Southern Hemisphere by the 2021 QS World University Rankings. It is ranked 31st in the world by the 2021 QS World University Rankings, and 59th in the world (third in Australia) by the 2021 Times Higher Education.

    In the 2020 Times Higher Education Global Employability University Ranking, an annual ranking of university graduates’ employability, Australian National University was ranked 15th in the world (first in Australia). According to the 2020 QS World University by Subject, the university was also ranked among the top 10 in the world for Anthropology, Earth and Marine Sciences, Geography, Geology, Philosophy, Politics, and Sociology.

    Established in 1946, The Australian National University is the only university to have been created by the Parliament of Australia. It traces its origins to Canberra University College, which was established in 1929 and was integrated into The Australian National University in 1960. The Australian National University enrolls 10,052 undergraduate and 10,840 postgraduate students and employs 3,753 staff. The university’s endowment stood at A$1.8 billion as of 2018.

    The Australian National University counts six Nobel laureates and 49 Rhodes scholars among its faculty and alumni. The university has educated two prime ministers, 30 current Australian ambassadors and more than a dozen current heads of government departments of Australia. The latest releases of ANU’s scholarly publications are held through ANU Press online.

    Calls for the establishment of a national university in Australia began as early as 1900. After the location of the nation’s capital, Canberra, was determined in 1908, land was set aside for the university at the foot of Black Mountain in the city designs by Walter Burley Griffin. Planning for the university was disrupted by World War II but resumed with the creation of the Department of Post-War Reconstruction in 1942, ultimately leading to the passage of the Australian National University Act 1946 by the Chifley Government on 1 August 1946.

    A group of eminent Australian scholars returned from overseas to join The Australian National University, including Sir Howard Florey (co-developer of medicinal penicillin), Sir Mark Oliphant (a nuclear physicist who worked on the Manhattan Project), and Sir Keith Hancock (the Chichele Professor of Economic History at The University of Oxford (UK)). The group also included a New Zealander, Sir Raymond Firth (a professor of anthropology at LSE), who had earlier worked in Australia for some years. Economist Sir Douglas Copland was appointed as The Australian National University’s first Vice-Chancellor and former Prime Minister Stanley Bruce served as the first Chancellor. The Australian National University was originally organised into four centres—the Research Schools of Physical Sciences, Social Sciences and Pacific Studies and the John Curtin School of Medical Research.

    The first residents’ hall, University House, was opened in 1954 for faculty members and postgraduate students. The Mount Stromlo Observatory, established by the federal government in 1924, became part of The Australian National University in 1957.

    Mount Stromlo Observatory, just outside of Canberra, Altitude 770 m (2,530 ft).

    The first locations of The Australian National University Library, the Menzies and Chifley buildings, opened in 1963. The Australian Forestry School, located in Canberra since 1927, was amalgamated by The Australian National University in 1965.

    The Canberra School of Music and the Canberra School of Art combined in 1988 to form the Canberra Institute of the Arts, and amalgamated with the university as The Australian National University Institute of the Arts in 1992.

    The Australian National University established its Medical School in 2002, after obtaining federal government approval in 2000.

    On 18 January 2003, the Canberra bushfires largely destroyed the Mount Stromlo Observatory. The Australian National University astronomers now conduct research from the Siding Spring Observatory, which contains 10 telescopes including the Anglo-Australian Telescope.

    Siding Spring Mountain Observatory – Research School of Astronomy & Astrophysics (AU) with Anglo-Australian Telescope dome visible near centre of image in Coonabarabran, Warrumbungle National Park, New South Wales, Siding Spring Mountain [Mount Woorat] at an altitude of 1,165 m (3,822 ft).

    The Australian Astronomical Observatory AAT Anglo Australian Telescope, at Siding Spring Observatory, near Coonabarabran, New South Wales, Australia, at an altitude of 1,165 m (3,822 ft).

    In February 2013, financial entrepreneur and Australian National University graduate Graham Tuckwell made the largest university donation in Australian history by giving $50 million to fund an undergraduate scholarship program at The Australian National University.

    The Australian National University is well known for its history of student activism and, in recent years, its fossil fuel divestment campaign, which is one of the longest-running and most successful in the country. The decision of The Australian National University Council to divest from two fossil fuel companies in 2014 was criticized by ministers in the Abbott government, but defended by Vice Chancellor Ian Young, who noted:

    “On divestment, it is clear we were in the right and played a truly national and international leadership role. […] [W]e seem to have played a major role in a movement which now seems unstoppable.”

    As of 2014 The Australian National University still had investments in major fossil fuel companies.

    A survey conducted by the Australian Human Rights Commission in 2017 found that The Australian National University had the second highest incidence of sexual assault and sexual harassment. 3.5 per cent of respondents from The Australian National University reported being sexually assaulted in 2016. Vice Chancellor Brian Schmidt apologised to victims of sexual assault and harassment.

    In recent years The Australian National University has come under pressure with funding and staff cuts in the School of Music in 2011-15 and in the School of Culture, History and Language in 2016. However, there is a range of global (governmental) endowments available for Arts and Social Sciences, designated only for The Australian National University. Some courses are now delivered online.

    Today The Australian National University has exchange agreements in place for its students with many of the world’s leading universities most notably in the Asia-Pacific region, including The National University of Singapore [சிங்கப்பூர் தேசிய பல்கலைக்கழகம்](SG), The University of Tokyo[(東京大] (JP), The University of Hong Kong [香港大學](HK), The Peking University [北京大学](CN), The Tsinghua University [清华大学](CN) and The Seoul National University [서울대학교](KR). In other regions, notable universities include The Paris Sciences et Letters University [Université Paris Sciences et Lettres Université PSL](FR) The George Washington University(US), The University of California (US), The University of Texas (US), The University of Toronto (CA) in North America and Imperial College London (UK), King’s College London (UK), Sciences Po (FR), The Swiss Federal Institute of Technology ETH Zürich [Eidgenössische Technische Hochschule Zürich)](CH), Bocconi University [Università Commerciale Luigi Bocconi](IT), The University of Copenhagen [Københavns Universitet](DK) and Trinity College Dublin, the University of Dublin(IE) in Europe.

    In 2017, Chinese hackers infiltrated the computers of Australian National University, potentially compromising national security research conducted at the university.

    The Australian National University was ranked 27th in the world (first in Australia) by the 2022 QS World University Rankings, and equal 54th in the world, and equal 2nd in Australia (with The University of Queensland(AU)), by the 2022 Times Higher Education.

    In the QS World University Rankings by Subject 2020, The Australian National University was ranked 6th in the world for geology, 7th for philosophy, 8th in the world for politics, 9th in the world for sociology, 13th in the world for development studies and 15th in the world for linguistics.

    A 2017 study by Times Higher Education reported that The Australian National University was the world’s 7th (first in Australia) most international university.

    In the 2020 Times Higher Education Global Employability University Ranking, an annual ranking of university graduates’ employability, The Australian National University was ranked 15th in the world (first in Australia).

  • richardmitnick 5:34 pm on January 27, 2022 Permalink | Reply
    Tags: "Machine learning points to prime places in Antarctica to find meteorites", "Science News (US)", A map of 613 probable meteorite hot spots including some near existing Antarctic research stations., , , The Free University of Brussels [Université libre de Bruxelles](BE)   

    From The Free University of Brussels [Université libre de Bruxelles](BE) via Science News (US) : “Machine learning points to prime places in Antarctica to find meteorites” 

    From The Free University of Brussels [Université libre de Bruxelles](BE)


    Science News (US)

    January 26, 2022
    Carolyn Gramling

    More than 600 spots in the icy continent may be prime locales for finding lots more space rocks.

    Researchers discover a meteorite in the Nansen blue ice area near Belgium’s Princess Elisabeth Antarctic research station during a 2019–2020 expedition. Field team of the BELARE 2019-2020 meteorite recovery expedition on the Nansen Ice Field.

    The hunt for meteorites may have just gotten some new leads. A powerful new machine learning algorithm has identified over 600 hot spots in Antarctica where scientists are likely to find a bounty of the fallen alien rocks, researchers report January 26 in Science Advances.

    Antarctica isn’t necessarily the No. 1 landing spot for meteorites, bits of extraterrestrial rock that offer a window into the birth and evolution of the solar system. Previous estimates suggest more meteorites probably land closer to the equator (SN: 5/29/20). But the southern continent is still the best place to find them, says Veronica Tollenaar, a glaciologist at the Université libre de Bruxelles in Belgium. Not only are the dark specks at the surface starkly visible against the white background, but quirks of the ice sheet’s flow can also concentrate meteorites in “stranding zones.”

    The trouble is that so far, meteorite stranding zones have been found by luck. Satellites help, but poring through the images is time-consuming, and field reconnaissance is costly. So Tollenaar and her colleagues trained computers to find these zones more quickly.

    Space rocks’ road
    This diagram shows what happens when a slowly creeping ice sheet, with meteorites (black dots) embedded in deeper layers, encounters a topographic rise such as a mountain. That obstacle bends the ice sheet’s layers upward, concentrating the space rocks embedded in them into a meteorite stranding zone. In regions where snow turns into water vapor (red arrows) faster than it accumulates, called blue ice areas, these stranding zones are particularly visible.
    Credit: Veronica Tollenaar.

    Such stranding zones form when the slow creep of the ice sheet over the land encounters a mountain or hidden rise in the ground. That barrier shifts the flow upward, carrying any embedded space rocks toward the surface.

    Combining a machine learning algorithm with data on the ice’s velocity and thickness, surface temperatures, the shape of the bedrock and known stranding zones, Tollenaar and colleagues created a map of 613 probable meteorite hot spots including some near existing Antarctic research stations.

    To date, about 45,000 meteorites have been plucked from the ice. But that’s a fraction of the 300,000 bits of space rock estimated to lie somewhere on the continent’s surface.

    The team has yet to test the map on the ground; a COVID-19 outbreak at the Belgian station in December halted plans to try it during the 2021–2022 field season. It will try again next year. Meanwhile, the team is making these data freely accessible to other researchers, hoping they’ll take up the hunt as well.

    Marking the spots
    Using a machine learning algorithm, researchers trained computers to identify the likeliest locations of meteorite stranding zones across Antarctica. Many of these zones (in red) are located near existing research stations (small tents). The team also created an interactive map to help researchers find these alien rocks.
    ‘Treasure map’ to find meteorites in Antarctica.
    Credit: Veronica Tollenaar.

    See the full article here.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Free University of Brussels [Université libre de Bruxelles] (BE) (French for Free University of Brussels), abbreviated ULB, is a French-speaking private research university in Brussels, Belgium.

    ULB is one of two institutions which trace their origins to the Free University of Brussels, founded in 1834 by Belgian lawyer Pierre-Théodore Verhaegen. This split along linguistic lines in 1969 into the French-speaking ULB and Dutch-speaking Vrije Universiteit Brussel (VUB), both founded in 1970. A major research center open to Europe and the world, it has about 24,200 students, 33% of whom come from abroad, and an equally cosmopolitan staff.

  • richardmitnick 11:41 am on January 15, 2022 Permalink | Reply
    Tags: "A century of quantum mechanics questions the fundamental nature of reality", "Science News (US)", At its roots reality is described by the mysterious set of mathematical rules known as quantum mechanics., , , Quantum mechanics is the math that explains matter., Quantum theory represents the ultimate outcome of superior logical reasoning., Reality isn’t what it seems., Science morphs from dictator to oddsmaker: quantum math tells only probabilities for different possible outcomes. Some uncertainty always remains., Scientists have dug deep enough to discover that reality’s foundations do not mirror the world of everyday appearances., The physics of the microworld, The quantum revolution upended our understanding of nature and a lot of uncertainty remains.   

    From California Institute of Technology (US) via Science News (US) : “A century of quantum mechanics questions the fundamental nature of reality” 

    Caltech Logo

    From California Institute of Technology (US)


    Science News (US)

    January 12, 2022
    Tom Siegfried

    Quantum theory describes a reality ruled by probabilities. How to reconcile that reality with everyday experiences is still unclear. Credit: Max Löffler.

    The quantum revolution upended our understanding of nature and a lot of uncertainty remains.

    Scientists are like prospectors, excavating the natural world seeking gems of knowledge about physical reality. And in the century just past, scientists have dug deep enough to discover that reality’s foundations do not mirror the world of everyday appearances. At its roots reality is described by the mysterious set of mathematical rules known as quantum mechanics.

    Conceived at the turn of the 20th century and then emerging in its full form in the mid-1920s, quantum mechanics is the math that explains matter. It’s the theory for describing the physics of the microworld, where atoms and molecules interact to generate the world of human experience. And it’s at the heart of everything that made the century just past so dramatically unlike the century preceding it. From cell phones to supercomputers, DVDs to pdfs, quantum physics fueled the present-day electronics-based economy, transforming commerce, communication and entertainment.

    But quantum theory taught scientists much more than how to make computer chips. It taught that reality isn’t what it seems.

    “The fundamental nature of reality could be radically different from our familiar world of objects moving around in space and interacting with each other,” physicist Sean Carroll suggested in a recent tweet. “We shouldn’t fool ourselves into mistaking the world as we experience it for the world as it really is.”

    In a technical paper [Quantum Mechanics and Fundamentality Reality as a Vector in Hilbert Space] backing up his tweet, Carroll notes that quantum theory consists of equations that describe mathematical entities roaming through an abstract realm of possible natural events. It’s plausible, Carroll argues, that this quantum realm of mathematical possibilities represents the true, fundamental nature of reality. If so, all the physical phenomena we perceive are just a “higher-level emergent description” of what’s really going on.

    “Emergent” events in ordinary space are real in their own way, just not fundamental, Carroll allows. Belief that the “spatial arena” is fundamental “is more a matter of convenience and convention than one of principle,” he says.

    Carroll’s perspective is not the only way of viewing the meaning of quantum math, he acknowledges, and it is not fully shared by most physicists. But everybody does agree that quantum physics has drastically remodeled humankind’s understanding of nature. In fact, a fair reading of history suggests that quantum theory is the most dramatic shift in science’s conception of reality since the ancient Greeks deposed mythological explanations of natural phenomena in favor of logic and reason. After all, quantum physics itself seems to defy logic and reason.

    It doesn’t, of course. Quantum theory represents the ultimate outcome of superior logical reasoning, arriving at truths that could never be discovered merely by observing the visible world.

    It turns out that in the microworld — beyond the reach of the senses — phenomena play a game with fantastical rules. Matter’s basic particles are not tiny rocks, but more like ghostly waves that maintain multiple possible futures until forced to assume the subatomic equivalent of substance. As a result, quantum math does not describe a relentless cause-and-effect sequence of events as Newtonian science had insisted. Instead science morphs from dictator to oddsmaker: quantum math tells only probabilities for different possible outcomes. Some uncertainty always remains.

    Quantum mechanics says that whether an electron behaves as particle or wave depends on how it is observed.Credit: Max Löffler.

    The quantum revolution

    The discovery of quantum uncertainty was what first impressed the world with the depth of the quantum revolution. German physicist Werner Heisenberg, in 1927, astounded the scientific community with the revelation that deterministic cause-and-effect physics failed when applied to atoms. It was impossible, Heisenberg deduced, to measure both the location and velocity of a subatomic particle at the same time. If you measured one precisely, some uncertainty remained for the other.

    “A particle may have an exact place or an exact speed, but it can not have both,” as Science News Letter, the predecessor of Science News, reported in 1929. “Crudely stated, the new theory holds that chance rules the physical world.” Heisenberg’s uncertainty principle “is destined to revolutionize the ideas of the universe held by scientists and laymen to an even greater extent than Einstein’s relativity.”

    Heisenberg’s breakthrough was the culmination of a series of quantum surprises. First came German physicist Max Planck’s discovery, in 1900, that light and other forms of radiation could be absorbed or emitted only in discrete packets, which Planck called quanta. A few years later Albert Einstein argued that light also traveled through space as packets, or particles, later called photons. Many physicists dismissed such early quantum clues as inconsequential. But in 1913, the Danish physicist Niels Bohr used quantum theory to explain the structure of the atom. Soon the world realized that reality needed reexamining.

    By 1921, awareness of the quantum revolution had begun to expand beyond the confines of physics conferences. In that year, Science News Bulletin, the first iteration of Science News, distributed what was “believed to be the first popular explanation” of the quantum theory of radiation, provided by American physical chemist William D. Harkins. He proclaimed that the quantum theory “is of much more practical importance” than Albert Einsteins’s Theory of General Relativity.

    “Since it concerns itself with the relations between matter and radiation,” Harkins wrote, quantum theory “is of fundamental significance in connection with almost all processes which we know.” Electricity, chemical reactions and how matter responds to heat all require quantum-theoretic explanations.

    As for atoms, traditional physics asserts that atoms and their parts can move about “in a large number of different ways,” Harkins stated. But quantum theory maintains that “of all the states of motion (or ways of moving) prescribed by the older theory, only a certain number actually do occur.” Therefore, events previously believed “to occur as continuous processes, actually do occur in steps.”

    But in 1921 quantum physics remained embryonic. Some of its implications had been discerned, but its full form remained undeveloped in detail. It was Heisenberg, in 1925, who first transformed the puzzling jumble of clues into a coherent mathematical picture. His decisive advance was developing a way to represent the energies of electrons in atoms using matrix algebra. With aid from German physicists Max Born and Pascual Jordan, Heisenberg’s math became known as matrix mechanics. Shortly thereafter, Austrian physicist Erwin Schrödinger developed a competing equation for electron energies, viewing the supposed particles as waves described by a mathematical wave function. Schrödinger’s “wave mechanics” turned out to be mathematically equivalent to Heisenberg’s particle-based approach, and “quantum mechanics” became the general term for the math describing all subatomic systems.

    Still, some confusion remained. It wasn’t clear how an approach picturing electrons as particles could be equivalent to one supposing electrons to be waves. Bohr, by then regarded as the foremost of the world’s atomic physicists, pondered the question deeply and by 1927 arrived at a novel viewpoint he called complementarity.

    Bohr argued that the particle and wave views were complementary; both were necessary for a full description of subatomic phenomena. Whether a “particle” — say, an electron — exhibited its wave or particle nature depended on the experimental setup observing it. An apparatus designed to find a particle would find a particle; an apparatus geared to detect wave behavior would find a wave.

    At about the same time, Heisenberg derived his uncertainty principle. Just as wave and particle could not be observed in the same experiment, position and velocity could not both be precisely measured at the same time. As physicist Wolfgang Pauli commented, “Now it becomes day in quantum theory.”

    But the quantum adventure was really just beginning.

    In the many worlds interpretation of quantum mechanics, all possible realities exist, but humans perceive just one. Credit: Max Löffler.

    A great debate

    Many physicists, Einstein among them, deplored the implications of Heisenberg’s uncertainty principle. Its introduction in 1927 eliminated the possibility of precisely predicting the outcomes of atomic observations. As Born had shown, you could merely predict the probabilities for the various possible outcomes, using calculations informed by the wave function that Schrödinger had introduced. Einstein famously retorted that he could not believe that God would play dice with the universe. Even worse, in Einstein’s view, the wave-particle duality described by Bohr implied that a physicist could affect reality by deciding what kind of measurement to make. Surely, Einstein believed, reality existed independently of human observations.

    On that point, Bohr engaged Einstein in a series of discussions that came to be known as the Bohr-Einstein debate, a continuing dialog that came to a head in 1935. In that year, Einstein, with collaborators Nathan Rosen and Boris Podolsky, described a thought experiment supposedly showing that quantum mechanics could not be a complete theory of reality.

    In a brief summary in Science News Letter in May 1935, Podolsky explained that a complete theory must include a mathematical “counterpart for every element of the physical world.” In other words, there should be a quantum wave function for the properties of every physical system. Yet if two physical systems, each described by a wave function, interact and then fly apart, “quantum mechanics … does not enable us to calculate the wave function of each physical system after the separation.” (In technical terms, the two systems become “entangled,” a term coined by Schrödinger.) So quantum math cannot describe all elements of reality and is therefore incomplete.

    Bohr soon responded, as reported in Science News Letter in August 1935. He declared that Einstein and colleagues’ criterion for physical reality was ambiguous in quantum systems. Einstein, Podolsky and Rosen assumed that a system (say an electron) possessed definite values for certain properties (such as its momentum) before those values were measured. Quantum mechanics, Bohr explained, preserved different possible values for a particle’s properties until one of them was measured. You could not assume the existence of an “element of reality” without specifying an experiment to measure it.

    Einstein did not relent. He acknowledged that the uncertainty principle was correct with respect to what was observable in nature, but insisted that some invisible aspect of reality nevertheless determined the course of physical events. In the early 1950s physicist David Bohm developed such a theory of “hidden variables” that restored determinism to quantum physics, but made no predictions that differed from the standard quantum mechanics math. Einstein was not impressed with Bohm’s effort. “That way seems too cheap to me,” Einstein wrote to Born, a lifelong friend.

    Einstein died in 1955, Bohr in 1962, neither conceding to the other. In any case it seemed like an irresolvable dispute, since experiments would give the same results either way. But in 1964, physicist John Stewart Bell deduced a clever theorem about entangled particles, enabling experiments to probe the possibility of hidden variables. Beginning in the 1970s, and continuing to today, experiment after experiment confirmed the standard quantum mechanical predictions. Einstein’s objection was overruled by the court of nature.

    Still, many physicists expressed discomfort with Bohr’s view (commonly referred to as the Copenhagen interpretation of quantum mechanics). One particularly dramatic challenge came from the physicist Hugh Everett III in 1957. He insisted that an experiment did not create one reality from the many quantum possibilities, but rather identified only one branch of reality. All the other experimental possibilities existed on other branches, all equally real. Humans perceive only their own particular branch, unaware of the others just as they are unaware of the rotation of the Earth. This “many worlds interpretation” was widely ignored at first but became popular decades later, with many adherents today.

    Since Everett’s work, numerous other interpretations of quantum theory have been offered. Some emphasize the “reality” of the wave function, the mathematical expression used for predicting the odds of different possibilities. Others emphasize the role of the math as describing the knowledge about reality accessible to experimenters.

    Some interpretations attempt to reconcile the many worlds view with the fact that humans perceive only one reality. In the 1980s, physicists including H. Dieter Zeh and Wojciech Zurek identified the importance of a quantum system’s interaction with its external environment, a process called quantum decoherence. Some of a particle’s many possible realities rapidly evaporate as it encounters matter and radiation in its vicinity. Soon only one of the possible realities remains consistent with all the environmental interactions, explaining why on the human scale of time and size only one such reality is perceived.

    This insight spawned the “consistent histories” interpretation, pioneered by Robert Griffiths and developed in more elaborate form by Murray Gell-Mann and James Hartle. It is widely known among physicists but has received little wider popularity and has not deterred the pursuit of other interpretations. Scientists continue to grapple with what quantum math means for the very nature of reality.

    Using principles of quantum information theory, a particle’s quantum state can be replicated at a distant location, a feat known as quantum teleportation. Credit: Max Löffler.

    It from quantum bit

    In the 1990s, the quest for quantum clarity took a new turn with the rise of quantum information theory. Physicist John Archibald Wheeler, a disciple of Bohr, had long emphasized that specific realities emerged from the fog of quantum possibilities by irreversible amplifications — such as an electron definitely establishing its location by leaving a mark after hitting a detector. Wheeler suggested that reality as a whole could be built up from such processes, which he compared to yes or no questions — is the electron here? Answers corresponded to bits of information, the 1s and 0s used by computers. Wheeler coined the slogan “it from bit” to describe the link between existence and information.

    Taking the analogy further, one of Wheeler’s former students, Benjamin Schumacher, devised the notion of a quantum version of the classical bit of information. He introduced the quantum bit, or qubit, at a conference in Dallas in 1992.

    Schumacher’s qubit provided a basis for building computers that could process quantum information. Such “quantum computers” had previously been envisioned, in different ways, by physicists Paul Benioff, Richard Feynman and David Deutsch. In 1994, mathematician Peter Shor showed how a quantum computer manipulating qubits could crack the toughest secret codes, launching a quest to design and build quantum computers capable of that and other clever computing feats. By the early 21st century, rudimentary quantum computers had been built; the latest versions can perform some computing tasks but are not powerful enough yet to make current cryptography methods obsolete. For certain types of problems, though, quantum computing may soon achieve superiority over standard computers.


    Quantum computing’s realization has not resolved the debate over quantum interpretations. Deutsch believed that quantum computers would support the many worlds view. Hardly anyone else agrees, though. And decades of quantum experiments have not provided any support for novel interpretations — all the results comply with the traditional quantum mechanics expectations. Quantum systems preserve different values for certain properties until one is measured, just as Bohr insisted. But nobody is completely satisfied, perhaps because the 20th century’s other pillar of fundamental physics, Einstein’s theory of gravity (general relativity), does not fit in quantum theory’s framework.

    For decades now, the quest for a quantum theory of gravity has fallen short of success, despite many promising ideas. Most recently a new approach suggests that the geometry of spacetime, the source of gravity in Einstein’s theory, may in some way be built from the entanglement of quantum entities. If so, the mysterious behavior of the quantum world defies understanding in terms of ordinary events in space and time because quantum reality creates spacetime, rather than occupying it. If so, human observers witness an artificial, emergent reality that gives the impression of events happening in space and time while the true, inaccessible reality doesn’t have to play by the spacetime rules.

    In a crude way this view echoes that of Parmenides, the ancient Greek philosopher who taught that all change is an illusion. Our senses show us the “way of seeming,” Parmenides declared; only logic and reason can reveal “the way of truth.” Parmenides didn’t reach that insight by doing the math, of course (he said it was explained to him by a goddess). But he was a crucial figure in the history of science, initiating the use of rigorous deductive reasoning and relying on it even when it led to conclusions that defied sensory experience.

    Yet as some of the other ancient Greeks realized, the world of the senses does offer clues about the reality we can’t see. “Phenomena are a sight of the unseen,” Anaxagoras said. As Carroll puts it, in modern terms, “the world as we experience it” is certainly related to “the world as it really is.”

    “But the relationship is complicated,” he says, “and it’s real work to figure it out.”

    In fact, it took two millennia of hard work for the Greek revolution in explaining nature to mature into Newtonian science’s mechanistic understanding of reality. Three centuries later quantum physics revolutionized science’s grasp of reality to a comparable extent. Yet the lack of agreement on what it all means suggests that perhaps science needs to dig a little deeper still.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Caltech campus

    The The California Institute of Technology (US) is a private research university in Pasadena, California. The university is known for its strength in science and engineering, and is one among a small group of institutes of technology in the United States which is primarily devoted to the instruction of pure and applied sciences.

    The California Institute of Technology was founded as a preparatory and vocational school by Amos G. Throop in 1891 and began attracting influential scientists such as George Ellery Hale, Arthur Amos Noyes, and Robert Andrews Millikan in the early 20th century. The vocational and preparatory schools were disbanded and spun off in 1910 and the college assumed its present name in 1920. In 1934, The California Institute of Technology was elected to the Association of American Universities, and the antecedents of National Aeronautics and Space Administration (US)’s Jet Propulsion Laboratory, which The California Institute of Technology continues to manage and operate, were established between 1936 and 1943 under Theodore von Kármán.

    The California Institute of Technology has six academic divisions with strong emphasis on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. First-year students are required to live on campus, and 95% of undergraduates remain in the on-campus House System at The California Institute of Technology. Although The California Institute of Technology has a strong tradition of practical jokes and pranks, student life is governed by an honor code which allows faculty to assign take-home examinations. The The California Institute of Technology Beavers compete in 13 intercollegiate sports in the NCAA Division III’s Southern California Intercollegiate Athletic Conference (SCIAC).

    As of October 2020, there are 76 Nobel laureates who have been affiliated with The California Institute of Technology, including 40 alumni and faculty members (41 prizes, with chemist Linus Pauling being the only individual in history to win two unshared prizes). In addition, 4 Fields Medalists and 6 Turing Award winners have been affiliated with The California Institute of Technology. There are 8 Crafoord Laureates and 56 non-emeritus faculty members (as well as many emeritus faculty members) who have been elected to one of the United States National Academies. Four Chief Scientists of the U.S. Air Force and 71 have won the United States National Medal of Science or Technology. Numerous faculty members are associated with the Howard Hughes Medical Institute(US) as well as National Aeronautics and Space Administration(US). According to a 2015 Pomona College(US) study, The California Institute of Technology ranked number one in the U.S. for the percentage of its graduates who go on to earn a PhD.


    The California Institute of Technology is classified among “R1: Doctoral Universities – Very High Research Activity”. Caltech was elected to The Association of American Universities in 1934 and remains a research university with “very high” research activity, primarily in STEM fields. The largest federal agencies contributing to research are National Aeronautics and Space Administration(US); National Science Foundation(US); Department of Health and Human Services(US); Department of Defense(US), and Department of Energy(US).

    In 2005, The California Institute of Technology had 739,000 square feet (68,700 m^2) dedicated to research: 330,000 square feet (30,700 m^2) to physical sciences, 163,000 square feet (15,100 m^2) to engineering, and 160,000 square feet (14,900 m^2) to biological sciences.

    In addition to managing NASA-JPL/Caltech (US), The California Institute of Technology also operates the Caltech Palomar Observatory(US); the Owens Valley Radio Observatory(US);the Caltech Submillimeter Observatory(US); the W. M. Keck Observatory at the Mauna Kea Observatory(US); the Laser Interferometer Gravitational-Wave Observatory at Livingston, Louisiana and Richland, Washington; and Kerckhoff Marine Laboratory(US) in Corona del Mar, California. The Institute launched the Kavli Nanoscience Institute at The California Institute of Technology in 2006; the Keck Institute for Space Studies in 2008; and is also the current home for the Einstein Papers Project. The Spitzer Science Center(US), part of the Infrared Processing and Analysis Center(US) located on The California Institute of Technology campus, is the data analysis and community support center for NASA’s Spitzer Infrared Space Telescope [no longer in service].

    The California Institute of Technology partnered with University of California at Los Angeles(US) to establish a Joint Center for Translational Medicine (UCLA-Caltech JCTM), which conducts experimental research into clinical applications, including the diagnosis and treatment of diseases such as cancer.

    The California Institute of Technology operates several Total Carbon Column Observing Network(US) stations as part of an international collaborative effort of measuring greenhouse gases globally. One station is on campus.

  • richardmitnick 1:39 pm on December 26, 2021 Permalink | Reply
    Tags: "Science News (US)", "The most ancient supermassive black hole is bafflingly big", A quasar is a supermassive black hole in the core of a galaxy wrapped in a bright disk of material., , , , , Finding such a huge supermassive black hole so early in the universe’s history challenges astronomers’ understanding of how these cosmic beasts first formed., Quasar J0313-1806 is two times heavier and 20 million years older than the last record-holder for earliest known black hole., The most ancient black hole ever discovered is so big it defies explanation., The quasar is dubbed J0313-1806., , This active supermassive black hole-a quasar-boasts a mass of 1.6 billion suns and lies at the heart of a galaxy more than 13 billion light-years from Earth.   

    From The University of Arizona (US) via Science News (US) : “The most ancient supermassive black hole is bafflingly big” 

    From The University of Arizona (US)


    Science News (US)

    January 18, 2021 [Just found this in a year-end round up.]
    Maria Temming

    A quasar is a supermassive black hole in the core of a galaxy wrapped in a bright disk of material. The most distant quasar now known is J0313-1806, which dates back to when the universe was a mere 670 million years old. Credit: J. da Silva NOIRLab (US)/The National Science Foundation (US)/The Association of Universities for Research in Astronomy (AURA)(US).

    The most ancient black hole ever discovered is so big it defies explanation.

    This active supermassive black hole, or quasar, boasts a mass of 1.6 billion suns and lies at the heart of a galaxy more than 13 billion light-years from Earth. The quasar is dubbed J0313-1806, dates back to when the universe was just 670 million years old, or about 5 percent of the universe’s current age. That makes J0313-1806 two times heavier and 20 million years older than the last record-holder for earliest known black hole (SN: 12/6/17).

    Finding such a huge supermassive black hole so early in the universe’s history challenges astronomers’ understanding of how these cosmic beasts first formed, researchers reported January 12 at a virtual meeting of the American Astronomical Society and in a paper posted for [The Astrophysical Journal Letters] on January 8, 2021.

    Supermassive black holes are thought to grow from smaller seed black holes that gobble up matter. But astronomer Feige Wang of the University of Arizona and colleagues calculated that even if J0313-1806’s seed formed right after the first stars in the universe and grew as fast as possible, it would have needed a starting mass of at least 10,000 suns. The normal way seed black holes form — through the collapse of massive stars — can only make black holes up to a few thousand times as massive as the sun.

    A gargantuan seed black hole may have formed through the direct collapse of vast amounts of primordial hydrogen gas, says study coauthor Xiaohui Fan, also an astronomer at the University of Arizona in Tucson. Or perhaps J0313-1806’s seed started out small, forming through stellar collapse, and black holes can grow a lot faster than scientists think. “Both possibilities exist, but neither is proven,” Fan says. “We have to look much earlier [in the universe] and look for much less massive black holes to see how these things grow.”

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    As of 2019, the The University of Arizona (US) enrolled 45,918 students in 19 separate colleges/schools, including The University of Arizona College of Medicine in Tucson and Phoenix and the James E. Rogers College of Law, and is affiliated with two academic medical centers (Banner – University Medical Center Tucson and Banner – University Medical Center Phoenix). The University of Arizona is one of three universities governed by the Arizona Board of Regents. The university is part of the Association of American Universities and is the only member from Arizona, and also part of the Universities Research Association(US). The university is classified among “R1: Doctoral Universities – Very High Research Activity”.

    Known as the Arizona Wildcats (often shortened to “Cats”), The University of Arizona’s intercollegiate athletic teams are members of the Pac-12 Conference of the NCAA. The University of Arizona athletes have won national titles in several sports, most notably men’s basketball, baseball, and softball. The official colors of the university and its athletic teams are cardinal red and navy blue.

    After the passage of the Morrill Land-Grant Act of 1862, the push for a university in Arizona grew. The Arizona Territory’s “Thieving Thirteenth” Legislature approved The University of Arizona in 1885 and selected the city of Tucson to receive the appropriation to build the university. Tucson hoped to receive the appropriation for the territory’s mental hospital, which carried a $100,000 allocation instead of the $25,000 allotted to the territory’s only university (Arizona State University(US) was also chartered in 1885, but it was created as Arizona’s normal school, and not a university). Flooding on the Salt River delayed Tucson’s legislators, and by they time they reached Prescott, back-room deals allocating the most desirable territorial institutions had been made. Tucson was largely disappointed with receiving what was viewed as an inferior prize.

    With no parties willing to provide land for the new institution, the citizens of Tucson prepared to return the money to the Territorial Legislature until two gamblers and a saloon keeper decided to donate the land to build the school. Construction of Old Main, the first building on campus, began on October 27, 1887, and classes met for the first time in 1891 with 32 students in Old Main, which is still in use today. Because there were no high schools in Arizona Territory, the university maintained separate preparatory classes for the first 23 years of operation.


    The University of Arizona is classified among “R1: Doctoral Universities – Very high research activity”. UArizona is the fourth most awarded public university by National Aeronautics and Space Administration(US) for research. The University of Arizona was awarded over $325 million for its Lunar and Planetary Laboratory (LPL) to lead NASA’s 2007–08 mission to Mars to explore the Martian Arctic, and $800 million for its OSIRIS-REx mission, the first in U.S. history to sample an asteroid.

    The LPL’s work in the Cassini spacecraft orbit around Saturn is larger than any other university globally. The University of Arizona laboratory designed and operated the atmospheric radiation investigations and imaging on the probe. The University of Arizona operates the HiRISE camera, a part of the Mars Reconnaissance Orbiter. While using the HiRISE camera in 2011, University of Arizona alumnus Lujendra Ojha and his team discovered proof of liquid water on the surface of Mars—a discovery confirmed by NASA in 2015. The University of Arizona receives more NASA grants annually than the next nine top NASA/JPL-Caltech(US)-funded universities combined. As of March 2016, The University of Arizona’s Lunar and Planetary Laboratory is actively involved in ten spacecraft missions: Cassini VIMS; Grail; the HiRISE camera orbiting Mars; the Juno mission orbiting Jupiter; Lunar Reconnaissance Orbiter (LRO); Maven, which will explore Mars’ upper atmosphere and interactions with the sun; Solar Probe Plus, a historic mission into the Sun’s atmosphere for the first time; Rosetta’s VIRTIS; WISE; and OSIRIS-REx, the first U.S. sample-return mission to a near-earth asteroid, which launched on September 8, 2016.

    The University of Arizona students have been selected as Truman, Rhodes, Goldwater, and Fulbright Scholars. According to The Chronicle of Higher Education, UArizona is among the top 25 producers of Fulbright awards in the U.S.

    The University of Arizona is a member of the Association of Universities for Research in Astronomy(US), a consortium of institutions pursuing research in astronomy. The association operates observatories and telescopes, notably Kitt Peak National Observatory(US) just outside Tucson. Led by Roger Angel, researchers in the Steward Observatory Mirror Lab at The University of Arizona are working in concert to build the world’s most advanced telescope. Known as the Giant Magellan Telescope(CL), it will produce images 10 times sharper than those from the Earth-orbiting Hubble Telescope.

    Giant Magellan Telescope, 21 meters, to be at the NOIRLab(US) National Optical Astronomy Observatory(US) Carnegie Institution for Science’s(US) Las Campanas Observatory(CL), some 115 km (71 mi) north-northeast of La Serena, Chile, over 2,500 m (8,200 ft) high.

    GMT Giant Magellan Telescope(CL) 21 meters, to be at the Carnegie Institution for Science’s(US) NOIRLab(US) NOAO(US) Las Campanas Observatory(CL), some 115 km (71 mi) north-northeast of La Serena, Chile, over 2,500 m (8,200 ft) high.

    The telescope is set to be completed in 2021. GMT will ultimately cost $1 billion. Researchers from at least nine institutions are working to secure the funding for the project. The telescope will include seven 18-ton mirrors capable of providing clear images of volcanoes and riverbeds on Mars and mountains on the moon at a rate 40 times faster than the world’s current large telescopes. The mirrors of the Giant Magellan Telescope will be built at The University of Arizona and transported to a permanent mountaintop site in the Chilean Andes where the telescope will be constructed.

    Reaching Mars in March 2006, the Mars Reconnaissance Orbiter contained the HiRISE camera, with Principal Investigator Alfred McEwen as the lead on the project. This National Aeronautics and Space Agency (US) mission to Mars carrying the UArizona-designed camera is capturing the highest-resolution images of the planet ever seen. The journey of the orbiter was 300 million miles. In August 2007, The University of Arizona, under the charge of Scientist Peter Smith, led the Phoenix Mars Mission, the first mission completely controlled by a university. Reaching the planet’s surface in May 2008, the mission’s purpose was to improve knowledge of the Martian Arctic. The Arizona Radio Observatory(US), a part of The University of Arizona Department of Astronomy Steward Observatory(US), operates the Submillimeter Telescope on Mount Graham.

    The National Science Foundation(US) funded the iPlant Collaborative in 2008 with a $50 million grant. In 2013, iPlant Collaborative received a $50 million renewal grant. Rebranded in late 2015 as “CyVerse”, the collaborative cloud-based data management platform is moving beyond life sciences to provide cloud-computing access across all scientific disciplines.

    In June 2011, the university announced it would assume full ownership of the Biosphere 2 scientific research facility in Oracle, Arizona, north of Tucson, effective July 1. Biosphere 2 was constructed by private developers (funded mainly by Texas businessman and philanthropist Ed Bass) with its first closed system experiment commencing in 1991. The university had been the official management partner of the facility for research purposes since 2007.

    U Arizona mirror lab-Where else in the world can you find an astronomical observatory mirror lab under a football stadium?

    University of Arizona’s Biosphere 2, located in the Sonoran desert. An entire ecosystem under a glass dome? Visit our campus, just once, and you’ll quickly understand why the UA is a university unlike any other.

    University of Arizona Landscape Evolution Observatory at Biosphere 2.

  • richardmitnick 10:16 am on December 23, 2021 Permalink | Reply
    Tags: "Science News (US)", "Spacecraft in 2021 set their sights on Mars and asteroids and beyond", Mars 2020 Perseverance Rover, NASA DART and ESA Hera, ,   

    From Science News (US) : “Spacecraft in 2021 set their sights on Mars and asteroids and beyond” 

    From Science News (US)

    December 22, 2021
    Lisa Grossman

    New rovers started exploring the Red Planet while other missions launched to asteroids.

    While a flurry of missions crowded around Mars this year, some lesser-explored parts of the solar system are about to get fresh eyes.

    Three countries visited the Red Planet in 2021, sending orbiters, landers, rovers and even a helicopter. The United Arab Emirates successfully put its first interplanetary spacecraft, called Hope, into orbit in February, to study Mars’ climate. China’s Zhurong rover has been trundling around the planet’s surface since May, studying the local geology and searching for underground water ice (SN Online: 5/19/21). And NASA’s Perseverance rover, which landed in February, has been collaborating with a helicopter called Ingenuity to explore an ancient lake bed and collect rocks for a future delivery mission to Earth (SN Online: 2/17/21; SN Online: 4/30/21).

    But while all eyes were on Mars, other missions are embarking on journeys to study even more far-flung places. After years of delays and billions of dollars over budget, the James Webb Space Telescope is finally set to launch, no earlier than December 25, to probe the universe’s earliest galaxies, among other things (SN: 10/9/21 & 10/23/21, p. 26).

    Meanwhile, spacecraft are heading off to visit 11 asteroids in the solar system in search of clues to the origins of the planets, and water and life on Earth, as well as ways to keep our planet safe from errant space rocks.

    Let’s meet those rock explorers.

    Mars 2020 Perseverance Rover

    Perseverence Mars 2020 Perseverance Rover – NASA Mars annotated.
    NASA’s Perseverance rover arrived on Mars in February and has been exploring an ancient lake bed.
    Credit: JPL/Caltech-NASA(US)


    NASA’s Lucy spacecraft launched on October 16 on the first mission to explore Jupiter’s Trojan asteroids, two groups of space rocks that share an orbit with Jupiter around the sun (SN Online: 10/15/21; SN: 5/13/17, p. 5). The asteroids occupy areas known as Lagrange points, where the gravitational pulls of Jupiter and the sun cancel each other out. These regions are like cosmic dead zones and have been collecting planetary flotsam for billions of years. Those asteroids and other bits of debris are like fossils of the early solar system. (Fittingly, the mission was named for the famous hominid fossil Lucy.)

    Over the next 12 years, Lucy will make five flybys to observe seven Trojan asteroids, plus one asteroid in the main belt between Mars and Jupiter for good measure. By the end of its mission, Lucy will have visited more objects than any other NASA mission.

    NASA depiction of Lucy Mission to Jupiter’s Trojans.

    NASA DART and ESA Hera

    The next mission to head out was NASA’s Double Asteroid Redirection Test, or DART, which launched November 24 to deliberately ram into an asteroid in an attempt to alter its orbit. That collision will test a technique for deflecting dangerous asteroids that could crash into Earth in the future.

    The spacecraft’s destination is a pair of asteroids called Didymos and Dimorphos (SN: 8/15/20, p. 5). In late September 2022, DART will crash-land on Dimorphos while moving at about 6.6 kilometers per second, which hopefully will shift its orbit around Didymos. Astronomers on Earth will be able to tell if the test worked by looking for a change in Dimorphos’ orbit time, and the European Space Agency will send a follow-up probe called Hera in 2024.

    National Aeronautics Space Agency(US) DART in space depiction.

    European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU)’s Hera spacecraft depiction.


    Last up is Psyche, which is set to launch in August 2022. The NASA spacecraft will visit 16 Psyche, an asteroid that seems to be made up almost entirely of metal. It may be the exposed core of a protoplanet that lost its outer mantle and crust in cosmic collisions long ago. Since scientists can’t send a mission to Earth’s core, a trip to 16 Psyche may be the closest we can get to a journey to the center of the Earth.

    Psyche will arrive at its destination in 2026 and spend 21 months measuring the asteroid’s magnetic field and composition from orbit. One question is whether the asteroid truly is a planetary core. Even if not, the asteroid is a new kind of world that no spacecraft has visited before.
    NASA Psyche spacecraft depiction.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 9:45 am on December 23, 2021 Permalink | Reply
    Tags: "Science News (US)", "These discoveries from 2021-if true-could shake up science", A giant arc of galaxies stretching across more than 3 billion light-years., Antistarry night, Cosmic curve ball, Early arrival, Extragalactic planet, Fourteen celestial sources of gamma rays, Misbehaving muons, Oldest animal fossils?   

    From Science News (US) : “These discoveries from 2021-if true-could shake up science” 

    From Science News (US)

    December 21, 2021
    Aina Abell

    Hidden subatomic particles and the oldest animal fossils are among findings that need more evidence.

    Antistarry night

    Fourteen celestial sources of gamma rays (colored dots in this all-sky map of the Milky Way; yellow indicates bright sources and blue shows dim sources) may come from stars made of antimatter. Credit: SIMON DUPOURQUÉ/The Astrophysics and Planetology Research Institute[Institut de Recherche en Astrophysique et Planétologie – Observer et comprendre l’Univers, The Paul Sabatier University [Université Paul Sabatier a.k.a known as Toulouse III)(FR)/The National Centre for Scientific Research [Centre national de la recherche scientifique [CNRS](FR)) | VAMDC Consortium.

    Scientists may have spotted stars made of antimatter (SN: 6/5/21, p. 8). Finding antistars challenges a basic tenet of cosmology — that the vast majority of the universe’s antimatter, matter’s oppositely charged doppelgänger, was destroyed long ago. In 10 years of observations from the Fermi Gamma-ray Space Telescope, researchers found 14 points of light emitting gamma rays at energies that are expected when matter and antimatter meet and annihilate each other — a process that could happen on the surface of antistars. The discovery hints that substantial amounts of antimatter may have survived. But proving the existence of antistars will be extremely difficult because, aside from the studied gamma rays, the light such stars give off would look just like the light from normal stars.

    Misbehaving muons

    Nothing gets physicists more excited than evidence of a new fundamental particle. Researchers with the Muon g–2 experiment at DOE’s Fermi National Accelerator Laboratory (US), flung billions of muons around the lab’s giant magnet and found that the rate at which the orientation of the muons’ magnetic poles wobbled strayed from theoretical predictions. The odd behavior suggests that hidden particles are influencing the muons’ magnetic properties, challenging the standard model of particle physics describing the universe’s fundamental forces and elementary particles (SN: 5/8/21 & 5/21/21, p. 6). But it will take more data to convince physicists, who are still refining their predictions of muon behavior.

    DOE’s Fermi National Accelerator Laboratory(US) Muon g-2 studio. As muons race around a ring at the Muon g-2 studio, their spin axes twirl, reflecting the influence of unseen particles.

    Cosmic curve ball

    In other news that may upend our understanding of the cosmos, scientists detected a giant arc of galaxies stretching across more than 3 billion light-years. Such a finding is counter to the assumption that matter in the universe is evenly distributed on large scales (SN: 7/3/21 & 7/17/21, p. 9). The arc, invisible to the human eye, came to light in an analysis of about 40,000 quasars — very bright cores of distant galaxies. But some skeptics argue that the arc may be just an artifact of the human tendency to pick up patterns where none actually exist.

    Astronomers discovered what they say is a giant arc of galaxies (smile-shaped curve in the middle of the image) by using the light from distant quasars (tiny blue dots), which exposes the halos of galaxies (dark spots) in front of them. Credit: A. Lopez.

    Early arrival

    This year brought new evidence that humans arrived in the Americas more than 15,000 years earlier than traditionally thought, throwing support behind last year’s claim that humans reached North America by about 33,000 years ago (SN: 12/19/20 & 1/2/21, p. 35). In May, researchers reported that animal bones excavated at a cave in Mexico date to between about 33,000 and 28,000 years ago (SN: 7/3/21 & 7/17/21, p. 16). Chipped and sharp-edged stones that may have functioned as tools were found near the bones, hinting that humans had been in the area. And the discovery of fossilized human footprints suggests people roamed what’s now New Mexico around 23,000 to 21,000 years ago (SN: 11/6/21, p. 12). If the tracks’ age is verified, it would show that humans were in North America during the pinnacle of the last ice age.

    These human footprints from what’s now New Mexico may be between 23,000 and 21,000 years old, suggesting humans were in North America during the height of the last ice age.David Bustos/National Park Service, Bournemouth University (UK).

    Oldest animal fossils?

    Tiny tubes found in 890-million-year old rocks might be remnants of sea sponges. If that claim holds up, the tubes would push animal origins back by about 350 million years to an oxygen-poor period considered unsuitable for animal life (SN: 8/28/21, p. 6). But some researchers aren’t convinced that the fossils are sea sponges. Skeptics point to the lack of mineralized skeletal parts, known as spicules, that are typical features of sea sponges, and the fact that many non­animal organisms can make similar tubes.

    This microscope image shows the pale, wormlike tubes of putative fossils of ancient sea sponges, found in 890-million-year old rock. Credit: E.C. Turner/Nature 2021.

    Extragalactic planet

    Astronomers may have detected the first known planet outside of the Milky Way, in a galaxy about 28 million light-years from Earth (SN: 11/20/21, p. 7). Traditional exoplanet-hunting techniques don’t work well for such distances, so researchers looked to a type of paired star system known as an X-ray binary, which emits bright X-rays. A planet crossing, or transiting, in front of such a system would temporarily block those X-rays, alerting astronomers to the planet’s presence.

    The spiral-shaped Whirlpool galaxy may be the host of the first planet spotted outside of the Milky Way. Credit: S. BECKWITH/The Space Telescope Science Institute (US), The Hubble Heritage Team (US)(EU)/STSCI/The Association of Universities for Research in Astronomy (AURA)(US), The National Aeronautics and Space Agency(US), The European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU).

    Planet transit. Credit: The NASA Ames Research Center (US).

    Some scientists are skeptical because the discovery relied, metaphorically, on many stars to align: The planet needed to transit the X-ray binary while its orbit was perfectly in line with Earth’s point of view, just when a telescope was looking.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 1:21 pm on December 16, 2021 Permalink | Reply
    Tags: "Quantum physics requires imaginary numbers to explain reality", "Science News (US)", , , Imaginary numbers -well explained in Stephen Hawking's “A Brief History of Time”, Imaginary numbers result from taking the square root of a negative number., Imaginary numbers turn out to be essential. They seem to be woven into the fabric of quantum mechanics-the math describing the realm of molecules; atoms; and subatomic particles., , , Quantum theory’s prominent use of complex numbers-sums of imaginary and real numbers-was disconcerting to its founders-including physicist Erwin Schrödinger.   

    From Science News (US) : “Quantum physics requires imaginary numbers to explain reality” 

    From Science News (US)

    December 15, 2021
    Emily Conover

    Theories based only on real numbers fail to explain the results of two new experiments.

    To explain the real world, imaginary numbers are necessary, according to a quantum experiment (shown) performed by a team of physicists including Yali Mao (pictured) of The Southern University of Science and Technology [南方科技大學](CN). Credit: Jingyun Fan.

    Imaginary numbers [well explained in Stephen Hawking’s A Brief History of Time] might seem like unicorns and goblins — interesting but irrelevant to reality.

    But for describing matter at its roots, imaginary numbers turn out to be essential. They seem to be woven into the fabric of quantum mechanics-the math describing the realm of molecules; atoms; and subatomic particles. A theory obeying the rules of quantum physics needs imaginary numbers to describe the real world, two new experiments suggest.

    Imaginary numbers result from taking the square root of a negative number. They often pop up in equations as a mathematical tool to make calculations easier. But everything we can actually measure about the world is described by real numbers, the normal, nonimaginary figures we’re used to (SN: 5/8/18). That’s true in quantum physics too. Although imaginary numbers appear in the inner workings of the theory, all possible measurements generate real numbers.

    Quantum theory’s prominent use of complex numbers-sums of imaginary and real numbers-was disconcerting to its founders-including physicist Erwin Schrödinger. “From the early days of quantum theory, complex numbers were treated more as a mathematical convenience than a fundamental building block,” says physicist Jingyun Fan of The Southern University of Science and Technology [南方科技大學](CN).

    Some physicists have attempted to build quantum theory using real numbers only, avoiding the imaginary realm with versions called “real quantum mechanics.” But without an experimental test of such theories, the question remained whether imaginary numbers were truly necessary in quantum physics, or just a useful computational tool.

    A type of experiment known as a Bell test resolved a different quantum quandary, proving that quantum mechanics really requires strange quantum linkages between particles called entanglement (SN: 8/28/15). “We started thinking about whether an experiment of this sort could also refute real quantum mechanics,” says theoretical physicist Miguel Navascués of The Institute for Quantum Optics and Quantum Information [Institut für Quantenoptik und Quanteninformation] of the The Austrian Academy of Sciences [Österreichische Akademie der Wissenschaften](AT). He and colleagues laid out a plan for an experiment in a paper posted online at arXiv.org in January 2021 and published December 15 in Nature.

    In this plan, researchers would send pairs of entangled particles from two different sources to three different people, named according to conventional physics lingo as Alice, Bob and Charlie. Alice receives one particle, and can measure it using various settings that she chooses. Charlie does the same. Bob receives two particles and performs a special type of measurement to entangle the particles that Alice and Charlie receive. A real quantum theory, with no imaginary numbers, would predict different results than standard quantum physics, allowing the experiment to distinguish which one is correct.

    Fan and colleagues performed such an experiment using photons, or particles of light, they report in a paper to be published in Physical Review Letters. By studying how Alice, Charlie and Bob’s results compare across many measurements, Fan, Navascués and colleagues show that the data could be described only by a quantum theory with complex numbers.

    Another team of physicists conducted an experiment based on the same concept using a quantum computer made with superconductors, materials which conduct electricity without resistance. Those researchers, too, found that quantum physics requires complex numbers, they report in another paper to be published in Physical Review Letters. “We are curious about why complex numbers are necessary and play a fundamental role in quantum mechanics,” says quantum physicist Chao-Yang Lu of The University of Science and Technology [中国科学技术大学] (CN) at Chinese Academy of Sciences [中国科学院](CN), a coauthor of the study.

    But the results don’t rule out all theories that eschew imaginary numbers, notes theoretical physicist Jerry Finkelstein of DOE’s Lawrence Berkeley National Laboratory (US), who was not involved with the new studies. The study eliminated certain theories based on real numbers, namely those that still follow the conventions of quantum mechanics. It’s still possible to explain the results without imaginary numbers by using a theory that breaks standard quantum rules. But those theories run into other conceptual issues, making them “ugly,” he says. But “if you’re willing to put up with the ugliness, then you can have a real quantum theory.”

    Despite the caveat, other physicists agree that the quandaries raised by the new findings are compelling. “I find it intriguing when you ask questions about why is quantum mechanics the way it is,” says physicist Krister Shalm of The National Institute of Standards and Technology (US). Asking whether quantum theory could be simpler or if it contains anything unnecessary, “these are very interesting and thought-provoking questions.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 10:14 am on November 25, 2021 Permalink | Reply
    Tags: "Astronomers have found the Milky Way’s first known ‘feather’", "Science News (US)", , , , ,   

    From The University of Cologne [Universität zu Köln](DE) via Science News (US) : “Astronomers have found the Milky Way’s first known ‘feather’” 

    From The University of Cologne [Universität zu Köln](DE)


    Science News (US)

    November 23, 2021
    Lisa Grossman

    The gaseous structure bridges two of the galaxy’s spiral arms.

    The Milky Way’s spiral arms, shown in this artist’s illustration, may have feathery bridges of gas connecting them, a new study suggests. Credit: JPL-Caltech (US)/The National Aeronautics and Space Agency(US).

    A long, thin filament of cold, dense gas extends jauntily from the galactic center, connecting two of the galaxy’s spiral arms, astronomers report November 11 in The Astrophysical Journal Letters.

    The team that discovered our galaxy’s feather named it the Gangotri wave, after the glacier that is the source of India’s longest river, the Ganges. In Hindi and other Indian languages, the Milky Way is called Akasha Ganga, “the river Ganga in the sky,” says astrophysicist Veena V.S. of the University of Cologne in Germany.

    She and colleagues found the Gangotri wave by looking for clouds of cold carbon monoxide gas, which is dense and easy to trace, in data from the APEX telescope in San Pedro de Atacama, Chile.

    ESO operates the Atacama Pathfinder Experiment, APEX, at one of the highest observatory sites on Earth, at an elevation of 5100 metres, high on the Chajnantor plateau in Chile’s Atacama region.

    The structure stretches 6,000 to 13,000 light-years from the Norma arm of the Milky Way to a minor arm near the galactic center called the 3-kiloparsec arm. So far, all other known gas tendrils in the Milky Way align with the spiral arms (SN: 12/30/15).

    The Gangotri wave has another unusual feature: waviness. The filament appears to wobble up and down like a sine wave over the course of thousands of light-years. Astronomers aren’t sure what could cause that, Veena says.

    Other galaxies have gaseous plumage, but when it comes to the Milky Way, “it’s very, very difficult” to map the galaxy’s structure from the inside out, she says. She hopes to find more galactic feathers and other bits of our galaxy’s structure. “One by one, we’ll be able to map the Milky Way.”

    See the full article here.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The The University of Cologne [Universität zu Köln](DE) is a university in Cologne, Germany. It was the sixth university to be established in Central Europe and, although it closed in 1798 before being re-established in 1919, it is now one of the largest universities in Germany with more than 48,000 students. The University of Cologne is a German Excellence University.

    The University of Cologne was established in 1388 as the fourth university in the Holy Roman Empire, after the Charles University of Prague (1348), the University of Vienna (1365) and the Ruprecht Karl University of Heidelberg (1386). The charter was signed by Pope Urban VI. The university began teaching on 6 January 1389.

    In 1798, the university was abolished by the French, who had invaded Cologne in 1794, because under the new French constitution, many universities were abolished all over France. The last rector Ferdinand Franz Wallraf was able to preserve the university’s Great Seal, now once more in use.

    In 1919, the Prussian government endorsed a decision by the Cologne City Council to re-establish the university. This was considered to be a replacement for the loss of the University of Strasbourg on the west bank of the Rhine, which contemporaneously reverted to France with the rest of Alsace. On 29 May 1919, the Cologne Mayor Konrad Adenauer signed the charter of the modern university.

    At that point, the new university was located in Neustadt-Süd, but relocated to its current campus in Lindenthal on 2 November 1934. The old premises are now being used for the Cologne University of Applied Sciences.

    Initially, the university was composed of the Faculty of Business, Economics and Social Sciences (successor to the Institutes of Commerce and of Communal and Social Administration) and the Faculty of Medicine (successor to the Academy of Medicine). In 1920, the Faculty of Law and the Faculty of Arts were added, from which latter the School of Mathematics and Natural Sciences was split off in 1955 to form a separate Faculty. In 1980, the two Cologne departments of the Rhineland School of Education were attached to the university as the Faculties of Education and of Special Education. In 1988, the university became a founding member of the Community of European Management Schools and International Companies (CEMS), today’s Global Alliance in Management Education.

    The University is a leader in the area of economics and is regularly placed in top positions for law and business, both for national and international rankings.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: