Recent Updates Page 2 Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 11:12 am on November 23, 2020 Permalink | Reply
    Tags: "To explain away dark matter gravity would have to be really weird cosmologists say", , , , ,   

    From Science Magazine: “To explain away dark matter, gravity would have to be really weird, cosmologists say” 

    From Science Magazine

    Nov. 20, 2020
    Adrian Cho

    The spatial distribution of more than 4 million galaxies as measured by Sloan Digital Sky Survey, which can’t be easily explained by modifying gravity Credit: SDSS (CC-BY)

    Dark Matter, the invisible stuff whose gravity is thought to hold galaxies together, may be the least satisfying concept in physics. But if you want to get rid of it, a new study finds, you’ll need to replace it with something even more bizarre: a force of gravity that, at some distances, pulls massive objects together and, at other distances, pushes them apart. The analysis underscores how hard it is to explain away dark matter.

    Concocting such a theory of gravity “is so complicated that it seems very unlikely that anyone could come up with a scenario that would work,” says Scott Dodelson, a theoretical physicist at Carnegie Mellon University, who wasn’t involved in the new work. Still, some theorists say it may be possible to pass the test.

    According to cosmologists’ prevailing theory, dark matter pervades pretty much every galaxy, providing the extra gravity that keeps stars from swirling out into space, given the speeds at which astronomers see the galaxies rotating. A vast web of clumps and strands of the stuff served as the scaffolding on which the cosmos developed. Yet, after of decades of trying, physicists haven’t spotted particles of dark matter floating around, and many would happily dismiss the idea—if it didn’t work so well.

    The XENON1T detector, which ran from 2016 to 2018, may have seen signs of exotic particles—or not.
    Enrico Sacchetti/Science Source

    XENON1T at Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy.

    Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy.

    Some scientists have tried to kick the dark matter habit. In 1983, Israeli physicist Mordehai Milgrom found he could account for the high speeds of stars swirling around the peripheries of galaxies by modifying Isaac Newton’s famous second law of motion: force equals mass times acceleration.

    Mordehai Milgrom, MOND theorist, is an Israeli physicist and professor in the department of Condensed Matter Physics at the Weizmann Institute in Rehovot, Israel

    That insight suggested the need for dark matter could be obviated by changing the law of gravity, at least on the scale of individual galaxies. But theorists labored for decades to turn the idea into a coherent theory of gravity akin to Albert Einstein’s general theory of relativity, and to do so, they had to add new fields, cousins of the usual gravitational field.

    But to do away with dark matter, theorists would also need explain away its effects on much larger, cosmological scales. And that is much harder, argues Kris Pardo, a cosmologist at NASA’s Jet Propulsion Laboratory, and David Spergel, a cosmologist at Princeton University. To make their case, they compare the distribution of ordinary matter in the early universe as revealed by measurements of the afterglow of the big bang—the cosmic microwave background (CMB)—with the distribution of the galaxies today.

    The evolution of the universe is a tale of two fluids: dark matter, which doesn’t interact with light, and ordinary matter, which does. The big bang left ripples in the dark matter, which under its own gravity began to coalesce into the denser spots. Ordinary matter—then, a hot soup of free-flying protons and electrons—also began to fall into the dark matter clumps. However, those charged particles themselves generated radiation that pushed them back out, creating sound waves known as a baryon acoustic oscillations. The waves continued to spread until the universe cooled enough to form neutral atoms, 380,000 years after the big bang, when the CMB was born. The sound wave left its imprint on the CMB and, faintly, in the distribution of the galaxies.

    CMB per ESA/Planck

    Or could that evolution be explained with only ordinary matter interacting through modified gravity? To explore that possibility, Pardo and Spergel derived a mathematical function that describes how gravity would have had to work to get from the distribution of ordinary matter revealed by the CMB to the current distribution of the galaxies. They found something striking: That function must swing between positive and negative values, meaning gravity would be attractive at some length scales and repulsive at others, Pardo and Spergel report this week in Physical Review Letters. “And that’s superweird,” Pardo says.

    The strange behavior is required to explain how the larger baryon acoustic oscillation faded over cosmic time while the smaller galaxies emerged, Pardo says. Just as Milgrom did with individual galaxies, the new work shows how, without dark matter, gravity would have to change to explain the universe’s large-scale structure, Dodelson says. But that change would have to be radical, he says. “They’re demonstrating that to do that you have to jump through these 13 hoops,” he says.

    However, theorists already seem prepared to jump through those hoops. In a paper posted in June to the preprint server arXiv [A new relativistic theory for Modified Newtonian Dynamics], theoretical cosmologists Constantinos Skordis and Tom Złosnik of the Czech Academy of Sciences present a dark matter–less theory of modified gravity they say jibes with CMB data. To do that, researchers add to a theory like general relativity an additional, tunable field called a scalar field. It has energy, and through Einstein’s equivalence of mass and energy, it can behave like a form of mass. Set things up just right and at large spatial scales, the scalar field interacts only with itself and acts like dark matter.

    The team hasn’t explicitly shown that the theory, which isn’t meant to be a fundamental theory of gravity, passes Pardo’s and Spergel’s particular test. But because it’s designed to mimic dark matter, it ought to, Skordis says. “We engineered it to have that behavior.”

    Skordis’s and Złosnik’s paper is “very exciting,” Pardo says. But he notes that in some sense it merely replaces one mysterious thing—dark matter—with another—a carefully tuned scalar field. Given the complications, Pardo says, “dark matter is kind of the easier explanation.”

    Dark Matter Background
    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, did most of the work on Dark Matter.

    Fritz Zwicky from http://

    Coma cluster via NASA/ESA Hubble.

    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.

    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.

    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).

    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL).

    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970.

    The Vera C. Rubin Observatory currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    LSST Data Journey, Illustration by Sandbox Studio, Chicago with Ana Kova.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 10:33 am on November 23, 2020 Permalink | Reply
    Tags: Aimy Bazylak, Chelsea Rochman, Diana Fu, Kelly O’Brien, Kevin Lewis O’Neill, Marc Cadotte, Royal Society of Canada, Six ways U of T researchers are working to make our lives better",   

    From University of Toronto (CA): “Six ways U of T researchers – and the Royal Society of Canada – are working to make our lives better” 

    U Toronto Bloc

    From University of Toronto (CA)

    Royal Society of Canada

    November 19, 2020
    Rahul Kalvapalle

    Clockwise from top left: Aimy Bazylak, Chelsea Rochman, Kelly O’Brien, Kevin Lewis O’Neill, Marc Cadotte and Diana Fu

    From clean energy and water to urban biodiversity, disability in HIV patients and labour activism in China, the next generation of researchers at the University of Toronto is engaged in tackling some of the most difficult – and pressing – questions of our time.

    Six of those researchers were recently recognized for their leadership, talent and the transformative potential of their research by being elected to the Royal Society of Canada (RSC)’s College of New Scholars, Artists and Scientists.

    Their appointment makes them members of an exclusive group who have demonstrated excellence and extraordinary productivity at an early stage in their career, and whose perspectives and expertise will strengthen the College of New Scholars, Artists and Scientists’ mission of harnessing interdisciplinary approaches to generate ideas and solutions.

    “Our researchers’ strength across the full breadth of areas of scholarship fuels an extraordinary range of cross-disciplinary collaborations – work that is vital to developing creative and sustainable solutions for the challenges facing Canada and the world,” says University Professor Ted Sargent, U of T’s vice-president, research and innovation, and strategic initiatives.

    The new members of the college will be honoured at the Royal Society of Canada’s 2020 Celebration of Excellence and Engagement, a week-long exploration of scholarly, scientific and artistic topics. They will be recognized alongside 12 leading U of T researchers, across a diverse array of disciplines, who were named fellows of the society this year, as well two faculty members who won Royal Society of Canada medals: Barbara Sherwood Lollar of the department of Earth sciences and Marla Sokolowski of the department of ecology and evolutionary biology – both in the Faculty of Arts & Science.

    U of T is a presenting sponsor of the event, which runs from Nov. 23 to Nov. 29 and includes a virtual symposium examining the impact and legacy of the discovery of insulin by U of T scientists 100 years ago.

    “These researchers contribute to a culture of curiosity, creativity and collaboration across the university that is driving discovery and innovation,” Sargent says. “I’m confident they will bring that same energy and dedication to the Royal Society of Canada and that their leadership will have a significant impact on the lives of Canadians – both now and well into the future.”

    Meet the researchers

    Aimy Bazylak

    Aimy Bazylak, a professor in the department of mechanical and industrial engineering in the Faculty of Applied Science & Engineering, is advancing the development of clean energy technologies such as fuel cells, batteries and electrolyzers. These technologies, Bazylak says, “will help everyone have a cleaner society.”

    A fellow of the American Society of Mechanical Engineers, Bazylak holds a Canada Research Chair in Thermofluidics for Clean Energy and was named this year’s winner of the McLean award for her contributions to fuel cell and electrolyzer technology.

    She attributes her passion for green technologies to the enthusiasm and drive of her students.

    “What inspires me to do what I do is the students I work with every day – from undergraduate to graduate students – at the University of Toronto,” says Bazylak, “The students I work with are so driven for a clean energy society.

    “That inspiration really permeates through everything that they do and inspires me as well.”

    Marc Cadotte

    U of T Scarborough ecologist Marc Cadotte is a prolific researcher. Having published more than 150 articles and accumulated over 11,000 citations, he has been listed among Web of Science’s most cited environmental scientists since 2017.

    For Cadotte, appointment to the college represents an opportunity to share his expertise on an issue of universal concern: how human activity affects the ecosystems around us and how biodiversity can be preserved and its benefits maximized.

    “I grew up as a child in northern Ontario and then I moved to southern Ontario as a teenager,” says Cadotte, a professor in the department of biological sciences. “I went from seeing bears and moose in my backyard to moving to a place where the landscape was fundamentally transformed by human activity. I used to be able to find nature outside my back door, and then I had to go searching for it.

    “So, what inspired me was this fundamental change I saw in the environment around me. I wanted to understand why we have these impacts.”

    Kevin Lewis O’Neill

    Kevin Lewis O’Neill is a cultural anthropologist who, working principally in Guatemala City, explores “questions of religion and politics with a philosophical interest in matters of belonging in exclusion.”

    O’Neill, a professor in the department for the study of religion in the Faculty of Arts & Science, says his research is inspired and driven by two key factors. “One is the politics of the research – I study matters of deportation and citizenship and matters of security throughout the Americas and the political stakes of these issues have never been higher,” says O’Neill, who is also director of the Centre for Diaspora & Transnational Studies.

    “The other part is that there is a pleasure to ethnographic field work. I have found something exceedingly pleasurable about the intensity of the research, the kinds of relationships I’m able to establish local stakeholders and the kinds of questions I’m able to pursue.”

    Diana Fu

    Diana Fu’s professorial career has involved considerable overseas field work. An associate professor in the department of political science at the University of Toronto Scarborough, Fu spent two years studying informal labour organizations in China. Her research examines various aspects of politics and activism in China, generating important insights for the discourse around Canada-China relations.

    “Canadians need China competency now more than ever, and this is what I hope my research contributes to,” says Fu, who is director of the East Asia seminar series at the Munk School of Global Affairs & Public Policy.

    Chelsea Rochman

    The work of Chelsea Rochman focuses on an environmental issue of national and global importance: plastic pollution and its impact on marine and freshwater ecosystems and wildlife. Rochman, an assistant professor in the department of ecology and evolutionary biology in the Faculty of Arts & Science, has studied the effects of microplastics – such as the billions of tiny fibres laundered from our clothing – on our water and has written about the dangers of this pollution.

    She notes that the Canadian government has increasingly been prioritizing programs to build a more sustainable plastic economy and mitigate existing plastic pollution.

    “For me, this is exciting,” she says. “I’ve been inspired by the idea or issue of waste for a long time – since I was a child – and I’m really excited to have a career being able to both research it as well as work within our own community locally, but also globally, on tackling this issue.”

    Kelly O’Brien

    Kelly O’Brien is an associate professor in the department of physical therapy in the Temerty Faculty of Medicine. Her research focuses on episodic disability and rehabilitation in the context of chronic disease and HIV.

    She is cross-appointed to the Rehabilitation Sciences Institute as well as the Institute of Health Policy, Management and Evaluation at the Dalla Lana School of Public Health.

    “Over the years,” O’Brien says, “I’ve had the opportunity to learn from and collaborate with a number of mentors and colleagues – both academic and community-based experts – including people living with HIV, clinicians, researchers and representatives of community organizations who are dedicated to identifying new and emerging research priorities in the field and advancing research in practice to improve health outcomes and access to rehabilitation for people with chronic disease.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    University of Toronto (CA) has evolved into Canada’s leading institution of learning, discovery and knowledge creation. We are proud to be one of the world’s top research-intensive universities, driven to invent and innovate.

    Our students have the opportunity to learn from and work with preeminent thought leaders through our multidisciplinary network of teaching and research faculty, alumni and partners.

    The ideas, innovations and actions of more than 560,000 graduates continue to have a positive impact on the world.

  • richardmitnick 9:59 am on November 23, 2020 Permalink | Reply
    Tags: "Scientists organize to tackle crisis of coral bleaching", , , Marine ecology,   

    From Ohio State University: “Scientists organize to tackle crisis of coral bleaching” 

    From Ohio State University

    Nov 23, 2020
    Laura Arenschield
    Ohio State News

    Bleached coral in the Red Sea. Credit: Anna Roik

    New common framework will help research into climate-threatened ocean reefs.

    An international consortium of scientists has created the first-ever common framework for increasing comparability of research findings on coral bleaching.

    “Coral bleaching is a major crisis and we have to find a way to move the science forward faster,” said Andréa Grottoli, a professor of earth sciences at The Ohio State University and lead author of a paper on guidelines published online Saturday in the journal Ecological Applications.

    The common framework covers a broad range of variables that scientists generally monitor in their experiments, including temperature, water flow, light and others. It does not dictate what levels of each should be present during an experiment into the causes of coral bleaching; rather, it offers a common framework for increasing comparability of reported variables.

    “Our goal was to create a structure that would allow researchers to anchor their studies, so we would have a common language and common reference points for comparing among studies,” said Grottoli, who also is director of the consortium that developed the common framework.

    Coral bleaching is a significant problem for the world’s ocean ecosystems: When coral becomes bleached, it loses the algae that live inside it, turning it white. Coral can survive a bleaching but being bleached puts coral at higher risk for disease and death. And that can be catastrophic: Coral protects coastlines from erosion, offers a boost to tourism in coastal regions, and is an essential habitat to more than 25% of the world’s marine species.

    Bleaching events have been happening with greater frequency and in greater numbers as the world’s atmosphere — and oceans — have warmed because of climate change.

    “Reefs are in crisis,” Grottoli said. “And as scientists, we have a responsibility to do our jobs as quickly, cost-effectively, professionally and as well as we can. The proposed common framework is one mechanism for enhancing that.”

    The consortium leading this effort is the Coral Bleaching Research Coordination Network, an international group of coral researchers. Twenty-seven scientists from the network, representing 21 institutions around the world, worked together as part of a workshop at Ohio State in May 2019 to develop the common framework.

    The goal, Grottoli said, is to allow scientists to compare their work, make the most of the coral samples they collect, and find ways to create a common framework for coral experimentation.

    Their recommendations include guidelines for experiments that help scientists understand what happens when coral is exposed to changes in light or temperature over a short period of time, a moderate period, and long periods. The guidelines include a compendium of the most common methods used for recording and reporting physical and biological parameters in a coral bleaching experiment.

    That such a framework hasn’t already been established is not surprising: The scientific field that seeks to understand the causes of and solutions for coral bleaching is relatively young. The first reported bleaching occurred in 1971 in Hawaii; the first wide-spread bleaching event was reported in Panama and was connected with the 1982-83 El Niño.

    But experiments to understand coral bleaching didn’t really start in earnest until the 1990s — and a companion paper by many of the same authors found that two-thirds of the scientific papers about coral bleaching have been published in the last 10 years.

    Researchers are still trying to understand why some coral species seem to be more vulnerable to bleaching than others, Grottoli said, and setting up experiments with consistency will help the science move forward more quickly and economically.

    “Adopting a common framework for experiments around coral bleaching would make us more efficient as a discipline,” Grottoli said.

    “We’d be able to better collaborate, and to build on one another’s work more easily. It would help us progress in our understanding of coral bleaching — and because of climate change and the vulnerability of the coral, we need to progress more quickly.”

    Other Ohio State researchers who are co-authors on the paper are graduate students Rowan McLachlan, James T. Price and Kerri L. Dobson.

    This work was funded by the National Science Foundation.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Ohio State University (OSU, commonly referred to as Ohio State) is a public research university in Columbus, Ohio. Founded in 1870 as a land-grant university and the ninth university in Ohio with the Morrill Act of 1862,[4] the university was originally known as the Ohio Agricultural and Mechanical College. The college originally focused on various agricultural and mechanical disciplines but it developed into a comprehensive university under the direction of then-Governor (later, U.S. President) Rutherford B. Hayes, and in 1878 the Ohio General Assembly passed a law changing the name to “The Ohio State University”.[5] The main campus in Columbus, Ohio, has since grown into the third-largest university campus in the United States.[6] The university also operates regional campuses in Lima, Mansfield, Marion, Newark, and Wooster.

    The university has an extensive student life program, with over 1,000 student organizations; intercollegiate, club and recreational sports programs; student media organizations and publications, fraternities and sororities; and three student governments. Ohio State athletic teams compete in Division I of the NCAA and are known as the Ohio State Buckeyes. As of the 2016 Summer Olympics, athletes from Ohio State have won 104 Olympic medals (46 gold, 35 silver, and 23 bronze). The university is a member of the Big Ten Conference for the majority of sports.

  • richardmitnick 9:36 am on November 23, 2020 Permalink | Reply
    Tags: "Growing Interest in Moon Resources Could Cause Tension Scientists Find", , Lunar reconnaissance   

    From Harvard-Smithsonian Center for Astrophysics: “Growing Interest in Moon Resources Could Cause Tension, Scientists Find” 

    Harvard Smithsonian Center for Astrophysics

    From Harvard-Smithsonian Center for Astrophysics

    November 23, 2020

    Amy Oliver
    Public Affairs
    Center for Astrophysics | Harvard & Smithsonian
    Fred Lawrence Whipple Observatory

    Taken by NASA’s Lunar Reconnaissance Orbiter, this image of the moon is part of the collection of the highest resolution, near-global topographic maps of the moon ever created. Overlaid on this image are some of the hotspots identified for cosmology telescopes on the moon; few ideal locations for these telescopes exist on the moon, as others conflict with the radio quiet zone. Credit: NASA/Goddard Space Flight Center/DLR (DE)/ASU; Overlay: M. Elvis, A. Krosilowski, T. Milligan.

    NASA/Lunar Reconnaissance Orbiter

    Lunar cold traps located at the South Pole of the moon, are critical to all moon-based operations because they contain frozen water molecules. Water is required for all moon-based operations because it is needed to grow food, and to break down into oxygen for breathing and hydrogen for fuel. The four white-circled regions in this image contain the coldest terrain with average annual near-surface temperatures of 25-50 K. They are about 50 km across. Credit: David Paige, reproduced with permission.

    An international team of scientists led by the Center for Astrophysics | Harvard & Smithsonian, has identified a problem with the growing interest in extractable resources on the moon: there aren’t enough of them to go around. With no international policies or agreements to decide “who gets what from where,” scientists believe tensions, overcrowding, and quick exhaustion of resources to be one possible future for moon mining projects. The paper published today in the Philosophical Transactions of the Royal Society A.

    “A lot of people think of space as a place of peace and harmony between nations. The problem is there’s no law to regulate who gets to use the resources, and there are a significant number of space agencies and others in the private sector that aim to land on the moon within the next five years,” said Martin Elvis, astronomer at the Center for Astrophysics | Harvard & Smithsonian and the lead author on the paper. “We looked at all the maps of the Moon we could find and found that not very many places had resources of interest, and those that did were very small. That creates a lot of room for conflict over certain resources.”

    Resources like water and iron are important because they will enable future research to be conducted on, and launched from, the moon. “You don’t want to bring resources for mission support from Earth, you’d much rather get them from the Moon. Iron is important if you want to build anything on the moon; it would be absurdly expensive to transport iron to the moon,” said Elvis. “You need water to survive; you need it to grow food—you don’t bring your salad with you from Earth—and to split into oxygen to breathe and hydrogen for fuel.”

    Interest in the moon as a location for extracting resources isn’t new. An extensive body of research dating back to the Apollo program has explored the availability of resources such as helium, water, and iron, with more recent research focusing on continuous access to solar power, cold traps and frozen water deposits, and even volatiles that may exist in shaded areas on the surface of the moon. Tony Milligan, a Senior Researcher with the Cosmological Visionaries project at King’s College London, and a co-author on the paper said, “Since lunar rock samples returned by the Apollo program indicated the presence of Helium-3, the moon has been one of several strategic resources which have been targeted.”

    Although some treaties do exist, like the 1967 Outer Space Treaty—prohibiting national appropriation—and the 2020 Artemis Accords—reaffirming the duty to coordinate and notify—neither is meant for robust protection. Much of the discussion surrounding the moon, and including current and potential policy for governing missions to the satellite, have centered on scientific versus commercial activity, and who should be allowed to tap into the resources locked away in, and on, the moon. According to Milligan, it’s a very 20th century debate, and doesn’t tackle the actual problem. “The biggest problem is that everyone is targeting the same sites and resources: states, private companies, everyone. But they are limited sites and resources. We don’t have a second moon to move on to. This is all we have to work with.” Alanna Krolikowski, assistant professor of science and technology policy at Missouri University of Science and Technology (Missouri S&T) and a co-author on the paper, added that a framework for success already exists and, paired with good old-fashioned business sense, may set policy on the right path. “While a comprehensive international legal regime to manage space resources remains a distant prospect, important conceptual foundations already exist and we can start implementing, or at least deliberating, concrete, local measures to address anticipated problems at specific sites today,” said Krolikowski. “The likely first step will be convening a community of prospective users, made up of those who will be active at a given site within the next decade or so. Their first order of business should be identifying worst-case outcomes, the most pernicious forms of crowding and interference, that they seek to avoid at each site. Loss aversion tends to motivate actors.”

    There is still a risk that resource locations will turn out to be more scant than currently believed, and scientists want to go back and get a clearer picture of resource availability before anyone starts digging, drilling, or collecting. “We need to go back and map resource hot spots in better resolution. Right now, we only have a few miles at best. If the resources are all contained in a smaller area, the problem will only get worse,” said Elvis. “If we can map the smallest spaces, that will inform policymaking, allow for info-sharing and help everyone to play nice together so we can avoid conflict.”

    While more research on these lunar hot spots is needed to inform policy, the framework for possible solutions to potential crowding are already in view. “Examples of analogs on Earth point to mechanisms for managing these challenges. Common-pool resources on Earth, resources over which no single act can claim jurisdiction or ownership, offer insights to glean. Some of these are global in scale, like the high seas, while other are local like fish stocks or lakes to which several small communities share access,” said Krolikowski, adding that one of the first challenges for policymakers will be to characterize the resources at stake at each individual site. “Are these resources, say, areas of real estate at the high-value Peaks of Eternal Light, where the sun shines almost continuously, or are they units of energy to be generated from solar panels installed there? At what level can they can realistically be exploited? How should the benefits from those activities be distributed? Developing agreement on those questions is a likely precondition to the successful coordination of activities at these uniquely attractive lunar sites.”

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Center for Astrophysics combines the resources and research facilities of the Harvard College Observatory and the Smithsonian Astrophysical Observatory under a single director to pursue studies of those basic physical processes that determine the nature and evolution of the universe. The Smithsonian Astrophysical Observatory (SAO) is a bureau of the Smithsonian Institution, founded in 1890. The Harvard College Observatory (HCO), founded in 1839, is a research institution of the Faculty of Arts and Sciences, Harvard University, and provides facilities and substantial other support for teaching activities of the Department of Astronomy.

  • richardmitnick 12:23 pm on November 22, 2020 Permalink | Reply
    Tags: "NASA to test its SLS megarocket in the coming weeks", , ,   

    From NASA via EarthSky: “NASA to test its SLS megarocket in the coming weeks” 

    NASA image
    From NASA




    November 21, 2020
    Lia Rovira

    The ongoing pandemic has slowed testing for NASA’s Space Launch System megarocket, but the process is resuming and has checked off a key milestone: powering up the core stage.

    An artist’s rendering of the first Space Launch System (SLS) vehicle with Orion spacecraft on the pad before launch. The Orion spacecraft and SLS megarocket – together called Block 1 – and the ground systems at Kennedy Space Center in Cape Canaveral, Florida, will be part of the Artemis 1 lunar mission, scheduled for launch in 2021. Crfedit: NASA/Wikipedia.

    Although the coronavirus pandemic has slowed testing of NASA’s Space Launch System – a rocket more powerful than the Saturn V that propelled the first astronauts to the moon – the months-long process is finally resuming at the agency’s Stennis Space Center in Mississippi. Boeing, the company NASA contracted to lead the rocket’s construction, is now engaged in an eight-step core testing process dubbed the green run. It’ll culminate in a hot-fire test, where the rocket will be tied down, but will fire up its engines and endure each step of a launch as if it were really taking place. Originally scheduled to take place in early to mid-November 2020, this final testing is now expected to take place within the next three to six weeks, NASA says. It hopes to keep to this testing goal, to keep its schedule on track for the rocket’s debut launch on the Artemis 1 lunar mission in mid-to-late 2021.

    Clyde Sellers, a security specialist at the NASA center, told EarthSky:

    “It’s extremely gratifying to watch. It’s the first time this test has run and for a new, original rocket, the most powerful rocket ever built.”

    Although the green run series started with a modal test – a kind of vibration testing – conducted in January 2020, the process has been slowed considerably by the coronavirus that has swept the world. Agency leadership halted on-site work at Stennis after the pandemic struck the region in March. The center began reopening slowly in mid-May, and the green run team completed their second test on the core stage (the orange “body” of the rocket) in late June.

    That test ensured that the software and other electrical interfaces involved in the rocket and the testing stand work properly.

    The rocket has since undergone and passed the next four steps of the green run series:

    – Test 3, in which engineers inspected all the safety systems that shut down operations during testing. During this test, they simulated potential problems.

    – Test 4, the first test of each of the main propulsion system components that connect to the engines. Command and control operations were verified, and the core stage was checked for leaks in fluid or gas.

    – Test 5, in which engineers ensured the thrust vector control system can move the four engines and checked all the related hydraulic systems.

    – Test 6, which simulated the launch countdown, including step-by-step fueling procedures. Core stage avionics were powered on, and propellant loading and pressurization were simulated. The test team exercised and validated the countdown timeline and sequence of events.

    This helpful graphic illustrates what the 8 parts of the green run will test, as well as the individual components of the SLS Core Stage (orange rocket body). Credit:NASA.

    The final two tests scheduled for the next month or so – test 7 and test 8 – will be a “wet dress rehearsal” that sees the rocket stage loaded with fuel and the full hot-fire test to ensure the vehicle is truly ready for launch. It’s an intense procedure, but one that’s crucial for engineers to feel confident the vehicle is safe.

    After the hot fire test, engineers will refurbish the core stage and configure it for its journey to NASA’s Kennedy Space Center in Florida, where still more tests await the core stage. But eventually, if all goes well, the next time the RS-25 engines fire will be for the first uncrewed mission of NASA’s Artemis 1 – the first in a series of increasingly complex missions that will enable human exploration to the moon and Mars – and perhaps, one day, deep space.

    The orange SLS core stage is being tested on the B-2 test stand at NASA’s Stennis Space Center. Image via NASA.

    The core stage will later be assembled with the other parts of the rocket and the Orion spacecraft, the crew module designed to carry humans into space.

    Orion spacecraft. Credit: NASA

    Drawing from more than half a century of research and development, the Orion module plans to be flexible enough to carry humans to a variety of destinations beyond our own moon. The abort system, which will provide the crew with the ability to escape if an emergency occurs on the launch pad, was successfully tested at White Sands Missile Range in New Mexico back in 2010. A series of launch and landing simulations at NASA’s Hydro Impact Basin tested how the module will fare when it splashes down in the ocean at the end of its mission.

    Orion’s testing wrapped up in 2018 after a series of parachute falls, and it is expected to fly in the first Artemis launch.

    Unlike previous human launch systems, SLS is designed to grow and evolve over time, with system flexibility that allows engineers to use one design today but adapt it later to future missions. Sellers added:

    “SLS will advance our understanding of our solar system and mankind’s capabilities.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The National Aeronautics and Space Administration (NASA) is the agency of the United States government that is responsible for the nation’s civilian space program and for aeronautics and aerospace research.

    President Dwight D. Eisenhower established the National Aeronautics and Space Administration (NASA) in 1958 with a distinctly civilian (rather than military) orientation encouraging peaceful applications in space science. The National Aeronautics and Space Act was passed on July 29, 1958, disestablishing NASA’s predecessor, the National Advisory Committee for Aeronautics (NACA). The new agency became operational on October 1, 1958.

    Since that time, most U.S. space exploration efforts have been led by NASA, including the Apollo moon-landing missions, the Skylab space station, and later the Space Shuttle. Currently, NASA is supporting the International Space Station and is overseeing the development of the Orion Multi-Purpose Crew Vehicle and Commercial Crew vehicles. The agency is also responsible for the Launch Services Program (LSP) which provides oversight of launch operations and countdown management for unmanned NASA launches. Most recently, NASA announced a new Space Launch System that it said would take the agency’s astronauts farther into space than ever before and lay the cornerstone for future human space exploration efforts by the U.S.

    NASA science is focused on better understanding Earth through the Earth Observing System, advancing heliophysics through the efforts of the Science Mission Directorate’s Heliophysics Research Program, exploring bodies throughout the Solar System with advanced robotic missions such as New Horizons, and researching astrophysics topics, such as the Big Bang, through the Great Observatories [Hubble, Chandra, Spitzer, and associated programs. NASA shares data with various national and international organizations such as from the [JAXA]Greenhouse Gases Observing Satellite.

  • richardmitnick 11:55 am on November 22, 2020 Permalink | Reply
    Tags: "Will SpaceX’s Starlink satellites ruin stargazing?", , , , ,   

    From EarthSky: “Will SpaceX’s Starlink satellites ruin stargazing?” 


    From EarthSky

    November 22, 2020
    Samantha Lawler, University of Regina (CA)

    A shooting star during the Perseid meteor shower. Soon, thousands of satellites will crowd the night sky. Credit: Shutterstock/ The Conversation.

    I walk outside my rural Saskatchewan house before dawn and look up, expecting to have my breath taken away by the sheer number of stars overhead. I’m a professional astronomer, but I still appreciate unaided-eye stargazing as much as an eager child. This is the first place I’ve lived that’s dark enough to easily see the Milky Way, and I’m stunned and awed every time I look up.

    This time though, I curse softly. There’s a bright satellite. And another following behind. And another. And another.

    I used to be excited about seeing artificial satellites, but now I know what’s coming. We’re about to undergo a dramatic transition in our experience of satellites. No longer will you escape your city for a camping trip and see the stars unobstructed: you will have to look through a grid of crawling, bright satellites no matter how remote your location.

    Crowded orbits

    If mega-constellations of satellites become reality, the night sky will become a mundane highway of moving lights, obscuring the stars. Now, every time I see the bright reflection of a satellite tracking across the stars, I am reminded of what has already been approved by the United States Federal Communications Commission, the agency that regulates frequencies broadcast by satellites over the U.S., effectively putting itself in charge of regulating every space launch on the planet.

    SpaceX has already received approval for 12,000 Starlink satellites and is seeking approval for 30,000 more. Other companies are not far behind.

    The Starlink mega-constellation itself would increase the number of active satellites more than tenfold: there are around 3,000 active satellites in orbit; current Starlinks are brighter than 99% of them because they are in lower orbits, closer to the surface of Earth, and more reflective than Starlink engineers predicted.

    SpaceX is launching sets of 60 satellites every couple of weeks, and there will be a thousand Starlinks in orbit by Christmas 2020.

    Time-lapse photography of the Lyrid meteor shower from April 2020. At the 0:50 mark, a train of Starlink satellites zooms through the landscape.

    With the unaided eye, stargazing from a dark-sky location allows you to see about 4,500 stars. From a typical suburban location, you can see about 400. Simulations show that from 52 degrees north (the latitude of both Saskatoon and London, U.K.) hundreds of Starlinks will be visible for a couple of hours after sunset and before sunrise (comparable to the number of visible stars) and dozens of these will be visible all night during the summer months.

    Light pollution has long been a threat to stargazing, but at least that can be escaped by leaving urban centers.

    But satellites will be a global star-obscuring phenomenon, particularly bad at the latitudes of northern U.S. states, Canada and much of Europe.

    Stellar sacrifices

    To their credit, SpaceX and Amazon – which is also investing in satellite internet services – have voluntarily started participating in discussions with professional astronomers on possible ways to mitigate the effects of thousands of bright satellites on specific observations, like interstellar objects.

    SpaceX did also try a “darksat” coating, though preliminary measurements by astronomers showed that it was only marginally fainter than other Starlinks. Meanwhile, launches continue with unmitigated, bright Starlinks.

    Simulations show that professional astronomy [The Astronomical Journal] and amateur astrophotography will be severely affected by bright mega-constellations. Discoveries of hazardous near-Earth asteroids will be particularly devastated [The Astrophysical Journal Letters] by the hundreds of Starlinks confusing their targets, leaving Earth more vulnerable to world-altering impacts.

    The point of the Starlink mega-constellation is to provide global internet access. It is often stated by Starlink supporters that this will provide internet access to places on the globe not currently served by other communication technologies. But currently available information shows the cost of access will be too high in nearly every location that needs internet access. Thus, Starlink will likely only provide an alternate for residents of wealthy countries who already have other ways of accessing the internet.

    Crowding the night sky

    Even if SpaceX changes its plans, other companies are actively developing separate mega-constellations, and there are more in the works.

    Currently, there are no rules about satellite orbits or right-of-way, and if a collision (or multiple collisions) should occur, it’s not clear who would be at fault and who would have to clean up the debris (if that is even possible to do). The only international law that applies to satellite debris, from 1972, basically says that the country who launched the satellite has to clean up any mess it leaves on the surface of the Earth after crashing.

    Space Debris in Motion
    Space junk — or orbital debris — is a growing problem.

    Most satellites today are launched by private companies, not governments, and most satellite debris remains abandoned in orbit, because there are no rules about clean-up. There are thousands of pieces of this space junk, ranging in size from bolts to bus-sized dead satellites.

    With tens of thousands of new satellites approved for launch, and no laws about orbit crowding, right-of-way or space cleanup, the stage is set for the disastrous possibility of Kessler Syndrome, a runaway cascade of debris that could destroy most satellites in orbit and prevent launches for decades.

    Losing our connections

    As human beings, we have deep connections to the stars that extend back to the dawn of humanity and, indeed, we are made of material from ancient stars.

    The Native Skywatchers program celebrates humanity’s time-honored love of the night sky and shares Indigenous knowledge of astronomy. A Dakota Elder recently shared her traditional knowledge of the skies: the Blue Woman spirit To Win lives in Wichakiyuhapi (the Big Dipper), where she guides new babies from the Star Nation into our world and waits to greet our spirits at the door as we leave our world.

    Large corporations like SpaceX and Amazon will only respond to legislation – which is slow, especially for international legislation – and consumer pressure. Is having another source of internet worth losing access to unobstructed stargazing for yourself and nearly every other person on the planet? Our species has been stargazing for thousands of years, do we really want to lose access now for the profit of a few large corporations?

    On your next clear night, go outside and look up. Enjoy the stars that you can see now, because without big changes in the plans of corporations that want to launch mega-constellations, your view of the stars is about to change dramatically.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Deborah Byrd created the EarthSky radio series in 1991 and founded EarthSky.orgin 1994. Today, she serves as Editor-in-Chief of this website. She has won a galaxy of awards from the broadcasting and science communities, including having an asteroid named 3505 Byrd in her honor. A science communicator and educator since 1976, Byrd believes in science as a force for good in the world and a vital tool for the 21st century. “Being an EarthSky editor is like hosting a big global party for cool nature-lovers,” she says.

  • richardmitnick 11:16 am on November 22, 2020 Permalink | Reply
    Tags: "Sea-level monitoring satellite in position for liftoff", Copernicus Sentinel-6 Michael Freilich,   

    From European Space Agency – United Space in Europe (EU): “Sea-level monitoring satellite in position for liftoff” 

    ESA Space For Europe Banner

    From European Space Agency – United Space in Europe (EU)


    Copernicus Sentinel-6 Michael Freilich atop a Falcon 9 rocket at the Vandenberg Air Force Base in California, US.

    ESA/NASA Sentinel-6 Michael Freilich

    Once launched, this new mission will take the role of radar altimetry reference mission, continuing the long-term record of measurements of sea-surface height started in 1992 by the French–US Topex Poseidon and then the Jason series of satellite missions.

    With liftoff set for today at 17:17 GMT (18:17 CET, 09:17 PST), the Copernicus Sentinel-6 Michael Freilich satellite is poised for liftoff – atop a Falcon 9 rocket on the launch pad at the Vandenberg Air Force Base in California, US.


    Once commissioned in orbit, this new Copernicus satellite will take the reins of delivering measurements of sea-surface height to monitor sea-level rise, and more.

    With millions of people living in coastal communities around the world, rising seas are at the top of the list of major concerns linked to climate change. Monitoring sea-surface height is critical to understanding the changes taking place so that decision-makers have the evidence to implement appropriate policies to help curb climate change and for authorities to take action to protect vulnerable communities.

    Over the last three decades, the French–US Topex-Poseidon and Jason mission series served as reference missions, and in combination with ESA’s earlier ERS and Envisat satellites, as well as today’s CryoSat and Copernicus Sentinel-3, they have shown how sea level has risen about 3.2 mm on average every year. More alarmingly, this rate of rise has been accelerating; over the last few years, the average rate of rise has been 4.8 mm a year.

    Copernicus Sentinel-6 is the next radar altimetry reference mission to extend this legacy of sea-surface height measurements until at least 2030.

    The mission comprises two identical satellites launched sequentially. Copernicus Sentinel-6 Michael Freilich today, and Copernicus Sentinel-6B in 2025.

    Copernicus Sentinel-6 Michael Freilich was named in honour of the former head of NASA’s Earth science division. The name change not only recognises Dr Freilich’s outstanding contribution to the mission, but also his lifelong dedication to understanding our planet.

    While Sentinel-6 is one of the European Union’s family of Copernicus missions, its implementation is the result of a unique cooperation between ESA, Eumetsat, NASA and NOAA, with contribution from the CNES French space agency.

    Since its arrival at the launch site at the end of September, Sentinel-6 Michael Freilich has been thoroughly tested, fuelled and joined to the launch adapter, encapsulated in the rocket fairing, and now rolled out to the launch tower and integrated into the rest of the rocket.

    With the launch dress rehearsal done and everyone and everything in place, it’s now almost time for liftoff.

    For live updates, follow: @esa_eo @esa @esaoperations

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The European Space Agency (ESA) (EU), established in 1975, is an intergovernmental organization dedicated to the exploration of space, currently with 19 member states. Headquartered in Paris, ESA has a staff of more than 2,000. ESA’s space flight program includes human spaceflight, mainly through the participation in the International Space Station program, the launch and operations of unmanned exploration missions to other planets and the Moon, Earth observation, science, telecommunication as well as maintaining a major spaceport, the Guiana Space Centre at Kourou, French Guiana, and designing launch vehicles. ESA science missions are based at ESTEC in Noordwijk, Netherlands, Earth Observation missions at ESRIN in Frascati, Italy, ESA Mission Control (ESOC) is in Darmstadt, Germany, the European Astronaut Centre (EAC) that trains astronauts for future missions is situated in Cologne, Germany, and the European Space Astronomy Centre is located in Villanueva de la Cañada, Spain.

    ESA50 Logo large

  • richardmitnick 2:35 pm on November 21, 2020 Permalink | Reply
    Tags: "Accelerator Makes Cross-Country Trek to Enable Laser Upgrade", Another upgrade to this accelerator may be on the horizon: LCLS-II HE (High Energy)., , , , Jefferson Lab is a key contributor to the upgrade project providing a total of 21 cryomodules for the new superconducting portion of LCLS-II since work began in 2013., Jefferson Lab is a world leader in superconducting radiofrequency accelerator technologies and is home to the first large-scale SRF accelerator., Jefferson Lab’s newly shipped cryomodule will travel almost 3000 miles to its home in the LCLS-II linear accelerator in Menlo Park California over the course of 72 hours., , SLAC/LCLS II-the world’s brightest X-ray laser., The HE upgrade is the culmination of work by staff at both Jefferson Lab and Fermilab., The LCLS-II cryomodules are the highest-performing cryomodules that anybody has ever built., The LCLS-II project is being built for SLAC by a multi-lab collaboration that includes four DOE national labs: Jefferson Lab; Argonne National Lab; Berkeley Lab and Fermilab and Cornell University.,   

    From Thomas Jefferson National Accelerator Facility: “Accelerator Makes Cross-Country Trek to Enable Laser Upgrade” 

    From DOE’s Thomas Jefferson National Accelerator Facility

    By Chris Patrick

    Kandice Carter
    Jefferson Lab

    Ali Sundermier,
    SLAC National Accelerator Laboratory

    Thomas Jefferson National Accelerator Facility has shipped the final new section of accelerator that it has built for an upgrade of the Linac Coherent Light Source (LCLS). The section of accelerator, called a cryomodule, has begun a cross-country road trip to DOE’s SLAC National Accelerator Laboratory, where it will be installed in LCLS-II, the world’s brightest X-ray laser. Credit: DOE’s Jefferson Lab.

    Above: File photos of cryomodule work for the LCLS-II project.

    Today, the U.S. Department of Energy’s Thomas Jefferson National Accelerator Facility has shipped the final new section of accelerator that it has built for an upgrade of the Linac Coherent Light Source (LCLS). The section of accelerator, called a cryomodule, has begun a cross-country road trip to DOE’s SLAC National Accelerator Laboratory, where it will be installed in LCLS-II, the world’s brightest X-ray laser.

    SLAC/LCLS II projected view.

    SLAC/LCLS II schematic.

    SLAC LCLS-II Undulators The Linac Coherent Light Source’s new undulators each use an intricately tuned series of magnets to convert electron energy into intense bursts of X-rays. The “soft” X-ray undulator stretches for 100 meters on the left side of this hall, with the “hard” x-ray undulator on the right. Credit: Alberto Gamazo/SLAC National Accelerator Laboratory.

    “This is the culmination of seven years of work,” said Naeem Huque, the cost account manager who led the cryomodule efforts at Jefferson Lab. “A lot of the staff in Jefferson Lab’s Superconducting Radiofrequency Institute came in right from the start of the project, and they’re still here seeing it off. We are happy to see this project conclude successfully.”

    LCLS-II is a project to upgrade the existing Linac Coherent Light Source (LCLS), the world’s first X-ray free-electron laser. The X-ray pulses generated by the machine act like a powerful microscope, allowing researchers to watch chemical reactions in real time, probe materials and more. Once complete, LCLS-II will begin its reign as the biggest and brightest X-ray free-electron laser in the world.

    LCLS-II will provide even better resolution than the original LCLS, which accelerated electrons at room temperature and generated 120 X-ray laser pulses per second. The upgraded machine will accelerate electrons at superconducting temperatures to generate 1 million X-ray laser pulses per second. Jefferson Lab is a key contributor to the upgrade project, providing a total of 21 cryomodules for the new superconducting portion of LCLS-II since work began in 2013.

    The superconducting accelerator that will power the upgraded machine is made up of cryomodules. Electrons zip through the cryomodules, where they are loaded up with extra energy. Then, magnets make the electrons zigzag to give off their energy as X-rays. The upgraded LCLS will boast 37 cryomodules in total. Of those,18 are from Jefferson Lab (plus three spares), and the rest will come from Fermilab, another key contributor.

    “The LCLS-II cryomodules are the highest-performing cryomodules that anybody has ever built,” said Joe Preble, senior team leader for the LCLS-II project at Jefferson Lab. “We pushed out the performance frontier on this sort of technology and turned it into a regular, turnkey process.”

    Jefferson Lab is a world leader in superconducting radiofrequency accelerator technologies and is home to the first large-scale SRF accelerator. As the team at Jefferson Lab contributed to the design of, built, tested and shipped these record-breaking cryomodules for LCLS-II, they encountered unprecedented challenges to push the cryomodule technology’s performance.

    “These very high-performing cryomodules are sensitive to things that we never had to worry about before, like our assembly procedures, the way we treat materials, the way we build things,” Preble said.

    Jefferson Lab modified its facilities to accommodate the cryomodules, which were a different shape and size than those that came before. Jefferson Lab staff members even figured out a new way to ship the finished cryomodules, after some broke during shipment.

    “We explored a lot of different options, everything from hiring a NASA aircraft to take it over there, to trying to send it by train or ship,” Huque explained.

    In the end, they managed to improve safety without pulling the cryomodules off the road. Sitting in a bed of springs to prevent damage from jostling, Jefferson Lab’s newly shipped cryomodule will travel almost 3,000 miles to its home in the LCLS-II linear accelerator in Menlo Park, California, over the course of 72 hours.

    However, Jefferson Lab’s work on improving the LCLS is likely not yet done. Another upgrade to this accelerator may be on the horizon: LCLS-II HE (High Energy). If that project is greenlighted, Jefferson Lab will build between 10 and 13 more cryomodules with a newer procedure. It’s expected that those cryomodules will have even better performance than the 21 they just finished.

    “I think that’s one of the biggest signs that we’ve done really well, is that something that was already ambitious is now getting pushed even further,” Huque said. The HE upgrade, which is the culmination of work by staff at both Jefferson lab and Fermilab, will dramatically increase the performance capabilities of LCLS-II.”

    For now, this momentous final delivery closes the book on Jefferson Lab’s part in delivering new cryomodules for LCLS-II while the R&D and prototyping for HE is already ongoing. Its conclusion comes thanks to the help of many.

    “From the procurement people to the engineers, the scientists, the technicians, and the administrators, it’s taken everybody working together across laboratories to get this done,” Preble said. “It’s a great success and demonstration of the way the DOE needs to continue to work in building these new big projects.”

    The LCLS-II project is being developed and built for SLAC by a multi-lab collaboration that includes four other DOE national labs: Jefferson Lab, Argonne National Lab, Berkeley Lab and Fermilab, along with additional collaboration from Cornell University. Jefferson Lab is providing the liquid helium cryogenics plant for the project, half of the SRF cryomodules and the systems that will allow its operators to control the cryomodules. Once complete, LCLS-II will be the longest continuous SRF linear accelerator in the country, boasting 280 accelerating cavities.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    JLab campus
    Jefferson Lab is supported by the Office of Science of the U.S. Department of Energy. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, visit
    Jefferson Science Associates, LLC, a joint venture of the Southeastern Universities Research Association, Inc. and PAE Applied Technologies, manages and operates the Thomas Jefferson National Accelerator Facility, or Jefferson Lab, for the U.S. Department of Energy’s Office of Science.

  • richardmitnick 1:55 pm on November 21, 2020 Permalink | Reply
    Tags: "We Need to Go Deeper – the Blueschists and Eclogites of Oman", , But what are blueschists and eclogites? Read the full blog post., , , My senior research project is focused on understanding the structural geometry of Oman’s blueschists and eclogites whose protoliths are approximately 110 million years old., Oman is well known for its Semail Ophiolite being the world’s largest and most complete ophiolite complex.,   

    From The College of William & Mary: “We Need to Go Deeper – the Blueschists and Eclogites of Oman” 

    From The College of William & Mary

    November 20, 2020
    Nick Carpenter ’21

    “Have you ever wondered what goes on deep within the Earth, far deeper than any human has ever ventured? Well look no further, for the Earth leaves behind clues of what takes place down there, clues that are millions of years in the making and occur at special locations around the world, one such location is in northern Oman along a spectacular and secluded beach.

    Figure 1. Eclogite beach to the north of As Sifah.

    Oman is well known for its Semail Ophiolite, being the world’s largest and most complete ophiolite complex. Ophiolite, a sequence of rocks that comes from the Earth’s oceanic crust and upper mantle, or lithosphere, is especially valuable to geologists as a means of studying tectonic and magmatic processes from part of the Earth that is otherwise inaccessible. But much like the Minecraft achievement, we need to go deeper – to the high pressure metamorphic rocks that occur structurally below the Semail Ophiolite and are exposed in the Saih Hatat tectonic window. To the southeast of Muscat, along the coast of the Gulf of Oman, lies the village of As Sifah, an area well known for its extraordinary metamorphic rocks – blueschist and eclogite.

    Figure 2. Geological map showing the rock units southeast of Muscat, Oman from the Mesozoic to the Tertiary. Sites where rock samples were collected for my research project are marked by dots (modified after Massonne et al. 2013).

    But what are blueschists and eclogites? To understand them, let’s look at a metamorphic facies diagram. Metamorphic facies are sets of mineral assemblages that form under similar temperature and pressure conditions in metamorphic rocks. Metamorphic rocks with certain minerals can therefore be linked to specific tectonic settings. Since they are gradational and approximate, the boundaries between the facies are shown as wide bands.

    Figure 3. Metamorphic facies diagram illustrating in temperature – pressure – depth space (After Strekeisen). The black and white dot indicates the peak metamorphic conditions experienced by the As Sifah eclogites and blueschists (data from by El-Shazly et al., 1990).

    Blueschists are rocks formed by regional metamorphism in subduction zones, meaning its protolith (parent rock or the original, un-metamorphosed rock), likely a sea floor basalt, was metamorphosed due to heat and pressure affecting rocks over an extensive area accompanied by deformation under differential stress conditions. The resulting rock is strongly foliated as the mineral grains are arranged parallel, giving the rock a “striped” appearance. If you look at the facies diagram, you’ll see that the blueschist facies result from high-pressure (P >6 kbar) low-temperature (T ~300˚ Celsius) conditions and correspond to depths of ~20 to 40 km. As a reference, ocean crust is generally only 6-10 km thick. The tectonic environment blueschists are formed in the subduction zone. The presence of HP/LT index minerals like glaucophane, lawsonite, aragonite, jadeite, and deerite characterize blueschists. The “blue” color of the rock, and subsequently its name, comes from the predominant minerals glaucophane and lawsonite.

    Figure 4. A blueschist thin section of a sample from As Sifah, Oman.

    Let’s say we take our rock from the blueschist facies and subduct it even deeper, descending further into the Earth’s mantle and subjecting it to higher pressures. What happens is the glaucophane reacts with albite (plagioclase feldspar) to form omphacite (clinopyroxene), and chlorite breaks down to form garnet (often pyrope or Mg-rich almandine), effectively transforming our blueschist facies into eclogite facies. This facies requires high pressure (P >12 kbar) and high temperatures (T >400˚ Celsius), corresponding with burial to a minimum depth of ~40 km.

    Eclogite is known as the “Christmas” rock for its striking appearance of red to pink garnet in a green matrix of sodium-rich pyroxene (omphacite). Accessory minerals include an alphabet soup of minerals including kyanite, rutile, quartz, lawsonite, coesite, amphibole, phengite, paragonite, zoisite, dolomite, corundum, and, rarely, diamond. Rutile, kyanite, and quartz are also typically present.

    Figure 5. An eclogite hand sample from As Sifah, Oman.

    So we know that blueschists and eclogites have to make a journey deep within the earth, but how do they find themselves back on the surface for a geologist to collect and study? First of all, in order for blueschists and eclogites to be preserved, the rock must be exhumed fast enough to prevent total thermal equilibration of the rocks as it comes back towards the Earth’s surface. A way this can happen is through rapid flow and/or faulting in accretionary wedges or the upper parts of subducted crust. Another way is through buoyancy, granted the rocks are associated with low-density continental crust.

    Eclogite outcrop on a beach in Oman
    Figure 6. Large eclogite outcrop on the beach of As Sifah. Rock hammer for scale is approximately 30 cm long.

    My senior research project is focused on understanding the structural geometry of Oman’s blueschists and eclogites, whose protoliths are approximately 110 million years old and were metamorphosed ~80 million years ago (which coincides with the emplacement of the Semail Ophiolite during the Mid to Late Cretaceous). Our goal is to conduct strain and vorticity analyses in order to determine what type of deformation and kinematic processes these rocks enjoyed. When studying strain and vorticity, we ask what type of shear and 3D strain occurred. Shear can be divided into three categories: pure, simple, and general. Pure vs simple shear is a matter of squeezing vs sliding, and general shear is a combination of pure and simple shear. Strain can also be divided into three categories: constrictional, plane, and flattening with plane being the middle ground between constrictional and flattening. Strain deals with whether the deformation occurred by shortening in two directions with stretching in another (constriction), or being flattened like a pancake, in which the rock is stretched outward in two directions and shortened in the other. We hypothesize that these rocks enjoyed general shear and flattening strain that records the exhumation phase of deformation.

    Figure 7. Diagram of the types of shear (left) and types of strain (right) (from Hatcher & Bailey 2020).

    In January of 2020, I had the amazing opportunity to travel to Oman to experience its wonderfully breathtaking geology and culture through William & Mary’s Natural History & Contemporary Culture of Oman (a.k.a. Rock Music Oman) course (read more about the details of our trip). On the sunny day of January 17th, our group studied three outcrops at As Sifah and I collected ten oriented and unoriented rock samples of metamorphic rocks that were, after obtaining the proper permits, later shipped back to the States for my research.

    Figure 8. The ten samples of metamorphic rocks from As Sifah, Oman reposing in an office chair at William & Mary.

    After the samples arrived at W&M, we cut slabs and small billets to make petrographic thin sections. By looking at these thin sections we can describe the mineralogy and identify the microstructures of our samples, which should reveal the type of shear and strain present.

    My love of geology stems from my love of the Earth and what it has to offer. Having a father that has dedicated his life to marine conservation, I developed a curiosity toward the natural Earth. As a child, I had the habit of picking up rocks and shells that caught my eye from the places I’ve traveled to in order to capture and treasure the beauty of the natural world. But that child was unaware of the true extent of the stories that can be told from such small things. It is through studying geology at William & Mary that I unlocked the ability to understand these stories that the Earth tells us in subtle ways. I can thank the department for cultivating an environment that encourages us to be curious and unconditionally love geology. I could think of no better way of finishing off my William & Mary experience – conducting research on some of the most interesting and unique rocks that the world has to offer.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The College of William & Mary (also known as William & Mary, W&M, and officially The College of William and Mary in Virginia) is a public research university in Williamsburg, Virginia. Founded in 1693 by letters patent issued by King William III and Queen Mary II, it is the second-oldest institution of higher education in the United States, after Harvard University.

    William & Mary educated American Presidents Thomas Jefferson (third), James Monroe (fifth), and John Tyler (tenth) as well as other key figures important to the development of the nation, including the fourth U.S. Supreme Court Chief Justice John Marshall of Virginia, Speaker of the House of Representatives Henry Clay of Kentucky, sixteen members of the Continental Congress, and four signers of the Declaration of Independence, earning it the nickname “the Alma Mater of the Nation.” A young George Washington (1732–1799) also received his surveyor’s license through the college. W&M students founded the Phi Beta Kappa academic honor society in 1776, and W&M was the first school of higher education in the United States to install an honor code of conduct for students. The establishment of graduate programs in law and medicine in 1779 makes it one of the earliest higher level universities in the United States.

    In addition to its undergraduate program, W&M is home to several graduate programs (including computer science, public policy, physics, and colonial history) and four professional schools (law, business, education, and marine science). In his 1985 book Public Ivies: A Guide to America’s Best Public Undergraduate Colleges and Universities, Richard Moll categorized William & Mary as one of eight “Public Ivies”.

  • richardmitnick 12:58 pm on November 21, 2020 Permalink | Reply
    Tags: "A Solar-Powered Rocket Might Be Our Ticket to Interstellar Space", , ,   

    From JHU Applied Physics Lab via WIRED: “A Solar-Powered Rocket Might Be Our Ticket to Interstellar Space” 

    From Johns Hopkins University Applied Physics Lab


    Johns Hopkins Applied Physics Lab bloc
    From JHU Applied Physics Lab



    The idea for solar thermal propulsion has been around for decades, but researchers tapped by NASA just conducted a first test.

    Credit: NASA.

    If Jason Benkoski is right, the path to interstellar space begins in a shipping container tucked behind a laboratory high bay in Maryland. The set up looks like something out of a low-budget sci-fi film: One wall of the container is lined with thousands of LEDs, an inscrutable metal trellis runs down the center, and a thick black curtain partially obscures the apparatus. This is the Johns Hopkins University Applied Physics Laboratory solar simulator, a tool that can shine with the intensity of 20 suns. On Thursday afternoon, Benkoski mounted a small black and white tile onto the trellis and pulled a dark curtain around the set-up before stepping out of the shipping container. Then he hit the light switch.

    Once the solar simulator was blistering hot, Benkoski started pumping liquid helium through a small embedded tube that snaked across the slab. The helium absorbed heat from the LEDs as it wound through the channel and expanded until it was finally released through a small nozzle. It might not sound like much, but Benkoski and his team just demonstrated solar thermal propulsion, a previously theoretical type of rocket engine that is powered by the sun’s heat. They think it could be the key to interstellar exploration.

    “It’s really easy for someone to dismiss the idea and say, ‘On the back of an envelope, it looks great, but if you actually build it, you’re never going to get those theoretical numbers,’” says Benkoski, a materials scientist at the Applied Physics Laboratory and the leader of the team working on a solar thermal propulsion system. “What this is showing is that solar thermal propulsion is not just a fantasy. It could actually work.”

    Only two spacecraft, Voyager 1 and Voyager 2, have left our solar system.

    Heliosphere-heliopause showing positions of Voyager spacecraft. Credit: NASA.

    But that was a scientific bonus after they completed their main mission to explore Jupiter and Saturn. Neither spacecraft was equipped with the right instruments to study the boundary between our star’s planetary fiefdom and the rest of the universe. Plus, the Voyager twins are slow. Plodding along at 30,000 miles per hour, it took them nearly a half century to escape the sun’s influence.

    But the data they have sent back from the edge is tantalizing. It showed that much of what physicists had predicted about the environment at the edge of the solar system was wrong. Unsurprisingly, a large group of astrophysicists, cosmologists, and planetary scientists are clamoring for a dedicated interstellar probe to explore this new frontier.

    In 2019, NASA tapped the Applied Physics Laboratory to study concepts for a dedicated interstellar mission. At the end of next year, the team will submit its research to the National Academies of Sciences, Engineering, and Medicine’s Heliophysics decadal survey, which determines sun-related science priorities for the next 10 years. APL researchers working on the Interstellar Probe program are studying all aspects of the mission, from cost estimates to instrumentation. But simply figuring out how to get to interstellar space in any reasonable amount of time is by far the biggest and most important piece of the puzzle.

    The edge of the solar system—called the heliopause—is extremely far away. By the time a spacecraft reaches Pluto, it’s only a third of the way to interstellar space. And the APL team is studying a probe that would go three times farther than the edge of the solar system, a journey of 50 billion miles, in about half the time it took the Voyager spacecraft just to reach the edge. To pull off that type of mission, they’ll need a probe unlike anything that’s ever been built. “We want to make a spacecraft that will go faster, further, and get closer to the sun than anything has ever done before,” says Benkoski. “It’s like the hardest thing you could possibly do.”

    In mid-November, the Interstellar Probe researchers met online for a weeklong conference to share updates as the study enters its final year. At the conference, teams from APL and NASA shared the results of their work on solar thermal propulsion, which they believe is the fastest way to get a probe into interstellar space. The idea is to power a rocket engine with heat from the sun, rather than combustion. According to Benkoski’s calculations, this engine would be around three times more efficient than the best conventional chemical engines available today. “From a physics standpoint, it’s hard for me to imagine anything that’s going to beat solar thermal propulsion in terms of efficiency,” says Benkoski. “But can you keep it from exploding?”

    Unlike a conventional engine mounted on the aft end of a rocket, the solar thermal engine that the researchers are studying would be integrated with the spacecraft’s shield. The rigid flat shell is made from a black carbon foam with one side coated in a white reflective material. Externally it would look very similar to the heat shield on the Parker Solar Probe. The critical difference is the tortuous pipeline hidden just beneath the surface. If the interstellar probe makes a close pass by the sun and pushes hydrogen into its shield’s vasculature, the hydrogen will expand and explode from a nozzle at the end of the pipe. The heat shield will generate thrust.

    It’s simple in theory, but incredibly hard in practice. A solar thermal rocket is only effective if it can pull off an Oberth maneuver, an orbital mechanics hack that turns the sun into a giant slingshot. The sun’s gravity acts like a force multiplier that dramatically increases the craft’s speed if a spacecraft fires its engines as it loops around the star. The closer a spacecraft gets to the sun during an Oberth maneuver, the faster it will go. In APL’s mission design, the interstellar probe would pass just a million miles from its roiling surface.

    To put this in perspective, by the time NASA’s Parker Solar Probe makes its closest approach in 2025, it will be within 4 million miles of the sun’s surface and booking it at nearly 430,000 miles per hour. That’s about twice the speed the interstellar probe aims to hit and the Parker Solar Probe built up speed with gravity assists from the sun and Venus over the course of seven years. The Interstellar Probe will have to accelerate from around 30,000 miles per hour to around 200,000 miles per hour in a single shot around the sun, which means getting close to the star. Really close.

    Cozying up to a sun-sized thermonuclear explosion creates all sorts of materials challenges, says Dean Cheikh, a materials technologist at NASA’s Jet Propulsion Laboratory who presented a case study on the solar thermal rocket during the recent conference. For the APL mission, the probe would spend around two-and-a-half hours in temperatures around 4,500 degrees Fahrenheit as it completed its Oberth maneuver. That’s more than hot enough to melt through the Parker Solar Probe’s heat shield, so Cheikh’s team at NASA found new materials that could be coated on the outside to reflect away thermal energy. Combined with the cooling effect of hydrogen flowing through channels in the heat shield, these coatings would keep the interstellar probe cool while it blitzed by the sun. “You want to maximize the amount of energy that you’re kicking back,” says Cheikh. “Even small differences in material reflectivity start to heat up your spacecraft significantly.”

    A still greater problem is how to handle the hot hydrogen flowing through the channels. At extremely high temperatures, the hydrogen would eat right through the carbon-based core of the heat shield, which means the inside of the channels will have to be coated in a stronger material. The team identified a few materials that could do the job, but there’s just not a lot of data on their performance, especially extreme temperatures. “There’s not a lot of materials that can fill these demands,” says Cheikh. “In some ways that’s good, because we only have to look at these materials. But it’s also bad because we don’t have a lot of options.”

    The big takeaway from his research, says Cheikh, is there’s a lot of testing that needs to be done on heat shield materials before a solar thermal rocket is sent around the sun. But it’s not a dealbreaker. In fact, incredible advances in materials science make the idea finally seem feasible more than 60 years after it was first conceived by engineers in the US Air Force. “I thought I came up with this great idea independently, but people were talking about it in 1956,” says Benkoski. “Additive manufacturing is a key component of this, and we couldn’t do that 20 years ago. Now I can 3D-print metal in the lab.”

    Even if Benkoski wasn’t the first to float the idea of a solar thermal propulsion, he believes he’s the first to demonstrate a prototype engine. During his experiments with the channeled tile in the shipping container, Benkoski and his team showed that it was possible to generate thrust using sunlight to heat a gas as it passed through embedded ducts in a heat shield. These experiments had several limitations. They didn’t use the same materials or propellant that would be used on an actual mission, and the tests occurred at temperatures well below what an interstellar probe would experience. But the important thing, says Benkoski, is that the data from the low temperature experiments matched the models that predict how an interstellar probe would perform on its actual mission once adjustments are made for the different materials. “We did it on a system that would never actually fly. And now the second step is we start to substitute each of these components with the stuff that you would put on a real spacecraft for an Oberth maneuver,” Benkoski says.

    The concept has a long way to go before it’s ready to be used on a mission—and with only a year left in the Interstellar Probe study, there’s not enough time to launch a small satellite to do experiments in low Earth orbit. But by the time Benkoski and his colleagues at APL submit their report next year, they will have generated a wealth of data that lays the foundation for in-space tests. There’s no guarantee that the National Academies will select the interstellar probe concept as a top priority for the coming decade. But whenever we are ready to leave the sun behind, there’s a good chance we’ll have to use it for a boost on our way out the door.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    JHUAPL campus.

    Founded on March 10, 1942—just three months after the United States entered World War II—Applied Physics Lab -was created as part of a federal government effort to mobilize scientific resources to address wartime challenges.

    APL was assigned the task of finding a more effective way for ships to defend themselves against enemy air attacks. The Laboratory designed, built, and tested a radar proximity fuze (known as the VT fuze) that significantly increased the effectiveness of anti-aircraft shells in the Pacific—and, later, ground artillery during the invasion of Europe. The product of the Laboratory’s intense development effort was later judged to be, along with the atomic bomb and radar, one of the three most valuable technology developments of the war.

    On the basis of that successful collaboration, the government, The Johns Hopkins University, and APL made a commitment to continue their strategic relationship. The Laboratory rapidly became a major contributor to advances in guided missiles and submarine technologies. Today, more than seven decades later, the Laboratory’s numerous and diverse achievements continue to strengthen our nation.

    APL continues to relentlessly pursue the mission it has followed since its first day: to make critical contributions to critical challenges for our nation.

    Johns Hopkins Unversity campus.

    Johns Hopkins University opened in 1876, with the inauguration of its first president, Daniel Coit Gilman. “What are we aiming at?” Gilman asked in his installation address. “The encouragement of research … and the advancement of individual scholars, who by their excellence will advance the sciences they pursue, and the society where they dwell.”

    The mission laid out by Gilman remains the university’s mission today, summed up in a simple but powerful restatement of Gilman’s own words: “Knowledge for the world.”

    What Gilman created was a research university, dedicated to advancing both students’ knowledge and the state of human knowledge through research and scholarship. Gilman believed that teaching and research are interdependent, that success in one depends on success in the other. A modern university, he believed, must do both well. The realization of Gilman’s philosophy at Johns Hopkins, and at other institutions that later attracted Johns Hopkins-trained scholars, revolutionized higher education in America, leading to the research university system as it exists today.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: