Tagged: Cosmos Magazine Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:08 pm on March 27, 2018 Permalink | Reply
    Tags: , , , , Cosmos Magazine, , , , National Computational Infrastructure at the Australian National University in Canberra, SkyMapper telescope at Siding Spring Observatory, SN KSN 2015K,   

    From Space Science Telescope Institute via COSMOS: “Gone in a flash: supernova burns up in just 25 days” 

    Space Science Telescope Institute


    27 March 2018
    Lauren Fuge

    Huge, bright and incredibly violent, a new supernova provides new challenges for astronomers.

    An artists impression of how the explosive light of the supernova was hidden for a while behind a cocoon of ejected dust. Nature Astronomy.

    Astronomers have witnessed a blazing supernova explosion that faded away 10 times faster than expected.

    A supernova is the violent death of a massive star, typically occurring when it exhausts its fuel supply and collapses under its own weight, generating a powerful shockwave that blasts light and material out into space.

    Supernovae often blaze so brightly that they briefly outshine all the other stars in their host galaxy. They show off for months on end — in 1054, a supernova could be seen during the day for three weeks and only disappeared completely after two years. Its remnants are known as the Crab Nebula.

    The Crab Nebula in all its glory. NASA, ESA, NRAO/AUI/NSF and G. Dubner (University of Buenos Aires).

    Now an international team of astronomers, led by Armin Rest from the Space Science Telescope Institute in Baltimore, US, has observed a supernova that rapidly soared to its peak brightness in 2.2 days then faded away in just 25.

    “When I first saw the Kepler data, and realised how short this transient is, my jaw dropped,” recalls Rest.

    The supernova, dubbed KSN 2015K, is part of a puzzling class of rare events called Fast-Evolving Luminous Transients (FELTs).

    KSN 2015K’s host is the star-forming spiral galaxy 2MASX-J13315109-1044061. Image credit: Rest et al: https://www.nature.com/articles/s41550-018-0423-2.

    FELTs don’t fit into existing supernova models and astronomers are still debating their sources. Previous suggestions include the afterglow of a gamma-ray burst, a supernova turbo-boosted by a magnetically-powerful neutron star, or a failed example of special type of binary star supernova known as a type 1a. KSN 2015K is the most extreme example found so far.

    In a paper published in the journal Nature Astronomy, the team says that KSN 2015K’s behaviour can most likely be explained by its surroundings: the star was swathed in dense gas and dust that it ejected in its old age, like a caterpillar spinning a cocoon. When the supernova detonated, it took some time for the resulting shock wave to slam into the shell of material and produce a burst of light, becoming visible to astronomers.

    KSN 2015K was captured by NASA’s Kepler Space Telescope, which is designed to hunt for planets by noticing the tiny, temporary dips in light from far-away stars when planets pass in front of them.

    NASA/Kepler Telescope

    Planet transit. NASA/Ames

    This exact skill is also useful in studying supernovae and other brief, explosive events.

    “Using Kepler’s high-speed light-measuring capabilities, we’ve been able to see this exotic star explosion in incredible detail,” says team member Brad Tucker, an astrophysicist from the Australian National University.

    Co-author David Khatami from the University of California, Berkeley, US, adds that this is the first time astronomers can test FELT models to a high degree of accuracy. “The fact that Kepler completely captured the rapid evolution really constrains the exotic ways in which stars die,” he says.

    Australian researchers and facilities were also key to this discovery. Follow-up observations were made with the SkyMapper telescope at Siding Spring Observatory, and then processed by the National Computational Infrastructure at the Australian National University in Canberra.

    ANU Skymapper telescope, a fully automated 1.35 m (4.4 ft) wide-angle optical telescope, at Siding Spring Observatory , near Coonabarabran, New South Wales, Australia, Altitude 1,165 m (3,822 ft)

    Siding Spring Observatory, near Coonabarabran, New South Wales, Australia, Altitude 1,165 m (3,822 ft)

    The National Computational Infrastructure building at the Australian National University

    Tucker says that by learning more about how stars live and die, astronomers can better understand solar systems as a whole, including the potential life on orbiting planets.

    He concludes: “With the imminent launch of NASA’s new space telescope, TESS, we hope to find even more of these rare and violent explosions.”


    See the full article here . Other articles here and here and here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    We are the Space Telescope Science Institute in Baltimore, Maryland, operated by the Association of Universities for Research in Astronomy. We help humanity explore the universe with advanced space telescopes and ever-growing data archives.

    Association of Universities for Research in Astronomy

    Founded in 1982, we have helped guide the most famous observatory in history, the Hubble Space Telescope.

    NASA/ESA Hubble Telescope

    Since its launch in 1990, we have performed the science operations for Hubble. We also lead the science and mission operations for the James Webb Space Telescope (JWST), scheduled for launch in 2019.

    NASA/ESA/CSA Webb Telescope annotated

    We will perform parts of the science operations for the Wide Field Infrared Survey Telescope (WFIRST), in formulation for launch in the mid-2020s, and we are partners on several other NASA missions.


    Our staff conducts world-class scientific research; our Barbara A. Mikulski Archive for Space Telescopes (MAST) curates and disseminates data from over 20 astronomical missions;

    Mikulski Archive For Space Telescopes

    and we bring science to the world through internationally recognized news, education, and public outreach programs. We value our diverse workforce and civility in the workplace, and seek to be an example for others to follow.

  • richardmitnick 8:27 am on February 19, 2018 Permalink | Reply
    Tags: , Cosmos Magazine, Meteotsunami,   

    From COSMOS Magazine: “Prevalence and danger of little known tsunami type revealed” 

    Cosmos Magazine bloc

    COSMOS Magazine

    19 February 2018
    Richard A Lovett


    On 4 July 2003, beachgoers at Warren Dunes State Park, in the US state of Michigan, were enjoying America’s Independence Day holiday when a fast-moving line of thunderstorms blew in from Lake Michigan. They scurried for shelter, but the event passed so quickly it didn’t appear that their holiday was ruined.

    “In 15 minutes it was gone,” says civil engineer Alvaro Linares of the University of Wisconsin, Madison.

    But when swimmers re-entered the water, rip currents appeared seemingly from nowhere, pulling eight people out into the lake, where seven drowned.

    What these people had encountered, Linares says, was a meteotsunami — an aquatic hazard of which few people, including scientists, were aware of until recently.

    Few scientists have researched the phenomenon. May of those who have gathered recently at the annual American Geophysical Union Ocean Sciences meeting, held in Portland, Oregon, US, to compare notes.

    Conventional tsunamis are caused by underwater processes such as earthquakes and submarine landslides. Meteotsunamis, as the name indicates, are caused by weather. But while the catalysts are different, the effects are not.

    “The wave characteristics are very similar,” says Eric Anderson of the Great Lakes Environmental Research Laboratory of the National Oceanic and Atmospheric Administration (NOAA) in Ann Arbor, Michigan.

    To create a meteotsunami, what’s required is a combination of a strong, fast-moving storm and relatively shallow water. The sudden increase in winds along the storm front, possibly combined with changes in air pressure, starts the process by kicking up a tsunami-style wave that runs ahead of it. But the process would quickly fizzle out if the water was too deep, because in deep water, such waves propagate very quickly and would soon outrun the storm.

    What’s needed to produce a meteotsunami is a water depth at which the storm’s speed and the wave’s speed match, allowing the wave to build as it and the storm move in tandem. “The storm puts all its energy into that wave,” Anderson says.

    Furthermore, the wave can magnify even more when it hits shallower water or shoals. “That is when these become destructive,” Anderson says.

    In 2004, for example, a storm front 300 kilometres wide sped across the East China Sea at a speed of 31 metres per second, 112 kilometres per hour, says Katsutoshi Fukuzawa of the University of Tokyo.

    Water there is shallow, he adds, with depths mostly under 100 metres. This limits wave speed to about 30 metres per second — a near-perfect match to the storm’s. As a result, parts of the island of Kyushu were hit with a tsunami as big as 1.6-metres.

    Not that meteotsunamis have to be that big to be dangerous. The one at Warren Dunes was probably no more than 30 centimeters, says Linares — small enough not even to be visible in the lake’s normal chop.

    But unlike normal surf, meteotsunamis produce a sustained slosh that lasts several minutes between run-up and retreat. That means that even low-height waves carry a lot of water, creating the potential for strong rip currents when they withdraw. According to Linares’ models [Journal of Geophysical Research], these currents would have persisted for about an hour — plenty long enough to drag unwary swimmers far out into the lake, long after the storm had passed.

    It’s also possible for meteotsunamis to become “detached” from the storm front that created them, striking shores far away. Researchers reviewing records in the Great Lakes have concluded that that is what happened when such a wave hit Chicago in 1954, killing 10 people.

    “The wave came out of nowhere,” Anderson says. “It was a calm, sunny day.”

    It’s not just Japan and America’s Great Lakes that have seen such events. In May 2017, a storm raced up the English Channel, kicking up a metre-high wave that swept beaches in The Netherlands as bystanders looked on with awe, says Ap van Dongeren of the Deltares research institute in Delft, The Netherlands.

    Quirks of topography can magnify the effects of such tsunamis. On 13 June 2013, a group of spearfishermen in New Jersey were stunned when a surge of water threw them across a breakwater into the open ocean [nj.com]. A few minutes later, another surge threw them back where they’d come from. And that came from a meteotsunami that measured at well less than a metre on local tide gauges, says Gregory Dusek, a NOAA oceanographer at Camp Springs, Maryland.

    Meteotsunamis have occurred on all inhabited continents, including one that hit the port of Fremantle, near the Australian city of Perth, in 2014, causing a ship to break free from its moorings and crash into a railroad bridge in 2014, Sarath Wijeratne of the University of Western Australia reported in a conference abstract. In fact, Wijeratne concluded, a look back at historical water level records indicates that Western Australia may have seen more than 15 such events each year between 2008 and 2016.

    Other researchers are also finding these events to be surprisingly frequent. By studying tide gauge records back to 1996, Dusek has concluded that they occur on America’s eastern seaboard at a rate of 23 per year — though most are small enough nobody would ever notice. In Holland, Van Dongeren says that a quick check of historical tide gauge records revealed at least three such events in the past decade that had gone unnoticed because they happened at low tide. “They’re not that rare,” he says.

    Fukuzawa says that Japan saw 37 meteotsunamis exceeding one metre from 1961 to 2005.

    Furthermore, bigger ones are possible. In June 2014, Croatia was hit by a two-to-three metre tsunami sweeping in from the Adriatic Sea, says Clea Denamiel, of the Croatian Institute of Oceanography and Fisheries.

    But the mother of all meteotsunamis came in 1978, when Vela Luka, at the southern end of Croatia’s scenic Dalmatian coast, was smashed by a meteotsunami measuring a full six metres, with giant waves surging and retreating about every 17 minutes, just as might have occurred in the aftermath of a large offshore earthquake.

    As of now, scientists don’t know enough about meteotsunamis to be able to predict them, though efforts are under way to create models that can do just that. But as they dig back through old records, they are increasingly realising that meteotsunamis might have been with us for a long time.

    Or as Linares puts it with typical scientific understatement, “meteotsunamis are a beach hazard that has been overlooked”.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 12:59 pm on February 9, 2018 Permalink | Reply
    Tags: , , Cosmos Magazine, , ,   

    From ANU via COSMOS: “40-year cosmic theory confirmed” 

    ANU Australian National University Bloc

    Australian National University


    09 February 2018
    Andrew Masterson

    A stellar reaction long predicted but never seen has been demonstrated in the lab.

    After four decades of research, a theory is finally confirmed. CONEYL JAY/SCIENCE PHOTO LIBRARY/Getty Images.

    An abundant new energy supply could be derived from controlling a quantum reaction that takes place in stars, according to research from the Australian National University (ANU).

    The possibility arises because the ANU scientists plus others from institutions including the US Army Research laboratory and Poland’s National Centre for Nuclear Research have succeeded in confirming the existence of a reaction first predicted four decades ago but unmeasured until now.

    In a paper published in the journal Nature, ANU physicist Greg Lane and colleagues report the confirmation of a phenomenon known as Nuclear Excitation by Electron Capture (NEEC). Confirming that NEEC actually happens supplies a key mechanism for understanding how evolving stars produce elements such as gold and platinum.

    NEEC can occur when an atom captures an electron. If the electron’s kinetic energy and the energy required to capture it add up to just the right amount, the atom’s nucleus is pushed to a higher state of excitation.

    The energy increase, however, comes at the cost of a shorter life. What was a long-lived stable nucleus must now decay, either through an electromagnetic process known as internal conversion which spits out an electron, or by emitting a photon.

    Although discussed since the 1970s, experimental proof for NEEC has remained elusive.

    The new work, however, has now provided the necessary evidence. The researchers did so by creating an exotic isotope – molybdenum-93 – by firing a beam of zirconium atoms at lithium targets, using the ANU’s Heavy Ion Accelerator and the ATLAS Accelerator at Argonne National Laboratory in the United States.

    ANU’s Heavy Ion Accelerator

    ATLAS Accelerator at Argonne National Laboratory

    The resulting molybdenum atoms zipped around at as much as 10% of the speed of light, smashing into the remaining lithium, stripping off electrons and leaving highly charged ions behind.

    As the interactions continued, the molybdenum ions lost kinetic energy until they reached a state where they could capture an electron with just the right energy to push the molybdenum nuclei from their long-duration “isomer” states into higher level but shorter-lived intermediate ones. These intermediate states decayed, giving off a unique gamma-ray signature that proved NEEC had occurred.

    The research now provides a model against which other theoretical calculations for the NEEC effect in different elements can be tested, illuminating further the process by which nuclear interactions in stars produce certain metals.

    “The abundance of the different elements in a star depends primarily on the structure and behaviour of atomic nuclei,” says Lane.

    “The NEEC phenomenon modifies the nucleus lifetime so that it survives for a shorter amount of time in a star.”

    As well cosmological implications, the confirmation of the NEEC effect opens the door to potentially accessing energy stored in longer-lived isomer nuclei. Lane suggests the technique could create energy sources 100,000 times more powerful than chemical batteries.

    It is a possible outcome that has not gone unnoticed by at least one of the ANU’s research partners.

    “Our study demonstrated a new way to release the energy stored in a long-lived nuclear state, which the US Army Research Laboratory is interested to explore further,” says Lane.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ANU Campus

    ANU is a world-leading university in Australia’s capital city, Canberra. Our location points to our unique history, ties to the Australian Government and special standing as a resource for the Australian people.

    Our focus on research as an asset, and an approach to education, ensures our graduates are in demand the world-over for their abilities to understand, and apply vision and creativity to addressing complex contemporary challenges.

  • richardmitnick 10:59 am on December 4, 2017 Permalink | Reply
    Tags: Australia seems on the brink of embracing space in a coordinated manner but how should we do it?, Australian universities made cubesats for an international research project, Cosmos Magazine, , It is encouraging that Australian organisations have anticipated the growth areas, There are also emerging Australian capabilities in small satellites and potentially disruptive technologies with space applications, Three new reports add clarity to Australia’s space sector a ‘crowded and valuable high ground’,   

    From COSMOS: “Three new reports add clarity to Australia’s space sector, a ‘crowded and valuable high ground’” 

    Cosmos Magazine bloc

    COSMOS Magazine

    02 December 2017
    Anthony Wicht

    Three new reports examine Australia’s existing space capabilities, set them in the light of international developments, and identify growth areas and models for Australia to pursue. 136319147@N08/flickr. Telescope is not identified. Bad journalism.

    Australia seems on the brink of embracing space in a coordinated manner, but how should we do it?

    This week, the Australian government released three reports to help chart the future of Australia’s space industry. Their conclusions will feed into the review of Australia’s space industry underway by former CSIRO head Dr Megan Clark.

    The reports examine Australia’s existing space capabilities, set them in the light of international developments, and identify growth areas and models for Australia to pursue. The promise is there:

    Australia has scattered globally competitive capabilities in areas from space weather to deep-space communication but “by far the strongest areas” are applications of satellite data on Earth to industries like agriculture, communications and mining
    Australian research in other sectors like 3D printing and VR is being translated to space with potentially high payoffs
    global trends, including the demand for more space traffic management, play to our emerging strengths
    the prize for success is real – the UK currently has an A$8 billion space export industry, and anticipates further growth.

    While it is not the first time the government has commissioned this type of research, the updates are welcome given the fast pace of space innovation. Taken together they paint a picture of potential for the future of Australian space and a firm foundation for a space agency.

    The rules of the game

    The Global Space Industry Dynamics report from Bryce Space and Technology, a US-based space specialist consulting firm, sets out the “rules of the game” in the US$344 billion (A$450 billion) space sector.

    The global space economy at a glance. Figures are from 2016, and shown in US$.
    Marcella Cheng for The Conversation, adapted from Global Space Industry Dynamics Research Paper by Bryce Space and Technology

    It highlights that:

    three quarters of global revenues are made commercially, despite the prevailing perception that space is a government concern
    most commercial revenue is made from space-enabled services and applications (like satellite TV or GPS receivers) rather than the construction and launch of space hardware itself
    commercial launch and satellite manufacturing industries are still small in relative terms, at about US$20.5 billion (A$27 billion) of revenues, but show strong growth, particularly for smaller satellites and launch vehicles.

    The report also looks at the emerging trends that a smart space industry in Australia will try and run ahead of. Space is becoming cheaper, more attractive to investors and increasingly important in our data-rich economy. These trends have not gone unnoticed by global competitors, though, and the report describes space as an increasingly “crowded and valuable high ground”.

    What is particularly useful about the report is its sharp focus on the three numbers that determine commercial attractiveness:

    market size

    The magic comes through matching these attractive sectors against areas where Australia can compete strongly because of existing capability or geographic advantage.

    The report suggests growth opportunities across traditional and emerging space sectors. In traditional sectors, it calls out satellite services, particularly commercial satellite radio and broadband, and ground infrastructure as prime opportunities. In emerging sectors, earth observation data analytics, space traffic management, and small satellite manufacturing are all tipped as potentially profitable growth areas where Australia could compete.

    The report adds the speculative area of space mining as an additional sector worth considering given Australia’s existing terrestrial capability.

    It is encouraging that Australian organisations have anticipated the growth areas, from UNSW’s off-earth mining research, to Geoscience Australia’s integrated satellite data to Mt Stromlo’s debris tracking capability.

    Australian capabilities

    Australian capabilities are the focus of a second report, by ACIL Allen consulting, Australian Space Industry Capability. The review highlights a smattering of world class Australian capabilities, particularly in the application of space data to activities on Earth like agriculture, transport and financial services.

    There are also emerging Australian capabilities in small satellites and potentially disruptive technologies with space applications, like 3D printing, AI and quantum computing. The report notes that basic research is strong, but challenges remain in “industrialising and commercialising the resulting products”.

    Australian universities made cubesats for an international research project.

    The concern about commercialisation prompts questions about the policies that will help Australian companies succeed.

    Should we embrace recent trends and rely wholly on market mechanisms and venture capital Darwinism, or buy into traditional international space projects?

    Do we send our brightest overseas for a few years’ training, or spin up a full suite of research and development programs domestically?

    Are there regulations that need to change to level the playing field for Australian space exports?
    Learning from the world

    Part of the answer is to be found in the third report, Global Space Strategies and Best Practices, which looks at global approaches to funding, capability development, and governance arrangements. The case studies illustrate a range of styles.

    The UK’s pragmatic approach developed a £5 billion (A$8 billion) export industry by focusing primarily on competitive commercial applications, including a satellite Australia recently bought a time-share on.

    A longer-term play is Luxembourg’s use of tax breaks and legal changes to attract space mining ventures. Before laughing, remember that Luxembourg has space clout: satellite giants SES and Intelsat are headquartered there thanks to similar forward thinking in the 1980s. Those two companies pulled in about A$3 billion of profit between them last year.

    Norway and Canada show a middle ground, combining international partnerships with clear focus areas that benefit research and the economy. Norway has taken advantage of its geography to build satellite ground stations for polar-orbiting satellites, in an interesting parallel with Australia’s longstanding ground capabilities. Canada used its relationship with the United States to build the robotic “Canadarm” for the Space Shuttle and International Space Station, developing a space robotics capability for the country.

    Canadarm played an important role in Canada-USA relations.

    The only caution is that confining the possible role models to the space sector is unnecessarily limiting. Commercialisation in technology fields is a broader policy question, and there is much to learn from recent innovations including CSIRO’s venture fund and the broader Cooperative Research Centre (CRC) program.

    As well as the three reports, the government recently released 140 public submissions to the panel.

    There is no shortage of advice for Dr Clark and the expert reference group; appropriate given it seems an industry of remarkable potential rests in their hands.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 10:05 am on December 4, 2017 Permalink | Reply
    Tags: , Can the Great Barrier Reef regenerate?, Cosmos Magazine,   

    From COSMOS Magazine: “Can the Great Barrier Reef regenerate?” 

    Cosmos Magazine bloc

    COSMOS Magazine

    02 December 2017
    No writer credit

    Well-positioned “robust reefs” may provide coral larvae to help the Great Barrier Reef regenerate after catastrophic bleaching.
    Peter Mumby

    The Great Barrier Reef’s health could be boosted by just 3 per cent of its reefs, according to an Australian-led study.

    The authors found around 100 reefs that should have healthy adult corals, and be well connected enough to supply larvae to almost half of the Great Barrier Reef in a single year.

    By simulating the dispersal of larvae, the researchers could pinpoint which smaller reefs were best connected by ocean currents to the rest of the Great Barrier Reef and could top it up.

    They then used ocean and climate system models to show which reefs were less likely to be exposed to coral bleaching and the crown-of-thorns starfish – a pest that eats coral – and crosschecked that list against the first to come up with a ‘robust’ 3% of reefs.

    The authors of the PLOS Biology paper say these 100 reefs could help desirable species recover – suggesting a level of widespread resilience for the Great Barrier Reef – and that these reefs are unlikely to spread crown-of-thorns starfish.

    “Finding these 100 reefs is a little like revealing the cardiovascular system of the Great Barrier Reef,” explained study author Professor Peter Mumby of the University of Queensland.

    “These refugia are critical as they maintain the healthy populations and diversity required to rebuild coral populations, and have the ability to repopulate other reefs,” Dr Andrew Lenton of CSIRO Oceans and Atmosphere told the Australian Science Media Centre.

    However, there’s reason to be sceptical, according to Associate Professor John Alroy of Macquarie University: “I think [the paper] makes a good case that corals will persist for a while on a fair number of reefs. But I think it’s optimistic.”

    Given the paper shows most of the robust reefs are in the south, Alroy said it made him wonder “whether reefs in the far north can really be kept alive by being replenished from the south.”

    He also pointed out that many of the species of animals living on the Great Barrier Reef are likely to be absent from the ‘robust’ reefs.

    Dr Karlo Hock, of the University of Queensland and also an author of the paper, suggested more does need to be done at different scales to rescue the reef.

    “Identifying only 100 reefs with this potential across the length of the entire 2300 km Great Barrier Reef emphasises the need for effective local protection of critical locations, and carbon emission reductions to support this ecosystem,” Hock said.

    Lenton explained that just protecting these robust reefs likely isn’t enough to ensure the long-term survival of the whole Great Barrier Reef.

    “[This] will need to be coupled with climate mitigation, local management and active management such as coral re-seeding,” he suggested.

    However, Alroy warned “the paper doesn’t really address the fact that global warming is just going to get worse and worse over the next few decades and centuries.”

    “So, even the ‘robust reefs’ might be wiped out in the not-too-distant future – unless we really get serious right now about mitigating global warming.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 5:10 am on November 15, 2017 Permalink | Reply
    Tags: , Cosmos Magazine,   

    From COSMOS: “Need a better microscope? Add mirrors” 

    Cosmos Magazine bloc

    COSMOS Magazine

    15 November 2017
    Andrew Masterson

    Anthony Van Leeuwenhoek’s first microscope, from the seventeenth century, looks nothing like a modern SPIM microscope, but both are products of a quest to improve optics. Stegerphoto.

    From pre-classical times onwards, it could be argued, lens-makers have been the unsung heroes of science.

    As early as 750 BCE the Assyrians were shaping lenses from quartz. From there, the history of optics both underpins and enables discovery in both the macro and micro worlds.

    Where would science be today had it not been for the patient work of myriad lens grinders and optics theorists, including Francis Bacon, Galileo, van Leeuwenhoek, right up to Roberts and Young – inventors in 1951 of photon scanning microscopy – and beyond?

    Even today, the quest for better, clearer, more detailed images from lenses continues apace, with the latest advance, declared in the journal Nature Communications, coming from the US National Institutes of Health and the University of Chicago.

    The images obtained by the combination of the new coverslip and computer algorithms show clearer views of small structures. Credit: Yicong Wu, National Institute of Biomedical Imaging and Bioengineering

    In this diagram, you can see how the mirrored coverslip allows for four simultaneous views. Credit: Yicong Wu, National Institute of Biomedical Imaging and Bioengineering

    A team of researchers, led by Hari Shroff, head of the National Institute of Biomedical Imaging and Bioengineering’s lab section on High Resolution Optical Imaging (HROI), report the solution to a mechanical problem in microscope optics that was, in a way, of their own making.

    Several years ago, Shroff and colleagues developed a new type of microscope that performed “selective plane illumination microscopy” or SPIM. These microscopes use light sheets to illuminate only sections of specimens being examined, thereby doing less damage and better preserving the sample.

    In 2013, Shroff’s team created a SPIM microscope that used two lenses instead of one, which improved image quality and depth perception, In 2016, a third lens was added, allowing improved resolution and 3D-imagery.

    A fourth lens would have boosted matters even more, but at this point van Leeuwenhoek’s twenty-first century heirs hit a snag.

    “Once we incorporated three lenses, we found it became increasingly difficult to add more,” says Shroff. “Not because we reached the limit of our computational abilities, but because we ran out of physical space.”

    Proximity was a real issue. Not only were the three lenses crowded together, but all had to be positioned extremely close to the sample being examined to allow the imaging goal – detailed views of structures within a single cell, say – to be achieved.

    In their new paper, Shroff and his colleagues reveal a solution to the problem that is nothing if not elegant. Rather than try to cram an extra lens in, they have put mirrors on the coverslip – the thin piece of glass that sits on top of the sample.

    The result – especially when coupled with new algorithms in the computerised back-end of a SPIM microscope – is better speed, efficiency and resolution.

    “It’s a lot like looking into a mirror,” Shroff explains. “If you look at a scene in a mirror, you can view perspectives that are otherwise hidden. We used this same principle with the microscope.

    “We can see the sample conventionally using the usual views enabled by the lenses themselves, while at the same time recording the reflected images of the sample provided by the mirror.”

    The addition of the tiny mirrors was not without its own problems. Every microscope raw image contains unwanted data from the source of illumination used to light up the sample. With three lenses, there are three sources of this interference; with mirrors added, these too are multiplied.

    Shroff, however, took this problem to computational imaging researcher Patrick La Riviere at the University of Chicago, who, with his team, was able to modify the processing software to eliminate the extra noise and further improve the signal.

    Francis Bacon, one thinks, would have approved.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 2:22 pm on October 28, 2017 Permalink | Reply
    Tags: , , , Cosmos Magazine, Exoplanet research with optical telescopes, Infographic: a closer look at Extremely Large Telescopes   

    From COSMOS: “Infographic: a closer look at Extremely Large Telescopes” and “How Extremely Large Telescopes will reveal exoplanets” 

    Cosmos Magazine bloc

    COSMOS Magazine

    24 October 2017

    Infographic: a closer look at Extremely Large Telescopes


    Giant Magellan Telescope, to be at Las Campanas Observatory, to be built some 115 km (71 mi) north-northeast of La Serena, Chile, over 2,500 m (8,200 ft) high

    Organisation: US-led partnership with Australia, Brazil, Chile and Korea.
    Telescope location: Las Campanas Observatory (2,550 metres high), Atacama Region, Chile.
    Mirror size: Seven 8.4-metre diameter circular mirrors mounted together to give the collecting area of a 24.5-metre telescope.
    Instruments: adaptive optics imaging cameras, spectrographs for high and low resolution, single and multiple targets.
    Status: Partially funded. Mirror casting began 2005, four completed. Mirror polishing completed for one mirror. Site construction began in 2015.
    Expected completion: 2022 with four mirrors, 2025 with seven mirrors.
    Quirky fact: A life-sized depiction of the telescope’s seven mirrors is painted on the car park at the offices of the Observatories of the Carnegie Institution in Pasadena, California.

    TMT-Thirty Meter Telescope, proposed and now approved for Mauna Kea, Hawaii, USA4,207 m (13,802 ft) above sea level

    Organisation: US-led partnership with Canada, China, India and Japan.
    Telescope location: Preferred site is Mauna Kea Observatory (4,205 metres high), Big Island, Hawaii, but has been subject to dispute [since resolved]. Alternative site is Observatorio del Roque de los Muchachos (2,396 metres high), La Palma, Canary Islands.
    Mirror size: 30-metre mirror composed of 492 hexagonal component mirrors (each about 1.4 metres in diameter) butted together under computer control to form a contiguous optical surface.
    Instruments: adaptive optics imaging cameras, wide-field optical and infrared spectrographs for high and low resolution with single and multiple targets.
    Status: Partially funded. Intended to be finished in 2022
    Quirky fact: The 66-metre diameter dome proposed for the TMT has a moveable circular aperture rather than the usual opening slit.

    ESO/E-ELT,to be on top of Cerro Armazones in the Atacama Desert of northern Chile. located at the summit of the mountain at an altitude of 3,060 metres (10,040 ft).

    Organisation: European Southern Observatory (ESO), a consortium of 15 European countries plus Brazil. In July 2017, Australia entered a 10-year strategic partnership with ESO which does not include access to the telescope.
    Telescope location: Cerro Armazones Observatory (3,060 m high), Antofagasta, Chile
    Mirror size: 39.3-metre diameter mirror composed of 798 hexagonal component mirrors (each about 1.4-metres in diameter) butted together under computer control to form a contiguous optical surface.
    Instruments: adaptive optics imaging cameras, wide-field optical and infrared spectrographs for high and low resolution with single and multiple targets.
    Status: Fully funded. First stone laid at Cerro Armazones in May 2017.
    Expected completion: 2024.
    Quirky fact: When completed, the E-ELT will be the biggest optical telescope ever built, equivalent in light-gathering power to 264 Hubble Space Telescopes.

    How Extremely Large Telescopes will reveal exoplanets

    24 October 2017
    Fred Watson

    Andrew Grey discovered a four-planet solar system 600 light years from Earth. He’s not a professional astronomer. He’s a 26-year-old car mechanic from Darwin. His persistence in trawling through a thousand or so light curves – star brightness graphs – has been rewarded big-time. On live television, to boot.

    Stargazing Live, a three-night blockbuster on Australia’s ABC TV, sparked a frenzy of citizen science. The challenge: find the tell-tale signatures of exoplanets in a mass of data freshly downloaded from NASA’s Kepler space observatory. Kepler’s primary mission has been to stare at more than 150,000 stars, in the hope of recording minuscule dips in brightness that betray the passage of a planet across a star’s disc. This so-called ‘transit method’ is today’s gold standard for planet-finding, having netted the vast majority of the 3,633 exoplanets found so far. Grey’s contribution to this tally was to find a star with not one but four transiting planets.

    Planetary production-line

    It was the first exoplanet discovery in 1995 that triggered the current industrial-scale production line of exoplanet identification. A half-Jupiter-sized world with the uninspiring name of 51 Peg b, it was found not because it dimmed the light of its parent star but because of its motion around it. Professional astronomers with moderately large telescopes have the wherewithal to measure a star’s speed very accurately, typically at the pace of a few metres per second. That is precise enough to gauge a star’s to-and-fro motion as it is pulled off-centre by an orbiting planet.

    Planet transit. NASA/Ames

    Astronomers use a device known as a spectrograph to reveal the rainbow spectrum of light from a star. Like a colourful bar code, the spectrograph carries diagnostic information about the star. Its bands shift slightly as the star speeds up or slows down (relative to the point of measurement). Using detected shifts in the bands to reveal the presence of a planet is known as the ‘Doppler wobble’ technique.

    Working towards a planetary bar code: Shown here are spectra from three star types. A hot blue giant (top) shows absorption lines for hydrogen only. A star like the Sun (middle) also shows lines also representing He, O, C, Ne, N, Mg, Si, Fe, Ca and Na; a cool brown dwarf (bottom) emits light mostly in the infrared but its visible spectrum shows a complex mix of lines from molecules and elements. Once we have ELTs, spectra will be used to analyse exoplanets. The bars will show discrete wavelengths of light absorbed by specific molecules in their atmospheres. University of Cardiff

    In the first years of exoplanet discovery it was by far the most productive method, so long as you had access to a telescope with a spectrograph. Then, in 2009, along came Kepler and everything changed. The sole mission of NASA’s space telescope was to search for exoplanets by identifying sudden dips in the brightness of stars.

    The space observatory’s success spawned a new breed of ground-based exoplanet hunters, aided by the power and affordability of new technology. Using increasingly sensitive cameras combined with computer analysis, amateurs could exploit the transit technique with telescopes far smaller than those historically needed.

    So the pace of exoplanet discovery has exploded and shows no sign of slowing down. The large sample now available reveals a diversity of planetary systems that has staggered astronomers and shattered cherished notions about system formation. We had believed, for instance, that the line-up of our Solar System – with small rocky planets close in and big gassy ones further out – reflected fundamental laws about the way solar systems form, and our models backed that up. Many of the alien systems, however, have giant gas planets within scorching distance of their sun. While Jupiter take 12 years to orbit the Sun, so-called ‘hot Jupiters’ take only a few days.

    These giant hot gas planets nestled close to their star were the easiest to find via the Doppler wobble technique, due to the degree they warped their star’s motion. Using the transit method, we have also found ‘super Neptunes’ (planets like Neptune that are gassy on the outside with a solid core), ‘super Earths’ (giant rocky planets whose mass is greater than our own but less than the likes of Neptune), and ‘Earth-like planets’ (roughly the same mass as our own, orbiting in the ‘goldilocks’ zone where liquid water can exist); about 5% of analysed stars have been found to have such planets, putting the number of possible Earth-like planets in our Galaxy well into the tens of billions. Smaller worlds, below the current level of detectability, must be there, too.

    While newly found planets are rolling off the production line, the truth is we have really only scratched the surface of what we can learn. But that is about to change – very radically. Enter the age of the extremely large telescopes.

    The bigger the better

    The world’s optical astronomers suffer from aperture fever; they crave ever bigger mirrors for their telescopes. This is not mere megalomania, and not even merely the wish to see more distant celestial objects. It’s mostly about how much light is at your disposal, and what clever stuff you can do with it.

    One of the really clever things that can be done with larger telescopes is to see exoplanets directly, rather than relying on how they nudge or shade their star.

    In the 1970s and 1980s, the astronomical world saw a proliferation of telescopes in the 4-metre class. The 1990s saw the introduction of 8-10 metre giants. This new generation of so-called extremely large telescopes, or ELTs, now being built have mirrors more than 20 metres in diameter. Lenses crafted from single pieces of glass would be impossible in these sizes; but by various techniques of segmenting, and aligning individual pieces of glass with computer-controlled fingers to replicate a single reflecting surface, the size problem can be solved.

    The ELTs will chart new territory. They will peer back into the early universe to reveal its history and shed new light on mysteries like the origin of black holes, dark matter and dark energy. Just as the 16th century explorer Ferdinand Magellan – after whom one of the telescopes is named – had no idea of what he was about to discover as his ships sailed into the Pacific, we don’t know what lies ahead. But among the exciting things that will come into view are the exoplanets.

    An individual ELT, of course, also comes with an ELPT – an extremely large price tag, typically in the region of a billion dollars. Funding at this level demands large international collaborations. Three groups are actively involved in building ELTs. Two are US-led: the Giant Magellan Telescope (GMT), with which Australia is partnered, and the Thirty Metre Telescope (TMT). The third is the European Extremely Large Telescope (E-ELT).

    But ownership does not in itself dictate where the telescopes will go. To perform properly an ELT needs exquisite atmospheric conditions, and that limits possible sites to a handful of mountain-top locations. The GMT and European ELT will peer into the Southern sky from the Atacama desert in Chile.

    Common to all these ELT projects is technology to reduce the effects of turbulence in the Earth’s atmosphere. The twinkling of stars may inspire poets but it puts a serious damper on observing exoplanets. Twinkling turns star images into inflated wobbling blobs of light that hide all the detail and reduces the concentration of precious photons. That makes it very hard to snap a crisp image of an exoplanet.

    Adaptive optics will enable the Earth-based ELTs to reveal detail 10 times finer than the Hubble.

    Until a couple of decades ago the only way to eliminate the twinkle was to place an observatory above the atmosphere, as with the Kepler and Hubble space telescopes. Now a technique known as adaptive optics is able to sense the incoming light to quantify the interference caused by atmospheric turbulence. This information is fed back under computer control to thin reflecting membranes that can flex thousands of times per second. This counteracts the distortion by shifting the wobbling light back to its centre, so cancelling the twinkle. The corrective process, akin to that used in noise-cancelling headphones, has taken decades to perfect. With it, Earth-based ELTs will be able to reveal detail 10 times finer than the Hubble.

    Looking for life

    There has been no end of speculation about the habitability of exoplanets but ELTs will be a game changer. Their ability to image exoplanets directly raises the possibility of using spectroscopy to analyse the make-up of their atmospheres.

    The light spectrum reflected by a planet contains the signatures of any gas through which that light has passed. Like a planetary bar code, this enables identification of the elements and molecules present in an exoplanet atmosphere. Some of these elements and molecules could reveal the prospect of life.

    One of the most telling is oxygen, because it accumulates in detectable quantities only through biological processes – most notably photosynthesis. Moreover, because it reacts so readily with other molecules, oxygen has to be continuously replenished to remain in circulation. Our own planet clearly signals the presence of life by the fact oxygen accounts for almost 21% of the atmosphere.

    The presence of oxygen in a planet’s atmosphere, however, is by no means evidence of complex life forms; it can be produced by single-celled organisms, like the cyanobacteria thought responsible for the initial oxygenation of Earth’s atmosphere some 2.3 billion years ago. Biomarkers for multi-celled organisms are more subtle. Whether such signatures might be detectable at interstellar distances is a hot topic in astrobiology. Some possibilities do exist: for example, the chlorophyll content. Vegetation produces a characteristic spectral profile. This so-called ‘vegetation red edge’ is already used to map our own planet’s resources from space.

    How might we react to the unequivocal detection of rudimentary life beyond our planet? Whether life exists elsewhere in space is one of the biggest questions of our time. Even the discovery of single-celled organisms would have far-reaching implications. But the finding that really would be overwhelming is unequivocal evidence of an intelligent civilisation. The sociocultural impacts of such a discovery would be profound. Science, technology, ethics, politics and religion – all will undergo major shifts as we come to terms with a completely new perspective: we are not alone.

    The way ELTs might reveal that knowledge is by finding so-called technomarkers. There are chemicals that can only be introduced into a planet’s atmosphere in significant amounts by industrial processes. They include well-known offenders such as chlorofluorocarbons. Eventually ELTs should allow us to detect these tell-tale pollutants in the atmospheres of distant planets. The irony is inescapable: extra-terrestrial intelligence discovered because aliens were trashing their planet, just as we are trashing ours.

    See the full infographic article here .
    See the full ELT’s will reveal exoplanets here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 12:37 pm on October 27, 2017 Permalink | Reply
    Tags: Cosmos Magazine, Is he feeling optimistic about the world right now?, , Lawrence Krauss eyes the clock   

    From COSMOS: “Lawrence Krauss eyes the clock” 

    Cosmos Magazine bloc

    COSMOS Magazine

    27 October 2017
    Andrew Masterson

    Cosmologist Lawrence Krauss: pessimistic, but not gloomy. Brian de Rivera Simon/WireImage

    [Krauss said once that all scientists should be militant atheists. I object always to militancy of any kind. Beyond that people are free to choose as they wish.]

    A few days from now, theoretical physicist, cosmologist and author Lawrence Krauss will meet with other distinguished scientists to decide the next move in a project that was started just after World War II by Albert Einstein and Robert Oppenheimer.

    In his day job Krauss is the Foundation Professor of the School of Earth and Space Exploration at Arizona State University in the US, but in his downtime he also heads up the board of sponsors for the Bulletin of Atomic Scientists.

    In January every year the Bulletin folk garner a hefty burst of media coverage, because the organisation maintains the well-known Doomsday Clock: the visual and symbolic measure of how close the Earth is to the point of total environmental disaster.

    This year, citing among other matters new US president Donald Trump’s active antipathy to climate change mitigation and nuclear weapons abolition, the boffins twitched the big hand forward. It now sits at two-and-half-minutes to midnight.

    Very soon, the Bulletin must decide what do next time. Krauss has already made his mind up, but isn’t in a sharing mood.

    “If I told you I’d have to kill you,” he laughs.

    (And then, by the way, he eats a Halloween-themed candy that looks like a brain. It is faintly disturbing.)

    The Doomsday Clock’s current position suggests humanity’s prospects are parlous. The simplest explanation for this is to attribute it is to the rise in influence of alt-right anti-science lobbies and the consequent abandonment of evidence as a basis for policy-making.

    Krauss agrees, but finds fault too with himself and fellow scientists, and with everyone who until recently wrote off fundamentalist religion and climate change denial as products of fringe communities.

    “We were complacent, for sure,” he says.

    “I don’t think they were so much on the fringe. They are not on the fringe. I wish they had been. People have been intimidated. Now we’re in a position where government leaders are obviously anti-science and in many cases religious fundamentalists.

    “And that’s a huge problem because they are making policies that are clearly ridiculous. That’s a new concern, but there’s always been another one, which is more pervasive.

    “There are people, millions of them, who feel they are bad people because they question the existence of god.”

    In the US, he explains, agnosticism, much less atheism, is rarely discussed. Those who question the existence of deities often, thus, feel isolated, alone and damaged.

    This needs to change, he says. The godless need to get together and get loud.

    “What we haven’t done enough of is encourage more people to openly ridicule stupid ideas,” he says.

    “Or at least encourage people to ask questions. I think we’ve been far too polite and far too lenient – at least in my country – on religious fundamentalism.”

    That, however, needs to be but one prong of a two-pronged assault.

    “We also have not done a good job of teaching science,” he continues. “These are intimately related, because how do you tell the difference between sense and nonsense in this modern political arena filled with ‘alternative facts’?

    “We have this problem, and I really do think it stems from teaching science the wrong way. We teach it as if it’s a bunch of facts, but it’s not: it’s a process for deriving facts.”

    Over the years, Krauss has made solid contributions not just to his own fields of research but also to the cause of popularising science. His mass market books, such as The Physics of Star Trek in 1995 and A Universe from Nothing in 2012 have been best-sellers.

    Now he has embarked on a new type of teaching journey, teaming up with evolutionary biologist Richard Dawkins in a travelling two-man show called Science In The Soul.

    While neither scientist is a stranger to publicity or controversy, Krauss draws a distinction between the way they approach their tasks.

    “It’s one of the discussions I often have with Richard,” he says,

    “And I think it’s because Richard has lived in [the UK academic city of] Oxford his whole life, or almost.

    “He’s one of the most impatient persons with irrationality that I’ve ever known, whereas I live in the United States so I’m quite used to it.”

    Dawkins and Krauss are bringing their show to Australia in May next year. Tickets are on sale now.

    By then, of course, we and the rest of the world will know whether the Bulletin of Atomic Scientists decided to move the hand of the Doomsday Clock closer to midnight.

    Krauss – being a media pro as much as a scientist – will never be tempted to break embargo and let us know the decision early, but perhaps we can seek a clue obliquely.

    Is he feeling optimistic about the world right now?

    “No,” he answers. “I think that’s the sensible answer. Like my friend [novelist] Cormac McCarthy once said, ‘I’m a pessimist but that’s no reason to be gloomy’.

    “We are living in dangerous times and certainly there are many indicators that suggest that the world is getting more dangerous against various existential threats.

    “So I’m not optimistic at this point, but that doesn’t mean we give up hope, and that doesn’t mean we give up acting. And part of the point of the Bulletin is to speak out and get people to act.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    • stewarthoughblog 10:43 pm on October 27, 2017 Permalink | Reply

      Krauss is becoming more irrational, bigoted and biased with each of his anti-intellectual surges. His none book is duplicitous usurpation of science and knowledge to suit his own ideological mandates.


    • richardmitnick 7:55 am on October 28, 2017 Permalink | Reply

      I only approved your comment for freedom o speech. I agree, on atheism, Krauss is a demigod. But he is entitled to his own opinions. He should keep them to himself. Yet, in Astronomy and Cosmology he is a rock star. I keep all non-science issues out of my blog.


  • richardmitnick 7:25 am on October 11, 2017 Permalink | Reply
    Tags: , Beginning of a new field of computational science, Cosmos Magazine, ,   

    From COSMOS: “Physicists solve extreme electron puzzle” 

    Cosmos Magazine bloc

    COSMOS Magazine

    11 October 2017
    Michael Lucy

    A better understanding of how electrons behave in extreme conditions will help scientists understand stars, lasers and planets.

    The behaviour of electrons has fascinated physicists since their discovery in 1897. Getty Images/Omrikon

    On Earth, electrons are mainly well-behaved creatures. Under extreme conditions – the kind you find in a white dwarf star, say, or in the chamber of a fusion reactor – they fall into a degenerate state, and their behaviour is entirely another matter.

    By creating a better model of electrons in one of these degenerate states – called “warm dense matter” – physicists have opened the way to a better understanding of some extreme corners of the universe.

    “This is the beginning of a new field of computational science,” says Matthew Foulkes of Imperial College London, who developed the model with colleagues at the University of Kiel, in Germany, and the Los Alamos and Lawrence Livermore national laboratories in the US.

    Electrons, the familiar tiny charged particles that flow through wires to produce an electric current, are quite well understood under everyday conditions. Physicists can predict their behaviour both at very small scales (in orbit around an atomic nucleus, say) and very large (the aforementioned electric currents).

    However, at very high temperatures (often in the tens of thousands of degrees) and under great pressure, their behaviour becomes fuzzier and ruled by arcane laws of quantum mechanics.

    The equations that describe their behaviour in this state are extremely complex and up till now no one has found an exact solution.

    Foulkes says it took five years to develop the new techniques necessary to describe warm dense matter accurately.

    The result is a complete description of the thermodynamic properties – the relationships between energy, temperature, pressure and polarisation – of electrons in a warm-dense-matter state.

    The new model, written up in a paper in Physical Review Letters and published online as freely available computer code, will enable other scientists to improve their understanding in a range of extreme situations such as inside stars and planets, in laser laboratories and in the quest for contained nuclear fusion reactions.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 7:59 am on September 27, 2017 Permalink | Reply
    Tags: , , , , , Cosmos Magazine, Dark energy may not exist, Standard candles,   

    From COSMOS: “Dark energy may not exist” 

    Cosmos Magazine bloc

    COSMOS Magazine

    27 September 2017
    Stuart Gary

    A model of the universe that takes into account the irregular distribution of galaxies may make dark energy disappear. NASA, H. Ford (JHU), G. Illingworth (UCSC/LO), M. Clampin (STScI), G. Hartig (STScI), the ACS Science Team and ESA

    The accelerating expansion of the universe due to a mysterious quantity called “dark energy” may not be real, according to research claiming it might simply be an artefact caused by the physical structure of the cosmos.

    The findings, reported in the Monthly Notices of the Royal Astronomical Society, claims the fit of Type Ia supernovae to a model universe with no dark energy appears to be slightly better than the fit using the standard dark energy model.

    The study’s lead author David Wiltshire, from the University of Canterbury in New Zealand, says existing dark energy models are based on a homogenous universe in which matter is evenly distributed.

    CMB per ESA/Planck


    “The real universe has a far more complicated structure, comprising galaxies, galaxy clusters, and superclusters arranged in a cosmic web of giant sheets and filaments surrounding vast near-empty voids”, says Wiltshire.

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al

    Current models of the universe require dark energy to explain the observed acceleration in the rate at which the universe is expanding.

    Scientists base this conclusion on measurements of the distances to Type 1a supernovae in distant galaxies, which appear to be farther away than they would be if the universe’s expansion was not accelerating.

    Type 1a supernovae are powerful explosions bright enough to briefly outshine an entire galaxy. They’re caused by the thermonuclear destruction of a type of star known as a white dwarf – the stellar corpse of a Sun-like star.

    All Type 1a supernovae are thought to explode at around the same mass – a figure known in astrophysics as the Chandrasekhar limit – which equates to about 1.44 times the mass of the Sun.

    Because they all explode at about the same mass, they also explode with about the same level of luminosity.

    This allows astronomers to use them as standard candles to measure cosmic distances across the universe – in the same way you can determine how far away a row of street lights is along a road by how bright each one appears from where you’re standing.

    Standard candles. https://www.extremetech.com

    On a galactic scale, gravity appears to be stronger than scientists can account for, using the normal matter of the universe, the material in the standard model of particle physics, which makes up all the stars, planets, buildings, and people.

    To explain their observations, scientists invented “dark matter”, a mysterious substance which seems to only interact gravitationally with normal matter.

    To explain science’s observations of how galaxies move, there must be about five times as much dark matter as normal matter.

    Caterpillar Project A Milky-Way-size dark-matter halo and its subhalos circled, an enormous suite of simulations . Griffen et al. 2016

    It’s called dark because whatever it is, it cannot emit light. Scientists can only see its effects gravitationally on normal matter.

    On the even larger cosmic scales of an expanding universe, gravity appears to be weaker than expected in a universe containing only normal matter and dark matter.

    And so, scientists invented a new force, called “dark energy”, a sort of anti-gravitational force causing an acceleration in the expansion of the universe out from the big bang 13.8 billion years ago.

    Dark energy isn’t noticeable on small scales, but becomes the dominating force of the universe on the largest cosmic scales: almost four times greater than the gravity of normal and dark matter combined.

    The idea of dark energy isn’t new. Albert Einstein first came up with it to explain a problem he was having when he applied his famous 1915 equations of general relativity theory to the whole universe.

    Like other scientists at the time, Einstein believed the universe was in a steady unchanging state. Yet, when applied to cosmology, his equations showed the universe wanted to expand or contract as matter interacts with the fabric of spacetime: matter tells spacetime how to curve, and spacetime tells matter how to move.

    To resolve the problem, Einstein introduced a dark energy force in 1917 which he called the “cosmological constant”.

    It was a mathematical invention, a fudge factor designed to solve the discrepancies between general relativity theory and the best observational evidence of the day, thus bringing the universe back into a steady state.

    Years later, when astronomer Edwin Hubble discovered that galaxies appeared to be moving away from each other, and the rate at which they were moving was proportional to their distance, Einstein realised his mistake, describing the cosmological constant as the biggest blunder of his life.

    However, the idea has never really gone away, and keeps reappearing to explain strange observations.

    In the mid 1990s two teams of scientists, one led by Brian Schmidt and Adam Riess, and the other by Saul Perlmutter, independently measured distances to Type 1a supernovae in the distant universe, finding that they appeared to be further way than they should be if the universe’s rate of expansion was constant.

    The observations led to the hypothesis that some kind of dark energy anti-gravitational force has caused the expansion of the universe to accelerate over the past six billion years.

    Wiltshire and his colleagues now challenge that reasoning.

    “But these observations are based on an old model of expansion that has not changed since the 1920s”, he says.

    In 1922, Russian physicist Alexander Friedmann used Einstein’s field equations to develop a physical cosmology governing the expansion of space in homogeneous and isotropic models of the universe.

    “Friedmann’s equation assumes an expansion identical to that of a featureless soup, with no complicating structure”, says Wiltshire.

    This has become the basis of the standard Lambda Cold Dark Matter cosmology used to describe the universe.

    “In reality, today’s universe is not homogeneous”, says Wiltshire.

    The earliest snapshot of the universe – called cosmic microwave background radiation – displays only slight temperature variations caused by differences in densities present 370,000 years after the Big Bang.

    However, gravitational instabilities led those tiny density variations to evolve into the stars, galaxies, and clusters of galaxies, which made up the large scale structure of the universe today.

    “The universe has become a vast cosmic web dominated in volume by empty voids, surrounded by sheets of galaxies and threaded by wispy filaments”, says Wiltshire.

    Rather than comparing the supernova observations to the standard Lambda Cold Dark Matter cosmological model, Wiltshire and colleagues used a different model, called ‘timescape cosmology’.

    Timescape cosmology has no dark energy. Instead, it includes variations in the effects of gravity caused by the lumpiness in the structure in the universe.

    Clocks carried by observers in galaxies differ from the clock that best describes average expansion once variations within the universe (known as “inhomogeneity” in the trade) becomes significant.

    Whether or not one infers accelerating expansion then depends crucially on the clock used.

    “Timescape cosmology gives a slightly better fit to the largest supernova data catalogue than Lambda Cold Dark Matter cosmology,” says Wiltshire.

    He admits the statistical evidence is not yet strong enough to definitively rule in favour of one model over the other, and adds that future missions such as the European Space Agency’s Euclid spacecraft will have the power to distinguish between differing cosmology models.

    ESA/Euclid spacecraft

    Another problem involves science’s understanding of Type 1a supernovae. They are not actually perfect standard candles, despite being treated as such in calculations.

    Since timescape cosmology uses a different equation for average expansion, it gives scientists a new way to test for changes in the properties of supernovae over distance.

    Regardless of which model ultimately fits better, better understanding of this will increase the confidence with which scientists can use them as precise distance indicators.

    Answering questions like these will help scientists determine whether dark energy is real or not – an important step in determining the ultimate fate of the universe.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: