Tagged: Cosmos Magazine Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:59 am on December 4, 2017 Permalink | Reply
    Tags: Australia seems on the brink of embracing space in a coordinated manner but how should we do it?, Australian universities made cubesats for an international research project, Cosmos Magazine, , It is encouraging that Australian organisations have anticipated the growth areas, There are also emerging Australian capabilities in small satellites and potentially disruptive technologies with space applications, Three new reports add clarity to Australia’s space sector a ‘crowded and valuable high ground’,   

    From COSMOS: “Three new reports add clarity to Australia’s space sector, a ‘crowded and valuable high ground’” 

    Cosmos Magazine bloc

    COSMOS Magazine

    02 December 2017
    Anthony Wicht

    Three new reports examine Australia’s existing space capabilities, set them in the light of international developments, and identify growth areas and models for Australia to pursue. 136319147@N08/flickr. Telescope is not identified. Bad journalism.

    Australia seems on the brink of embracing space in a coordinated manner, but how should we do it?

    This week, the Australian government released three reports to help chart the future of Australia’s space industry. Their conclusions will feed into the review of Australia’s space industry underway by former CSIRO head Dr Megan Clark.

    The reports examine Australia’s existing space capabilities, set them in the light of international developments, and identify growth areas and models for Australia to pursue. The promise is there:

    Australia has scattered globally competitive capabilities in areas from space weather to deep-space communication but “by far the strongest areas” are applications of satellite data on Earth to industries like agriculture, communications and mining
    Australian research in other sectors like 3D printing and VR is being translated to space with potentially high payoffs
    global trends, including the demand for more space traffic management, play to our emerging strengths
    the prize for success is real – the UK currently has an A$8 billion space export industry, and anticipates further growth.

    While it is not the first time the government has commissioned this type of research, the updates are welcome given the fast pace of space innovation. Taken together they paint a picture of potential for the future of Australian space and a firm foundation for a space agency.

    The rules of the game

    The Global Space Industry Dynamics report from Bryce Space and Technology, a US-based space specialist consulting firm, sets out the “rules of the game” in the US$344 billion (A$450 billion) space sector.

    The global space economy at a glance. Figures are from 2016, and shown in US$.
    Marcella Cheng for The Conversation, adapted from Global Space Industry Dynamics Research Paper by Bryce Space and Technology

    It highlights that:

    three quarters of global revenues are made commercially, despite the prevailing perception that space is a government concern
    most commercial revenue is made from space-enabled services and applications (like satellite TV or GPS receivers) rather than the construction and launch of space hardware itself
    commercial launch and satellite manufacturing industries are still small in relative terms, at about US$20.5 billion (A$27 billion) of revenues, but show strong growth, particularly for smaller satellites and launch vehicles.

    The report also looks at the emerging trends that a smart space industry in Australia will try and run ahead of. Space is becoming cheaper, more attractive to investors and increasingly important in our data-rich economy. These trends have not gone unnoticed by global competitors, though, and the report describes space as an increasingly “crowded and valuable high ground”.

    What is particularly useful about the report is its sharp focus on the three numbers that determine commercial attractiveness:

    market size

    The magic comes through matching these attractive sectors against areas where Australia can compete strongly because of existing capability or geographic advantage.

    The report suggests growth opportunities across traditional and emerging space sectors. In traditional sectors, it calls out satellite services, particularly commercial satellite radio and broadband, and ground infrastructure as prime opportunities. In emerging sectors, earth observation data analytics, space traffic management, and small satellite manufacturing are all tipped as potentially profitable growth areas where Australia could compete.

    The report adds the speculative area of space mining as an additional sector worth considering given Australia’s existing terrestrial capability.

    It is encouraging that Australian organisations have anticipated the growth areas, from UNSW’s off-earth mining research, to Geoscience Australia’s integrated satellite data to Mt Stromlo’s debris tracking capability.

    Australian capabilities

    Australian capabilities are the focus of a second report, by ACIL Allen consulting, Australian Space Industry Capability. The review highlights a smattering of world class Australian capabilities, particularly in the application of space data to activities on Earth like agriculture, transport and financial services.

    There are also emerging Australian capabilities in small satellites and potentially disruptive technologies with space applications, like 3D printing, AI and quantum computing. The report notes that basic research is strong, but challenges remain in “industrialising and commercialising the resulting products”.

    Australian universities made cubesats for an international research project.

    The concern about commercialisation prompts questions about the policies that will help Australian companies succeed.

    Should we embrace recent trends and rely wholly on market mechanisms and venture capital Darwinism, or buy into traditional international space projects?

    Do we send our brightest overseas for a few years’ training, or spin up a full suite of research and development programs domestically?

    Are there regulations that need to change to level the playing field for Australian space exports?
    Learning from the world

    Part of the answer is to be found in the third report, Global Space Strategies and Best Practices, which looks at global approaches to funding, capability development, and governance arrangements. The case studies illustrate a range of styles.

    The UK’s pragmatic approach developed a £5 billion (A$8 billion) export industry by focusing primarily on competitive commercial applications, including a satellite Australia recently bought a time-share on.

    A longer-term play is Luxembourg’s use of tax breaks and legal changes to attract space mining ventures. Before laughing, remember that Luxembourg has space clout: satellite giants SES and Intelsat are headquartered there thanks to similar forward thinking in the 1980s. Those two companies pulled in about A$3 billion of profit between them last year.

    Norway and Canada show a middle ground, combining international partnerships with clear focus areas that benefit research and the economy. Norway has taken advantage of its geography to build satellite ground stations for polar-orbiting satellites, in an interesting parallel with Australia’s longstanding ground capabilities. Canada used its relationship with the United States to build the robotic “Canadarm” for the Space Shuttle and International Space Station, developing a space robotics capability for the country.

    Canadarm played an important role in Canada-USA relations.

    The only caution is that confining the possible role models to the space sector is unnecessarily limiting. Commercialisation in technology fields is a broader policy question, and there is much to learn from recent innovations including CSIRO’s venture fund and the broader Cooperative Research Centre (CRC) program.

    As well as the three reports, the government recently released 140 public submissions to the panel.

    There is no shortage of advice for Dr Clark and the expert reference group; appropriate given it seems an industry of remarkable potential rests in their hands.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 10:05 am on December 4, 2017 Permalink | Reply
    Tags: , Can the Great Barrier Reef regenerate?, Cosmos Magazine,   

    From COSMOS Magazine: “Can the Great Barrier Reef regenerate?” 

    Cosmos Magazine bloc

    COSMOS Magazine

    02 December 2017
    No writer credit

    Well-positioned “robust reefs” may provide coral larvae to help the Great Barrier Reef regenerate after catastrophic bleaching.
    Peter Mumby

    The Great Barrier Reef’s health could be boosted by just 3 per cent of its reefs, according to an Australian-led study.

    The authors found around 100 reefs that should have healthy adult corals, and be well connected enough to supply larvae to almost half of the Great Barrier Reef in a single year.

    By simulating the dispersal of larvae, the researchers could pinpoint which smaller reefs were best connected by ocean currents to the rest of the Great Barrier Reef and could top it up.

    They then used ocean and climate system models to show which reefs were less likely to be exposed to coral bleaching and the crown-of-thorns starfish – a pest that eats coral – and crosschecked that list against the first to come up with a ‘robust’ 3% of reefs.

    The authors of the PLOS Biology paper say these 100 reefs could help desirable species recover – suggesting a level of widespread resilience for the Great Barrier Reef – and that these reefs are unlikely to spread crown-of-thorns starfish.

    “Finding these 100 reefs is a little like revealing the cardiovascular system of the Great Barrier Reef,” explained study author Professor Peter Mumby of the University of Queensland.

    “These refugia are critical as they maintain the healthy populations and diversity required to rebuild coral populations, and have the ability to repopulate other reefs,” Dr Andrew Lenton of CSIRO Oceans and Atmosphere told the Australian Science Media Centre.

    However, there’s reason to be sceptical, according to Associate Professor John Alroy of Macquarie University: “I think [the paper] makes a good case that corals will persist for a while on a fair number of reefs. But I think it’s optimistic.”

    Given the paper shows most of the robust reefs are in the south, Alroy said it made him wonder “whether reefs in the far north can really be kept alive by being replenished from the south.”

    He also pointed out that many of the species of animals living on the Great Barrier Reef are likely to be absent from the ‘robust’ reefs.

    Dr Karlo Hock, of the University of Queensland and also an author of the paper, suggested more does need to be done at different scales to rescue the reef.

    “Identifying only 100 reefs with this potential across the length of the entire 2300 km Great Barrier Reef emphasises the need for effective local protection of critical locations, and carbon emission reductions to support this ecosystem,” Hock said.

    Lenton explained that just protecting these robust reefs likely isn’t enough to ensure the long-term survival of the whole Great Barrier Reef.

    “[This] will need to be coupled with climate mitigation, local management and active management such as coral re-seeding,” he suggested.

    However, Alroy warned “the paper doesn’t really address the fact that global warming is just going to get worse and worse over the next few decades and centuries.”

    “So, even the ‘robust reefs’ might be wiped out in the not-too-distant future – unless we really get serious right now about mitigating global warming.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 5:10 am on November 15, 2017 Permalink | Reply
    Tags: , Cosmos Magazine,   

    From COSMOS: “Need a better microscope? Add mirrors” 

    Cosmos Magazine bloc

    COSMOS Magazine

    15 November 2017
    Andrew Masterson

    Anthony Van Leeuwenhoek’s first microscope, from the seventeenth century, looks nothing like a modern SPIM microscope, but both are products of a quest to improve optics. Stegerphoto.

    From pre-classical times onwards, it could be argued, lens-makers have been the unsung heroes of science.

    As early as 750 BCE the Assyrians were shaping lenses from quartz. From there, the history of optics both underpins and enables discovery in both the macro and micro worlds.

    Where would science be today had it not been for the patient work of myriad lens grinders and optics theorists, including Francis Bacon, Galileo, van Leeuwenhoek, right up to Roberts and Young – inventors in 1951 of photon scanning microscopy – and beyond?

    Even today, the quest for better, clearer, more detailed images from lenses continues apace, with the latest advance, declared in the journal Nature Communications, coming from the US National Institutes of Health and the University of Chicago.

    The images obtained by the combination of the new coverslip and computer algorithms show clearer views of small structures. Credit: Yicong Wu, National Institute of Biomedical Imaging and Bioengineering

    In this diagram, you can see how the mirrored coverslip allows for four simultaneous views. Credit: Yicong Wu, National Institute of Biomedical Imaging and Bioengineering

    A team of researchers, led by Hari Shroff, head of the National Institute of Biomedical Imaging and Bioengineering’s lab section on High Resolution Optical Imaging (HROI), report the solution to a mechanical problem in microscope optics that was, in a way, of their own making.

    Several years ago, Shroff and colleagues developed a new type of microscope that performed “selective plane illumination microscopy” or SPIM. These microscopes use light sheets to illuminate only sections of specimens being examined, thereby doing less damage and better preserving the sample.

    In 2013, Shroff’s team created a SPIM microscope that used two lenses instead of one, which improved image quality and depth perception, In 2016, a third lens was added, allowing improved resolution and 3D-imagery.

    A fourth lens would have boosted matters even more, but at this point van Leeuwenhoek’s twenty-first century heirs hit a snag.

    “Once we incorporated three lenses, we found it became increasingly difficult to add more,” says Shroff. “Not because we reached the limit of our computational abilities, but because we ran out of physical space.”

    Proximity was a real issue. Not only were the three lenses crowded together, but all had to be positioned extremely close to the sample being examined to allow the imaging goal – detailed views of structures within a single cell, say – to be achieved.

    In their new paper, Shroff and his colleagues reveal a solution to the problem that is nothing if not elegant. Rather than try to cram an extra lens in, they have put mirrors on the coverslip – the thin piece of glass that sits on top of the sample.

    The result – especially when coupled with new algorithms in the computerised back-end of a SPIM microscope – is better speed, efficiency and resolution.

    “It’s a lot like looking into a mirror,” Shroff explains. “If you look at a scene in a mirror, you can view perspectives that are otherwise hidden. We used this same principle with the microscope.

    “We can see the sample conventionally using the usual views enabled by the lenses themselves, while at the same time recording the reflected images of the sample provided by the mirror.”

    The addition of the tiny mirrors was not without its own problems. Every microscope raw image contains unwanted data from the source of illumination used to light up the sample. With three lenses, there are three sources of this interference; with mirrors added, these too are multiplied.

    Shroff, however, took this problem to computational imaging researcher Patrick La Riviere at the University of Chicago, who, with his team, was able to modify the processing software to eliminate the extra noise and further improve the signal.

    Francis Bacon, one thinks, would have approved.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 2:22 pm on October 28, 2017 Permalink | Reply
    Tags: , , , Cosmos Magazine, Exoplanet research with optical telescopes, Infographic: a closer look at Extremely Large Telescopes   

    From COSMOS: “Infographic: a closer look at Extremely Large Telescopes” and “How Extremely Large Telescopes will reveal exoplanets” 

    Cosmos Magazine bloc

    COSMOS Magazine

    24 October 2017

    Infographic: a closer look at Extremely Large Telescopes


    Giant Magellan Telescope, to be at Las Campanas Observatory, to be built some 115 km (71 mi) north-northeast of La Serena, Chile, over 2,500 m (8,200 ft) high

    Organisation: US-led partnership with Australia, Brazil, Chile and Korea.
    Telescope location: Las Campanas Observatory (2,550 metres high), Atacama Region, Chile.
    Mirror size: Seven 8.4-metre diameter circular mirrors mounted together to give the collecting area of a 24.5-metre telescope.
    Instruments: adaptive optics imaging cameras, spectrographs for high and low resolution, single and multiple targets.
    Status: Partially funded. Mirror casting began 2005, four completed. Mirror polishing completed for one mirror. Site construction began in 2015.
    Expected completion: 2022 with four mirrors, 2025 with seven mirrors.
    Quirky fact: A life-sized depiction of the telescope’s seven mirrors is painted on the car park at the offices of the Observatories of the Carnegie Institution in Pasadena, California.

    TMT-Thirty Meter Telescope, proposed and now approved for Mauna Kea, Hawaii, USA4,207 m (13,802 ft) above sea level

    Organisation: US-led partnership with Canada, China, India and Japan.
    Telescope location: Preferred site is Mauna Kea Observatory (4,205 metres high), Big Island, Hawaii, but has been subject to dispute [since resolved]. Alternative site is Observatorio del Roque de los Muchachos (2,396 metres high), La Palma, Canary Islands.
    Mirror size: 30-metre mirror composed of 492 hexagonal component mirrors (each about 1.4 metres in diameter) butted together under computer control to form a contiguous optical surface.
    Instruments: adaptive optics imaging cameras, wide-field optical and infrared spectrographs for high and low resolution with single and multiple targets.
    Status: Partially funded. Intended to be finished in 2022
    Quirky fact: The 66-metre diameter dome proposed for the TMT has a moveable circular aperture rather than the usual opening slit.

    ESO/E-ELT,to be on top of Cerro Armazones in the Atacama Desert of northern Chile. located at the summit of the mountain at an altitude of 3,060 metres (10,040 ft).

    Organisation: European Southern Observatory (ESO), a consortium of 15 European countries plus Brazil. In July 2017, Australia entered a 10-year strategic partnership with ESO which does not include access to the telescope.
    Telescope location: Cerro Armazones Observatory (3,060 m high), Antofagasta, Chile
    Mirror size: 39.3-metre diameter mirror composed of 798 hexagonal component mirrors (each about 1.4-metres in diameter) butted together under computer control to form a contiguous optical surface.
    Instruments: adaptive optics imaging cameras, wide-field optical and infrared spectrographs for high and low resolution with single and multiple targets.
    Status: Fully funded. First stone laid at Cerro Armazones in May 2017.
    Expected completion: 2024.
    Quirky fact: When completed, the E-ELT will be the biggest optical telescope ever built, equivalent in light-gathering power to 264 Hubble Space Telescopes.

    How Extremely Large Telescopes will reveal exoplanets

    24 October 2017
    Fred Watson

    Andrew Grey discovered a four-planet solar system 600 light years from Earth. He’s not a professional astronomer. He’s a 26-year-old car mechanic from Darwin. His persistence in trawling through a thousand or so light curves – star brightness graphs – has been rewarded big-time. On live television, to boot.

    Stargazing Live, a three-night blockbuster on Australia’s ABC TV, sparked a frenzy of citizen science. The challenge: find the tell-tale signatures of exoplanets in a mass of data freshly downloaded from NASA’s Kepler space observatory. Kepler’s primary mission has been to stare at more than 150,000 stars, in the hope of recording minuscule dips in brightness that betray the passage of a planet across a star’s disc. This so-called ‘transit method’ is today’s gold standard for planet-finding, having netted the vast majority of the 3,633 exoplanets found so far. Grey’s contribution to this tally was to find a star with not one but four transiting planets.

    Planetary production-line

    It was the first exoplanet discovery in 1995 that triggered the current industrial-scale production line of exoplanet identification. A half-Jupiter-sized world with the uninspiring name of 51 Peg b, it was found not because it dimmed the light of its parent star but because of its motion around it. Professional astronomers with moderately large telescopes have the wherewithal to measure a star’s speed very accurately, typically at the pace of a few metres per second. That is precise enough to gauge a star’s to-and-fro motion as it is pulled off-centre by an orbiting planet.

    Planet transit. NASA/Ames

    Astronomers use a device known as a spectrograph to reveal the rainbow spectrum of light from a star. Like a colourful bar code, the spectrograph carries diagnostic information about the star. Its bands shift slightly as the star speeds up or slows down (relative to the point of measurement). Using detected shifts in the bands to reveal the presence of a planet is known as the ‘Doppler wobble’ technique.

    Working towards a planetary bar code: Shown here are spectra from three star types. A hot blue giant (top) shows absorption lines for hydrogen only. A star like the Sun (middle) also shows lines also representing He, O, C, Ne, N, Mg, Si, Fe, Ca and Na; a cool brown dwarf (bottom) emits light mostly in the infrared but its visible spectrum shows a complex mix of lines from molecules and elements. Once we have ELTs, spectra will be used to analyse exoplanets. The bars will show discrete wavelengths of light absorbed by specific molecules in their atmospheres. University of Cardiff

    In the first years of exoplanet discovery it was by far the most productive method, so long as you had access to a telescope with a spectrograph. Then, in 2009, along came Kepler and everything changed. The sole mission of NASA’s space telescope was to search for exoplanets by identifying sudden dips in the brightness of stars.

    The space observatory’s success spawned a new breed of ground-based exoplanet hunters, aided by the power and affordability of new technology. Using increasingly sensitive cameras combined with computer analysis, amateurs could exploit the transit technique with telescopes far smaller than those historically needed.

    So the pace of exoplanet discovery has exploded and shows no sign of slowing down. The large sample now available reveals a diversity of planetary systems that has staggered astronomers and shattered cherished notions about system formation. We had believed, for instance, that the line-up of our Solar System – with small rocky planets close in and big gassy ones further out – reflected fundamental laws about the way solar systems form, and our models backed that up. Many of the alien systems, however, have giant gas planets within scorching distance of their sun. While Jupiter take 12 years to orbit the Sun, so-called ‘hot Jupiters’ take only a few days.

    These giant hot gas planets nestled close to their star were the easiest to find via the Doppler wobble technique, due to the degree they warped their star’s motion. Using the transit method, we have also found ‘super Neptunes’ (planets like Neptune that are gassy on the outside with a solid core), ‘super Earths’ (giant rocky planets whose mass is greater than our own but less than the likes of Neptune), and ‘Earth-like planets’ (roughly the same mass as our own, orbiting in the ‘goldilocks’ zone where liquid water can exist); about 5% of analysed stars have been found to have such planets, putting the number of possible Earth-like planets in our Galaxy well into the tens of billions. Smaller worlds, below the current level of detectability, must be there, too.

    While newly found planets are rolling off the production line, the truth is we have really only scratched the surface of what we can learn. But that is about to change – very radically. Enter the age of the extremely large telescopes.

    The bigger the better

    The world’s optical astronomers suffer from aperture fever; they crave ever bigger mirrors for their telescopes. This is not mere megalomania, and not even merely the wish to see more distant celestial objects. It’s mostly about how much light is at your disposal, and what clever stuff you can do with it.

    One of the really clever things that can be done with larger telescopes is to see exoplanets directly, rather than relying on how they nudge or shade their star.

    In the 1970s and 1980s, the astronomical world saw a proliferation of telescopes in the 4-metre class. The 1990s saw the introduction of 8-10 metre giants. This new generation of so-called extremely large telescopes, or ELTs, now being built have mirrors more than 20 metres in diameter. Lenses crafted from single pieces of glass would be impossible in these sizes; but by various techniques of segmenting, and aligning individual pieces of glass with computer-controlled fingers to replicate a single reflecting surface, the size problem can be solved.

    The ELTs will chart new territory. They will peer back into the early universe to reveal its history and shed new light on mysteries like the origin of black holes, dark matter and dark energy. Just as the 16th century explorer Ferdinand Magellan – after whom one of the telescopes is named – had no idea of what he was about to discover as his ships sailed into the Pacific, we don’t know what lies ahead. But among the exciting things that will come into view are the exoplanets.

    An individual ELT, of course, also comes with an ELPT – an extremely large price tag, typically in the region of a billion dollars. Funding at this level demands large international collaborations. Three groups are actively involved in building ELTs. Two are US-led: the Giant Magellan Telescope (GMT), with which Australia is partnered, and the Thirty Metre Telescope (TMT). The third is the European Extremely Large Telescope (E-ELT).

    But ownership does not in itself dictate where the telescopes will go. To perform properly an ELT needs exquisite atmospheric conditions, and that limits possible sites to a handful of mountain-top locations. The GMT and European ELT will peer into the Southern sky from the Atacama desert in Chile.

    Common to all these ELT projects is technology to reduce the effects of turbulence in the Earth’s atmosphere. The twinkling of stars may inspire poets but it puts a serious damper on observing exoplanets. Twinkling turns star images into inflated wobbling blobs of light that hide all the detail and reduces the concentration of precious photons. That makes it very hard to snap a crisp image of an exoplanet.

    Adaptive optics will enable the Earth-based ELTs to reveal detail 10 times finer than the Hubble.

    Until a couple of decades ago the only way to eliminate the twinkle was to place an observatory above the atmosphere, as with the Kepler and Hubble space telescopes. Now a technique known as adaptive optics is able to sense the incoming light to quantify the interference caused by atmospheric turbulence. This information is fed back under computer control to thin reflecting membranes that can flex thousands of times per second. This counteracts the distortion by shifting the wobbling light back to its centre, so cancelling the twinkle. The corrective process, akin to that used in noise-cancelling headphones, has taken decades to perfect. With it, Earth-based ELTs will be able to reveal detail 10 times finer than the Hubble.

    Looking for life

    There has been no end of speculation about the habitability of exoplanets but ELTs will be a game changer. Their ability to image exoplanets directly raises the possibility of using spectroscopy to analyse the make-up of their atmospheres.

    The light spectrum reflected by a planet contains the signatures of any gas through which that light has passed. Like a planetary bar code, this enables identification of the elements and molecules present in an exoplanet atmosphere. Some of these elements and molecules could reveal the prospect of life.

    One of the most telling is oxygen, because it accumulates in detectable quantities only through biological processes – most notably photosynthesis. Moreover, because it reacts so readily with other molecules, oxygen has to be continuously replenished to remain in circulation. Our own planet clearly signals the presence of life by the fact oxygen accounts for almost 21% of the atmosphere.

    The presence of oxygen in a planet’s atmosphere, however, is by no means evidence of complex life forms; it can be produced by single-celled organisms, like the cyanobacteria thought responsible for the initial oxygenation of Earth’s atmosphere some 2.3 billion years ago. Biomarkers for multi-celled organisms are more subtle. Whether such signatures might be detectable at interstellar distances is a hot topic in astrobiology. Some possibilities do exist: for example, the chlorophyll content. Vegetation produces a characteristic spectral profile. This so-called ‘vegetation red edge’ is already used to map our own planet’s resources from space.

    How might we react to the unequivocal detection of rudimentary life beyond our planet? Whether life exists elsewhere in space is one of the biggest questions of our time. Even the discovery of single-celled organisms would have far-reaching implications. But the finding that really would be overwhelming is unequivocal evidence of an intelligent civilisation. The sociocultural impacts of such a discovery would be profound. Science, technology, ethics, politics and religion – all will undergo major shifts as we come to terms with a completely new perspective: we are not alone.

    The way ELTs might reveal that knowledge is by finding so-called technomarkers. There are chemicals that can only be introduced into a planet’s atmosphere in significant amounts by industrial processes. They include well-known offenders such as chlorofluorocarbons. Eventually ELTs should allow us to detect these tell-tale pollutants in the atmospheres of distant planets. The irony is inescapable: extra-terrestrial intelligence discovered because aliens were trashing their planet, just as we are trashing ours.

    See the full infographic article here .
    See the full ELT’s will reveal exoplanets here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 12:37 pm on October 27, 2017 Permalink | Reply
    Tags: Cosmos Magazine, Is he feeling optimistic about the world right now?, , Lawrence Krauss eyes the clock   

    From COSMOS: “Lawrence Krauss eyes the clock” 

    Cosmos Magazine bloc

    COSMOS Magazine

    27 October 2017
    Andrew Masterson

    Cosmologist Lawrence Krauss: pessimistic, but not gloomy. Brian de Rivera Simon/WireImage

    [Krauss said once that all scientists should be militant atheists. I object always to militancy of any kind. Beyond that people are free to choose as they wish.]

    A few days from now, theoretical physicist, cosmologist and author Lawrence Krauss will meet with other distinguished scientists to decide the next move in a project that was started just after World War II by Albert Einstein and Robert Oppenheimer.

    In his day job Krauss is the Foundation Professor of the School of Earth and Space Exploration at Arizona State University in the US, but in his downtime he also heads up the board of sponsors for the Bulletin of Atomic Scientists.

    In January every year the Bulletin folk garner a hefty burst of media coverage, because the organisation maintains the well-known Doomsday Clock: the visual and symbolic measure of how close the Earth is to the point of total environmental disaster.

    This year, citing among other matters new US president Donald Trump’s active antipathy to climate change mitigation and nuclear weapons abolition, the boffins twitched the big hand forward. It now sits at two-and-half-minutes to midnight.

    Very soon, the Bulletin must decide what do next time. Krauss has already made his mind up, but isn’t in a sharing mood.

    “If I told you I’d have to kill you,” he laughs.

    (And then, by the way, he eats a Halloween-themed candy that looks like a brain. It is faintly disturbing.)

    The Doomsday Clock’s current position suggests humanity’s prospects are parlous. The simplest explanation for this is to attribute it is to the rise in influence of alt-right anti-science lobbies and the consequent abandonment of evidence as a basis for policy-making.

    Krauss agrees, but finds fault too with himself and fellow scientists, and with everyone who until recently wrote off fundamentalist religion and climate change denial as products of fringe communities.

    “We were complacent, for sure,” he says.

    “I don’t think they were so much on the fringe. They are not on the fringe. I wish they had been. People have been intimidated. Now we’re in a position where government leaders are obviously anti-science and in many cases religious fundamentalists.

    “And that’s a huge problem because they are making policies that are clearly ridiculous. That’s a new concern, but there’s always been another one, which is more pervasive.

    “There are people, millions of them, who feel they are bad people because they question the existence of god.”

    In the US, he explains, agnosticism, much less atheism, is rarely discussed. Those who question the existence of deities often, thus, feel isolated, alone and damaged.

    This needs to change, he says. The godless need to get together and get loud.

    “What we haven’t done enough of is encourage more people to openly ridicule stupid ideas,” he says.

    “Or at least encourage people to ask questions. I think we’ve been far too polite and far too lenient – at least in my country – on religious fundamentalism.”

    That, however, needs to be but one prong of a two-pronged assault.

    “We also have not done a good job of teaching science,” he continues. “These are intimately related, because how do you tell the difference between sense and nonsense in this modern political arena filled with ‘alternative facts’?

    “We have this problem, and I really do think it stems from teaching science the wrong way. We teach it as if it’s a bunch of facts, but it’s not: it’s a process for deriving facts.”

    Over the years, Krauss has made solid contributions not just to his own fields of research but also to the cause of popularising science. His mass market books, such as The Physics of Star Trek in 1995 and A Universe from Nothing in 2012 have been best-sellers.

    Now he has embarked on a new type of teaching journey, teaming up with evolutionary biologist Richard Dawkins in a travelling two-man show called Science In The Soul.

    While neither scientist is a stranger to publicity or controversy, Krauss draws a distinction between the way they approach their tasks.

    “It’s one of the discussions I often have with Richard,” he says,

    “And I think it’s because Richard has lived in [the UK academic city of] Oxford his whole life, or almost.

    “He’s one of the most impatient persons with irrationality that I’ve ever known, whereas I live in the United States so I’m quite used to it.”

    Dawkins and Krauss are bringing their show to Australia in May next year. Tickets are on sale now.

    By then, of course, we and the rest of the world will know whether the Bulletin of Atomic Scientists decided to move the hand of the Doomsday Clock closer to midnight.

    Krauss – being a media pro as much as a scientist – will never be tempted to break embargo and let us know the decision early, but perhaps we can seek a clue obliquely.

    Is he feeling optimistic about the world right now?

    “No,” he answers. “I think that’s the sensible answer. Like my friend [novelist] Cormac McCarthy once said, ‘I’m a pessimist but that’s no reason to be gloomy’.

    “We are living in dangerous times and certainly there are many indicators that suggest that the world is getting more dangerous against various existential threats.

    “So I’m not optimistic at this point, but that doesn’t mean we give up hope, and that doesn’t mean we give up acting. And part of the point of the Bulletin is to speak out and get people to act.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    • stewarthoughblog 10:43 pm on October 27, 2017 Permalink | Reply

      Krauss is becoming more irrational, bigoted and biased with each of his anti-intellectual surges. His none book is duplicitous usurpation of science and knowledge to suit his own ideological mandates.


    • richardmitnick 7:55 am on October 28, 2017 Permalink | Reply

      I only approved your comment for freedom o speech. I agree, on atheism, Krauss is a demigod. But he is entitled to his own opinions. He should keep them to himself. Yet, in Astronomy and Cosmology he is a rock star. I keep all non-science issues out of my blog.


  • richardmitnick 7:25 am on October 11, 2017 Permalink | Reply
    Tags: , Beginning of a new field of computational science, Cosmos Magazine, ,   

    From COSMOS: “Physicists solve extreme electron puzzle” 

    Cosmos Magazine bloc

    COSMOS Magazine

    11 October 2017
    Michael Lucy

    A better understanding of how electrons behave in extreme conditions will help scientists understand stars, lasers and planets.

    The behaviour of electrons has fascinated physicists since their discovery in 1897. Getty Images/Omrikon

    On Earth, electrons are mainly well-behaved creatures. Under extreme conditions – the kind you find in a white dwarf star, say, or in the chamber of a fusion reactor – they fall into a degenerate state, and their behaviour is entirely another matter.

    By creating a better model of electrons in one of these degenerate states – called “warm dense matter” – physicists have opened the way to a better understanding of some extreme corners of the universe.

    “This is the beginning of a new field of computational science,” says Matthew Foulkes of Imperial College London, who developed the model with colleagues at the University of Kiel, in Germany, and the Los Alamos and Lawrence Livermore national laboratories in the US.

    Electrons, the familiar tiny charged particles that flow through wires to produce an electric current, are quite well understood under everyday conditions. Physicists can predict their behaviour both at very small scales (in orbit around an atomic nucleus, say) and very large (the aforementioned electric currents).

    However, at very high temperatures (often in the tens of thousands of degrees) and under great pressure, their behaviour becomes fuzzier and ruled by arcane laws of quantum mechanics.

    The equations that describe their behaviour in this state are extremely complex and up till now no one has found an exact solution.

    Foulkes says it took five years to develop the new techniques necessary to describe warm dense matter accurately.

    The result is a complete description of the thermodynamic properties – the relationships between energy, temperature, pressure and polarisation – of electrons in a warm-dense-matter state.

    The new model, written up in a paper in Physical Review Letters and published online as freely available computer code, will enable other scientists to improve their understanding in a range of extreme situations such as inside stars and planets, in laser laboratories and in the quest for contained nuclear fusion reactions.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 7:59 am on September 27, 2017 Permalink | Reply
    Tags: , , , , , Cosmos Magazine, Dark energy may not exist, Standard candles,   

    From COSMOS: “Dark energy may not exist” 

    Cosmos Magazine bloc

    COSMOS Magazine

    27 September 2017
    Stuart Gary

    A model of the universe that takes into account the irregular distribution of galaxies may make dark energy disappear. NASA, H. Ford (JHU), G. Illingworth (UCSC/LO), M. Clampin (STScI), G. Hartig (STScI), the ACS Science Team and ESA

    The accelerating expansion of the universe due to a mysterious quantity called “dark energy” may not be real, according to research claiming it might simply be an artefact caused by the physical structure of the cosmos.

    The findings, reported in the Monthly Notices of the Royal Astronomical Society, claims the fit of Type Ia supernovae to a model universe with no dark energy appears to be slightly better than the fit using the standard dark energy model.

    The study’s lead author David Wiltshire, from the University of Canterbury in New Zealand, says existing dark energy models are based on a homogenous universe in which matter is evenly distributed.

    CMB per ESA/Planck


    “The real universe has a far more complicated structure, comprising galaxies, galaxy clusters, and superclusters arranged in a cosmic web of giant sheets and filaments surrounding vast near-empty voids”, says Wiltshire.

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al

    Current models of the universe require dark energy to explain the observed acceleration in the rate at which the universe is expanding.

    Scientists base this conclusion on measurements of the distances to Type 1a supernovae in distant galaxies, which appear to be farther away than they would be if the universe’s expansion was not accelerating.

    Type 1a supernovae are powerful explosions bright enough to briefly outshine an entire galaxy. They’re caused by the thermonuclear destruction of a type of star known as a white dwarf – the stellar corpse of a Sun-like star.

    All Type 1a supernovae are thought to explode at around the same mass – a figure known in astrophysics as the Chandrasekhar limit – which equates to about 1.44 times the mass of the Sun.

    Because they all explode at about the same mass, they also explode with about the same level of luminosity.

    This allows astronomers to use them as standard candles to measure cosmic distances across the universe – in the same way you can determine how far away a row of street lights is along a road by how bright each one appears from where you’re standing.

    Standard candles. https://www.extremetech.com

    On a galactic scale, gravity appears to be stronger than scientists can account for, using the normal matter of the universe, the material in the standard model of particle physics, which makes up all the stars, planets, buildings, and people.

    To explain their observations, scientists invented “dark matter”, a mysterious substance which seems to only interact gravitationally with normal matter.

    To explain science’s observations of how galaxies move, there must be about five times as much dark matter as normal matter.

    Caterpillar Project A Milky-Way-size dark-matter halo and its subhalos circled, an enormous suite of simulations . Griffen et al. 2016

    It’s called dark because whatever it is, it cannot emit light. Scientists can only see its effects gravitationally on normal matter.

    On the even larger cosmic scales of an expanding universe, gravity appears to be weaker than expected in a universe containing only normal matter and dark matter.

    And so, scientists invented a new force, called “dark energy”, a sort of anti-gravitational force causing an acceleration in the expansion of the universe out from the big bang 13.8 billion years ago.

    Dark energy isn’t noticeable on small scales, but becomes the dominating force of the universe on the largest cosmic scales: almost four times greater than the gravity of normal and dark matter combined.

    The idea of dark energy isn’t new. Albert Einstein first came up with it to explain a problem he was having when he applied his famous 1915 equations of general relativity theory to the whole universe.

    Like other scientists at the time, Einstein believed the universe was in a steady unchanging state. Yet, when applied to cosmology, his equations showed the universe wanted to expand or contract as matter interacts with the fabric of spacetime: matter tells spacetime how to curve, and spacetime tells matter how to move.

    To resolve the problem, Einstein introduced a dark energy force in 1917 which he called the “cosmological constant”.

    It was a mathematical invention, a fudge factor designed to solve the discrepancies between general relativity theory and the best observational evidence of the day, thus bringing the universe back into a steady state.

    Years later, when astronomer Edwin Hubble discovered that galaxies appeared to be moving away from each other, and the rate at which they were moving was proportional to their distance, Einstein realised his mistake, describing the cosmological constant as the biggest blunder of his life.

    However, the idea has never really gone away, and keeps reappearing to explain strange observations.

    In the mid 1990s two teams of scientists, one led by Brian Schmidt and Adam Riess, and the other by Saul Perlmutter, independently measured distances to Type 1a supernovae in the distant universe, finding that they appeared to be further way than they should be if the universe’s rate of expansion was constant.

    The observations led to the hypothesis that some kind of dark energy anti-gravitational force has caused the expansion of the universe to accelerate over the past six billion years.

    Wiltshire and his colleagues now challenge that reasoning.

    “But these observations are based on an old model of expansion that has not changed since the 1920s”, he says.

    In 1922, Russian physicist Alexander Friedmann used Einstein’s field equations to develop a physical cosmology governing the expansion of space in homogeneous and isotropic models of the universe.

    “Friedmann’s equation assumes an expansion identical to that of a featureless soup, with no complicating structure”, says Wiltshire.

    This has become the basis of the standard Lambda Cold Dark Matter cosmology used to describe the universe.

    “In reality, today’s universe is not homogeneous”, says Wiltshire.

    The earliest snapshot of the universe – called cosmic microwave background radiation – displays only slight temperature variations caused by differences in densities present 370,000 years after the Big Bang.

    However, gravitational instabilities led those tiny density variations to evolve into the stars, galaxies, and clusters of galaxies, which made up the large scale structure of the universe today.

    “The universe has become a vast cosmic web dominated in volume by empty voids, surrounded by sheets of galaxies and threaded by wispy filaments”, says Wiltshire.

    Rather than comparing the supernova observations to the standard Lambda Cold Dark Matter cosmological model, Wiltshire and colleagues used a different model, called ‘timescape cosmology’.

    Timescape cosmology has no dark energy. Instead, it includes variations in the effects of gravity caused by the lumpiness in the structure in the universe.

    Clocks carried by observers in galaxies differ from the clock that best describes average expansion once variations within the universe (known as “inhomogeneity” in the trade) becomes significant.

    Whether or not one infers accelerating expansion then depends crucially on the clock used.

    “Timescape cosmology gives a slightly better fit to the largest supernova data catalogue than Lambda Cold Dark Matter cosmology,” says Wiltshire.

    He admits the statistical evidence is not yet strong enough to definitively rule in favour of one model over the other, and adds that future missions such as the European Space Agency’s Euclid spacecraft will have the power to distinguish between differing cosmology models.

    ESA/Euclid spacecraft

    Another problem involves science’s understanding of Type 1a supernovae. They are not actually perfect standard candles, despite being treated as such in calculations.

    Since timescape cosmology uses a different equation for average expansion, it gives scientists a new way to test for changes in the properties of supernovae over distance.

    Regardless of which model ultimately fits better, better understanding of this will increase the confidence with which scientists can use them as precise distance indicators.

    Answering questions like these will help scientists determine whether dark energy is real or not – an important step in determining the ultimate fate of the universe.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 10:09 am on July 19, 2017 Permalink | Reply
    Tags: "Top five places to look for extraterrestrial life, Cosmos Magazine, , , , The moon Titan,   

    From COSMOS: “Top five places to look for extraterrestrial life” 

    Cosmos Magazine bloc

    COSMOS Magazine

    19 July 2017
    Andrew Masterson

    For all the hope and expectation, it is sobering to recall that, despite the best efforts of scientists and engineers, there is still no evidence that life exists anywhere beyond our own planet. There are, however, some planetary prime suspects. Here are the five places astronomers and astrobiologists think are the best chances for harbouring ET.

    An artist’s impression of “rocky super-Earth” LHS 1140b and its red dwarf host. M. Weiss/CfA

    LHS 1140b

    News of this planet, a “rocky super-earth”, was announced in the journal Nature in April. Orbiting a red dwarf 39 light-years from Earth, the planet sits in its star’s habitable zone and has an estimated mass almost seven times that of our own planet, leading to the assumption that it comprises rock encasing a solid iron core. According to Jason Dittmann of the Harvard Smithsonian Centre for Astrophysics in Massachusetts, US, LHS 1140b’s density means it might have survived the runaway global warming thought to denude many red dwarf planets. If so, it might now boast a stable atmosphere and liquid water. “This is the most exciting exoplanet I’ve seen in the past decade,” he said. “We could hardly hope for a better target to perform one of the biggest quests in science – searching for evidence of life beyond Earth.”

    Enceladus Curtains: Comparing Data and Simulation. http://photojournal.jpl.nasa.gov/catalog/PIA19061.


    Thanks to data from NASA’s Cassini spacecraft, Saturn’s moon Enceladus has emerged as every ET-hunter’s favourite target – mainly due to the strong likelihood that it features a subterranean ocean. In April this year, a team of scientists from the South West Research Institute (SWRI) in Texas, US, revealed a plume of hydrogen erupting from the moon’s surface. The plume may well be evidence of hydrothermal vents in the subsurface ocean – the same type of vents that support extremophile life on earth. “The discovery of hydrogen gas and the evidence for ongoing hydrothermal activity offer a tantalising suggestion that habitable conditions could exist beneath the moon’s icy crust,” says principal investigator Hunter Waite.

    In its final swoop close to the surface of Enceladus, NASA’s Cassini spacecraft has delivered a stunning cliffhanger by detecting the most remarkable hints yet that there may be life on Saturn’s sixth-largest moon.

    That swoop took place in October 2015, but research published this month in Science reveals that the spacecraft – which is due to end its 22-year mission by plunging into the planet’s surface in a few months – detected hydrogen gas in a plume of material erupting from the moon’s surface.

    Hovering over Titan. NASA.


    Another of Saturn’s 53 moons, Titan is known to have permanent hydrocarbon lakes, a nitrogen-heavy atmosphere, and possibly a subsurface ocean beneath a salty crust. It is a possible host for either water-dependent or methane-dependent life.



    Artist’s impression of the planet orbiting Proxima Centauri. ESO/M. KORNMESSER / GETTY.

    This planet, discovered in August 2016, orbits the star Proxima Centauri, 4.2 light-years away from our sun, and is the nearest candidate beyond the solar system for hosting ET. Research in May’s Astronomy & Astrophysics journal found the chances of life existing on the planet may hinge on its orbital speed. Astrophysicists at the University of Exeter calculated that if Proxima-b rotates on its axis three times for every two times it orbits its sun, then the chances of it being habitable are substantially improved.

    TRAPPIST-1 planet lineup. NASA.

    The announcement of the Trappist-1 system in February, with seven rocky planets orbiting an ultracool dwarf star, sent ripples of excitement through astrobiologists everywhere. At least three of the planets looked like they were within the star’s habitable zone. The latest analysis, by Eric Wolf from the Laboratory for Atmospheric and Space Physics at the University of Colorado, Boulder, US, has somewhat dampened expectations, suggesting that only one of the group has life-sustaining potential. But never mind: one chance in seven is still better than no chance at all.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 9:59 am on December 15, 2016 Permalink | Reply
    Tags: , , Cosmos Magazine, Protein HER2 a culprit   

    From COSMOS: “How breast cancer spreads before tumours can be detected” 

    Cosmos Magazine bloc


    15 December 2016
    Anthea Batsakis

    Coloured scanning electron micrograph of a migrating – or metastasising – breast cancer cell. Science Photo Library / Getty Images

    Like a weed spreading seeds before it’s even sprouted from the soil, breast cancer cells can migrate around the body before any lumps can be felt or detected by a mammogram, two mouse studies show.

    Each proposes an explanation why early disseminating cancer cells – cells that “spread” around the body when the tumour is only microscopic – are better at invading distant tissues than those from an advanced tumour.

    Both studies, published in Nature, could lead to new ways of monitoring cancer’s spread.

    “They have such firm support that early dissemination is really occurring much more than we thought,” Rik Thompson, breast cancer biologist from the Queensland University of Technology in Australia and who was not involved in the study, says.

    Metastasis – the formation of secondary tumours as a result of disseminating cells – is responsible for most cancer-related deaths.

    And while the idea that early disseminating cancer cells lead to metastasis is nothing new, the question of why hasn’t yet been fully answered.

    A protein called HER2 is overproduced in roughly 25% of breast cancer cases. In those patients, the chance their cancer will reappear increases three-fold.

    Both teams of researchers investigated HER2-positive cancer but told two different stories.

    Hedayatollah Hosseini from the University of Regensburg in Germany and his colleagues suggest
    [Nature] the female hormone progesterone drives the circulation of early cancer cells from microscopic tumours.

    Meanwhile, Kathryn Harper from the Icahn School of Medicine at Mount Sinai in the US and her colleagues showed [Nature] the HER2 protein itself helped early invasive cells enter the bloodstream.

    Thompson says that neither paper is more convincing than the other – they’re simply different, and challenge the common notion that cancer cells are better at spreading when they originate from an advanced tumour.

    Harper’s team hooked up a microscope to mice mammary glands and watched its cancer cells in the lining tissue. They also studied human bone marrow samples seeded with disseminated cancer cells.

    And they found that HER2 switches on another protein, which in turn subdues a cancer-halting enzyme called p38. The cancer cells were able to circulate the body unhindered.

    On the other hand, Hosseini and colleagues turned to progesterone.

    Using human tissue samples, the researchers showed that progesterone triggers a cell to release two proteins that target and strengthen an invasive cells’ ability to migrate.

    Thompson is curious about a possible connection between the two studies – specifically p38 from Harper’s study and progesterone from Hosseini’s.

    “Clearly they’re both working on the same model on early stage dissemination, but the connection between the two is an intriguing question for me,” he says. Perhaps progesterone regulates p38 – or the other way around.

    And in the short term, the researchers suggest HER2-positive breast cancer patients may benefit from close blood monitoring early on to catch any tumours that might grow from metastasising cells.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 5:00 am on December 9, 2016 Permalink | Reply
    Tags: Cosmos Magazine, Einstein’s Greatest Mistake: The Life of a Flawed Genius" Book Review,   

    From COSMOS: “Einstein’s Greatest Mistake: The Life of a Flawed Genius” Book Review 

    Cosmos Magazine bloc


    09 December 2016
    Bill Condie

    Einstein’s Greatest Mistake: The Life of a Flawed Genius
    By David Bodanis
    Little, Brown (2016)
    RRP $35.00

    We all make mistakes, for sure, but fallibility is not the first thing that comes to mind when thinking about the most recognisable genius the world has ever produced. David Bodanis, that talented explainer of complex physics to lay readers, whose E=mc2: A Biography of the World’s Most Famous Equation is among the clearest explanations of the famous formula, has come up with a perfect sequel.

    Described by the author as “the story of a fallible genius, but also the story of his mistakes”, the book tries to explain the anticlimactic later years of the great man’s life. Tourists may have still gawped as Einstein trudged home in Princeton, but during those final decades he was largely ignored by working scientists.

    The explanation lies, Bodanis argues, in the same characteristics of imagination and self-confidence that led the young Einstein to change the way we thought about physics forever. As he says, “genius and hubris, triumph and failure, can be inextricable”. To understand where Einstein went wrong, it is necessary to examine his earliest years to understand how his mind engaged with the mysteries of the universe.

    It began with Einstein’s discovery that mass and energy are different forms of the same stuff, expressed in the neat little formula E=mc2 – unheard of at the time, but so dramatically demonstrated as true in the skies over Hiroshima, where a tiny sliver of matter became a knockout blow of energy.

    Later came the theory of general relativity that proved energy and mass distort spacetime. The discovery unified gravity into a single view of the universe, no longer a separate force but the result of existing laws. Laws, Einstein thought, that were very clear and very exact. No wonder he considered the theory “the greatest satisfaction of my life”.

    Ironically though, it was this faith in the perfection of his theory – one could say a blind faith – that closed his mind to other emerging schools of thought, particularly those developing in theories of quantum mechanics. That the quantum world of subatomic particles was a place of inherent uncertainty and contradiction was anathema to Einstein’s belief in the underlying laws that guided his own theory. God, he said, “is not playing at dice”. And that, to Bodanis, was his greatest mistake. It was also a blindness that kept Einstein in the wilderness for the last 25 years of his life.

    With the centenary of Einstein’s general theory of relativity last year, there is no shortage of books about Einstein. But this one is still a welcome addition to the vast library. It comes, as mentioned, with Bodanis’ talent for explaining the maths and science of Einstein’s work. But the best part is the real feel it gives of Einstein the man, and his thinking.

    The poor, somewhat arrogant, student of his youth – whose teachers thought would amount to little thanks to his reluctance to take instruction – against the odds gives birth to the in-his-prime scientist combining wonderful imagination and rigour to shake our understanding of the world to its foundations. But that, in turn, leads to a dogmatism that locks him out of a world of new thought that, had he approached the problem differently, he might have contributed so much to.

    It’s a wonderful exposition of the life of Einstein – the man with the superhuman mind who was, in the end, all too human.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: