Tagged: Medium Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:58 pm on February 17, 2019 Permalink | Reply
    Tags: Asgardia, , , , , , , Medium, , See the full blog post for images of all of the spacecraft involved and the Heliopause and Heliosphere, Which Spacecraft Will Reach Interstellar Space Next?   

    From Asgardia via Medium: “Which Spacecraft Will Reach Interstellar Space Next?” 

    From Asgardia

    via

    Medium

    2

    NASA’s Voyager 2spacecraft reached interstellar space in December 2018, following in the footsteps of its sister, Voyager 1. Currently, only five spacecraft have been launched that can make such a grand exit, including the Voyagers. The other three are Pioneers 10 and 11, and New Horizons. Which one will make a great escape next?

    NASA/Voyager 2

    NASA/Voyager 1

    NASA Pioneer 10

    NASA Pioneer 11

    NASA/New Horizons spacecraft

    Reaching interstellar space is a milestone that is thought of as leaving the solar system by a specific definition. In 1990, the New York Times reported that Pioneer left the solar system when it flew past Neptune’s orbit. But that’s not what Voyager 2’s scientists used as their definition. Instead, the more recent measurements said the crossing of the sun’s heliopause, the theoretical boundary to its heliosphere, is the determining factor for entering interstellar space.

    The heliosphere is a bubble of charged particles that are created by and flows past the sun. It is used by scientists to mark where interstellar space starts.

    NASA Heliosphere

    However, the heliosphere is tricky, and there are many changes such as the sun’s 22-year solar cycle, the shrinking and growing with the solar wind, and stretching out behind the sun in the star’s direction of travel. It’s not something that can be measured easily from Earth. Thus, NASA’s Interstellar Boundary Explorer (IBEX) mission is trying to define the edges of the bubble remotely.

    Observations from the Voyager probes’ indicate that they’ve pierced this bubble. However, since researchers think the Oort Cloud also surrounds the sun, an area of icy bodies that is estimated to stretch from 1,000 to 100,000 astronomical units — far beyond the heliopause — the Voyager probes cannot be considered entirely outside the solar system. (One astronomical unit, or AU, is the distance between the Earth and the sun — 93 million miles, or 150 million kilometres).

    Oort cloud Image by TypePad, http://goo.gl/NWlQz6

    Oort Cloud, The layout of the solar system, including the Oort Cloud, on a logarithmic scale. Credit: NASA, Universe Today

    When Voyager 1 and 2 crossed the heliopause, their still-working particle instruments unveiled the historical events. The heliosphere functions as a shield, keeping out many of the higher-energy particles created by the cosmic rays generated by other stars.

    Magnetosphere of Earth, original bitmap from NASA. SVG rendering by Aaron Kaase

    By tracking both the low-energy particles found inside the solar system and the high-energy particles from outside of it, the instruments could reveal a sudden surge of cosmic rays alerting scientists that the spacecraft had left the solar system.

    The ever-changing nature of the heliosphere makes it impossible to tell when Pioneer 10 and 11 will enter interstellar space. It’s even possible that one of them may have already.

    As per NASA’s e-book Beyond Earth: A Chronicle of Deep Space Exploration, from Nov. 5, 2017, Pioneer 10 was approximately 118.824 AUs from Earth, farther than any craft besides Voyager 1. H(?), Although Pioneer 11 and the Voyager twins were all heading in the direction of the sun’s apparent travel, Pioneer 10 is headed toward the trailing side. 2017 research showed that the tail of the heliosphere is around 220 AU from the sun. Since Pioneer 10 travels about 2.5 AU/year, it will take Pioneer until roughly 2057–40 years — to reach the changing boundary.

    Pioneer 11 was thought to be approximately 97.6 AUs from Earth as of Nov. 5, 2017, according to the same e-book. Unlike its twin, the spacecraft is travelling in about the same direction as the Voyagers. Voyager 2 crossed into interstellar medium at approximately 120 AUs. Since Pioneer 11 is moving at 2.3 AU/year, it should reach interstellar space in about eight years, around 2027 — assuming the boundary doesn’t change, which it probably will.

    On Jan. 1, 2019, New Horizons made its most recent flyby of a solar system object, and it was launched much later than the other four. During this flyby, New Horizons was 43 AU from the sun. The mission’s principal investigator, Alan Stern, told Space.com that the spacecraft was travelling approximately 3.1 AU each year, or 31 AU in ten years. In another two decades, the spacecraft has a good chance of reaching interstellar space. If New Horizons crossed at Voyager 2’s same border (it won’t, but just consider as a baseline), it would make the trip in just under 24 years, in 2043. But it’s possible the ISM line will move inward, allowing it to cross sooner.

    Although there won’t be a direct confirmation of crossing the heliopause with the Pioneer spacecraft, it’s possible that New Horizons will still be working, and will give us a detailed study of interstellar space. The particle detectors that it holds are much more potent than the ones on Voyager, Stern said. Moreover, New Horizons holds a dust detector that would offer insight into the area beyond the heliosphere.

    However, whether or not they will still be functioning remains to be seen. As per Stern, power is the limiting factor. New Horizons runs off of decaying plutonium dioxide. Presently, the spacecraft has enough power to work until the late 2030s, said Stern, and it is currently in good working order.

    If in the unlikely event that the ever-changing heliosphere remains static Pioneer 11 will be the next to cross the heliopause in 2027, followed by New Horizons in 2043. Pioneer 10, the first of the five spacecraft to launch, will be the last to leave the heliosphere, in 2057. Once again, this assumes the extremely unrealistic chance that the heliopause remaining static for the next four decades.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    About Medium

    Medium is an online publishing platform developed by Evan Williams, and launched in August 2012. It is owned by A Medium Corporation. The platform is an example of social journalism, having a hybrid collection of amateur and professional people and publications, or exclusive blogs or publishers on Medium, and is regularly regarded as a blog host.

    Williams developed Medium as a way to publish writings and documents longer than Twitter’s 140-character (now 280-character) maximum.

    Advertisements
     
  • richardmitnick 1:54 pm on February 15, 2019 Permalink | Reply
    Tags: , , , , Is J1420–0545 the largest galaxy ever discovered?, Medium,   

    From Medium: “Is J1420–0545 the largest galaxy ever discovered?” 

    From Medium

    Jan 27, 2019
    Graham Doskoch

    An unassuming galaxy hides a secret 15 million light-years long.

    1
    If we could get high-quality optical images of J1420–0545, they might look like this photograph of its closer cousin, the giant radio galaxy 3C 236. This Hubble image only shows the galaxy’s core; radio telescopes reveal a much larger structure. Image credit: NASA/ESA.

    The Milky Way is about 50 to 60 kiloparsecs in diameter — a moderately sized spiral galaxy.

    Milky Way Galaxy Credits: NASA/JPL-Caltech/R. Hurt

    It’s a few orders of magnitude larger than the smallest galaxies, ultra-compact dwarfs like M60–UCD1 that have most of their stars clustered in a sphere less than 50 to 100 parsecs across. At the extreme opposite end of the spectrum lie supergiant ellipticals, more formally known as cD galaxies, whose diffuse halos can be up to 1–2 megaparsecs wide. To put this in perspective, the Andromeda galaxy is 0.78 Mpc away.

    Andromeda Galaxy Adam Evans

    Andromeda Nebula Clean by Rah2005 on DeviantArt

    This means that the 2-megaparsec-long stellar halo of IC 1101 — sometimes hailed as the largest known galaxy in the observable universe — could stretch from the Milky Way to Andromeda and then some.

    2
    IC 1101, possibly the largest known galaxy in the universe. Its diffuse halo might not look like much, but it extends about one megaparsec in each direction. Image credit: NASA/ESA/Hubble Space Telescope

    Yet IC 1101 pales in comparison to another class of objects: radio galaxies. Radio galaxies are sources of strong synchrotron emission, radiation from particles being accelerated along curved paths by magnetic fields. Active galactic nuclei are the culprits, supermassive black holes accreting matter and sending out jets of energetic electrons. In most cases, these jets are hundreds of kiloparsecs in length, and some are even longer.

    This week’s blog post talks about J1420–0545, currently the largest-known radio galaxy. To be more specific, it has the largest radio “cocoon” ever observed. These cocoons are structures formed by shocked plasma from the jets, which expands outward into the intergalactic medium (IGM) and encases the jets and the lobes they form. The entire radio structure around J1420–0545 is enormous, stretching 4.69 Mpc — 15 million light-years — from end to end. Read on to find out just how extraordinary this galaxy is and how we know so much about its enormous cocoon, despite knowing so little about the host galaxy itself.

    Initial observations and slight surprise

    J1420–0545 was discovered, like many unusual galaxies, in a survey scanning the sky. In particular, it showed up as two large radio lobes spaced 17.4′ apart on the FIRST and NVSS surveys observing at 1.4 GHz using the Very Large Array (VLA).

    NRAO/Karl V Jansky Expanded Very Large Array, on the Plains of San Agustin fifty miles west of Socorro, NM, USA, at an elevation of 6970 ft (2124 m)

    Follow-up observations made at Effelsberg and the Giant Metrewave Radio Telescope (GMRT) (Machalski et al. 2008) then confirmed that there was a radio-loud core located midway between them, and that it corresponded to a previously-known dim galaxy.

    MPIFR/Effelsberg Radio Telescope, in the Ahrgebirge (part of the Eifel) in Bad Münstereifel, Germany

    Giant Metrewave Radio Telescope, an array of thirty telecopes, located near Pune in India

    4
    Fig. 1, Machalski et al. 2008. VLA/Effelsberg observations of J1020–0545 showed that the main sources of 1.4 GHz emission were two large radio-loud lobes and a weaker central source. The galaxy itself is in the crosshairs in the image, a speck among specks.

    Redshift values for that galaxy were available (z~0.42–0.46), but had large uncertainties, so the team performed their own optical photometry at the Mount Suhora Observatory. The spectra derived from this proved useful in two ways. First, the spectroscopy allowed the team to figure out what sort of galaxy they were looking at. Unlike the radio lobes, the optical emission from the center couldn’t be resolved, and it wasn’t possible to image the galaxy in the same way that we could take a picture of, say, our neighbor Andromeda. Fortunately, there was a solution: The 4000 Å discontinuity.

    Elliptical galaxies are typically old, having formed over time from mergers and collisions of smaller galaxies of varying types. Star formation levels are low, meaning that there are relatively few young, hot, blue stars compared to star-forming spiral and lenticular galaxies. Now, at wavelengths a bit shorter than 4000 Å, there is a drop-off in emission thanks to absorption by metals in stellar atmospheres. In most galaxies, hot stars fill in this gap, when present. However, in elliptical galaxies, there are few hot stars, and so there is a “discontinuity” in the spectra around 4000 Å.

    The team found other spectral features corroborating the hypothesis that J1420–0545 is an elliptical galaxy. Now that they knew the sort of spectrum they expected to see, they could fit a model to it. Measurements of [O II] and Ca II absorption lines yielded a new redshift of z~0.03067, placing the object closer than originally thought. Since the redshift (and therefore the distance) was known, as well as the angular size of the radio cocoon, its size could be estimated — assuming that the inclination angle was 90°, as suggested by the weak emission from the core. A simple calculation showed that the jets must be 4.69 Mpc long.

    How did it get so big?

    A radio structure of this size isn’t unprecedented. The giant radio galaxy 3C 236 had already been discovered, and found to have a radio cocoon 4.4 Mpc in length. However, what was surprising about J1420–0545 wasn’t just its size, but its age. Best-fit models of the jet and ambient medium found the structure to have an age of about 47 million years; 3C 236, on the other hand, is thought to have been active for 110 million years — more than double that. So why is J1420–0545, a relatively young radio galaxy, so large?

    5
    Fig. 1, Carilli & Barthel 1995. A radio galaxy’s narrow jets are surrounded by a bow shock at the boundary with the intergalactic medium, as well as a radio cocoon.

    The answer turned out to be the intergalactic medium itself, the hot plasma that fills the spaces between galaxies. The IGM at the center of the galaxy is lower than at the center of 3C 236 by about a factor of 20, meaning that the gas pressure opposing the jets’ expansion was correspondingly lower. The power of the AGN in J1420–0545 is also 50% greater than the AGN in 3C 236; this, combined with the substantially lower ambient IGM density, meant that the jets experienced much less resistance as they plowed into intergalactic space, and could therefore expand faster and farther in a shorter amount of time.

    This of course just begs the question: Why is the local IGM so rarefied on so large a scale? Originally, the group thought that it was simply a naturally under-dense region of space, similar to a void — an underdensity dozens of megaparsecs across that formed shortly after the Big Bang. However, after additional VLA and GMRT measurements (Machalski et al. 2011), they considered an alternative possibility: that the jets were the result of more than one round of AGN activity.

    Double, double, radio bubbles

    The team suggested classifying J1420–0545 as a double-double radio galaxy (DDRG). DDRGs exhibit two pairs of lobes that are aligned to within a few degrees, indicating that the central AGN underwent a period of activity, shut down, and then restarted. The key piece of information from the old VLA and GMRT data that suggested that J1420–0545 might be an extreme DDRG was the shape of its jets. The narrow jets are characteristic of double-double radio galaxies undergoing their second period of activity.

    If the DDRG hypothesis is true, there should be a second faint outer radio cocoon surrounding the structure. After the first period of AGN activity, once the jets ceased, the cocoon should have quickly cooled through energy losses by synchrotron radiation and inverse-Compton scattering; with a suitable choice of parameters, it would be quite possible for it to be below the sensitivity of the VLA and GMRT. However, the team is hopeful that higher-sensitivity measurements in the future might be able to discover it.

    In an interesting twist, it was suggested around the same time that 3C 236 is also a DDRG — albeit one in the very early stages of its second period of AGN activity (Tremblay et al. 2010 The Astrophysical Journal). A group observed four bright “knots” near its core that were visible in the far ultraviolet. They appear to be associated with the AGN’s dust disk, and are about ten million years old.

    6
    Fig. 4, Tremblay et al. The star-forming knots in the core of 3C 236. The nucleus itself, hiding a supermassive black hole, is surrounded by dust lanes

    3C 236’s two large radio lobes appear to be relic, and it has a smaller (~2 kpc) compact structure that seems to be much more recent. This is the key bit of evidence suggesting that it, too, might be a DDRG: The compact radio structure appears to be the same age as the knots, meaning that whatever event caused one likely caused the other. For instance, if a new reservoir of gas became available, it could fuel both AGN activity and a new round of star formation. If this is true, and the compact source ends up resulting in jets, it’s possible that 3C 236 could end up the size of J1420–0545 — or larger.

    I’ll end this post by discussing the question I posed in the title: Does J1420–0545 deserve to be called the largest known galaxy? We don’t know quite how large its stellar halo is, but it’s assuredly much smaller than the giant radio cocoon that surrounds it. At the same time, the cocoon represents a very distinct boundary between the galaxy and the intergalactic medium, and the shocked plasma inside it should behave quite differently from plasma in the IGM. Ironically, unlike normal elliptical galaxies that have diffuse halos, we can place a finger on where this giant ends and where intergalactic space begins.

    One day, perhaps, we’ll find a giant radio galaxy even larger than J1420–0545, and the question will be moot. For now, though, I leave the question open— and I’ll wait for more VLA data. Clinching evidence of an outer cocoon could be around the corner. All we have to do is wait and see.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    About Medium

    Medium is an online publishing platform developed by Evan Williams, and launched in August 2012. It is owned by A Medium Corporation. The platform is an example of social journalism, having a hybrid collection of amateur and professional people and publications, or exclusive blogs or publishers on Medium, and is regularly regarded as a blog host.

    Williams developed Medium as a way to publish writings and documents longer than Twitter’s 140-character (now 280-character) maximum.

     
  • richardmitnick 1:54 pm on February 13, 2019 Permalink | Reply
    Tags: , , , , , Medium, ,   

    From Medium: “Here’s what I Zwicky 18 can tell us about the first stars in the universe” 

    From Medium

    Feb 10, 2019
    Graham Doskoch

    A blue dwarf galaxy only 59 million light-years away may harbor cousins of the mysterious Population III stars.

    1
    A Hubble Space Telescope image of I Zwicky 18 shows gas illuminated by young blue stars. Image credit: NASA/ ESA/A. Aloisi.

    The first stars in the universe were unlike any we can see today. Known to astronomers as Population III stars, they were large, massive, and composed almost entirely of hydrogen and helium. Population III stars were important because they enriched the interstellar medium with metals — all the elements heavier than hydrogen and helium — and participated in reionization, an event a few hundred million years after the Big Bang that made the universe more transparent.

    Finding Population III stars could confirm important parts of our theories of cosmology and stellar evolution. However, they should all be gone from the Milky Way by now, having exploded as supernova long ago. We can look into the distant universe to search for them at high redshifts — and indeed, the James Webb Space Telescope will do just that — but detecting individual stars at that distance is beyond our current capabilities. So far, telescopes have turned up nothing.

    Recent observations of a nearby dwarf galaxy named I Zwicky 18, however, have given us some hope. Only 59 million light-years away, the galaxy seems to contain clouds of hydrogen that are nearly metal-free. What’s more, it’s undergoing a burst of star formation that might be producing stars very similar to Population III stars. If we could learn more about this galaxy, it could provide us with clues as to what the earliest stars and galaxies in the universe were like.

    Is the current wave of star formation the first?

    2
    The initial HI observations of I Zwicky 18 used the radio interferometer at Westerbork, in the Netherlands. Image credit: Wikipedia user Onderwijsgek, under the Creative Commons Attribution-Share Alike 2.5 Netherlands license.

    One of the first studies to draw attention to the possibility that I Zwicky 18 is forming Population III-analog stars was by Lequex & Viallefond 1980. They supplemented existing optical observations of HII regions — clouds of ionized gas that host young, hot, massive stars — with studies of HI regions via the 21-cm emission line, a key tool for mapping neutral hydrogen. They were trying to figure out if the current round of massive star formation in the dwarf galaxy is its first, or if it had been preceded by other events, polluting the hydrogen clouds with metals.

    Their radio observations with the Westerbork Synthesis Radio Telescope [above] found a total HI mass of about 70 million solar masses in six separate regions, three of which remained unresolved. They were unable to connect individual components to the maps of HII regions, but radial velocity measurements of the clouds found that the total mass of the galaxy was much greater by about a factor of ten, suggesting that some other sort of mass was present.

    There were two possibilities: either the unseen mass was molecular hydrogen — which would not emit 21-cm radiation — or there was a dim population of older stars. The molecular hydrogen hypothesis couldn’t be ruled out, but the idea of an as-yet unseen group of stars was attractive. For one thing, the HI clouds appeared quite similar to the primordial clouds needed for galaxy formation. If these HI regions were actually primordial, then these dim stars could have supported them against gravitational collapse for billions of years.

    2
    Figure 5, Lequex & Viallefond 1980. A map of the HI regions in the galaxy show that three (labeled 1, 2 and 5) are large enough to be resolved, while the others are point sources. Regions 1, 4 and 5 are the most massive.

    A picture began to emerge. Comparison of Lyman continuum emission with far-ultraviolet emission indicated that the burst of star formation must have begun about a few million years ago, likely due to the collision of several hydrogen clouds. Before this, there would have been formation of dim red stars on a smaller scale, but not enough to enrich the galaxy more than low observed oxygen abundances suggested. Therefore, the stars forming in I Zwicky 18 should indeed be very close to Population III stars.

    What sort of stars are we dealing with?

    3
    Figure 1, Kehrig et al. 2015. A composite (hydrogen alpha + UV + r’-band) image of luminous knots in the dwarf galaxy that show intense helium emission.

    The idea caught on over the next few decades, and astronomers became interested in determining the nature of these young stars. One group (Kehrig et al. 2015 The Astrophysical Letters) was particularly interested in determining what type of massive stars could best explain the He II λ4686 line, an indicator of hard radiation and hot stars ionizing material in HII star-forming regions. There were a couple possible culprits:

    Early-type Wolf-Rayet stars, which are thought to be responsible for much of the He II λ4686 emission in star-forming galaxies.

    Shocks and x-ray binaries, which have also been found in extragalactic HII regions.

    Extremely metal-poor O stars, or — going one step further — entirely metal-free O stars, similar to Population III stars.

    The group ruled out the Wolf-Rayet stars quickly. Key signatures of metal-poor carbon Wolf-Rayet stars were clearly evident in the spectra, but the inferred number based on the C IV λ1550 line was too small to account for all of the helium emission. Similarly, the x-ray binary possibility was discarded because the sole x-ray binary found was too dim by a factor of 100.

    4
    Figure 2, Kehrig et al. 2015. A region of high Hα and He II λ4686 emission shows little overlap with [OI] λ6300 emission and low [S II] contrast, ruling out the possibility of x-ray shocks.

    However, a group of maybe a dozen or so metal-free stars of a hundred solar masses or more could successful reproduce the observed He II λ4686 line. There are pockets of gas near a knot in the northwest edge of the galaxy that are devoid of metals and would provide a suitable environment for these stars to form, although there are also likely chemically-enriched stars there, too. Certain models of extremely high-mass (~300 solar masses) offer an alternative to these metal-free stars, but in light of the previous observations, the metal-free models remain enticing.

    For the time being, our telescopes can’t detect Population III stars. Until they do, we can still learn a lot about the early universe by studying blue compact dwarf galaxies like I Zwicky 18. Low-redshift, metal-free analogs of the first stars in the universe are close enough for us to study today. The most metal-poor galaxy in the universe is a good place to start.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    About Medium

    Medium is an online publishing platform developed by Evan Williams, and launched in August 2012. It is owned by A Medium Corporation. The platform is an example of social journalism, having a hybrid collection of amateur and professional people and publications, or exclusive blogs or publishers on Medium, and is regularly regarded as a blog host.

    Williams developed Medium as a way to publish writings and documents longer than Twitter’s 140-character (now 280-character) maximum.

     
  • richardmitnick 11:39 am on February 9, 2019 Permalink | Reply
    Tags: , , , , First observed in 2008 a binary system known as IGR J18245–2452 from its x-ray outbursts and PSR J1824–2452I for its radio emissions, Medium, , , , The fastest millisecond pulsar PSR J1748–2446ad   

    From Medium: “IGR J18245–2452: The most important neutron star you’ve never heard of” 

    From Medium

    Jan 21, 2019
    Graham Doskoch

    Astronomers have spent thirty years on the theory behind how millisecond pulsars form. Now we know they got it right.

    Neutron stars are known for their astonishing rotational speeds, with most spinning around their axes many times each second. The mechanism behind this is simple: When a fairly massive star several times the radius of the Sun collapses into a dense ball about ten kilometers in diameter, conservation of angular momentum dictates that it must spin quicker.

    However, one class of neutron stars can’t be explained this way: millisecond pulsars. These exotic objects spin hundreds of times each second, with the fastest, PSR J1748–2446ad, rotating at over 700 Hertz! Since their discovery in the 1980s, a slightly different evolutionary path has been proposed. After studying dozens of systems, astronomers theorized that millisecond pulsars are very old — old enough that they’ve lost much of their original angular momentum to radiation. However, they’re also in binary systems, and under certain conditions, a companion star can transfer matter — and thus angular momentum — to the pulsar, spinning it back up again.

    1
    A plot of the periods and magnetic fields of pulsars. Millisecond pulsars have extremely short periods, and comparatively weak magnetic fields. Image credit: Swinburne University of Technology

    During this period of accretion, the system should become an x-ray binary, featuring strong emission from the hot plasma in the neutron star’s accretion disk. There should also be periods where the neutron star behaves like an ordinary radio pulsar, emitting radio waves we can detect on Earth. If we could detect both types of radiation from a single system, it might be the clinching bit of evidence for the spin-up model of millisecond pulsar formation.

    In 2013, astronomers discovered just that: a binary system known as IGR J18245–2452 from its x-ray outbursts, and PSR J1824–2452I for its radio emissions. First observed in 2008, it had exhibited both radio pulsations and x-ray outbursts within a short period of time, clear evidence of the sort of transitional stage everyone had been looking for. This was it: a confirmation of the ideas behind thirty years of work on how these strange systems form.

    2
    INTEGRAL observations of IGR J18245–2452 from February 2013 (top) and March/April 2013 (bottom). The system is only visible in x-rays in the second period. Image credit: ESA/INTEGRAL/IBIS/Jörn Wilms.

    ESA/Integral

    The 2013 outburst

    Towards the end of March of 2013, the INTEGRAL and Swift space telescopes detected x-rays from an energetic event coming from the core of the globular cluster M28 (Papitto et al. 2013).

    NASA Neil Gehrels Swift Observatory

    It appeared to be an outburst of some kind — judging by the Swift observations, likely a thermonuclear explosion. A number of scenarios can lead to x-ray transients, including novae and certain types of supernovae. Binary systems are often the culprits, where mass can be transferred from one star or compact object to another.

    3
    Fig. 7, Papitto et al. Swift data from observations of an outburst show its characteristic exponentially decreasing cooling.

    One thermonuclear burst observed by Swift followed a time evolution profile expected for such a detonation: An increase in luminosity for 10 seconds, followed by an exponential decrease with a time constant of 38.9 seconds. This decrease represents the start of post-burst cooling. The other outbursts from the system should have had similar profiles characteristic of x-ray-producing thermonuclear explosions, and indeed later observations of the system have confirmed that this is indeed the case (De Falco et al. 2017 [Astronomy and Astrophysics]), albeit with slightly different rise times and decay constants.

    To determine the identity of the transient, now designated IGR J18245–2452, astronomers made follow-up observations using the XMM-Newton telescope.

    ESA/XMM Newton

    The nature of the outburst would determine how it evolved over time. For instance, supernovae (usually) decrease in brightness over the course of weeks or months. In this case, however, the x-rays were still detected — albeit a bit weaker. More surprisingly, the strength of the emission appeared to be modulated, varying with a period of 3.93 milliseconds.

    Such a short period seemed to indicate that a pulsar might be responsible. The team checked databases of known radio pulsars and found one that matched the x-ray source: PSR J1824–2452I, a millisecond pulsar in a binary system. Even after this radio counterpart had been found, however, two questions remained: Were these x-ray pulses new or a long-term process, and how did they relate to the radio emission?

    Diving into the archives

    A handy tool for observational astronomers is archival images. By looking at observations taken months, years or decades before an event, scientists can — if they’re lucky — peek into the past to see what an object of interest looked like long before it became interesting. Archival data is often of use for teams studying supernovae, as even a previously uninteresting or unnoticed star can tell the story of a supernova’s progenitor.

    4
    Fig. 3, Papitto et al. Chandra images from 2008, showing the system in quiescent (top) and active (bottom) states.

    NASA/Chandra X-ray Telescope

    In this case, Papitto et al. looked at Chandra observations from 2008, comparing them with new data from April 2013. They found x-ray variability occurring shortly after a period of radio activity by the pulsar, indicating that the system had switched off its radio emissions and started emitting x-rays. This was extremely interesting, because new observations with three sensitive radio telescopes — Green Bank, Parkes, and Westerbork — indicated that the pulsar was no longer active in radio waves.

    Green Bank Radio Telescope, West Virginia, USA, now the center piece of the GBO, Green Bank Observatory, being cut loose by the NSF

    CSIRO/Parkes Observatory, located 20 kilometres north of the town of Parkes, New South Wales, Australia

    Westerbork Synthesis Radio Telescope, an aperture synthesis interferometer near World War II Nazi detention and transit camp Westerbork, north of the village of Westerbork, Midden-Drenthe, in the northeastern Netherlands

    It was possible that the pulsar had been eclipsed and emission was ongoing, and this may indeed have happened at some points, but was not likely to be the main factor behind the apparent quiescence.

    A few weeks later, however, the exact opposite happened: the pulsar exited its quiescent radio state and was again picked up by the three radio telescopes. In short, over a period of months, it had oscillated between behaving like an x-ray binary and a normal millisecond pulsar. Finally, x-ray observations had conclusively shown that this sort of bizarre transitional state was possible!

    The mechanism

    IGR J18245–2452 spends the vast majority of its time in what is known as a “quiescent” state, during which there is comparatively little x-ray activity. The pulsar’s magnetosphere exerts a pressure on the infalling gas, forming a disk at a suitable distance from the surface. Eventually, however, there is enough buildup that an x-ray outburst occurs, lasting for a few months. The outburst decreases the mass accretion rate, and the magnetosphere pushes away much of the transferred gas, allowing radio pulsations to take place once more.

    5
    Fig. 2, De Falco et al. Over a period of a few weeks, IGR J18245–2452 underwent a number of individual x-ray outbursts, themselves indicative of a brief period of x-ray activity and radio silence.

    It’s expected that the pulsar will eventually be spun-up until its rotational period is on the order of a millisecond or so. It will cease x-ray emissions, and be visible mainly through radio pulses. All of this, however, is far in the future, and during our lifetimes, IGR J18245–2452 will stay in its current transitional state, halfway between an x-ray binary and a millisecond pulsar.

    Women in STEM – Dame Susan Jocelyn Bell Burnell

    Dame Susan Jocelyn Bell Burnell, discovered pulsars with radio astronomy. Jocelyn Bell at the Mullard Radio Astronomy Observatory, Cambridge University, taken for the Daily Herald newspaper in 1968. Denied the Nobel.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    About Medium

    Medium is an online publishing platform developed by Evan Williams, and launched in August 2012. It is owned by A Medium Corporation. The platform is an example of social journalism, having a hybrid collection of amateur and professional people and publications, or exclusive blogs or publishers on Medium, and is regularly regarded as a blog host.

    Williams developed Medium as a way to publish writings and documents longer than Twitter’s 140-character (now 280-character) maximum.

     
  • richardmitnick 1:05 pm on January 5, 2019 Permalink | Reply
    Tags: , Blogs publishing falsehoods are bad enough but the rise of social media made the situation even worse, In 2014 only 14 percent of those surveyed showed “a great deal of confidence” in academia, It is incredibly rare for the scientific consensus as a whole to be wrong, Measles, Medium, Parents in the United States are fooled by the false claim that vaccines cause autism, Research shows that people lack the skills for differentiating misinformation from true information, Scientists get rewarded in money and reputation for finding fault with statements about reality made by other scientists, The Internet Is for…Misinformation, The lack of curation means thinking errors are causing us to choose information that fits our intuitions and preferences as opposed to the facts, The large gaps between what scientists and the public believe about issues such as climate change evolution GMOs and vaccination exemplify the problems caused by misinformation and lack of trust in sc, The Pro-Truth Pledge combines the struggle against misinformation with science advocacy, The rise of the internet and more recently social media is key to explaining the declining public confidence in expert opinion, These problems result from the train wreck of human thought processes meeting the internet, This crumbling of trust in science and academia forms part of a broader pattern, This greater likelihood of experts being correct does not at all mean we should always defer to experts, We can uplift the role of science in our society. The March for Science movement is a great example of this effort, We’re In an Epidemic of Mistrust in Science   

    From Medium: “We’re In an Epidemic of Mistrust in Science” 

    From Medium

    Jun 27, 2018
    Gleb Tsipursky

    1
    A family physician prepares a measles vaccine during a consultation in Bucharest, Romania on April 16, 2018. Photo by Daniel Mihailescu/AFP via Getty

    Dozens of infants and children in Romania died recently in a major measles outbreak, as a result of prominent celebrities campaigning against vaccination. This trend parallels that of Europe as a whole, which suffered a 400 percent increase in measles cases from 2016 to 2017. Unvaccinated Americans traveling to the World Cup may well bring back the disease to the United States.

    Of course, we don’t need European travel to suffer from measles. Kansas just experienced its worst measles outbreak in decades. Children and adults in a few unvaccinated families were key to this widespread outbreak.

    Just like in Romania, parents in the United States are fooled by the false claim that vaccines cause autism. This belief has spread widely across the country and leads to a host of problems.

    Measles was practically eliminated in the United States by 2000. In recent years, however, outbreaks of measles have been on the rise, driven by parents failing to vaccinate their children in a number of communities. We should be especially concerned because our president has frequently expressed the false view that vaccines cause autism, and his administration has pushed against funding “science-based” policies at the Centers for Disease Control and Prevention.

    These illnesses and deaths are among many terrible consequences of the crisis of trust suffered by our institutions in recent years. While headlines focus on declining trust in the media and government, science and academia are not immune to this crisis of confidence, and the results can be deadly.

    Consider that in 2006, 41 percent of respondents in a nationwide poll expressed “a lot of confidence” in higher education. Fewer than 10 years later, in 2014, only 14 percent of those surveyed showed “a great deal of confidence” in academia.

    What about science as distinct from academia? Polling shows that the number of people who believe science has “made life more difficult” increased by 50 percent from 2009 to 2015. According to a 2017 survey, only 35 percent of respondents have “a lot” of trust in scientists; the number of people who trust scientists “not at all” increased by over 50 percent from a similar poll conducted in December 2013.

    This crumbling of trust in science and academia forms part of a broader pattern, what Tom Nichols called the death of expertise in his 2017 book of the same name. Growing numbers of people claim their personal opinions hold equal weight to the opinions of experts.

    Should We Actually Trust Scientific Experts?

    While we can all agree that we do not want people to get sick, what is the underlying basis for why the opinions of experts — including scientists — deserve more trust than the average person in evaluating the truth of reality?

    The term “expert” refers to someone who has extensive familiarity with a specific area, as shown by commonly recognized credentials, such as a certification, an academic degree, publication of a book, years of experience in a field, or some other way that a reasonable person may recognize an “expert.” Experts are able to draw on their substantial body of knowledge and experience to provide an opinion, often expressed as “expert analysis.”

    That doesn’t mean an expert opinion will always be right—it’s simply much more likely to be right than the opinion of a nonexpert. The underlying principle here is probabilistic thinking, our ability to predict the truth of current and future reality based on limited information. Thus, a scientist studying autism would be much more likely to predict accurately the consequences of vaccinations than someone who has spent 10 hours Googling “vaccines and autism.”

    This greater likelihood of experts being correct does not at all mean we should always defer to experts. First, research shows that experts do best in evaluating reality in environments that are relatively stable over time and thus predictable, and when the experts have a chance to learn about the predictable aspects of this environment. Second, other research suggests that ideological biases can have a strongly negative impact on the ability of experts to make accurate evaluations. Third, material motivations can sway experts to conduct an analysis favorable to their financial sponsor.

    However, while individual scientists may make mistakes, it is incredibly rare for the scientific consensus as a whole to be wrong. Scientists get rewarded in money and reputation for finding fault with statements about reality made by other scientists. Thus, when the large majority of them agree on something — when there is a scientific consensus — it is a clear indicator that whatever they agree on accurately reflects reality.

    The Internet Is for…Misinformation

    The rise of the internet and, more recently, social media, is key to explaining the declining public confidence in expert opinion.

    Before the internet, the information accessible to the general public about any given topic usually came from experts. For instance, scientific experts on autism were invited to talk on this topic on mainstream media, large publishers published books by the same experts, and they wrote encyclopedia articles on the topic.

    The internet has enabled anyone to be a publisher of content, connecting people around the world with any and all sources of information. On the one hand, this freedom is empowering and liberating, with Wikipedia being a great example of a highly curated and accurate source on the vast majority of subjects. On the other hand, anyone can publish a blog post making false claims about links between vaccines and autism. If they are skilled at search engine optimization or have money to invest in advertising, they can get their message spread widely.

    Unfortunately, research shows that people lack the skills for differentiating misinformation from true information. This lack of skills has clear real-world effects: Just consider that U.S. adults believed 75 percent of fake news stories about the 2016 U.S. presidential election. The more often someone sees a piece of misinformation, the more likely they are to believe it.

    Today, the lack of curation means thinking errors are causing us to choose information that fits our intuitions and preferences, as opposed to the facts.

    Blogs publishing falsehoods are bad enough, but the rise of social media made the situation even worse. Most people reshare news stories without reading the actual article, judging the quality of the story by the headline and image alone. No wonder research indicates that misinformation spreads as much as 10 times faster and further on social media than true information. After all, the creator of a fake news item is free to devise the most appealing headline and image, while credible sources of information have to stick to factual headlines and images.

    These problems result from the train wreck of human thought processes meeting the internet. We all suffer from a series of thinking errors, such as confirmation bias, our tendency to look for and interpret information in ways that conform to our beliefs.

    Before the internet, we got our information from sources like mainstream media and encyclopedias, which curated the information for us to ensure it came from experts, minimizing the problem of confirmation bias. Today, the lack of curation means thinking errors are causing us to choose information that fits our intuitions and preferences, as opposed to the facts. Moreover, some unscrupulous foreign actors — such as the Russian government — and domestic politicians use misinformation as a tool to influence public discourse and public policy.

    The large gaps between what scientists and the public believe about issues such as climate change, evolution, GMOs, and vaccination exemplify the problems caused by misinformation and lack of trust in science. Such mistrust results in great harm to our society, from outbreaks of preventable diseases to highly damaging public policies.

    What Can We Do?

    Fortunately, there are proactive steps we can take to address the crisis of trust in science and academia.

    For example, we can uplift the role of science in our society. The March for Science movement is a great example of this effort. First held on Earth Day in 2017 and repeated in 2018, this effort involves people rallying in the streets to celebrate science and push for evidence-based policies. Another example is the Scholars Strategy Network, an effort to support scholars in popularizing their research for a broad audience and connecting scholars to policymakers.

    We can also fight the scourge of misinformation. Many world governments are taking steps to combat falsehoods. While the U.S. federal government has dropped the ball on this problem, a number of states have passed bipartisan efforts promoting media literacy. Likewise, many nongovernmental groups are pursuing a variety of efforts to fight misinformation.

    The Pro-Truth Pledge combines the struggle against misinformation with science advocacy. Founded by a group of behavioral science experts (including myself) and concerned citizens, the pledge calls on public figures, organizations, and private citizens to commit to 12 behaviors listed on the pledge website that research in behavioral science shows correlate with truthfulness. Signers are held accountable through a crowdsourced reporting and evaluation mechanism while getting reputational rewards because of their commitment. The scientific consensus serves as a key measure of credibility, and the pledge encourages pledge-takers to recognize the opinions of experts as more likely to be true when the facts are disputed. More than 500 politicians took the pledge, including state legislators Eric Nelson (PA) and Ogden Driskell (WY) and Congress members Beto O’Rourke (TX) and Marcia Fudge (OH).

    Two research studies at Ohio State University demonstrated the effectiveness of the pledge in changing the behavior of pledge-takers to be more truthful with a strong statistical significance. Thus, taking the pledge yourself and encouraging people you know and your elected representatives to take the pledge is an easy action to both fight misinformation and promote science.

    Conclusion

    I have a dream that, one day, children will not be getting sick with measles because their parents put their trust in a random blogger instead of extensive scientific studies. I have a dream that schools will be teaching media literacy, and people will know how to evaluate the firehose of information coming their way. I have a dream that we will all know that we suffer from thinking errors and will watch out for confirmation bias and other problems. I have a dream that the quickly growing distrust of experts and science will seem like a bad dream. I have a dream that our grandchildren will find it hard to believe our present reality when we tell them stories about the bad old days.

    To live these dreams requires all of us who care about truth and science to act now, before we fall further down the slippery slope. Our information ecosystem and credibility mechanisms are broken. Only a third of Americans trust scientists, and most people can’t tell the difference between truth and falsehood online. The lack of trust in science — and the excessive trust in persuasive purveyors of misinformation — is perhaps the biggest threat to our society right now. If we don’t turn back from the brink, our future will not be a dream: It will be a nightmare.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    About Medium

    Medium is an online publishing platform developed by Evan Williams, and launched in August 2012. It is owned by A Medium Corporation. The platform is an example of social journalism, having a hybrid collection of amateur and professional people and publications, or exclusive blogs or publishers on Medium, and is regularly regarded as a blog host.

    Williams developed Medium as a way to publish writings and documents longer than Twitter’s 140-character (now 280-character) maximum.

     
  • richardmitnick 11:44 am on December 12, 2018 Permalink | Reply
    Tags: , , , , Medium,   

    From Medium: “Did We Detect a White Hole?” 

    From Medium

    Nov 19, 2018
    Ella Alderson

    1
    Take the number 25. The square root of 25 can be 5, but it can just as easily be -5. As two solutions to the field equations, both black and white holes stirred and excited scientists at the time, though only one went on to be well known. Image: Olena Shmahalo/Quanta Magazine

    Black holes are fascinating monsters. They shred stars and planets as massive as Jupiter, spinning wild enough to capture everything within their mysterious caverns. Once past their event horizons, nothing can escape. One even dwells at the center of our galaxy and possibly at the center of almost all the galaxies we see harbored in the night sky. They are, without doubt, one of the most powerful objects in the universe. And yet, as little as we know about them and as fascinating as they are, it’s their counterparts which prove even more elusive and more exciting to consider.

    Not long after Einstein introduced the world to general relativity in the early 1900’s, there emerged the foundation for black holes and their mathematical opposites — white holes. Einstein himself didn’t predict them; he thought the extreme nature of black holes was far too outlandish to investigate. Yet to other scientists they became big points of interest.

    At their very essence, black holes and white holes are composed of a singularity (where an immense amount of mass is condensed down to a small amount of space) and an event horizon. They are identical to one another except for their direction of passage. While black holes devour matter and let nothing escape, white holes emit huge amounts of matter and energy, allowing nothing to travel inside them. They could never be entered. If an intrepid crew did attempt to enter a white hole, the sheer force of the gamma rays would destroy them and their ship. But even if the ship was strong enough to withstand that amount of energy, space-time around the white hole is structured so that the amount of acceleration required to get inside gets higher and higher the closer you get. In short, getting inside a white hole requires more energy than there exists in the entire universe.

    But just because a white hole obeys general relativity and is mathematically sound doesn’t mean it’s practical. Many scientists call white holes “an impossible possibility”, meaning that while they can’t be completely ruled out, they also don’t expect to see one in our telescopes. This is because this phenomenon violates the second law of thermodynamics: entropy in the universe must always stay the same or increase.

    Entropy is often described as chaos but can be better explained as an increase in how many states are possible for particles in a certain system. For example, a house demolished into rubble is an increase in entropy because that rubble can go on to make many other structures — sheds, bookshelves, mounds and paper — whereas a house is only one very specific state of those particles. Small, local decreases in entropy can occur as long as the universe’s overall entropy is increasing. Black holes are excellent at this because they take matter low in entropy, such as planets, and disperse them across large spaces over time, increasing the chaos of space. White holes, with their outpours of matter, violate this law as they would decrease overall entropy. This is also why physicists argue that time cannot go backwards.

    But this still doesn’t make white holes impossible.

    2
    A black hole interacts with a star. Stephen Hawking showed that black holes shed their mass over time, suggesting that information can’t be eternally stored inside them. But if that information can’t be destroyed, where does it go? That’s where white holes come in. Image: DVDP

    A rare dip in entropy could temporarily reverse time and form a white hole. The only problem is that once time resumed its normal course, the white hole would explode and vanish in a powerful burst of energy. Some scientists speculate that this is exactly what created our universe; the Big Bang does mathematically look a lot like a white hole, the only difference being that the Big Bang had no singularity and instead occurred everywhere at the same time. But it would explain why so much matter and energy suddenly appeared.

    Some researchers have cited white holes as an answer to the black hole information paradox — a contradiction that says that information swallowed by a black hole is permanently lost during Hawking radiation but that this would violate a law of quantum mechanics that says that no information can ever be destroyed.

    If a black hole was connected to a white hole, all matter and energy consumed by the black hole would emerge from the white hole either in a different part of the universe or in another universe altogether. This would solve the question of information conservation. Hawking supported this theory for many years.

    Similarly, in 2014, a team led by theoretical physicist Carlo Rovelli suggested that once black holes could no longer evaporate and shrink due to the constraints of space-time, the black hole would then experience a quantum bounce (an outward pressure) and transform into a white hole. This means that black holes become white holes almost at the instant they form. However, outside observers continue to see a black hole for billions of years because of gravity’s time dilation. If this theory is correct, black holes that formed in the early years of the universe could be ready to die and burst into cosmic rays or another form of radiation at any moment.

    In fact, we might have already seen one.

    3
    The location of GRB 060614. This explosion was trillions of times more powerful than the sun. Image: Hubble Space Telescope

    On a balmy summer day in 2006, NASA’s Swift satellite captured an exceptionally powerful gamma-ray burst (called GRB 060614) in a very strange region of the sky.

    NASA Neil Gehrels Swift Observatory

    Whereas these kinds of bursts fall into one of two categories — short burst and long burst — and are usually associated with a supernovae, GRB 060614 didn’t do either. It lasted for a remarkable 102 seconds but wasn’t associated with any star explosion. Most gamma-ray bursts, for comparison, last only 2–30 seconds.

    GRB 060614 took place in a galaxy that had very few stars able to produce explosions or long bursts. It appears to astronomers and astrophysicists that this gamma-ray burst came from nowhere and simply collapsed in on itself after just a few short moments. A few years later, scientists introduced the hypothesis that GRB 060614 could have been a white hole. This does, after all, describe perfectly what we would expect to see from a white hole — a powerful, unstable fountain of matter and energy that disappears shortly after forming, usually from a point too small to see. And while it can’t be concluded that GRB 060614 was in fact such a fantastic phenomenon, current scientific models have no explanation for what happened. NASA scientists do believe something entirely new was responsible for the gamma-ray burst, with many admitting that despite dedicating a great deal of time to observation and data, they simply don’t know what could have caused it. Since its discovery in 2006, dozens of telescopes, including Hubble, have studied the event.

    For now, no one can say with certainty that we’ve seen these fantastic objects in our universe. But we can say this: general relativity breaks down at a black hole’s singularity. Energy density and curvature simply don’t allow general relativity to be a good descriptor of what happens inside a black hole. It isn’t until we have a more complete understanding of physics that we can rule out objects like white holes and worm holes which live, for now, only in our science fiction. But I suppose it’s important to add that at one point, black holes were considered fiction too.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    About Medium

    Medium is an online publishing platform developed by Evan Williams, and launched in August 2012. It is owned by A Medium Corporation. The platform is an example of social journalism, having a hybrid collection of amateur and professional people and publications, or exclusive blogs or publishers on Medium, and is regularly regarded as a blog host.

    Williams developed Medium as a way to publish writings and documents longer than Twitter’s 140-character (now 280-character) maximum.

     
  • richardmitnick 11:22 am on October 22, 2018 Permalink | Reply
    Tags: , , Medium,   

    From Medium: “Fusion Power, or Creating A Star On Earth” 

    From Medium

    Oct 20, 2018
    Ella Alderson

    1
    In this image, an accelerator shot brightens up the surface of the Z Machine in New Mexico. It’s the world’s most powerful source of laboratory radiation and holds the energy equivalent of up to 52 sticks of dynamite. Researchers at the Sandia National Laboratory hope to find an alternative approach to fusion power.

    Sandia Z machine

    Energy production today is a battlefield. It’s a competition between the grime and pollution of fossil fuels, the unreliability of solar and wind power, and the looming towers of nuclear plants and their dangerous waste. With the world’s energy consumption growing each year and the planet straining under our use of coal and oil, an innovative source of energy for our cities is quickly becoming a necessity. That’s what makes fusion power so attractive; it promises to provide all the energy we need reliably, cheaply, and, most exciting of all, in an environmentally safe way with zero carbon emissions. It would be the ultimate source of power for our booming civilizations here on Earth and our craft setting out to explore the solar system. Even catastrophes at a fusion reactor would result in little more than the plasma expanding and cooling, with no chances of a huge, endangering explosion.

    And it’s not impossible. Fusion is what drives every star dotting the endless skies — including our gorgeous, broiling sun. At its core, hydrogen is fused into helium and eventually escapes as electromagnetic radiation. That is, two hydrogen atoms are rammed together and produce a helium atom as a result. But the fusion of one element into another is not as easy as it sounds. Because both protons have the same charge, the only way to overcome their natural repellence is to bring them close enough together that they fuse. The sun is able to do this because of its immense mass (it claims 99.8% of all matter in our solar system) and, consequently, the immense amount of gravitational force made available. The heat, the pressure, and the gravity are what make solar fusion possible.

    That’s what makes fusion different from fission — while fission aims to split apart a heavier nucleus into two lighter ones, fusion brings together lighter nuclei into a heavier one. The fusion process means the resulting element has less mass and the remaining mass turns into energy. An enormous amount of energy. Our nuclear reactors today use fission, which unfortunately makes radioactive waste that lasts tens of thousands of years. Still, many see fission powered reactors as an improvement over fossil fuels since it’s less polluting than most other sources of energy and has helped us avoid 14 billion metric tons of carbon dioxide in the past 21 years. Nuclear power is affordable and provides about 20% of all electricity in the US.

    But it’s not fusion. It’s not everything fusion promises to be.

    2
    Oil from an explosion mars the waters of the Gulf of Mexico. Fusion energy could provide the same amount of power from a single glass of seawater as burning an entire barrel of oil. Hydrogen isotopes extracted from the seawater would provide limitless power. Image by Kari Goodnough.

    But it turns out recreating conditions of the sun here on Earth isn’t going to be a straightforward task. One of the saddest jokes regarding fusion is that it’s the energy of the future…and always will be. This type of remarkable energy has been 30 years away for the past 8 decades now. But this time could be different. Physicists are feeling more confident than ever in their ability to problem-solve and there’s been a great number of breakthroughs in the last few years.

    One of the problems they face is that the process requires temperatures in the hundreds of millions of degrees — temperatures up to 10 times higher than those at the core of the sun. Needless to say, no solid material could withstand that amount of heat and so scientists often use magnetic fields to hold up the scorching plasma during what’s known as magnetic confinement. Magnets then press the plasma into higher densities, but that means the atoms aren’t always stable enough to contain the energy. Plasma heated using lasers and ion beams require too much energy going into the system.

    In short, the current goal of fusion research is to break even in terms of energy. Researchers want to get as much energy out as what they’re putting in. Up until now we’ve been working on a deficit where energy output is far less than energy input. The end goal, of course, is to get many times as much energy out as what we’re putting in.

    Some new approaches are a hybrid of electrical and magnetic fields that beam atoms against a solid target until the atoms from the beam fuse together with those of the target. This process utilizes hydrogen since lighter elements produce more energy during fusion. The trick here is to minimize the number of atoms scattering and thus increase the amount of energy collected.

    As far as a commercial reach for fusion goes, estimates range from 60 years from today to a mere 15 depending on who you ask. Researchers from MIT are confident they can have a fusion reactor on the grid as soon as 2033, though even that brings up the question of whether or not we can afford to wait so long for clean energy.

    3
    The Joint European Torus is the world’s largest and most powerful tokamak — a machine that confines hot plasma into the shape of a torus by use of a magnetic field.

    Instead of focusing too much on the technicalities of fusion power itself, I was very interested in what this kind of energy would mean for space exploration. It turns out that NASA is funding a fusion-powered rocket with hopes of a working prototype by 2020. If successful, fusion powered spacecraft would reach Mars twice as fast as anything we could send now. This means that instead of a trip to the red planet taking 7 months, it would only take a little over 3, greatly reducing the crew’s exposure to radiation, psychological strain, and weightlessness. Not to mention they would need much less food, fuel, and oxygen onboard. In fact, a small grain of aluminum would provide the equivalent power as a gallon of fuel does in today’s chemical rockets.

    Fusion rockets would have a specific impulse of 130,000 seconds — 300 times greater than that of modern rockets. Specific impulse is a term that refers to the relationship between thrust and the amount of propellant used. In the case of chemical rockets, a specific impulse of 450 seconds means that a rocket can hold 1 pound of thrust from 1 pound of fuel for about 450 seconds. Fusion rockets would also allow for a bigger payload since not as much room is needed for fuel. Instead, magnets with lithium bands would push atoms together, resulting in fusion and energy to push the rocket forward. If the fusion rockets use hydrogen as a propellant, they could replenish their stock by collecting hydrogen from the surface of planets.

    It’s a fast, much more efficient method of interplanetary travel. And the foundations for it are being built today. Projects like VASIMR act as steps on the path to fusion rockets. VASIMR is a plasma rocket that heats and expels the plasma to create thrust but, because fusion rockets will also use plasma, anything researchers can learn from this craft would help them in the design and creation of a fusion drive.

    Recreating the power of a star is something that sounds uniquely human: a violent, tricky, seemingly fantastical goal. And yet achieving it would bring our civilization so much, both in terms of physical energy and introspection. How far can we truly advance, and can we reach the goals of our wildest ambitions?

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    About Medium

    Medium is an online publishing platform developed by Evan Williams, and launched in August 2012. It is owned by A Medium Corporation. The platform is an example of social journalism, having a hybrid collection of amateur and professional people and publications, or exclusive blogs or publishers on Medium, and is regularly regarded as a blog host.

    Williams developed Medium as a way to publish writings and documents longer than Twitter’s 140-character (now 280-character) maximum.

     
  • richardmitnick 8:15 am on September 5, 2018 Permalink | Reply
    Tags: A Type IV civilization would be undetectable to us, A Type V master race that would function like gods able to harness energy not only from this universe but all universes in all dimensions, , At 100000 times the energy usage we have now we’d have access to 10¹⁷ watts of energy as a Type I civilization, , , , Dyson ring and Dyson bubble, Medium, Micro-scale developed by John D. Barrow, , Physicist Michio Kaku, The Kardashev scale designed by astrophysicist Nikolai Kardashev, The trick for a galactic species would be the constraints of the laws of physics, These feats are very sci-fi and as far as we know impossible to accomplish. But then again we’re a lowly Type 0 civilization with no idea what may lie ahead, To colonize all the stars we could use self-replicating robots that would assemble and maintain the Dyson swarms, We’re a Type Zero Civilization   

    From Medium: “We’re a Type Zero Civilization” 

    From Medium

    Aug 11, 2018
    Updated 9.5.18
    Ella Alderson

    When will we move up the scale?

    1
    Image: Juanmrgt/iStock/Getty Images Plus

    The Kardashev scale, designed by astrophysicist Nikolai Kardashev, was created to assess how advanced a civilization is by taking into consideration multiple factors, including population growth, technology, and energy demands. The idea is that the more advanced the people are, the higher and more complex their energy usage will be. When we first appeared on Earth 200,000 years ago, for example, our species was few in number, and the extent of our energy source was, really, just fire. We now number in the billions and use a combination of wind, solar, and nuclear energy sources, though our main energy supply comes from fossil fuels (it really seems like we just moved on to burning bigger and badder things). The International Energy Agency estimates that each year our societies use an estimated 17.37 terrawatt-hours.

    All of this may sound fairly advanced — we’ve come a long way from just using logs to fuel our everyday lives. Yet in reality, we’re really quite primitive compared to where we could be. We still get the majority of our energy from dead plants and animals, a source that will eventually run out sooner or later, and which is helping destroy our planet in the process.

    So where do we place on the Kardashev scale? We’re a zero: 0.72, to be more exact. Here’s what we need to move forward.

    Type I

    To become a Type I civilization we would have to harness all the available energy of our home planet at 100% efficiency. This means capturing the energy of every wave, every beam of sunlight, and every bit of fossil fuel we can dig up. To do that without rendering the entire planet uninhabitable, we’d have to use nuclear fusion. And to create all the energy we need via this method, we would require 280 k/s of hydrogen and helium every second, or 89 billion grams of hydrogen per year. You can gather more than that from one square km of ocean water.

    With this ability to harness all energy from Earth also comes the ability to control all of the planet’s natural forces, including volcanoes, geothermal vents, earthquakes, and climate. At 100,000 times the energy usage we have now, we’d have access to 10¹⁷ watts of energy as a Type I civilization. Consider, for example, the ability to control a hurricane. One such storm can release the power of hundreds of hydrogen bombs.

    While controlling the weather may sound very fantastical, physicist Michio Kaku theorizes that we’ll reach Type I status in the next 100–200 years, as we continue to grow in population at about 3% per year.

    2
    Dyson ring concept drawing (Source: Vedexent/Wikipedia)
    3
    Dyson bubble concept drawing (Source: PNG Crusade Bot/Wikipedia)/CC BY 2.5

    After we’ve been able to harness all the energy from our home planet, we’ll move on to harnessing all the energy of our home star, the sun. One way of doing this is to build a Dyson swarm around the star, or a group of panels capable of reflecting light into small solar power plants which could then send those light beams to Earth for our use. Similar to the work of controlling the forces here on Earth, we’d be able to control the star as well, including the manipulation of solar flares. Another way to get enough energy for a Type II civilization would be to build a fusion reactor on a huge scale or to use a reactor to essentially drain the hydrogen from a nearby gas giant, like Jupiter.

    At this point we’re a few thousand years into the future and using 10²⁶ watts of energy. A stellar civilization capable of gathering energy on this scale has become immune to extinction.

    Type III

    We’ve gone from controlling all the energy of our home planet to our home star and, now, our galaxy. Take the Dyson swarm proposed above and extend it to cover all 100 billion stars of the Milky Way. A civilization this advanced, and with access to this many resources, would truly be a master race, having at their disposal 10³⁶ watts of energy. Hundreds of thousands, even millions of years of evolution would mean that we as a race would look very different, both biologically and in terms of merging with our technology in becoming cyborgs or even fully robotic.

    To colonize all the stars we could use self-replicating robots that would assemble and maintain the Dyson swarms, though it’s likely we’ll have found a new energy source by then. This could include tapping into the energy of the black hole at the center of the Milky Way, or even using gamma ray bursts. Another possibility, though they have been yet undetected, would be to find a white hole and to use the energy that emanates from it.

    The trick for a galactic species would be the constraints of the laws of physics — how can they be united when their colonies are light years away? They’d have to find a way to move at the speed of light or, even better, create wormholes to other locations.

    Kardashev ended the scale here because he didn’t believe it could go any further, stating that any civilizations beyond Type III would be too advanced to even fathom. But other astronomers have since extended the scale to include Type IV and Type V.

    Type IV and V

    A Type IV civilization would be undetectable to us. It would be able to harness the entire energy of the universe and move across all of space, appearing as nothing more than a work of nature. Some speculate that giant voids in space, like the one 1.8 billion light years across and missing 90% of its galaxies, could be proof of a civilization making use of the universe. But a civilization this advanced might not even harness energy as we know it anymore, choosing instead to move into more exotic substances, like dark energy. They might also live inside black holes, controlling 10⁴⁶ watts of energy. These feats are very sci-fi and, as far as we know, impossible to accomplish. But then again we’re a lowly Type 0 civilization with no idea what may lie ahead.

    It gets even more fantastical when one considers a Type V master race that would function like gods, able to harness energy not only from this universe, but all universes in all dimensions. Its energy usage and access to knowledge would be incomprehensible.

    Micro-scale

    The micro-dimensional mastery extension to the Kardashev scale was proposed by John D. Barrow, a scientist who decided to take civilization ranking in the opposite direction, choosing instead to base his scale on how small a people’s control could reach. This scale is outlined differently:

    Type I-minus: controlling matter at the observable level, that is, being to manipulate things we can see and touch.

    Type II-minus: controlling genes

    Type III-minus: controlling molecules

    Type IV-minus: controlling atoms

    Type V-minus: controlling protons

    Type VI-minus: controlling elementary particles, like quarks

    Type Omega-minus: controlling fundamental elements of spacetime

    Whether using the original or micro version, the beautiful thing about the Kardashev scale is that it’s not just full of fascinating and alien concepts; it’s also a blueprint for where we could go if our species could just make it the next 100 years. Will the human race emerge from our planet and thrive in the universe just as we emerged from Africa and grew to thrive around the world?

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 8:29 am on September 2, 2018 Permalink | Reply
    Tags: "Valley of Genius", Medium,   

    From Medium: “Did we learn the right things from Steve Jobs?” 

    From Medium

    Aug 19, 2018
    Marianne Bellotti

    Our quest for meaning ultimately distorts our understanding of what creates success.

    I’m going to take a break from writing about mainframes to do something a bit different: a book review! I’ve been spending the better part of a month working through Adam Fisher’s Valley of Genius. Usually, I plow through a book in a week or two, but this one is over 500 pages (ಠ_ಠ) … so it took a minute.

    1

    Valley of Genius is written in an oral history style, meaning rather than Fisher retelling stories like the rise and fall of Atari, the founding of the Homebrew Computer club or Larry and Sergey’s Excellent Graduate School Adventure, Fisher strings together direct quotes from the people who were there. Five hundred and twelve pages worth of quotes one after the other, often without any attribution clarifying at what point in the years of interviewing necessary for this book the statements were made.

    I’m not a fan of this technique. There are various points where quotes are strung together to give the impression of one person responding directly to a remark made by someone else. For example, Mark Zuckerberg repeats the word “Domination” over and over again as various people affiliated with Facebook rattle off descriptions of early ethical dilemmas. It encourages the reader to imagine Zuckerberg at his most callous and smug while reading.

    That’s pretty manipulative. The interviewee’s own words used to editorialize against him.

    On the other hand, allowing the people on the ground to speak for themselves reveals how significant the things removed can be when the past is being shaped into a narrative. There’s not much in this book that is completely new. You can read the same stories in any number of books or profiles specific to the individual companies or personalities. But Valley of Genius attempts to connect these separate cast of characters to examine the broader context of Silicon Valley’s evolution. Most of these people knew each other, they influenced each other, they stole from each other.

    Because much of my career involves technology applied to complex social issues (before USDS I worked at the UN and overseas for various governments and NGOs) I’m often struck by how often people learn the wrong things from experience. How often the narrative constructed to explain a success or failure is edited to eliminate important details. Then you have new people coming in believing they understand the history of past projects and making the same mistakes, or failing to find success by applying a supposedly tried and true method.

    Surfacing those details — the stuff often omitted but critical to understanding what happened — is something Valley of Genius did really well.

    The curious case of Steve Jobs

    Steve Jobs via Matt Yohe/Wikipedia https://en.wikipedia.org/wiki/User:Matt_Yohe

    Before I read this book I knew Steve Jobs was an asshole. I knew that Apple stole the Mac from Xerox PARC from stories my father used to tell me growing up. I learned later on that the image I had in my head of Jobs and Wozniak in a garage building computers probably wasn’t true because Steve Jobs wasn’t an engineer (although Valley of Genius claims he was excellent at soldering). I knew he was abusive and manipulative with his staff. I knew he was an asshole.

    I just didn’t realize how much of an asshole.

    One of the first stories in Fisher’s book that really caught my attention was an anecdote about Jobs’s time at Atari:

    2

    A version of this story appears in other Steve Jobs biographies but framed very differently. Steve Jobs: The Man Who Thought Different tells the story this way:

    3

    4

    Implying that the bonus was a surprise, not something stated upfront certainly puts Jobs in a slightly better light. Walter Isaacson’s Steve Jobs acknowledges the bonus was on the table from the beginning but avoids specifying what it was. The reader can easily assume that it was a couple extra hundred dollars, rather than Steve Jobs fleeced his best friend for thousands of dollars.

    Jobs is an interesting case because his life story seems to be filled with these sort of situations, places where depending on how you arrange the facts things can look bad or very very bad. For example, I knew that Jobs needed a liver transplant towards the end of his life. I remember the ethics debate it kicked off at the time when Jobs’s name appeared at the top of the list in record time. What I didn’t know until this book was that nine months earlier he decided to forego surgery that would have cured his pancreatic cancer in favor of acupuncture and other holistic treatments. Suddenly the idea that that someone else died on that transplant list so that Steve Jobs could live two more years felt very different to me.

    And yet one cannot deny the significance of Steve Jobs’s contribution to society either. The man is a challenge to curate. So much about him is selfish, abusive, and cruel. But is the lesson we should learn from his life to be selfish, abusive and cruel?

    While Fisher’s tries his best to maintain the rhetoric of Jobs as a genius — describing him at one point as the native son who best epitomized the Valley’s nature — quotes from Jobs’s contemporaries are constantly driving home the point that his success was all smoke and mirrors. Steve Jobs wasn’t a genius. He wasn’t even very smart.

    The Apple I and II? Mostly the work of Wozniak built at Atari with Atari parts. The Macintosh? An existing project founded by other Apple engineers that Jobs took over when an internal coup kicked him off the Lisa project. And although the 1984 marketing campaign was groundbreaking, the actual product wasn’t successful until after Steve Jobs was forced out of the company. The iPod? The work of Tony Fadell. The iPhone? Based heavily on the work of General Magic. Valley of Genius quotes luminary after luminary on this point: everything that Steve Jobs tried to build himself was a failure.

    6
    Marc Porat’s original sketches for General Magic

    Still… the man died worth billions of dollars, having founded and led a company now worth a trillion dollars. Is that not success? Should people not use his life as an example?

    The Archipelago of Innovation

    When people like Steve Jobs rise to prominence, we’re encouraged to overlook distasteful elements of their personalities. We construct narratives around how they accomplished what they accomplished that filter out all that troubling stuff so that we can learn from the genius without burden of ethical dilemmas. Steve Jobs wasn’t an abusive boss who often sabotaged and derailed projects, he was a man with a vision and the guts to stick with it when no one else could see it. He didn’t steal all the products people associate with him from other companies, taking something to market is the same as inventing it.

    7
    Xerox Alto (left) Mac’s first GUI (top-right) LISA’s GUI (bottom-right)


    What I found most interesting about Fisher’s Steve Jobs is the one characteristic that I’ve never heard anyone attribute to him: his cognitive flexibility.

    Look, Steve Jobs stole. He straight up stole. He didn’t just build on the innovations of others, he often passed them off as his own work, even when it was clear that couldn’t possibly be true. And while most people like to either gloss over that or romanticize it, I think it’s worth asking how did he know what to steal?

    One of the companies Fisher traces the life cycle of in Valley of Genius is Pixar. Toward the end of that chapter Steve Jobs, who had invested $10 million when the company spun off from Lucasfilm, is unpleasantly surprised to find that what he thought would become a technology company is in fact an animation studio. But he finds this out because of the buzz around Pixar’s first feature length film: Toy Story. He subsequently takes over leadership of the company and begins pushing the narrative that he was one of the original founders who had always been in charge.

    8
    The computer that Jobs thought he was investing in.

    I don’t think Steve Jobs’s defining characteristic was vision or determination or arrogance or even empathy for users. I think it was cognitive flexibility, the ability to abandon his own assumptions about computers, technology and business — even the assumptions that made him billions of dollars — and adopt new models as he encountered them. Computers needed GUIs and mice. Pixar wasn’t a computer company. An open platform for iPhone development. The remarkable thing is that Steve Jobs encountered the same ideas that dozens of his peers were also exposed to and he internalize them and embraced them as his own. In Fisher’s history of Silicon Valley we see these shifts play out over time. We see the common sense form after the winners and losers are declared then we see Steve Jobs realize how things are about to change and defy that common sense before anyone else does.

    The problem with calling that vision is that it actually encourages people to do the exact opposite of what they should. The Great Man Theory of Steve Jobs assumes that Steve Jobs could predict the future of technology all on his own. That he found the technology to steal because he knew what he was looking for. Instead what I think we should learn from Steve Jobs is to be resourceful and opportunistic. Don’t rely purely on your own knowledge and ability because your knowledge and ability is not enough.

    It’s mentioned a few times in Fisher’s book: Steve Jobs didn’t really know anything about computers. He didn’t understand most of the engineering behind the things he stole. Far from a great mind asserting his will on the universe, his potential to innovate grew as he drew from a larger and larger pool of others.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    stem

    Stem Education Coalition

     
  • richardmitnick 9:09 am on June 14, 2018 Permalink | Reply
    Tags: , Medium, The Start Up   

    From The Start Up: “The Hidden Universe” 

    From The Start Up

    Dark matter and dark energy explained.

    1
    Photo by Brett Ritchie on Unsplash

    Dark matter and dark energy make up the vast majority of our universe, and yet we can’t really perceive them.

    However, we have ways to perceive their influence, and it’s as wide-reaching as dark matter and energy are.

    All matter as we know it, from muons, electrons and atoms all the way up to planets, stars and galactic clusters, makes up less than a meager 5% of everything in the universe.

    As far we we know, roughly 25% is dark matter and 70% is dark energy. But the thing about those two massive chunks of everything is that they are invisible. Basically, all that we perceive is only a very small part of reality.

    To top it all off, we don’t REALLY understand what dark matter and dark energy are.

    So where do we even get the idea of those two huge bits of craziness from?

    When astrophysicists did the math to figure out how the universe was structured, it didn’t hold up. There just wasn’t enough visible matter in the universe to make it work. Taking all of the gravity generated by the matter we can see together is not enough to form the massive and complex forms we see throughout the universe: Without SOMETHING else to hold it all together, it looked like stars should just be scattered willy-nilly instead of ever clustering together to make galaxies.

    That something has to be woven all around the visible matter. But it can’t be seen, never emitting or reflecting light…a.k.a. dark matter.

    However, we can tell dark matter is there even if we can’t actually see it with telescopes. It does have a gravitational effect on space-time. Light passing around these dark matter mega-clumps is actually bent.

    At the same time, though, dark matter cannot be normal matter, because we WOULD be able to detect some kind of emissions if that were the case. It also doesn’t even react with normal matter. Antimatter makes gamma rays when it reacts with normal matter. Dark matter apparently does nothing.

    But it’s THERE.

    The most accepted possibility now is that dark matter is some kind of exotic particle that we simply don’t know anything about. Some possibilities for these particles could be GIMPs (gravitationally-interacting massive particles) and WIMPs (weakly-interacting massive particles). Perhaps it might be created in a particle accelerator experiment at some point, though we may not recognize it for what it is.

    And then we have dark energy, which is, believe it or not, even stranger.

    Just as with dark matter, we can’t truly detect or examine it, but we can see how dark energy affects the universe around it. This started with Edward Hubble in 1929 when he saw the red-shifting of light wavelengths as he looked deeper and deeper into space: Distant (and therefore faint) galaxies showed a lot of red-shift, while galaxies that were more visible and closer red-shifted much less. This led to our understanding that the universe has always been expanding, which would stretch the wavelength of light as everything moved farther and farther apart.

    Since then, research has demonstrated that this expansion has been accelerating. This throws out the idea that we will see an eventual retraction leading to a “big crunch”. Currently, the belief is that space will just continue expanding until matter is basically stretched so thin that everything comes to a total standstill, even atomic movement — the “heat death of the universe” theory. Don’t let that get you down, though. We’re talking billions of years into the future.

    3
    http://hubblesite.org/newscenter/archive/releases/2001/09/image/g/

    So, as expansion goes on, the universe and all of space is essentially gaining in volume. Because we’re held together in a nice tidy bundle of stars called the Milky Way galaxy, the expansion doesn’t affect us directly, but we see other galaxies red-shifting away from us just as we’re red-shifting away from wherever the center of the universe is.

    Whatever is propelling this expansion is what we call dark energy. It represents more energy than everything else taken together: all the stars, black holes, gas giants, quasars and everything else combined doesn’t come close to the sheer power dark energy represents.

    As to what dark energy is, again, we can’t be certain. It may be some intrinsic property of space. As the universe expands, more space is created, thus pushing expansion further and further with ever-increasing acceleration.

    Einstein’s “cosmological constant” concept from 1917 was similar. He proposed it as a force that countered gravity, but the mathematics didn’t produce any viable results and just confused things more. This is also known as “vacuum energy”, as in empty space is a vacuum.

    Another possibility is that dark matter is made of spontaneously-forming particles that continuously pop up and disappear as new space forms, which in turn generates the dark energy. This continues on in a never-ending loop which keeps the expansion going.

    Ultimately, dark matter and dark energy are key components of the universe’s make-up that are currently beyond our technological ability to understand. However, unlike the mystery of what came before our universe’s creation, dark matter and energy actually exist in the here and now, and as we advance in our scientific understanding there is a great chance that we will some day grasp their full origins and purpose.

    For now, just keep looking up into the night sky and know that there are so many wonders still out there for us to find and comprehend. As long as we dream of the stars, we will discover more about them.

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    Dark Matter Research

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Scientists studying the cosmic microwave background hope to learn about more than just how the universe grew—it could also offer insight into dark matter, dark energy and the mass of the neutrino.

    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al

    Dark Matter Particle Explorer China

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB deep in Sudbury’s Creighton Mine

    LUX Dark matter Experiment at SURF, Lead, SD, USA

    ADMX Axion Dark Matter Experiment, U Uashington

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: