From ars technica: “The 2010s: Decade of the exoplanet”

Ars Technica
From ars technica

John Timmer

Artist conception of Kepler-186f, the first Earth-size exoplanet found in a star’s “habitable zone.”

ESO Belgian robotic Trappist National Telescope at Cerro La Silla, Chile

A size comparison of the planets of the TRAPPIST-1 system, lined up in order of increasing distance from their host star. The planetary surfaces are portrayed with an artist’s impression of their potential surface features, including water, ice, and atmospheres. NASA

Centauris Alpha Beta Proxima 27, February 2012. Skatebiker

The last ten years will arguably be seen as the “decade of the exoplanet.” That might seem like an obvious thing to say, given that the discovery of the first exoplanet was honored with a Nobel Prize this year. But that discovery happened back in 1995—so what made the 2010s so pivotal?

One key event: 2009’s launch of the Kepler planet-hunting probe.

NASA/Kepler Telescope, and K2 March 7, 2009 until November 15, 2018

Kepler spawned a completely new scientific discipline, one that has moved from basic discovery—there are exoplanets!—to inferring exoplanetary composition, figuring out exoplanetary atmosphere, and pondering what exoplanets might tell us about prospects for life outside our Solar System.

To get a sense of how this happened, we talked to someone who was in the field when the decade started: Andrew Szentgyorgyi, currently at the Harvard-Smithsonian Center for Astrophysics, where he’s the principal investigator on the Giant Magellan Telescope’s Large Earth Finder instrument.

Giant Magellan Telescope, 21 meters, to be at the Carnegie Institution for Science’s Las Campanas Observatory, to be built some 115 km (71 mi) north-northeast of La Serena, Chile, over 2,500 m (8,200 ft) high

In addition to being famous for having taught your author his “intro to physics” course, Szentgyorgyi was working on a similar instrument when the first exoplanet was discovered.

Two ways to find a planet

The Nobel-winning discovery of 51 Pegasi b came via the “radial velocity” method, which relies on the fact that a planet exerts a gravitational influence on its host star, causing the star to accelerate slightly toward the planet.

Radial Velocity Method-Las Cumbres Observatory

Radial velocity Image via SuperWasp http

Unless the planet’s orbit is oriented so that it’s perpendicular to the line of sight between Earth and the star, some of that acceleration will draw the star either closer to or farther from Earth. This acceleration can be detected via a blue or red shift in the star’s light, respectively.

The surfaces of stars can expand and contract, which also produces red and blue shifts, but these won’t have the regularity of acceleration produced by an orbital body. But it explains why, back in the 1990s, people studying the surface changes in stars were already building the necessary hardware to study radial velocity.

“We had a group that was building instruments that I’ve worked with to study the pulsations of stars—astroseismology,” Szentgyorgyi told Ars, “but that turns out to be sort of the same instrumentation you would use” to discern exoplanets.

He called the discovery of 51 Pegasi b a “seismic event” and said that he and his collaborators began thinking about how to use their instruments “probably when I got the copy of Nature” that the discovery was published in. Because some researchers already had the right equipment, a steady if small flow of exoplanet announcements followed.

During this time, researchers developed an alternate way to find exoplanets, termed the “transit method.”

Planet transit. NASA/Ames

The transit method requires a more limited geometry from an exoplanet’s orbit: the plane has to cause the exoplanet to pass through the line of sight between its host star and Earth. During these transits, the planet will eclipse a small fraction of light from the host star, causing a dip in its brightness. This doesn’t require the specialized equipment needed for radial velocity detections, but it does require a telescope that can detect small brightness differences despite the flicker caused by the light passing through our atmosphere.

By 2009, transit detections were adding regularly to the growing list of exoplanets.

The tsunami

In the first year it was launched, Kepler started finding new planets. Given time and a better understanding of how to use the instrument, the early years of the 2010s saw thousands of new planets cataloged. In 2009, Szentgyorgyi said, “it was still ‘you’re finding handfuls of exoplanetary systems.’ And then with the launch of Kepler, there’s this tsunami of results which has transformed the field.”

Suddenly, rather than dozens of exoplanets, we knew about thousands.

The tsunami of Kepler planet discoveries.

The sheer numbers involved had a profound effect on our understanding of planet formation. Rather than simply having a single example to test our models against—our own Solar System—we suddenly had many systems to examine (containing over 4,000 currently known exoplanets). These include objects that don’t exist in our Solar System, things like hot Jupiters, super-Earths, warm Neptunes, and more. “You found all these crazy things that, you know, don’t make any sense from the context of what we knew about the Solar System,” Szentgyorgyi told Ars.

It’s one thing to have models of planet formation that say some of these planets can form; it’s quite another to know that hundreds of them actually exist. And, in the case of hot Jupiters, it suggests that many exosolar systems are dynamic, shuffling planets to places where they can’t form and, in some cases, can’t survive indefinitely.

But Kepler gave us more than new exoplanets; it provided a different kind of data. Radial velocity measurements only tell you how much the star is moving, but that motion could be caused by a relatively small planet with an orbital plane aligned with the line of sight from Earth. Or it could be caused by a massive planet with an orbit that’s highly inclined from that line of sight. Physics dictates that, from our perspective, these will produce the same acceleration of the star. Kepler helped us sort out the differences.

A massive planet orbiting at a steep angle (left) and a small one orbiting at a shallow one will both produce the same motion of a star relative to Earth.

“Kepler not only found thousands and thousands of exoplanets, but it found them where we know the geometry,” Szentgyorgyi told Ars. “If you know the geometry—if you know the planet transits—you know your orbital inclination is in the plane you’re looking.” This allows follow-on observations using radial velocity to provide a more definitive mass of the exoplanet. Kepler also gave us the radius of each exoplanet.

“Once you know the mass and radius, you can infer the density,” Szentgyorgyi said. “There’s a remarkable amount of science you can do with that. It doesn’t seem like a lot, but it’s really huge.”

Density can tell us if a planet is rocky or watery—or whether it’s likely to have a large atmosphere or a small one. Sometimes, it can be tough to tell two possibilities apart; density consistent with a watery world could also be provided by a rocky core and a large atmosphere. But some combinations are either physically implausible or not consistent with planetary formation models, so knowing the density gives us good insight into the planetary type.

Beyond Kepler

Despite NASA’s heroic efforts, which kept Kepler going even after its hardware started to fail, its tsunami of discoveries slowed considerably before the decade was over. By that point, however, it had more than done its job. We had a new catalog of thousands of confirmed exoplanets, along with a new picture of our galaxy.

For instance, binary star systems are common in the Milky Way; we now know that their complicated gravitational environment isn’t a barrier to planet formation.

We also know that the most common type of star is the low-mass red dwarf. It was previously possible to think that the star’s low mass would be matched by a low-mass planet-forming disk, preventing the generation of large planets and the generation of large families of smaller planets. Neither turned out to be true.

“We’ve moved into a mode where we can actually say interesting, global, statistical things about exoplanets,” Szentgyorgyi told Ars. “Most exoplanets are small—they’re sort of Earth to sub-Neptune size. It would seem that probably most of the solar-type stars have exoplanets.” And, perhaps most important, there’s a lot of them. “The ubiquity of exoplanets certainly is a stunner… they’re just everywhere,” Szentgyorgyi added.

That ubiquity has provided the field with two things. First, it has given scientists the confidence to build new equipment, knowing that there are going to be planets to study. The most prominent piece of gear is NASA’s Transiting Exoplanet Survey Satellite, a space-based telescope designed to perform an all-sky exoplanet survey using methods similar to Kepler’s.

NASA/MIT TESS replaced Kepler in search for exoplanets

But other projects are smaller, focused on finding exoplanets closer to Earth. If exoplanets are everywhere, they’re also likely to be orbiting stars that are close enough so we can do detailed studies, including characterizing their atmospheres. One famous success in this area came courtesy of the TRAPPIST telescopes [above], which spotted a system hosting at least seven planets. More data should be coming soon, too; on December 17, the European Space Agency launched the first satellite dedicated to studying known exoplanets.


With future telescopes and associated hardware similar to what Szentgyorgyi is working on, we should be able to characterize the atmospheres of planets out to about 30 light years from Earth. One catch: this method requires that the planet passes in front of its host star from Earth’s point of view.

When an exoplanet transits in front of its star, most of the light that reaches Earth comes directly to us from the star. But a small percentage passes through the atmosphere of the exoplanet, allowing it to interact with the gases there. The molecules that make up the atmosphere can absorb light of specific wavelengths—essentially causing them to drop out of the light that makes its way to Earth. Thus, the spectrum of the light that we can see using a telescope can contain the signatures of various gases in the exoplanet’s atmosphere.

There are some important caveats to this method, though. Since the fraction of light that passes through the exoplanet atmosphere is small compared to that which comes directly to us from the star, we have to image multiple transits for the signal to stand out. And the host star has to have a steady output at the wavelengths we’re examining in order to keep its own variability from swamping the exoplanetary signal. Finally, gases in the exoplanet’s atmosphere are constantly in motion, which can make their signals challenging to interpret. (Clouds can also complicate matters.) Still, the approach has been used successfully on a number of exoplanets now.

In the air

Understanding atmospheric composition can tell us critical things about an exoplanet. Much of the news about exoplanet discoveries has been driven by what’s called the “habitable zone.” That zone is defined as the orbital region around a star where the amount of light reaching a planet’s surface is sufficient to keep water liquid. Get too close to the star and there’s enough energy reaching the planet to vaporize the water; get too far away and the energy is insufficient to keep water liquid.

These limits, however, assume an atmosphere that’s effectively transparent at all wavelengths. As we’ve seen in the Solar System, greenhouse gases can play an outsized role in altering the properties of planets like Venus, Earth, and Mars. At the right distance from a star, greenhouse gases can make the difference between a frozen rock and a Venus-like oven. The presence of clouds can also alter a planet’s temperature and can sometimes be identified by imaging the atmosphere. Finally, the reflectivity of a planet’s surface might also influence its temperature.

The net result is that we don’t know whether any of the planets in a star’s “habitable zone” are actually habitable. But understanding the atmosphere can give us good probabilities, at least.

The atmosphere can also open a window into the planet’s chemistry and history. On Venus, for example, the huge levels of carbon dioxide and the presence of sulfur dioxide clouds indicate that the planet has an oxidizing environment and that its atmosphere is dominated by volcanic activity. The composition of the gas giants in the outer Solar System likely reflects the gas that was present in the disk that formed the planets early in the Solar System’s history.

But the most intriguing prospect is that we could find something like Earth, where biological processes produce both methane and the oxygen that ultimately converts it to carbon dioxide. The presence of both in an atmosphere indicates that some process(es) are constantly producing the gases, maintaining a long-term balance. While some geological phenomena can produce both these chemicals, finding them together in an atmosphere would at least be suggestive of possible life.


Just the prospect of finding hints of life on other worlds has rapidly transformed the study of exoplanets, since it’s a problem that touches on nearly every area of science. Take the issue of atmospheres and habitability. Even if we understand the composition of a planet’s atmosphere, its temperature won’t just pop out of a simple equation. Distance from the star, type of star, the planet’s rotation, and the circulation of the atmosphere will all play a role in determining conditions. But the climate models that we use to simulate Earth’s atmosphere haven’t been capable of handling anything but the Sun and an Earth-like atmosphere. So extensive work has had to be done to modify them to work with the conditions found elsewhere.

Similar problems appear everywhere. Geologists and geochemists have to infer likely compositions given little more than a planet’s density and perhaps its atmospheric compositions. Their results need to be combined with atmospheric models to figure out what the surface chemistry of a planet might be. Biologists and biochemists can then take that chemistry and figure out what reactions might be possible there. Meanwhile, the planetary scientists who study our own Solar System can provide insight into how those processes have worked out here.

“I think it’s part of the Renaissance aspect of exoplanets,” Szentgyorgyi told Ars. “A lot of people now think a lot more broadly, there’s a lot more cross-disciplinary interaction. I find that I’m going to talks about geology, I’m going to talks about the atmospheric chemistry on Titan.”

The next decade promises incredible progress. A new generation of enormous telescopes is expected to come online, and the James Webb space telescope should devote significant time to imaging exosolar systems.

NASA/ESA/CSA Webb Telescope annotated

Other giant 30 meter class telescopes planned

ESO/E-ELT,39 meter telescope to be on top of Cerro Armazones in the Atacama Desert of northern Chile. located at the summit of the mountain at an altitude of 3,060 metres (10,040 ft).

TMT-Thirty Meter Telescope, proposed and now approved for Mauna Kea, Hawaii, USA4,207 m (13,802 ft) above sea level, the only giant 30 meter class telescope for the Northern hemisphere


We’re likely to end up with much more detailed pictures of some intriguing bodies in our galactic neighborhood.

The data that will flow from new experiments and new devices will be interpreted by scientists who have already transformed their field. That transformation—from proving that exoplanets exist to establishing a vibrant, multidisciplinary discipline—really took place during the 2010s, which is why it deserves the title “decade of exoplanets.”

See the full article here .


Please help promote STEM in your local schools.

Stem Education Coalition

Ars Technica was founded in 1998 when Founder & Editor-in-Chief Ken Fisher announced his plans for starting a publication devoted to technology that would cater to what he called “alpha geeks”: technologists and IT professionals. Ken’s vision was to build a publication with a simple editorial mission: be “technically savvy, up-to-date, and more fun” than what was currently popular in the space. In the ensuing years, with formidable contributions by a unique editorial staff, Ars Technica became a trusted source for technology news, tech policy analysis, breakdowns of the latest scientific advancements, gadget reviews, software, hardware, and nearly everything else found in between layers of silicon.

Ars Technica innovates by listening to its core readership. Readers have come to demand devotedness to accuracy and integrity, flanked by a willingness to leave each day’s meaningless, click-bait fodder by the wayside. The result is something unique: the unparalleled marriage of breadth and depth in technology journalism. By 2001, Ars Technica was regularly producing news reports, op-eds, and the like, but the company stood out from the competition by regularly providing long thought-pieces and in-depth explainers.

And thanks to its readership, Ars Technica also accomplished a number of industry leading moves. In 2001, Ars launched a digital subscription service when such things were non-existent for digital media. Ars was also the first IT publication to begin covering the resurgence of Apple, and the first to draw analytical and cultural ties between the world of high technology and gaming. Ars was also first to begin selling its long form content in digitally distributable forms, such as PDFs and eventually eBooks (again, starting in 2001).