From Physics: “Unlocking the Hidden Information in Starlight”

Physics LogoAbout Physics

Physics Logo 2

Physics

August 29, 2016
Gabriel Durkin, Berkeley Quantum Information and Computation Center, University of California, Berkeley

Quantum metrology shows that it is always possible to estimate the separation of two stars, no matter how close together they are.

1
The Rayleigh criterion states that in direct imaging, two light sources are only discernable when the centers of their diffraction patterns, or peaks of their point spread functions, are farther apart than their widths. (Top) The sources are farther apart than the Rayleigh criterion distance. (Middle) The sources meet the Rayleigh criterion distance. (Bottom) The sources are closer than the Rayleigh criterion distance. Tsang and collaborators [1] used quantum metrology techniques to show that the Rayleigh criterion is not a fundamental limitation, finding that the separation between two objects can always be estimated with a precision that is independent of the size of the separation.

A provocative new result [1] by Mankei Tsang, Ranjith Nair, and Xiao-Ming Lu of the National University of Singapore suggests that a long-standing limitation to the precision of astronomical imaging, the Rayleigh criterion, proposed in 1879 [2] is itself only an apparition. Using quantum metrology techniques, the researchers have shown that two uncorrelated point-like light sources, such as stars, can be discriminated to arbitrary precision even as their separation decreases to zero.

Quantum metrology, a field that has existed since the late 1960s with the pioneering work of Carl Helstrom [3], is a peculiar hybrid of quantum mechanics and the classical estimation theory developed by statisticians in the 1940s. The methodology is a powerful one, quantifying resources needed for optimal estimation of elementary variables and fundamental constants. These resources include preparation of quantum systems in a characteristic (entangled) state, followed by judiciously chosen measurements, from which a desired parameter, itself not directly measurable, may be inferred.

In the context of remote sensing, for example, in the imaging of objects in the night sky, the ability to prepare a physical system in an optimal state does not exist. In the case of starlight, the typical assumption is that the source is classical thermal light, the state of maximum entropy or “uninformativeness.” Imaging such sources is plagued by the limits of diffraction when the objects are in close proximity. The wave-like nature of light causes it to spread as it moves through space, bending around obstacles, for example when traversing a telescope aperture. This results in a diffraction pattern described by a so-called point spread function (PSF) in the image plane. The Rayleigh criterion states that two closely spaced objects are just resolvable—that is, discernable from one another—when the center of the diffraction pattern, or peak of the PSF, of one object is directly over the first minimum of the diffraction pattern of the other. Roughly, the PSF maxima must be farther apart than their widths (Fig. 1).

Some astronomers say they are able to resolve objects that are slightly closer than the Rayleigh limit allows. Yet inevitably, as the angular separation between the objects decreases, the information that can be obtained about that separation using direct detection becomes negligible, and even the most optimistic astronomer, utilizing the most sophisticated signal-processing techniques, must admit defeat. Correspondingly, as the separation approaches zero, the minimum error on any unbiased estimation of the separation blows up to infinity, which has limited angular resolution in imaging since the time of Galileo. Typically, the mean-squared error on the estimation of a parameter scales with the number of repeated measurements or data points, ν
, as 1∕ν. Even for a large error per measurement, any desired precision is attained by taking multiple data points. When, however, the lower bound on direct estimation of the separation is divergent because of the Rayleigh limit, the 1∕ν factor makes no impact. This is what Tsang and collaborators call Rayleigh’s curse.

Using a quantum metrology formalism to minimize the estimation error, the initial achievement of their work has been to show that there is no fundamental obstacle to the estimation of the separation of two PSFs in one dimension (that is, for sources that sit on a line). As the separation of two PSFs decreases to zero, the amount of obtainable information stays constant. This discovery is nicely summed up by Tsang, who says we should apologize to the starlight “as it travels billions of light years to reach us, yet our current technology and even our space telescopes turn out to be wasting a lot of the information it carries.” [4]

It could be suggested that this is merely a theoretical proof; the quantum metrology formalism indicates that there is always an optimal measurement, which minimizes the estimation error for the separation parameter. Paradoxically, this optimal measurement can, however, depend on the value of the parameter. To obviate such concerns, Tsang and his colleagues propose a strategy, based on state-of-the-art quantum optics technology, that produces a minimal error in the estimation of the separation variable—counterintuitively, this error remains constant for all separation values, under the assumption that the PSFs have a Gaussian shape. The method, which the authors call spatial mode demultiplexing (SPADE), splits the light from the two sources into optical waveguides that have a quadratic refractive-index lateral profile. Mathematically, this SPADE measurement decomposes the overlapping PSFs (a real function in one dimension) into the complete basis of Hermite functions, just as a Fourier transform provides a decomposition of a real function into a superposition of sine and cosine terms. A posteriori, one may be tempted to use intuition to explain why this Hermite basis measurement seems not to suffer Rayleigh’s curse, but then again, were intuition forthcoming, the result may not have been hidden from view for so long. (This elusiveness relates to subtleties in the estimation of a single parameter extracted from the joint statistics of two incoherent light sources.)

One minor caveat of the approach is that full imaging of two point sources at positions X1
and X2 requires estimation of both separation X1−X2 and centroid (X1+X2)∕2 parameters. SPADE is only optimal when the centroid parameter is already known to high precision. Centroid estimation, however, has no equivalent analog to the Rayleigh curse; it may be made via direct imaging. Errors can be reduced appropriately via the factor 1∕ν for data points with ν much greater than 1.

A second detail worth pondering is that this result utilized techniques from the quantum domain to reveal a classical result. (All of the physical assumptions about starlight admit a classical model.) The quantum metrology formalism has been used to optimally estimate a parameter, but no quantum correlations exist in the system for any value of that parameter, that is, for any angular separation of two stars. When no quantum correlations are present, the formalism will still indicate the best possible measurement strategy and the smallest achievable estimation error.

An added blessing of quantum metrology is that it allows the development of generalized uncertainty relationships, for example between temperature and energy for a system at equilibrium [5], or photon number and path-length difference between the two arms of an interferometer. The result of Tsang and his colleagues can be presented as another type of generalized uncertainty, between source separation and “momentum.” The mean-squared error associated with separation estimation scales inversely with the momentum (Fourier) space variance of the overlapping PSFs.

Regarding impact on the field, the authors’ study produced a flurry of generalizations and other experimental proposals. During the past six months there have been four proof-of-principle experiments, first in Singapore by Tsang’s colleague Alex Ling and collaborators [6], and then elsewhere in Canada and Europe [7–9]. A subsequent theory paper from researchers at the University of York [10] extends Tsang and colleagues’ theory result, which was for incoherent thermal sources such as starlight, to any general quantum state existing jointly between the two sources. This work exploits the roles of squeezing (of quantum fluctuations) and of quantum entanglement to improve measurement precision, extending applicability to domains in which control of the source light is possible, such as microscopy.

Tsang and his colleagues have provided a new perspective on the utility of quantum metrology, and they have reminded us that even in observational astronomy—one of the oldest branches of science—there are (sometimes) still new things to be learned, at the most basic level.

This research is published in Physical Review X.

See the full article here .

Please help promote STEM in your local schools.

STEM Icon

Stem Education Coalition

Physicists are drowning in a flood of research papers in their own fields and coping with an even larger deluge in other areas of physics. How can an active researcher stay informed about the most important developments in physics? Physics highlights a selection of papers from the Physical Review journals. In consultation with expert scientists, the editors choose these papers for their importance and/or intrinsic interest. To highlight these papers, Physics features three kinds of articles: Viewpoints are commentaries written by active researchers, who are asked to explain the results to physicists in other subfields. Focus stories are written by professional science writers in a journalistic style and are intended to be accessible to students and non-experts. Synopses are brief editor-written summaries. Physics provides a much-needed guide to the best in physics, and we welcome your comments (physics@aps.org).

Leave a comment