Tagged: Quantum metrology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:23 pm on November 25, 2021 Permalink | Reply
    Tags: "Quantum Light for Cooler Sharper Imaging", A quantum-enhanced on-chip interference microscope, , , Enhancing imaging with entangled photons, , Quantum metrology, , The Q-MIC platform employs pairs of entangled photons as a sort of quantum light to enhance interference patterns during the imaging process.   

    From Optics & Photonics : “Quantum Light for Cooler Sharper Imaging” 

    From Optics & Photonics

    25 November 2021
    Sarah Michaud

    Left: A quantum-enhanced microscope developed and demonstrated by the Q-MIC consortium. Right: A standard protein sample in a silicon substrate was used to calibrate the quantum microscopy platform. [Image: The Institute of Photonic Sciences [Instituto de Ciencias Fotónicas](ES)]

    Q Microwave (EU), a European consortium for advancing research in quantum metrology, has demonstrated a quantum-enhanced on-chip interference microscope that improves imaging—without the potentially sample-damaging levels of radiation generated by light sources used in traditional microscopy [Science Advances].

    The Q-MIC platform employs pairs of entangled photons as a sort of quantum light to enhance interference patterns during the imaging process. Q-MIC researcher Alvaro Cuevas, ICFO­–The Institute of Photonic Sciences, Spain, said in a press release that by harnessing the quantum effect, “we are able to reduce the noise level and increase the sensitivity of the measurements by more than 25% when compared to classical measurements.”

    Potential applications of the quantum microscopy technology include biomedical diagnostics, material science, and crystallography.

    Enhancing imaging with entangled photons

    In traditional microscopy, high light levels are needed to reveal structures in a sample. However, for organic, live, transparent or other sensitive materials, light can literally cook or bleach away important details. Lower illumination levels cause less damage but increase visual background noise, leading to blurry images.

    Q-MIC consortium researchers aimed to reduce sample photodamage and increase image clarity by using interference patterns of entangled photons to enhance microscopy images. Compared with photons from a traditional light source, the entangled photons in the Q-MIC setup have shorter wavelengths that improve resolution in phase and speed up the measurement process.

    Setup of the Q-MIC quantum-enhanced microscope. SI, Sagnac interferometer; PBS, polarizing beam splitter; HWP, half-wave plate; L, lenses; P, polarizer; DM, dichroic mirror; M, mirror; ϕb, birefringent sample (SLM); ϕnb, nonbirefringent sample; SP, Savart plate; dPBS, lateral displacement polarizing beam splitter; BPF, band-pass filter. [Image: R. Camphausen et al., Sci. Adv., doi: 10.1126/sciadv.abj2155 (2021); CC BY-NC 4.0]

    Putting the pieces together

    Key components of the Q-MIC platform include a source of space-polarization hyperentangled N00N-state photons, a large field-of-view lens-free interferometric microscope (LIM), and a single photon avalanche diode (SPAD) array camera. (N00N-state photons are photons in a particular many-body entangled state that has proved useful in quantum metrology.)

    In the setup, the hyperentangled N00N-state photon pairs are created by spontaneous parametric down conversion in a Sagnac interferometer, and are then polarized and divided into vertical and horizontal paths by a Savart plate. The polarized photons next pass through a sample mounted on the LIM. The photon paths change depending on the structure of the sample. A second Savart plate with an opposite orientation to the first plate recombines the two photon paths.

    Phase differences in the photon paths caused by sample properties create an interference pattern when passed through a polarizer. The interference pattern is then captured by the SPAD array camera. After this process is repeated several times, a computer algorithm is used to combine the interference patterns to reconstruct a detailed image of the sample.

    The researchers used a standard protein sample in a silicon substrate to calibrate the quantum microscopy platform and confirmed that their new device created much smoother, more detailed images with fewer photons than traditional microscopy.

    Institutions involved in the EU-funded Q-MIC consortium include project coordinator ICFO, Spain; The Institute for Quantum Optics and Quantum Information [Institut für Quantenoptik und Quanteninformation]Austrian Academy of Sciences(AT); The Polytechnic University of Milan [Politecnico di Milano] (IT); The University of Glasgow (SCT); and The Institute for Applied Optics and Precision Engineering – Fraunhofer IOF (DE), with industry partners Carl Zeiss, Germany, and Micro Photon Devices, Italy.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Optics & Photonics News (OPN) is The Optical Society’s monthly news magazine. It provides in-depth coverage of recent developments in the field of optics and offers busy professionals the tools they need to succeed in the optics industry, as well as informative pieces on a variety of topics such as science and society, education, technology and business. OPN strives to make the various facets of this diverse field accessible to researchers, engineers, businesspeople and students. Contributors include scientists and journalists who specialize in the field of optics. We welcome your submissions.

  • richardmitnick 11:34 am on July 29, 2020 Permalink | Reply
    Tags: "‘Quantum negativity’ can power ultra-precise measurements", Metrology is the science of estimations and measurements., Quantum metrology,   

    From University of Cambridge: “‘Quantum negativity’ can power ultra-precise measurements” 

    U Cambridge bloc

    From University of Cambridge

    29 Jul 2020
    Sarah Collins
    Communications team

    Artist’s impression of a quantum metrology device. Credit:Hugo Lepage

    The researchers, from the University of Cambridge, Harvard and MIT, have shown that quantum particles can carry an unlimited amount of information about things they have interacted with. The results reported in the journal Nature Communications, could enable far more precise measurements and power new technologies, such as super-precise microscopes and quantum computers.

    Metrology is the science of estimations and measurements. If you weighed yourself this morning, you’ve done metrology. In the same way as quantum computing is expected to revolutionise the way complicated calculations are done, quantum metrology, using the strange behaviour of subatomic particles, may revolutionise the way we measure things.

    We are used to dealing with probabilities that range from 0% (never happens) to 100% (always happens). To explain results from the quantum world however, the concept of probability needs to be expanded to include a so-called quasi-probability, which can be negative. This quasi-probability allows quantum concepts such as Einstein’s ‘spooky action at a distance’ and wave-particle duality to be explained in an intuitive mathematical language. For example, the probability of an atom being at a certain position and travelling with a specific speed might be a negative number, such as –5%.

    An experiment whose explanation requires negative probabilities is said to possess ‘quantum negativity.’ The scientists have now shown that this quantum negativity can help take more precise measurements.

    All metrology needs probes, which can be simple scales or thermometers. In state-of-the-art metrology however, the probes are quantum particles, which can be controlled at the sub-atomic level. These quantum particles are made to interact with the thing being measured. Then the particles are analysed by a detection device.

    In theory, the greater number of probing particles there are, the more information will be available to the detection device. But in practice, there is a cap on the rate at which detection devices can analyse particles. The same is true in everyday life: putting on sunglasses can filter out excess light and improve vision. But there is a limit to how much filtering can improve our vision — having sunglasses which are too dark is detrimental.

    “We’ve adapted tools from standard information theory to quasi-probabilities and shown that filtering quantum particles can condense the information of a million particles into one,” said lead author Dr David Arvidsson-Shukur from Cambridge’s Cavendish Laboratory and Sarah Woodhead Fellow at Girton College. “That means that detection devices can operate at their ideal influx rate while receiving information corresponding to much higher rates. This is forbidden according to normal probability theory, but quantum negativity makes it possible.”

    An experimental group at the University of Toronto has already started building technology to use these new theoretical results. Their goal is to create a quantum device that uses single-photon laser light to provide incredibly precise measurements of optical components. Such measurements are crucial for creating advanced new technologies, such as photonic quantum computers.

    “Our discovery opens up exciting new ways to use fundamental quantum phenomena in real-world applications,” said Arvidsson-Shukur.

    Quantum metrology can improve measurements of things including distances, angles, temperatures and magnetic fields. These more precise measurements can lead to better and faster technologies, but also better resources to probe fundamental physics and improve our understanding of the universe. For example, many technologies rely on the precise alignment of components or the ability to sense small changes in electric or magnetic fields. Higher precision in aligning mirrors can allow for more precise microscopes or telescopes, and better ways of measuring the earth’s magnetic field can lead to better navigation tools.

    Quantum metrology is currently used to enhance the precision of gravitational wave detection in the Nobel Prize-winning LIGO Hanford Observatory. But for the majority of applications, quantum metrology has been overly expensive and unachievable with current technology. The newly-published results offer a cheaper way of doing quantum metrology.

    “Scientists often say that ‘there is no such thing as a free lunch’, meaning that you cannot gain anything if you are unwilling to pay the computational price,” said co-author Aleksander Lasek, a PhD candidate at the Cavendish Laboratory. “However, in quantum metrology this price can be made arbitrarily low. That’s highly counterintuitive, and truly amazing!”

    Dr Nicole Yunger Halpern, co-author and ITAMP Postdoctoral Fellow at Harvard University, said: “Everyday multiplication commutes: Six times seven equals seven times six. Quantum theory involves multiplication that doesn’t commute. The lack of commutation lets us improve metrology using quantum physics.

    “Quantum physics enhances metrology, computation, cryptography, and more; but proving rigorously that it does is difficult. We showed that quantum physics enables us to extract more information from experiments than we could with only classical physics. The key to the proof is a quantum version of probabilities — mathematical objects that resemble probabilities but can assume negative and non-real values.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Cambridge Campus

    The University of Cambridge (abbreviated as Cantab in post-nominal letters) is a collegiate public research university in Cambridge, England. Founded in 1209, Cambridge is the second-oldest university in the English-speaking world and the world’s fourth-oldest surviving university. It grew out of an association of scholars who left the University of Oxford after a dispute with townsfolk. The two ancient universities share many common features and are often jointly referred to as “Oxbridge”.

    Cambridge is formed from a variety of institutions which include 31 constituent colleges and over 100 academic departments organised into six schools. The university occupies buildings throughout the town, many of which are of historical importance. The colleges are self-governing institutions founded as integral parts of the university. In the year ended 31 July 2014, the university had a total income of £1.51 billion, of which £371 million was from research grants and contracts. The central university and colleges have a combined endowment of around £4.9 billion, the largest of any university outside the United States. Cambridge is a member of many associations and forms part of the “golden triangle” of leading English universities and Cambridge University Health Partners, an academic health science centre. The university is closely linked with the development of the high-tech business cluster known as “Silicon Fen”.

  • richardmitnick 12:03 pm on August 29, 2016 Permalink | Reply
    Tags: , , , Quantum metrology,   

    From Physics: “Unlocking the Hidden Information in Starlight” 

    Physics LogoAbout Physics

    Physics Logo 2


    August 29, 2016
    Gabriel Durkin, Berkeley Quantum Information and Computation Center, University of California, Berkeley

    Quantum metrology shows that it is always possible to estimate the separation of two stars, no matter how close together they are.

    The Rayleigh criterion states that in direct imaging, two light sources are only discernable when the centers of their diffraction patterns, or peaks of their point spread functions, are farther apart than their widths. (Top) The sources are farther apart than the Rayleigh criterion distance. (Middle) The sources meet the Rayleigh criterion distance. (Bottom) The sources are closer than the Rayleigh criterion distance. Tsang and collaborators [1] used quantum metrology techniques to show that the Rayleigh criterion is not a fundamental limitation, finding that the separation between two objects can always be estimated with a precision that is independent of the size of the separation.

    A provocative new result [1] by Mankei Tsang, Ranjith Nair, and Xiao-Ming Lu of the National University of Singapore suggests that a long-standing limitation to the precision of astronomical imaging, the Rayleigh criterion, proposed in 1879 [2] is itself only an apparition. Using quantum metrology techniques, the researchers have shown that two uncorrelated point-like light sources, such as stars, can be discriminated to arbitrary precision even as their separation decreases to zero.

    Quantum metrology, a field that has existed since the late 1960s with the pioneering work of Carl Helstrom [3], is a peculiar hybrid of quantum mechanics and the classical estimation theory developed by statisticians in the 1940s. The methodology is a powerful one, quantifying resources needed for optimal estimation of elementary variables and fundamental constants. These resources include preparation of quantum systems in a characteristic (entangled) state, followed by judiciously chosen measurements, from which a desired parameter, itself not directly measurable, may be inferred.

    In the context of remote sensing, for example, in the imaging of objects in the night sky, the ability to prepare a physical system in an optimal state does not exist. In the case of starlight, the typical assumption is that the source is classical thermal light, the state of maximum entropy or “uninformativeness.” Imaging such sources is plagued by the limits of diffraction when the objects are in close proximity. The wave-like nature of light causes it to spread as it moves through space, bending around obstacles, for example when traversing a telescope aperture. This results in a diffraction pattern described by a so-called point spread function (PSF) in the image plane. The Rayleigh criterion states that two closely spaced objects are just resolvable—that is, discernable from one another—when the center of the diffraction pattern, or peak of the PSF, of one object is directly over the first minimum of the diffraction pattern of the other. Roughly, the PSF maxima must be farther apart than their widths (Fig. 1).

    Some astronomers say they are able to resolve objects that are slightly closer than the Rayleigh limit allows. Yet inevitably, as the angular separation between the objects decreases, the information that can be obtained about that separation using direct detection becomes negligible, and even the most optimistic astronomer, utilizing the most sophisticated signal-processing techniques, must admit defeat. Correspondingly, as the separation approaches zero, the minimum error on any unbiased estimation of the separation blows up to infinity, which has limited angular resolution in imaging since the time of Galileo. Typically, the mean-squared error on the estimation of a parameter scales with the number of repeated measurements or data points, ν
    , as 1∕ν. Even for a large error per measurement, any desired precision is attained by taking multiple data points. When, however, the lower bound on direct estimation of the separation is divergent because of the Rayleigh limit, the 1∕ν factor makes no impact. This is what Tsang and collaborators call Rayleigh’s curse.

    Using a quantum metrology formalism to minimize the estimation error, the initial achievement of their work has been to show that there is no fundamental obstacle to the estimation of the separation of two PSFs in one dimension (that is, for sources that sit on a line). As the separation of two PSFs decreases to zero, the amount of obtainable information stays constant. This discovery is nicely summed up by Tsang, who says we should apologize to the starlight “as it travels billions of light years to reach us, yet our current technology and even our space telescopes turn out to be wasting a lot of the information it carries.” [4]

    It could be suggested that this is merely a theoretical proof; the quantum metrology formalism indicates that there is always an optimal measurement, which minimizes the estimation error for the separation parameter. Paradoxically, this optimal measurement can, however, depend on the value of the parameter. To obviate such concerns, Tsang and his colleagues propose a strategy, based on state-of-the-art quantum optics technology, that produces a minimal error in the estimation of the separation variable—counterintuitively, this error remains constant for all separation values, under the assumption that the PSFs have a Gaussian shape. The method, which the authors call spatial mode demultiplexing (SPADE), splits the light from the two sources into optical waveguides that have a quadratic refractive-index lateral profile. Mathematically, this SPADE measurement decomposes the overlapping PSFs (a real function in one dimension) into the complete basis of Hermite functions, just as a Fourier transform provides a decomposition of a real function into a superposition of sine and cosine terms. A posteriori, one may be tempted to use intuition to explain why this Hermite basis measurement seems not to suffer Rayleigh’s curse, but then again, were intuition forthcoming, the result may not have been hidden from view for so long. (This elusiveness relates to subtleties in the estimation of a single parameter extracted from the joint statistics of two incoherent light sources.)

    One minor caveat of the approach is that full imaging of two point sources at positions X1
    and X2 requires estimation of both separation X1−X2 and centroid (X1+X2)∕2 parameters. SPADE is only optimal when the centroid parameter is already known to high precision. Centroid estimation, however, has no equivalent analog to the Rayleigh curse; it may be made via direct imaging. Errors can be reduced appropriately via the factor 1∕ν for data points with ν much greater than 1.

    A second detail worth pondering is that this result utilized techniques from the quantum domain to reveal a classical result. (All of the physical assumptions about starlight admit a classical model.) The quantum metrology formalism has been used to optimally estimate a parameter, but no quantum correlations exist in the system for any value of that parameter, that is, for any angular separation of two stars. When no quantum correlations are present, the formalism will still indicate the best possible measurement strategy and the smallest achievable estimation error.

    An added blessing of quantum metrology is that it allows the development of generalized uncertainty relationships, for example between temperature and energy for a system at equilibrium [5], or photon number and path-length difference between the two arms of an interferometer. The result of Tsang and his colleagues can be presented as another type of generalized uncertainty, between source separation and “momentum.” The mean-squared error associated with separation estimation scales inversely with the momentum (Fourier) space variance of the overlapping PSFs.

    Regarding impact on the field, the authors’ study produced a flurry of generalizations and other experimental proposals. During the past six months there have been four proof-of-principle experiments, first in Singapore by Tsang’s colleague Alex Ling and collaborators [6], and then elsewhere in Canada and Europe [7–9]. A subsequent theory paper from researchers at the University of York [10] extends Tsang and colleagues’ theory result, which was for incoherent thermal sources such as starlight, to any general quantum state existing jointly between the two sources. This work exploits the roles of squeezing (of quantum fluctuations) and of quantum entanglement to improve measurement precision, extending applicability to domains in which control of the source light is possible, such as microscopy.

    Tsang and his colleagues have provided a new perspective on the utility of quantum metrology, and they have reminded us that even in observational astronomy—one of the oldest branches of science—there are (sometimes) still new things to be learned, at the most basic level.

    This research is published in Physical Review X.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Physicists are drowning in a flood of research papers in their own fields and coping with an even larger deluge in other areas of physics. How can an active researcher stay informed about the most important developments in physics? Physics highlights a selection of papers from the Physical Review journals. In consultation with expert scientists, the editors choose these papers for their importance and/or intrinsic interest. To highlight these papers, Physics features three kinds of articles: Viewpoints are commentaries written by active researchers, who are asked to explain the results to physicists in other subfields. Focus stories are written by professional science writers in a journalistic style and are intended to be accessible to students and non-experts. Synopses are brief editor-written summaries. Physics provides a much-needed guide to the best in physics, and we welcome your comments (physics@aps.org).

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: