Tagged: MIT Technology Review Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:44 pm on February 23, 2017 Permalink | Reply
    Tags: Magnetic resonance imaging, MIT Technology Review, University of Melbourne   

    From MIT Tech Review: “This Microscope Reveals Human Biochemistry at Previously Unimaginable Scales” 

    MIT Technology Review
    M.I.T Technology Review

    February 23, 2017


    Magnetic resonance imaging is one of the miracles of modern science. It produces noninvasive 3-D images of the body using harmless magnetic fields and radio waves. And with a few additional tricks, it can also reveal details of the biochemical makeup of tissue.

    Atomic-scale MRI holds promise for new drug discovery | The Melbourne Newsroom

    That biochemical trick is called magnetic resonance spectroscopy, and it is a powerful tool for physicians and researchers studying the biochemistry of the body, including metabolic changes in tumors in the brain and in muscles.

    But this technique is not perfect. The resolution of magnetic resonance spectroscopy is limited to length scales of about 10 micrometers. And there is a world of chemical and biological activity at smaller scales that scientists simply cannot access in this way.

    So physicians and researchers would dearly love to have a magnetic resonance microscope that can study body tissue and the biochemical reactions within it at much smaller scales.

    Today, David Simpson and pals at the University of Melbourne in Australia say they have built a magnetic resonance microscope with a resolution of just 300 nanometers that can study biochemical reactions on previously unimaginable scales. Their key breakthrough is an exotic diamond sensor that creates magnetic resonance images in a similar way to a light sensitive CCD chip in a camera.

    Magnetic resonance imaging works by placing a sample in a magnetic field so powerful that the atomic nuclei all become aligned; in other words, they all spin the same way. When these nuclei are zapped with radio waves, the nuclei become excited and then emit radio waves as they relax. By studying the pattern of re-emitted radio waves, it is possible to work out where they have come from and so build up a picture of the sample.

    The signals also reveal how the atoms are bonded to each other and the biochemical processes at work. But the resolution of this technique is limited by how closely the radio receiver can get to the sample.

    Enter Simpson and co, who have built an entirely new kind of magnetic resonance sensor out of diamond film. The secret sauce in this sensor is an array of nitrogen atoms that have been embedded in a diamond film at a depth of about seven nanometers and about 10 nanometers apart.

    Nitrogen atoms are useful because when embedded in diamond, they can be made to fluoresce. And when in a magnetic field, the color they produce is highly sensitive to the spin of atoms and electrons nearby or, in other words, to the local biochemical environment.

    So in the new machine, Simpson and co place their sample on top of the diamond sensor, in a powerful magnetic field and zap it with radio waves. Any changes in the state of nearby nuclei causes the nitrogen array to fluoresce in various colors. And the array of nitrogen atoms produces a kind of image, just like a light sensitive CCD chip. All Simpson and co do is monitor this fireworks display to see what’s going on.

    To put the new technique through its paces, Simpson and co study the behavior of hexaaqua copper(2+) complexes in aqueous solution. Hexaaqua copper is present in many enzymes which use it to incorporate copper in metalloproteins. However, the distribution of copper during this process, and the role it plays in cell signaling, is poorly understood because it is impossible to visualize in vivo.

    Simpson and co show how this can now be done using their new technique, which they call quantum magnetic resonance microscopy. They show how their new sensor can reveal the spatial distribution of copper 2+ ions in volumes of just a few attoLitres and at high resolution. “We demonstrate imaging resolution at the diffraction limit (~300 nm) with spin sensitivities in the zeptomol (10‐21) range,” say Simpson and co. They also show how the technique reveals the redox reactions that the ions undergo. And they do all this at room temperature.

    That’s impressive work that has important implications for the future study of biochemistry. “The work demonstrates that quantum sensing systems can accommodate the fluctuating Brownian environment encountered in ‘real’ chemical systems and the inherent fluctuations in the spin environment of ions undergoing ligand rearrangement,” says Simpson and co.

    That makes it a powerful new tool that could change the way we understand biological processes. Simpson and co are optimistic about its potential. “Quantum magnetic resonance microscopy is ideal for probing fundamental nanoscale biochemistry such as binding events on cell membranes and the intra‐cellular transition metal concentration in the periplasm of prokaryotic cells.”

    Ref: arxiv.org/abs/1702.04418: Quantum Magnetic Resonance Microscopy

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

  • richardmitnick 12:48 pm on January 8, 2017 Permalink | Reply
    Tags: , , , MIT Technology Review, A test that will detect all of the major cancer types   

    From MIT Tech Review: “Liquid Biopsies Are About to Get a Billion Dollar Boost’ 

    MIT Technology Review
    M.I.T Technology Review

    January 6, 2017
    Michael Reilly

    A billion dollars sounds like a lot of money. But when your ambitions are as big as the cancer-detection startup Grail Bio’s are, it might not be enough.

    As CEO and ex-Googler Jeff Huber puts it, Grail’s aim is to create “a test that will detect all of the major cancer types.” Already the recipient of $100 million in funding from DNA sequencing company Illumina and a series of tech luminaries, Grail believes that adding another zero to its cash balance will put its lofty goals within reach. The company announced Thursday that it plans to raise $1 billion, has “indications of interest” from investors, and would move quickly to secure the hefty cash infusion.

    Whether Grail succeeds turns on the company’s ability to dramatically expand an emerging technology known as the liquid biopsy. It works by sequencing DNA from someone’s blood and looking for tell-tale fragments that indicate the presence of cancer. Dennis Lo, a doctor in Hong Kong, was among the first to show the technique’s promise. He’d previously used it to detect fetal DNA in a mother’s bloodstream. That led to a much safer form of screening for Down’s syndrome that is now in wide use.

    Lo has experimented with liquid biopsy as a way to catch liver and nasopharyngeal cancers, with some encouraging results. But he urged caution in assuming the technique could be translated to all cancers.

    Grail, which was spun out of Illumina about a year ago, has launched its first trials to see whether liquid biopsies can spot cancers earlier and more reliably than other screening tests.

    For his part, Huber seems to understand that he’s got a mountain to climb. After losing his wife to colorectal cancer, Grail’s mission is deeply personal. He acknowledges that detecting cancer DNA may be difficult, because the disease mutates rapidly as it advances, and varies immensely from one type to another. He says his company will rely on sequencing the DNA of tens of thousands of subjects to build a library of cancer DNA that computers can then decipher.

    Beyond the high-minded talk of turning the tide in the war against cancer, though, is a more cynical reading of the situation. As a unit within Illumina, Grail was an expensive, long-shot bet to create a new market for its gene sequencing machines. As a separate, now cash-rich company, Grail figures to become one of Illumina’s biggest customers. And venture capital will foot the bill, whether or not the experiment works.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

  • richardmitnick 3:16 pm on December 23, 2016 Permalink | Reply
    Tags: , MIT Technology Review, , , Intel and Competitors IBM, and Google,   

    From MIT Tech Review: “Intel Bets It Can Turn Everyday Silicon into Quantum Computing’s Wonder Material” 

    MIT Technology Review
    MIT Technology Review

    December 21, 2016
    Tom Simonite

    The world’s largest chip company sees a novel path toward computers of immense power.

    Researchers at TU Delft in the Netherlands use equipment like this to test quantum computing devices at supercool temperatures, in a collaboration with chip maker Intel. No image credit.

    Sometimes the solution to a problem is staring you in the face all along. Chip maker Intel is betting that will be true in the race to build quantum computers—machines that should offer immense processing power by exploiting the oddities of quantum mechanics.

    Competitors IBM, Microsoft, and Google are all developing quantum components that are different from the ones crunching data in today’s computers. But Intel is trying to adapt the workhorse of existing computers, the silicon transistor, for the task.

    Intel has a team of quantum hardware engineers in Portland, Oregon, who collaborate with researchers in the Netherlands, at TU Delft’s QuTech quantum research institute, under a $50 million grant established last year. Earlier this month Intel’s group reported that they can now layer the ultra-pure silicon needed for a quantum computer onto the standard wafers used in chip factories.

    This strategy makes Intel an outlier among industry and academic groups working on qubits, as the basic components needed for quantum computers are known. Other companies can run code on prototype chips with several qubits made from superconducting circuits (see Google’s Quantum Dream Machine). No one has yet advanced silicon qubits that far.

    A quantum computer would need to have thousands or millions of qubits to be broadly useful, though. And Jim Clarke, who leads Intel’s project as director of quantum hardware, argues that silicon qubits are more likely to get to that point (although Intel is also doing some research on superconducting qubits). One thing in silicon’s favor, he says: the expertise and equipment used to make conventional chips with billions of identical transistors should allow work on perfecting and scaling up silicon qubits to progress quickly.

    Intel’s silicon qubits represent data in a quantum property called the “spin” of a single electron trapped inside a modified version of the transistors in its existing commercial chips. “The hope is that if we make the best transistors, then with a few material and design changes we can make the best qubits,” says Clarke.

    Another reason to work on silicon qubits is that they should be more reliable than the superconducting equivalents. Still, all qubits are error prone because they work on data using very weak quantum effects (see Google Researchers Make Quantum Components More Reliable).

    The new process that helps Intel experiment with silicon qubits on standard chip wafers, developed with the materials companies Urenco and Air Liquide, should help speed up its research, says Andrew Dzurak, who works on silicon qubits at the University of New South Wales in Australia. “To get to hundreds of thousands of qubits, we will need incredible engineering reliability, and that is the hallmark of the semiconductor industry,” he says.

    Companies developing superconducting qubits also make them using existing chip fabrication methods. But the resulting devices are larger than transistors, and there is no template for how to manufacture and package them up in large numbers, says Dzurak.

    Chad Rigetti, founder and CEO of Rigetti Computing, a startup working on superconducting qubits similar to those Google and IBM are developing, agrees that this presents a challenge. But he argues that his chosen technology’s head start will afford ample time and resources to tackle the problem.

    Google and Rigetti have both said that in just a few years they could build a quantum chip with tens or hundreds of qubits that dramatically outperforms conventional computers on certain problems, even doing useful work on problems in chemistry or machine learning.

    No sciencde papers cited.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

  • richardmitnick 3:38 pm on December 7, 2016 Permalink | Reply
    Tags: , , , MIT Technology Review   

    From MIT Tech Review: “Personalized Cancer Vaccine Prevents Leukemia Relapse in Patients” 

    MIT Technology Review
    M.I.T Technology Review

    December 7, 2016
    Emily Mullin

    Shortly after Ernest Levy of Cooperstown, New York, returned from a trip to South Africa with his son for the 2010 World Cup, he was diagnosed with acute myeloid leukemia. The prognosis didn’t look good for Levy, now 76. Just over a quarter of adult patients survive five years after developing the disease, a type of cancer that affects bone marrow.

    Levy joined a clinical trial led by the Beth Israel Deaconess Medical Center, a teaching hospital of Harvard Medical School in Boston, testing a cancer vaccine for acute myeloid leukemia. After an initial round of chemotherapy, he and the other trial participants received the experimental vaccine, a type of immunotherapy intended to “reëducate” the immune cells to see cancer cells as foreign and attack them, explains David Avigan, chief of Hematological Malignancies and director of the Cancer Vaccine Program at Beth Israel.

    Now results from the trial suggest that the vaccine was able to stimulate powerful immune responses against cancer cells and protect a majority of patients from relapse—including Levy. Out of 17 patients with an average age of 63 who received the vaccine, 12 are still in remission four years or more after receiving the vaccine, Avigan and his co-authors at the Dana-Farber Cancer Institute report. The researchers found expanded levels of immune cells that recognize acute myeloid leukemia cells after vaccination. The results appear today in the journal Science Translational Medicine.

    Acute myeloid leukemia is typically treated with a combination of chemotherapies, but the cancer often relapses after initial treatment, with older patients having a higher chance of relapse.

    Therapeutic cancer vaccines are designed to work by activating immune cells called T cells and directing them to recognize and act against cancer cells, or by spurring the production of antibodies that bind to certain molecules on the surface of cancer cells. But producing effective therapeutic vaccines has proved challenging, with many of these vaccines either failing outright or showing only marginal increases in survival rates in clinical trials.

    Avigan and his colleagues created a personalized vaccine by taking leukemia cells from patients and then freezing them for preservation while they received a traditional chemotherapy. Then scientists thawed the cancer cells and combined them with dendritic cells, immune cells that unleash tumor-fighting T cells. The vaccine took about 10 days to manufacture and another three to four weeks before it was ready for administration.

    Many cancer vaccine strategies have homed in on a single target, or antigen. When the antigen is introduced in the body via injection, it causes an immune response. The body begins to produce T cells that recognize and attack the same antigen on the surface of cancer cells. The vaccine Avigan and his team created uses a mixture of cells that contain many antigens in an attempt to generate a more potent approach.

    Though the number of patients in the trial was small, Avigan says, “this was enough of a provocative finding” that the researchers will be expanding the trial to include more patients. At the same time, the personalized vaccine approach is already being tested in other types of cancers.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

  • richardmitnick 2:42 pm on August 18, 2016 Permalink | Reply
    Tags: , , MIT Technology Review,   

    From MIT Tech Review: “New Brain-Mapping Technique Captures Every Connection Between Neurons” 

    MIT Technology Review
    M.I.T Technology Review

    August 18, 2016
    Ryan Cross

    The human brain is among the universe’s greatest remaining uncharted territories. And as with any mysterious land, the secret to understanding it begins with a good map.

    Neuroscientists have now taken a huge step toward the goal of mapping the connections between neurons in the brain using bits of genetic material to bar-code each individual brain cell. The technique, called MAP-seq, could help researchers study disorders like autism and schizophrenia in unprecedented detail.

    “We’ve got the basis for a whole new technology with a gazillion applications,” says Anthony Zador, a neuroscientist at Cold Spring Harbor Laboratory who came up with the technique.

    Current methods for mapping neuronal connections, known as the brain’s connectome, commonly rely on fluorescent proteins and microscopes to visualize cells, but they are laborious and have difficultly following the connections of many neurons at once.

    MAP-seq works by first creating a library of viruses that contain randomized RNA sequences. This mixture is then injected into the brain, and approximately one virus enters each neuron in the injection area, granting each cell a unique RNA bar code. The brain is then sliced and diced into orderly sections for processing. A DNA sequencer reads the RNA bar codes, and researchers create a connectivity matrix that displays how individual neurons connect to other regions of the brain.

    The newly published study, which appears Thursday in the journal Neuron, follows the sprawling outbound connections from 1,000 mouse neurons in a brain region called the locus coeruleus to show that the technique works. But Zador says the results actually reconcile previously conflicting findings about how those neurons connect across the brain.

    Justus Kebschull, who worked with Zador in developing MAP-seq, says the technique is getting better. “We’re now mapping out 100,000 cells at a time, in one week, in one experiment,” he says. “That was previously only possible if you put a ton of work in.”

    Both autism and schizophrenia are viewed as disorders that may arise from dysfunctional brain connectivity. There are perhaps hundreds of genetic mutations that may slightly alter the brain’s wiring as it develops. “We are looking at mouse models where something is mucked up. And now that the method is so fast, we can look at many mouse models,” Kebschull says. By comparing the brain circuitry in mice with different candidate genes for autism, researchers expect, they’ll get new insight into the condition.

    “I think it is a great method that has a lot of room to grow,” says Je Hyuk Lee, a molecular biologist at Cold Spring Harbor Laboratory, who was not part of the MAP-seq study. Although other groups have used similar bar-coding to study individual differences between cells, no one knew if the bar codes would be able to travel along the neuronal connections across the brain. “That had been conjectured but never shown, especially not at this scale,” Lee says.

    Zador says that as of now, his lab is the only one bar-coding the brain, but he hopes others will start using MAP-seq to chart the brain’s circuitry. “Because the cost of sequencing is continuing to plummet, we can envision doing this quickly and cheaply,” he said. It may not be long, then, before a complete map of the brain is ready for its first explorer to use.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

  • richardmitnick 10:51 am on June 11, 2016 Permalink | Reply
    Tags: , Genetic replacement therapy, MIT Technology Review, Strimvelis   

    From MIT Tech Review: “Gene Therapy’s First Out-and-Out Cure Is Here” 

    MIT Technology Review
    M.I.T Technology Review

    May 6, 2016
    Antonio Regalado

    An artist’s illustration of gene therapy shows a retrovirus harboring a correct copy of a human gene. No image credit

    A treatment now pending approval in Europe will be the first commercial gene therapy to provide an outright cure for a deadly disease.

    The treatment is a landmark for gene-replacement technology, an idea that’s struggled for three decades to prove itself safe and practical.

    Called Strimvelis, and owned by drug giant GlaxoSmithKline, the treatment is for severe combined immune deficiency, a rare disease that leaves newborns with almost no defense against viruses, bacteria, or fungi and is sometimes called “bubble boy” disease after an American child whose short life inside a protective plastic shield was described in a 1976 movie.

    The treatment is different from any that’s come before because it appears to be an outright cure carried out through a genetic repair. The therapy was tested on 18 children, the first of them 15 years ago. All are still alive.

    “I would be hesitant to call it a cure, although there’s no reason to think it won’t last,” says Sven Kili, the executive who heads gene-therapy development at GSK.

    The British company licensed the treatment in 2010 from the San Raffaele Telethon Institute for Gene Therapy, in Milan, Italy, where it was developed and first tested on children.

    On April 1, European advisors recommended that Strimvelis be allowed on the market. If, as expected, GSK wins formal authorization, it can start selling the drug in 27 European countries. GSK plans to seek U.S. marketing approval next year.

    GSK is the first large drug company to seek to market a gene therapy to treat any genetic disease. If successful, the therapeutic could signal a disruptive new phase in medicine in which one-time gene fixes replace trips to the pharmacy or lifelong dependence on medication.

    “The idea that you don’t have to worry about it and can be normal is extremely exciting for people,” says Marcia Boyle, founder and president of the Immune Deficiency Foundation, whose son was born with a different immune disorder, one of more than 200 known to exist. “I am a little guarded on gene therapy because we were all excited a long time ago, and it was not as easy to fool Mother Nature as people had hoped.”

    Today, several hundred gene therapies are in development, and many aspire to be out-and-out cures for one of about 5,000 rare diseases caused by errors in a single gene.

    Children who lack correct copies of a gene called adenosine deaminase begin to get life-threatening infections days after birth. The current treatment for this immune deficiency, known as ADA-SCID, is a bone marrow transplant, which itself is sometimes fatal, or lifelong therapy using costly replacement enzymes that cost $5,000 a vial.

    Strimvelis uses a “repair and replace” strategy, so called because doctors first remove stem cells from a patient’s bone marrow then soak them with viruses to transfer a correct copy of the ADA gene.

    “What we are talking about is ex vivo gene therapy—you pull out the cells, correct them in test tube, and put the cells back,” says Maria-Grazia Roncarolo, a pediatrician and scientist at Stanford University who led the original Milan experiments. “If you want to fix a disease for life, you need to put the gene in the stem cells.”

    Some companies are trying to add corrected genes using direct injections into muscles or the eye. But the repair-and-replace strategy may have the larger impact. As soon as next year, companies like Novartis and Juno Therapeutics may seek approval for cancer treatments that also use a combination of gene and cell therapy to obliterate one type of leukemia.

    Overall, investment in gene therapy is booming. The Alliance for Regenerative Medicine says that globally, in 2015, public and private companies raised $10 billion, and about 70 treatments are in late-stage testing.

    GSK has never sold a product so drastically different from a bottle of pills. And because ADA-SCID is one of the rarest diseases on Earth, Strimvelis won’t be a blockbuster. GSK estimates there are only about 14 cases a year in Europe, and 12 in the U.S.

    Instead, the British company hopes to master gene-therapy technology, including virus manufacturing. “If we can first make products that change lives, then we can develop them into things that affect more people,” says Kili. “We believe gene therapy is an area of important future growth; we don’t want to rush or cut corners.”

    Markets will closely scrutinize how much GSK charges for Strimvelis. Kili says a final decision hasn’t been made. Another gene therapy, called Glybera, debuted with a $1 million price tag but is already considered a commercial flop. The dilemma is how to bill for a high-tech drug that people take only once.

    Kili says GSK’s price won’t be anywhere close to a million dollars, though it will be enough to meet a company policy of getting a 14 percent return on every dollar spent on R&D.

    The connection between immune deficiency and gene therapy isn’t new. In fact, the first attempt to correct genes in a living person occurred in 1990, also in a patient with ADA-SCID.

    By 2000, teams in London and France had cured some children of a closely related immune deficiency, X-linked SCID, the bubble boy disease. But some of those children developed leukemia after the viruses dropped their genetic payloads into the wrong part of the genome.

    In the U.S., the Food and Drug Administration quickly canceled 27 trials over safety concerns. “It was a major step back,” says Roncarolo, and probably a more serious red flag even than the death of a volunteer named Jesse Gelsinger in a U.S. trial in 1999, which also drew attention to gene therapy’s risks.

    The San Raffaele Telethon Institute for Gene Therapy presented its own results in ADA-SCID, which also affects girls, in 2002 in the journal Science. Like the French, they’d also apparently cured patients, and because of differences in their approach, they didn’t run the same cancer risk.

    GSK says it is moving toward commercializing several other gene therapies for rare disease developed by the Italian team, including treatments for metachromatic leukodystrophy, a rare but rapidly fatal birth defect, and for beta thalassemia.

    Kili says the general idea is to leapfrog from ultra-rare diseases to less rare ones, like beta thalassemia, hemophilia, and sickle cell disease. However, he doubts the technology will be used to treat common conditions such as arthritis or heart disease anytime soon. Those conditions are complex and aren’t caused by a defect in just one gene.

    “Honestly, as we stand at the moment, I don’t think gene therapy will address all the ills or ailments of humanity. We can address [single-gene] disease,” he says. “We are building a hammer that is not that big.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

  • richardmitnick 8:19 am on June 10, 2016 Permalink | Reply
    Tags: A Power Plant in Iceland Deals with Carbon Dioxide by Turning It into Rock, , , MIT Technology Review   

    From MIT Tech Review: “A Power Plant in Iceland Deals with Carbon Dioxide by Turning It into Rock” 

    MIT Technology Review
    MIT Technology Review

    June 9, 2016
    Ryan Cross

    Photograph by Juerg Matter

    The world has a carbon dioxide problem. And while there are lots of ideas on how to curtail the nearly 40 billion tons of the gas that humanity spews into the atmosphere annually, one has just gotten a boost: burying it.

    Since 2012, Reykjavík Energy’s CarbFix project in Iceland has been injecting carbon dioxide underground in a way that converts it into rock so that it can’t escape. This kind of carbon sequestration has been tried before, but as researchers working on the project report today in the journal Science, the process of mineralizing the carbon dioxide happens far more quickly than expected, confirming previous reports and brightening the prospects for scaling up this technology.

    Iceland’s volcanic landscape is replete with basalt. Injecting carbon dioxide and water deep underground allows the mixture to react with calcium, magnesium, and iron in the basalt, turning it into carbonate minerals like limestone.

    Project leader Juerg Matter stands by the injection well during the CarbFix project’s initial injection. Photograph by Sigurdur Gislason

    Conventional methods for storing carbon dioxide underground pressurize and heat it to form a supercritical fluid, giving it the properties of both a liquid and a gas. While making the carbon dioxide easier to inject into the ground—usually in an old oil or gas reservoir—this carries a higher risk that it could escape back into the atmosphere through cracks in the rock.

    CarbFix takes carbon dioxide from the Hellisheidi geothermal power plant, the largest in the world, which uses volcanically heated water to power turbines. The process produces 40,000 tons of carbon dioxide a year, as well as hydrogen sulfide, both of which are naturally present in the water.

    The CarbFix pilot injection site in March 2011. Photograph by Martin Stute

    The new study shows that more than 95 percent of the injected material turned to rock in less than two years. “No one actually expected it to be this quick,” says Edda Aradóttir, CarbFix’s project manager. The project is already storing 5,000 tons underground per year, making it the largest of its kind. New equipment being installed this summer aims to double the rate of storage.

    Aradóttir says CarbFix spends $30 per ton to capture and inject the carbon dioxide, versus $65 to $100 per ton for the conventional method. A lot of that savings comes from not having to purify the carbon dioxide; it and the hydrogen sulfide are simply mixed with additional water and injected underground.

    CarbFix team members handle the rock core recovered from drilling at the CarbFix pilot injection site in October 2014. Photograph by Juerg Matter

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

  • richardmitnick 12:13 pm on June 9, 2016 Permalink | Reply
    Tags: , MIT Technology Review,   

    From MIT Tech Review: “Proof That Quantum Computers Will Change Everything” 

    MIT Technology Review
    MIT Technology Review

    First Demonstration of 10-Photon Quantum Entanglement

    June 9, 2016
    Emerging Technology from the arXiv

    The ability to entangle 10 photons should allow physicists to prove, once and for all, that quantum computers really can do things classical computers cannot.

    Entanglement is the strange phenomenon in which quantum particles become so deeply linked that they share the same existence. Once rare, entangling particles has become routine in labs all over the world.

    Quantum approach to big data. MIT
    Quantum approach to big data. MIT

    Physicists have learned how to create entanglement, transfer it from one particle to another, and even distil it. Indeed, entanglement has become a resource in itself and a crucial one for everything from cryptography and teleportation to computing and simulation.

    But a significant problem remains. To carry out ever more complex and powerful experiments, physicists need to produce entanglement on ever-larger scales by entangling more particles at the same time.

    The current numbers are paltry, though. Photons are the quantum workhorses in most labs and the record for the number of entangled photons is a mere eight, produced at a rate of about nine events per hour.

    Using the same techniques to create a 10-photon count rate would result in only 170 per year, too few even to measure easily. So the prospects of improvement have seemed remote.

    Which is why the work of Xi-Lin Wang and pals at the University of Science and Technology of China in Heifu is impressive. Today, they announce that they’ve produced 10-photon entanglement for the first time, and they’ve done it at a count rate that is three orders of magnitude higher than anything possible until now.

    The biggest bottleneck in entangling photons is the way they are produced. This involves a process called spontaneous parametric down conversion, in which one energetic photon is converted into two photons of lower energy inside a crystal of beta-barium borate. These daughter photons are naturally entangled.

    Experiment setup for generating ten-photon polarization-entangled GHZ, from the science paper

    By zapping the crystal continuously with a laser beam, it is possible to create a stream of entangled photon pairs. However, the rate of down conversion is tiny, just one photon per trillion. So collecting the entangled pairs efficiently is hugely important.

    That’s no easy tasks, not least because the photons come out of the crystal in slightly different directions, neither of which can be easily predicted. Physicists collect the photons from the two points where they are most likely to appear but most of the entangled photons are lost.

    Xi-Lin and co have tackled this problem by reducing the uncertainty in the photon directions. Indeed, they have been able to shape the beams of entangled photons so that they form two separate circular beams, which can be more easily collected.

    In this way, the team has generated entangled photon pairs at the rate of about 10 million per watt of laser power. This is brighter than previous entanglement generators by a factor of about four. It is this improvement that makes 10-photon entanglement possible.

    Their method is to collect five successively generated pairs of entangled photons and pass them into an optical network of four beam splitters. The team then introduces time delays that ensure the photons arrive at the beam splitters simultaneously and so become entangled.

    This creates the 10-photon entangled state, albeit at a rate of about four per hour, which is low but finally measureable for the first time. “We demonstrate, for the first time, genuine and distillable entanglement of 10 single photons,” say Xi-Lin and co.

    That’s impressive work that immediately opens the prospect of a new generation of experiments. The most exciting of these is a technique called boson sampling that physicists hope will prove that quantum computers really are capable of things classical computers are not.

    That’s important because nobody has built a quantum computer more powerful than a pocket calculator (the controversial D-Wave results aside). Neither are they likely to in the near future. So boson sampling is quantum physicists’ greatest hope that will allow them to show off the mind-boggling power of quantum computation for the first time.

    Other things also become possible, such as the quantum teleportation of three degrees of freedom in a single photon and multi-photon experiments over very long distances.

    But it is the possibility of boson sampling that will send a frisson through the quantum physics community.

    Ref: arxiv.org/abs/1605.08547: Experimental Ten-Photon Entanglement

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

  • richardmitnick 8:54 am on June 6, 2016 Permalink | Reply
    Tags: , , Go Inside an Industrial Plant That Sucks Carbon Dioxide Straight Out of the Air, MIT Technology Review   

    From MIT Tech Review: “Go Inside an Industrial Plant That Sucks Carbon Dioxide Straight Out of the Air” 

    MIT Technology Review
    MIT Technology Review

    June 6, 2016
    Peter Fairley

    No image caption. No image credit.

    Carbon dioxide emissions must decrease to nearly zero by 2040 if global warming by the end of this century is to be held to 2 °C. But we may well miss that target. A pilot plant started up last fall at Squamish, British Columbia, is testing a backup plan: sucking carbon dioxide directly out of the air.

    Capturing ambient carbon dioxide is a tall order because, for all the trouble it causes, the greenhouse gas makes up just 0.04 percent of the air we breathe. The Squamish plant can capture one ton of carbon dioxide a day. Significantly reducing atmospheric carbon dioxide levels would require thousands of far larger facilities, each sucking millions of tons of carbon per year out of the air.

    The plant is the brainchild of Calgary-based Carbon Engineering and its founder, Harvard University physicist David Keith. While some scientists have estimated that direct air capture would cost $400 to $1,000 per ton of carbon dioxide, Keith projects that large plants could do it for about $100 per ton.

    “We’ve taken existing pieces of industrial equipment and thought about new chemistries to run through them,” says Adrian Corless, Carbon Engineering’s CEO. The company captures carbon dioxide in a refashioned cooling tower flowing with an alkali solution that reacts with acidic carbon dioxide. That yields dissolved carbon molecules that are then converted to pellets in equipment created to extract minerals in water treatment plants. And the plant can turn those carbonate solids into pure carbon dioxide gas for sale by heating them in a modified cement kiln.

    Carbon Engineering CEO Adrian Corless

    In May the company closed on $8 million of new financing in Canadian dollars ($6.2 million in U.S. dollars) from investors including Bill Gates. Keith also hopes to start winning over skeptics. “Most people in the energy expert space think that air capture is not particularly credible,” he says. “There won’t be incentives and funding in a serious way for these technologies unless people believe that they actually work.”

    Carbon dioxide is captured within the plant’s gas-liquid contactor, which is essentially a repurposed cooling tower. An alkaline solution in the contactor reacts with acidic carbon dioxide in air to enrich the capture solution with potassium carbonate.


    The contactor contains 80 cubic meters of plastic packing whose three-dimensional honeycomb structure offers 16,800 square meters of surface area. The setup removes 75 to 80 percent of the carbon dioxide in the air.


    Above top: The capture fluid, now rich with carbon dioxide from the air, circulates to a 13-meter-tall reactor. Above bottom: Calcium hydroxide is added to the capture fluid just before it enters the reactor, causing two products to be created inside. One is solid calcium carbonate containing the captured atmospheric carbon. The second, potassium hydroxide, flows back to the air contactor to capture more carbon dioxide.

    As fluid moves up through the reactor, growing pellets of calcium carbonate spread out in a gradient, with the smallest pellets at the top. Pellets can be removed via these sample ports and analyzed in order to optimize the process.

    The heaviest pellets settle at the bottom of the reactor and are periodically removed, washed to remove fine crystals and capture fluid, and dried. The finished product is solid grains of calcium carbonate that resemble a fine couscous.

    Dried pellets are fed into the calciner, in which a 900 °C inferno of natural gas burning in pure oxygen roasts a rolling mass of calcium oxide. The calcium carbonate pellets spontaneously break down, producing more calcium oxide and releasing carbon dioxide gas.

    Next up at Squamish: turning captured carbon dioxide (now vented back to the air) into a low-carbon transportation fuel. By reacting carbon dioxide with hydrogen, Carbon Engineering plans to synthesize a fuel with less than one-third the carbon content of conventional gasoline. Corless estimates the fuels will cost $4 to $6 per gallon, but he expects to fetch a premium in places such as California and the European Union, where mandates require fuel suppliers to reduce their carbon content annually. Ultimately, says Corless, fuel from air capture may prove crucial to break the fossil-fuel dependence everywhere.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

  • richardmitnick 5:45 pm on April 26, 2016 Permalink | Reply
    Tags: "A Space Mission to the Gravitational Focus of the Sun", , , MIT Technology Review   

    From MIT Tech Review: “A Space Mission to the Gravitational Focus of the Sun” 

    MIT Technology Review
    M.I.T Technology Review

    April 26, 2016
    by Emerging Technology from the arXiv

    The search for an Earth-like planet orbiting another star is one of astronomy’s greatest challenges. It’s a task that appears close to fruition. Since astronomers spotted the first exoplanet in 1988, they have found more than 2,000 others.

    Most of these planets are huge, because bigger objects are easier to spot. But as sensing techniques and technologies improve, astronomers are finding planets that match Earth’s vital statistics ever more closely.

    They have even begun to use a ranking system called the Earth Similarity Index to quantify how similar an exoplanet is to the mother planet. The exoplanet that currently ranks most highly is Kepler-438b, which orbits in the habitable zone of a red dwarf in the constellation of Lyra some 470 light years from here.

    Kepler-438b has an Earth Similarity Index of 0.88. By comparison, Mars has an ESI of 0.797, so it’s more Earth-like than our nearest neighbor. That’s exciting but it is inevitable that astronomers will find planets with even higher indices in the near future.

    And that raises an interesting question: how much can we ever know about these planets, given their size and distance from us? After all, the limited size of orbiting telescopes places severe restrictions on how much light and information we can gather from an Earth analogue.


    But there is another option—the gravitational field of the sun can focus light. Place a telescope at the focal point of this giant lens and it should become possible to study a distant object in unprecedented detail. But how good would such a lens be; what would it reveal that we couldn’t see with our own telescopes?

    Today we get an answer to these questions thanks to the work of Geoffrey Landis at NASA’s John Glenn Research Center in Cleveland. Landis has analyzed the resolving power of the solar lens and worked out just how good it could be.

    The basic physics is straightforward and has been worked out in some detail by astronomers in the past. General relativity predicts that light must bend around any massive object. The effect is tiny, however, and only observable with objects of truly enormous mass.

    Despite its size, the sun only bends light by a tiny amount. Consequently, the focal point of our solar lens is at least 550 astronomical units away. That’s beyond the orbit of Pluto and the Kuiper Belt, which extends a mere 50 AU.

    Nevertheless, it is a tempting stepping stone given that there is little of interest between the Kuiper Belt and the next nearest star, Alpha Centauri, which is 280,000 AU distant. “There is thus a powerful incentive to find some plausible objective in visiting the gravitational focus, as a potential intermediate step toward a future interstellar mission,” says Landis.

    Kuiper Belt. Minor Planet Center
    Kuiper Belt. Minor Planet Center

    Centauris Alpha Beta Proxima 27, February 2012 Skatebiker
    Centauris Alpha Beta Proxima, 27, February 2012. Skatebiker

    But there are significant challenges in using the sun as a gravitational lens. The first is related to pointing and focal length. The idea is to place a spacecraft on the opposite side of the sun from the exoplanet, but it cannot sit exactly at the focal point where the light from the exoplanet converges.

    That’s because any image would be drowned out by light from the sun, which would still be the brightest object in the sky. Instead, the spacecraft would sit beyond the focal point where the light from the exoplanet would form into an Einstein ring around the sun. It is this ring that the mission would have to sample.

    Einstein ring. NASA/ESA Hubble
    Einstein ring. NASA/ESA Hubble

    But it is not just the sun that can drown out the image. The solar corona, the aura of plasma that surrounds the sun, is also a problem, and this extends much further. To ensure that the Einstein ring is larger than the corona and not obscured by it, the mission would have to sit even further, at a distance of more than 2,000 AU, says Landis. That’s much further than the 550 AU that previous analyses have suggested.

    It is a simple matter to show that this mission could only have a single objective. To point at a different object just 1 degree away, the telescope would have to move at least 10 AU around the sun, equivalent to the distance from Earth to Saturn. “A significant difference of the solar gravitational lens from a conventional telescope is that the gravitational lens telescope is not in any practical sense pointable,” says Landis.

    But given a specific target, the focal power of the sun produces a hugely magnified view. To demonstrate its potential, Landis uses the hypothetical example of an exoplanet orbiting a star some 35 light years away. If this planet were the same size as the Earth, the image at the focal plane of the sun would be 12.5 kilometers across.

    So the mission could only ever see a small fraction of the planet’s surface. Indeed, a telescope with a one-metre detector would image a one kilometer square area on the surface of the planet—that’s smaller than New York’s Central Park.

    Pointing a telescope at an area so small and distant is tricky. There can be no “finder scope” on such a telescope because the target would be invisible except when using the gravity lens. So the exoplanet’s position will have to be known with high precision.

    Even then, pointing it will not be trivial. “Finding a planet of diameter ~10^4 km at a distance of 10^14 km requires a pointing knowledge and pointing accuracy of 0.1 nanoradians,” says Landis. State-of-the-art pointing accuracy is today about 10 nanoradians.

    But that’s just the start. The exoplanet will be moving as it orbits its star. Landis analyses what would happen if the exoplanet has the same orbital velocity as the Earth, 30 km/sec. In that case, a one-kilometer section of the planet will traverse a one-meter detector in just 33 milliseconds and the entire planet will slip past in 42 seconds.

    Preventing blur by moving the telescope to track the image will be hard. Landis says that the spacecraft will need to change its velocity by 30 meters per second to keep up and that over the course of a year it would follow an ellipse with a semi major axis of about 150,000 kilometers. It’s not clear what kind of propulsion system would be capable of this.

    The alternative, of course, is to use image processing techniques to remove the blur, which is increasingly doable with today’s technology.

    Another major problem is filtering out the light from the sun, not to mention the exoplanet’s parent star, which will be orders of magnitude brighter than the target. The telescope will also have to minimize interference from other sources such as zodiacal light. Much effort has been out into this for the current generation of planet hunting telescopes. Nevertheless, Landis says, this is not a trivial problem.

    Given all these problems, how much better the image from a gravitational lens be compared to an unlensed image? Landis’s estimate is that the lens increases the intensity of light from the exoplanet by a factor of 100,000.

    That’s a significant advantage. But it can only be realized if the exoplanet light can be well separated from the light from other sources such as the sun, the corona, the parent star, and so on. And this is a big unknown.

    The utility of the mission depends on this. “Given all the difficulties, is it worth traveling out to beyond 600 AU to merely gain a factor of 100,000? Is this enough?” asks Landis.

    That’s a question that astronomers, funding agencies, and the public at large will have to consider in some detail. Landis makes no suggestion that such a mission should be undertaken now or is even possible or affordable. But his analysis has certainly raised the stakes.

    Going further, it seems hard to understate the significance of finding an Earth analogue that has the potential to support life. The idea of mapping areas on this planet that are just one kilometer in size will be powerful motivation.

    On Earth, this kind of image would reveal islands, rivers, parks, Great Walls, freeways, cities, and so on. Perhaps a spacecraft sitting at the gravitational focus of a distant star is revealing these things right now to a spellbound alien population. Just imagine.

    Ref: arxiv.org/abs/1604.06351: Mission to the Gravitational Focus of the Sun: A Critical Analysis

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: