Tagged: MIT Technology Review Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:02 pm on May 5, 2017 Permalink | Reply
    Tags: , Astrophysicists Turn GPS Satellite Constellation into Giant Dark Matter Detector, , , , , MIT Technology Review   

    From MIT Tech Review: “Astrophysicists Turn GPS Satellite Constellation into Giant Dark Matter Detector” 

    MIT Technology Review
    M.I.T. Technology Review

    May 4, 2017
    Emerging Technology from the arXiv
    If Earth is sweeping through an ocean of dark matter, the effects should be visible in clock data from GPS satellites.

    1

    The Global Positioning System consists of 31 Earth-orbiting satellites, each carrying an atomic clock that sends a highly accurate timing signal to the ground. Anybody with an appropriate receiver can work out their position to within a few meters by comparing the arrival time of signals from three or more satellites.

    And this system can easily be improved. The accuracy of GPS signals can be made much higher by combining the signals with ones produced on the ground. Geophysicists, for example, use this technique to determine the position of ground stations to within a few millimeters. In this way, they can measure the tiny movements of entire continents.

    This is an impressive endeavor. Geophysicists routinely measure the difference between GPS signals and clocks on the ground with an accuracy of less than 0.1 nanoseconds. They also archive this data providing a detailed record of how GPS signals have changed over time. This archival storage opens the possibility of using the data for other exotic studies.

    Today Benjamin Roberts at the University of Nevada and a few pals say they have used this data to find out whether GPS satellites may have been influenced by dark matter, the mysterious invisible stuff that astrophysicists think fills our galaxy. In effect, these guys have turned the Global Positioning System into an astrophysical observatory of truly planetary proportion.

    The theory behind dark matter is based in observations of the way galaxies rotate. This spinning motion is so fast that it should send stars flying off into extra-galactic space.

    But this doesn’t happen. Instead, a mysterious force must somehow hold the stars in place. The theory is that this force is gravity generated by invisible stuff that doesn’t show up in astronomical observations. In other words, dark matter.

    If this theory is correct, dark matter should fill our galaxy, too, and as the sun makes its stately orbit round the galactic center, Earth should plough through a great ocean of dark matter.

    There’s no obvious sign of this stuff, which makes physicists think it must interact very weakly with ordinary visible matter. But they hypothesize that if dark matter exists in small atomic-sized lumps, it might occasionally hit atomic nuclei head on, thereby transferring their energy to visible matter.

    That’s why astrophysicists have built giant observatories in underground mines to look for the tell-tale energy released in these collisions. So far, they’ve seen nothing. Or at least, there is no consensus that anybody has seen evidence of dark matter. So other ways to look for dark matter are desperately needed.

    Enter Roberts and co. They start with a different vision of what dark matter may consist of. Instead of small particles, another option is that dark matter may take the form of topological defects in space-time left over from the Big Bang. These would be glitches in the fabric of the universe, like domain walls, that bend space-time in their vicinity.

    Should the Earth pass through such a defect, it would change the local gravitational field just slightly over a period of an hour or so.

    But how to detect such a change in the local field? To Roberts and co, the answer is clear. According to relativity, any change in gravity also changes the rate at which a clock ticks. That’s why orbiting clocks run a little bit slower than those on the surface.

    If the Earth has passed through any topological defects in the recent past, the clock data from GPS satellites would have recorded this event. So by searching through geophysicists’ archived records of GPS clock timings, it ought to be possible to see such events.

    That’s the theory. In practice, this work is a little more complicated because GPS timing signals are also influenced by other factors such as atmospheric conditions, random variations, and other things. All these need to be taken into account.

    But a key signature of a topological defect is that its influence should sweep through the fleet of satellites as the Earth passes through it. So any other kinds of local timing fluctuation can be ruled out.

    Roberts and co study the data over the last 16 years, and their results make for interesting reading. These guys say they have found no sign that Earth has passed through a topological defect in that time. “We find no evidence for dark matter clumps in the form of domain walls,” they say.

    Of course, that doesn’t rule out the existence of dark matter or even that dark matter exists in this form. But it does place strong limits on how common topological defects can be and how strong their influence is.

    Until now, the limits have been set using observations of the cosmic microwave background radiation, which should reveal topological defects, albeit at low resolution. The work of Roberts and co improves these limits by five orders of magnitude.

    And better data should be available soon. The best clocks in Earth laboratories are orders of magnitude more accurate than the atomic clocks on board GPS satellites. So a network of clocks on Earth should act as an even more sensitive observatory for topological defects. These clocks are only just becoming linked together in networks, so the data from them should be available in the coming years.

    This greater sensitivity should allow physicists to look for other types of dark matter, which may take the form of solitons or Q-balls, for example.

    All this is part of a fascinating process of evolution. The technology behind the GPS system can be traced directly back to the first attempts to track the Sputnik spacecraft after the Soviets launched it in 1957. Physicists soon realized they could determine its location by measuring the radio signals it generated at different places.

    It wasn’t long before they turned this idea on its head. Given the known location of a satellite, is it possible to determine your location on Earth using the signals it broadcasts? The GPS constellation is a direct descendant of that train of thought.

    Those physicists would surely be amazed to know that the technology they developed is also now being used as a planetary-sized astrophysical observatory.

    Ref: arxiv.org/abs/1704.06844: GPS as a Dark-Matter Detector: Orders-of-Magnitude Improvement on Couplings of Clumpy Dark Matter to Atomic Clocks

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

     
  • richardmitnick 3:44 pm on February 23, 2017 Permalink | Reply
    Tags: Magnetic resonance imaging, MIT Technology Review, University of Melbourne   

    From MIT Tech Review: “This Microscope Reveals Human Biochemistry at Previously Unimaginable Scales” 

    MIT Technology Review
    M.I.T Technology Review

    February 23, 2017

    1

    Magnetic resonance imaging is one of the miracles of modern science. It produces noninvasive 3-D images of the body using harmless magnetic fields and radio waves. And with a few additional tricks, it can also reveal details of the biochemical makeup of tissue.

    1
    Atomic-scale MRI holds promise for new drug discovery | The Melbourne Newsroom

    That biochemical trick is called magnetic resonance spectroscopy, and it is a powerful tool for physicians and researchers studying the biochemistry of the body, including metabolic changes in tumors in the brain and in muscles.

    But this technique is not perfect. The resolution of magnetic resonance spectroscopy is limited to length scales of about 10 micrometers. And there is a world of chemical and biological activity at smaller scales that scientists simply cannot access in this way.

    So physicians and researchers would dearly love to have a magnetic resonance microscope that can study body tissue and the biochemical reactions within it at much smaller scales.

    Today, David Simpson and pals at the University of Melbourne in Australia say they have built a magnetic resonance microscope with a resolution of just 300 nanometers that can study biochemical reactions on previously unimaginable scales. Their key breakthrough is an exotic diamond sensor that creates magnetic resonance images in a similar way to a light sensitive CCD chip in a camera.

    Magnetic resonance imaging works by placing a sample in a magnetic field so powerful that the atomic nuclei all become aligned; in other words, they all spin the same way. When these nuclei are zapped with radio waves, the nuclei become excited and then emit radio waves as they relax. By studying the pattern of re-emitted radio waves, it is possible to work out where they have come from and so build up a picture of the sample.

    The signals also reveal how the atoms are bonded to each other and the biochemical processes at work. But the resolution of this technique is limited by how closely the radio receiver can get to the sample.

    Enter Simpson and co, who have built an entirely new kind of magnetic resonance sensor out of diamond film. The secret sauce in this sensor is an array of nitrogen atoms that have been embedded in a diamond film at a depth of about seven nanometers and about 10 nanometers apart.

    Nitrogen atoms are useful because when embedded in diamond, they can be made to fluoresce. And when in a magnetic field, the color they produce is highly sensitive to the spin of atoms and electrons nearby or, in other words, to the local biochemical environment.

    So in the new machine, Simpson and co place their sample on top of the diamond sensor, in a powerful magnetic field and zap it with radio waves. Any changes in the state of nearby nuclei causes the nitrogen array to fluoresce in various colors. And the array of nitrogen atoms produces a kind of image, just like a light sensitive CCD chip. All Simpson and co do is monitor this fireworks display to see what’s going on.

    To put the new technique through its paces, Simpson and co study the behavior of hexaaqua copper(2+) complexes in aqueous solution. Hexaaqua copper is present in many enzymes which use it to incorporate copper in metalloproteins. However, the distribution of copper during this process, and the role it plays in cell signaling, is poorly understood because it is impossible to visualize in vivo.

    Simpson and co show how this can now be done using their new technique, which they call quantum magnetic resonance microscopy. They show how their new sensor can reveal the spatial distribution of copper 2+ ions in volumes of just a few attoLitres and at high resolution. “We demonstrate imaging resolution at the diffraction limit (~300 nm) with spin sensitivities in the zeptomol (10‐21) range,” say Simpson and co. They also show how the technique reveals the redox reactions that the ions undergo. And they do all this at room temperature.

    That’s impressive work that has important implications for the future study of biochemistry. “The work demonstrates that quantum sensing systems can accommodate the fluctuating Brownian environment encountered in ‘real’ chemical systems and the inherent fluctuations in the spin environment of ions undergoing ligand rearrangement,” says Simpson and co.

    That makes it a powerful new tool that could change the way we understand biological processes. Simpson and co are optimistic about its potential. “Quantum magnetic resonance microscopy is ideal for probing fundamental nanoscale biochemistry such as binding events on cell membranes and the intra‐cellular transition metal concentration in the periplasm of prokaryotic cells.”

    Ref: arxiv.org/abs/1702.04418: Quantum Magnetic Resonance Microscopy

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

     
  • richardmitnick 12:48 pm on January 8, 2017 Permalink | Reply
    Tags: A test that will detect all of the major cancer types, , , , MIT Technology Review   

    From MIT Tech Review: “Liquid Biopsies Are About to Get a Billion Dollar Boost’ 

    MIT Technology Review
    M.I.T Technology Review

    January 6, 2017
    Michael Reilly

    A billion dollars sounds like a lot of money. But when your ambitions are as big as the cancer-detection startup Grail Bio’s are, it might not be enough.

    As CEO and ex-Googler Jeff Huber puts it, Grail’s aim is to create “a test that will detect all of the major cancer types.” Already the recipient of $100 million in funding from DNA sequencing company Illumina and a series of tech luminaries, Grail believes that adding another zero to its cash balance will put its lofty goals within reach. The company announced Thursday that it plans to raise $1 billion, has “indications of interest” from investors, and would move quickly to secure the hefty cash infusion.

    Whether Grail succeeds turns on the company’s ability to dramatically expand an emerging technology known as the liquid biopsy. It works by sequencing DNA from someone’s blood and looking for tell-tale fragments that indicate the presence of cancer. Dennis Lo, a doctor in Hong Kong, was among the first to show the technique’s promise. He’d previously used it to detect fetal DNA in a mother’s bloodstream. That led to a much safer form of screening for Down’s syndrome that is now in wide use.

    Lo has experimented with liquid biopsy as a way to catch liver and nasopharyngeal cancers, with some encouraging results. But he urged caution in assuming the technique could be translated to all cancers.

    Grail, which was spun out of Illumina about a year ago, has launched its first trials to see whether liquid biopsies can spot cancers earlier and more reliably than other screening tests.

    For his part, Huber seems to understand that he’s got a mountain to climb. After losing his wife to colorectal cancer, Grail’s mission is deeply personal. He acknowledges that detecting cancer DNA may be difficult, because the disease mutates rapidly as it advances, and varies immensely from one type to another. He says his company will rely on sequencing the DNA of tens of thousands of subjects to build a library of cancer DNA that computers can then decipher.

    Beyond the high-minded talk of turning the tide in the war against cancer, though, is a more cynical reading of the situation. As a unit within Illumina, Grail was an expensive, long-shot bet to create a new market for its gene sequencing machines. As a separate, now cash-rich company, Grail figures to become one of Illumina’s biggest customers. And venture capital will foot the bill, whether or not the experiment works.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

     
  • richardmitnick 3:16 pm on December 23, 2016 Permalink | Reply
    Tags: and Google, , , Intel and Competitors IBM, , MIT Technology Review,   

    From MIT Tech Review: “Intel Bets It Can Turn Everyday Silicon into Quantum Computing’s Wonder Material” 

    MIT Technology Review
    MIT Technology Review

    December 21, 2016
    Tom Simonite

    The world’s largest chip company sees a novel path toward computers of immense power.

    1
    Researchers at TU Delft in the Netherlands use equipment like this to test quantum computing devices at supercool temperatures, in a collaboration with chip maker Intel. No image credit.

    Sometimes the solution to a problem is staring you in the face all along. Chip maker Intel is betting that will be true in the race to build quantum computers—machines that should offer immense processing power by exploiting the oddities of quantum mechanics.

    Competitors IBM, Microsoft, and Google are all developing quantum components that are different from the ones crunching data in today’s computers. But Intel is trying to adapt the workhorse of existing computers, the silicon transistor, for the task.

    Intel has a team of quantum hardware engineers in Portland, Oregon, who collaborate with researchers in the Netherlands, at TU Delft’s QuTech quantum research institute, under a $50 million grant established last year. Earlier this month Intel’s group reported that they can now layer the ultra-pure silicon needed for a quantum computer onto the standard wafers used in chip factories.

    This strategy makes Intel an outlier among industry and academic groups working on qubits, as the basic components needed for quantum computers are known. Other companies can run code on prototype chips with several qubits made from superconducting circuits (see Google’s Quantum Dream Machine). No one has yet advanced silicon qubits that far.

    A quantum computer would need to have thousands or millions of qubits to be broadly useful, though. And Jim Clarke, who leads Intel’s project as director of quantum hardware, argues that silicon qubits are more likely to get to that point (although Intel is also doing some research on superconducting qubits). One thing in silicon’s favor, he says: the expertise and equipment used to make conventional chips with billions of identical transistors should allow work on perfecting and scaling up silicon qubits to progress quickly.

    Intel’s silicon qubits represent data in a quantum property called the “spin” of a single electron trapped inside a modified version of the transistors in its existing commercial chips. “The hope is that if we make the best transistors, then with a few material and design changes we can make the best qubits,” says Clarke.

    Another reason to work on silicon qubits is that they should be more reliable than the superconducting equivalents. Still, all qubits are error prone because they work on data using very weak quantum effects (see Google Researchers Make Quantum Components More Reliable).

    The new process that helps Intel experiment with silicon qubits on standard chip wafers, developed with the materials companies Urenco and Air Liquide, should help speed up its research, says Andrew Dzurak, who works on silicon qubits at the University of New South Wales in Australia. “To get to hundreds of thousands of qubits, we will need incredible engineering reliability, and that is the hallmark of the semiconductor industry,” he says.

    Companies developing superconducting qubits also make them using existing chip fabrication methods. But the resulting devices are larger than transistors, and there is no template for how to manufacture and package them up in large numbers, says Dzurak.

    Chad Rigetti, founder and CEO of Rigetti Computing, a startup working on superconducting qubits similar to those Google and IBM are developing, agrees that this presents a challenge. But he argues that his chosen technology’s head start will afford ample time and resources to tackle the problem.

    Google and Rigetti have both said that in just a few years they could build a quantum chip with tens or hundreds of qubits that dramatically outperforms conventional computers on certain problems, even doing useful work on problems in chemistry or machine learning.

    No sciencde papers cited.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

     
  • richardmitnick 3:38 pm on December 7, 2016 Permalink | Reply
    Tags: , , , MIT Technology Review   

    From MIT Tech Review: “Personalized Cancer Vaccine Prevents Leukemia Relapse in Patients” 

    MIT Technology Review
    M.I.T Technology Review

    December 7, 2016
    Emily Mullin

    Shortly after Ernest Levy of Cooperstown, New York, returned from a trip to South Africa with his son for the 2010 World Cup, he was diagnosed with acute myeloid leukemia. The prognosis didn’t look good for Levy, now 76. Just over a quarter of adult patients survive five years after developing the disease, a type of cancer that affects bone marrow.

    Levy joined a clinical trial led by the Beth Israel Deaconess Medical Center, a teaching hospital of Harvard Medical School in Boston, testing a cancer vaccine for acute myeloid leukemia. After an initial round of chemotherapy, he and the other trial participants received the experimental vaccine, a type of immunotherapy intended to “reëducate” the immune cells to see cancer cells as foreign and attack them, explains David Avigan, chief of Hematological Malignancies and director of the Cancer Vaccine Program at Beth Israel.

    Now results from the trial suggest that the vaccine was able to stimulate powerful immune responses against cancer cells and protect a majority of patients from relapse—including Levy. Out of 17 patients with an average age of 63 who received the vaccine, 12 are still in remission four years or more after receiving the vaccine, Avigan and his co-authors at the Dana-Farber Cancer Institute report. The researchers found expanded levels of immune cells that recognize acute myeloid leukemia cells after vaccination. The results appear today in the journal Science Translational Medicine.

    Acute myeloid leukemia is typically treated with a combination of chemotherapies, but the cancer often relapses after initial treatment, with older patients having a higher chance of relapse.

    Therapeutic cancer vaccines are designed to work by activating immune cells called T cells and directing them to recognize and act against cancer cells, or by spurring the production of antibodies that bind to certain molecules on the surface of cancer cells. But producing effective therapeutic vaccines has proved challenging, with many of these vaccines either failing outright or showing only marginal increases in survival rates in clinical trials.

    Avigan and his colleagues created a personalized vaccine by taking leukemia cells from patients and then freezing them for preservation while they received a traditional chemotherapy. Then scientists thawed the cancer cells and combined them with dendritic cells, immune cells that unleash tumor-fighting T cells. The vaccine took about 10 days to manufacture and another three to four weeks before it was ready for administration.

    Many cancer vaccine strategies have homed in on a single target, or antigen. When the antigen is introduced in the body via injection, it causes an immune response. The body begins to produce T cells that recognize and attack the same antigen on the surface of cancer cells. The vaccine Avigan and his team created uses a mixture of cells that contain many antigens in an attempt to generate a more potent approach.

    Though the number of patients in the trial was small, Avigan says, “this was enough of a provocative finding” that the researchers will be expanding the trial to include more patients. At the same time, the personalized vaccine approach is already being tested in other types of cancers.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

     
  • richardmitnick 2:42 pm on August 18, 2016 Permalink | Reply
    Tags: , , MIT Technology Review,   

    From MIT Tech Review: “New Brain-Mapping Technique Captures Every Connection Between Neurons” 

    MIT Technology Review
    M.I.T Technology Review

    August 18, 2016
    Ryan Cross

    The human brain is among the universe’s greatest remaining uncharted territories. And as with any mysterious land, the secret to understanding it begins with a good map.

    Neuroscientists have now taken a huge step toward the goal of mapping the connections between neurons in the brain using bits of genetic material to bar-code each individual brain cell. The technique, called MAP-seq, could help researchers study disorders like autism and schizophrenia in unprecedented detail.

    “We’ve got the basis for a whole new technology with a gazillion applications,” says Anthony Zador, a neuroscientist at Cold Spring Harbor Laboratory who came up with the technique.

    Current methods for mapping neuronal connections, known as the brain’s connectome, commonly rely on fluorescent proteins and microscopes to visualize cells, but they are laborious and have difficultly following the connections of many neurons at once.

    MAP-seq works by first creating a library of viruses that contain randomized RNA sequences. This mixture is then injected into the brain, and approximately one virus enters each neuron in the injection area, granting each cell a unique RNA bar code. The brain is then sliced and diced into orderly sections for processing. A DNA sequencer reads the RNA bar codes, and researchers create a connectivity matrix that displays how individual neurons connect to other regions of the brain.

    The newly published study, which appears Thursday in the journal Neuron, follows the sprawling outbound connections from 1,000 mouse neurons in a brain region called the locus coeruleus to show that the technique works. But Zador says the results actually reconcile previously conflicting findings about how those neurons connect across the brain.

    Justus Kebschull, who worked with Zador in developing MAP-seq, says the technique is getting better. “We’re now mapping out 100,000 cells at a time, in one week, in one experiment,” he says. “That was previously only possible if you put a ton of work in.”

    Both autism and schizophrenia are viewed as disorders that may arise from dysfunctional brain connectivity. There are perhaps hundreds of genetic mutations that may slightly alter the brain’s wiring as it develops. “We are looking at mouse models where something is mucked up. And now that the method is so fast, we can look at many mouse models,” Kebschull says. By comparing the brain circuitry in mice with different candidate genes for autism, researchers expect, they’ll get new insight into the condition.

    “I think it is a great method that has a lot of room to grow,” says Je Hyuk Lee, a molecular biologist at Cold Spring Harbor Laboratory, who was not part of the MAP-seq study. Although other groups have used similar bar-coding to study individual differences between cells, no one knew if the bar codes would be able to travel along the neuronal connections across the brain. “That had been conjectured but never shown, especially not at this scale,” Lee says.

    Zador says that as of now, his lab is the only one bar-coding the brain, but he hopes others will start using MAP-seq to chart the brain’s circuitry. “Because the cost of sequencing is continuing to plummet, we can envision doing this quickly and cheaply,” he said. It may not be long, then, before a complete map of the brain is ready for its first explorer to use.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

     
  • richardmitnick 10:51 am on June 11, 2016 Permalink | Reply
    Tags: , Genetic replacement therapy, MIT Technology Review, Strimvelis   

    From MIT Tech Review: “Gene Therapy’s First Out-and-Out Cure Is Here” 

    MIT Technology Review
    M.I.T Technology Review

    May 6, 2016
    Antonio Regalado

    1
    An artist’s illustration of gene therapy shows a retrovirus harboring a correct copy of a human gene. No image credit

    A treatment now pending approval in Europe will be the first commercial gene therapy to provide an outright cure for a deadly disease.

    The treatment is a landmark for gene-replacement technology, an idea that’s struggled for three decades to prove itself safe and practical.

    Called Strimvelis, and owned by drug giant GlaxoSmithKline, the treatment is for severe combined immune deficiency, a rare disease that leaves newborns with almost no defense against viruses, bacteria, or fungi and is sometimes called “bubble boy” disease after an American child whose short life inside a protective plastic shield was described in a 1976 movie.

    The treatment is different from any that’s come before because it appears to be an outright cure carried out through a genetic repair. The therapy was tested on 18 children, the first of them 15 years ago. All are still alive.

    “I would be hesitant to call it a cure, although there’s no reason to think it won’t last,” says Sven Kili, the executive who heads gene-therapy development at GSK.

    The British company licensed the treatment in 2010 from the San Raffaele Telethon Institute for Gene Therapy, in Milan, Italy, where it was developed and first tested on children.

    On April 1, European advisors recommended that Strimvelis be allowed on the market. If, as expected, GSK wins formal authorization, it can start selling the drug in 27 European countries. GSK plans to seek U.S. marketing approval next year.

    GSK is the first large drug company to seek to market a gene therapy to treat any genetic disease. If successful, the therapeutic could signal a disruptive new phase in medicine in which one-time gene fixes replace trips to the pharmacy or lifelong dependence on medication.

    “The idea that you don’t have to worry about it and can be normal is extremely exciting for people,” says Marcia Boyle, founder and president of the Immune Deficiency Foundation, whose son was born with a different immune disorder, one of more than 200 known to exist. “I am a little guarded on gene therapy because we were all excited a long time ago, and it was not as easy to fool Mother Nature as people had hoped.”

    Today, several hundred gene therapies are in development, and many aspire to be out-and-out cures for one of about 5,000 rare diseases caused by errors in a single gene.

    Children who lack correct copies of a gene called adenosine deaminase begin to get life-threatening infections days after birth. The current treatment for this immune deficiency, known as ADA-SCID, is a bone marrow transplant, which itself is sometimes fatal, or lifelong therapy using costly replacement enzymes that cost $5,000 a vial.

    Strimvelis uses a “repair and replace” strategy, so called because doctors first remove stem cells from a patient’s bone marrow then soak them with viruses to transfer a correct copy of the ADA gene.

    “What we are talking about is ex vivo gene therapy—you pull out the cells, correct them in test tube, and put the cells back,” says Maria-Grazia Roncarolo, a pediatrician and scientist at Stanford University who led the original Milan experiments. “If you want to fix a disease for life, you need to put the gene in the stem cells.”

    Some companies are trying to add corrected genes using direct injections into muscles or the eye. But the repair-and-replace strategy may have the larger impact. As soon as next year, companies like Novartis and Juno Therapeutics may seek approval for cancer treatments that also use a combination of gene and cell therapy to obliterate one type of leukemia.

    Overall, investment in gene therapy is booming. The Alliance for Regenerative Medicine says that globally, in 2015, public and private companies raised $10 billion, and about 70 treatments are in late-stage testing.

    GSK has never sold a product so drastically different from a bottle of pills. And because ADA-SCID is one of the rarest diseases on Earth, Strimvelis won’t be a blockbuster. GSK estimates there are only about 14 cases a year in Europe, and 12 in the U.S.

    Instead, the British company hopes to master gene-therapy technology, including virus manufacturing. “If we can first make products that change lives, then we can develop them into things that affect more people,” says Kili. “We believe gene therapy is an area of important future growth; we don’t want to rush or cut corners.”

    Markets will closely scrutinize how much GSK charges for Strimvelis. Kili says a final decision hasn’t been made. Another gene therapy, called Glybera, debuted with a $1 million price tag but is already considered a commercial flop. The dilemma is how to bill for a high-tech drug that people take only once.

    Kili says GSK’s price won’t be anywhere close to a million dollars, though it will be enough to meet a company policy of getting a 14 percent return on every dollar spent on R&D.

    The connection between immune deficiency and gene therapy isn’t new. In fact, the first attempt to correct genes in a living person occurred in 1990, also in a patient with ADA-SCID.

    By 2000, teams in London and France had cured some children of a closely related immune deficiency, X-linked SCID, the bubble boy disease. But some of those children developed leukemia after the viruses dropped their genetic payloads into the wrong part of the genome.

    In the U.S., the Food and Drug Administration quickly canceled 27 trials over safety concerns. “It was a major step back,” says Roncarolo, and probably a more serious red flag even than the death of a volunteer named Jesse Gelsinger in a U.S. trial in 1999, which also drew attention to gene therapy’s risks.

    The San Raffaele Telethon Institute for Gene Therapy presented its own results in ADA-SCID, which also affects girls, in 2002 in the journal Science. Like the French, they’d also apparently cured patients, and because of differences in their approach, they didn’t run the same cancer risk.

    GSK says it is moving toward commercializing several other gene therapies for rare disease developed by the Italian team, including treatments for metachromatic leukodystrophy, a rare but rapidly fatal birth defect, and for beta thalassemia.

    Kili says the general idea is to leapfrog from ultra-rare diseases to less rare ones, like beta thalassemia, hemophilia, and sickle cell disease. However, he doubts the technology will be used to treat common conditions such as arthritis or heart disease anytime soon. Those conditions are complex and aren’t caused by a defect in just one gene.

    “Honestly, as we stand at the moment, I don’t think gene therapy will address all the ills or ailments of humanity. We can address [single-gene] disease,” he says. “We are building a hammer that is not that big.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

     
  • richardmitnick 8:19 am on June 10, 2016 Permalink | Reply
    Tags: A Power Plant in Iceland Deals with Carbon Dioxide by Turning It into Rock, , , MIT Technology Review   

    From MIT Tech Review: “A Power Plant in Iceland Deals with Carbon Dioxide by Turning It into Rock” 

    MIT Technology Review
    MIT Technology Review

    June 9, 2016
    Ryan Cross

    1
    Photograph by Juerg Matter

    The world has a carbon dioxide problem. And while there are lots of ideas on how to curtail the nearly 40 billion tons of the gas that humanity spews into the atmosphere annually, one has just gotten a boost: burying it.

    Since 2012, Reykjavík Energy’s CarbFix project in Iceland has been injecting carbon dioxide underground in a way that converts it into rock so that it can’t escape. This kind of carbon sequestration has been tried before, but as researchers working on the project report today in the journal Science, the process of mineralizing the carbon dioxide happens far more quickly than expected, confirming previous reports and brightening the prospects for scaling up this technology.

    Iceland’s volcanic landscape is replete with basalt. Injecting carbon dioxide and water deep underground allows the mixture to react with calcium, magnesium, and iron in the basalt, turning it into carbonate minerals like limestone.

    2
    Project leader Juerg Matter stands by the injection well during the CarbFix project’s initial injection. Photograph by Sigurdur Gislason

    Conventional methods for storing carbon dioxide underground pressurize and heat it to form a supercritical fluid, giving it the properties of both a liquid and a gas. While making the carbon dioxide easier to inject into the ground—usually in an old oil or gas reservoir—this carries a higher risk that it could escape back into the atmosphere through cracks in the rock.

    CarbFix takes carbon dioxide from the Hellisheidi geothermal power plant, the largest in the world, which uses volcanically heated water to power turbines. The process produces 40,000 tons of carbon dioxide a year, as well as hydrogen sulfide, both of which are naturally present in the water.

    3
    The CarbFix pilot injection site in March 2011. Photograph by Martin Stute

    The new study shows that more than 95 percent of the injected material turned to rock in less than two years. “No one actually expected it to be this quick,” says Edda Aradóttir, CarbFix’s project manager. The project is already storing 5,000 tons underground per year, making it the largest of its kind. New equipment being installed this summer aims to double the rate of storage.

    Aradóttir says CarbFix spends $30 per ton to capture and inject the carbon dioxide, versus $65 to $100 per ton for the conventional method. A lot of that savings comes from not having to purify the carbon dioxide; it and the hydrogen sulfide are simply mixed with additional water and injected underground.

    4
    CarbFix team members handle the rock core recovered from drilling at the CarbFix pilot injection site in October 2014. Photograph by Juerg Matter

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

     
  • richardmitnick 12:13 pm on June 9, 2016 Permalink | Reply
    Tags: , MIT Technology Review,   

    From MIT Tech Review: “Proof That Quantum Computers Will Change Everything” 

    MIT Technology Review
    MIT Technology Review

    First Demonstration of 10-Photon Quantum Entanglement

    June 9, 2016
    Emerging Technology from the arXiv

    The ability to entangle 10 photons should allow physicists to prove, once and for all, that quantum computers really can do things classical computers cannot.

    Entanglement is the strange phenomenon in which quantum particles become so deeply linked that they share the same existence. Once rare, entangling particles has become routine in labs all over the world.

    Quantum approach to big data. MIT
    Quantum approach to big data. MIT

    Physicists have learned how to create entanglement, transfer it from one particle to another, and even distil it. Indeed, entanglement has become a resource in itself and a crucial one for everything from cryptography and teleportation to computing and simulation.

    But a significant problem remains. To carry out ever more complex and powerful experiments, physicists need to produce entanglement on ever-larger scales by entangling more particles at the same time.

    The current numbers are paltry, though. Photons are the quantum workhorses in most labs and the record for the number of entangled photons is a mere eight, produced at a rate of about nine events per hour.

    Using the same techniques to create a 10-photon count rate would result in only 170 per year, too few even to measure easily. So the prospects of improvement have seemed remote.

    Which is why the work of Xi-Lin Wang and pals at the University of Science and Technology of China in Heifu is impressive. Today, they announce that they’ve produced 10-photon entanglement for the first time, and they’ve done it at a count rate that is three orders of magnitude higher than anything possible until now.

    The biggest bottleneck in entangling photons is the way they are produced. This involves a process called spontaneous parametric down conversion, in which one energetic photon is converted into two photons of lower energy inside a crystal of beta-barium borate. These daughter photons are naturally entangled.

    2
    Experiment setup for generating ten-photon polarization-entangled GHZ, from the science paper

    By zapping the crystal continuously with a laser beam, it is possible to create a stream of entangled photon pairs. However, the rate of down conversion is tiny, just one photon per trillion. So collecting the entangled pairs efficiently is hugely important.

    That’s no easy tasks, not least because the photons come out of the crystal in slightly different directions, neither of which can be easily predicted. Physicists collect the photons from the two points where they are most likely to appear but most of the entangled photons are lost.

    Xi-Lin and co have tackled this problem by reducing the uncertainty in the photon directions. Indeed, they have been able to shape the beams of entangled photons so that they form two separate circular beams, which can be more easily collected.

    In this way, the team has generated entangled photon pairs at the rate of about 10 million per watt of laser power. This is brighter than previous entanglement generators by a factor of about four. It is this improvement that makes 10-photon entanglement possible.

    Their method is to collect five successively generated pairs of entangled photons and pass them into an optical network of four beam splitters. The team then introduces time delays that ensure the photons arrive at the beam splitters simultaneously and so become entangled.

    This creates the 10-photon entangled state, albeit at a rate of about four per hour, which is low but finally measureable for the first time. “We demonstrate, for the first time, genuine and distillable entanglement of 10 single photons,” say Xi-Lin and co.

    That’s impressive work that immediately opens the prospect of a new generation of experiments. The most exciting of these is a technique called boson sampling that physicists hope will prove that quantum computers really are capable of things classical computers are not.

    That’s important because nobody has built a quantum computer more powerful than a pocket calculator (the controversial D-Wave results aside). Neither are they likely to in the near future. So boson sampling is quantum physicists’ greatest hope that will allow them to show off the mind-boggling power of quantum computation for the first time.

    Other things also become possible, such as the quantum teleportation of three degrees of freedom in a single photon and multi-photon experiments over very long distances.

    But it is the possibility of boson sampling that will send a frisson through the quantum physics community.

    Ref: arxiv.org/abs/1605.08547: Experimental Ten-Photon Entanglement

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

     
  • richardmitnick 8:54 am on June 6, 2016 Permalink | Reply
    Tags: , , Go Inside an Industrial Plant That Sucks Carbon Dioxide Straight Out of the Air, MIT Technology Review   

    From MIT Tech Review: “Go Inside an Industrial Plant That Sucks Carbon Dioxide Straight Out of the Air” 

    MIT Technology Review
    MIT Technology Review

    June 6, 2016
    Peter Fairley

    1
    No image caption. No image credit.

    Carbon dioxide emissions must decrease to nearly zero by 2040 if global warming by the end of this century is to be held to 2 °C. But we may well miss that target. A pilot plant started up last fall at Squamish, British Columbia, is testing a backup plan: sucking carbon dioxide directly out of the air.

    Capturing ambient carbon dioxide is a tall order because, for all the trouble it causes, the greenhouse gas makes up just 0.04 percent of the air we breathe. The Squamish plant can capture one ton of carbon dioxide a day. Significantly reducing atmospheric carbon dioxide levels would require thousands of far larger facilities, each sucking millions of tons of carbon per year out of the air.

    The plant is the brainchild of Calgary-based Carbon Engineering and its founder, Harvard University physicist David Keith. While some scientists have estimated that direct air capture would cost $400 to $1,000 per ton of carbon dioxide, Keith projects that large plants could do it for about $100 per ton.

    “We’ve taken existing pieces of industrial equipment and thought about new chemistries to run through them,” says Adrian Corless, Carbon Engineering’s CEO. The company captures carbon dioxide in a refashioned cooling tower flowing with an alkali solution that reacts with acidic carbon dioxide. That yields dissolved carbon molecules that are then converted to pellets in equipment created to extract minerals in water treatment plants. And the plant can turn those carbonate solids into pure carbon dioxide gas for sale by heating them in a modified cement kiln.

    2
    Carbon Engineering CEO Adrian Corless

    In May the company closed on $8 million of new financing in Canadian dollars ($6.2 million in U.S. dollars) from investors including Bill Gates. Keith also hopes to start winning over skeptics. “Most people in the energy expert space think that air capture is not particularly credible,” he says. “There won’t be incentives and funding in a serious way for these technologies unless people believe that they actually work.”

    3
    Carbon dioxide is captured within the plant’s gas-liquid contactor, which is essentially a repurposed cooling tower. An alkaline solution in the contactor reacts with acidic carbon dioxide in air to enrich the capture solution with potassium carbonate.

    4

    5
    The contactor contains 80 cubic meters of plastic packing whose three-dimensional honeycomb structure offers 16,800 square meters of surface area. The setup removes 75 to 80 percent of the carbon dioxide in the air.

    6

    7
    Above top: The capture fluid, now rich with carbon dioxide from the air, circulates to a 13-meter-tall reactor. Above bottom: Calcium hydroxide is added to the capture fluid just before it enters the reactor, causing two products to be created inside. One is solid calcium carbonate containing the captured atmospheric carbon. The second, potassium hydroxide, flows back to the air contactor to capture more carbon dioxide.

    8
    As fluid moves up through the reactor, growing pellets of calcium carbonate spread out in a gradient, with the smallest pellets at the top. Pellets can be removed via these sample ports and analyzed in order to optimize the process.

    9
    The heaviest pellets settle at the bottom of the reactor and are periodically removed, washed to remove fine crystals and capture fluid, and dried. The finished product is solid grains of calcium carbonate that resemble a fine couscous.

    10
    Dried pellets are fed into the calciner, in which a 900 °C inferno of natural gas burning in pure oxygen roasts a rolling mass of calcium oxide. The calcium carbonate pellets spontaneously break down, producing more calcium oxide and releasing carbon dioxide gas.

    Next up at Squamish: turning captured carbon dioxide (now vented back to the air) into a low-carbon transportation fuel. By reacting carbon dioxide with hydrogen, Carbon Engineering plans to synthesize a fuel with less than one-third the carbon content of conventional gasoline. Corless estimates the fuels will cost $4 to $6 per gallon, but he expects to fetch a premium in places such as California and the European Union, where mandates require fuel suppliers to reduce their carbon content annually. Ultimately, says Corless, fuel from air capture may prove crucial to break the fossil-fuel dependence everywhere.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: