Updates from June, 2016 Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:33 pm on June 10, 2016 Permalink | Reply
    Tags: , , Diane Souvaine-Vice Chair, Maria Zuber-Chair, National Science Board, ,   

    From NSF: Women in Science “The National Science Board taps Maria Zuber as its chairperson and Diane Souvaine for vice chairperson” 

    nsf
    National Science Foundation

    1
    National Science Board

    May 24, 2016 [Just appeared in social media.]

    1
    Left, Maria Zuber, Chair; right, Diane Souvaine, Vice Chair

    For the first time in National Science Foundation (NSF) history, women hold the positions of director and National Science Board (NSB) chair, and vice chair. During its May meeting, the board, which serves as the governing body for NSF, elected Maria Zuber, vice president for research at the Massachusetts Institute of Technology, as chair and Diane Souvaine, vice provost for research at Tufts University, as vice chair. They replace Dan Arvizu and Kelvin Droegemeier, who both rotated off the board after serving 12 years, the last four as chair and vice chair, respectively.

    Zuber’s research bridges planetary geophysics and the technology of space-based laser and radio systems, and she has published over 200 papers. She has held leadership roles associated with scientific experiments or instrumentation on nine NASA missions and remains involved with six of these missions. She is a member of the National Academy of Sciences and American Philosophical Society and is a fellow for the American Academy of Arts and Sciences, the American Association for the Advancement of Science, the Geological Society and the American Geophysical Union. In 2002, Discover magazine named her one of the 50 most important women in science. Zuber served on the Presidential Commission on the Implementation of United States Space Exploration Policy in 2004.

    NSF Director and NSB member ex officio France Córdova said, “I am delighted to say, on behalf of NSF that we are thrilled with Dr. Zuber’s election as chair and Dr. Souvaine’s election as vice chair of the National Science Board. As Dr. Zuber is a superb scientist and recognized university leader, she has the skills needed to help guide the agency’s policies and programs. Coupled with Dr. Souvaine’s background in computer science, exemplary leadership skills, and expertise in budget oversight and strategy, NSB is well-positioned for the coming years. I look forward to working with both leaders as NSF launches new big ideas in science and engineering.”

    Zuber is in her fourth year on the board and has served on its Committee on Strategy and Budget, which advises on NSF’s strategic direction and reviews the agency’s budget submissions.

    “It is a privilege to lead the National Science Board and to promote NSF’s bold vision for research and education in science and engineering,” Zuber said. “The outcomes of discovery science inspire the next generation and yield the knowledge that drives innovation and national competitiveness, and contribute to our quality of life. NSB is committed to working with director Córdova and her talented staff to assure that the very best ideas based on merit review are supported and that exciting, emerging opportunities — many at the intersection of disciplines — are pursued.”

    Souvaine is in her second term on the NSB and has served as chair of its Committee on Strategy and Budget, chair of its Committee on Programs and Plans, and as a member of its Committee on Audit and Oversight, all of which provide strategic direction, and oversight and guidance on NSF projects and programs. In addition, she co-chaired NSB’s Task Force on Mid-Scale Research and served three years on the Executive Committee.

    A theoretical computer scientist, Souvaine’s research in computational geometry has commercial applications in materials engineering, microchip design, robotics and computer graphics. She was elected a fellow of the Association for Computing Machinery for her research and for her service on behalf of the computing community. A founding member, Souvaine served for over two years in the directorate of the NSF Science and Technology Center on Discrete Mathematics and Theoretical Computer Science that originally spanned Princeton University, Rutgers University, Bell Labs and Bell Communications Research. She also works to enhance pre-college mathematics and foundations of computing education and to advance opportunities for women and minorities in mathematics, science and engineering.

    “I am truly honored and humbled by this vote of confidence from such esteemed colleagues. I do not take this responsibility lightly,” Souvaine said. “The board is proud of NSF’s accomplishments over its 66 years, from the discovery of gravitational waves at LIGO to our biennial Science and Engineering Indicators report on the state of our nation’s science and engineering enterprise. I look forward to working with Congress, the Administration, the science and education communities, and NSF staff to continue the agency’s legacy in advancing the progress of science.”

    Jointly, the 24-member board and the director pursue the goals and function of the foundation. NSB establishes NSF policies within the framework of applicable national policies set forth by the President and Congress. NSB identifies issues critical to NSF’s future, approves the agency’s strategic budget directions and the annual budget submission to the Office of Management and Budget, and new major programs and awards. The board also serves as an independent body of advisers to both the President and Congress on policy matters related to science and engineering and education in science and engineering. In addition to major reports, NSB publishes policy papers and statements on issues of importance to U.S. science and engineering.

    The President appoints board members, selected for their eminence in research, education or public service and records of distinguished service and who represent a variety of science and engineering disciplines and geographic areas. Board members serve six-year terms and the President may reappoint members for a second term. NSF’s director is an ex officio 25th member of the board.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    The National Science Foundation (NSF) is an independent federal agency created by Congress in 1950 “to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense…we are the funding source for approximately 24 percent of all federally supported basic research conducted by America’s colleges and universities. In many fields such as mathematics, computer science and the social sciences, NSF is the major source of federal backing.

    seal

     
  • richardmitnick 11:21 am on June 10, 2016 Permalink | Reply
    Tags: , , , Laura Shou, ,   

    From Caltech: Women in Science “Shou Receives Fellowship for Graduate Studies in Germany” Laura Shou 

    Caltech Logo
    Caltech

    06/09/2016
    Lori Dajose

    1
    Laura Shou. Credit: Courtesy of L. Shou

    Laura Shou, a senior in mathematics, has received a Graduate Study Scholarship from the German Academic Exchange Service (DAAD) to pursue a master’s degree in Germany. She will spend one year at the Ludwig-Maximilians-Universität München and the Technische Universität München, studying in the theoretical and mathematical physics (TMP) program.

    The DAAD is the German national agency for the support of international academic cooperation. The organization aims to promote international academic relations and cooperation by offering mobility programs for students, faculty, and administrators and others in the higher education realm. The Graduate Study Scholarship supports highly qualified American and Canadian students with an opportunity to conduct independent research or complete a full master’s degree in Germany. Master’s scholarships are granted for 12 months and are eligible for up to a one-year extension in the case of two-year master’s programs. Recipients receive a living stipend, health insurance, educational costs, and travel.

    “As a math major, I was especially interested in the TMP course because of its focus on the interplay between theoretical physics and mathematics,” Shou says. “I would like to use mathematical rigor and analysis to work on problems motivated by physics. The TMP course at the LMU/TUM is one of the few programs focused specifically on mathematical physics. There are many people doing research in mathematical physics there, and the program also regularly offers mathematically rigorous physics classes.”

    At Caltech, Shou has participated in the Summer Undergraduate Research Fellowship (SURF) program three times, conducting research with Professor of Mathematics Yi Ni on knot theory and topology, with former postdoctoral fellow Chris Marx (PhD ’12) on mathematical physics, and with Professor of Mathematics Nets Katz on analysis. She was the president of the Dance Dance Revolution Club and a member of the Caltech NERF Club and the Caltech Math Club.

    Following her year in Germany, Shou will begin the mathematics PhD program at Princeton.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Caltech campus

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”

     
  • richardmitnick 10:09 am on June 10, 2016 Permalink | Reply
    Tags: , , How honeybees do without males,   

    From phys.org: “How honeybees do without males” 

    physdotorg
    phys.org

    June 9, 2016

    1
    An isolated population of honeybees, the Cape bees, living in South Africa has evolved a strategy to reproduce without males. A research team from Uppsala University has sequenced the entire genomes of a sample of Cape bees and compared them with other populations of honeybees to find out the genetic mechanisms behind their asexual reproduction. Credit: Mike Allsopp

    Most animals reproduce sexually, which means that both males and females are required for the species to survive. Normally, the honeybee is no exception to this rule: the female queen bee produces new offspring by laying eggs that have been fertilised by sperm from male drones. However, one isolated population of honeybees living in the southern Cape of Africa has evolved a strategy to do without males.

    In the Cape bee, female worker bees are able to reproduce asexually: they lay eggs that are essentially fertilised by their own DNA, which develop into new worker bees. Such bees are also able to invade the nests of other bees and continue to reproduce in this fashion, eventually taking over the foreign nests, a behaviour called social parasitism.

    The explanation for this unique behaviour is unknown, however a research team from UU has come closer to uncovering the genetic mechanisms behind it. The team sequenced the entire genomes of a sample of Cape bees and compared them with other populations of honeybees that reproduce normally. They found striking differences at several genes, which can explain both the abnormal type of egg production that leads to reproduction without males, and the unique social parasitism behaviour.

    “The question of why this population of honeybees in South Africa has evolved to reproduce asexually is still a mystery. But understanding the genes involved brings us closer to understanding it. This study will help us to understand how genes control biological processes like cell division and behaviour. Furthermore understanding why populations sometimes reproduce asexually may help us to understand the evolutionary advantage of sex, which is a major conundrum for evolutionary biologists, says Matthew Webster”, researcher at the Department of Medical Biochemistry and Microbiology at Uppsala University.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Phys.org in 100 Words

    Phys.org™ (formerly Physorg.com) is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004, Phys.org’s readership has grown steadily to include 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

     
  • richardmitnick 9:12 am on June 10, 2016 Permalink | Reply
    Tags: , , , Researchers demonstrate a 100x increase in the amount of information that can be 'packed into light', University of the Witwatersrand   

    From phys.org: “Researchers demonstrate a 100x increase in the amount of information that can be ‘packed into light’ “ 

    physdotorg
    phys.org

    1
    Data of the Rubik’s cube sent and received. Credit: Wits University

    The rise of big data and advances in information technology has serious implications for our ability to deliver sufficient bandwidth to meet the growing demand.

    Researchers at the University of the Witwatersrand in Johannesburg, South Africa, and the Council for Scientific and Industrial Research (CSIR) are looking at alternative sources that will be able to take over where traditional optical communications systems are likely to fail in future.

    In their latest research, published online today (10 June 2016) in the scientific journal, Scientific Reports, the team from South Africa and Tunisia demonstrate over 100 patterns of light used in an optical communication link, potentially increasing the bandwidth of communication systems by 100 times.

    The idea was conceived by Professor Andrew Forbes from Wits University, who led the collaboration. The key experiment was performed by Dr Carmelo Rosales-Guzman, a Research Fellow in the Structured Light group in the Wits School of Physics, and Dr Angela Dudley of the CSIR, an honorary academic at Wits.

    The first experiments on the topic were carried out by Abderrahmen Trichili of Sup’Com (Tunisia) as a visiting student to South Africa as part of an African Laser Centre funded research project. The other team members included Bienvenu Ndagano (Wits), Dr Amine Ben Salem (Sup’Com) and Professor Mourad Zghal (Sup’Com), all of who contributed significantly to the work.

    Bracing for the bandwidth ceiling

    Traditional optical communication systems modulate the amplitude, phase, polarisation, colour and frequency of the light that is transmitted. Yet despite these technologies, we are predicted to reach a bandwidth ceiling in the near future.

    2
    Dr. Carmelo Rosales-Guzman from Wits University. Credit: Wits University

    But light also has a “pattern” – the intensity distribution of the light, that is, how it looks on a camera or a screen.

    Since these patterns are unique, they can be used to encode information:

    pattern 1 = channel 1 or the letter A,
    pattern 2 = channel 2 or the letter B, and so on.

    What does this mean?

    That future bandwidth can be increased by precisely the number of patterns of light we are able to use.

    Ten patterns mean a 10x increase in existing bandwidth, as 10 new channels would emerge for data transfer.

    At the moment modern optical communication systems only use one pattern. This is due to technical hurdles in how to pack information into these patterns of light, and how to get the information back out again.

    How the research was done

    In this latest work [not available until paper is published], the team showed data transmission with over 100 patterns of light, exploiting three degrees of freedom in the process.

    They used digital holograms written to a small liquid crystal display (LCD) and showed that it is possible to have a hologram encoded with over 100 patterns in multiple colours.

    “This is the highest number of patterns created and detected on such a device to date, far exceeding the previous state-of-the-art,” says Forbes.

    One of the novel steps was to make the device ‘colour blind’, so the same holograms can be used to encode many wavelengths.

    According to Rosales-Guzman to make this work “100 holograms were combined into a single, complex hologram. Moreover, each sub-hologram was individually tailored to correct for any optical aberrations due to the colour difference, angular offset and so on”.

    What’s next?

    The next stage is to move out of the laboratory and demonstrate the technology in a real-world system.

    “We are presently working with a commercial entity to test in just such an environment,” says Forbes. The approach of the team could be used in both free-space and optical fibre networks.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Phys.org in 100 Words

    Phys.org™ (formerly Physorg.com) is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004, Phys.org’s readership has grown steadily to include 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

     
  • richardmitnick 8:33 am on June 10, 2016 Permalink | Reply
    Tags: , , , Autism may stem—in part—from a disordered sense of touch   

    From AAAS: “Autism may stem—in part—from a disordered sense of touch” 

    AAAS

    AAAS

    Jun. 9, 2016
    Teal Burrell

    1
    A disrupted sense of touch causes autismlike behaviors in mice. ploughmann/iStock.

    Sociability may be skin deep. The social impairments and high anxiety seen in people with autism or related disorders may be partly due to a disruption in the nerves of the skin that sense touch, a new study in mice suggests.

    Autism spectrum disorders are primarily thought of as disorders of the brain, generally characterized by repetitive behaviors and deficits in communication skills and social interaction. But a majority of people with autism spectrum disorders also have an altered tactile sense; they are often hypersensitive to light touch and can be overwhelmed by certain textures. “They tend to be very wary of social touch [like a hug or handshake], or if they go outside and feel a gust of wind, it can be very unnerving,” says neuroscientist Lauren Orefice from Harvard Medical School in Boston.

    An appreciation for this sensory aspect of autism has grown in recent years. The newest version of psychiatry’s bible, the Diagnostic and Statistical Manual of Mental Disorders, includes the sensory abnormalities of autism as core features of the disease. “That was a big nod and a recognition that this is a really important aspect of autism,” says Kevin Pelphrey, a cognitive neuroscientist at The George Washington University in Washington, D.C., who was not involved in the work.

    The sensation of touch starts in the peripheral nervous system—in receptors at the surface of the skin—and travels along nerves that connect into the central nervous system. Whereas many autism researchers focus on the end of the pathway—the brain—Orefice and colleagues wondered about the first leg of the trip. So the group introduced mutations that silenced genes associated with autism spectrum disorders in mice, adding them in a way that restricted the effects to peripheral nerve cells, they report today in Cell. The team singled out the gene Mecp2, which encodes a protein that regulates the expression of genes that help forge connections between nerve cells.

    The Mecp2 mutant mice were more sensitive to light touch; a small puff of air on their backs startled the rodents more than normal mice. Additionally, the mutants were unable to distinguish between rough and smooth textures. Just like normal mice—which love novelty—they played with new objects whenever given a choice between familiar and new ones that differed in shape and size. But when the objects differed by texture, they played just as much with familiar, rough blocks of wood and new, smooth ones—unlike the control mice. Orefice suggests that an increased sensitivity to touch in the mutant mice makes any texture overwhelming, so subtle differences are indistinguishable.

    The animals also displayed autismlike behaviors beyond touch. Even though the defective Mecp2 gene wasn’t present in brain cells, the mutant mice were also more anxious and less social, traits generally attributed to the central nervous system. When given the option to hang out with another mouse or an object like an empty cup, the mutant mice spent just as much time with the object as with the other mouse, unlike normal mice, which prefer a living companion. Tests of anxiety also revealed differences. Whereas normal mice will explore the entirety of an open area or venture onto the wall-less sides of an elevated platform, the mutant mice preferred to hug the edges of the open area and remain in the walled regions of the platform, suggesting heightened anxiety.

    When the researchers silenced the genes in the peripheral nerves of adult animals, they were still hypersensitive to light touch, but they didn’t display the behavioral abnormalities seen in the animals that had the gene silenced from birth. That suggests to Orefice’s team that there is a developmental window of time when touch influences behavior. “The way we navigate our world is largely with a sense of touch,” she says. During development, touch is key to learning how to interact with other animals and the environment. If a light touch from another mouse is uncomfortable, a mouse might learn to avoid its peers in the future. And if the environment itself feels abrasive, the mouse might stop exploring.

    The researchers also found that the peripheral nerves of Mecp2 mutant mice had low levels of a receptor for the neurotransmitter GABA (gamma-aminobutryic acid). Low GABA levels in the brain have previously been linked to autism, but the new finding opens up an unexpected treatment possibility: a drug that restores GABA function in the periphery. “If we can normalize the hypersensitivity to touch, it’s possible that this might help improve anxietylike behaviors and social interaction deficits. This is not to say that the brain is not important,” Orefice says. But targeting the periphery along with the brain may be a way to get at the disease from both ends.

    For now, the findings apply only to mice, which are an imperfect model for complex cognitive disorders such as autism. “For translation to humans, it would be important to know if pharmacological enhancement—ideally of the specific GABA receptor—can alleviate the peripheral hypersensitivity to touch, especially in young children who may be in a critical period of vulnerability,” says Takao Hensch, a neuroscientist at Harvard University who was not involved in the research. He also wonders whether the findings apply to other genetic forms of autism spectrum disorders. Mecp2 has been shown to have unique effects on GABA in the brain; perhaps its peripheral effects are unique as well.

    Still, the finding that dysfunction in the touch system can contribute to behavioral problems is exciting, Pelphrey says. “It gives you a sense of how fundamental these sensory features might be … in terms of mechanistically causing some of the other features,” he says. “It really opens up a different way of thinking about what’s going on.”

    See the full article here .

    The American Association for the Advancement of Science is an international non-profit organization dedicated to advancing science for the benefit of all people.

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition

     
  • richardmitnick 8:19 am on June 10, 2016 Permalink | Reply
    Tags: A Power Plant in Iceland Deals with Carbon Dioxide by Turning It into Rock, , ,   

    From MIT Tech Review: “A Power Plant in Iceland Deals with Carbon Dioxide by Turning It into Rock” 

    MIT Technology Review
    MIT Technology Review

    June 9, 2016
    Ryan Cross

    1
    Photograph by Juerg Matter

    The world has a carbon dioxide problem. And while there are lots of ideas on how to curtail the nearly 40 billion tons of the gas that humanity spews into the atmosphere annually, one has just gotten a boost: burying it.

    Since 2012, Reykjavík Energy’s CarbFix project in Iceland has been injecting carbon dioxide underground in a way that converts it into rock so that it can’t escape. This kind of carbon sequestration has been tried before, but as researchers working on the project report today in the journal Science, the process of mineralizing the carbon dioxide happens far more quickly than expected, confirming previous reports and brightening the prospects for scaling up this technology.

    Iceland’s volcanic landscape is replete with basalt. Injecting carbon dioxide and water deep underground allows the mixture to react with calcium, magnesium, and iron in the basalt, turning it into carbonate minerals like limestone.

    2
    Project leader Juerg Matter stands by the injection well during the CarbFix project’s initial injection. Photograph by Sigurdur Gislason

    Conventional methods for storing carbon dioxide underground pressurize and heat it to form a supercritical fluid, giving it the properties of both a liquid and a gas. While making the carbon dioxide easier to inject into the ground—usually in an old oil or gas reservoir—this carries a higher risk that it could escape back into the atmosphere through cracks in the rock.

    CarbFix takes carbon dioxide from the Hellisheidi geothermal power plant, the largest in the world, which uses volcanically heated water to power turbines. The process produces 40,000 tons of carbon dioxide a year, as well as hydrogen sulfide, both of which are naturally present in the water.

    3
    The CarbFix pilot injection site in March 2011. Photograph by Martin Stute

    The new study shows that more than 95 percent of the injected material turned to rock in less than two years. “No one actually expected it to be this quick,” says Edda Aradóttir, CarbFix’s project manager. The project is already storing 5,000 tons underground per year, making it the largest of its kind. New equipment being installed this summer aims to double the rate of storage.

    Aradóttir says CarbFix spends $30 per ton to capture and inject the carbon dioxide, versus $65 to $100 per ton for the conventional method. A lot of that savings comes from not having to purify the carbon dioxide; it and the hydrogen sulfide are simply mixed with additional water and injected underground.

    4
    CarbFix team members handle the rock core recovered from drilling at the CarbFix pilot injection site in October 2014. Photograph by Juerg Matter

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

     
  • richardmitnick 7:53 am on June 10, 2016 Permalink | Reply
    Tags: , , scientists discover biomarkers that could help give cancer patients better survival estimates, , Using big data   

    From UCLA: “Using big data, scientists discover biomarkers that could help give cancer patients better survival estimates” 

    UCLA bloc

    UCLA

    June 09, 2016
    Stuart Wolpert

    1
    A SURVIV analysis of breast cancer isoforms developed at UCLA. Blue lines are associated with longer survival times, and magenta lines with shorter survival times. Courtesy of Yi Xing

    People with cancer are often told by their doctors approximately how long they have to live, and how well they will respond to treatments, but what if there were a way to improve the accuracy of doctors’ predictions?

    A new method developed by UCLA scientists could eventually lead to a way to do just that, using data about patients’ genetic sequences to produce more reliable projections for survival time and how they might respond to possible treatments. The technique is an innovative way of using biomedical big data — which gleans patterns and trends from massive amounts of patient information — to achieve precision medicine — giving doctors the ability to better tailor their care for each individual patient.

    The approach is likely to enable doctors to give more accurate predictions for people with many types of cancers. In this research, the UCLA scientists studied cancers of the breast, brain (glioblastoma multiforme, a highly malignant and aggressive form; and lower grade glioma, a less aggressive version), lung, ovary and kidney.

    In addition, it may allow scientists to analyze people’s genetic sequences and determine which are lethal and which are harmless.

    The new method analyzes various gene isoforms — combinations of genetic sequences that can produce an enormous variety of RNAs and proteins from a single gene — using data from RNA molecules in cancer specimens. That process, called RNA sequencing, or RNA-seq, reveals the presence and quantity of RNA molecules in a biological sample. In the method developed at UCLA, scientists analyzed the ratios of slightly different genetic sequences within the isoforms, enabling them to detect important but subtle differences in the genetic sequences. In contrast, the conventional analysis aggregates all of the isoforms together, meaning that the technique misses important differences within the isoforms.

    2
    Yi Xing. Courtesy of Yi Xing

    SURVIV (for “survival analysis of mRNA isoform variation”) is the first statistical method for conducting survival analysis on isoforms using RNA-seq data, said senior author Yi Xing, a UCLA associate professor of microbiology, immunology and molecular genetics. The research is published today in the journal Nature Communications.

    The researchers report having identified some 200 isoforms that are associated with survival time for people with breast cancer; some predict longer survival times, others are linked to shorter times. Armed with that knowledge, scientists might eventually be able to target the isoforms associated with shorter survival times in order to suppress them and fight disease, Xing said.

    The researchers evaluated the performance of survival predictors using a metric called C-index and found that across the six different types of cancer they analyzed, their isoform-based predictions performed consistently better than the conventional gene-based predictions.

    The result was surprising because it suggests, contrary to conventional wisdom, that isoform ratios provide a more robust molecular signature of cancer patients than overall gene abundance, said Xing, director of UCLA’s bioinformatics doctoral program and a member of the UCLA Institute for Quantitative and Computational Biosciences.

    “Our finding suggests that isoform ratios provide a more robust molecular signature of cancer patients in large-scale RNA-seq datasets,” he said.

    The researchers studied tissues from 2,684 people with cancer whose samples were part of the National Institutes of Health’s Cancer Genome Atlas, and they spent more than two years developing the algorithm for SURVIV.

    According to Xing, a human gene typically produces seven to 10 isoforms.

    “In cancer, sometimes a single gene produces two isoforms, one of which promotes metastasis and one of which represses metastasis,” he said, adding that understanding the differences between the two is extremely important in combatting cancer.

    “We have just scratched the surface,” Xing said. “We will apply the method to much larger data sets, and we expect to learn a lot more.”

    Co-authors of the research are lead author Shihao Shen, a senior research scientist in Xing’s laboratory; Ying Nian Wu, a UCLA professor of statistics; Yuanyuan Wang, and Chengyang Wang, UCLA doctoral students.

    The research was funded by the National Institutes of Health (grants R01GM088342 and R01GM105431) and the National Science Foundation (grant DMS1310391). Xing’s research is also supported by an Alfred Sloan Research Fellowship.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    UC LA Campus

    For nearly 100 years, UCLA has been a pioneer, persevering through impossibility, turning the futile into the attainable.

    We doubt the critics, reject the status quo and see opportunity in dissatisfaction. Our campus, faculty and students are driven by optimism. It is not naïve; it is essential. And it has fueled every accomplishment, allowing us to redefine what’s possible, time after time.

    This can-do perspective has brought us 12 Nobel Prizes, 12 Rhodes Scholarships, more NCAA titles than any university and more Olympic medals than most nations. Our faculty and alumni helped create the Internet and pioneered reverse osmosis. And more than 100 companies have been created based on technology developed at UCLA.

     
  • richardmitnick 12:46 pm on June 9, 2016 Permalink | Reply
    Tags: Another Piece of the Puzzle, , ,   

    From UNC: “Another Piece of the Puzzle” 

    U NC bloc

    University of North Carolina

    June 9, 2016
    Mark Derewicz

    1
    UNC researchers continue to discover new pieces of the autism puzzle. Most recently, they’ve collaborated with the Simons Foundation to take part in its SPARK initiative — a genetic study that would recruit thousands of families for autism research. Illustration by Corina Cudebec

    UNC clinical researchers begin the largest-ever genetic study of autism to elucidate the complex genetics of the condition

    If a child has autism, the condition is uniquely their own. The genes involved, how those genes are expressed to give rise to the proteins his or her brain cells need to function, how the neurons are wired to articulate thoughts or navigate social interactions or think through a problem — all of these things are unique to this child.

    “No two children with autism are the same,” UNC-Chapel Hill researcher Gabriel Dichter says, “but the way we try to help kids now is with a one-size-fits-all approach. We use a trial-and-error approach and try to help them with the same interventions to see what works and what doesn’t.” It would be better to know more about what intervention would work best for each child as quickly as possible.

    That’s why UNC — along with 20 other research institutions — is taking on the largest genetic study of autism ever attempted. Researchers will collect DNA and other information from 50,000 people with autism and their immediate family members. UNC was one of three pilot institutions tasked with making sure such an ambitious project was even possible. “It will be the first opportunity the research community has had to understand autism genetics in a way that will allow us, in the future, to match a person’s specific genetic profile with a specific treatment plan,” Dichter says. “That’s the ultimate goal.”

    Dichter, Carolina Institute for Developmental Disabilities (CIDD) Director Joseph Piven, and colleagues across the state are recruiting families with children with autism to be part of this study, called the SPARK initiative (Simons Foundation Powering Autism Research for Knowledge). The UNC team hopes to recruit thousands of families — perhaps even 10,000 — of which they would have access to their full genome sequences.

    To date, approximately 50 genes have been identified that almost certainly play a role in autism, and scientists estimate hundreds more are involved. By studying these genes, their biological consequences, and how they interact with environmental factors, researchers could better understand the condition’s causes, and link possible underlying causes to the spectrum of symptoms, skills, and challenges of people affected.

    Piven’s team at CIDD, home to the federally funded Intellectual and Developmental Disabilities Research Center, is no stranger to this kind of work. For more than 15 years, his group — together with the UNC TEACCH Autism Program — has been building a research registry of families with at least one child with autism (whom have all consented to being contacted by UNC researchers conducting studies).

    These North Carolina families, now more than 6,000 strong, have made it possible for UNC researchers to deepen their understanding of this complex condition and provide numerous intervention strategies, support systems, and diagnostic tools. Piven and Dichter’s team will now tap into that registry to recruit these families, while continually working to add more to the list.

    Mark Zylka is a cell biologist and the incoming director of the UNC Neuroscience Center, which is supporting the SPARK initiative with funds for personnel to boost recruitment efforts. “Those of us in the basic sciences want to partner with clinicians in research projects we hope will ultimately benefit people,” he says. Zylka knows better than most what access to genetic information can mean to a researcher and people with the condition.

    A previous group of scientists discovered through genetic analysis that nearly 1,000 genes are potentially linked to autism in some way. Of those genes, Zylka researched UBE3A, a protein coding gene associated with Angelman Syndrome — a neurodevelopmental disorder characterized by severe intellectual and developmental disability, sleep disturbance, seizures, jerky movements, and a typically happy demeanor. He observed cells from a child with a mutated UBE3A gene and cells from the child’s parents.

    Jason Yi, a postdoctoral fellow in Zylka’s lab, found that a child had a “hyperactive” version of UBE3A. It’s like a broken water faucet — the gene can’t be shut off. In normal brain development, that gene has to be turned on to produce an enzyme that targets proteins to be broken down within cells. It then has to be shut off to avoid too much production of the enzyme. In a child with the UBE3A mutation, the faucet is never turned off. In his parents, the gene works normally.

    “We think it may be possible to tamp down UBE3A in some autism patients to restore normal levels of the enzyme in the brain,” Zylka explains. It’s a long way from the clinic, but his and Yi’s work shows it’s possible to affect the basic biology that plays a role in autism.

    From one generation to the next

    Piven, along with the CIDD, has begun to study the link between autism and Parkinson’s disease. In two small, preliminary clinical studies, he and colleagues found that Parkinson’s disease may occur much more commonly in older adults with autism than in those without autism.

    He and his team identified 20 adults with autism who were not taking atypical neuroleptic drugs. Four of them were diagnosed with Parkinson’s disease. This 20 percent rate of diagnosis was 200 fold higher than the normal rate of incidence — one in 1000 or 0.1 percent — among the general population of people ages 45 to 65. There was an even higher rate of Parkinsonian symptoms among participants with autism who were taking neuroleptic drugs, which can cause the neurological problems seen in Parkinson’s disease.

    The study needs to be replicated in a larger pool of people with autism. “We think these findings are the tip of the iceberg,” Piven says. “Studying older populations of people with autism is a new frontier, and we think this continued work will uncover very important information all of us need in order to better care for people with autism as they age.”

    And that’s a big deal.

    “By and large, what autism is like for older adults is still a mystery,” Piven adds. “Many of these people were misdiagnosed years ago, and there’s nearly nothing in the medical literature about these older people with autism.”

    As UNC basic science researchers delve into the genetics of autism and the potential environmental triggers, UNC behavioral researchers are focused on developing and disseminating community-based services. UNC TEACCH Autism Program director Laura Klinger is busy documenting the needs of adults with autism.

    Research conducted by Julie Daniels at UNC in collaboration with the Centers for Disease Control shows that the prevalence of autism in 8-year-olds has risen from 1 in 150 in 2002 to 1 in 68 in 2012. The first cohort of 8-year-olds is now 22 years of age.

    “So, we can look ahead and expect a large increase in the number of adults with autism in the coming decade,” Klinger said. “Yet, we know very little about how to support a good quality of life for adults with the disorder. We’ve learned so much about autism in children in the past decade. We can diagnose autism earlier than ever before, and we have witnessed firsthand how earlier interventions can make a difference in children’s lives. Now, we need to focus on supporting individuals with autism across the entire lifespan.”

    There is much work to be done in developing vocational, residential, medical, and mental health services to support adults with autism, Klinger adds. “The longevity of autism services and research at UNC gives us a unique opportunity to lead the world in understanding aging in autism.”

    Right now, scientists don’t understand the underlying genetics well enough and don’t have a good enough handle on potential environmental causes, according to Piven. Clinicians are limited in their ability to help some of the more severe cases, though they’ve made strides in the past decade. And the medical community doesn’t have a good concept of what it’s like to live with autism for many decades into old age.

    “The good news is that UNC is one of the few places in the world capable of tackling these and other issues facing the autism community,” Piven says. “We have the scientific and clinical expertise, and we’re making progress every day.”

    Gabriel Dichter is an associate professor of psychiatry and psychology, and also the director of the Clinical Affective Neuroscience Lab at UNC-Chapel Hill.

    Joseph Piven is the Thomas E. Castelloe Distinguished Professor of Psychiatry, Pediatrics, and Psychology; director of the Carolina Institute for Developmental Disabilities; and director of the Intellectual and Developmental Disabilities Research Center, which is funded through the National Institute of Child Health and Development.

    Mark Zylka is the incoming director of the UNC Neuroscience Center. He is also a professor of cell biology and physiology and an adjunct associate professor of pharmacy.

    Laura Klinger is an associate professor in the Department of Psychiatry and Neuroscience and the director of the TEACCH Autism Program.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition</a

    U NC campus

    Carolina’s vibrant people and programs attest to the University’s long-standing place among leaders in higher education since it was chartered in 1789 and opened its doors for students in 1795 as the nation’s first public university. Situated in the beautiful college town of Chapel Hill, N.C., UNC has earned a reputation as one of the best universities in the world. Carolina prides itself on a strong, diverse student body, academic opportunities not found anywhere else, and a value unmatched by any public university in the nation.

     
  • richardmitnick 12:31 pm on June 9, 2016 Permalink | Reply
    Tags: , Google Moves Closer to a Universal Quantum Computer, ,   

    From SA: “Google Moves Closer to a Universal Quantum Computer” 

    Scientific American

    Scientific American

    June 9, 2016
    Philip Ball, Nature Magazine

    1
    Corporate headquarters complex of Google in Mountain View, California. Credit: Brooks Kraft LLC/Corbis via Getty Images

    For 30 years, researchers have pursued the universal quantum computer, a device that could solve any computational problem, with varying degrees of success. Now, a team in California and Spain has made an experimental prototype of such a device that can solve a wide range of problems in fields such as chemistry and physics, and has the potential to be scaled up to larger systems.

    Both IBM and a Canadian company called D-Wave have created functioning quantum computers using different approaches. But their devices are not easily scalable to the many quantum bits (qubits) needed for solving problems that classical computers cannot.

    Computer scientists at Google’s research laboratories in Santa Barbara, California, and physicists at the University of California at Santa Barbara and the University of the Basque Country in Bilbao, Spain, describe their new device online in Nature.

    “It’s terrific work in many respects, and is filled with valuable lessons for the quantum computing community,” says Daniel Lidar, a quantum-computing expert at the University of Southern California in Los Angeles.

    The Google prototype combines the two main approaches to quantum computing. One approach constructs the computer’s digital circuits using qubits in particular arrangements geared to solve a specific problem. This is analogous to a tailor-made digital circuit in a conventional microprocessor made from classical bits.

    Much of quantum computing theory is based on this approach, which includes methods for correcting errors that might otherwise derail a calculation. So far, practical implementations have been possible only with a handful of qubits.

    Analog approach

    The other approach is called adiabatic quantum computing (AQC). Here, the computer encodes a given problem in the states of a group of qubits, gradually evolving and adjusting the interactions between them to “shape” their collective quantum state and reach a solution. In principle, just about any problem can be encoded into the same group of qubits.

    This analog approach is limited by the effects of random noise, which introduces errors that cannot be corrected as systematically as in digital circuits. And there’s no guarantee that this method can solve every problem efficiently, says computer scientist Rami Barends, a member of the Google team.

    Yet only AQC has furnished the first commercial devices — made by D-Wave in Burnaby, British Columbia — which sell for about $15 million apiece. Google owns a D-Wave device, but Barends and colleagues think that there’s a better way to do AQC.

    In particular, they want to find some way to implement error correction. Without it, scaling up AQC will be difficult, because errors accumulate more quickly in larger systems. The team thinks the first step to achieving that is to combine the AQC method with the digital approach’s error-correction capabilities.

    Virtual chemistry

    To do that, the Google team uses a row of nine solid-state qubits, fashioned from cross-shaped films of aluminium about 400 micrometers from tip to tip. These are deposited onto a sapphire surface. The researchers cool the aluminium to 0.02 degrees kelvin, turning the metal into a superconductor with no electrical resistance. Information can then be encoded into the qubits in their superconducting state.

    The interactions between neighboring qubits are controlled by ‘logic gates’ that steer the qubits digitally into a state that encodes the solution to a problem. As a demonstration, the researchers instructed their array to simulate a row of magnetic atoms with coupled spin states — a problem thoroughly explored in condensed-matter physics. They could then look at the qubits to determine the lowest-energy collective state of the spins that the atoms represented.

    This is a fairly simple problem for a classical computer to solve. But the new Google device can also handle so-called ‘non-stoquastic’ problems, which classical computers cannot. These include simulations of the interactions between many electrons, which are needed for accurate computer simulations in chemistry. The ability to simulate molecules and materials at the quantum level could be one of the most valuable applications of quantum computing.

    This new approach should enable a computer with quantum error correction, says Lidar. Although the researchers did not demonstrate that here, the team has previously shown how that might be achieved on its nine-qubit device.

    “With error correction, our approach becomes a general-purpose algorithm that is, in principle, scalable to an arbitrarily large quantum computer,” says Alireza Shabani, another member of the Google team.

    The Google device is still very much a prototype. But Lidar says that in a couple of years, devices with more than 40 qubits could become a reality.

    “At that point,” he says, “it will become possible to simulate quantum dynamics that is inaccessible on classical hardware, which will mark the advent of ‘quantum supremacy’.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 12:13 pm on June 9, 2016 Permalink | Reply
    Tags: , ,   

    From MIT Tech Review: “Proof That Quantum Computers Will Change Everything” 

    MIT Technology Review
    MIT Technology Review

    First Demonstration of 10-Photon Quantum Entanglement

    June 9, 2016
    Emerging Technology from the arXiv

    The ability to entangle 10 photons should allow physicists to prove, once and for all, that quantum computers really can do things classical computers cannot.

    Entanglement is the strange phenomenon in which quantum particles become so deeply linked that they share the same existence. Once rare, entangling particles has become routine in labs all over the world.

    Quantum approach to big data. MIT
    Quantum approach to big data. MIT

    Physicists have learned how to create entanglement, transfer it from one particle to another, and even distil it. Indeed, entanglement has become a resource in itself and a crucial one for everything from cryptography and teleportation to computing and simulation.

    But a significant problem remains. To carry out ever more complex and powerful experiments, physicists need to produce entanglement on ever-larger scales by entangling more particles at the same time.

    The current numbers are paltry, though. Photons are the quantum workhorses in most labs and the record for the number of entangled photons is a mere eight, produced at a rate of about nine events per hour.

    Using the same techniques to create a 10-photon count rate would result in only 170 per year, too few even to measure easily. So the prospects of improvement have seemed remote.

    Which is why the work of Xi-Lin Wang and pals at the University of Science and Technology of China in Heifu is impressive. Today, they announce that they’ve produced 10-photon entanglement for the first time, and they’ve done it at a count rate that is three orders of magnitude higher than anything possible until now.

    The biggest bottleneck in entangling photons is the way they are produced. This involves a process called spontaneous parametric down conversion, in which one energetic photon is converted into two photons of lower energy inside a crystal of beta-barium borate. These daughter photons are naturally entangled.

    2
    Experiment setup for generating ten-photon polarization-entangled GHZ, from the science paper

    By zapping the crystal continuously with a laser beam, it is possible to create a stream of entangled photon pairs. However, the rate of down conversion is tiny, just one photon per trillion. So collecting the entangled pairs efficiently is hugely important.

    That’s no easy tasks, not least because the photons come out of the crystal in slightly different directions, neither of which can be easily predicted. Physicists collect the photons from the two points where they are most likely to appear but most of the entangled photons are lost.

    Xi-Lin and co have tackled this problem by reducing the uncertainty in the photon directions. Indeed, they have been able to shape the beams of entangled photons so that they form two separate circular beams, which can be more easily collected.

    In this way, the team has generated entangled photon pairs at the rate of about 10 million per watt of laser power. This is brighter than previous entanglement generators by a factor of about four. It is this improvement that makes 10-photon entanglement possible.

    Their method is to collect five successively generated pairs of entangled photons and pass them into an optical network of four beam splitters. The team then introduces time delays that ensure the photons arrive at the beam splitters simultaneously and so become entangled.

    This creates the 10-photon entangled state, albeit at a rate of about four per hour, which is low but finally measureable for the first time. “We demonstrate, for the first time, genuine and distillable entanglement of 10 single photons,” say Xi-Lin and co.

    That’s impressive work that immediately opens the prospect of a new generation of experiments. The most exciting of these is a technique called boson sampling that physicists hope will prove that quantum computers really are capable of things classical computers are not.

    That’s important because nobody has built a quantum computer more powerful than a pocket calculator (the controversial D-Wave results aside). Neither are they likely to in the near future. So boson sampling is quantum physicists’ greatest hope that will allow them to show off the mind-boggling power of quantum computation for the first time.

    Other things also become possible, such as the quantum teleportation of three degrees of freedom in a single photon and multi-photon experiments over very long distances.

    But it is the possibility of boson sampling that will send a frisson through the quantum physics community.

    Ref: arxiv.org/abs/1605.08547: Experimental Ten-Photon Entanglement

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: