Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:58 pm on September 17, 2014 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From physicsworld: “New plasmonic nanolaser is cavity-free” 


    Sep 17, 2014
    Tim Wogan

    A new design for a cavity-free nanolaser has been proposed by physicists at Imperial College London. The design builds on a proposal from the same team earlier this year to reduce the group velocity of light of a particular frequency to exactly zero in a metal–dielectric–metal waveguide. The laser, which has yet to be built, makes use of two such zero-velocity regions, and would achieve population inversion and create a laser beam without the need for an optical cavity. The researchers suggest that the design could have important applications in optical telecommunications and computing, as well as theoretical implications in reconciling the physics of lasers with plasmonics.

    Slowing light to a stop: nanolaser has no cavity

    The traditional design for a laser involves encasing a gain medium such as a gas in a cavity containing two opposing mirrors. The gain medium contains two electronic energy levels, and, in the natural state, the lower energy level is the more populated. However, by injecting electrical or light energy into the cavity, some electrons can be “pumped” into the upper state. At low pumping levels, atoms pushed to the upper level decay spontaneously back to the ground state with the emission of a photon. However, above a certain threshold, transitions back to the ground state are predominantly caused by an excited atom’s absorption of a second photon. The two photons are emitted perfectly in phase, and go on to excite emission from more atoms. The resulting beam of phase-coherent photons is the laser beam.

    Lasers have revolutionized modern science and technology, with tiny lasers can be found everywhere from cheap pointers to state-of-the-art telecommunications systems. While much smaller nanoscale lasers would be useful for creating chip-based optical circuits, the need for a cavity limits means that it is difficult to miniaturize a conventional laser beyond the wavelength of the light it produces. This limit is about one micron for the light used in telecommunications technologies.

    Plasmonic interactions

    Now, Ortwin Hess and colleagues have devised a new way of producing a sub-wavelength laser by removing the cavity altogether. The researchers designed a layered metal–dielectric–metal waveguide structure that supports plasmonic interactions between light and conduction electrons at the metal–dielectric interfaces. Such a plasmonic waveguide supports two “zero-velocity singularities” at closely spaced but distinct frequencies. Light of other frequencies will propagate through the semiconductor very slowly – allowing it plenty of time to interact with the gain material. While slow and stopped light might sound like unphysical concepts, they can occur when light interacts with plasmons. Injecting a pulse of this slow light, the researchers calculated, will pump carriers from a lower energy state to a higher state. This higher state would then decay to an intermediate state, which would then decay to produce the laser light. The presence of the zero-velocity singularities causes the laser light to be trapped in the material, where it drives the desired coherent stimulated emission.

    To produce a laser beam, however, some of the laser light must be able to leave the device. In previous work (see “Plasmonic waveguide stops light in its tracks”), Hess and colleagues proposed exciting a zero-velocity mode by passing the light through the cladding in the form of an evanescent wave – a special type of wave the frequency of which is a complex number. Radiation incident on the cladding would excite an evanescent wave, which would in turn excite the stopped-light mode in the dielectric inside. In their new paper, Hess and colleagues turn this idea on its head and use the evanescent field to allow laser light to escape. By varying the precise properties and thickness of the cladding layer, the proportion of light allowed to escape could be tuned, producing a laser beam of variable intensity.

    Biomedical applications

    Nicholas Fang, a nanophotonics expert at the Massachusetts Institute of Technology, believes that, if such cavity-free nanolasers could be produced, they could have major practical implications not only in computation and signalling, but also in less-obvious fields such as prosthetics: he suggests they could be embedded in synthetic tissue to provide sensors with output signals detectable by the nervous system. “Here you’d have a laser source that could be directly implantable,” he explains.

    Hess, meanwhile, is excited by the potential theoretical implications of the work. While the current research focuses on using plasmonic interactions to produce coherent light, he believes that it should also be possible to keep the plasmons themselves confined within the waveguide to produce a miniature surface plasmon laser or “spaser”.

    The research is described in Nature Communications.

    See the full article here.

    PhysicsWorld is a publication of the Institute of Physics. The Institute of Physics is a leading scientific society. We are a charitable organisation with a worldwide membership of more than 50,000, working together to advance physics education, research and application.

    We engage with policymakers and the general public to develop awareness and understanding of the value of physics and, through IOP Publishing, we are world leaders in professional scientific communications.
    IOP Institute of Physics

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 11:03 am on September 17, 2014 Permalink | Reply
    Tags: Applied Research & Technology, Depression,   

    From Huff Post: “A Blood Test For Depression Shows The Illness Is Not A Matter Of Will” 

    Huffington Post
    The Huffington Post

    Anna Almendrala

    Screening for depression might soon be as easy as a blood test.


    A new test that identifies particular molecules in the blood could help doctors diagnose patients with clinical depression, according to a new study published in the journal Translational Psychiatry. The blood test can also predict which therapies would be most successful for patients, and lays the groundwork for one day identifying people who are especially vulnerable to depression — even before they’ve gone through a depressive episode.

    But perhaps just as important, said lead investigator Eva Redei, Ph.D., is the potential the test has for taking some of the stigma out of a depression diagnosis. When depression can be confirmed with a blood test like any other physical ailment, she said, there’s less stigma about having the disease and getting treatment.

    “I really believe that having an objective diagnosis will decrease stigma,” Redei, a neuroscientist and professor at the Northwestern University Feinberg School of Medicine, told The Huffington Post. “Once you have numbers in your hand, you can identify that [depression] is an illness — not a matter of will.”

    The most effective way to treat depression is to treat it early, but past studies show that it takes an average of two to 40 months to diagnose depression — if it gets diagnosed at all. Redei’s depression blood test could lead to faster and more accurate diagnoses, thereby transforming the way depression is treated.

    If Redei’s findings are independently replicated and confirmed, then approved by the Food and Drug Administration, laboratories across the U.S. could incorporate the test into their battery of routine exams. This is in contrast to MDDScore, a depression blood test owned by Ridge Diagnostics that was announced in 2012. Because the test is proprietary to Ridge Diagnostics, doctors have to submit samples to the company’s lab in North Carolina, where the company analyzes the blood and sends back results. Redei’s test, however, “can be done by any clinical laboratory anywhere, just like a cholesterol test,” Redei explained. “That is, assuming that we can go through the FDA approval [process] fast.”

    Redei’s study compared the blood samples of 32 patients who had been diagnosed with depression in the traditional way (a clinical interview) with samples taken from 32 people without depression. She found nine RNA blood markers — the molecules that carry out DNA’s instructions — that differed significantly between the two groups, which she then used as the basis for the depression diagnosis.

    Then, the depressed patients went through 18 weeks of cognitive behavioral therapy, a common treatment for depression. Re-testing their blood, Redei was able to tell which patients had benefitted the most from therapy, just by examining the changes in their RNA markers. In other words, the test was also a biological way to tell if treatment had been effective.

    Finally, Redei also noticed that there were three RNA markers that didn’t change in depressed patients, no matter if they had benefitted from cognitive behavioral therapy or not. She suspects they may be markers that show if a person is predisposed to depression.

    “Being aware of people who are more susceptible to recurring depression allows us to monitor them more closely,” said David Mohr, Ph.D., co-lead author of the study in a press release. “They can consider a maintenance dose of antidepressants or continued psychotherapy to diminish the severity of a future episode or prolong the intervals between episodes.”

    Zachary Kaminsky, Ph.D., of the Mood Disorders Center at Johns Hopkins Medicine, wasn’t involved with the study but is excited about its potential implications for depression treatment. Kaminsky is a pioneer in blood tests to predict suicide risk, and although he and Redei measure very different things in their tests, he sees that both researchers have similar goals when it comes to creating biological tests for mental illnesses.

    “It’s an exciting time — there is potential to find factors that are going to distinguish between various mental illnesses as well as responses to direct clinical treatment,” said Kaminsky to HuffPost. “Any finding that gets us closer to that is very interesting and worth following up.”

    But Kaminsky also pointed out that Redei and Mohr’s research still needs to be independently validated by other patient populations to confirm that it works. For instance, Kaminsky pointed out, the study would have been more scientifically rigorous if it had used a different patient group to confirm the blood test, as opposed to using the same participants to both create and then test the predictions.

    “I think this is very early stage and this model needs to be investigated in an independent sample,” Kaminsky said. “It will be important to test the predictability of these expression measures in independent cohorts.”

    Redei acknowledged that the next step in research would be to run the tests on larger samples in order to validate the models and then submit them for FDA approval.

    “The major question here is always funding,” said Redei. “We are really trying to gather as much funding from as many sources as possible so it can move ahead.”

    Major depressive disorder affects an estimated 6.7 percent of the U.S. population and is the leading cause of disability for Americans ages 15 to 44, according to the Anxiety and Depression Association of America. Despite the research hurdles she still needs to overcome, Redei is confident that her test can make a positive impact on the millions who struggle with depression — not only by making treatment more precise, but by bringing psychiatry “into the 21st century,” Redei said. “We’ll get to the point where there won’t be any discrimination between physical illness and mental illness.”

    See the full article here.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 9:39 am on September 17, 2014 Permalink | Reply
    Tags: Applied Research & Technology, , , Stem Cell Research   

    From New Scientist: “Stem cells used in landmark therapy for failing sight” 


    New Scientist

    17 September 2014
    Andy Coghlan

    A woman in Japan has received the first medical treatment based on induced pluripotent stem cells, eight years after they were discovered.

    The iPS cells were made by reprogramming skin cells from the woman’s arm, then transformed into specialised eye cells to treat age-related macular degeneration (AMD) ´ a condition that affects millions of elderly people worldwide, and often results in blindness. Last week, the woman, who is in her 70s, had a patch of the cells measuring 1.3 by 3 millimetres grafted into her eye in a two-hour operation.

    Grafts derived from stem cells could keep the retina in good working order (Image: Science Source/Science Photo Library)

    She is the first of six people lined up for the landmark treatment, developed by Masayo Takahashi and her colleagues at the RIKEN Center for Developmental Biology in Kobe, Japan. In a pilot study to test the safety of putting iPS-derived cells into humans, the six are all receiving a graft of new retinal pigment epithelial (RPE) cells, which serve to maintain the eye’s light-sensing cells.

    No embryos needed

    Since iPS cells can be made from adult tissue samples, the technique does not require the destruction of embryos, unlike stem-cell-based AMD treatments that are also being worked on – one such treatment is being trialled in the US and UK.

    “It’s an exciting development, and we await the outcome over the next year to see how well these cells integrate, and if there are any potential adverse reactions,” says Mike Cheetham of the Institute of Ophthalmology at University College London, one site which is also researching a human embryonic stem-cell treatment for AMD. “If it goes well, it could be the start of a new era in personalised medicine,” he says.

    Shinya Yamanaka and his colleagues at Kyoto University in Japan discovered iPS cells in 2006. In 2012, Yamanaka was awarded the Nobel prize for the work.

    See the full article here.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 11:39 am on September 16, 2014 Permalink | Reply
    Tags: Applied Research & Technology, , ,   

    From M.I.T.: “Neuroscientists identify key role of language gene” 

    MIT News

    September 15, 2014
    Anne Trafton | MIT News Office

    Neuroscientists have found that a gene mutation that arose more than half a million years ago may be key to humans’ unique ability to produce and understand speech.


    Researchers from MIT and several European universities have shown that the human version of a gene called Foxp2 makes it easier to transform new experiences into routine procedures. When they engineered mice to express humanized Foxp2, the mice learned to run a maze much more quickly than normal mice.

    The findings suggest that Foxp2 may help humans with a key component of learning language — transforming experiences, such as hearing the word “glass” when we are shown a glass of water, into a nearly automatic association of that word with objects that look and function like glasses, says Ann Graybiel, an MIT Institute Professor, member of MIT’s McGovern Institute for Brain Research, and a senior author of the study.

    “This really is an important brick in the wall saying that the form of the gene that allowed us to speak may have something to do with a special kind of learning, which takes us from having to make conscious associations in order to act to a nearly automatic-pilot way of acting based on the cues around us,” Graybiel says.

    Wolfgang Enard, a professor of anthropology and human genetics at Ludwig-Maximilians University in Germany, is also a senior author of the study, which appears in the Proceedings of the National Academy of Sciences this week. The paper’s lead authors are Christiane Schreiweis, a former visiting graduate student at MIT, and Ulrich Bornschein of the Max Planck Institute for Evolutionary Anthropology in Germany.

    All animal species communicate with each other, but humans have a unique ability to generate and comprehend language. Foxp2 is one of several genes that scientists believe may have contributed to the development of these linguistic skills. The gene was first identified in a group of family members who had severe difficulties in speaking and understanding speech, and who were found to carry a mutated version of the Foxp2 gene.

    In 2009, Svante Pääbo, director of the Max Planck Institute for Evolutionary Anthropology, and his team engineered mice to express the human form of the Foxp2 gene, which encodes a protein that differs from the mouse version by only two amino acids. His team found that these mice had longer dendrites — the slender extensions that neurons use to communicate with each other — in the striatum, a part of the brain implicated in habit formation. They were also better at forming new synapses, or connections between neurons.

    Pääbo, who is also an author of the new PNAS paper, and Enard enlisted Graybiel, an expert in the striatum, to help study the behavioral effects of replacing Foxp2. They found that the mice with humanized Foxp2 were better at learning to run a T-shaped maze, in which the mice must decide whether to turn left or right at a T-shaped junction, based on the texture of the maze floor, to earn a food reward.

    The first phase of this type of learning requires using declarative memory, or memory for events and places. Over time, these memory cues become embedded as habits and are encoded through procedural memory — the type of memory necessary for routine tasks, such as driving to work every day or hitting a tennis forehand after thousands of practice strokes.

    Using another type of maze called a cross-maze, Schreiweis and her MIT colleagues were able to test the mice’s ability in each of type of memory alone, as well as the interaction of the two types. They found that the mice with humanized Foxp2 performed the same as normal mice when just one type of memory was needed, but their performance was superior when the learning task required them to convert declarative memories into habitual routines. The key finding was therefore that the humanized Foxp2 gene makes it easier to turn mindful actions into behavioral routines.

    The protein produced by Foxp2 is a transcription factor, meaning that it turns other genes on and off. In this study, the researchers found that Foxp2 appears to turn on genes involved in the regulation of synaptic connections between neurons. They also found enhanced dopamine activity in a part of the striatum that is involved in forming procedures. In addition, the neurons of some striatal regions could be turned off for longer periods in response to prolonged activation — a phenomenon known as long-term depression, which is necessary for learning new tasks and forming memories.

    Together, these changes help to “tune” the brain differently to adapt it to speech and language acquisition, the researchers believe. They are now further investigating how Foxp2 may interact with other genes to produce its effects on learning and language.

    This study “provides new ways to think about the evolution of Foxp2 function in the brain,” says Genevieve Konopka, an assistant professor of neuroscience at the University of Texas Southwestern Medical Center who was not involved in the research. “It suggests that human Foxp2 facilitates learning that has been conducive for the emergence of speech and language in humans. The observed differences in dopamine levels and long-term depression in a region-specific manner are also striking and begin to provide mechanistic details of how the molecular evolution of one gene might lead to alterations in behavior.”

    The research was funded by the Nancy Lurie Marks Family Foundation, the Simons Foundation Autism Research Initiative, the National Institutes of Health, the Wellcome Trust, the Fondation pour la Recherche Médicale, and the Max Planck Society.

    See the full article here.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 3:56 pm on September 15, 2014 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From Triumf: “Postdoc Publishes Theory Breakthrough” 

    Triumf Lab

    15 September 2014
    Nick Leach, Outreach Assistant

    The basic interaction between the constituents of an atomic nucleus (‘nucleons‘ means neutrons or protons) has been well understood for decades; however, the interaction’s strength has meant that calculations for all but the very simplest nuclear systems (e.g. the deuteron = 1 proton + 1 neutron) were initially too complex to do from basic principles, necessitating various approximation methods to make reliable predictions. Nonetheless, theoretical groups worldwide have persevered in their attempts at establishing reliable “ab initio” techniques, and much progress has been made in describing ever-larger nuclei starting from the fundamental nucleon forces. TRIUMF theorists have been at the forefront of many of these advances.

    The quark structure of the neutron. The color assignment of individual quarks is arbitrary, but all three colors must be present. Forces between quarks are mediated by gluons.

    The quark structure of the proton. The color assignment of individual quarks is arbitrary, but all three colors must be present. Forces between quarks are mediated by gluons.

    Recently, scientists from TRIUMF and the Lawrence Livermore National Laboratory (LLNL) published a paper in the prestigious Physical Review Letters outlining a technique which for the first time enables researchers to analyze systems of three nuclear clusters in relative motion while treating the individual nucleons as fundamental components interacting by accurate nucleon-nucleon interactions. The work by the Theory postdoc Carolina Romero-Redondo and Petr Navratil (Theory Department, TRIUMF) in conjunction with Sofia Quaglioni and Guillaume Hupin (LLNL) has produced the first successful ab initio analysis describing energy states in the He-6 nucleus (2 protons + 4 neutrons).


    Carolina Romero-Redondo

    He-6 is an exotic nucleus that can be described as a three-cluster “halo” nucleus – a tightly bound He-4 core orbited by two neutrons. Part of what makes this particular nucleus so fascinating is that though the trio are bound together when all three bodies are present, removing just one renders the whole structure unstable. These are known as “Borromean” nuclei, after the similarly named Borromean rings, which exhibit a similar all-or-nothing structure. The He-6 nucleus is difficult to study experimentally and as such its energy spectrum is not yet firmly established. Excitingly, the results by Romero-Redondo, et al. are consistent with recent experiments, correctly identifying some known energy states (‘resonances’). They also predict new energy states and do not find others (e.g. low-energy “1-“ state) predicted by other formalisms.

    In the future, this new approach will be applied to study systems such as H-5 as a 3H+n+n trio, and Li-11 as a 9Li+n+n configuration.

    Having completed her term at TRIUMF, Carolina will continue her innovative research on three-cluster systems at her new appointment at LLNL.

    Congratulations to Romero-Redondo and her colleagues for this excellent contribution!

    See the full article here.

    World Class Science at Triumf Lab, British Columbia, Canada
    Canada’s national laboratory for particle and nuclear physics
    Member Universities:
    University of Alberta, University of British Columbia, Carleton University, University of Guelph, University of Manitoba, Université de Montréal, Simon Fraser University,
    Queen’s University, University of Toronto, University of Victoria, York University. Not too shabby, eh?

    Associate Members:
    University of Calgary, McMaster University, University of Northern British Columbia, University of Regina, Saint Mary’s University, University of Winnipeg, How bad is that !!
    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 3:37 pm on September 15, 2014 Permalink | Reply
    Tags: Applied Research & Technology,   

    From Rutgers: “Railway Platform Gap Bridging Unit” 

    Rutgers University
    Rutgers University


    Invention Summary

    In a recent report sponsored by the New Jersey Department of Transportation, it was found that car/platform gap injuries accounted for 25% of all injuries on NJ TRANSIT Rail lines, including death. Similar findings have been reported by other agencies, including the Long Island Rail Road and the Metropolitan Transit Agency of New York. These results prompted the Federal Railroad Safety Advisory Committee to address the seriousness of the car/platform gap issue.

    Researchers at Rutgers University have developed an apparatus that dynamically fills the gap between the door sill of a commuter rail train car and the station platform. This innovative trapdoor unit incorporates an automated slide for bridging a train/platform gap up to 15 inches wide. The unit is specific to train cars fitted with trapdoor system for high-low level platform service. The unit is based on the profile/ footprint of the existing trapdoors of multi-level cars currently in service on NJ Transit and other lines which allows the unit to be a retrofitted to an existing train car, with no structural modifications. The gap bridging slide is actuated by a simple motor/power screw arrangement for high reliability and direct operation by 72 volts DC, consistent with carriage equipment supply voltage.

    This technology enables passenger safety, where the gap bridging slide extends until a proximity sensor detects engagement with any object, at which the extension promptly ceases. The gap bridging slide forms a rigid structure cantilever that is designed to handle traffic up to 1000 lbs., and extends up to 12 inches. In addition, the unit is fitted with a motion sensor that measures the actual size of the gap each time the slide is extended; thus allowing real-time measurements recorded to create a history of gap size at each station. The unit weights 140 lbs (less than currently fitted trap doors weighing 180 lb.) and costs about $2800 per unit. The unit has a built-in, hard wired control only requiring an etension/ retraction signal for operation/synchronization with the rail carriage doors’s open actuation.

    The apparatus serves two current needs: 1) The Americans with Disabilities Act (ADA) requirement that passengers in wheelchairs and other mobility-impaired travelers have accessibility to get on and off passenger trains and 2) Protection against slip and fall injuries to passengers due to excessive gaps between the train and the platform.
    Market Application
    Can be used for passenger protection on railcars of a particular design in which high level platforms are serviced by a trap door, which is a common configuration on many U.S. commuter railcars.

    Minimum implementation cost
    Simple control/maintenance requirements
    No modification to carriage structure

    Intellectual Property & Development Status
    Patent pending

    See the full article here.

    Rutgers, The State University of New Jersey, is a leading national research university and the state’s preeminent, comprehensive public institution of higher education. Rutgers is dedicated to teaching that meets the highest standards of excellence; to conducting research that breaks new ground; and to providing services, solutions, and clinical care that help individuals and the local, national, and global communities where they live.

    Founded in 1766, Rutgers teaches across the full educational spectrum: preschool to precollege; undergraduate to graduate; postdoctoral fellowships to residencies; and continuing education for professional and personal advancement.

    Rutgers Seal

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 3:27 pm on September 15, 2014 Permalink | Reply
    Tags: Applied Research & Technology, , , PETRA III, ,   

    From DESY: “Double topping-out celebrations at DESY” 


    Two new experimental halls for research light source PETRA III

    Today DESY celebrates the topping-out of two large experimental halls for the research light source PETRA III.Ten additional beamlines, which will serve in the PETRA III particle accelerator’s high intensity X-ray experiments, are under construction in a space measuring approximately 6000 square meters; the facility will also include en-suite offices and laboratory spaces for scientists.The experimentation capabilities at the PETRA III synchrotron radiation source will be considerably increased due to the expansion project.The first new beamlines of the 80-million-Euro-project will be ready for operation beginning in autumn 2015.
    Zoom (17 KB)


    “With the new experimental stations, we are significantly expanding the research capabilities of PETRA III, for example, with new nanospectroscopy and materials research technologies,” says Chairman of the DESY Board of Directors Professor Helmut Dosch at the event. “At the same time, we will be fulfilling the enormous worldwide scientific demand for the best synchrotron radiation source in the world.”

    Hamburg´s Science Senator Dr. Dorothee Stapelfeldt says: “The senate’s aim is to develop Hamburg into one of the leading locations for research and innovation in Europe.In order to do so, it is essential to further raise the profiles of universities and research institutions in close dialogue with all stakeholders.Hamburg already occupies a leading position in structural research.The ground-breaking cooperation between DESY, the university and their partners at the Bahrenfeld research campus has been clearly recognized internationally.With the two new experimental halls, PETRA’s synchrotron radiation will be made available to even more researchers from all over the world in the future.”

    “With a total of ten new beamlines, the allure of Hamburg as a location for cutting-edge research will continue to increase, nationally and internationally,” says Dr. Beatrix Vierkorn-Rudolph (BMBF), Chairperson of the DESY Foundation Council. “With its excellent research opportunities, PETRA III contributes to rapidly transfering the results of basic research into application while also strengthening the innovative power of Germany.”

    DESY’s 2.3-kilometre-long PETRA III ring accelerator produces high intensity, highly collimated X-ray pulses for a diverse range of physical, biological and chemical experiments.Fourteen measuring stations, which can accommodate up to thirty experiments, already exist in an approximately 300-metre-long experimental hall.The properties of light pulses, which PETRA delivers to the different measuring stations, are thereby precisely attuned to the different research disciplines.Using the extremely brilliant X-rays, researchers study, for example, innovative solar cells, observe the dynamics of cell membranes and analyse fossilised dinosaur eggs.

    PETRA III, the world´s best X-ray source of its kind, has been heavily over-booked since it began operations in 2009.The PETRA III Extension Project was begun in December 2013 to give more scientists access to the unique experimental possibilities of this research light source and to broaden PETRA III’s research portfolio in experimental technologies:measuring approximately 6000 square meters in their entirety, the two new experimental halls house enough space for technical installations of up to ten additional beam lines, and an additional 1400 square metres provide office and laboratory space for the scientists.The beam lines and measuring instruments in the new halls are under construction in close cooperation with the future user community and are, in part, collaborative research projects.Three of the future PETRA beamlines will be constructed as an international partnership with Sweden, India and Russia.

    Altogether approximately 170 metres of the PETRA tunnel and accelerator have been dismantled since February to build the new experimental halls. Since August, the accelerator, equipped with special magnets for producing X-ray radiation, has been under reconstruction within the new tunnel areas that have already been completed.After the preliminary construction phase of the experimental halls, they are to be developed further from December 2014 onward; the accelerator will at the same time resume operation.The experiments will re-start in the PETRA III experimental hall “Max von Laue” beginning in April 2015 and the first measuring stations in the new, still unnamed halls should gradually become ready for operation in autumn 2015 and the start of 2016.

    The extension’s total budget of approximately 80 million Euros stems in large part from the Helmholtz Association’s expansion funds as well as funds from the Federal Ministry of Research, the Free and Hanseatic City of Hamburg and DESY.Collaborative partners from Germany and abroad cover approximately one third of the costs.

    See the full article here.


    DESY is one of the world’s leading accelerator centres. Researchers use the large-scale facilities at DESY to explore the microcosm in all its variety – from the interactions of tiny elementary particles and the behaviour of new types of nanomaterials to biomolecular processes that are essential to life. The accelerators and detectors that DESY develops and builds are unique research tools. The facilities generate the world’s most intense X-ray light, accelerate particles to record energies and open completely new windows onto the universe. 
That makes DESY not only a magnet for more than 3000 guest researchers from over 40 countries every year, but also a coveted partner for national and international cooperations. Committed young researchers find an exciting interdisciplinary setting at DESY. The research centre offers specialized training for a large number of professions. DESY cooperates with industry and business to promote new technologies that will benefit society and encourage innovations. This also benefits the metropolitan regions of the two DESY locations, Hamburg and Zeuthen near Berlin.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 3:10 pm on September 15, 2014 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From BNL: “Elusive Quantum Transformations Found Near Absolute Zero” 

    Brookhaven Lab

    September 15, 2014
    Justin Eure, (631) 344-2347 or Peter Genzer, (631) 344-3174

    Brookhaven Lab and Stony Brook University researchers measure the quantum fluctuations behind a novel magnetic material’s ultra-cold ferromagnetic phase transition.

    Heat drives classical phase transitions—think solid, liquid, and gas—but much stranger things can happen when the temperature drops. If phase transitions occur at the coldest temperatures imaginable, where quantum mechanics reigns, subtle fluctuations can dramatically transform a material.

    Scientists from the U.S. Department of Energy’s Brookhaven National Laboratory and Stony Brook University have explored this frigid landscape of absolute zero to isolate and probe these quantum phase transitions with unprecedented precision.

    Liusuo Wu, a Stony Brook University Ph.D. student and lead author on the study, with his postdoctoral advisor (and study coauthor) Meigan Aronson, a Brookhaven Lab physicist and Stony Brook professor

    “Under these cold conditions, the electronic, magnetic, and thermodynamic performance of metallic materials is defined by these elusive quantum fluctuations,” said study coauthor Meigan Aronson, a physicist at Brookhaven Lab and professor at Stony Brook. “For the first time, we have a picture of one of the most fundamental electron states without ambient heat obscuring or complicating those properties.”

    The scientists explored the onset of ferromagnetism—the same magnetic polarization exploited in advanced electronic devices, electrical motors, and even refrigerator magnets—in a custom-synthesized iron compound as it approached absolute zero.

    The research provides new methods to identify and understand novel materials with powerful and unexpected properties, including superconductivity—the ability to conduct electricity with perfect efficiency. The study will be published online Sept. 15, 2014, in the journal Proceedings of the National Academy of Sciences.

    “Exposing this quantum phase transition allows us to predict and potentially boost the performance of new materials in practical ways that were previously only theoretical,” said study coauthor and Brookhaven Lab physicist Alexei Tsvelik.

    Mapping Quantum Landscapes

    Rendering of the near–perfect crystal structure of the yttrium–iron–aluminum compound used in the study. The two–dimensional layers of the material allowed the scientists to isolate the magnetic ordering that emerged near absolute zero.

    The presence of heat complicates or overpowers the so-called quantum critical fluctuations, so the scientists conducted experiments at the lowest possible temperatures.

    “The laws of thermodynamics make absolute zero unreachable, but the quantum phase transitions can actually be observed at nonzero temperatures,” Aronson said. “Even so, in order to deduce the full quantum mechanical nature, we needed to reach temperatures as low as 0.06 Kelvin—much, much colder than liquid helium or even interstellar space.”

    The researchers used a novel compound of yttrium, iron, and aluminum (YFe2Al10), which they discovered while searching for new superconductors. This layered, metallic material sits poised on the threshold of ferromagnetic order, a key and very rare property.

    “Our thermodynamic and magnetic measurements proved that YFe2Al10 becomes ferromagnetic exactly at absolute zero—a sharp contrast to iron, which is ferromagnetic well above room temperature,” Aronson said. “Further, we used magnetic fields to reverse this ferromagnetic order, proving that quantum fluctuations were responsible.”

    The collaboration produced near-perfect samples to prove that material defects could not impact the results. They were also the first group to prepare YFe2Al10 in single-crystal form, which allowed them to show that the emergent magnetism resided within two-dimensional layers.

    “As the ferromagnetism decayed with heat or applied magnetic fields, we used theory to identify the spatial and temporal fluctuations that drove the transition,” Tsvelik said. “That fundamental information provides insight into countless other materials.”

    Quantum Clues to New Materials

    The scientists plan to modify the composition of YFe2Al10 so that it becomes ferromagnetic at nonzero temperatures, opening another window onto the relationship between temperature, quantum transitions, and material performance.

    “Robust magnetic ordering generally blocks superconductivity, but suppressing this state might achieve the exact balance of quantum fluctuations needed to realize unconventional superconductivity,” Tsvelik said. “It is a matter of great experimental and theoretical interest to isolate these competing quantum interactions that favor magnetism in one case and superconductivity on the other.”

    Added Aronson, “Having more examples displaying this zero-temperature interplay of superconductivity and magnetism is crucial as we develop a holistic understanding of how these phenomena are related and how we might ultimately control these properties in new generations of materials.”

    Other authors on this study include Liusuo Wu, Moosung Kim, and Keeseong Park, all of Stony Brook University’s Department of Physics and Astronomy.

    The research was conducted at Brookhaven Lab’s Condensed Matter Physics and Materials Science Department and supported by the U.S. Department of Energy’s Office of Science (BES).

    BNL Campus

    See the full article here.

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 2:12 pm on September 15, 2014 Permalink | Reply
    Tags: Applied Research & Technology, , Harvard University,   

    From NOVA: “‘Biospleen Device’ Uses Magnetic Nanoparticles to Filter Pathogens from Blood” 



    Mon, 15 Sep 2014
    Tim De Chant

    When a patient succumbs to an infection, it’s not the mere presence of the pathogen that kills them, but rather the sheer quantity of it. With many deadly diseases, the immune system simply can’t keep up. So bioengineers figured that outsourcing some of those duties could help keep patients alive.

    A team of bioengineers led by Donald Ingber at Harvard’s Wyss Institute devised a device to filter pathogens from a patient’s blood. Inspired by the spleen, an organ which filters antibody-coated pathogens from the bloodstream, the “biospleen” works by first injecting specially treated, magnetic nanoparticles into the blood flowing through it. The nanoparticles have a protein attached to their surfaces which adheres to bacteria, viruses, and fungi; the protein-coated nanoparticles work like antibodies, which glom onto foreign objects. The biospleen then uses a magnet to pull out the nanoparticles and the pathogens they’re attached to.

    The filtering section of the biospleen

    The biospleen is similar in concept to dialysis, which mimics the function of the kidneys, but works on pathogens instead of typical bodily waste.

    Sara Reardon, reporting for Nature News, has more details:

    To test the device, Ingber and his team infected rats with either E. coli or Staphylococcus aureus and filtered blood from some of the animals through the biospleen. Five hours after infection, 89% of the rats whose blood had been filtered were still alive, compared with only 14% of those that were infected but not treated. The researchers found that the device had removed more than 90% of the bacteria from the rats’ blood. The rats whose blood had been filtered also had less inflammation in their lungs and other organs, suggesting they would be less prone to sepsis.

    Ingber and his team also tested the device using human volumes of blood, and they found that it took about five hours to filter most pathogens from five liters of blood.

    If the device makes it into trials, which could happen in just a few years, it could give doctors the upper hand in a number of intractable infections, including HIV and Ebola. The biospleen could reduce the pathogen load in a patient’s blood, leaving the drugs that normally treat the infection to clear the virus or bacteria from the patient’s organs. In acute infections, like Ebola, it would also buy doctors valuable time in their efforts to eliminate the virus before the patient succumbs.

    See the full article here.

    NOVA is the highest rated science series on television and the most watched documentary series on public television. It is also one of television’s most acclaimed series, having won every major television award, most of them many times over.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 1:54 pm on September 15, 2014 Permalink | Reply
    Tags: Applied Research & Technology, , SLAC SUNCAT, Statistical Analysis   

    From D.O.E. Pulse: “Uncertainty gives scientists new confidence in search for novel materials” 


    DOE Pulse

    September 15, 2014
    Angela Anderson, 650.926.3505,

    Scientists at Stanford University and DOE’s SLAC National Accelerator Laboratory have found a way to estimate uncertainties in computer calculations that are widely used to speed the search for new materials for industry, electronics, energy, drug design and a host of other applications.

    This image shows the results of calculations
    aimed at determining which of six chemical
    elements would make the best catalyst for
    promoting an ammonia synthesis reaction.
    Researchers at SLAC and Stanford used
    Density Functional Theory (DFT) to calculate
    the strength of the bond between nitrogen
    atoms and the surfaces of the catalysts.
    The bond strength, plotted on the horizontal
    axis, is a key factor in determining the
    reaction speed, plotted on the vertical axis.
    Based on thousands of these calculations,
    which yielded a range of results (colored dots)
    that reveal the uncertainty involved,
    researchers estimated an 80 percent chance
    that ruthenium (Ru, in red) will be a better
    catalyst than iron (Fe, in orange.)
    (Andrew Medford and Aleksandra Vojvodic/
    SUNCAT, Callie Cullum)

    The technique, reported in a recent issue of Science, should quickly be adopted in studies that produce some 30,000 scientific papers per year.

    “Over the past 10 years our ability to calculate the properties of materials and chemicals, such as reactivity and mechanical strength, has increased enormously. It’s totally exploded,” said Jens Nørskov, a professor at SLAC and Stanford and director of the SUNCAT Center for Interface Science and Catalysis, who led the research.

    “As more and more researchers use computer simulations to predict which materials have the interesting properties we’re looking for — part of a process called ‘materials by design’ — knowing the probability for error in these calculations is essential,” he said. “It tells us exactly how much confidence we can put in our results.”

    Nørskov and his colleagues have been at the forefront of developing this approach, using it to find better and cheaper catalysts to speed ammonia synthesis and generate hydrogen gas for fuel, among other things. But the technique they describe in the paper can be broadly applied to all kinds of scientific studies.

    Speeding the Material Design Cycle

    The set of calculations involved in this study is known as DFT, for Density Functional Theory. It predicts bond energies between atoms based on the principles of quantum mechanics. DFT calculations allow scientists to predict hundreds of chemical and materials properties, from the electronic structures of compounds to density, hardness, optical properties and reactivity.

    Because researchers use approximations to simplify the calculations — otherwise they’d take too much computer time — each of these calculated material properties could be off by a fairly wide margin.

    To estimate the size of those errors, the team applied a statistical method: They calculated each property thousands of times, each time tweaking one of the variables to produce slightly different results. That variation in results represents the possible range of error.

    “Even with the estimated uncertainties included, when we compared the calculated properties of different materials we were able to see clear trends,” said Andrew J. Medford, a graduate student with SUNCAT and first author of the study. “We could predict, for instance, that ruthenium would be a better catalyst for synthesizing ammonia than cobalt or nickel, and say what the likelihood is of our prediction being right.”

    An Essential New Tool for Thousands of Studies

    DFT calculations are used in the materials genome initiative to search through millions of solids and compounds, and also widely used in drug design, said Kieron Burke, a professor of chemistry and physics at the University of California-Irvine who was not involved in the study.

    “There were roughly 30,000 papers published last year using DFT,” he said. “I believe the technique they’ve developed will become absolutely necessary for these kinds of calculations in all fields in a very short period of time.”

    Thomas Bligaard, a senior staff scientist in charge of theoretical method development at SUNCAT, said the team has a lot of work ahead in implementing these ideas, especially in calculations attempting to make predictions of new phenomena or new functional materials.

    Other researchers involved in the study were Jess Wellendorff, Aleksandra Vojvodic, Felix Studt, and Frank Abild-Pedersen of SUNCAT and Karsten W. Jacobsen of the Technical University of Denmark. Funding for the research came from the DOE Office of Science.

    See the full article here.

    DOE Pulse highlights work being done at the Department of Energy’s national laboratories. DOE’s laboratories house world-class facilities where more than 30,000 scientists and engineers perform cutting-edge research spanning DOE’s science, energy, National security and environmental quality missions. DOE Pulse is distributed twice each month.

    DOE Banner

    ScienceSprings relies on technology from

    MAINGEAR computers



Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc

Get every new post delivered to your Inbox.

Join 324 other followers

%d bloggers like this: