Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:40 pm on November 22, 2014 Permalink | Reply
    Tags: Applied Research & Technology, , HI-SEAS,   

    From SPACE.com: “Life in an 8-Month Mars Sim: A Q+A With the Hi-SEAS Team” 

    space-dot-com logo


    November 21, 2014
    SPACE.com Staff

    With support from NASA, the Hawai’i Space Exploration Analog and Simulation(HI-SEAS) program launched in 2013 to study how astronauts might interact during long deployments in isolation from the rest of Earth, such as those required for a manned trip to Mars.

    HI-SEAS is led by researchers at the University of Hawaii at Manoa, and the current mission is focused on the social, interpersonal and cognitive factors that affect team performance over time. HI-SEAS crew members are required have “astronaut-like characteristics,” including the ability to pass a Class 2 flight physical examination, and undergraduate training as a scientist or engineer. Like the astronaut mission specialists they represent, each participant brings a significant research project or other scholarly work of his or her own to complete while inside the space-analog habitat.

    Now in its third experiment, which began on October 15 and continues until June 15, the current HI-SEAS crew includes: Martha Lenio (Commander), Jocelyn Dunn (Chief Scientist), Sophie Milam (Executive Officer), Allen Mirkadyrov (Crew Engineer), Neil Scheibelhut (Medical Officer), and Zak Wilson (Chief Engineer).

    To give Space.com readers a better sense of life on “Mars,” the crew provided the following exclusive Q+A.

    SDC: What motivated you to put your life on hold and join this project?

    Sophie Milam: I remember being five-years-old and telling everyone I wanted to be an astronaut, and throughout my life I’ve just never stopped wanting that — although I did have to face some harsh realities that I might not be able to go when I became gluten intolerant. When I got the HI-SEAS opportunity I had so many people supporting me and cheering me on, but what really made me want to go was the thought that I might be one of the rare people that got to fulfill a promise to their five-year-old self. I want to know I’ve contributed to human space exploration and that my time in this dome will make astronauts’ lives better in the future, but I also want to do this for the kid inside me that never let me give up the outlandish (outworldish?) dream of space travel and encourage kids everywhere to hold on to their dreams and make them come true.

    Martha Lenio: Like the others involved, I dream of one day becoming an astronaut. In addition to that though, I’m very interested in sustainable living and feel that a mission to Mars would be maximizing our ability to live within our means. I was interested to see how sustainable this project was already, and how I could help to improve it over the course of the mission. I also want to see what aspects of our life in the Dome will be able to translate to regular life on Earth.

    Zak Wilson: I’ve been interested in this type of thing for many years. About five years ago, I did a two week stay at the Mars Desert Research Station (MDRS ), another Mars analog program, located in Utah, which made me interested in doing this longer version. I first applied to HI-SEAS a couple years ago and got accepted this summer. I think sending people to Mars would be an inspiring and valuable thing to do, so I’m happy to be able to contribute to the knowledge necessary for that to happen. I’d also love to be one of the people to get to go to Mars, so this is about as good a test/approximation as I can manage for how I would handle that while remaining on Earth. I also think this is just a generally interesting experience, and I always would have wondered what it would have been like if I had declined a spot on the crew.

    Jocelyn Dunn: Signing off from the world and living sustainably on “Mars” is both a technical and personal challenge. I was attracted to the idea of growing stronger intellectually and spiritually. In this confined environment, there are limited factors impacting our psychology, so it’s a great place to discover what is the core of my being, what makes me happy, what makes me stressed, and how can I better take care of myself and my relationships. Finding baselines of human behavior is also my research interest here. Again, being in this limited, semi-controlled environment provides an opportunity to collect data about our physiology and develop technology that can quantitatively decipher our health and mood states in an automated manner.

    Neil Scheibelhut: In a word … Pride. Sure I could have kept grinding away in Los Angeles, always looking for a better opportunity, always looking forward to that next paycheck so maybe I could treat myself to a concert or a nice dinner out. But that’s boring. And it’s mundane. It may be good enough for the average American, but I’m not the average American. I need to do something with my life that I can be proud of. So, here I am. I’m a part of something that could help put man on Mars. THAT’s something to be proud of. THAT’s living a meaningful life, even if it is only for eight months.

    Neil Scheibelhut, medical Officer
    Credit: Hawai’i Space Exploration Analog and Simulation (HI-SEAS)

    Allen Mirkadynrov: My main reason for participating in this project and for “putting my life on hold,” as quoted, is to contribute valuable input to the overall space exploration endeavor. I am extremely passionate about space exploration, and I would want nothing more in my personal career, than to provide some useful or valuable input towards humankind’s outward expansion. Even if I contribute a very small amount, I would consider it my honor and a small success. The other reason why I joined this project is probably similar to the reasons posted by other participants — we all want to become astronauts someday, and this will surely be a big step in the right direction towards that goal.
    HI-SEAS Crew in costume, Mars

    The HI-SEAS crew poses with the TARDIS. The crew includes: Martha Lenio (Commander), Jocelyn Dunn (Chief Scientist), Sophie Milam (Executive Officer), Allen Mirkadyrov (Crew Engineer), Neil Scheibelhut (Medical Officer), and Zak Wilson (Chief Engineer).
    Credit: Hawai’i Space Exploration Analog and Simulation (HI-SEAS)

    SDC: So how are you feeling this far into the mission?

    S.M.: I feel great, we’re finally hitting a groove as a team and getting our own personal schedules worked out. There are lots of things I miss — showering whenever I want, cooking dinner at night, going outside — but there are so many things that I’m really enjoying here. At the top of that list is how good I feel with the crew. Regardless of whether we’re playing board games or preparing for a geology extra-vehicular activity (EVA) everyone is positive, helpful, and fun.

    Z.W.: I’m generally liking things in the dome, and the time seems to be flying by. Having time to do personal research is a great opportunity that doesn’t come around very often. I’m also really enjoying being around smart people that have different areas of expertise than I do. On the negative side, I’ve only been outside for maybe a total of 30 minutes since we started four weeks ago (and all of that was in simulated spacesuit that effectively cuts you off from the environment). That is definitely getting to me a little.

    A.M.: To put this into a perspective, we have been here less than a month, so it still hasn’t kicked in completely (at least for me). I am feeling very well thus far. I feel great physically and I feel great psychologically and emotionally. I am very impressed with the caliber of people that I am fortunate to work with and I could not find a more helpful, unselfish, accommodating and team-oriented group of people if I tried. I truly believe that. Perhaps that’s what makes this project run smoothly so far, and I sincerely hope that it continues throughout our stay here.

    SDC: Good food can make more of a difference in field work and exploration than some people realize — are you eating well, and what comforts do you have around you?

    S.M.: WE EAT SO WELL! Our schedule for cooking is that everyone cooks dinner one day a week and then we have one day of eating leftovers and thus cleaning out our fridge. You’re on your own for breakfast and lunch unless you find someone else who wants to eat with you, but its nice be in control of your own diet. An idea has emerged from this that when its your turn to cook you really have to go all out so we end up with chicken caccitori on cornbread (Gluten free for me and Neil), Gahnaian brown nut soup with rice balls (our commander is a world traveler and is exposing us to some awesome food from other cultures), red beans and rice (Zak made this dish and I ate four full servings and then a half serving after our dessert of homemade vanilla ice cream), taco extravaganza (Jocelyn and I made our own crispy corn shells as well as beef and chicken fillings, Spanish rice, salsa and pineapple-cabbage slaw), tuna cakes with barley pearls and Russian pickles (Allen’s home cooking is quickly gaining a huge hold on my heart). I’ve come to consider pickles a luxury, and any real cheese, but other than that all of our food, from ham to kale to milk and anything in between, is either freeze dried, dehydrated or powdered. The real luxury is the time and effort that everyone puts in to making food. I think everyone on this crew agrees that good food can definitely make the difference and we are absolutely dedicated to making sure we don’t drop the ball on that.

    Z.W.: I worried a bit about the food that we would have here on “Mars” in the lead up to the mission. I previously did a two week stint at the Mars Desert Research Station (MDRS), another Mars analog program, and the food there left something to be desired. The first mission at HI-SEAS was a study looking at the palatability of food for Mars missions, so it can be an issue. The first week we were all a little tentative learning to cook with the ingredients we have here. Gallon jugs of freeze dried vegetables, big cans of freeze dried meat and then things like pasta, sugar, rice, flour and other grains make up the basis of most of our meals. Our ability to cook with the stuff we have has vastly improved over the past few weeks. I am unquestionably eating better here than I did living on my own. The cooking schedule we decided on is each dinner has a chef, sous chef and a couple people on clean up. This means, on average, each person is in charge of cooking only one meal a week and helping with another. Since each person doesn’t cook very often, we’ve gotten into a bit of a thing doing pretty elaborate (and delicious) meals. We haven’t been wanting in variety either, we’ve done Indian, Thai, Persian, Ghanaian, Chinese, Hawaiian, Greek/Turkish, Russian/Azerbaijani a few times, Mexican a couple times, a few Italian-ish dishes, plus some more American stuff including some solid southern and comfort food.

    J.D.: Food here is more than sustenance. We socialize and bond through cooking and sharing meals together each evening. At first, we were all intimidated by the freeze-dried, shelf-stable ingredients, but now we are talking about having Iron Chef cooking challenges — that’s what happens when you have a group of high-performing, competitive individuals. Our meals keep getting better and better.

    N.S.: I am actually really surprised at how well we are eating. This crew is going all out with our meals. I didn’t eat this well before coming into the dome. I miss fresh food, but we are eating very well.

    SDC: What experiments are showing promise already?

    S.M.: Martha’s garden is showing some good progress, the cilantro is starting to have real cilantro leaves instead of just the baby sprouts. Zak has made a range ofgreat things with his 3D printer, including parts for fixing my watch-band. Jocelyn has been taking all kinds of measurements about our general health and wellbeing and we have Jawbone Up bands that allow us to keep track of our sleep, mood and eating habits, which I’ve found incredibly useful. My area of research is tensegrity robotics, and when I’m not actively doing research I’m writing up my final research paper for my MEME degree and it is looking promising — hopefully I will be receiving my diploma and stole from Martha in December.

    Z.W.: The main experiment I brought was a 3D printer, with the idea that I would be able to print replacements for things that broke or design new things as we came up with them, rather than having to wait for our resupply every 2 months. I’ve used a few 3D printers before, but I bought my own (an Up mini) for this mission. I’ve been working with Made in Space (http://www.madeinspace.us) who gave me some training before the mission began, and they have been providing me with some technical support. I’ve managed to print a few useful items already, a white board marker clip, a replacement watch buckle for one my crewmates’ watch and a holder for our shower timer (our water is rationed). I think this capability has great potential and I’m really enjoying it so far. It is also great to be able to solve some more of our own problems rather than relying on mission support.

    J.D.: For my research, I’ve been using Hexoskin biometric shirts and Jawbone wristbands to collect data about our health and performance. From hexoskin, I can collect rich electrocardiogram (ECG) data along with heart and respiration rates to track our performance during workouts and EVAs. With Jawbone, we are tracking our activity, sleep, diet and mood. The goal is to develop technology for automatically inferring stress states from wearable device data. As validation, I’m collecting hair and urine samples to analyze for biomarkers of stress. The data collection stage of this work is largely established, so I’m now working on the analytics for harvesting information from these rich data about our health and activities.

    SDC: What experiments are growing problematic?

    M.L.: A general comment on some of the psych experiments is that the hardware or software is not entirely reliable. We’ve had some problems with tablets not working, or software crashing. We’ve managed to find work-arounds, but it’s not ideal. One aspect of the study that’s likely contributing to the difficulty is the 20-minute delay on the internet here [a requirement of the simulation]. Any app that relies on live internet is going to have issues, and things start to become flaky.

    Z.W.: The other experiment I brought with me is an Oculus Rift virtual reality headset. The idea was it might be useful for escapism/relaxation, as it can be hard to get away from the reality of being confined in a 1,300 sq ft dome with five other people during our eight months here. This was something that I bought specifically for this mission and had never tried or set up before. Getting it working is proving more difficult than I had anticipated. Mainly this is because of the time delay/lack of internet. Downloading drivers and demos is difficult since I have to get mission support to send me stuff and also get answers to questions. I’m hoping I’ll get it working better, but it hasn’t been very useful thus far.

    N.S.: Nothing yet.

    SDC: What do you miss most about the outside world?

    S.M.: Of course my family, friends and being able to go outside any time I want, but the crew is becoming very much like family and friends, we joke and make fun, and get competitive over board games. It really it reminds me of being home with my own family. The one thing there is no analogue for is pets, I miss my dog Charly and my hedgehog Slim Pricklins.

    M.L.: Sun, wind, just being outside.

    Z.W.: Variety in exercise. I’m a pretty big runner and rock climber. We have a treadmill that I’ve been using, but it just doesn’t do it for me. We’ve also been doing P90X as a group which is fun, but still lacking in variety. I’m guessing as time passes, family and friends will be the thing I miss most — but since we are only about four weeks in, that isn’t so bad yet.

    N.S.: You know, the Cleveland Browns are in first place, and I’m sure that city is rocking. I wish I was there to help cheer them on.

    A.M.: What I miss most is of course my family and friends. I love everyone here at the habitat, but at the same time I do miss hearing the voices of loved ones or watching their facial expressions in real time. Other than people, I miss the fresh air, sea, sunshine on my face, rain and other natural wonders that we take for granted

    S.M.:Other than the obvious stuff like doing dishes on your night and staying tidy, one of the most important lessons I’ve found is to take the time to talk with people one-on-one. Often I feel like I’m surrounded by everyone, and its difficult to form a real connection without having that personal time away from the group.

    Z.W.: Obviously it is still pretty early in the mission, but we have been getting along great. I think the selection committee did a good job selecting people with the right variety of skills and personalities that mesh well. We seem to have found a good balance between being around each other all the time and not being in each other’s faces. We are mostly doing our own things during the day (with the exception of EVAs and a few required group tasks), but in the evening we hang out. We’ve all been working out together and most nights after dinner we watch a movie or play a board game together. I think as time passes it will get harder, the inability to be alone isn’t something I’ve ever really dealt with before.

    J.D.: I am trying to find a balance between private and group time. It is sometimes difficult to say “no” to a movie or game night, but personally, it is important for me to have adequate time for self-reflection.

    N.S.: You know, as of now, we are all getting along so well, it kind of feels like a college dorm. We hang out and watch movies together. It actually doesn’t seem that different from being outside. However, I feel like the dynamic of this crew is not a coincidence, and maybe we are all the kind of people who don’t mind roughing it a little, and are very social.

    A.M.: I am learning how to read people’s moods, how to interject my thoughts into helpful comments (when necessary), and how to be as helpful and accommodating to people as they are to me. This question may get a fuller response as time goes on, but with less than a month under our belts here, it’s really difficult to fully grasp our isolation and limitations (at least for me).

    SDC: Would you go on a mission to Mars?

    S.M.: Absolutely, where do I sign?!

    M.L.: Yes! In a heartbeat!

    Z.W.: Yes, I would love to. That is really why I’m doing this. It is either good practice for the real thing or maybe the closest I’ll ever get.

    J.D.: Yes, I dream of being a part of pioneering human space exploration. I’ve found my work on this simulated Mars mission deeply interesting and compelling, and consequently intend to apply for NASA’s next round of astronaut selection. Regardless, I will continue research in these fields making contributions to the development and expansion of humanity’s exploration of space.

    N.S.: If I got to go with the other five people I’m with, I’d sign up right now.

    A.M.: Absolutely. I would not think twice about it and I would volunteer to go immediately. As I mentioned in my first answer, my real passion in life has always been space exploration. If I could contribute to improving life on Earth for the rest of humanity by going to Mars, I would consider it my duty, honor and privilege. I hope someday that such an opportunity comes my way, so I can be one of the first volunteers.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

  • richardmitnick 5:08 pm on November 21, 2014 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From NSF: “A foundation for robotics” 

    National Science Foundation

    November 21, 2014
    Aaron Dubrow, NSF (703) 292-4489 adubrow@nsf.gov

    The fundamental research in computing and engineering that enabled robotics to develop in the U.S. has been supported by the National Science Foundation (NSF) since its inception.
    Yet despite these early investments in sensors, machine movement and computer vision, it wasn’t until 1972 that the first grant with “robot” in the title was funded.

    1970s: Robots for the factory floor

    In the mid-1970s, robotics began to gather steam at NSF. Among the first research projects classified as robotics were mechanical arms (seen [below]) that could pick a part out of a box and visually identify it and orient it properly for the next step on an assembly line, as well as computer-controlled welding robots. These and other NSF-funded projects were aimed at improving the productivity of American manufacturing processes–a goal for roboticists that continues to this day.

    This image of a robot arm, developed by the Stanford Research Institute, is similar to the one that appeared in the 1976 NSF Annual Report. The robotic system used computer vision to identify and make decisions about parts on an assembly line. This is one of several projects from that era aimed at improving the productivity of American manufacturing processes. Credit: SRI International

    1980s: Rise of the walking machines

    The 1980s brought an increased diversification in the types of robots being explored and the ways they could be used.

    At Ohio State, electrical engineer Robert McGhee and mechanical engineer Kenneth Waldron, along with a 60-member team of students and technical assistants, developed the Adaptive Suspension Vehicle (ASV), nicknamed the “Walker,” with support from NSF and the Defense Advanced Projects Research Agency (DARPA).

    What do you get when you combine 20 years of research, $5 million, and a Star Wars Imperial all-terrain vehicle? Ohio State’s Adaptive Suspension Vehicle (ASV), nicknamed the “Walker.” Developed by electrical engineer Robert McGhee and mechanical engineer Kenneth Waldron, along with a 60-member team of students and technical assistants, the ‘Walker’ was developed under a research contract from the Defense Advanced Projects Research Agency (DARPA).

    The ASV was 17 feet long, 8 feet wide, and 10.5 feet high, and had six legs to support its three-ton aluminum body. It was designed to carry cargo for industrial and military applications over rough, mountainous, icy or muddy terrain, and was capable of crossing 9-foot-wide ditches or 7-foot-high walls.

    The walker used a forward mounted radar system to scan the terrain ahead and feed that data, along with instructions from the operator’s joystick, into the 16 onboard computers that coordinated and controlled the ASV’s legs. Computers moved each leg individually, up and down, forward and back, and closer or farther from the ASV’s body, for a clunky but serviceable ride.

    1990s: Robots explore new environments

    Not long afterward, researchers supported by NSF were developing robots for a very different environment: underwater. First built in 1991, the Omni-Directional Intelligent Navigator (ODIN) was a sphere-shaped, autonomous underwater robot capable of instantaneous movement in all six directions. First built as a remotely operated robot, in 1995 it was upgraded to ODIN II, an autonomous underwater robot. Sentry, a successor robot developed through a grant from NSF, plies the deep waters today locating and quantifying hydrothermal fluxes.

    First built in 1991, the Omni-Directional Intelligent Navigator (ODIN) was a sphere-shaped, autonomous underwater robot capable of instantaneous movement in six directions. Credit: Autonomous Systems Laboratory, University of Hawaii

    In the 1990s, roboticists began turning their attention to day-to-day tasks with which a robot could assist. For instance, researchers from the University of Pittsburgh, University of Michigan and Carnegie Mellon University developed a series of mobile, personal service robots, such as Nursebot, that were designed to assist elderly people in their everyday life.

    Researchers from the University of Pittsburgh, University of Michigan and Carnegie Mellon University have developed mobile, personal service robots, such as Nursebot, that assist elderly people in their everyday life. An autonomous mobile robot that “lives” in the home of a chronically ill elderly person might remind its owner to take medicine, provide videoconferencing with doctors, collect patient data or watch for accidents, manipulate objects for arthritis sufferers, and provide some social interaction. Credit: Carnegie Mellon University

    An autonomous mobile robot that “lives” in the home of a chronically ill elderly person could remind its owner to take medicine, provide videoconferencing with doctors, collect patient data or watch for accidents, manipulate objects for arthritis sufferers, and provide some social interaction. New versions have evolved over the years and a General Electric developed hospital robot is expected to be tested at a Veterans Affairs hospital in 2015.

    2000s: Miniaturization and mobility

    Researchers have always envisioned a future where robots could serve the general good in disaster recovery and search-and-rescue operations, but it wasn’t until 9/11 that robots were broadly put to that use.

    Robotics expert Robin Murphy, then an associate professor of computer science at the University of South Florida, arrived on site the morning after the collapse of the World Trade Center. Murphy’s research on experimental mixed-initiative robots for urban rescue operations was originally funded by NSF. She brought with her a response team that included three graduate students–Jenn Casper, Mark Micire and Brian Minten–and software-guided “marsupial” robot systems. These intelligent anonymous “marsupial” robots are especially useful in rubble because the “mother” robot releases smaller robots to explore tight spaces unreachable by other means.

    Over the next 11 days, the teams made five insertions onto the rubble piles, often at the request of the Federal Emergency Management Agency (FEMA) task force teams or sector chiefs. Murphy’s mechanized prowlers had tethers with a range of 100 feet, far out-stripping the fire department’s seven-foot camera wands. In doing so, they helped find five victims and another set of remains, though Murphy expressed regret that they hadn’t been more successful.

    As the 2000s progressed, efforts by engineers to miniaturize components led to robots that were significantly smaller than those that came before. One startling example of this trend is the RoboBee project, which was awarded an Expeditions in Computing award from NSF’s Directorate for Computer and Information Science and Engineering in 2009.

    Researchers in this expedition are creating robotic bees that fly autonomously and coordinate activities amongst themselves and the hive, much like real bees. The research team aims to drive research in compact, high-energy power sources, ultra-low-powered computing and the design of distributed algorithms for multi-agent systems. Most recently, RoboBees were pollinating young minds at the Boston Museum of Science in an exhibition dedicated to their complex design.

    2010s: Investing in co-robots

    In June 2011, the administration launched the National Robotics Initiative (NRI) to develop robots that work with or beside people to extend or augment human capabilities, taking advantage of the different strengths of humans and robots. This provided focus and funding for robotics research. The NRI is led by NSF and supported by multiple agencies including the National Aeronautics and Space Administration (NASA), the National Institutes of Health (NIH), the U.S. Department of Agriculture (USDA), and the U.S. Department of Defense (DOD).

    Since 2011, NSF and its partners in the NRI have invested more than $120 million in robotics research.

    Today, robots impact our lives in a myriad of ways. Robots are being used in classrooms across the nation to capture the excitement of students and help them learn STEM (and non-STEM) principles. They’re helping doctors perform surgeries, providing assistance to individuals with disabilities and inspecting bridges and roads to ensure our safety.

    “Robots, which once were limited to the realm of science fiction, are now a transformative technology, a demonstration of how NSF-funded basic research can bring about changes in human life,” said Marc Rothenberg, former NSF historian.

    Though robots may be an emerging area of research, many of the underpinnings of today’s robots began in fundamental research in sensing, computer vision, artificial intelligence, mechanical engineering and many other areas that some might not immediately recognize as being related to robots.

    What’s next

    NSF-supported researchers are making incredible advances in robotics, creating a new generation of co-robots that can handle critical tasks in close proximity to humans, safely and with greater resilience than previous intelligent machines.

    Check out our Youtube Robot playlist to learn more about more about the research of NSF-funded roboticists.

    Junku Yuh
    Song Choi
    Gu-Yeon Wei
    Robert Wood
    Robin Murphy
    Andrea Thomaz
    Charles Klein
    Robert McGhee
    Radhika Nagpal
    Blake Hannaford
    Judith Matthews
    Nilanjan Sarkar
    Donald Chiarulli
    Nikolaus Correll
    Said Koozekanani
    J. Gregory Morrisett
    Jacqueline Dunbar-Jacob

    Related Institutions/Organizations
    Harvard University
    University of Hawaii
    Ohio State University
    Colorado School of Mines
    University of Pittsburgh
    University of Washington
    Georgia Tech Research Corporation
    University of Colorado at Boulder

    Related Programs
    National Robotics Initiative

    Related Awards
    #9157896 Presidential Young Investigators Award
    #0953181 CAREER: Socially Guided Machine Learning
    #9320318 Reactive Sensing for Autonomous Mobile Robots
    #0085796 ITR: Personal Robotic Assistants for the Elderly
    #1150223 CAREER: Modeling and Design of Composite Swarming Behaviors
    #0926148 Collaborative Research: RoboBees: A Convergence of Body, Brain and Colony
    #0958441 II New: A Network of Open Experimental Testbeds for Surgical Robotics Research
    #7818957 Dynamics and Control of Industrial Manipulators and Legged Locomotion Systems
    #9701614 Intelligent Coordinated Motion Control of Underwater Robotic Vehicles with Manipulator Workpackages (Collaborative Research)
    #9603043 U.S.-Japan Cooperative Science: Virtual Collaborative World Simulator for Underwater Robots using Multi-Dimensional, Synthetic Environment

    Years Research Conducted
    1972 – 2014

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    The National Science Foundation (NSF) is an independent federal agency created by Congress in 1950 “to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense…we are the funding source for approximately 24 percent of all federally supported basic research conducted by America’s colleges and universities. In many fields such as mathematics, computer science and the social sciences, NSF is the major source of federal backing.


    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 4:40 pm on November 21, 2014 Permalink | Reply
    Tags: Applied Research & Technology, , , , ,   

    From SLAC: “Robotics Meet X-ray Lasers in Cutting-edge Biology Studies” 

    SLAC Lab

    November 21, 2014

    Platform Brings Speed, Precision in Determining 3-D Structure of Challenging Biological Molecules

    Scientists at the Department of Energy’s SLAC National Accelerator Laboratory are combining the speed and precision of robots with one of the brightest X-ray lasers on the planet for pioneering studies of proteins important to biology and drug discovery.

    The new system uses robotics and other automated components to precisely maneuver delicate samples for study with the X-ray laser pulses at SLAC’s Linac Coherent Light Source (LCLS). This will speed efforts to map the 3-D structures of nanoscale crystallized proteins, which are important for designing targeted drugs and synthesizing natural systems and processes.

    This illustration shows an experimental setup used in crystallography experiments at SLAC’s Linac Coherent Light Source X-ray laser. The drum-shaped container at left stores supercooled crystal samples that are fetched by a robotic arm and delivered to another device, called a goniometer. The goniometer moves individual crystals through the X-ray beam, which travels from the pipe at upper left toward the lower right. A detector, right, captures X-ray diffraction patterns produced as the X-rays pass through the crystal samples. (SLAC National Accelerator Laboratory)

    Equipment used in a highly automated X-ray crystallography system at SLAC’s Linac Coherent Light Source X-ray laser. The metal drum at lower left contains liquid nitrogen for cooling crystallized samples studied with LCLS’s intense X-ray pulses. (SLAC National Accelerator Laboratory)


    A New Way to Study Biology

    “This is an efficient, highly reliable and automated way to obtain high-resolution 3-D structural information from small sizes and volumes of samples, and from samples that are too delicate to study using other X-ray sources and techniques,” said Aina Cohen, who oversaw the development of the platform in collaboration with staff at LCLS and at SLAC’s Stanford Synchrotron Radiation Lightsource (SSRL), both DOE Office of Science User Facilities.


    She is co-leader of the Macromolecular Crystallography group in the Structural Molecular Biology (SMB) program at SSRL, which has used robotic sample-handling systems to run remote-controlled experiments for a decade.

    The new setup at LCLS is described in the Oct. 31 edition of Proceedings of the National Academy of Sciences. It includes a modified version of a “goniometer,” a sample-handling device in use at SSRL and many other synchrotrons, as well as a custom version of an SSRL-designed software package that pinpoints the position of crystals in arrays of samples.

    LCLS, with X-ray pulses a billion times brighter than more conventional sources, has already allowed scientists to explore biological samples too small or fragile to study in detail with other tools. The new system provides added flexibility in the type of samples and sample-holders that can be used in experiments.

    Rather than injecting millions of tiny, randomly tumbling crystallized samples into the path of the pulses in a thin liquid stream – common in biology experiments at LCLS – the goniometer-based system places crystals one at a time into the X-ray pulses. This greatly reduces the number of crystals needed for structural studies on rare and important samples that require a more controlled approach.

    Early Successes

    “This system adapts common synchrotron techniques for use at LCLS, which is very important,” said Henrik Lemke, staff scientist at LCLS. “There is a large community of scientists who are familiar with the goniometer technique.”

    The system has already been used to provide a complete picture of a protein’s structure in about 30 minutes using only five crystallized samples of an enzyme, moved one at a time into the X-rays for a sequence of atomic-scale “snapshots.”

    It has also helped to determine the atomic-scale structures of an oxygen-binding protein found in muscles, and another protein that regulates heart and other muscle and organ functions.

    “We have shown that this system works, and we can further automate it,” Cohen said. “Our goal is to make it easy for everyone to use.”

    Many biological experiments at LCLS are conducted in air-tight chambers. The new setup is designed to work in the open air and can also be used to study room-temperature samples, although most of the samples used in the system so far have been deeply chilled to preserve their structure. One goal is to speed up the system so it delivers samples and measures the resulting diffraction patterns as fast as possible, ideally as fast as LCLS delivers pulses: 120 times a second.

    The goniometer setup is the latest addition to a large toolkit of systems that deliver a variety of samples to the LCLS beam, and a new experimental station called MFX that is planned at LCLS will incorporate a permanent version.

    Team Effort

    Developed through a collaboration of SSRL’s Structural Molecular Biology program and the Stanford University School of Medicine, the LCLS goniometer system reflects increasing cooperation in the science of SSRL and LCLS, Cohen said, drawing upon key areas of expertise for SSRL and the unique capabilities of LCLS. “The combined effort of staff at both experimental facilities was key in this success,” she said.

    In addition to staff at SLAC’s SSRL and LCLS and at Stanford University’s School of Medicine, researchers from SLAC’s Photon Science Directorate, the University of Pittsburgh School of Medicine, Howard Hughes Medical Institute, Montana State University, Lawrence Berkeley National Laboratory and the University of California, San Francisco also participated in this effort.

    The work was supported by the Department of Energy Office of Basic Energy Sciences, the SSRL Structural Molecular Biology Program via the DOE Office of Biological and Environmental Research, and the Biomedical Technology Research Resources program at the National Institute of General Medical Sciences, National Institutes of Health.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 5:32 pm on November 20, 2014 Permalink | Reply
    Tags: Applied Research & Technology, , , ,   

    From NSF: “A deep dive into plasma” 

    National Science Foundation

    November 20, 2014
    No Writer Credit

    Renowned physicist uses NSF-supported supercomputer and visualization resources to gain insight into plasma dynamic

    Studying the intricacies and mysteries of the sun is physicist Wendell Horton’s life’s work. A widely known authority on plasma physics, his study of the high temperature gases on the sun, or plasma, consistently leads him around the world to work on a diverse range of projects that have great impact.

    Fusion energy is one such key scientific issue that Horton is investigating and one that has intrigued researchers for decades.

    “Fusion energy involves the same thermonuclear reactions that take place on the sun,” Horton said. “Fusing two isotopes of hydrogen to create helium releases a tremendous amount of energy–10 times greater than that of nuclear fission.”

    It’s no secret that the demand for energy around the world is outpacing the supply. Fusion energy has tremendous potential. However, harnessing the power of the sun for this burgeoning energy source requires extensive work.

    Through the Institute for Fusion Studies at The University of Texas at Austin, Horton collaborates with researchers at ITER, a fusion lab in France and the National Institute for Fusion Science in Japan to address these challenges. At ITER, Horton is working with researchers to build the world’s largest tokamak–the device that is leading the way to produce fusion energy in the laboratory.

    ITER Tokamak
    ITER tokamak

    “Inside the tokamak, we inject 10 to 100 megawatts of power to recreate the conditions of burning hydrogen as it occurs in the sun,” Horton said. “Our challenge is confining the plasma, since temperatures are up to 10 times hotter than the center of the sun inside the machine.”

    Perfecting the design of the tokamak is essential to producing fusion energy, and since it is not fully developed, Horton performs supercomputer simulations on the Stampede supercomputer at the Texas Advanced Computing Center (TACC) to model plasma flow and turbulence inside the device.

    “Simulations give us information about plasma in three dimensions and in time, so that we are able to see details beyond what we would get with analytic theory and probes and high-tech diagnostic measurements,” Horton said.

    The simulations also give researchers a more holistic picture of what is needed to improve the tokamak design. Comparing simulations with fusion experiments in nuclear labs around the world helps Horton and other researchers move even closer to this breakthrough energy source.

    Plasma in the ionosphere

    Because the mathematical theories used to understand fusion reactions have numerous applications, Horton is also investigating space plasma physics, which has important implications in GPS communications.

    GPS signaling, a complex form of communication, relies on signal transmission from satellites in space, through the ionosphere, to GPS devices located on Earth.

    “The ionosphere is a layer of the atmosphere that is subject to solar radiation,” Horton explained. “Due to the sun’s high-energy solar radiation plasma wind, nitrogen and oxygen atoms are ionized, or stripped of their electrons, creating plasma gas.”

    These plasma structures can scatter signals sent between global navigation satellites and ground-based receivers resulting in a “loss-of-lock” and large errors in the data used for navigational systems.

    Most people who use GPS navigation have experienced “loss-of-lock,” or instance of system inaccuracy. Although this usually results in a minor inconvenience for the casual GPS user, it can be devastating for emergency response teams in disaster situations or where issues of national security are concerned.

    To better understand how plasma in the ionosphere scatters signals and affects GPS communications, Horton is modeling plasma turbulence as it occurs in the ionosphere on Stampede. He is also sharing this knowledge with research institutions in the United States and abroad including the UT Space and Geophysics Laboratory.

    Seeing is believing

    Although Horton is a long-time TACC partner and Stampede user, he only recently began using TACC’s visualization resources to gain deeper insight into plasma dynamics.

    “After partnering with TACC for nearly 10 years, Horton inquired about creating visualizations of his research,” said Greg Foss, TACC Research Scientist Associate. “I teamed up with TACC research scientist, Anne Bowen, to develop visualizations from the myriad of data Horton accumulated on plasmas.”

    Since plasma behaves similarly inside of a fusion-generating tokamak and in the ionosphere, Foss and Bowen developed visualizations representing generalized plasma turbulence. The team used Maverick, TACC’s interactive visualization and data analysis system to create the visualizations, allowing Horton to see the full 3-D structure and dynamics of plasma for the first time in his 40-year career.

    This image visualizes the effect of gravity waves on an initially relatively stable rotating column of electron density, twisting into a turbulent vortex on the verge of complete chaotic collapse. These computer generated graphics are visualizations of data from a simulation of plasma turbulence in Earth’s ionosphere. The same physics are also applied to the research team’s investigations of turbulence in the tokamak, a device used in nuclear fusion experiments.Credit: Visualization: Greg Foss, TACC Visualization software support: Anne Bowen, Greg Abram, TACC Science: Wendell Horton, Lee Leonard, U. of Texas at Austin

    “It was very exciting and revealing to see how complex these plasma structures really are,” said Horton. “I also began to appreciate how the measurements we get from laboratory diagnostics are not adequate enough to give us an understanding of the full three-dimensional plasma structure.”

    Word of the plasma visualizations soon spread and Horton received requests from physics researchers in Brazil and researchers at AMU in France to share the visualizations and work to create more. The visualizations were also presented at the XSEDE’14 Visualization Showcase and will be featured at the upcoming SC’14 conference.

    Horton plans to continue working with Bowen and Foss to learn even more about these complex plasma structures, allowing him to further disseminate knowledge nationally and internationally, also proving that no matter your experience level, it’s never too late to learn something new.
    — Makeda Easter, Texas Advanced Computing Center (512) 471-8217 makeda@tacc.utexas.edu
    — Aaron Dubrow, NSF (703) 292-4489 adubrow@nsf.gov

    Wendell Horton
    Daniel Stanzione

    Related Institutions/Organizations
    Texas Advanced Computing Center
    University of Texas at Austin

    Austin , Texas

    Related Programs
    Leadership-Class System Acquisition – Creating a Petascale Computing Environment for Science and Engineering

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    The National Science Foundation (NSF) is an independent federal agency created by Congress in 1950 “to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense…we are the funding source for approximately 24 percent of all federally supported basic research conducted by America’s colleges and universities. In many fields such as mathematics, computer science and the social sciences, NSF is the major source of federal backing.


    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 5:10 pm on November 20, 2014 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From Caltech: “Caltech Geologists Discover Ancient Buried Canyon in South Tibet” 

    Caltech Logo

    Kimm Fesenmaier

    A team of researchers from Caltech and the China Earthquake Administration has discovered an ancient, deep canyon buried along the Yarlung Tsangpo River in south Tibet, north of the eastern end of the Himalayas. The geologists say that the ancient canyon—thousands of feet deep in places—effectively rules out a popular model used to explain how the massive and picturesque gorges of the Himalayas became so steep, so fast.

    The general location of the Himalayan range

    This photo shows the Yarlung Tsangpo Valley close to the Tsangpo Gorge, where it is rather narrow and underlain by only about 250 meters of sediments. The mountains in the upper left corner belong to the Namche Barwa massif. Previously, scientists had suspected that the debris deposited by a glacier in the foreground was responsible for the formation of the steep Tsangpo Gorge—the new discoveries falsify this hypothesis. Credit: Ping Wang

    The wide valley floor of the Nyang River, a tributary of the Yarlung Tsangpo. Here, the valley floor of the paleocanyon lies at a depth of about 800 meters below the present-day river.
    Credit: Ping Wang

    This Google Earth image looks down the Yarlung Tsangpo Valley towards the Namche Barwa (right) and Gyala Peri massifs (left). The confluence with the Nyang River (joining from the right) is shown in the foreground. Here, the valley floor is about 4 kilometers wide and the paleocanyon lies about 800-900 meters below the present-day river.
    Credit: Map data: Google, Mapabc.com, DigitalGlobe, and Cnes/Spot Image

    “I was extremely surprised when my colleagues, Jing Liu-Zeng and Dirk Scherler, showed me the evidence for this canyon in southern Tibet,” says Jean-Philippe Avouac, the Earle C. Anthony Professor of Geology at Caltech. “When I first saw the data, I said, ‘Wow!’ It was amazing to see that the river once cut quite deeply into the Tibetan Plateau because it does not today. That was a big discovery, in my opinion.”

    Geologists like Avouac and his colleagues, who are interested in tectonics—the study of the earth’s surface and the way it changes—can use tools such as GPS and seismology to study crustal deformation that is taking place today. But if they are interested in studying changes that occurred millions of years ago, such tools are not useful because the activity has already happened. In those cases, rivers become a main source of information because they leave behind geomorphic signatures that geologists can interrogate to learn about the way those rivers once interacted with the land—helping them to pin down when the land changed and by how much, for example.

    World plate tectonics

    “In tectonics, we are always trying to use rivers to say something about uplift,” Avouac says. “In this case, we used a paleocanyon that was carved by a river. It’s a nice example where by recovering the geometry of the bottom of the canyon, we were able to say how much the range has moved up and when it started moving.”

    The team reports its findings in the current issue of Science.

    Last year, civil engineers from the China Earthquake Administration collected cores by drilling into the valley floor at five locations along the Yarlung Tsangpo River. Shortly after, former Caltech graduate student Jing Liu-Zeng, who now works for that administration, returned to Caltech as a visiting associate and shared the core data with Avouac and Dirk Scherler, then a postdoc in Avouac’s group. Scherler had previously worked in the far western Himalayas, where the Indus River has cut deeply into the Tibetan Plateau, and immediately recognized that the new data suggested the presence of a paleocanyon.

    Liu-Zeng and Scherler analyzed the core data and found that at several locations there were sedimentary conglomerates, rounded gravel and larger rocks cemented together, that are associated with flowing rivers, until a depth of 800 meters or so, at which point the record clearly indicated bedrock. This suggested that the river once carved deeply into the plateau.

    To establish when the river switched from incising bedrock to depositing sediments, they measured two isotopes, beryllium-10 and aluminum-26, in the lowest sediment layer. The isotopes are produced when rocks and sediment are exposed to cosmic rays at the surface and decay at different rates once buried, and so allowed the geologists to determine that the paleocanyon started to fill with sediment about 2.5 million years ago.

    The researchers’ reconstruction of the former valley floor showed that the slope of the river once increased gradually from the Gangetic Plain to the Tibetan Plateau, with no sudden changes, or knickpoints. Today, the river, like most others in the area, has a steep knickpoint where it meets the Himalayas, at a place known as the Namche Barwa massif. There, the uplift of the mountains is extremely rapid (on the order of 1 centimeter per year, whereas in other areas 5 millimeters per year is more typical) and the river drops by 2 kilometers in elevation as it flows through the famous Tsangpo Gorge, known by some as the Yarlung Tsangpo Grand Canyon because it is so deep and long.

    Combining the depth and age of the paleocanyon with the geometry of the valley, the geologists surmised that the river existed in this location prior to about 3 million years ago, but at that time, it was not affected by the Himalayas. However, as the Indian and Eurasian plates continued to collide and the mountain range pushed northward, it began impinging on the river. Suddenly, about 2.5 million years ago, a rapidly uplifting section of the mountain range got in the river’s way, damming it, and the canyon subsequently filled with sediment.

    “This is the time when the Namche Barwa massif started to rise, and the gorge developed,” says Scherler, one of two lead authors on the paper and now at the GFZ German Research Center for Geosciences in Potsdam, Germany.

    That picture of the river and the Tibetan Plateau, which involves the river incising deeply into the plateau millions of years ago, differs quite a bit from the typically accepted geologic vision. Typically, geologists believe that when rivers start to incise into a plateau, they eat at the edges, slowly making their way into the plateau over time. However, the rivers flowing across the Himalayas all have strong knickpoints and have not incised much at all into the Tibetan Plateau. Therefore, the thought has been that the rapid uplift of the Himalayas has pushed the rivers back, effectively pinning them, so that they have not been able to make their way into the plateau. But that explanation does not work with the newly discovered paleocanyon.

    The team’s new hypothesis also rules out a model that has been around for about 15 years, called tectonic aneurysm, which suggests that the rapid uplift seen at the Namche Barwa massif was triggered by intense river incision. In tectonic aneurysm, a river cuts down through the earth’s crust so fast that it causes the crust to heat up, making a nearby mountain range weaker and facilitating uplift.

    The model is popular among geologists, and indeed Avouac himself published a modeling paper in 1996 that showed the viability of the mechanism. “But now we have discovered that the river was able to cut into the plateau way before the uplift happened,” Avouac says, “and this shows that the tectonic aneurysm model was actually not at work here. The rapid uplift is not a response to river incision.”

    The other lead author on the paper, Tectonic control of Yarlung Tsangpo Gorge revealed by a buried canyon in Southern Tibet, is Ping Wang of the State Key Laboratory of Earthquake Dynamics, in Beijing, China. Additional authors include Jürgen Mey, of the University of Potsdam, in Germany; and Yunda Zhang and Dingguo Shi of the Chengdu Engineering Corporation, in China. The work was supported by the National Natural Science Foundation of China, the State Key Laboratory for Earthquake Dynamics, and the Alexander von Humboldt Foundation.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”
    Caltech buildings
    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 4:28 pm on November 20, 2014 Permalink | Reply
    Tags: Applied Research & Technology, , ,   

    From MIT: “Controlling a material with voltage” 

    MIT News

    November 20, 2014
    David L. Chandler | MIT News Office

    Technique could let a small electrical signal change materials’ electrical, thermal, and optical characteristics.

    A new way of switching the magnetic properties of a material using just a small applied voltage, developed by researchers at MIT and collaborators elsewhere, could signal the beginning of a new family of materials with a variety of switchable properties, the researchers say.

    This diagram shows the principle behind using voltage to change material properties. In this sandwich of materials, applying a voltage results in movement of ions — electrically charged atoms — from the middle, functional layer of material into the target layer. This modifies some of the properties — magnetic, thermal, or optical — of the target material, and the changes remain after the voltage is removed. Diagram courtesy of the researchers; edited by Jose-Luis Olivares/MIT

    The technique could ultimately be used to control properties other than magnetism, including reflectivity or thermal conductivity, they say. The first application of the new finding is likely to be a new kind of memory chip that requires no power to maintain data once it’s written, drastically lowering its overall power needs. This could be especially useful for mobile devices, where battery life is often a major limitation.

    The findings were published this week in the journal Nature Materials by MIT doctoral student Uwe Bauer, associate professor Geoffrey Beach, and six other co-authors.

    Beach, the Class of ’58 Associate Professor of Materials Science and Engineering, says the work is the culmination of Bauer’s PhD thesis research on voltage-programmable materials. The work could lead to a new kind of nonvolatile, ultralow-power memory chips, Beach says.

    The concept of using an electrical signal to control a magnetic memory element is the subject of much research by chip manufacturers, Beach says. But the MIT-based team has made important strides in making the technique practical, he says.

    The structure of these devices is similar to that of a capacitor, Beach explains, with two thin layers of conductive material separated by an insulating layer. The insulating layer is so thin that under certain conditions, electrons can tunnel right through it.

    But unlike in a capacitor, the conductive layers in these low-power chips are magnetized. In the new device, one conductive layer has fixed magnetization, but the other can be toggled between two magnetic orientations by applying a voltage to it. When the magnetic orientations are aligned, it is easier for electrons to tunnel from one layer to the other; when they have opposite orientations, the device is more insulating. These states can be used to represent “zero” and “one.”

    The work at MIT shows that it takes just a small voltage to flip the state of the device — which then retains its new state even after power is switched off. Conventional memory devices require a continuous source of power to maintain their state.

    The MIT team was able to design a system in which voltage changes the magnetic properties 100 times more powerfully than other groups have been able to achieve; this strong change in magnetism makes possible the long-term stability of the new memory cells.

    They achieved this by using an insulating layer made of an oxide material in which the applied voltage can rearrange the locations of the oxygen ions. They showed that the properties of the magnetic layer could be changed dramatically by moving the oxygen ions back and forth near the interface.

    The team is now working to ramp up the speed at which these changes can be made to the memory elements. They have already reached rates of a megahertz (millions of times per second) in switching, but a fully competitive memory module will require further increase on the order of a hundredfold to a thousandfold, they say.

    The team also found that the magnetic properties could be changed using a pulse of laser light that heats the oxide layer, helping the oxygen ions to move more easily. The laser beam used to alter the state of the material can scan across its surface, making changes as it goes.

    The same techniques could be used to alter other properties of materials, Beach explains, such as reflectivity or thermal conductivity. Such properties can ordinarily be changed only through mechanical or chemical processing. “All these properties could come under electrical control, to be switched on and off, and even ‘written’ using a beam of light,” Beach says. This ability to make such changes on the fly essentially produces “an Etch-a-Sketch for material properties,” he says.The new findings “started as a fluke,” Beach says: Bauer was experimenting with the layered material, expecting to see standard temporary capacitive effects from an applied voltage. “But he turned off the voltage and it stayed that way,” with a reversed magnetic state, Beach says, leading to further investigation.

    “I think this will have broad applications,” Beach says, adding that it uses methods and materials that are already standard in microchip manufacturing.

    In addition to Bauer and Beach, the team included Lide Yao and Sebastiaan van Dijken of Aalto University in Finland and, at MIT, graduate students Aik Jun Tan, Parnika Agrawal, and Satoru Emori and professor of ceramics and electronic materials Harry Tuller. The work was supported by the National Science Foundation and Samsung.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 4:12 pm on November 20, 2014 Permalink | Reply
    Tags: Applied Research & Technology, , ,   

    From MIT: “New 2-D quantum materials for nanoelectronics” 

    MIT News

    November 20, 2014
    David L. Chandler | MIT News Office

    MIT team provides theoretical roadmap to making 2-D electronics with novel properties.

    Researchers at MIT say they have carried out a theoretical analysis showing that a family of two-dimensional materials exhibits exotic quantum properties that may enable a new type of nanoscale electronics.

    These materials are predicted to show a phenomenon called the quantum spin Hall (QSH) effect, and belong to a class of materials known as transition metal dichalcogenides, with layers a few atoms thick. The findings are detailed in a paper appearing this week in the journal Science, co-authored by MIT postdocs Xiaofeng Qian and Junwei Liu; assistant professor of physics Liang Fu; and Ju Li, a professor of nuclear science and engineering and materials science and engineering.

    This diagram illustrates the concept behind the MIT team’s vision of a new kind of electronic device based on 2-D materials. The 2-D material is at the middle of a layered “sandwich,” with layers of another material, boron nitride, at top and bottom (shown in gray). When an electric field is applied to the material, by way of the rectangular areas at top, it switches the quantum state of the middle layer (yellow areas). The boundaries of these “switched” regions act as perfect quantum wires, potentially leading to new electronic devices with low losses. Illustration: Yan Liang

    QSH materials have the unusual property of being electrical insulators in the bulk of the material, yet highly conductive on their edges. This could potentially make them a suitable material for new kinds of quantum electronic devices, many researchers believe.

    But only two materials with QSH properties have been synthesized, and potential applications of these materials have been hampered by two serious drawbacks: Their bandgap, a property essential for making transistors and other electronic devices, is too small, giving a low signal-to-noise ratio; and they lack the ability to switch rapidly on and off. Now the MIT researchers say they have found ways to potentially circumvent both obstacles using 2-D materials that have been explored for other purposes.

    Existing QSH materials only work at very low temperatures and under difficult conditions, Fu says, adding that “the materials we predicted to exhibit this effect are widely accessible. … The effects could be observed at relatively high temperatures.”

    “What is discovered here is a true 2-D material that has this [QSH] characteristic,” Li says. “The edges are like perfect quantum wires.”

    The MIT researchers say this could lead to new kinds of low-power quantum electronics, as well as spintronics devices — a kind of electronics in which the spin of electrons, rather than their electrical charge, is used to carry information.

    Graphene, a two-dimensional, one-atom-thick form of carbon with unusual electrical and mechanical properties, has been the subject of much research, which has led to further research on similar 2-D materials. But until now, few researchers have examined these materials for possible QSH effects, the MIT team says. “Two-dimensional materials are a very active field for a lot of potential applications,” Qian says — and this team’s theoretical work now shows that at least six such materials do share these QSH properties.

    Graphene is an atomic-scale honeycomb lattice made of carbon atoms.

    The MIT researchers studied materials known as transition metal dichalcogenides, a family of compounds made from the transition metals molybdenum or tungsten and the nonmetals tellurium, selenium, or sulfur. These compounds naturally form thin sheets, just atoms thick, that can spontaneously develop a dimerization pattern in their crystal structure. It is this lattice dimerization that produces the effects studied by the MIT team.

    While the new work is theoretical, the team produced a design for a new kind of transistor based on the calculated effects. Called a topological field-effect transistor, or TFET, the design is based on a single layer of the 2-D material sandwiched by two layers of 2-D boron nitride. The researchers say such devices could be produced at very high density on a chip and have very low losses, allowing high-efficiency operation.

    By applying an electric field to the material, the QSH state can be switched on and off, making possible a host of electronic and spintronic devices, they say.

    In addition, this is one of the most promising known materials for possible use in quantum computers, the researchers say. Quantum computing is usually susceptible to disruption — technically, a loss of coherence — from even very small perturbations. But, Li says, topological quantum computers “cannot lose coherence from small perturbations. It’s a big advantage for quantum information processing.”

    Because so much research is already under way on these 2-D materials for other purposes, methods of making them efficiently may be developed by other groups and could then be applied to the creation of new QSH electronic devices, Qian says.

    Nai Phuan Ong, a professor of physics at Princeton University who was not connected to this work, says, “Although some of the ideas have been mentioned before, the present system seems especially promising. This exciting result will bridge two very active subfields of condensed matter physics, topological insulators and dichalcogenides.”

    The research was supported by the National Science Foundation, the U.S. Department of Energy, and the STC Center for Integrated Quantum Materials. Qian and Liu contributed equally to the work.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 12:07 pm on November 20, 2014 Permalink | Reply
    Tags: Applied Research & Technology, ,   

    From FNAL: “Physics in a Nutshell – Heisenberg’s uncertainty principle and Wi-Fi 

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    Thursday, Nov. 20, 2014
    Jim Pivarski

    When I first started teaching, I was stumped by a student who asked me if quantum mechanics affected anything in daily life. I said that the universe is fundamentally quantum mechanical and therefore it affects everything, but this didn’t satisfy him. Since then, I’ve been noticing examples everywhere.

    Bandwidth, or the spreading of a radio station onto multiple, neighboring frequencies, is related to uncertainty in quantum mechanics.

    One surprising example is the effect of Heisenberg’s uncertainty principle on Wi-Fi communication (wireless internet). Heisenberg’s uncertainty principle is usually described as a limit on knowledge of a particle’s position and speed: The better you know its position, the worse you know its speed. However, it is a general principle with many consequences. The most common in particle physics is that the shorter a particle’s lifetime, the worse you know its mass. Both of these formulations are far removed from everyday life, though.

    In everyday life, the wave nature of most particles is too small to see. The biggest exception is radio and light, which are wave-like in daily life and only particle-like (photons) in the quantum realm. In radio terminology, Heisenberg’s uncertainty principle is called the bandwidth theorem, and it states that the rate at which information is carried over a radio band is proportional to the width of that band. Bandwidth is the reason that radio stations with nearly the same central frequency can sometimes be heard simultaneously: Each is broadcasting over a range of frequencies, and those ranges overlap. If you try to send shorter pulses of data at a higher rate, the range of frequencies broadens.

    Although this theorem was developed in the context of Morse code over telegraph systems, it applies just as well to computer data over Wi-Fi networks. A typical Wi-Fi network transmits 54 million bits per second, or 18.5 nanoseconds per bit (zero or one). Through the bandwidth theorem, this implies a frequency spread of about 25 MHz, but the whole Wi-Fi radio dial is only 72 MHz across. In practice, only three bands can be distinguished, so only three different networks can fill the same airwaves at the same time. As the bit rate of Wi-Fi gets faster, the bandwidth gets broader, crowding the radio dial even more.

    Mathematically, the Heisenberg uncertainty principle is just a special case of the bandwidth theorem, and we can see this relationship by comparing units. The lifetime of a particle can be measured in nanoseconds, just like the time for a computer to emit a zero or a one. A particle’s mass, which is a form of energy, can be expressed as a frequency (for example, 1 GeV is a quarter of a trillion trillion Hz). Uncertainty in mass is therefore a frequency spread, which is to say, bandwidth.

    Although it’s fundamentally the same thing, the numerical scale is staggering. A computer network comprising decaying Z bosons could emit 75 million petabytes per second, and its bandwidth would be 600 trillion GHz wide.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics.

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 9:05 am on November 20, 2014 Permalink | Reply
    Tags: , Applied Research & Technology, ,   

    From AAAS: “Body’s bacteria may keep our brains healthy” 



    19 November 2014
    Elizabeth Pennisi

    The microbes that live in your body outnumber your cells 10 to one. Recent studies suggest these tiny organisms help us digest food and maintain our immune system. Now, researchers have discovered yet another way microbes keep us healthy: They are needed for closing the blood-brain barrier, a molecular fence that shuts out pathogens and molecules that could harm the brain.

    Lacking a strong blood-brain barrier, germ-free mice (left) can’t prevent a radioactive tracer (yellow) from entering the brain the way that mice with microbes (middle) can. But adding microbes to germ-free mice (right) restores the blood-brain barrier. (Miklós Tóth/Karolinkska Institutet)

    The findings suggest that a woman’s diet or exposure to antibiotics during pregnancy may influence the development of this barrier. The work could also lead to a better understanding of multiple sclerosis, in which a leaky blood-brain barrier may set the stage for a decline in brain function.

    The first evidence that bacteria may help fortify the body’s biological barriers came in 2001. Researchers discovered that microbes in the gut activate genes that code for gap junction proteins, which are critical to building the gut wall. Without these proteins, gut pathogens can enter the bloodstream and cause disease.

    In the new study, intestinal biologist Sven Pettersson and his postdoc Viorica Braniste of the Karolinska Institute in Stockholm decided to look at the blood-brain barrier, which also has gap junction proteins. They tested how leaky the blood-brain barrier was in developing and adult mice. Some of the rodents were brought up in a sterile environment and thus were germ-free, with no detectable microbes in their bodies. Braniste then injected antibodies—which are too big to get through the blood-brain barrier—into embryos developing within either germ-free moms or moms with the typical microbes, or microbiota.

    The studies showed that the blood-brain barrier typically forms a tight seal a little more than 17 days into development. Antibodies infiltrated the brains of all the embryos younger than 17 days, but they continued to enter the brains of embryos of germ-free mothers well beyond day 17, the team reports online today in Science Translational Medicine. Embryos from germ-free mothers also had fewer intact gap junction proteins, and gap junction protein genes in their brains were less active, which may explain the persistent leakiness. (The researchers didn’t look at the mice’s guts.)

    Germ-free mice even have leaky blood-brain barriers as adults. But those leaks closed after the researchers gave the animals the microbes from normal mice for 2 weeks, Pettersson says.

    The microbes have “a striking effect,” says Elaine Hsiao, a neurobiologist at the California Institute of Technology in Pasadena who was not involved in the study. The work suggests “a role for the [microbes] in regulating brain development and function.”

    But how? In the gut, bacteria may influence the gut wall’s integrity through one of their byproducts, energy-laden molecules called short-chain fatty acids. So Pettersson and his colleagues infected germ-free mice with either bacteria that made these fatty acids or ones that did not. The blood-brain barrier improved only when the bacteria made these fatty acids, Pettersson says. He thinks that these molecules may get into the blood and stimulate gene activity that leads to the closure of the barrier.

    The study is not perfect, Hsaio says. “Germ-free mice are useful tools for studying the microbiota, but the germ-free condition is artificial and involves widespread disruptions” in how the body functions, such as impaired immunity and loss of gut integrity. So these results in germ-free mice need to be confirmed in humans, she says.

    But at the very least, the findings point toward a new understanding of human health and disease, says Lora Hooper, an immunologist at the University of Texas Southwestern Medical Center in Dallas who was not involved in the work. With multiple sclerosis, neurobiologists are at a loss to explain why the disease progresses so erratically, so the idea that changes in the body’s microbes may alter the blood-brain barrier to make the brain more vulnerable to damage is appealing, Pettersson notes.

    Scientists, Hooper adds, should also investigate whether microbes help spur the development of the human fetus’s blood-brain barrier. It could be that taking antibiotics at the wrong time during pregnancy is creating abnormalities in the blood-brain barrier of the child, she says.

    See the full article here.

    The American Association for the Advancement of Science is an international non-profit organization dedicated to advancing science for the benefit of all people.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ScienceSprings relies on technology from

    MAINGEAR computers



  • richardmitnick 5:16 pm on November 19, 2014 Permalink | Reply
    Tags: Applied Research & Technology, , , ,   

    From Princeton: “Unique sense of ‘touch’ gives a prolific bacterium its ability to infect anything” 

    Princeton University
    Princeton University

    November 19, 2014
    Morgan Kelly, Office of Communications

    New research has found that one of the world’s most prolific bacteria manages to afflict humans, animals and even plants by way of a mechanism not before seen in any infectious microorganism — a sense of touch. This unique ability helps make the bacteria Pseudomonas aeruginosa ubiquitous, but it also might leave these antibiotic-resistant organisms vulnerable to a new form of treatment.

    Pseudomonas is the first pathogen found to initiate infection after merely attaching to the surface of a host, Princeton University and Dartmouth College researchers report in the journal the Proceedings of the National Academy of Sciences. This mechanism means that the bacteria, unlike most pathogens, do not rely on a chemical signal specific to any one host, and just have to make contact with any organism that’s ripe for infection.

    The researchers found, however, that the bacteria could not infect another organism when a protein on their surface known as PilY1 was disabled. This suggests a possible treatment that, instead of attempting to kill the pathogen, targets the bacteria’s own mechanisms for infection.

    A study led by Princeton University researchers found that one of the world’s most prolific bacteria, Pseudomonas aeruginosa, manages to afflict humans, animals and even plants by way of a mechanism not before seen in any infectious microorganism — a sense of touch. This technique means the bacteria, unlike most pathogens, do not rely on a chemical signal specific to any one host. To demonstrate the bacteria’s versatility, the researchers infected ivy cells (blue rings) with the bacteria (green areas) then introduced amoebas (yellow) to the same sample. Pseudomonas immediately detected and quickly overwhelmed the amoebas. (Image by Albert Siryaporn, Department of Molecular Biology)

    Corresponding author Zemer Gitai, a Princeton associate professor of molecular biology, explained that the majority of bacteria, viruses and other disease-causing agents depend on “taste,” as in they respond to chemical signals unique to the hosts with which they typically co-evolved. Pseudomonas, however, through their sense of touch, are able to thrive on humans, plants, animals, numerous human-made surfaces, and in water and soil. They can cause potentially fatal organ infections in humans, and are the culprit in many hospital-acquired illnesses such as sepsis. The bacteria are largely unfazed by antibiotics.

    “Pseudomonas’ ability to infect anything was known before. What was not known was how it’s able to detect so many types of hosts,” Gitai said. “That’s the key piece of this research — by using this sense of touch, as opposed to taste, Pseudomonas can equally identify any kind of suitable host and initiate infection in an attempt to kill it.”

    The researchers found that only two conditions must be satisfied for Pseudomonas to launch an infection: Surface attachment and “quorum sensing,” a common bacterial mechanism wherein the organisms can detect that a large concentration of their kind is present. The researchers focused on the surface-attachment cue because it truly sets Pseudomonas apart, said Gitai, who worked with first author Albert Siryaporn, a postdoctoral researcher in Gitai’s group; George O’Toole, a professor of microbiology and immunology at Dartmouth; and Sherry Kuchma, a senior scientist in O’Toole’s laboratory.

    To demonstrate the bacteria’s wide-ranging lethality, Siryaporn infected ivy cells with the bacteria then introduced amoebas to the same sample; Pseudomonas immediately detected and quickly overwhelmed the single-celled animals. “The bacteria don’t know what kind of host it’s sitting on,” Siryaporn said. “All they know is that they’re on something, so they’re on the offensive. It doesn’t draw a distinction between one host or another.”

    When Siryaporn deleted the protein PilY1 from the bacteria’s surface, however, the bacteria lost their ability to infect and thus kill the test host, an amoeba. “We believe that this protein is the sensor of surfaces,” Siryaporn said. “When we deleted the protein, the bacteria were still on a surface, but they didn’t know they were on a surface, so they never initiated virulence.”

    Because PilY1 is on a Pseudomonas bacterium’s surface and required for virulence, it presents a comprehensive and easily accessible target for developing drugs to treat Pseudomonas infection, Gitai said. Many drugs are developed to target components in a pathogen’s more protected interior, he said.

    The video [included], captured during a span of 113 minutes, shows that Pseudomonas (gray tubes) grow exponentially — doubling their numbers roughly every 30 minutes — and establish large populations of cells over the course of a few hours. In contrast, eukaryotic organisms such as the amoeba (large organisms) grow much more slowly and can be quickly overwhelmed by a bacterial population. The bacteria’s ability to rapidly multiply in a variety hosts makes a Pseudomonas infection difficult to treat using antibiotics. (Video by Albert Siryaporn, Department of Molecular Biology)

    KC Huang, a Stanford University associate professor of bioengineering, said that the research is an important demonstration of an emerging approach to treating pathogens — by disabling rather than killing them.

    “This work indicates that the PilY1 sensor is a sort of lynchpin for the entire virulence response, opening the door to therapeutic designs that specifically disrupt the mechanical cues for activating virulence,” said Huang, who is familiar with the research but had no role in it.

    “This is a key example of what I think will become the paradigm in antivirals and antimicrobials in the future — that trying to kill the microbes is not necessarily the best strategy for dealing with an infection,” Huang said. “[The researchers’] discovery of the molecular factor that detects the mechanical cues is critical for designing such compounds.”

    Targeting proteins such as PilY1 offers an avenue for combating the growing problem of antibiotic resistance among bacteria, Gitai said. Disabling the protein in Pseudomonas did not hinder the bacteria’s ability to multiply, only to infect.

    Antibiotic resistance results when a drug kills all of its target organisms, but leaves behind bacteria that developed a resistance to the drug. These mutants, previously in the minority, multiply at an astounding rate — doubling their numbers roughly every 30 minutes — and become the dominant strain of pathogen, Gitai said. If bacteria had their ability to infect disabled, but were not killed, the mutant organisms would be unlikely to take over, he said.

    “I’m very optimistic that we can use drugs that target PilY1 to inhibit the whole virulence process instead of killing off bacteria piecemeal,” Gitai said. “This could be a whole new strategy. Really what people should be doing is screening drugs that inhibit virulence but preserve growth. This protein presents a possible route by which to do that.”

    PilY1 also is found in other bacteria with a range of hosts, Gitai said, including Neisseria gonorrhoeae or the large bacteria genus Burkholderia, which, respectively, cause gonorrhea in humans and are, along with Pseudomonas, a leading cause of lung infection in people with cystic fibrosis. It is possible that PilY1 has a similar role in detecting surfaces and initiating infection for these other bacteria, and thus could be a treatment target.

    Frederick Ausubel, a professor of genetics at Harvard Medical School, said that the research could help explain how opportunistic pathogens are able to infect multiple types of hosts. Recent research has revealed a lot about how bacteria initiate an infection, particularly via quorum sensing and chemical signals, but the question about how that’s done across a spectrum of unrelated hosts has remained unanswered, said Ausubel, who is familiar with the research but had no role in it.

    “A broad host-range pathogen such as Pseudomonas cannot rely solely on chemical cues to alert it to the presence of a suitable host,” Ausubel said.

    “It makes sense that Pseudomonas would use surface attachment as one of the major inputs to activating virulence, especially if attachment to surfaces in general rather than to a particular surface is the signal,” he said. “There is probably an advantage to activating virulence only when attached to a host cell, and it is certainly possible that other broad host-range opportunistic pathogens utilize a similar strategy.”

    The paper, Surface attachment induces Pseudomonas aeruginosa virulence, was published online Nov. 10 by the Proceedings of the National Academy of Sciences. The work was supported by a National Institutes of Health Director’s New Innovator Award (grant no. 1DP2OD004389); the National Science Foundation (grant no. 1330288); an NIH National Institute of Allergy and Infectious Diseases postdoctoral fellowship (no. F32AI095002) and grant (no. R37-AI83256-06); and the Human Frontiers in Science Program.

    See the full article, with video, here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    About Princeton: Overview

    Princeton University is a vibrant community of scholarship and learning that stands in the nation’s service and in the service of all nations. Chartered in 1746, Princeton is the fourth-oldest college in the United States. Princeton is an independent, coeducational, nondenominational institution that provides undergraduate and graduate instruction in the humanities, social sciences, natural sciences and engineering.

    As a world-renowned research university, Princeton seeks to achieve the highest levels of distinction in the discovery and transmission of knowledge and understanding. At the same time, Princeton is distinctive among research universities in its commitment to undergraduate teaching.

    Today, more than 1,100 faculty members instruct approximately 5,200 undergraduate students and 2,600 graduate students. The University’s generous financial aid program ensures that talented students from all economic backgrounds can afford a Princeton education.

    Princeton Shield
    ScienceSprings relies on technology from

    MAINGEAR computers



Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc

Get every new post delivered to your Inbox.

Join 356 other followers

%d bloggers like this: