Tagged: Robotics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:34 am on December 16, 2014 Permalink | Reply
    Tags: , Robotics,   

    From Sandia: “Getting bot responders into shape” 


    Sandia Lab

    December 16, 2014
    Stephanie Holinka, slholin@sandia.gov, (505) 284-9227

    Sandia National Laboratories is tackling one of the biggest barriers to the use of robots in emergency response: energy efficiency.

    Through a project supported by the Defense Advanced Research Projects Agency (DARPA), Sandia is developing technology that will dramatically improve the endurance of legged robots, helping them operate for long periods while performing the types of locomotion most relevant to disaster response scenarios.

    1
    Steve Buerger is leading a Sandia National Laboratories project to demonstrate how energy efficient biped walking robots could become. Increased efficiency could enable bots to operate for much longer periods of time without recharging batteries, an important factor in emergency situations. (Photo by Randy Montoya)
    One of Sandia’s new robots that showcases this technology will be demonstrated at an exposition to be held in conjunction with the DARPA Robotics Challenge Finals next June.

    As the finals draw closer, some of the most advanced robotics research and development organizations in the world are racing to develop emergency response robots that can complete a battery of tasks specified by DARPA. Competing robots will face degraded physical environments that simulate conditions likely to occur in a natural or man-made disaster. Many robots will walk on legs to allow them to negotiate challenging terrain.

    Sandia’s robots won’t compete in the finals next June, but they could ultimately help the winning robots extend their battery life until their life-saving work is done.

    “We’ll demonstrate how energy efficient biped walking robots could become. Increased efficiency could allow robots similar to those used for the competition to operate for much longer periods of time without recharging batteries,” said project lead Steve Buerger of Sandia’s Intelligent Systems Control Dept.

    Batteries need to last for emergency response robots

    Battery life is an important concern in the usefulness of robots for emergency response.

    “You can have the biggest, baddest, toughest robot on the planet, but if its battery life is 10 or 20 minutes, as many are right now, that robot cannot possibly function in an emergency situation, when lives are at stake,” said Buerger.

    The first robot Sandia is developing in support of the DARPA Challenge, is known as STEPPR for Sandia Transmission Efficient Prototype Promoting Research. It is a fully functional research platform that allows developers to try different joint-level mechanisms that function like elbows and knees to quantify how much energy is used.

    Sandia’s second robot, WANDERER for Walking Anthropomorphic Novelly Driven Efficient Robot for Emergency Response, will be a more optimized and better-packaged prototype.

    Energy-efficient actuators key to testing

    The key to the testing is Sandia’s novel, energy-efficient actuators, which move the robots’ joints. The actuation system uses efficient, brushless DC motors with very high torque-to-weight ratios, very efficient low-ratio transmissions and specially designed passive mechanisms customized for each joint to ensure energy efficiency.

    “We take advantage of dynamic characteristics that are common to a wide variety of legged behaviors and add a set of ‘support elements,’ including springs and variable transmissions, that keep the motors operating at more efficient speed-torque conditions, reducing losses,” Buerger said.

    Electric motors are particularly inefficient when providing large torques at low speeds, for example, to a crouching robot, Buerger said. A simple support element, such as a spring, would provide torque, reducing the load on the motor.

    “The support elements also allow robots to self-adjust when they change behaviors. When they change from level walking to uphill walking, for example, they can make subtle adjustments to their joint dynamics to optimize efficiency under the new condition,” Buerger said.

    Robots must adapt to the diverse kinds of conditions expected in emergency response scenarios.

    “Certain legged robot designs are extremely efficient when walking on level ground, but function extremely inefficiently under other conditions or cannot walk over different types of terrains. Robots need an actuation system to enable efficient locomotion in many different conditions,” Buerger said. “That is what the adjustable support elements can do.”

    Early testing has shown STEPPR to operate efficiently and quietly.

    “Noise is lost energy, so being quiet goes hand-in-hand with being efficient. Most robots make a lot of noise, and that can be a major drawback for some applications,” Buerger said.

    Robots’ electronics, certain software to be publicly released

    STEPPR’s and WANDERER’s electronics and low-level software are being developed by the Open Source Robotics Foundation. The designs will be publicly released, allowing engineers and designers all over the world to take advantage of advances.

    The Florida Institute for Human and Machine Cognition is developing energy-efficient walking control algorithms for both robots. The Massachusetts Institute of Technology and Globe Motors also are contributing to the project.

    Sandia’s robotic work will be demonstrated in the technology exposition section of the DARPA Robotics Challenge, scheduled for June 5-6 at Fairplex in Pomona, Calif.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 5:08 pm on November 21, 2014 Permalink | Reply
    Tags: , , Robotics   

    From NSF: “A foundation for robotics” 

    nsf
    National Science Foundation

    November 21, 2014
    Aaron Dubrow, NSF (703) 292-4489 adubrow@nsf.gov

    The fundamental research in computing and engineering that enabled robotics to develop in the U.S. has been supported by the National Science Foundation (NSF) since its inception.
    Yet despite these early investments in sensors, machine movement and computer vision, it wasn’t until 1972 that the first grant with “robot” in the title was funded.

    1970s: Robots for the factory floor

    In the mid-1970s, robotics began to gather steam at NSF. Among the first research projects classified as robotics were mechanical arms (seen [below]) that could pick a part out of a box and visually identify it and orient it properly for the next step on an assembly line, as well as computer-controlled welding robots. These and other NSF-funded projects were aimed at improving the productivity of American manufacturing processes–a goal for roboticists that continues to this day.

    r
    This image of a robot arm, developed by the Stanford Research Institute, is similar to the one that appeared in the 1976 NSF Annual Report. The robotic system used computer vision to identify and make decisions about parts on an assembly line. This is one of several projects from that era aimed at improving the productivity of American manufacturing processes. Credit: SRI International

    1980s: Rise of the walking machines

    The 1980s brought an increased diversification in the types of robots being explored and the ways they could be used.

    At Ohio State, electrical engineer Robert McGhee and mechanical engineer Kenneth Waldron, along with a 60-member team of students and technical assistants, developed the Adaptive Suspension Vehicle (ASV), nicknamed the “Walker,” with support from NSF and the Defense Advanced Projects Research Agency (DARPA).

    w
    What do you get when you combine 20 years of research, $5 million, and a Star Wars Imperial all-terrain vehicle? Ohio State’s Adaptive Suspension Vehicle (ASV), nicknamed the “Walker.” Developed by electrical engineer Robert McGhee and mechanical engineer Kenneth Waldron, along with a 60-member team of students and technical assistants, the ‘Walker’ was developed under a research contract from the Defense Advanced Projects Research Agency (DARPA).

    The ASV was 17 feet long, 8 feet wide, and 10.5 feet high, and had six legs to support its three-ton aluminum body. It was designed to carry cargo for industrial and military applications over rough, mountainous, icy or muddy terrain, and was capable of crossing 9-foot-wide ditches or 7-foot-high walls.

    The walker used a forward mounted radar system to scan the terrain ahead and feed that data, along with instructions from the operator’s joystick, into the 16 onboard computers that coordinated and controlled the ASV’s legs. Computers moved each leg individually, up and down, forward and back, and closer or farther from the ASV’s body, for a clunky but serviceable ride.

    1990s: Robots explore new environments

    Not long afterward, researchers supported by NSF were developing robots for a very different environment: underwater. First built in 1991, the Omni-Directional Intelligent Navigator (ODIN) was a sphere-shaped, autonomous underwater robot capable of instantaneous movement in all six directions. First built as a remotely operated robot, in 1995 it was upgraded to ODIN II, an autonomous underwater robot. Sentry, a successor robot developed through a grant from NSF, plies the deep waters today locating and quantifying hydrothermal fluxes.

    o
    First built in 1991, the Omni-Directional Intelligent Navigator (ODIN) was a sphere-shaped, autonomous underwater robot capable of instantaneous movement in six directions. Credit: Autonomous Systems Laboratory, University of Hawaii

    In the 1990s, roboticists began turning their attention to day-to-day tasks with which a robot could assist. For instance, researchers from the University of Pittsburgh, University of Michigan and Carnegie Mellon University developed a series of mobile, personal service robots, such as Nursebot, that were designed to assist elderly people in their everyday life.

    n
    Researchers from the University of Pittsburgh, University of Michigan and Carnegie Mellon University have developed mobile, personal service robots, such as Nursebot, that assist elderly people in their everyday life. An autonomous mobile robot that “lives” in the home of a chronically ill elderly person might remind its owner to take medicine, provide videoconferencing with doctors, collect patient data or watch for accidents, manipulate objects for arthritis sufferers, and provide some social interaction. Credit: Carnegie Mellon University

    An autonomous mobile robot that “lives” in the home of a chronically ill elderly person could remind its owner to take medicine, provide videoconferencing with doctors, collect patient data or watch for accidents, manipulate objects for arthritis sufferers, and provide some social interaction. New versions have evolved over the years and a General Electric developed hospital robot is expected to be tested at a Veterans Affairs hospital in 2015.

    2000s: Miniaturization and mobility

    Researchers have always envisioned a future where robots could serve the general good in disaster recovery and search-and-rescue operations, but it wasn’t until 9/11 that robots were broadly put to that use.

    Robotics expert Robin Murphy, then an associate professor of computer science at the University of South Florida, arrived on site the morning after the collapse of the World Trade Center. Murphy’s research on experimental mixed-initiative robots for urban rescue operations was originally funded by NSF. She brought with her a response team that included three graduate students–Jenn Casper, Mark Micire and Brian Minten–and software-guided “marsupial” robot systems. These intelligent anonymous “marsupial” robots are especially useful in rubble because the “mother” robot releases smaller robots to explore tight spaces unreachable by other means.

    Over the next 11 days, the teams made five insertions onto the rubble piles, often at the request of the Federal Emergency Management Agency (FEMA) task force teams or sector chiefs. Murphy’s mechanized prowlers had tethers with a range of 100 feet, far out-stripping the fire department’s seven-foot camera wands. In doing so, they helped find five victims and another set of remains, though Murphy expressed regret that they hadn’t been more successful.

    As the 2000s progressed, efforts by engineers to miniaturize components led to robots that were significantly smaller than those that came before. One startling example of this trend is the RoboBee project, which was awarded an Expeditions in Computing award from NSF’s Directorate for Computer and Information Science and Engineering in 2009.

    Researchers in this expedition are creating robotic bees that fly autonomously and coordinate activities amongst themselves and the hive, much like real bees. The research team aims to drive research in compact, high-energy power sources, ultra-low-powered computing and the design of distributed algorithms for multi-agent systems. Most recently, RoboBees were pollinating young minds at the Boston Museum of Science in an exhibition dedicated to their complex design.

    2010s: Investing in co-robots

    In June 2011, the administration launched the National Robotics Initiative (NRI) to develop robots that work with or beside people to extend or augment human capabilities, taking advantage of the different strengths of humans and robots. This provided focus and funding for robotics research. The NRI is led by NSF and supported by multiple agencies including the National Aeronautics and Space Administration (NASA), the National Institutes of Health (NIH), the U.S. Department of Agriculture (USDA), and the U.S. Department of Defense (DOD).

    Since 2011, NSF and its partners in the NRI have invested more than $120 million in robotics research.

    Today, robots impact our lives in a myriad of ways. Robots are being used in classrooms across the nation to capture the excitement of students and help them learn STEM (and non-STEM) principles. They’re helping doctors perform surgeries, providing assistance to individuals with disabilities and inspecting bridges and roads to ensure our safety.

    “Robots, which once were limited to the realm of science fiction, are now a transformative technology, a demonstration of how NSF-funded basic research can bring about changes in human life,” said Marc Rothenberg, former NSF historian.

    Though robots may be an emerging area of research, many of the underpinnings of today’s robots began in fundamental research in sensing, computer vision, artificial intelligence, mechanical engineering and many other areas that some might not immediately recognize as being related to robots.

    What’s next

    NSF-supported researchers are making incredible advances in robotics, creating a new generation of co-robots that can handle critical tasks in close proximity to humans, safely and with greater resilience than previous intelligent machines.

    Check out our Youtube Robot playlist to learn more about more about the research of NSF-funded roboticists.

    Investigators
    Junku Yuh
    Song Choi
    Gu-Yeon Wei
    Robert Wood
    Robin Murphy
    Andrea Thomaz
    Charles Klein
    Robert McGhee
    Radhika Nagpal
    Blake Hannaford
    Judith Matthews
    Nilanjan Sarkar
    Donald Chiarulli
    Nikolaus Correll
    Said Koozekanani
    J. Gregory Morrisett
    Jacqueline Dunbar-Jacob

    Related Institutions/Organizations
    Harvard University
    University of Hawaii
    Ohio State University
    Colorado School of Mines
    University of Pittsburgh
    University of Washington
    Georgia Tech Research Corporation
    University of Colorado at Boulder

    Related Programs
    National Robotics Initiative

    Related Awards
    #9157896 Presidential Young Investigators Award
    #0953181 CAREER: Socially Guided Machine Learning
    #9320318 Reactive Sensing for Autonomous Mobile Robots
    #0085796 ITR: Personal Robotic Assistants for the Elderly
    #1150223 CAREER: Modeling and Design of Composite Swarming Behaviors
    #0926148 Collaborative Research: RoboBees: A Convergence of Body, Brain and Colony
    #0958441 II New: A Network of Open Experimental Testbeds for Surgical Robotics Research
    #7818957 Dynamics and Control of Industrial Manipulators and Legged Locomotion Systems
    #9701614 Intelligent Coordinated Motion Control of Underwater Robotic Vehicles with Manipulator Workpackages (Collaborative Research)
    #9603043 U.S.-Japan Cooperative Science: Virtual Collaborative World Simulator for Underwater Robots using Multi-Dimensional, Synthetic Environment

    Years Research Conducted
    1972 – 2014

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    The National Science Foundation (NSF) is an independent federal agency created by Congress in 1950 “to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense…we are the funding source for approximately 24 percent of all federally supported basic research conducted by America’s colleges and universities. In many fields such as mathematics, computer science and the social sciences, NSF is the major source of federal backing.

    seal

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 5:14 pm on August 25, 2014 Permalink | Reply
    Tags: , , Robotics   

    From SPACE.com: “NASA’s Robot Army of ‘Swarmies’ Could Explore Other Planets” 

    space-dot-com logo

    SPACE.com

    August 25, 2014
    Kelly Dickerson

    They may look like remote-controlled toy trucks, but a troop of new NASA robots could one day race across distant planets as a sort of space exploration vanguard.

    swarmies

    The autonomous robots, which engineers have dubbed “swarmies,” are much smaller than other NASA robots like the Mars rover Curiosity. Each comes equipped with a webcam, Wi-Fi antenna, and GPS system for navigation. The self-driving swarmie robots could be used to search alien surfaces one day. Credit: NASA/Dmitri Gerondidakis

    The swarmies function in a way similar to an ant colony. When one ant stumbles across a food source, it sends out a signal to the rest of the colony, and then the ants work together to cart the food back to the nest. Engineers from NASA’s Kennedy Space Center in Florida developed software that directs the swarmies to fan out in different directions and search for a specific, predetermined material, like ice-water on Mars. Once one of the rovers finds something interesting, it can use radio communication to call its robotic brethren over to help collect samples.

    “For a while people were interested in putting as much smarts and capability as they could on their one robot,” Kurt Leucht, one of the engineers working on the project, said in a statement. “Now people are realizing you can have much smaller, much simpler robots that can work together and achieve a task. One of them can roll over and die and it’s not the end of the mission because the others can still accomplish the task.”

    Working out a way to send humans on lunar or Martian exploration missions is complicated and expensive and those kinds of missions are likely still a long way off. Sending robots is an easier alternative, and NASA is working on a whole new generation of autonomous robotic explorers. NASA engineers have already dreamed up slithering snake-like robots that could explore Mars and deep-diving robots that could explore the oceans of Jupiter’s moon Europa.

    rr
    The RASSOR robot is programmed for digging and mining and will be incorporated into the swarmie test drives. Credit: NASA

    The swarmie tests are still in the preliminary stages, and NASA engineers are only driving the swarmies around the parking lots surrounding Kennedy’s Launch Control Center. Right now the robots are only programmed to hunt for barcoded slips of paper. Over the next few months, swarmie tests will also include RASSOR — a mining robot specially designed to dig into alien surfaces and search for interesting or valuable materials. The test will determine how well the swarming software translates to control other robotic vehicles.

    Swarmies might also find a use on Earth, NASA officials said. The robots could aid in rescue missions following natural disasters or building collapses, crashes and other wreckage sites. The robots would also make perfect pipeline inspectors.

    “This would give you something smaller and cheaper that could always be running up and down the length of the pipeline so you would always know the health of your pipelines,” Cheryle Mako, a NASA engineer who is leading the project, said in a statement. “If we had small swarming robots that had a couple sensors and knew what they were looking for, you could send them out to a leak site and find which area was at greatest risk.”

    See the full article here.

    NASA

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 3:08 pm on August 15, 2014 Permalink | Reply
    Tags: , , , , Robotics   

    From Harvard: “A self-organizing thousand-robot swarm” 

    Harvard School of Engineering and Applied Sciences
    Harvard School of Engineering and Applied Sciences

    August 14, 2014
    Caroline Perry

    Following simple programmed rules, autonomous robots arrange themselves into vast, complex shapes

    The first thousand-robot flash mob has assembled at Harvard University.

    “Form a sea star shape,” directs a computer scientist, sending the command to 1,024 little bots simultaneously via an infrared light. The robots begin to blink at one another and then gradually arrange themselves into a five-pointed star. “Now form the letter K.”

    The ‘K’ stands for Kilobots, the name given to these extremely simple robots, each just a few centimeters across, standing on three pin-like legs. Instead of one highly-complex robot, a “kilo” of robots collaborate, providing a simple platform for the enactment of complex behaviors.

    Just as trillions of individual cells can assemble into an intelligent organism, or a thousand starlings can form a great flowing murmuration across the sky, the Kilobots demonstrate how complexity can arise from very simple behaviors performed en masse (see video). To computer scientists, they also represent a significant milestone in the development of collective artificial intelligence (AI).

    Given a two-dimensional image, the Kilobots follow simple rules to form the same shape. Visually, the effect is similar to a flock of birds wheeling across the sky. “At some level you no longer even see the individuals; you just see the collective as an entity to itself,” says Radhika Nagpal.

    bots
    (Image courtesy of Mike Rubenstein and Science/AAAS.)

    This self-organizing swarm was created in the lab of Radhika Nagpal, Fred Kavli Professor of Computer Science at the Harvard School of Engineering and Applied Sciences (SEAS) and a Core Faculty Member at the Wyss Institute for Biologically Inspired Engineering at Harvard University. The advance is described in the August 15 issue of Science.

    “The beauty of biological systems is that they are elegantly simple—and yet, in large numbers, accomplish the seemingly impossible,” says Nagpal. “At some level you no longer even see the individuals; you just see the collective as an entity to itself.”

    “Biological collectives involve enormous numbers of cooperating entities—whether you think of cells or insects or animals—that together accomplish a single task that is a magnitude beyond the scale of any individual,” says lead author Michael Rubenstein, a research associate at Harvard SEAS and the Wyss Institute.

    He cites, for example, the behavior of a colony of army ants. By linking together, they can form rafts and bridges to cross difficult terrain. Social amoebas do something similar at a microscopic scale: when food is scarce, they join together to create a fruiting body capable of escaping the local environment. In cuttlefish, color changes at the level of individual cells can help the entire organism blend into its surroundings. (And as Nagpal points out—with a smile—a school of fish in the movie Finding Nemo also collaborate when they form the shape of an arrow to point Nemo toward the jet stream.)

    “We are especially inspired by systems where individuals can self-assemble together to solve problems,” says Nagpal. Her research group made news in February 2014 with a group of termite-inspired robots that can collaboratively perform construction tasks using simple forms of coordination.

    But the algorithm that instructs those TERMES robots has not yet been demonstrated in a very large swarm. In fact, only a few robot swarms to date have exceeded 100 individuals, because of the algorithmic limitations on coordinating such large numbers, and the cost and labor involved in fabricating the physical devices.

    The research team overcame both of these challenges through thoughtful design.

    Most notably, the Kilobots require no micromanagement or intervention once an initial set of instructions has been delivered. Four robots mark the origin of a coordinate system, all the other robots receive a 2D image that they should mimic, and then using very primitive behaviors—following the edge of a group, tracking a distance from the origin, and maintaining a sense of relative location—they take turns moving towards an acceptable position. With coauthor Alejandro Cornejo, a postdoctoral fellow at Harvard SEAS and the Wyss Institute, they demonstrated a mathematical proof that the individual behaviors would lead to the right global result.

    The Kilobots also correct their own mistakes. If a traffic jam forms or a robot moves off-course—errors that become much more common in a large group—nearby robots sense the problem and cooperate to fix it.

    swarm
    In a swarm of a thousand simple robots, errors like traffic jams (second from left) and imprecise positioning (far right) are common, so the algorithm incorporates rules that can help correct for these. (Photo courtesy of Mike Rubenstein and Science/AAAS.)

    To keep the cost of the Kilobot down, each robot moves using two vibrating motors that allow it to slide across a surface on its rigid legs. An infrared transmitter and receiver allow it to communicate with a few of its neighbors and measure their proximity—but the robots are myopic and have no access to a bird’s-eye view. These design decisions come with tradeoffs, as Rubenstein explains: “These robots are much simpler than many conventional robots, and as a result, their abilities are more variable and less reliable,” he says. “For example, the Kilobots have trouble moving in a straight line, and the accuracy of distance sensing can vary from robot to robot.”

    Yet, at scale, the smart algorithm overcomes these individual limitations and guarantees—both physically and mathematically—that the robots can complete a human-specified task, in this case assembling into a particular shape. That’s an important demonstration for the future of distributed robotics, says Nagpal.

    “Increasingly, we’re going to see large numbers of robots working together, whether its hundreds of robots cooperating to achieve environmental cleanup or a quick disaster response, or millions of self-driving cars on our highways,” she says. “Understanding how to design ‘good’ systems at that scale will be critical.”

    For now, the Kilobots provide an essential test bed for AI algorithms.

    The thousand-Kilobot swarm provides a valuable platform for testing future collective AI algorithms. (Photo courtesy of Mike Rubenstein and Science/AAAS.)

    “We can simulate the behavior of large swarms of robots, but a simulation can only go so far,” says Nagpal. “The real-world dynamics—the physical interactions and variability—make a difference, and having the Kilobots to test the algorithm on real robots has helped us better understand how to recognize and prevent the failures that occur at these large scales.”

    The Kilobot robot design and software, originally created in Nagpal’s group at Harvard, are available open-source for non-commercial use. The Kilobots have also been licensed by Harvard’s Office of Technology Development to K-Team, a manufacturer of small mobile robots.

    This research was supported in part by the Wyss Institute and by the National Science Foundation (CCF-0926148, CCF-0643898).

    See the full article here.

    Through research and scholarship, the Harvard School of Engineering and Applied Sciences (SEAS) will create collaborative bridges across Harvard and educate the next generation of global leaders. By harnessing the power of engineering and applied sciences we will address the greatest challenges facing our society.

    Specifically, that means that SEAS will provide to all Harvard College students an introduction to and familiarity with engineering and technology as this is essential knowledge in the 21st century.

    Moreover, our concentrators will be immersed in the liberal arts environment and be able to understand the societal context for their problem solving, capable of working seamlessly withothers, including those in the arts, the sciences, and the professional schools. They will focus on the fundamental engineering and applied science disciplines for the 21st century; as we will not teach legacy 20th century engineering disciplines.

    Instead, our curriculum will be rigorous but inviting to students, and be infused with active learning, interdisciplinary research, entrepreneurship and engineering design experiences. For our concentrators and graduate students, we will educate “T-shaped” individuals – with depth in one discipline but capable of working seamlessly with others, including arts, humanities, natural science and social science.

    To address current and future societal challenges, knowledge from fundamental science, art, and the humanities must all be linked through the application of engineering principles with the professions of law, medicine, public policy, design and business practice.

    In other words, solving important issues requires a multidisciplinary approach.

    With the combined strengths of SEAS, the Faculty of Arts and Sciences, and the professional schools, Harvard is ideally positioned to both broadly educate the next generation of leaders who understand the complexities of technology and society and to use its intellectual resources and innovative thinking to meet the challenges of the 21st century.

    Ultimately, we will provide to our graduates a rigorous quantitative liberal arts education that is an excellent launching point for any career and profession.

    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 8:24 am on July 14, 2014 Permalink | Reply
    Tags: , , Robotics   

    From M.I.T.: “Squishy robots” 

    M.I.T.

    July 14, 2014
    Helen Knight

    Phase-changing material could allow even low-cost robots to switch between hard and soft states.

    wrench

    In the movie Terminator 2, the shape-shifting T-1000 robot morphs into a liquid state to squeeze through tight spaces or to repair itself when harmed.

    Now a phase-changing material built from wax and foam, and capable of switching between hard and soft states, could allow even low-cost robots to perform the same feat.

    The material — developed by Anette Hosoi, a professor of mechanical engineering and applied mathematics at MIT, and her former graduate student Nadia Cheng, alongside researchers at the Max Planck Institute for Dynamics and Self-Organization and Stony Brook University — could be used to build deformable surgical robots. The robots could move through the body to reach a particular point without damaging any of the organs or vessels along the way.

    Robots built from the material, which is described in a new paper in the journal Macromolecular Materials and Engineering, could also be used in search-and-rescue operations to squeeze through rubble looking for survivors, Hosoi says.

    Follow that octopus

    Working with robotics company Boston Dynamics, based in Waltham, Mass., the researchers began developing the material as part of the Chemical Robots program of the Defense Advanced Research Projects Agency (DARPA). The agency was interested in “squishy” robots capable of squeezing through tight spaces and then expanding again to move around a given area, Hosoi says — much as octopuses do.

    But if a robot is going to perform meaningful tasks, it needs to be able to exert a reasonable amount of force on its surroundings, she says. “You can’t just create a bowl of Jell-O, because if the Jell-O has to manipulate an object, it would simply deform without applying significant pressure to the thing it was trying to move.”

    What’s more, controlling a very soft structure is extremely difficult: It is much harder to predict how the material will move, and what shapes it will form, than it is with a rigid robot.

    So the researchers decided that the only way to build a deformable robot would be to develop a material that can switch between a soft and hard state, Hosoi says. “If you’re trying to squeeze under a door, for example, you should opt for a soft state, but if you want to pick up a hammer or open a window, you need at least part of the machine to be rigid,” she says.

    Compressible and self-healing

    To build a material capable of shifting between squishy and rigid states, the researchers coated a foam structure in wax. They chose foam because it can be squeezed into a small fraction of its normal size, but once released will bounce back to its original shape.

    The wax coating, meanwhile, can change from a hard outer shell to a soft, pliable surface with moderate heating. This could be done by running a wire along each of the coated foam struts and then applying a current to heat up and melt the surrounding wax. Turning off the current again would allow the material to cool down and return to its rigid state.

    In addition to switching the material to its soft state, heating the wax in this way would also repair any damage sustained, Hosoi says. “This material is self-healing,” she says. “So if you push it too far and fracture the coating, you can heat it and then cool it, and the structure returns to its original configuration.”

    To build the material, the researchers simply placed the polyurethane foam in a bath of melted wax. They then squeezed the foam to encourage it to soak up the wax, Cheng says. “A lot of materials innovation can be very expensive, but in this case you could just buy really low-cost polyurethane foam and some wax from a craft store,” she says.

    In order to study the properties of the material in more detail, they then used a 3-D printer to build a second version of the foam lattice structure, to allow them to carefully control the position of each of the struts and pores.

    When they tested the two materials, they found that the printed lattice was more amenable to analysis than the polyurethane foam, although the latter would still be fine for low-cost applications, Hosoi says.

    The wax coating could also be replaced by a stronger material, such as solder, she adds.

    Hosoi is now investigating the use of other unconventional materials for robotics, such as magnetorheological and electrorheological fluids. These materials consist of a liquid with particles suspended inside, and can be made to switch from a soft to a rigid state with the application of a magnetic or electric field.

    When it comes to artificial muscles for soft and biologically inspired robots, we tend to think of controlling shape through bending or contraction, says Carmel Majidi, an assistant professor of mechanical engineering in the Robotics Institute at Carnegie Mellon University, who was not involved in the research. “But for a lot of robotics tasks, reversibly tuning the mechanical rigidity of a joint can be just as important,” he says. “This work is a great demonstration of how thermally controlled rigidity-tuning could potentially be used in soft robotics.”

    See the full article, with video, here.


    ScienceSprings is powered by MAINGEAR computers

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 378 other followers

%d bloggers like this: