Tagged: Robotics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:44 am on April 28, 2016 Permalink | Reply
    Tags: , Robotics,   

    From Stanford: “Maiden voyage of Stanford’s humanoid robotic diver recovers treasures from King Louis XIV’s wrecked flagship” 

    Stanford University Name
    Stanford University

    OceanOne, a new humanoid robotic diver from Stanford, explores a 17th century shipwreck. (Image credit: Frederic Osada and Teddy Seguin/DRASSM)

    April 27, 2016
    Bjorn Carey

    Access mp4 video here .
    Video by Kurt Hickman

    Oussama Khatib held his breath as he swam through the wreck of La Lune, 100 meters below the Mediterranean. The flagship of King Louis XIV sank here in 1664, 20 miles off the southern coast of France, and no human had touched the ruins – or the countless treasures and artifacts the ship once carried – in the centuries since.

    With guidance from a team of skilled deep-sea archaeologists who had studied the site, Khatib, a professor of computer science at Stanford, spotted a grapefruit-size vase. He hovered precisely over the vase, reached out, felt its contours and weight, and stuck a finger inside to get a good grip. He swam over to a recovery basket, gently laid down the vase and shut the lid. Then he stood up and high-fived the dozen archaeologists and engineers who had been crowded around him.

    This entire time Khatib had been sitting comfortably in a boat, using a set of joysticks to control OceanOne, a humanoid diving robot outfitted with human vision, haptic force feedback and an artificial brain – in essence, a virtual diver.

    When the vase returned to the boat, Khatib was the first person to touch it in hundreds of years. It was in remarkably good condition, though it showed every day of its time underwater: The surface was covered in ocean detritus, and it smelled like raw oysters. The team members were overjoyed, and when they popped bottles of champagne, they made sure to give their heroic robot a celebratory bath.

    The expedition to La Lune was OceanOne’s maiden voyage. Based on its astonishing success, Khatib hopes that the robot will one day take on highly skilled underwater tasks too dangerous for human divers, as well as open up a whole new realm of ocean exploration.

    “OceanOne will be your avatar,” Khatib said. “The intent here is to have a human diving virtually, to put the human out of harm’s way. Having a machine that has human characteristics that can project the human diver’s embodiment at depth is going to be amazing.”
    Anatomy of a robo-mermaid

    The concept for OceanOne was born from the need to study coral reefs deep in the Red Sea, far below the comfortable range of human divers. No existing robotic submarine can dive with the skill and care of a human diver, so OceanOne was conceived and built from the ground up, a successful marriage of robotics, artificial intelligence and haptic feedback systems.

    OceanOne looks something like a robo-mermaid. Roughly five feet long from end to end, its torso features a head with stereoscopic vision that shows the pilot exactly what the robot sees, and two fully articulated arms. The “tail” section houses batteries, computers and eight multi-directional thrusters.

    The body looks far unlike conventional boxy robotic submersibles, but it’s the hands that really set OceanOne apart. Each fully articulated wrist is fitted with force sensors that relay haptic feedback to the pilot’s controls, so the human can feel whether the robot is grasping something firm and heavy, or light and delicate. (Eventually, each finger will be covered with tactile sensors.) The ‘bot’s brain also reads the data and makes sure that its hands keep a firm grip on objects, but that they don’t damage things by squeezing too tightly. In addition to exploring shipwrecks, this makes it adept at manipulating delicate coral reef research and precisely placing underwater sensors.

    “You can feel exactly what the robot is doing,” Khatib said. “It’s almost like you are there; with the sense of touch you create a new dimension of perception.”


    The pilot can take control at any moment, but most frequently won’t need to lift a finger. Sensors throughout the robot gauge current and turbulence, automatically activating the thrusters to keep the robot in place. And even as the body moves, quick-firing motors adjust the arms to keep its hands steady as it works. Navigation relies on perception of the environment, from both sensors and cameras, and these data run through smart algorithms that help OceanOne avoid collisions. If it senses that its thrusters won’t slow it down quickly enough, it can quickly brace for impact with its arms, an advantage of a humanoid body build.

    A human touch

    The humanoid form also means that when OceanOne dives alongside actual humans, its pilot can communicate through hand gestures during complex tasks or scientific experiments. Ultimately, though, Khatib designed OceanOne with an eye toward getting human divers out of harm’s way. Every aspect of the robot’s design is meant to allow it to take on tasks that are either dangerous – deep-water mining, oil-rig maintenance or underwater disaster situations like the Fukushima Daiichi power plant – or simply beyond the physical limits of human divers.

    “We connect the human to the robot in very intuitive and meaningful way. The human can provide intuition and expertise and cognitive abilities to the robot,” Khatib said. “The two bring together an amazing synergy. The human and robot can do things in areas too dangerous for a human, while the human is still there.”

    Khatib was forced to showcase this attribute while recovering the vase. As OceanOne swam through the wreck, it wedged itself between two cannons. Firing the thrusters in reverse wouldn’t extricate it, so Khatib took control of the arms, motioned for the bot to perform a sort of pushup, and OceanOne was free.

    The expedition to La Lune was made possible in large part thanks to the efforts of Michel L’Hour, the director of underwater archaeology research in France’s Ministry of Culture. Previous remote studies of the shipwreck conducted by L’Hour’s team made it possible for OceanOne to navigate the site. Vincent Creuze of the Universite de Montpellier in France commanded the support underwater vehicle that provided third-person visuals of OceanOne and held its support tether at a safe distance.

    Several students played key roles in OceanOne’s success, including graduate students Gerald Brantner, Xiyang Yeh, Boyeon Kim and Brian Soe, who joined Khatib in France for the expedition, as well as Shameek Ganguly, Mikael Jorda, and a number of undergraduate and graduate students. Khatib also drew on the expertise of Mark Cutkosky, a professor of mechanical engineering, for designing and building the robotic arms.

    Next month, OceanOne will return to the Stanford campus, where Khatib and his students will continue iterating on the platform. The prototype robot is a fleet of one, but Khatib hopes to build more units, which would work in concert during a dive.

    In addition to Stanford, the development of the robot was supported by Meka Robotics and the King Abdullah University of Science and Technology (KAUST) in Saudi Arabia.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Leland and Jane Stanford founded the University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members

    Stanford University Seal

  • richardmitnick 4:33 pm on March 3, 2016 Permalink | Reply
    Tags: , , , Robotics   

    From Cornell: “Light-up skin stretches boundaries of robotics” 

    Cornell Bloc

    Cornell University

    March 3, 2016
    Tom Fleischman

    Electroluminescent skin Cornell
    The research group of Rob Shepherd, assistant professor of mechanical and aerospace engineering, has developed a highly stretchable electroluminescent skin capable of stretching to nearly six times its original size while still emitting light. The group’s work is documented in a paper published online March 3 in the journal Science. Chris Larson

    A health care robot that displays a patient’s temperature and pulse, and even reacts to a patient’s mood.

    An autonomous vehicle with an information display interface that can be changed based on the passenger’s needs.

    Even in this age of smartphones and other electronics wonders, these ideas sound quite futuristic. But a team of Cornell graduate students – led by Rob Shepherd, assistant professor of mechanical and aerospace engineering – has developed an electroluminescent “skin” that stretches to more than six times its original size while still emitting light. The discovery could lead to significant advances in health care, transportation, electronic communication and other areas.

    “This material can stretch with the body of a soft robot, and that’s what our group does,” Shepherd said, noting that the material has two key properties: “It allows robots to change their color, and it also allows displays to change their shape.”

    This hyper-elastic light-emitting capacitor (HLEC), which can endure more than twice the strain of previously tested stretchable displays, consists of layers of transparent hydrogel electrodes sandwiching a dielectric (insulating) elastomer sheet. The elastomer changes luminance and capacitance (the ability to store an electrical charge) when stretched, rolled and otherwise deformed.

    “We can take these pixels that change color and put them on these robots, and now we have the ability to change their color,” Shepherd said. “Why is that important? For one thing, when robots become more and more a part of our lives, the ability for them to have emotional connection with us will be important. So to be able to change their color in response to mood or the tone of the room we believe is going to be important for human-robot interactions.”

    In addition to its ability to emit light under a strain of greater than 480 percent its original size, the group’s HLEC was shown to be capable of being integrated into a soft robotic system. Three six-layer HLEC panels were bound together to form a crawling soft robot, with the top four layers making up the light-up skin and the bottom two the pneumatic actuators.

    The chambers were alternately inflated and deflated, with the resulting curvature creating an undulating, “walking” motion.

    Shepherd credited a group of four graduate students – Bryan Peele, Chris Larson, Shuo Li and Sanlin Robinson – with coming up with the idea for the material. All but Li were in Shepherd’s Rheology and Processing of Soft Materials class in spring 2014, when the seeds for this discovery were planted.

    “They would say something like, ‘OK, we have a single pixel that can stretch 500 percent in length.’ And so I’d say, ‘That’s cool, but what is the application for it?’” Shepherd said. “And that’s the biggest thing – you can have something cool, but you need to find a reason to use it.”

    In addition to the four graduate students, all members of the Shepherd Group, contributors included Massimo Tottaro, Lucia Beccai and Barbara Mazzolai of the Italian Institute of Technology’s Center for Micro-BioRobotics, a world leader in robotics study. Shepherd met Beccai and Mazzolai at a conference two years ago; this was their first research collaboration.

    The group’s paper, Highly Stretchable Electroluminescent Skin for Optical Signaling and Tactile Sensing, is published in the March 3 online edition of the journal Science.

    Although Shepherd admitted to “not being very fashion-forward,” another application involves wearable electronics. While wearable technology today involves putting hard electronics onto a soft base (think Apple Watch or Fitbit), this discovery paves the way for devices that fully conform to the wearer’s shape.

    “You could have a rubber band that goes around your arm that also displays information,” Larson said. “You could be in a meeting and have a rubber band-like device on your arm and could be checking your email. That’s obviously in the future, but that’s the direction we’re looking in.”

    The Shepherd Group has also developed a lightweight, stretchable material with the consistency of memory foam, with the potential for use in prosthetic body parts, artificial organs and soft robotics.

    The group’s latest work was supported by a grant from the Army Research Office, a 2015 award from the Air Force Office of Scientific Research, and two grants from the National Science Foundation.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    Once called “the first American university” by educational historian Frederick Rudolph, Cornell University represents a distinctive mix of eminent scholarship and democratic ideals. Adding practical subjects to the classics and admitting qualified students regardless of nationality, race, social circumstance, gender, or religion was quite a departure when Cornell was founded in 1865.

    Today’s Cornell reflects this heritage of egalitarian excellence. It is home to the nation’s first colleges devoted to hotel administration, industrial and labor relations, and veterinary medicine. Both a private university and the land-grant institution of New York State, Cornell University is the most educationally diverse member of the Ivy League.

    On the Ithaca campus alone nearly 20,000 students representing every state and 120 countries choose from among 4,000 courses in 11 undergraduate, graduate, and professional schools. Many undergraduates participate in a wide range of interdisciplinary programs, play meaningful roles in original research, and study in Cornell programs in Washington, New York City, and the world over.

  • richardmitnick 7:45 am on October 27, 2015 Permalink | Reply
    Tags: , Robotics,   

    From TUM: “Robots learn to walk” 

    Techniche Universitat Munchen

    Techniche Universitat Munchen


    Dr. Daniel Renjewski
    Technical University of Munich (TUM)
    Chair of Robotics and Embedded Systems (Prof. Alois Knoll)
    +49 (0)89 289 18133

    Prof. Jonathan Hurst
    Oregon State University
    Associate Professor of Mechanical Engineering

    Springs instead of muscles: The “ATRIAS” robot walks like a human

    Illustration of “ATRIAS” by Mikhail Jones

    download the mp4 video here.

    Humanoid robots are intended to become more and more like people. But walking on two legs – one of the characteristic features of the human being – continues to pose particular problems to these machines. Dr. Daniel Renjewski of the Technical University of Munich (TUM), together with his colleagues at Oregon State University, has developed a robot whose gait comes closer than ever before to that of humans. The results of their study could also be used to develop better prostheses.

    When we walk, we are not consciously aware of the structure of the ground. Our body has the ability to automatically compensate for small uneven patches without tripping or coming to a standstill. Walking robots, such as the humanoid “Asimo” from Japan, closely resemble humans in appearance, but tend to walk slowly and stiffly. They also use up a lot of energy in the process.


    Humans and animals do not think about walking, explains Dr. Daniel Renjewski of the Chair of Robotics and Embedded Systems at the TUM. “The intelligence lies in the mechanics.” Tendons and muscles cushion the impact of any uneven patches in the ground. “When we walk, one might say that we fall from one step into the other,” says Renjewski. This means that our gait is sometimes unstable. Were we to interrupt our stride in mid-movement, we would fall.

    Previous walking robots: stable but stiff

    This type of dynamic movement is difficult to control in a robot designed using conventional approaches. To guarantee that the machine is always stable and does not fall, the engineers therefore have to measure where the robot is located at every point in time, and how its center of gravity shifts as it moves. The price for this accurate steering is that its movements are by necessity controlled and stiff. In most cases, the machines walk on level terrain in the laboratory and are only required to avoid defined obstacles.

    The goal for Renjewski and his colleagues at Oregon State University was to develop a robot whose gait is the same as that of a human. The name they gave to this robot, which they describe in a report in the journal IEEE Transactions on Robotics was “ATRIAS” (Assume The Robot Is A Sphere).

    Theory and practice

    The development of ATRIAS is based on the so-called spring-mass model, first presented in 1989. It describes the underlying principle of walking on two legs. In this model, the entire mass of the body is concentrated into one point, which is attached to a massless spring. The spring is a simplified representation of the muscles, bones and tendons on which the forces generated during walking act in the real world.

    The researchers had to make a few more adjustments to enable them to implement this theoretical model technically: in reality, the springs also have mass, and motors are needed to compensate for unavoidable damping in the system.

    Keeping steady

    ATRIAS has three motors in each leg for this purpose. Two of the motors act directly on the two leg springs. The third motor ensures lateral stability. ATRIAS’s legs make up ten percent of its total mass, to get as close as possible to the theoretical lack of mass.

    Trials showed that it walks three times more efficiently than other human-sized biped robots. Even outside forces, such as being hit by a ball or a rough terrain, cannot unbalance it. Prof. Jonathan Hurst of Oregon State University, and the initiator of the study, is sure that this type of locomotion will catch on in walking robots of the future. If the technology is improved even further, these robots could, for instance, be used to assist firefighting.

    Better prostheses

    However, these research results are also meaningful for people. In the next research step, Renjewski, who moved to the TUM in May of this year, is working on transferring these findings to robots for gait rehabilitation and prostheses.

    Original publication: Daniel Renjewski, Alexander Sprowitz, Andrew Peekema, Mikhail Jones, Jonathan Hurst: Exciting Engineered Passive Dynamics in a Bipedal Robot; IEEE Transactions on Robotics, Volume 31, Issue 5; DOI: 10.1109/TRO.2015.2473456

    The work was supported by the National Science Foundation, the Defense Advanced Research Projects Agency and the Human Frontier Science Program.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Techniche Universitat Munchin Campus

    Technische Universität München (TUM) is one of Europe’s top universities. It is committed to excellence in research and teaching, interdisciplinary education and the active promotion of promising young scientists. The university also forges strong links with companies and scientific institutions across the world. TUM was one of the first universities in Germany to be named a University of Excellence. Moreover, TUM regularly ranks among the best European universities in international rankings.

  • richardmitnick 8:27 am on August 21, 2015 Permalink | Reply
    Tags: , , Robotics   

    From Caltech: “Crush, the RoboSub, Places in International Competition” 

    Caltech Logo

    Lori Dajose

    Crush, the RoboSub Credit: Caltech Robotics Team

    The Caltech Robotics Team—composed of 30 Caltech undergrads and recent alumni—placed fourth in the 18th Annual International RoboSub Competition, held July 20–26 in San Diego, California. The competition, hosted by the Association for Unmanned Vehicle Systems International (AUVSI) Foundation and cosponsored by the U.S. Office of Naval Research, challenges teams of student engineers to perform realistic missions with autonomous underwater vehicles (AUVs) in an underwater environment. Thirty-seven teams from across the globe competed in this year’s event.

    The challenge was to build a robotic submarine that could autonomously navigate an obstacle course, completing tasks such as driving through a gate, bumping into colored buoys, shooting torpedoes through holes, and dropping markers into designated bins. The only human involvement during the competition was the initial placement of the vehicle into the water.

    The Caltech team was divided into three groups, responsible for the mechanical, electrical, and software systems of the robot, which they named Crush. A fourth group managed the team’s fund-raising and outreach efforts. The mechanical team, led by Edward Fouad, a senior in mechanical engineering, was responsible for building grippers, a propulsion system, and a pressure hull to house the robot’s electronics. The autonomous capabilities of the robot were programmed from scratch by the software team, led by Kushal Agarwal, a junior in computer science. The electrical team, led by Torkom Pailevanian, a senior in electrical engineering, designed an inertial measurement unit consisting of gyroscopes and accelerometers that allow the robot to orient itself in 3-D space.

    Started in 1998, the Annual RoboSub Competition is designed to introduce young students into high-tech STEM fields such as maritime robotics. This year’s team from Caltech was led by Justin Koch—who graduated in June with his BS in mechanical engineering—and advised by Joel Burdick, the Richard L. and Dorothy M. Hayman Professor of Mechanical Engineering and Bioengineering.

    “Last year, as a first-year team, we placed seventh overall and were awarded Best New Entry,” says Koch. “I’m definitely very excited with how we did as only a second-year team!”

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”
    Caltech buildings

  • richardmitnick 5:07 pm on August 15, 2015 Permalink | Reply
    Tags: NASA Swamp Works, Robotics   

    From NASA: “Extreme Access Flyer to Take Planetary Exploration Airborne” 



    July 30, 2015
    Steven Siceloff

    A prototype built to test Extreme Access Flyer systems in different environments.
    Credits: NASA/Swamp Works

    Swamp Works engineers at NASA’s Kennedy Space Center in Florida are inventing a flying robotic vehicle that can gather samples on other worlds in places inaccessible to rovers. The vehicles – similar to quad-copters but designed for the thin atmosphere of Mars and the airless voids of asteroids and the moon – would use a lander as a base to replenish batteries and propellants between flights.

    “This is a prospecting robot,” said Rob Mueller, senior technologist for advanced projects at Swamp Works. “The first step in being able to use resources on Mars or an asteroid is to find out where the resources are. They are most likely in hard-to-access areas where there is permanent shadow. Some of the crater walls are angled 30 degrees or more, and that’s far too steep for a traditional rover to navigate and climb.”

    The machines being built fall under the name Extreme Access Flyers, and their designers intend to create vehicles that can travel into the shaded regions of a crater and pull out small amounts of soil to see whether it holds the water-ice promised by readings from orbiting spacecraft. Running on propellants made from resources on the distant worlds, the machines would be able to execute hundreds of explorative sorties during their mission. They also would be small enough for a lander to bring several of them to the surface at once, so if one fails, the mission isn’t lost.

    If that sounds a lot like a job for a quad-copter, it kind of is. On Earth, a quad-copter with its four rotors and outfitted with a digger or sampling device of some sort would be able to execute many missions with no problem. On other worlds, though, the machine would require very large rotors since the atmosphere on Mars is thin and there is no air on an asteroid or the moon. Also, the flyer would have to operate autonomously, figuring out on its own where it is and where it is going since there is no GPS to help it navigate and the communications delays are too large to control it directly from Earth.

    Cold-gas jets using oxygen or steam water vapor will take on the lifting and maneuvering duties performed by the rotors on Earth. For navigation, the team is programming the flyer to recognize terrain and landmarks and guide itself to areas controllers on Earth send it to or even scout on its own the best places to take samples from.

    “It would have enough propellant to fly for a number of minutes on Mars or on the moon, hours on an asteroid,” said DuPuis.

    For the sampling itself, designers currently envision a modular approach that would let the flyer take one tool at a time to a sample area to gather about seven grams of material at a time. That’s enough for instruments to analyze and, throughout the course of many flights, is enough to gather samples that would show Earth-bound scientists a complete geological picture of an area.

    It’s work that would’ve been too complicated to research even five years ago, particularly with off-the-shelf components. Now though, the advent of autonomous flight controllers, laser-guidance and mapping systems combined with innovations in 3-D printing make the chances of developing a successful prototype flyer much more likely. Also, a partnership with Embry-Riddle Aeronautical University and Honeybee Robotic Spacecraft Mechanisms is providing more expertise.

    “The flight control systems of commercially available small, unmanned multi-rotor aerial vehicles are not too dissimilar to a spacecraft controller,” Mike DuPuis, co-investigator of the Extreme Access Flyer project. “That was the starting point for developing a controller.”

    In the Swamp Works laboratory, the team has assembled several models designed to test aspects of the final machine. A large quad-copter about five feet across that uses ducted fans is about the size of the prototype the team has in mind for an operational mission in space. It’s been tested at the planetary surface analogous test site built for the Morpheus lander project at the north end of the Shuttle Landing Facility’s runway.

    A smaller ducted fan flyer, about the size of a person’s palm is routinely flown inside a 10-by-10-foot cube to test software and control abilities. Another, primarily built with asteroid exploration in mind, is suspended inside a gimbal device that lets it maneuver much as it would in zero gravity, using nitrogen high pressure cold gas thrusters to tilt and spin while the team judges its behavior in a virtual simulated world on a computer that shows what its flight around an asteroid would look like.

    The team started at a low level of technological readiness two years ago and is steadily pushing the mission and design closer to a state where it can be made into a flight-ready craft.

    The uses for the sampling vehicle may not be solely extraterrestrial, Mueller said. On Earth, an aerial vehicle that can pull a few grams of dirt from an area potentially brimming with toxins would be very valuable for first responders or those researching a new area who do not want to risk humans. Mueller said the effects of a nuclear radiation leak on surrounding areas, for example, could be measured with soil gathered quickly by a vehicle like the Extreme Access Flyer.

    “We’re an innovations lab, so in everything we do, we try to come up with new solutions,” Mueller said.

    In addition to scouting craters for water and other elements that can be processed into fuel for large spacecraft and air for humans, the flyer would be capable of exploring lava tubes that are known to exist on Mars and the moon and are found in many volcanic areas on Earth. Because some are thought to be 30 feet or bigger in diameter, an extreme access flyer could navigate autonomously during a robotic precursor mission and find a safe place for astronauts during their journey to Mars.

    “You could put a whole habitat inside a lava tube to shelter astronauts from radiation, thermal extremes, weather and micrometeorites,” Mueller said.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The National Aeronautics and Space Administration (NASA) is the agency of the United States government that is responsible for the nation’s civilian space program and for aeronautics and aerospace research.

    President Dwight D. Eisenhower established the National Aeronautics and Space Administration (NASA) in 1958 with a distinctly civilian (rather than military) orientation encouraging peaceful applications in space science. The National Aeronautics and Space Act was passed on July 29, 1958, disestablishing NASA’s predecessor, the National Advisory Committee for Aeronautics (NACA). The new agency became operational on October 1, 1958.

    Since that time, most U.S. space exploration efforts have been led by NASA, including the Apollo moon-landing missions, the Skylab space station, and later the Space Shuttle. Currently, NASA is supporting the International Space Station and is overseeing the development of the Orion Multi-Purpose Crew Vehicle and Commercial Crew vehicles. The agency is also responsible for the Launch Services Program (LSP) which provides oversight of launch operations and countdown management for unmanned NASA launches. Most recently, NASA announced a new Space Launch System that it said would take the agency’s astronauts farther into space than ever before and lay the cornerstone for future human space exploration efforts by the U.S.

    NASA science is focused on better understanding Earth through the Earth Observing System, advancing heliophysics through the efforts of the Science Mission Directorate’s Heliophysics Research Program, exploring bodies throughout the Solar System with advanced robotic missions such as New Horizons, and researching astrophysics topics, such as the Big Bang, through the Great Observatories [Hubble, Chandra, Spitzer, and associated programs. NASA shares data with various national and international organizations such as from the [JAXA]Greenhouse Gases Observing Satellite.

  • richardmitnick 5:24 pm on July 31, 2015 Permalink | Reply
    Tags: , Robotics,   

    From U Hawaii: “UH Hilo Robotics Team Tests School’s First Rover” 

    U Hawaii

    University of Hawaii

    July 30, 2015
    No Writer Credit

    UH-Hilo’s rover “Spock” shown during its very first field test at a PISCES martian simulation site on Hawaii Island.

    The University of Hawaii at Hilo’s Space Robotics Team has successfully built from scratch the school’s very first planetary mining rover. And by competition standards, it’s a contender.

    Ethan Paguirigan, Carli Hand, and Daryl Albano comprise the core team that designed and built the rover – appropriately named “Spock” after UH-Hilo’s mascot, “The Vulcans” – from scratch over the course of a semester. They intended to enter NASA’s annual Robotic Mining Competition (RMC) at Kennedy Space Center, which challenges college teams to build a space-worthy mining rover that can effectively mine and haul regolith, or dirt.

    Though they ran short on time, the students were able to test Spock’s prowess on July 28 for the very first time at a PISCES Martian-simulation site as part of the 2015 PRISM (PISCES Robotic International Space Mining) event.

    “It went spectacular,” said Ethan Paguirigan, UH student and team leader of the robotics group who tackled the mechanical design of the rover. “The entire system was untested… it was all a big mystery.”

    Operating the robot remotely from Gemini Observatory Headquarters in Hilo some 30 miles away, students initially faced some challenges. But any uncertainty about Spock’s capabilities were soon put to rest after the rover hauled 2.5 pounds of dirt and gravel, qualifying it by NASA competition standards as a contender.

    Following their first successful field test, Ethan, Carli, and Daryl are looking at an ambitious upgrade for Spock – autonomous operation. Their goal is to integrate sensors into the rover that will allow it to know where it is and what it is doing without a driver. With this in mind, the team intends to enter Spock in NASA’s 2016 RMC – boldly going where no UH students have gone before.

    Ethan, a mechanical engineering major, says he got into robots after seeing Marvel’s first installment of Ironman on the big screen. Carli is double-majoring in math and electrical engineering; Daryl is the programming wiz behind Spock with his studies directed in computer science.

    Ethan Paguirigan, Carli Hand, and Daryl Albano stand in front of “Spock” at UH-Hilo’s robotics lab.

    Spock’s simple design and unique features speak for its performance. The battery-operated, 125-pound rover is about the size of a large lawn mower, but would eat any yard maintenance device alive with its rugged four-wheel-drive design. Using “wegs,” or spoked “wheel legs” made of wooden pegs, the rover has superior traction and mobility on rugged, rocky surfaces. The frame is made of light-weight aluminum and houses a cleanly-welded shovel to scoop dirt and gravel using an actuator from an electric wheel chair.

    Besides innovating the design and build of a really cool robot, UH-Hilo’s space robotics team is also advancing the technology of ISRU – in-situ resource utilization. ISRU involves “living off the land” by utilizing local materials like regolith for creating usable resources and infrastructure. On other planets, this might look like space shelters, breathable oxygen, and rocket fuel. All from dirt you ask? Yes. And it’s becoming more of a potential reality thanks to the hard work of scientists and students alike.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    System Overview

    The University of Hawai‘i System includes 10 campuses and dozens of educational, training and research centers across the Hawaiian Islands. As the public system of higher education in Hawai‘i, UH offers opportunities as unique and diverse as our Island home.

    The 10 UH campuses and educational centers on six Hawaiian Islands provide unique opportunities for both learning and recreation.

    UH is the State’s leading engine for economic growth and diversification, stimulating the local economy with jobs, research and skilled workers.

  • richardmitnick 1:12 pm on April 11, 2015 Permalink | Reply
    Tags: , , Robotics   

    From PPPL: “Synthetic muscle developed with PPPL scientists’ help ready for launch” 


    April 8, 2015
    Jeanne Jackson DeVoe

    Lenore Rasmussen examines a titanium coupon used in her synthetic muscle being treated in an oxygen plasma at PPPL. (Photo by Elle Starkman/PPPL Office of Communications)

    Lenore Rasmussen’s dream of developing a synthetic muscle that could be used to make better prosthetic limbs and more responsive robots will literally become airborne on Monday April 13 at 4:33 p.m. when her experiment will rocket off to the International Space Station from Cape Canaveral in Florida.

    Rasmussen developed the material at RAS Labs and has worked closely with researchers and engineers at the U.S. Department of Energy’s Princeton Plasma Physics Laboratory (PPPL) to develop the material’s ability to adhere to metal. The Synthetic Muscle™ could be used in robotics in deep space travel such as travel to Mars because of its radiation resistance.

    “Based on the good results we had on planet Earth, the next step is to see how it behaves in a space environment,” said Charles Gentile, an engineer at PPPL who has worked closely with Rasmussen. “From there the next step might be to use it on a mission to Mars.”

    Early Connection with PPPL

    Rasmussen began working with PPPL in 2007 just four years after she started Ras Labs. She received her first patent for a synthetic muscle in 1998. It is a gel-like material called an electroactive polymer that can potentially mimic human movement because it can expand and contract to simulate the movement of muscles in humans. That ability would make it very useful in robotics and in developing better prosthetic limbs.

    “We can’t explore space without robots,” Rasmussen said. “Humans can only withstand a certain amount of radiation so that limits the time that people can be in space, whereas robots particularly if they’re radiation-resistant can be up there for long periods of time without being replaced.”

    Lew Meixler, the long-time head of Technology Transfer at PPPL, who retired in March, said he has enjoyed helping Rasmussen follow her quest. “That’s what entrepreneurs are,” he said. “They’re the dreamers who devote all their time, energy and resources to following their dreams.”

    Rasmussen credits PPPL with providing help and support during critical points in her project. “It was and continues to be a wonderful resource not just because of the plasma physics but the people,” she said. “Charlie and Lew found ways to make things happen.”

    At PPPL, Rasmussen solved a crucial problem: getting the gel, which can be as soft as jelly or as hard as rubber, to adhere to the metal electrodes. Initially working with Lew Meixler on a federal Cooperative Research and Development Agreement in the Plasma Surface Laboratory, she solved the problem by treating the metal (steel or titanium) with a plasma. This changed the metal’s surface and made the gel adhere more closely to the metal

    PPPL was also involved with tests of the material last summer, when the material was exposed to over 300,000 RADs of gamma radiation. That is 20 times the amount that would be lethal to a human and was equivalent to a trip from earth to Mars and back. A second test of 45 hours was enough to be equivalent to a trip to Jupiter and beyond

    Rasmussen and Gentile found that there was no change in the strength, electroacivity, or durability of the material due to the radiation, although there was a slight change in color. Tests on selected samples of the material found it was not affected by extreme temperatures down to -271 degrees Celsius, which is close to absolute zero, the coldest temperature possible in the universe.

    Preparing for launch

    Since then, PPPL staff members have been involved in planning for the launch. This involves mapping out each detail with military precision. Several PPPL staff members, along with Rasmussen and her staff, signed the back of the metal container or coupon holding the material. “All of the people who worked on the lab signed it and the coupon will go into space,” said Gentile. “So I’ll be up there with Gene Roddenberry.”

    The Synthetic Muscle™ material will be launched on the Falcon 9, a rocket carrying the Dragon spacecraft, both produced by Space X, which will carry 4,300 pounds of supplies and payloads, including material for research experiments, to the International Space Station’s U.S. National Laboratory. The nine-engine rocket will propel the Dragon into orbit where it will meet with the Space Station 33 hours after the rocket is launched. Astronauts will use the station’s 57-foot arm to reach out and capture Dragon at 7:15 a.m. on April 15. Additional information about the launch is available at https://blogs.nasa.gov/spacex/2015/03/31/spacex-targeting-april-13-for-station-resupply-launch/.

    The material will be kept in a zero gravity storage rack in the U.S. National Laboratory on the space station for 90 days. The astronauts will photograph the materials every three weeks. When the material returns to Earth in July, it will be tested and compared with identical materials that remained on Earth.

    The International Space Station is an international science laboratory in low Earth orbit where astronauts conduct scientific research in biology, human biology, astronomy, meteorology and other fields in a gravity-free environment. It has operated since November of 2000 with the cooperation of the U.S., Russia, many European nations, Japan, Canada, and Brazil. It is currently staffed by two astronauts from NASA, three cosmonauts from Russia and an astronaut from the European Space Agency.

    Use as a prosthetic

    Rasmussen is also exploring whether Synthetic Muscle™ could be used as a prosthetic liner. The vestigial limbs of amputees can expand and contract during the day and the Ras Labs material is designed to expand and contract so it could make prosthetics more comfortable. She recently received a grant from the Pediatric Medical Device Consortium at the Children’s Hospital of Philadelphia to research this possibility.

    Ras Labs is a high tech woman-owned small business. Visit http://www.raslabs.com.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Princeton Plasma Physics Laboratory is a U.S. Department of Energy national laboratory managed by Princeton University.

  • richardmitnick 7:20 pm on January 12, 2015 Permalink | Reply
    Tags: , , Robotics   

    From Carnegie-Mellon: “Carnegie Mellon’s Six-legged “Snake Monster” Is First of New Breed of Reconfigurable Modular Robots” 

    Carnegie Mellon University logo
    Carnegie Mellon university

    January 12, 2015
    Byron Spice / 412-268-9068

    Carnegie Mellon University’s latest robot is called Snake Monster, however, with six legs, it looks more like an insect than a snake. But it really doesn’t matter what you call it, says its inventor, Howie Choset — the whole point of the project is to make modular robots that can easily be reconfigured to meet a user’s needs.

    Choset, a professor in CMU’s Robotics Institute, said the walking robot, developed in just six months, is only one example of the robots that eventually can be built using this modular system. His team already is working on modules such as force-sensing feet, wheels and tank-like treads that will enable the assembly of totally different robots.

    “By creating a system that can be readily reconfigured and that also is easy to program, we believe we can build robots that are not only robust and flexible, but also inexpensive,” Choset said. “Modularity has the potential to rapidly accelerate the development of traditional industrial robots, as well as all kinds of new robots.”

    The Defense Advanced Research Projects Agency sponsored this work through its Maximum Mobility and Manipulation (M3) program, which focuses on ways to design and build robots more rapidly and enhance their ability to manipulate objects and move in natural environments. Snake Monster, as well as some of Choset’s other robots, will be demonstrated at the finals of the DARPA Robotics Challenge, June 5-6 in Pomona, Calif.

    For years, Choset’s lab has concentrated on building and operating snake-like robots — chains of repeated component joints. By careful coordination of these joints, the robots can be made to move in ways that are similar to a snake’s natural undulations and in other ways not seen in nature, such as rolling. Applications for these robots include urban search and rescue, archaeological exploration and, thanks to the robots’ ability to move through pipes, inspection of power plants, refineries and sewers.

    The name Snake Monster harkens to those research origins and the similarity between the snake robots and the legs of the Snake Monster. The six legs have a reach of 12 inches (30 cm), and are connected to a rectangular body, with the whole robot weighing 18 pounds (8 kg). The robot moves with an alternating tripod gait, with three legs in the air at all times — two on one side and one on the other. A YouTube video of the walking robot is available.

    To build it, Choset’s team used the hardware expertise developed in snake robots to build small, powerful modules and used the lessons learned in controlling the snakebots to create a system architecture that can be easily programmed to control robots with a wide variety of configurations.

    “The architecture is built on Ethernet computer networking technology,” Choset said. “Ethernet doesn’t require that the computers connected to it be of a specific type, but that they all communicate with each other in the same way.” The interfaces used in the modular architecture allow robot designers to focus on specific capabilities without having to worry about detailed systems issues or having to modify the robot later, he added.

    Also key to the modular approach was the team’s development of a series elastic actuator — a motor that has a spring in series with its output shaft. The spring helps protect the motor from high impacts but, more importantly, allows the actuator to measure and regulate the force it exerts, as well as the forces exerted on it.

    “When we push the Snake Monster forward, the joints in the leg ‘feel’ the force of the robot being pushed and, then, in an effort to zero-out the force it feels, the robot walks in the direction it is being pushed,” said Choset, who noted the force feedback allows very simple controls to adapt to a wide range of terrains. “When the robot goes over bumpy terrain, the springs in the series elastic actuators allow us to not perfectly plan the foot steps, but rather let the robot automatically conform to the environment the way animals do.”

    The Robotics Institute is part of Carnegie Mellon’s top-ranked School of Computer Science.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Carnegie Mellon Campus

    Carnegie Mellon University (CMU) is a global research university with more than 12,000 students, 95,000 alumni, and 5,000 faculty and staff.
    CMU has been a birthplace of innovation since its founding in 1900.
    Today, we are a global leader bringing groundbreaking ideas to market and creating successful startup businesses.
    Our award-winning faculty members are renowned for working closely with students to solve major scientific, technological and societal challenges. We put a strong emphasis on creating things—from art to robots. Our students are recruited by some of the world’s most innovative companies.
    We have campuses in Pittsburgh, Qatar and Silicon Valley, and degree-granting programs around the world, including Africa, Asia, Australia, Europe and Latin America.

  • richardmitnick 8:34 am on December 16, 2014 Permalink | Reply
    Tags: , Robotics   

    From Sandia: “Getting bot responders into shape” 

    Sandia Lab

    December 16, 2014
    Stephanie Holinka, slholin@sandia.gov, (505) 284-9227

    Sandia National Laboratories is tackling one of the biggest barriers to the use of robots in emergency response: energy efficiency.

    Through a project supported by the Defense Advanced Research Projects Agency (DARPA), Sandia is developing technology that will dramatically improve the endurance of legged robots, helping them operate for long periods while performing the types of locomotion most relevant to disaster response scenarios.

    Steve Buerger is leading a Sandia National Laboratories project to demonstrate how energy efficient biped walking robots could become. Increased efficiency could enable bots to operate for much longer periods of time without recharging batteries, an important factor in emergency situations. (Photo by Randy Montoya)
    One of Sandia’s new robots that showcases this technology will be demonstrated at an exposition to be held in conjunction with the DARPA Robotics Challenge Finals next June.

    As the finals draw closer, some of the most advanced robotics research and development organizations in the world are racing to develop emergency response robots that can complete a battery of tasks specified by DARPA. Competing robots will face degraded physical environments that simulate conditions likely to occur in a natural or man-made disaster. Many robots will walk on legs to allow them to negotiate challenging terrain.

    Sandia’s robots won’t compete in the finals next June, but they could ultimately help the winning robots extend their battery life until their life-saving work is done.

    “We’ll demonstrate how energy efficient biped walking robots could become. Increased efficiency could allow robots similar to those used for the competition to operate for much longer periods of time without recharging batteries,” said project lead Steve Buerger of Sandia’s Intelligent Systems Control Dept.

    Batteries need to last for emergency response robots

    Battery life is an important concern in the usefulness of robots for emergency response.

    “You can have the biggest, baddest, toughest robot on the planet, but if its battery life is 10 or 20 minutes, as many are right now, that robot cannot possibly function in an emergency situation, when lives are at stake,” said Buerger.

    The first robot Sandia is developing in support of the DARPA Challenge, is known as STEPPR for Sandia Transmission Efficient Prototype Promoting Research. It is a fully functional research platform that allows developers to try different joint-level mechanisms that function like elbows and knees to quantify how much energy is used.

    Sandia’s second robot, WANDERER for Walking Anthropomorphic Novelly Driven Efficient Robot for Emergency Response, will be a more optimized and better-packaged prototype.

    Energy-efficient actuators key to testing

    The key to the testing is Sandia’s novel, energy-efficient actuators, which move the robots’ joints. The actuation system uses efficient, brushless DC motors with very high torque-to-weight ratios, very efficient low-ratio transmissions and specially designed passive mechanisms customized for each joint to ensure energy efficiency.

    “We take advantage of dynamic characteristics that are common to a wide variety of legged behaviors and add a set of ‘support elements,’ including springs and variable transmissions, that keep the motors operating at more efficient speed-torque conditions, reducing losses,” Buerger said.

    Electric motors are particularly inefficient when providing large torques at low speeds, for example, to a crouching robot, Buerger said. A simple support element, such as a spring, would provide torque, reducing the load on the motor.

    “The support elements also allow robots to self-adjust when they change behaviors. When they change from level walking to uphill walking, for example, they can make subtle adjustments to their joint dynamics to optimize efficiency under the new condition,” Buerger said.

    Robots must adapt to the diverse kinds of conditions expected in emergency response scenarios.

    “Certain legged robot designs are extremely efficient when walking on level ground, but function extremely inefficiently under other conditions or cannot walk over different types of terrains. Robots need an actuation system to enable efficient locomotion in many different conditions,” Buerger said. “That is what the adjustable support elements can do.”

    Early testing has shown STEPPR to operate efficiently and quietly.

    “Noise is lost energy, so being quiet goes hand-in-hand with being efficient. Most robots make a lot of noise, and that can be a major drawback for some applications,” Buerger said.

    Robots’ electronics, certain software to be publicly released

    STEPPR’s and WANDERER’s electronics and low-level software are being developed by the Open Source Robotics Foundation. The designs will be publicly released, allowing engineers and designers all over the world to take advantage of advances.

    The Florida Institute for Human and Machine Cognition is developing energy-efficient walking control algorithms for both robots. The Massachusetts Institute of Technology and Globe Motors also are contributing to the project.

    Sandia’s robotic work will be demonstrated in the technology exposition section of the DARPA Robotics Challenge, scheduled for June 5-6 at Fairplex in Pomona, Calif.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.

  • richardmitnick 5:08 pm on November 21, 2014 Permalink | Reply
    Tags: , , Robotics   

    From NSF: “A foundation for robotics” 

    National Science Foundation

    November 21, 2014
    Aaron Dubrow, NSF (703) 292-4489 adubrow@nsf.gov

    The fundamental research in computing and engineering that enabled robotics to develop in the U.S. has been supported by the National Science Foundation (NSF) since its inception.
    Yet despite these early investments in sensors, machine movement and computer vision, it wasn’t until 1972 that the first grant with “robot” in the title was funded.

    1970s: Robots for the factory floor

    In the mid-1970s, robotics began to gather steam at NSF. Among the first research projects classified as robotics were mechanical arms (seen [below]) that could pick a part out of a box and visually identify it and orient it properly for the next step on an assembly line, as well as computer-controlled welding robots. These and other NSF-funded projects were aimed at improving the productivity of American manufacturing processes–a goal for roboticists that continues to this day.

    This image of a robot arm, developed by the Stanford Research Institute, is similar to the one that appeared in the 1976 NSF Annual Report. The robotic system used computer vision to identify and make decisions about parts on an assembly line. This is one of several projects from that era aimed at improving the productivity of American manufacturing processes. Credit: SRI International

    1980s: Rise of the walking machines

    The 1980s brought an increased diversification in the types of robots being explored and the ways they could be used.

    At Ohio State, electrical engineer Robert McGhee and mechanical engineer Kenneth Waldron, along with a 60-member team of students and technical assistants, developed the Adaptive Suspension Vehicle (ASV), nicknamed the “Walker,” with support from NSF and the Defense Advanced Projects Research Agency (DARPA).

    What do you get when you combine 20 years of research, $5 million, and a Star Wars Imperial all-terrain vehicle? Ohio State’s Adaptive Suspension Vehicle (ASV), nicknamed the “Walker.” Developed by electrical engineer Robert McGhee and mechanical engineer Kenneth Waldron, along with a 60-member team of students and technical assistants, the ‘Walker’ was developed under a research contract from the Defense Advanced Projects Research Agency (DARPA).

    The ASV was 17 feet long, 8 feet wide, and 10.5 feet high, and had six legs to support its three-ton aluminum body. It was designed to carry cargo for industrial and military applications over rough, mountainous, icy or muddy terrain, and was capable of crossing 9-foot-wide ditches or 7-foot-high walls.

    The walker used a forward mounted radar system to scan the terrain ahead and feed that data, along with instructions from the operator’s joystick, into the 16 onboard computers that coordinated and controlled the ASV’s legs. Computers moved each leg individually, up and down, forward and back, and closer or farther from the ASV’s body, for a clunky but serviceable ride.

    1990s: Robots explore new environments

    Not long afterward, researchers supported by NSF were developing robots for a very different environment: underwater. First built in 1991, the Omni-Directional Intelligent Navigator (ODIN) was a sphere-shaped, autonomous underwater robot capable of instantaneous movement in all six directions. First built as a remotely operated robot, in 1995 it was upgraded to ODIN II, an autonomous underwater robot. Sentry, a successor robot developed through a grant from NSF, plies the deep waters today locating and quantifying hydrothermal fluxes.

    First built in 1991, the Omni-Directional Intelligent Navigator (ODIN) was a sphere-shaped, autonomous underwater robot capable of instantaneous movement in six directions. Credit: Autonomous Systems Laboratory, University of Hawaii

    In the 1990s, roboticists began turning their attention to day-to-day tasks with which a robot could assist. For instance, researchers from the University of Pittsburgh, University of Michigan and Carnegie Mellon University developed a series of mobile, personal service robots, such as Nursebot, that were designed to assist elderly people in their everyday life.

    Researchers from the University of Pittsburgh, University of Michigan and Carnegie Mellon University have developed mobile, personal service robots, such as Nursebot, that assist elderly people in their everyday life. An autonomous mobile robot that “lives” in the home of a chronically ill elderly person might remind its owner to take medicine, provide videoconferencing with doctors, collect patient data or watch for accidents, manipulate objects for arthritis sufferers, and provide some social interaction. Credit: Carnegie Mellon University

    An autonomous mobile robot that “lives” in the home of a chronically ill elderly person could remind its owner to take medicine, provide videoconferencing with doctors, collect patient data or watch for accidents, manipulate objects for arthritis sufferers, and provide some social interaction. New versions have evolved over the years and a General Electric developed hospital robot is expected to be tested at a Veterans Affairs hospital in 2015.

    2000s: Miniaturization and mobility

    Researchers have always envisioned a future where robots could serve the general good in disaster recovery and search-and-rescue operations, but it wasn’t until 9/11 that robots were broadly put to that use.

    Robotics expert Robin Murphy, then an associate professor of computer science at the University of South Florida, arrived on site the morning after the collapse of the World Trade Center. Murphy’s research on experimental mixed-initiative robots for urban rescue operations was originally funded by NSF. She brought with her a response team that included three graduate students–Jenn Casper, Mark Micire and Brian Minten–and software-guided “marsupial” robot systems. These intelligent anonymous “marsupial” robots are especially useful in rubble because the “mother” robot releases smaller robots to explore tight spaces unreachable by other means.

    Over the next 11 days, the teams made five insertions onto the rubble piles, often at the request of the Federal Emergency Management Agency (FEMA) task force teams or sector chiefs. Murphy’s mechanized prowlers had tethers with a range of 100 feet, far out-stripping the fire department’s seven-foot camera wands. In doing so, they helped find five victims and another set of remains, though Murphy expressed regret that they hadn’t been more successful.

    As the 2000s progressed, efforts by engineers to miniaturize components led to robots that were significantly smaller than those that came before. One startling example of this trend is the RoboBee project, which was awarded an Expeditions in Computing award from NSF’s Directorate for Computer and Information Science and Engineering in 2009.

    Researchers in this expedition are creating robotic bees that fly autonomously and coordinate activities amongst themselves and the hive, much like real bees. The research team aims to drive research in compact, high-energy power sources, ultra-low-powered computing and the design of distributed algorithms for multi-agent systems. Most recently, RoboBees were pollinating young minds at the Boston Museum of Science in an exhibition dedicated to their complex design.

    2010s: Investing in co-robots

    In June 2011, the administration launched the National Robotics Initiative (NRI) to develop robots that work with or beside people to extend or augment human capabilities, taking advantage of the different strengths of humans and robots. This provided focus and funding for robotics research. The NRI is led by NSF and supported by multiple agencies including the National Aeronautics and Space Administration (NASA), the National Institutes of Health (NIH), the U.S. Department of Agriculture (USDA), and the U.S. Department of Defense (DOD).

    Since 2011, NSF and its partners in the NRI have invested more than $120 million in robotics research.

    Today, robots impact our lives in a myriad of ways. Robots are being used in classrooms across the nation to capture the excitement of students and help them learn STEM (and non-STEM) principles. They’re helping doctors perform surgeries, providing assistance to individuals with disabilities and inspecting bridges and roads to ensure our safety.

    “Robots, which once were limited to the realm of science fiction, are now a transformative technology, a demonstration of how NSF-funded basic research can bring about changes in human life,” said Marc Rothenberg, former NSF historian.

    Though robots may be an emerging area of research, many of the underpinnings of today’s robots began in fundamental research in sensing, computer vision, artificial intelligence, mechanical engineering and many other areas that some might not immediately recognize as being related to robots.

    What’s next

    NSF-supported researchers are making incredible advances in robotics, creating a new generation of co-robots that can handle critical tasks in close proximity to humans, safely and with greater resilience than previous intelligent machines.

    Check out our Youtube Robot playlist to learn more about more about the research of NSF-funded roboticists.

    Junku Yuh
    Song Choi
    Gu-Yeon Wei
    Robert Wood
    Robin Murphy
    Andrea Thomaz
    Charles Klein
    Robert McGhee
    Radhika Nagpal
    Blake Hannaford
    Judith Matthews
    Nilanjan Sarkar
    Donald Chiarulli
    Nikolaus Correll
    Said Koozekanani
    J. Gregory Morrisett
    Jacqueline Dunbar-Jacob

    Related Institutions/Organizations
    Harvard University
    University of Hawaii
    Ohio State University
    Colorado School of Mines
    University of Pittsburgh
    University of Washington
    Georgia Tech Research Corporation
    University of Colorado at Boulder

    Related Programs
    National Robotics Initiative

    Related Awards
    #9157896 Presidential Young Investigators Award
    #0953181 CAREER: Socially Guided Machine Learning
    #9320318 Reactive Sensing for Autonomous Mobile Robots
    #0085796 ITR: Personal Robotic Assistants for the Elderly
    #1150223 CAREER: Modeling and Design of Composite Swarming Behaviors
    #0926148 Collaborative Research: RoboBees: A Convergence of Body, Brain and Colony
    #0958441 II New: A Network of Open Experimental Testbeds for Surgical Robotics Research
    #7818957 Dynamics and Control of Industrial Manipulators and Legged Locomotion Systems
    #9701614 Intelligent Coordinated Motion Control of Underwater Robotic Vehicles with Manipulator Workpackages (Collaborative Research)
    #9603043 U.S.-Japan Cooperative Science: Virtual Collaborative World Simulator for Underwater Robots using Multi-Dimensional, Synthetic Environment

    Years Research Conducted
    1972 – 2014

    See the full article here.

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    The National Science Foundation (NSF) is an independent federal agency created by Congress in 1950 “to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense…we are the funding source for approximately 24 percent of all federally supported basic research conducted by America’s colleges and universities. In many fields such as mathematics, computer science and the social sciences, NSF is the major source of federal backing.


    ScienceSprings relies on technology from

    MAINGEAR computers



Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc

Get every new post delivered to your Inbox.

Join 549 other followers

%d bloggers like this: