Tagged: Robotics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:02 pm on February 20, 2019 Permalink | Reply
    Tags: "Robots track moving objects with unprecedented precision", , , RFID tags, Robotics   

    From MIT News: “Robots track moving objects with unprecedented precision” 

    MIT News
    MIT Widget

    From MIT News

    February 18, 2019
    Rob Matheson

    System uses RFID tags to home in on targets; could benefit robotic manufacturing, collaborative drones, and other applications.

    1
    MIT Media Lab researchers are using RFID tags to help robots home in on moving objects with unprecedented speed and accuracy, potentially enabling greater collaboration in robotic packaging and assembly and among swarms of drones. Photo courtesy of the researchers.

    A novel system developed at MIT uses RFID tags to help robots home in on moving objects with unprecedented speed and accuracy. The system could enable greater collaboration and precision by robots working on packaging and assembly, and by swarms of drones carrying out search-and-rescue missions.

    In a paper being presented next week at the USENIX Symposium on Networked Systems Design and Implementation, the researchers show that robots using the system can locate tagged objects within 7.5 milliseconds, on average, and with an error of less than a centimeter.

    In the system, called TurboTrack, an RFID (radio-frequency identification) tag can be applied to any object. A reader sends a wireless signal that reflects off the RFID tag and other nearby objects, and rebounds to the reader. An algorithm sifts through all the reflected signals to find the RFID tag’s response. Final computations then leverage the RFID tag’s movement — even though this usually decreases precision — to improve its localization accuracy.

    The researchers say the system could replace computer vision for some robotic tasks. As with its human counterpart, computer vision is limited by what it can see, and it can fail to notice objects in cluttered environments. Radio frequency signals have no such restrictions: They can identify targets without visualization, within clutter and through walls.

    To validate the system, the researchers attached one RFID tag to a cap and another to a bottle. A robotic arm located the cap and placed it onto the bottle, held by another robotic arm. In another demonstration, the researchers tracked RFID-equipped nanodrones during docking, maneuvering, and flying. In both tasks, the system was as accurate and fast as traditional computer-vision systems, while working in scenarios where computer vision fails, the researchers report.

    “If you use RF signals for tasks typically done using computer vision, not only do you enable robots to do human things, but you can also enable them to do superhuman things,” says Fadel Adib, an assistant professor and principal investigator in the MIT Media Lab, and founding director of the Signal Kinetics Research Group. “And you can do it in a scalable way, because these RFID tags are only 3 cents each.”

    In manufacturing, the system could enable robot arms to be more precise and versatile in, say, picking up, assembling, and packaging items along an assembly line. Another promising application is using handheld “nanodrones” for search and rescue missions. Nanodrones currently use computer vision and methods to stitch together captured images for localization purposes. These drones often get confused in chaotic areas, lose each other behind walls, and can’t uniquely identify each other. This all limits their ability to, say, spread out over an area and collaborate to search for a missing person. Using the researchers’ system, nanodrones in swarms could better locate each other, for greater control and collaboration.

    “You could enable a swarm of nanodrones to form in certain ways, fly into cluttered environments, and even environments hidden from sight, with great precision,” says first author Zhihong Luo, a graduate student in the Signal Kinetics Research Group.

    The other Media Lab co-authors on the paper are visiting student Qiping Zhang, postdoc Yunfei Ma, and Research Assistant Manish Singh.

    Super resolution

    Adib’s group has been working for years on using radio signals for tracking and identification purposes, such as detecting contamination in bottled foods, communicating with devices inside the body, and managing warehouse inventory.

    Similar systems have attempted to use RFID tags for localization tasks. But these come with trade-offs in either accuracy or speed. To be accurate, it may take them several seconds to find a moving object; to increase speed, they lose accuracy.

    The challenge was achieving both speed and accuracy simultaneously. To do so, the researchers drew inspiration from an imaging technique called “super-resolution imaging.” These systems stitch together images from multiple angles to achieve a finer-resolution image.

    “The idea was to apply these super-resolution systems to radio signals,” Adib says. “As something moves, you get more perspectives in tracking it, so you can exploit the movement for accuracy.”

    The system combines a standard RFID reader with a “helper” component that’s used to localize radio frequency signals. The helper shoots out a wideband signal comprising multiple frequencies, building on a modulation scheme used in wireless communication, called orthogonal frequency-division multiplexing.

    The system captures all the signals rebounding off objects in the environment, including the RFID tag. One of those signals carries a signal that’s specific to the specific RFID tag, because RFID signals reflect and absorb an incoming signal in a certain pattern, corresponding to bits of 0s and 1s, that the system can recognize.

    Because these signals travel at the speed of light, the system can compute a “time of flight” — measuring distance by calculating the time it takes a signal to travel between a transmitter and receiver — to gauge the location of the tag, as well as the other objects in the environment. But this provides only a ballpark localization figure, not subcentimter precision.

    Leveraging movement

    To zoom in on the tag’s location, the researchers developed what they call a “space-time super-resolution” algorithm.

    The algorithm combines the location estimations for all rebounding signals, including the RFID signal, which it determined using time of flight. Using some probability calculations, it narrows down that group to a handful of potential locations for the RFID tag.

    As the tag moves, its signal angle slightly alters — a change that also corresponds to a certain location. The algorithm then can use that angle change to track the tag’s distance as it moves. By constantly comparing that changing distance measurement to all other distance measurements from other signals, it can find the tag in a three-dimensional space. This all happens in a fraction of a second.

    “The high-level idea is that, by combining these measurements over time and over space, you get a better reconstruction of the tag’s position,” Adib says.

    The work was sponsored, in part, by the National Science Foundation.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 10:18 am on January 9, 2019 Permalink | Reply
    Tags: , , , Robotics   

    From CSIROscope : “Robots of the future: it’s about to get weird” 

    CSIRO bloc

    From CSIROscope

    9 January 2019

    The word “robot” was coined almost a hundred years ago by Czech writer Karel Čapek, to refer to the artificial life forms in his play “Rossum’s Universal Robots”. Ever since humanoid shaped robots have dominated concepts of what a robot should look like.

    Think of Star War’s C3PO, The Terminator, The Iron Giant, or even Marvin the Paranoid Android from “Hitchhikers guide to the Galaxy”.

    In the real world there are also machines like Boston Dynamic’s incredibly agile “Atlas”. Or Sophia, the first robot to receive citizenship.

    More often than not though our shape isn’t the best one for robots faced with challenging assignments in extreme environments.

    In a just published paper [Nature Machine Intelligence] our scientists have offered a bold glimpse into what the robots of the future could look like – and it’s not “Robby the Robot”.

    Robot evolution revolution

    Our Active Integrated Matter Future Science Platform (AIM FSP) says that within 20 years robots could look unpredictably different. Scientific breakthroughs in areas like materials discovery, advanced manufacturing, 3D printing, and artificial intelligence will allow robots to be designed from the molecular level up to perform their specific mission. Resulting in unusual and unexpected shapes, limbs and behaviours.

    2
    An artist’s impression of a robot for use in the Amazon. Based on tree crawling lizards and gecko, it would have articulated legs for more flexibility and climbing.

    Central to this all is a concept known as Multi-Level Evolution (MLE). It argues that robots should be taking their engineering cues from the one tried and true design philosophy that’s survived millennia on Earth: evolution.

    Evolution has seen animals undergo incredibly diverse adaptation to survive challenging environments. It creates effective solutions that are often totally different to any a human engineer would come up with. Kangaroos, for instance, probably wouldn’t have made it off the drawing board but have survived and thrived for eons in Australia.

    How would MLE work?

    A robot’s mission, as well as details about the relevant terrain and environment, would be entered into a computer. It would then run algorithms based on evolution to automatically design robots.

    The computer would do this by exploring a diverse range of materials, components, sensors and behaviours. Advanced, computer based modelling could rapidly test prototypes in simulated, “real world” scenarios to decide which works best.

    Once that’s done 3D printing and other technologies would be used to create and physically test prototype robots.

    The end result? Small, simple, highly specialised robots that can automatically adapt to their environment and are tough enough to survive their mission.

    3
    An artist’s impression of an ocean, coastal or river based amphibious robot. It would travel in water like an eel, but have legs in order to crawl and climb.

    Do the robot

    Say, in the future, you need to design robots for environmental monitoring in extreme environments. They’d all need to move across difficult landscapes while gathering data. Eventually, to avoid polluting the environment, they’d have to return to base or degrade away to nothing. How could you do this?

    MLE would come up with remarkably different results, depending on the terrain, climate and other factors.

    To cope with the Sahara Desert a robot would need materials designed to survive punishing heat, sand and dust. Given the amount of sun the Sahara receives the robot could be solar powered, and slide across sand dunes. The harsh UV light could also be used as the trigger to eventually wear the robot away.

    In the Amazon a robot would have entirely different challenges to face. Thick, low lying vegetation and fallen trees would hamper its movement so it would need to be flexible enough to climb over or go round obstacles. It could perhaps be powered by biomass such as the leaves covering the jungle floor, and degrade with humidity.

    4
    An artist’s impression of an Antarctic based robot. Turtle like, it would be strong and robust for extreme conditions. It could also suit desert applications.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    SKA/ASKAP radio telescope at the Murchison Radio-astronomy Observatory (MRO) in Mid West region of Western Australia

    So what can we expect these new radio projects to discover? We have no idea, but history tells us that they are almost certain to deliver some major surprises.

    Making these new discoveries may not be so simple. Gone are the days when astronomers could just notice something odd as they browse their tables and graphs.

    Nowadays, astronomers are more likely to be distilling their answers from carefully-posed queries to databases containing petabytes of data. Human brains are just not up to the job of making unexpected discoveries in these circumstances, and instead we will need to develop “learning machines” to help us discover the unexpected.

    With the right tools and careful insight, who knows what we might find.

    CSIRO campus

    CSIRO, the Commonwealth Scientific and Industrial Research Organisation, is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

     
  • richardmitnick 9:44 am on May 23, 2018 Permalink | Reply
    Tags: , , Meet Our Robot Family, Robotics   

    From CSIROscope: “Meet Our Robot Family” 

    CSIRO bloc

    From CSIROscope

    23 May 2018
    Ketan Joshi

    1
    We’re developing robotic systems that help humans perform dangerous tasks, and expanding the Australian robotics industry. Above, Weaver. No image credit.

    The family of robots that live at our Data61 are incredibly diverse. They’ve got legs, wheels, cameras, sensors, fins, blades and magnets. They sense the world, navigate it autonomously, and they traverse places too dangerous and dirty for human work. They’re as varied as the challenges they’re designed to resolve, but the common DNA is a focus on the use of cutting-edge data science.

    This isn’t something we go at alone—our partners include: DARPA (Defence Advanced Research Projects Agency), Rockwell Collins, Boeing, Woodside, Queensland University of Technology, and many other government, universities and enterprises. We recently announced the Sixth Wave Alliance, to develop a national robotics R&D strategy and create the critical mass required to address large-scale Australian and international challenges using robotics technologies.

    This week, we’re also at the International Conference on Robotics (ICRA 2018), where we’re showcasing the best of our bots.


    Meet the family below, and read more about our robotics research here.

    Machines that see – Sensing and mapping the world

    Sucking up information from the world is a capability we fleshy humans take for granted. Data61’s robotic and autonomous devices are particularly good at sensing and mapping – two capabilities that are of high importance for modern robotics and industries like mining, exploration and environmental conservation.

    Hovermap and Zebedee – moving without GPSs

    Drones are increasingly common as consumer goods, but they’re reliant on direct access to global position satellites (GPS).

    Hovermap is a 3D mapping system that uses LIDAR (light detection and ranging) technology, combined with Data61’s proprietary Simultaneous Localisation and Mapping (SLAM) solution. Hovermap works in conjunction with a UAV (uncrewed autonomous vehicle), and can map both indoor and outdoor locations without relying on GPS.

    2
    Zebedee, our high-accuracy 3D laser mapping technology, was commercialised and is already being used around the world by 25 multinational organisations. It was recently trialled by the International Atomic Energy Agency in nuclear safeguards inspections.

    Camazotz – the bat god tech

    Camazotz, named after a Mayan bat god, is a small, portable device that is used to monitor flying foxes across Australia, helping ecologists understand and predict the spread of disease. The Wireless Ad hoc System for Positioning (WASP) uses similar tags to track vehicles and mine workers relative to reference nodes – assisting with safety and boosting productivity.

    Legged Robots

    You’ve probably seen videos of animal robots doing clever tasks and being shared with a tone of alarm. Legged robots aren’t reason for alarm – these systems are well suited to navigating environments that are too dangerous or dirty for safe human work, such as a chemical spill in a plant or the ceiling beam in a factory.

    Gizmo

    3
    Gizmo dancing

    Gizmo is Data61’s newest bot – a small, smooth hexapod designed for versatility and small spaces. One of the motivating applications for this robot is to inspect and map ceiling cavity and underfloor-type confined spaces.

    Zee

    4
    Zee

    Zee is a prototype hexapod robot equipped with a streaming camera sensor and a real-time 3D scanning LIDAR. You’ve probably seen Zee around – it’s an older machine but still an excellent demonstration of six-legged robotics.

    Weaver

    5
    Zee’s big sister, Weaver, features five joints per leg and 30 degrees of freedom. Weaver can self-stabilise through ‘exteroceptive’ sensing – enabling the robot to walk up gradients of 30°, and remain stable on inclines up to 50°.

    MaX

    6

    MaX (Multi-legged autonomous explorer) is even bigger – 2.25m tall when standing up straight. But MaX only weighs 60kg; around 5 to 20 times lighter than comparable robots. MaX is a research vehicle designed to help our scientists understand how to traverse and explore challenging indoor and outdoor environments.

    Magnapod

    7

    Magnapods are Data61’s wall-climbing, electro-magnetic inspection robots, useful in confined space inspection tasks and capable of carrying a 10 kilogram sensor payload.

    You can read more about the scientific goals of our legged robot research program here.

    Autonomous vehicles

    Creating systems that can navigate and respond without human intervention is a key component in removing the human element from tasks that are dangerous or poorly suited for human control. We’ve developed several ground vehicles normally used in industrial environments that can operate without human intervention, including the Gator, the load haul dump vehicle and the 20 tonne hot metal carrier.

    8

    Our Science Rover enabled the complicated process of satellite calibration – the autonomous vehicle collects measurements at the same time an Earth observation satellite passes overhead – the two datasets are compared, and the satellite is calibrated. Our underwater autonomous vehicle, Starbug, uses underwater sensor networks to locate itself (GPS signals cannot be used underwater), enabling smart underwater data collection for protection and tracking of ecosystems.

    Our family of robots is, as you can see, pretty diverse. It’s the broad nature of the challenges they’re addressing that gives them these shapes, from small to big, wheeled to legged.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.
    stem
    Stem Education Coalition

    SKA/ASKAP radio telescope at the Murchison Radio-astronomy Observatory (MRO) in Mid West region of Western Australia

    So what can we expect these new radio projects to discover? We have no idea, but history tells us that they are almost certain to deliver some major surprises.

    Making these new discoveries may not be so simple. Gone are the days when astronomers could just notice something odd as they browse their tables and graphs.

    Nowadays, astronomers are more likely to be distilling their answers from carefully-posed queries to databases containing petabytes of data. Human brains are just not up to the job of making unexpected discoveries in these circumstances, and instead we will need to develop “learning machines” to help us discover the unexpected.

    With the right tools and careful insight, who knows what we might find.

    CSIRO campus

    CSIRO, the Commonwealth Scientific and Industrial Research Organisation, is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

     
  • richardmitnick 9:02 am on January 3, 2018 Permalink | Reply
    Tags: , Bimorph, , Graphene-based Bimorphs for Micron-sized Autonomous Origami Machines, Physicists take first step toward cell-sized robots, , Robotics, You could put the computational power of the spaceship Voyager onto an object the size of a cell   

    From Cornell Chronicle: “Physicists take first step toward cell-sized robots” 

    Cornell Bloc

    Cornell Chronicle

    January 2, 2018
    Tom Fleischman
    tjf85@cornell.edu

    Charles Walcott

    1
    An electricity-conducting, environment-sensing, shape-changing machine the size of a human cell? Is that even possible?

    Cornell physicists Paul McEuen and Itai Cohen not only say yes, but they’ve actually built the “muscle” for one.

    With postdoctoral researcher Marc Miskin at the helm, the team has made a robot exoskeleton that can rapidly change its shape upon sensing chemical or thermal changes in its environment. And, they claim, these microscale machines – equipped with electronic, photonic and chemical payloads – could become a powerful platform for robotics at the size scale of biological microorganisms.

    “You could put the computational power of the spaceship Voyager onto an object the size of a cell,” Cohen said. “Then, where do you go explore?”

    “We are trying to build what you might call an ‘exoskeleton’ for electronics,” said McEuen, the John A. Newman Professor of Physical Science and director of the Kavli Institute at Cornell for Nanoscale Science. “Right now, you can make little computer chips that do a lot of information-processing … but they don’t know how to move or cause something to bend.”

    Their work is outlined in Graphene-based Bimorphs for Micron-sized, Autonomous Origami Machines, published Jan. 2 in Proceedings of the National Academy of Sciences. Miskin is lead author; other contributors included David Muller, the Samuel B. Eckert Professor of Engineering, and doctoral students Kyle Dorsey, Baris Bircan and Yimo Han.

    The machines move using a motor called a bimorph. A bimorph is an assembly of two materials – in this case, graphene and glass – that bends when driven by a stimulus like heat, a chemical reaction or an applied voltage. The shape change happens because, in the case of heat, two materials with different thermal responses expand by different amounts over the same temperature change.

    As a consequence, the bimorph bends to relieve some of this strain, allowing one layer to stretch out longer than the other. By adding rigid flat panels that cannot be bent by bimorphs, the researchers localize bending to take place only in specific places, creating folds. With this concept, they are able to make a variety of folding structures ranging from tetrahedra (triangular pyramids) to cubes.

    In the case of graphene and glass, the bimorphs also fold in response to chemical stimuli by driving large ions into the glass, causing it to expand. Typically this chemical activity only occurs on the very outer edge of glass when submerged in water or some other ionic fluid. Since their bimorph is only a few nanometers thick, the glass is basically all outer edge and very reactive.

    “It’s a neat trick,” Miskin said, “because it’s something you can do only with these nanoscale systems.”

    The bimorph is built using atomic layer deposition – chemically “painting” atomically thin layers of silicon dioxide onto aluminum over a cover slip – then wet-transferring a single atomic layer of graphene on top of the stack. The result is the thinnest bimorph ever made.

    One of their machines was described as being “three times larger than a red blood cell and three times smaller than a large neuron” when folded. Folding scaffolds of this size have been built before, but this group’s version has one clear advantage.

    “Our devices are compatible with semiconductor manufacturing,” Cohen said. “That’s what’s making this compatible with our future vision for robotics at this scale.”

    And due to graphene’s relative strength, Miskin said, it can handle the types of loads necessary for electronics applications.

    “If you want to build this electronics exoskeleton,” he said, “you need it to be able to produce enough force to carry the electronics. Ours does that.”

    For now, these tiniest of tiny machines have no commercial application in electronics, biological sensing or anything else. But the research pushes the science of nanoscale robots forward, McEuen said.

    “Right now, there are no ‘muscles’ for small-scale machines,” he said, “so we’re building the small-scale muscles.”

    This work was performed at the Cornell NanoScale Facility for Science and Technology and supported by the Cornell Center for Materials Research, the National Science Foundation, the Air Force Office of Scientific Research and the Kavli Institute at Cornell.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Once called “the first American university” by educational historian Frederick Rudolph, Cornell University represents a distinctive mix of eminent scholarship and democratic ideals. Adding practical subjects to the classics and admitting qualified students regardless of nationality, race, social circumstance, gender, or religion was quite a departure when Cornell was founded in 1865.

    Today’s Cornell reflects this heritage of egalitarian excellence. It is home to the nation’s first colleges devoted to hotel administration, industrial and labor relations, and veterinary medicine. Both a private university and the land-grant institution of New York State, Cornell University is the most educationally diverse member of the Ivy League.

    On the Ithaca campus alone nearly 20,000 students representing every state and 120 countries choose from among 4,000 courses in 11 undergraduate, graduate, and professional schools. Many undergraduates participate in a wide range of interdisciplinary programs, play meaningful roles in original research, and study in Cornell programs in Washington, New York City, and the world over.

     
  • richardmitnick 11:46 am on December 31, 2017 Permalink | Reply
    Tags: , , , Daniel Vogt, Falkor research vessel, , NOAA’s Office of Ocean Exploration and Research, , PIPA-Phoenix Islands Protected Area, Robotics, ROV-remotely operated underwater vehicle, Schmidt Ocean Institute, , Squishy fingers help scientists probe the watery depths,   

    From Wyss Institute: “Squishy fingers help scientists probe the watery depths” 2017 

    Harvard bloc tiny
    Wyss Institute bloc
    Wyss Institute

    October 28, 2017
    Lindsay Brownell

    Wyss researcher Daniel Vogt tests out soft robotics on deep sea corals in the South Pacific.

    As an engineer with degrees in Computer Science and Microengineering, Wyss researcher Daniel Vogt usually spends most of his time in his lab building and testing robots, surrounded by jumbles of cables, wires, bits of plastic, and circuit boards. But for the last month, he’s spent nearly every day in a room that resembles NASA ground control surrounded by marine biologists on a ship in the middle of the Pacific Ocean, intently watching them use joysticks and buttons to maneuver a remotely operated underwater vehicle (ROV) to harvest corals, crabs, and other sea life from the ocean floor.

    1
    The squishy fingers are made of a soft, flexible material that is more dexterous and gentle than ROVs’ conventional grippers. Credit: Schmidt Ocean Institute.


    Deep corals of the Phoenix Islands Protected Area: How Wyss Institute researchers are changing underwater exploration. Credit: Schmidt Ocean Institute.

    This particular ROV’s robotic metal arm is holding the reason why Vogt is here: what looks like a large, floppy toy starfish made of blue and yellow foam. “Devices like this are extremely soft – you can compare them to rubber bands or gummy bears – and this allows them to grasp things that you wouldn’t be able to grasp with a hard device like the ROV gripper,” says Vogt, watching the TV screen as the “squishy fingers” gently close around a diaphanous bright pink sea cucumber and lift it off the sand. The biologists applaud as the fingers cradle the sea cucumber safely on its journey to the ROV’s collection box. “Nicely done,” Vogt says to the ROV operators.

    This shipful of scientists is the latest in a series of research voyages co-funded by NOAA’s Office of Ocean Exploration and Research and the Schmidt Ocean Institute, a nonprofit founded by Eric and Wendy Schmidt in 2009 to support high-risk marine exploration that expands humans’ understanding of our planet’s oceans. The Institute provides marine scientists access to the ship, Falkor, and expert technical shipboard support in exchange for a commitment to openly share and communicate the outcomes of their research.

    2
    Falkor is equipped with both wet and dry lab spaces, the ROV SuBastian, echosounders, water sampling systems, and many other instruments to gather data about the ocean. Credit: Schmidt Ocean Institute.

    Vogt’s shipmates are studying the mysterious deep sea coral communities of the deep ocean, which live below 138 meters (450 feet) on seamounts which are mostly unexplored.

    The best place to find those corals is the Phoenix Islands Protected Area (PIPA), a smattering of tiny islands, atolls, coral reefs, and great swaths of their surrounding South Pacific ocean almost 3,000 miles from the nearest continent. PIPA is the largest (the size of California) and deepest (average water column depth of 4 km/2.5 mi) UNESCO World Heritage Site on Earth and, thanks to its designation as a Marine Protected Area in 2008, represents one of Earth’s last intact oceanic coral archipelago ecosystems. With over 500 species of reef fishes, 250 shallow coral species, and large numbers of sharks and other marine life, PIPA’s reefs resemble what a reef might have looked like a thousand years ago, before human activity began to severely affect oceanic communities. The team on board Falkor is conducting the first deep water biological surveys in PIPA, assessing what species of deep corals are present and any new, undescribed species, while also evaluating the effect of seawater acidification (caused by an increase in the amount of CO2 in the water) on deep coral ecosystems.

    The deep ocean is about as inhospitable to human life as outer space, so scientists largely rely on ROVs to be their eyes, legs, and hands underwater, controlling them remotely from the safety of the surface. Most ROVs used in deep-sea research were designed for use in the oil and gas industries and are built to accomplish tasks like lifting heavy weights, drilling into rock, and installing machinery. When it comes to plucking a sea cucumber off the ocean floor or snipping a piece off a delicate sea fan, however, existing ROVs are like bulls in a china shop, often crushing the samples they’re meant to be taking.

    This problem led to a collaboration between Wyss Core Faculty member Rob Wood, Ph.D. and City University of New York (CUNY) marine biologist David Gruber, Ph.D. back in 2014 that produced the first version of the soft robotic “squishy fingers,” which were successfully tested in the Red Sea in 2015. PIPA offered a unique opportunity to test the squishy fingers in more extreme conditions and evaluate a series of improvements that Vogt and other members of Wood’s lab have been making to them, such as integrating sensors into the robots’ soft bodies. “The Phoenix Islands are very unexplored. We’re looking for new species of corals that nobody has ever seen anywhere else. We don’t know what our graspers will have to pick up on a given day, so it’s a great opportunity to see how they fare against different challenges in the field.”

    3
    Daniel Vogt holds the ‘squishy finger’ soft robots aboard Falkor. Credit: Schmidt Ocean Institute.

    Vogt, ever the tinkerer, also brought with him something that the Red Sea voyage did not have on board: two off-the-shelf 3D printers. Taking feedback directly from the biologists and the ROV pilots about what the soft robot could and could not do, Vogt was able to print new components overnight and try them in the field the next day – something that rarely happens even on land. “It’s really a novel thing, to be able to iterate based on input in the middle of the Pacific Ocean, with no lab in sight. We noticed, for example, that the samples we tried to grasp were often on rock instead of sand, making it difficult for the soft fingers to reach underneath the sample for a good grip. In the latest iteration of the gripper, ‘fingernails’ were added to improve grasping in these situations.” The ultimate goal of building better and better underwater soft robots is to be able to conduct research on samples underwater at their natural depth and temperature, rather than bringing them up to the surface, as this will paint a more accurate picture of what is happening out of sight in the world’s oceans.

    PIPA may be somewhat insulated from the threats of warming oceans and pollution thanks to its remoteness and deep waters, but the people of Kiribati, the island nation that contains and administers PIPA, are not. The researchers visited the island of Kanton, population 25, a few days into their trip to meet the local people and learn about their lives in a country where dry land makes up less than 1% of its total area – a true oceanic nation. “The people were very nice, very welcoming. There is one ship that comes every six months to deliver supplies; everything else they get from the sea,” says Vogt (locals are allowed to fish for subsistence). “They’re also going to be one of the first nations affected by rising sea levels, because the highest point on the whole island is three meters (ten feet). They know that they live in a special place, but they’re preparing for the day when they’ll have to leave their home. The whole community has bought land on Fiji, where they’ll move once Kanton becomes uninhabitable.”

    4
    Daniel Vogt tests the squishy fingers on the forearm of CUNY biologist David Gruber, who spearheaded their development along with Wyss Faculty member Rob Wood. Credit: Schmidt Ocean Institute.

    Research that brings scientists from different fields together to elucidate the world’s remaining unknowns and solve its toughest problems is gaining popularity, and may be the best chance humanity has to ensure its own survival. “One of the most eye-opening part of the trip has been interacting with people from different backgrounds and seeing the scientific challenges they face, which are very different from the challenges that the mechanical and electrical engineers I’m with most of the time have to solve,” says Vogt. “I’ve been amazed by the technology that’s on Falkor related to the ROV and all the scientific tools aboard. The ROV SuBastian is one-of-a-kind, with numerous tools, cameras and sensors aboard as well as an advanced underwater positioning system. It takes a lot of engineers to create and operate something like that, and then a lot of biologists to interpret the results and analyze the 400+ samples which were collected during the cruise.”

    Vogt says he spent a lot of time listening to the biologists and the ROV pilots in order to modify the gripper’s design according to their feedback. The latest version of the gripper was fully designed and manufactured on the boat, and was used during the last dive to successfully sample a variety of sea creatures. He and Wood plan to write several papers detailing the results of his experiments in the coming months.

    “We’re very excited that what started as a conversation between a roboticist and a marine biologist at a conference three years ago has blossomed into a project that solves a significant problem in the real world, and can aid researchers in understanding and preserving our oceans’ sea life,” says Wood.

    Additional videos detailing Vogt’s voyage, including the ship’s log, can be found here.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Wyss Institute campus

    The Wyss (pronounced “Veese”) Institute for Biologically Inspired Engineering uses Nature’s design principles to develop bioinspired materials and devices that will transform medicine and create a more sustainable world.

    Working as an alliance among Harvard’s Schools of Medicine, Engineering, and Arts & Sciences, and in partnership with Beth Israel Deaconess Medical Center, Boston Children’s Hospital, Brigham and Women’s Hospital, Dana Farber Cancer Institute, Massachusetts General Hospital, the University of Massachusetts Medical School, Spaulding Rehabilitation Hospital, Tufts University, and Boston University, the Institute crosses disciplinary and institutional barriers to engage in high-risk research that leads to transformative technological breakthroughs.

     
  • richardmitnick 1:23 pm on November 8, 2017 Permalink | Reply
    Tags: , CSAIL-MIT’s Computer Science and Artificial Intelligence Lab, Daniela Rus, , More Evidence that Humans and Machines Are Better When They Team Up, Robotics,   

    From M.I.T Technology Review: Women in STEM- Daniela Rus”More Evidence that Humans and Machines Are Better When They Team Up” 

    MIT Technology Review
    M.I.T Technology Review

    November 8, 2017
    Will Knight

    By worrying about job displacement, we might end up missing a huge opportunity for technological amplification.

    1
    MIT computer scientist Daniela Rus. Justin Saglio

    Instead of just fretting about how robots and AI will eliminate jobs, we should explore new ways for humans and machines to collaborate, says Daniela Rus, director of MIT’s Computer Science and Artificial Intelligence Lab (CSAIL).

    “I believe people and machines should not be competitors, they should be collaborators,” Rus said during her keynote at EmTech MIT 2017, an annual event hosted by MIT Technology Review.

    How technology will impact employment in coming years has become a huge question for economists, policy-makers, and technologists. And, as one of the world’s preeminent centers of robotics and artificial intelligence, CSAIL has a big stake in driving coming changes.

    There is some disagreement among experts about how significantly jobs will be affected by automation and AI, and about how this will be offset by the creation of new business opportunities. Last week, Rus and others at MIT organized an event called AI and the Future of Work, where some speakers gave more dire warnings about the likely upheaval ahead (see “Is AI About to Decimate White Collar Jobs?”).

    The potential for AI to augment human skills is often mentioned, but it has been researched relatively little. Rus talked about a study by researchers from Harvard University comparing the ability of expert doctors and AI software to diagnose cancer in patients. They found that doctors perform significantly better than the software, but doctors together with software were better still.

    Rus pointed to the potential for AI to augment human capabilities in law and in manufacturing, where smarter automated systems might enable the production of goods to be highly customized and more distributed.

    Robotics might end up augmenting human abilities in some surprising ways. For instance, Rus pointed to a project at MIT that involves using the technology in self-driving cars to help people with visual impairment to navigate. She also speculated that brain-computer interfaces, while still relatively crude today, might have a huge impact on future interactions with robots.

    Although Rus is bullish on the future of work, she said two economic phenomena do give her cause for concern. One is the decreasing quality of many jobs, something that is partly shaped by automation; and the other is the flat gross domestic product of the United States, which impacts the emergence of new economic opportunities.

    But because AI is still so limited, she said she expects it to mostly eliminate routing and boring elements of work. “There is still a lot to be done in this space,” Rus said. “I am wildly excited about offloading my routine tasks to machines so I can focus on things that are interesting.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

     
  • richardmitnick 11:49 am on October 20, 2017 Permalink | Reply
    Tags: Endless thirst and search for research knowledge, , Low-cost self-driving technology for power wheelchairs, Maya Burhanpurkar, Robotics, , Writing algorithms, Writing code   

    From Paulson: Women in STEM- “Driven to discover” Maya Burhanpurkar 

    Harvard School of Engineering and Applied Sciences
    John A Paulson School of Engineering and Applied Sciences

    October 18, 2017
    Adam Zewe

    1
    Harvard freshman Maya Burhanpurkar spent the past year developing software for a low-cost, self-driving technology for power wheelchairs, and writing super-fast algorithms to process data from one of the world’s most powerful telescopes. (Photo by Adam Zewe/SEAS Communications.)

    Before she arrived on campus this fall, Harvard freshman Maya Burhanpurkar already had a notch in her belt typically reserved for Ph.D. candidates. First author of a paper on a low-cost, self-driving technology for power wheelchairs, Burhanpurkar presented the research at the 2017 IEEE International Conference on Rehabilitative Robotics.

    Sharing her work with hundreds of scientists from around the world was exhilarating, Burhanpurkar said, and another milestone in a research career that began at age 7.

    “I was always asking questions. Eventually, I started asking questions people didn’t have answers to, so I started doing my own projects,” she said.

    2
    Maya Burhanpurkar discusses the low-cost, self-driving technology for power wheelchairs she and a University of Toronto team developed. (Image courtesy of Reuters.)

    The intrepid elementary school student, who grew up in a rural town 100 miles north of Toronto, set out to determine if herbs could kill pathogenic bacteria. She commandeered a piece of raw chicken meant for that night’s dinner, left it on the deck for a few days, and then swabbed it onto Petri dishes. In her basement microbiology laboratory, she piled herbs onto the Petri dishes and put them into a homemade incubator she built with a cooler and electric blanket.

    At Canada’s National Science Fair, Burhanpurkar showcased her incredible results—no bacterial growth meant the herbs must have killed the bacteria. A Science Fair judge quickly, but kindly, pointed out that the bacteria actually died due to suffocation.

    That experience only fueled Burhanpurkar’s desire to conduct more research. She began contacting professors and, while in ninth grade, joined a University of Toronto lab to build an apparatus that can physically detect the time integral of distance. The project earned her a second Grand Platinum award at the National Science Fair.

    Through middle and high school, she built a quantum key distribution system for cryptography at the Institute for Quantum Computing, tracked near-earth asteroids for the Harvard-Smithsonian Center for Astrophysics, and embarked on an expedition to study the impact of climate change on the Canadian and Greenlandic Arctic. The latter project led her to write and produce an award-winning climate change documentary titled “400 PPM.”

    3
    Burhanpurkar at Jakobshavn fjord on the west coast of Greenland. (Photo provided by Maya Burhanpurkar.)

    “Research really drew me in because of the opportunity to answer unanswered questions,” she said. “It is so fascinating that you can make fundamental discoveries about the universe around us.”

    Not even an early acceptance by Harvard could disrupt her focus on research; Burhanpurkar deferred admission to work on the self-driving wheel chair technology during a gap year. Her University of Toronto team sought to develop a hardware and software package that would make it easier for people with severe physical disabilities to use power wheelchairs.

    “People with hand tremors or more severe Parkinson’s Disease or ALS really struggle with a joy stick or an alternate input device, like a sip and puff switch,” she said. “These people often have degraded mobility and a degraded quality of life.”

    Burhanpurkar developed a core part of the software for the semi-autonomous system that is capable of localization, mapping, and obstacle avoidance. The software utilizes off-the-shelf computer vision and odometry sensors rather than expensive 3D laser scanners and high-performance hardware, so it is more cost-effective than other devices, Burhanpurkar said.

    Despite her lack of coding experience, she wrote a specialized path-planning algorithm that enables autonomous doorway detection and traversal simply by placing the wheelchair in front of a door. She also helped develop software for autonomously traveling down long corridors, and docking at a desk, typically very difficult tasks for users with upper body mobility impairments.

    “The challenge was what I enjoyed the most. I got thrown off the deep end in this project and I had to swim my way up, which was really fun,” she said. “It was intellectually interesting, but it was also emotionally interesting. Working on something that can directly impact people’s lives in the near future, not decades away, is really exciting.”

    But that was only half of Burhanpurkar’s gap year. She spent the other half as the youngest paid researcher at the Perimeter Institute for Theoretical Physics (where Stephen Hawking keeps an office), writing super-fast algorithms for a novel telescope in British Columbia. The telescope will continually map the entire northern hemisphere in an effort to learn more about cosmic fast radio bursts.

    4
    Burhanpurkar working on code at the Canadian Hydrogen Intensity Mapping Experiment telescope in British Columbia. (Photo by Richard Bowden.)

    Each day, the planet is bombarded by high-energy millisecond duration bursts of radio waves, each having the energy of 500 million of our suns, but scientists remain puzzled about their origins. This new telescope will enable researchers to gather data on thousands of these bursts, opening the door for more detailed analysis.

    Terabytes of astronomical data will be generated each second, so the super-fast algorithms Burhanpurkar and the team wrote are necessary to efficiently process the mass of information.

    “I know that right now my code is running on a new 128-node supercomputer in British Columbia, and it is going to help detect one of the most enigmatic phenomena of the universe,” she said. “That’s pretty cool.”

    Now at Harvard, Burhanpurkar is not planning to slow down. She is interested in continuing her robotics research and working with the i-Lab to bring the cost-effective self-driving wheelchair technology to consumers.

    While she hasn’t decided on a concentration, she is considering computer science and physics (or both), and looks forward to pursuing her passion for research down new avenues.

    “Taking a gap year was great for perspective,” she said. “Before, I wasn’t sure what I wanted to do. I hadn’t really done long-term research projects, but now, I have actual experience and I know what the end goal is. I can use that to motivate me.”

    5
    Burhanpurkar interviewing Canadian author and environmental activist Margaret Atwood for her climate change documentary titled “400 PPM.” (Photo provided by Maya Burhanpurkar.)

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Through research and scholarship, the Harvard School of Engineering and Applied Sciences (SEAS) will create collaborative bridges across Harvard and educate the next generation of global leaders. By harnessing the power of engineering and applied sciences we will address the greatest challenges facing our society.

    Specifically, that means that SEAS will provide to all Harvard College students an introduction to and familiarity with engineering and technology as this is essential knowledge in the 21st century.

    Moreover, our concentrators will be immersed in the liberal arts environment and be able to understand the societal context for their problem solving, capable of working seamlessly withothers, including those in the arts, the sciences, and the professional schools. They will focus on the fundamental engineering and applied science disciplines for the 21st century; as we will not teach legacy 20th century engineering disciplines.

    Instead, our curriculum will be rigorous but inviting to students, and be infused with active learning, interdisciplinary research, entrepreneurship and engineering design experiences. For our concentrators and graduate students, we will educate “T-shaped” individuals – with depth in one discipline but capable of working seamlessly with others, including arts, humanities, natural science and social science.

    To address current and future societal challenges, knowledge from fundamental science, art, and the humanities must all be linked through the application of engineering principles with the professions of law, medicine, public policy, design and business practice.

    In other words, solving important issues requires a multidisciplinary approach.

    With the combined strengths of SEAS, the Faculty of Arts and Sciences, and the professional schools, Harvard is ideally positioned to both broadly educate the next generation of leaders who understand the complexities of technology and society and to use its intellectual resources and innovative thinking to meet the challenges of the 21st century.

    Ultimately, we will provide to our graduates a rigorous quantitative liberal arts education that is an excellent launching point for any career and profession.

     
  • richardmitnick 3:19 pm on October 6, 2017 Permalink | Reply
    Tags: , “Primer” - a new cube-shaped robot can be controlled via magnets to make it walk roll sail and glide., , Robotics   

    From MIT: ““Superhero” robot wears different outfits for different tasks” 

    MIT News

    MIT Widget

    MIT News

    September 27, 2017
    Adam Conner-Simons
    Rachel Gordon

    1
    Dubbed “Primer,” a new cube-shaped robot can be controlled via magnets to make it walk, roll, sail, and glide. It carries out these actions by wearing different exoskeletons, which start out as sheets of plastic that fold into specific shapes when heated. After Primer finishes its task, it can shed its “skin” by immersing itself in water, which dissolves the exoskeleton. Courtesy of the researchers.

    From butterflies that sprout wings to hermit crabs that switch their shells, many animals must adapt their exterior features in order to survive. While humans don’t undergo that kind of metamorphosis, we often try to create functional objects that are similarly adaptive — including our robots.

    Despite what you might have seen in “Transformers” movies, though, today’s robots are still pretty inflexible. Each of their parts usually has a fixed structure and a single defined purpose, making it difficult for them to perform a wide variety of actions.

    Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) are aiming to change that with a new shape-shifting robot that’s something of a superhero: It can transform itself with different “outfits” that allow it to perform different tasks.

    Dubbed “Primer,” the cube-shaped robot can be controlled via magnets to make it walk, roll, sail, and glide. It carries out these actions by wearing different exoskeletons, which start out as sheets of plastic that fold into specific shapes when heated. After Primer finishes its task, it can shed its “skin” by immersing itself in water, which dissolves the exoskeleton.

    “If we want robots to help us do things, it’s not very efficient to have a different one for each task,” says Daniela Rus, CSAIL director and principal investigator on the project. “With this metamorphosis-inspired approach, we can extend the capabilities of a single robot by giving it different ‘accessories’ to use in different situations.”

    Primer’s various forms have a range of advantages. For example, “Wheel-bot” has wheels that allow it to move twice as fast as “Walk-bot.” “Boat-bot” can float on water and carry nearly twice its weight. “Glider-bot” can soar across longer distances, which could be useful for deploying robots or switching environments.

    Primer can even wear multiple outfits at once, like a Russian nesting doll. It can add one exoskeleton to become “Walk-bot,” and then interface with another, larger exoskeleton that allows it to carry objects and move two body lengths per second. To deploy the second exoskeleton, “Walk-bot” steps onto the sheet, which then blankets the bot with its four self-folding arms.

    “Imagine future applications for space exploration, where you could send a single robot with a stack of exoskeletons to Mars,” says postdoc Shuguang Li, one of the co-authors of the study. “The robot could then perform different tasks by wearing different ‘outfits.’”

    The project was led by Rus and Shuhei Miyashita, a former CSAIL postdoc who is now director of the Microrobotics Group at the University of York. Their co-authors include Li and graduate student Steven Guitron. An article about the work appears in the journal Science Robotics on Sept. 27.

    Robot metamorphosis

    Primer builds on several previous projects from Rus’ team, including magnetic blocks that can assemble themselves into different shapes and centimeter-long microrobots that can be precisely customized from sheets of plastic.

    While robots that can change their form or function have been developed at larger sizes, it’s generally been difficult to build such structures at much smaller scales.

    “This work represents an advance over the authors’ previous work in that they have now demonstrated a scheme that allows for the creation of five different functionalities,” says Eric Diller, a microrobotics expert and assistant professor of mechanical engineering at the University of Toronto, who was not involved in the paper. “Previous work at most shifted between only two functionalities, such as ‘open’ or ‘closed’ shapes.”

    The team outlines many potential applications for robots that can perform multiple actions with just a quick costume change. For example, say some equipment needs to be moved across a stream. A single robot with multiple exoskeletons could potentially sail across the stream and then carry objects on the other side.

    “Our approach shows that origami-inspired manufacturing allows us to have robotic components that are versatile, accessible, and reusable,” says Rus, the Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science at MIT.

    Designed in a matter of hours, the exoskeletons fold into shape after being heated for just a few seconds, suggesting a new approach to rapid fabrication of robots.

    “I could envision devices like these being used in ‘microfactories’ where prefabricated parts and tools would enable a single microrobot to do many complex tasks on demand,” Diller says.

    As a next step, the team plans to explore giving the robots an even wider range of capabilities, from driving through water and burrowing in sand to camouflaging their color. Guitron pictures a future robotics community that shares open-source designs for parts much the way 3-D-printing enthusiasts trade ideas on sites such as Thingiverse.

    “I can imagine one day being able to customize robots with different arms and appendages,” says Rus. “Why update a whole robot when you can just update one part of it?”

    This project was supported, in part, by the National Science Foundation.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 5:16 am on July 19, 2017 Permalink | Reply
    Tags: , , CSIRO blogs, Robotics   

    From CSIRO blog: “Legged robots walk the walk” 

    CSIRO bloc

    CSIRO blog

    19th July 2017
    Eliza Keck

    1
    In an emergency, first responders often have to make a very tough call: can I enter the area safely or is it too dangerous? It’s the most extreme risk vs reward analysis anyone could ever face, and the call is often made in mere moments and with very little information. In the future, this decision will hopefully be much easier with the help of some six legged robots: hexapods. Creating robots that can go into an unpredictable, unstable environment and help people escape it would be a literal life-saver.

    You know how people are always talking about how robots are going to steal our jobs and take over the world? Well this is one job we wouldn’t mind them taking.

    Wheel what have we here?

    There are some pretty amazing robots on wheels. Case in point: NASA’s Curiosity.

    NASA/Mars Curiosity Rover

    Wheels are great for moving fast, they’re stable and they are easy to build. So why the obsession with legs? Well, wheels have their drawbacks: they can’t go side to side (well, most of them can’t!), they can’t cross over gaps, can’t climb over obstacles and they’re basically turtles; flip them on their back and they’re useless. Legs are the answer. So why haven’t we done it already? Because legs are significantly complicated.

    Balancing act

    Humans have been trying to create humanoid robots for centuries. But being able to walk on two legs is a significant achievement that took us millions of years to perfect. To simply balance, many complex body systems work together (and even compensate for each other when required). There’s our vestibulo-ocular reflex (our eyes and inner-ear working together), our nervous system and the body’s sense of where it is in space: proprioception. We also have baroreceptors, sensors in our blood vessels that sense blood pressure (like a barometer and air pressure), that tell our heart to pump blood faster when we stand so we don’t faint.

    When designing a robot, scientists have to decide what kind of stability it will use: dynamic or static. As its name suggests, a statically stable robot will be stable when standing still. Basically – any robot with three legs or more can do this without trying. Dynamically stable robots are stable when moving (think about how much easier it is to hop on one leg than standing still on one leg). Obviously, a dynamically stable robot is much harder to control and significantly more complex however they are more energy efficient and faster. Most scientists are working to create something that is the best of both worlds. For us, we’re doing this with hexapods.

    2
    Model of a humanoid robot based on drawings by Leonardo da Vinci. Photo by Erik Möller.

    The invention of sensors like accelerometers and gyroscopes have helped scientists take the next *step* forwards in balance and stability, but that’s only the start of the many complex problems scientists have to solve before our robot dreams can turn into reality.

    Casing the joint

    Do you enjoy scrambling around rock pools at the beach? Ever notice how you moved when climbing? It wasn’t the same as if you were walking on the footpath was it? You slow down, use your hands for stability and test the movement of each rock before committing all your weight on it. Your joints play a vital role in stability in rough terrain. Toes, ankles, knees, hips and your back all make minute and major adjustments to keep you stable.

    Having flexible legs with multiple joints improves stability on rough terrain. Our first hexapod models had three ‘joints’ per leg. They were fantastic at walking on a flat surface, but as soon as they encountered a steep hill they lost their grip.

    Our latest hexapod models have two extra ‘joints’ per leg and can now tackle up to 50 per cent inclines. This is because they can widen their stance, creating a larger support polygon and shifting their centre of gravity to be within this polygon.

    3
    How useful are diagrams when trying to understand support polygons!? Credit: Stability During Arboreal Locomotion; Andrew Lammers, and Ulrich Zurcher, Cleveland State University, USA.

    Walk this way

    The gait (walking style) of the robot plays a big role in how fast and efficient it will be. When deciding which gait a robot uses you’ve got two options: fast, efficient but unstable or slow, stable, but inefficient. Neither option would work in real-life. So what’s the solution? The robot needs to be able to change how it moves depending on the situation. This is called ‘dynamic movement.’ Our hexapods constantly test the surface and will automatically change their gait and speed to stay stable and energy efficient.

    Our legged robots have got all the right moves, click here to learn more about them.

    Getting around is no easy feat, unless you have six of them

    When disaster strikes, who’s first on the scene?

    Emergency response teams often need to enter dangerous or confined spaces. But accessing unknown or unstable areas involves risk.

    Our legged robots are designed to go where no other robot or human can easily access – for example, a collapsed building. These nifty bots are able to safely explore and assess dangerous areas, such as when looking for survivors before sending in rescue teams.

    Introducing the legged robots

    Our hexapods are modelled off insects with the same number and configuration of legs, like ants and cockroaches. The hexapods are programmed with different gaits inspired by their natural counterparts.

    Our hexapods are modelled off insects with the same number and configuration of legs, like ants and cockroaches. The hexapods are programmed with different gaits inspired by their natural counterparts.

    4

    One of the most popular gaits, inspired by running ants and cockroaches, is called the “alternating tripod gait”. The “waive gait”, closer to a caterpillar’s pattern, is slower but more stable. It’s much more useful when navigating sloped or slippery terrain.

    One of our hexapods, Weaver, has five joints on each of its six legs, enabling it to move freely and negotiate uneven terrain easily.

    It is also fitted with a pair of stereo cameras, allowing it to create a digital elevation map of an area, and detect any physical obstacles in its path. Thanks to sensors in each of its leg joints, this nifty insect-like bot can measure the forces felt at its foot tips. When each foot touches the ground, it feeds this information on the ground conditions back through a sequence of algorithms.

    In combination with its elevation map, the hexapod can interpret the stability of the surface and then adjust the stiffness of its legs as it travels. This allows the legged robot to avoid getting stuck or losing balance, by adjusting the flexibility of its leg joints depending on the roughness of the terrain.

    Getting into those hard to reach spaces

    6
    Hexapods: Legged Robots

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    CSIRO campus

    CSIRO, the Commonwealth Scientific and Industrial Research Organisation, is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

    The CSIRO blog is designed to entertain, inform and inspire by generally digging around in the work being done by our terrific scientists, and leaving the techie speak and jargon for the experts.

    We aim to bring you stories from across the vast breadth and depth of our organisation: from the wild sea voyages of our Research Vessel Investigator to the mind-blowing astronomy of our Space teams, right through all the different ways our scientists solve national challenges in areas as diverse as Health, Farming, Tech, Manufacturing, Energy, Oceans, and our Environment.

    If you have any questions about anything you find on our blog, we’d love to hear from you. You can reach us at socialmedia@csiro.au.

    And if you’d like to find out more about us, our science, or how to work with us, head over to CSIRO.au

     
  • richardmitnick 7:58 am on July 14, 2017 Permalink | Reply
    Tags: , Biomechatronics, , Developing designs for exoskeletons and prosthetic limbs, New software algorithms, Robotics   

    From Carnegie Mellon: “Carnegie Mellon Develops Landmark Achievement in Walking Technology” 

    Carnegie Mellon University logo
    Carnegie Mellon University

    July 11, 2017
    Lisa Kulick
    lkulick@andrew.cmu.edu

    Researchers in Carnegie Mellon University’s College of Engineering are using feedback from the human body to develop designs for exoskeletons and prosthetic limbs.

    Published in Science, their technique, called human-in-the-loop optimization, customizes walking assistance for individuals and significantly lessens the amount of energy needed when walking. The algorithm that enables this optimization represents a landmark achievement in the field of biomechatronics.

    “Existing exoskeleton devices, despite their potential, have not improved walking performance as much as we think they should,” said Steven Collins, a professor of mechanical engineering. “We’ve seen improvements related to computing, hardware and sensors, but the biggest challenge has remained the human element — we just haven’t been able to guess how they will respond to new devices.”

    The software algorithm is combined with versatile emulator hardware that automatically identifies optimal assistance strategies for individuals.

    During experiments, each user received a unique pattern of assistance from an exoskeleton worn on one ankle. The algorithm tested responses to 32 patterns over the course of an hour, making adjustments based on measurements of the user’s energy use with each pattern. The optimized assistance pattern produced larger benefits than any exoskeleton to date, including devices acting at all joints on both legs.

    “When we walk, we naturally optimize coordination patterns for energy efficiency,” Collins said. “Human-in-the-loop optimization acts in a similar way to optimize the assistance provided by wearable devices. We are really excited about this approach because we think it will dramatically improve energy economy, speed and balance for millions of people, especially those with disabilities.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Carnegie Mellon Campus

    Carnegie Mellon University (CMU) is a global research university with more than 12,000 students, 95,000 alumni, and 5,000 faculty and staff.
    CMU has been a birthplace of innovation since its founding in 1900.
    Today, we are a global leader bringing groundbreaking ideas to market and creating successful startup businesses.
    Our award-winning faculty members are renowned for working closely with students to solve major scientific, technological and societal challenges. We put a strong emphasis on creating things—from art to robots. Our students are recruited by some of the world’s most innovative companies.
    We have campuses in Pittsburgh, Qatar and Silicon Valley, and degree-granting programs around the world, including Africa, Asia, Australia, Europe and Latin America.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: