Tagged: Robotics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:44 am on May 23, 2018 Permalink | Reply
    Tags: , , Meet Our Robot Family, Robotics   

    From CSIROscope: “Meet Our Robot Family” 

    CSIRO bloc

    From CSIROscope

    23 May 2018
    Ketan Joshi

    1
    We’re developing robotic systems that help humans perform dangerous tasks, and expanding the Australian robotics industry. Above, Weaver. No image credit.

    The family of robots that live at our Data61 are incredibly diverse. They’ve got legs, wheels, cameras, sensors, fins, blades and magnets. They sense the world, navigate it autonomously, and they traverse places too dangerous and dirty for human work. They’re as varied as the challenges they’re designed to resolve, but the common DNA is a focus on the use of cutting-edge data science.

    This isn’t something we go at alone—our partners include: DARPA (Defence Advanced Research Projects Agency), Rockwell Collins, Boeing, Woodside, Queensland University of Technology, and many other government, universities and enterprises. We recently announced the Sixth Wave Alliance, to develop a national robotics R&D strategy and create the critical mass required to address large-scale Australian and international challenges using robotics technologies.

    This week, we’re also at the International Conference on Robotics (ICRA 2018), where we’re showcasing the best of our bots.


    Meet the family below, and read more about our robotics research here.

    Machines that see – Sensing and mapping the world

    Sucking up information from the world is a capability we fleshy humans take for granted. Data61’s robotic and autonomous devices are particularly good at sensing and mapping – two capabilities that are of high importance for modern robotics and industries like mining, exploration and environmental conservation.

    Hovermap and Zebedee – moving without GPSs

    Drones are increasingly common as consumer goods, but they’re reliant on direct access to global position satellites (GPS).

    Hovermap is a 3D mapping system that uses LIDAR (light detection and ranging) technology, combined with Data61’s proprietary Simultaneous Localisation and Mapping (SLAM) solution. Hovermap works in conjunction with a UAV (uncrewed autonomous vehicle), and can map both indoor and outdoor locations without relying on GPS.

    2
    Zebedee, our high-accuracy 3D laser mapping technology, was commercialised and is already being used around the world by 25 multinational organisations. It was recently trialled by the International Atomic Energy Agency in nuclear safeguards inspections.

    Camazotz – the bat god tech

    Camazotz, named after a Mayan bat god, is a small, portable device that is used to monitor flying foxes across Australia, helping ecologists understand and predict the spread of disease. The Wireless Ad hoc System for Positioning (WASP) uses similar tags to track vehicles and mine workers relative to reference nodes – assisting with safety and boosting productivity.

    Legged Robots

    You’ve probably seen videos of animal robots doing clever tasks and being shared with a tone of alarm. Legged robots aren’t reason for alarm – these systems are well suited to navigating environments that are too dangerous or dirty for safe human work, such as a chemical spill in a plant or the ceiling beam in a factory.

    Gizmo

    3
    Gizmo dancing

    Gizmo is Data61’s newest bot – a small, smooth hexapod designed for versatility and small spaces. One of the motivating applications for this robot is to inspect and map ceiling cavity and underfloor-type confined spaces.

    Zee

    4
    Zee

    Zee is a prototype hexapod robot equipped with a streaming camera sensor and a real-time 3D scanning LIDAR. You’ve probably seen Zee around – it’s an older machine but still an excellent demonstration of six-legged robotics.

    Weaver

    5
    Zee’s big sister, Weaver, features five joints per leg and 30 degrees of freedom. Weaver can self-stabilise through ‘exteroceptive’ sensing – enabling the robot to walk up gradients of 30°, and remain stable on inclines up to 50°.

    MaX

    6

    MaX (Multi-legged autonomous explorer) is even bigger – 2.25m tall when standing up straight. But MaX only weighs 60kg; around 5 to 20 times lighter than comparable robots. MaX is a research vehicle designed to help our scientists understand how to traverse and explore challenging indoor and outdoor environments.

    Magnapod

    7

    Magnapods are Data61’s wall-climbing, electro-magnetic inspection robots, useful in confined space inspection tasks and capable of carrying a 10 kilogram sensor payload.

    You can read more about the scientific goals of our legged robot research program here.

    Autonomous vehicles

    Creating systems that can navigate and respond without human intervention is a key component in removing the human element from tasks that are dangerous or poorly suited for human control. We’ve developed several ground vehicles normally used in industrial environments that can operate without human intervention, including the Gator, the load haul dump vehicle and the 20 tonne hot metal carrier.

    8

    Our Science Rover enabled the complicated process of satellite calibration – the autonomous vehicle collects measurements at the same time an Earth observation satellite passes overhead – the two datasets are compared, and the satellite is calibrated. Our underwater autonomous vehicle, Starbug, uses underwater sensor networks to locate itself (GPS signals cannot be used underwater), enabling smart underwater data collection for protection and tracking of ecosystems.

    Our family of robots is, as you can see, pretty diverse. It’s the broad nature of the challenges they’re addressing that gives them these shapes, from small to big, wheeled to legged.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.
    stem
    Stem Education Coalition

    SKA/ASKAP radio telescope at the Murchison Radio-astronomy Observatory (MRO) in Mid West region of Western Australia

    So what can we expect these new radio projects to discover? We have no idea, but history tells us that they are almost certain to deliver some major surprises.

    Making these new discoveries may not be so simple. Gone are the days when astronomers could just notice something odd as they browse their tables and graphs.

    Nowadays, astronomers are more likely to be distilling their answers from carefully-posed queries to databases containing petabytes of data. Human brains are just not up to the job of making unexpected discoveries in these circumstances, and instead we will need to develop “learning machines” to help us discover the unexpected.

    With the right tools and careful insight, who knows what we might find.

    CSIRO campus

    CSIRO, the Commonwealth Scientific and Industrial Research Organisation, is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

    Advertisements
     
  • richardmitnick 9:02 am on January 3, 2018 Permalink | Reply
    Tags: , Bimorph, , Graphene-based Bimorphs for Micron-sized Autonomous Origami Machines, Physicists take first step toward cell-sized robots, , Robotics, You could put the computational power of the spaceship Voyager onto an object the size of a cell   

    From Cornell Chronicle: “Physicists take first step toward cell-sized robots” 

    Cornell Bloc

    Cornell Chronicle

    January 2, 2018
    Tom Fleischman
    tjf85@cornell.edu

    Charles Walcott

    1
    An electricity-conducting, environment-sensing, shape-changing machine the size of a human cell? Is that even possible?

    Cornell physicists Paul McEuen and Itai Cohen not only say yes, but they’ve actually built the “muscle” for one.

    With postdoctoral researcher Marc Miskin at the helm, the team has made a robot exoskeleton that can rapidly change its shape upon sensing chemical or thermal changes in its environment. And, they claim, these microscale machines – equipped with electronic, photonic and chemical payloads – could become a powerful platform for robotics at the size scale of biological microorganisms.

    “You could put the computational power of the spaceship Voyager onto an object the size of a cell,” Cohen said. “Then, where do you go explore?”

    “We are trying to build what you might call an ‘exoskeleton’ for electronics,” said McEuen, the John A. Newman Professor of Physical Science and director of the Kavli Institute at Cornell for Nanoscale Science. “Right now, you can make little computer chips that do a lot of information-processing … but they don’t know how to move or cause something to bend.”

    Their work is outlined in Graphene-based Bimorphs for Micron-sized, Autonomous Origami Machines, published Jan. 2 in Proceedings of the National Academy of Sciences. Miskin is lead author; other contributors included David Muller, the Samuel B. Eckert Professor of Engineering, and doctoral students Kyle Dorsey, Baris Bircan and Yimo Han.

    The machines move using a motor called a bimorph. A bimorph is an assembly of two materials – in this case, graphene and glass – that bends when driven by a stimulus like heat, a chemical reaction or an applied voltage. The shape change happens because, in the case of heat, two materials with different thermal responses expand by different amounts over the same temperature change.

    As a consequence, the bimorph bends to relieve some of this strain, allowing one layer to stretch out longer than the other. By adding rigid flat panels that cannot be bent by bimorphs, the researchers localize bending to take place only in specific places, creating folds. With this concept, they are able to make a variety of folding structures ranging from tetrahedra (triangular pyramids) to cubes.

    In the case of graphene and glass, the bimorphs also fold in response to chemical stimuli by driving large ions into the glass, causing it to expand. Typically this chemical activity only occurs on the very outer edge of glass when submerged in water or some other ionic fluid. Since their bimorph is only a few nanometers thick, the glass is basically all outer edge and very reactive.

    “It’s a neat trick,” Miskin said, “because it’s something you can do only with these nanoscale systems.”

    The bimorph is built using atomic layer deposition – chemically “painting” atomically thin layers of silicon dioxide onto aluminum over a cover slip – then wet-transferring a single atomic layer of graphene on top of the stack. The result is the thinnest bimorph ever made.

    One of their machines was described as being “three times larger than a red blood cell and three times smaller than a large neuron” when folded. Folding scaffolds of this size have been built before, but this group’s version has one clear advantage.

    “Our devices are compatible with semiconductor manufacturing,” Cohen said. “That’s what’s making this compatible with our future vision for robotics at this scale.”

    And due to graphene’s relative strength, Miskin said, it can handle the types of loads necessary for electronics applications.

    “If you want to build this electronics exoskeleton,” he said, “you need it to be able to produce enough force to carry the electronics. Ours does that.”

    For now, these tiniest of tiny machines have no commercial application in electronics, biological sensing or anything else. But the research pushes the science of nanoscale robots forward, McEuen said.

    “Right now, there are no ‘muscles’ for small-scale machines,” he said, “so we’re building the small-scale muscles.”

    This work was performed at the Cornell NanoScale Facility for Science and Technology and supported by the Cornell Center for Materials Research, the National Science Foundation, the Air Force Office of Scientific Research and the Kavli Institute at Cornell.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Once called “the first American university” by educational historian Frederick Rudolph, Cornell University represents a distinctive mix of eminent scholarship and democratic ideals. Adding practical subjects to the classics and admitting qualified students regardless of nationality, race, social circumstance, gender, or religion was quite a departure when Cornell was founded in 1865.

    Today’s Cornell reflects this heritage of egalitarian excellence. It is home to the nation’s first colleges devoted to hotel administration, industrial and labor relations, and veterinary medicine. Both a private university and the land-grant institution of New York State, Cornell University is the most educationally diverse member of the Ivy League.

    On the Ithaca campus alone nearly 20,000 students representing every state and 120 countries choose from among 4,000 courses in 11 undergraduate, graduate, and professional schools. Many undergraduates participate in a wide range of interdisciplinary programs, play meaningful roles in original research, and study in Cornell programs in Washington, New York City, and the world over.

     
  • richardmitnick 11:46 am on December 31, 2017 Permalink | Reply
    Tags: , , , Daniel Vogt, Falkor research vessel, , NOAA’s Office of Ocean Exploration and Research, , PIPA-Phoenix Islands Protected Area, Robotics, ROV-remotely operated underwater vehicle, Schmidt Ocean Institute, , Squishy fingers help scientists probe the watery depths,   

    From Wyss Institute: “Squishy fingers help scientists probe the watery depths” 2017 

    Harvard bloc tiny
    Wyss Institute bloc
    Wyss Institute

    October 28, 2017
    Lindsay Brownell

    Wyss researcher Daniel Vogt tests out soft robotics on deep sea corals in the South Pacific.

    As an engineer with degrees in Computer Science and Microengineering, Wyss researcher Daniel Vogt usually spends most of his time in his lab building and testing robots, surrounded by jumbles of cables, wires, bits of plastic, and circuit boards. But for the last month, he’s spent nearly every day in a room that resembles NASA ground control surrounded by marine biologists on a ship in the middle of the Pacific Ocean, intently watching them use joysticks and buttons to maneuver a remotely operated underwater vehicle (ROV) to harvest corals, crabs, and other sea life from the ocean floor.

    1
    The squishy fingers are made of a soft, flexible material that is more dexterous and gentle than ROVs’ conventional grippers. Credit: Schmidt Ocean Institute.


    Deep corals of the Phoenix Islands Protected Area: How Wyss Institute researchers are changing underwater exploration. Credit: Schmidt Ocean Institute.

    This particular ROV’s robotic metal arm is holding the reason why Vogt is here: what looks like a large, floppy toy starfish made of blue and yellow foam. “Devices like this are extremely soft – you can compare them to rubber bands or gummy bears – and this allows them to grasp things that you wouldn’t be able to grasp with a hard device like the ROV gripper,” says Vogt, watching the TV screen as the “squishy fingers” gently close around a diaphanous bright pink sea cucumber and lift it off the sand. The biologists applaud as the fingers cradle the sea cucumber safely on its journey to the ROV’s collection box. “Nicely done,” Vogt says to the ROV operators.

    This shipful of scientists is the latest in a series of research voyages co-funded by NOAA’s Office of Ocean Exploration and Research and the Schmidt Ocean Institute, a nonprofit founded by Eric and Wendy Schmidt in 2009 to support high-risk marine exploration that expands humans’ understanding of our planet’s oceans. The Institute provides marine scientists access to the ship, Falkor, and expert technical shipboard support in exchange for a commitment to openly share and communicate the outcomes of their research.

    2
    Falkor is equipped with both wet and dry lab spaces, the ROV SuBastian, echosounders, water sampling systems, and many other instruments to gather data about the ocean. Credit: Schmidt Ocean Institute.

    Vogt’s shipmates are studying the mysterious deep sea coral communities of the deep ocean, which live below 138 meters (450 feet) on seamounts which are mostly unexplored.

    The best place to find those corals is the Phoenix Islands Protected Area (PIPA), a smattering of tiny islands, atolls, coral reefs, and great swaths of their surrounding South Pacific ocean almost 3,000 miles from the nearest continent. PIPA is the largest (the size of California) and deepest (average water column depth of 4 km/2.5 mi) UNESCO World Heritage Site on Earth and, thanks to its designation as a Marine Protected Area in 2008, represents one of Earth’s last intact oceanic coral archipelago ecosystems. With over 500 species of reef fishes, 250 shallow coral species, and large numbers of sharks and other marine life, PIPA’s reefs resemble what a reef might have looked like a thousand years ago, before human activity began to severely affect oceanic communities. The team on board Falkor is conducting the first deep water biological surveys in PIPA, assessing what species of deep corals are present and any new, undescribed species, while also evaluating the effect of seawater acidification (caused by an increase in the amount of CO2 in the water) on deep coral ecosystems.

    The deep ocean is about as inhospitable to human life as outer space, so scientists largely rely on ROVs to be their eyes, legs, and hands underwater, controlling them remotely from the safety of the surface. Most ROVs used in deep-sea research were designed for use in the oil and gas industries and are built to accomplish tasks like lifting heavy weights, drilling into rock, and installing machinery. When it comes to plucking a sea cucumber off the ocean floor or snipping a piece off a delicate sea fan, however, existing ROVs are like bulls in a china shop, often crushing the samples they’re meant to be taking.

    This problem led to a collaboration between Wyss Core Faculty member Rob Wood, Ph.D. and City University of New York (CUNY) marine biologist David Gruber, Ph.D. back in 2014 that produced the first version of the soft robotic “squishy fingers,” which were successfully tested in the Red Sea in 2015. PIPA offered a unique opportunity to test the squishy fingers in more extreme conditions and evaluate a series of improvements that Vogt and other members of Wood’s lab have been making to them, such as integrating sensors into the robots’ soft bodies. “The Phoenix Islands are very unexplored. We’re looking for new species of corals that nobody has ever seen anywhere else. We don’t know what our graspers will have to pick up on a given day, so it’s a great opportunity to see how they fare against different challenges in the field.”

    3
    Daniel Vogt holds the ‘squishy finger’ soft robots aboard Falkor. Credit: Schmidt Ocean Institute.

    Vogt, ever the tinkerer, also brought with him something that the Red Sea voyage did not have on board: two off-the-shelf 3D printers. Taking feedback directly from the biologists and the ROV pilots about what the soft robot could and could not do, Vogt was able to print new components overnight and try them in the field the next day – something that rarely happens even on land. “It’s really a novel thing, to be able to iterate based on input in the middle of the Pacific Ocean, with no lab in sight. We noticed, for example, that the samples we tried to grasp were often on rock instead of sand, making it difficult for the soft fingers to reach underneath the sample for a good grip. In the latest iteration of the gripper, ‘fingernails’ were added to improve grasping in these situations.” The ultimate goal of building better and better underwater soft robots is to be able to conduct research on samples underwater at their natural depth and temperature, rather than bringing them up to the surface, as this will paint a more accurate picture of what is happening out of sight in the world’s oceans.

    PIPA may be somewhat insulated from the threats of warming oceans and pollution thanks to its remoteness and deep waters, but the people of Kiribati, the island nation that contains and administers PIPA, are not. The researchers visited the island of Kanton, population 25, a few days into their trip to meet the local people and learn about their lives in a country where dry land makes up less than 1% of its total area – a true oceanic nation. “The people were very nice, very welcoming. There is one ship that comes every six months to deliver supplies; everything else they get from the sea,” says Vogt (locals are allowed to fish for subsistence). “They’re also going to be one of the first nations affected by rising sea levels, because the highest point on the whole island is three meters (ten feet). They know that they live in a special place, but they’re preparing for the day when they’ll have to leave their home. The whole community has bought land on Fiji, where they’ll move once Kanton becomes uninhabitable.”

    4
    Daniel Vogt tests the squishy fingers on the forearm of CUNY biologist David Gruber, who spearheaded their development along with Wyss Faculty member Rob Wood. Credit: Schmidt Ocean Institute.

    Research that brings scientists from different fields together to elucidate the world’s remaining unknowns and solve its toughest problems is gaining popularity, and may be the best chance humanity has to ensure its own survival. “One of the most eye-opening part of the trip has been interacting with people from different backgrounds and seeing the scientific challenges they face, which are very different from the challenges that the mechanical and electrical engineers I’m with most of the time have to solve,” says Vogt. “I’ve been amazed by the technology that’s on Falkor related to the ROV and all the scientific tools aboard. The ROV SuBastian is one-of-a-kind, with numerous tools, cameras and sensors aboard as well as an advanced underwater positioning system. It takes a lot of engineers to create and operate something like that, and then a lot of biologists to interpret the results and analyze the 400+ samples which were collected during the cruise.”

    Vogt says he spent a lot of time listening to the biologists and the ROV pilots in order to modify the gripper’s design according to their feedback. The latest version of the gripper was fully designed and manufactured on the boat, and was used during the last dive to successfully sample a variety of sea creatures. He and Wood plan to write several papers detailing the results of his experiments in the coming months.

    “We’re very excited that what started as a conversation between a roboticist and a marine biologist at a conference three years ago has blossomed into a project that solves a significant problem in the real world, and can aid researchers in understanding and preserving our oceans’ sea life,” says Wood.

    Additional videos detailing Vogt’s voyage, including the ship’s log, can be found here.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Wyss Institute campus

    The Wyss (pronounced “Veese”) Institute for Biologically Inspired Engineering uses Nature’s design principles to develop bioinspired materials and devices that will transform medicine and create a more sustainable world.

    Working as an alliance among Harvard’s Schools of Medicine, Engineering, and Arts & Sciences, and in partnership with Beth Israel Deaconess Medical Center, Boston Children’s Hospital, Brigham and Women’s Hospital, Dana Farber Cancer Institute, Massachusetts General Hospital, the University of Massachusetts Medical School, Spaulding Rehabilitation Hospital, Tufts University, and Boston University, the Institute crosses disciplinary and institutional barriers to engage in high-risk research that leads to transformative technological breakthroughs.

     
  • richardmitnick 1:23 pm on November 8, 2017 Permalink | Reply
    Tags: , CSAIL-MIT’s Computer Science and Artificial Intelligence Lab, Daniela Rus, , More Evidence that Humans and Machines Are Better When They Team Up, Robotics,   

    From M.I.T Technology Review: Women in STEM- Daniela Rus”More Evidence that Humans and Machines Are Better When They Team Up” 

    MIT Technology Review
    M.I.T Technology Review

    November 8, 2017
    Will Knight

    By worrying about job displacement, we might end up missing a huge opportunity for technological amplification.

    1
    MIT computer scientist Daniela Rus. Justin Saglio

    Instead of just fretting about how robots and AI will eliminate jobs, we should explore new ways for humans and machines to collaborate, says Daniela Rus, director of MIT’s Computer Science and Artificial Intelligence Lab (CSAIL).

    “I believe people and machines should not be competitors, they should be collaborators,” Rus said during her keynote at EmTech MIT 2017, an annual event hosted by MIT Technology Review.

    How technology will impact employment in coming years has become a huge question for economists, policy-makers, and technologists. And, as one of the world’s preeminent centers of robotics and artificial intelligence, CSAIL has a big stake in driving coming changes.

    There is some disagreement among experts about how significantly jobs will be affected by automation and AI, and about how this will be offset by the creation of new business opportunities. Last week, Rus and others at MIT organized an event called AI and the Future of Work, where some speakers gave more dire warnings about the likely upheaval ahead (see “Is AI About to Decimate White Collar Jobs?”).

    The potential for AI to augment human skills is often mentioned, but it has been researched relatively little. Rus talked about a study by researchers from Harvard University comparing the ability of expert doctors and AI software to diagnose cancer in patients. They found that doctors perform significantly better than the software, but doctors together with software were better still.

    Rus pointed to the potential for AI to augment human capabilities in law and in manufacturing, where smarter automated systems might enable the production of goods to be highly customized and more distributed.

    Robotics might end up augmenting human abilities in some surprising ways. For instance, Rus pointed to a project at MIT that involves using the technology in self-driving cars to help people with visual impairment to navigate. She also speculated that brain-computer interfaces, while still relatively crude today, might have a huge impact on future interactions with robots.

    Although Rus is bullish on the future of work, she said two economic phenomena do give her cause for concern. One is the decreasing quality of many jobs, something that is partly shaped by automation; and the other is the flat gross domestic product of the United States, which impacts the emergence of new economic opportunities.

    But because AI is still so limited, she said she expects it to mostly eliminate routing and boring elements of work. “There is still a lot to be done in this space,” Rus said. “I am wildly excited about offloading my routine tasks to machines so I can focus on things that are interesting.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

     
  • richardmitnick 11:49 am on October 20, 2017 Permalink | Reply
    Tags: Endless thirst and search for research knowledge, , Low-cost self-driving technology for power wheelchairs, Maya Burhanpurkar, Robotics, , Writing algorithms, Writing code   

    From Paulson: Women in STEM- “Driven to discover” Maya Burhanpurkar 

    Harvard School of Engineering and Applied Sciences
    John A Paulson School of Engineering and Applied Sciences

    October 18, 2017
    Adam Zewe

    1
    Harvard freshman Maya Burhanpurkar spent the past year developing software for a low-cost, self-driving technology for power wheelchairs, and writing super-fast algorithms to process data from one of the world’s most powerful telescopes. (Photo by Adam Zewe/SEAS Communications.)

    Before she arrived on campus this fall, Harvard freshman Maya Burhanpurkar already had a notch in her belt typically reserved for Ph.D. candidates. First author of a paper on a low-cost, self-driving technology for power wheelchairs, Burhanpurkar presented the research at the 2017 IEEE International Conference on Rehabilitative Robotics.

    Sharing her work with hundreds of scientists from around the world was exhilarating, Burhanpurkar said, and another milestone in a research career that began at age 7.

    “I was always asking questions. Eventually, I started asking questions people didn’t have answers to, so I started doing my own projects,” she said.

    2
    Maya Burhanpurkar discusses the low-cost, self-driving technology for power wheelchairs she and a University of Toronto team developed. (Image courtesy of Reuters.)

    The intrepid elementary school student, who grew up in a rural town 100 miles north of Toronto, set out to determine if herbs could kill pathogenic bacteria. She commandeered a piece of raw chicken meant for that night’s dinner, left it on the deck for a few days, and then swabbed it onto Petri dishes. In her basement microbiology laboratory, she piled herbs onto the Petri dishes and put them into a homemade incubator she built with a cooler and electric blanket.

    At Canada’s National Science Fair, Burhanpurkar showcased her incredible results—no bacterial growth meant the herbs must have killed the bacteria. A Science Fair judge quickly, but kindly, pointed out that the bacteria actually died due to suffocation.

    That experience only fueled Burhanpurkar’s desire to conduct more research. She began contacting professors and, while in ninth grade, joined a University of Toronto lab to build an apparatus that can physically detect the time integral of distance. The project earned her a second Grand Platinum award at the National Science Fair.

    Through middle and high school, she built a quantum key distribution system for cryptography at the Institute for Quantum Computing, tracked near-earth asteroids for the Harvard-Smithsonian Center for Astrophysics, and embarked on an expedition to study the impact of climate change on the Canadian and Greenlandic Arctic. The latter project led her to write and produce an award-winning climate change documentary titled “400 PPM.”

    3
    Burhanpurkar at Jakobshavn fjord on the west coast of Greenland. (Photo provided by Maya Burhanpurkar.)

    “Research really drew me in because of the opportunity to answer unanswered questions,” she said. “It is so fascinating that you can make fundamental discoveries about the universe around us.”

    Not even an early acceptance by Harvard could disrupt her focus on research; Burhanpurkar deferred admission to work on the self-driving wheel chair technology during a gap year. Her University of Toronto team sought to develop a hardware and software package that would make it easier for people with severe physical disabilities to use power wheelchairs.

    “People with hand tremors or more severe Parkinson’s Disease or ALS really struggle with a joy stick or an alternate input device, like a sip and puff switch,” she said. “These people often have degraded mobility and a degraded quality of life.”

    Burhanpurkar developed a core part of the software for the semi-autonomous system that is capable of localization, mapping, and obstacle avoidance. The software utilizes off-the-shelf computer vision and odometry sensors rather than expensive 3D laser scanners and high-performance hardware, so it is more cost-effective than other devices, Burhanpurkar said.

    Despite her lack of coding experience, she wrote a specialized path-planning algorithm that enables autonomous doorway detection and traversal simply by placing the wheelchair in front of a door. She also helped develop software for autonomously traveling down long corridors, and docking at a desk, typically very difficult tasks for users with upper body mobility impairments.

    “The challenge was what I enjoyed the most. I got thrown off the deep end in this project and I had to swim my way up, which was really fun,” she said. “It was intellectually interesting, but it was also emotionally interesting. Working on something that can directly impact people’s lives in the near future, not decades away, is really exciting.”

    But that was only half of Burhanpurkar’s gap year. She spent the other half as the youngest paid researcher at the Perimeter Institute for Theoretical Physics (where Stephen Hawking keeps an office), writing super-fast algorithms for a novel telescope in British Columbia. The telescope will continually map the entire northern hemisphere in an effort to learn more about cosmic fast radio bursts.

    4
    Burhanpurkar working on code at the Canadian Hydrogen Intensity Mapping Experiment telescope in British Columbia. (Photo by Richard Bowden.)

    Each day, the planet is bombarded by high-energy millisecond duration bursts of radio waves, each having the energy of 500 million of our suns, but scientists remain puzzled about their origins. This new telescope will enable researchers to gather data on thousands of these bursts, opening the door for more detailed analysis.

    Terabytes of astronomical data will be generated each second, so the super-fast algorithms Burhanpurkar and the team wrote are necessary to efficiently process the mass of information.

    “I know that right now my code is running on a new 128-node supercomputer in British Columbia, and it is going to help detect one of the most enigmatic phenomena of the universe,” she said. “That’s pretty cool.”

    Now at Harvard, Burhanpurkar is not planning to slow down. She is interested in continuing her robotics research and working with the i-Lab to bring the cost-effective self-driving wheelchair technology to consumers.

    While she hasn’t decided on a concentration, she is considering computer science and physics (or both), and looks forward to pursuing her passion for research down new avenues.

    “Taking a gap year was great for perspective,” she said. “Before, I wasn’t sure what I wanted to do. I hadn’t really done long-term research projects, but now, I have actual experience and I know what the end goal is. I can use that to motivate me.”

    5
    Burhanpurkar interviewing Canadian author and environmental activist Margaret Atwood for her climate change documentary titled “400 PPM.” (Photo provided by Maya Burhanpurkar.)

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Through research and scholarship, the Harvard School of Engineering and Applied Sciences (SEAS) will create collaborative bridges across Harvard and educate the next generation of global leaders. By harnessing the power of engineering and applied sciences we will address the greatest challenges facing our society.

    Specifically, that means that SEAS will provide to all Harvard College students an introduction to and familiarity with engineering and technology as this is essential knowledge in the 21st century.

    Moreover, our concentrators will be immersed in the liberal arts environment and be able to understand the societal context for their problem solving, capable of working seamlessly withothers, including those in the arts, the sciences, and the professional schools. They will focus on the fundamental engineering and applied science disciplines for the 21st century; as we will not teach legacy 20th century engineering disciplines.

    Instead, our curriculum will be rigorous but inviting to students, and be infused with active learning, interdisciplinary research, entrepreneurship and engineering design experiences. For our concentrators and graduate students, we will educate “T-shaped” individuals – with depth in one discipline but capable of working seamlessly with others, including arts, humanities, natural science and social science.

    To address current and future societal challenges, knowledge from fundamental science, art, and the humanities must all be linked through the application of engineering principles with the professions of law, medicine, public policy, design and business practice.

    In other words, solving important issues requires a multidisciplinary approach.

    With the combined strengths of SEAS, the Faculty of Arts and Sciences, and the professional schools, Harvard is ideally positioned to both broadly educate the next generation of leaders who understand the complexities of technology and society and to use its intellectual resources and innovative thinking to meet the challenges of the 21st century.

    Ultimately, we will provide to our graduates a rigorous quantitative liberal arts education that is an excellent launching point for any career and profession.

     
  • richardmitnick 3:19 pm on October 6, 2017 Permalink | Reply
    Tags: , “Primer” - a new cube-shaped robot can be controlled via magnets to make it walk roll sail and glide., , Robotics   

    From MIT: ““Superhero” robot wears different outfits for different tasks” 

    MIT News

    MIT Widget

    MIT News

    September 27, 2017
    Adam Conner-Simons
    Rachel Gordon

    1
    Dubbed “Primer,” a new cube-shaped robot can be controlled via magnets to make it walk, roll, sail, and glide. It carries out these actions by wearing different exoskeletons, which start out as sheets of plastic that fold into specific shapes when heated. After Primer finishes its task, it can shed its “skin” by immersing itself in water, which dissolves the exoskeleton. Courtesy of the researchers.

    From butterflies that sprout wings to hermit crabs that switch their shells, many animals must adapt their exterior features in order to survive. While humans don’t undergo that kind of metamorphosis, we often try to create functional objects that are similarly adaptive — including our robots.

    Despite what you might have seen in “Transformers” movies, though, today’s robots are still pretty inflexible. Each of their parts usually has a fixed structure and a single defined purpose, making it difficult for them to perform a wide variety of actions.

    Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) are aiming to change that with a new shape-shifting robot that’s something of a superhero: It can transform itself with different “outfits” that allow it to perform different tasks.

    Dubbed “Primer,” the cube-shaped robot can be controlled via magnets to make it walk, roll, sail, and glide. It carries out these actions by wearing different exoskeletons, which start out as sheets of plastic that fold into specific shapes when heated. After Primer finishes its task, it can shed its “skin” by immersing itself in water, which dissolves the exoskeleton.

    “If we want robots to help us do things, it’s not very efficient to have a different one for each task,” says Daniela Rus, CSAIL director and principal investigator on the project. “With this metamorphosis-inspired approach, we can extend the capabilities of a single robot by giving it different ‘accessories’ to use in different situations.”

    Primer’s various forms have a range of advantages. For example, “Wheel-bot” has wheels that allow it to move twice as fast as “Walk-bot.” “Boat-bot” can float on water and carry nearly twice its weight. “Glider-bot” can soar across longer distances, which could be useful for deploying robots or switching environments.

    Primer can even wear multiple outfits at once, like a Russian nesting doll. It can add one exoskeleton to become “Walk-bot,” and then interface with another, larger exoskeleton that allows it to carry objects and move two body lengths per second. To deploy the second exoskeleton, “Walk-bot” steps onto the sheet, which then blankets the bot with its four self-folding arms.

    “Imagine future applications for space exploration, where you could send a single robot with a stack of exoskeletons to Mars,” says postdoc Shuguang Li, one of the co-authors of the study. “The robot could then perform different tasks by wearing different ‘outfits.’”

    The project was led by Rus and Shuhei Miyashita, a former CSAIL postdoc who is now director of the Microrobotics Group at the University of York. Their co-authors include Li and graduate student Steven Guitron. An article about the work appears in the journal Science Robotics on Sept. 27.

    Robot metamorphosis

    Primer builds on several previous projects from Rus’ team, including magnetic blocks that can assemble themselves into different shapes and centimeter-long microrobots that can be precisely customized from sheets of plastic.

    While robots that can change their form or function have been developed at larger sizes, it’s generally been difficult to build such structures at much smaller scales.

    “This work represents an advance over the authors’ previous work in that they have now demonstrated a scheme that allows for the creation of five different functionalities,” says Eric Diller, a microrobotics expert and assistant professor of mechanical engineering at the University of Toronto, who was not involved in the paper. “Previous work at most shifted between only two functionalities, such as ‘open’ or ‘closed’ shapes.”

    The team outlines many potential applications for robots that can perform multiple actions with just a quick costume change. For example, say some equipment needs to be moved across a stream. A single robot with multiple exoskeletons could potentially sail across the stream and then carry objects on the other side.

    “Our approach shows that origami-inspired manufacturing allows us to have robotic components that are versatile, accessible, and reusable,” says Rus, the Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science at MIT.

    Designed in a matter of hours, the exoskeletons fold into shape after being heated for just a few seconds, suggesting a new approach to rapid fabrication of robots.

    “I could envision devices like these being used in ‘microfactories’ where prefabricated parts and tools would enable a single microrobot to do many complex tasks on demand,” Diller says.

    As a next step, the team plans to explore giving the robots an even wider range of capabilities, from driving through water and burrowing in sand to camouflaging their color. Guitron pictures a future robotics community that shares open-source designs for parts much the way 3-D-printing enthusiasts trade ideas on sites such as Thingiverse.

    “I can imagine one day being able to customize robots with different arms and appendages,” says Rus. “Why update a whole robot when you can just update one part of it?”

    This project was supported, in part, by the National Science Foundation.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 5:16 am on July 19, 2017 Permalink | Reply
    Tags: , , CSIRO blogs, Robotics   

    From CSIRO blog: “Legged robots walk the walk” 

    CSIRO bloc

    CSIRO blog

    19th July 2017
    Eliza Keck

    1
    In an emergency, first responders often have to make a very tough call: can I enter the area safely or is it too dangerous? It’s the most extreme risk vs reward analysis anyone could ever face, and the call is often made in mere moments and with very little information. In the future, this decision will hopefully be much easier with the help of some six legged robots: hexapods. Creating robots that can go into an unpredictable, unstable environment and help people escape it would be a literal life-saver.

    You know how people are always talking about how robots are going to steal our jobs and take over the world? Well this is one job we wouldn’t mind them taking.

    Wheel what have we here?

    There are some pretty amazing robots on wheels. Case in point: NASA’s Curiosity.

    NASA/Mars Curiosity Rover

    Wheels are great for moving fast, they’re stable and they are easy to build. So why the obsession with legs? Well, wheels have their drawbacks: they can’t go side to side (well, most of them can’t!), they can’t cross over gaps, can’t climb over obstacles and they’re basically turtles; flip them on their back and they’re useless. Legs are the answer. So why haven’t we done it already? Because legs are significantly complicated.

    Balancing act

    Humans have been trying to create humanoid robots for centuries. But being able to walk on two legs is a significant achievement that took us millions of years to perfect. To simply balance, many complex body systems work together (and even compensate for each other when required). There’s our vestibulo-ocular reflex (our eyes and inner-ear working together), our nervous system and the body’s sense of where it is in space: proprioception. We also have baroreceptors, sensors in our blood vessels that sense blood pressure (like a barometer and air pressure), that tell our heart to pump blood faster when we stand so we don’t faint.

    When designing a robot, scientists have to decide what kind of stability it will use: dynamic or static. As its name suggests, a statically stable robot will be stable when standing still. Basically – any robot with three legs or more can do this without trying. Dynamically stable robots are stable when moving (think about how much easier it is to hop on one leg than standing still on one leg). Obviously, a dynamically stable robot is much harder to control and significantly more complex however they are more energy efficient and faster. Most scientists are working to create something that is the best of both worlds. For us, we’re doing this with hexapods.

    2
    Model of a humanoid robot based on drawings by Leonardo da Vinci. Photo by Erik Möller.

    The invention of sensors like accelerometers and gyroscopes have helped scientists take the next *step* forwards in balance and stability, but that’s only the start of the many complex problems scientists have to solve before our robot dreams can turn into reality.

    Casing the joint

    Do you enjoy scrambling around rock pools at the beach? Ever notice how you moved when climbing? It wasn’t the same as if you were walking on the footpath was it? You slow down, use your hands for stability and test the movement of each rock before committing all your weight on it. Your joints play a vital role in stability in rough terrain. Toes, ankles, knees, hips and your back all make minute and major adjustments to keep you stable.

    Having flexible legs with multiple joints improves stability on rough terrain. Our first hexapod models had three ‘joints’ per leg. They were fantastic at walking on a flat surface, but as soon as they encountered a steep hill they lost their grip.

    Our latest hexapod models have two extra ‘joints’ per leg and can now tackle up to 50 per cent inclines. This is because they can widen their stance, creating a larger support polygon and shifting their centre of gravity to be within this polygon.

    3
    How useful are diagrams when trying to understand support polygons!? Credit: Stability During Arboreal Locomotion; Andrew Lammers, and Ulrich Zurcher, Cleveland State University, USA.

    Walk this way

    The gait (walking style) of the robot plays a big role in how fast and efficient it will be. When deciding which gait a robot uses you’ve got two options: fast, efficient but unstable or slow, stable, but inefficient. Neither option would work in real-life. So what’s the solution? The robot needs to be able to change how it moves depending on the situation. This is called ‘dynamic movement.’ Our hexapods constantly test the surface and will automatically change their gait and speed to stay stable and energy efficient.

    Our legged robots have got all the right moves, click here to learn more about them.

    Getting around is no easy feat, unless you have six of them

    When disaster strikes, who’s first on the scene?

    Emergency response teams often need to enter dangerous or confined spaces. But accessing unknown or unstable areas involves risk.

    Our legged robots are designed to go where no other robot or human can easily access – for example, a collapsed building. These nifty bots are able to safely explore and assess dangerous areas, such as when looking for survivors before sending in rescue teams.

    Introducing the legged robots

    Our hexapods are modelled off insects with the same number and configuration of legs, like ants and cockroaches. The hexapods are programmed with different gaits inspired by their natural counterparts.

    Our hexapods are modelled off insects with the same number and configuration of legs, like ants and cockroaches. The hexapods are programmed with different gaits inspired by their natural counterparts.

    4

    One of the most popular gaits, inspired by running ants and cockroaches, is called the “alternating tripod gait”. The “waive gait”, closer to a caterpillar’s pattern, is slower but more stable. It’s much more useful when navigating sloped or slippery terrain.

    One of our hexapods, Weaver, has five joints on each of its six legs, enabling it to move freely and negotiate uneven terrain easily.

    It is also fitted with a pair of stereo cameras, allowing it to create a digital elevation map of an area, and detect any physical obstacles in its path. Thanks to sensors in each of its leg joints, this nifty insect-like bot can measure the forces felt at its foot tips. When each foot touches the ground, it feeds this information on the ground conditions back through a sequence of algorithms.

    In combination with its elevation map, the hexapod can interpret the stability of the surface and then adjust the stiffness of its legs as it travels. This allows the legged robot to avoid getting stuck or losing balance, by adjusting the flexibility of its leg joints depending on the roughness of the terrain.

    Getting into those hard to reach spaces

    6
    Hexapods: Legged Robots

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    CSIRO campus

    CSIRO, the Commonwealth Scientific and Industrial Research Organisation, is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

    The CSIRO blog is designed to entertain, inform and inspire by generally digging around in the work being done by our terrific scientists, and leaving the techie speak and jargon for the experts.

    We aim to bring you stories from across the vast breadth and depth of our organisation: from the wild sea voyages of our Research Vessel Investigator to the mind-blowing astronomy of our Space teams, right through all the different ways our scientists solve national challenges in areas as diverse as Health, Farming, Tech, Manufacturing, Energy, Oceans, and our Environment.

    If you have any questions about anything you find on our blog, we’d love to hear from you. You can reach us at socialmedia@csiro.au.

    And if you’d like to find out more about us, our science, or how to work with us, head over to CSIRO.au

     
  • richardmitnick 7:58 am on July 14, 2017 Permalink | Reply
    Tags: , Biomechatronics, , Developing designs for exoskeletons and prosthetic limbs, New software algorithms, Robotics   

    From Carnegie Mellon: “Carnegie Mellon Develops Landmark Achievement in Walking Technology” 

    Carnegie Mellon University logo
    Carnegie Mellon University

    July 11, 2017
    Lisa Kulick
    lkulick@andrew.cmu.edu

    Researchers in Carnegie Mellon University’s College of Engineering are using feedback from the human body to develop designs for exoskeletons and prosthetic limbs.

    Published in Science, their technique, called human-in-the-loop optimization, customizes walking assistance for individuals and significantly lessens the amount of energy needed when walking. The algorithm that enables this optimization represents a landmark achievement in the field of biomechatronics.

    “Existing exoskeleton devices, despite their potential, have not improved walking performance as much as we think they should,” said Steven Collins, a professor of mechanical engineering. “We’ve seen improvements related to computing, hardware and sensors, but the biggest challenge has remained the human element — we just haven’t been able to guess how they will respond to new devices.”

    The software algorithm is combined with versatile emulator hardware that automatically identifies optimal assistance strategies for individuals.

    During experiments, each user received a unique pattern of assistance from an exoskeleton worn on one ankle. The algorithm tested responses to 32 patterns over the course of an hour, making adjustments based on measurements of the user’s energy use with each pattern. The optimized assistance pattern produced larger benefits than any exoskeleton to date, including devices acting at all joints on both legs.

    “When we walk, we naturally optimize coordination patterns for energy efficiency,” Collins said. “Human-in-the-loop optimization acts in a similar way to optimize the assistance provided by wearable devices. We are really excited about this approach because we think it will dramatically improve energy economy, speed and balance for millions of people, especially those with disabilities.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Carnegie Mellon Campus

    Carnegie Mellon University (CMU) is a global research university with more than 12,000 students, 95,000 alumni, and 5,000 faculty and staff.
    CMU has been a birthplace of innovation since its founding in 1900.
    Today, we are a global leader bringing groundbreaking ideas to market and creating successful startup businesses.
    Our award-winning faculty members are renowned for working closely with students to solve major scientific, technological and societal challenges. We put a strong emphasis on creating things—from art to robots. Our students are recruited by some of the world’s most innovative companies.
    We have campuses in Pittsburgh, Qatar and Silicon Valley, and degree-granting programs around the world, including Africa, Asia, Australia, Europe and Latin America.

     
  • richardmitnick 6:53 am on June 22, 2017 Permalink | Reply
    Tags: , , Robotics, Surgeons Want Robots - if They Know They Will Help Cut Down Their Human Errors   

    From Science Alert: “Surgeons Want Robots – if They Know They Will Help Cut Down Their Human Errors” 

    ScienceAlert

    Science Alert

    22 JUN 2017
    Anjali Jaiprakash
    Advance Queensland Fellow, Medical Robotics, Queensland University of Technology

    Jonathan Roberts
    Professor in Robotics, Queensland University of Technology

    Ross Crawford
    Professor of Orthopaedic Research, Queensland University of Technology

    1
    QUT, The Conversation

    How good are humans at performing manual surgery? Major surgical errors must be reported and there has been research into the attitudes of surgeons in how they report such errors.

    But there is no requirement or legislation in place to report minor unintentional damage, and how that is even defined is a grey area.

    Very little research exists into the frequency of unintentional surgical damage, the challenges that cause this damage, or understanding of the long-term effects.

    We are developing semi-autonomous robotic tools to help surgeons, especially for knee surgery. It’s estimated that around 4 million knee arthroscopies are performed each year worldwide.

    In our recent study, some surgeons said they found that such knee procedures could be physically challenging and could cause unintentional damage to their patients.

    But a majority said they would be prepared to use robotic tools if they could be shown to help in the surgery and reduce the risks of injury to patients.

    Unintentional damage in surgery

    Osteoarthritis is by far the leading cause of pain in joints, especially knees.

    Following X-ray and MRI scans, the first line of minimally invasive diagnostic and treatment procedures is known as knee arthroscopy. It is a procedure in which a surgeon slides a camera and a range of instruments into the joint through small incisions.

    This procedure is somewhat controversial as the evidence of its effectiveness for some patients has been questioned. But it is still one of the most common surgical procedures carried out in the world.

    With our colleagues, we asked 93 surgeons in Australia with a range of experience how often they observed unintentional damage occurring during a knee arthroscopy. The survey was anonymous and the results were published earlier this year in the Journal of Orthopaedic Surgery.

    Half the surgeons (49.5 percent) said unintentional damage to articular cartilage, which is the tissue that covers the end of your bones that make up your joints, occurred in at least one in ten procedures.

    A third (34.4 percent) of them said the damage rate was at least one in five procedures. Incredibly, seven of the surgeons (7.5 percent) said such damage occurred in every procedure carried out.

    Damage to cartilage is probably one of the causes of osteoarthritis and your body does not repair cartilage if damaged, which can then result in knee pain.

    So patients who suffer unintentional cartilage damage during an arthroscopy have an additional risk of developing osteoarthritis. This is somewhat ironic, given that the motivation for many arthroscopic procedures is to try to treat osteoarthritis.

    A pain for the surgeon

    Knee arthroscopy is considered straightforward, and a skilled surgeon will make it look easy. But it is actually very difficult and requires considerable skill and experience.

    During the procedure, the leg must be manipulated to create the space for the camera and the tools. This means that the surgeon has to continually lift and hold the leg, while at the same time hold the camera and the tools and operate by looking at the video on a screen.

    We asked the surgeons whether they found knee arthroscopy to be physically challenging, and whether they had experienced pain themselves after performing this surgery.

    Nearly 59 percent reported they found the procedure to be physically challenging, and more than a fifth (22.6 percent) said they had experienced physical pain afterwards. It is in the interests of patients that their surgeons remain in good health.

    Robots to the rescue

    So how can we reduce the risk of any unintentional damage during knee arthroscopy surgery and make the procedure less challenging for the surgeon?

    At the moment there are no robotically assisted technologies used in knee arthroscopy. All the surgery is performed manually.

    Our current research focuses on how robots can be used by surgeons to improve patient and surgeon safety, to reduce the need for future medical treatment, and to lower the costs of healthcare.

    We are exploring how robots can be used to hold and move the leg during a knee arthroscopy, freeing the surgeon to focus on observing the interior of the knee.

    We are also developing new types of flexible robots and tiny stereo cameras to replace the existing arthroscopes and which will feed into robotic vision systems to map the 3D structure of the knee.

    These 3D knee maps will be used by other tool holding robots to avoid colliding with the cartilage.

    Our aim is to give surgeons semi-autonomous robotic tools so they can concentrate on what they are best at: deciding what is wrong with the patient and how to treat it.

    About a third (32.3 percent) of surgeons we surveyed said they were nervous about the introduction of any semi-autonomous arthroscopic systems.

    But about three-quarters (76.3 percent) said they would use a robotic assist system if it improved the efficiency of the procedure, and 86 percent said they would use a robot if it decreased the rate of unintentional damage to cartilage.

    Overall, 47.3 percent of the surgeons said they saw a future role for semi-autonomous arthroscopic systems.

    All surgeons will tell you that surgery carries a risk. As a patient, you must balance the benefits of a given surgery against those risks.

    Future upgrades to their toolkit in the form of robotic manipulators, scopes and tools, will hopefully allow surgeons to reduce the risks for both the patients and themselves.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 5:02 pm on May 19, 2017 Permalink | Reply
    Tags: 3D-printed Soft Four Legged Robot Can Walk on Sand and Stone, , , Robotics,   

    From UCSD: “3D-printed Soft Four Legged Robot Can Walk on Sand and Stone” 

    UC San Diego bloc

    UC San Diego

    May 17, 2017
    Ioana Patringenaru

    1
    UC San Diego Jacobs School of Engineering mechanical engineering graduate student Dylan Trotman from the Tolley Lab with the 3D-printed, four-legged robot being pressented at the 2017 IEEE International Conference on Robotics and Automation (ICRA). The entire photo set is on Flickr. Photo credit: UC San Diego Jacobs School of Engineering / David Baillot

    Engineers at the University of California San Diego have developed the first soft robot that is capable of walking on rough surfaces, such as sand and pebbles. The 3D-printed, four-legged robot can climb over obstacles and walk on different terrains.

    Researchers led by Michael Tolley, a mechanical engineering professor at the University of California San Diego, will present the robot at the IEEE International Conference on Robotics and Automation from May 29 to June 3 in Singapore. The robot could be used to capture sensor readings in dangerous environments or for search and rescue.

    The breakthrough was possible thanks to a high-end printer that allowed researchers to print soft and rigid materials together within the same components. This made it possible for researchers to design more complex shapes for the robot’s legs.

    Bringing together soft and rigid materials will help create a new generation of fast, agile robots that are more adaptable than their predecessors and can safely work side by side with humans, said Tolley. The idea of blending soft and hard materials into the robot’s body came from nature, he added. “In nature, complexity has a very low cost,” Tolley said. “Using new manufacturing techniques like 3D printing, we’re trying to translate this to robotics.”

    3-D printing soft and rigid robots rather than relying on molds to manufacture them is much cheaper and faster, Tolley pointed out. So far, soft robots have only been able to shuffle or crawl on the ground without being able to lift their legs. This robot is actually able to walk.

    Researchers successfully tested the tethered robot on large rocks, inclined surfaces and sand (see video). The robot also was able to transition from walking to crawling into an increasingly confined space, much like a cat wiggling into a crawl space.

    Dylan Drotman, a Ph.D. student at the Jacobs School of Engineering at UC San Diego, led the effort to design the legs and the robot’s control systems. He also developed models to predict how the robot would move, which he then compared to how the robot actually behaved in a real-life environment.

    How it’s made

    The legs are made up of three parallel, connected sealed inflatable chambers, or actuators, 3D-printed from a rubber-like material. The chambers are hollow on the inside, so they can be inflated. On the outside, the chambers are bellowed, which allows engineers to better control the legs’ movements. For example, when one chamber is inflated and the other two aren’t, the leg bends. The legs are laid out in the shape of an X and connected to a rigid body.

    The robot’s gait depends on the order of the timing, the amount of pressure and the order in which the pistons in its four legs are inflated. The robot’s walking behavior in real life also closely matched the researcher’s predictions. This will allow engineers to make better educated decisions when designing soft robots.

    The current quadruped robot prototype is tethered to an open source board and an air pump. Researchers are now working on miniaturizing both the board and the pump so that the robot can walk independently. The challenge here is to find the right design for the board and the right components, such as power sources and batteries, Tolley said.

    3D Printed Soft Actuators for a Legged Robot Capable of Navigating Unstructured Terrain

    Authors: Dylan Drotman, Saurabh Jadhav, Mahmood Karimi, Philip deZonia, Michael T. Tolley

    This work is supported by the UC San Diego Frontiers of Innovation Scholarship Program and the Office of Naval Research grant number N000141712062.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    UC San Diego Campus

    The University of California, San Diego (also referred to as UC San Diego or UCSD), is a public research university located in the La Jolla area of San Diego, California, in the United States.[12] The university occupies 2,141 acres (866 ha) near the coast of the Pacific Ocean with the main campus resting on approximately 1,152 acres (466 ha).[13] Established in 1960 near the pre-existing Scripps Institution of Oceanography, UC San Diego is the seventh oldest of the 10 University of California campuses and offers over 200 undergraduate and graduate degree programs, enrolling about 22,700 undergraduate and 6,300 graduate students. UC San Diego is one of America’s Public Ivy universities, which recognizes top public research universities in the United States. UC San Diego was ranked 8th among public universities and 37th among all universities in the United States, and rated the 18th Top World University by U.S. News & World Report ‘s 2015 rankings.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: