Tagged: Robotics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:19 am on July 5, 2019 Permalink | Reply
    Tags: , Robotics, SpaceBok robot   

    From European Space Agency: “Jumping space robot ‘flies’ like a spacecraft” 

    ESA Space For Europe Banner

    From European Space Agency

    Spacebok jumping in simulated lunar gravity

    4 July 2019

    Astronauts on the Moon found themselves hopping around, rather than simply walking. Switzerland’s SpaceBok planetary exploration robot has followed their example, launching all four legs off the ground during tests at ESA’s technical heart.

    SpaceBok is a quadruped robot designed and built by a Swiss student team from ETH Zurich and ZHAW Zurich. It is currently being tested using robotic facilities at ESA’s

    ESA Estec

    technical centre in the Netherlands.

    Work is proceeding under the leadership of PhD student Hendrik Kolvenbach from ETH Zurich’s Robotic Systems Lab, currently based at ESTEC. The robot is being used to investigate the potential of ‘dynamic walking’ to get around in low gravity environments.

    Hendrik explains: “Instead of static walking, where at least three legs stay on the ground at all times, dynamic walking allows for gaits with full flight phases during which all legs stay off the ground. Animals make use of dynamic gaits due to their efficiency, but until recently, the computational power and algorithms required for control made it challenging to realise them on robots.

    “For the lower gravity environments of the Moon, Mars or asteroids, jumping off the ground like this turns out to be a very efficient way to get around.”

    Hendrik Kolvenbach with SpaceBok. Work on SpaceBok is proceeding under the leadership of PhD student Hendrik Kolvenbach from ETH Zurich’s Robotic Systems Lab, currently based at ESTEC. The robot is being used to investigate the potential of ‘dynamic walking’ to get around in low gravity environments.

    Simulating low-gravity conditions

    “Astronauts moving in the one-sixth gravity of the Moon adopted jumping instinctively. SpaceBok could potentially go up to 2 m high in lunar gravity, although such a height poses new challenges. Once it comes off the ground the legged robot needs to stabilise itself to come down again safely – it’s basically behaving like a mini-spacecraft at this point,” says team member Alexander Dietsche.

    “So what we’ve done is harness one of the methods a conventional satellite uses to control its orientation, called a reaction wheel. It can be accelerated and decelerated to trigger an equal and opposite reaction in SpaceBok itself,” explains team member Philip Arm.

    “Additionally, SpaceBok’s legs incorporate springs to store energy during landing and release it at take-off, significantly reducing the energy needed to achieve those jumps,” adds another team member, Benjamin Sun.

    The team is slowly increasing the height of the robot’s repetitive jumps, up to 1.3 m in simulated lunar gravity conditions so far.

    Test rigs have been set up to simulate various gravity environments, mimicking not only lunar conditions but also the very low gravities of asteroids. The lower the gravity the longer the flight phase can be for each robot jump, but effective control is needed for both take-off and landing.

    To simulate the vanishingly low gravity of asteroids, the SpaceBok team made use of the flattest floor in the Netherlands – a 4.8 x 9 m epoxy floor smoothed to an overall flatness within 0.8 mm, called the Orbital Robotics Bench for Integrated Technology (ORBIT), part of ESA’s Orbital Robotics and Guidance Navigation and Control Laboratory.

    Robot mounted sideways

    SpaceBok was placed on its side, then attached to a free-floating platform to reproduce zero-G conditions in two dimensions. When jumping off a wall its reaction wheel allowed it to twirl around mid-jump, letting it land feet first again on the other side of the chamber – as if it was jumping along a scaled-down single low-gravity surface.

    Hendrik added: “The testing went sufficiently well that we even used SpaceBok to play a live-action game of Pong, the video game classic.”

    SpaceBok robot

    Testing will continue in more realistic conditions, with jumps made over obstacles, hilly terrain, and realistic soil, eventually moving out of doors.

    Hendrik is studying at ESTEC through ESA’s Networking Partnering Initiative, intended to harness advanced academic research for space applications.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The European Space Agency (ESA), established in 1975, is an intergovernmental organization dedicated to the exploration of space, currently with 19 member states. Headquartered in Paris, ESA has a staff of more than 2,000. ESA’s space flight program includes human spaceflight, mainly through the participation in the International Space Station program, the launch and operations of unmanned exploration missions to other planets and the Moon, Earth observation, science, telecommunication as well as maintaining a major spaceport, the Guiana Space Centre at Kourou, French Guiana, and designing launch vehicles. ESA science missions are based at ESTEC in Noordwijk, Netherlands, Earth Observation missions at ESRIN in Frascati, Italy, ESA Mission Control (ESOC) is in Darmstadt, Germany, the European Astronaut Centre (EAC) that trains astronauts for future missions is situated in Cologne, Germany, and the European Space Astronomy Centre is located in Villanueva de la Cañada, Spain.

    ESA50 Logo large

  • richardmitnick 9:34 am on May 5, 2019 Permalink | Reply
    Tags: "America's infrastructure is like a third-world country" said Ray LaHood transportation secretary under President Obama., , But the next generation of these machines it seems clear will gain more autonomy and machine learning technologies, In short the infrastructure robots are coming; in fact some of them are already here., Infrastructure spending, Robotics, Robots and artificial intelligence can help us build the infrastructure we need here and around the world., The early infrastructure robots don't use much AI.   

    From WIRED: “Spend Part of the $2 Trillion Infrastructure Plan on Robots” 

    Wired logo

    From WIRED

    Gretchen Greene

    Alexis Rosenfeld/Getty Images

    This week, the Democrats and President Trump are talking about a $2 trillion infrastructure plan, a number in line with American Society of Civil Engineers’ estimates for infrastructure needs, but it isn’t clear where the money will come from or if a bipartisan plan will actually move forward.

    The ASCE’s 2017 report card gave America’s infrastructure a D+ with scant progress these last 20 years. “America’s infrastructure is like a third-world country,” said Ray LaHood, transportation secretary under President Obama. If we don’t make a major infrastructure investment, our enormous infrastructure needs will just keep growing. We need good new ideas to make the most of whatever money is approved by the federal government or local governments.

    New technologies are threatening jobs but they also offer the possibility of completing projects we otherwise couldn’t afford, minimizing disruption, improving safety and optimizing systems in ways humans working alone could not. Robots and artificial intelligence can help us build the infrastructure we need, here and around the world. In short, the infrastructure robots are coming; in fact, some of them are already here.

    In Minnesota, spider-like bridge inspection drones crawl along high abutments and into narrow gaps while hovering drones inspect the undersides of bridge decks. Their access is better, cheaper, and safer with less disruption of traffic. Gas pipe repair robots allow utility crews in Boston, NYC, and Edinburgh, Scotland, to finish a job in a third of the time, without digging up the street at every joint or interrupting service because the robots can safely work inside pressurized lines. In Saudi Arabia and Mexico, water pipe inspection robots are inserted in one fire hydrant, carried by the water flow and captured with a net at another fire hydrant down the line, reporting the locations of leaks a tenth to a third the size old methods could find. In Connecticut, drones are replacing low flying helicopters for power line inspections.

    In Oslo, Norway, submarine drones are mapping the landscape of underwater garbage: old tires and toys, plastic bags and the carcasses of abandoned cars, so boats with cranes and human divers can be deployed to clean up the fjords.

    In Fukushima, Japan, engineers have embarked on a half century project one expert called more challenging than putting a man on the moon: designing and building robots that can operate in an extremely challenging environment to find, recover, and seal the lost radioactive fuel from the biggest nuclear plant disaster cleanup effort in history.

    The early infrastructure robots don’t use much AI. They are remotely controlled or tethered, relaying video to human operators to interpret, carrying tools a human operator can use from a distance, and relying on a human operator to tell them where to go. Their genius lies in their ability to squeeze into small spaces, levitate in the sky, or dive into the water and survive in harsh environments, going places humans can’t go easily, safely, cheaply, or at all.

    But the next generation of these machines, it seems clear, will gain more autonomy, adopting computer vision, autonomous vehicle navigation and machine learning technologies. Semi-autonomous drones and robots are in testing and early commercial deployment for inspection of industrial assets, land surveying and sidewalk snow clearing.

    It’s not a big step to imagine robots creeping through the gas and water lines all day and night, mapping their own course, quietly fixing leaks, docking at charging and maintenance stations as needed, like a Roomba underground. Above ground robots could patrol the roads, the power grid and the waterways, cleaning up trash and fixing potholes, electrical wires and bridges or reporting what they can’t fix, directing a human crew to the spot.

    Machine learning software systems are learning to predict code violations, safety incidents, mechanical failures and natural disasters, directing robotic or human resources to intervene. They are being used for fire code, health code and industrial safety inspection prioritization in Pittsburgh, New York, New Orleans, Boston, Chicago and British Columbia, Canada. Rolls Royce is testing machine learning to predict engine failures. The oil and gas industry is automating the detection of serious pipeline corrosion, adding machine learning to the pipeline robot pigs it has used for decades. British Columbia is trying to predict elevator problems. Pittsburgh is trying to predict landslides on roads.

    Robots, sensors and machine learning are being used to direct water to where we want it before it ever hits a pipeline and to reduce pollution. Tech startups in Boston and San Francisco are using sensors and machine learning to create hyperlocal air quality and weather data and predictions. In crowded industrial cities in Guangzhou, China, pollution-detecting airborne drones help law enforcement identify which factory should be punished for emissions. China has used chemical carrying drones to disperse smog and to make rain and is considering the creation of a vast network of fuel burning chambers, planes, drones and artillery, guided by real-time data from satellites, to seed clouds over the Tibetan plateau, the source of most of Asia’s biggest rivers, an area three times the size of Spain.

    Advances in robotics, hardware and artificial intelligence have combined to make a new vision possible for how infrastructure maintenance and repair is carried out. More importantly, they offer a vision for how we might be able to afford to do the work we can’t put off forever.

    There’s a rising army of robots, ready to serve.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 1:19 pm on April 12, 2019 Permalink | Reply
    Tags: , , , , , , , Robotics,   

    From University of New South Wales: “Sky’s the limit: celebrating engineering that’s out of this world” 

    U NSW bloc

    From University of New South Wales

    12 Apr 2019
    Cecilia Duong

    Researchers from UNSW Engineering are harnessing new technologies to help build Australia’s space future.

    An impression of UNSW Cubesat in orbit. Image: Jamie Tufrey

    On International Day of Human Space Flight – an annual celebration of the beginning of the space era for mankind that’s designed to reaffirm the important contribution of space science and technology in today’s world – UNSW Engineering is looking at some of its own space-related research highlights.

    Whether it’s finding ways to mine water on the moon or developing space cells with the highest efficiencies, researchers from UNSW Engineering are harnessing new technologies to help build Australia’s space future. Our student-led projects, such as BlueSAT and American Institute of Aeronautics and Astronautics (AIAA Rocketry), are also providing students with real-world experience in multi-disciplinary space engineering projects to continue to promote space technology in Australia.

    Here are a few highlights of how UNSW Engineering research is innovating both on Earth and in space.

    Mining water on the Moon
    Image: Shutterstock

    A team of UNSW Engineers have put together a multi-university, agency and industry project team to investigate the possibilities of mining water on the moon to produce rocket fuel.

    Find out more.

    Satellite solar technology comes down to Earth
    Solar cells used in space are achieving higher efficiencies than those used at ground level, and now there are ways to have them working on Earth without breaking the bank.

    Researchers from the School of Photovoltaics Renewable Energy Engineering are no strangers to setting new records for solar cell efficiency levels but Associate Professor Ned Ekins-Daukes has made it his mission to develop space cells with the highest efficiencies at the lowest weight.

    Find out more.

    Students shine in off-world robotics competition
    UNSW’s Off-World Robotics team – part of the long-running BLUEsat student-led project – achieved their best placing in the competition to date.

    A team of eight UNSW Engineering students came eighth in the European Rover Challenge (ERC) in Poland, one of the world’s biggest international space and robotics events, defeating 57 teams from around the globe.

    Find out more.

    Exploring a little-understood region above Earth
    Associate Professor Elias Aboutanios with UNSW-Ec0. Photo:Grant Turner

    UNSW-EC0, a CubeSat built by a team led by Australian Centre for Space Engineering Research (ACSER) deputy director Associate Professor Elias Aboutanios, is studying the atomic composition of the thermosphere using an on-board ion neutral mass spectrometer.

    Find out more.

    Rocketing into an internship
    Third-year Aerospace Engineering student, Sam Wilkinson, scored an internship at Rocket Lab in New Zealand.

    Third-year Aerospace Engineering student, Sam Wilkinson, describes how he landed an internship at an international aerospace company, which works with organisations such as NASA, without going through the usual application process.

    Find out more.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    U NSW Campus

    Welcome to UNSW Australia (The University of New South Wales), one of Australia’s leading research and teaching universities. At UNSW, we take pride in the broad range and high quality of our teaching programs. Our teaching gains strength and currency from our research activities, strong industry links and our international nature; UNSW has a strong regional and global engagement.

    In developing new ideas and promoting lasting knowledge we are creating an academic environment where outstanding students and scholars from around the world can be inspired to excel in their programs of study and research. Partnerships with both local and global communities allow UNSW to share knowledge, debate and research outcomes. UNSW’s public events include concert performances, open days and public forums on issues such as the environment, healthcare and global politics. We encourage you to explore the UNSW website so you can find out more about what we do.

  • richardmitnick 12:02 pm on February 20, 2019 Permalink | Reply
    Tags: "Robots track moving objects with unprecedented precision", , , RFID tags, Robotics   

    From MIT News: “Robots track moving objects with unprecedented precision” 

    MIT News
    MIT Widget

    From MIT News

    February 18, 2019
    Rob Matheson

    System uses RFID tags to home in on targets; could benefit robotic manufacturing, collaborative drones, and other applications.

    MIT Media Lab researchers are using RFID tags to help robots home in on moving objects with unprecedented speed and accuracy, potentially enabling greater collaboration in robotic packaging and assembly and among swarms of drones. Photo courtesy of the researchers.

    A novel system developed at MIT uses RFID tags to help robots home in on moving objects with unprecedented speed and accuracy. The system could enable greater collaboration and precision by robots working on packaging and assembly, and by swarms of drones carrying out search-and-rescue missions.

    In a paper being presented next week at the USENIX Symposium on Networked Systems Design and Implementation, the researchers show that robots using the system can locate tagged objects within 7.5 milliseconds, on average, and with an error of less than a centimeter.

    In the system, called TurboTrack, an RFID (radio-frequency identification) tag can be applied to any object. A reader sends a wireless signal that reflects off the RFID tag and other nearby objects, and rebounds to the reader. An algorithm sifts through all the reflected signals to find the RFID tag’s response. Final computations then leverage the RFID tag’s movement — even though this usually decreases precision — to improve its localization accuracy.

    The researchers say the system could replace computer vision for some robotic tasks. As with its human counterpart, computer vision is limited by what it can see, and it can fail to notice objects in cluttered environments. Radio frequency signals have no such restrictions: They can identify targets without visualization, within clutter and through walls.

    To validate the system, the researchers attached one RFID tag to a cap and another to a bottle. A robotic arm located the cap and placed it onto the bottle, held by another robotic arm. In another demonstration, the researchers tracked RFID-equipped nanodrones during docking, maneuvering, and flying. In both tasks, the system was as accurate and fast as traditional computer-vision systems, while working in scenarios where computer vision fails, the researchers report.

    “If you use RF signals for tasks typically done using computer vision, not only do you enable robots to do human things, but you can also enable them to do superhuman things,” says Fadel Adib, an assistant professor and principal investigator in the MIT Media Lab, and founding director of the Signal Kinetics Research Group. “And you can do it in a scalable way, because these RFID tags are only 3 cents each.”

    In manufacturing, the system could enable robot arms to be more precise and versatile in, say, picking up, assembling, and packaging items along an assembly line. Another promising application is using handheld “nanodrones” for search and rescue missions. Nanodrones currently use computer vision and methods to stitch together captured images for localization purposes. These drones often get confused in chaotic areas, lose each other behind walls, and can’t uniquely identify each other. This all limits their ability to, say, spread out over an area and collaborate to search for a missing person. Using the researchers’ system, nanodrones in swarms could better locate each other, for greater control and collaboration.

    “You could enable a swarm of nanodrones to form in certain ways, fly into cluttered environments, and even environments hidden from sight, with great precision,” says first author Zhihong Luo, a graduate student in the Signal Kinetics Research Group.

    The other Media Lab co-authors on the paper are visiting student Qiping Zhang, postdoc Yunfei Ma, and Research Assistant Manish Singh.

    Super resolution

    Adib’s group has been working for years on using radio signals for tracking and identification purposes, such as detecting contamination in bottled foods, communicating with devices inside the body, and managing warehouse inventory.

    Similar systems have attempted to use RFID tags for localization tasks. But these come with trade-offs in either accuracy or speed. To be accurate, it may take them several seconds to find a moving object; to increase speed, they lose accuracy.

    The challenge was achieving both speed and accuracy simultaneously. To do so, the researchers drew inspiration from an imaging technique called “super-resolution imaging.” These systems stitch together images from multiple angles to achieve a finer-resolution image.

    “The idea was to apply these super-resolution systems to radio signals,” Adib says. “As something moves, you get more perspectives in tracking it, so you can exploit the movement for accuracy.”

    The system combines a standard RFID reader with a “helper” component that’s used to localize radio frequency signals. The helper shoots out a wideband signal comprising multiple frequencies, building on a modulation scheme used in wireless communication, called orthogonal frequency-division multiplexing.

    The system captures all the signals rebounding off objects in the environment, including the RFID tag. One of those signals carries a signal that’s specific to the specific RFID tag, because RFID signals reflect and absorb an incoming signal in a certain pattern, corresponding to bits of 0s and 1s, that the system can recognize.

    Because these signals travel at the speed of light, the system can compute a “time of flight” — measuring distance by calculating the time it takes a signal to travel between a transmitter and receiver — to gauge the location of the tag, as well as the other objects in the environment. But this provides only a ballpark localization figure, not subcentimter precision.

    Leveraging movement

    To zoom in on the tag’s location, the researchers developed what they call a “space-time super-resolution” algorithm.

    The algorithm combines the location estimations for all rebounding signals, including the RFID signal, which it determined using time of flight. Using some probability calculations, it narrows down that group to a handful of potential locations for the RFID tag.

    As the tag moves, its signal angle slightly alters — a change that also corresponds to a certain location. The algorithm then can use that angle change to track the tag’s distance as it moves. By constantly comparing that changing distance measurement to all other distance measurements from other signals, it can find the tag in a three-dimensional space. This all happens in a fraction of a second.

    “The high-level idea is that, by combining these measurements over time and over space, you get a better reconstruction of the tag’s position,” Adib says.

    The work was sponsored, in part, by the National Science Foundation.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

  • richardmitnick 10:18 am on January 9, 2019 Permalink | Reply
    Tags: , , , Robotics   

    From CSIROscope : “Robots of the future: it’s about to get weird” 

    CSIRO bloc

    From CSIROscope

    9 January 2019

    The word “robot” was coined almost a hundred years ago by Czech writer Karel Čapek, to refer to the artificial life forms in his play “Rossum’s Universal Robots”. Ever since humanoid shaped robots have dominated concepts of what a robot should look like.

    Think of Star War’s C3PO, The Terminator, The Iron Giant, or even Marvin the Paranoid Android from “Hitchhikers guide to the Galaxy”.

    In the real world there are also machines like Boston Dynamic’s incredibly agile “Atlas”. Or Sophia, the first robot to receive citizenship.

    More often than not though our shape isn’t the best one for robots faced with challenging assignments in extreme environments.

    In a just published paper [Nature Machine Intelligence] our scientists have offered a bold glimpse into what the robots of the future could look like – and it’s not “Robby the Robot”.

    Robot evolution revolution

    Our Active Integrated Matter Future Science Platform (AIM FSP) says that within 20 years robots could look unpredictably different. Scientific breakthroughs in areas like materials discovery, advanced manufacturing, 3D printing, and artificial intelligence will allow robots to be designed from the molecular level up to perform their specific mission. Resulting in unusual and unexpected shapes, limbs and behaviours.

    An artist’s impression of a robot for use in the Amazon. Based on tree crawling lizards and gecko, it would have articulated legs for more flexibility and climbing.

    Central to this all is a concept known as Multi-Level Evolution (MLE). It argues that robots should be taking their engineering cues from the one tried and true design philosophy that’s survived millennia on Earth: evolution.

    Evolution has seen animals undergo incredibly diverse adaptation to survive challenging environments. It creates effective solutions that are often totally different to any a human engineer would come up with. Kangaroos, for instance, probably wouldn’t have made it off the drawing board but have survived and thrived for eons in Australia.

    How would MLE work?

    A robot’s mission, as well as details about the relevant terrain and environment, would be entered into a computer. It would then run algorithms based on evolution to automatically design robots.

    The computer would do this by exploring a diverse range of materials, components, sensors and behaviours. Advanced, computer based modelling could rapidly test prototypes in simulated, “real world” scenarios to decide which works best.

    Once that’s done 3D printing and other technologies would be used to create and physically test prototype robots.

    The end result? Small, simple, highly specialised robots that can automatically adapt to their environment and are tough enough to survive their mission.

    An artist’s impression of an ocean, coastal or river based amphibious robot. It would travel in water like an eel, but have legs in order to crawl and climb.

    Do the robot

    Say, in the future, you need to design robots for environmental monitoring in extreme environments. They’d all need to move across difficult landscapes while gathering data. Eventually, to avoid polluting the environment, they’d have to return to base or degrade away to nothing. How could you do this?

    MLE would come up with remarkably different results, depending on the terrain, climate and other factors.

    To cope with the Sahara Desert a robot would need materials designed to survive punishing heat, sand and dust. Given the amount of sun the Sahara receives the robot could be solar powered, and slide across sand dunes. The harsh UV light could also be used as the trigger to eventually wear the robot away.

    In the Amazon a robot would have entirely different challenges to face. Thick, low lying vegetation and fallen trees would hamper its movement so it would need to be flexible enough to climb over or go round obstacles. It could perhaps be powered by biomass such as the leaves covering the jungle floor, and degrade with humidity.

    An artist’s impression of an Antarctic based robot. Turtle like, it would be strong and robust for extreme conditions. It could also suit desert applications.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    SKA/ASKAP radio telescope at the Murchison Radio-astronomy Observatory (MRO) in Mid West region of Western Australia

    So what can we expect these new radio projects to discover? We have no idea, but history tells us that they are almost certain to deliver some major surprises.

    Making these new discoveries may not be so simple. Gone are the days when astronomers could just notice something odd as they browse their tables and graphs.

    Nowadays, astronomers are more likely to be distilling their answers from carefully-posed queries to databases containing petabytes of data. Human brains are just not up to the job of making unexpected discoveries in these circumstances, and instead we will need to develop “learning machines” to help us discover the unexpected.

    With the right tools and careful insight, who knows what we might find.

    CSIRO campus

    CSIRO, the Commonwealth Scientific and Industrial Research Organisation, is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

  • richardmitnick 9:44 am on May 23, 2018 Permalink | Reply
    Tags: , , Meet Our Robot Family, Robotics   

    From CSIROscope: “Meet Our Robot Family” 

    CSIRO bloc

    From CSIROscope

    23 May 2018
    Ketan Joshi

    We’re developing robotic systems that help humans perform dangerous tasks, and expanding the Australian robotics industry. Above, Weaver. No image credit.

    The family of robots that live at our Data61 are incredibly diverse. They’ve got legs, wheels, cameras, sensors, fins, blades and magnets. They sense the world, navigate it autonomously, and they traverse places too dangerous and dirty for human work. They’re as varied as the challenges they’re designed to resolve, but the common DNA is a focus on the use of cutting-edge data science.

    This isn’t something we go at alone—our partners include: DARPA (Defence Advanced Research Projects Agency), Rockwell Collins, Boeing, Woodside, Queensland University of Technology, and many other government, universities and enterprises. We recently announced the Sixth Wave Alliance, to develop a national robotics R&D strategy and create the critical mass required to address large-scale Australian and international challenges using robotics technologies.

    This week, we’re also at the International Conference on Robotics (ICRA 2018), where we’re showcasing the best of our bots.

    Meet the family below, and read more about our robotics research here.

    Machines that see – Sensing and mapping the world

    Sucking up information from the world is a capability we fleshy humans take for granted. Data61’s robotic and autonomous devices are particularly good at sensing and mapping – two capabilities that are of high importance for modern robotics and industries like mining, exploration and environmental conservation.

    Hovermap and Zebedee – moving without GPSs

    Drones are increasingly common as consumer goods, but they’re reliant on direct access to global position satellites (GPS).

    Hovermap is a 3D mapping system that uses LIDAR (light detection and ranging) technology, combined with Data61’s proprietary Simultaneous Localisation and Mapping (SLAM) solution. Hovermap works in conjunction with a UAV (uncrewed autonomous vehicle), and can map both indoor and outdoor locations without relying on GPS.

    Zebedee, our high-accuracy 3D laser mapping technology, was commercialised and is already being used around the world by 25 multinational organisations. It was recently trialled by the International Atomic Energy Agency in nuclear safeguards inspections.

    Camazotz – the bat god tech

    Camazotz, named after a Mayan bat god, is a small, portable device that is used to monitor flying foxes across Australia, helping ecologists understand and predict the spread of disease. The Wireless Ad hoc System for Positioning (WASP) uses similar tags to track vehicles and mine workers relative to reference nodes – assisting with safety and boosting productivity.

    Legged Robots

    You’ve probably seen videos of animal robots doing clever tasks and being shared with a tone of alarm. Legged robots aren’t reason for alarm – these systems are well suited to navigating environments that are too dangerous or dirty for safe human work, such as a chemical spill in a plant or the ceiling beam in a factory.


    Gizmo dancing

    Gizmo is Data61’s newest bot – a small, smooth hexapod designed for versatility and small spaces. One of the motivating applications for this robot is to inspect and map ceiling cavity and underfloor-type confined spaces.



    Zee is a prototype hexapod robot equipped with a streaming camera sensor and a real-time 3D scanning LIDAR. You’ve probably seen Zee around – it’s an older machine but still an excellent demonstration of six-legged robotics.


    Zee’s big sister, Weaver, features five joints per leg and 30 degrees of freedom. Weaver can self-stabilise through ‘exteroceptive’ sensing – enabling the robot to walk up gradients of 30°, and remain stable on inclines up to 50°.



    MaX (Multi-legged autonomous explorer) is even bigger – 2.25m tall when standing up straight. But MaX only weighs 60kg; around 5 to 20 times lighter than comparable robots. MaX is a research vehicle designed to help our scientists understand how to traverse and explore challenging indoor and outdoor environments.



    Magnapods are Data61’s wall-climbing, electro-magnetic inspection robots, useful in confined space inspection tasks and capable of carrying a 10 kilogram sensor payload.

    You can read more about the scientific goals of our legged robot research program here.

    Autonomous vehicles

    Creating systems that can navigate and respond without human intervention is a key component in removing the human element from tasks that are dangerous or poorly suited for human control. We’ve developed several ground vehicles normally used in industrial environments that can operate without human intervention, including the Gator, the load haul dump vehicle and the 20 tonne hot metal carrier.


    Our Science Rover enabled the complicated process of satellite calibration – the autonomous vehicle collects measurements at the same time an Earth observation satellite passes overhead – the two datasets are compared, and the satellite is calibrated. Our underwater autonomous vehicle, Starbug, uses underwater sensor networks to locate itself (GPS signals cannot be used underwater), enabling smart underwater data collection for protection and tracking of ecosystems.

    Our family of robots is, as you can see, pretty diverse. It’s the broad nature of the challenges they’re addressing that gives them these shapes, from small to big, wheeled to legged.

    See the full article here .


    Please help promote STEM in your local schools.
    Stem Education Coalition

    SKA/ASKAP radio telescope at the Murchison Radio-astronomy Observatory (MRO) in Mid West region of Western Australia

    So what can we expect these new radio projects to discover? We have no idea, but history tells us that they are almost certain to deliver some major surprises.

    Making these new discoveries may not be so simple. Gone are the days when astronomers could just notice something odd as they browse their tables and graphs.

    Nowadays, astronomers are more likely to be distilling their answers from carefully-posed queries to databases containing petabytes of data. Human brains are just not up to the job of making unexpected discoveries in these circumstances, and instead we will need to develop “learning machines” to help us discover the unexpected.

    With the right tools and careful insight, who knows what we might find.

    CSIRO campus

    CSIRO, the Commonwealth Scientific and Industrial Research Organisation, is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

  • richardmitnick 9:02 am on January 3, 2018 Permalink | Reply
    Tags: , Bimorph, , Graphene-based Bimorphs for Micron-sized Autonomous Origami Machines, Physicists take first step toward cell-sized robots, , Robotics, You could put the computational power of the spaceship Voyager onto an object the size of a cell   

    From Cornell Chronicle: “Physicists take first step toward cell-sized robots” 

    Cornell Bloc

    Cornell Chronicle

    January 2, 2018
    Tom Fleischman

    Charles Walcott

    An electricity-conducting, environment-sensing, shape-changing machine the size of a human cell? Is that even possible?

    Cornell physicists Paul McEuen and Itai Cohen not only say yes, but they’ve actually built the “muscle” for one.

    With postdoctoral researcher Marc Miskin at the helm, the team has made a robot exoskeleton that can rapidly change its shape upon sensing chemical or thermal changes in its environment. And, they claim, these microscale machines – equipped with electronic, photonic and chemical payloads – could become a powerful platform for robotics at the size scale of biological microorganisms.

    “You could put the computational power of the spaceship Voyager onto an object the size of a cell,” Cohen said. “Then, where do you go explore?”

    “We are trying to build what you might call an ‘exoskeleton’ for electronics,” said McEuen, the John A. Newman Professor of Physical Science and director of the Kavli Institute at Cornell for Nanoscale Science. “Right now, you can make little computer chips that do a lot of information-processing … but they don’t know how to move or cause something to bend.”

    Their work is outlined in Graphene-based Bimorphs for Micron-sized, Autonomous Origami Machines, published Jan. 2 in Proceedings of the National Academy of Sciences. Miskin is lead author; other contributors included David Muller, the Samuel B. Eckert Professor of Engineering, and doctoral students Kyle Dorsey, Baris Bircan and Yimo Han.

    The machines move using a motor called a bimorph. A bimorph is an assembly of two materials – in this case, graphene and glass – that bends when driven by a stimulus like heat, a chemical reaction or an applied voltage. The shape change happens because, in the case of heat, two materials with different thermal responses expand by different amounts over the same temperature change.

    As a consequence, the bimorph bends to relieve some of this strain, allowing one layer to stretch out longer than the other. By adding rigid flat panels that cannot be bent by bimorphs, the researchers localize bending to take place only in specific places, creating folds. With this concept, they are able to make a variety of folding structures ranging from tetrahedra (triangular pyramids) to cubes.

    In the case of graphene and glass, the bimorphs also fold in response to chemical stimuli by driving large ions into the glass, causing it to expand. Typically this chemical activity only occurs on the very outer edge of glass when submerged in water or some other ionic fluid. Since their bimorph is only a few nanometers thick, the glass is basically all outer edge and very reactive.

    “It’s a neat trick,” Miskin said, “because it’s something you can do only with these nanoscale systems.”

    The bimorph is built using atomic layer deposition – chemically “painting” atomically thin layers of silicon dioxide onto aluminum over a cover slip – then wet-transferring a single atomic layer of graphene on top of the stack. The result is the thinnest bimorph ever made.

    One of their machines was described as being “three times larger than a red blood cell and three times smaller than a large neuron” when folded. Folding scaffolds of this size have been built before, but this group’s version has one clear advantage.

    “Our devices are compatible with semiconductor manufacturing,” Cohen said. “That’s what’s making this compatible with our future vision for robotics at this scale.”

    And due to graphene’s relative strength, Miskin said, it can handle the types of loads necessary for electronics applications.

    “If you want to build this electronics exoskeleton,” he said, “you need it to be able to produce enough force to carry the electronics. Ours does that.”

    For now, these tiniest of tiny machines have no commercial application in electronics, biological sensing or anything else. But the research pushes the science of nanoscale robots forward, McEuen said.

    “Right now, there are no ‘muscles’ for small-scale machines,” he said, “so we’re building the small-scale muscles.”

    This work was performed at the Cornell NanoScale Facility for Science and Technology and supported by the Cornell Center for Materials Research, the National Science Foundation, the Air Force Office of Scientific Research and the Kavli Institute at Cornell.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Once called “the first American university” by educational historian Frederick Rudolph, Cornell University represents a distinctive mix of eminent scholarship and democratic ideals. Adding practical subjects to the classics and admitting qualified students regardless of nationality, race, social circumstance, gender, or religion was quite a departure when Cornell was founded in 1865.

    Today’s Cornell reflects this heritage of egalitarian excellence. It is home to the nation’s first colleges devoted to hotel administration, industrial and labor relations, and veterinary medicine. Both a private university and the land-grant institution of New York State, Cornell University is the most educationally diverse member of the Ivy League.

    On the Ithaca campus alone nearly 20,000 students representing every state and 120 countries choose from among 4,000 courses in 11 undergraduate, graduate, and professional schools. Many undergraduates participate in a wide range of interdisciplinary programs, play meaningful roles in original research, and study in Cornell programs in Washington, New York City, and the world over.

  • richardmitnick 11:46 am on December 31, 2017 Permalink | Reply
    Tags: , , , Daniel Vogt, Falkor research vessel, , NOAA’s Office of Ocean Exploration and Research, , PIPA-Phoenix Islands Protected Area, Robotics, ROV-remotely operated underwater vehicle, , , Squishy fingers help scientists probe the watery depths,   

    From Wyss Institute: “Squishy fingers help scientists probe the watery depths” 2017 

    Harvard bloc tiny
    Wyss Institute bloc
    Wyss Institute

    October 28, 2017
    Lindsay Brownell

    Wyss researcher Daniel Vogt tests out soft robotics on deep sea corals in the South Pacific.

    As an engineer with degrees in Computer Science and Microengineering, Wyss researcher Daniel Vogt usually spends most of his time in his lab building and testing robots, surrounded by jumbles of cables, wires, bits of plastic, and circuit boards. But for the last month, he’s spent nearly every day in a room that resembles NASA ground control surrounded by marine biologists on a ship in the middle of the Pacific Ocean, intently watching them use joysticks and buttons to maneuver a remotely operated underwater vehicle (ROV) to harvest corals, crabs, and other sea life from the ocean floor.

    The squishy fingers are made of a soft, flexible material that is more dexterous and gentle than ROVs’ conventional grippers. Credit: Schmidt Ocean Institute.

    Deep corals of the Phoenix Islands Protected Area: How Wyss Institute researchers are changing underwater exploration. Credit: Schmidt Ocean Institute.

    This particular ROV’s robotic metal arm is holding the reason why Vogt is here: what looks like a large, floppy toy starfish made of blue and yellow foam. “Devices like this are extremely soft – you can compare them to rubber bands or gummy bears – and this allows them to grasp things that you wouldn’t be able to grasp with a hard device like the ROV gripper,” says Vogt, watching the TV screen as the “squishy fingers” gently close around a diaphanous bright pink sea cucumber and lift it off the sand. The biologists applaud as the fingers cradle the sea cucumber safely on its journey to the ROV’s collection box. “Nicely done,” Vogt says to the ROV operators.

    This shipful of scientists is the latest in a series of research voyages co-funded by NOAA’s Office of Ocean Exploration and Research and the Schmidt Ocean Institute, a nonprofit founded by Eric and Wendy Schmidt in 2009 to support high-risk marine exploration that expands humans’ understanding of our planet’s oceans. The Institute provides marine scientists access to the ship, Falkor, and expert technical shipboard support in exchange for a commitment to openly share and communicate the outcomes of their research.

    Falkor is equipped with both wet and dry lab spaces, the ROV SuBastian, echosounders, water sampling systems, and many other instruments to gather data about the ocean. Credit: Schmidt Ocean Institute.

    Vogt’s shipmates are studying the mysterious deep sea coral communities of the deep ocean, which live below 138 meters (450 feet) on seamounts which are mostly unexplored.

    The best place to find those corals is the Phoenix Islands Protected Area (PIPA), a smattering of tiny islands, atolls, coral reefs, and great swaths of their surrounding South Pacific ocean almost 3,000 miles from the nearest continent. PIPA is the largest (the size of California) and deepest (average water column depth of 4 km/2.5 mi) UNESCO World Heritage Site on Earth and, thanks to its designation as a Marine Protected Area in 2008, represents one of Earth’s last intact oceanic coral archipelago ecosystems. With over 500 species of reef fishes, 250 shallow coral species, and large numbers of sharks and other marine life, PIPA’s reefs resemble what a reef might have looked like a thousand years ago, before human activity began to severely affect oceanic communities. The team on board Falkor is conducting the first deep water biological surveys in PIPA, assessing what species of deep corals are present and any new, undescribed species, while also evaluating the effect of seawater acidification (caused by an increase in the amount of CO2 in the water) on deep coral ecosystems.

    The deep ocean is about as inhospitable to human life as outer space, so scientists largely rely on ROVs to be their eyes, legs, and hands underwater, controlling them remotely from the safety of the surface. Most ROVs used in deep-sea research were designed for use in the oil and gas industries and are built to accomplish tasks like lifting heavy weights, drilling into rock, and installing machinery. When it comes to plucking a sea cucumber off the ocean floor or snipping a piece off a delicate sea fan, however, existing ROVs are like bulls in a china shop, often crushing the samples they’re meant to be taking.

    This problem led to a collaboration between Wyss Core Faculty member Rob Wood, Ph.D. and City University of New York (CUNY) marine biologist David Gruber, Ph.D. back in 2014 that produced the first version of the soft robotic “squishy fingers,” which were successfully tested in the Red Sea in 2015. PIPA offered a unique opportunity to test the squishy fingers in more extreme conditions and evaluate a series of improvements that Vogt and other members of Wood’s lab have been making to them, such as integrating sensors into the robots’ soft bodies. “The Phoenix Islands are very unexplored. We’re looking for new species of corals that nobody has ever seen anywhere else. We don’t know what our graspers will have to pick up on a given day, so it’s a great opportunity to see how they fare against different challenges in the field.”

    Daniel Vogt holds the ‘squishy finger’ soft robots aboard Falkor. Credit: Schmidt Ocean Institute.

    Vogt, ever the tinkerer, also brought with him something that the Red Sea voyage did not have on board: two off-the-shelf 3D printers. Taking feedback directly from the biologists and the ROV pilots about what the soft robot could and could not do, Vogt was able to print new components overnight and try them in the field the next day – something that rarely happens even on land. “It’s really a novel thing, to be able to iterate based on input in the middle of the Pacific Ocean, with no lab in sight. We noticed, for example, that the samples we tried to grasp were often on rock instead of sand, making it difficult for the soft fingers to reach underneath the sample for a good grip. In the latest iteration of the gripper, ‘fingernails’ were added to improve grasping in these situations.” The ultimate goal of building better and better underwater soft robots is to be able to conduct research on samples underwater at their natural depth and temperature, rather than bringing them up to the surface, as this will paint a more accurate picture of what is happening out of sight in the world’s oceans.

    PIPA may be somewhat insulated from the threats of warming oceans and pollution thanks to its remoteness and deep waters, but the people of Kiribati, the island nation that contains and administers PIPA, are not. The researchers visited the island of Kanton, population 25, a few days into their trip to meet the local people and learn about their lives in a country where dry land makes up less than 1% of its total area – a true oceanic nation. “The people were very nice, very welcoming. There is one ship that comes every six months to deliver supplies; everything else they get from the sea,” says Vogt (locals are allowed to fish for subsistence). “They’re also going to be one of the first nations affected by rising sea levels, because the highest point on the whole island is three meters (ten feet). They know that they live in a special place, but they’re preparing for the day when they’ll have to leave their home. The whole community has bought land on Fiji, where they’ll move once Kanton becomes uninhabitable.”

    Daniel Vogt tests the squishy fingers on the forearm of CUNY biologist David Gruber, who spearheaded their development along with Wyss Faculty member Rob Wood. Credit: Schmidt Ocean Institute.

    Research that brings scientists from different fields together to elucidate the world’s remaining unknowns and solve its toughest problems is gaining popularity, and may be the best chance humanity has to ensure its own survival. “One of the most eye-opening part of the trip has been interacting with people from different backgrounds and seeing the scientific challenges they face, which are very different from the challenges that the mechanical and electrical engineers I’m with most of the time have to solve,” says Vogt. “I’ve been amazed by the technology that’s on Falkor related to the ROV and all the scientific tools aboard. The ROV SuBastian is one-of-a-kind, with numerous tools, cameras and sensors aboard as well as an advanced underwater positioning system. It takes a lot of engineers to create and operate something like that, and then a lot of biologists to interpret the results and analyze the 400+ samples which were collected during the cruise.”

    Vogt says he spent a lot of time listening to the biologists and the ROV pilots in order to modify the gripper’s design according to their feedback. The latest version of the gripper was fully designed and manufactured on the boat, and was used during the last dive to successfully sample a variety of sea creatures. He and Wood plan to write several papers detailing the results of his experiments in the coming months.

    “We’re very excited that what started as a conversation between a roboticist and a marine biologist at a conference three years ago has blossomed into a project that solves a significant problem in the real world, and can aid researchers in understanding and preserving our oceans’ sea life,” says Wood.

    Additional videos detailing Vogt’s voyage, including the ship’s log, can be found here.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Wyss Institute campus

    The Wyss (pronounced “Veese”) Institute for Biologically Inspired Engineering uses Nature’s design principles to develop bioinspired materials and devices that will transform medicine and create a more sustainable world.

    Working as an alliance among Harvard’s Schools of Medicine, Engineering, and Arts & Sciences, and in partnership with Beth Israel Deaconess Medical Center, Boston Children’s Hospital, Brigham and Women’s Hospital, Dana Farber Cancer Institute, Massachusetts General Hospital, the University of Massachusetts Medical School, Spaulding Rehabilitation Hospital, Tufts University, and Boston University, the Institute crosses disciplinary and institutional barriers to engage in high-risk research that leads to transformative technological breakthroughs.

  • richardmitnick 1:23 pm on November 8, 2017 Permalink | Reply
    Tags: , CSAIL-MIT’s Computer Science and Artificial Intelligence Lab, Daniela Rus, , More Evidence that Humans and Machines Are Better When They Team Up, Robotics,   

    From M.I.T Technology Review: Women in STEM- Daniela Rus”More Evidence that Humans and Machines Are Better When They Team Up” 

    MIT Technology Review
    M.I.T Technology Review

    November 8, 2017
    Will Knight

    By worrying about job displacement, we might end up missing a huge opportunity for technological amplification.

    MIT computer scientist Daniela Rus. Justin Saglio

    Instead of just fretting about how robots and AI will eliminate jobs, we should explore new ways for humans and machines to collaborate, says Daniela Rus, director of MIT’s Computer Science and Artificial Intelligence Lab (CSAIL).

    “I believe people and machines should not be competitors, they should be collaborators,” Rus said during her keynote at EmTech MIT 2017, an annual event hosted by MIT Technology Review.

    How technology will impact employment in coming years has become a huge question for economists, policy-makers, and technologists. And, as one of the world’s preeminent centers of robotics and artificial intelligence, CSAIL has a big stake in driving coming changes.

    There is some disagreement among experts about how significantly jobs will be affected by automation and AI, and about how this will be offset by the creation of new business opportunities. Last week, Rus and others at MIT organized an event called AI and the Future of Work, where some speakers gave more dire warnings about the likely upheaval ahead (see “Is AI About to Decimate White Collar Jobs?”).

    The potential for AI to augment human skills is often mentioned, but it has been researched relatively little. Rus talked about a study by researchers from Harvard University comparing the ability of expert doctors and AI software to diagnose cancer in patients. They found that doctors perform significantly better than the software, but doctors together with software were better still.

    Rus pointed to the potential for AI to augment human capabilities in law and in manufacturing, where smarter automated systems might enable the production of goods to be highly customized and more distributed.

    Robotics might end up augmenting human abilities in some surprising ways. For instance, Rus pointed to a project at MIT that involves using the technology in self-driving cars to help people with visual impairment to navigate. She also speculated that brain-computer interfaces, while still relatively crude today, might have a huge impact on future interactions with robots.

    Although Rus is bullish on the future of work, she said two economic phenomena do give her cause for concern. One is the decreasing quality of many jobs, something that is partly shaped by automation; and the other is the flat gross domestic product of the United States, which impacts the emergence of new economic opportunities.

    But because AI is still so limited, she said she expects it to mostly eliminate routing and boring elements of work. “There is still a lot to be done in this space,” Rus said. “I am wildly excited about offloading my routine tasks to machines so I can focus on things that are interesting.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of MIT Technology Review is to equip its audiences with the intelligence to understand a world shaped by technology.

  • richardmitnick 11:49 am on October 20, 2017 Permalink | Reply
    Tags: Endless thirst and search for research knowledge, , Low-cost self-driving technology for power wheelchairs, Maya Burhanpurkar, Robotics, , Writing algorithms, Writing code   

    From Paulson: Women in STEM- “Driven to discover” Maya Burhanpurkar 

    Harvard School of Engineering and Applied Sciences
    John A Paulson School of Engineering and Applied Sciences

    October 18, 2017
    Adam Zewe

    Harvard freshman Maya Burhanpurkar spent the past year developing software for a low-cost, self-driving technology for power wheelchairs, and writing super-fast algorithms to process data from one of the world’s most powerful telescopes. (Photo by Adam Zewe/SEAS Communications.)

    Before she arrived on campus this fall, Harvard freshman Maya Burhanpurkar already had a notch in her belt typically reserved for Ph.D. candidates. First author of a paper on a low-cost, self-driving technology for power wheelchairs, Burhanpurkar presented the research at the 2017 IEEE International Conference on Rehabilitative Robotics.

    Sharing her work with hundreds of scientists from around the world was exhilarating, Burhanpurkar said, and another milestone in a research career that began at age 7.

    “I was always asking questions. Eventually, I started asking questions people didn’t have answers to, so I started doing my own projects,” she said.

    Maya Burhanpurkar discusses the low-cost, self-driving technology for power wheelchairs she and a University of Toronto team developed. (Image courtesy of Reuters.)

    The intrepid elementary school student, who grew up in a rural town 100 miles north of Toronto, set out to determine if herbs could kill pathogenic bacteria. She commandeered a piece of raw chicken meant for that night’s dinner, left it on the deck for a few days, and then swabbed it onto Petri dishes. In her basement microbiology laboratory, she piled herbs onto the Petri dishes and put them into a homemade incubator she built with a cooler and electric blanket.

    At Canada’s National Science Fair, Burhanpurkar showcased her incredible results—no bacterial growth meant the herbs must have killed the bacteria. A Science Fair judge quickly, but kindly, pointed out that the bacteria actually died due to suffocation.

    That experience only fueled Burhanpurkar’s desire to conduct more research. She began contacting professors and, while in ninth grade, joined a University of Toronto lab to build an apparatus that can physically detect the time integral of distance. The project earned her a second Grand Platinum award at the National Science Fair.

    Through middle and high school, she built a quantum key distribution system for cryptography at the Institute for Quantum Computing, tracked near-earth asteroids for the Harvard-Smithsonian Center for Astrophysics, and embarked on an expedition to study the impact of climate change on the Canadian and Greenlandic Arctic. The latter project led her to write and produce an award-winning climate change documentary titled “400 PPM.”

    Burhanpurkar at Jakobshavn fjord on the west coast of Greenland. (Photo provided by Maya Burhanpurkar.)

    “Research really drew me in because of the opportunity to answer unanswered questions,” she said. “It is so fascinating that you can make fundamental discoveries about the universe around us.”

    Not even an early acceptance by Harvard could disrupt her focus on research; Burhanpurkar deferred admission to work on the self-driving wheel chair technology during a gap year. Her University of Toronto team sought to develop a hardware and software package that would make it easier for people with severe physical disabilities to use power wheelchairs.

    “People with hand tremors or more severe Parkinson’s Disease or ALS really struggle with a joy stick or an alternate input device, like a sip and puff switch,” she said. “These people often have degraded mobility and a degraded quality of life.”

    Burhanpurkar developed a core part of the software for the semi-autonomous system that is capable of localization, mapping, and obstacle avoidance. The software utilizes off-the-shelf computer vision and odometry sensors rather than expensive 3D laser scanners and high-performance hardware, so it is more cost-effective than other devices, Burhanpurkar said.

    Despite her lack of coding experience, she wrote a specialized path-planning algorithm that enables autonomous doorway detection and traversal simply by placing the wheelchair in front of a door. She also helped develop software for autonomously traveling down long corridors, and docking at a desk, typically very difficult tasks for users with upper body mobility impairments.

    “The challenge was what I enjoyed the most. I got thrown off the deep end in this project and I had to swim my way up, which was really fun,” she said. “It was intellectually interesting, but it was also emotionally interesting. Working on something that can directly impact people’s lives in the near future, not decades away, is really exciting.”

    But that was only half of Burhanpurkar’s gap year. She spent the other half as the youngest paid researcher at the Perimeter Institute for Theoretical Physics (where Stephen Hawking keeps an office), writing super-fast algorithms for a novel telescope in British Columbia. The telescope will continually map the entire northern hemisphere in an effort to learn more about cosmic fast radio bursts.

    Burhanpurkar working on code at the Canadian Hydrogen Intensity Mapping Experiment telescope in British Columbia. (Photo by Richard Bowden.)

    Each day, the planet is bombarded by high-energy millisecond duration bursts of radio waves, each having the energy of 500 million of our suns, but scientists remain puzzled about their origins. This new telescope will enable researchers to gather data on thousands of these bursts, opening the door for more detailed analysis.

    Terabytes of astronomical data will be generated each second, so the super-fast algorithms Burhanpurkar and the team wrote are necessary to efficiently process the mass of information.

    “I know that right now my code is running on a new 128-node supercomputer in British Columbia, and it is going to help detect one of the most enigmatic phenomena of the universe,” she said. “That’s pretty cool.”

    Now at Harvard, Burhanpurkar is not planning to slow down. She is interested in continuing her robotics research and working with the i-Lab to bring the cost-effective self-driving wheelchair technology to consumers.

    While she hasn’t decided on a concentration, she is considering computer science and physics (or both), and looks forward to pursuing her passion for research down new avenues.

    “Taking a gap year was great for perspective,” she said. “Before, I wasn’t sure what I wanted to do. I hadn’t really done long-term research projects, but now, I have actual experience and I know what the end goal is. I can use that to motivate me.”

    Burhanpurkar interviewing Canadian author and environmental activist Margaret Atwood for her climate change documentary titled “400 PPM.” (Photo provided by Maya Burhanpurkar.)

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Through research and scholarship, the Harvard School of Engineering and Applied Sciences (SEAS) will create collaborative bridges across Harvard and educate the next generation of global leaders. By harnessing the power of engineering and applied sciences we will address the greatest challenges facing our society.

    Specifically, that means that SEAS will provide to all Harvard College students an introduction to and familiarity with engineering and technology as this is essential knowledge in the 21st century.

    Moreover, our concentrators will be immersed in the liberal arts environment and be able to understand the societal context for their problem solving, capable of working seamlessly withothers, including those in the arts, the sciences, and the professional schools. They will focus on the fundamental engineering and applied science disciplines for the 21st century; as we will not teach legacy 20th century engineering disciplines.

    Instead, our curriculum will be rigorous but inviting to students, and be infused with active learning, interdisciplinary research, entrepreneurship and engineering design experiences. For our concentrators and graduate students, we will educate “T-shaped” individuals – with depth in one discipline but capable of working seamlessly with others, including arts, humanities, natural science and social science.

    To address current and future societal challenges, knowledge from fundamental science, art, and the humanities must all be linked through the application of engineering principles with the professions of law, medicine, public policy, design and business practice.

    In other words, solving important issues requires a multidisciplinary approach.

    With the combined strengths of SEAS, the Faculty of Arts and Sciences, and the professional schools, Harvard is ideally positioned to both broadly educate the next generation of leaders who understand the complexities of technology and society and to use its intellectual resources and innovative thinking to meet the challenges of the 21st century.

    Ultimately, we will provide to our graduates a rigorous quantitative liberal arts education that is an excellent launching point for any career and profession.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: