( January 20, 2022; IEEE Spectrum )
Description: "The research that the Robotic Systems Lab at ETH Zürich has published showcases a control system for a legged robot to evaluate how reliable the exteroceptive information that it's getting is. When the data are good, the robot plans ahead and moves quickly."
Robots, like humans, generally use two different sensory modalities when interacting with the world. There’s exteroceptive perception (or exteroception), which comes from external sensing systems like lidar, cameras, and eyeballs. And then there’s proprioceptive perception (or proprioception), which is internal sensing, involving things like touch, and force sensing. Generally, we humans use both of these sensing modalities at once to move around, with exteroception helping us plan ahead and proprioception kicking in when things get tricky. You use proprioception in the dark, for example, where movement is still totally possible—you just do it slowly and carefully, relying on balance and feeling your way around.
For legged robots, exteroception is what enables them to do all the cool stuff—with really good external sensing and the time (and compute) to do some awesome motion planning, robots can move dynamically and fast. Legged robots are much less comfortable in the dark, however, or really under any circumstances where the exteroception they need either doesn’t come through (because a sensor is not functional for whatever reason) or just totally sucks because of robot-unfriendly things like reflective surfaces or thick undergrowth or whatever. This is a problem because the real world is frustratingly full of robot-unfriendly things.
- Humans and robots use two types of perception, exteroception and proprioception.
- Exteroception comes from external sensing: things like sight and sound in humans, cameras and lidar in robots.
- Proprioception comes from touch and feel
Humans use both exteroception and proprioception when moving around. Robots have typically used exteroception. Now, though, Robotic Systems Lab at ETH Zürich has published in Science Robotics about their work towards integrating exteroception and proprioception into what they call a "belief state". Their technique uses exteroception first, but falls back to proprioception when exteroception is found to be untrustworthy -- in cases like walking on surfaces that are unstable or covered with things like snow or underbrush, for example. This enables the robot to move quickly when exteroception is working, and still reliably when it's not.
The embedded video shows Anymal in action making use of the integrated belief state. The robot makes use of a neural network for its belief state, where it is trained by having a "student policy"attempt to follow the actions of a teacher.
Read the rest from IEEE Spectrum: Legged Robots Learn to Hike Harsh Terrain
Check the #penny4thoughts tag to find other active conversations.
###### This markdown/html was auto-generated by the java Steem Links Creator prototype application.
Walking robots can easily move around on a flat surface, but in the field there are countless obstacles and uneven ground. In a new paper, a control system is created that lets the robot use sensory information to determine not just the ground itself but the accuracy of that sensory data. If it becomes too inaccurate or misleading, it can slow down and move more carefully until its perception improves. This is just one way to help robots move in unfamiliar terrain, making slow but robust progress.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
The power of coding and sensors is incredible. This robot did remind me of a cartoon where they used robo-dog to mine and extract gold and diamonds, but I don't remember the cartoon since it is very old.
Sometimes I think that many science fiction things are becoming reality every time that passes the only thing missing is to make interstellar travel a reality.
This is a great advance for robotics.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
With the development of artificial intelligence and through the deep learning technique, the robot learns to improve the decision process, that is, if the result is good, that decision is learned in the future, otherwise, if it is harmful, the robot will avoid it and replace the decision with a different one.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Interesting to see the way it moves , the locomotion of the robot is great , it goes up and down scales very quickly , it seems great to me , but what I like the most is that it is directed by a human , I hope it will always be like that , the theme I don't like robots, let me explain, we humans go first, not everything to the contrary
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
In near future, robot will be the biggest helper, servant of mankind. Robotics is an amzing technology that can change all our social aspects and discipline.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Interesting your post with adequate explanation regarding the interaction of the robot and the association with the human. Thank you.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
I saw this kind of robot. In a video related to machineries. They are much more efficient.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit