In the last three years, laser-based optical radar, or LIDAR has come a long way. LIDAR became famous when Google started using it on their self-driving cars; but it was big, clumsy and slightly threatening. That spinning cylinder on top of a car or robot might help it navigate, but it looks so mechanical!
Then along came digital 3D laser scanners. With a diamond-studded microchip at its core, the invisible light beam is manipulated electronically; no moving parts. The LIDAR chip is the size of a grain of rice, and only uses a tiny amount of battery power. Now they are making robotic eyes the size of human eyes, and they are fully LIDAR equipped.
Which means that robots (and cars and drones) can truly see the world around them, with full depth perception and real-time feedback. Movements are smoother, and working with, and around, humans is much friendlier – almost human-like.
But the most important thing is, robots’ eyes now look like normal eyes, and they watch your face for emotional cues, just like your pet dog. Artificial intelligence gives these robots the ability to read your body language, too. Humanoid robots that help out at work or in the home are all the rage.
But the creepy part is, they can also respond when they are facing away from you. Your robot helper literally has eyes in the back of its head!