Evolution of Sensor Technology: Drones Don’t Need Eyes to See

Is there another way for a machine with artificial intelligence to get visual signals from the outside world rather than a camera?  

In fact, there are many other possibilities to receive visual signals, and these possibilities provide a much better view – click for more info. Similar to night animals that perceive the world with the help of ultrasound, infrared signals and physical characteristics of the environment, autonomous systems can use the same types of information to get data about the objects around it. Images based on thermal detectors are used in a variety of spheres, from security to agriculture, and there are also companies like LIDAR that rely on lasers to navigate the surroundings in total darkness.

Broader Possibilities for Image Construction

Further research is being undertaken by Stanford University and University of California San Diego in their attempt to broaden the possibilities of robots to see their surroundings. They have designed a type of one-lens light field camera which does not only cover a wider visual field – it provides a 4-dimensional signal with an extra layer of data. The new invention is best suited for air navigation systems like drones, who need very good visual signals for security reasons. The light field camera perceives light with a much bigger surface, and thus lets the drone capture images from a wider area of its surroundings at a time.

“It is a much more advanced technology that allows the autonomous system to cope with its most difficult tasks, like scene shaping, determining camera motion, noticing barriers and other moving bodies around it“, said Donald Dansereau, a PhD from Stanford. “The approach we have adopted to these issues is power-saving and much more practical than the attempts that have been undertaken before us. We have fixed deadlines for the project and we are doing our best to provide a viable solution quickly, given the demand.”

The team’s invention will aid the functioning of drone navigation software by decreasing latency and eliminating many possible flight errors that could prevent the device from landing if the landing area is complex and multi-layered; by enhancing the drone’s visual systems, Dansereau’s novice helps it to fly faster and easier, and land safely regardless of the environmental characteristics.

Other alternative vision inventions can have similar uses. CrowdOptic has, in collaboration with Hewlett-Packard Enterprise, for example, has recently designed a camera system that uses GPS to detect objects around it within its visual field. It does better than providing a camera image of the object it sees – it gives you the exact coordinates of its whereabouts.

“This is a unique invention,” said CrowdOptic CEO Jon Fisher. “We’ve managed to resolve every important trigonometry issue with drones and their visual systems.” This is a hope-inspiring statement, given the fact that drones need not only detect the object around them, but also “see” each other. It is an issue of public safety just as much as drones’ efficiency altogether.

The sensor technology is now booming, with sensor devices never being as cheap and efficient before as they are now. The alternative vision is the next step for drones, and a great possibility for them to conquer the same volume of market. We can only hope that they will learn to fly peacefully next to each other and start delivering pizza and shopping carts to our homes in the next 5 to 10 years.