Skip Navigation

Animal brain inspired AI game changer for autonomous robots First neuromorphic vision and control of a flying drone

www.sciencedaily.com Animal brain inspired AI game changer for autonomous robots

A team of researchers has developed a drone that flies autonomously using neuromorphic image processing and control based on the workings of animal brains. Animal brains use less data and energy compared to current deep neural networks running on GPUs (graphic chips). Neuromorphic processors are the...

Animal brain inspired AI game changer for autonomous robots
1
1 comments
  • This is the best summary I could come up with:


    The processors made for running deep neural networks (Graphics Processing Units, GPUs) consume a substantial amount of energy.

    In an article published in Science Robotics on May 15, 2024, researchers from Delft University of Technology, the Netherlands, demonstrate for the first time a drone that uses neuromorphic vision and control for autonomous flight.

    Specifically, they developed a spiking neural network that processes the signals from a neuromorphic camera and outputs control commands that determine the drone's pose and thrust.

    Over the generations of the artificial evolution, the spiking neural networks got increasingly good at control, and were finally able to fly in any direction at different speeds.

    It can even fly with flickering lights, which make the pixels in the neuromorphic camera send great numbers of signals to the network that are unrelated to motion.

    At Delft University of Technology's Faculty of Aerospace Engineering, we work on tiny autonomous drones which can be used for applications ranging from monitoring crop in greenhouses to keeping track of stock in warehouses.


    The original article contains 1,032 words, the summary contains 169 words. Saved 84%. I'm a bot and I'm open source!