NXP drone accompanied by HAWK vision

Photo by rawpixel on Unsplash

It was during summer 2020 after the first corona wave in Slovenia when we decided to participate in the NXP HoverGames challenge number 2.

Our project was designed in a way that we use the drone as a self-flying robot that will follow us where we go (master-slave functionality), when a situation comes that we want to control them, special hands mimic will switch from follow-me mode to the search mode. With this functionality (master-tool) drone will be our sixth sense of perception.

The goal of our project is to create a platform, which can be used for a variety of different use-cases:

Pandemic (helps to locate big groups of people, the messenger of information) Natural disaster (identifying potentially dangerous POI) Search on difficult terrains (people, pets, objects) Hobbies (search for mushrooms, special vegetation, an overview of landscape) Fun (families/friends) Summary (practical example): Drone follows us as we move. If we want to extend our vision (left/right 100 meters) we extend our hand in that direction, the drone will detect hands-mimic and initialize a pre-defined mission (object of interest) if that object is found it will notify the master-user and give him the possibility to observe it through FPV goggle, if not it will return to master and continue with follow-me mode.

Continue reading on orginal post

Sašo Pavlič
Sašo Pavlič
PhD student in computer science & informatics

My research interests include artificial inteligence, machine learning, neural architecture search, anomaly detection, …

Related