Controll Parrot AR drone with Emotiv Epoc

The Parrot AR drone is a new flying robot that has now development SDKs for all major platforms.

With the Emotive Epoc SDK, a preliminary flight remote server has been designed to run on Windows PC for interaction with any Web mobile device.

https://github.com/Ruslan-B/AR.Drone

Use the headset to control the Drone's flight while observing the Drone video on an iPod or iPad

If you are interested in this project and would like to know more, please contact me.

A. R. Drone Technical Details
Embedded computer system
ARM9 processor, 128MB RAM, Wi-Fi b/g, USB, Linux OS
Inertial guidance systems
3 axis accelerometer
2 axis gyro-meter
1 axis yaw precision gyro-meter
Specs:
Speed: 5m/s; 18km/h
Weight: Less than 1 pound
Flying time ~12 mins.
Ultrasound altimeter
Range: 6 meters – vertical stabilization
Camera
Vertical high speed camera: up to 60 fps – allows stabilization

Based on EEG, 14 sensors – positioned for accurate spatial resolution
Detecting facial expressions are very fast (<10ms)
Wireless chip is proprietary and operates on frequency 2.4GHz
Hacked to use via Python
https://github.com/daeken/Emokit/blob/master/Announcement.md
https://github.com/daeken/Emokit

http://dsc.discovery.com/videos/prototype-this-mind-controlled-car.html
http://www.autonomos.inf.fu-berlin.de/subprojects/braindriver
http://sensorlab.cs.dartmouth.edu/pubs/neurophone.pdf
http://www.engadget.com/2011/02/19/german-researchers-take-mind-controlled-car-for-a-carefully-cont/
Our goal is to control the A R Drone using thoughts via Emotive EPOC 

Control the A.R. Drone using Computer
Get the commands from Emotiv EPOC and process those
Design an architecture to connect both and is extendable to incorporate multiple devices.
Establish connections and fine tune the data for smooth controlling

Map headset signals to reasonable commands
Create a channel between the commands from headset interface and A. R. Drone
client/server architecture
allows us to control multiple A. R. Drones remotely
programs can be extended to run on different environments
 

Emotiv
Connect
Emo Composer
Configurations
Server
Mappings
Emo Engine
Core
Emotiv
EPOC

CLIENT
(updates every 500ms)
Virtual 
Input 
Device
(updates every 20ms)
Win32 
A R Drone 
Controller
Buffer
Quad-copter

Emotive SDK is platform dependent
Headset sends many signals
States change very rapidly – causes noisy interstates
Training requires to focus and not interchangeable from person to person
There is no universal training method to get same results

Current System:
Enhance the system to connect with multiple clients
Enable the system to work remotely via Internet
Client could be made more intelligent in order to handle emergency situations

The technology could be used to control devices which we used in daily routine, like cars, phones, other electronics etc. 
On long run the EEG devices could be improved to a level where controlling devices will become as natural as controlling once body parts.

We are able to fully control the A R Drone using earlier by facial expression and gyro-meter and later by only using the cognitive commands. Given time this system could be future enhanced to control multiple devices simultaneously with a higher accuracy.

The available technology for reading and processing the thoughts is pretty good to control a system with limited command set, but it needs a lot of improvement in order to be used for complex systems.
Great learning experience

Liam Broza

Co-Founder at Companion Intelligence

https://liambroza.com
Previous
Previous

How to Optimize Your Site Using Visual Studio 2012

Next
Next

Make the Creation of Aspose Documents easier