Motion tracking with the Parrot AR.Drone Quadrocopter

One of the toys that the Computing Club has to play with is an  AR.Drone 1.0. This is a pre-built WiFi-enabled quadrocopter manufactured by Parrot. There are official iOS and Android applications for remotely controlling the quadrocopter. The AR.Drone also streams a live video feed from its onboard camera to the controller. Flying the drone around from an app is fun enough, but where things get really interesting for the Computing Club is programming it to do things! Over the last few months undergraduates have been tinkering with the drone, making it do various things using the open-source javadrone API.

Kirill Sidorov and I, organisers of the Computing Club this academic year, were asked to prepare a demo for an upcoming School of Computer Science & Informatics Open Day. The aim of these open days is to enthuse A-Level students who are considering study in Computer Science. We needed something that was interactive and fun, but also allowed us to highlight some of the concepts of computer science and what makes it interesting. We decided on a motion-tracking AR.Drone demo. We’d use the on-board camera to have the drone follow an individual holding a target. There’s some neat computer science here – control and computer vision in particular – and it also demonstrates the power using software to program real-world devices. Furthermore, it also meant we could build on the work done by Computing Club students and bring them in to chat to visitors at the Open Day.

Conveniently, a few days before the the first Open Day (17 April) was the two-day “Open Sauce” Hackathon. Kirill and I were attending anyway to help with the student-organised event, so we took advantage of the fruitful combination of hackathon ambience, energy drinks, and free food to build the demo over those two days. The repository is hosted on GitHub. The original output from the Hackathon is in this branch (warning: gnarled, hackathon-quality code). This was tweaked and (slightly) refactored over the following days in preparation for the Open Day, resulting in this.

Building the AR.Drone demo at the 2013 “Open Sauce” hackathon.

The target we used during the hackathon was a ping-pong paddle wrapped in an A4 sheet of paper coloured with pink highlighter. In hindsight, the lighting conditions of the venue were very consistent, making it a favourable test environment. Kirill prototyped some image-by-image video processing to extract the target in MATLAB, and then translated to native Java. I handled the interaction with the AR.Drone and control loop. We also implemented a fairly crude but useful GUI to view the raw and processed image streams, debug some control parameters, and initiate take-off and landing (emergency, typically). The javadrone API made controlling the drone straightforward, and even allowed us to implement some nifty features like changing the drone’s LED colours when the target is lost.

The image component outputs the location (a pixel coordinate) and extent (a measure proportional to the target’s size in view) of the target in the camera’s view. This information is used to handle our three control variables:

  1. Forward/back tilt for moving forwards and backwards to maintain a particular distance from the target.
  2. Left/right rotation to keep the target horizontally centred.
  3. Vertical ascent/descent to keep the camera and target at the same height.

We didn’t have much time to fully explore the handling of the drone with respect to these control variables, but experimenting with a few simple linear controllers and a PID or two resulted in decent  tracking, as undergraduate George Sale demonstrates in this video:

(As shown in the video, as well as  this other one, pretty much every flight ended up with a haywire drone and me initiating a forced landing.)

That was the hackathon; the Open Day proved much more challenging. In our hackathon experiments, the specificity of our target detection was excellent. Specificity was our primary concern, since a false-positive target detection puts bystanders wearing unfortunately coloured clothing on the receiving end of multi-bladed drone fury. The Open Day venue had very uneven lighting, with patchy artificial lights, and a large window in one corner that would temporarily flood the camera depending on the drone’s angle. This caused the colour profile of the paddle to change drastically depending on the angle of the drone, the location of the target, and the location of the drone.

To deal with this, our first trick was to change the target. Significant variation in light reflection between dimly lit and brightly lit areas meant large changes in the target’s brightness and hue. By switching to a backlit target we could ensure fairly consistent brightness, irrespective of ambient light. Using a bike light, a home-made filter (highlighted A4 paper), a diffuser (coffee filter paper), and filter assembly (polystyrene cup), we hacked together the following target:

Light diffuser.
Polystyrene.
Green filter.

(Yes, we effectively built a cheap Playstation Move controller.)

The resulting target had very consistent and distinct appearance. After this there were just a few camera-related issues to tackle; in particular:

  • Although the camera resolution is 640x480, the drone only streams 320x240 back to the laptop. Nothing much to say here, except it’s surprising (802.11g is capable of the bandwidth and latency) and inconvenient.
  • Either the camera hardware or drone firmware was doing some unwanted brightness auto-adjustment which we had to un-adjust back on the laptop.
  • The lens quality is poor. We had to discard everything outside a centre 320px-wide circle to cull corner artefacts.

And, then, finally, we were left with a superb signal and negligible false-positive rate.

Target triumph! Left panel: raw stream. Right panel: processed video stream; red pixels and white circle indicate detected target.

The control still needs a lot of work, but the drone flies and reacts well. It’s enjoyable watching people have a go at it. Initially people are very tentative. This is unsurprising; the drone’s forward/back lunging can be vicious at first (although it usually stabilises before quite reaching the volunteer). After a few goes, they’re eventually able to start taking it on tours around the demo area, almost like walking a dog; albeit a dog that is noisier, less behaved, and hovering in mid air.