Sunday, January 17, 2010

Arduino_Processing Face Follower

Here is another Arduino and Processing project I built this weekend.
A face follower allows to track people faces and to track them in a scene, captured thru a webcam.

My face follower is simple. It is built on the previous project LaserGun, and the code is very similar. The Arduino part is exactly the same, actually.









The differences rely in the use of a webcam, for continuosly acquiring the scene, and in the use of a computer vision library: OpenCV, which is the actual engine behind the magic.

Here you can fine some more information about it: Wikipedia OpenCV entry, OpenCV Wiki, .


Requirements: 
This project requires:
  1. A Windows XP pc with a webcam. I tried on Windows Vista but without success. Windows 7 is probably not working too.
  2. An Arduino board.
  3. Two servos.
  4. Some wires
  5. One toy laser (optional, for more accurate calibration system).

Building:
Building is exactly the same like the LaserGun project.
In place of  the laser you can put a toy figure face or a puppet face, so that when the servo move you will see the puppet face turning and tilting.


The Software
Arduino board software is exactly the same of LaserGun project

PC software requires some additional components, but starts from the same base.
I followed the instructions I found here http://ubaa.net/shared/processing/opencv/ which is the main site for OpenCV integration with Processing.

On your PC, you will need:
  1. Processing environment. Enough said.

  2. OpenCV libray: go download it and be sure NOT to download the wrong version. We need the 1.0 version. I suggest to install it in c:\opencv10 or to another simple path without spaces within the name. I do not recommend to install it in the default location which is probably dependent from your windows system language, and may contain spaces (like in "Program Files"). Also I suggest to use an all lowercase name.During installation, answer yes when prompted to alter the path adding the c:\opencv10\bin folder.
    After installation, I am suggesting to edit the system path so to have that folder at the beginning of the path, as shown in this picture (you can get to this via right click on My Computer, then Properties, then Advanced tab, then Environment Variables button.














    A reboot is not needed








  3. OpenCV Processing Libray: This is for interfacing your Processing environment with OpenCV.You can download it from here, expand it and put the contents inside the folder "libraries" into your processing installation root. In my case i put it in c:\inst\processing-1.0.9\processing-1.0.9\libraries


Optionally you can also install the Processing OpenCV examples. Get them from http://ubaa.net/shared/processing/opencv/download/opencv_examples.zip (5Mb)
and copy the opencv-examples folder accordingly in C:\inst\processing-1.0.9\processing-1.0.9\examples


Checking the environment installation
When starting Processing you should be able to open and run the example code. If you do not see camera feed, it is probably due to your OS not being XP. (I had black pitch camera feed on my Vista laptop)
In order for the face detection demos to work, you need to copy the proper haar recognizer data file into the sketch folder. Get the datafiles from C:\openCV10\data\haarcascades



Here is the code (also download from my google docs)

-----------

//
// Processing code for a two axis Face-Following interacting with arduino firmata
// to control servos connected to analog 9 and analog 10 pins
// project "facefollow" by Marco Guardigli, email: mgua@tomware.it  twitter: @mgua
//
// ATTENTION! This software makes a laser pointer track a face.
//            Use with extreme caution.
//
//
// see
// http://marco.guardigli.it/2010/01/arduinoprocessing-face-follower.html
// 
// this code is free software, released under the GNU GPL license
// see www.gnu.org for license details
//
// copyleft Marco Guardigli
//    2010 jan 18: first version
//
//
import hypermedia.video.*;   
import processing.serial.*;
import cc.arduino.*;       
OpenCV opencv;
Arduino arduino;

int maxx = 640;          // windows sizes, I suggest not to go over 640x480
int maxy = 480;

int calibrating = 0;     // nonzero during calibration, states are 0,1,2

int calibrateminx=80;    // recalibration window
int calibrateminy=80;
int calibratemaxx=100;
int calibratemaxy=100;
int cx1, cy1, cx2, cy2;  // screen mousex/y coords of the two calibration points

float dzx, dzy;
int cfacex, cfacey;    // center of the first face detected

int maxservox=170;     // maximum servo excursions - to be redefined in recalibration
int minservox=10;
int maxservoy=170;
int minservoy=10;
int initialservox = 90;
int initialservoy = 90;
int servox, servoy;    // current servos positions -in servo range-
int laseroff = 0;      // laser is controlled (improperly) as a servo
int laseron = 100;     // zero is not actually turned off, but is dimmer


void setup() {
  println(Arduino.list());

// IMPORTANT! This code will not work if you do not write the correct 
// id of the serial interface in the next line (in my case 2)
  arduino = new Arduino(this, Arduino.list()[2], 57600);
  arduino.analogWrite(9,initialservox);
  arduino.analogWrite(10,initialservoy);
  arduino.analogWrite(11,laseroff);  // laser off
  size(maxx,maxy);
 
  opencv = new OpenCV(this);
  opencv.capture( width, height );
  opencv.cascade( OpenCV.CASCADE_FRONTALFACE_ALT );    // load the FRONTALFACE description file

  opencv.read();
  image( opencv.image(), 0, 0 );
}


void draw() {
  opencv.read();
  image( opencv.image(), 0, 0 );
  Rectangle[] faces = opencv.detect();     // detect anything resembling a FRONTALFACE
  noFill(); stroke(255,0,0);
  for( int i=0; i < faces.length; i++ ) {
      rect( faces[i].x, faces[i].y, faces[i].width, faces[i].height );
  }
 
  switch (calibrating) {
    case 0: {  // no calibration in course: pointer follows face 
      if (faces.length > 0) {
        cfacex = int(faces[0].x + (faces[0].width / 2));
        cfacey = int(faces[0].y + (faces[0].height / 2));
        servox = int(map(cfacex,0,maxx,minservox,maxservox));
        servoy = int(map(cfacey,0,maxy,minservoy,maxservoy));
        arduino.analogWrite(9,servox);
        arduino.analogWrite(10,servoy);
      }
    }
    case 1: {   // need to read first calibration point
      cx1 = mouseX;
      cy1 = mouseY;
    }
    case 2: {   // need to read second calibration point
      cx2 = mouseX;
      cy2 = mouseY;
    }
  }
}


void mousePressed() {
    if (mouseButton == LEFT) {
      if (calibrating == 0) {                // draw shot on screen
        arduino.analogWrite(11,laseron);     // and intensify laser
        stroke(200,200,0);                 
        fill(200,0,0);
        ellipse(cfacex,cfacey,5,5);
        delay(500);
        ellipse(cfacex,cfacey,10,10); 
        arduino.analogWrite(11,laseroff); 
      }
    }
}   
   

void mouseReleased() {
  if (mouseButton == RIGHT) {
    switch (calibrating) {
      case 0: { 
            calibrating = 1;   // stops laser following mouse pointer
            arduino.analogWrite(9,calibrateminx);
            arduino.analogWrite(10,calibrateminy);
            arduino.analogWrite(11,laseron);     // and intensify laser           
            println("cx1/cy1: point mouse to where laser pointer is and RCLICK");
            break;
      }
      case 1: {  // arriving here after rclick release in calibration point 1
            calibrating = 2;
            arduino.analogWrite(9,calibratemaxx);
            arduino.analogWrite(10,calibratemaxy); 
            arduino.analogWrite(11,laseron);     // and intensify laser                       
            print("  calibration point1: "); print(cx1); print(" , ");  println(cy1);
            println("cx2/cy2: point mouse to where laser pointer is and RCLICK");
            break;
      }
      case 2: {  // arriving here after rclick release in calibration point 2
            print("  calibration point2: "); print(cx2); print(" , ");  println(cy2);
            // (cx1,cy1) corresponds to (calibrateminx, calibrateminy)
            // (cx2,cy2) corresponds to (calibratemaxx, calibratemaxy)
            // i will recalculate minservox, minservoy and maxservox, maxservoy
            if (((cx2-cx1) != 0) && ((cy2-cy1) != 0)) {
              stroke(200);
              line (cx1,cy1,cx1,cy2);
              line (cx1,cy2,cx2,cy2);
              line (cx2,cy2,cx2,cy1);
              line (cx2,cy1,cx1,cy1);
 
              dzx = (calibratemaxx - calibrateminx);

              dzx = dzx / (cx2 - cx1);              // dzx is how much servo per pixel
              dzy = calibratemaxy - calibrateminy;
              dzy = dzy / (cy2 - cy1);

              float leftx = calibrateminx - ( dzx * cx1 );
              float rightx = calibratemaxx + ( dzx * (maxx-cx2) );
              float upy = calibrateminy - ( dzy * cy1 );
              float downy = calibratemaxy + ( dzy * (maxy-cy2) );

              minservox = int(leftx);
              maxservox = int(rightx);
              minservoy = int(upy);
              maxservoy = int(downy);
            } else {
              println("Invalid calibration points selected.");
            }           
            calibrating = 0;
            arduino.analogWrite(11,laseroff);     // and dim laser           
            break;
          } // end case 2
      default: {
            break;
          } // end case default
      } // end switch
    } // end if mousebutton right
} // end mouseReleased

 



------------


Calibration
As in the LaserGun Project, it is important to properly calibrate the system so to have it work decently.
In order to minimize errors, try to keep the position of the lasers and the position of the webcam as close as possible.
Calibration is performed pressing right mouse click, and then right clicking on the screen where you see the pointer. Repeat for the two points asked.

Multiple faces: 
Currently, the OpenCV library is able to detect more than one face in the scene. The detected faces are not always presented in the same order. If you present two faces to the current system, it will be confused. A more accurate movement detection and tracking over time would be needed.


Caution
This code could be dangerous if improperly used. Never play with laser pointing it into people eyes and face.
Be smart, and always think before doing.



Marco  ( @mgua on Twitter )

.

3 comments:

Rajeshchintha said...

how can i use this for class room attendance ? can u give some ideas? i am using Arduino as well as Opencv

Marco Guardigli said...

Rajesh,

First of all I would suggest not to use any laser that cold be dangerous for people eyes, in the class.

A nice project could be to arrange a PTZ webcam (pan tilt zoom) and a videoprojector, drawing a canvas on the wall so that everyone can see. You could train a haas classifier to identify colored fluo highlighter that people coul have in thei hands. Detected object size could be used to estimate distances, or to feedback on camera zoom function so to normalize scale.

People would then make gestures and draw on the projected screen simply waving the highlighter in front of them, in the air.

I am sure this would be a great project, very entertaining and very effective in the class!

You could improve highlighter detection setting up some wood ultraviolet lamps in the room.

Do it and give me some feedback!

enjoy!

Marco (@mgua)


.

Sayeed Mohammed said...

Hello I want to use the openCV library. but whenever i run any examples they say that the library is not compatible with the version of processing. Do you know what to do ?