Saturday, January 23, 2010

Growing transplantable organs with tissue engineering

Here is a great piece from TedMed Oct 2009, published in jan 2010

Anthony Atala's description of the research about how to build or regenerate organs to be used in transplants.

This TED talk is very powerful, fascinating and enthusiastic.
Here is the original link. http://www.ted.com/talks/anthony_atala_growing_organs_engineering_tissue.html








These topics and the current state of the research in these fields seem really to be science fiction but are real.

People will be allowed to access cell banks where to leave their healthy cells taken as biopsy from important organs, to be preserved for eventual later use.
When needed, the saved cells could be used to rebuild the failed organs to be transplanted.

Simply Amazing.


Marco ( @mgua on twitter )





.

Sunday, January 17, 2010

Arduino_Processing Face Follower

Here is another Arduino and Processing project I built this weekend.
A face follower allows to track people faces and to track them in a scene, captured thru a webcam.

My face follower is simple. It is built on the previous project LaserGun, and the code is very similar. The Arduino part is exactly the same, actually.









The differences rely in the use of a webcam, for continuosly acquiring the scene, and in the use of a computer vision library: OpenCV, which is the actual engine behind the magic.

Here you can fine some more information about it: Wikipedia OpenCV entry, OpenCV Wiki, .


Requirements: 
This project requires:
  1. A Windows XP pc with a webcam. I tried on Windows Vista but without success. Windows 7 is probably not working too.
  2. An Arduino board.
  3. Two servos.
  4. Some wires
  5. One toy laser (optional, for more accurate calibration system).

Building:
Building is exactly the same like the LaserGun project.
In place of  the laser you can put a toy figure face or a puppet face, so that when the servo move you will see the puppet face turning and tilting.


The Software
Arduino board software is exactly the same of LaserGun project

PC software requires some additional components, but starts from the same base.
I followed the instructions I found here http://ubaa.net/shared/processing/opencv/ which is the main site for OpenCV integration with Processing.

On your PC, you will need:
  1. Processing environment. Enough said.

  2. OpenCV libray: go download it and be sure NOT to download the wrong version. We need the 1.0 version. I suggest to install it in c:\opencv10 or to another simple path without spaces within the name. I do not recommend to install it in the default location which is probably dependent from your windows system language, and may contain spaces (like in "Program Files"). Also I suggest to use an all lowercase name.During installation, answer yes when prompted to alter the path adding the c:\opencv10\bin folder.
    After installation, I am suggesting to edit the system path so to have that folder at the beginning of the path, as shown in this picture (you can get to this via right click on My Computer, then Properties, then Advanced tab, then Environment Variables button.














    A reboot is not needed








  3. OpenCV Processing Libray: This is for interfacing your Processing environment with OpenCV.You can download it from here, expand it and put the contents inside the folder "libraries" into your processing installation root. In my case i put it in c:\inst\processing-1.0.9\processing-1.0.9\libraries


Optionally you can also install the Processing OpenCV examples. Get them from http://ubaa.net/shared/processing/opencv/download/opencv_examples.zip (5Mb)
and copy the opencv-examples folder accordingly in C:\inst\processing-1.0.9\processing-1.0.9\examples


Checking the environment installation
When starting Processing you should be able to open and run the example code. If you do not see camera feed, it is probably due to your OS not being XP. (I had black pitch camera feed on my Vista laptop)
In order for the face detection demos to work, you need to copy the proper haar recognizer data file into the sketch folder. Get the datafiles from C:\openCV10\data\haarcascades



Here is the code (also download from my google docs)

-----------

//
// Processing code for a two axis Face-Following interacting with arduino firmata
// to control servos connected to analog 9 and analog 10 pins
// project "facefollow" by Marco Guardigli, email: mgua@tomware.it  twitter: @mgua
//
// ATTENTION! This software makes a laser pointer track a face.
//            Use with extreme caution.
//
//
// see
// http://marco.guardigli.it/2010/01/arduinoprocessing-face-follower.html
// 
// this code is free software, released under the GNU GPL license
// see www.gnu.org for license details
//
// copyleft Marco Guardigli
//    2010 jan 18: first version
//
//
import hypermedia.video.*;   
import processing.serial.*;
import cc.arduino.*;       
OpenCV opencv;
Arduino arduino;

int maxx = 640;          // windows sizes, I suggest not to go over 640x480
int maxy = 480;

int calibrating = 0;     // nonzero during calibration, states are 0,1,2

int calibrateminx=80;    // recalibration window
int calibrateminy=80;
int calibratemaxx=100;
int calibratemaxy=100;
int cx1, cy1, cx2, cy2;  // screen mousex/y coords of the two calibration points

float dzx, dzy;
int cfacex, cfacey;    // center of the first face detected

int maxservox=170;     // maximum servo excursions - to be redefined in recalibration
int minservox=10;
int maxservoy=170;
int minservoy=10;
int initialservox = 90;
int initialservoy = 90;
int servox, servoy;    // current servos positions -in servo range-
int laseroff = 0;      // laser is controlled (improperly) as a servo
int laseron = 100;     // zero is not actually turned off, but is dimmer


void setup() {
  println(Arduino.list());

// IMPORTANT! This code will not work if you do not write the correct 
// id of the serial interface in the next line (in my case 2)
  arduino = new Arduino(this, Arduino.list()[2], 57600);
  arduino.analogWrite(9,initialservox);
  arduino.analogWrite(10,initialservoy);
  arduino.analogWrite(11,laseroff);  // laser off
  size(maxx,maxy);
 
  opencv = new OpenCV(this);
  opencv.capture( width, height );
  opencv.cascade( OpenCV.CASCADE_FRONTALFACE_ALT );    // load the FRONTALFACE description file

  opencv.read();
  image( opencv.image(), 0, 0 );
}


void draw() {
  opencv.read();
  image( opencv.image(), 0, 0 );
  Rectangle[] faces = opencv.detect();     // detect anything resembling a FRONTALFACE
  noFill(); stroke(255,0,0);
  for( int i=0; i < faces.length; i++ ) {
      rect( faces[i].x, faces[i].y, faces[i].width, faces[i].height );
  }
 
  switch (calibrating) {
    case 0: {  // no calibration in course: pointer follows face 
      if (faces.length > 0) {
        cfacex = int(faces[0].x + (faces[0].width / 2));
        cfacey = int(faces[0].y + (faces[0].height / 2));
        servox = int(map(cfacex,0,maxx,minservox,maxservox));
        servoy = int(map(cfacey,0,maxy,minservoy,maxservoy));
        arduino.analogWrite(9,servox);
        arduino.analogWrite(10,servoy);
      }
    }
    case 1: {   // need to read first calibration point
      cx1 = mouseX;
      cy1 = mouseY;
    }
    case 2: {   // need to read second calibration point
      cx2 = mouseX;
      cy2 = mouseY;
    }
  }
}


void mousePressed() {
    if (mouseButton == LEFT) {
      if (calibrating == 0) {                // draw shot on screen
        arduino.analogWrite(11,laseron);     // and intensify laser
        stroke(200,200,0);                 
        fill(200,0,0);
        ellipse(cfacex,cfacey,5,5);
        delay(500);
        ellipse(cfacex,cfacey,10,10); 
        arduino.analogWrite(11,laseroff); 
      }
    }
}   
   

void mouseReleased() {
  if (mouseButton == RIGHT) {
    switch (calibrating) {
      case 0: { 
            calibrating = 1;   // stops laser following mouse pointer
            arduino.analogWrite(9,calibrateminx);
            arduino.analogWrite(10,calibrateminy);
            arduino.analogWrite(11,laseron);     // and intensify laser           
            println("cx1/cy1: point mouse to where laser pointer is and RCLICK");
            break;
      }
      case 1: {  // arriving here after rclick release in calibration point 1
            calibrating = 2;
            arduino.analogWrite(9,calibratemaxx);
            arduino.analogWrite(10,calibratemaxy); 
            arduino.analogWrite(11,laseron);     // and intensify laser                       
            print("  calibration point1: "); print(cx1); print(" , ");  println(cy1);
            println("cx2/cy2: point mouse to where laser pointer is and RCLICK");
            break;
      }
      case 2: {  // arriving here after rclick release in calibration point 2
            print("  calibration point2: "); print(cx2); print(" , ");  println(cy2);
            // (cx1,cy1) corresponds to (calibrateminx, calibrateminy)
            // (cx2,cy2) corresponds to (calibratemaxx, calibratemaxy)
            // i will recalculate minservox, minservoy and maxservox, maxservoy
            if (((cx2-cx1) != 0) && ((cy2-cy1) != 0)) {
              stroke(200);
              line (cx1,cy1,cx1,cy2);
              line (cx1,cy2,cx2,cy2);
              line (cx2,cy2,cx2,cy1);
              line (cx2,cy1,cx1,cy1);
 
              dzx = (calibratemaxx - calibrateminx);

              dzx = dzx / (cx2 - cx1);              // dzx is how much servo per pixel
              dzy = calibratemaxy - calibrateminy;
              dzy = dzy / (cy2 - cy1);

              float leftx = calibrateminx - ( dzx * cx1 );
              float rightx = calibratemaxx + ( dzx * (maxx-cx2) );
              float upy = calibrateminy - ( dzy * cy1 );
              float downy = calibratemaxy + ( dzy * (maxy-cy2) );

              minservox = int(leftx);
              maxservox = int(rightx);
              minservoy = int(upy);
              maxservoy = int(downy);
            } else {
              println("Invalid calibration points selected.");
            }           
            calibrating = 0;
            arduino.analogWrite(11,laseroff);     // and dim laser           
            break;
          } // end case 2
      default: {
            break;
          } // end case default
      } // end switch
    } // end if mousebutton right
} // end mouseReleased

 



------------


Calibration
As in the LaserGun Project, it is important to properly calibrate the system so to have it work decently.
In order to minimize errors, try to keep the position of the lasers and the position of the webcam as close as possible.
Calibration is performed pressing right mouse click, and then right clicking on the screen where you see the pointer. Repeat for the two points asked.

Multiple faces: 
Currently, the OpenCV library is able to detect more than one face in the scene. The detected faces are not always presented in the same order. If you present two faces to the current system, it will be confused. A more accurate movement detection and tracking over time would be needed.


Caution
This code could be dangerous if improperly used. Never play with laser pointing it into people eyes and face.
Be smart, and always think before doing.



Marco  ( @mgua on Twitter )

.

Friday, January 15, 2010

Processing code for building multi-frame animated images

Barrier Grid Animations (or Scanimations® as elsewhere referenced -see endnote-) are fun.

See this nice video by brusspup on youtube to quickly understand the concept.










I wrote a software tool to produce the picture and the related mask to see it.
the code is written in Processing. It produces the picture and the related mask.

My software basically takes a number of images as input that are to be considered the frames of the animation to be built. You can change the parameter to define how many images do you want to use. Typically you can go with 4, and 6 is probably the maximum, otherwise the animation is too dark, because the final effect reduces the image brightness sensibly.

If you use 4 frames, only 1/4 columns of pixels are visible at a given time, reducing overall brightness to 25% of the original.
If you use 6 frames, final brightness goes to 17% of the original.

As source pictures, it is best to use some high contrast pictures, for example some high contrast dark shapes on white background. I tried with photos taken from my webcam but results were quite poor.

Simple parametrization is needed in the source, to adapt to your input image sequence and output resolution.


Printing the mask transparency and the multi-frame picture
Another tricky problem can be the printing of the mask bitmap. I used a standard laser printer, and printed on A4 sized transparencies.Usually printers perform dithering and anti-aliasing and introduce their "improvements" on printed data, but for this print job we do not need any halftoning.

I performed some tests, and was not satisfacted by any of the normal printing results from standard applications. I resorted to using Adobe Photoshop, and performed image scaling multiplying the original size of the image by a integer (i multiplied by 3 my original size and kept proportions).
It is critical and important that, when scaling, you multiply the image size by an integer, so that even spacing between resulting pixel columns is used. (doing this the scaling algorithm needs not to introduce new columns via interpolation).
In the resample image option of the Image/Image-size menu, I then selected "Nearest Neighbor". This option produces no dithering or halftoning upon image resizing.
If someone know how to obtain the same result without using Photoshop, please let me know.
(july 2010 note: paint.net has a similar option which is working fine)

Of course, you need to perform scaling of the picture following exactly the same rules. Exact size proportion between pixel column width must be preserved and must be the same in the mask and in the image.


Detail view
For better understanding, here is a detail zoom of a multi-frame image portion, prepared for 6 animation frames:
.

And here is a corresponding detail zoom of individual pixel of the mask for 6 animation frames. You can see 1 transparent column and 5 opaque columns




Code
To use this code you need a Processing development environment. You can download and install it from the Processing web site. It is open source and multiplatform, for Windows/Linux/Mac.
Then you create a new sketch, and paste the following code, saving the new project.
You then have to put in the sketch folder the pictures you want to create the animation from, naming each file with a name ending in a progressive digit starting from 0,1,2,3... See the source code for understanding better.

Here is the Processing code.

//------------
// scanimator
//
// a processing sketch for generating barrier grid animations

// see http://marco.guardigli.it/2010/01/scanimation-builder-processing-code.html
//
// this code is Free Software, released under GNU GPL license. See www.gnu.org for license details
// copyleft Marco Guardigli
//
//    email: mgua@tomware.it
//    twitter: @mgua
//
//
//  2009 dec 20  first draft
//  2010 jan 15  1.0
//
//

int nframes = 6;
int maxx, maxy;
int cframe = 1;
int xsize=320;  //resized image
int ysize=240;

String[] fname = new String[nframes];
String prjname = "giulio";       // project name: common initial part of the input frame filenames

PImage[] frame = new PImage[nframes];            // sequence of the initial frame to process
PImage scanimage;                                // resulting multi-frame scanimage
PImage maskimage;                                // mask

void setup() {
  for (int i=0; i < nframes; i++) {              // cycle on input frames
    fname[i] = prjname + "-" + i + ".jpg";
    println(fname[i]);
    frame[i] = loadImage(fname[i]);              // read frame
    frame[i].resize(xsize,ysize);
  }
  maxx = frame[0].width;
  maxy = frame[0].height;
  scanimage = createImage(maxx,maxy,ARGB);        // output scanimage
 
  for (int f = 0; f < nframes; f++) {             // cycle on input frames
    for (int c = 0; c < maxx; c += nframes ) {    // columns to keep of this frame
        for (int y = 0; y < maxy-1; y++) {        // cycle on each pixel of the column
          scanimage.pixels[maxx * y + f + c] = frame[f].pixels[maxx * y + f + c];
        }
    }
  }
  scanimage.save(prjname + "_scanimage.png");   // save resulting scanimage
 
 
  // mask preparation and file generation
  maskimage = createImage(maxx,maxy,RGB);
  for (int x = 0; x < maskimage.width; x += nframes ) {
    for (int y = 0; y < maskimage.height; y++ ) {
      maskimage.pixels[xsize * y + x] = color(255,255,255);
    }
  }
  maskimage.save(prjname + "_mask.png");
 
  // prepare for on screen display of scanimage
  size(maxx,maxy);
}


void draw() {
  background(255);
  image(scanimage,0,0);
  stroke(0);                    // draws a mask on top of the image
  for (int c = cframe; c < scanimage.width; c += nframes ) {
    for (int b=0; b < (nframes -1); b++) {
      line(c+b,0,c+b,scanimage.height);
    }
  }
}



void mousePressed() {          // when click shift mask
 cframe = (cframe + 1) % nframes;
 println("cframe=" + cframe);
}
// ---------------


Example
As an example, here are some webcam photos I made with my kids and the resulting mask and multi-frame image. It is a set of six frames:





 

 

And here is the processed multi-frame picture:

 
with the related mask


These on-screen pictures will be probably resized on your computer screen, so do not use them for printing expecting nice results, because they will probably not work.

Related websites:
  Rufus Butler Seder website (Eye Think inc)
  dudecraft's blog
  http://animbar.mnim.org/  
  A scratch similar project from MIT

Have fun and happy scanimaging.


Marco ( @mgua on twitter )






NOTE on copyright - added on 2010 july 27 after receiving a request from the trademark owner, resulting in removal of every occurrence of the words "scanimation" and "scanimations" in relation to my work:

Scanimation® is a federally registered trademark owned by Eye Think, Inc. and bearing U.S. Registration No. 2,614,549. The mark was federally registered in the United States on September 3, 2002.
http://www.eyethinkinc.com/





Monday, January 11, 2010

Arduino+PC two axis controlled Laser Gun

If you are a Maker (it seems not politically correct to say Hacker), or a DIY fan, here is a project that I developed in the weekend, playing with Arduino board and Processing.

I built this hack with my kids: Luca and Giulio, and we had lots of fun.


Our Laser Gun works using two servos, connected to an Arduino board. The Arduino is connected to the pc via usb. The thing is controlled via mouse movements on the pc, over a window with a picture taken from the gun position. The laser works normally in low intensity "pointing mode", and when fired is much powerful.
The arduino microcontroller and the pc communicate via serial usb interface. A custom Processing software allows mouse interaction, and enables guidance.

Parts used:
Arduino 2009 board
2 light servos (I used two standard hitec hs55 8gr)
1 laser pointer (savaged from a broken toy gun)
1 breadboard (not strictly necessary)
some adhesive putty (I used UHU Patafix)
some wires


Circuit description:
  • black cable of both servos is connected to ground
  • red cable of both servos is connected to +5Vcc
  • x-servo (the lower one, moving its head on the horizontal left-right plane) control yellow is connected to arduino digital pin 9

  • y-servo (the upper one, moving its head on the vertical top-down plane) control yellow is connected to arduino digital pin10
  • laser pointer negative is connected to ground
  • laser pointer positive is connected to Arduino digital pin 11 (the two laser intensities are managed controlling it -improperly- like a servo, but it works)


Software:
The project requires two pieces of software: one written in Arduino Language (based on Wiring) that is run on the Atmel 328 microcontroller on Arduino 2009 board, and a Processing sketch (Processing programs are called sketches) that is run on the pc.
The two softwares communicate with a standard serial protocol, called Firmata, which is implemented in libraries both on the Arduino side and on the PC Processing side.
I am currently using Microsoft Windows Vista Ultimate 64bit. This is not the best platform for development, and I do not recommend such a setting. A better solution would be Microsoft XP 32 bit.


On the pc, I installed the standard Arduino Development Environment (currently I run arduino-0017), with no additions. This allows to code, edit, compile and download to the Arduino board and test programs written in Wiring.
Arduino Development Environment is based on a version of Processing. Once installed


On the PC I also installed the latest Processing environment (1.0.9 in my case). An add-on library is needed so to support the Firmata serial protocol. This library zip file ( http://www.arduino.cc/playground/uploads/Interfacing/processing-arduino-0017.zip ) file has to be expanded and the three included directories (examples, libraries, src) must be copied to the following folder: \libraries\arduino





Here is the software to be downloaded to the Arduino controller. It is the standard Firmata "servo" template (you can find it among the included examples) with a very simple addition for managing the laser, connected on pin 11.

Note: Unfortunately, it seems that blogger does not like the "<" and ">" characters inside the listings.
the following first two lines are actually #include"<"Firmata.h">" and #include"<"Servo.h">" but you have to remove the " characters.
Additionally, I uploaded the code also on this posterous entry, and on these  two scribd entries: 1 and 2.

#include "<"Firmata.h">"
#include "<"Servo.h">"
Servo servo9;
Servo servo10;
Servo laser11;

void analogWriteCallback(byte pin, int value)
{
    if(pin == 9)
      servo9.write(value);
    if(pin == 10)
      servo10.write(value);
    if(pin == 11)
      laser11.write(value);
     
}


void setup()
{
    Firmata.setFirmwareVersion(0, 2);
    Firmata.attach(ANALOG_MESSAGE, analogWriteCallback);

    servo9.attach(9);
    servo10.attach(10);
    laser11.attach(11);
  
    Firmata.begin(57600);
}

void loop()
{
    while(Firmata.available())
        Firmata.processInput();
}



And here follows the code to be run on the PC Processing environment

I am using a small font, so not to cause unwanted line breaks.
The Processing code will have to be adapted to your environment defining the correct identifier for the USB serial interface (see at the beginning of the setup  procedure), and also to define the proper name of the picture you want to display on screen while operating the laser (see setup procedure). I suggest to use a picture resized to 800x600, taken from the place in which the laser will be put, towards the target.
The picture file has to be put in the Processing sketch directory in which you will save the project. To access that directory just select the Sketch menu from the Processing environment, and then "show sketch folder".


//
// Processing code for interacting with arduino firmata
// to control servos connected to analog 9 and analog 10 pins
// project "lasergun" by Marco Guardigli, mgua@tomware.it 

//

// see
// http://marco.guardigli.it/2010/01/arduinopc-two-axis-controlled-laser-gun.html

// 
// this code is free software, released under the GNU GPL license
// see www.gnu.org for license details
//
// copyleft Marco Guardigli
//    2010 jan 11: first version
//
//

import processing.serial.*;
import cc.arduino.*;       
Arduino arduino;

int maxx;              // windows sizes
int maxy;
int calibrating = 0;   // nonzero during calibration, states are 0,1,2

int calibrateminx=80;  // recalibration window
int calibrateminy=80;
int calibratemaxx=100;
int calibratemaxy=100;
int cx1, cy1, cx2, cy2;  // screen mousex/y coords of the two calibration points

float dzx, dzy;

int maxservox=170;  // maximum servo excursions - to be redefined in recalibration
int minservox=10;
int maxservoy=170;
int minservoy=10;
int initialservox = 90;
int initialservoy = 90;
int servox, servoy;    // current servos positions -in servo range-
int laseroff = 0;      // laser is controlled (improperly) as a servo
int laseron = 170;     // zero is not actually turned off, but is dimmer

PImage room;           //  room background (photo shot from laser initial position)


void setup() {
  println(Arduino.list()); 

// IMPORTANT! This code will not work if you do not write the correct 
// id of the serial interface in the next line (in my case 3)
  arduino = new Arduino(this, Arduino.list()[3], 57600);
  arduino.analogWrite(9,initialservox);
  arduino.analogWrite(10,initialservoy);
  arduino.analogWrite(11,laseroff);  // laser off
// put in the next line the file name of your ambient picture

// taken placing the camera in the place where the lasergun is 
// so to have a correct perspective on the screen

  room = loadImage("myplace.jpg");   // put here the picture of your place
  maxx = room.width;                 // I suggest to resize the picture to 800x600
  maxy = room.height;
  size(maxx,maxy);
  background(120);
  image(room,0,0);  //show background room
}


void draw() {
  switch (calibrating) {
    case 0: {  // no calibration in course: laser follows pointer 
      servox = int(map(mouseX,0,maxx,minservox,maxservox));
      servoy = int(map(mouseY,0,maxy,minservoy,maxservoy));
      arduino.analogWrite(9,servox);
      arduino.analogWrite(10,servoy);
    }
    case 1: {   // need to read first calibration point
      cx1 = mouseX;
      cy1 = mouseY;
    }
    case 2: {   // need to read second calibration point
      cx2 = mouseX;
      cy2 = mouseY;
    }
  }
}


void mousePressed() {
    if (mouseButton == LEFT) {
      if (calibrating == 0) {                // draw shot on screen
        arduino.analogWrite(11,laseron);     // and intensify laser
        stroke(200,200,0);                 
        fill(200,0,0);
        ellipse(mouseX,mouseY,5,5);
        delay(500);
        ellipse(mouseX,mouseY,10,10); 
        arduino.analogWrite(11,laseroff); 
      }
    }
}   
   

void mouseReleased() {
  if (mouseButton == RIGHT) {
    switch (calibrating) {
      case 0: { 
            calibrating = 1;   // stops laser following mouse pointer
            arduino.analogWrite(9,calibrateminx);
            arduino.analogWrite(10,calibrateminy);
            arduino.analogWrite(11,laseron);     // and intensify laser           
            println("cx1/cy1: point mouse to where laser pointer is and RCLICK");
            break;
      }
      case 1: {  // arriving here after rclick release in calibration point 1
            calibrating = 2;
            arduino.analogWrite(9,calibratemaxx);
            arduino.analogWrite(10,calibratemaxy); 
            arduino.analogWrite(11,laseron);     // and intensify laser                       
            print("  calibration point1: "); print(cx1); print(" , ");  println(cy1);
            println("cx2/cy2: point mouse to where laser pointer is and RCLICK");
            break;
      }
      case 2: {  // arriving here after rclick release in calibration point 2
            print("  calibration point2: "); print(cx2); print(" , ");  println(cy2);
            // (cx1,cy1) corresponds to (calibrateminx, calibrateminy)
            // (cx2,cy2) corresponds to (calibratemaxx, calibratemaxy)
            // i will recalculate minservox, minservoy and maxservox, maxservoy
            if (((cx2-cx1) != 0) && ((cy2-cy1) != 0)) {
              stroke(200);
              line (cx1,cy1,cx1,cy2);
              line (cx1,cy2,cx2,cy2);
              line (cx2,cy2,cx2,cy1);
              line (cx2,cy1,cx1,cy1);
 
              dzx = (calibratemaxx - calibrateminx);
// for some reason dzx=(calibratemaxx-calibrateminx)/(cx2-cx1)
// was not calculated well ( !!! )
// so I had to split the calculation int two statements
              dzx = dzx / (cx2 - cx1);              // dzx is now "how much servo per pixel"
              dzy = calibratemaxy - calibrateminy;
              dzy = dzy / (cy2 - cy1);

              float leftx = calibrateminx - ( dzx * cx1 );
              float rightx = calibratemaxx + ( dzx * (maxx-cx2) );
              float upy = calibrateminy - ( dzy * cy1 );
              float downy = calibratemaxy + ( dzy * (maxy-cy2) );

              minservox = int(leftx);
              maxservox = int(rightx);
              minservoy = int(upy);
              maxservoy = int(downy);
            } else {
              println("Invalid calibration points selected.");
            }           
            calibrating = 0;
            arduino.analogWrite(11,laseroff);     // and dim laser           
            break;
          } // end case 2
      default: {
            break;
          } // end case default
      } // end switch
    } // end if mousebutton right
} // end mouseReleased




Features:
Once you have loaded the software on the Arduino board, this will be automatically run at boot. So you will not need to load it again unless you decide to change it.
Operation of the Laser Gun just requires you to connect the USB cable to the pc, and launch the Processing Environment. The Firmata library is designed to "bring outside" of the board all its features, allowing complete control from the PC.

Once initialization has been completed, you will see a window with the picture you loaded in the sketch folder. If you move the mouse, the laser gun should follow your mouse movements.
When you press the left mouse button, the led intensity will grow, to represent "fire". I will probably add some audio features, because the thing is too silent now :-).

You will soon notice that there is a non correspondance between what you point on the screen and the actual position of the laser. This happens because you need to calibrate the system.
Calibrating means that the servo min and max boundaries have to be redefined so to match the space in which you use the LaserGun.

Calibration is performed clicking the right button, and requires you to point with the mouse the place in the screen corresponding to the actual place where you see the laser in your room. When positioned, press again right click, and repeat for a second point that the system will ask. After this procedure has been completed you should have a (reasonable) correspondance between what you aim and what you kill. ( :-)



Safety precautions
Be careful not to point laser in people or animals eyes.
If you use more powerful servos, or a more powerful laser, it is better to power the system via an external power supply, and not thru the USB.






Feel free to improve and enjoy at will




Marco  ( @mgua on twitter )

Sunday, January 3, 2010

Evaporating Borders: Need of new politics

Which is the need for local country politics in a completely connected world?

We really need to rethink politics, countries, borders, and social differentiation.
 
Current trends in online interactions are gradually and progressively changing the traditional sense of nation.
People influence each other through new communication systems and tools.

If we examine all the usual bounds keeping a country together, we see that all these are gradually being released.
  • Religion: It is gradually reducing its importance. Current religious terrorism is helping a lot in accelerating the process.
  • Culture: Younger people do not perceive strong local cultural relations. Dialects are gradually disappearing. Most art is cross-borders.
  • Language: De facto net language is English. Period.
  • History: Historical reasons are less perceived in a globally accelerating present. Young people is more interested in future. ( Are they to blame? )
  • Geography and Borders: Easy travel is gradually reducing geography role as a fence. Beside that many communities do not need to be localized in a physical place to prosper and evolve.
  • Currency: money transactions will shift to "plastic money" i.e. credit cards, or some other currency not bound to local political environment (linden dollars? whuffies? paypal or ebay credits?). Strangely, efforts to build an independent "internet currency" did not succeed. Probably because of bank lobbies not able to find a suitable agreement.
So, traditional "we are together" because "we belong" to the "same nation" are not going to work anymore.
Our identities are being revealed on the net. And we allow it without complaints. We share personal information more easily online than with a policeman.

Along with borders evaporation, our identities are progressively mixing and melting, and social media are becoming the main drivers for education and community building.

We quickly need to evolve the traditional meaning of politics.
We need young politicians. And newer ideas for an up-to-time politic science.

Consensus will not be built anymore with broadcasting, but leveraging online active communities, and promoting autonomous critical thinking.

Feedback cycles will be much faster. Traditional elections are slow and expensive.

New politics will be tough.


Marco ( @mgua on twitter )


.

Saturday, January 2, 2010

Engineered plants as solar power generators? (Actually Solar Power Plants!)

This post is just a cool idea I came out with yesterday.

I was thinking about a solar photovoltaic array, in which panels are to keep oriented towards the sun to maximize the collected energy. I thought about possible sun following rotating designs, then I considered sunflowers that do exactly the same.

Many plants actually turn their leaves to the sun to maximize photosynthetic reactions.

So maybe it could be possible to develop some genetically engineered plant, that could be connected to electrical grid through a ground electrode in the roots and with an aerial "electrical vine" entwined to a metal wire that could also sustain the plant itself.

The underground electrode could be grown by the plant itself towards a specific substance that could attract electric roots.

Probably there are many plant species that could be used for this.


A simpler idea is to mount light photovoltaic panels on the leaves, without covering them all, and have the plant movement orient them properly (this would probably be very easy to develop). We need to find the more robust sun orienting plants.

Any botanical expert or crazy inventor that could comment on this?



Marco ( @mgua on twitter )


.