User Testing

Before the IM show I asked two people who weren’t familiar with the concept of my final project to test it to get feedback about what could be improved. Unfortunately my project was not in a fully working order quite yet because one of its components didn’t work but I got feedback about the rest of them.

The idea of my project is that a person stands in front of a screen and sees himself/herself in the screen. In front of the person there is a map and a figure of a person which can be moved around different places in the map. Once placed in a certain location, this place appears as a background for the person on the screen. Then there is a little carpet on the floor with the shape of two feet drawn on it and, when stepping on this carpet, the person can start playing a video which is essentially their background. The speed of the video depends on how fast the person is walking on the carpet. Once the figure is moved to a different location, the journey continues in a different place.

The component, which didn’t yet work, was the green screen, which substitutes the background with a video of a certain place, therefore the person only saw a video of the place instead of him/her being in that place.


The first person to try out my project was Isabella. After seeing her interacting with my project I decided to change/include several things in my project:

1) She asked whether to take her shoes off. I realized I hadn’t completely made up my mind yet about it but, after Isabella trying it both ways, I decided to allow to keep the shoes on because then the pressure might even out more on the force resistive sensors that are hidden below the carpet.

2) I will decrease the time after which the image on the screen stops when there is no more movement sensed on the sensors. What seemed to be fine before the user testing, turned out to be too long because Isabella sometimes got confused why the image is still moving even if she stopped walking a good while before.

3) I will adjust the speed of the video when increasing from the speed of walking to the speed of running because it was sometimes either too slow or too fast.



The second person to try out my project was Erika. After her interaction I concluded the following thing:

1) I will place the sensors further below the shape of the foot on the carpet, because Erika tended to move away a lot form the front of the carpet where the sensors where located during the user testing, especially when she started running. When you are running and paying attention to the screen rather than your feet, it is very easy to go out of the range of the sensors, therefore I’ll try to center them as much as possible.


Overall, it was helpful to have some users actually try out the project and see what they do with it when they see it or when they use it. Even though it is never possible to predict all of the potential ways in which an interaction might go, it is still useful to see at least some, because other people will definitely use the project differently than you do it, because you are most likely doing it accordingly to how it is supposed to be used (according to you). Because you have intended it to work a certain way, it’s hard to put yourself in another person’s shoes and think what you would do if you didn’t know all of the logic that’s lying behind the project.

Divine Intervention Switch v2

For our computer vision/image manipulation assignment, I chose to go back to the basics and produce another iteration of my handless switch where an LED is turned on whenever God and Adam’s hand touch.

Adam’s hand is linked to a color-tracker coded in Processing, such that whenever the color being tracked comes near God’s hand, Processing communicates with the Arduino to turn on the LED. In order to do this, I rely on serial communication. Moreover, whenever Adam’s hand touches God’s ‘Hallelujah,’ from Handel’s ‘Messiah,’ plays in the background.

In order to do this project, I relied on Dan Shiffman’s tutorials, Aaron’s color-tracking example code, and the MINIM library for playing sound files in Processing.


const int ledPin1 = 3;

void setup() {
// put your setup code here, to run once:



void loop() {
} else{


import processing.serial.*;
import ddf.minim.*;
Capture video;
Serial myPort;
PImage adam;
PImage hand;
color trackColor;
int locX, locY;
boolean touch=false;
Minim minim;
AudioPlayer hallelujah;

void setup() {
size(640, 480);
video = new Capture(this, 640, 480, 30);
adam = loadImage("creation_adam1.png");
hand = loadImage("adam_hand.png");

minim = new Minim(this);
hallelujah = minim.loadFile( "hallelujah_short.mp3");

String portname=Serial.list()[5];
myPort = new Serial(this,portname,9600);


void draw() {
if (video.available()) {;
float dist=500;
for (int y=0; y<height; y++) {
for (int x=0; x<width; x++) {
int loc = (video.width-x-1)+(y*width);
color pix=video.pixels[loc];
float r1=red(pix);
float g1=green(pix);
float b1=blue(pix);
float r2=red(trackColor);
float g2=green(trackColor);
float b2=blue(trackColor);
float diff=dist(r1,g1,b1,r2,g2,b2);

if (diff<dist){
image(hand, 30,100, locX, locY);
image(adam, 100, 50, 600, 480);


if (hallelujah.isPlaying())
println("audio is playing");
} else {;
else {



void mousePressed(){
int loc=(video.width-mouseX-1)+(mouseY*width);
//void serialEvent(Serial myPort){
// String s = myPort.readStringUntil('\n');
// s =trim(s);
// println(s);
// if(s!=null){
// int value[]=int(split(s,','));
// if(value.length==2){
// }
// }
// myPort.write(int(touch));