Thursday, January 08, 2009

Where I'm going, where I've been

Near the end of the first semester of my Master's, my research topic was becoming clearer - along the lines of "teaching the use of gestural interfaces". This was motivated by the proliferation of gestural interaction in devices of many form factors. Much of my thoughts on this were driven by my improv and theatre background, especially with respect to miming and mimicry. After some emails were sent, and calls made, I have been hired an an intern at Microsoft Research in Redmond, Washington from January to April to work with the Microsoft Surface team on teaching gestural interface use, and to measure how well a user has learned. With that in mind, here's the courses I took last semester and projects I did.

Computational Biology
This course takes problems solved in computer science and maps them onto problems in biology that are becoming harder as more data becomes available. I initially only took this course because I am required to take "breadth" in my courses, but it turned out to be pretty enjoyable. I was part of a sub-group that examined current research into codon bias. My final project was on modifying a gene from one organism to another to make it perform better, with respect to codon bias. I wanted a picture from each course I did, but since I don't really have a good one for Comp. Bio., here's a bunch of completely unexplained equations I made for my final project.


Ubiquitous Computing
Ubicomp helped me get more coverage in the literature of my field, although more on the ubiquitous, rather than interactive, side. I was interested in ambient displays, and my project partner was interested in household devices, so we came up with the humourous yet sincere project title: "Devices that Bruise: Battered Device Syndrome". The idea was to give everyday items the ability to give feedback if they received damage, either causing immediate harm, or causing harm if this behaviour was continued in the long run. Below is what we imagined a door would look like if it was slammed shut.


And here is a storyboard of some devices that have been "bruised".

In addition to giving the user's feedback about unintentional misuse, these devices could let you know when you are in a bad mood, as the user above realizes. This is much like a good friend would alert you to your mood. For our prototype, we wired an accelerometer up to some LEDs, which isn't visually interesting enough for me to show here. One concern with this sort of behaviour feedback is that it might encourage the inverse response to what we are looking for, like in the art project love hate punch, which was one of our inspirations. User testing would have to be done.

Machine Learning
This was mentioned in an earlier post. Upon learning about the ability of Restricted Boltzmann Machines, which are pretty good digit classifiers, to generate new digits, I immediately wanted to apply it to HCI somehow. These animations were a large part of my inspiration. Given a user's ambiguous digit, I wanted to show a user how they could improve their writing to make it easier to recognize by the device. The philosophy here isn't to show the user the "perfect" digit, but rather to show how their given digit could be improved. Personalized feedback. I was afraid that the animations wouldn't be smooth enough to be pedagogical, but my professor said he would be surprised if they weren't. It turns out that the animations were very smooth, but did not always work perfectly. Below shows examples of digits that were improved, starting with the original digit at the left and evolving right.

For the above digits, the classifier can easily tell what the digits were, but improvement is still possible. However, for some digits in the dataset, it isn't really clear what was meant, and feedback on how to disambiguate towards either alternative must be shown. Below, we see an ambiguous digit in the centre. To make it a better 1, the evolution goes left. To make it a better 2, the evolution goes right.


Although it doesn't look that cool, I'm pretty exciting by these results. Classifiers in machine learning haven't been used in this way before, and I would like to think of more interactive ways to apply them. Generally, Artificial Intelligence and Human-Computer Interaction are not as closely related as they should be, and I think they should be if we want to makes computers worth caring about.

4 comments:

Anonymous said...

Everything is very open with a really clear explanation of the challenges.
It was really informative. Your site is very useful.
Thanks for sharing!

Feel free to surf to my blog - male enhancement methods

Anonymous said...

I constantly spent my half an hour to read this website's content every day along with a mug of coffee.

Feel free to visit my blog post :: forex trading forex Trading

Anonymous said...

Heya i'm for the first time here. I came across this board and I find It truly useful & it helped me out much. I hope to give something back and aid others like you aided me.

My blog: http://recentarticless.info/Women-Tend-To-Be-Not-That-Complicated.htm

Anonymous said...

Howdy! This is my first comment here so I just wanted to give a quick shout out and say I really enjoy
reading through your articles. Can you recommend any
other blogs/websites/forums that go over the same subjects?
Thanks for your time!

Have a look at my homepage: flagate.net