
On Sat, Oct 24, 2009 at 5:39 PM, Stjepan Rajko <stjepan.rajko@gmail.com> wrote:
Nice video. Did you work on that project or are you working on something similar?
I worked with the software and hardware shown in the video, but I did not work on that particular demonstration.
You could use the library in it's current state to provide a richer set of gestures. Here are some examples showing a vocabulary of gestures being trained and then recognized: http://www.youtube.com/watch?v=LAX5qgzYHjU (iPhone gesture classification) http://www.youtube.com/watch?v=mjjwhK4Dxt4 (mouse gesture on-line recognition)
You could do something similar in the system shown in your video, at least with single-touch gestures (maybe they could be used as shortcuts to some of the functions otherwise accessible through the menu).
There was also a nuicode GSoC project this year that used the library for multi-touch gestures: http://nuicode.com/projects/gsoc-gesture-models http://code.google.com/p/multitouch-gestr/
The system works pretty well, but I don't think there is much documentation at this point.
Would any of this be useful to you?
Yes, I had seen the nuicode project, but didn't follow it very closely. It's good to hear that your library would be useful in this context. Thanks! --Michael Fawcett