
On Fri, Oct 23, 2009 at 10:56 AM, Michael Fawcett <michael.fawcett@gmail.com> wrote:
On Wed, Oct 21, 2009 at 3:53 PM, Stjepan Rajko <stjepan.rajko@gmail.com> wrote:
However, the documentation is lacking, so I'm curious to know which parts of the functionality (if any) are of interest to the boost community, so I know what parts of the documentation to focus on. I am also considering proposing a presentation of this library to BoostCon.
We would be interested if it could make something like this (video links to demonstration of multi-touch touch table):
Nice video. Did you work on that project or are you working on something similar?
easier to code for. Currently the gesture recognition is pretty primitive, and none of it is generic. Would love to hear that this library could help.
You could use the library in it's current state to provide a richer set of gestures. Here are some examples showing a vocabulary of gestures being trained and then recognized: http://www.youtube.com/watch?v=LAX5qgzYHjU (iPhone gesture classification) http://www.youtube.com/watch?v=mjjwhK4Dxt4 (mouse gesture on-line recognition) You could do something similar in the system shown in your video, at least with single-touch gestures (maybe they could be used as shortcuts to some of the functions otherwise accessible through the menu). There was also a nuicode GSoC project this year that used the library for multi-touch gestures: http://nuicode.com/projects/gsoc-gesture-models http://code.google.com/p/multitouch-gestr/ The system works pretty well, but I don't think there is much documentation at this point. Would any of this be useful to you? Best, Stjepan