We will use an IR-filtered camera in order to read “blobs” created when a user touches the screen; the blobs are created by the IR beams colliding with each other when the user touches the screen. These blobs are then processed by a computer program as “touch events” and are given IDs for tracking purposes. A tracking program overlays this and is able to maintain coordinates of the specified touch event based on an equation relating the relative position of the ID and a prediction of its propagation based on noise coefficients and track orientation. With this collected information we can use “gestures” in order to overlay functions and create a more “natural” interface. Our specific design is unique in that it will incorporate an Artificially Intelligent (AI) User control point to aid in navigating the table. This is revolutionary because it is the first time an AI has been incorporated into a multi-touch system. Additionally, our AI—designed by our own Chief Software Engineer, Joshua Deaton—will also be revolutionary in that it will be the first time an AI is built in this manner. Our AI, named MARVIS, will be given the ability to simulate and “detect” emotion by using a system that reads words as emotional values in hierarchical syntax. Based on this concept, every human user can have a different experience and “relationship” to MARVIS based on the values they express. More interestingly, since MARVIS is not the essential core of our system, we have crafted his duties as a normal user for the table. In other words, MARVIS will have his own login, and can serve as a guide, or assistant in any projects or applications we incorporate his code into—for example, MARVIS can be called upon by a member of the OIDC or a guest in order to initiate a task or login.
Additionally, our project will focus on developing apps for our multi-touch table. Our standard OS of choice will Windows 8 and the apps created will aid in overlaying set gestures with our own as well as increasing the user’s experience with the table. Our apps will be developed in KIVY, a multi-touch framework based on the Python programming language. Since the knowledge needed for the development of this table lies heavily on material rarely taught in schools, the OIDC will also be giving free classes to students in all areas of the design and development. Also, since the project is a student research initiative, we want to make an effort in helping people who would be interested in recreating, collaborating, and improving our design. For this reason, so long as credit is given where it is due, the project will be open to all persons. Our programs and code will be entirely open source and the design/build process will be entirely documented, photographed, and split into video developer diaries so that anyone interested in NUI technology is able to recreate and learn more about multi-touch technologies. We want to create new interesting fields for students that will eventually lead to cutting edge research attempts in sixth-sense technology, holography, and more innovative NUI projects.
A link to our website/blog can be found here with major development updates and a video representing our detection-test program being used on our first rudimentary prototype model: http://opticalinterface.wordpress.com/
More updates will be coming soon as our project will officially launch Wednesday, August 29th, 2012.
The entrant for this contest, Joshua Deaton, is indeed 21 years of age.