A Multi Touch Table With An AI User Control Point
A Multi Touch Table With An AI User Control Point
Table of Contents
Related Instructables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
http://www.instructables.com/id/A-Multi-Touch-Table-with-an-AI-User-Control-Point/
Intro: A Multi-Touch Table with an AI User Control Point
The Optical Interface Design Club (OIDC) is a research/design-based student organization founded in The University of Alabama in Huntsville on the Fall Semester of
2012. The aim of the OIDC is to provide all students—especially undergraduate students—with research-level experience and application of classroom knowledge in the
field of the Natural User Interface (NUI) technologies branch under the Human-computer interaction efforts of computing. Inspired by the recent rebirth in touchscreen
technology, our goal is to leave our mark in the multi-touch community. Our dream project is to build a 52” multi-user/multi-touch table based on the Rear Diffused
Illumination (Rear Di) technique under the Planar Infrared touchscreen category. Our design will feature an enclosed box “flooded” with Infra-red (IR) light. The light
source will be pointing downwards so as to create a dispersing effect in order to achieve even IR illumination.
We will use an IR-filtered camera in order to read “blobs” created when a user touches the screen; the blobs are created by the IR beams colliding with each other when
the user touches the screen. These blobs are then processed by a computer program as “touch events” and are given IDs for tracking purposes. A tracking program
overlays this and is able to maintain coordinates of the specified touch event based on an equation relating the relative position of the ID and a prediction of its
propagation based on noise coefficients and track orientation. With this collected information we can use “gestures” in order to overlay functions and create a more
“natural” interface. Our specific design is unique in that it will incorporate an Artificially Intelligent (AI) User control point to aid in navigating the table. This is
revolutionary because it is the first time an AI has been incorporated into a multi-touch system. Additionally, our AI—designed by our own Chief Software Engineer,
Joshua Deaton—will also be revolutionary in that it will be the first time an AI is built in this manner. Our AI, named MARVIS, will be given the ability to simulate and
“detect” emotion by using a system that reads words as emotional values in hierarchical syntax. Based on this concept, every human user can have a different
experience and “relationship” to MARVIS based on the values they express. More interestingly, since MARVIS is not the essential core of our system, we have crafted
his duties as a normal user for the table. In other words, MARVIS will have his own login, and can serve as a guide, or assistant in any projects or applications we
incorporate his code into—for example, MARVIS can be called upon by a member of the OIDC or a guest in order to initiate a task or login.
Additionally, our project will focus on developing apps for our multi-touch table. Our standard OS of choice will Windows 8 and the apps created will aid in overlaying set
gestures with our own as well as increasing the user’s experience with the table. Our apps will be developed in KIVY, a multi-touch framework based on the Python
programming language. Since the knowledge needed for the development of this table lies heavily on material rarely taught in schools, the OIDC will also be giving free
classes to students in all areas of the design and development. Also, since the project is a student research initiative, we want to make an effort in helping people who
would be interested in recreating, collaborating, and improving our design. For this reason, so long as credit is given where it is due, the project will be open to all
persons. Our programs and code will be entirely open source and the design/build process will be entirely documented, photographed, and split into video developer
diaries so that anyone interested in NUI technology is able to recreate and learn more about multi-touch technologies. We want to create new interesting fields for
students that will eventually lead to cutting edge research attempts in sixth-sense technology, holography, and more innovative NUI projects.
A link to our website/blog can be found here with major development updates and a video representing our detection-test program being used on our first rudimentary
prototype model: http://opticalinterface.wordpress.com/
More updates will be coming soon as our project will officially launch Wednesday, August 29 th , 2012.
The entrant for this contest, Joshua Deaton, is indeed 21 years of age.
DOB: 4/18/1991
Related Instructables
Design Your
Own Controllers My Dream Jack Daniel's
(video) by Fuzzy- Project (video) Independence Jack Daniel's
Hot As A Independence
Wobble Mini Arcade Pepper wants to Project - GYM
Machines! by handcraftsup Project Contest
win the Jack (video) by
(video) by seanhutchison52 - My submission
Daniel's
samseide Independence video (video) by
Project contest! The King of
(video) by Random
JohnMHoyt
http://www.instructables.com/id/A-Multi-Touch-Table-with-an-AI-User-Control-Point/