Here is a video of my display in action:
Step 1: Theory of operation
Han, J. Y. 2005. Low-Cost Multi-Touch Sensing through Frustrated Total Internal Reflection. In Proceedings of the 18th Annual ACM Symposium on User Interface Software and Technology
The figure below comes from his web site.
An acrylic panel is edge lit with infrared leds. When your finger comes in contact with the acrylic, it scatters infrared light out the back where it is visible via infrared camera. As long as nothing is touching the acrylic, very little of the light escapes, instead just reflecting around inside. Image processing takes care of detecting tips of fingers and relaying their location to application software. Since the camera "reads" the whole display in parallel, it is easy to detect multiple fingertips at once, even those belonging to multiple users. All this sensing goes on in the infrared spectrum, leaving us free to utilize the visible spectrum to display interactive software.
Since most hobbyists can't afford multiple projectors (i don't even own one, just borrowed it from dr.eel), my design uses a ceiling mount that swivels so that the projector can be used either in standard mode (say for watching movies) or can be aimed downwards, bouncing off a reflector and onto the multitouch display screen.
The screen itself can be constructed from hardware store materials and hand tools. Excluding the projector and modified webcam (commodity items these days), the only thing complicated is the software. Halfway through this project, I was happy to discover that there is thriving DIY community which has already undertaken the task of writing the image processing code and several cool open source demos which can be found here: