So now all I need is $100k and I'm ready to rock. watch the videoJefferson Y. Han likes big computer monitors. If a screen is large enough, four or five people can work at it together, rearranging blueprints, say, or editing photos. But they can't do that if they're taking turns at a keyboard and mouse. The answer, which Han demonstrates on a 3 x 8-ft. monitor in his lab at New York University, is multitouch input. It allows any number of users to lay hands on the screen as if they were manipulating real objects. On the monitor, recently dubbed the Media Wall, Han uses his hands to spin a virtual globe and then zoom into the canyons of Manhattan. "A mouse is an indirect pointing device," Han says. "You're working with an object that's not on the screen. Multitouch computing is direct manipulation."Han, the son of Korean immigrants, drew inspiration from the way light diffuses when you touch a glass of water. It's a phenomenon known as "ÃÂfrustrated total internal reflection." Han attached LEDs to the side of a piece of clear acrylic Ã¢â¬Âhis screen" and mounted an infrared camera on the back. Light traveled through the acrylic. When Han touched the screen, some light diffused out and was captured by the camera. He then designed software that allows the screen to be used for operating mapping programs, handling photos and drawing animated figures. Perceptive Pixel, the business Han started with an NYU colleague, builds custom multitouch computers for private industry and military clients (and, now, to anyone with $100,000).Multitouch computing isn't new: the concept has been kicking around since the 1980s, and two Breakthrough Award-winning products, Apple's iPhone and Microsoft's Surface, use versions of it. Han's special contribution: Beyond his varied applications for multitouch computing lies a broad vision of how it can empower people to work together in new ways.