Apple is developing new display technology that embeds thousands of microscopic optical sensors into the screen, allowing the device to process images outward from the screen, and thus both displaying and capturing images simultaniously.
I'd like to see this used to track hand and pointer-device movements,
from multiple angles of these different sensors, to calculate by the device where and how far away anything pointing at the screen is. The device may even be able to deduce the angle, size, and color of what you interact using, and could respond accordingly.
Imagine a nice new 30 inch wall-mounted from Apple in your office that you can draw on using whatever the color of the marker cap is, or where you can 'grasp' items on the screen and move them around.
The biggest advantage here is that you could have extremely high-precission touch-sensitivity, without adding any hardware beyond what apple is already building into these, and they could perform at better resolutions than traditional methods, while also providing more input information.
Of course, processing all the data might be expensive, but processing costs always go down.