The word of 2007 is “multi-touch“. It’s everywhere, from the iPhone to the Microsoft Surface. We weren’t happy with just one finger on the touch-sensitive screens, we wanted to put all 10 fingers and probably some toes on too, and multi-touch was born.
One of the most memorable and widespread demonstrations of multi-touch technology is probably the TED presentation by Jeff Han in 2006. I don’t know exactly why but his prototype has become the benchmark for multi-touch technologies. However if I recall correctly, the iPhone is the only portable device that could achieve the desired effect without the bulky equipment required in many other setup such as the Surface. But the fallback being skin-sensitive only, an issue to those with big hands who wants to use their nails or disabilities.
The idea of having a multi-touch-capable laptop sounds pretty sweet, but of course no one would want to carry a cathode-ray-tube in their backpack, so how do you come up with a solution for a screen less than an inch thick? Microsoft Research Cambridge’s Steve Hodges might have the answer.
Steve’s solution is remarkably simple yet effective. With an off-the-shelf laptop, he retrofitted some infrared sensors on the back and together with the magic of software, you have multi-touch! Check out the following video clip (excerpt from MSR Cambridge video) to see his demonstration. It has all the dragging and pinching demos you’d expect to see in every multi-touch display, so don’t expect to be blown away.
Granted the sensors are a little thick at the moment and only affect a portion of the screen, but it’s dead simple and presumably just as cheap. Obviously with some more work on the production and aesthetics side of things (no one wants a hole in the back of their screen), this could become a mainstream solution.
In addition, because he uses infrared technology, an added enhancement is the ability to pick up infrared signals from a standard remote. There’s nothing exactly new about remotes and laptops, several on the market already use built-in remotes for media browsing activities. However again, this could drive the technology mainstream and spark a whole wave of do-it-yourself infrared hacks for computers.
Having said all that, the rate at which Microsoft Research projects are realized is not exactly fast or even promising for that matter. I hope this one picks up some pace and interest from computer manufacturers so all of us can twist and pinch pictures on our laptops in the near future.
Just wondering, what happens if someone behind you is trying to interfere with your computer by pointing their remote control at your laptop screen?
Great looking tech, get the impression that it would be nice and cheap to implement and robust in practice. Lets just hope they pull there finger out (pun intended) and get some laptop/display manufactures on board.
Surface, presumably users fewer IR cameras because they can be place farther away and so have a wider field of vision. Nonetheless, this might mean you can get a Surface like computer using an LCD screen instead of a projector that’s only a few inches thick.
@RC If I remember correctly Surface uses 5 cameras.
@John: Good point! Didn’t think of that.
@ClientEastman: I think you’re right, Surface’s pretty complicated.
For all of the multi-touch technologies, do they have a way to keep the screen from getting greasy? I am not the type of person to wash my hands every five minutes and have enough trouble trying to keep my not touch sensitive cell phone screen clean.
Microsoft should make more hardware!
Hey Long…thanks for the linkage!
Dave – Lifekludger
@Dave: You are an inspiration. For those who don’t know, Dave is actually a quadriplegic who’s doing amazing things with computers. http://lifekludger.net/?page_id=5
Anyone else notice they took a Dell laptop and turned the screen backwards? Heh.
Cool idea. Something tells me that the surface will be very expensive. May provide some cheap competition.
I love how he says that it’ a “brand new technology” they’ve “developed” at Microsoft when that is a flat out lie. Multi-Touch has been around for a long time and was demoed by NYU years ago. It became mainstream with the iPhone.
It’s amusing to me that Apple can get Multi-Touch to work just perfectly on a small phone but MS still can’t on a large laptop screen. They need to connect all sorts of hardware to the back to rig it. the refresh rate is also slow on every MS demo I’ve seen. The images on the screen need time to catch up to the fingers of the users. It looks very sloppy.
Jared, the touch screen was in development 5 years ago, and the appications of surface can do waaayyy much more than what Apple has with the iPhone which is only touch and execute a command.
@ Jared
What is wrong with you? NO ONE OWNS MULTI-TOUCH. It is a concept not a technology. Just like no one owns the concept of an operating system or a mouse. This is a new concept and many organizations are taking different approaches implementing it. Isn’t this a good sign of innovation and advancement?
Hey just don’t replace the mouse with this new technology, maybe some people won’t want to have to raise there arm off there desk to have to reach up out of there comfortable position just to navigate there displays on the screen.For the tired or just being lazy this might not be user friendly,I think the mouse is still easier you don’t have to use both hands. lol! Just kidding I’m sure the mouse will still be an option.:)
I can’t believe people are missing the point. This demo isn’t claiming multi-touch is a new technology, its about a particular method of adapting an already existing laptop display to have multi-touch.
Microsoft does it again! Pointing a remote control and making a carousel of 4 icons spin..now where have i seen THAT before? Front Row on Intel Macs maybe?
With multi touch you don’t need a carousel spinning around with 4 items…if you must rip off…just look at how the iPhone lets you choose items from cover flow
Hey Long, nice find!! You should check out http://www.skinkers.com to see an example of how MSR technology is finding its way out of the labs faster. I posted a video with their main man, Matteo Berulcchi, talking about the IP they have licensed from MSR and how they’re bringing out products based on it already. all part of http://www.microsoft.com/about/legal/intellectualproperty/ipventures/default.mspx
videos at blogs.msdn.com/ptstv
Wow, looking forward to see what this can do in the future.
Looks like an iPhone!
How much time do you think it’ll take before multi-touch tablets are available??
How about multi-touch on an iMac 🙂
LOL, pretty cool.
But my grandfather has the same mouse like him…
Haha
The iPhone uses capacitive technology which has been around for many many years. With a capacitive system you need a conductive medium, like your finger to change the sensor capacitance. The iPhone can’t see anything which is not conductive. The difference with the MSFT technology is that you can see anything which is on the surface, be it a figure, a credit card, or a plastic game piece. It is also different then the NYU technology which requires the object placed on the screen surface to break the TIR condition of the light guide. The MSFT technology does not need the object to be in “optical contact” with the screen. Again if you throw a credit card on the NYU system you won’t see it. With the surface you will! That’s big, and it opens up many more possibilities for the Surface than any other “Multi-Touch” system under development.
“eat the red pill”
It is a shame everyone here missed FingerWorks, a multi-touch keyboard that does all this and more today. I have been using them for the last 3 years, and even had one as a keyboard replacement in my Mac. Rumor has it that Apple bought the company, and have used the techonolgy in the iPhone. The keyboard is amazing, all the scrolling, zooming, switching stuff, here today, and in many of my conventional apps. The only difference, it isn’t on the screen.
http://www.fingerworks.com is all that is left.
GC
Jared,
iPhone didn’t make multitouch mainstream, it’s not even out yet. It wasn’t developed by NYU either, it was developer 20 years ago in Bell labs.
The big difference here is that every multitouch platform Microsoft has developed, we can develop for it. Every multitouch platform Apple has developed, you can’t develop anything that utilizes the multitouch.
And your iPhone won’t work with gloves on. That’s why it went on sale in the summer.
GC, I am pretty sure it was apple who brought fingerworks, since its the same technology in the iPhone… Capacitance, and fingerworks were the best around at the time, it makes sense for apple to buy it.
The Lenovo Thinkpad X series tablet PC can do Multitouch since the X60 came out.
So this is not really something new.
Hi.
Good design, who make it?
Andreas:
The X60 isn’t actually multitouch – it’s Multitouch – as in marketspeak. They’re referring to the fact that it’s touchscreen AND Wacom digitiser. The touchscreen part is a standard resistive touchscreen and can’t do multitouch.
The only full screen multitouch laptop out there at the moment is the Dell Latitude XT (well ok – it’s not actually out NOW.. but it’s up on their site and should be out shortly – and you can’t actually use the multitouch yet because, of course, there’s no software for Windows that knows how to do multitouch – but hey – someone has to be first…)
gaothebao:
Way to complete miss the point of a post. BTW, there were PC media center apps using that rotating carosel of icons before Apple used it – heck there were PCs using remotes before Apple as well.
Awesome. Flat out awesome find..
Can’t wait till I can get one (Insert drool here)
“We weren’t happy with just one finger on the touch-sensitive screens, we wanted to put all 10 fingers and probably some toes on too, and multi-touch was born.”
How much is it true?