Wednesday, February 10, 2010

Looking beyond the screen : Text-to-speech and eyes-free interaction

Question:

How do you build a Touch screen application, on a device with no keyboard, for someone who has limited sight or just cannot look at the screen (they are driving) ?

I was fortunate enough to attend Google IO last year. (each year it just gets better and better - if you missed it - register now). Amongst some of the great sessions I attended was one that deals with two of my most favourite subjects - user interaction and voice technology. Presented by TV Raman and Charles L. Chen - they discuss the Eyes-Free Project : a project that aims to enable fluent eyes-free use of mobile devices running Android. Target uses range from eyes-busy environments like in-car use to users who can't or don't want to look at the visual display. When you watch the video you will see some of the UI innovations for taking advantage of the touch screen without needing to actually look at the screen. 


You can download the applications to your Android phone. It's great to see such innovation around user interactivity - especially on a mobile device. The mobile device has the ability to sense more about it's environment than your PC can.

You can download the presentation here

Going beyond the mobile use case, it does not take too much imagination to see the same principles being applied to the next generation of touch screen tablet devices or the humble trackpad on your existing laptop. Imagine a data capture or input mechanism being driven by your gestures and not just the final click - think about the journey and not just the final destination. Could you optimise an operation in such a way to make it is more intuitive and yet productive? Watch the video and find out...

The Eyes Free project is here