Results 1 to 1 of 1
  1. #1
    Hot Member
    Join Date
    Nov 2012


    Default What Comes After the Touch Screen?

    Follow us on Social Media

    In a few short years, the technologies found in today's mobile devices—touch screens, gyroscopes, and voice-control software, to name a few—have radically transformed how we access computers. To glimpse what new ideas might have a similar impact in the next few years, you need only to have walked into the Marriott Hotel in Cambridge, Massachusetts, this week. There, researchers from around the world demonstrated new ideas for computer interaction at the ACM Symposium on User Interface Software and Technology. Many were focused on taking mobile devices in directions that today feel strange and new but could before long be as normal as swiping the screen of an iPhone or Android device.

    "We see new hardware, like devices activated by tongue movement or muscle-flexing, or prototypes that build on technology we already have in our hands, like Kinect, Wii, or the sensors built into existing phones," said Rob Miller, a professor at MIT's Computer Science and Artificial Intelligence Lab (CSAIL) and the chair of the conference.

    One of the most eye-catching, and potentially promising, ideas that was on show makes it possible to perform complex tasks with a flick of the wrist or a snap of the fingers.

    The interface, called Digits, created by David Kim, a U.K. researcher at both Microsoft Research and Newcastle University, is worn around the wrist and consists of a motion sensor and an infrared light source and camera. Like a portable version of Microsoft's motion-sensing device for the Xbox Kinect, Digits can follow arm and finger movements with enough accuracy to replicate them on screen or allow control of a complex computer game. "We envision a smaller device that could be worn like a watch that allows users to communicate with their surroundings and personal computing devices with simple hand gestures," said Kim (watch a video of Digits in action).

    Projects like Kim's could be a glimpse into the future of mobile computing. After all, prior to the iPhone's launch, multi-touch interfaces were found only at this kind of event. Researchers believe that mobile computers are still being held back by the limitations of existing control methods, without which they could become even more powerful.

    "We have an increasing desire and need to access and work with our computing devices anywhere and everywhere we are," Kim said. "Productive input and interaction on mobile devices is, however, still challenging due to the trade-offs we have to make regarding a device's form factor and input capacity."

    The advance of mobile technology has also given researchers easy ways to experiment. Several groups at the conference showed off modifications of existing mobile interfaces designed to give them new capabilities.

    Hong Tan, a professor at Purdue University currently working at Microsoft Research Asia, demonstrated a way to add the feel of buttons and other physical controls to a touch screen: vibrating piezoelectric actuators installed on the side of a normal screen generate friction at the point of contact with a finger. The design, dubbed SlickFeel, can make an ordinary sheet of glass feel as if it has physical buttons or even a physical slider with varying levels of resistance. Such haptic feedback could help users find the right control on compact devices like smartphones, or enable the use of a touch screen without looking at it, for example while driving.



Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts