Articles

What Next After The Touch Screen?

by James L. Developer
Article Source By Android Application Development Karachi

In a few years, the technologies of today's mobile devices (touch screens, gyroscopes and voice control software, etc.), have radically transformed the way we access computers. To envision what new ideas could have a similar impact in the years ahead, just had to get into the Marriott Hotel in Cambridge, Massachusetts (USA) this week. Researchers around the world have gathered here to demonstrate new ideas about human-computer interaction at the ACM Symposium on Software Technology and User Interface. A lot of them have focused on bringing mobile devices into addresses that currently feel as strange and new, but could be as normal as swiping your finger across the screen of an iPhone or Android device. "We are seeing a new type of hardware, such as activated by the movement of the tongue or muscle flexing, or prototypes based on technology that we already have in our hands, as the Kinect, Wii, or sensors embedded devices on existing phones, "said Rob Miller, a professor in the Laboratory for Computer Science and Artificial Intelligence (CSAIL, for its acronym in English) of MIT (Massachusetts Institute of Technology, USA) and chairman of the conference.

One of the most striking and potentially promising ideas, included in the program can perform complex tasks with a flick of the wrist or a snap of the fingers. The interface, called Digits, created by David Kim, a UK researcher who works in both Microsoft Research and the University of Newcastle, is placed on the wrist and wears a motion sensor, a source of infrared light and a camera. Android Application Development Karachi -  As if it were a portable version of motion-sensing device Kinect for Microsoft Xbox, Digits can follow the movements of the arm and finger enough to play on screen or allow control of a complex computer game accuracy. "We have in mind a smaller device that could be used as a clock and allow users to communicate with their environment and personal computing devices with simple hand gestures," Kim said (see a video Digits in action). Projects like Kim could offer a glimpse into the future of mobile computing. After all, before the launch of iPhone, the multi-touch interfaces only found in this type of event. The researchers believe that mobile computers are being held back by the limitations of existing control methods, without which they could become even more powerful. "We have a desire and increasing access and work with our computing devices anywhere in need," said Kim. "However, to achieve a production level of input and interaction in mobile devices remains a challenge due to the concessions we have to do in terms of form factor and device input capacitance."

The advance of mobile technology has provided researchers with easy ways to experiment. Several groups at the conference showed modifications of existing mobile interfaces designed to give them new skills. Hong Tan, a professor at Purdue University, and currently works at Microsoft Research Asia, has shown a way to add the feel of physical buttons and other controls on a touch screen: piezoelectric actuators capable of vibration, installed in the side view of a normal screen, create friction at the point of contact with the finger. Android Application Development Karachi - The design, called Slick Feel, can make a piece of ordinary glass gives the feel of physical buttons or even a physical slider also with different levels of resistance. This type of tactile feedback may help users find the right control in compact devices such as Smartphone, or allow the use of a touch screen without looking, for example when driving.

In an attempt to get more out of Chris Harrison, Carnegie Mellon University, touch screens has introduced a way for devices to recognize the touches and slides individuals. Its interface, capacitive touch screen with a connected sensor resistance, identifies the 'impedance profile' one of a person's body through their fingers. Users have to keep your finger on the device for a few seconds the first time you use it, and subsequent touches are attributed to those users. This could allow applications to do things such as track changes in a document prepared by different people together using the same tablet (see a video of the screen). "It's similar to the technology that is already in Smartphone," said Harrison. "This has many implications for games (split screens are over) and collaboration applications."

Motion sensors and touch on current phones is another target for experimentation. Mayank Goel, PhD student at the University of Washington (USA), and together with his colleagues, has modified the software of an Android phone to automatically determine which hand is holding the person. Android Application Development Karachi - The software finds out by monitoring the angle at which the device is tilted, as revealed by its motion sensor, and the precise form of pressure on your touch screen. Goel said that this can allow you to create keyboard automatically adjust if a person is using the left or right hand. This adjustment reduced by 30 percent misspellings for their experiments.

Other prototypes on display had a less obvious connection with devices current pocket. One was a malleable interface that can be shaped like clay, developed by a team at the MIT Media Lab. Sean Follmer, a PhD student in the laboratory of Professor Hiroshi Ishii, showed several versions, including a TV set on a table transparent, flexible touch screen, Is made of a plastic material containing glass beads and oil, with a projector and a 3D sensor located below. The twists and tweaks made to the flexible screen changed colors appearing on it, which also showed a 3D model of the material into a nearby computer screen.

Android Application Development Karachi - It is hard to imagine an interface pocket. However, Desney Tan, who heads the group Computational User Experiences Microsoft in Redmond, Washington (USA), and the group Human-Computer Interaction Company in Beijing, China, found that the choice between multiple modes of interaction will be an important part of the future of computing. "Let's stop thinking in mobile 'devices' to focus on 'computer' Mobile" said Tan, who also won the 35 Minor Innovative 35 Technology Review in 2011. "As I see it, any mode input or output will dominate the environment in the same way the visual display, mouse and keyboard have done so far.

Sponsor Ads


About James L. Junior   Developer

3 connections, 0 recommendations, 18 honor points.
Joined APSense since, October 28th, 2014, From New York, United States.

Created on Dec 31st 1969 18:00. Viewed 0 times.

Comments

No comment, be the first to comment.
Please sign in before you comment.