I am working as a QA Engineer in Intelligaia Technologies Pvt. Ltd from last one and half years. Here I got a chance to work on the TouchScreen applications. It was a good experience when I saw the touch enabled machine on my seat and was asked to work on this. I want to share my experience about testing touchscreen applications
TouchScreen is a display device that can detect the presence and the point of contact on the display area of the device. The term itself defines its meaning, that is touching a touch sensitive device with finger, hand and other objects like stylus. TouchScreen is an input device that receives the input signals as user interacts with the computer by touching the display screen. The way by which Windows lets user interact directly with a computer is referred to as ‘Touch’. Many touch interactions are performed using single and multi touch. A single touch is a movement of one finger on the screen that the computer interprets as a command. Windows 7 has new multitouch (providing the input using two or more than two fingers) feature such as pan, zoom, rotate, multi finger tap, and press & tap. One of the quickest and easiest movement to perform is a flick. A flick is a simple movement of finger that includes drag (up/down), moves (forward/backward), copy, paste and other editing commands.
Actually these screens use infrared light beams that are projected across the screen surface. Whenever the user touches the screen, these beams of light get an Interrupt and an electronic signal identifying the location of the screen is generated. These signals are then interpreted by different softwares and the required operation is performed. There are a number of technologies now in use to make touch enabled screens. Read More…