Nothing Special   »   [go: up one dir, main page]

Multi Touch Table: Mohd Saad Ather 1604-09-737-037 Ahmed Bin Zubair 1604-09-737-043 Syed Rahmat Kaleem 1604-09-737-044

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 15

MULTI TOUCH TABLE

Mohd Saad Ather 1604-09-737-037 Ahmed Bin Zubair 1604-09-737-043 Syed Rahmat Kaleem 1604-09-737-044

DEPARTMENT OF INFORMATION TECHNOLOGY MUFFAKHAM JAH COLLEGE OF ENGINEERING AND TECHNOLOGY (AFFILIATED TO OSMANIA UNIVERSITY) HYDERABAD

YEAR: 2012

Mini Project report submitted in partial fulfillment of the requirement for the award of the Degree of B.E By

Syed Rahmat Kaleem 1604-09-737-044

ii

DEPARTMENT OF INFORMATION TECHNOLOGY MUFFAKHAM JAH COLLEGE OF ENGINEERING AND TECHNOLOGY (AFFILIATED TO OSMANIA UNIVERSITY) HYDERABAD

CERTIFICATE

This is to certify that the project report entitled Multi Touch Table being submitted by Mr Syed Rahmat Kaleem in partial fulfillment for the award of the Degree of Bachelor of Engineering in IT to the Osmania University is a record of bonafide work carried out by him under my guidance and supervision.

iii

(Head of the Department)

Guide Name Designation

ACKNOWLEDGEMENT

I would like to extend gratitude towards our guide for mini project Mr M A Rasheed for his invaluable support and for the useful insight provided by him to accomplish the arduous task of carrying out this project. I would also like to thank the Department of Information Technology and the Head of the Department Ms. Arshia Azam, for encouraging us and providing us with the LCD Projector.

iv

ABSTRACT

Table of Contents
1. Introduction................................................................................................................................1 2. A. SOFTWARE REQUIREMENTS........................................................................................ 2 3. Overall Description....................................................................................................................6 4. System Features......................................................................................................................... 8 5. External Interface Requirements............................................................................................. 8 6. Other Nonfunctional Requirements......................................................................................... 9

vi

1.
1.1

Introduction
Purpose

The purpose of this document is to present a detailed description of touchlib, which is a library for creating multi-touch interaction surfaces. It will explain the purpose and features of the system, the interfaces of the system, what the system will do, the constraints under which it must operate and how the system will react to external stimuli. When interacting with a regular desktop computer, indirect devices such as a mouse or keyboard are used to control the computer. Results of the interaction are displayed on a monitor. Current operating systems are restricted to one pointing device. With the introduction of multi-touch, a new form of human computer interaction is introduced. Multi-touch combines display technology with sensors which are capable of tracking multiple points of input. The idea is that this would allow users to interact with the computer in a natural way. Through a set of experiments we evaluate how multi-touch input performs on tasks compared to conventional mouse input. Unlike interaction on a desktop computer multi-touch allows multiple users to interact with the same devices at the same time. We present measurements that show how collaboration on a multi-touch table can improve the performance for specific tasks.

1.2

Intended Audience and Reading Suggestions

Applications such as the real-time fluid dynamics simulations demonstrated that scientific applications can benefit from multi-user input. NASA WorldWind is a demonstration of how existing applications can benefit from multitouch input using gesture based interaction. 1. We encourages the audience to participate in experiments, the system should be attractive. 2. The system needs to be suitable for an audience from 7 up to 70 years. 3. The system should encourage users to playing together. 4. The system needs to be a standalone device. 5. The hardware needs to be `child proof', which means it is robust and easy to use.

1.3

Project Scope

Touchlib is a library for creating multi-touch interaction surfaces. It handles tracking blobs of infrared light for you and sends your programs multitouch events, such as 'finger down', 'finger moved', and 'finger released'. You can build applications in C++ and take advantage of touchlib's simple programming interface. Touchlib does not provide you with any graphical or front end abilities - it simply passes you touch events. The software will facilitate communication between the user and the machine through infrared signals transmission techniques.

2.

A. SOFTWARE REQUIREMENTS

Microsoft Visual Studio 2005


MS Visual Studio 2005 MS Visual Studio 2005 SP1 MS Visual Studio 2005 SP1 Update for Windows Vista (only needed when using vista)

TortoiseSVN
TortoiseSVN latest binaries

Miscellaneous Libraries
OpenCV (download OpenCV_1.0.exe) DSVideoLib (download dsvideolib-0.0.8c)
2

VideoWrapper (download a VideoWraper_0.2.5.zip) GLUT (download glut-3.7.6-bin.zip) OSCpack (download oscpack_1_0_2.zip) CMU 1394 Digital Camera Driver (download 1394camera645.exe) The platform SDK has been replaced by the Windows SDK: Download the Web Install or DVD ISO image DirectX SDK (download March 2009 or newer)

B. HARDWARE REQUIREMENTS

2.1

User Interfaces

Multi-touch technology allows users to interact with a computer in a new way. When using a multitouch device, the type of interaction methods depend on the application. Current operating systems do not support multi-touch natively. Therefore it is required to implement multi-touch callbacks in the application code manually. The touch-pad becomes a user interactive surface by which the user gives inputs to the touchlib software for required output. The inputs would be hand movements on the touch-pad and the output would be given by the application used.

2.2

Hardware Interfaces

The software requires the use of a camera, with a wide-angle lens to ensure the entire projected image gets covered .You'll need to set up your computer so that the main monitor is the video projector so that the app comes up on that screen. Run the configuration app. Press 'b' at any time to recapture the background. Tweak the sliders until you get the desired results. The last step (rectify)
3

should just have light coming from your finger tips (no background noise, etc). When you are satisfied, press 'enter'. This will launch the app in full screen mode and you'll see a grid of points (green pluses). Now you can press 'c' to start calibrating. The current point should turn red. Press on your FTIR screen where the point is. Hopefully a press is detected (you can check by looking in the debug window). Press 'space' to calibrate the next point. You'll continue through until all points are calibrated. Note that the screen may not indicate where you are pressing. When you are all done, you can press 'ESC' to quit. All your changes (slider tweaks and calibration points) are saved to the config.xml file. Now when you run any touchlib app it will be calibrated. Note that any changes to where the projector is pointing or your webcam will require a re-calibration.

References:
http://nuigroup.com/touchlib/ http://nuigroup.com/touchlib/downloads/ http://www.whitenoiseaudio.com/touchlib/ http://nuigroup.com/forums/ http://nuicode.com/projects/wiki-book/files

3.
3.1

Overall Description
Product Perspective

3.2

Product Features

Our multi-touch system uses Touchlib [38] which is a free open source cross platform multi-touch framework which provides video processing and blob tracking for multitouch devices based on FTIR and DI. Video processing in Touchlib is done through the Intel's OpenCV graphics library [16]. Touchlib currently runs on MS Windows, Linux and Mac OS X. In Touchlib the blob tracker handles the blob detection and tracking. Blob detection is done to detect touch in a camera image. In order to follow the movement of touch, the blob tracker compares the detected touch locations in a frame with the positions of the previous frame.

3.3

Operating Environment

Touchlib is written in C++ and has a Visual Studio 2005 Solution ready to compile. It currently depends on OpenCV, DirectShow (you'll need the Microsoft Platform SDK), VideoWrapper and the DSVideoLib. The source code includes the main library which can be linked to any application to start capturing touch events. It has support for most major camera/webcam types.

3.4

Design and Implementation Constraints

Camera based multi-touch devices share the same concept of processing and filtering captured images on patterns. In general the interaction can be described as the pipeline in the figure. The pipeline begins when the user views the scene on the panel and decides to interact. To interact with the device, the user touches the device panel on the hardware level the camera registers touch. Because the captured frame might not only include the contact points but also a (static) background, it is required to perform image processing on each captured frame. The captured frame is converted to a gray scale image and the static background is removed by subtracting the current frame with a reference frame. As a result the frame only shows white contours (blobs) which are the points of contact. By tracking these blobs the system becomes capable of sensing touch. In order to track the blobs, the positions are transmitted to the blob tracker which matches blobs from earlier frames with the current frame. After processing, events will be triggered which can be used in a multi-touch capable application. The application modifies objects in the current scene according to the new blob positions. The result is returned to the user through the display. The performance of a camera based multitouch device depends on the used hardware and software. When a user touches a multi-touch device, it expects the device to respond directly. The responsiveness of the device depends on the time it needs to process the user's input and present the users a result through the display. In the interaction
7

pipeline two levels are important, the hardware and the software level. Using a camera which is capable of 30 frames per second allows smooth interaction with the system. However, this requires a system that can handle image processing and blob tracking in 1/30th of a second. A combination of smart algorithms implemented in software and fast hardware helps to minimized the latency and increase the responsiveness of the device.

3.5

User Documentation

The following are the documentations and online help offered with the product 1. http://nuigroup.com/forums -online forums for troubleshooting and developing information 2. http://www.whitenoiseaudio.com/touchlib/ -step to step information for handling touchlib.

4.

System Features

This template illustrates organizing the functional requirements for the product by system features, the major services provided by the product. You may prefer to organize this section by use case, mode of operation, user class, object class, functional hierarchy, or combinations of these, whatever makes the most logical sense for the product.

5.

External Interface Requirements

6.
6.1

Other Nonfunctional Requirements


Safety Requirements
The IR signal should be spread evenly. The texture of the surface should be uniform. The interface should be evenly illuminated. The less shiny the surface, the better the result. White surface are preferred as they give the best projection. The images captured by the camera should be of good quality. All the connections should be carefully done.

You might also like