The application requires the right of priority to the 554th, No. 416 New Zealand's temporary patent application that is entitled as " utilizing the touch-screen that suspends and click input method " of new zealand patent office submission on April 11st, 2007.
Embodiment
The invention provides the touch-screen system and the method that are used for approximate at least four interaction modes, these these interaction modes comprise: (1) is state out-of-bounds; (2) tracking mode; (3) selection mode and (4) drag state.System and method of the present invention is provided at the function of discerning between a plurality of interaction modes, and no matter the direction that user's finger, stylus or other touch object how, and need not to rely on the direct sensing in touch pressure or zone.Describe illustrative embodiments of the present invention hereinafter with reference to the accompanying drawings, identical in the accompanying drawings label is represented components identical.
Fig. 1 is the diagrammatic sketch of exemplary touch screen system 100.As employed in this article, term " touch-screen system " means touch-screen 110 and hardware and/or the software element that touches measuring ability is provided.Exemplary touch-screen system 100 is shown as contiguous display device (that is video monitor) 190.Display device 190 can be connected to carry out and be used to detect on touch-screen 110 or near the personal computer of the software of the touch it or other calculation element (referring to Fig. 2).The diagrammatic sketch of touch-screen system 100 contiguous display device 190 is represented the exemplary application of touch-screen system 100 in Fig. 1.For example, touch-screen system 100 can be arranged on and/or be fixed on the place ahead of display device 190, and is mutual with it so that the user can watch the vision output of display device 190 and pass through touch-screen 110.
Therefore, touch-screen system 100 can have the covering that is used for existing display device 190 or upgrade and use.Other application that it should be understood, however, that exemplary touch screen system 100 is also expected by the present invention.For example, touch-screen system 100 can be used as the integrated component of display device 190, and in this case can also be as the display screen that is used for display device 190.Exemplary touch-screen system 100 can be used in combination with the display device 190 of all sizes and size, includes but not limited to the display screen of less hand-held device, such as mobile phone, PDA(Personal Digital Assistant), pager or the like.
It is transparent and/or translucent that at least a portion of touch-screen 110 is generally, so that image or other target can be watched by touch-screen 110, and the energy of light and/or other form can be within touch-screen 110 or by touch-screen 110 (for example, by reflection or refraction) transmission.For example, touch-screen 110 can be by plastics or thermoplastic (for example, acrylic acid, plexiglas, polycarbonate or the like) and/or glass-like materials formation.In some embodiments, touch-screen can be polycarbonate or the glass material that is bonded to acryhic material.As is known to the person skilled in the art, touch-screen 110 can also be made of other material.The coating that touch-screen 110 can also dispose durable (for example, anti-scratch and/or pulverizing).Touch-screen 110 can comprise or can not comprise framework or protecgulum,, centers on the outer box or the shell of touch-screen 110 peripheries that is.
Touch-screen system 100 comprises energy source 120, and it for example is configured to launch that form is the energy (, being commonly referred to " energy beam " in this article for easy) of pulse, ripple, bundle or the like.Energy source 120 is arranged within one or more edges of touch-screen 110 or neighbouring (for example, approaching with the edge) usually.Energy source 120 can be launched the energy of one or more types.For example, energy source 120 can be launched infrared (IR) energy.Perhaps, energy source 120 can visible emitting energy (for example, with one or more frequencies or spectrum).
Energy source 120 can comprise one or more independently emissive sources (transmitter, generator or the like).For example, energy source 120 can comprise one or more infrarede emitting diodes (LED).As another example, energy source 120 can comprise one or more microwave energy transmitters or one or more sonic generator.Energy source 120 is set up and is configured so that its emission passes the energy beam 140 on touch-screen 110 surfaces, thereby generates near the energized plane that is positioned at the touch screen surface.For example, suitable reflection or refracting element (such as the zone of reflections, lacquer, metal or plastics, mirror, prism or the like) can be used to form and be provided with energized plane.
The energy beam 150 of the front surface 111 that passes touch-screen 110 of being reflected is detected by detector 130,131.Detector 130,131 can be configured to monitor and/or detected energy bundle 150 in variation (change, or the like).The direction that depends on energy source 120 and detector 130,131, energy beam 150 can have " backlight (back-lighting) " or " preceding light (fore-lighting) " effect on finger, stylus or other object of contact touch-screen 110.Under the backlight situation, on touch-screen 110 front surfaces or near touch can cause the interruption of the energy beam 150 of reflection appear as shade or profile (, energy disappearance) when device 130,131 is surveyed so that touch location is detected.Under preceding smooth situation, will in detector 130,131, appear as the zone of energy density increase by the energy of finger, contact pilotage or the reflection of other object.
In some embodiments, detector 130,131 and/or software can be used filtering to strengthen the detection of energy beam Strength Changes.Yet the strength difference between energy beam 150 and the ambient noise may be enough to eliminate the needs to filtering.See below such that Fig. 2 discusses, can handle by video processing unit (for example, digital signal processor) and/or calculation element by the information signal that detector 130,131 generates.
Detector 130,131 can be arranged within the touch-screen 110 or neighbouring (for example, approaching with it), so that detector 130,131 can monitor and/or detect the energy beam 150 near touch screen surface the energized plane.If necessary, depend on the position of detector 130,131, can use reverberator and/or prism to allow detector 130,131 detected energy bundles 150.In the example shown in Fig. 1, detector 130,131 is arranged within the bottom margin of touch-screen 110 or along this edge setting, has a detecting device in the every nook and cranny.In preferred embodiment, comprise two separate detector at least, so that it is the position that touches can be determined by triangulation technique, as mentioned below.
Detector 130,131 can be to detect any device of variation that (for example, imaging, monitor or the like) passes the energy beam 150 of touch-screen 110 front surface reflections.For example, suitable detector 130,131 can be a kind of in the multiple camera, such as sector scanning or line scanning (for example, numeral) camera.This sector scanning or line scan camera can be based on complementary metal oxide semiconductor (CMOS) (CMOS) or charge-coupled device (CCD) technology, and it is known in the field.In addition, because detector 130,131 need not to obtain detailed color image, therefore monochromatic (for example, gray scale) camera can be enough.
Although the common specific energy of camera enough in touch-screen system 100 (the detector means costliness of) other type for example, photodiode or phototransistor, camera provides bigger touch accuracy of detection such as photodetector.As known in the art, sector scanning or line scan camera (camera that especially has monochromatic performance) are usually cheap than being configured to obtain detail image and/or having a camera of color detection performance.Therefore, cost-effective relatively sector scanning or line scan camera can be touch-screen system 100 accurate touch screen performance are provided.Yet, should be appreciated that according to other embodiment of the present invention, other device also can be used to provide the function of detector 130,131.
Therefore, touch-screen system 100 of the present invention is configured to come senses touch (for example, by finger, stylus or other object) based on the variation near the detected energy beam 150 that forms energized plane touch screen surface.Energy beam 150 is monitored by detector 130,131.Detector 130,131 can be configured to the variation (for example, reduce or increase) of the intensity of detected energy bundle 150.As skilled in the art to understand, permission is carried out the output performance that enough detects required energy source 120 by detecting device can be based on multiple factor, such as the expected loss of the size of touch-screen 110, touch-screen system 100 inside (for example, 2 losses of 1/ distance) and the speed of the expected loss that causes by surrounding medium (for example, air), detector 110 or time shutter characteristic, surround lighting characteristic or the like.As discussing with reference to the following drawings, detector 130,131 will be relevant with energy beam 150 (or variation wherein) data transmission to the calculation element (not shown), this calculation element is carried out and is used for processing said data and calculates software with respect to the touch location of touch-screen 110.
Fig. 2 is the calcspar that shows the exemplary touch screen system that is connected to example calculation device 201 100 of some embodiment according to the present invention.Calculation element 201 can functionally be connected to touch-screen system 100 by circuit or wireless connections.Exemplary calculation element 201 can be any type by the processor device driven, such as personal computer, laptop computer, handheld computer, PDA(Personal Digital Assistant), numeral and/or cell phone, beeper, video game apparatus or the like.These or other type be conspicuous for those skilled in the art by the processor device driven.As employed in this article, term " processor " can mean the programmable logic device (PLD) of any type, comprises the microprocessor or the similar device of other type arbitrarily.
Calculation element 201 can comprise for example processor 202, Installed System Memory 204 and various system interface components 206.Processor 202, Installed System Memory 204, digital signal processing (DSP) unit 205 and system interface components 206 can be passed through system bus 208 functional connections.System interface components 206 can make processor 202 communicate by letter with peripheral unit.For example, memory device interface 210 can provide processor 202 and memory storage 211 (for example, dismountable and/or non-removable), such as disc driver, between interface.Network interface 212 also can be provided as the interface between processor 202 and the network communication device (not shown), so that calculation element 201 can be connected to network.
Display screen interface 214 can provide the interface between processor 202 and the display device 190 (shown in Fig. 1).The touch-screen 110 of touch-screen system 100 can be arranged on the place ahead of the display device 190 with himself display screen 192, perhaps otherwise connects or is mounted to display device 190.Perhaps, touch-screen 110 can be used as the display screen 192 of display device 190.One or more I/O (" I/O ") port interface 216 can be provided as the interface between processor 202 and a plurality of input and/or the output unit.For example, the detector 130,131 of touch-screen system 100 or other suitable element can be connected to calculation element 201 or provide input signal by input port interface 216 to processor 202 by input port.Similarly, the energy source 120 of touch-screen system 100 can be connected to calculation element 201 and can receive output signal from processor 202 by output port interface 216 by output port.
A plurality of program modules can be stored in Installed System Memory 204 and/or any other computer-readable medium of being associated with memory storage 211 (for example, hard disk drive) in.Program module can comprise operating system 217.This program module also can comprise information display program module 219, and it comprises the computer executable instructions that is used on display screen 192 display image or out of Memory.The others of this illustrative embodiments of the present invention can be implemented in touch-screen control program module 221, and this module is used for the detector 130,131 of control energy source 120 and/or touch-screen system 100 and/or is used for calculating with respect to the touch location of touch-screen 110 and discerning interaction mode based on the signal that receives from detector 130,131.
Some embodiment of the present invention can comprise the DSP unit, is used to carry out the some or all functions that belong to touch-screen control program module 221.As known in the art, DSP unit 205 can be configured to the calculating (comprising filtering, data sampling, triangulation and other calculating) of the many types of execution and the modulation in control energy source 120.DSP unit 205 can comprise a series of scanning imagers, digital filter and the comparer of realizing with software.Therefore DSP unit 205 can be programmed touch location and the identification interaction mode that is used to calculate with respect to touch-screen 110, as described herein.
Can be configured to the computer-readable instruction of carrying out each program module by the processor 202 of operating system 217 controls.Method of the present invention may be implemented in these computer-readable instructions.In addition, the image or the out of Memory that are shown by information display program module 219 can be stored in one or more information data files 223, on any computer-readable medium that information data file 223 can be stored in calculation element 201 is associated.
As mentioned above, as user on touch-screen 110 or when touching in its vicinity, pass in the intensity of energy beam 150 on touch-screen 110 surfaces and will occur changing.Detector 130,131 be configured to detect the energy beam 150 that passes touch-screen 110 surface reflections intensity and should be enough sensitivity to detect the variation in this intensity.The information signal that is produced by the detector 130,131 of touch-screen system 100 and/or other element can be used for determining with respect to the touch location of touch-screen 110 (and therefore with respect to display screen 192) by calculation element 201 and discern this touch and whether represent selection mode, tracking mode or drag state.Calculation element 201 also can be determined on touch-screen 110 or near the suitable response of the touch it.
According to some embodiment of the present invention, can periodically be handled by calculation element 201 from the data of detector 130,131, be directed to the typical intensity level of the energy beam 150 that passes touch-screen 110 surfaces with monitoring energy beam 150 when not having touch.This permission system solves and reduces thus the influence of the variation of ambient light levels or other environmental baseline.When needs, calculation element 201 can increase or reduce the intensity by the energy beam 150 of energy source 120 emissions alternatively.Subsequently, if detector 130,131 detects the variation of the intensity of energy beam 150, then calculation element 201 can be handled this information to determine having occurred touch on touch-screen 110 or near it.
For example, can be by handling the information that receives from each detector 130,131 and carrying out one or more known trigonometric calculations and determine touch location with respect to touch-screen 110.As example, calculation element 201 can be from each detector 130,131 reception information, and detector 130,131 can be used to discern with respect to regional location each detector 130,131, that energy beam intensity increases or reduces.Can determine about the one or more pixels of touch-screen 110 or the coordinate of virtual pixel with respect to regional location each detector 130,131, that energy beam intensity reduces.Then, can be based on the geometric condition between the detector 130,131, to respect to each detector, energy beam intensity increases or the regional location that reduces carries out triangulation, so that determine the actual touch position with respect to touch-screen 110.Determine to illustrate with reference to the following drawings by the calculating that touches represented interaction mode.Be used for determining that any this calculating of touch location and/or interaction mode can comprise the algorithm that can use, be used for compensate (damage of for example, lens distortion, environmental baseline, touch-screen 110 or the barrier on it or the like).
Fig. 3 comprises Fig. 3 A and Fig. 3 B, shows the mutual of user and exemplary touch-screen 110.User interactions among the embodiment that illustrates is intended to represent tracking mode.The user points a part (or other object) of 302 and enters near the touch screen surface energized plane (being formed by energy beam 150), perhaps " suspensions " near touch screen surface and do not contact with it, perhaps the less relatively pressure of usefulness contacts with touch-screen.Two detectors 130,131 (are referred to as Camera for convenience
0(camera
0) and Camera
1(camera
1)) generate the indication energized plane Strength Changes, and thereby indication have the information signal that touches.
The view data that detector 130,131 is caught can be handled and explain, with the indicated interaction mode of approximate touch.For example, can be in a known manner to Camera
0Output handle, to determine that extending to the user from first reference point (for example, the angle 303 of touch-screen 110) points 302 the slope (m of line of first pair of outward flange 304,306 of part in the visual field of detector 130
0aAnd m
0b).Equally, can be to Camera
1Output handle, to determine that extending to the user from second reference point (for example, the angle 305 of touch-screen 110) points 302 the slope (m of line of second pair of outward flange 308,310 of part in the visual field of detector 131
1aAnd m
1b).The selection of reference point (for example, angle 303 and 305) depends on detector 130,131 geometries with respect to touch-screen 110 certainly.Article four, the oblique line (m that calculates
0a, m
0b, m
1aAnd m
1b) the point of crossing then can be used for approximate user and point 302 the surface areas (S) of part in the visual field of detector 130,131.The user is pointed 302 the surface areas (S) of part in the visual field of detector 130,131 herein and be called " touch area ", not necessarily need contacting of reality although should be appreciated that as mentioned above between finger 302 (or other objects) and the touch-screen 110.
Opposite with tracking mode embodiment illustrated in fig. 3, the user interactions shown in Fig. 4 A and Fig. 4 B is intended to expression and selects or " click " state.The user points a part (or other object) of 302 and enters near (or remaining on) touch screen surface energized plane, and uses the pressure contact touch screen surface bigger than the pressure among the embodiment of Fig. 3.Two detectors 130,131 generate once more the indication energized plane Strength Changes, and thereby indication have the information signal that touches.In the embodiment of Fig. 4, the user points 302 can enter energized plane from position out-of-bounds.Alternatively, the user finger position in energized plane can change, so that it becomes from previous suspension (noncontact) position and contacts with touch screen surface or increase pressure on touch screen surface.
Equally, can be in a known manner to Camera
0Output handle, with determine from first reference point (for example, the angle 303 of touch-screen 110) extend to the user point 302 in the visual field of detector 130 part first pair of outward flange 304 ', 306 ' line slope (m '
0aAnd m '
0b).Similarly, can be to Camera
1Output handle, with determine from second reference point (for example, the angle 305 of touch-screen 110) extend to the user point 302 in the visual field of detector 131 part second pair of outward flange 308 ', 310 ' line slope (m '
1aAnd m '
1b).Article four, the oblique line that calculates (m '
Oa, m '
0b, m '
1aAnd m '
1b) the point of crossing then can be used for approximate touch area (S ').
As a comparison, Fig. 4 A with solid line show the oblique line of representing selection mode (m '
Oa, m '
0b, m '
1aAnd m '
1b) and touch area (S '), and be shown in broken lines the oblique line (m of expression tracking mode (Fig. 3)
Oa, m
0b, m
1aAnd m
1b) and touch area (S).As shown in the figure, the touch area (S ') of expression selection mode is greater than the touch area (S) of expression selection mode.This points 302 because of the user is soft, and when the user contacts touch-screen (or increase on touch-screen pressure) when selecting, the user point 302 in contact point distortion (or after contact pressure increases, having bigger distortion) to cover bigger area on the touch screen surface.
Calculation element 201 can be used for touch-screen system 100 is calibrated, to specify the threshold touch area of expression tracking mode.After calibration, calculation element 201 can be programmed to the touch area that exceeds threshold touch area that will calculate, " selection " is appointed as in any touch that detects.It will be understood by those skilled in the art that exemplary calibration steps comprises that the prompting user carries out tracking operation with respect to touch-screen 110, carries out the user and calculate the touch area that touches area, will calculate then when following the tracks of operation and add that optional error or " hysteresis " value are stored as threshold touch area.
In some embodiments, calibration steps can the user point 302 or stylus automatically perform when static.This calibration steps hypothesis a period of time before applying extra pressure representative " selection " operation, user's finger or stylus remain on stable " tracking " pattern.To those skilled in the art, other method that is used to calibrate exemplary touch-screen system 100 will be conspicuous, therefore also should be considered within the scope of the invention.
In some embodiments, can use following exemplary triangulation calculation to come approximate touch area.Can understand these formula best with reference to Fig. 5.Yet, should be noted that Fig. 5 only is provided as exemplary reference.
At first, order:
Camera is positioned at y=0, and at a distance of the distance of 1 unit,
m
0a=Camera
0The slope at observed first edge,
m
0b=Camera
0The slope at observed second edge,
m
0c=m
0aAnd m
0bAverage,
m
1a=Camera
1The slope at observed first edge,
m
1b=Camera
1The slope at observed second edge,
m
1c=m
1aAnd m
1bAverage,
(x
0a, y
0a)=m
0aAnd m
1cIntersection point,
(x
0b, y
0b)=m
0bAnd m
1cIntersection point,
(x
0c, y
0c)=m
0cAnd m
1cIntersection point, the touch center,
(x
1a, y
1a)=m
1aAnd m
0cIntersection point,
(x
1b, y
1b)=m
1bAnd m
0cIntersection point,
With (x
0c, y
0c) identical, and r
0=from Camera
0To the distance at touch center,
r
1=from Camera
1To the distance at touch center,
w
0=point (x
0a, y
0a) to (x
0b, y
0b) width or distance,
w
1=point (x
1a, y
1a) to (x
1b, y
1b) width or distance,
Then, use following formula to calculate by Camera
0Width (the w of observed touch area
0):
x
0a=m
1c/(m
0a-m
1c)
y
0a=m
0a*x
0a
x
0b=m
1c/(m
0b-m
1c)
y
0b=m
0b*x
0b
x
0c=m
1c/(m
0b-m
1c)
y
0c=m
0b*x
0c
r
0=sqrt(x
0c 2+y
0c 2)
Similarly formula can be used to calculate by Camera
1Width (the w of observed touch area
1).After width is found the solution, can use following formula to calculate and touch area (S):
S=w
0*w
1
Wherein, w
0Be by Camera
0The width of the touch area that detects, w
1Be by Camera
1The width of the touch area that detects.
Fig. 6 comprises Fig. 6 A and Fig. 6 B, shows simple stylus 602, and it is modified to and can allows a plurality of touch areas based on applied pressure.Stylus 602 comprises spring-loaded plunger 604, and it is designed in the tip 606 of retraction stylus 602 when enough pressure puts on spring 608.Like this, when stylus 602 is suspended near the of touch-screen 110 or contact touch-screen 110 and its insufficient pressure with compression spring 608, plunger 604 will keep giving prominence to from most advanced and sophisticated 606.Detector 130,131 will be surveyed the existence of plunger 604, and calculation element 201 will calculate based on the plunger size that detects and touch area (S).On the contrary, when stylus 602 contacts with touch-screen 100 with enough pressure and during compression spring 608, plunger 604 is in the retraction tip 606, most advanced and sophisticated 606 contact touch-screens 110.The touch area that calculation element 201 will enlarge based on the size calculating of the stylus tip 606 that detects (S ').
The stylus 602 of Fig. 6 is designed to be similar to finger 302 operations, and it produces the touch area that enlarges when exerting pressure.Other stylus designs also can be finished similar function.For example, can provide similar function by the stylus with rubber tip, this rubber tip (on area) when being applied in pressure enlarges.Therefore, not only can be used for indicating but also can be used for indicating any stylus of larger area or other object all to can be used for according to the embodiment of the present invention than small size.
Fig. 7 shows and is used to distinguish tracking mode, selection mode and the process flow diagram of the illustrative methods 700 of state out-of-bounds.Method 700 determines whether that in step 702 near the energized plane touch-screen detects finger or stylus then from starting block 701.If do not detect finger or stylus, this method then enters step 704, and the indication interaction mode is " out-of-bounds ".After step 704, this method is got back to step 702 and is more handled.When detect finger or stylus in step 702 after, this method enters step 706, the image that first detector is caught is handled, to determine the outer peripheral approximate coordinates of the first couple of finger or stylus.For example, can utilize slope line calculations to determine these coordinates.Next,, the image that second detector is caught is handled, to determine the outer peripheral approximate coordinates of the second couple of finger or stylus in step 708.In step 710, use the outer peripheral approximate coordinates of two couples of finger or stylus to calculate approximated touch area.
After step 710 had been calculated approximated touch area, this method entered step 712 and determines that whether this approximated touch area is greater than threshold touch area.Threshold touch area can be determined by the calibration to touch-screen system 100, or can be specified by system operator or supvr.If approximated touch area is greater than threshold touch area, then at step 712 indication selection mode.If approximated touch area is not more than threshold touch area, then at step 714 indicators track state.From step 712 or 714, this method turns back to step 702 and more handles.
It will be apparent for a person skilled in the art that touch location calculation can or carry out concurrently with the computation sequence of approximate interaction mode ground.Thereby by the continuous selection mode of iteration indication of illustrative methods 700, so, this continuous selection mode will be considered to the state of dragging if detect moving of finger or stylus.Indicate continuous tracking mode to combine with the mobile phase of finger or stylus then can be considered to for example is to need cursor to follow finger or stylus.
Fig. 8 shows the view of the sequence of operation of certain illustrative embodiments of the present invention.When the touch area that detects user's finger or stylus near the energized plane touch-screen 110 and calculate is confirmed as being less than or equal to threshold touch area, indicators track state 802.If finger or stylus do not move (that is, the speed that detects is about zero), then indicate steady state (SS) 804.During steady state (SS) 804, alternatively, can calibrate threshold touch area, for example, with it as background process.Steady state (SS) 804 times, if finger or stylus begin to move (that is, the speed that detects is greater than zero) and the touch area that calculates keeps being less than or equal to threshold touch area, indicators track state 802 once more then.
Steady state (SS) 804 times,, then indicate selection mode 806 if the touch area that calculates is confirmed as greater than threshold touch area.If finger or stylus begin to move when indication selection mode 806, then indication drags state 808.When indication selection mode 806 or the state 808 of dragging, if the touch area that calculates be confirmed as being less than or equal to threshold touch area (that is, and finger or stylus be promoted to small part away from touch-screen 110), then indication stops selection mode 810.Stopping selection mode 810 times,, then indicating steady state (SS) 804 once more if finger or stylus remain in the energized plane.In tracking mode 802, steady state (SS) 804 or stop selection mode 810 times,, then indicate out-of-bounds state 112 if finger or stylus are promoted to fully away from touch-screen 110.
Those skilled in the art will recognize that the state machine diagram of Fig. 8 only is exemplary, state and state exchange additional and/or that replace all are possible.For example, other embodiment of the present invention can be configured to from tracking mode 802 and directly is converted to selection mode 806 or drags state 808.Similarly, in some embodiments, the present invention can be configured to from selection mode 806 or drags state 808 and directly is converted to tracking mode 802 or state 812 out-of-bounds.Therefore, scope of the present invention does not trend towards being limited to the exemplary state machine synoptic diagram of Fig. 8, is not limited to the exemplary process diagram of Fig. 6 yet.
Those of skill in the art also will appreciate that, some function of illustrative embodiments of the present invention can provide by the program module of any kind and quantity, and these program modules can be created and can be stored in the calculation element 201 (also can not) this locality with any programming language.For example, calculation element 201 can comprise and can be configured to carry out the webserver, client computer or the equipment that is stored in the program module on other network equipment and/or is used to control the touch-screen system of long-range setting.
Based on foregoing, as seen the invention provides improved touch-screen system, it can be similar to follows the tracks of and drags state, and no matter touch direction, and need not to depend on direct sensing touch pressure and area.Many other modifications of the present invention, feature and embodiment will be apparent to those skilled in the art.For example, those skilled in the art will recognize that embodiments of the present invention all are useful and suitable to various touch-screens, these touch-screens include but not limited to: optical touch screen, IR touch-screen and capacitive touch screen.Therefore, should be realized that above-described many aspects of the present invention all only are exemplary, and do not trend towards necessary or important element, unless expressly stated otherwise, into the present invention.
Therefore, should be appreciated that mentioned abovely only relate to some embodiment of the present invention, it can have a large amount of changes and not depart from the spirit and scope of the present invention that are defined by the claims.Be also to be understood that to the invention is not restricted to described embodiment, can make various modifications within the scope of the invention.