CN101615100B - Computer and notebook computer - Google Patents
Computer and notebook computer Download PDFInfo
- Publication number
- CN101615100B CN101615100B CN2008101155822A CN200810115582A CN101615100B CN 101615100 B CN101615100 B CN 101615100B CN 2008101155822 A CN2008101155822 A CN 2008101155822A CN 200810115582 A CN200810115582 A CN 200810115582A CN 101615100 B CN101615100 B CN 101615100B
- Authority
- CN
- China
- Prior art keywords
- touch point
- touch
- module
- shortcut icon
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Landscapes
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
The invention provides a computer and a notebook computer. The computer comprises an icon acquisition module, a display module, a touch input interface, a first sensing module, a processing module and an execution module, wherein the icon acquisition module is used for acquiring short-cut icons; the display module is used for displaying the short-cut icons to corresponding display subareas, and a display area consisting of a plurality of integrated display subareas is positioned at a first physical position; the touch input interface is positioned at a second position and consists of a plurality of touch points, each display subarea corresponds to at least one of the plurality of touch points of the touch input interface, and the first position and the second position are not overlapped; the first sensing module is used for acquiring position information of a touched first touch point when an input device touches the touch input interface; the processing module is used for acquiring a first short-cut icon which is in the first display subarea and corresponds to the first touch point according to the position information of the first touch point and a corresponding relation between the first touch point and the display subarea; and the execution module is used for executing a first computer processing object corresponding to the first short-cut icon. The computer can combine touch control to realize human-computer interaction.
Description
Technical field
The present invention relates to human-computer interaction technique field, particularly a kind of computing machine and notebook computer.
Background technology
Huge content and function are arranged in the computing machine, this needs the user to have enough effective methods when using computing machine and computing machine carries out alternately, the user need have an effective window to explore, visit, use content in the computing machine, seek, visit content and the function wanted and be unlikely to before the content side of magnanimity the helpless or very big strength of flower, or between different spaces, ceaselessly switch.
For above-mentioned problem, with regard to different operating system, its explore and access computer in the approach of content have a lot, mainly comprise following several modes: desktop shortcuts, start menu, explorer or start mode such as hurdle fast.
In realizing process of the present invention, the inventor finds that there is following shortcoming at least in the man-machine interaction of desktop shortcuts of the prior art:
Use existing interactive mode, the user explore and access computer in during content, mainly undertaken by mouse and keyboard, control the mainstream development direction that will become following computing system input equipment with himself numerous advantage and touch, therefore, be necessary to propose a kind of new man-machine interaction mode in conjunction with touch control.
Summary of the invention
The purpose of this invention is to provide a kind of computing machine and notebook computer, realize man-machine interaction in conjunction with touching control.
To achieve these goals, the embodiment of the invention provides a kind of computing machine, comprising:
The icon acquisition module is for the shortcut icon that obtains Computer Processing object correspondence;
Display module is used for described shortcut icon is shown to corresponding a plurality of demonstration subregions respectively, and the viewing area that described a plurality of demonstration subregion set form is positioned at first physical location;
Touch input face, be positioned at second physical location, formed by a plurality of touch points, in described a plurality of touch points of each described demonstration subregion and described touch input face at least one is corresponding, wherein, described first physical location and described second physical location are geographically not overlapping, and inequality;
First sensing module is used for obtaining the first touch point position information that contacts when input media touches described touch input face;
Processing module is used for according to the corresponding relation between the described first touch point position information and described first touch point and the described demonstration subregion, obtains first first shortcut icon that show in subregion corresponding with described first touch point;
Execution module is for the first Computer Processing object of carrying out the described first shortcut icon correspondence.
Preferably, above-mentioned computing machine also comprises:
Second sensing module is when being used for obtaining input media and touching described touch input face and the contact information of described first touch point;
Judge module is used for judging that according to described contact information whether executive condition is set up, and obtains judged result;
Trigger module is used for triggering described execution module executable operations when described judged result indication executive condition is set up.
Preferably, in the above-mentioned computing machine, described demonstration subregion is formed the viewing area, and described computing machine also comprises:
Display control module is used for controlling described viewing area and showing fully when display condition is set up, otherwise hides described viewing area.
Preferably, above-mentioned computing machine also comprises:
Light emitting diode with the corresponding setting in described touch point;
Control module is used for according to the described first touch point position information, lights first light emitting diode corresponding with described first touch point.
Preferably, above-mentioned computing machine also comprises:
Amplification module is used for amplifying described first shortcut icon.
To achieve these goals, the embodiment of the invention also provides a kind of notebook computer, comprising:
Display;
Main frame is connected with described display, and described main frame comprises:
Housing;
Mainboard is arranged in the described housing;
Central processing unit is arranged on the described mainboard;
Chipset is arranged on the described mainboard, is connected with described central processing unit;
Video card is arranged on the described mainboard, is connected with described chipset, is connected with described display by the image data transmission interface on the described mainboard;
Wherein, also comprise:
Touch input face, formed by the touch point, be arranged on the upper surface of described housing;
First sensing module is positioned at the below of described touch input face, is arranged on the described mainboard, is connected with described chipset, is used for obtaining the first touch point position information that contacts when input media touches described touch input face;
Described central processing unit, be used for obtaining the shortcut icon of Computer Processing object correspondence, and described shortcut icon is shown to a plurality of demonstration subregions corresponding in the part viewing area of described indicator screen respectively by described video card, each described a plurality of demonstration subregion is corresponding with a plurality of touch points of described touch input face, and according to the corresponding relation between the described first touch point position information and described first touch point and the described demonstration subregion, after obtaining first first shortcut icon that shows in the subregion corresponding with described first touch point, and when executive condition is set up, carry out the first Computer Processing object of the described first shortcut icon correspondence.
Preferably, above-mentioned notebook computer also comprises:
Second sensing module is arranged on the described mainboard, is connected with described chipset, when being used for obtaining input media and touching described touch input face and the contact information of described first touch point;
Described central processing unit judges according to described contact information whether executive condition is set up.
Described demonstration subregion is formed the viewing area, and described central processing unit also is used for controlling described viewing area and showing fully when display condition is set up, otherwise hides described viewing area.
Preferably, above-mentioned notebook computer also comprises:
Light emitting diode with the corresponding setting in described touch point;
Control module is connected with described first sensing module, is used for according to the described first touch point position information, lights first light emitting diode corresponding with described first touch point.
Preferably, above-mentioned notebook computer, described central processing unit also are used for amplifying described first shortcut icon.
The embodiment of the invention has following beneficial effect:
The embodiment of the invention is by obtaining touch point position information, and the corresponding relation between touch point and the viewing area, shortcut icon in visit and the touch point corresponding display, thereby carry out the Computer Processing object of shortcut icon correspondence, realize the man-machine interaction of desktop shortcuts quickly and easily.
Description of drawings
Fig. 1 is the structural representation of the computing machine of the embodiment of the invention;
Fig. 2 is the touch point and the corresponding relation synoptic diagram that shows subregion of the embodiment of the invention;
Fig. 3 is the schematic flow sheet of the method for processing computer of the embodiment of the invention.
Embodiment
The computing machine of the embodiment of the invention and notebook computer combine quick Starting mode with touching control, provide a kind of new man-machine interaction mode, to adapt to the change of following input mode.
The computing machine of the embodiment of the invention comprises as shown in Figure 1:
The icon acquisition module, be used for obtaining the shortcut icon (as " my computer " icon, " my document " icon, " network neighbor " icon, browser icon, Word document icon etc.) of Computer Processing object correspondence, wherein this Computer Processing object includes but not limited to following type: application program, file, file etc.;
Display module, the described shortcut icon that is used for obtaining is shown to corresponding demonstration subregion respectively, described demonstration subregion is positioned at first physical location, a viewing area is formed in the set of described demonstration subregion, and the viewing area that described a plurality of demonstration subregion set form is positioned at first physical location and is positioned at first physical location;
Touch input face, be positioned at second physical location, be made up of a plurality of touch points, at least one in described a plurality of touch points of each described a plurality of demonstration subregion and described touch input face is corresponding, described first physical location and described second physical location are geographically not overlapping, and inequality;
Sensing module is used for obtaining the first touch point position information that input media (as user's finger, touch input pen etc.) contacts during the touch input face as described in touching;
Processing module is used for according to the corresponding relation between the described first touch point position information and described first touch point and the described demonstration subregion, obtains first first shortcut icon that show in subregion corresponding with described first touch point;
Execution module is for the first Computer Processing object of carrying out the described first shortcut icon correspondence.
Simultaneously, also comprise in the computing machine of the embodiment of the invention:
Second sensing module is when being used for obtaining input media and touching described touch input face and the contact information of described first touch point;
Judge module is used for judging that according to described contact information whether executive condition is set up, and obtains a judged result;
Trigger module is used for triggering described execution module executable operations when described judged result indication executive condition is set up.
Below each module in the computing machine of the embodiment of the invention is elaborated.
Shortcut arranges the shortcut icon that obtains the Computer Processing object in the module, and its obtain manner includes but not limited to following mode:
Obtain the shortcut icon of the Computer Processing object of user's appointment according to user's indication;
Directly obtain the shortcut icon of the Computer Processing object of computer desktop, under this mode, the computing machine of the embodiment of the invention also comprises:
Release module is used for the described shortcut icon that display module shows is discharged into desktop.
In specific embodiments of the invention, this touch input face is made up of the touch point, and this touch point with show subregion corresponding (at least one in described a plurality of touch points of each described a plurality of demonstration subregion and described touch input face is corresponding), below this corresponding relation is elaborated.
As shown in Figure 2, be a kind of synoptic diagram of corresponding relation wherein, wherein the viewing area comprises a plurality of demonstration subregions, show shortcut icon in each demonstration subregion, and the touch input face that is in the below comprises a plurality of touch points (i.e. lattice among the figure), a plurality of touch points show subregion corresponding to one, pass through input media (as user's finger the user, touch input pen etc.) when touching described touch input face, to touch the touch point, locate this touch point this moment, behind this location, touch point, can find the demonstration subregion corresponding with this touch point according to corresponding relation, and then obtain corresponding shortcut icon.
Above-mentionedly can set up corresponding relation by dual mode, namely by a dimension coordinate or by two-dimensional coordinate, but in specific embodiments of the invention, showing that subregion when laterally order is arranged, only adopts the one dimension lateral coordinates to realize, be described as follows.
By the one dimension coordinate time, for example set up following corresponding relation;
[0,10) corresponding to showing subregion 1;
[10,20) corresponding to showing subregion 2;
[20,30) corresponding to showing subregion 3;
At this moment, only need obtain the horizontal ordinate of touch point, for example the horizontal ordinate of current touch point is 25, then know namely that according to corresponding relation the demonstration subregion corresponding with this touch point is for showing subregion 3, if the icon in the current demonstration subregion 3 is a folder icon, then execution module can be opened when trigger condition is set up with this document clip icon corresponding file and press from both sides.
When showing that subregion is arranged simultaneously on horizontal and vertical, namely need to distinguish by two-dimensional coordinate, its corresponding relation is set up the mode place different with a dimension coordinate and only is coordinate dimension, no longer describes in detail at this.
Certainly, for convenience of description above-mentioned, the demonstration subregion among Fig. 2 is distinguished by rectangular block, and when practical application, also this rectangular block is not directly distinguished by coordinate.
Certainly, the above-mentioned explanation of carrying out with the corresponding relation of direct foundation, also can obtain by calculating, as the length that touches input face is 10, and the length of viewing area is 30, at this moment, multiply by 3 horizontal ordinates that can obtain corresponding display according to the horizontal ordinate X of touch point, judge according to the viewing area horizontal ordinate to show that subregion gets final product.
Certainly, above-mentioned is to touch the explanation that situation that input face and viewing area be rectangle is carried out, those skilled in the art should know, touch input face and viewing area and can be any shape (as circle, ellipse, star, heart etc.), as long as the touch point has corresponding relation with the demonstration subregion.
In specific embodiments of the invention, the viewing area of this display module is hidden in the normal state, and after being triggered, just represents whole viewing area, and therefore, the computing machine of the specific embodiment of the invention also comprises:
Display control module when being used for the display condition establishment, being controlled described viewing area and is shown in screen fully, otherwise hides described viewing area.
In specific embodiments of the invention, this display condition can be that input media contacts with touch face.
Can carry out in the following way and hide the viewing area:
Do not hide when detecting input media in the given time and contacting with touch face;
When leaving, predetermined touch point hides detecting input media.
The above-mentioned mode of hiding and showing fully can realize in the following manner:
Mode one, the position of viewing area is set, make the coboundary of described viewing area exceed display screen lower limb intended pixel, as the 1-2 pixel, be hiding, otherwise the lower limb that makes described viewing area exceeds the display screen lower limb, can show this viewing area fully, certainly, can use the mode that shifts out of sliding to realize;
Mode two, the transparency that the viewing area is set are predetermined transparency, between 100%~80%, it is 0% that transparency perhaps directly is set, be hiding, otherwise the transparency that makes described viewing area is lower than another transparency, as be lower than 60% (can certainly be other numerical value, only need the user to identify and get final product), can show this viewing area fully.
In judge module, need judge whether executive condition is set up according to contact information, in specific embodiments of the invention, handle according to following mode:
Mode one, judge that whether the duration of contact of input media and described first touch point surpasses the Preset Time threshold value, if surpass the Preset Time threshold value, judge that then executive condition sets up, otherwise be false;
Mode two, judge that whether the duration of contact of input media and described first touch point surpasses the Preset Time threshold value, when whether duration of contact surpasses the Preset Time threshold value, further detect described input media and whether leave from described first touch point, judge when leaving that executive condition sets up detecting.
Mode three, judge input media and described first touch point contact after, leave described first touch point, and in the Preset Time threshold value described first touch point of contact, then judge the executive condition establishment, the double-click that is similar to mouse is moved.
Certainly, can also be to judge by other mode whether executive condition is set up, only need judge to get final product that this contact information can be duration of contact, the touch number of times in the predetermined amount of time etc. according to the contact information of input media and first touch point.
Simultaneously, in specific embodiments of the invention, observe for the convenience of the user, computing machine of the present invention also comprises:
A plurality of LED (light emitting diode), corresponding setting with the touch point;
Control module is used for according to the described first touch point position information, lights the LED corresponding with described first touch point and lights.
According to above-mentioned setting, when input media when touch face slides, it is luminous that LED follows the tracks of finger.
Certainly, when not having input media contact touch face, it is luminous that LED can do breathing one by one.
Simultaneously, the computing machine of the embodiment of the invention also comprises:
Module is set, is used for arranging the position, viewing area transparency of size, the viewing area of size, the shortcut icon of viewing area etc.
More for a long time, the computing machine of the embodiment of the invention solves in the following manner at shortcut icon.
The area of mode one, expansion viewing area just increases the number that shows subregion;
Mode two is dwindled the area of shortcut icon/demonstration subregion, makes the viewing area can show more shortcut icon;
Mode three, the hidden parts shortcut icon when input media contacts with default touch point, is hidden current shortcut icon, shows other shortcut icon.
Certainly, this default touch point preferably is arranged at the end that touches input face.
The user observes for convenience, and the computing machine of the embodiment of the invention also comprises:
Amplification module is used for amplifying described first shortcut icon.
When input media contacts with first touch point, under the effect of amplification module, can amplify first first shortcut icon that show in subregion corresponding with first touch point, play reminding effect, this enlargement factor can be set in advance by the user.
Certainly, this amplification module also can carry out amplifieroperation to the shortcut icon of the predetermined number adjacent with described first shortcut icon, but its enlargement factor should not surpass the enlargement factor of this first shortcut icon, its enlargement factor can be inversely proportional to distance, and this distance is the distance of other shortcut icon and first shortcut icon.
Simultaneously, this amplification module also can be according to the size variation of input media with respect to each icon of motion control of the face of touch.
Fig. 3 is the schematic flow sheet of the method for processing computer of the embodiment of the invention, as shown in Figure 3, comprising:
The computing machine of the embodiment of the invention can be a notebook computer, and this notebook computer comprises:
Display;
Main frame is connected with described display, and described main frame comprises again:
Housing;
Mainboard is arranged in the described housing;
Central processing unit is arranged on the described mainboard;
Chipset is arranged on the described mainboard, is connected with described central processing unit;
Video card is arranged on the described mainboard, is connected with described chipset, is connected with described display by the image data transmission interface on the described mainboard;
Wherein, also comprise:
Touch input face, formed by the touch point, be arranged on the upper surface of described housing; Sensing module, be positioned at the below of described touch input face, be arranged on the described mainboard, be connected with described chipset, be used for obtaining the first touch point position information that input media (as user's finger, touch input pen etc.) contacts during the touch input face as described in touching and as described in input media and as described in the contact information of first touch point;
Described central processing unit, be used for obtaining the shortcut icon of Computer Processing object correspondence, and described shortcut icon is shown to a plurality of demonstration subregions corresponding in the part viewing area of described indicator screen respectively by described video card, each described a plurality of demonstration subregion is corresponding with a plurality of touch points of described touch input face, and according to the corresponding relation between the described first touch point position information and described first touch point and the described demonstration subregion, obtain first first shortcut icon that show in subregion corresponding with described first touch point, when judging that according to described contact information executive condition is set up, carry out the first Computer Processing object of the described first shortcut icon correspondence.
The above only is preferred implementation of the present invention; should be pointed out that for those skilled in the art, under the prerequisite that does not break away from the principle of the invention; can also make some improvements and modifications, these improvements and modifications also should be considered as protection scope of the present invention.
Claims (10)
1. a computing machine is characterized in that, comprising:
The icon acquisition module is for the shortcut icon that directly obtains the Computer Processing object of computer desktop;
Display module, for corresponding a plurality of demonstration subregions in the part viewing area that described shortcut icon is shown to indicator screen respectively, the viewing area that described a plurality of demonstration subregions set form is positioned at first physical location;
Touch input face, be positioned at second physical location, formed by a plurality of touch points, in described a plurality of touch points of each described demonstration subregion and described touch input face at least one is corresponding, wherein, described first physical location and described second physical location are geographically not overlapping, and inequality;
First sensing module is positioned at the below of described touch input face, is used for obtaining the first touch point position information that contacts when input media touches described touch input face;
Processing module is used for according to the corresponding relation between the described first touch point position information and described first touch point and the described demonstration subregion, obtains first first shortcut icon that show in subregion corresponding with described first touch point;
Execution module is for the first Computer Processing object of carrying out the described first shortcut icon correspondence;
Described computing machine also comprises:
Release module is used for the described shortcut icon that display module shows is discharged into desktop.
2. computing machine according to claim 1 is characterized in that, also comprises:
Second sensing module is when being used for obtaining input media and touching described touch input face and the contact information of described first touch point;
Judge module is used for judging that according to described contact information whether executive condition is set up, and obtains judged result;
Trigger module is used for triggering described execution module executable operations when described judged result indication executive condition is set up.
3. computing machine according to claim 1 and 2 is characterized in that, described demonstration subregion is formed the viewing area, and described computing machine also comprises:
Display control module is used for controlling described viewing area and showing fully when display condition is set up, otherwise hides described viewing area.
4. computing machine according to claim 2 is characterized in that, also comprises:
Light emitting diode with the corresponding setting in described touch point;
Control module is used for according to the described first touch point position information, lights first light emitting diode corresponding with described first touch point.
5. computing machine according to claim 2 is characterized in that, also comprises:
Amplification module is used for amplifying described first shortcut icon.
6. a notebook computer is characterized in that, comprising:
Display;
Main frame is connected with described display, and described main frame comprises:
Housing;
Mainboard is arranged in the described housing;
Central processing unit is arranged on the described mainboard;
Chipset is arranged on the described mainboard, is connected with described central processing unit;
Video card is arranged on the described mainboard, is connected with described chipset, is connected with described display by the image data transmission interface on the described mainboard;
Wherein, also comprise:
Touch input face, formed by the touch point, be arranged on the upper surface of described housing;
First sensing module is positioned at the below of described touch input face, is arranged on the described mainboard, is connected with described chipset, is used for obtaining the first touch point position information that contacts when input media touches described touch input face;
Described central processing unit, be used for directly obtaining the shortcut icon of the Computer Processing object of computer desktop, and described shortcut icon is shown to a plurality of demonstration subregions corresponding in the part viewing area of described indicator screen respectively by described video card, in described a plurality of touch points of each described demonstration subregion and described touch input face at least one is corresponding, and according to the corresponding relation between the described first touch point position information and described first touch point and the described demonstration subregion, after obtaining first first shortcut icon that shows in the subregion corresponding with described first touch point, and when executive condition is set up, carry out the first Computer Processing object of the described first shortcut icon correspondence;
The described shortcut icon that described central processing unit also is used for showing is discharged into desktop.
7. notebook computer according to claim 6 is characterized in that, also comprises:
Second sensing module is arranged on the described mainboard, is connected with described chipset, when being used for obtaining input media and touching described touch input face and the contact information of described first touch point;
Described central processing unit judges according to described contact information whether executive condition is set up.
8. according to claim 6 or 7 described notebook computers, it is characterized in that described demonstration subregion is formed the viewing area, described central processing unit also is used for controlling described viewing area and showing fully when display condition is set up, otherwise hides described viewing area.
9. notebook computer according to claim 6 is characterized in that, also comprises:
Light emitting diode with the corresponding setting in described touch point;
Control module is connected with described first sensing module, is used for according to the described first touch point position information, lights first light emitting diode corresponding with described first touch point.
10. notebook computer according to claim 6 is characterized in that, described central processing unit also is used for amplifying described first shortcut icon.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2008101155822A CN101615100B (en) | 2008-06-25 | 2008-06-25 | Computer and notebook computer |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2008101155822A CN101615100B (en) | 2008-06-25 | 2008-06-25 | Computer and notebook computer |
Publications (2)
Publication Number | Publication Date |
---|---|
CN101615100A CN101615100A (en) | 2009-12-30 |
CN101615100B true CN101615100B (en) | 2013-07-03 |
Family
ID=41494765
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2008101155822A Active CN101615100B (en) | 2008-06-25 | 2008-06-25 | Computer and notebook computer |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN101615100B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102789353A (en) * | 2011-05-20 | 2012-11-21 | 腾讯科技(深圳)有限公司 | Method and device for operating desktop objects on touch screen mobile devices |
CN103377020A (en) * | 2012-04-23 | 2013-10-30 | 百度在线网络技术(北京)有限公司 | Display method and device for mobile terminal |
CN104536660A (en) * | 2014-12-16 | 2015-04-22 | 小米科技有限责任公司 | Interface displaying method and device |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1664767A (en) * | 2004-03-04 | 2005-09-07 | 英业达股份有限公司 | Portable data processing equipment with single button touch control operation and switching method thereof |
-
2008
- 2008-06-25 CN CN2008101155822A patent/CN101615100B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1664767A (en) * | 2004-03-04 | 2005-09-07 | 英业达股份有限公司 | Portable data processing equipment with single button touch control operation and switching method thereof |
Non-Patent Citations (2)
Title |
---|
JP特開2005-122271A 2005.05.12 |
JP特開2006-244041A 2006.09.14 |
Also Published As
Publication number | Publication date |
---|---|
CN101615100A (en) | 2009-12-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10444989B2 (en) | Information processing apparatus, and input control method and program of information processing apparatus | |
AU2007100827A4 (en) | Multi-event input system | |
RU2523169C2 (en) | Panning content using drag operation | |
KR101361214B1 (en) | Interface Apparatus and Method for setting scope of control area of touch screen | |
KR101117481B1 (en) | Multi-touch type input controlling system | |
CN103064627B (en) | A kind of application management method and device | |
CN111475097B (en) | Handwriting selection method and device, computer equipment and storage medium | |
US20120306788A1 (en) | Electronic apparatus with touch input system | |
TWI463355B (en) | Signal processing apparatus, signal processing method and selecting method of user-interface icon for multi-touch interface | |
KR20150014083A (en) | Method For Sensing Inputs of Electrical Device And Electrical Device Thereof | |
WO2014029043A1 (en) | Method and device for simulating mouse input | |
CN103218044B (en) | A kind of touching device of physically based deformation feedback and processing method of touch thereof | |
JP2011221640A (en) | Information processor, information processing method and program | |
CN104866225A (en) | Electronic device having touch display screen and control method therefor | |
JP2009151718A (en) | Information processing device and display control method | |
US20140015785A1 (en) | Electronic device | |
CN103309482A (en) | Electronic equipment and touch control method and touch control device thereof | |
CN102768595B (en) | A kind of method and device identifying touch control operation instruction on touch-screen | |
KR20120023405A (en) | Method and apparatus for providing user interface | |
CN102830893A (en) | Screen display control method and system | |
JPH10228350A (en) | Input device | |
CN103389876A (en) | Function switching method based on touch display equipment and touch display equipment | |
US20140298275A1 (en) | Method for recognizing input gestures | |
CN103365444A (en) | Terminal equipment and touch method for touch screen thereon | |
CN101615100B (en) | Computer and notebook computer |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |