WO2014178800A1 - User interface for multiple-users collaboration surface - Google Patents
User interface for multiple-users collaboration surface Download PDFInfo
- Publication number
- WO2014178800A1 WO2014178800A1 PCT/TH2013/000023 TH2013000023W WO2014178800A1 WO 2014178800 A1 WO2014178800 A1 WO 2014178800A1 TH 2013000023 W TH2013000023 W TH 2013000023W WO 2014178800 A1 WO2014178800 A1 WO 2014178800A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- tools
- workspace
- same time
- unbounded
- user tools
- Prior art date
Links
- 238000004091 panning Methods 0.000 claims description 5
- 238000000034 method Methods 0.000 claims 18
- 230000000007 visual effect Effects 0.000 claims 1
- 244000035744 Hura crepitans Species 0.000 description 5
- 238000010586 diagram Methods 0.000 description 2
- 239000012530 fluid Substances 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- aspects of the invention generally relate to software design touchscreen devices or computers with functions for multiple users working together at the same time.
- touchscreen software and devices have been developed to enhance usability and to make computers more versatile throughout the evolution of the field of computer science. Touchscreen software and devices changes the experience of the users when interacting with computers. Many software and application becomes easier to use. Touchscreen also open up many new capabilities that the ordinary keyboard and mouse cannot perform.
- touchscreen devices examples include the iPhone and tablet.
- the touchscreen software has developed rapidly. Touchscreen used to work with only a stylus pen and only one point could be touch at one time.
- touchscreen software is very fluid and response very well with the movement of the fingers and many points can be touch at the same time.
- Numerous touchscreen devices and software allow more than one users to use the device at the same time.
- One example would be the iPad and the Tap Tap Revolution Game.
- these devices and software are usually not designed for multiusers to use the device at the same time and many of the devices have sizes that are too small.
- the Sandbox addresses these issues by being software that utilize large touchscreen interface designed to enable many users to use the touchscreen at once with ease.
- the software embedded is very important because it is designed for many users to use at the same time. All of the tools provided suits a variety of environments and many groups of people. All of the tools can be customize and many users can use different tools or even the same tools all at once.
- One aspect of the invention is to optimize the multiuser touch screen experience.
- the software allows more than one person to work together on the touchscreen devices at the same time.
- Each user has their workspace and is able to interact together with another person work all of the work done in a workspace is a project.
- Another aspect of the invention is the specialized user tools in which every user can simply draw a geometric shape to pull out the specialized user tools anytime and each user can use any tools individually at the same time.
- the tools are very flexible and changes or adjustments to any creation done by the tools are always available.
- Yet another aspect of the invention is the unbounded and indefinite workspace.
- the zooming function enables the workspace to expand with no bound or limits.
- the panning function enables each user to move the workspace being displayed at the screen.
- FIG. 1 is a structure of the architecture of the touchscreen technology system.
- FIG. 2 is a scenario of multiple users working together.
- FIG. 3 is a structure of the gesture analysis.
- FIG. 4 is a scenario of many individualized user tools on a workspace.
- FIG. 5 is a scenario of the unbounded workspaces using.
- FIG. 6 is a scenario of the object creating and changing.
- This invention is created to help multiple users collaborate better in working together on computers. For multiple users to work together well on a touchscreen device, not only does the hardware which is the computer with a touchscreen need to be robust, but the software for multiusers need to be well created as well to work cooperatively between both the hardware and the software.
- Sandbox is a combination of both a robust computer with touchscreen for multiusers and the software that has been created for multiple users to collaborate better. Functions and tools of Sandbox are designed to help multiple users share and work together on the same project while having individual workspace.
- the Gesture Analysis enables multiple users to use different tools at the same time. For example, one user could draw a line while another user could be creating a shape at the same time.
- the specialized user tools are drawn-pulled out by drawing a geometric shape. The tools enable many features and can be pulled out as many times as possible for each user.
- the specialized user tools are movable.
- the unbounded and indefinite workspace provides an unlimited area for maximum creativity. The area can be moved around with the panning tools and the zooming function can zoom in and out of an area.
- FIG. 1 illustrates a diagram of the architecture of the touchscreen technology system.
- the sensor detects the location of the point being touched on the touchscreen accurately.
- the ID helps the computer to recognize each touch individually.
- the driver calibrates the point to make an estimation of the point location accurately and send that point to Sandbox computer.
- FIG. 2 illustrates a Sandbox being used by multiple users at the same time.
- FIG. 3 illustrates a diagram of how each individual touch point is being analyzed by the gesture analysis. The touch point then is interpreted into different commands.
- FIG 4 explains the functions of the tools usage on the workspace.
- the individualized user tools are called out by drawing a geometric shape anywhere on the workspace. Each user can freely select and use any tool on the individualized user tools without interfering with another individualized user tools.
- Each attribute of the tool of the individualized user tools can be adjusted as desire. For example, the pen tool can be used with different color attribute by different users at the same time.
- the tool of one individualized user tools can also be used on the creation created by another person individualized user tools.
- the eraser tool of one individualized user tools can be used to erase lines drawn by another individualized user tools.
- FIG 4 illustrates the scenarios of the workspace software being in use.
- One feature of the software is to show many workspaces at the same time. Another feature is to the usage of the workspace by multiple users at the same time.
- the workspace is unbounded and indefinite and goes beyond the screen.
- the zooming tool and the panning tool can be used to move around what of the unbounded and indefinite workspace is shown on the screen.
- FIG 5 illustrates the usage of the object functions on workspace.
- the first function is to open different type of files at the same time and to select a part or the full file to be used as an object on the workspace in an unlimited number of times.
- Objects can be created by the individualized user tools of each user and the attribute of each object including the shape, size, and color can be change anytime.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Software embedded into a computer with touch display for multiusers consisting of tools, functions, and programs to optimize the collaboration among individuals. The hardware and the software are designed to work together cooperatively and fluidly. The touchscreen technology besides having touchscreen hardware that can detect multiple touch points must have software that supports multiusers working together.
Description
USER INTERFACE FOR MULTIPLE-USERS COLLABORATION SURFACE FIELD OF THE INVENTION
Aspects of the invention generally relate to software design touchscreen devices or computers with functions for multiple users working together at the same time.
BACKGROUND OF INVENTION
A variety of touchscreen software and devices have been developed to enhance usability and to make computers more versatile throughout the evolution of the field of computer science. Touchscreen software and devices changes the experience of the users when interacting with computers. Many software and application becomes easier to use. Touchscreen also open up many new capabilities that the ordinary keyboard and mouse cannot perform.
Examples of touchscreen devices include the iPhone and tablet. The touchscreen software has developed rapidly. Touchscreen used to work with only a stylus pen and only one point could be touch at one time. Today, touchscreen software is very fluid and response very well with the movement of the fingers and many points can be touch at the same time. Numerous touchscreen devices and software allow more than one users to use the device at the same time. One example would be the iPad and the Tap Tap Revolution Game. However, these devices and software are usually not designed for multiusers to use the device at the same time and many of the devices have sizes that are too small. The Sandbox addresses these issues by being software that utilize large touchscreen interface designed to enable many users to use the touchscreen at once with ease. The software embedded is very important because it is designed for many users to use at the same time. All of the tools provided suits a variety of environments and many groups of people. All of the tools can be customize and many users can use different tools or even the same tools all at once.
SUMMARY OF THE INVENTION
One aspect of the invention is to optimize the multiuser touch screen experience. The software allows more than one person to work together on the touchscreen devices at the same time. Each user has their workspace and is able to interact together with another person work all of the work done in a workspace is a project.
Another aspect of the invention is the specialized user tools in which every user can simply draw a geometric shape to pull out the specialized user tools anytime and each user can use any tools individually at the same time. The tools are very flexible and changes or adjustments to any creation done by the tools are always available.
Yet another aspect of the invention is the unbounded and indefinite workspace. The zooming function enables the workspace to expand with no bound or limits. The panning function enables each user to move the workspace being displayed at the screen.
BRIEF DESCRIPTION OF THE DRAWINGS
The foregoing Summary, as well as the following Detailed Description, will be better understood when read in conjunction with the accompanying drawings. FIG. 1 is a structure of the architecture of the touchscreen technology system.
FIG. 2 is a scenario of multiple users working together.
FIG. 3 is a structure of the gesture analysis.
FIG. 4 is a scenario of many individualized user tools on a workspace.
FIG. 5 is a scenario of the unbounded workspaces using.
FIG. 6 is a scenario of the object creating and changing.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
This invention is created to help multiple users collaborate better in working together on computers. For multiple users to work together well on a touchscreen device, not only does the hardware which is the computer with a touchscreen need to be robust, but the
software for multiusers need to be well created as well to work cooperatively between both the hardware and the software.
Sandbox is a combination of both a robust computer with touchscreen for multiusers and the software that has been created for multiple users to collaborate better. Functions and tools of Sandbox are designed to help multiple users share and work together on the same project while having individual workspace.
The Gesture Analysis enables multiple users to use different tools at the same time. For example, one user could draw a line while another user could be creating a shape at the same time. The specialized user tools are drawn-pulled out by drawing a geometric shape. The tools enable many features and can be pulled out as many times as possible for each user. The specialized user tools are movable. The unbounded and indefinite workspace provides an unlimited area for maximum creativity. The area can be moved around with the panning tools and the zooming function can zoom in and out of an area.
Illustrative Computing Environment
FIG. 1 illustrates a diagram of the architecture of the touchscreen technology system. The sensor detects the location of the point being touched on the touchscreen accurately. The ID helps the computer to recognize each touch individually. The driver calibrates the point to make an estimation of the point location accurately and send that point to Sandbox computer.
FIG. 2 illustrates a Sandbox being used by multiple users at the same time.
FIG. 3 illustrates a diagram of how each individual touch point is being analyzed by the gesture analysis. The touch point then is interpreted into different commands.
Tools Function
FIG 4 explains the functions of the tools usage on the workspace. The individualized user tools are called out by drawing a geometric shape anywhere on the workspace. Each user can freely select and use any tool on the individualized user tools without interfering with another individualized user tools. Each attribute of the tool of the individualized user
tools can be adjusted as desire. For example, the pen tool can be used with different color attribute by different users at the same time. The tool of one individualized user tools can also be used on the creation created by another person individualized user tools. For example, the eraser tool of one individualized user tools can be used to erase lines drawn by another individualized user tools.
Workspace Features
FIG 4 illustrates the scenarios of the workspace software being in use. One feature of the software is to show many workspaces at the same time. Another feature is to the usage of the workspace by multiple users at the same time. The workspace is unbounded and indefinite and goes beyond the screen. The zooming tool and the panning tool can be used to move around what of the unbounded and indefinite workspace is shown on the screen. Obj ect Functions
FIG 5 illustrates the usage of the object functions on workspace. The first function is to open different type of files at the same time and to select a part or the full file to be used as an object on the workspace in an unlimited number of times. Objects can be created by the individualized user tools of each user and the attribute of each object including the shape, size, and color can be change anytime.
Claims
1. A method comprising: calling out at least one individualized user tools; displaying at least one individualized user tools; receiving user input from the touch screen performing more than one functions of the individualized user tools at the same time displaying work result of the various functions of the individualized user tools
2. The method of claim 1, wherein the individualized user tools can be called out for multiple users at the same time.
3. The method of claim 1, wherein the calling out of the individualized user tools can be performed anywhere on the touch screen.
4. The method of claim 1, wherein it can be repeated to create an unlimited number of specialized user tools.
5. The method of claim 1, wherein the individualized user tools can moved freely anywhere on the touch screen.
6. The method of claim 1, wherein the different tools on the individualized user tools bought out by method 1, in which each tools can perform different functions at the same time by same or different users.
7. The method of claim 1, wherein more than one user can use the same tool with different attributes at the same time.
8. The method of claim 1, wherein the objects and line tools can be used individually by more than one person at the same time and each individual can change or adjust the attribute of their line creation tool and the eraser tool can be used to erase all creations done by any users.
9. The method of claim 1, wherein the workspace is unbounded and expandable by zooming out indefinitely to create space and by panning to move to any location in the indefinite workspace.
10. The method of claim 1, wherein the screen capture can be used to capture all or any part of the unbounded workspace in claim 9 in any manner and size.
11. The method of claim 1 , wherein the unbounded workspace in claim 9 can be
created more than once in an unlimited number of times.
12. The method of claim 1, wherein 1 or more unbounded workspaces created can be viewed on the screen at the same time.
13. The method of claim 1, wherein any geometric shapes that has been adjusted or altered previously or use as part of something else can be adjusted and changed to another geometric shape in an unlimited number of times.
14. The method of claim 1, wherein any file in any format can be pulled out or duplicate in any format as a separate file, object, or item to work with.
15. The method of claim 1, wherein any object can be created in an unlimited number of times on the unbounded workspace.
16. The method of claim 1, wherein any image or video can be pulled out in an unlimited number on the unbounded workspace.
17. A system comprising:
a visual display with touchscreen capabilities and
a processor executing software configured to:
call out at least one individualized user tools;
display at least one individualized user tools;
receive user input from the touch screen
perform more than one functions of the individualized user tools at the same time display work result of the various functions of the individualized user tools
18. The system of claim 17, wherein the individualized user tools can be called out for multiple users at the same time.
19. The system of claim 17, wherein the calling out of the individualized user tools can be performed anywhere on the touch screen.
20. The system of claim 17, wherein it can be repeated to create an unlimited number of specialized user tools.
21. The system of claim 17, wherein the individualized user tools can moved freely anywhere on the touch screen.
22. The system of claim 17, wherein the different tools on the individualized user tools bought out by method 1, in which each tools can perform different functions at the same time by same or different users.
23. The system of claim 17, wherein more than one user can use the same tool with different attributes at the same time.
24. The system of claim 17, wherein the objects and line tools can be used individually by more than one person at the same time and each individual can change or adjust the attribute of their line creation tool and the eraser tool can be used to erase all creations done by any users.
25. The system of claim 17, wherein the workspace is unbounded and expandable by zooming out indefinitely to create space and by panning to move to any location in the indefinite workspace.
26. The system of claim 17, wherein the screen capture can be used to capture all or any part of the unbounded workspace in claim 9 in any manner and size.
27. The system of claim 17, wherein the unbounded workspace in claim 9 can be created more than once in an unlimited number of times.
28. The system of claim 17, wherein 1 or more unbounded workspaces created can be viewed on the screen at the same time.
29. The system of claim 17, wherein any geometric shapes that has been adjusted or altered previously or use as part of something else can be adjusted and changed to another geometric shape in an unlimited number of times.
30. The system of claim 17, wherein any file in any format can be pulled out or duplicate in any format as a separate file, object, or item to work with.
31. The system of claim 17, wherein any object can be created in an unlimited number of times on the unbounded workspace.
32. The system of claim 17, wherein any image or video can be pulled out in an unlimited number on the unbounded workspace.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/TH2013/000023 WO2014178800A1 (en) | 2013-05-02 | 2013-05-02 | User interface for multiple-users collaboration surface |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/TH2013/000023 WO2014178800A1 (en) | 2013-05-02 | 2013-05-02 | User interface for multiple-users collaboration surface |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014178800A1 true WO2014178800A1 (en) | 2014-11-06 |
Family
ID=51843789
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/TH2013/000023 WO2014178800A1 (en) | 2013-05-02 | 2013-05-02 | User interface for multiple-users collaboration surface |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2014178800A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10698505B2 (en) | 2016-01-13 | 2020-06-30 | Hewlett-Packard Development Company, L.P. | Executing multiple pen inputs |
US10749701B2 (en) | 2017-09-22 | 2020-08-18 | Microsoft Technology Licensing, Llc | Identification of meeting group and related content |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090193345A1 (en) * | 2008-01-28 | 2009-07-30 | Apeer Inc. | Collaborative interface |
WO2012050946A2 (en) * | 2010-09-29 | 2012-04-19 | Bae Systems Information Solutions Inc. | A method of collaborative computing |
US20120110431A1 (en) * | 2010-11-02 | 2012-05-03 | Perceptive Pixel, Inc. | Touch-Based Annotation System with Temporary Modes |
US20130047093A1 (en) * | 2011-05-23 | 2013-02-21 | Jeffrey Jon Reuschel | Digital whiteboard collaboration apparatuses, methods and systems |
US20130093708A1 (en) * | 2011-10-13 | 2013-04-18 | Autodesk, Inc. | Proximity-aware multi-touch tabletop |
-
2013
- 2013-05-02 WO PCT/TH2013/000023 patent/WO2014178800A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090193345A1 (en) * | 2008-01-28 | 2009-07-30 | Apeer Inc. | Collaborative interface |
WO2012050946A2 (en) * | 2010-09-29 | 2012-04-19 | Bae Systems Information Solutions Inc. | A method of collaborative computing |
US20120110431A1 (en) * | 2010-11-02 | 2012-05-03 | Perceptive Pixel, Inc. | Touch-Based Annotation System with Temporary Modes |
US20130047093A1 (en) * | 2011-05-23 | 2013-02-21 | Jeffrey Jon Reuschel | Digital whiteboard collaboration apparatuses, methods and systems |
US20130093708A1 (en) * | 2011-10-13 | 2013-04-18 | Autodesk, Inc. | Proximity-aware multi-touch tabletop |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10698505B2 (en) | 2016-01-13 | 2020-06-30 | Hewlett-Packard Development Company, L.P. | Executing multiple pen inputs |
US10749701B2 (en) | 2017-09-22 | 2020-08-18 | Microsoft Technology Licensing, Llc | Identification of meeting group and related content |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12073008B2 (en) | Three-dimensional object tracking to augment display area | |
KR20150014083A (en) | Method For Sensing Inputs of Electrical Device And Electrical Device Thereof | |
JP2015153420A (en) | Multitask switching method and system and electronic equipment having the same system | |
JP2019087284A (en) | Interaction method for user interfaces | |
JP2014215737A (en) | Information processor, display control method and computer program | |
JP6379880B2 (en) | System, method, and program enabling fine user interaction with projector-camera system or display-camera system | |
CN105934739A (en) | Virtual mouse for a touch screen device | |
KR20170009979A (en) | Methods and systems for touch input | |
Kolb et al. | Towards gesture-based process modeling on multi-touch devices | |
KR20160019762A (en) | Method for controlling touch screen with one hand | |
US20160054879A1 (en) | Portable electronic devices and methods for operating user interfaces | |
US10073612B1 (en) | Fixed cursor input interface for a computer aided design application executing on a touch screen device | |
Foucault et al. | SPad: a bimanual interaction technique for productivity applications on multi-touch tablets | |
WO2014178800A1 (en) | User interface for multiple-users collaboration surface | |
Baldauf et al. | Snap target: Investigating an assistance technique for mobile magic lens interaction with large displays | |
Brehmer et al. | Interacting with visualization on mobile devices | |
US10838570B2 (en) | Multi-touch GUI featuring directional compression and expansion of graphical content | |
Klompmaker et al. | Towards multimodal 3d tabletop interaction using sensor equipped mobile devices | |
Aigner et al. | Design Considerations for the Placement of Data Visualisations in Virtually Extended Desktop Environments | |
Liu et al. | Interactive space: a prototyping framework for touch and gesture on and above the desktop | |
Buda | Rotation techniques for 3D object interaction on mobile devices | |
KR101136327B1 (en) | A touch and cursor control method for portable terminal and portable terminal using the same | |
Lee et al. | Smart and space-aware interactions using smartphones in a shared space | |
Khan | A survey of interaction techniques and devices for large high resolution displays | |
Vanoni | Human-centered interfaces for large, high-resolution visualization systems |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13883670 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13883670 Country of ref document: EP Kind code of ref document: A1 |