US20020057270A1 - Virtual reality method - Google Patents
Virtual reality method Download PDFInfo
- Publication number
- US20020057270A1 US20020057270A1 US10/035,532 US3553201A US2002057270A1 US 20020057270 A1 US20020057270 A1 US 20020057270A1 US 3553201 A US3553201 A US 3553201A US 2002057270 A1 US2002057270 A1 US 2002057270A1
- Authority
- US
- United States
- Prior art keywords
- image
- pointer
- direction signal
- matrix
- images
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05G—CONTROL DEVICES OR SYSTEMS INSOFAR AS CHARACTERISED BY MECHANICAL FEATURES ONLY
- G05G1/00—Controlling members, e.g. knobs or handles; Assemblies or arrangements thereof; Indicating position of controlling members
- G05G1/02—Controlling members for hand actuation by linear movement, e.g. push buttons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0338—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of limited linear or angular displacement of an operating part of the device from a neutral position, e.g. isotonic or isometric joysticks
Definitions
- the present invention relates to a virtual reality (VR) method, and particularly to a virtual reality method in which users can build up a virtual reality system by directly adopting the images arranged in a matrix form.
- VR virtual reality
- the first method is to photograph an object or the environment using a panoramic camera, so as to build up a virtual reality system.
- the second method is to establish the three-dimension (3D) digital information of an object or the environment so as to achieve the effect of virtual reality.
- the first method requires a panoramic camera and related software (or plug-in software) for producing and playing the photos, and specific technical personnel to operate the camera and software.
- the panoramic camera and the software are expensive, and users always do not have time to learn the required skills.
- using the first method to build up a virtual reality system is unrealistic for general users.
- the present invention also helps art designers or marketing personnel to design an interactive VR object or VR character by simple concept and operation.
- the present invention provides a virtual reality method which operates with a different horizontal angle.
- a plurality of images are provided, and these images are connected in series as an image sequence.
- a pointer pointed to a target-image in the image sequence is provided, wherein the target-image is one of the images in the image sequence.
- a direction signal is received, and the pointer points to an adjacent image next to the target-image when the direction signal is a first direction; and the pointer points to an adjacent image previous to the target-image when the direction signal is a second direction.
- the present invention also provides a virtual reality method which operates with a different horizontal angle and a different overlooking angle.
- a plurality of images are provided, and these images are arranged into a matrix.
- a pointer pointed to a target-image in the matrix is provided, wherein the target-image is one of the images in the matrix.
- a direction signal is received, and the pointer points to an adjacent image next to the target-image when the direction signal is a first direction; the pointer points to an adjacent image previous to the target-image when the direction signal is a second direction; the pointer points to an adjacent image above the target-image when the direction signal is a third direction; and the pointer points to an adjacent image below the target-image when the direction signal is a fourth direction.
- FIG. 1 is a flow chart illustrating the operation of a virtual reality method according to the first embodiment of the present invention
- FIG. 2 is a schematic diagram showing an example photographing the images of an object from different horizontal angles as disclosed in the first embodiment
- FIG. 3 is a flow chart illustrating the operation of a virtual reality method according to the second embodiment of the present invention.
- FIG. 4 is a schematic diagram showing an example photographing the images of an object from different horizontal angles and different overlooking angles as disclosed in the second embodiment.
- FIG. 5 is a schematic diagram showing an example of the image matrix in the second embodiment.
- FIG. 1 illustrates the operation of a virtual reality method according to the first embodiment of the present invention
- FIG. 2 shows an example photographing the images of an object from different horizontal angles in the first embodiment.
- the first embodiment of the present invention represents a virtual reality method that operates with a different horizontal angle.
- step S 100 a plurality of images is provided, and these images are connected in series as an image sequence. These images are the photos of an object at different positions on a circle having a fixed radius, and there is a predetermined angle difference between one image and its adjacent image in the image sequence.
- the images described above are sixteen photos of an object 20 shot by a camera 30 at the different positions ( 1 to 16 ) on a circle, and the predetermined angle difference between two positions is 24 degree horizontal angle.
- the sixteen images are then connected in series as an image sequence.
- step S 105 a pointer pointed to a target-image in the image sequence is provided, wherein the target-image is one of the images in the image sequence.
- step S 110 a direction signal is received.
- step S 115 the direction signal's right/left orientation is determined.
- step S 120 the pointer is verified as pointing to the last image of the image sequence. If it is, then as step S 125 , the pointer is altered to point to the first image of the image sequence; If the pointer is not pointing to the last image of the image sequence, then as step S 130 , the pointer is altered to point to an adjacent image next to the target-image in the image sequence.
- the pointer is altered to point to the fourth image. If the direction signal is a right signal and the pointer is pointing to the sixteenth image of the sixteen images in FIG. 2, then the pointer is altered to point to the first image.
- step S 155 the image indicated by the pointer is displayed, and the process returns to step S 110 , to receive another direction signal.
- step S 135 the direction signal's right/left orientation is determined.
- step S 140 the pointer is verified as pointing to the first image of the image sequence. If it is, then as step S 145 , the pointer is altered to point to the last image of the image sequence; If the pointer is not pointing to the first image of the image sequence, then as step S 150 , the pointer is altered to point to an adjacent image previous to the target-image in the image sequence.
- the pointer is altered to point to the second image. If the direction signal is a left signal and the pointer is pointing to the first image of the sixteen images in FIG. 2, then the pointer is altered to point to the sixteenth image.
- step S 155 the image pointed to by the pointer is displayed, and the process returns to step S 110 , to receive another direction signal.
- step S 110 the process goes directly to step S 110 , to receive another direction signal.
- FIG. 3 illustrates the operation of a virtual reality method according to the second embodiment of the present invention
- FIG. 4 shows an example photographing these images of an object from different horizontal angles and different overlooking angles.
- the second embodiment of the present invention represents a virtual reality method which operates with a different horizontal angle and a different overlooking angle.
- step S 200 a plurality of images are provided, and these images are arranged into a matrix.
- These images are the photos of an object at different positions on a virtual spherical surface.
- the images in the same row of the matrix represent the images photographed from the same overlooking angle but a different horizontal angle, and there is a predetermined horizontal angle difference between one image and its adjacent image in any row.
- the images in the same column of the matrix represent the images photographed from the same horizontal angle but different overlooking angle, and there is a predetermined overlooking angle difference between one image and its adjacent image in one column.
- the images described above are the photos of an object 20 shot by a camera 30 at the different positions on a virtual spherical surface.
- the predetermined horizontal angle difference between two positions is 24 degrees.
- group A, B, C, . . . , and F. Therefore, there are 96 (6 ⁇ 16) images in this example, and these 96 images are arranged into a matrix 40 , as shown in FIG. 5.
- step S 205 a pointer pointed to a target-image in the matrix 40 is provided, wherein the target-image is one of the images in the matrix 40 .
- step S 210 a direction signal is received.
- step S 215 the direction signal's right/left orientation is determined.
- step S 220 the pointer is verified as pointing to the image in the last column of the matrix 40 . If it is, then as step S 225 , the pointer is altered to point to the image in the first column of the matrix 40 ; If the pointer is not pointing to the image in the last column of the matrix 40 , then as step S 230 , the pointer is altered to point to an adjacent image next to the target-image in the same row.
- the pointer is altered to point to the fourth image in row B. If the direction signal is a right signal and the pointer is pointing to the sixteenth image of the sixteen images in row B of the matrix 40 in FIG. 5, then the pointer is altered to point to the first image in row B.
- step S 295 the image pointed to by the pointer is displayed, and the process returns to step S 210 , to receive another direction signal.
- step S 235 the direction signal's right/left orientation is determined.
- step S 240 the pointer is verified as pointing to the image in the first column of the matrix 40 . If it is, then as step S 245 , the pointer is altered to point to the image in the last column of the matrix 40 ; If the pointer is not pointing to the image in the first column of the matrix 40 , then as step S 250 , the pointer is altered to point to an adjacent image previous to the target-image in the same row.
- the pointer is altered to point to the second image in row B. If the direction signal is a left signal and the pointer is pointing to the first image of the sixteen images in row B of the matrix 40 in FIG. 5, then the pointer is altered to point to the sixteenth image in row B.
- step S 295 the image pointed to by the pointer is displayed, and the process returns to step S 210 , to receive another direction signal.
- step S 255 the direction signal is verified as being an up signal.
- step S 260 the pointer is verified as pointing to the image in the first row of the matrix 40 . If it is, then as step S 265 , the pointer is altered to point to the image in the first row of the matrix 40 ; If the pointer is not pointing to the image in the first row of the matrix 40 , then as step S 270 , the pointer is altered to point to an adjacent image above the target-image in the same column.
- the pointer is altered to point to the third image in row A. If the direction signal is an up signal and the pointer is pointing to the third image of the sixteen images in row A of the matrix 40 in FIG. 5, then the pointer is altered to point to the third image in row A.
- step S 295 the image pointed to by the pointer is displayed, and the process returns to step S 210 , to receive another direction signal.
- step S 275 the direction signal is verified as being a down signal.
- step S 280 the pointer is verified as pointing to the image in the last row of the matrix 40 . If it is, then as step S 285 , the pointer is altered to point to the image in the last row of the matrix 40 ; If the pointer is not pointing to the image in the last row of the matrix 40 , then as step S 290 , the pointer is altered to point to an adjacent image below the target-image in the same column.
- the pointer is altered to point to the third image in row C. If the direction signal is a down signal and the pointer is pointing to the third image of the sixteen images in row F of the matrix 40 in FIG. 5, then the pointer is altered to point to the third image in row F.
- step S 295 the image pointed to by the pointer is displayed, and the process returns to step S 210 , to receive another direction signal.
- step S 110 the process goes directly to step S 110 , to receive another direction signal.
- users can photograph these images of an object or the environment and use a simple image editing tool, such as MS Paint, to connect or arrange these images into an image sequence or a matrix. Users can further utilize the image sequence and the matrix with the present invention to build up a virtual reality system. In addition, designers or marketing personnel can use the virtual reality method of the present invention to create interactive VR objects or VR characters with reduced learning time and expense.
- a simple image editing tool such as MS Paint
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Automation & Control Theory (AREA)
- Position Input By Displaying (AREA)
- Processing Or Creating Images (AREA)
Abstract
A virtual reality method provides, first, a plurality of images, and these images are connected in series as an image sequence. Then, a pointer pointed to a target-image in the image sequence is provided. A direction signal is received, and the pointer points to an adjacent image next to the target-image when the direction signal is a first direction; and the pointer points to an adjacent image previous to the target-image when the direction signal is a second direction.
Description
- 1. Field of the Invention
- The present invention relates to a virtual reality (VR) method, and particularly to a virtual reality method in which users can build up a virtual reality system by directly adopting the images arranged in a matrix form.
- 2. Description of the Related Art
- Conventionally, there are two methods to build up a virtual reality system. The first method is to photograph an object or the environment using a panoramic camera, so as to build up a virtual reality system. The second method is to establish the three-dimension (3D) digital information of an object or the environment so as to achieve the effect of virtual reality.
- The first method requires a panoramic camera and related software (or plug-in software) for producing and playing the photos, and specific technical personnel to operate the camera and software. However, the panoramic camera and the software are expensive, and users always do not have time to learn the required skills. As a result, using the first method to build up a virtual reality system is unrealistic for general users.
- On the other hand, since the 3D digital information of an object or the environment requires familiarity with a software tool, such as AUTOCAD. For an art designer or marketing personnel, the time and money needed to learn the tool is also unrealistic.
- It is therefore an object of the present invention to provide a virtual reality method in which users can build up a virtual reality system by adopting images arranged in a matrix form. In addition, the present invention also helps art designers or marketing personnel to design an interactive VR object or VR character by simple concept and operation.
- To achieve the above object, the present invention provides a virtual reality method which operates with a different horizontal angle. First, a plurality of images are provided, and these images are connected in series as an image sequence. Then, a pointer pointed to a target-image in the image sequence is provided, wherein the target-image is one of the images in the image sequence. Finally, a direction signal is received, and the pointer points to an adjacent image next to the target-image when the direction signal is a first direction; and the pointer points to an adjacent image previous to the target-image when the direction signal is a second direction.
- Furthermore, the present invention also provides a virtual reality method which operates with a different horizontal angle and a different overlooking angle. First, a plurality of images are provided, and these images are arranged into a matrix. Then, a pointer pointed to a target-image in the matrix is provided, wherein the target-image is one of the images in the matrix. Finally, a direction signal is received, and the pointer points to an adjacent image next to the target-image when the direction signal is a first direction; the pointer points to an adjacent image previous to the target-image when the direction signal is a second direction; the pointer points to an adjacent image above the target-image when the direction signal is a third direction; and the pointer points to an adjacent image below the target-image when the direction signal is a fourth direction.
- The aforementioned objects, features and advantages of this invention will become apparent by referring to the following detailed description of the preferred embodiment with reference to the accompanying drawings, wherein:
- FIG. 1 is a flow chart illustrating the operation of a virtual reality method according to the first embodiment of the present invention;
- FIG. 2 is a schematic diagram showing an example photographing the images of an object from different horizontal angles as disclosed in the first embodiment;
- FIG. 3 is a flow chart illustrating the operation of a virtual reality method according to the second embodiment of the present invention;
- FIG. 4 is a schematic diagram showing an example photographing the images of an object from different horizontal angles and different overlooking angles as disclosed in the second embodiment; and
- FIG. 5 is a schematic diagram showing an example of the image matrix in the second embodiment.
- Referring to the accompanying figures, the preferred embodiments according to the present invention follow.
- [First Embodiment]
- FIG. 1 illustrates the operation of a virtual reality method according to the first embodiment of the present invention, and FIG. 2 shows an example photographing the images of an object from different horizontal angles in the first embodiment. Referring to FIGS. 1 and 2, the first embodiment of the present invention follows.
- The first embodiment of the present invention represents a virtual reality method that operates with a different horizontal angle. First, in step S100, a plurality of images is provided, and these images are connected in series as an image sequence. These images are the photos of an object at different positions on a circle having a fixed radius, and there is a predetermined angle difference between one image and its adjacent image in the image sequence.
- For example, referring to FIG. 2, the images described above are sixteen photos of an
object 20 shot by acamera 30 at the different positions (1 to 16) on a circle, and the predetermined angle difference between two positions is 24 degree horizontal angle. The sixteen images are then connected in series as an image sequence. - In step S105, a pointer pointed to a target-image in the image sequence is provided, wherein the target-image is one of the images in the image sequence. In step S110, a direction signal is received. Then, in step S115, the direction signal's right/left orientation is determined.
- If the direction signal is a right (namely the first direction) signal, then as in step S120, the pointer is verified as pointing to the last image of the image sequence. If it is, then as step S125, the pointer is altered to point to the first image of the image sequence; If the pointer is not pointing to the last image of the image sequence, then as step S130, the pointer is altered to point to an adjacent image next to the target-image in the image sequence.
- For example, if the direction signal is a right signal and the pointer is pointing to the third image of the sixteen images in FIG. 2, then the pointer is altered to point to the fourth image. If the direction signal is a right signal and the pointer is pointing to the sixteenth image of the sixteen images in FIG. 2, then the pointer is altered to point to the first image.
- In step S155, the image indicated by the pointer is displayed, and the process returns to step S110, to receive another direction signal.
- In addition, if the direction signal is not a right signal, then in step S135, the direction signal's right/left orientation is determined.
- If the direction signal is a left (namely the second direction) signal, then as step S140, the pointer is verified as pointing to the first image of the image sequence. If it is, then as step S145, the pointer is altered to point to the last image of the image sequence; If the pointer is not pointing to the first image of the image sequence, then as step S150, the pointer is altered to point to an adjacent image previous to the target-image in the image sequence.
- For example, if the direction signal is a left signal and the pointer is pointing to the third image of the sixteen images in FIG. 2, then the pointer is altered to point to the second image. If the direction signal is a left signal and the pointer is pointing to the first image of the sixteen images in FIG. 2, then the pointer is altered to point to the sixteenth image.
- Then, in step S155, the image pointed to by the pointer is displayed, and the process returns to step S110, to receive another direction signal.
- As well, if the direction signal is not a left signal, the process goes directly to step S110, to receive another direction signal.
- [Second Embodiment]
- FIG. 3 illustrates the operation of a virtual reality method according to the second embodiment of the present invention, and FIG. 4 shows an example photographing these images of an object from different horizontal angles and different overlooking angles. Referring to FIGS. 3 and 4, the second embodiment of the present invention is described in detail as follows.
- The second embodiment of the present invention represents a virtual reality method which operates with a different horizontal angle and a different overlooking angle. First, in step S200, a plurality of images are provided, and these images are arranged into a matrix.
- These images are the photos of an object at different positions on a virtual spherical surface. The images in the same row of the matrix represent the images photographed from the same overlooking angle but a different horizontal angle, and there is a predetermined horizontal angle difference between one image and its adjacent image in any row. In addition, the images in the same column of the matrix represent the images photographed from the same horizontal angle but different overlooking angle, and there is a predetermined overlooking angle difference between one image and its adjacent image in one column.
- For example, referring to FIG. 4, the images described above are the photos of an
object 20 shot by acamera 30 at the different positions on a virtual spherical surface. For example, at the position B of one predetermined overlooking angle, we can shoot sixteen photos of anobject 20 using acamera 30 at the different positions (1 to 16) on a circle, and the predetermined horizontal angle difference between two positions is 24 degrees. The same situation exists for group A, B, C, . . . , and F. Therefore, there are 96 (6×16) images in this example, and these 96 images are arranged into amatrix 40, as shown in FIG. 5. - Then, in step S205, a pointer pointed to a target-image in the
matrix 40 is provided, wherein the target-image is one of the images in thematrix 40. Also, in step S210, a direction signal is received. Then, in step S215, the direction signal's right/left orientation is determined. - If the direction signal is a right (namely the first direction) signal, then as step S220, the pointer is verified as pointing to the image in the last column of the
matrix 40. If it is, then as step S225, the pointer is altered to point to the image in the first column of thematrix 40; If the pointer is not pointing to the image in the last column of thematrix 40, then as step S230, the pointer is altered to point to an adjacent image next to the target-image in the same row. - For example, if the direction signal is a right signal and the pointer is pointing to the third image of the sixteen images in row B of the
matrix 40 in FIG. 5, the pointer is altered to point to the fourth image in row B. If the direction signal is a right signal and the pointer is pointing to the sixteenth image of the sixteen images in row B of thematrix 40 in FIG. 5, then the pointer is altered to point to the first image in row B. - In step S295, the image pointed to by the pointer is displayed, and the process returns to step S210, to receive another direction signal.
- In addition, if the direction signal is not a right signal, then in step S235, the direction signal's right/left orientation is determined.
- If the direction signal is a left (namely the second direction) signal, then as step S240, the pointer is verified as pointing to the image in the first column of the
matrix 40. If it is, then as step S245, the pointer is altered to point to the image in the last column of thematrix 40; If the pointer is not pointing to the image in the first column of thematrix 40, then as step S250, the pointer is altered to point to an adjacent image previous to the target-image in the same row. - For example, if the direction signal is a left signal and the pointer is pointing to the third image of the sixteen images in row B of the
matrix 40 in FIG. 5, then the pointer is altered to point to the second image in row B. If the direction signal is a left signal and the pointer is pointing to the first image of the sixteen images in row B of thematrix 40 in FIG. 5, then the pointer is altered to point to the sixteenth image in row B. - Then, in step S295, the image pointed to by the pointer is displayed, and the process returns to step S210, to receive another direction signal.
- Furthermore, if the direction signal is not a left signal, then in step S255, the direction signal is verified as being an up signal.
- If the direction signal is an up (namely the third direction) signal, then as step S260, the pointer is verified as pointing to the image in the first row of the
matrix 40. If it is, then as step S265, the pointer is altered to point to the image in the first row of thematrix 40; If the pointer is not pointing to the image in the first row of thematrix 40, then as step S270, the pointer is altered to point to an adjacent image above the target-image in the same column. - For example, if the direction signal is an up signal and the pointer is pointing to the third image of the sixteen images in row B of the
matrix 40 in FIG. 5, then the pointer is altered to point to the third image in row A. If the direction signal is an up signal and the pointer is pointing to the third image of the sixteen images in row A of thematrix 40 in FIG. 5, then the pointer is altered to point to the third image in row A. - Then, in step S295, the image pointed to by the pointer is displayed, and the process returns to step S210, to receive another direction signal.
- In addition, if the direction signal is not an up signal, then in step S275, the direction signal is verified as being a down signal.
- If the direction signal is a down (namely the fourth direction) signal, then as step S280, the pointer is verified as pointing to the image in the last row of the
matrix 40. If it is, then as step S285, the pointer is altered to point to the image in the last row of thematrix 40; If the pointer is not pointing to the image in the last row of thematrix 40, then as step S290, the pointer is altered to point to an adjacent image below the target-image in the same column. - For example, if the direction signal is a down signal and the pointer is pointing to the third image of the sixteen images in row B of the
matrix 40 in FIG. 5, then the pointer is altered to point to the third image in row C. If the direction signal is a down signal and the pointer is pointing to the third image of the sixteen images in row F of thematrix 40 in FIG. 5, then the pointer is altered to point to the third image in row F. - Then, in step S295, the image pointed to by the pointer is displayed, and the process returns to step S210, to receive another direction signal.
- As well, if the direction signal is not a down signal, the process goes directly to step S110, to receive another direction signal.
- As a result, users can photograph these images of an object or the environment and use a simple image editing tool, such as MS Paint, to connect or arrange these images into an image sequence or a matrix. Users can further utilize the image sequence and the matrix with the present invention to build up a virtual reality system. In addition, designers or marketing personnel can use the virtual reality method of the present invention to create interactive VR objects or VR characters with reduced learning time and expense.
- Although the present invention has been described in its preferred embodiment, it is not intended to limit the invention to the precise embodiment disclosed herein. Those who are skilled in this technology can still make various alterations and modifications without departing from the scope and spirit of this invention. Therefore, the scope of the present invention shall be defined and protected by the following claims and their equivalents.
Claims (20)
1. A virtual reality method, comprising the steps of:
providing a plurality of images and connecting the images in series as an image sequence;
providing a pointer pointed to a target-image in the image sequence, wherein the target-image is one of the images in the image sequence;
receiving a direction signal;
determining the direction signal;
altering the pointer to point to an adjacent image next to the target-image in the image sequence when the direction signal is a first direction signal; and
altering the pointer to point to an adjacent image previous to the target-image in the image sequence when the direction signal is a second direction signal.
2. The method as claimed in claim 1 further comprising:
determining whether the pointer is pointing to the last image of the image sequence; and
altering the pointer to point to the first image of the image sequence when the direction signal is the first direction signal and the pointer is pointing to the last image of the image sequence.
3. The method as claimed in claim 1 further comprising:
determining whether the pointer is pointing to the first image of the image sequence; and
altering the pointer to point to the last image of the image sequence when the direction signal is the second direction signal and the pointer is pointing to the first image of the image sequence.
4. The method as claimed in claim 1 further comprising displaying the image pointed to by the pointer.
5. The method as claimed in claim 2 wherein the first direction signal is a right signal.
6. The method as claimed in claim 3 wherein the second direction signal is a left signal.
7. The method as claimed in claim 1 wherein the images are the photos of an object at different positions on a circle having a fixed radius, and there is a predetermined angle difference between one image and its adjacent image in the image sequence.
8. The method as claimed in claim 7 wherein the predetermined angle difference is a 24 degree horizontal angle.
9. A virtual reality method, comprising the steps of:
providing a plurality of images and arranging the images into a matrix;
providing a pointer pointed to a target-image in the matrix, wherein the target-image is one of the images in the matrix;
receiving a direction signal;
determining the direction signal;
altering the pointer to point to an adjacent image next to the target-image in the matrix when the direction signal is a first direction signal;
altering the pointer to point to an adjacent image previous to the target-image in the matrix when the direction signal is a second direction signal;
altering the pointer to point to an adjacent image above the target-image in the matrix when the direction signal is a third direction signal; and
altering the pointer to point to an adjacent image below the target-image in the matrix when the direction signal is a fourth direction signal.
10. The method as claimed in claim 9 further comprising:
determining whether the pointer is pointing to the image in the last column of the matrix; and
altering the pointer to point to the image in the first column of the matrix when the direction signal is the first direction signal and the pointer is pointing to the image in the last column of the matrix.
11. The method as claimed in claim 9 further comprising:
determining whether the pointer is pointing to the image in the first column of the matrix; and
altering the pointer to point to the image in the last column of the matrix when the direction signal is the second direction signal and the pointer is pointing to the image in the first column of the matrix.
12. The method as claimed in claim 9 further comprising:
determining whether the pointer is pointing to the image in the first row of the matrix; and
altering the pointer to point to the image in the first row of the matrix when the direction signal is the third direction signal and the pointer is pointing to the image in the first row of the matrix.
13. The method as claimed in claim 9 further comprising:
determining whether the pointer is pointing to the image in the last row of the matrix; and
altering the pointer to point to the image in the last row of the matrix when the direction signal is the fourth direction signal and the pointer is pointing to the image in the last row of the matrix.
14. The method as claimed in claim 9 further comprising displaying the image pointed to by the pointer.
15. The method as claimed in claim 10 wherein the first direction signal is a right signal.
16. The method as claimed in claim 11 wherein the second direction signal is a left signal.
17. The method as claimed in claim 12 wherein the third direction signal is an up signal.
18. The method as claimed in claim 13 wherein the fourth direction signal is a down signal.
19. The method as claimed in claim 9 wherein the images are the photos of an object at different positions on a virtual spherical surface, and the images in the same row of the matrix represent the images photographed from the same overlooking angle but different horizontal angles, and there is a predetermined horizontal angle difference between one image and its adjacent image in one row, and the images in the same column of the matrix represent the images photographed from the same horizontal angle but different overlooking angles, and there is a predetermined overlooking angle difference between one image and its adjacent image in one column.
20. The method as claimed in claim 19 wherein the predetermined horizontal angle difference is a 24 degree horizontal angle.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW89204033 | 2000-03-14 | ||
TW089204033U TW516685U (en) | 2000-03-14 | 2000-03-14 | Touch-type pointing rod |
Publications (1)
Publication Number | Publication Date |
---|---|
US20020057270A1 true US20020057270A1 (en) | 2002-05-16 |
Family
ID=21665241
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/793,461 Expired - Fee Related US6515652B2 (en) | 2000-03-14 | 2001-02-26 | Tactile pointing stick |
US10/035,532 Abandoned US20020057270A1 (en) | 2000-03-14 | 2001-11-06 | Virtual reality method |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/793,461 Expired - Fee Related US6515652B2 (en) | 2000-03-14 | 2001-02-26 | Tactile pointing stick |
Country Status (2)
Country | Link |
---|---|
US (2) | US6515652B2 (en) |
TW (1) | TW516685U (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7298378B1 (en) | 2004-12-13 | 2007-11-20 | Hagenbuch Andrew M | Virtual reality universe realized as a distributed location network |
US20100283796A1 (en) * | 2004-03-03 | 2010-11-11 | Gary Kramer | System for Delivering and Enabling Interactivity with Images |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090153481A1 (en) * | 2007-12-12 | 2009-06-18 | Gunther Adam M | Data output device having a plurality of key stick devices configured for reading out data to a user and method thereof |
US20090239665A1 (en) * | 2007-12-31 | 2009-09-24 | Michael Minuto | Brandable thumbstick cover for game controllers |
JP4968694B2 (en) * | 2008-06-19 | 2012-07-04 | 富士通コンポーネント株式会社 | Input device |
US20190138044A1 (en) * | 2016-04-12 | 2019-05-09 | Mark Slotta | Control Stick Cap with Retention Features |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6014142A (en) * | 1995-11-13 | 2000-01-11 | Platinum Technology Ip, Inc. | Apparatus and method for three dimensional manipulation of point of view and object |
US6195122B1 (en) * | 1995-01-31 | 2001-02-27 | Robert Vincent | Spatial referenced photography |
US6329963B1 (en) * | 1996-06-05 | 2001-12-11 | Cyberlogic, Inc. | Three-dimensional display system: apparatus and method |
US6597380B1 (en) * | 1998-03-16 | 2003-07-22 | Nec Corporation | In-space viewpoint control device for use in information visualization system |
US6636246B1 (en) * | 2000-03-17 | 2003-10-21 | Vizible.Com Inc. | Three dimensional spatial user interface |
US6668082B1 (en) * | 1997-08-05 | 2003-12-23 | Canon Kabushiki Kaisha | Image processing apparatus |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5889508A (en) * | 1996-09-26 | 1999-03-30 | Slotta; Mark R. | Cushion for keyboard cursor control stick |
US6271834B1 (en) * | 1998-05-29 | 2001-08-07 | International Business Machines Corporation | Integrated pointing device having tactile feedback |
TW526446B (en) * | 1999-08-27 | 2003-04-01 | Darfon Electronics Corp | Pointing stick device capable of sensing adapter pressure acutely |
-
2000
- 2000-03-14 TW TW089204033U patent/TW516685U/en unknown
-
2001
- 2001-02-26 US US09/793,461 patent/US6515652B2/en not_active Expired - Fee Related
- 2001-11-06 US US10/035,532 patent/US20020057270A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6195122B1 (en) * | 1995-01-31 | 2001-02-27 | Robert Vincent | Spatial referenced photography |
US6014142A (en) * | 1995-11-13 | 2000-01-11 | Platinum Technology Ip, Inc. | Apparatus and method for three dimensional manipulation of point of view and object |
US6329963B1 (en) * | 1996-06-05 | 2001-12-11 | Cyberlogic, Inc. | Three-dimensional display system: apparatus and method |
US6668082B1 (en) * | 1997-08-05 | 2003-12-23 | Canon Kabushiki Kaisha | Image processing apparatus |
US6597380B1 (en) * | 1998-03-16 | 2003-07-22 | Nec Corporation | In-space viewpoint control device for use in information visualization system |
US6636246B1 (en) * | 2000-03-17 | 2003-10-21 | Vizible.Com Inc. | Three dimensional spatial user interface |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100283796A1 (en) * | 2004-03-03 | 2010-11-11 | Gary Kramer | System for Delivering and Enabling Interactivity with Images |
US7956872B2 (en) * | 2004-03-03 | 2011-06-07 | Virtual Iris Studios, Inc. | System for delivering and enabling interactivity with images |
US9087413B2 (en) | 2004-03-03 | 2015-07-21 | Engram Networking Llc | System for delivering and enabling interactivity with images |
US7298378B1 (en) | 2004-12-13 | 2007-11-20 | Hagenbuch Andrew M | Virtual reality universe realized as a distributed location network |
Also Published As
Publication number | Publication date |
---|---|
US6515652B2 (en) | 2003-02-04 |
TW516685U (en) | 2003-01-01 |
US20010022576A1 (en) | 2001-09-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108062776B (en) | Camera Attitude Tracking method and apparatus | |
CN102831401B (en) | To following the tracks of without specific markers target object, three-dimensional overlay and mutual method and system | |
EP0583060A2 (en) | Method and system for creating an illusion of three-dimensionality | |
US6822648B2 (en) | Method for occlusion of movable objects and people in augmented reality scenes | |
EP0684059B1 (en) | Method and apparatus for the display of video images | |
CN1698357B (en) | Method for displaying an output image on an object | |
US5790713A (en) | Three-dimensional computer graphics image generator | |
Thomas et al. | Using augmented reality to visualise architecture designs in an outdoor environment | |
US6219444B1 (en) | Synthesizing virtual two dimensional images of three dimensional space from a collection of real two dimensional images | |
US20020190987A1 (en) | Image display | |
CN105869216A (en) | Method and apparatus for presenting object target | |
GB2256567A (en) | Modelling system for imaging three-dimensional models | |
GB2099269A (en) | Method of and apparatus for compiling digital image information | |
GB2100100A (en) | A computer generated imagery system | |
CA2126921A1 (en) | Apparatus and method for producing picture data based on two-dimensional and three-dimensional picture data producing instructions | |
Tatzgern et al. | Exploring real world points of interest: Design and evaluation of object-centric exploration techniques for augmented reality | |
CN104537705A (en) | Augmented reality based mobile platform three-dimensional biomolecule display system and method | |
US4831557A (en) | Image composing apparatus | |
IL112940A (en) | Apparatus and method for simulating a terrain and objects thereabove | |
JPH0793579A (en) | Formation system of three-dimensional simulation image | |
CN111161398A (en) | Image generation method, device, equipment and storage medium | |
CN114092670A (en) | Virtual reality display method, equipment and storage medium | |
US20020057270A1 (en) | Virtual reality method | |
CN1979508A (en) | Simulated humanbody channel collateral cartoon presenting system capable of excuting by computer and method therefor | |
KR101940499B1 (en) | Puzzle completion type augmented reality application system, and application, articles and electronic device therefor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ACER INCORPORATED, TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HAO-YU, TSAO;REEL/FRAME:012451/0048 Effective date: 20011026 |
|
AS | Assignment |
Owner name: WISTRON CORP., TAIWAN Free format text: 1/2 RIGHT, TITLE & INTEREST;ASSIGNOR:ACER INC.;REEL/FRAME:013254/0417 Effective date: 20021106 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |