CN102989174B - Obtain the input being used for controlling the operation of games - Google Patents
Obtain the input being used for controlling the operation of games Download PDFInfo
- Publication number
- CN102989174B CN102989174B CN201210496712.8A CN201210496712A CN102989174B CN 102989174 B CN102989174 B CN 102989174B CN 201210496712 A CN201210496712 A CN 201210496712A CN 102989174 B CN102989174 B CN 102989174B
- Authority
- CN
- China
- Prior art keywords
- controller
- information
- input
- input information
- value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 claims abstract description 62
- 230000033001 locomotion Effects 0.000 claims description 35
- 230000008859 change Effects 0.000 claims description 28
- 230000008569 process Effects 0.000 claims description 22
- 238000005096 rolling process Methods 0.000 claims description 17
- 238000012935 Averaging Methods 0.000 claims description 9
- 230000000694 effects Effects 0.000 claims description 8
- 239000013589 supplement Substances 0.000 claims description 4
- 230000001965 increasing effect Effects 0.000 claims description 3
- 239000000654 additive Substances 0.000 claims description 2
- 230000000996 additive effect Effects 0.000 claims description 2
- 239000000047 product Substances 0.000 claims description 2
- 238000009792 diffusion process Methods 0.000 claims 2
- 238000004458 analytical method Methods 0.000 abstract description 17
- 230000036544 posture Effects 0.000 description 38
- 230000001133 acceleration Effects 0.000 description 32
- 238000003860 storage Methods 0.000 description 27
- 230000006870 function Effects 0.000 description 20
- 238000013461 design Methods 0.000 description 15
- 238000012545 processing Methods 0.000 description 14
- 238000013507 mapping Methods 0.000 description 13
- 230000005540 biological transmission Effects 0.000 description 12
- 238000001514 detection method Methods 0.000 description 11
- 230000009471 action Effects 0.000 description 10
- 238000005516 engineering process Methods 0.000 description 10
- 230000003287 optical effect Effects 0.000 description 10
- 238000005070 sampling Methods 0.000 description 10
- 238000012550 audit Methods 0.000 description 9
- 238000004088 simulation Methods 0.000 description 9
- 230000000007 visual effect Effects 0.000 description 9
- 238000006073 displacement reaction Methods 0.000 description 8
- 238000009826 distribution Methods 0.000 description 8
- 230000004044 response Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 6
- 238000004891 communication Methods 0.000 description 5
- 230000002093 peripheral effect Effects 0.000 description 5
- 230000005236 sound signal Effects 0.000 description 5
- 230000006835 compression Effects 0.000 description 4
- 238000007906 compression Methods 0.000 description 4
- 230000008878 coupling Effects 0.000 description 4
- 238000010168 coupling process Methods 0.000 description 4
- 238000005859 coupling reaction Methods 0.000 description 4
- 238000007726 management method Methods 0.000 description 4
- 238000004590 computer program Methods 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 230000005355 Hall effect Effects 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 2
- 238000012512 characterization method Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000036461 convulsion Effects 0.000 description 2
- 238000013481 data capture Methods 0.000 description 2
- 230000002708 enhancing effect Effects 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000014759 maintenance of location Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 201000004569 Blindness Diseases 0.000 description 1
- 241001503991 Consolida Species 0.000 description 1
- 230000008485 antagonism Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000003139 buffering effect Effects 0.000 description 1
- 230000002860 competitive effect Effects 0.000 description 1
- 150000001875 compounds Chemical class 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000004886 head movement Effects 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000006698 induction Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 239000003607 modifier Substances 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 230000000284 resting effect Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 239000011435 rock Substances 0.000 description 1
- 239000004576 sand Substances 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 230000007781 signaling event Effects 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 239000011800 void material Substances 0.000 description 1
- 238000004018 waxing Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R3/00—Circuits for transducers, loudspeakers or microphones
- H04R3/005—Circuits for transducers, loudspeakers or microphones for combining the signals of two or more microphones
Landscapes
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Otolaryngology (AREA)
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Acoustics & Sound (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
- Position Input By Displaying (AREA)
- Image Analysis (AREA)
Abstract
The method disclosing the input obtaining the operation for controlling game.In an embodiment of the present invention, before the analysis of gesture recognition, the controller routing information from inertia, picture catching and sound source can be mixed.
Description
Priority request
This application claims the rights and interests of following patent: U.S. Patent application No.11/381729, authorize XiaoDongMao, title is " microminiature microphone array ", (attorney docket SCEA05062US00), and on May 4th, 2006 submits to;Application number 11/381728, authorizes XiaoDongMao, and title is " echo and noise eliminate ", and (attorney docket SCEA05064US00), on May 4th, 2006 submits to;U.S. Patent application No.11/381725, authorizes XiaoDongMao, and title is " method and apparatus of target sound detection ", and (attorney docket SCEA05072US00), on May 4th, 2006 submits to;U.S. Patent Application No. 11/381727, authorizes XiaoDongMao, and title is " noise remove on control station with the electronic installation of far field microphone ", (attorney docket SCEA05073US00), and on May 4th, 2006 submits to;U.S. Patent application No.11/381724, authorizes XiaoDongMao, and title is " target sound detection and the method and apparatus characterized ", and (attorney docket SCEA05079US00), on May 4th, 2006 submits to;U.S. Patent application No.11/381721, authorizes XiaoDongMao, and title is " selective sound source listening in conjunction with computer interaction process ", (attorney docket SCEA04005JUMBOUS), and on May 4th, 2006 submits to;By reference all of which is incorporated into herein.
This application claims the rights and interests of following patent: CO-PENDING application number 11/418988, authorize XiaoDongMao, title is " adjusting the method and apparatus being used for catching the audit area of sound ", (attorney docket SCEA-00300), and on May 4th, 2006 submits to;CO-PENDING application number 11/418989, authorizes XiaoDongMao, and title is " method and apparatus for catching audio signal according to visual image ", and (attorney docket SCEA-00400), on May 4th, 2006 submits to;CO-PENDING application number 11/429047, authorizes XiaoDongMao, and title is " according to the method and apparatus that the position of signal catches audio signal ", (attorney docket SCEA-00500), and on May 4th, 2006 submits to;CO-PENDING application number 11/429133, authorizes RichardMarks et al., and title is " selective sound source listening in conjunction with computer interaction process ", (attorney docket SCEA04005US01-SONYP045), and on May 4th, 2006 submits to;And CO-PENDING application number 11/429414, authorize RichardMarks et al., title is " with the computer picture of the intensity of computer program interface and input equipment and Audio Processing ", (attorney docket SONYP052), and on May 4th, 2006 submits to;By reference their whole complete disclosure are incorporated herein in.
The application also requires the rights and interests of following patent: U.S. Patent application No.11/382031, and title is " multi-input game control mixer ", (attorney docket SCEA06MXR1), and on May 6th, 2006 submits to;U.S. Patent application No.11/382032, title is " for the system that the user in tracking environmental handles ", and (attorney docket SCEA06MXR2), on May 6th, 2006 submits to;U.S. Patent application No.11/382033, title is " for the system of three-dimensional input control, method and apparatus " ", (attorney docket SCEA06INRT1), on May 6th, 2006 submits to;U.S. Patent application No.11/382035, title is " inertia can follow the tracks of hand held controller ", (attorney docket SCEA06INRT2), and on May 6th, 2006 submits to;U.S. Patent application No.11/382036, title is " method and system for visual tracking application connected effect ", and (attorney docket SONYP058A), on May 6th, 2006 submits to;U.S. Patent application No.11/382041, title is " for inertia is followed the tracks of the method and system of application connected effect ", and (attorney docket SONYP058B), on May 7th, 2006 submits to;U.S. Patent application No.11/382038, title is " being used for the method and system to acoustic tracking application connected effect ", and (attorney docket SONYP058C), on May 6th, 2006 submits to;U.S. Patent application No.11/382040, title is " being used for the method and system to multichannel Mixed design application connected effect ", and (attorney docket SONYP058D), on May 7th, 2006 submits to;U.S. Patent application No.11/382034, title is " scheme that the user for detecting and follow the tracks of game controller body handles ", and (attorney docket 86321SCEA05082US00), on May 6th, 2006 submits to;U.S. Patent application No.11/382037, title is " for the movement of hand held controller converts to the scheme of the input of system ", and (attorney docket 86324), on May 6th, 2006 submits to;U.S. Patent application No.11/382043, title is " can detect and can follow the tracks of hand held controller ", (attorney docket 86325), and on May 7th, 2006 submits to;U.S. Patent application No.11/382039, title is " method for the movement of hand held controller is mapped to game commands ", and (attorney docket 86326), on May 7th, 2006 submits to;S Design Patent application No.29/259349, title is " controller with infrared port ", (attorney docket SCEA06007US00), and on May 6th, 2006 submits to;S Design Patent application No.29/259350, title is " controller with tracking transducer ", (attorney docket SCEA06008US00), and on May 6th, 2006 submits to;U.S. Patent application No.60/798031, title is " dynamic object interface ", (attorney docket SCEA06009US00), and on May 6th, 2006 submits to;And S Design Patent application No.29/259348, title is " tracked control device ", (attorney docket SCEA06010US00), and on May 6th, 2006 submits to;U.S. Patent application No.11/382250, title is " obtaining the input being used for controlling the operation of games ", and (attorney docket SCEA06018US00), on May 8th, 2006 submits to;By reference all of which is intactly incorporated herein in.
The application also requires the rights and interests of following patent: co-pending U.S. Patent application number 11/430594, authorize GarzZalewski and RileyR.Russel, title is " audio visual environment of use user selects the system and method for advertisement ", (attorney docket SCEA05059US00), on May 8th, 2006 submits to;By reference its complete disclosure is incorporated herein in.
The application also requires the rights and interests of following patent: co-pending U.S. Patent application number 11/430593, authorize GarzZalewski and RileyR.Russel, title is " using audio visual environment to select advertisement on gaming platform ", (attorney docket SCEAUS3.0-011), on May 8th, 2006 submits to;By reference its complete disclosure is incorporated herein in.
The application also requires the rights and interests of following patent: co-pending U.S. Patent application number 11/382259, authorize GarzZalewski et al., title is " for determining not relative to the method and apparatus of the User Activity of system ", (attorney docket 86327), on May 8th, 2006 submits to;By reference its complete disclosure is incorporated herein in.
The application also requires the rights and interests of following patent: co-pending U.S. Patent application number 11/382258, authorize GarzZalewski et al., title is " method and apparatus for determining the User Activity grade relative to system ", (attorney docket 86328), on May 8th, 2006 submits to;By reference its complete disclosure is incorporated herein in.
The application also requires the rights and interests of following patent: co-pending U.S. Patent application number 11/382251, authorize GarzZalewski et al., title is " having can the hand held controller of detecting element for what follow the tracks of ", and (attorney docket 86329), on May 8th, 2006 submits to;By reference its complete disclosure is incorporated herein in.
The application also requires the rights and interests of following patent: co-pending U.S. Patent application number 11/382252, title is " for obtaining the tracking device controlling the information that games run ", (attorney docket SCEA06INRT3), on May 8th, 2006 submits to;By reference its complete disclosure is incorporated herein in.
The application also requires the rights and interests of following patent: co-pending U.S. Patent application number 11/382256, title is " having the tracking device for obtaining the acoustic emitter controlling the information that games run ", (attorney docket SCEA06ACRA2), on May 8th, 2006 submits to;By reference its complete disclosure is incorporated herein in.
The application also requires the rights and interests of following patent: copending United States design patent application number 29/246744, and title is " PlayStation 3 videogame console/PS3 front ", (attorney docket SCEACTR-D3), and on May 8th, 2006 submits to;By reference its complete disclosure is incorporated herein in.
The application also requires the rights and interests of following patent: copending United States design patent application number 29/246743, and title is " PlayStation 3 videogame console/PS3 ", (attorney docket SCEACTRL-D2), and on May 8th, 2006 submits to;By reference its complete disclosure is incorporated herein in.
The application also requires the rights and interests of following patent: copending United States design patent application number 29/246767, and title is " PlayStation 3 videogame console/PS3 ", (attorney docket SONYP059A), and on May 8th, 2006 submits to;By reference its complete disclosure is incorporated herein in.
The application also requires the rights and interests of following patent: copending United States design patent application number 29/246768, and title is " PlayStation 3 videogame console/PS3 ", (attorney docket SONYP059B), and on May 8th, 2006 submits to;By reference its complete disclosure is incorporated herein in.
The application also requires the rights and interests of following patent: copending United States design patent application number 29/246763, title is " having the ergonomics game controller apparatus of LED and optical port ", (attorney docket PA3760US), on May 8th, 2006 submits to;By reference its complete disclosure is incorporated herein in.
The application also requires the rights and interests of following patent: copending United States design patent application number 29/246759, and title is " game controller apparatus with LED and optical port ", (attorney docket PA3761US), and on May 8th, 2006 submits to;By reference its complete disclosure is incorporated herein in.
The application also requires the rights and interests of following patent: copending United States design patent application number 29/246765, and title is " design of optics game controller interface ", (attorney docket PA3762US), and on May 8th, 2006 submits to;By reference its complete disclosure is incorporated herein in.
The application also requires the rights and interests of following patent: copending United States design patent application number 29/246766, title is " having the dual-handle game control device of LED and optical port ", (attorney docket PA3763US), on May 8th, 2006 submits to;By reference its complete disclosure is incorporated herein in.
The application also requires the rights and interests of following patent: copending United States design patent application number 29/246764, and title is " the game interface device with LED and optical port ", (attorney docket PA3764US), and on May 8th, 2006 submits to;By reference its complete disclosure is incorporated herein in.
The application also requires the rights and interests of following patent: copending United States design patent application number 29/246762, title is " having the ergonomics game interface device of LED and optical port ", (attorney docket PA3765US), on May 8th, 2006 submits to;By reference its complete disclosure is incorporated herein in.
The cross reference of related application
The application relates to the U.S. Provisional Patent Application No.60/718145 that title is " audio frequency, video, simulation and user interface example " of JIUYUE in 2005 submission on the 15th, it is incorporated into by reference herein.
The application relates to following patent: U.S. Patent application No.10/207677, and title is " man-machine interface using deformable device ", and on July 27th, 2002 submits to;U.S. Patent application No.10/650409, title is " audio input system ", and on August 27th, 2003 submits to;U.S. Patent application No.10/663236, title is " method and apparatus for adjusting shown picture view according to tracked head movement ", and within 2003, JIUYUE is submitted on the 15th;U.S. Patent application No.10/759782, title is " method and apparatus for light Line Input Devices ", and on January 16th, 2004 submits to;U.S. Patent application 10/820469, title is " method and apparatus of detection and removal audio frequency disturbance ", and on April 7th, 2004 submits to;And U.S. Patent application No.11/301673, title is " following the tracks of the method using opposing headers and hand position to realize instruction interface via camera ", and within 2005, December is submitted on the 12nd;U.S. Patent application No.11/165473, title is " delay matching of audio-frequency/video frequency system ", and on June 22nd, 2005 submits to;By reference all of which is hereby incorporated by.
The application further relates to following patent: co-pending U.S. Patent application No.11/400997, and on April 10th, 2006 submits to, and title is " being used for the system and method from phonetic acquisition user profile ", (attorney docket SCEA05040US00);By reference its complete disclosure is incorporated herein in.
Technical field
In general, the present invention relates to man-machine interface, it particularly relates to process the multichannel input that the user for following the tracks of one or more controller handles.
Background technology
Computer entertainment system generally includes hand held controller, game console or other controller.User or player use controller to send order or other instruction to entertainment systems, in order to control video-game or other simulation played.Such as, controller can be equipped with by the manipulator of user operation, such as stick.The variable that is manipulated by of stick converts digital value to from the analogue value, and this digital value is sent to game host.Controller also can be equipped with can by the button of user operation.
The present invention has been developed just for these and other background information factors.
Accompanying drawing explanation
In conjunction with accompanying drawing by with reference to described in detail below, it can be readily appreciated that the theory of the present invention, accompanying drawing includes:
Fig. 1 is the pictorial diagram illustrating the video game system being operated according to one embodiment of present invention;
Fig. 2 is the perspective view of the controller made according to one embodiment of present invention;
Fig. 3 is the schematic three dimensional views of the accelerometer illustrating according to one embodiment of present invention, can be used for controller;
Fig. 4 is according to one embodiment of the invention, for mixing the block diagram of the various system controlling input;
Fig. 5 A is the block diagram of a part for the video game system of Fig. 1;
Fig. 5 B be according to one embodiment of present invention, for following the tracks of the flow chart of the method for the controller of video game system;
Fig. 5 C is the flow chart illustrating according to one embodiment of present invention, utilizing during the game on video game system carries out the method for position and/or orientation information;
Fig. 6 is the block diagram illustrating video game system according to an embodiment of the invention;And
Fig. 7 is the block diagram of the Cell processor realization of video game system according to an embodiment of the invention.
Specific embodiment describes
Although for the ease of illustrating, described in detail below comprising many details, however it will be understood by those of ordinary skill in the art that, many changes and change to details below are within the scope of the present invention.It is therefore proposed that the example embodiment of the following description of the present invention, and do not lose the generality of the present invention requiring rights and interests and the present invention requiring rights and interests is not applied restriction.
The various embodiments of method described herein, equipment, scheme and system provide the user's detection of movement, motion and/or manipulation, seizure and tracking to whole controller main body itself.User is mobile to being detected of whole controller main body, move and/or handle the various aspects that can be used for controlling as additional command game or other simulation carried out.
The step of the manipulation of game controller body can be realized by detection and tracking user by different modes.Such as, for instance the image capture unit such as the inertial sensor such as accelerometer or gyroscope, such as digital camera can with computer entertainment system with the use of, in order to detection hand held controller main body motion, and be converted into game in action.Such as follow the tracks of the example of the controller with inertial sensor at title described in the U.S. Patent application 11/382033 (attorney docket SCEA06INRT1) of " system that three-dimensional input controls, method and apparatus ", be incorporated into by reference herein.Such as use picture catching to carry out the example of tracking control unit at title described in the U.S. Patent application 11/382034 (attorney docket SCEA05082US00) of " scheme that the user for detecting and follow the tracks of game controller body handles ", be incorporated into by reference herein.It addition, it be also possible to use microphone array and suitable signal processing acoustically tracking control unit and/or user.In the example of this acoustic tracking described in U.S. Patent application 11/381721, it is incorporated into by reference herein.
Phonoreception is surveyed, inertia sensing and picture catching can individually or with any combination for detecting the many different types of motion of controller, for instance moves up and down, reverse and move, move left and right, jerk movement, bar type motion, underriding campaign etc..This type games may correspond to various order so that motion is converted into the action in game.The manipulation of game controller body be can be used to realize many different types of game, simulation etc. by detection and tracking user; this allows user such as to participate in daggers and swords or the fight of light sword; use the rod and follow the tracks of the shape of article; participate in many different types of competitive sports, participate in the fight on screen or other antagonism etc..Games can be configured to the motion of tracking control unit, and identifies some pre-recorded posture from tracked motion.One or more identification in these postures can trigger the change of game state.
In an embodiment of the present invention, can mix, before for the analysis of gesture recognition, the controller routing information obtained from these separate sources.The tracking data from separate sources (such as sound, inertia and picture catching) can be mixed by the mode of the probability of improvement identification posture.
With reference to Fig. 1, it is shown that the system 100 being operated according to one embodiment of present invention.As it can be seen, computer entertainment control station 102 can couple with TV or other video display units 104, in order to show the image of video-game or other simulation wherein.Game or other simulation are storable in inserting on the DVD of control station 102, CD, flash memory, USB storage or other storage medium 106.User or player 108 direct game controller 110 control video-game or other simulation.In fig. 2, it is seen that game console 110 includes inertial sensor 112, it responds the change of position of game console 110, motion, orientation or orientation and produces signal.Except inertial sensor, game console 110 may also include conventional control input equipment, for instance stick 111, button 113, R1, L1 etc..
In operation, user 108 mobile controller 110 for physically.Such as, controller 110 can be moved towards any direction by user 108, for instance upper and lower, to side, to opposite side, torsion, roll, rock, jerk, underriding etc..These of controller 110 itself move can by camera 112 by via analyze be tracked from the signal of inertial sensor 112, the mode that is described below detects and catches.
Referring again to Fig. 1, system 100 can include camera or other video image trap setting 114 alternatively, and it can be positioned such that controller 110 is within the visual field 116 of camera.Analysis from the image of image capture device 114 can be combined use with the analysis of the data from inertial sensor 112.As in figure 2 it is shown, controller 110 can be optionally equipped with the light sources such as such as light emitting diode (LED) 202,204,206,208, to help to be tracked by video analysis.They may be mounted in the main body of controller 110.Term as used herein " main body " is used for describing in game console 110 part that will grasp (or wearing when it is wearable game console).
Such as authorizing inventor GaryM.Zalewski, title analysis to this kind of video image for tracking control unit 110 described in the U.S. Patent Application No. 11/382034 (attorney docket SCEA05082US00) of " scheme that the user for detecting and follow the tracks of game controller body handles ", it is incorporated into by reference herein.Control station 102 can include sonic transducer, for instance microphone array 118.Controller 110 may also include acoustical signal maker 210 (such as loudspeaker), thus providing sound source to help to have the acoustic tracking of the controller 110 of microphone array 118 and suitable Underwater Acoustic channels, as described in U.S. Patent application 11/381724, it is incorporated into by reference herein.
In general, position and the orientation data of formation controller 110 it is used for from the signal of inertial sensor 112.This data can be used to many physics aspects of the movement of computing controller 110, for instance any telemetry station of its acceleration along any axle and speed, its inclination, pitching, driftage, rolling and controller 110." remote measurement " used herein is generally referred to as and remotely measures information of interest and to system or to the designer of system or operator's report.
The ability of the movement of detection and tracking control unit 110 enables a determination of whether to perform any predetermined movement of controller 110.It is to say, some Move Mode of controller 110 or posture can pre-define and with playing games or the input order of other simulation.Such as, the downward underriding posture of controller 110 may be defined as an order, and the torsion posture of controller 110 may be defined as another order, and the posture of rocking of controller 110 may be defined as another order, and the rest may be inferred.So, the mode of user 108 mobile controller 110 for physically controls another input of game with acting on, and it provides the user more stimulates happier experience.
Exemplarily rather than restriction, inertial sensor 112 can be accelerometer.Fig. 3 illustrate take such as to be determined by the simple quality of spring 306,308,310,312 and framework 304 Elastic Coupling at four points 302 the example of accelerometer 300 of form.Pitch axis and roll axis (being represented by X and Y respectively) are arranged in the plane with frame intersection.Yaw axis Z is orientated vertical with the plane comprising pitch axis X and roll axis Y.Framework 304 can be installed to controller 110 by any appropriate ways.When framework 304 (and game console 110) accelerates and/or rotates, quality is determined 302 can relative to framework 304 displacement, and spring 306,308,310,312 can extend in the following manner or compress, translation which depends on pitching and/or rolling and/or driftage and/or the quantity spun up and direction and/or angle.The displacement of mass 302 and/or the compression of spring 306,308,310,312 or elongation can adopt such as suitable sensor 314,316,318,320 to sense, and are converted into known or that predetermined way is relevant to the acceleration amount of pitching and/or rolling signal.
There is many different modes come tracking quality position certainly and/or be applied to power thereon, including strain ga(u)ge material, photon sensor, Magnetic Sensor, Hall effect device, piezo-electric device, capacitance sensor etc..Embodiments of the invention can include the sensor of any quantity and type or the combination of sensor type.By example rather than restriction, sensor 314,316,318,320 can be provided in the gap close induction type electrode on mass 302.Electric capacity between mass and each electrode changes with the mass position relative to each electrode.Each electrode may be connected to circuit, and this circuit produces the signal relevant relative to the electric capacity of electrode (therefore with mass relative to the nearness of electrode) to mass 302.It addition, spring 306,308,310,
312 can include resistance-strain flowmeter sensor, and they produce the signal relevant to the compression of spring and elongation.
In certain embodiments, framework 304 can be installed to controller 110 with gimbal so that accelerometer 300 keeps fixed orientation relative to pitching and/or rolling and/or yaw axis.So, controller shaft X, YZ can map directly to the respective shaft in real space, without considering the controller shaft inclination relative to real space coordinate axes.
As it has been described above, the data from inertia, picture catching and sound source can be analyzed, the path of position and/or orientation to generate tracking control unit 110.As illustrated in the diagram in figure 4, system 400 according to an embodiment of the invention can include tracking information 402, image dissector 404 and acoustic analyser 406.Each reception in these analyzers carrys out the signal of self-inductance measurement environment 401.Analyzer 402,404,406 can pass through hardware, software (or firmware) or two of which or more certain combination realize.The tracking information that each generation in analyzer is relevant to the position of concerned object and/or orientation.Exemplarily, concerned object can be controller noted above 110.Image dissector 404 is operated in combinations with the method described in U.S. Patent application 11/382034 (attorney docket SCEA05082US00), form field according to it and it is operated relatively.Tracking information 402 is operated for the method described in the U.S. Patent application 11/382033 (attorney docket SCEA06INRT1) of " system that three-dimensional input controls, method and apparatus " in combinations with title, form field according to it and it is operated relatively.Acoustic analyser 406 is operated in combinations with the method described in U.S. Patent application 11/381,724, form field according to it and it is operated relatively.
The different passages being seen as from the input of position and/or orientation information associate analyzer 402,404 with 406.Blender 408 can accept multiple input channel, and this kind of passage can comprise the sample data characterizing sensing environment 401, generally from the angle of passage.The position of tracking information 402, image dissector 404 and acoustic analyser 406 generation and/or orientation information can be coupled to the input of blender 408.Blender 408 and analyzer 402,404,406 can be inquired about by game software program 410, and can be configured to response events and interrupt Games Software.Event can include gesture recognition event, linkage change, configuration variation, arranges noise grade, arranges sampling rate, change mapping chain etc., and its example is discussed below.Blender 408 is operated in combinations with method described herein, form field according to it and it is operated relatively.
As mentioned above, signal from the such as different input channels of inertial sensor, video image and/or acoustic sensor etc. can be analyzed by tracking information 402, image dissector 404 and acoustic analyser 406 respectively, in order to determine motion and/or the orientation of controller 110 during carrying out video-game according to the inventive method.A series of (aseriesof) processor executable program code instruction that is that this method can be embodied as in processor readable medium storage and that run on digital processing unit.Such as, as shown in Figure 5A, video game system 100 can include the control station 102 with the tracking information 402, image dissector 404 and the acoustic analyser 406 that are realized by hardware or software.Exemplarily, analyzer 402,404,406 can be embodied as the software instruction run on appropriate processor unit 502.Exemplarily, processor unit 502 can be digital processing unit, for instance the microprocessor of common type in video game console.A part for instruction is storable in memorizer 506.Alternatively, tracking information 402, image dissector 404 and acoustic analyser 406 can pass through hardware, such as special IC (ASIC) realizes.This analyzer hardware may be provided on controller 110 or control station 102, or can be remotely located in other position.In hardware realizes, analyzer 402,404,406 can be in response to such as from processor 502 or such as programmable by USB cable, wireless connections or the external signal in other certain remotely located source that connected by network.
Tracking information 402 can include or realize analyzing the signal that inertial sensor 112 generates and the instruction utilizing the information relevant with the position of controller 110 and/or orientation.Similarly, image dissector 404 can realize analyzing the instruction of the image that image capture unit 114 catches.It addition, acoustic analyser can realize analyzing the instruction of the image that microphone array 118 catches.As shown in the flow chart 510 of Fig. 5 B, these signals and/or image can be received by analyzer 402,404,406, as shown in frame 512.Signal and/or image can be analyzed by analyzer 402,404,406, to determine that the inertia relevant with the position of controller 110 and/or orientation is followed the tracks of information 403, image trace information 405 and acoustics and followed the tracks of information 407, as depicted at block 514.Tracking information 403,405,407 can be relevant to one or more degree of freedom.Preferably tracking six degrees of freedom, with the manipulation of characterization control device 110 or other tracked object.This kind of degree of freedom can tilt to the controller along x, y and z axes, go off course, roll and position, speed or acceleration are relevant.
As depicted at block 516, blender 408 mixes Inertia information 403, image information 405 and acoustic information 407, to generate accurate position and/or orientation information (orientationinformation) 409.Exemplarily, blender 408 can inertia, image and acoustics are followed the tracks of, according to game or environmental condition, the weight that information 403,405,407 application is different, and takes weighted average.It addition, blender 408 can include the blender analyzer 412 of its own, analyzer 412 analyzes the position/orientation information of combination, and generates gained " blender " information of its own of the combination of the information comprising other parser generation.
In one embodiment of the invention, Distribution Value can be given the tracking information 403,405,407 from analyzer 402,404,406 by blender 408.As it has been described above, some set that input can control data is averaging.But, in the present embodiment, before input control data are averaging, give certain value to it, thus, the input control data from some analyzer control data than the input from other analyzer and have bigger analysis importance.
Blender 408 can undertake several functions in the context of native system, including observation, correction, stable, derivation, combination, Route Selection, mixing, report, buffering, interrupts other process and analysis.This can perform relative to one or more the received tracking information 403,405,407 from analyzer 402,404,406.Though each some tracking information that receives and/or derive of analyzer 402,404,406, blender 408 can be implemented as the use of the tracking information 403,405,407 that optimization receives, and generate accurate tracking information 409.
Analyzer 402,404,406 is preferably configured with blender 408 becomes the output format that tracking information provides similar.Information parameter of following the tracks of from any analyzer element 402,404,406 maps to the single parameter in analyzer.Alternatively, by processing the one or more one or more tracking informations parameter from analyzer 402,404,406, blender 408 can form the tracking information of any one of analyzer 402,404,406.Blender can be combined two or more elements of the tracking information of the identical parameters type taking from analyzer 402,404,406, and/or the multiple parameters for the tracking information of parser generation perform function, to create the synthesis set of the output with the beneficial effect that the multiple passages from input generate.
Accurate tracking information 409 can use during employing system 100 carries out video-game, as indicated at block 518.In certain embodiments, the posture made of period can be carried out to use position and/or orientation information relative to user 108 in game.In certain embodiments, blender 408 is operated in combinations with gesture recognizers 505, in order at least one action in game environment associated with the one or more user actions (manipulation of the controller in such as space) from user.
As shown in the flow chart 520 of Fig. 5 C, position and/or orientation information can be used to come the path of tracking control unit 110, as shown in frame 522.Exemplarily rather than restriction, this path can include representing the set relative to the point of the position of certain coordinate system of the quality of the controller center certainly.Each position point can be represented by X, the Y in one or more coordinates, such as Cartesian coordinates and Z coordinate.Time can associate with each point on path so that can monitor the shape in path and the controller progress along path.It addition, each point in set can be associated with the data representing the orientation of controller, such as controller around one or more angles of the central rotation of its mass.The each point on path rotates around the angle at the center of its mass and the value of speed of angular acceleration additionally, can be associated with the speed at the center of the mass of controller and acceleration and controller.
As shown in frame 524, it is possible to compared in the path in tracked path with the one or more storages corresponding to known and/or pre-recorded posture 508, these known and/or pre-recorded postures 508 are context-sensitive with the video-game carried out.Evaluator 505 can be configured to identify that user or process audio frequency differentiate posture etc..Such as, user can be identified by posture by evaluator 505, and posture can be that user is specific.This given pose can be recorded and be included among the stored pre-recorded posture 508 of memorizer 506.Recording process can be optionally stored on the audio frequency generated during the record of posture.Sensing environment is sampled in multichannel analyzer and processes.Processor is referred to gesture model with according to voice or sonogram, determine with high accuracy and performance and differentiate and/or identify user or object.
As shown in Figure 5A, represent that the data 508 of posture are storable in memorizer 506.The example of posture includes but not limited to: object-throwing, for instance ball;Swing object, for instance bat or golf club;Suction hand pump;On or off door or window;Steering wheel rotation or other wagon control;Wushu movement, for instance boxing;Sand papering action;Waxing and wax removing;Paint house;Shake hands;Send sound of laughing sound;Rolling;Throw rugby;Swing handle moves;3D mouse moves;Roll mobile;The movement of major profile;Any recordable movement;Along moving back and forth of any vector, i.e. to tyre inflating, but carry out with certain arbitrary orientation in space;Movement along path;There is the movement accurately stopped with the time started;In noise floor, batten, recordable, tracking and the user based on any time repeated handle;Etc..Each in these postures can be pre-recorded from path data and store as time-based model.The comparison of the posture of path and storage can from supposing stable state, if path deviation stable state, then path can be compared by the posture of elimination process Yu storage.At frame 526, without coupling, then at frame 522, analyzer can continue the path of tracking control unit 110.Fully mate if existed between path (or its part) and the posture of storage, then the state played can change, as shown in 528.The change of game state may include but be not limited to interrupt, send control signal, change variable etc..
Here it is it may happen that an example of this situation.When determining that controller 110 has been moved off stable state, the movement of analyzer 402,404,406 or 412 tracking control unit 110.As long as the path of controller 110 meets path defined in the gesture model 508 of storage, then those postures are possible " hits ".If the path of controller 110 (in noise tolerance sets) deviates any gesture model 508, then from hit list, delete that gesture model.Each posture reference model includes the time base of record posture.The posture 508 of controller path data Yu storage is compared by analyzer 402,404,406 or 412 at reasonable time index.The appearance of limit resets clock.When deviateing stable state (that is, when following the tracks of mobile outside noise threshold), hit list is loaded all possible gesture model.Start clock, and the movement of controller and hit list are compared.It is Walkthrough (walkthrough) time more equally.Terminate if any posture in hit list arrives posture, be then hit at first time.
In certain embodiments, blender 408 and/or each analyzer 402,404,406,412 can notify that games are about the time that some event occurs.The example of this kind of event includes the following:
The 1 acceleration point (X and/or Y and/or Z axis) interrupting reaching is in some game situation, and when the acceleration of controller changes in flex point, analyzer can notify or interrupt the routine in games.Such as, user 108 can use controller 110 to carry out the game scapegoat of the quarter back in control representation rugby simulation game.Analyzer can carry out tracking control unit (expression rugby) via the path generated according to the signal from inertial sensor 112.The specific change of the acceleration of controller 110 can signal service.At this moment, analyzer can another routine in trigger (such as physical modeling bag), simulate the track of rugby according to the position of penalty mark place controller and/or speed and/or orientation.
Interrupt the new posture identified
It addition, analyzer can be configured by one or more inputs.The example of this kind of input includes but not limited to:
The reference tolerance used when noise grade (X, Y or Z axis) noise grade can be the shake analyzing the hands of user in game is set.
Sampling rate is set." sampling rate " used herein can refer to the analyzer frequency for being sampled from the signal of inertial sensor.Sampling rate can be set to signal over sampling or is averaging.
Linkage (gearing) is set." linkage " used herein refer generally to controller move with play in the ratio of movement that occurs.The example controlling this " linkage " in the context of video-game is found in the U.S. Patent Application No. 11/382040 (attorney docket No.:SONYP058D) submitted on May 7th, 2006, it is incorporated herein in by reference.
Mapping chain is set." mapping chain " used herein refers to the figure of gesture model.Gesture model figure can be made to be suitable for specific input channel (path data such as only generated) or the hybrid channel formed mixer unit from inertial sensor signal.
Three input channels can be served by two or more different analyzers similar from tracking information 402.Specifically, they comprise the steps that tracking information 402 as described herein, such as authorizing the U.S. Patent application 11/382034 of inventor GaryM.Zalewski, title is " scheme that the user for detecting and follow the tracks of game controller body handles " video analyzer described in (attorney docket SCEA05082US00), it is incorporated herein by reference, and the acoustic analyser described in U.S. Patent application 11/381721 such as incorporated herein by reference.Analyzer can configure with mapping chain.Map chain to be swapped out by game during game carries out, for instance analyzer or blender can be arranged.
Referring again to the frame 512 of Fig. 5 B, it will be appreciated by those skilled in the art that and there is many ways in which and generate signal from inertial sensor 112.This document describes several example therein.With reference to frame 514, there is many ways in which that the sensor signal generated in analysis block 512 is to obtain the tracking information relevant to the position of controller 110 and/or orientation.Exemplarily rather than restriction, tracking information may include but be not limited to individually or information relevant with following parameters in any combination:
Controller orientation.The orientation of controller 110 can according to relative to certain with reference to be orientated pitching (pitch), rolling (roll) or driftage (yaw) angle, such as represent with radian.The rate of change (such as angular velocity or angular acceleration) of controller orientation can be additionally included in position and/or orientation information.Such as, when inertial sensor 112 includes gyrosensor, the controller orientation information of the form of the one or more output valves proportional to the angle of pitching, rolling or driftage can be directly obtained.
Location of controls (Cartesian coordinate X, Y, Z of such as certain referential middle controller 110)
Controller X-axis speed
Controller Y-axis speed
Controller Z axis speed
Controller X-axis acceleration
Controller Y-axis acceleration
Controller Z axis acceleration
It should be noted that can represent according to the coordinate system different from cartesian relative to position, speed and acceleration, position and/or orientation information.Such as, cylinder or spherical coordinate can be used for position, speed and acceleration.Can directly obtain from accelerometer type sensor relative to the acceleration information of X, Y and Z axis, as described herein.X, Y and Z acceleration can for being integrated from the time of certain initial time, to determine the change of X, Y and Z speed.Can pass through to be added the given value of velocity variations with X, Y and Z speed of initial time, calculate these speed.X, Y and Z speed can be integrated for the time, to determine X, Y and the Z displacement of controller.Can pass through to be added displacement known X, Y with initial time and Z location, determine X, Y and Z location.
This customizing messages of stable state Y/N-represents whether controller is in stable state, and it may be defined as any position, it is possible to through change.In a preferred embodiment, stable position can be that controller is maintained at the position of the height substantially flushed with user's waist with about horizontal alignment.
The data that " time from last stable state " is generally referred to as to through how long, section is relevant since stable state (as mentioned above) being detected for the last time.As previously described, the determination of time can in real time, calculate by processor cycle or sampling period.For relative to initial point come reset controller follow the tracks of to guarantee the personage or the accuracy of object that map in game environment, " time from last stable state " can be important.For determining the actions available/posture (foreclose or be included) being likely to run in game environment subsequently, these data also can be important.
" the last posture of identification " is generally referred to as the last posture identified by gesture recognizers 505 (can be realized) by hardware or software.For previous posture can in discernible possible posture subsequently or game environment occur other certain action is relevant the fact, the mark of the last posture of identification can be important.
The time of the last posture identified
By games or software, above-mentioned output can be sampled at any time.
In one embodiment of the invention, Distribution Value can be given the tracking information 403,405,407 from analyzer 402,404,406 by blender 408.As it has been described above, some set that input can control data is averaging.But, in the present embodiment, at the forward direction that input control data are averaging, it gives certain value, and thus, the input control data from some analyzer control data than the input from other analyzer and have bigger analysis importance.
Such as, blender 408 can need the tracking information relevant to acceleration and stable state.Then, blender 408 is by reception tracking information 403,405,407, as mentioned above.Tracking information can include the parameter relevant to acceleration and stable state, for instance the above.Before the data representing this information are averaging, Distribution Value can be given and follow the tracks of message data set 403,405,407 by blender 408.Such as, can be weighted for x and the y acceleration parameter from tracking information 402 with the value of 90%.But, can only be weighted for x and the y acceleration parameter from image dissector 406 with 10%.Acoustic analyser is followed the tracks of information 407 and can be weighted with 0% when relating to acceleration parameter, i.e. this data void value.
Similarly, the Z axis from tracking information 402 is followed the tracks of information parameter and can be weighted with 10%, and image dissector Z axis is followed the tracks of information and can be weighted with 90%.Acoustic analyser is followed the tracks of information 407 and can be weighted with 0% value equally, but the steady track information originating from acoustic analyser 406 can be weighted with 100%, and wherein all the other analyzers tracking information can be weighted with 0%.
After giving suitable distribution of weights, in combinations with that weight, input being controlled data and be averaging, to show that weighted average input controls data set, this data set is analyzed by gesture recognizers 505 subsequently, and associates with the specific action in game environment.The value of association can pre-define by blender 408 or by particular game title.These values can also be the result dynamically adjusted that blender 408 identifies that the particular data quality from each analyzer thus carry out is further discussed below.Adjustment can also be the result building the historical knowledge base when particular data has the characteristic of particular value and/or the given game title of response in specific environment.
Blender 408 can be configured to carry out period dynamic operation in game.Such as, when blender 408 receive various input control data time, it can recognize that certain data all the time outside acceptable scope of data or quality or reflection may indicate that related input device process mistake damage data.
It addition, some condition of real world environments can change.Such as, the natural light in the family game environment of user is likely to forward in the morning and is continuously increased the lower period of the day from 11 a.m. to 1 p.m, thus the problem causing image data capture.Additionally, neighbours or household are likely to become more noisy with the passage of time in a day, thus going wrong when causing audio data capture.Equally, if user has be carried out a few hours game, then their respond becomes less sharp, the problem thus resulting in the explanation of inertial data.
In these cases, or in other situation any that the quality of the input control data of particular form becomes problem, distribution of weights (weight) dynamically can be given the specific collection of the data from specific device by blender 408 again, make to give specific input and control the more or less importance of data, as mentioned above.Similarly, game environment can with particular game need change game process and change so that again assignment or need specific input control data.
Similarly, blender 408 can according to processing mistake or the feedback data that can be generated by gesture recognizers 505 recognize certain data being delivered to gesture recognizers 505 and be processed incorrectly, processes lentamente or entirely without process.Respond this feedback or recognize these difficult treatment (such as, while image analysis data is within tolerance interval, mistake is produced by gesture recognizers 505 when being associated), if which input is blender 408 adjustable seek from which analyzer controls data and the time when having.Before input control data are delivered to blender 408, blender 408 also can need suitable analyzer input controls some analysis of data and processes, it can process data (such as data being averaging) again so that constitutes another layer about efficiently and properly processing the data passing to gesture recognizers 505 and ensures.
In certain embodiments, blender 408 can recognize that certain data damage, invalid or beyond outside particular variables, and the specific input relevant to those data can be needed to control data or variable, make it may replace incorrect data, or suitably analyze relative to necessary variable and calculate certain data.
Can realize according to mode shown in Fig. 6 according to embodiments of the invention, the video game system of the above-mentioned type and method.Video game system 600 can include processor 601 and memorizer 602 (such as RAM, DRAM, ROM etc.).If it addition, to realize parallel processing, then video game system 600 can have multiple processor 601.Memorizer 602 includes data and games code 604, and it can include part configured as described above.Specifically, memorizer 602 can include inertial signal data 606, and these inertial signal data 606 can include storage control routing information as above.Memorizer 602 also can comprise the gesture data 608 stored, for instance represents the data of the one or more postures relevant to games 604.The coded command running on processor 602 can realize multi input blender 605, and it can carry out configuring and working as described above.
System 600 may also include well-known support function 610, for instance input/output (I/O) element 611, power supply (P/S) 612, clock (CLK) 613 and high-speed cache 614.Equipment 600 can include the mass storage device 615 of storage program and/or data alternatively, for instance disc driver, CD-ROM drive, tape drive etc..Controller can also include display unit 616 and user interface section 618 alternatively, in order to mutual between controller 600 and user.Display unit 616 can take the cathode ray tube (CRT) of display text, numeral, graphical symbol or image or the form of flat screens.User interface 618 can include keyboard, mouse, stick, light pen or other device.It addition, user interface 618 can include microphone, video camera or other chromacoder, to provide the direct of signal to be analyzed to catch.The processor 601 of system 600, memorizer 602 and other assembly can via system bus 620 phase interchangeable signal (such as code command and data), as shown in Figure 6.
Microphone array 622 can pass through I/0 function 611 and couple with system 600.Microphone array can include about 2 to about 8 microphones, it is preferable that about 4 microphones, the distance that wherein adjacent microphone is separated less than about 4 centimetres, is preferably between about 1 centimetre and about 2 centimetres.Preferably, the microphone in array 622 is omni-directional microphone.Optional image capture unit 623 (such as video camera) can be passed through I/O function 611 and be coupled with equipment 600.One or more sensing actuators 625 with camera mechanical couplings can exchange signal via I/O function 611 and processor 601.
Term as used herein " I/0 " is generally directed to system 600 and device transmission data or transmission to the periphery from any program of data of system 600 and peripheral unit, operation or device.The transmission of each data is considered as the output from a device and the input to another device.Peripheral unit includes the output device merely entering device, such as printer etc. of such as keyboard and mouse etc. and such as to serve as the device such as writable cd-ROM of input and output device.Term " peripheral unit " including: the external device (ED) of such as mouse, keyboard, printer, monitor, microphone, game console, camera, external Zip drive or scanner etc. and other peripheral hardware of the such as interior arrangement of CD-ROM drive, CD-R drive or internal modems etc. or such as flash memory reader/write device, hard disk drive etc..
In certain embodiments of the present invention, equipment 600 can be video gaming units, and it can include the controller 630 via I/0 function 611 with processor wired (such as USB cable) or wireless coupling.Controller 630 can have analog joystick control 631 and conventional button 633, and they provide control signal conventional during carrying out video-game.This kind of video-game can be embodied as processor readable data and/or the instruction of the program 604 from being storable in memorizer 602 or other processor readable medium of such as associating etc. with mass storage device 615.In certain embodiments, blender 605 can receive the input from analog joystick control 631 and button 633.
Stick control 631 is generally configured to control bar send signalisation along the movement of X-axis so that moving to the left or to the right, will control bar and (upwards) or backward move and then signal the movement along Y-axis (downwards) forward.In configuration in the stick of three-dimensional movement, (clockwise) torsion stick can signal the movement along Z axis (counterclockwise) or to the right to the left.These three axle-X, Y and Z-are generally called rolling, pitching and driftage, especially with respect to aircraft.
Game console 630 can include operable with in processor 602, game console 630 at least one or both carry out the communication interface of digital communication.Communication interface can include universal asynchronous receiver emitter (" UART ").UART can be operable to receive the control signal of the operation for controlling to follow the tracks of device or the signal for communicating with another device from tracking device transmission.Alternatively, communication interface includes USB (universal serial bus) (" USB ") controller.USB controller can be operable to receive the control signal of the operation for controlling to follow the tracks of device or the signal for communicating with another device from tracking device transmission.
It addition, controller 630 can include one or more inertial sensor 632, it can provide position and/or orientation information via inertial signal to processor 601.Orientation information can include angle information, for instance the inclination of controller 630, rolling or driftage.Exemplarily, inertial sensor 632 can include any amount of accelerometer, gyroscope or inclination sensor or their any combination.In a preferred embodiment, inertial sensor 632 includes: inclination sensor, is suitable for sensing game console 630 relative to the orientation tilted with roll axis;First accelerometer, is suitable for sensing the acceleration along yaw axis;And second accelerometer, it is suitable for sensing the angular acceleration relative to yaw axis.Accelerometer can be embodied as such as MEMS device, including the mass installed by one or more springs, wherein has for sensing the mass sensor relative to the displacement in one or more directions.The signal carrying out the displacement depending on mass of sensor may be used to determine the acceleration of game console 630.This kind of technology can by realizing from the instruction of the games 604 being storable in memorizer 602 and being run by processor 601.
Exemplarily, the simple quality that the accelerometer being suitable as inertial sensor 632 can for example be by spring, couple with frame elastic on three or four points is certainly.Pitching and roll axis are arranged in and the plane of the frame intersection being installed to game material controlling device 630.When framework (with game console 630) rotates around pitching and roll axis, quality is certainly by displacement under the influence of gravity, and spring will extend or compresses in the way of depending on the angle of pitching and/or rolling.The displacement of mass can be sensed and convert the signal depending on pitching and/or rolling amount to.Angular acceleration around yaw axis or the linear acceleration along yaw axis also can produce the compression of spring and/or the motion characteristics figure of elongation or mass, and they can be sensed and convert the signal of the amount depending on angle or linear acceleration to.This accelerometer means by the compression of tracking quality movement certainly or spring and expansive force, can be measured around the inclination of yaw axis, roll angle acceleration and the linear acceleration along yaw axis.There is many different modes come the position of tracking quality block and/or be applied to power thereon, including strain ga(u)ge material, photon sensor, Magnetic Sensor, Hall effect device, piezo-electric device, capacitance sensor etc..
It addition, game console 630 can include one or more light source 634, for instance light emitting diode (LED).Light source 634 can be used to distinguish a controller and another controller.Such as, one or more LED or can keep realizing this aspect by making the flicker of LED mode code.Exemplarily, 5 LED can be arranged on game console 630 with linear or two-dimensional model.Although the linear array of LED is preferred, but, LED can alternatively be arranged to rectangular pattern or arch pattern, in order to determine the plane of delineation of LED array when analyzing the image of the obtained LED mode of image capture unit 623.Additionally, LED mode code can be additionally used in the location determining game console 630 during game carries out.Such as, LED can help the inclination of identification controller, driftage and rolling.This detection pattern can help in game, such as better user/sensations of middle offer such as aircraft flying games.Image capture unit 623 can catch the image comprising game console 630 and light source 634.The analysis of this kind of image can determine that position and/or the orientation of game console.This analysis can by store in the memory 602 and the code instructions 604 run by processor 601 realize.For the ease of being caught the image of light source 634 by image capture unit 623, light source 634 may be provided on two or more different sides of game console 630, for instance is arranged on the front and back (as shown in shade).This layout allows image capture unit 623 to keep the mode of game console 630 to obtain the image of light source 634 for the different orientation of game console 630 according to user.
It addition, light source 634 can pass through such as pulse code, amplitude modulation(PAM) or frequency modulation(PFM) form provides telemetered signal to processor 601.This kind of telemetered signal can indicate whether press which stick button and/or press the dynamics of this kind of button.Such as by pulse code, pulsewidth modulation, frequency modulation(PFM) or light intensity (amplitude) modulation, telemetered signal can be encoded into optical signal.Telemetered signal from optical signal can be decoded by processor 601, and responds the telemetered signal of decoding and running game order.From the graphical analysis of image capture unit 623 obtained game console 630, telemetered signal can be decoded.Alternatively, equipment 600 can include being exclusively used in the Individual optical sensor receiving the telemetered signal from light source 634.The U.S. Patent Application No. 11/429414 authorizing RichardL.Marks et al., the title combination for " with the computer picture of the intensity of computer program interface and input equipment and Audio Processing " (attorney docket No.SONYP052) described in such as submitted on May 4th, 2006 determine that the intensive quantity with computer program interface is to use LED, is intactly incorporated herein in it by reference.It addition, the analysis of the image comprising light source 634 can be used for remote measurement and determines position and/or the orientation of game console 630.This kind of technology can realize in the instruction of program 604 by being storable in memorizer 602 and being run by processor 601.
Processor 601 can be combined with the sound source position of the optical signalling of the light source 634 detected from image capture unit 623 and/or the acoustical signal detected from microphone array 622 and characterization information and use the inertial signal from inertial sensor 632, in order to the information of the derive position about controller 630 and/or its user and/or orientation.Such as, " acoustic radar " sound source position and sign can be combined with microphone array 622 and be used for following the tracks of mobile voice, and the motion (by inertial sensor 632 and/or light source 634) of game console simultaneously is individually tracked.In acoustic radar, operationally select precalibrated audit area, and filter the sound that the source outside precalibrated audit area sends.Precalibrated audit area can include audit area corresponding to a large amount of focuses with image capture unit 623 or the visual field.The U.S. Patent Application No. 11/381724 that the title authorizing XiadongMao is " method and apparatus for target sound detection and sign " of submission on May 4th, 2006 is described in detail the example of acoustic radar, by reference it is incorporated herein in.Any amount of various combination providing the different mode of control signal to processor 601 can be combined use with embodiments of the invention.This kind of technology can be realized by the code instructions 604 being storable in memorizer 602 and run by processor 601, and can including one or more instruction alternatively, the one or more processor of these commands direct operationally selects precalibrated audit area and filters the sound sent from the source outside precalibrated audit area.Precalibrated audit area can include audit area corresponding to a large amount of focuses with image capture unit 623 or the visual field.
Program 604 can include one or more instruction alternatively, and the one or more processor of these commands direct is from the microphone M of microphone array 6220...MMMiddle generation discrete time-domain input signal xm(t), it is determined that monitor sector (sector), and use monitoring sector to select finite impulse response filter coefficient in half-blindness source separates, in order to separate from input signal xmThe different sound sources of (t).Program 604 may also include and is applied to by one or more fractional delays and carrys out self-reference microphone M0Input signal x0T selected input signal x that () is differentmThe instruction of (t).Each fractional delay may be selected to the signal to noise ratio optimizing discrete time-domain output signal y (t) from microphone array.Fractional delay is selected such that self-reference microphone M0Signal be first relative to the signal of other microphone from array in time.Program 604 may also include the instruction of output signal y (t) that mark time delay Δ introduces microphone array so that: y (t+ Δ)=x (t+ Δ) * b0+x(t-1+Δ)*b1+x(t-2+Δ)*b2+...+x(t-N+Δ)*bN, wherein Δ is between 0 and ± 1.The U.S. Patent Application No. 11/381729 that the title authorizing XiadongMao is " microminiature microphone array " that on May 4th, 2006 submits to is described in detail the example of this kind of technology, is incorporated by reference its complete disclosure.
Program 604 can include one or more instruction, and these instructions operationally make system 600 select the precalibrated monitoring sector comprising sound source.This kind of instruction can make equipment determine whether sound source is arranged in initial sector or is positioned at the specific side of initial sector.If sound source is not in default sector, then instruction can operationally select the different sectors of the specific side of default sector.The feature of this difference sector may be in the decay of the input signal closest to optimum.These instructions operationally can calculate the decay of the input signal from microphone array 622 and the decay to optimum.Instruction can operationally make equipment 600 determine the pad value inputting signal of one or more sector, and selects the sector decayed closest to optimum.The title the authorizing XiadongMao example of this technology described in the U.S. Patent application 11/381725 of " for the method and apparatus that target sound detects " such as submitted on May 4th, 2006, is openly incorporated herein in it by reference.
Signal from inertial sensor 632 can provide part tracking information to input, and the signal generated by following the tracks of one or more light sources 634 from image capture unit 623 can provide another part to follow the tracks of information input.Exemplarily rather than restriction, this kind of " mixed model " signal can be used in the game of rugby type video, and wherein quarter back's head to the left is met and discussed and delivered to the right after feinting deception.Specifically, the game player holding controller 630 labour contractor can turn to the left side, and carry out by controller seem rugby be equally flapped toward right throw action be simultaneously emitted by sound.The microphone array 622 being combined with " acoustic radar " program code can follow the tracks of the voice of user.Image capture unit 623 can be followed the tracks of the motion of user's head or follow the tracks of other order not needing sound or use controller.Sensor 632 can follow the tracks of the motion of game console (expression rugby).Image capture unit 623 also can light source 634 on tracking control unit 630.Can when reaching a certain amount of and/or direction of acceleration of game console 630, or when passing through to press key commands that the button on controller 630 triggers, user can unclamp " ball ".
In certain embodiments of the present invention, for instance the inertial signal from accelerometer or gyroscope may be used to determine the position of controller 630.Specifically, the acceleration signal from accelerometer can relative to time integral once, and to determine the change of speed, and speed can be integrated relative to the time, to determine the change of position.If the value of the initial position of certain time and speed be it is known that, the change of these values and speed and position can be used to determine absolute position.The position of inertial sensor is used to determine ratio use image capture unit 623 and light source 634 faster although can make, but, inertial sensor 632 is likely to a type of mistake through being called " drift ", and it is inconsistent that the mistake wherein accumulated in time may result in from the physical location between position (with shadow representation) and the game console 630 of the computed stick 631 of inertial signal.Embodiments of the invention allow various ways to process this kind of mistake.
Such as, by being reset to by the initial position of controller 630 equal to the current position calculated, drift can manually be offset.User can use the one or more of the button on controller 630 to trigger the order resetting initial position.Alternatively, by being reset to current location according to the determined position as reference of image obtained from image capture unit 623, the drift based on image can be realized.Such as when user triggers the button on game console 630 one or more, can the manually implemented this drift compensation based on image.Alternatively, for instance at regular intervals or response game carry out and be automatically obtained the drift compensation based on image.This kind of technology can be realized by the code instructions 604 being storable in memorizer 602 and run by processor 601.
In some embodiments, it may be desirable to the spurious signals compensated in inertial sensor signal.Such as, for carrying out over sampling from the signal of inertial sensor 632, and sliding average can be calculated from over sampling signal, in order to from inertial sensor signal, remove spurious signals.In some cases, it can be possible to wish signal is carried out over sampling, and from certain subset of data point, get rid of high and/or low value, and calculate sliding average from remainder data point.Additionally, other data sampling and manipulation technology can be used for adjusting the signal from inertial sensor, in order to remove or reduce the importance of spurious signals.The selection of technology can be depending on the character of signal, the calculating that signal is performed, the character carried out of playing or their certain combinations two or more.This kind of technology can realize in the instruction of program 604 by being storable in memorizer 602 and being run by processor 601.
Processor 601 can respond the code instructions of the data 606 and program 604 being stored by memorizer 602 and being retrieved and run by processor module 601 and perform the analysis of inertial signal data 606 as above.The code section of program 604 may conform to any one of multiple different programming language, for instance compilation, C++, JAVA be many other Languages perhaps.Processor module 601 forms general purpose computer, and it becomes special-purpose computer when running the programs such as such as program code 604.Although program code 604 is described herein as being realized by software and running on general purpose computer, but, it will be appreciated by those skilled in the art that the method for task management alternatively can use the hardware of such as special IC (ASIC) or other hardware circuit etc. to realize.It will thus be appreciated that embodiments of the invention can be realized by software, hardware or the combination of both in whole or in part.
In one embodiment, program code 604 wherein may also include processor readable instruction sets, and this instruction set realizes method 520 or their certain methods combining the same feature two or more with method 510 and Fig. 5 C with Fig. 5 B.Program code 604 generally can include one or more instruction, and the one or more processor analysis of these commands direct is from the signal of inertial sensor 632, in order to generates position and/or orientation information, and utilizes this information during carrying out video-game.
Program code 604 can include processor executable alternatively, including one or more instructions, they operationally make image capture unit 623 monitor the visual field before image capture unit 623, what identify in the light source 634 in the visual field is one or more, the change of the light that detection sends from light source 634;And response detects that change triggers the input order to processor 601.The title authorizing RichardL.Marks such as submitted on January 16th, 2004 is combined with image capture device for the U.S. Patent Application No. 10/759782 of " method and apparatus for optical input device " described in and uses LED to the action triggering in game console, it is intactly incorporated herein in by reference.
Program code 604 can include processor executable alternatively, including one or more instructions, they operationally use from the signal of inertial sensor and from image capture unit by following the tracks of the signal that generates of one or more light sources as the input to games system, as mentioned above.Program code 604 can include processor executable alternatively, including one or more instructions of the drift operationally compensated in inertial sensor 632.
It addition, program code 604 can include processor executable alternatively, including the one or more instructions operationally adjusting the controller manipulation linkage to game environment and mapping.This feature allows user to change the manipulation of game console 630 " linkage " to game state.Such as, 45 degree of game console 630 rotate and can rotate linkage with the 45 of game object degree.But, this 1: 1 linkage ratio can be changed into and converts the Y rotation (or tilt or go off course or " manipulation ") of game object to so that the X degree of controller rotates (or tilt or go off course or " manipulation ").Linkage can be 1: 1 ratio, 1: 2 ratio, 1: X ratio or X: Y ratio, the wherein desirable arbitrary value of X and Y.It addition, the mapping that input channel controls to game also can be revised in time or immediately.Amendment can include the threshold value etc. changing posture locus model, location revision, scale, posture.This mapping may be programmed, at random, overlapping, staggered etc., in order to provide the user the manipulation of dynamic range.Map, the amendment of linkage or ratio can be carried out according to game by games 604, game state, by user's modifier button (keypad etc.) of being arranged on game console 630 or widely response input channel adjust.What input channel may include but be not limited to audio user, controller generates audio frequency, controller generated follow the tracks of audio frequency, controller buttons state, video camera output, include accelerometer data, inclination, driftage, rolling, position, the controller telemetry of acceleration and carry out sensor can follow the tracks of user or other data any that the user for object handles.
In certain embodiments, games 604 can change mapping or linkage in time respectively by predetermined time correlation mode from a kind of scheme or ratio to another kind of scheme.Linkage and mapping change can be applied to game environment by various modes.In one example, when personage's health, video game character can control according to the one scheme of linking, and when the health of personage worsens, the whole control order of system adjustable, therefore forces user to aggravate the movement of controller to illustrate order to personage.Adjusting input so that when regaining the control of personage under new mappings when such as may call for user, the video game character becoming disorientation can force the mapping changing input channel.Amendment input channel also can carry out period change in game to the mapping scheme of the conversion of game commands.This conversion can be responded the amendment order sent under one or more elements of game state or response input channel and be undertaken by various modes.Linkage and mapping may be additionally configured to affect configuration and/or the process of one or more elements of input channel.
It addition, such as the acoustic emitter 636 of loudspeaker, buzzer, bugle, bagpipe etc. may be mounted to Joystick controller 630.In certain embodiments, acoustic emitter can be installed to " main body (body) " of Joystick controller 630 in a detachable manner.Positioning and characterize at program code 604 in " acoustic radar " embodiment of the sound adopting microphone array 622 to detect, acoustic emitter 636 can provide and can be detected by microphone array 622 and by program code 604 for following the tracks of the audio signal of the position of game console 630.Acoustic emitter 636 can be additionally used in and from game console 630, additional " input channel " is supplied to processor 601.Can regularly with pulsing from the audio signal of acoustic emitter 636, in order to the beacon making acoustic radar tracing positional is provided.Audio signal (with pulsing or alternate manner) can be audible or hyperacoustic.Acoustic radar can be followed the tracks of the user of game console 630 and handle, and wherein this manipulation tracking can include the information relevant with the position of game console 630 and orientation (such as pitching, rolling or yaw angle).Pulse can trigger with the suitable working cycle, and this is that those skilled in the art can apply.Pulse can be initiated according to from the control signal of system arbitrament.The distribution of the control signal between two or more Joystick controllers 630 that system 600 (by program code 604) tunable couples with processor 601, to guarantee to follow the tracks of multiple controller.
In certain embodiments, blender 605 can be configured to obtain for using the input received from routine controls such as the such as analog joystick control 631 game console 630 and buttons 633 to control the input of the operation of games 604.Specifically, receive blender 605 and can receive the controller input information of self-controller 630.Controller input information can include following at least one: a) identify that the user of game console may move the information of current location controlling bar relative to the resting position controlling bar, or b) identify that whether the switch that comprises in game console is movable information.Blender 605 also can receive the supplementary input information from the environment just using controller 630.Exemplarily rather than restriction, supplement input information can include following one or more: the i) information that the image capture device from environment (such as image capture unit 623) is obtained;And/or ii) from the information with at least one inertial sensor associated (such as inertial sensor 632) of game console or user;And/or iii) the obtained acoustic intelligence (such as from microphone array 622, it is possible to the acoustical signal generated with acoustic transmitter 636 is combined) of sonic transducer from environment.
Controller input information may also include and identifies that whether pressure sensitive buttons is movable information.Inputting information and supplementary input information to produce combination input by processing controller, blender 605 is available for controlling the combination input of the operation of games 604.
Combination input can include each merging input for controlling each function corresponding at the run duration of games 604.Can pass through to merge the controller about specific independent function and input information and the supplementary input information about specific independent function, obtain each at least some merging input.Combine input and can include the merging input for controlling certain function at the run duration of games 604, and can pass through to merge about the controller input information of this function and the supplementary input information about this function, obtain at least some merging input.In such cases, by asking the average of the value of the value representing controller input information and the supplementary input information of expression, merging can be performed.Exemplarily, the value of controller input information and the average of the value of supplementary input information can be asked according to the ratio of to.Alternatively, controller input information and supplementary input information all can be endowed different weights, and can, according to the weighted average of composed weight, the value inputting information and supplementary input information as controller, perform to average.
In certain embodiments, the value of controller input information or first of supplementary input information can be used as the amendment to games and input, for revising the control of at least one the still function of activity activated for input information or supplementary input information according to controller second.Supplement input information and can include being operated by the orientation information of the orientation of the obtained inertial sensor information of inertial sensor 632 and/or expression user's movable objects.Alternatively, supplement input information and include the position of instruction user's movable objects or the information of at least one of orientation." user's movable objects " used herein above can refer to controller 630 or be installed to the product of main body of controller 630, and supplementary input information includes the information of orientation of instruction user's movable objects.Exemplarily, this orientation information can include the information of at least one in instruction pitching, driftage or rolling.
In certain embodiments, can pass through to would indicate that the value of the controller input information of the position controlling bar (such as analog joystick 631 one of them) merges with the value of the supplementary input information of the orientation representing user's movable objects, obtain to combine and input.As mentioned above, user's movable objects can include the object and/or the game console 630 that are installed to game console 630, and when control bar be moved rearwards by, simultaneously pitching just increasing to (high head (nose-up)) value time, combination input can reflect the input of facing upward of enhancing.Similarly, when control bar move forward, simultaneously pitching reduce to negative (undershoot) value time, combination input can reflect that the underriding of enhancing inputs.
The value specifying the controller representing the position the controlling bar input information value as thick control information and the supplementary input information specifying the orientation representing user's movable objects can be passed through as carefully controlling information, obtain combination input.Alternatively, can pass through to specify the switch identifying game console be whether the value of movable controller input information as the value of thick control information and the supplementary input information specifying the orientation representing user's movable objects as thin control information, obtain combination input.It addition, the value that can pass through the supplementary input information of the orientation of appointment expression user's movable objects as thick control information and specifies the controller representing the position controlling bar to input the value of information as carefully controlling information, obtain and combine input.In addition, also by specify the switch identifying game console be whether the value of movable controller input information as the value of thin control information and the supplementary input information specifying the orientation representing user's movable objects as thick control information, obtain combination input.In all these situations or any one situation therein, combination input can represent the thick value controlling information adjusting relatively small number according to thin control information.
In certain embodiments, can by controller being inputted the value represented by information and the value additive combination represented by supplementary input information, make combination input provide the signal with any one the higher or lower value inputting the value that information or supplementary input information individually take than controller to games 604, obtain combination input.Alternatively, combination input can provide the signal with smooth value to games 604, and smooth value signal passes through any one the slower change of the value individually taken than controller input information or supplementary input information in time.Combination input also can provide the high-definition signal of the signal content with increase to games.High-definition signal can change through inputting any one of value that information or supplementary input information individually takes than controller in time more rapidly.
Although describing embodiments of the invention according to the example relevant to the game of PlayStation 3 videogame console/PS3 630, but, the embodiments of the invention including system 600 can handle use on main body, molded object, knob, structure etc. any user, wherein has inertia sensing ability and wireless or alternate manner inertial sensor signal transmission capabilities.
Exemplarily, embodiments of the invention can realize on parallel processing system (PPS).This kind of parallel processing system (PPS) generally includes two or more processor elements, and they are configured to use some parts of independent processor parallel running program.Exemplarily rather than restriction, Fig. 7 illustrates a type of cell processor 700 according to an embodiment of the invention.Cell processor 700 can be used as the processor 601 of Fig. 6 or the processor 502 of Fig. 5 A.In the example depicted in fig. 7, cell processor 700 includes main storage 702, power processor element (PPE) 704 and multiple coprocessor element (SPE) 706.In the example depicted in fig. 7, cell processor 700 includes single PPE704 and eight SPE706.In this configuration, seven in SPE706 can be used for parallel processing, and standby when one can be retained as in other seven is out of order.Alternatively, cell processor can include organizing PPE (PPE group) more and organizing SPE (SPE group) more.In this case, hardware resource can be shared between unit in a group.But, software must be shown as independent component by SPE and PPE.Therefore, embodiments of the invention are not limited to use with the compound and cooperation shown in Fig. 7.
Main storage 702 generally include general and Nonvolatile memory devices and configure for such as system, data transmission synchronizes, memorizer maps specialized hardware depositor or the array of the functions such as I/O and I/O subsystem.In an embodiment of the present invention, video game program 703 can be resided in main storage 702.Memorizer 702 also can comprise signal data 709.Video program 703 can include inertia, image and above in relation to described in Fig. 4, Fig. 5 A, Fig. 5 B or Fig. 5 C or their certain acoustic analyser of configuring of combination and blender.Program 703 can be run on PPE.Program 703 can be divided into the multiple signal processing tasks that can run on SPE and/or PPE.
Exemplarily, PPE704 can be 64 the Power PC Processor unit (PPU) having related L1 and L2 high-speed cache.PPE704 is General Porcess Unit, and it may have access to system administration resources (such as memory protection table).Hardware resource can clearly be mapped to the PPE actual address space seen.Therefore, PPE can pass through to use suitable effective address value directly any one of these resources to be addressed.The major function of PPE704 is the task of the SPE706 in management and distribution cell processor 700.
Although Fig. 7 only illustrates single PPE, but realize at some cell processors, in cell Broadband Engine Architecture (CBEA), cell processor 700 can have to be organized in multiple PPE, PPE groups of PPE group and can there is more than one PPE.These PPE groups can share the access to main storage 702.Additionally, cell processor 700 can include two or more groups SPE.SPE group also can share the access to main storage 702.This kind of configuration is within the scope of the invention.
Each SPE706 includes the local storage LS of coprocessor unit (SPU) and its own.Local storage LS can include one or more independent memory storage area, and each associates with specific SPU.Each SPU can be configured to the instruction (including data to load and data storage operations) only running in the locally stored territory of the association from its own.In this configuration, can pass through to send direct memory access (DMA) (DMA) order from memory stream controller (MFC) to transmit data or the transmission data from locally stored territory to (independent SPE's) locally stored territory, perform the data transmission between other position of local storage LS and system 700.Compared with PPE704, SPU is less complicated computing unit, because they do not perform any system management function.SPU generally has single-instruction multiple-data (SIMD) ability, and generally process Data Concurrent plays any desired data transmission (obeying the PPE access attribute set up), in order to perform its distribution task.The purpose of SPU is to realize needing the application of higher computing unit density, and can be efficiently used the instruction set of offer.A large amount of SPE in the system that PPE704 manages allow the cost-effective process for widespread adoption.
Each SPE706 can include private memory stream controller (MFC), and it includes the associative storage administrative unit that can keep and process memory protection and access grant information.MFC provides the data transmission between main storage means and the local storage of SPE of cell processor, protection and Tong Bu elemental method.MFC command describes pending transmission.The order of transmission data orders (or MFC DMA command) sometimes referred to as MFC direct memory access (DMA) (DMA).
Each MFC can support that multiple DMA transmits simultaneously, and can keep and process multiple MFC command.The request of each MFCDMA data transfer command can comprise locally stored address (LSA) and effective address (EA).It only can be associated the local storage direct addressin of SPE by locally stored address.Effective address can have and is more normally applied, for instance, it can quote main storage means, including all SPE local storages, if they are aliased into actual address space.
In order to help the communication between SPE706 and/or between SPE706 and PPE704, SPE706 and PPE704 can include the signalisation depositor relying on signaling event.PPE704 and SPE706 can be coupled by star topology, and wherein PPE704 serves as the router transmitting message to SPE706.Alternatively, each SPE706 and PPE704 can have the one way signal notice depositor being called mailbox.Mailbox can be used for presiding over operating system (0S) by SPE706 and synchronize.
Cell processor 700 can include input/output (I/O) function 708, cell processor 700 can pass through the peripheral device interface of this function and such as microphone array 712 and optional image capture unit 713 and game console 730 etc..Game console unit can include inertial sensor 732 and light source 734.It addition, Component Interconnect bus 710 can connect above-mentioned various assembly.Each SPE and PPE can access bus 710 by Bus Interface Unit BIU.Cell processor 700 may also include two controllers being typically found in processor: the bus interface controller BIC of the data stream between memory interface controller MIC and control I/O708 and the bus 710 of the data stream between control bus 710 and main storage 702.Although the requirement of MIC, BIC, BIU and bus 710 is likely to greatly change for different realizations, but those skilled in the art can be familiar with its function and for realizing their circuit.
Cell processor 700 may also include internal interrupt controller IIC.IIC assembly management is supplied to the priority of the interruption of PPE.IIC permission processes the interruption of other assembly from cell processor 700, without using main system interrupt control unit.IIC is regarded as second level controller.Main system interrupt control unit can process the interruption originated outside cell processor
In an embodiment of the present invention, some calculating of one or more executed in parallel of PPE704 and/or SPE706 can be used, such as above-mentioned fractional delay.Each fractional delay calculates and can run as one or more independent tasks, and when they become available, difference SPE706 can carry out these tasks.
Although being above the complete description to the preferred embodiments of the present invention, but it is able to use various alternative, amendment and equivalents.Therefore, the scope of the present invention should do not determined with reference to above description, but jointly should determine with reference to claims and complete equivalent scope thereof.It is as herein described regardless of whether preferred any feature all can with as herein described regardless of whether preferred any further feature be combined.In claims below, " one " refers to the one or more quantity after this word, unless otherwise noted.Appended claims is not meant to be interpreted as comprise means-plus-function restriction, unless use in given claim word " be used for ... parts " be expressly recited this restriction.
Claims (27)
1. for controlling a controller for the operation of program, including:
From user can the source of controller input information of steering controller, described controller input information includes for identifying that the user on described controller may move switch or controls the information of current state of bar;
From the source of the supplementary input information of described controller, wherein said supplementary input information includes the information indicating the three-dimensional motion of described controller;And
Wherein, described controller input information becomes with described supplementary input information configuration by process described controller input information and described supplementary input information with obtain combination input and combined, thus obtain for control described program operation described combination input, otherwise wherein by specify described controller input information value as thick control information and specify represent described user's movable objects orientation described supplementary input information value as thin control information or obtain described combination input.
2. controller as claimed in claim 1, wherein, described controller input information becomes with described supplementary input information configuration so that described combination input includes the merging input for controlling certain function at the run duration of described program, and obtains the described at least some merging input by merging about the described controller input information of described function and the described supplementary input information about described function.
3. controller as claimed in claim 2, wherein, by seeking the value and the average of the value representing described supplementary input information representing described controller input information, performs described merging.
4. controller as claimed in claim 3, wherein, asks the value of described controller input information and the average of the value of described supplementary input information according to the ratio of to.
5. controller as claimed in claim 3, wherein, described controller input information and described supplementary input information are all endowed different weights, and according to the weighted average of composed weight, the described value inputting information and supplementary input information as controller, step of averaging described in performing.
6. controller as claimed in claim 1, wherein, the value of described controller input information or first of described supplementary input information is configured for use as the amendment to described program and input, for the amendment control at least one still function of activity activated of input information or described supplementary input information according to described controller second.
7. controller as claimed in claim 1, wherein, the source of described side information includes the LED light source on described controller and diffusing globe, wherein said diffusing globe is configured to make the light generation diffusion from described light source, and wherein supplements input information from described in described LED light source diffusion Image Acquisition in the image obtained from image capture device.
8. controller as claimed in claim 1, wherein, described supplementary input information also includes being operated by the obtained inertial sensor information of inertial sensor or represents at least one of orientation information of orientation of user's movable objects.
9. controller as claimed in claim 8, wherein, described inertial sensor is installed to described controller, and described inertial sensor includes at least one of accelerometer or gyroscope.
10. controller as claimed in claim 1, wherein, described supplementary input information includes the position of instruction user's movable objects or the information of at least one of orientation.
11. controller as claimed in claim 10, wherein, described user's movable objects includes described controller or is installed at least one of product of main body of described controller, and described supplementary input information includes the information that indicates the orientation of described user's movable objects.
12. controller as claimed in claim 10, wherein, described supplementary input information includes the information of at least one of instruction pitching, driftage or rolling.
13. controller as claimed in claim 12, wherein, described supplementary input information includes the information of instruction pitching, driftage or rolling.
14. controller as claimed in claim 10, wherein, by would indicate that the value of controller input information of described switch or the state that controls bar merges with the value of described supplementary input information of the orientation representing described user's movable objects, obtaining described combination and inputting.
15. controller as claimed in claim 14, wherein, when described control bar be moved rearwards by, pitching simultaneously just increasing to (high head) value time, the input of facing upward that described combination input reflection strengthens.
16. controller as claimed in claim 15, wherein, when described control bar move forward, pitching simultaneously be reduced to negative (undershoot) value time, the underriding input that described combination input reflection strengthens.
17. controller as claimed in claim 14, wherein, described combination input represents the value of the described thick control information adjusting relatively small number according to described thin control information.
18. controller as claimed in claim 14, wherein, by specifying the described switch identified on described controller or controlling the value whether bar is movable described controller input information and obtain described combination input as the value of thick control information and the described supplementary input information specifying the orientation representing described user's movable objects as thin control information, wherein said combination input represents the value of the described thick control information adjusting relatively small number according to described thin control information.
19. controller as claimed in claim 14, wherein, described controller input information and described side information are arranged so that wherein said combination input represents the value of the described thick control information adjusting relatively small number according to described thin control information by specifying the described switch identified on described controller or controlling the value whether bar is movable described controller input information and obtain described combination input as the value of thin control information and the described supplementary input information specifying the orientation representing described user's movable objects as thick control information.
20. controller as claimed in claim 1, wherein, described controller input information and described side information be arranged so that by by the value represented by described controller input information with the value additive combination represented by described supplementary input information so that described combination input provides the signal with any one the higher value inputting the value that information or described supplementary input information individually take than described controller to obtain described combination input to described program.
21. controller as claimed in claim 1, wherein, described controller input information and described side information be arranged so that by by the value represented by described controller input information with the value subtractive combination represented by described supplementary input information so that described combination input provides the signal with any one the less value inputting the value that information or described supplementary input information individually take than described controller to obtain described combination input to described program.
22. controller as claimed in claim 1, wherein, described controller inputs information and described side information is arranged so that described combination input provides the signal with smooth value to described program, and described smooth value signal passes through any one the slower change of the value individually taken than described controller input information or described supplementary input information in time.
23. controller as claimed in claim 1, wherein, described controller input information and described side information are arranged so that described combination input provides the high-definition signal of signal content with increase to described program, and described high-definition signal is in time through inputting information than described controller or any one of value that described supplementary input information individually takes changes more rapidly.
24. controller as claimed in claim 1, wherein, described supplementary input information includes responding from the obtained acoustic intelligence of the sound of the source emission described controller sonic transducer from environment.
25. controller as claimed in claim 1, wherein, described controller input information includes identifying that whether pressure sensitive buttons is movable information.
26. controller as claimed in claim 1, wherein, described supplementary input information include following at least one: the i) information that the image capture device from environment is obtained, ii) from the information with at least one inertial sensor associated of described controller or user, or iii) from the information of the sonic transducer in environment.
27. controller as claimed in claim 1, wherein, described supplementary input information include the obtained information of the image capture device from environment, from the information of at least one inertial sensor associated of described controller or user and from the information of the sonic transducer in environment.
Applications Claiming Priority (137)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US2006/017483 WO2006121896A2 (en) | 2005-05-05 | 2006-05-04 | Microphone array based selective sound source listening and video game control |
USPCT/US2006/017483 | 2006-05-04 | ||
US11/429,047 US8233642B2 (en) | 2003-08-27 | 2006-05-04 | Methods and apparatuses for capturing an audio signal based on a location of the signal |
US11/429,133 US7760248B2 (en) | 2002-07-27 | 2006-05-04 | Selective sound source listening in conjunction with computer interactive processing |
US11/429,414 | 2006-05-04 | ||
US11/381,728 | 2006-05-04 | ||
US11/418989 | 2006-05-04 | ||
US11/418,989 US8139793B2 (en) | 2003-08-27 | 2006-05-04 | Methods and apparatus for capturing audio signals based on a visual image |
US11/418,989 | 2006-05-04 | ||
US11/429414 | 2006-05-04 | ||
US11/381,725 | 2006-05-04 | ||
US11/429,133 | 2006-05-04 | ||
US11/381,727 US7697700B2 (en) | 2006-05-04 | 2006-05-04 | Noise removal for electronic device with far field microphone on console |
US11/429,414 US7627139B2 (en) | 2002-07-27 | 2006-05-04 | Computer image and audio processing of intensity and input devices for interfacing with a computer program |
US11/381,721 US8947347B2 (en) | 2003-08-27 | 2006-05-04 | Controlling actions in a video game unit |
US11/381,729 | 2006-05-04 | ||
US11/418,988 | 2006-05-04 | ||
US11/381724 | 2006-05-04 | ||
US11/381,724 US8073157B2 (en) | 2003-08-27 | 2006-05-04 | Methods and apparatus for targeted sound detection and characterization |
US11/418,988 US8160269B2 (en) | 2003-08-27 | 2006-05-04 | Methods and apparatuses for adjusting a listening area for capturing sounds |
US11/429,047 | 2006-05-04 | ||
US11/429047 | 2006-05-04 | ||
US11/381,721 | 2006-05-04 | ||
US11/381,725 US7783061B2 (en) | 2003-08-27 | 2006-05-04 | Methods and apparatus for the targeted sound detection |
US11/381,729 US7809145B2 (en) | 2006-05-04 | 2006-05-04 | Ultra small microphone array |
US11/418988 | 2006-05-04 | ||
US11/381729 | 2006-05-04 | ||
US11/381,728 US7545926B2 (en) | 2006-05-04 | 2006-05-04 | Echo and noise cancellation |
US11/429133 | 2006-05-04 | ||
US11/381725 | 2006-05-04 | ||
US11/381728 | 2006-05-04 | ||
US11/381727 | 2006-05-04 | ||
US11/381721 | 2006-05-04 | ||
US11/381,727 | 2006-05-04 | ||
US11/381,724 | 2006-05-04 | ||
US79803106P | 2006-05-06 | 2006-05-06 | |
US11/382034 | 2006-05-06 | ||
US11/382,035 | 2006-05-06 | ||
US11/382033 | 2006-05-06 | ||
US11/382,037 | 2006-05-06 | ||
US11/382031 | 2006-05-06 | ||
US11/382,038 | 2006-05-06 | ||
US11/382,031 | 2006-05-06 | ||
US11/382,032 | 2006-05-06 | ||
US11/382,037 US8313380B2 (en) | 2002-07-27 | 2006-05-06 | Scheme for translating movements of a hand-held controller into inputs for a system |
US11/382,033 US8686939B2 (en) | 2002-07-27 | 2006-05-06 | System, method, and apparatus for three-dimensional input control |
US11/382037 | 2006-05-06 | ||
US11/382,031 US7918733B2 (en) | 2002-07-27 | 2006-05-06 | Multi-input game control mixer |
US29/259350 | 2006-05-06 | ||
US11/382,034 US20060256081A1 (en) | 2002-07-27 | 2006-05-06 | Scheme for detecting and tracking user manipulation of a game controller body |
US29/259,350 | 2006-05-06 | ||
US11/382035 | 2006-05-06 | ||
US11/382,033 | 2006-05-06 | ||
US11/382,034 | 2006-05-06 | ||
US29259349 | 2006-05-06 | ||
US29259348 | 2006-05-06 | ||
US29/259,348 | 2006-05-06 | ||
US11/382,036 | 2006-05-06 | ||
US11/382,032 US7850526B2 (en) | 2002-07-27 | 2006-05-06 | System for tracking user manipulations within an environment |
US29/259,350 USD621836S1 (en) | 2006-05-06 | 2006-05-06 | Controller face with tracking sensors |
US11/382,038 US7352358B2 (en) | 2002-07-27 | 2006-05-06 | Method and system for applying gearing effects to acoustical tracking |
US29/259348 | 2006-05-06 | ||
US11/382032 | 2006-05-06 | ||
US29/259,349 | 2006-05-06 | ||
US11/382,035 US8797260B2 (en) | 2002-07-27 | 2006-05-06 | Inertially trackable hand-held controller |
US11/382,036 US9474968B2 (en) | 2002-07-27 | 2006-05-06 | Method and system for applying gearing effects to visual tracking |
US60/798031 | 2006-05-06 | ||
US29/259349 | 2006-05-06 | ||
US60/798,031 | 2006-05-06 | ||
US11/382036 | 2006-05-06 | ||
US11/382038 | 2006-05-06 | ||
US11/382043 | 2006-05-07 | ||
US11/382,041 US7352359B2 (en) | 2002-07-27 | 2006-05-07 | Method and system for applying gearing effects to inertial tracking |
US11/382039 | 2006-05-07 | ||
US11/382,043 | 2006-05-07 | ||
US11/382,039 | 2006-05-07 | ||
US11/382,039 US9393487B2 (en) | 2002-07-27 | 2006-05-07 | Method for mapping movements of a hand-held controller to game commands |
US11/382,043 US20060264260A1 (en) | 2002-07-27 | 2006-05-07 | Detectable and trackable hand-held controller |
US11/382041 | 2006-05-07 | ||
US11/382,040 | 2006-05-07 | ||
US11/382,041 | 2006-05-07 | ||
US11/382,040 US7391409B2 (en) | 2002-07-27 | 2006-05-07 | Method and system for applying gearing effects to multi-channel mixed input |
US11/382040 | 2006-05-07 | ||
US11/382,252 | 2006-05-08 | ||
US29/246762 | 2006-05-08 | ||
US11/430,593 | 2006-05-08 | ||
US29/246,744 | 2006-05-08 | ||
US29/246,759 | 2006-05-08 | ||
US11/382,250 US7854655B2 (en) | 2002-07-27 | 2006-05-08 | Obtaining input for controlling execution of a game program |
US29/246768 | 2006-05-08 | ||
US29246765 | 2006-05-08 | ||
US29/246764 | 2006-05-08 | ||
US29/246743 | 2006-05-08 | ||
US11/382,256 | 2006-05-08 | ||
US29/246763 | 2006-05-08 | ||
US11/430593 | 2006-05-08 | ||
US29/246,767 USD572254S1 (en) | 2006-05-08 | 2006-05-08 | Video game controller |
US11/382250 | 2006-05-08 | ||
US11/382,250 | 2006-05-08 | ||
US11/430,594 US20070260517A1 (en) | 2006-05-08 | 2006-05-08 | Profile detection |
US11/382,259 US20070015559A1 (en) | 2002-07-27 | 2006-05-08 | Method and apparatus for use in determining lack of user activity in relation to a system |
US29/246767 | 2006-05-08 | ||
US11/382256 | 2006-05-08 | ||
US29/246,744 USD630211S1 (en) | 2006-05-08 | 2006-05-08 | Video game controller front face |
US11/430,593 US20070261077A1 (en) | 2006-05-08 | 2006-05-08 | Using audio/visual environment to select ads on game platform |
US29246766 | 2006-05-08 | ||
US29/246,743 USD571367S1 (en) | 2006-05-08 | 2006-05-08 | Video game controller |
US29246759 | 2006-05-08 | ||
US29/246,743 | 2006-05-08 | ||
US29/246,768 USD571806S1 (en) | 2006-05-08 | 2006-05-08 | Video game controller |
US29246762 | 2006-05-08 | ||
US29/246,768 | 2006-05-08 | ||
US11/382,251 | 2006-05-08 | ||
US11/382,256 US7803050B2 (en) | 2002-07-27 | 2006-05-08 | Tracking device with sound emitter for use in obtaining information for controlling game program execution |
US11/382251 | 2006-05-08 | ||
US11/382259 | 2006-05-08 | ||
US29/246,764 USD629000S1 (en) | 2006-05-08 | 2006-05-08 | Game interface device with optical port |
US11/430,594 | 2006-05-08 | ||
US11/382,258 US7782297B2 (en) | 2002-07-27 | 2006-05-08 | Method and apparatus for use in determining an activity level of a user in relation to a system |
US29246763 | 2006-05-08 | ||
US29/246,766 | 2006-05-08 | ||
US29/246765 | 2006-05-08 | ||
US29/246,765 | 2006-05-08 | ||
US29/246766 | 2006-05-08 | ||
US11/382,251 US20060282873A1 (en) | 2002-07-27 | 2006-05-08 | Hand-held controller having detectable elements for tracking purposes |
US11/382252 | 2006-05-08 | ||
US11/382258 | 2006-05-08 | ||
US11/430594 | 2006-05-08 | ||
US29/246,764 | 2006-05-08 | ||
US29/246744 | 2006-05-08 | ||
US29/246,763 | 2006-05-08 | ||
US11/382,252 US10086282B2 (en) | 2002-07-27 | 2006-05-08 | Tracking device for use in obtaining information for controlling game program execution |
US11/382,258 | 2006-05-08 | ||
US29/246,762 | 2006-05-08 | ||
US29/246759 | 2006-05-08 | ||
US11/382,259 | 2006-05-08 | ||
US29/246,767 | 2006-05-08 |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN200780025400.6A Division CN101484221B (en) | 2006-05-04 | 2007-04-14 | Obtaining input for controlling execution of a game program |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102989174A CN102989174A (en) | 2013-03-27 |
CN102989174B true CN102989174B (en) | 2016-06-29 |
Family
ID=46469882
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201210037498.XA Active CN102580314B (en) | 2006-05-04 | 2007-04-14 | Obtaining input for controlling execution of a game program |
CN201210496712.8A Active CN102989174B (en) | 2006-05-04 | 2007-04-14 | Obtain the input being used for controlling the operation of games |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201210037498.XA Active CN102580314B (en) | 2006-05-04 | 2007-04-14 | Obtaining input for controlling execution of a game program |
Country Status (3)
Country | Link |
---|---|
JP (3) | JP2009535173A (en) |
CN (2) | CN102580314B (en) |
WO (2) | WO2007130793A2 (en) |
Families Citing this family (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10279254B2 (en) * | 2005-10-26 | 2019-05-07 | Sony Interactive Entertainment Inc. | Controller having visually trackable object for interfacing with a gaming system |
JP5659453B2 (en) | 2007-11-15 | 2015-01-28 | セイコーエプソン株式会社 | Ink composition |
US20090221368A1 (en) * | 2007-11-28 | 2009-09-03 | Ailive Inc., | Method and system for creating a shared game space for a networked game |
US8419545B2 (en) * | 2007-11-28 | 2013-04-16 | Ailive, Inc. | Method and system for controlling movements of objects in a videogame |
GB2458297B (en) * | 2008-03-13 | 2012-12-12 | Performance Designed Products Ltd | Pointing device |
JP4628483B2 (en) * | 2008-07-15 | 2011-02-09 | パナソニック株式会社 | Portable device and position specifying method thereof |
KR101250513B1 (en) * | 2008-10-27 | 2013-04-03 | 소니 컴퓨터 엔터테인먼트 인코포레이티드 | Spherical ended controller with configurable modes |
KR20100138725A (en) | 2009-06-25 | 2010-12-31 | 삼성전자주식회사 | Method and apparatus for processing virtual world |
JP5534729B2 (en) * | 2009-07-16 | 2014-07-02 | 株式会社タイトー | Screen coordinate position detection method, screen coordinate position detection apparatus and gun game apparatus using double circle index |
CN106943742B (en) * | 2011-02-11 | 2024-04-26 | 漳州市阿思星谷电子科技有限公司 | Action amplifying system |
US20120277001A1 (en) * | 2011-04-28 | 2012-11-01 | Microsoft Corporation | Manual and Camera-based Game Control |
US8870654B2 (en) * | 2011-11-23 | 2014-10-28 | Sony Computer Entertainment America Llc | Gaming controller |
US10525347B2 (en) | 2012-03-13 | 2020-01-07 | Sony Interactive Entertainment America Llc | System and method for capturing and sharing console gaming data |
US8672765B2 (en) * | 2012-03-13 | 2014-03-18 | Sony Computer Entertainment America Llc | System and method for capturing and sharing console gaming data |
US9116555B2 (en) | 2011-11-23 | 2015-08-25 | Sony Computer Entertainment America Llc | Gaming controller |
US10486064B2 (en) | 2011-11-23 | 2019-11-26 | Sony Interactive Entertainment America Llc | Sharing buffered gameplay in response to an input request |
US10960300B2 (en) | 2011-11-23 | 2021-03-30 | Sony Interactive Entertainment LLC | Sharing user-initiated recorded gameplay with buffered gameplay |
CN103974752B (en) * | 2011-12-19 | 2016-05-18 | 英派尔科技开发有限公司 | Be used for the time-out of the game based on posture and restart scheme |
BR112015008168B1 (en) * | 2012-10-15 | 2021-11-23 | Sony Computer Entertainment Inc | OPERATING DEVICE |
US9690392B2 (en) | 2012-10-15 | 2017-06-27 | Sony Corporation | Operating device including a touch sensor |
GB2533394A (en) * | 2014-12-19 | 2016-06-22 | Gen Electric | Method and system for generating a control signal for a medical device |
CN107548503B (en) * | 2015-06-17 | 2022-01-11 | 克朗设备公司 | Dynamic vehicle performance analyzer with smoothing filter |
CN108604454B (en) * | 2016-03-16 | 2020-12-15 | 华为技术有限公司 | Audio signal processing apparatus and input audio signal processing method |
RU2642394C1 (en) * | 2017-05-05 | 2018-01-24 | Андрей Валерьевич Груздев | Device for control of the movement system |
JP6957218B2 (en) * | 2017-06-12 | 2021-11-02 | 株式会社バンダイナムコエンターテインメント | Simulation system and program |
JP6822906B2 (en) | 2017-06-23 | 2021-01-27 | 株式会社東芝 | Transformation matrix calculation device, position estimation device, transformation matrix calculation method and position estimation method |
KR102480310B1 (en) * | 2017-11-06 | 2022-12-23 | 삼성전자주식회사 | Display apparatus and control method of the same |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5554980A (en) * | 1993-03-12 | 1996-09-10 | Mitsubishi Denki Kabushiki Kaisha | Remote control system |
US6069594A (en) * | 1991-07-29 | 2000-05-30 | Logitech, Inc. | Computer input device with multiple switches using single line |
US6489948B1 (en) * | 2000-04-20 | 2002-12-03 | Benny Chi Wah Lau | Computer mouse having multiple cursor positioning inputs and method of operation |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
ES2056580T3 (en) * | 1990-05-18 | 1994-10-01 | British Aerospace | INERTIAL SENSORS. |
US5181181A (en) | 1990-09-27 | 1993-01-19 | Triton Technologies, Inc. | Computer apparatus input device for three-dimensional information |
JP3907213B2 (en) * | 1992-09-11 | 2007-04-18 | 伸壹 坪田 | Game control device |
US6022274A (en) * | 1995-11-22 | 2000-02-08 | Nintendo Co., Ltd. | Video game system using memory module |
CN1177634C (en) * | 1996-03-05 | 2004-12-01 | 世嘉企业股份有限公司 | Controller and extension unit for controller |
US5992233A (en) * | 1996-05-31 | 1999-11-30 | The Regents Of The University Of California | Micromachined Z-axis vibratory rate gyroscope |
JPH1021000A (en) * | 1996-06-28 | 1998-01-23 | Sumitomo Metal Ind Ltd | Signal input device |
US6400374B2 (en) * | 1996-09-18 | 2002-06-04 | Eyematic Interfaces, Inc. | Video superposition system and method |
US6720949B1 (en) * | 1997-08-22 | 2004-04-13 | Timothy R. Pryor | Man machine interfaces and applications |
JPH11253656A (en) | 1998-03-09 | 1999-09-21 | Omron Corp | Attachment of game controller |
JP4805433B2 (en) * | 1999-03-31 | 2011-11-02 | 株式会社カプコン | Signal input device and regulating member |
US6417836B1 (en) * | 1999-08-02 | 2002-07-09 | Lucent Technologies Inc. | Computer input device having six degrees of freedom for controlling movement of a three-dimensional object |
JP3847058B2 (en) * | 1999-10-04 | 2006-11-15 | 任天堂株式会社 | GAME SYSTEM AND GAME INFORMATION STORAGE MEDIUM USED FOR THE SAME |
JP2002090384A (en) * | 2000-09-13 | 2002-03-27 | Microstone Corp | Structure of motion sensor and internal connecting method |
JP3611807B2 (en) * | 2001-07-19 | 2005-01-19 | コナミ株式会社 | Video game apparatus, pseudo camera viewpoint movement control method and program in video game |
JP2003131796A (en) * | 2001-10-22 | 2003-05-09 | Sony Corp | Information input device, its method and computer program |
CN1692401B (en) * | 2002-04-12 | 2011-11-16 | 雷斯里·R·奥柏梅尔 | Multi-axis input transducer apparatus and joystick |
JP4179162B2 (en) * | 2003-12-26 | 2008-11-12 | 株式会社セガ | Information processing device, game device, image generation method, and game image generation method |
JP2006031515A (en) * | 2004-07-20 | 2006-02-02 | Vodafone Kk | Mobile communication terminal, application program, image display control device, and image display control method |
JP4610971B2 (en) * | 2004-09-07 | 2011-01-12 | 任天堂株式会社 | Game program |
-
2007
- 2007-04-14 CN CN201210037498.XA patent/CN102580314B/en active Active
- 2007-04-14 CN CN201210496712.8A patent/CN102989174B/en active Active
- 2007-04-14 WO PCT/US2007/067010 patent/WO2007130793A2/en active Application Filing
- 2007-04-19 JP JP2009509932A patent/JP2009535173A/en active Pending
- 2007-04-19 WO PCT/US2007/067005 patent/WO2007130792A2/en active Application Filing
- 2007-05-02 JP JP2007121964A patent/JP4553917B2/en active Active
-
2009
- 2009-08-07 JP JP2009185086A patent/JP5465948B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6069594A (en) * | 1991-07-29 | 2000-05-30 | Logitech, Inc. | Computer input device with multiple switches using single line |
US5554980A (en) * | 1993-03-12 | 1996-09-10 | Mitsubishi Denki Kabushiki Kaisha | Remote control system |
US6489948B1 (en) * | 2000-04-20 | 2002-12-03 | Benny Chi Wah Lau | Computer mouse having multiple cursor positioning inputs and method of operation |
Also Published As
Publication number | Publication date |
---|---|
CN102580314A (en) | 2012-07-18 |
WO2007130793A3 (en) | 2008-12-11 |
WO2007130793A2 (en) | 2007-11-15 |
CN102989174A (en) | 2013-03-27 |
WO2007130792A2 (en) | 2007-11-15 |
JP4553917B2 (en) | 2010-09-29 |
JP2009254888A (en) | 2009-11-05 |
JP5465948B2 (en) | 2014-04-09 |
JP2009535173A (en) | 2009-10-01 |
CN102580314B (en) | 2015-05-20 |
JP2007296367A (en) | 2007-11-15 |
WO2007130792A3 (en) | 2008-09-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102989174B (en) | Obtain the input being used for controlling the operation of games | |
CN101484221B (en) | Obtaining input for controlling execution of a game program | |
KR101036403B1 (en) | Object detection using video input combined with tilt angle information | |
US7854655B2 (en) | Obtaining input for controlling execution of a game program | |
CN101438340B (en) | System, method, and apparatus for three-dimensional input control | |
US10086282B2 (en) | Tracking device for use in obtaining information for controlling game program execution | |
US7918733B2 (en) | Multi-input game control mixer | |
US7850526B2 (en) | System for tracking user manipulations within an environment | |
US9682320B2 (en) | Inertially trackable hand-held controller | |
JP5022385B2 (en) | Gesture catalog generation and recognition | |
US20060287084A1 (en) | System, method, and apparatus for three-dimensional input control | |
JP2012164330A (en) | System for tracking user operation in environment | |
KR101020510B1 (en) | Multi-input game control mixer | |
EP2351604A2 (en) | Obtaining input for controlling execution of a game program | |
KR101020509B1 (en) | Obtaining input for controlling execution of a program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |