Nothing Special   »   [go: up one dir, main page]

CN109144256B - A kind of virtual reality behavior interactive approach and device - Google Patents

A kind of virtual reality behavior interactive approach and device Download PDF

Info

Publication number
CN109144256B
CN109144256B CN201810947092.2A CN201810947092A CN109144256B CN 109144256 B CN109144256 B CN 109144256B CN 201810947092 A CN201810947092 A CN 201810947092A CN 109144256 B CN109144256 B CN 109144256B
Authority
CN
China
Prior art keywords
behavior
trigger
virtual
target object
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810947092.2A
Other languages
Chinese (zh)
Other versions
CN109144256A (en
Inventor
程大鹏
梁景辉
罗云环
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Field River Guangzhou Three Culture Science And Technology Co Ltd
Original Assignee
Field River Guangzhou Three Culture Science And Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Field River Guangzhou Three Culture Science And Technology Co Ltd filed Critical Field River Guangzhou Three Culture Science And Technology Co Ltd
Priority to CN201810947092.2A priority Critical patent/CN109144256B/en
Publication of CN109144256A publication Critical patent/CN109144256A/en
Application granted granted Critical
Publication of CN109144256B publication Critical patent/CN109144256B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the invention provides a kind of virtual reality behavior interactive approach and devices, this method comprises: creating virtual scene according to the scene selection instruction of user's input, and each dummy object is created in virtual scene, wherein the first trigger is provided on each dummy object;After capturing the interactive information between the first trigger on the second trigger and dummy object of user, determine that dummy object is virtual target object, display all of virtual target object can operation behavior;Goal behavior can be determined, and call script corresponding with goal behavior in preset behavior kit and run, so that virtual target object performance objective behavior after action selection instruction in operation behavior all by obtaining user;Wherein, preset behavior kit be packaged with each dummy object it is all can the corresponding script of operation behavior, virtual target object be one or more.The embodiment of the present invention can be improved the real-time that interacts between user and virtual target object, improve user experience.

Description

A kind of virtual reality behavior interactive approach and device
Technical field
The present invention relates to technical field of virtual reality more particularly to a kind of virtual reality behavior interactive approaches and device.
Background technique
As science and technology rapidly develops, virtual reality technology has obtained more and more applications in people's lives.
In the prior art, user in virtual scene can with virtual target object carry out behavior interact, user by pair The movement of virtual target object is selected, and virtual target object can be made to realize corresponding movement, user is enabled to obtain and virtual mesh Mark the experience true to nature of object interaction.
However, being individually to be carried out to each virtual target object since the corresponding actions of virtual target object are in project development Exploitation, when the determining movement to some virtual target object of user, it is corresponding to need first to determine that the virtual target object stores the movement The memory block of data, then transfer the data so that virtual target object realizes corresponding movement, therefore, user in the prior art There can be certain delay with the interactive process of virtual target object, cause user experience lower.
Summary of the invention
The embodiment of the invention provides a kind of virtual reality behavior interactive approach and devices, can be improved user and virtual mesh The interaction real-time between object is marked, user experience is improved.
According to an aspect of the present invention, a kind of virtual reality behavior interactive approach is provided, comprising:
Virtual scene is created according to the scene selection instruction of user's input, and is created in the virtual scene each virtual Object, wherein be provided with the first trigger on each dummy object;
After capturing the interactive information between first trigger on the second trigger and the dummy object of user, It determines that the dummy object is virtual target object, shows that all of the virtual target object can operation behavior;
Obtaining user described can determine goal behavior, and preset after action selection instruction in operation behavior all Script corresponding with the goal behavior is called in behavior kit and is run, so that the virtual target object executes the target Behavior;
Wherein, the preset behavior kit be packaged with all of each dummy object can the corresponding foot of operation behavior This, the virtual target object is one or more.
Preferably, the scene selection instruction according to user's input creates virtual scene, and in the virtual scene Each dummy object is created to specifically include:
The scene selection instruction for obtaining user's input, according to the scene selection instruction based on preset space coordinate origin Establish virtual scene;
It determines the pre-set co-ordinate of each dummy object, and is created in the virtual scene according to the pre-set co-ordinate each Dummy object.
Preferably, described operation behavior push-and-pull or scaling or to pick up or pat or mobile.
Preferably, when the interactive information includes that user interacts with the virtual target object, second trigger and institute State the distance of angle and second trigger relative to first trigger movement between the first trigger.
Preferably, the virtual target object executes the goal behavior and specifically includes:
When the goal behavior is push-and-pull, the virtual target object moves pre- along the horizontal direction far from or close to user Set first distance;
When the goal behavior is scaling, the virtual target object is reduced into preset first multiple of original size or puts Greatly preset second multiple of original size;
When the goal behavior is to pick up, the virtual target object follows second trigger to be moved;
When the goal behavior is to pat, the virtual target object carries out the shake back and forth of predetermined times around its geometric center It is dynamic;
When the goal behavior is mobile, the virtual target object is according to second trigger and first triggering Angle and second trigger are moved accordingly relative to the distance of first trigger movement between device.
According to another aspect of the present invention, a kind of virtual reality behavior interactive device is provided, comprising:
Creation module, the scene selection instruction for being inputted according to user create virtual scene, and in the virtual scene The middle each dummy object of creation, wherein be provided with the first trigger on each dummy object;
Interactive module, between first trigger on the second trigger and the dummy object for capturing user Interactive information after, determine the dummy object be virtual target object, show that all of the virtual target object can operation behavior;
Execution module described can determine target after action selection instruction in operation behavior all for obtaining user Behavior, and call script corresponding with the goal behavior in preset behavior kit and run, so that the virtual target Object executes the goal behavior;
Wherein, the preset behavior kit include all of each dummy object can the corresponding foot of operation behavior This, the virtual target object is one or more.
Preferably, the creation module includes:
First creating unit is based on for obtaining the scene selection instruction of user's input according to the scene selection instruction Preset space coordinate origin establishes virtual scene;
Second creating unit, for determining the pre-set co-ordinate of each dummy object, and according to the pre-set co-ordinate described Each dummy object is created in virtual scene.
Preferably, described operation behavior push-and-pull or scaling or to pick up or pat or mobile.
Preferably, when the interactive information includes that user interacts with the virtual target object, second trigger and institute State the distance of angle and second trigger relative to first trigger movement between the first trigger.
Preferably, execution module includes:
Determination unit described can determine target after action selection instruction in operation behavior all for obtaining user Behavior;
Execution unit makes for calling script corresponding with the goal behavior in preset behavior kit and running It obtains the virtual target object and executes the goal behavior;
The execution unit includes:
Subelement is run, for calling script corresponding with the goal behavior in preset behavior kit and running;
First execute subelement, for so that the virtual target object along far from or close to user horizontal direction move it is pre- Set first distance;
Second executes subelement, for so that the virtual target object is reduced into preset first multiple of original size or puts Greatly preset second multiple of original size;
Third executes subelement, for so that the virtual target object follows second trigger to be moved;
4th execute subelement, for so that the virtual target object around its geometric center carry out predetermined times shake back and forth It is dynamic;
5th executes subelement, is used for so that the virtual target object is according to second trigger and first triggering Angle and second trigger are moved accordingly relative to the distance of first trigger movement between device.
As can be seen from the above technical solutions, the embodiment of the present invention has the advantage that
The embodiment of the invention provides a kind of virtual reality behavior interactive approach and devices, this method comprises: according to user The scene selection instruction of input creates virtual scene, and each dummy object is created in virtual scene, wherein each virtual object The first trigger is provided on body;It captures and interacts letter between the second trigger and the first trigger on dummy object of user After breath, determine that dummy object is virtual target object, display all of virtual target object can operation behavior;Obtain user it is all can After action selection instruction in operation behavior, goal behavior is determined, and call and goal behavior pair in preset behavior kit The script answered and operation, so that virtual target object performance objective behavior;Wherein, preset behavior kit is packaged with each virtual object Body it is all can the corresponding script of operation behavior, virtual target object be one or more.The present invention passes through mutual between trigger Dynamic information determines that user selects the goal behavior operated to virtual target object, when virtual target object is multiple, Ke Yitong One in preset behavior kit, calls script corresponding with the goal behavior of each virtual target object, then Run Script, makes It obtains each virtual target object and executes corresponding behavior, can be improved the real-time that interacts between user and virtual target object, improve User experience.
Further, the present invention can also be towards the developer of development function, fast construction intercorrelation functional ring Working efficiency is improved in border.
Detailed description of the invention
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below There is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this Some embodiments of invention without any creative labor, may be used also for those of ordinary skill in the art To obtain other attached drawings according to these attached drawings.
Fig. 1 is a kind of flow diagram of one embodiment of virtual reality behavior interactive approach provided by the invention;
Fig. 2 is a kind of flow diagram of another embodiment of virtual reality behavior interactive approach provided by the invention;
Fig. 3 is a kind of structural schematic diagram of one embodiment of virtual reality behavior interactive device provided by the invention;
Fig. 4 is a kind of structural schematic diagram of another embodiment of virtual reality behavior interactive device provided by the invention.
Specific embodiment
The embodiment of the invention provides a kind of virtual reality behavior interactive approach and devices, can be improved user and virtual mesh The interaction real-time between object is marked, user experience is improved.
In order to make the invention's purpose, features and advantages of the invention more obvious and easy to understand, below in conjunction with the present invention Attached drawing in embodiment, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that disclosed below Embodiment be only a part of the embodiment of the present invention, and not all embodiment.Based on the embodiments of the present invention, this field Those of ordinary skill's all other embodiment obtained without making creative work, belongs to protection of the present invention Range.
Referring to Fig. 1, a kind of one embodiment of virtual reality behavior interactive approach provided by the invention, comprising:
101, virtual scene is created according to the scene selection instruction of user's input, and created in virtual scene each virtual Object, wherein the first trigger is provided on each dummy object;
In the present embodiment, after the scene selection instruction for getting user's input, virtual field corresponding with the instruction is created Scape, such as street, room, office, shop urban environment, for another example field environments such as forest, ocean, grassland, and each void Quasi- scene is corresponding with a certain number of dummy objects (developer can be arranged in advance, and quantity is set according to actual needs), because This can correspond to the dummy object for determining that the scene needs to create after some virtual scene of user's unique selection.Virtual After the completion of scene creation, its corresponding dummy object is created in this scenario one by one, constitutes a complete dummy model.It needs It is noted that being provided with the first trigger in the virtual scene created, on dummy object.
102, it after capturing the interactive information between the first trigger on the second trigger and dummy object of user, determines Dummy object is virtual target object, and display all of virtual target object can operation behavior;
In the present embodiment, user wears operation tool, is built-in with the second trigger.Wherein, the second trigger and First trigger is substantially identical, and the second trigger and the first trigger have the corresponding relationship of mutual induction, when with When hand motion is made to some dummy object in family, some dummy object, the second trigger and the virtual object of user are such as streaked The first trigger on body interacts, then can capture the interactive information between two triggers, and the dummy object is true It is set to virtual target object, meanwhile, near the virtual target object, show that all of virtual target object can along the direction towards user Operation behavior.
It should be noted that user can also make movement to multiple dummy objects simultaneously, set out by user at these In the dummy object of work, the friendship between the first trigger on each dummy object and the second trigger of user is captured one by one After mutual information, each dummy object can be determined as virtual target object, then can in chronological sequence sequence shown one by one respectively The all of a virtual target object can operation behavior.
103, goal behavior can be determined, and preset after action selection instruction in operation behavior all by obtaining user Script corresponding with goal behavior is called in behavior kit and is run, so that virtual target object performance objective behavior.
User watch on virtual target object it is all can be selected in these behaviors after operation behavior, After the action selection instruction for obtaining user, i.e., the corresponding behavior of the instruction is determined as goal behavior, and in preset behavior tool Script corresponding with goal behavior is called in packet, then runs the script, it is to be understood that certain calculation is built-in in script Method after running the algorithm, can enable corresponding virtual target object execute the corresponding movement of the algorithm.
In the present embodiment, preset behavior kit be packaged with all of each dummy object can the corresponding foot of operation behavior This.Therefore, even if user makes hand motion to multiple dummy objects in the present invention, in the target for determining each virtual target object After behavior, it can be unified in the preset behavior kit and call corresponding script, so that each virtual target object executes Corresponding behavior can be improved the real-time that interacts between user and virtual target object, improve user experience.
The above are a kind of one embodiment of virtual reality behavior interactive approach to mention below to carry out more specific description For a kind of another embodiment of virtual reality behavior interactive approach, referring to Fig. 2, a kind of virtual reality row provided by the invention For another embodiment of interactive approach, comprising:
201, the scene selection instruction for obtaining user's input, according to scene selection instruction based on preset space coordinate origin Virtual scene is established, determines the pre-set co-ordinate of each dummy object, and create each void in virtual scene according to pre-set co-ordinate Quasi- object, wherein the first trigger is provided on each dummy object;
In the present embodiment, after the scene selection instruction for getting user's input, virtual field corresponding with the instruction is created Scape, and each virtual scene is corresponding with a certain number of dummy objects (developer can be arranged in advance, and quantity is according to reality Demand setting), therefore, after some virtual scene of user's unique selection, the void for determining that the scene needs to create can be corresponded to Quasi- object.
The sight that user is captured by the head-mounted display that user wears, is then based on preset space coordinate origin (origin is aligned with user's focus vision point usually, the approximate overlapping of the two can also be made, as long as distance between the two is inclined Difference is within the scope of preset acceptable error), virtual scene is created on the basis of the coordinate origin, the scene can be at this time It is considered as a three dimensional space coordinate system, corresponding each dummy object is provided with pre-set co-ordinate, according to pre-set co-ordinate The corresponding dummy object of the scene is created one by one with the scene, constitutes a complete dummy model, then pass through wear-type Display, which is shown, to be watched and is experienced for user.It should be noted that in the virtual scene created, dummy object On be provided with the first trigger.
202, it after capturing the interactive information between the first trigger on the second trigger and dummy object of user, determines Dummy object is virtual target object, and display all of virtual target object can operation behavior;
In the present embodiment, user wears operation tool, is built-in with the second trigger.Wherein, the second trigger and First trigger is substantially identical, and the second trigger and the first trigger have the corresponding relationship of mutual induction, when with When hand motion is made to some dummy object in family, some dummy object, the second trigger and the virtual object of user are such as streaked The first trigger on body interacts, then can capture the interactive information between two triggers, and the dummy object is true It is set to virtual target object, meanwhile, near the virtual target object, show that all of virtual target object can along the direction towards user Operation behavior.It should be noted that operability in the present embodiment be can be thought as push-and-pull or scaling or pickup or beating or It is mobile.
It should be noted that user can also make movement to multiple dummy objects simultaneously, set out by user at these In the dummy object of work, the friendship between the first trigger on each dummy object and the second trigger of user is captured one by one Mutual information (when in the present embodiment, interactive information includes that user interacts with virtual target object, the second trigger and the first trigger Between angle and the second trigger distance mobile relative to the first trigger.) after, it can be true by each dummy object It is set to virtual target object, then in chronological sequence sequence can shows that all of each virtual target object can operation behavior one by one.
As dummy object be one chair when, user can make movement by arm and streak the chair, so that user Second trigger is interacted with the first trigger on chair, after capturing the interactive information between two triggers, is then existed Shown on the chair its it is corresponding it is all can operation behavior, can be shown with a list, as in the list comprising push-and-pull, Scaling, pickup, beating and movement.Similarly, when user streaks multiple dummy objects, each dummy object is shown can operation behavior Process it is similar with the process of the above chair, be not repeated herein.
203, goal behavior can be determined, and preset after action selection instruction in operation behavior all by obtaining user Script corresponding with goal behavior is called in behavior kit and is run, so that virtual target object performance objective behavior;
User watch on virtual target object it is all can be selected in these behaviors after operation behavior, After the action selection instruction for obtaining user, i.e., the corresponding behavior of the instruction is determined as goal behavior, and in preset behavior tool Script corresponding with goal behavior is called in packet, then runs the script, it is to be understood that certain calculation is built-in in script Method after running the algorithm, can enable corresponding virtual target object execute the corresponding movement of the algorithm.
In the present embodiment, preset behavior kit be packaged with all of each dummy object can the corresponding foot of operation behavior This.Due to can operation behavior include push-and-pull, scaling, pick up, pat and mobile five kinds of behaviors, virtual target object executes mesh The process of mark behavior can be any one in following situations:
When goal behavior is push-and-pull, virtual target object along the horizontal direction far from or close to user move preset first away from From when selecting pulling one that chair is close to it such as user, the effect of display is that chair is moved along the horizontal direction close to user Move preset first distance;
When goal behavior is scaling, virtual target object is reduced into preset first multiple of original size or is enlarged into original Preset second multiple of size, if user selects one chest of amplification, the effect of display is that the chest can be enlarged into original ruler Very little preset first multiple (such as 2 times or 1.5 times, which can be set according to actual needs), if user selects contracting Small, then process is similar, and details are not described herein again;
When goal behavior is to pick up, virtual target object follows the second trigger to be moved, it is to be understood that virtual The first trigger that object follows the second trigger to carry out on mobile as virtual target object is moved according to the second trigger Dynamic, the effect of display is that virtual target object follows the hand of user mobile;
When goal behavior is to pat, virtual target object carries out the vibration back and forth of predetermined times around its geometric center, such as works as When user selects to pat a sphere, the effect of display is sphere around its geometric center progress certain amplitude, certain number Vibration;
When goal behavior is mobile, virtual target object according to angle between the second trigger and the first trigger with And second the trigger distance mobile relative to the first trigger moved accordingly.It is understood that goal behavior is When mobile, corresponding script can calculate destination virtual according to the distance of angle and relative movement between two triggers The distance and direction that object should move, then destination virtual object can carry out corresponding movement with direction according to this distance, to simulate use Family virtual target object to be made carries out mobile effect.
Therefore, even if user makes hand motion to multiple dummy objects in the present invention, each virtual target object is being determined Goal behavior after, can be unified in the preset behavior kit and call corresponding script, so that each virtual target Object executes corresponding behavior, can be improved the real-time that interacts between user and virtual target object, improves user experience.
It is the detailed description carried out to a kind of virtual reality behavior interactive approach provided by the invention above, it below will be to this The structure and connection relationship for inventing a kind of virtual reality behavior interactive device provided are illustrated, referring to Fig. 3, the present invention mentions A kind of one embodiment of the virtual reality behavior interactive device supplied, comprising:
Creation module 301, the scene selection instruction for being inputted according to user create virtual scene, and in virtual scene Create each dummy object, wherein the first trigger is provided on each dummy object;
Interactive module 302, the friendship between the first trigger on the second trigger and dummy object for capturing user After mutual information, determine that dummy object is virtual target object, display all of virtual target object can operation behavior;
Execution module 303 can determine target line after action selection instruction in operation behavior all for obtaining user For, and call script corresponding with goal behavior in preset behavior kit and run, so that virtual target object performance objective Behavior;
Wherein, preset behavior kit include each dummy object it is all can the corresponding script of operation behavior, virtual mesh Marking object is one or more.
The above are a kind of one embodiment of virtual reality behavior interactive device to mention below to carry out more specific description For a kind of another embodiment of virtual reality behavior interactive device, referring to Fig. 4, a kind of virtual reality row provided by the invention For another embodiment of interactive device, comprising:
Creation module 401, the scene selection instruction for being inputted according to user create virtual scene, and in virtual scene Create each dummy object, wherein the first trigger is provided on each dummy object;
Interactive module 402, the friendship between the first trigger on the second trigger and dummy object for capturing user After mutual information, determine that dummy object is virtual target object, display all of virtual target object can operation behavior;
Execution module 403 can determine target line after action selection instruction in operation behavior all for obtaining user For, and call script corresponding with goal behavior in preset behavior kit and run, so that virtual target object performance objective Behavior;
Wherein, preset behavior kit include each dummy object it is all can the corresponding script of operation behavior, virtual mesh Marking object is one or more.
In the present embodiment, creation module 401 includes:
First creating unit 4011 is based on for obtaining the scene selection instruction of user's input according to scene selection instruction Preset space coordinate origin establishes virtual scene;
Second creating unit 4012, for determining the pre-set co-ordinate of each dummy object, and according to pre-set co-ordinate virtual Each dummy object is created in scene.
It in the present embodiment, can operation behavior push-and-pull or scaling or pickup or beating or mobile.
In the present embodiment, when interactive information includes that user interacts with virtual target object, the second trigger and the first triggering The distance of angle and the second trigger relative to the movement of the first trigger between device.
In the present embodiment, execution module 403 includes:
Determination unit 4031 can determine target after action selection instruction in operation behavior all for obtaining user Behavior;
Execution unit 4032 makes for calling script corresponding with goal behavior in preset behavior kit and running Obtain the performance objective behavior of virtual target object;
Execution unit 4032 includes:
Subelement 40321 is run, for calling script corresponding with goal behavior in preset behavior kit and running, When goal behavior push-and-pull, triggering first executes subelement 40322, and when goal behavior scaling, triggering second executes son Unit 40323, when the goal behavior picks up, triggering third executes subelement 40324, when the goal behavior is patted, triggering 4th executes subelement 40325, and when the goal behavior is mobile, triggering the 5th executes subelement 40326;
First executes subelement 40322, for so that virtual target object is moved along the horizontal direction far from or close to user Preset first distance;
Second execute subelement 40323, for so that virtual target object be reduced into original size preset first multiple or It is enlarged into preset second multiple of original size;
Third executes subelement 40324, for so that virtual target object follows the second trigger to be moved;
4th executes subelement 40325, for so that virtual target object carries out predetermined times back and forth around its geometric center Vibration;
5th executes subelement 40326, is used for so that virtual target object is according between the second trigger and the first trigger Angle and the second trigger are moved accordingly relative to the distance of the first trigger movement.
It should be noted that a kind of virtual reality behavior interactive device provided in an embodiment of the present invention can be understood as controlling Terminal, the operation user which can wear with head-mounted display, the hand of user is attached, and realizes that information is handed over Mutually.
It is apparent to those skilled in the art that for convenience and simplicity of description, the system of foregoing description, The specific work process of device and unit, can refer to corresponding processes in the foregoing method embodiment, and details are not described herein.
In several embodiments provided herein, it should be understood that disclosed system, device and method can be with It realizes by another way.For example, the apparatus embodiments described above are merely exemplary, for example, the unit It divides, only a kind of logical function partition, there may be another division manner in actual implementation, such as multiple units or components It can be combined or can be integrated into another system, or some features can be ignored or not executed.Another point, it is shown or The mutual coupling, direct-coupling or communication connection discussed can be through some interfaces, the indirect coupling of device or unit It closes or communicates to connect, can be electrical property, mechanical or other forms.
The unit as illustrated by the separation member may or may not be physically separated, aobvious as unit The component shown may or may not be physical unit, it can and it is in one place, or may be distributed over multiple In network unit.It can select some or all of unit therein according to the actual needs to realize the mesh of this embodiment scheme 's.
It, can also be in addition, the functional units in various embodiments of the present invention may be integrated into one processing unit It is that each unit physically exists alone, can also be integrated in one unit with two or more units.Above-mentioned integrated list Member both can take the form of hardware realization, can also realize in the form of software functional units.
If the integrated unit is realized in the form of SFU software functional unit and sells or use as independent product When, it can store in a computer readable storage medium.Based on this understanding, technical solution of the present invention is substantially The all or part of the part that contributes to existing technology or the technical solution can be in the form of software products in other words It embodies, which is stored in a storage medium, including some instructions are used so that a computer Equipment (can be personal computer, server or the network equipment etc.) executes the complete of each embodiment the method for the present invention Portion or part steps.And storage medium above-mentioned includes: USB flash disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic or disk etc. are various can store journey The medium of sequence code.
The above, the above embodiments are merely illustrative of the technical solutions of the present invention, rather than its limitations;Although referring to before Stating embodiment, invention is explained in detail, those skilled in the art should understand that: it still can be to preceding Technical solution documented by each embodiment is stated to modify or equivalent replacement of some of the technical features;And these It modifies or replaces, the spirit and scope for technical solution of various embodiments of the present invention that it does not separate the essence of the corresponding technical solution.

Claims (4)

1. a kind of virtual reality behavior interactive approach characterized by comprising
Virtual scene is created according to the scene selection instruction of user's input, and creates each virtual object in the virtual scene Body, wherein be provided with the first trigger on each dummy object;
After capturing the interactive information between first trigger on the second trigger and the dummy object of user, determine The dummy object is virtual target object, shows that all of the virtual target object can operation behavior;
Obtaining user described can determine goal behavior, and in preset behavior after action selection instruction in operation behavior all Script corresponding with the goal behavior is called in kit and is run, so that the virtual target object executes the target line For;
Wherein, when user makes movement to multiple dummy objects simultaneously, then for being made the virtual object of movement by user Body will obtain one by one the interactive information between the first trigger on each dummy object and the second trigger of user, will be every A dummy object is determined as virtual target object, and in chronological sequence sequence shows that each all of virtual target object grasp one by one Make behavior;
The preset behavior kit be packaged with each dummy object it is all can the corresponding script of operation behavior, the void Quasi- object is one or more;
It is described operation behavior push-and-pull or scaling or to pick up or pat or mobile;
When the interactive information includes that user interacts with the virtual target object, second trigger and first trigger Between angle and second trigger distance mobile relative to first trigger;
The virtual target object executes the goal behavior and specifically includes:
When the goal behavior is push-and-pull, the virtual target object moves preset the along the horizontal direction far from or close to user One distance;
When the goal behavior is scaling, the virtual target object is reduced into preset first multiple of original size or is enlarged into Preset second multiple of original size;
When the goal behavior is to pick up, the virtual target object follows second trigger to be moved;
When the goal behavior is to pat, the virtual target object carries out the vibration back and forth of predetermined times around its geometric center;
When the goal behavior is mobile, the virtual target object according to second trigger and first trigger it Between angle and second trigger distance mobile relative to first trigger moved accordingly.
2. virtual reality behavior interactive approach according to claim 1, which is characterized in that the field inputted according to user Scape selection instruction creates virtual scene, and creates each dummy object in the virtual scene and specifically include:
The scene selection instruction for obtaining user's input is established according to the scene selection instruction based on preset space coordinate origin Virtual scene;
It determines the pre-set co-ordinate of each dummy object, and is created in the virtual scene according to the pre-set co-ordinate each virtual Object.
3. a kind of virtual reality behavior interactive device characterized by comprising
Creation module, the scene selection instruction for being inputted according to user creates virtual scene, and creates in the virtual scene Build each dummy object, wherein be provided with the first trigger on each dummy object;
Interactive module, the friendship between first trigger on the second trigger and the dummy object for capturing user After mutual information, determines that the dummy object is virtual target object, show that all of the virtual target object can operation behavior;
Execution module, for obtain user it is all it is described can determine goal behavior after action selection instruction in operation behavior, And call script corresponding with the goal behavior in preset behavior kit and run, so that the virtual target object executes The goal behavior;
Wherein, when user makes movement to multiple dummy objects simultaneously, then for being made the virtual object of movement by user Body will obtain one by one the interactive information between the first trigger on each dummy object and the second trigger of user, will be every A dummy object is determined as virtual target object, and in chronological sequence sequence shows that each all of virtual target object grasp one by one Make behavior;
The preset behavior kit include each dummy object it is all can the corresponding script of operation behavior, it is described virtual Object is one or more;
It is described operation behavior push-and-pull or scaling or to pick up or pat or mobile;
When the interactive information includes that user interacts with the virtual target object, second trigger and first trigger Between angle and second trigger distance mobile relative to first trigger;
Execution module includes:
Determination unit described can determine goal behavior after action selection instruction in operation behavior all for obtaining user;
Execution unit, for calling script corresponding with the goal behavior in preset behavior kit and running, so that institute It states virtual target object and executes the goal behavior;
The execution unit includes:
Subelement is run, for calling script corresponding with the goal behavior in preset behavior kit and running;
First executes subelement, for so that the virtual target object moves preset the along the horizontal direction far from or close to user One distance;
Second executes subelement, for so that the virtual target object is reduced into preset first multiple of original size or is enlarged into Preset second multiple of original size;
Third executes subelement, for so that the virtual target object follows second trigger to be moved;
4th execute subelement, for so that the virtual target object around its geometric center carry out predetermined times vibration back and forth;
5th executes subelement, be used for so that the virtual target object according to second trigger and first trigger it Between angle and second trigger distance mobile relative to first trigger moved accordingly.
4. virtual reality behavior interactive device according to claim 3, which is characterized in that the creation module includes:
First creating unit is based on preset for obtaining the scene selection instruction of user's input according to the scene selection instruction Space coordinate origin establish virtual scene;
Second creating unit, for determining the pre-set co-ordinate of each dummy object, and according to the pre-set co-ordinate described virtual Each dummy object is created in scene.
CN201810947092.2A 2018-08-20 2018-08-20 A kind of virtual reality behavior interactive approach and device Active CN109144256B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810947092.2A CN109144256B (en) 2018-08-20 2018-08-20 A kind of virtual reality behavior interactive approach and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810947092.2A CN109144256B (en) 2018-08-20 2018-08-20 A kind of virtual reality behavior interactive approach and device

Publications (2)

Publication Number Publication Date
CN109144256A CN109144256A (en) 2019-01-04
CN109144256B true CN109144256B (en) 2019-08-23

Family

ID=64790409

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810947092.2A Active CN109144256B (en) 2018-08-20 2018-08-20 A kind of virtual reality behavior interactive approach and device

Country Status (1)

Country Link
CN (1) CN109144256B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10948978B2 (en) 2019-04-23 2021-03-16 XRSpace CO., LTD. Virtual object operating system and virtual object operating method
CN112152894B (en) * 2020-08-31 2022-02-18 青岛海尔空调器有限总公司 Household appliance control method based on virtual reality and virtual reality system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104794752A (en) * 2015-04-30 2015-07-22 山东大学 Collaborative modeling method and system based on mobile terminal and holographic displayed virtual scene
CN107643820A (en) * 2016-07-20 2018-01-30 郎焘 The passive humanoid robots of VR and its implementation method
CN107918949A (en) * 2017-12-11 2018-04-17 网易(杭州)网络有限公司 Rendering intent, storage medium, processor and the terminal of virtual resource object
CN107957774A (en) * 2016-10-18 2018-04-24 阿里巴巴集团控股有限公司 Exchange method and device in virtual reality space environment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105894570A (en) * 2015-12-01 2016-08-24 乐视致新电子科技(天津)有限公司 Virtual reality scene modeling method and device
CN107515674B (en) * 2017-08-08 2018-09-04 山东科技大学 It is a kind of that implementation method is interacted based on virtual reality more with the mining processes of augmented reality
CN107624627A (en) * 2017-08-11 2018-01-26 骆秀菊 A kind of agricultural irrigation systems based on virtual reality
CN107862580A (en) * 2017-11-22 2018-03-30 纽世纪(广东)电子商务有限公司 A kind of commodity method for pushing and system
CN108108018A (en) * 2017-12-12 2018-06-01 歌尔科技有限公司 Commanding and training method, equipment and system based on virtual reality
CN108287483B (en) * 2018-01-17 2021-08-20 北京航空航天大学 Immersive virtual maintenance simulation method and system for product maintainability verification

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104794752A (en) * 2015-04-30 2015-07-22 山东大学 Collaborative modeling method and system based on mobile terminal and holographic displayed virtual scene
CN107643820A (en) * 2016-07-20 2018-01-30 郎焘 The passive humanoid robots of VR and its implementation method
CN107957774A (en) * 2016-10-18 2018-04-24 阿里巴巴集团控股有限公司 Exchange method and device in virtual reality space environment
CN107918949A (en) * 2017-12-11 2018-04-17 网易(杭州)网络有限公司 Rendering intent, storage medium, processor and the terminal of virtual resource object

Also Published As

Publication number Publication date
CN109144256A (en) 2019-01-04

Similar Documents

Publication Publication Date Title
KR102590841B1 (en) virtual object driving Method, apparatus, electronic device, and readable storage medium
KR101918262B1 (en) Method and system for providing mixed reality service
CN107145227B (en) The exchange method and device of virtual reality scenario
Stafford et al. Implementation of god-like interaction techniques for supporting collaboration between outdoor AR and indoor tabletop users
CN112148189A (en) Interaction method and device in AR scene, electronic equipment and storage medium
CN112076473B (en) Control method and device of virtual prop, electronic equipment and storage medium
US20200097732A1 (en) Markerless Human Movement Tracking in Virtual Simulation
CN106774872A (en) Virtual reality system, virtual reality exchange method and device
US20160114243A1 (en) Image processing program, server device, image processing system, and image processing method
CN105617658A (en) Multiplayer moving shooting game system based on real indoor environment
CN109144256B (en) A kind of virtual reality behavior interactive approach and device
CN111882674A (en) Virtual object adjusting method and device, electronic equipment and storage medium
CN110427100A (en) A kind of movement posture capture system based on depth camera
CN113230654B (en) Shooting display method and device of virtual gun, computer equipment and storage medium
CN105892650A (en) Information processing method and electronic equipment
CN109840946A (en) Virtual objects display methods and device
CN202159302U (en) Augment reality system with user interaction and input functions
KR102057658B1 (en) Apparatus for providing virtual reality-based game interface and method using the same
CN109983424A (en) Object selected method and apparatus and virtual reality device in virtual reality scenario
CN109636888B (en) 2D special effect manufacturing method and device, electronic equipment and storage medium
JP2016115328A (en) Method for calculation execution, calculation processing system, and program
CN113052753B (en) Panoramic topological structure generation method, device and equipment and readable storage medium
KR102010023B1 (en) Method and system for providing mixed reality service
CN107728811A (en) Interface control method, apparatus and system
CN116251349A (en) Method and device for prompting target position in game and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant