Nothing Special   »   [go: up one dir, main page]

CN214251480U - Virtual reality test equipment with position sensing structure - Google Patents

Virtual reality test equipment with position sensing structure Download PDF

Info

Publication number
CN214251480U
CN214251480U CN202023292138.3U CN202023292138U CN214251480U CN 214251480 U CN214251480 U CN 214251480U CN 202023292138 U CN202023292138 U CN 202023292138U CN 214251480 U CN214251480 U CN 214251480U
Authority
CN
China
Prior art keywords
virtual reality
position sensing
testing
station
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202023292138.3U
Other languages
Chinese (zh)
Inventor
刘洛希
哈乐宇
王索刚
秦路
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhe Jiang Fan Ju Technology Co ltd
Original Assignee
Zhe Jiang Fan Ju Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhe Jiang Fan Ju Technology Co ltd filed Critical Zhe Jiang Fan Ju Technology Co ltd
Priority to CN202023292138.3U priority Critical patent/CN214251480U/en
Application granted granted Critical
Publication of CN214251480U publication Critical patent/CN214251480U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The utility model provides a virtual reality test equipment with position response structure, including the testboard, the testboard on distribute and have station response network, station response network connection in host system, host system be used for connecting the head-mounted module, just the sensor be temperature sensor or photoelectric sensor. The scheme provides a test bench for virtual reality test, and a standing position sensing network is distributed on the test bench, so that the standing position of a user can be obtained at any time.

Description

Virtual reality test equipment with position sensing structure
Technical Field
The utility model belongs to the technical field of the virtual reality, especially, relate to a virtual reality test equipment with position response structure.
Background
The virtual reality technology is currently in a development stage, and the development of the virtual reality technology inevitably requires performance testing on virtual reality equipment and virtual reality software.
Some performance tests need to guarantee that the user is in a certain range, such as vertigo tests, but lack the equipment structure that can perceive user's position at present, and at the in-process of carrying out performance test, the position obtains the difficulty, also can not guarantee that the user is in a certain range all the time simultaneously, leads to the performance test result to receive the influence.
SUMMERY OF THE UTILITY MODEL
The utility model aims at the above-mentioned problem, a virtual reality test equipment with position response structure is provided.
In order to achieve the above purpose, the utility model adopts the following technical proposal:
the utility model provides a virtual reality test equipment with position response structure, includes the testboard, the testboard on distribute and have the station position response network, station position response network connect in host system, host system be used for connecting the head-mounted module.
In the virtual reality testing equipment with the position sensing structure, the station sensing network comprises a plurality of sensors which are distributed on the testing platform and can sense users on the testing platform, and the sensors are connected to the main control module.
In the virtual reality testing equipment with the position sensing structure, the sensor is a temperature sensor or a photoelectric sensor.
In the virtual reality testing device with the position sensing structure, the main control module is connected with a memory, and the distribution positions of the temperature sensors/photoelectric sensors on the testing platform are stored in the memory.
In the virtual reality testing equipment with the position sensing structure, a plurality of sensors are distributed on the testing platform from the center to the outside in a diffused mode to form a standing position collecting area for a user to stand.
In the virtual reality testing device with the position sensing structure, the station acquisition area formed by the sensors is of a rectangular structure or a circular structure.
In the virtual reality testing device with the position sensing structure, a plurality of sensors are distributed on the testing platform in a checkerboard-shaped rectangular structure.
In the above virtual reality testing apparatus with a position sensing structure, the station acquisition area is a rectangular structure with the front-back direction of the user as the length and the left-right direction as the width, and the length is greater than 40cm and less than 60cm, and the width is greater than 30cm and less than 50 cm.
In the virtual reality testing equipment with the position sensing structure, the testing platform is at least 3cm higher than the ground.
In foretell virtual reality test equipment that has position sensing structure, the testboard on be located the station and gather the district outside, limiting plate is installed to user's the left and right sides.
In the virtual reality testing equipment with the position sensing structure, an image acquisition device for acquiring images of a test user is arranged in front of the test board, and the image acquisition device is connected to the main control module.
The utility model has the advantages that: a test bench is provided, and a standing position sensing network is distributed on the test bench, so that the standing position of a user can be acquired at any time.
Drawings
Fig. 1 is a schematic diagram of the distribution of the station sensing network of the present invention on the test board;
FIG. 2 is a schematic view of the apparatus of the present invention in a front view;
FIG. 3 is a schematic side view of the apparatus of the present invention in use;
FIG. 4 is a flow chart of a vertigo testing method using the apparatus of the present invention;
fig. 5 is a schematic diagram of the positions of the characteristic points on the human body in the vertigo testing method using the device of the present invention.
Reference numerals: a test bench 1; a head-mounted module 2; a sensor 3; a station acquisition area 4; a limiting plate 5; an image acquisition device 6; a central feature point 7; the auxiliary feature points 8.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and specific embodiments.
The embodiment discloses virtual reality test equipment with a position sensing structure, which comprises a test bench 1, wherein a station sensing network is distributed on the test bench 1 and connected to a main control module, and the main control module is used for connecting a head-mounted module 2.
Specifically, as shown in fig. 1, the station sensing network includes a plurality of sensors 3 distributed on the test platform 1 and capable of sensing a user located on the test platform 1, each sensor 3 may be a temperature sensor or a photoelectric sensor, the sensor 3 is connected to the main control module, the main control module is connected to a memory, distribution positions of the sensors 3 on the test platform 1 are stored in the memory, the main control module determines a current position of the user according to the distribution positions of the sensors 3 on the test platform 1 and current sensing data of the sensors 3, the main control module can be connected to the alarm module, and when the user triggers the peripheral sensors 3, the head module 2 prompts the user to avoid the user from going out of a specified range to affect performance testing.
Specifically, a plurality of sensors 3 are distributed on the test bench 1 from the center to the outside in a diffused manner to form a standing position collecting area 4 for a user to stand. And a station acquisition area 4 formed by a plurality of sensors 3 is of a rectangular structure or a circular structure. The sensors 3 of this embodiment are preferably distributed on the test bench 1 in a checkerboard-like rectangular configuration.
Specifically, the station collection area 4 is a rectangular structure with the front and back direction of the user as the length and the left and right direction as the width, and the length is more than 40cm and less than 60cm, and the width is more than 30cm and less than 50 cm.
Further, as shown in fig. 2, the test bench 1 is at least 3cm above the ground in order to arrange a station sensing network. And locate in the station position collection district 4 outside on testboard 1, limiting plate 5 is installed to user's the left and right sides, can enough prevent that the user from walking out of station position collection district 4, can prevent again that the user from falling testboard 1.
Further, as shown in fig. 3, an image acquisition device 6 for acquiring an image of a test user is arranged in front of the test stand 1, and the image acquisition device 6 is connected to the main control module.
Specifically, as shown in fig. 4, the use method of the present device in combination with software for testing vertigo degree in the main control module is as follows: (of course, when in use, the equipment is not limited to the testing scene of vertigo degree, but also can be used for the scenes of performance tests in other aspects)
S1, acquiring and processing an initial state image of a user, specifically:
s11, acquiring an initial state image of a user through an image acquisition device 6;
s12, extracting auxiliary feature points and central feature points in the initial state image, and constructing a standard feature vector;
specifically, as shown in fig. 5, there are an auxiliary feature point and a central feature point, and the central skeleton point of the left or right shoulder is extracted as the auxiliary feature point, and the central point between the two shoulders is extracted as the central feature point;
s2, continuously acquiring and processing a user test state image after the user enters virtual reality, specifically:
s21, collecting a user test state image;
s22, extracting auxiliary characteristic points and central characteristic points in the image in the test state, judging whether the station positions in the initial state and the test state have displacement or not, if so, extracting station position offset, correcting the current auxiliary characteristic points and central characteristic points according to the station position offset, and constructing an offset characteristic vector;
otherwise, the offset feature vector is directly constructed according to the auxiliary feature points and the central feature point.
And, here, the standing position in the initial state and the standing position in the test state of the user are obtained through the sensing situation in the initial state and the sensing situation in the current state of each sensor 3, so as to obtain the standing position offset.
S3, comparing the initial state image of the user with the test state image, and outputting the vertigo degree according to the comparison result, which specifically comprises the following steps:
s31, fusing the offset characteristic vector and the standard characteristic vector, and recording an offset parameter of the offset characteristic vector; the offset parameter describes the degree of body imbalance swing, corresponding to the degree of vertigo of the user. The offset parameter may serve as an objective index quantifying the degree of body tilt, which in turn corresponds to the vertigo degree of the user, so the index is used to evaluate quantifying the vertigo degree of the user;
and S32, comparing the offset parameters with the standard values in the database, and generating a report containing the vertigo degree according to the comparison result.
The constant database stores standard values of different shift parameter ranges corresponding to different vertigo degrees of users with different characteristics, and extracts corresponding standard values for shift comparison according to the characteristics of the current user;
and the different characteristics of the user include gender, age group and height. As the present embodiment is directed to 6 to 18 years of age, each year as an age group characteristic; for ages 19 to 24, as an age group characteristic every two years; for 25 to 50 years, every five years as a segment of age characteristic; the ages of 51 to 60 are taken as a characteristic of one age group, and the ages over 61 are taken as a characteristic of one age group. By inquiring the vertigo feelings of large users and combining the measured deviation parameters, the different deviation parameter ranges under different age stages, different sexes and different vertigo degree feelings are counted out to be used as standard values. When the offset characteristic vector and the standard characteristic vector are fused, obtaining a position offset value of the central characteristic point in the test state compared with the central characteristic point in the initial state and an included angle between the offset characteristic vector and the standard characteristic vector; and calculating a product value of the sine value of the included angle and the standard characteristic vector, and taking the sum of the obtained position deviation value and the product value of half as a deviation parameter, wherein the unit of the parameter is centimeter. The shift parameter ranges for each vertigo class for men such as 25-30 years old are:
no dizziness, (-2, 4); mild vertigo, (-8, -2) and [4, 18); moderate vertigo, [ -14, -8] and [18,34 ]; severe vertigo is less than-14, or greater than 34.
Here, four levels are classified into no vertigo, mild degree, moderate degree and severe degree, and when the device is put into use, the device can also be classified into more levels, and the device is not limited herein.
Specifically, in step S22, the condition that at least a preset number of times, for example, 3 swings within 5 minutes, continuously occur within a preset time period is regarded as unbalanced tilt swing, and vertigo degree calculation is performed, otherwise, it is determined that the user is autonomously swinging, vertigo degree calculation is not performed, and measurement errors caused by the autonomous swinging of the user are eliminated. The wobble is a case where the offset eigenvector is offset from the standard eigenvector, and in order to avoid a slight fluctuation of the user himself, it is considered that the wobble occurs due to dizziness when the absolute value of the offset parameter is larger than a certain value.
Preferably, when the unbalance tilt and swing is judged to belong to, a plurality of offset feature vectors of a plurality of test state images in the period of time are obtained, and an offset parameter obtained by fusing the offset feature vector with the standard feature vector with the most serious swing can be directly selected as an offset parameter used for vertigo measurement. The offset parameters obtained by fusing each offset eigenvector with the standard eigenvector can also be used, at this time, if a plurality of offset parameters are in the vertigo degree of the same grade, the result is directly output, if the offset parameters do not belong to the same grade, the number of the covered offset parameters is sequentially judged from the highest grade to the lowest grade, when the number of the offset parameters covered by the highest grade is higher than the preset value, the highest grade is judged, otherwise, the grade is reduced until the number of the offset parameters covered by a certain grade is higher than the preset value, and the vertigo degree is judged to be in the corresponding grade. The preset value is determined according to factors such as the length of the preset time slot, and the like, the embodiment is determined by 3 times, if the deviation parameter of a 28-year-old man for 3 times is greater than 34 or less than-14, the man is determined to be severely dizzy, if the deviation parameter is only 2 times, the man is reduced by one grade, the man continues to check the number of deviation parameters of the man in the range of [ -14, -8] and [18,34], if the deviation parameter is greater than or equal to 3 times, the man is determined to be moderately dizzy, and if the deviation parameter is not greater than or equal to 3 times, the man is reduced by another grade.
The virtual reality test equipment with the position sensing structure provided by the embodiment can acquire the position of a user in the test process, and can acquire the standing position in the initial state and the standing position in the test state through the virtual reality test equipment in the vertigo test process, so that the standing position offset is acquired, and the user can be ensured to move within the allowable range.
The specific embodiments described herein are merely illustrative of the spirit of the invention. Various modifications, additions and substitutions for the specific embodiments described herein may be made by those skilled in the art without departing from the spirit of the invention or exceeding the scope of the invention as defined in the accompanying claims.
Although test station 1 is used more here; a head-mounted module 2; a sensor 3; a station acquisition area 4; a limiting plate 5; an image acquisition device 6; a central feature point 7; the auxiliary feature points 8, etc., but do not exclude the possibility of using other terms. These terms are used merely to more conveniently describe and explain the nature of the present invention; they are to be construed in a manner that is inconsistent with the spirit of the invention.

Claims (10)

1. The utility model provides a virtual reality test equipment with position response structure, its characterized in that includes testboard (1), testboard (1) on distribute and have station position response network, station position response network connect in host system, host system be used for connecting head module (2).
2. The virtual reality testing apparatus with a position sensing architecture according to claim 1, wherein the station sensing network comprises a plurality of sensors (3) distributed on the testing platform (1) and capable of sensing users located on the testing platform (1), and the sensors (3) are connected to the main control module.
3. The virtual reality testing apparatus with a position sensing structure according to claim 2, wherein the sensor (3) is a temperature sensor or a photoelectric sensor.
4. The virtual reality testing device with the position sensing structure as claimed in claim 3, wherein the main control module is connected with a memory, and the memory stores the distribution positions of the temperature sensors/photoelectric sensors on the testing platform (1).
5. The virtual reality testing apparatus with a position sensing structure according to claim 4, wherein several sensors (3) are distributed on the testing platform (1) from the center to the outside in a spread manner to form a standing collection area (4) for users to stand.
6. The virtual reality testing apparatus with a position sensing structure according to claim 5, wherein several sensors (3) are distributed on the testing table (1) in a checkerboard-like rectangular configuration.
7. The virtual reality testing apparatus with a position sensing structure according to claim 6, wherein the station collection area (4) is a rectangular structure with the length in the front-back direction of the user and the width in the left-right direction, and the length is more than 40cm and less than 60cm, and the width is more than 30cm and less than 50 cm.
8. The virtual reality testing apparatus with a position sensing architecture according to claim 1, wherein the testing table (1) is at least 3cm above the ground.
9. The virtual reality testing equipment with the position sensing structure according to claim 8, wherein the testing platform (1) is located outside the station acquisition area (4), and limiting plates (5) are installed on the left and right sides of the user.
10. The virtual reality testing equipment with the position sensing structure is characterized in that an image acquisition device (6) for acquiring images of a test user is arranged in front of the testing platform (1), and the image acquisition device is connected to the main control module.
CN202023292138.3U 2020-12-30 2020-12-30 Virtual reality test equipment with position sensing structure Active CN214251480U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202023292138.3U CN214251480U (en) 2020-12-30 2020-12-30 Virtual reality test equipment with position sensing structure

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202023292138.3U CN214251480U (en) 2020-12-30 2020-12-30 Virtual reality test equipment with position sensing structure

Publications (1)

Publication Number Publication Date
CN214251480U true CN214251480U (en) 2021-09-21

Family

ID=77722472

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202023292138.3U Active CN214251480U (en) 2020-12-30 2020-12-30 Virtual reality test equipment with position sensing structure

Country Status (1)

Country Link
CN (1) CN214251480U (en)

Similar Documents

Publication Publication Date Title
CN104436597A (en) Exercise support device and exercise support method
CN107578019B (en) A gait recognition system and recognition method based on visual and tactile fusion
CN106934773B (en) Video moving target and Mac address matching method
Jin et al. Neural system identification model of human sound localization
WO2004027734B1 (en) Test of parietal lobe function and associated methods
CN109376686A (en) A kind of various dimensions human face data acquisition scheme, acquisition system and acquisition method
CN114022035A (en) Method for evaluating carbon emission of building in urban heat island effect
CN214251480U (en) Virtual reality test equipment with position sensing structure
CN114532986B (en) Human body balance measurement method and system based on three-dimensional space motion capture
CN117456280A (en) Rock mass structural plane identification method, device and equipment and readable storage medium
Cen et al. Gross error diagnostics before least squares adjustment of observations
CN111330213A (en) Anti-falling pre-judging method and system for VR (virtual reality) treadmill and readable storage medium
CN112656404B (en) System and method for measuring virtual reality dizziness degree based on image processing
CN106419826A (en) Vision detecting system based on fingerprint recognition
CN105630858B (en) Display method and device of heat index, server and intelligent equipment
CN111596594B (en) A panoramic big data application monitoring and control system
CN110342363B (en) Method and device for testing safety performance of elevator, terminal equipment and storage medium
CN110753299A (en) Positioning precision pre-judging method and device, equipment and storage medium
CN109257756B (en) Indoor wireless network quality evaluation method and server
CN108896070B (en) Method and device for detecting sensor error in mobile equipment and terminal
CN108646101A (en) Mobile phone radiation monitoring method and its device
CN112754472B (en) Calibration method and device for sensor in motion capture system
CN114549590A (en) Target object detection method and device
CN113807197A (en) Fall detection method and device
CN112600724A (en) CAN bus performance test method and test system

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant