CN107610044A - Image processing method, computer-readable recording medium and virtual reality helmet - Google Patents
Image processing method, computer-readable recording medium and virtual reality helmet Download PDFInfo
- Publication number
- CN107610044A CN107610044A CN201710758206.4A CN201710758206A CN107610044A CN 107610044 A CN107610044 A CN 107610044A CN 201710758206 A CN201710758206 A CN 201710758206A CN 107610044 A CN107610044 A CN 107610044A
- Authority
- CN
- China
- Prior art keywords
- image
- angle
- visual field
- virtual reality
- displayed
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Abstract
The invention discloses a kind of image processing method, computer-readable recording medium and virtual reality helmet, this method to include:Obtain image to be displayed corresponding to first angle of visual field, virtual reality helmet optical component second angle of visual field and second angle of visual field corresponding to, the second field range size on the screen of the virtual reality helmet;Using first angle of visual field, second angle of visual field and the second field range size, it is calculated corresponding to first angle of visual field, the first field range size on the screen of the virtual reality helmet;Using the first field range size, the size of the image to be displayed is calculated;According to the size of the image to be displayed, optic aberrance revising processing is carried out to the image to be displayed.According to one embodiment of present invention, the experience of user is improved.
Description
Technical field
The present invention relates to technical field of virtual reality, more particularly, to a kind of image processing method, computer-readable deposit
Storage media and virtual reality helmet.
Background technology
Virtual reality device is equipped with optical lens.Optical lens has specific FOV (Field of View, visual field
Angle), in conjunction with the structure design of virtual reality device, determine user by the optical lens virtual reality device screen
On field range.Fig. 1 shows the field range on the screen for the virtual reality device that user is watched by optical lens
Schematic diagram.
In order to improve the authenticity of Consumer's Experience, when developing VR (Virtual Reality, virtual reality) applications, meeting
VR scenes are designed according to the FOV of optical lens, so that in the viewing area of the screen of VR equipment and the visual field of optical lens
Scope is coincide, and distortion chromatic aberration correction is combined when adjusting viewing area, ensure that VR scenes and the uniformity of optical lens.
When the optical lens of FOV and the VR equipment of the picture of input to VR equipment is inconsistent, user will be had a strong impact on
Experience.Such as the external camera of VR equipment, it is necessary to camera picture is inputted to VR equipment to be shown, without
Possess different FOV with camera, and cannot ensure it is consistent with the FOV of the optical lens of VR equipment, now, if by camera's
Content is shown to will result within sweep of the eye for lens and is not inconsistent by the content of lens inspection and the FOV of lens, influences
Experience.
Accordingly, it is desirable to provide a kind of new technical scheme, is improved for above-mentioned technical problem of the prior art.
The content of the invention
It is an object of the present invention to provide a kind of new solution of image processing method.
According to the first aspect of the invention, there is provided a kind of image processing method, including:
Obtain image to be displayed corresponding to first angle of visual field, virtual reality helmet optical component second angle of visual field
With second angle of visual field corresponding to, the second field range size on the screen of the virtual reality helmet;
Using first angle of visual field, second angle of visual field and the second field range size, it is calculated described
First angle of visual field is corresponding, the first field range size on the screen of the virtual reality helmet;
Using the first field range size, the size of the image to be displayed is calculated;
According to the size of the image to be displayed, optic aberrance revising processing is carried out to the image to be displayed.
Alternatively, first angle of visual field corresponding to the image to be displayed is the external equipment of the virtual reality helmet
Optical component the angle of visual field.
Alternatively, first angle of visual field corresponding to image to be displayed is obtained, including:In the virtual reality helmet and institute
When stating external equipment foundation connection, the virtual reality helmet obtains the device parameter of the external equipment, wherein, it is described
Device parameter includes the angle of visual field of the optical component of the external equipment.
Alternatively, using first angle of visual field, second angle of visual field and the second field range size, calculate
To the first field range size corresponding to first angle of visual field, on the screen of the virtual reality helmet, including:
The first field range size is calculated based on following calculating formula,
R=R* [tan (FOV1/2)/tan(FOV2/ 2)], wherein, r is zone radius corresponding to first field range
Size, R are zone radius size, FOV corresponding to second field range1For first angle of visual field, FOV2For described second
The angle of visual field.
Alternatively, using the first field range size, the size of the image to be displayed is calculated, including:
The size of the image to be displayed is calculated based on following calculating formula,
R=K0+K1*h+K2*h2+K3*h3+…+Kn*hn, wherein, h is square area corresponding to the image to be displayed
The length of side, r are zone radius size, K corresponding to first field range0、K1、K2、K3、……KnWorn for the virtual reality
Coefficient corresponding to the optical component of equipment.
Alternatively, according to the size of the image to be displayed, optic aberrance revising processing is carried out to the image to be displayed,
Including:
Multiple characteristic points are determined from the image to be displayed, according to the size of the image to be displayed, determine each feature
Distance of the point to the central point of the image to be displayed;
Optic aberrance revising processing is carried out to the image to be displayed based on following calculating formula,
L'=K0+K1*l+K2*l2+K3*l3+…+Kn*ln, wherein, l is characterized the central point for a little arriving the image to be displayed
Distance, K0、K1、K2、K3、……KnFor coefficient corresponding to the optical component of the virtual reality helmet, l ' is the spy
Sign point arrives the distance of the central point of the image to be displayed after optic aberrance revising is handled.
Alternatively, methods described also includes:
Image to be displayed after optic aberrance revising is handled is shown on the screen of the virtual reality helmet
Show.
Alternatively, the image to be displayed after optic aberrance revising is handled is on the screen of the virtual reality helmet
After being shown, in the case where first angle of visual field is less than second angle of visual field, user passes through the virtual reality
The region of image to be displayed after the optic aberrance revising processing that the optical component of helmet is watched is described first
The angle of visual field is corresponding, the first field range size on the screen of the virtual reality helmet;
In the case where first angle of visual field is more than second angle of visual field, user is worn by the virtual reality and set
The region of image to be displayed after the optic aberrance revising processing that standby optical component is watched is second angle of visual field
Second field range size corresponding, on the screen of the virtual reality helmet;
In the case where first angle of visual field is equal to second angle of visual field, user is worn by the virtual reality and set
The region of image to be displayed after the optic aberrance revising processing that standby optical component is watched is second angle of visual field
Second field range size corresponding, on the screen of the virtual reality helmet.
According to the second aspect of the invention, there is provided a kind of computer-readable recording medium, the computer-readable storage
Media storage has a computer program, and the computer program is by one or more computing device to realize any of the above-described institute
The image processing method stated.
According to the third aspect of the invention we, there is provided a kind of virtual reality helmet, including:Memory, processor and
It is stored in the computer program that can be run on the memory and on the processor, computer described in the computing device
The image processing method described in any one as described above is realized during program.
Image processing method, computer-readable recording medium and virtual reality helmet provided by the invention, pass through profit
Corresponding to image to be displayed first angle of visual field, virtual reality helmet optical component second angle of visual field and the second visual field
Corresponding to angle, the second field range size on the screen of virtual reality helmet, obtain corresponding to first angle of visual field,
The first field range size on the screen of virtual reality helmet, then, using the first field range size, it is calculated
The size of image to be displayed, finally, according to the size of image to be displayed, optic aberrance revising processing is carried out to image to be displayed,
So that the scene of image to be displayed transmission matches with second angle of visual field, the authenticity of experience is improved.
By referring to the drawings to the present invention exemplary embodiment detailed description, further feature of the invention and its
Advantage will be made apparent from.
Brief description of the drawings
It is combined in the description and the accompanying drawing of a part for constitution instruction shows embodiments of the invention, and even
It is used for the principle for explaining the present invention together with its explanation.
Fig. 1 shows the signal of the field range on the screen for the virtual reality device that user is watched by optical lens
Figure.
Fig. 2 shows the process chart of image processing method according to an embodiment of the invention.
Fig. 3 shows the schematic diagram of optic aberrance revising process according to an embodiment of the invention.
Fig. 4 shows viewing area corresponding to first angle of visual field and second angle of visual field difference according to an embodiment of the invention
The schematic diagram in domain.
Fig. 5 shows viewing area corresponding to first angle of visual field and second angle of visual field difference according to an embodiment of the invention
Another schematic diagram in domain.
Fig. 6 shows viewing area corresponding to first angle of visual field and second angle of visual field difference according to an embodiment of the invention
Another schematic diagram in domain.
Fig. 7 shows the structural representation of virtual reality helmet according to an embodiment of the invention.
Embodiment
The various exemplary embodiments of the present invention are described in detail now with reference to accompanying drawing.It should be noted that:Unless have in addition
Body illustrates that the unlimited system of part and the positioned opposite of step, numerical expression and the numerical value otherwise illustrated in these embodiments is originally
The scope of invention.
The description only actually at least one exemplary embodiment is illustrative to be never used as to the present invention below
And its application or any restrictions that use.
It may be not discussed in detail for technology, method and apparatus known to person of ordinary skill in the relevant, but suitable
In the case of, the technology, method and apparatus should be considered as part for specification.
In shown here and discussion all examples, any occurrence should be construed as merely exemplary, without
It is as limitation.Therefore, other examples of exemplary embodiment can have different values.
It should be noted that:Similar label and letter represents similar terms in following accompanying drawing, therefore, once a certain Xiang Yi
It is defined, then it need not be further discussed in subsequent accompanying drawing in individual accompanying drawing.
An embodiment provides a kind of image processing method.Fig. 2 is shown according to one implementation of the present invention
The process chart of the image processing method of example.Referring to Fig. 2, this method comprises at least step S201 to step S204.
Step S201, obtain first angle of visual field corresponding to image to be displayed, the optical component of virtual reality helmet
Corresponding to second angle of visual field and second angle of visual field, the second field range size on the screen of virtual reality helmet;
Step S202, using first angle of visual field, second angle of visual field and the second field range size, the first visual field is calculated
Angle is corresponding, the first field range size on the screen of virtual reality helmet;
Step S203, using the first field range size, the size of image to be displayed is calculated;
Step S204, according to the size of image to be displayed, optic aberrance revising processing is carried out to image to be displayed.
In one embodiment of the present of invention, image to be displayed is to be inputted by virtual reality helmet external equipment to virtual
Real helmet, first angle of visual field corresponding to image to be displayed is the optical section of the external equipment of virtual reality helmet
The angle of visual field of part.The external equipment of virtual reality helmet can be any in camera, video camera.Worn in virtual reality
When equipment establishes connection with its external equipment, virtual reality helmet can obtain the device parameter of the external equipment, wherein, if
Standby parameter comprises at least the angle of visual field of the optical component of external equipment.By taking video camera as an example, regarded in video camera by what shooting obtained
Take place frequently before delivering to virtual reality helmet, video camera need with virtual reality helmet establish connect, when video camera with
After virtual reality helmet establishes connection, virtual reality helmet can obtain the relevant parameter of video camera, for example, video camera
In lens the angle of visual field.
Distortion occurs in the image that user is watched by the optical component of virtual reality helmet, for example, pincushion is abnormal
Become.In order that the image that user is watched by the optical component of virtual reality helmet is consistent with original image, to original
Before beginning image is shown, optic aberrance revising processing is carried out to original image.The original image herein being related to is to be shown
Image.
In one embodiment of the present of invention, above-mentioned steps S102 is specifically, utilize first angle of visual field, second angle of visual field and
Two field range sizes, it is calculated based on following calculating formulas (1) corresponding to first angle of visual field, in virtual reality helmet
The first field range size on screen,
R=R* [tan (FOV1/2)/tan(FOV2/ 2)]-calculating formula (1),
Wherein, r is zone radius size corresponding to the first field range, and R is zone radius corresponding to the second field range
Size, FOV1For first angle of visual field, FOV2For second angle of visual field.
After zone radius size corresponding to the first field range is calculated, in one embodiment of the present of invention, it is based on
Following calculating formulas (2), the size of image to be displayed is calculated,
R=K0+K1*h+K2*h2+K3*h3+…+Kn*hn- calculating formula (2),
H is the length of side of square area corresponding to image to be displayed, and r is that zone radius corresponding to the first field range is big
It is small, K0、K1、K2、K3…KnFor coefficient corresponding to the optical component of virtual reality helmet.
It should be noted that image to be displayed, before by optic aberrance revising processing, image to be displayed is one square
Shape image, its corresponding length of side are h.For image to be displayed after optic aberrance revising processing, image to be displayed is changed into half
Footpath is the barrel-shaped of r, and radius r herein is corresponding to above-mentioned the first field range being calculated based on above-mentioned calculating formula (1)
Zone radius size.In addition, K0、K1、K2、K3…KnAs coefficient corresponding to the optical component of virtual reality helmet, be by
What optical component determined in itself, when the model of optical component changes, its corresponding coefficient correspondingly changes, its
In, K0、K1、K2、K3…KnIt can be obtained by being fitted experiment test.
After the size of image to be displayed is calculated, according to the size of image to be displayed, image to be displayed is carried out
Optic aberrance revising processing, specifically, first, determines multiple characteristic points, then, according to image to be displayed from image to be displayed
Size, each characteristic point is determined to the distance of the central point of image to be displayed, then, based on following calculating formulas (3) to be shown
Image carries out optic aberrance revising processing,
L'=K0+K1*l+K2*l2+K3*l3+…+Kn*ln- calculating formula (3),
Wherein, l is characterized the distance for the central point for a little arriving image to be displayed, K0、K1、K2、K3、……KnFor virtual reality head
Coefficient corresponding to the optical component of equipment is worn, l' is characterized the center that image to be displayed is a little arrived after optic aberrance revising is handled
The distance of point.In the embodiment of the present invention, the multiple characteristic points determined from image to be displayed are preferably to be evenly distributed on to wait to show
The characteristic point of diagram picture, each characteristic point determined is subjected to optic aberrance revising processing according to above-mentioned calculating formula (3).
Fig. 3 shows the schematic diagram of optic aberrance revising process according to an embodiment of the invention.Referring to Fig. 3, Fig. 3's
Left-half shows two images to be displayed (image a and image b), Fig. 3 right side before optic aberrance revising processing
It is divided to and shows two barrel-shaped image (image c and image d) after the processing of two optic aberrance revisings.Image c passes through for image a
The image obtained after optic aberrance revising processing.Image b is the image that image d is obtained after optic aberrance revising is handled.From
Its central point and a characteristic point are determined in image a, according to image a size, determines this feature point to image a central point
Distance, and then above-mentioned calculating formula (3) is based on, optic aberrance revising processing is carried out to this feature point, obtained at optic aberrance revising
The distance of this feature point after reason to central point.Its central point and a characteristic point are determined from image b, according to the big of image b
It is small, this feature point is determined to the distance of the central point of image, and then above-mentioned calculating formula (3) is based on, optics is carried out to this feature point
Distortion correction processing, this feature point after optic aberrance revising processing is obtained to the distance of central point.It is it should be noted that above-mentioned
Example illustrate only and a characteristic point is determined from image to be displayed, but not cause any restriction to the present invention.From
The quantity for the characteristic point determined in image to be displayed can be it is several, tens, hundreds of, it is even more more.From image to be displayed
In the quantity of characteristic point determined it is more, the image obtained after optic aberrance revising processing is more accurate.
After optic aberrance revising processing is carried out to image to be displayed, the image to be displayed after optic aberrance revising is handled
Shown on the screen of virtual reality helmet.Image to be displayed after optic aberrance revising processing is in virtual reality head
Wear viewing area on the screen of equipment corresponding to first angle of visual field, first on the screen of virtual reality helmet regard
Wild range size.First angle of visual field corresponding to image to be displayed and second angle of visual field of the optical component of virtual reality helmet
Size following three kinds of situations occur:First angle of visual field is less than second angle of visual field;First angle of visual field is more than second angle of visual field;The
One angle of visual field is equal to second angle of visual field.
It is less than the second visual field of the optical component of virtual reality helmet in first angle of visual field corresponding to image to be displayed
In the case of angle, referring to Fig. 4, viewing area a is the corresponding to first angle of visual field, on the screen of virtual reality helmet
One field range size, viewing area b is corresponding to second angle of visual field, second on the screen of virtual reality helmet regards
Wild range size.Viewing area of the image to be displayed on the screen of virtual reality helmet after optic aberrance revising processing
For the field range size corresponding to first angle of visual field, on the screen of virtual reality helmet.User passes through virtual reality
The region of image to be displayed after the optic aberrance revising processing that the optical component of helmet is watched is first angle of visual field pair
The first field range size answer, on the screen of virtual reality helmet.
It is more than the second visual field of the optical component of virtual reality helmet in first angle of visual field corresponding to image to be displayed
In the case of angle, referring to Fig. 5, viewing area c is the corresponding to first angle of visual field, on the screen of virtual reality helmet
One field range size, viewing area b is corresponding to second angle of visual field, second on the screen of virtual reality helmet regards
Wild range size.Viewing area of the image to be displayed on the screen of virtual reality helmet after optic aberrance revising processing
For the field range size corresponding to first angle of visual field, on the screen of virtual reality helmet.User passes through virtual reality
The region of image to be displayed after the optic aberrance revising processing that the optical component of helmet is watched is second angle of visual field pair
The second field range size answer, on the screen of virtual reality helmet.
It is equal to the second visual field of the optical component of virtual reality helmet in first angle of visual field corresponding to image to be displayed
In the case of angle, referring to Fig. 6, first angle of visual field is corresponding, the first field range on the screen of virtual reality helmet
Size, the second field range corresponding with second angle of visual field, on the screen of virtual reality helmet are equal in magnitude.Optics
Viewing area of the image to be displayed on the screen of virtual reality helmet after distortion correction processing is first angle of visual field pair
Field range size answer, on the screen of virtual reality helmet.The optics that user passes through virtual reality helmet
The region of image to be displayed after the optic aberrance revising processing that part is watched is corresponding to second angle of visual field, in virtual reality
The second field range size on the screen of helmet.
Based on same inventive concept, the invention provides a kind of computer-readable recording medium.Computer-readable storage medium
Matter is stored with computer program, and the computer program can realize any implementation as described above by one or more computing device
The image processing method that example provides.
Based on same inventive concept, the invention provides a kind of virtual reality helmet.Fig. 7 is shown according to the present invention
The structural representation of the virtual reality helmet of one embodiment.Referring to Fig. 7, virtual reality helmet 700 includes display
Unit 701, virtual image optical unit 702, input operation unit 703, state information acquisition unit 704, communication unit 705,
Memory 706, processor 707, graphics processing unit 708, display driver element 709, sound processing unit 710, sound input/
Output unit 711.
Memory 706 is arranged to the mass-memory unit with solid-state drive etc..Memory 706 can store should
With program or various types of data.For example, processor 707 performs the computer journey of image processing method provided by the invention
Sequence, or, the content that user is watched using virtual reality helmet 700 can be stored in memory 706, etc..
Processor 707, which can include computer processing unit (CPU) or other, has the equipment of similar functions.Some realities
Apply in example, processor 707 is used to perform image processing method provided by the invention, that is, performs following operating procedure:Obtain and wait to show
Diagram as corresponding to first angle of visual field, virtual reality helmet optical component second angle of visual field and second angle of visual field it is corresponding
, the second field range size on the screen of virtual reality helmet;Utilize first angle of visual field, second angle of visual field and
Two field range sizes, it is calculated corresponding to first angle of visual field, first visual field on the screen of virtual reality helmet
Range size;Using the first field range size, the size of image to be displayed is calculated;According to the size of image to be displayed,
Optic aberrance revising processing is carried out to image to be displayed.
Display unit 701 can include display panel, display panel be arranged on virtual reality helmet 700 towards with
The side surface of family face can be an entire panel or be the left panel and right panel for corresponding to user's left eye and right eye respectively.
Display panel can be electroluminescent (Electroluminescence, abbreviation EL) element, liquid crystal display or be tied with similar
The miniscope or retina of structure can directly display or similar laser scan type display.
Virtual image optical unit 702 shoots the image shown by display unit 701 in an exaggerated way, and allows user to press
Image shown by the virtual image observation of amplification.Can be from content as the display image being output on display unit 701
The image for the virtual scene that reproduction equipment (Blu-ray Disc or DVD player) or streaming media server provide or use are outside
The image for the reality scene that camera 110 is shot.In some embodiments, virtual image optical unit 702 can include lens unit,
Such as spherical lens, non-spherical lens, Fresnel Lenses etc..
It should be noted that external camera 110 can be arranged on 700 main body front surface of virtual reality helmet, it is outside
Camera 110 can be one or more.External camera 110 can obtain three-dimensional information, and be also used as Distance-sensing
Device.In addition, the position sensitive detector (PSD) or other kinds of range sensor of reflected signal of the detection from object can
To be used together with external camera 110.External camera 110 and range sensor can be used for detection wearing virtual reality and wear and set
Body position, posture and the shape of standby 700 user.In addition, user can directly be seen by external camera 110 under certain condition
See or preview reality scene.
Input operation unit 703 include it is at least one be used for performing the functional unit of input operation, such as button, button,
Switch or other there is the parts of similar functions, user instruction is received by functional unit, and refer to the output of processor 707
Order.
State information acquisition unit 704 is used for the status information for obtaining the user of wearing virtual reality helmet 700.Shape
State information acquisition unit 704 can include various types of sensors, detect status information for itself, and can pass through communication
Unit 705 obtains status information from external equipment (such as other multi-functional terminal ends of smart mobile phone, watch and user's wearing).Shape
State information acquisition unit 704 can obtain the positional information and/or attitude information on the head of user.State information acquisition unit
704 can include gyro sensor, acceleration transducer, global positioning system (Global Positioning System,
Abbreviation GPS) sensor, geomagnetic sensor, doppler effect sensors, infrared sensor, one in radio-frequency field intensity sensor
It is individual or multiple.In addition, state information acquisition unit 704 obtains the state letter of the user of wearing virtual reality helmet 700
Breath, such as mode of operation (whether user dresses virtual reality helmet 700), the operating state of user of acquisition user are (all
As it is static, walk, run and such mobile status, the posture of hand or finger tip, eyes open or closed state, sight side
To, pupil size), the state of mind (whether user is immersed in the shown image of observation and the like), or even physiology
State.
Communication unit 705 performs the coding with the communication process of external device (ED), modulation and demodulation processing and signal of communication
And decoding process.In addition, processor 707 can send transmission data from communication unit 705 to external device (ED).Communication mode can be with
It is wired or wireless, such as mobile high definition link (Mobile High-Definition Link, abbreviation MHL) or logical
With universal serial bus (Universal Serial Bus, abbreviation USB), high-definition media interface (High Definition
Multimedia Interface, abbreviation HDMI), Wireless Fidelity (Wi-Fi), Bluetooth communication or low-power consumption bluetooth communication, and
Mesh network of IEEE802.11s standards etc..In addition, communication unit 705 can be according to WCDMA (Wideband
Code Division Multiple Access, abbreviation W-CDMA), Long Term Evolution (Long Term Evolution, referred to as
LTE) and similar standard operation cellular radio transceiver.
Graphics processing unit 708 is used to perform signal transacting, for example the picture signal to being exported from processor 707 is related
Image quality correction, and the resolution ratio by its conversion of resolution for the screen according to display unit 701.Then, display driving
Unit 709 selects the often row pixel of display unit 701 successively, and scans the often row pixel of display unit 701 successively line by line, thus
Picture element signal based on the picture signal through signal transacting is provided.
Sound processing unit 710 can perform sound quality correction or the sound of the voice signal exported from processor 707
Amplification, and signal transacting of input audio signal etc..Then, sound I/O unit 711 after acoustic processing to outside
Export the sound of sound and input from microphone.
It should be noted that the structure or part in Fig. 7 shown in dotted line frame can be independently of virtual reality helmets 700
Outside, for example, can be arranged in external treatment system (such as computer system) with virtual reality helmet 700 coordinate make
With;Or structure shown in dotted line frame or part can be arranged on the inside of virtual reality helmet 700 or surface.
The present invention can be system, method and/or computer program product.Computer program product can include computer
Readable storage medium storing program for executing, containing for making processor realize the computer-readable program instructions of various aspects of the invention.
Computer-readable recording medium can keep and store to perform the tangible of the instruction that uses of equipment by instruction
Equipment.Computer-readable recording medium for example can be-- but be not limited to-- storage device electric, magnetic storage apparatus, optical storage
Equipment, electromagnetism storage device, semiconductor memory apparatus or above-mentioned any appropriate combination.Computer-readable recording medium
More specifically example (non exhaustive list) includes:Portable computer diskette, hard disk, random access memory (RAM), read-only deposit
It is reservoir (ROM), erasable programmable read only memory (EPROM or flash memory), static RAM (SRAM), portable
Compact disk read-only storage (CD-ROM), digital versatile disc (DVD), memory stick, floppy disk, mechanical coding equipment, for example thereon
It is stored with punch card or groove internal projection structure and the above-mentioned any appropriate combination of instruction.Calculating used herein above
Machine readable storage medium storing program for executing is not construed as instantaneous signal in itself, the electromagnetic wave of such as radio wave or other Free propagations, leads to
Cross the electromagnetic wave (for example, the light pulse for passing through fiber optic cables) of waveguide or the propagation of other transmission mediums or transmitted by electric wire
Electric signal.
Computer-readable program instructions as described herein can be downloaded to from computer-readable recording medium it is each calculate/
Processing equipment, or outer computer or outer is downloaded to by network, such as internet, LAN, wide area network and/or wireless network
Portion's storage device.Network can include copper transmission cable, optical fiber is transmitted, is wirelessly transferred, router, fire wall, interchanger, gateway
Computer and/or Edge Server.Adapter or network interface in each calculating/processing equipment receive from network to be counted
Calculation machine readable program instructions, and the computer-readable program instructions are forwarded, for the meter being stored in each calculating/processing equipment
In calculation machine readable storage medium storing program for executing.
For perform the computer program instructions that operate of the present invention can be assembly instruction, instruction set architecture (ISA) instruction,
Machine instruction, machine-dependent instructions, microcode, firmware instructions, condition setup data or with one or more programming languages
The source code or object code that any combination is write, programming language-such as Smalltalk of the programming language including object-oriented,
C++ etc., and conventional procedural programming languages-such as " C " language or similar programming language.Computer-readable program refers to
Order fully can on the user computer be performed, partly performed on the user computer, the software kit independent as one
Perform, part performs or completely on remote computer or server on the remote computer on the user computer for part
Perform.In the situation of remote computer is related to, remote computer can be by the network of any kind-include LAN
Or wide area network (WAN) (LAN)-subscriber computer is connected to, or, it may be connected to outer computer (such as utilize internet
Service provider passes through Internet connection).In certain embodiments, believe by using the state of computer-readable program instructions
Breath comes personalized customization electronic circuit, such as PLD, field programmable gate array (FPGA) or FPGA
Array (PLA), the electronic circuit can perform computer-readable program instructions, so as to realize various aspects of the invention.
Referring herein to method, apparatus (system) and computer program product according to embodiments of the present invention flow chart and/
Or block diagram describes various aspects of the invention.It should be appreciated that each square frame and flow chart of flow chart and/or block diagram and/
Or in block diagram each square frame combination, can be realized by computer-readable program instructions.
These computer-readable program instructions can be supplied to all-purpose computer, special-purpose computer or other programmable datas
The processor of processing unit, so as to produce a kind of machine so that these instructions are passing through computer or other programmable datas
During the computing device of processing unit, work(specified in one or more of implementation process figure and/or block diagram square frame is generated
The device of energy/action.These computer-readable program instructions can also be stored in a computer-readable storage medium, these refer to
Order causes computer, programmable data processing unit and/or other equipment to work in a specific way, so as to be stored with instruction
Computer-readable medium then includes a manufacture, and it is included in one or more of implementation process figure and/or block diagram square frame
The instruction of the various aspects of defined function/action.
Computer-readable program instructions can also be loaded into computer, other programmable data processing units or other
In equipment so that series of operation steps is performed on computer, other programmable data processing units or miscellaneous equipment, with production
Raw computer implemented process, so that performed on computer, other programmable data processing units or miscellaneous equipment
Instruct function/action specified in one or more of implementation process figure and/or block diagram square frame.
Flow chart and block diagram in accompanying drawing show system, method and the computer journey of multiple embodiments according to the present invention
Architectural framework in the cards, function and the operation of sequence product.At this point, each square frame in flow chart or block diagram can generation
One module of table, program segment or a part for instruction, module, program segment or a part for instruction include one or more be used in fact
The executable instruction of logic function as defined in existing.At some as in the realization replaced, the function of being marked in square frame can also
To occur different from the order marked in accompanying drawing.For example, two continuous square frames can essentially perform substantially in parallel, it
Can also perform in the opposite order sometimes, this is depending on involved function.It is also noted that block diagram and/or flow
The combination of each square frame and block diagram in figure and/or the square frame in flow chart, function or action as defined in performing can be used
Special hardware based system is realized, or can be realized with the combination of specialized hardware and computer instruction.For this
It is well known that, realized for art personnel by hardware mode, realized by software mode and pass through software and hardware
With reference to mode realize it is all of equal value.
It is described above various embodiments of the present invention, described above is exemplary, and non-exclusive, and
It is not limited to disclosed each embodiment.In the case of without departing from the scope and spirit of illustrated each embodiment, for this skill
Many modifications and changes will be apparent from for the those of ordinary skill in art field.The selection of term used herein, purport
Best explaining the principle of each embodiment, practical application or to the technological improvement in market, or make the art its
Its those of ordinary skill is understood that each embodiment disclosed herein.The scope of the present invention is defined by the appended claims.
Claims (10)
- A kind of 1. image processing method, it is characterised in that including:Obtain image to be displayed corresponding to first angle of visual field, virtual reality helmet optical component second angle of visual field and institute State corresponding to second angle of visual field, the second field range size on the screen of the virtual reality helmet;Using first angle of visual field, second angle of visual field and the second field range size, described first is calculated The angle of visual field is corresponding, the first field range size on the screen of the virtual reality helmet;Using the first field range size, the size of the image to be displayed is calculated;According to the size of the image to be displayed, optic aberrance revising processing is carried out to the image to be displayed.
- 2. according to the method for claim 1, it is characterised in that first angle of visual field is described corresponding to the image to be displayed The angle of visual field of the optical component of the external equipment of virtual reality helmet.
- 3. according to the method for claim 2, it is characterised in that first angle of visual field corresponding to image to be displayed is obtained, including:When the virtual reality helmet and the external equipment are established and connected, the virtual reality helmet obtains institute The device parameter of external equipment is stated, wherein, the device parameter includes the angle of visual field of the optical component of the external equipment.
- 4. according to the method for claim 1, it is characterised in that using first angle of visual field, second angle of visual field and The second field range size, it is calculated corresponding to first angle of visual field, in the screen of the virtual reality helmet The first field range size on curtain, including:The first field range size is calculated based on following calculating formula,R=R* [tan (FOV1/2)/tan(FOV2/ 2)], wherein, r is zone radius size corresponding to first field range, R is zone radius size, FOV corresponding to second field range1For first angle of visual field, FOV2For second visual field Angle.
- 5. according to the method for claim 1, it is characterised in that utilize the first field range size, institute is calculated The size of image to be displayed is stated, including:The size of the image to be displayed is calculated based on following calculating formula,R=K0+K1*h+K2*h2+K3*h3+…+Kn*hn, wherein, h is the side of square area corresponding to the image to be displayed Long, r is zone radius size, K corresponding to first field range0、K1、K2、K3、……KnWear and set for the virtual reality Coefficient corresponding to standby optical component.
- 6. according to any described methods of claim 1-5, it is characterised in that according to the size of the image to be displayed, to institute State image to be displayed and carry out optic aberrance revising processing, including:Multiple characteristic points are determined from the image to be displayed, according to the size of the image to be displayed, determine that each characteristic point arrives The distance of the central point of the image to be displayed;Optic aberrance revising processing is carried out to the image to be displayed based on following calculating formula,L'=K0+K1*l+K2*l2+K3*l3+…+Kn*ln, wherein, l be characterized a little to the image to be displayed central point away from From K0、K1、K2、K3、……KnFor coefficient corresponding to the optical component of the virtual reality helmet, l ' is the characteristic point The distance of the central point of the image to be displayed is arrived after optic aberrance revising is handled.
- 7. according to the method for claim 1, it is characterised in that methods described also includes:Image to be displayed after optic aberrance revising is handled is shown on the screen of the virtual reality helmet.
- 8. according to the method for claim 7, it is characterised in that the image to be displayed after optic aberrance revising is handled is in institute State after being shown on the screen of virtual reality helmet, be less than the feelings of second angle of visual field in first angle of visual field Under condition, treated after the optic aberrance revising processing that user is watched by the optical component of the virtual reality helmet The region of display image is corresponding to first angle of visual field, first visual field on the screen of the virtual reality helmet Range size;In the case where first angle of visual field is more than second angle of visual field, user passes through the virtual reality helmet The region of image to be displayed after the optic aberrance revising processing that optical component is watched is corresponding for second angle of visual field , the second field range size on the screen of the virtual reality helmet;In the case where first angle of visual field is equal to second angle of visual field, user passes through the virtual reality helmet The region of image to be displayed after the optic aberrance revising processing that optical component is watched is corresponding for second angle of visual field , the second field range size on the screen of the virtual reality helmet.
- 9. a kind of computer-readable recording medium, the computer-readable recording medium storage has computer program, and its feature exists In, the computer program by one or more computing device to realize the image as described in claim any one of 1-8 Processing method.
- 10. a kind of virtual reality helmet, including memory, processor and it is stored on the memory and can be at the place The computer program run on reason device, it is characterised in that realize that right such as will described in the computing device during computer program Seek the image processing method described in any one of 1-8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710758206.4A CN107610044A (en) | 2017-08-29 | 2017-08-29 | Image processing method, computer-readable recording medium and virtual reality helmet |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710758206.4A CN107610044A (en) | 2017-08-29 | 2017-08-29 | Image processing method, computer-readable recording medium and virtual reality helmet |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107610044A true CN107610044A (en) | 2018-01-19 |
Family
ID=61056559
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710758206.4A Pending CN107610044A (en) | 2017-08-29 | 2017-08-29 | Image processing method, computer-readable recording medium and virtual reality helmet |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107610044A (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109189215A (en) * | 2018-08-16 | 2019-01-11 | 腾讯科技(深圳)有限公司 | A kind of virtual content display methods, device, VR equipment and medium |
CN109410140A (en) * | 2018-10-24 | 2019-03-01 | 京东方科技集团股份有限公司 | A kind of distortion correction method, device, system and computer readable storage medium |
CN109523481A (en) * | 2018-11-09 | 2019-03-26 | 歌尔股份有限公司 | Antidote, device and the computer readable storage medium of projector image distortion |
WO2019205744A1 (en) * | 2018-04-28 | 2019-10-31 | 京东方科技集团股份有限公司 | Image distortion correction method and apparatus, display device, computer readable medium, and electronic device |
CN110490820A (en) * | 2019-08-07 | 2019-11-22 | Oppo广东移动通信有限公司 | Image processing method and device, electronic equipment, storage medium |
CN112348751A (en) * | 2020-10-27 | 2021-02-09 | 京东方科技集团股份有限公司 | Anti-distortion method and device for near-eye display equipment |
CN112433607A (en) * | 2020-11-17 | 2021-03-02 | 歌尔光学科技有限公司 | Image display method and device, electronic equipment and storage medium |
CN113398596A (en) * | 2021-07-30 | 2021-09-17 | 广州边在晓峰网络科技有限公司 | AR processing system based on multidimensional game |
CN113780414A (en) * | 2021-09-10 | 2021-12-10 | 京东方科技集团股份有限公司 | Eye movement behavior analysis method, image rendering method, component, device and medium |
CN114145011A (en) * | 2019-07-18 | 2022-03-04 | 微软技术许可有限责任公司 | Dynamic detection and correction of light field camera array miscalibration |
CN114356081A (en) * | 2021-12-20 | 2022-04-15 | 歌尔光学科技有限公司 | Image correction method and device, electronic equipment and head-mounted display equipment |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101305595A (en) * | 2005-11-11 | 2008-11-12 | 索尼株式会社 | Image processing device, image processing method, program thereof, recording medium containing the program |
CN105455285A (en) * | 2015-12-31 | 2016-04-06 | 北京小鸟看看科技有限公司 | Virtual reality helmet adaptation method |
CN106127714A (en) * | 2016-07-01 | 2016-11-16 | 南京睿悦信息技术有限公司 | A kind of measuring method of virtual reality head-mounted display equipment distortion parameter |
CN106406536A (en) * | 2016-09-29 | 2017-02-15 | 努比亚技术有限公司 | Head device, display device and image display method |
CN106569654A (en) * | 2016-10-09 | 2017-04-19 | 深圳市金立通信设备有限公司 | Virtual reality interface display method and virtual reality device |
-
2017
- 2017-08-29 CN CN201710758206.4A patent/CN107610044A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101305595A (en) * | 2005-11-11 | 2008-11-12 | 索尼株式会社 | Image processing device, image processing method, program thereof, recording medium containing the program |
CN105455285A (en) * | 2015-12-31 | 2016-04-06 | 北京小鸟看看科技有限公司 | Virtual reality helmet adaptation method |
CN106127714A (en) * | 2016-07-01 | 2016-11-16 | 南京睿悦信息技术有限公司 | A kind of measuring method of virtual reality head-mounted display equipment distortion parameter |
CN106406536A (en) * | 2016-09-29 | 2017-02-15 | 努比亚技术有限公司 | Head device, display device and image display method |
CN106569654A (en) * | 2016-10-09 | 2017-04-19 | 深圳市金立通信设备有限公司 | Virtual reality interface display method and virtual reality device |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019205744A1 (en) * | 2018-04-28 | 2019-10-31 | 京东方科技集团股份有限公司 | Image distortion correction method and apparatus, display device, computer readable medium, and electronic device |
US11423518B2 (en) | 2018-04-28 | 2022-08-23 | Beijing Boe Optoelectronics Technology Co., Ltd. | Method and device of correcting image distortion, display device, computer readable medium, electronic device |
CN109189215B (en) * | 2018-08-16 | 2021-08-20 | 腾讯科技(深圳)有限公司 | Virtual content display method and device, VR equipment and medium |
CN109189215A (en) * | 2018-08-16 | 2019-01-11 | 腾讯科技(深圳)有限公司 | A kind of virtual content display methods, device, VR equipment and medium |
CN109410140A (en) * | 2018-10-24 | 2019-03-01 | 京东方科技集团股份有限公司 | A kind of distortion correction method, device, system and computer readable storage medium |
CN109523481A (en) * | 2018-11-09 | 2019-03-26 | 歌尔股份有限公司 | Antidote, device and the computer readable storage medium of projector image distortion |
CN109523481B (en) * | 2018-11-09 | 2021-07-13 | 歌尔光学科技有限公司 | Method and device for correcting projector image distortion and computer readable storage medium |
CN114145011A (en) * | 2019-07-18 | 2022-03-04 | 微软技术许可有限责任公司 | Dynamic detection and correction of light field camera array miscalibration |
CN110490820A (en) * | 2019-08-07 | 2019-11-22 | Oppo广东移动通信有限公司 | Image processing method and device, electronic equipment, storage medium |
CN110490820B (en) * | 2019-08-07 | 2022-04-12 | Oppo广东移动通信有限公司 | Image processing method and device, electronic equipment and storage medium |
CN112348751A (en) * | 2020-10-27 | 2021-02-09 | 京东方科技集团股份有限公司 | Anti-distortion method and device for near-eye display equipment |
CN112433607A (en) * | 2020-11-17 | 2021-03-02 | 歌尔光学科技有限公司 | Image display method and device, electronic equipment and storage medium |
CN113398596A (en) * | 2021-07-30 | 2021-09-17 | 广州边在晓峰网络科技有限公司 | AR processing system based on multidimensional game |
CN113780414A (en) * | 2021-09-10 | 2021-12-10 | 京东方科技集团股份有限公司 | Eye movement behavior analysis method, image rendering method, component, device and medium |
CN113780414B (en) * | 2021-09-10 | 2024-08-23 | 京东方科技集团股份有限公司 | Eye movement behavior analysis method, image rendering method, component, device and medium |
CN114356081A (en) * | 2021-12-20 | 2022-04-15 | 歌尔光学科技有限公司 | Image correction method and device, electronic equipment and head-mounted display equipment |
WO2023115460A1 (en) * | 2021-12-20 | 2023-06-29 | 歌尔股份有限公司 | Image correction method and apparatus, electronic device, and head-mounted display device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107610044A (en) | Image processing method, computer-readable recording medium and virtual reality helmet | |
US10939034B2 (en) | Imaging system and method for producing images via gaze-based control | |
KR102330090B1 (en) | Method and device for compositing images | |
US10908421B2 (en) | Systems and methods for personal viewing devices | |
US9762791B2 (en) | Production of face images having preferred perspective angles | |
US11176747B2 (en) | Information processing apparatus and information processing method | |
US20180082479A1 (en) | Virtual fitting method, virtual fitting glasses and virtual fitting system | |
JPWO2013175923A1 (en) | Simulation device | |
CN109002248B (en) | VR scene screenshot method, equipment and storage medium | |
CN107678539A (en) | For wearing the display methods of display device and wearing display device | |
US20160180498A1 (en) | Image display device, image processing device, and image processing method | |
US10572764B1 (en) | Adaptive stereo rendering to reduce motion sickness | |
KR20200056721A (en) | Method and apparatus for measuring optical properties of augmented reality device | |
CN108124150A (en) | Virtual reality wears display device and observes the method for real scene by it | |
CN107560637A (en) | Wear display device calibration result verification method and wear display device | |
CN108270971B (en) | Mobile terminal focusing method and device and computer readable storage medium | |
CN108108018A (en) | Commanding and training method, equipment and system based on virtual reality | |
CN107644228A (en) | Image processing method | |
CN114945943A (en) | Estimating depth based on iris size | |
US11010865B2 (en) | Imaging method, imaging apparatus, and virtual reality device involves distortion | |
CN108021346A (en) | VR helmets show method, VR helmets and the system of image | |
CN107704397A (en) | Applied program testing method, device and electronic equipment | |
CN107589841A (en) | Wear the operating method of display device, wear display device and system | |
KR20170044319A (en) | Method for extending field of view of head mounted display | |
CN111736692B (en) | Display method, display device, storage medium and head-mounted device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20201009 Address after: 261031 north of Yuqing street, east of Dongming Road, high tech Zone, Weifang City, Shandong Province (Room 502, Geer electronic office building) Applicant after: GoerTek Optical Technology Co.,Ltd. Address before: 266104 Laoshan Qingdao District North House Street investment service center room, Room 308, Shandong Applicant before: GOERTEK TECHNOLOGY Co.,Ltd. |
|
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180119 |