US20120140033A1 - Displaying 3d content on low frame-rate displays - Google Patents
Displaying 3d content on low frame-rate displays Download PDFInfo
- Publication number
- US20120140033A1 US20120140033A1 US13/378,981 US201113378981A US2012140033A1 US 20120140033 A1 US20120140033 A1 US 20120140033A1 US 201113378981 A US201113378981 A US 201113378981A US 2012140033 A1 US2012140033 A1 US 2012140033A1
- Authority
- US
- United States
- Prior art keywords
- display device
- video frames
- frame
- blanking
- sending
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/003—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/20—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
- G02B30/22—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the stereoscopic type
- G02B30/24—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the stereoscopic type involving temporal multiplexing, e.g. using sequentially activated left and right shutters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/341—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/398—Synchronisation thereof; Control thereof
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2310/00—Command of the display device
- G09G2310/06—Details of flat display driving waveforms
- G09G2310/061—Details of flat display driving waveforms for resetting or blanking
Definitions
- PCT/US2011/027981 filed Mar. 10, 2011, entitled “SHUTTERING THE DISPLAY OF INTER-FRAME TRANSITIONS;”
- PCT Patent Application No. PCT/US2011/032549 filed Apr. 14, 2011, entitled “ADAPTIVE 3-D SHUTTERING DEVICES;”
- the entire content of each of the foregoing applications is incorporated by reference herein.
- This invention relates to systems, methods, and computer program products related to conversion and presentation of three-dimensional video content.
- Three-dimensional (3D) display technology involves presenting two-dimensional images in such a manner that the images appear to the human brain to be 3D. The process typically involves presenting “left” image data to the left eye, and “right” image data to the right eye. When received, the brain perceives this data as a 3D image.
- 3D display technology generally incorporates the use of a filtering or blanking device, such as glasses, which filter displayed image data to the correct eye. Filtering devices can be passive, meaning that image data is filtered passively (e.g., by color code or by polarization), or active, meaning that the image data is filtered actively (e.g., by shuttering).
- Traditional display devices such as computer monitors, television sets, and portable display devices, have been either incapable of producing suitable image data for 3D viewing, or have produced an inferior 3D viewing experience using known devices and processes.
- viewing 3D content from traditional display devices generally results in blurry images and/or images that have “ghosting” effects, both of which may cause dizziness, headache, discomfort, and even nausea in the viewer.
- display devices that incorporate more recent display technologies, such as Liquid Crystal Display (LCD), Plasma, Light Emitting Diode (LED), Organic Light Emitting Diode (OLED), etc.
- 3D display devices designed specifically for displaying 3D content have become increasingly popular. These 3D display devices are generally used in connection with active filtering devices (e.g., shuttering glasses) to produce 3D image quality not previously available from traditional display devices. These 3D display devices, however, are relatively expensive when compared to traditional display devices.
- active filtering devices e.g., shuttering glasses
- Implementations of the present invention solve one or more problems in the art with systems, methods, and computer program products configured to send three-dimensional (3D) content to a broad range of display devices.
- 3D content When sending 3D content using one or more implementations of the present invention, the viewer at the display device can experience a level of quality that can match or even exceed the quality of specialized 3D display devices. Accordingly, implementations of the present invention can alleviate or eliminate the need to purchase a 3D-specific display device by enabling traditional display devices to display 3D content in a high quality manner.
- a method of sending 3D content to a display device can involve sending one or more first video frames that include a first image for viewing by a user's first eye to a display device.
- the method can also involve transmitting an inter-frame blanking signal to a blanking device.
- the inter-frame blanking signal instructs the blanking device to concurrently blank both the user's first eye and the user's second eye during a display of a transition between the one or more first video frames and the one or more second video frames.
- the display device concurrently displays at least a portion of the one or more first video frames and at least a portion of the one or more second video frames.
- Another implementation can include a method of sending 3D content to a display device while synchronously sending an inter-frame blanking signal to a blanking device.
- the method involves receiving a 3D input signal including one or more input video frames.
- the input video frames include a first image for viewing by a user's first eye and a second image for viewing by the user's second eye.
- the method also includes determining frame-rate capabilities of a display device.
- the method After determining frame-rate capabilities of the display device, the method includes generating a 3D output signal for the display device.
- the 3D output signal comprises first output video frame(s) which include the first image and second output video frame(s) which include the second image.
- the method further includes transmitting the 3D output signal to the display device at a frame-rate based on the determined frame-rate capabilities.
- the method also includes transmitting a blanking instruction to a blanking device. The blanking instruction directs the blanking device to blank the user's view of the display device while the display device transitions between the first output video frame(s) and the second output video frame(s).
- FIG. 1 illustrates a schematic diagram of a three-dimensional (3D) content conversion system for sending 3D content to a variety of display devices in accordance one or more implementations of the present invention
- FIG. 2 illustrates a plurality of flow diagrams which demonstrate output video frames customized to physical characteristics of a destination display device in accordance with one or more implementations of the present invention
- FIG. 3 illustrates a schematic diagram of the shuttering of the display of 3D video content in response to a blanking signal in accordance one or more implementations of the present invention
- FIG. 4 illustrates a timing diagram which demonstrates the relative timing of transmitted output 3D content, a corresponding blanking signal, and resulting display states in accordance with one or more implementations of the present invention
- FIG. 5 illustrates a schematic diagram of a system for sending 3D content to low frame-rate devices in accordance with one or more implementations of the present invention
- FIG. 6 illustrates a flowchart of a series of acts in a method in accordance with an implementation of the present invention of sending 3D content to a display device
- FIG. 7 illustrates a flowchart of a series of acts in a method in accordance with an implementation of the present invention of sending 3D content to a display device while synchronously sending an inter-frame blanking signal to a blanking device.
- Implementations of the present invention solve one or more problems in the art with systems, methods, and computer program products configured to send three-dimensional (3D) content to a broad range of display devices.
- 3D content When sending 3D content using one or more implementations of the present invention, the viewer at the display device can experience a level of quality that can match or even exceed the quality of specialized 3D display devices. Accordingly, implementations of the present invention can alleviate or eliminate the need to purchase a 3D-specific display device by enabling traditional display devices to display 3D content in a high quality manner.
- Specialized 3D display devices attempt to provide an enhanced 3D viewing experience by modifying physical characteristics of the display device, such as by increasing the frame-rate and decreasing a frame overlap interval.
- the frame-rate refers to the number of unique video frames the display device can render in a given amount of time (e.g., one second).
- Frame overlap interval refers to the period of time that elapses when transitioning between two frames.
- the display device displays at least a portion of two or more video frames concurrently.
- Longer frame overlap intervals are perceptible to the human eye, and can lead to a degraded viewing experience. For example, longer frame overlap intervals can cause motion blurring or ghosting. These effects are a particular problem when viewing 3D video content.
- One or more implementations of the present invention provide for sending 3D content to lower frame-rate display devices in a manner customized for the display device. This can include, for example, customizing the frame-rate or the frame size of the 3D content for the device, and compensating for the frame overlap interval. Compensating for the frame overlap interval can involve blanking of some or all of the frame overlap interval from the user's view. “Inter-frame” blanking can involve sending a blanking instruction to a blanking device which instructs the blanking device to block all or part of the frame overlap interval from the user's view. In one or more implementations, the system can send the inter-frame blanking instruction synchronously with sending the 3D content to the display device. Thus, one or more implementations allow for sending 3D content to a broad range of display devices, including devices that have lower frame-rates and longer frame overlap intervals, while overcoming problems such as motion blurring and ghosting.
- FIG. 1 illustrates a schematic diagram of a 3D content conversion system 100 for sending 3D content to a variety of display devices in accordance with one or more implementations of the invention.
- the 3D content conversion system 100 includes a video processing device 102 .
- the video processing device 102 can receive input 3D content 104 via a video receiver 114 , and can transmit output 3D content 108 via a video transmitter 130 .
- the video processing device 102 can optimize, tailor, or customize the output 3D content 108 for a particular destination display device (e.g., destination display device 310 , FIG. 3 ) by considering physical characteristics of the destination display device. Customization or tailoring of the output 3D content 108 can include customizing the encoding format, the frame-rate, the frame size, etc. of the output 3D content 108 for the destination display device. To assist with 3D viewing, the video processing device 102 can also generate a blanking signal 136 and transmit the generated blanking signal 136 to one or more blanking devices (e.g., blanking device(s) 312 , FIG. 3 ).
- a blanking signal 136 e.g., blanking device(s) 312 , FIG. 3 .
- the video processing device 102 includes a processing component 116 which can include a plurality of sub-components or modules (which can be separate or combined).
- the processing component 116 can include a decoder 118 , frame buffers 122 , 124 , and an encoder 120 .
- the decoder 118 can receive the input 3D content 104 , which can include one or more input video frames 106 that comprise left eye image data and right eye image data.
- the decoder 118 can detect the 3D encoding format of the input 3D content 104 .
- the decoder 118 can decode left eye image data of the input video frame(s) 106 into one frame buffer (e.g., frame buffer 122 ) and decode right eye image data of the input video frame(s) into the other frame buffer (e.g., frame buffer 124 ).
- decoding may involve decoding image data from a plurality of input video frames 106 to construct complete image data for each frame buffer 122 , 124 . This may be the case, for example, if the input video frames 106 encode each image using a plurality of interlaced video frames. In other circumstances, decoding may involve decoding image data from a single input video frame 106 to construct complete image data for both frame buffers 122 , 124 . This may be the case, for example, if the input video frames 106 encode both images on a single frame (e.g., using spatial compression, or interleaving). As discussed more fully herein after, decoding can include modifying the left eye image data and the right eye image data based on physical characteristics of the destination display device.
- the encoder 120 can encode image data previously decoded into the frame buffers 122 , 124 to generate the output 3D content 108 .
- the encoder 120 can encode the left and right eye image data from the frame buffers 122 , 124 into alternating “left” video frames and “right” output video frames 110 , 112 .
- the left video frame(s) can encode only left image data from a corresponding frame buffer (e.g., frame buffer 122 ).
- the right video frame(s) can encode only right image data from a corresponding frame buffer (e.g., frame buffer 124 ).
- the video transmitter 130 can first send one or more output video frames 110 for one eye (e.g., one or more left video frames) to the destination display device, and then send one or more output video frames 112 for the other eye (e.g., one or more right video frames).
- the decoder 118 can then decode new image data into the frame buffers 122 , 124 , and the encoder 120 can then encode new output video frames 110 , 112 from the frame buffers 122 , 124 into the output 3D content 108 .
- the encoding and decoding process can include customizing the output 3D content 108 to physical characteristics of the destination display device.
- Both the decoder 118 and the encoder 120 can consider physical characteristics of the destination display device to generate customized output 3D content 108 that is appropriate for the particular destination display device.
- This can include any combination of generating customized types of output video frames (e.g., interlaced or progressive) and/or generating customized sizes of output video frames (e.g., 480, 720, or 1080 vertical lines).
- This can also include generating output video frames at a target frame-rate (e.g., 60 Hz, 120 Hz).
- the 3D content conversion system 100 can send the frames to the destination display device at a rate that would cause the display device to receive a target number of frames per second.
- FIG. 2 illustrated are a plurality of flow diagrams 202 , 204 , 206 , in accordance with one or more implementations, which demonstrate output video frames 110 , 112 customized to physical characteristics of a destination display device.
- Flow diagrams 202 and 204 illustrate output video frames 110 , 112 customized to destination display devices that alternately receive progressive or interlaced video frames.
- flow diagram 202 illustrates that sending one or more “left” video frames can involve sending a single progressive video frame 110 that includes left image data 208 (e.g., left image data from frame buffer 122 ).
- the progressive case can also involve sending a single progressive video frame 112 that includes right image data 210 (e.g., right image data from frame buffer 124 ).
- flow diagram 204 illustrates that sending one or more “left” video frames can involve sending two (or more) “left” interlaced video frames ( 110 a , 110 b ). These frames can include left image data 208 (e.g., left image data from frame buffer 122 ). Similarly, sending one or more “right” video frames can involve sending and two (or more) “right” interlaced video frames ( 112 a , 112 b ). These frames can include right image data 210 (e.g., right image data from frame buffer 124 ).
- each of the interlaced video frames encodes only partial image data (e.g., odd lines or even lines).
- Flow diagram 206 illustrates output video frames 110 , 112 that would upscale the frame-rate of the output 3D content 108 in accordance with one or more implementations.
- the output 3D content 108 includes a progressive “left” video frame 110 (corresponding to the left image data 208 ) and a subsequent progressive “right” video frame 112 (corresponding to right image data 210 ). Then, the “left” video frame 110 and the “right” video frame 112 repeat, using the same image data ( 208 , 210 ).
- repeating the same image data twice can double the frame-rate.
- the input 3D content 104 had a frame-rate of 60 Hz (i.e., the decoder 118 decoded sixty complete frames per second), then introducing the same image data twice can result in output 3D content 108 having an upscaled frame-rate of 120 Hz.
- Downscaling can involve reducing the number of video frames in the output 3D content 108 . This may be useful when sending the output 3D content 108 to destination display devices that may not be optimal for, or capable of, displaying higher frame-rates. Downscaling can involve omitting some frames of left and right image data stored in the frame buffers 122 , 124 . Downscaling can also involve detecting differences between sequential video frames and generating new frames that capture these differences. Thus, the video processing device 102 can generate new frames a lower frame-rate than the original frames, thereby reducing the frame-rate in the output 3D content 108 .
- the output 3D content 108 generated for one destination display device may comprise progressive video frames having 1080 lines of vertical resolution sent to the destination display device at 120 Hz.
- the output 3D content 108 may alternatively comprise interlaced video frames having 480 lines of vertical resolution sent to the destination display device at 60 Hz.
- these examples are merely illustrative and are not limiting.
- customizing the output 3D content 108 can involve the use of additional components or modules, such as a detection module 126 .
- the detection module 126 can detect physical characteristics of the destination display device and provide this information to the other modules or components, such as the decoder 118 and/or the encoder 120 .
- the detection module 126 can receive the physical characteristic information via an input receiver 132 .
- the detection module 126 can receive physical characteristic information directly from the destination display device (e.g., via a High Definition Media Interface (HDMI) connection) or manually (e.g., via user input).
- the physical characteristic information can include frame size and frame-rate capabilities of the destination display device, an inter-frame overlap interval of the destination display device, etc.
- receiving physical characteristic information via user input can involve receiving user feedback about output 3D content 108 displayed on the destination display device.
- the video processing device 102 can generate and transmit “configuration” output 3D content 108 and a corresponding “configuration” blanking signal 136 in a manner intended to elicit user feedback.
- the user can then provide any appropriate user feedback about his or her perception of the “configuration” output 3D content 108 and blanking signal 136 .
- the video processing device 102 can then adjust the output 3D content 108 and/or the blanking signal 136 until optimized for the physical characteristics of the destination display device.
- the video processing device 102 can send the output 3D content 108 in various different formats to the display device.
- the user can provide feedback to the video processing device 102 .
- the user can press a button on either a shuttering device, the display device, or the video processing device 102 to signify to the input module 132 that the clear image.
- the input module 132 can forward this input to the detection module 126 , which can then determine the physical characteristics of the destination display device.
- the user can use the input module 132 to enter or select a make and model of the destination display device.
- the detection module 126 can then determine the physical characteristics of the destination display device based on the user input.
- the decoder 118 and the encoder 120 can each be capable of encoding and/or decoding both analog and digital content.
- the video processing device 102 can convert digital input 3D content 104 into analog output 3D content 108 .
- the video processing device 102 can convert analog input 3D content 104 into digital output 3D content 108 .
- the video processing device 102 can also receive digital content and output digital content.
- the processing component 116 can also include a blanking signal generator 128 , which can generate a blanking signal 136 comprising a plurality of blanking instructions.
- the blanking signal transmitter 134 can transmit the blanking signal 136 to one or more blanking devices (e.g., blanking device(s) 312 , FIG. 3 ) prior to or concurrently with the transmission of the output 3D content 108 to the destination display device.
- the blanking instructions in the blanking signal 136 can instruct the blanking device(s) to shutter the display of the output 3D content 108 .
- the blanking device(s) can respond to the blanking instructions synchronously with the display of the output video frames 110 , 112 at the destination display device to shutter the displayed output video frames 110 , 112 from the user's view.
- the video processing device 102 can include any number of additional components or modules, or can contain a fewer number of components or modules. Accordingly, the video processing device 102 can depart from the illustrated form without departing from the scope of this disclosure. Furthermore, the video processing device 102 can implement any combination of the components or modules in hardware, software, or a combination thereof. For example, the video processing device 102 can implement one or more components or modules using Field Programmable Gate Arrays (FPGAs).
- FPGAs Field Programmable Gate Arrays
- FIG. 3 illustrated is a schematic diagram of the shuttering of the display of 3D video content in response to a blanking signal, according to one or more implementations.
- FIG. 3 illustrates a destination display device 310 and a one or more blanking devices 312 in each of three distinct states 302 , 304 , 306 .
- the destination display device 310 displays either unique 3D image data 208 , 210 or an inter-frame overlap 308 , while the blanking device 312 responds to an appropriate blanking instruction 318 , 320 , 322 .
- the destination display device 310 may be displaying one or more of the output video frames 110 , 112 of customized output 3D content 108 , while the blanking device(s) 312 may be responding to the blanking signal 136 .
- Each blanking device 312 can be a “shuttering device” that can blank (or block) one or more portions of a viewer's view of the destination display device 310 to provide the illusion of 3D content display.
- the video processing device 102 can transmit one or more “left” output video frames (e.g., output video frames 110 ) to the destination display device 310 and can transmit a blanking instruction 318 (blank right) to the blanking device(s) 312 .
- the blanking instruction can include a data packet.
- the video processing device 102 can send a blanking signal to the blanking device 312 including one or more data packets 318 .
- the data packet 318 can include instructions to use shuttering component 316 to blank the viewer's right eye view of the display device 310 .
- the blanking device 312 can blank or occlude the viewer's right eye view of the display device 310 using shutting component 316 .
- the destination display device 310 can uniquely display left image data 208 and each blanking device 312 can use a “right” blanking component 316 to blank the viewer's right eye view of the displayed left image data 208 .
- the video processing device 102 can transmit one or more “right” output video frames (e.g., output video frames 112 ) to the destination display device 310 and can transmit a blanking instruction 322 (blank left) to the blanking device(s) 312 .
- the data packet or blanking instruction 322 can include instructions to use shuttering component 314 to blank the viewer's left eye view of the display device 310 .
- the blanking device 312 can blank or occlude the viewer's left eye view of the display device 310 using shutting component 314 .
- the destination display device 310 can uniquely display right image data 210 and each blanking device 312 can use a “left” blanking component 314 to blank the viewer's left eye view of the displayed right image data 210 .
- states 302 and 306 when combined with the synchronous display of right and left image data, can provide the illusion that the two-dimensional left and right images are 3D.
- states 302 and 306 are not limited to displaying “left” and “right” video frames in the manner illustrated.
- the destination display device 310 can display right image data 210 , and each blanking device 312 can use the “left” blanking component 314 to blank the viewer's left eye.
- the destination display device 310 can display left image data 208 , and each blanking device 312 can use the “right” blanking component 316 blank the viewer's right eye.
- State 304 illustrates an inter-frame overlap interval occurring after the video processing device 102 transmits the one or more “right” output video frames (e.g., output video frames 112 ) subsequent to transmitting the “left” frame(s).
- inter-frame overlap 308 may occur, whereby the destination display device 310 concurrently displays portions of image data from two or more video frames (e.g., image data 208 from video frame 110 and image data 210 from video frame 112 ).
- the video processing device 102 can transmit a blanking instruction 320 (blank both) to the blanking device(s) 312 .
- the blanking instruction or data packet 320 can include instructions to use shuttering components 314 , 316 to blank the viewer's entire view of the display device 310 .
- the blanking device(s) 312 can concurrently use both blanking components 314 , 316 blank both the viewer's left eye view and the viewer's right eye during the inter-frame overlap 308 .
- the blanking device(s) 312 can prevent the viewer(s) from viewing at least a portion of the inter-frame overlap 308 during at least a portion of the inter-frame overlap interval.
- This “inter-frame blanking,” or the synchronous blanking of both eyes during inter-frame overlap intervals, can enhance the clarity of the perceived 3D image. Inter-frame blanking can reduce or eliminate the undesirable effects common to 3D content display, such as motion blurring and ghosting.
- the disclosed inter-frame blanking techniques when synchronously combined with the customized output 3D content 108 , can allow for viewing of 3D content on display devices that may have lower frame-rates and/or longer frame overlap intervals.
- FIG. 4 illustrates a timing diagram which demonstrates the relative timing of transmitted output 3D content 108 , a corresponding blanking signal 136 , and resulting display states, consistent with one or more implementations. Illustrated is a snapshot 400 of time during the transmission of the output 3D content 108 to the destination display device 310 , and the transmission of the blanking signal 136 to the blanking device(s) 312 .
- the display states 402 indicate the states 302 , 304 , 306 discussed herein above in connection with FIG. 3 .
- the horizontal ellipses to the left and right of the snapshot 400 indicate that the snapshot 400 may extend to any point in the past or in the future.
- the video processing device 102 can transmit left output video frame(s) 110 to the destination display device 310 .
- time 406 can correspond to the beginning of state 302 , in which the destination display device 310 uniquely displays left image data ( 208 , FIG. 3 ) from the left video frame(s) 110 .
- the video processing device 102 may have started transmission of the left video frame(s) 110 prior to time 406 , and a state 204 of inter-frame overlap may have occurred.
- the video processing device 102 may also have started transmission at the beginning of time 406 .
- FIG. 4 illustrates that during the time period between time 406 and a time 408 , the output 3D content 108 includes the left output video frame(s), 110 and that the blanking signal 136 includes an appropriate blanking instruction 318 (blank right).
- the video processing device 102 can cease transmitting the left output video frame(s) 110 and begin transmitting right output video frame(s) 112 .
- the video processing device 102 can base the timing of the transition between the left and right video frames on a target frame-rate of the output 3D content 108 tailored for the destination display device 310 . For example, if the destination display device 310 would optimally receive sixty progressive frames per second, then the video processing device 102 can transmit a progressive left video frame for 1/60 th of a second. Subsequently, the destination display device 310 can transmit a progressive right video frame for another 1/60 th of a second.
- the video processing device 102 can transmit a plurality of left video frames and then a plurality of right video frames, each for an appropriate period of time.
- the transition between transmitting two video frames can occur immediately after the video processing device 102 transmits the last line of a video frame (e.g., after transmitting the 720 th line, in the case of “720p” video frames).
- the video processing device 102 can determine a state 304 from time 408 to a time 410 .
- the destination display device 310 would display inter-frame overlap ( 308 , FIG. 3 ) as the display device transitions between uniquely displaying the left output video frame(s) 110 and the right output video frames(s) 112 .
- FIG. 4 illustrates that from time 408 to time 410 the blanking signal can include an inter-frame blanking instruction 320 (blank both).
- the inter-frame blanking instruction 320 can blank the inter-frame overlap ( 308 , FIG. 3 ) from the user's view.
- the destination display device 310 will have transitioned past the inter-frame overlap and will uniquely display the right output video frame(s) 112 .
- the video processing device 102 can send an appropriate blanking instruction 322 (blank left) to the blanking device(s) 312 .
- the video processing device 102 can send another one or more left frames, another one or more right frames, etc. These frames can include new image data decoded into the frame buffers, or can include the same data sent previously (i.e., when increasing the frame-rate) in the output 3D content 108 .
- FIG. 4 illustrates a series of alternating left and right video frames (in any order), one or more implementations extend to any sequence of video frames.
- the output 3D content 108 can comprise differing sequences of left and right video frames (e.g., left, left, right, right).
- the output video content 108 can include only video frames intended for viewing with both eyes.
- the output 3D content 108 can comprise a combination of different video frame types. One combination, for instance, can include both video frames intended for viewing with both eyes, as well as video frames intended for viewing with a single eye.
- the blanking signal 136 can instruct the blanking device(s) 312 to blank an entire time period. In other instances, however, the blanking signal 136 can also instruct the blanking device(s) 312 to blank only a portion of a corresponding time period. Furthermore, the blanking signal 136 can instruct the blanking device(s) 312 to blank more than a corresponding time period. In addition, the blanking signal 136 can also include other blanking instructions, such as a blanking instruction that causes the blanking device to refrain from blanking.
- the blanking signal 136 can include any appropriate sequence of blanking instructions that correspond to the output 3D content 108 . For instance, if the output 3D content 108 includes a different sequence of left and right video frames, the blanking signal 136 can include an appropriate different sequence of blanking instructions. Furthermore, the blanking signal 136 can depart from the illustrated implementations. For example, the blanking signal 136 can refrain from blanking during one or more time periods corresponding to a transition. Furthermore, blanking signal 136 can include any number of other blanking instructions, such as blanking instructions that does no blanking (e.g., when displaying a video frame intended for viewing with both eyes).
- FIG. 5 illustrates a schematic diagram of a system 500 for sending 3D video content to lower frame-rate devices.
- the system 500 can include the video processing device 102 , one or more blanking devices 312 , and a destination display device 310 . These devices can be separate or combined. For instance, in one or more implementations the video processing device 102 and the destination display device 310 are separate units, while in one or more other implementations these devices form a single unit.
- the video processing device 310 receives the input 3D content 104 from a media device.
- the media device can comprise any number of devices capable of transmitting 3D video content to the video processing device 102 .
- FIG. 5 illustrates that the media device can comprise a streaming source 502 (e.g., a satellite box, cable box, the Internet), a gaming device (e.g., XBOX 504 , PLAYSTATION 506 ), a player device (e.g., Blu-Ray player 506 , DVD player 508 ) capable of reading media 512 , and the like.
- the video processing device 102 can, itself, comprise one or more media devices.
- the video receiver 114 can comprise one or more media devices (e.g., media devices 502 , 504 , 506 , 508 , 510 ).
- the video processing device 102 can communicate with the destination display device 310 and the blanking device(s) 312 in any appropriate manner.
- an appropriate wired mechanism such as HDMI, component, composite, coaxial, network, optical, and the like can couple the video processing device 102 and the destination display device 310 together.
- an appropriate wireless mechanism such as BLUETOOTH, Wi-Fi, etc., can couple the video processing device 102 and the destination display device 310 together.
- any appropriate wired or wireless mechanism e.g., BLUETOOTH, infrared, etc.
- the video processing device 102 can generate any appropriate output signal comprising output 3D content 108 .
- the video processing device 102 and the destination display device 310 are coupled via a digital mechanism (e.g., HDMI), the video processing device 102 can generate a digital signal that includes the output 3D content 108 .
- the video processing device 102 and the destination display device 310 are coupled via an analog mechanism (e.g., component, composite or coaxial), the video processing device 102 can generate an analog signal that includes the output 3D content 108 .
- the video processing device 102 can take any of a variety of forms.
- the video processing device 102 may be a set-top box or other customized computing system.
- the video processing device 102 may also be a general purpose computing system (e.g., a laptop computer, a desktop computer, a tablet computer, etc.).
- the video processing device 102 can be a special purpose computing system (e.g., a gaming console, a set-top box, etc.) that has been adapted to implement one or more disclosed features.
- the destination display device 310 can be any one of a broad range of display devices that incorporate a variety of display technologies, both current and future (e.g., Cathode Ray, Plasma, LCD, LED, OLED). Furthermore, the destination display device 310 can take any of a number of forms, such as a television set, a computer display (e.g., desktop computer monitor, laptop computer display, tablet computer display), a handheld display (e.g., cellular telephone, PDA, handheld gaming device, handheld multimedia device), or any other appropriate form. While the destination display device 310 can be a display device designed specifically to display 3D content, the destination display device 310 can also be a more traditional display device, such as a lower frame-rate device. One will appreciate in light of the disclosure herein that the destination display device 310 can include both digital and analog display devices.
- the blanking device(s) 312 can be any blanking device(s) configured to interoperate with video processing device 102 and to respond to one or more blanking instructions received via the blanking signal 136 .
- the blanking device(s) 312 comprise shuttering components ( 314 , 316 ) that include one or more liquid crystal layers.
- the liquid crystal layers can have the property of becoming opaque (or substantially opaque) when voltage is applied (or, alternatively, when voltage is removed). Otherwise, the liquid crystal layers can have the property being transparent (or substantially transparent) when voltage is removed (or, alternatively, when voltage is applied).
- the blanking device(s) 312 can apply or remove voltage from the shuttering components to block the user's view, as instructed by the blanking signal.
- FIGS. 1-5 provide a number of components and mechanisms for sending 3D content to display devices synchronously with an inter-frame blanking signal.
- the 3D content is customized to particular destination display devices and the inter-frame blanking signal can block inter-frame overlap from a user's view.
- one or more disclosed implementations allow for viewing of 3D content on a broad range of display devices, even when that content in not encoded for viewing on those devices.
- FIGS. 6-7 illustrate flowcharts of computerized methods of sending 3D content to a display device.
- FIG. 6 illustrates a flowchart of a method of sending 3D content to a display device.
- FIG. 7 illustrates a flowchart of a method of sending 3D content to a display device while synchronously sending an inter-frame blanking signal to a blanking device.
- the acts of FIGS. 6 and 7 are described herein below with respect to the schematics, diagrams, devices and components shown in FIGS. 1-5 .
- FIG. 6 shows that a method of sending 3D content to a display device can comprise an act 602 of sending first video frame(s) to a display device.
- Act 602 can include sending one or more first video frames that include a first image for viewing by a user's first eye to a display device.
- the act can include the video processing device 102 transmitting output video frames 110 of output 3D content 108 to the destination display device 310 via the video transmitter 130 .
- sending one or more first video frames can include sending a plurality of interlaced first video frames (e.g., video frames 110 a , 110 b ) or sending a single progressive first video frame (e.g., video frame 110 ).
- sending the one or more first video frames can include sending the one or more first video frames at a frame-rate customized to the display device.
- FIG. 6 also shows that the method can comprise an act 604 of transmitting an inter-frame blanking signal to a blanking device.
- Act 604 can include transmitting an inter-frame blanking signal to a blanking device that instructs the blanking device to concurrently blank both of the user's first eye and the user's second eye during a display of a transition during which at least a portion of the one or more first video frames and at least a portion of the one or more second video frames are to be displayed concurrently at the display device.
- the act can include the video processing device sending the blanking instruction 320 (blank both) to the blanking device(s) 312 via the blanking signal 136 .
- the blanking signal 136 can include a blanking instruction 320 that instructs the blanking device to blank both of the user's first eye and the user's second eye during less than an entire display of the transition.
- the inter-frame blanking signal can also instruct the blanking device to blank the user's first eye during individual display of the second image at the display device.
- the inter-frame blanking signal can also instruct the blanking device and to blank the user's second eye during individual display of the first image at the display device.
- These instructions may correspond with blanking instructions 318 or 322 (in any order), for example.
- FIG. 6 shows that the method can comprise an act 606 of sending second video frame(s) to the display device.
- Act 606 can include sending the one or more second video frames that include a second image for viewing by the user's second eye to the display device.
- the act can include the video processing device 102 transmitting output video frames 112 of output 3D content 108 to the destination display device 310 via the video transmitter 130 .
- sending one or more second video frames can include sending a plurality of interlaced second video frames (e.g., video frames 112 a , 112 b ) or sending a single progressive first video frame (e.g., video frame 112 ).
- sending the one or more second video frames can include sending the one or more second video frames at a frame-rate customized to the display device.
- the method can include any number of additional acts.
- the method can include acts of generating the one or more first video frames and generating the one or more second video frames based on one or more physical characteristics of the display device, including a frame-rate and a frame size of the display device.
- the generating can include generating output video frames 110 , 112 of the output 3D content 108 having a number of lines customized to the destination display device 310 (e.g., 480, 720, 1080).
- Generating video frames can include generating a number of video frames based on the target frame-rate for the destination display device 310 .
- the method can include an act of generating the inter-frame blanking signal based on one or more physical characteristics of the display device, including an inter-frame overlap interval of the display device, which can be a time period corresponding to the display of the transition.
- FIG. 7 illustrates a method of sending three-dimensional (3D) content to a display device while synchronously sending an inter-frame blanking signal to a blanking device.
- the method can comprise an act 702 of receiving a 3D input signal.
- Act 702 can include receiving a 3D input signal including one or more input video frames that include a first image for viewing by a user's first eye and a second image for viewing by the user's second eye.
- the act can include the video processing device 102 receiving, via the video receiver 114 , the input 3D content 104 , which includes one or more input video frame(s) 106 .
- the one or more input video frames 106 comprise a single video frame (e.g., when the video frame encodes left and right image data using spatial compression or interleaving). In other instances, the one or more input video frames 106 comprise a plurality of video frames (e.g., when separate progressive or interlaced frames encode the left and right image data).
- FIG. 7 illustrates that the method can comprise an act 704 of determining capabilities of the display device.
- Act 704 can include determining frame-rate capabilities of a display device.
- the act can include the video processing device 102 receiving physical characteristic information of the destination display device 310 via the input receiver 132 .
- the physical characteristic information can include, for instance, frame-rate capabilities, frame size capabilities, frame overlap interval(s), etc.
- the act can also include determining frame size capabilities of the display device, or determining a frame overlap interval for the display device.
- the act can comprise receiving physical characteristic information (e.g., frame-rate capabilities) directly from the display device or via manual user input.
- FIG. 7 also illustrates that the method can comprise an act 706 of generating a 3D output signal.
- Act 706 can include generating a 3D output signal for the display device, comprising one or more first output video frames including the first image and one or more second output video frames including the second image.
- the act can include the video processing device 102 using the encoder 120 to encode a plurality of output video frames 110 , 112 from the frame buffers 112 , 124 .
- the act can take physical capabilities of the display device into account.
- the act an also include generating the one or more first output video frames and the one or more second output video frames based on determined capabilities (e.g., frame size, frame-rate).
- FIG. 7 illustrates that the method can comprise an act 708 of transmitting the 3D output signal to the display device.
- Act 708 can include transmitting the 3D output signal to the display device at a frame-rate based on the determined frame-rate capabilities.
- the act can include the video processing device 102 using the video transmitter 130 to send the output 3D content 108 to a destination display device 310 .
- the act can include sending each video frame for a specific time period appropriate for the frame-rate. For example, if the frame-rate is 60 Hz, the act can include sending each frame for 1/60 th of a second.
- FIG. 7 also shows that the method can include an act 710 of transmitting a blanking instruction to a blanking device.
- Act 710 can include transmitting a blanking instruction to a blanking device which directs the blanking device to blank the user's view of the display device while the display device transitions between the one or more first output video frames and the one or more second output video frames.
- the act can include the video processing device 102 transmitting the blanking signal 136 via the blanking signal transmitter 134 .
- the blanking signal 136 can include a first blanking instruction (e.g., blanking instruction 320 ) which instructs the blanking device to blank both of a user's eyes.
- a first blanking instruction e.g., blanking instruction 320
- the method can include transmitting any number of additional blanking instructions.
- the method include transmitting a second blanking instruction to the blanking device which directs the blanking device to blank the user's first eye view of the display device while the display device uniquely displays the one or more second output video frames (e.g., blanking instruction 318 ).
- the method can also include transmitting a third blanking instruction to the blanking device which directs the blanking device to blank the user's second eye view of the display device while the display device uniquely displays the one or more first output video frames (e.g., blanking instruction 322 ).
- the method can also include transmitting other blanking instructions, such as a blanking instruction which directs the blanking device to refrain from blanking.
- FIGS. 1-7 provide a number of components and mechanisms for sending 3D video content to a broad range of display devices.
- One or more disclosed implementations allow for viewing of 3D video content on a broad range of display devices, including devices that that may have lower frame-rates and longer frame overlap intervals, or that are not otherwise specifically designed for displaying 3D video content.
- the implementations of the present invention can comprise a special purpose or general-purpose computing systems.
- Computing systems may, for example, be handheld devices, appliances, laptop computers, desktop computers, mainframes, distributed computing systems, or even devices that have not conventionally considered a computing system, such as DVD players, Blu-Ray Players, gaming systems, and video converters.
- the term “computing system” is defined broadly as including any device or system (or combination thereof) that includes at least one physical and tangible processor, and a physical and tangible memory capable of having thereon computer-executable instructions that may be executed by the processor.
- the memory may take any form and may depend on the nature and form of the computing system.
- a computing system may be distributed over a network environment and may include multiple constituent computing systems. In its most basic configuration, a computing system typically includes at least one processing unit and memory.
- the memory may be physical system memory, which may be volatile, non-volatile, or some combination of the two.
- the term “memory” may also be used herein to refer to non-volatile mass storage such as physical storage media. If the computing system is distributed, the processing, memory and/or storage capability may be distributed as well.
- the term “module” or “component” can refer to software objects or routines that execute on the computing system. The different components, modules, engines, and services described herein may be implemented as objects or processes that execute on the computing system (e.g., as separate threads).
- Implementations of the present invention may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below.
- Embodiments within the scope of the present invention also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures.
- Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system.
- Computer-readable media that store computer-executable instructions are physical storage media.
- Computer-readable media that carry computer-executable instructions are transmission media.
- embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: computer storage media and transmission media.
- Computer storage media includes RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
- a “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices.
- a network or another communications connection can include a network and/or data links which can be used to carry or desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
- program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to computer storage media (or vice versa).
- computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media at a computer system.
- a network interface module e.g., a “NIC”
- NIC network interface module
- computer storage media can be included in computer system components that also (or even primarily) utilize transmission media.
- Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions.
- the computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code.
- the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like.
- the invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks.
- program modules may be located in both local and remote memory storage devices.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- Computer Hardware Design (AREA)
- Theoretical Computer Science (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
- The present application is a U.S. National Stage Application corresponding to PCT Patent Application No. PCT/US2011/027933, filed Mar. 10, 2011, which claims priority to U.S. Provisional Application No. 61/416,708, filed Nov. 23, 2010, entitled “3D VIDEO CONVERTER.” The present application is also a continuation-in-part of: PCT Patent Application No. PCT/US2011/025262, filed Feb. 17, 2011, entitled “BLANKING INTER-FRAME TRANSITIONS OF A 3D SIGNAL;” PCT Patent Application No. PCT/US2011/027175, filed Mar. 4, 2011, entitled “FORMATTING 3D CONTENT FOR LOW FRAME-RATE DISPLAYS;” PCT Patent Application No. PCT/US2011/027981, filed Mar. 10, 2011, entitled “SHUTTERING THE DISPLAY OF INTER-FRAME TRANSITIONS;” PCT Patent Application No. PCT/US2011/032549, filed Apr. 14, 2011, entitled “ADAPTIVE 3-D SHUTTERING DEVICES;” and PCT Patent Application No. PCT/US2011/031115, filed Apr. 4, 2011, entitled “DEVICE FOR DISPLAYING 3D CONTENT ON LOW FRAME-RATE DISPLAYS.” The entire content of each of the foregoing applications is incorporated by reference herein.
- 1. The Field of the Invention
- This invention relates to systems, methods, and computer program products related to conversion and presentation of three-dimensional video content.
- 2. Background and Relevant Art
- Three-dimensional (3D) display technology involves presenting two-dimensional images in such a manner that the images appear to the human brain to be 3D. The process typically involves presenting “left” image data to the left eye, and “right” image data to the right eye. When received, the brain perceives this data as a 3D image. 3D display technology generally incorporates the use of a filtering or blanking device, such as glasses, which filter displayed image data to the correct eye. Filtering devices can be passive, meaning that image data is filtered passively (e.g., by color code or by polarization), or active, meaning that the image data is filtered actively (e.g., by shuttering).
- Traditional display devices, such as computer monitors, television sets, and portable display devices, have been either incapable of producing suitable image data for 3D viewing, or have produced an inferior 3D viewing experience using known devices and processes. For instance, viewing 3D content from traditional display devices generally results in blurry images and/or images that have “ghosting” effects, both of which may cause dizziness, headache, discomfort, and even nausea in the viewer. This is true even for display devices that incorporate more recent display technologies, such as Liquid Crystal Display (LCD), Plasma, Light Emitting Diode (LED), Organic Light Emitting Diode (OLED), etc.
- Recently, 3D display devices designed specifically for displaying 3D content have become increasingly popular. These 3D display devices are generally used in connection with active filtering devices (e.g., shuttering glasses) to produce 3D image quality not previously available from traditional display devices. These 3D display devices, however, are relatively expensive when compared to traditional display devices.
- As a result, consumers who desire to view 3D content face the purchase of expensive 3D display devices, even when they may already have traditional display devices available. Accordingly, there a number of considerations to be made regarding the display of 3D content.
- Implementations of the present invention solve one or more problems in the art with systems, methods, and computer program products configured to send three-dimensional (3D) content to a broad range of display devices. When sending 3D content using one or more implementations of the present invention, the viewer at the display device can experience a level of quality that can match or even exceed the quality of specialized 3D display devices. Accordingly, implementations of the present invention can alleviate or eliminate the need to purchase a 3D-specific display device by enabling traditional display devices to display 3D content in a high quality manner.
- For example, a method of sending 3D content to a display device can involve sending one or more first video frames that include a first image for viewing by a user's first eye to a display device. The method can also involve transmitting an inter-frame blanking signal to a blanking device. The inter-frame blanking signal instructs the blanking device to concurrently blank both the user's first eye and the user's second eye during a display of a transition between the one or more first video frames and the one or more second video frames. During the transition the display device concurrently displays at least a portion of the one or more first video frames and at least a portion of the one or more second video frames.
- Another implementation can include a method of sending 3D content to a display device while synchronously sending an inter-frame blanking signal to a blanking device. The method involves receiving a 3D input signal including one or more input video frames. The input video frames include a first image for viewing by a user's first eye and a second image for viewing by the user's second eye. The method also includes determining frame-rate capabilities of a display device.
- After determining frame-rate capabilities of the display device, the method includes generating a 3D output signal for the display device. The 3D output signal comprises first output video frame(s) which include the first image and second output video frame(s) which include the second image. Then, the method further includes transmitting the 3D output signal to the display device at a frame-rate based on the determined frame-rate capabilities. The method also includes transmitting a blanking instruction to a blanking device. The blanking instruction directs the blanking device to blank the user's view of the display device while the display device transitions between the first output video frame(s) and the second output video frame(s).
- This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
- Additional features and advantages of exemplary implementations of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of such exemplary implementations. The features and advantages of such implementations may be realized and obtained by means of the instruments and combinations particularly pointed out in the appended claims. These and other features will become more fully apparent from the following description and appended claims, or may be learned by the practice of such exemplary implementations as set forth hereinafter.
- In order to describe the manner in which the above-recited and other advantages and features of the invention can be obtained, a more particular description of the invention briefly described above will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. Understanding that these drawings depict only typical embodiments of the invention and are not therefore to be considered to be limiting of its scope, the invention will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
-
FIG. 1 illustrates a schematic diagram of a three-dimensional (3D) content conversion system for sending 3D content to a variety of display devices in accordance one or more implementations of the present invention; -
FIG. 2 illustrates a plurality of flow diagrams which demonstrate output video frames customized to physical characteristics of a destination display device in accordance with one or more implementations of the present invention; -
FIG. 3 illustrates a schematic diagram of the shuttering of the display of 3D video content in response to a blanking signal in accordance one or more implementations of the present invention; -
FIG. 4 illustrates a timing diagram which demonstrates the relative timing of transmittedoutput 3D content, a corresponding blanking signal, and resulting display states in accordance with one or more implementations of the present invention; -
FIG. 5 illustrates a schematic diagram of a system for sending 3D content to low frame-rate devices in accordance with one or more implementations of the present invention; -
FIG. 6 illustrates a flowchart of a series of acts in a method in accordance with an implementation of the present invention of sending 3D content to a display device; and -
FIG. 7 illustrates a flowchart of a series of acts in a method in accordance with an implementation of the present invention of sending 3D content to a display device while synchronously sending an inter-frame blanking signal to a blanking device. - Implementations of the present invention solve one or more problems in the art with systems, methods, and computer program products configured to send three-dimensional (3D) content to a broad range of display devices. When sending 3D content using one or more implementations of the present invention, the viewer at the display device can experience a level of quality that can match or even exceed the quality of specialized 3D display devices. Accordingly, implementations of the present invention can alleviate or eliminate the need to purchase a 3D-specific display device by enabling traditional display devices to display 3D content in a high quality manner.
- Specialized 3D display devices attempt to provide an enhanced 3D viewing experience by modifying physical characteristics of the display device, such as by increasing the frame-rate and decreasing a frame overlap interval. The frame-rate refers to the number of unique video frames the display device can render in a given amount of time (e.g., one second). Frame overlap interval refers to the period of time that elapses when transitioning between two frames. During the frame overlap interval, the display device displays at least a portion of two or more video frames concurrently. Longer frame overlap intervals are perceptible to the human eye, and can lead to a degraded viewing experience. For example, longer frame overlap intervals can cause motion blurring or ghosting. These effects are a particular problem when viewing 3D video content.
- One or more implementations of the present invention provide for sending 3D content to lower frame-rate display devices in a manner customized for the display device. This can include, for example, customizing the frame-rate or the frame size of the 3D content for the device, and compensating for the frame overlap interval. Compensating for the frame overlap interval can involve blanking of some or all of the frame overlap interval from the user's view. “Inter-frame” blanking can involve sending a blanking instruction to a blanking device which instructs the blanking device to block all or part of the frame overlap interval from the user's view. In one or more implementations, the system can send the inter-frame blanking instruction synchronously with sending the 3D content to the display device. Thus, one or more implementations allow for sending 3D content to a broad range of display devices, including devices that have lower frame-rates and longer frame overlap intervals, while overcoming problems such as motion blurring and ghosting.
-
FIG. 1 , for example, illustrates a schematic diagram of a 3Dcontent conversion system 100 for sending 3D content to a variety of display devices in accordance with one or more implementations of the invention. As illustrated, the 3Dcontent conversion system 100 includes avideo processing device 102. Thevideo processing device 102 can receiveinput 3D content 104 via avideo receiver 114, and can transmitoutput 3D contentvideo transmitter 130. - The
video processing device 102 can optimize, tailor, or customize theoutput 3D contentdestination display device 310,FIG. 3 ) by considering physical characteristics of the destination display device. Customization or tailoring of theoutput 3D contentoutput 3D contentvideo processing device 102 can also generate ablanking signal 136 and transmit the generated blankingsignal 136 to one or more blanking devices (e.g., blanking device(s) 312,FIG. 3 ). - In one or more implementations, the
video processing device 102 includes aprocessing component 116 which can include a plurality of sub-components or modules (which can be separate or combined). For example, theprocessing component 116 can include adecoder 118,frame buffers encoder 120. Thedecoder 118 can receive theinput 3D content 104, which can include one or more input video frames 106 that comprise left eye image data and right eye image data. Thedecoder 118 can detect the 3D encoding format of theinput 3D content 104. Then, thedecoder 118 can decode left eye image data of the input video frame(s) 106 into one frame buffer (e.g., frame buffer 122) and decode right eye image data of the input video frame(s) into the other frame buffer (e.g., frame buffer 124). - In some circumstances, decoding may involve decoding image data from a plurality of input video frames 106 to construct complete image data for each
frame buffer input video frame 106 to construct complete image data for bothframe buffers - Regardless of the specific decoding process, the
encoder 120 can encode image data previously decoded into theframe buffers output 3D contentencoder 120 can encode the left and right eye image data from theframe buffers video transmitter 130 can first send one or more output video frames 110 for one eye (e.g., one or more left video frames) to the destination display device, and then send one or more output video frames 112 for the other eye (e.g., one or more right video frames). Thedecoder 118 can then decode new image data into theframe buffers encoder 120 can then encode new output video frames 110, 112 from theframe buffers output 3D content - The encoding and decoding process can include customizing the
output 3D contentdecoder 118 and theencoder 120 can consider physical characteristics of the destination display device to generate customizedoutput 3D contentcontent conversion system 100 can send the frames to the destination display device at a rate that would cause the display device to receive a target number of frames per second. - Turning briefly to
FIG. 2 , for example, illustrated are a plurality of flow diagrams 202, 204, 206, in accordance with one or more implementations, which demonstrate output video frames 110, 112 customized to physical characteristics of a destination display device. Flow diagrams 202 and 204 illustrate output video frames 110, 112 customized to destination display devices that alternately receive progressive or interlaced video frames. Taking the progressive case, flow diagram 202 illustrates that sending one or more “left” video frames can involve sending a singleprogressive video frame 110 that includes left image data 208 (e.g., left image data from frame buffer 122). Similarly, the progressive case can also involve sending a singleprogressive video frame 112 that includes right image data 210 (e.g., right image data from frame buffer 124). - In the interlaced case, on the other hand, flow diagram 204 illustrates that sending one or more “left” video frames can involve sending two (or more) “left” interlaced video frames (110 a, 110 b). These frames can include left image data 208 (e.g., left image data from frame buffer 122). Similarly, sending one or more “right” video frames can involve sending and two (or more) “right” interlaced video frames (112 a, 112 b). These frames can include right image data 210 (e.g., right image data from frame buffer 124). Of course, one of ordinary skill in the art would understand in view of this disclosure that each of the interlaced video frames encodes only partial image data (e.g., odd lines or even lines).
- Flow diagram 206 illustrates output video frames 110, 112 that would upscale the frame-rate of the
output 3D contentoutput 3D contentvideo frame 110 and the “right”video frame 112 repeat, using the same image data (208, 210). One will appreciate in light of the disclosure herein that repeating the same image data twice can double the frame-rate. For example, if theinput 3D content 104 had a frame-rate of 60 Hz (i.e., thedecoder 118 decoded sixty complete frames per second), then introducing the same image data twice can result inoutput 3D content - Downscaling, on the other hand, can involve reducing the number of video frames in the
output 3D contentoutput 3D contentframe buffers video processing device 102 can generate new frames a lower frame-rate than the original frames, thereby reducing the frame-rate in theoutput 3D content - Any combinations of customization of the
output 3D contentoutput 3D contentsame input 3D content 104, but for a different destination display device, theoutput 3D content - Returning to
FIG. 1 , in one or more implementations customizing theoutput 3D contentdetection module 126. Thedetection module 126 can detect physical characteristics of the destination display device and provide this information to the other modules or components, such as thedecoder 118 and/or theencoder 120. In one or more implementations, thedetection module 126 can receive the physical characteristic information via aninput receiver 132. Thedetection module 126 can receive physical characteristic information directly from the destination display device (e.g., via a High Definition Media Interface (HDMI) connection) or manually (e.g., via user input). The physical characteristic information can include frame size and frame-rate capabilities of the destination display device, an inter-frame overlap interval of the destination display device, etc. - In one or more implementations, receiving physical characteristic information via user input can involve receiving user feedback about
output 3D contentvideo processing device 102 can generate and transmit “configuration”output 3D contentsignal 136 in a manner intended to elicit user feedback. The user can then provide any appropriate user feedback about his or her perception of the “configuration”output 3D contentsignal 136. Thevideo processing device 102 can then adjust theoutput 3D contentblanking signal 136 until optimized for the physical characteristics of the destination display device. - For example, the
video processing device 102 can send theoutput 3D contentvideo processing device 102. In one or more implementations, the user can press a button on either a shuttering device, the display device, or thevideo processing device 102 to signify to theinput module 132 that the clear image. Theinput module 132 can forward this input to thedetection module 126, which can then determine the physical characteristics of the destination display device. Alternatively, the user can use theinput module 132 to enter or select a make and model of the destination display device. Thedetection module 126 can then determine the physical characteristics of the destination display device based on the user input. - The
decoder 118 and theencoder 120 can each be capable of encoding and/or decoding both analog and digital content. Thus, thevideo processing device 102 can convertdigital input 3D content 104 intoanalog 108. Alternatively, theoutput 3D contentvideo processing device 102 can convertanalog input 3D content 104 intodigital 108. Of course, theoutput 3D contentvideo processing device 102 can also receive digital content and output digital content. - The
processing component 116 can also include ablanking signal generator 128, which can generate ablanking signal 136 comprising a plurality of blanking instructions. The blankingsignal transmitter 134 can transmit theblanking signal 136 to one or more blanking devices (e.g., blanking device(s) 312,FIG. 3 ) prior to or concurrently with the transmission of theoutput 3D contentblanking signal 136 can instruct the blanking device(s) to shutter the display of theoutput 3D content - One will appreciate in light of the disclosure herein that the
video processing device 102 can include any number of additional components or modules, or can contain a fewer number of components or modules. Accordingly, thevideo processing device 102 can depart from the illustrated form without departing from the scope of this disclosure. Furthermore, thevideo processing device 102 can implement any combination of the components or modules in hardware, software, or a combination thereof. For example, thevideo processing device 102 can implement one or more components or modules using Field Programmable Gate Arrays (FPGAs). - Turning to
FIG. 3 , illustrated is a schematic diagram of the shuttering of the display of 3D video content in response to a blanking signal, according to one or more implementations.FIG. 3 illustrates adestination display device 310 and a one ormore blanking devices 312 in each of threedistinct states destination display device 310 displays either unique3D image data inter-frame overlap 308, while theblanking device 312 responds to anappropriate blanking instruction destination display device 310 may be displaying one or more of the output video frames 110, 112 of customizedoutput 3D contentblanking signal 136. - Each blanking
device 312 can be a “shuttering device” that can blank (or block) one or more portions of a viewer's view of thedestination display device 310 to provide the illusion of 3D content display. Instate 302, for example, thevideo processing device 102 can transmit one or more “left” output video frames (e.g., output video frames 110) to thedestination display device 310 and can transmit a blanking instruction 318 (blank right) to the blanking device(s) 312. In one or more implementations, the blanking instruction can include a data packet. Thus, when thedisplay device 310 displays “left eye content” 208 instate 302, thevideo processing device 102 can send a blanking signal to theblanking device 312 including one ormore data packets 318. Thedata packet 318 can include instructions to useshuttering component 316 to blank the viewer's right eye view of thedisplay device 310. Thus, upon receipt ofdata packet 318, theblanking device 312 can blank or occlude the viewer's right eye view of thedisplay device 310 using shuttingcomponent 316. Thus, thedestination display device 310 can uniquely displayleft image data 208 and each blankingdevice 312 can use a “right”blanking component 316 to blank the viewer's right eye view of the displayedleft image data 208. - Similarly, in
state 306, thevideo processing device 102 can transmit one or more “right” output video frames (e.g., output video frames 112) to thedestination display device 310 and can transmit a blanking instruction 322 (blank left) to the blanking device(s) 312. The data packet or blankinginstruction 322 can include instructions to useshuttering component 314 to blank the viewer's left eye view of thedisplay device 310. Thus, upon receipt ofdata packet 322, theblanking device 312 can blank or occlude the viewer's left eye view of thedisplay device 310 using shuttingcomponent 314. In other words, thedestination display device 310 can uniquely displayright image data 210 and each blankingdevice 312 can use a “left” blankingcomponent 314 to blank the viewer's left eye view of the displayedright image data 210. - One will appreciate in view of the disclosure herein that the appropriate shuttering or blanking of a single eye, as in
states state 302, thedestination display device 310 can displayright image data 210, and each blankingdevice 312 can use the “left” blankingcomponent 314 to blank the viewer's left eye. Instate 306, on the other hand, thedestination display device 310 can displayleft image data 208, and each blankingdevice 312 can use the “right”blanking component 316 blank the viewer's right eye. - In addition to blanking left and right eyes individually, one or more implementations provide an enhanced 3D viewing experience by introducing a third state that blanks both the viewer's eyes during inter-frame overlap.
State 304 illustrates an inter-frame overlap interval occurring after thevideo processing device 102 transmits the one or more “right” output video frames (e.g., output video frames 112) subsequent to transmitting the “left” frame(s). During this interval,inter-frame overlap 308 may occur, whereby thedestination display device 310 concurrently displays portions of image data from two or more video frames (e.g.,image data 208 fromvideo frame 110 andimage data 210 from video frame 112). During this inter-frame overlap interval, thevideo processing device 102 can transmit a blanking instruction 320 (blank both) to the blanking device(s) 312. The blanking instruction ordata packet 320 can include instructions to use shutteringcomponents display device 310. Thus, the blanking device(s) 312 can concurrently use both blankingcomponents inter-frame overlap 308. - By blanking both eyes during
state 304, the blanking device(s) 312 can prevent the viewer(s) from viewing at least a portion of theinter-frame overlap 308 during at least a portion of the inter-frame overlap interval. This “inter-frame blanking,” or the synchronous blanking of both eyes during inter-frame overlap intervals, can enhance the clarity of the perceived 3D image. Inter-frame blanking can reduce or eliminate the undesirable effects common to 3D content display, such as motion blurring and ghosting. Thus, the disclosed inter-frame blanking techniques, when synchronously combined with the customizedoutput 3D content -
FIG. 4 illustrates a timing diagram which demonstrates the relative timing oftransmitted 108, aoutput 3D contentcorresponding blanking signal 136, and resulting display states, consistent with one or more implementations. Illustrated is asnapshot 400 of time during the transmission of theoutput 3D contentdestination display device 310, and the transmission of the blankingsignal 136 to the blanking device(s) 312. The display states 402 indicate thestates FIG. 3 . The horizontal ellipses to the left and right of thesnapshot 400 indicate that thesnapshot 400 may extend to any point in the past or in the future. - At a
time 406, thevideo processing device 102 can transmit left output video frame(s) 110 to thedestination display device 310. As illustrated,time 406 can correspond to the beginning ofstate 302, in which thedestination display device 310 uniquely displays left image data (208,FIG. 3 ) from the left video frame(s) 110. Thevideo processing device 102 may have started transmission of the left video frame(s) 110 prior totime 406, and astate 204 of inter-frame overlap may have occurred. Thevideo processing device 102 may also have started transmission at the beginning oftime 406. Regardless of when transmission began,FIG. 4 illustrates that during the time period betweentime 406 and atime 408, theoutput 3D contentsignal 136 includes an appropriate blanking instruction 318 (blank right). - At
time 408, thevideo processing device 102 can cease transmitting the left output video frame(s) 110 and begin transmitting right output video frame(s) 112. Thevideo processing device 102 can base the timing of the transition between the left and right video frames on a target frame-rate of theoutput 3D contentdestination display device 310. For example, if thedestination display device 310 would optimally receive sixty progressive frames per second, then thevideo processing device 102 can transmit a progressive left video frame for 1/60th of a second. Subsequently, thedestination display device 310 can transmit a progressive right video frame for another 1/60th of a second. Of course, if thedestination display device 310 receives interlaced frames, then thevideo processing device 102 can transmit a plurality of left video frames and then a plurality of right video frames, each for an appropriate period of time. The transition between transmitting two video frames can occur immediately after thevideo processing device 102 transmits the last line of a video frame (e.g., after transmitting the 720th line, in the case of “720p” video frames). - Based on the physical characteristic information of the
destination display device 310, thevideo processing device 102 can determine astate 304 fromtime 408 to atime 410. During this time period, thedestination display device 310 would display inter-frame overlap (308,FIG. 3 ) as the display device transitions between uniquely displaying the left output video frame(s) 110 and the right output video frames(s) 112. Thus,FIG. 4 illustrates that fromtime 408 totime 410 the blanking signal can include an inter-frame blanking instruction 320 (blank both). As discussed, theinter-frame blanking instruction 320 can blank the inter-frame overlap (308,FIG. 3 ) from the user's view. - Next, during
state 306 thedestination display device 310 will have transitioned past the inter-frame overlap and will uniquely display the right output video frame(s) 112. Thus, thevideo processing device 102 can send an appropriate blanking instruction 322 (blank left) to the blanking device(s) 312. Subsequently, thevideo processing device 102 can send another one or more left frames, another one or more right frames, etc. These frames can include new image data decoded into the frame buffers, or can include the same data sent previously (i.e., when increasing the frame-rate) in theoutput 3D content - One will also appreciate that while
FIG. 4 illustrates a series of alternating left and right video frames (in any order), one or more implementations extend to any sequence of video frames. In one implementation, for example, theoutput 3D contentoutput video content 108 can include only video frames intended for viewing with both eyes. In yet another implementation, theoutput 3D content - Furthermore, in some instances, the blanking
signal 136 can instruct the blanking device(s) 312 to blank an entire time period. In other instances, however, the blankingsignal 136 can also instruct the blanking device(s) 312 to blank only a portion of a corresponding time period. Furthermore, the blankingsignal 136 can instruct the blanking device(s) 312 to blank more than a corresponding time period. In addition, the blankingsignal 136 can also include other blanking instructions, such as a blanking instruction that causes the blanking device to refrain from blanking. - One will appreciate in light of the disclosure herein that the blanking
signal 136 can include any appropriate sequence of blanking instructions that correspond to theoutput 3D contentoutput 3D contentsignal 136 can include an appropriate different sequence of blanking instructions. Furthermore, the blankingsignal 136 can depart from the illustrated implementations. For example, the blankingsignal 136 can refrain from blanking during one or more time periods corresponding to a transition. Furthermore, blankingsignal 136 can include any number of other blanking instructions, such as blanking instructions that does no blanking (e.g., when displaying a video frame intended for viewing with both eyes). -
FIG. 5 illustrates a schematic diagram of asystem 500 for sending 3D video content to lower frame-rate devices.FIG. 5 illustrates that thesystem 500 can include thevideo processing device 102, one ormore blanking devices 312, and adestination display device 310. These devices can be separate or combined. For instance, in one or more implementations thevideo processing device 102 and thedestination display device 310 are separate units, while in one or more other implementations these devices form a single unit. - In one or more implementations the
video processing device 310 receives theinput 3D content 104 from a media device. The media device can comprise any number of devices capable of transmitting 3D video content to thevideo processing device 102. For example,FIG. 5 illustrates that the media device can comprise a streaming source 502 (e.g., a satellite box, cable box, the Internet), a gaming device (e.g.,XBOX 504, PLAYSTATION 506), a player device (e.g., Blu-Ray player 506, DVD player 508) capable of readingmedia 512, and the like. Of course, thevideo processing device 102 can, itself, comprise one or more media devices. In this instance, thevideo receiver 114 can comprise one or more media devices (e.g.,media devices - The
video processing device 102 can communicate with thedestination display device 310 and the blanking device(s) 312 in any appropriate manner. For instance, an appropriate wired mechanism, such as HDMI, component, composite, coaxial, network, optical, and the like can couple thevideo processing device 102 and thedestination display device 310 together. Additionally, or alternatively, an appropriate wireless mechanism, such as BLUETOOTH, Wi-Fi, etc., can couple thevideo processing device 102 and thedestination display device 310 together. Likewise, any appropriate wired or wireless mechanism (e.g., BLUETOOTH, infrared, etc.) can couple thevideo processing device 102 and the blanking device(s) 312 together. - One will appreciate that the
video processing device 102 can generate any appropriate outputsignal comprising 108. For example, when theoutput 3D contentvideo processing device 102 and thedestination display device 310 are coupled via a digital mechanism (e.g., HDMI), thevideo processing device 102 can generate a digital signal that includes theoutput 3D contentvideo processing device 102 and thedestination display device 310 are coupled via an analog mechanism (e.g., component, composite or coaxial), thevideo processing device 102 can generate an analog signal that includes theoutput 3D content - One will appreciate in view of the disclosure herein that the
video processing device 102 can take any of a variety of forms. For example, thevideo processing device 102 may be a set-top box or other customized computing system. Thevideo processing device 102 may also be a general purpose computing system (e.g., a laptop computer, a desktop computer, a tablet computer, etc.). Alternatively, thevideo processing device 102 can be a special purpose computing system (e.g., a gaming console, a set-top box, etc.) that has been adapted to implement one or more disclosed features. - The
destination display device 310 can be any one of a broad range of display devices that incorporate a variety of display technologies, both current and future (e.g., Cathode Ray, Plasma, LCD, LED, OLED). Furthermore, thedestination display device 310 can take any of a number of forms, such as a television set, a computer display (e.g., desktop computer monitor, laptop computer display, tablet computer display), a handheld display (e.g., cellular telephone, PDA, handheld gaming device, handheld multimedia device), or any other appropriate form. While thedestination display device 310 can be a display device designed specifically to display 3D content, thedestination display device 310 can also be a more traditional display device, such as a lower frame-rate device. One will appreciate in light of the disclosure herein that thedestination display device 310 can include both digital and analog display devices. - The blanking device(s) 312 can be any blanking device(s) configured to interoperate with
video processing device 102 and to respond to one or more blanking instructions received via theblanking signal 136. In one or more implementations, the blanking device(s) 312 comprise shuttering components (314, 316) that include one or more liquid crystal layers. The liquid crystal layers can have the property of becoming opaque (or substantially opaque) when voltage is applied (or, alternatively, when voltage is removed). Otherwise, the liquid crystal layers can have the property being transparent (or substantially transparent) when voltage is removed (or, alternatively, when voltage is applied). Thus, the blanking device(s) 312 can apply or remove voltage from the shuttering components to block the user's view, as instructed by the blanking signal. - Accordingly,
FIGS. 1-5 provide a number of components and mechanisms for sending 3D content to display devices synchronously with an inter-frame blanking signal. The 3D content is customized to particular destination display devices and the inter-frame blanking signal can block inter-frame overlap from a user's view. Thus, one or more disclosed implementations allow for viewing of 3D content on a broad range of display devices, even when that content in not encoded for viewing on those devices. - Additionally, implementations of the present invention can also be described in terms of flowcharts comprising one or more acts in a method for accomplishing a particular result. Along these lines,
FIGS. 6-7 illustrate flowcharts of computerized methods of sending 3D content to a display device. For example,FIG. 6 illustrates a flowchart of a method of sending 3D content to a display device. Similarly,FIG. 7 illustrates a flowchart of a method of sending 3D content to a display device while synchronously sending an inter-frame blanking signal to a blanking device. The acts ofFIGS. 6 and 7 are described herein below with respect to the schematics, diagrams, devices and components shown inFIGS. 1-5 . - For example,
FIG. 6 shows that a method of sending 3D content to a display device can comprise anact 602 of sending first video frame(s) to a display device. Act 602 can include sending one or more first video frames that include a first image for viewing by a user's first eye to a display device. For example, the act can include thevideo processing device 102 transmitting output video frames 110 ofoutput 3D contentdestination display device 310 via thevideo transmitter 130. Also, as illustrated inFIG. 2 , sending one or more first video frames can include sending a plurality of interlaced first video frames (e.g., video frames 110 a, 110 b) or sending a single progressive first video frame (e.g., video frame 110). Furthermore, sending the one or more first video frames can include sending the one or more first video frames at a frame-rate customized to the display device. -
FIG. 6 also shows that the method can comprise anact 604 of transmitting an inter-frame blanking signal to a blanking device. Act 604 can include transmitting an inter-frame blanking signal to a blanking device that instructs the blanking device to concurrently blank both of the user's first eye and the user's second eye during a display of a transition during which at least a portion of the one or more first video frames and at least a portion of the one or more second video frames are to be displayed concurrently at the display device. For example, the act can include the video processing device sending the blanking instruction 320 (blank both) to the blanking device(s) 312 via theblanking signal 136. Of course, the blankingsignal 136 can include a blankinginstruction 320 that instructs the blanking device to blank both of the user's first eye and the user's second eye during less than an entire display of the transition. - Other blanking instructions are possible. For instance, the inter-frame blanking signal can also instruct the blanking device to blank the user's first eye during individual display of the second image at the display device. The inter-frame blanking signal can also instruct the blanking device and to blank the user's second eye during individual display of the first image at the display device. These instructions may correspond with blanking
instructions 318 or 322 (in any order), for example. - Additionally,
FIG. 6 shows that the method can comprise anact 606 of sending second video frame(s) to the display device. Act 606 can include sending the one or more second video frames that include a second image for viewing by the user's second eye to the display device. For example, the act can include thevideo processing device 102 transmitting output video frames 112 ofoutput 3D contentdestination display device 310 via thevideo transmitter 130. Similar to act 602, sending one or more second video frames can include sending a plurality of interlaced second video frames (e.g., video frames 112 a, 112 b) or sending a single progressive first video frame (e.g., video frame 112). Furthermore, sending the one or more second video frames can include sending the one or more second video frames at a frame-rate customized to the display device. - Although not illustrated, the method can include any number of additional acts. For example, the method can include acts of generating the one or more first video frames and generating the one or more second video frames based on one or more physical characteristics of the display device, including a frame-rate and a frame size of the display device. Illustratively, the generating can include generating output video frames 110, 112 of the
output 3D contentdestination display device 310. As well, the method can include an act of generating the inter-frame blanking signal based on one or more physical characteristics of the display device, including an inter-frame overlap interval of the display device, which can be a time period corresponding to the display of the transition. - In addition to the foregoing,
FIG. 7 illustrates a method of sending three-dimensional (3D) content to a display device while synchronously sending an inter-frame blanking signal to a blanking device. The method can comprise anact 702 of receiving a 3D input signal. Act 702 can include receiving a 3D input signal including one or more input video frames that include a first image for viewing by a user's first eye and a second image for viewing by the user's second eye. For example, the act can include thevideo processing device 102 receiving, via thevideo receiver 114, theinput 3D content 104, which includes one or more input video frame(s) 106. In some instances, the one or more input video frames 106 comprise a single video frame (e.g., when the video frame encodes left and right image data using spatial compression or interleaving). In other instances, the one or more input video frames 106 comprise a plurality of video frames (e.g., when separate progressive or interlaced frames encode the left and right image data). - Furthermore,
FIG. 7 illustrates that the method can comprise anact 704 of determining capabilities of the display device. Act 704 can include determining frame-rate capabilities of a display device. For example the act can include thevideo processing device 102 receiving physical characteristic information of thedestination display device 310 via theinput receiver 132. The physical characteristic information can include, for instance, frame-rate capabilities, frame size capabilities, frame overlap interval(s), etc. Thus, the act can also include determining frame size capabilities of the display device, or determining a frame overlap interval for the display device. Furthermore, the act can comprise receiving physical characteristic information (e.g., frame-rate capabilities) directly from the display device or via manual user input. -
FIG. 7 also illustrates that the method can comprise anact 706 of generating a 3D output signal. Act 706 can include generating a 3D output signal for the display device, comprising one or more first output video frames including the first image and one or more second output video frames including the second image. For example, the act can include thevideo processing device 102 using theencoder 120 to encode a plurality of output video frames 110, 112 from theframe buffers - In addition,
FIG. 7 illustrates that the method can comprise anact 708 of transmitting the 3D output signal to the display device. Act 708 can include transmitting the 3D output signal to the display device at a frame-rate based on the determined frame-rate capabilities. For example the act can include thevideo processing device 102 using thevideo transmitter 130 to send theoutput 3D contentdestination display device 310. To transmit at a specific frame-rate, the act can include sending each video frame for a specific time period appropriate for the frame-rate. For example, if the frame-rate is 60 Hz, the act can include sending each frame for 1/60th of a second. -
FIG. 7 also shows that the method can include anact 710 of transmitting a blanking instruction to a blanking device. Act 710 can include transmitting a blanking instruction to a blanking device which directs the blanking device to blank the user's view of the display device while the display device transitions between the one or more first output video frames and the one or more second output video frames. For example, the act can include thevideo processing device 102 transmitting the blankingsignal 136 via theblanking signal transmitter 134. The blankingsignal 136 can include a first blanking instruction (e.g., blanking instruction 320) which instructs the blanking device to blank both of a user's eyes. - Of course, the method can include transmitting any number of additional blanking instructions. For example, the method include transmitting a second blanking instruction to the blanking device which directs the blanking device to blank the user's first eye view of the display device while the display device uniquely displays the one or more second output video frames (e.g., blanking instruction 318). The method can also include transmitting a third blanking instruction to the blanking device which directs the blanking device to blank the user's second eye view of the display device while the display device uniquely displays the one or more first output video frames (e.g., blanking instruction 322). The method can also include transmitting other blanking instructions, such as a blanking instruction which directs the blanking device to refrain from blanking.
- Accordingly,
FIGS. 1-7 provide a number of components and mechanisms for sending 3D video content to a broad range of display devices. One or more disclosed implementations allow for viewing of 3D video content on a broad range of display devices, including devices that that may have lower frame-rates and longer frame overlap intervals, or that are not otherwise specifically designed for displaying 3D video content. - The implementations of the present invention can comprise a special purpose or general-purpose computing systems. Computing systems may, for example, be handheld devices, appliances, laptop computers, desktop computers, mainframes, distributed computing systems, or even devices that have not conventionally considered a computing system, such as DVD players, Blu-Ray Players, gaming systems, and video converters. In this description and in the claims, the term “computing system” is defined broadly as including any device or system (or combination thereof) that includes at least one physical and tangible processor, and a physical and tangible memory capable of having thereon computer-executable instructions that may be executed by the processor.
- The memory may take any form and may depend on the nature and form of the computing system. A computing system may be distributed over a network environment and may include multiple constituent computing systems. In its most basic configuration, a computing system typically includes at least one processing unit and memory. The memory may be physical system memory, which may be volatile, non-volatile, or some combination of the two. The term “memory” may also be used herein to refer to non-volatile mass storage such as physical storage media. If the computing system is distributed, the processing, memory and/or storage capability may be distributed as well. As used herein, the term “module” or “component” can refer to software objects or routines that execute on the computing system. The different components, modules, engines, and services described herein may be implemented as objects or processes that execute on the computing system (e.g., as separate threads).
- Implementations of the present invention may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below. Embodiments within the scope of the present invention also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are physical storage media. Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: computer storage media and transmission media.
- Computer storage media includes RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
- A “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmissions media can include a network and/or data links which can be used to carry or desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
- Further, upon reaching various computer system components, program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to computer storage media (or vice versa). For example, computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media at a computer system. Thus, it should be understood that computer storage media can be included in computer system components that also (or even primarily) utilize transmission media.
- Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
- Those skilled in the art will appreciate that the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like. The invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
- The present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/378,981 US20120140033A1 (en) | 2010-11-23 | 2011-03-10 | Displaying 3d content on low frame-rate displays |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US41670810P | 2010-11-23 | 2010-11-23 | |
PCT/US2011/025262 WO2012071063A1 (en) | 2010-11-23 | 2011-02-17 | Blanking inter-frame transitions of a 3d signal |
PCT/US2011/027175 WO2012071064A1 (en) | 2010-11-23 | 2011-03-04 | Formatting 3d content for low frame-rate displays |
PCT/US2011/027933 WO2012071066A1 (en) | 2010-11-23 | 2011-03-10 | Displaying 3d content on low frame-rate displays |
US13/378,981 US20120140033A1 (en) | 2010-11-23 | 2011-03-10 | Displaying 3d content on low frame-rate displays |
PCT/US2011/027981 WO2012071067A1 (en) | 2010-11-23 | 2011-03-10 | Shuttering the display of inter-frame transitions |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2011/025262 Continuation-In-Part WO2012071063A1 (en) | 2010-11-23 | 2011-02-17 | Blanking inter-frame transitions of a 3d signal |
Publications (1)
Publication Number | Publication Date |
---|---|
US20120140033A1 true US20120140033A1 (en) | 2012-06-07 |
Family
ID=46146151
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/377,132 Expired - Fee Related US8553072B2 (en) | 2010-11-23 | 2011-02-17 | Blanking inter-frame transitions of a 3D signal |
US13/378,649 Abandoned US20120140032A1 (en) | 2010-11-23 | 2011-03-04 | Formatting 3d content for low frame-rate displays |
US13/378,981 Abandoned US20120140033A1 (en) | 2010-11-23 | 2011-03-10 | Displaying 3d content on low frame-rate displays |
US13/378,975 Abandoned US20120140051A1 (en) | 2010-11-23 | 2011-03-10 | Shuttering the display of inter-frame transitions |
US13/379,613 Abandoned US20120140034A1 (en) | 2010-11-23 | 2011-04-04 | Device for displaying 3d content on low frame-rate displays |
US13/379,317 Abandoned US20120147160A1 (en) | 2010-11-23 | 2011-04-14 | Adaptive 3-d shuttering devices |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/377,132 Expired - Fee Related US8553072B2 (en) | 2010-11-23 | 2011-02-17 | Blanking inter-frame transitions of a 3D signal |
US13/378,649 Abandoned US20120140032A1 (en) | 2010-11-23 | 2011-03-04 | Formatting 3d content for low frame-rate displays |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/378,975 Abandoned US20120140051A1 (en) | 2010-11-23 | 2011-03-10 | Shuttering the display of inter-frame transitions |
US13/379,613 Abandoned US20120140034A1 (en) | 2010-11-23 | 2011-04-04 | Device for displaying 3d content on low frame-rate displays |
US13/379,317 Abandoned US20120147160A1 (en) | 2010-11-23 | 2011-04-14 | Adaptive 3-d shuttering devices |
Country Status (2)
Country | Link |
---|---|
US (6) | US8553072B2 (en) |
WO (6) | WO2012071063A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10334223B2 (en) * | 2015-01-30 | 2019-06-25 | Qualcomm Incorporated | System and method for multi-view video in wireless devices |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110134218A1 (en) * | 2009-12-08 | 2011-06-09 | Darren Neuman | Method and system for utilizing mosaic mode to create 3d video |
US9494975B1 (en) * | 2011-03-28 | 2016-11-15 | Amazon Technologies, Inc. | Accessory device identification method |
JP5821259B2 (en) * | 2011-04-22 | 2015-11-24 | セイコーエプソン株式会社 | Image display system, image display device, 3D glasses, and image display method |
EP2706755A4 (en) * | 2011-05-27 | 2014-07-16 | Huawei Tech Co Ltd | Media transmission method, media reception method, client and system thereof |
US8724662B2 (en) * | 2012-06-25 | 2014-05-13 | Johnson & Johnson Vision Care, Inc. | Wireless communication protocol for low power receivers |
DE102012108685B4 (en) * | 2012-09-17 | 2017-02-16 | Karlheinz Gelhardt | Multi-converter for digital, high-resolution, stereoscopic video signals |
GB2508413A (en) * | 2012-11-30 | 2014-06-04 | Nordic Semiconductor Asa | Stereoscopic viewing apparatus and display synchronization |
US9873233B2 (en) | 2013-03-15 | 2018-01-23 | Johnson & Johnson Vision Care, Inc. | Ophthalmic lens viewing sets for three-dimensional perception of stereoscopic media |
EP2778747A3 (en) * | 2013-03-15 | 2014-11-26 | Johnson & Johnson Vision Care, Inc. | Ophthalmic lens viewing sets for three-dimensional perception of stereoscopic media |
US10194163B2 (en) | 2014-05-22 | 2019-01-29 | Brain Corporation | Apparatus and methods for real time estimation of differential motion in live video |
US9713982B2 (en) | 2014-05-22 | 2017-07-25 | Brain Corporation | Apparatus and methods for robotic operation using video imagery |
US9939253B2 (en) * | 2014-05-22 | 2018-04-10 | Brain Corporation | Apparatus and methods for distance estimation using multiple image sensors |
US9842551B2 (en) | 2014-06-10 | 2017-12-12 | Apple Inc. | Display driver circuitry with balanced stress |
US10057593B2 (en) * | 2014-07-08 | 2018-08-21 | Brain Corporation | Apparatus and methods for distance estimation using stereo imagery |
US10032280B2 (en) | 2014-09-19 | 2018-07-24 | Brain Corporation | Apparatus and methods for tracking salient features |
CN104994372B (en) * | 2015-07-03 | 2017-03-29 | 深圳市华星光电技术有限公司 | A kind of 3D display systems |
US10197664B2 (en) | 2015-07-20 | 2019-02-05 | Brain Corporation | Apparatus and methods for detection of objects using broadband signals |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070229395A1 (en) * | 2006-03-29 | 2007-10-04 | Nvidia Corporation | System, method, and computer program product for controlling stereo glasses shutters |
US20100026794A1 (en) * | 2008-07-30 | 2010-02-04 | Sin-Min Chang | Method, System and Apparatus for Multiuser Display of Frame-Sequential Images |
US20110032330A1 (en) * | 2009-06-05 | 2011-02-10 | Lg Electronics Inc. | Image display apparatus and method for operating the same |
Family Cites Families (44)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5821989A (en) * | 1990-06-11 | 1998-10-13 | Vrex, Inc. | Stereoscopic 3-D viewing system and glasses having electrooptical shutters controlled by control signals produced using horizontal pulse detection within the vertical synchronization pulse period of computer generated video signals |
JPH06350978A (en) | 1993-06-11 | 1994-12-22 | Sanyo Electric Co Ltd | Video signal converter |
US6057811A (en) * | 1993-09-28 | 2000-05-02 | Oxmoor Corporation | 3-D glasses for use with multiplexed video images |
JPH0879799A (en) | 1994-09-05 | 1996-03-22 | Sony Corp | Stereoscopic display system, and its synchronizing signal transmitter and synchronizing signal receiver |
US5572250A (en) | 1994-10-20 | 1996-11-05 | Stereographics Corporation | Universal electronic stereoscopic display |
US5610661A (en) | 1995-05-19 | 1997-03-11 | Thomson Multimedia S.A. | Automatic image scanning format converter with seamless switching |
JPH09139957A (en) | 1995-11-14 | 1997-05-27 | Mitsubishi Electric Corp | Graphic display device |
EP0774850B1 (en) * | 1995-11-16 | 2004-10-27 | Ntt Mobile Communications Network Inc. | Digital signal detecting method and detector |
US6088052A (en) | 1997-01-08 | 2000-07-11 | Recherches Point Lab Inc. | 3D stereoscopic video display system |
DE19806547C2 (en) | 1997-04-30 | 2001-01-25 | Hewlett Packard Co | System and method for generating stereoscopic display signals from a single computer graphics pipeline |
JPH1169384A (en) | 1997-08-08 | 1999-03-09 | Olympus Optical Co Ltd | Video signal type decision processor |
JP3448467B2 (en) | 1997-09-19 | 2003-09-22 | 三洋電機株式会社 | LCD shutter glasses driving device |
KR100381817B1 (en) | 1999-11-17 | 2003-04-26 | 한국과학기술원 | Generating method of stereographic image using Z-buffer |
DE10016074B4 (en) | 2000-04-01 | 2004-09-30 | Tdv Technologies Corp. | Method and device for generating 3D images |
JP2004511824A (en) * | 2000-10-12 | 2004-04-15 | レベオ, インコーポレイティッド | Digital light processing 3D projection system and method |
US20020070932A1 (en) | 2000-12-10 | 2002-06-13 | Kim Jesse Jaejin | Universal three-dimensional graphics viewer for resource constrained mobile computers |
US20040218269A1 (en) | 2002-01-14 | 2004-11-04 | Divelbiss Adam W. | General purpose stereoscopic 3D format conversion system and method |
CA2380105A1 (en) * | 2002-04-09 | 2003-10-09 | Nicholas Routhier | Process and system for encoding and playback of stereoscopic video sequences |
US7511714B1 (en) | 2003-11-10 | 2009-03-31 | Nvidia Corporation | Video format conversion using 3D graphics pipeline of a GPU |
US20090051759A1 (en) * | 2005-05-27 | 2009-02-26 | Adkins Sean M | Equipment and methods for the synchronization of stereoscopic projection displays |
CN101001320A (en) | 2006-01-12 | 2007-07-18 | 万里科技股份有限公司 | System of automatic 3D image generating and automatic image format conversion |
JP2007200116A (en) | 2006-01-27 | 2007-08-09 | Manri Kagi Kofun Yugenkoshi | System for 3d image automatic generation and image format automatic conversion |
EP1999740A4 (en) | 2006-03-29 | 2009-11-11 | Nvidia Corp | System, method, and computer program product for controlling stereo glasses shutters |
JP2007324830A (en) * | 2006-05-31 | 2007-12-13 | Toshiba Corp | Frame rate converting device, and frame rate converting method |
US8717348B2 (en) * | 2006-12-22 | 2014-05-06 | Texas Instruments Incorporated | System and method for synchronizing a viewing device |
US20090207167A1 (en) * | 2008-02-18 | 2009-08-20 | International Business Machines Corporation | Method and System for Remote Three-Dimensional Stereo Image Display |
JP5338166B2 (en) | 2008-07-16 | 2013-11-13 | ソニー株式会社 | Transmitting apparatus, stereoscopic image data transmitting method, receiving apparatus, and stereoscopic image data receiving method |
JP4606502B2 (en) | 2008-08-07 | 2011-01-05 | 三菱電機株式会社 | Image display apparatus and method |
CA2684513A1 (en) * | 2008-11-17 | 2010-05-17 | X6D Limited | Improved performance 3d glasses |
JP2010139855A (en) | 2008-12-12 | 2010-06-24 | Sharp Corp | Display device, method for controlling display device, and control program |
US8233035B2 (en) * | 2009-01-09 | 2012-07-31 | Eastman Kodak Company | Dual-view stereoscopic display using linear modulator arrays |
TWI408947B (en) * | 2009-02-13 | 2013-09-11 | Mstar Semiconductor Inc | Image adjusting apparatus and image adjusting method |
US20110001805A1 (en) * | 2009-06-18 | 2011-01-06 | Bit Cauldron Corporation | System and method of transmitting and decoding stereoscopic sequence information |
JP5273478B2 (en) | 2009-07-07 | 2013-08-28 | ソニー株式会社 | Video display device and video display system |
KR20110040378A (en) * | 2009-10-14 | 2011-04-20 | 삼성전자주식회사 | Image providing method and image providing apparatus, display apparatus and image providing system using the same |
US20110090324A1 (en) * | 2009-10-15 | 2011-04-21 | Bit Cauldron Corporation | System and method of displaying three dimensional images using crystal sweep with freeze tag |
WO2011052918A2 (en) | 2009-10-30 | 2011-05-05 | Samsung Electronics Co., Ltd. | Two-dimensional/three-dimensional image display apparatus and method of driving the same |
KR101659575B1 (en) | 2009-10-30 | 2016-09-26 | 삼성전자주식회사 | Display apparatus for both 2D and 3D image and method of driving the same |
US9179136B2 (en) * | 2009-11-20 | 2015-11-03 | Broadcom Corporation | Method and system for synchronizing 3D shutter glasses to a television refresh rate |
CN101765024A (en) | 2009-12-22 | 2010-06-30 | 周晓民 | 3D stereoscopic video signal converter and technical application thereof |
TWI420503B (en) * | 2010-02-04 | 2013-12-21 | Chunghwa Picture Tubes Ltd | Three dimensional display |
KR20110102758A (en) * | 2010-03-11 | 2011-09-19 | 삼성전자주식회사 | 3-dimension glasses, rechargeable cradle, 3-dimension display apparatus and system for charging 3-dimension glasses |
US20120050462A1 (en) * | 2010-08-25 | 2012-03-01 | Zhibing Liu | 3d display control through aux channel in video display devices |
US20120050154A1 (en) * | 2010-08-31 | 2012-03-01 | Adil Jagmag | Method and system for providing 3d user interface in 3d televisions |
-
2011
- 2011-02-17 WO PCT/US2011/025262 patent/WO2012071063A1/en active Application Filing
- 2011-02-17 US US13/377,132 patent/US8553072B2/en not_active Expired - Fee Related
- 2011-03-04 WO PCT/US2011/027175 patent/WO2012071064A1/en active Application Filing
- 2011-03-04 US US13/378,649 patent/US20120140032A1/en not_active Abandoned
- 2011-03-10 WO PCT/US2011/027981 patent/WO2012071067A1/en active Application Filing
- 2011-03-10 US US13/378,981 patent/US20120140033A1/en not_active Abandoned
- 2011-03-10 US US13/378,975 patent/US20120140051A1/en not_active Abandoned
- 2011-03-10 WO PCT/US2011/027933 patent/WO2012071066A1/en active Application Filing
- 2011-04-04 US US13/379,613 patent/US20120140034A1/en not_active Abandoned
- 2011-04-04 WO PCT/US2011/031115 patent/WO2012071072A1/en active Application Filing
- 2011-04-14 US US13/379,317 patent/US20120147160A1/en not_active Abandoned
- 2011-04-14 WO PCT/US2011/032549 patent/WO2012071073A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070229395A1 (en) * | 2006-03-29 | 2007-10-04 | Nvidia Corporation | System, method, and computer program product for controlling stereo glasses shutters |
US20100026794A1 (en) * | 2008-07-30 | 2010-02-04 | Sin-Min Chang | Method, System and Apparatus for Multiuser Display of Frame-Sequential Images |
US20110032330A1 (en) * | 2009-06-05 | 2011-02-10 | Lg Electronics Inc. | Image display apparatus and method for operating the same |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10334223B2 (en) * | 2015-01-30 | 2019-06-25 | Qualcomm Incorporated | System and method for multi-view video in wireless devices |
Also Published As
Publication number | Publication date |
---|---|
US20120140034A1 (en) | 2012-06-07 |
US20120140031A1 (en) | 2012-06-07 |
US20120140032A1 (en) | 2012-06-07 |
WO2012071072A1 (en) | 2012-05-31 |
US20120147160A1 (en) | 2012-06-14 |
WO2012071063A1 (en) | 2012-05-31 |
WO2012071066A1 (en) | 2012-05-31 |
WO2012071067A1 (en) | 2012-05-31 |
US20120140051A1 (en) | 2012-06-07 |
WO2012071073A1 (en) | 2012-05-31 |
WO2012071064A1 (en) | 2012-05-31 |
US8553072B2 (en) | 2013-10-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20120140033A1 (en) | Displaying 3d content on low frame-rate displays | |
US11843759B2 (en) | Systems and method for virtual reality video conversion and streaming | |
WO2016098411A1 (en) | Video display device, video display system, and video display method | |
JP2011090079A (en) | Display device, display method and computer program | |
US20110254829A1 (en) | Wearable electronic device, viewing system and display device as well as method for operating a wearable electronic device and method for operating a viewing system | |
US20090002482A1 (en) | Method for displaying three-dimensional (3d) video and video apparatus using the same | |
WO2012058236A1 (en) | Methods and systems for presenting adjunct content during a presentation of a media content instance | |
JP5702063B2 (en) | Display device, display method, and computer program | |
JP2012070032A (en) | Display device and display method | |
WO2013044733A1 (en) | Display method and display device | |
EP2547114A2 (en) | Display apparatus and method for displaying 3D image thereon | |
US8547418B2 (en) | Method and system for processing and displaying video in three dimensions using a liquid crystal display | |
CN101193322B (en) | 3D video display method and display system using this method | |
US9137522B2 (en) | Device and method for 3-D display control | |
JP2011239148A (en) | Video display apparatus, video display method and computer program | |
JP5367031B2 (en) | Information processing method and information processing apparatus | |
JP2014027351A (en) | Image processing device, image processing method, and program | |
JP2011103504A (en) | Stereoscopic display device, stereoscopic display method and stereoscopic display system | |
WO2011034497A2 (en) | Shutter glass controller, shutter glass apparatus and display apparatus | |
KR101342784B1 (en) | Device for displaying stereoscopic image, and method thereof | |
KR101357373B1 (en) | Device for displaying stereoscopic image, and method thereof | |
KR101671033B1 (en) | Apparatus, method and recording medium for displaying 3d video | |
JP5341942B2 (en) | Video playback device | |
US20150156483A1 (en) | Providing a capability to simultaneously view and/or listen to multiple sets of data on one or more endpoint device(s) associated with a data processing device | |
JP2014002308A (en) | Video processing device, video processing method and computer program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: CIRCA3D, LLC, UTAH Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TABOR, TIMOTHY A.;REEL/FRAME:025936/0986 Effective date: 20110310 |
|
AS | Assignment |
Owner name: CIRCA3D, LLC, UTAH Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TABOR, TIMOTHY A.;REEL/FRAME:026136/0387 Effective date: 20110128 |
|
AS | Assignment |
Owner name: CIRCA3D, LLC, UTAH Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TABOR, TIMOTHY A.;REEL/FRAME:027400/0573 Effective date: 20110310 |
|
AS | Assignment |
Owner name: CIRCA3D, LLC, UTAH Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TABOR, TIM;REEL/FRAME:027571/0323 Effective date: 20110128 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |