US20170359607A1 - Distributed and synchronized media switching - Google Patents
Distributed and synchronized media switching Download PDFInfo
- Publication number
- US20170359607A1 US20170359607A1 US15/179,476 US201615179476A US2017359607A1 US 20170359607 A1 US20170359607 A1 US 20170359607A1 US 201615179476 A US201615179476 A US 201615179476A US 2017359607 A1 US2017359607 A1 US 2017359607A1
- Authority
- US
- United States
- Prior art keywords
- variant
- time
- media
- media item
- switchover
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000001360 synchronised effect Effects 0.000 title description 12
- 238000000034 method Methods 0.000 claims abstract description 27
- 230000008859 change Effects 0.000 claims abstract description 13
- 238000009877 rendering Methods 0.000 claims description 30
- 230000004044 response Effects 0.000 claims description 9
- 238000004891 communication Methods 0.000 claims description 8
- 230000007704 transition Effects 0.000 claims description 7
- 238000012545 processing Methods 0.000 description 16
- 230000003044 adaptive effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000012163 sequencing technique Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/60—Network streaming of media packets
- H04L65/75—Media network packet handling
- H04L65/752—Media network packet handling adapting media to network capabilities
-
- H04L65/4084—
-
- H04L65/602—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/60—Network streaming of media packets
- H04L65/61—Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
- H04L65/612—Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio for unicast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/60—Network streaming of media packets
- H04L65/75—Media network packet handling
- H04L65/762—Media network packet handling at the source
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/80—Responding to QoS
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/218—Source of audio or video content, e.g. local disk arrays
- H04N21/2187—Live feed
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/233—Processing of audio elementary streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
- H04N21/23439—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements for generating different versions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/24—Monitoring of processes or resources, e.g. monitoring of server load, available bandwidth, upstream requests
- H04N21/2402—Monitoring of the downstream path of the transmission network, e.g. bandwidth available
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/242—Synchronization processes, e.g. processing of PCR [Program Clock References]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/25—Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
- H04N21/266—Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
- H04N21/2662—Controlling the complexity of the video stream, e.g. by scaling the resolution or bitrate of the video stream based on the client capabilities
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/433—Content storage operation, e.g. storage operation in response to a pause request, caching operations
- H04N21/4331—Caching operations, e.g. of an advertisement for later insertion during playback
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
- H04N21/43615—Interfacing a Home Network, e.g. for connecting the client to a plurality of peripherals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/63—Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
- H04N21/633—Control signals issued by server directed to the network components or client
- H04N21/6332—Control signals issued by server directed to the network components or client directed to client
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8106—Monomedia components thereof involving special audio data, e.g. different tracks for different languages
Definitions
- HTTP Live Streaming is a media streaming protocol which is adaptive to multi-bit media playback.
- Media may exist in multiple quality tiers which can be rendered across multiple devices depending on a variety of circumstances. When circumstances change, it may be desirable to change an output from one quality tier to another. When using multiple output devices, it is often desirable to have synchronized rendering of common media distributed across each device. Televisions, speakers, computers, and other devices may render the same media or portions of the same media at the same time, but synchronization can be lost, for example, when switching from one quality tier to another.
- Some media rendering systems and methods may fail to properly render media as bitrates for the connections delivering the media change, system or device parameters change, or other reasons causing a desired change in media format or quality. Such failures can lead to unsynchronized media output. Unsynchronized media output could result in audio and video playing at different times across multiple devices, causing poor mixing of sound, sound not synchronized with corresponding video, spoilers of video output on multiple devices, and other undesirable outcomes.
- FIG. 1 illustrates a system for switching and synchronizing media output, according to an aspect of the disclosure.
- FIG. 2 is a functional block diagram of a device for switching and synchronizing media output, according to an aspect of the disclosure.
- FIG. 3 shows a method for switching and synchronizing media output, according to an aspect of the disclosure.
- FIG. 4 illustrates a possible use case for the system of FIG. 1 , according to an aspect of the disclosure.
- the disclosure includes a method of switching media output, the method including receiving a first variant of a media item with a player, transmitting the first variant to a secondary device, and upon determining a change in operating conditions, switching from the first variant of the media item to a second variant of the media item by estimating a time to perform the switch to the second variant, transmitting to the secondary device a notification of a time to switch from the first variant to the second variant, and transmitting the second variant to the secondary device.
- the systems and methods disclosed herein allow for synchronized switching of different media variants.
- FIG. 1 illustrates a system 100 for switching and synchronized media output.
- the system 100 may include a distribution server 110 , a primary device (such as a player) 120 , and one or more secondary devices 130 provided in communication by a network 140 .
- the distribution server 110 may store or have access to media items such as video, audio, images, and the like, which can be stored in one or more databases 112 associated with the distribution server 110 .
- a single media item 114 may include different coded variants (ver. 116 . 1 - 116 .N) of the media, each variant representing source media content having different types of coding applied thereto.
- the different variants 1-N may represent source media using different coded bitrates (which often impose different qualities to recovered video), different frame sizes, or different frame rates, among other characteristics.
- the variants may be parsed into “chunks” of predetermined duration, which may be requested individually by devices 120 , 130 and delivered by the distribution server 110 .
- the device 120 , 130 may request another variant that has lower bitrate or decoding complexity.
- the device 120 , 130 may request a higher-bitrate variant of a media item 112 , which typically yields recovered video data of higher quality.
- the distribution server 110 may also store a manifest file 118 for each media item 114 which describes the available variants (ver. 1-N) stored for the media item 114 .
- the distribution server 110 may deliver the manifest file 116 to a player 120 and/or secondary device 130 on request, prior to delivery of chunks. Based on this information, the player 120 may select a particular variant to receive over the network 140 . The selection of the variant may be made by a user or by the player 120 in response to a user command or changing condition (e.g., changes in network bandwidth or processor availability of the device).
- the player 120 or the distribution server 110 may determine how to transition from the current variant being transmitted to the player 120 to the requested variant. In making the determination, the player 120 or the distribution server 110 may select which channels and chunks of the variant should be sent to the player 120 .
- the player 120 may be a user device such as a computer, phone, tablet, stereo, television, speaker and receiver adapter, or any type of device or controller able to transmit and/or render media.
- the player 120 may determine, based on a bitrate between the player 120 and network 140 and/or between the player 120 and any secondary device 130 , which variant of media to render. For example, a higher bitrate may allow rendering a higher quality variant of the media than a lower bitrate may allow. Changing bitrates can cause the player 120 to switch variants of the media to render.
- Media may be rendered at the player 120 and/or any secondary devices 130 associated with the player 120 .
- the player 120 may receive media from the distribution server 110 and transmit it to the secondary devices 130 .
- the player 120 and secondary device 130 may be integrated into a common unit such as computer, phone, tablet, or the like.
- the player 120 may be a “silent primary” device which does not render any media, but transmits media to the secondary devices 130 to be rendered, and which controls the switching of media variants.
- Secondary devices 130 . 1 - 103 .M may be speakers, televisions, tablets, smartphones, and other devices capable of rendering media item 114 .
- a secondary device 130 may communicate directly with the network 140 or may be controlled by a player 120 .
- Different secondary devices may connect directly or indirectly with the network 140 .
- secondary devices 130 . 1 and 130 . 2 may be completely controlled by the player 120 while secondary device 130 .M may communicate directly with the network 140 .
- the distribution server 110 and the player 120 may connect to the network 140 and/or to each other via a communication channel.
- the network 140 may include the network time reference 145 , which may be available to each player 120 and secondary device 130 .
- the player 120 may connect to the distribution server 110 via the network 140 to receive the manifest file 118 and variants of the media items available at the distribution server 110 .
- multiple devices may communicate using peer-to-peer connections.
- multiple phones, tablets, or computers may communicate with each other when no WiFi connection is available.
- one device 120 may be the player 120 distributing content to the other devices which serve as secondary devices 130 .
- FIG. 2 is a functional block diagram of a device 200 according to an embodiment of the disclosure.
- the device 200 could be a player 120 or secondary device 130 ( FIG. 1 ).
- the device 200 may include a processing system 280 , memory system 220 , display 230 , transceiver (TX/RX) 240 , and input/output (I/O) units 250 .
- TX/RX transceiver
- I/O input/output
- the processing system 280 may control operation of the device 200 by causing the device 200 to interact with other entities, such as players 120 and/or secondary devices 130 ( FIG. 1 ), to synchronize playback.
- the memory system 220 may store instructions that the processing system 280 may execute and also may store media item data generated therefrom.
- the architecture of the processing system 280 may include a central processing unit; it may also include graphics processors, digital signal processors, and application specific integrated circuits (not shown) as may be suitable for individual media item 114 needs.
- the architecture of the memory system 220 may be suitable for individual media item 114 needs.
- the architecture of the memory system 220 may also vary from device to device.
- the memory system 220 may include one or more electrical, optical, and/or magnetic storage devices (not shown).
- the memory system 220 may be distributed throughout the processing system 280 .
- the memory system 220 may include a cache memory provided on a common integrated circuit with a central processor of the processing system 280 .
- the memory system 220 also may include a random access main memory coupled to the processing system 280 via a memory controller, and it may also include non-volatile memory device(s) for long-term storage.
- the processing system 280 may execute a variety of programs during operation, including an operating system 210 and one or more media items 114 .
- the device 200 may execute an item rendering application 272 and possibly other applications.
- the item rendering application 272 may manage download, decoding, and synchronized output of media item 114 .
- the item rendering application 272 may define a set of synchronization controls 278 for management of the application. Thus, synchronization controls may vary according to the output use case for which the device 200 is applied.
- FIG. 3 shows a method 300 for synchronizing media switches, according to an embodiment of the present disclosure.
- a player 120 may download a first variant of a media item 114 from the distribution server 110 (box 302 ).
- the player 120 may decode and, if appropriate to the player's type, render the media item locally (box 304 ).
- the player 120 also may transmit the coded first variant of the media item 116 . 1 to a secondary device 130 that is to play the media item (box 306 ).
- the secondary device 130 may receive the coded media data from the player 120 and may cache the first variant of the media item 116 . 1 in a memory (box 308 ).
- the secondary device 130 may decode received coded data from the player 120 (box 310 ) and render the media (box 312 ).
- the player 120 and secondary device 130 may repeat operation of boxes 304 and 308 - 312 until a switchover event occurs.
- a switchover event may occur when the player 120 decides to switch to another variant of the media (box 314 ).
- the player 120 decides to switch to a second variant of the media (e.g., 116 . 2 )
- it may estimate a time at which the switchover is to be performed (box 316 ) and may communicate the time to the secondary device(s) (msg. 318 ).
- the player 120 may request (box 322 ) and begin a download of the next variant from the distribution server 110 (box 324 ).
- the player 120 may decode and render that variant locally (box 326 ).
- the player 120 also may transmit the coded second variant of the media item 116 . 2 to a secondary device 130 that is to play the media item (box 328 ).
- the player 130 may repeat the operation of boxes 308 - 312 until a switchover event occurs, which is represented by the switchover command message 318 .
- the secondary device 130 may begin to receive coded media generated from the new variant of the media item (msg. 328 ).
- the secondary device 130 may receive and store the second variant in a cache (box 330 ).
- the secondary device 130 may determine whether the switchover time has been reached (box 332 ). If the switchover time has not been reached, the secondary device 130 may render the cached data of the first variant of the media item (box 334 ). After the switchover time has been reached, the secondary device 130 may decode the coded data of the second variant 116 . 2 (box 336 ), and render the cached data of the second variant of the media item 116 . 2 (box 338 ).
- the determination to switch to another variant of the media item 114 may be made based on a variety of operating factors.
- the player 120 may determine to switch variants based on a change in communication bitrate between the player 120 and the network 140 and/or between the connection between the player 120 and any secondary devices 130 ; if the player 120 detects that the bitrate connection has dropped to a level insufficient to support the rendering of the first variant of the media data 116 . 1 , it may switch to a lower bitrate.
- processing resources at the player 120 (or a secondary device) change due to the start or conclusion of other processes executing on the player 120 , the player 120 may switch to a variant that is a better match to the new level of processing resources that are available for decoding.
- the message 318 may identify a network time, as established with a network time reference 145 , when the switchover is to be performed, and may identify a media time that is to be played at the switchover time.
- the secondary devices 130 may use the media time and shared time identifiers to correlate a point in the media item 114 to a network time. This correlation, used in conjunction with the playback rate, may allow the secondary device 130 to identify which elements of the media item 114 are to be rendered in the future.
- the switchover command may include a notification of the time at which the secondary device 130 may begin receiving a second variant of the media (e.g., 116 . 2 ) and at which to begin rendering the second variant of the media.
- the switch time may be estimated based on a network-to-media time translation, and may use an algorithm to determine, for example, where on an audio ramp curve to execute the switch to another media variant.
- the switching of media variants may be executed as a crossfade from one variant to another, meaning the first media variant may be scheduled to ramp down during a period when another media variant is scheduled to ramp up.
- the secondary devices 130 may send an acknowledgment to the player 120 , or the switching may be done with brute force by the player 120 without any acknowledgment or handshaking between the player 120 and any secondary devices 130 .
- the second variant of the media item may be rendered at zero volume until the switchover time and/or until any remaining first variant of the media in the memory has been rendered.
- the first variant may be rendered at a decreasing volume while the second variant is rendered at an increasing volume until the switchover time is reached (box 332 ), and only the second variant is rendered (box 336 ).
- the time when the second variant media begins to be rendered at a positive volume may be associated with the network time reference 145 .
- the player 120 and/or secondary devices 130 may not immediately render the second variant of the media item.
- the secondary devices 130 may begin to receive the second variant media 116 . 2 from the player 120 while there is still some amount of first variant media 116 . 1 stored in memory of the secondary devices 130 .
- the player 120 may wait enough time for the switchover so that the secondary devices 130 may render the remaining first variant media 116 . 1 .
- the switchover time may also allow the secondary devices 130 enough time to store some amount of second variant media in a memory before rending the second variant media 116 . 2 at all or at a volume greater than zero.
- FIG. 4 illustrates a possible use case 400 for system 100 ( FIG. 1 ).
- the use case 400 may include the player 120 and multiple secondary devices 130 arranged in different locations, according to an aspect of the disclosure.
- the player 120 is the primary device which receives all media to be rendered from the distribution server 110 , and controls the rendering of the media at the secondary devices 130 by transmitting the media individually to each secondary device 130 .
- the player 120 may be a control node such as an Apple TV, an iPhone, a stereo receiver, a smart TV, a wireless transceiver device, or any other device cable of sending and receiving media.
- the secondary devices 130 may be speakers, televisions, tablets, smart phones, or other output devices, and may connect wirelessly to the player 120 via an area connection, which could be Wi-Fi, Wide Area Network, Bluetooth, or the like.
- the player 120 may transmit the media to each secondary device 130 in various locations and in a manner which produces a synchronized output in each location of a secondary device 130 .
- the player 120 may send coded data to the secondary devices 130 or may decode the data before sending it to the secondary devices 130 .
- the secondary device 130 may decode the data with a decoder before rendering the data.
- the secondary devices 130 may all render the same, complete media output. For example, if the media output is music, each secondary device 130 could be in a different room playing the same song so that the listener may walk from room to room while hearing a continuous output of the song. Likewise, if an output is paused as a listener walks from one room to another with a secondary device 130 , when audio play recommences, the output will be at the same point as it was in another room when it was paused. Alternatively, each secondary device could be rendering a different portion of the media. For example, one secondary device 130 could be rendering drums while another renders a guitar, simulating a synchronized live experience in which each secondary device 130 is playing different instruments or sounds of the same song at the same time. Such an aspect may represent a surround sound output.
- the media could be both audio and video. If a user walks from one room with a television to another, he/she may pause the media rendering while walking to the other room, and recommence output at another secondary device 130 without missing or repeating any media output. Similarly, a mix of video and audio may be synchronized.
- at least one of the secondary devices 130 could be a video display such as a television, and at least one of the secondary devices 130 could be an audio output such as a speaker.
- a user may be watching the video on one secondary device 130 in one room, and then walk to another room with a secondary device 130 functioning as a speaker only.
- the audio for the speaker should synchronize with the video and/or audio of the television in the other room. Such aspects may allow a viewer to avoid seeing or hearing an output on any secondary device 130 before any other secondary device 130 .
- the player 120 may control multiple secondary devices 130 which may render the media in sync.
- the player 120 may be an Apple TV device which may output to one or more output displays in a bar, gym, conference room, stadium, airplane, or other location with multiple audio and/or video outputs.
- the player 120 may use an adaptive bitrate which it may control. The player 120 may therefore alter the bitrate to ensure that each secondary device 130 is able to render the media at the exact same time.
- using the player 120 as a controller for multiple secondary devices 130 may have the benefit of reducing the number of connections to the distribution server 110 required to produce multiple outputs on various secondary devices 130 .
- the player 120 may alter the bitrate based on network bandwidth, latency, output capabilities, and other system parameters so that no secondary device 130 is rendering the media before or after any other secondary device 130 .
- playback may be synchronized across each secondary device 130 . For example, if audio is paused at the player 120 , it will be paused simultaneously at each secondary device 130 . If playback continues, it will recommence simultaneously at each secondary device 130 .
- Another example of such synchronization may involve enhanced playback functionality like replay, skipping forward and backward, rendering icons, widgets, and other add-ons for each secondary device 130 at the same time.
- the secondary devices 130 may render different portions of the media.
- each secondary device 130 may render a portion of audio output which is not necessarily the entire variant of the media.
- the player 120 may need to synchronize the rendering of media even though the portions of the media are being rendered at different locations with different secondary devices 130 .
- the player 120 may distribute the portions of the media to the secondary devices 130 under the same timing method of FIG. 3 so that the synchronized rendering of the various portions of the media at the secondary devices 130 results in the fully rendered media across multiple secondary devices 130 .
- the server may run on any computer including dedicated computers.
- the computer may include at least one processing element, typically a central processing unit (CPU), and some form of memory.
- the processing element may carry out arithmetic and logic operations, and a sequencing and control unit may change the order of operations in response to stored information.
- the server may include peripheral devices that may allow information to be retrieved from an external source, and the result of operations saved and retrieved.
- the server may operate within a client-server architecture.
- the server may perform some tasks on behalf of clients.
- the clients may connect to the server through the network on a communication channel as defined herein.
- the server may use memory with error detection and correction, redundant disks, redundant power supplies and so on.
- aspects of the disclosure may include communication channels that may be any type of wired or wireless electronic communications network, such as, e.g., a wired/wireless local area network (LAN), a wired/wireless personal area network (PAN), a wired/wireless home area network (HAN), a wired/wireless wide area network (WAN), a campus network, a metropolitan network, an enterprise private network, a virtual private network (VPN), an internetwork, a backbone network (BBN), a global area network (GAN), the Internet, an intranet, an extranet, an overlay network, Wireless Fidelity (Wi-Fi), Bluetooth, and/or the like, and/or a combination of two or more thereof
- LAN local area network
- PAN personal area network
- HAN wired/wireless home area network
- WAN wired/wireless wide area network
- VPN virtual private network
- BBN backbone network
- GAN global area network
- the Internet an intranet, an extranet, an overlay network, Wireless Fidelity
- a server may operate a web application in conjunction with a database.
- the web application may be hosted in a browser-controlled environment (e.g., a Java applet and/or the like), coded in a browser-supported language (e.g., JavaScript combined with a browser-rendered markup language (e.g., Hyper Text Markup Language (HTML) and/or the like)) and/or the like such that any device running a common web browser (e.g., SafariTM or the like) may render the application executable.
- a web-based service may be more beneficial due to the ubiquity of web browsers and the convenience of using a web browser as a client (i.e., thin client). Further, with inherent support for cross-platform compatibility, the web application may be maintained and updated without distributing and installing software on each.
- aspects of the disclosure may be implemented in any type of mobile smartphones that are operated by any type of advanced mobile data processing and communication operating system, such as, e.g., an AppleTM iOSTM operating system.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Databases & Information Systems (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Information Transfer Between Computers (AREA)
Abstract
Description
- HTTP Live Streaming (HLS) is a media streaming protocol which is adaptive to multi-bit media playback. Media may exist in multiple quality tiers which can be rendered across multiple devices depending on a variety of circumstances. When circumstances change, it may be desirable to change an output from one quality tier to another. When using multiple output devices, it is often desirable to have synchronized rendering of common media distributed across each device. Televisions, speakers, computers, and other devices may render the same media or portions of the same media at the same time, but synchronization can be lost, for example, when switching from one quality tier to another.
- Some media rendering systems and methods may fail to properly render media as bitrates for the connections delivering the media change, system or device parameters change, or other reasons causing a desired change in media format or quality. Such failures can lead to unsynchronized media output. Unsynchronized media output could result in audio and video playing at different times across multiple devices, causing poor mixing of sound, sound not synchronized with corresponding video, spoilers of video output on multiple devices, and other undesirable outcomes.
- Accordingly, what is needed is a system and method for distributed and synchronized media switching.
-
FIG. 1 illustrates a system for switching and synchronizing media output, according to an aspect of the disclosure. -
FIG. 2 is a functional block diagram of a device for switching and synchronizing media output, according to an aspect of the disclosure. -
FIG. 3 shows a method for switching and synchronizing media output, according to an aspect of the disclosure. -
FIG. 4 illustrates a possible use case for the system ofFIG. 1 , according to an aspect of the disclosure. - The disclosure includes a method of switching media output, the method including receiving a first variant of a media item with a player, transmitting the first variant to a secondary device, and upon determining a change in operating conditions, switching from the first variant of the media item to a second variant of the media item by estimating a time to perform the switch to the second variant, transmitting to the secondary device a notification of a time to switch from the first variant to the second variant, and transmitting the second variant to the secondary device. The systems and methods disclosed herein allow for synchronized switching of different media variants.
-
FIG. 1 illustrates asystem 100 for switching and synchronized media output. Thesystem 100 may include adistribution server 110, a primary device (such as a player) 120, and one or moresecondary devices 130 provided in communication by anetwork 140. - The
distribution server 110 may store or have access to media items such as video, audio, images, and the like, which can be stored in one ormore databases 112 associated with thedistribution server 110. Asingle media item 114 may include different coded variants (ver. 116.1-116.N) of the media, each variant representing source media content having different types of coding applied thereto. For example, the different variants 1-N may represent source media using different coded bitrates (which often impose different qualities to recovered video), different frame sizes, or different frame rates, among other characteristics. The variants may be parsed into “chunks” of predetermined duration, which may be requested individually bydevices distribution server 110. Thus, if a givendevice network 140 or resource constraints at thedevice device device device media item 112, which typically yields recovered video data of higher quality. - The
distribution server 110 may also store amanifest file 118 for eachmedia item 114 which describes the available variants (ver. 1-N) stored for themedia item 114. Thedistribution server 110 may deliver the manifest file 116 to aplayer 120 and/orsecondary device 130 on request, prior to delivery of chunks. Based on this information, theplayer 120 may select a particular variant to receive over thenetwork 140. The selection of the variant may be made by a user or by theplayer 120 in response to a user command or changing condition (e.g., changes in network bandwidth or processor availability of the device). When a request for a new or different variant of the media is issued, theplayer 120 or thedistribution server 110 may determine how to transition from the current variant being transmitted to theplayer 120 to the requested variant. In making the determination, theplayer 120 or thedistribution server 110 may select which channels and chunks of the variant should be sent to theplayer 120. - The
player 120 may be a user device such as a computer, phone, tablet, stereo, television, speaker and receiver adapter, or any type of device or controller able to transmit and/or render media. Theplayer 120 may determine, based on a bitrate between theplayer 120 andnetwork 140 and/or between theplayer 120 and anysecondary device 130, which variant of media to render. For example, a higher bitrate may allow rendering a higher quality variant of the media than a lower bitrate may allow. Changing bitrates can cause theplayer 120 to switch variants of the media to render. Media may be rendered at theplayer 120 and/or anysecondary devices 130 associated with theplayer 120. Theplayer 120 may receive media from thedistribution server 110 and transmit it to thesecondary devices 130. Alternatively, theplayer 120 andsecondary device 130 may be integrated into a common unit such as computer, phone, tablet, or the like. In another alternative, theplayer 120 may be a “silent primary” device which does not render any media, but transmits media to thesecondary devices 130 to be rendered, and which controls the switching of media variants. - Secondary devices 130.1-103.M may be speakers, televisions, tablets, smartphones, and other devices capable of rendering
media item 114. Asecondary device 130 may communicate directly with thenetwork 140 or may be controlled by aplayer 120. Different secondary devices may connect directly or indirectly with thenetwork 140. For example, secondary devices 130.1 and 130.2 may be completely controlled by theplayer 120 while secondary device 130.M may communicate directly with thenetwork 140. - The
distribution server 110 and theplayer 120 may connect to thenetwork 140 and/or to each other via a communication channel. Thenetwork 140 may include thenetwork time reference 145, which may be available to eachplayer 120 andsecondary device 130. Theplayer 120 may connect to thedistribution server 110 via thenetwork 140 to receive themanifest file 118 and variants of the media items available at thedistribution server 110. - In another aspect, multiple devices may communicate using peer-to-peer connections. For example, multiple phones, tablets, or computers may communicate with each other when no WiFi connection is available. In such an example, one
device 120 may be theplayer 120 distributing content to the other devices which serve assecondary devices 130. -
FIG. 2 is a functional block diagram of adevice 200 according to an embodiment of the disclosure. Thedevice 200 could be aplayer 120 or secondary device 130 (FIG. 1 ). Thedevice 200 may include aprocessing system 280,memory system 220,display 230, transceiver (TX/RX) 240, and input/output (I/O)units 250. - The
processing system 280 may control operation of thedevice 200 by causing thedevice 200 to interact with other entities, such asplayers 120 and/or secondary devices 130 (FIG. 1 ), to synchronize playback. Thememory system 220 may store instructions that theprocessing system 280 may execute and also may store media item data generated therefrom. The architecture of theprocessing system 280 may include a central processing unit; it may also include graphics processors, digital signal processors, and application specific integrated circuits (not shown) as may be suitable forindividual media item 114 needs. The architecture of thememory system 220 may be suitable forindividual media item 114 needs. The architecture of thememory system 220 may also vary from device to device. Typically, thememory system 220 may include one or more electrical, optical, and/or magnetic storage devices (not shown). Thememory system 220 may be distributed throughout theprocessing system 280. For example, thememory system 220 may include a cache memory provided on a common integrated circuit with a central processor of theprocessing system 280. Thememory system 220 also may include a random access main memory coupled to theprocessing system 280 via a memory controller, and it may also include non-volatile memory device(s) for long-term storage. - The
processing system 280 may execute a variety of programs during operation, including anoperating system 210 and one ormore media items 114. For example, thedevice 200 may execute anitem rendering application 272 and possibly other applications. Theitem rendering application 272 may manage download, decoding, and synchronized output ofmedia item 114. Theitem rendering application 272 may define a set of synchronization controls 278 for management of the application. Thus, synchronization controls may vary according to the output use case for which thedevice 200 is applied. -
FIG. 3 shows amethod 300 for synchronizing media switches, according to an embodiment of the present disclosure. Aplayer 120 may download a first variant of amedia item 114 from the distribution server 110 (box 302). Theplayer 120 may decode and, if appropriate to the player's type, render the media item locally (box 304). Theplayer 120 also may transmit the coded first variant of the media item 116.1 to asecondary device 130 that is to play the media item (box 306). In parallel, thesecondary device 130 may receive the coded media data from theplayer 120 and may cache the first variant of the media item 116.1 in a memory (box 308). Thesecondary device 130 may decode received coded data from the player 120 (box 310) and render the media (box 312). Theplayer 120 andsecondary device 130 may repeat operation of boxes 304 and 308-312 until a switchover event occurs. - A switchover event may occur when the
player 120 decides to switch to another variant of the media (box 314). When theplayer 120 decides to switch to a second variant of the media (e.g., 116.2), it may estimate a time at which the switchover is to be performed (box 316) and may communicate the time to the secondary device(s) (msg. 318). Theplayer 120 may request (box 322) and begin a download of the next variant from the distribution server 110 (box 324). Upon receiving the next variant, theplayer 120 may decode and render that variant locally (box 326). Theplayer 120 also may transmit the coded second variant of the media item 116.2 to asecondary device 130 that is to play the media item (box 328). - As mentioned, the
player 130 may repeat the operation of boxes 308-312 until a switchover event occurs, which is represented by theswitchover command message 318. When thesecondary device 130 determines that a switchover command has been received (box 320), it may begin to receive coded media generated from the new variant of the media item (msg. 328). Thesecondary device 130 may receive and store the second variant in a cache (box 330). Thesecondary device 130 may determine whether the switchover time has been reached (box 332). If the switchover time has not been reached, thesecondary device 130 may render the cached data of the first variant of the media item (box 334). After the switchover time has been reached, thesecondary device 130 may decode the coded data of the second variant 116.2 (box 336), and render the cached data of the second variant of the media item 116.2 (box 338). - In one aspect, the determination to switch to another variant of the
media item 114 may be made based on a variety of operating factors. For example, theplayer 120 may determine to switch variants based on a change in communication bitrate between theplayer 120 and thenetwork 140 and/or between the connection between theplayer 120 and anysecondary devices 130; if theplayer 120 detects that the bitrate connection has dropped to a level insufficient to support the rendering of the first variant of the media data 116.1, it may switch to a lower bitrate. Similarly, if processing resources at the player 120 (or a secondary device) change due to the start or conclusion of other processes executing on theplayer 120, theplayer 120 may switch to a variant that is a better match to the new level of processing resources that are available for decoding. - In another aspect, the
message 318 may identify a network time, as established with anetwork time reference 145, when the switchover is to be performed, and may identify a media time that is to be played at the switchover time. In another aspect, thesecondary devices 130 may use the media time and shared time identifiers to correlate a point in themedia item 114 to a network time. This correlation, used in conjunction with the playback rate, may allow thesecondary device 130 to identify which elements of themedia item 114 are to be rendered in the future. - In another embodiment, the switchover command may include a notification of the time at which the
secondary device 130 may begin receiving a second variant of the media (e.g., 116.2) and at which to begin rendering the second variant of the media. The switch time may be estimated based on a network-to-media time translation, and may use an algorithm to determine, for example, where on an audio ramp curve to execute the switch to another media variant. The switching of media variants may be executed as a crossfade from one variant to another, meaning the first media variant may be scheduled to ramp down during a period when another media variant is scheduled to ramp up. Thesecondary devices 130 may send an acknowledgment to theplayer 120, or the switching may be done with brute force by theplayer 120 without any acknowledgment or handshaking between theplayer 120 and anysecondary devices 130. - In an embodiment, the second variant of the media item may be rendered at zero volume until the switchover time and/or until any remaining first variant of the media in the memory has been rendered. Alternatively, the first variant may be rendered at a decreasing volume while the second variant is rendered at an increasing volume until the switchover time is reached (box 332), and only the second variant is rendered (box 336). The time when the second variant media begins to be rendered at a positive volume may be associated with the
network time reference 145. - In another embodiment, the
player 120 and/orsecondary devices 130 may not immediately render the second variant of the media item. For example, thesecondary devices 130 may begin to receive the second variant media 116.2 from theplayer 120 while there is still some amount of first variant media 116.1 stored in memory of thesecondary devices 130. In addition, theplayer 120 may wait enough time for the switchover so that thesecondary devices 130 may render the remaining first variant media 116.1. The switchover time may also allow thesecondary devices 130 enough time to store some amount of second variant media in a memory before rending the second variant media 116.2 at all or at a volume greater than zero. -
FIG. 4 illustrates apossible use case 400 for system 100 (FIG. 1 ). Theuse case 400 may include theplayer 120 and multiplesecondary devices 130 arranged in different locations, according to an aspect of the disclosure. In one aspect, theplayer 120 is the primary device which receives all media to be rendered from thedistribution server 110, and controls the rendering of the media at thesecondary devices 130 by transmitting the media individually to eachsecondary device 130. In one aspect, theplayer 120 may be a control node such as an Apple TV, an iPhone, a stereo receiver, a smart TV, a wireless transceiver device, or any other device cable of sending and receiving media. - The
secondary devices 130 may be speakers, televisions, tablets, smart phones, or other output devices, and may connect wirelessly to theplayer 120 via an area connection, which could be Wi-Fi, Wide Area Network, Bluetooth, or the like. Theplayer 120 may transmit the media to eachsecondary device 130 in various locations and in a manner which produces a synchronized output in each location of asecondary device 130. Theplayer 120 may send coded data to thesecondary devices 130 or may decode the data before sending it to thesecondary devices 130. When theplayer 120 sends coded data to thesecondary device 130, thesecondary device 130 may decode the data with a decoder before rendering the data. - In one embodiment, the
secondary devices 130 may all render the same, complete media output. For example, if the media output is music, eachsecondary device 130 could be in a different room playing the same song so that the listener may walk from room to room while hearing a continuous output of the song. Likewise, if an output is paused as a listener walks from one room to another with asecondary device 130, when audio play recommences, the output will be at the same point as it was in another room when it was paused. Alternatively, each secondary device could be rendering a different portion of the media. For example, onesecondary device 130 could be rendering drums while another renders a guitar, simulating a synchronized live experience in which eachsecondary device 130 is playing different instruments or sounds of the same song at the same time. Such an aspect may represent a surround sound output. - Similarly, in another example, the media could be both audio and video. If a user walks from one room with a television to another, he/she may pause the media rendering while walking to the other room, and recommence output at another
secondary device 130 without missing or repeating any media output. Similarly, a mix of video and audio may be synchronized. For example, at least one of thesecondary devices 130 could be a video display such as a television, and at least one of thesecondary devices 130 could be an audio output such as a speaker. In such an aspect, a user may be watching the video on onesecondary device 130 in one room, and then walk to another room with asecondary device 130 functioning as a speaker only. The audio for the speaker should synchronize with the video and/or audio of the television in the other room. Such aspects may allow a viewer to avoid seeing or hearing an output on anysecondary device 130 before any othersecondary device 130. - In another aspect, the
player 120 may control multiplesecondary devices 130 which may render the media in sync. For example, theplayer 120 may be an Apple TV device which may output to one or more output displays in a bar, gym, conference room, stadium, airplane, or other location with multiple audio and/or video outputs. To avoid asynchronous output from any combination ofsecondary devices 130, theplayer 120 may use an adaptive bitrate which it may control. Theplayer 120 may therefore alter the bitrate to ensure that eachsecondary device 130 is able to render the media at the exact same time. In addition, using theplayer 120 as a controller for multiplesecondary devices 130 may have the benefit of reducing the number of connections to thedistribution server 110 required to produce multiple outputs on varioussecondary devices 130. In such an aspect, theplayer 120 may alter the bitrate based on network bandwidth, latency, output capabilities, and other system parameters so that nosecondary device 130 is rendering the media before or after any othersecondary device 130. Likewise, playback may be synchronized across eachsecondary device 130. For example, if audio is paused at theplayer 120, it will be paused simultaneously at eachsecondary device 130. If playback continues, it will recommence simultaneously at eachsecondary device 130. Another example of such synchronization may involve enhanced playback functionality like replay, skipping forward and backward, rendering icons, widgets, and other add-ons for eachsecondary device 130 at the same time. - In another aspect, the
secondary devices 130 may render different portions of the media. For example, in a surround sound context, eachsecondary device 130 may render a portion of audio output which is not necessarily the entire variant of the media. In such an example, theplayer 120 may need to synchronize the rendering of media even though the portions of the media are being rendered at different locations with differentsecondary devices 130. In this aspect, theplayer 120 may distribute the portions of the media to thesecondary devices 130 under the same timing method ofFIG. 3 so that the synchronized rendering of the various portions of the media at thesecondary devices 130 results in the fully rendered media across multiplesecondary devices 130. - Aspects of the disclosure may include a server executing an instance of an application or software configured to accept requests from a client and giving responses accordingly. The server may run on any computer including dedicated computers. The computer may include at least one processing element, typically a central processing unit (CPU), and some form of memory. The processing element may carry out arithmetic and logic operations, and a sequencing and control unit may change the order of operations in response to stored information. The server may include peripheral devices that may allow information to be retrieved from an external source, and the result of operations saved and retrieved. The server may operate within a client-server architecture. The server may perform some tasks on behalf of clients. The clients may connect to the server through the network on a communication channel as defined herein. The server may use memory with error detection and correction, redundant disks, redundant power supplies and so on.
- Aspects of the disclosure may include communication channels that may be any type of wired or wireless electronic communications network, such as, e.g., a wired/wireless local area network (LAN), a wired/wireless personal area network (PAN), a wired/wireless home area network (HAN), a wired/wireless wide area network (WAN), a campus network, a metropolitan network, an enterprise private network, a virtual private network (VPN), an internetwork, a backbone network (BBN), a global area network (GAN), the Internet, an intranet, an extranet, an overlay network, Wireless Fidelity (Wi-Fi), Bluetooth, and/or the like, and/or a combination of two or more thereof
- Aspects of the disclosure may be web-based. For example, a server may operate a web application in conjunction with a database. The web application may be hosted in a browser-controlled environment (e.g., a Java applet and/or the like), coded in a browser-supported language (e.g., JavaScript combined with a browser-rendered markup language (e.g., Hyper Text Markup Language (HTML) and/or the like)) and/or the like such that any device running a common web browser (e.g., Safari™ or the like) may render the application executable. A web-based service may be more beneficial due to the ubiquity of web browsers and the convenience of using a web browser as a client (i.e., thin client). Further, with inherent support for cross-platform compatibility, the web application may be maintained and updated without distributing and installing software on each.
- Aspects of the disclosure may be implemented in any type of mobile smartphones that are operated by any type of advanced mobile data processing and communication operating system, such as, e.g., an Apple™ iOS™ operating system.
Claims (21)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/179,476 US9843825B1 (en) | 2016-06-10 | 2016-06-10 | Distributed and synchronized media switching |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/179,476 US9843825B1 (en) | 2016-06-10 | 2016-06-10 | Distributed and synchronized media switching |
Publications (2)
Publication Number | Publication Date |
---|---|
US9843825B1 US9843825B1 (en) | 2017-12-12 |
US20170359607A1 true US20170359607A1 (en) | 2017-12-14 |
Family
ID=60516411
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/179,476 Active US9843825B1 (en) | 2016-06-10 | 2016-06-10 | Distributed and synchronized media switching |
Country Status (1)
Country | Link |
---|---|
US (1) | US9843825B1 (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108093293A (en) * | 2018-01-15 | 2018-05-29 | 北京奇艺世纪科技有限公司 | A kind of Video Rendering method and system |
US20180310075A1 (en) * | 2017-04-21 | 2018-10-25 | Alcatel-Lucent Espana S.A. | Multimedia content delivery with reduced delay |
WO2020055802A1 (en) | 2018-09-12 | 2020-03-19 | Roku, Inc. | Adaptive switching in a whole home entertainment system |
WO2020151399A1 (en) * | 2019-01-23 | 2020-07-30 | 上海哔哩哔哩科技有限公司 | Method and device for pseudo-seamless switching between different video sources played by web and medium |
US10951954B2 (en) * | 2016-07-05 | 2021-03-16 | Vishare Technology Limited | Methods and systems for video streaming |
EP3850859A4 (en) * | 2018-09-12 | 2022-05-11 | Roku, Inc. | Dynamically adjusting video to improve synchronization with audio |
US11616993B1 (en) * | 2021-10-22 | 2023-03-28 | Hulu, LLC | Dyanamic parameter adjustment for adaptive bitrate algorithm |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10728305B2 (en) | 2018-07-24 | 2020-07-28 | At&T Intellectual Property I, L.P. | Adaptive bitrate streaming techniques |
US10728630B2 (en) | 2018-07-24 | 2020-07-28 | At&T Intellectual Property I, L.P. | Adaptive bitrate streaming techniques |
US11089346B2 (en) | 2018-07-24 | 2021-08-10 | At&T Intellectual Property I, L.P. | Adaptive bitrate streaming techniques |
US10728588B2 (en) | 2018-07-24 | 2020-07-28 | At&T Intellectual Property I, L.P. | Adaptive bitrate streaming techniques |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140189052A1 (en) * | 2012-12-28 | 2014-07-03 | Qualcomm Incorporated | Device timing adjustments and methods for supporting dash over broadcast |
US20160234078A1 (en) * | 2015-02-11 | 2016-08-11 | At&T Intellectual Property I, Lp | Method and system for managing service quality according to network status predictions |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7136398B1 (en) * | 2002-03-14 | 2006-11-14 | Cisco Technology, Inc. | Method and apparatus for adding functionality to an existing conference call |
JP4544190B2 (en) * | 2006-03-31 | 2010-09-15 | ソニー株式会社 | VIDEO / AUDIO PROCESSING SYSTEM, VIDEO PROCESSING DEVICE, AUDIO PROCESSING DEVICE, VIDEO / AUDIO OUTPUT DEVICE, AND VIDEO / AUDIO SYNCHRONIZATION METHOD |
JP5479823B2 (en) * | 2009-08-31 | 2014-04-23 | ローランド株式会社 | Effect device |
JP2015084468A (en) * | 2013-10-25 | 2015-04-30 | ソニー株式会社 | Video processing device, control method and effect switcher |
US10476926B2 (en) * | 2015-06-12 | 2019-11-12 | Telefonaktiebolaget Lm Ericsson (Publ) | System and method for managing ABR bitrate delivery responsive to video buffer characteristics of a client |
-
2016
- 2016-06-10 US US15/179,476 patent/US9843825B1/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140189052A1 (en) * | 2012-12-28 | 2014-07-03 | Qualcomm Incorporated | Device timing adjustments and methods for supporting dash over broadcast |
US20160234078A1 (en) * | 2015-02-11 | 2016-08-11 | At&T Intellectual Property I, Lp | Method and system for managing service quality according to network status predictions |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10951954B2 (en) * | 2016-07-05 | 2021-03-16 | Vishare Technology Limited | Methods and systems for video streaming |
US11297395B2 (en) * | 2016-07-05 | 2022-04-05 | Vishare Technology Limited | Methods and systems for video streaming |
US11924522B2 (en) * | 2017-04-21 | 2024-03-05 | Nokia Solutions And Networks Oy | Multimedia content delivery with reduced delay |
US20180310075A1 (en) * | 2017-04-21 | 2018-10-25 | Alcatel-Lucent Espana S.A. | Multimedia content delivery with reduced delay |
US20220360861A1 (en) * | 2017-04-21 | 2022-11-10 | Alcatel-Lucent Espana S.A. | Multimedia content delivery with reduced delay |
US11968431B2 (en) * | 2017-04-21 | 2024-04-23 | Nokia Solutions And Networks Oy | Multimedia content delivery with reduced delay |
CN108093293A (en) * | 2018-01-15 | 2018-05-29 | 北京奇艺世纪科技有限公司 | A kind of Video Rendering method and system |
WO2020055802A1 (en) | 2018-09-12 | 2020-03-19 | Roku, Inc. | Adaptive switching in a whole home entertainment system |
EP3850860A4 (en) * | 2018-09-12 | 2022-05-04 | Roku, Inc. | Adaptive switching in a whole home entertainment system |
EP3850859A4 (en) * | 2018-09-12 | 2022-05-11 | Roku, Inc. | Dynamically adjusting video to improve synchronization with audio |
US11611788B2 (en) | 2018-09-12 | 2023-03-21 | Roku, Inc. | Adaptive switching in a whole home entertainment system |
WO2020151399A1 (en) * | 2019-01-23 | 2020-07-30 | 上海哔哩哔哩科技有限公司 | Method and device for pseudo-seamless switching between different video sources played by web and medium |
US11589119B2 (en) | 2019-01-23 | 2023-02-21 | Shanghai Bilibili Technology Co., Ltd. | Pseudo seamless switching method, device and media for web playing different video sources |
US11616993B1 (en) * | 2021-10-22 | 2023-03-28 | Hulu, LLC | Dyanamic parameter adjustment for adaptive bitrate algorithm |
Also Published As
Publication number | Publication date |
---|---|
US9843825B1 (en) | 2017-12-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9843825B1 (en) | Distributed and synchronized media switching | |
US11006168B2 (en) | Synchronizing internet (“over the top”) video streams for simultaneous feedback | |
US11627351B2 (en) | Synchronizing playback of segmented video content across multiple video playback devices | |
US10419513B2 (en) | Methods and apparatus for reducing latency shift in switching between distinct content streams | |
US10405026B2 (en) | Methods, devices and systems for audiovisual synchronization with multiple output devices | |
RU2627303C2 (en) | System and method for adaptive streaming in medium with several transmitting paths | |
EP3203754A1 (en) | Method and system for realizing streaming media data seamlessly connecting in intelligent home | |
KR20190068613A (en) | Systems and methods for discontinuing streaming content provided through an inviolatory manifest protocol | |
WO2011127312A1 (en) | Real-time or near real-time streaming | |
EP3571848A1 (en) | Content streaming system and method | |
KR20170141677A (en) | Receiving device, transmitting device and data processing method | |
WO2014010444A1 (en) | Content transmission device, content playback device, content delivery system, control method for content transmission device, control method for content playback device, data structure, control program, and recording medium | |
WO2014010445A1 (en) | Content transmission device, content playback device, content delivery system, control method for content transmission device, control method for content playback device, data structure, control program, and recording medium | |
US10433023B1 (en) | Heuristics for streaming live content | |
US11750859B2 (en) | Methods and systems for separate delivery of segments of content items | |
WO2011123821A1 (en) | Real-time or near real-time streaming | |
US20220408140A1 (en) | Moving image reproduction apparatus, moving image reproduction system, and moving image reproduction method | |
CN107852523B (en) | Method, terminal and equipment for synchronizing media rendering between terminals | |
EP3360332A1 (en) | Client and method for playing a sequence of video streams, and corresponding server and computer program product | |
JP6513054B2 (en) | Client device of content delivery system, method and program for acquiring content | |
CN114760485B (en) | Video carousel method, system and related equipment | |
US11856242B1 (en) | Synchronization of content during live video stream | |
CN113612728A (en) | Streaming media playing method, transmission equipment and system | |
KR20090111972A (en) | Method and set-top box for performing skip function in download and play type iptv service | |
JP2015167400A (en) | Information processor, data management method and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: APPLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LI, ZHENHENG;SARACINO, DAVID P.;PANTOS, ROGER N.;SIGNING DATES FROM 20160610 TO 20160922;REEL/FRAME:039846/0077 |
|
AS | Assignment |
Owner name: APPLE INC., CALIFORNIA Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ADD FOURTH ASSIGNOR'S DATA PREVIOUSLY RECORDED ON REEL 039846 FRAME 0077. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNORS:LI, ZHENHENG;SARACINO, DAVID P.;PANTOS, ROGER N.;AND OTHERS;SIGNING DATES FROM 20160610 TO 20160922;REEL/FRAME:040562/0486 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |