Nothing Special   »   [go: up one dir, main page]

TWI337043B - Data transmission method and audio/video system capable of splitting and synchronizing audio/video data - Google Patents

Data transmission method and audio/video system capable of splitting and synchronizing audio/video data Download PDF

Info

Publication number
TWI337043B
TWI337043B TW096111460A TW96111460A TWI337043B TW I337043 B TWI337043 B TW I337043B TW 096111460 A TW096111460 A TW 096111460A TW 96111460 A TW96111460 A TW 96111460A TW I337043 B TWI337043 B TW I337043B
Authority
TW
Taiwan
Prior art keywords
data
end system
image data
sound
image
Prior art date
Application number
TW096111460A
Other languages
Chinese (zh)
Other versions
TW200840373A (en
Inventor
Cheng Te Tseng
Chang Hung Lee
Original Assignee
Qisda Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qisda Corp filed Critical Qisda Corp
Priority to TW096111460A priority Critical patent/TWI337043B/en
Priority to US12/057,378 priority patent/US8379150B2/en
Publication of TW200840373A publication Critical patent/TW200840373A/en
Application granted granted Critical
Publication of TWI337043B publication Critical patent/TWI337043B/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4341Demultiplexing of audio and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/2368Multiplexing of audio and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/242Synchronization processes, e.g. processing of PCR [Program Clock References]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/637Control signals issued by the client directed to the server or network components
    • H04N21/6377Control signals issued by the client directed to the server or network components directed to server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/637Control signals issued by the client directed to the server or network components
    • H04N21/6377Control signals issued by the client directed to the server or network components directed to server
    • H04N21/6379Control signals issued by the client directed to the server or network components directed to server directed to encoder, e.g. for requesting a lower encoding rate
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Description

1337043 九、發明說明: 【發明所屬之技術領域】 本發明提供一種影音資料傳輸方法及影音系統,尤指— 種影音分流且同步之資料傳輸方法及影音系統。 【先前技術】 影音(audio/video,A/V)傳輸技術的功能和應用極為廣 泛,常應用於保全監控系統、投影機或家庭劇院等影音系 統中。先前技術之影音系統一般採用影音同步 (synchronization)且合流(merge)或影音不同·步且分流 (splitting)之傳輸技術。 保全監控系統通常包含複數個監視器和一監控中心,而 投影機常使用於大型會議或多人簡報等場合。在會議或簡 報進行的過程中’若主講人有所更動,傳統的有線投影機 常常需要頻繁地進行插線、拔線、開關電腦和投影機等動 作,不但浪費時間,亦會造成使用上的不便。隨著無線網 路(wireless fidelity,WiFi)的普及以及嵌入式中央處理器 (embedded central processing unit,embedded CPU)速度的提 升,無線投影機的應用也越來越普及,無線投影機可以無 線方式聯結至每一與會者之電腦,因此可隨時切換簡報 者,不需要重複插拔螢幕連接線。 337043 請參考第1圖,第1圖為先前技術中一影音系統100之 功能方塊圖。影音系統100採用影音同步且合流之傳輸技 術,包含一無線傳送端系統110和一無線接收端系統120。 無線傳送端系統110可從一影音輸入設備112接收影像來 源資料〖VIDEO 和聲音來源資料Iaudio ’並利用一影音處理罕· 元114對影像來源資料IvlDEO和聲音來源資料IAUDI0進行 壓縮和編碼等處理,進而產生相對應之影音位元流資料 φ Sa/v,最後再透過一無線輸出端116輸出影音位元流資料 SA/V。無線接收端系統120可透過一無線接收端126接收 無線傳送端系統110傳來之影音位元流資料SA/V,並利用 一影音處理單元124對影音位元流資料SA/V進行解壓縮和 解編碼等處理,進而產生相對應之影像輸出資料OVIDE〇和 聲音輸出資料〇AUD10 ’最後再將影像輸出資料0 VIDEO和聲 音輸出資料Oaudio 傳至一影音輸出設備122。先前技術之 影音系統100採用影音同步的架構,在傳送時將影音資料 * 合併處理後以無線方式同流輸出,因此僅需考量影音資料 在同一接收端系統的處理方式。當影音輸入設備112為一 監視器而影音輸出設備122為一監控中心之螢幕時,影音 系統100可為一保全監控系統;當影音輸入設備112為一 筆記型電腦而影音輸出設備122為一無線投影機時,影音 系統100可為一資料投影系統。1337043 IX. Description of the Invention: [Technical Field] The present invention provides a video and audio data transmission method and a video and audio system, and more particularly to a video transmission and synchronization data transmission method and a video and audio system. [Prior Art] The functions and applications of audio/video (A/V) transmission technology are extremely extensive, and are often used in security systems such as surveillance systems, projectors, or home theaters. Prior art audio and video systems generally employ video transmission synchronization and merge or video transmission and splitting transmission techniques. The security monitoring system usually includes a plurality of monitors and a monitoring center, and the projector is often used in large conferences or multi-person briefings. In the process of the conference or briefing, if the speaker has changed, the traditional wired projector often needs to perform the functions of plugging, unplugging, switching the computer and projector frequently, which not only wastes time, but also causes the use. inconvenient. With the popularity of wireless fidelity (WiFi) and the speed of embedded central processing units (embedded CPUs), the application of wireless projectors is becoming more and more popular. Wireless projectors can be connected wirelessly. To each participant's computer, the presenter can be switched at any time without having to re-plug the screen cable. 337043 Please refer to FIG. 1 , which is a functional block diagram of a video system 100 in the prior art. The audio-visual system 100 employs a video-synchronous and converged transmission technology, including a wireless transmitting end system 110 and a wireless receiving end system 120. The wireless transmitting end system 110 can receive the image source data [VIDEO and the sound source data Iaudio" from an audio/video input device 112 and compress and encode the image source data IvlDEO and the sound source data IAUDI0 by using a video processing device. Then, the corresponding video bit stream data φ Sa/v is generated, and finally the video bit stream data SA/V is output through a wireless output terminal 116. The wireless receiving end system 120 can receive the video bit stream data SA/V transmitted from the wireless transmitting end system 110 through a wireless receiving end 126, and decompress and solve the video bit stream data SA/V by using a video processing unit 124. Processing such as encoding, and then corresponding image output data OVIDE〇 and sound output data 〇 AUD10 ' Finally, the image output data 0 VIDEO and the sound output data Oaudio are transmitted to an audio output device 122. The audio-visual system 100 of the prior art adopts an audio-visual synchronization architecture, and combines video and audio data*s in a transmission manner and wirelessly outputs them in the same manner. Therefore, it is only necessary to consider the processing method of the video and audio data in the same receiving end system. When the AV input device 112 is a monitor and the AV output device 122 is a monitor center screen, the AV system 100 can be a security monitoring system; when the AV input device 112 is a notebook computer and the AV output device 122 is a wireless device; In the projector, the video system 100 can be a data projection system.

另一方面,隨著無線網路頻寬的加大(例如依據IEEE 802.11 a與802.11 g兩種無線網路標準),無線投影機的應用 範圍由辦公室、會議或簡報等場合中之無線資料投影機 (wireless data projector),逐漸朝向家庭場合中之無線影音 投影機(wireless video projector)來發展。在家庭劇院的環境 裡’使用者希望將影像机说透過投影機輸出,而將聲音訊 號接到音響組。 請參考第2圖,第2圖為先前技術中一影音系統2〇〇之 功月b方塊圖。影音系統2〇〇採用影音不同步且分流之傳輸 技術,包含一無線傳送端系統21〇和一無線接收端系統 220。無線傳送端系統21〇可從一影音輸入設備212接收影 像來源貝料IvmE〇和聲音來源資料Lud丨〇,並利用一影像處 理單το 214對影像來源資料進行壓縮和編碼等處 理進而產生相對應之影像位元流資料%,最後再透過一 無線輸出端216輸出影像位^流資料、,同時透過一有線 ^ = 8輪出聲音來源^料lMJD1Q。無線接收端系統220 〇透^ 一無線接收端226和一有绩接收她”〇 丨,今你 線傳送端系統210傳來之線接“ 228接分別收無 祖τ 辱來之〜像位元流貧料sv及聲音來源資 s 利用—影像處理單元224對影像位元流資料 v 丁^縮和解編碼等處理,進而產 出資料0V丨DPn,界你$ s丁您心汾丨於训 資料Ϊ 、後再將影像輸出資料〇v咖〇和聲音來源 貝料1AUD丨〇分別傳至—暴; 備⑵。先前·設備222和—聲音輸出設 先讀術之影音系統2〇〇可為—家庭劇院’影像 11337043 輪出設備222和聲音輸出設備 響組。由於採用影音不同牛且八、、^加別為一投影機和音 聲音資料係分別以無線和有線^八:構’在影像資料和 資,在傳送過程中可能會遇到二像: 广“:此2=收到之影像位元流資料〜及聲音來源資:: ,㈣物_料在不同 【發明内容】 本發明提供—㈣彡音分流且同步 含初始化-傳送端系統和-接收端系統以取得 率及相關於該傳送端系統中之一第一 預疋錄 -預定容量;該傳送端系統處理-影像資 影像資料播放通知訊息來傳遞處理狀 來之,訊號及該影:二:= 生Γ:收端系統接收並處理該影像資二該聲 ㈣應之影像輸出資料和聲音輸出資料;以 及該接收端系統輸出該影像輸出資料至― ,供 及輸出該聲音輸出資料至一聲音輸出設備輸 包二=—種影音分流且同步之資料傳輸系統,其 而系統和一接收端系統。該傳送端系統包含- 9 备 j. 之 =理—單二:r並處理,輸人設備傳來 一傳來:聲理該影 來儲存該影像資料;—聲〜像貝枓緩衝器’用 無線輸出端,用來 :==㈣有㈣式來傳== __影像:及 二包接,端,用來以無線方式來接收該影:Γ 以產生相對應之聲;早疋,用來處理該聲音資料 【實施方式】 音供―㈣酬傳之觸輸方法及影 二:統:目同來源送出的影像與聲音資料在不同系統播 化影音系二效果。本發明之方法包含初始 播放。 專㈣和接收端,以及啟動影音資料之同步 凊參考第 圖,第3圖為本發明中一影音系統3〇〇之功 10 [1337043On the other hand, as the bandwidth of wireless networks increases (for example, according to IEEE 802.11a and 802.11g wireless network standards), wireless projectors can be used for wireless data projection in offices, conferences or presentations. The wireless data projector is gradually developed toward a wireless video projector in a home. In the home theater environment, the user wants to output the video camera through the projector and connect the audio signal to the audio group. Please refer to FIG. 2, which is a block diagram of the power month b of an audio-visual system in the prior art. The audio-visual system 2 employs a video-to-synchronous and shunt transmission technology, including a wireless transmitting end system 21A and a wireless receiving end system 220. The wireless transmitting end system 21 can receive the image source IvmE and the sound source data Lud丨〇 from an audio input device 212, and compress and encode the image source data by using an image processing unit το 214 to generate a corresponding image. The image bit stream data is %, and finally, the image bit stream data is output through a wireless output terminal 216, and the sound source material lMJD1Q is simultaneously transmitted through a wire ^=8. The wireless receiving end system 220 transmits a wireless receiving end 226 and a good reception to receive her "〇丨, now the line transmitted from the line transmitting end system 210 is connected to the "228" and receives the ancestors. The flow s sv and the sound source s utilization - the image processing unit 224 processes the image bit stream data v, condensed and decoded, and then outputs the data 0V 丨 DPn, and you are sth. Ϊ , and then the image output data 〇 v curry and sound source bead material 1AUD 丨〇 传 to the violent; prepared (2). Previously, the device 222 and the sound output set-up audio-visual system 2 can be a home theater image 11337043 a wheel-out device 222 and a sound output device. Because the use of video and audio is different, and eight, and ^ are different for a projector and audio and audio data systems, respectively, wireless and wired ^8: construct 'in the image data and resources, in the transmission process may encounter two images: wide": 2 = received image bit stream data ~ and sound source::, (4) material_ different in the content of the invention [invention] The present invention provides - (four) voice shunt and synchronous initialization-transmitting system and - receiving system Obtaining rate and related to the first pre-recording-predetermined capacity in the transmitting end system; the transmitting end system processing-image image data playing notification message to transmit the processing signal, the signal and the shadow: 2:= Production: the receiving system receives and processes the image output data and sound output data of the sound (4); and the receiving end system outputs the image output data to, for supplying and outputting the sound output data to a sound output Equipment transmission package = = a video and audio distribution and synchronization data transmission system, and the system and a receiving end system. The transmission end system includes - 9 standby j. = = - two: r and processing, input device transmission Come to pass : Sound the image to store the image data; - Sound ~ like the Bellow buffer 'Use wireless output, used: == (4) There are (4) type to pass == __ Image: and 2 package, end, use To receive the shadow wirelessly: Γ to produce the corresponding sound; early 疋, to process the sound data [implementation] sound supply - (four) the transmission method of the transmission and shadow 2: system: the same source The image and sound data are broadcasted in different systems. The method of the present invention includes initial playback. The (four) and the receiving end, and the synchronization of the audio and video data are activated. Referring to the figure, FIG. 3 is an audio-visual system of the present invention. 3〇〇之功10 [1337043]

Pi m 二方統3。。採用影音分流且同步之輪 “傳心糸統3! 一接收端系統3 2 〇 音輸入設備312接收影像來源資料I·。 資料T AUDIG。影像處理單元314可對影像來源 貝抖Iv,DEO進行壓縮和編碼等處 像位元,、…h 。寺處理,進而產生相對應之影 “貝抖SV’再將影像位元流資料Sv存人-影像資 料緩衝器311。聲立卢搜留-像貝 ,41主/日處理早兀阳可控制輸出聲音來源資 '、audio日’之"。_里及時間。傳送端系統 可計算影像資料緩衝器311和签立次^㈣衣置317 園時’此時可透過-益線輪出端料量在一預定範 …線輸出知3】6輸出影像位元流資料Pi m two-party system 3. . Adopting the audio and video shunting and synchronizing wheel "Transmission system 3! A receiving end system 3 2 Arpeggio input device 312 receives the image source data I·. Data T AUDIG. The image processing unit 314 can perform image source BV Iv, DEO Compression and encoding are like bit bits, ... h. The temple process, and then the corresponding shadow "Bei shake SV" and then the image bit stream data Sv - image data buffer 311. Sound Li Lu search - like Bei, 41 main / day processing early Xiangyang can control the output of sound source ', audio day' ". _ and time. The transmitting end system can calculate the image data buffer 311 and the signing time ^(4) clothes 317 garden time 'At this time, the throughput of the light-transmitting wheel end is output at a predetermined range... 3 output 6 pixel output stream data

lAUDIO =’=音樣緩衝器313内存之資料量在一預定範圍 夺,此時可透過-有線輸出端川來輸出聲音來源資料 ^ΛΠΠΓΠ 0 /妾收端系統320可透過一無線接收端326和-有線接收 柒328分別接收無線傳送端系 料SV及聲音來源資料『 傳來之影像位元流資 ,及,並將影像位元流資料^存入 影像資料緩衝器321。當影像資社 田〜像貝枓緩衝器321内存之資 料量在一預定範圍時,影傻虛审„0 〜m…· 處早元324可對影像位元流 負料S v進行解壓縮和解編碼等處 ^ ^ 寻处理’進而產生相對應之影 像輸出 > 料〇VqDE〇 ’最後再將畢彡 杳τ ν 办像輪出資料〇VIDEO和聲音 來源貧枓IAUD丨〇分別傳至一影俊鈐 像輸出設備322和一聲音輸 [1337043 出設備323。 3 2 21發=影音系統3 G G可為—家庭劇院,影像輸出設備 2=曰輸出設備323可分別為一投影機和音響組。由 於細音不同步且分流的架構,在影像資料和聲音資料 無線和有線方式分流輸出,影像位元流資料〜在 傳达過程中可能會遇到不同程度的干擾,接收端系統32〇 接收到之影像位元流資料Sv及聲音來源以41侧。彼此之 間可能不同步,因此本發明之影音系統3〇〇之接收端系统 320利用影像處理單元314、聲音處理單元315和控制裝置 317來控制影音資料之傳送時間、傳送流量和資㈣縮率 等參數’依據接收端系統320實際接收到之影音資料及時 間來凋整上述參數,使得影像輸出設備322播放之影像輸 出資料0V1DE0和聲音輸出設備323播放之聲音來源資料 Iaudio彼此之間能夠同步。 請參考第4圖,第4圖之流程圖說明了本發明之影音系 統300在初始化傳送端系統31〇時之方法,其包含下列步 驟·· 步驟400 ·開始。 步驟410 :初始化傳送端系統31 〇之網路。 步驟420 :判斷傳送端系統31 〇和接收端系統320之 12 ! 1337043 間是否能成功建立連線;若能成功建立連 線,執行步驟430 ;若能成功建立連線, 執行步驟410。 步驟430 :以一預定通訊頻寬來傳遞虛擬(dummy)資 料至接收端系統320。 步驟440 :接收接收端系統320傳來之頻寬資訊。 步驟450 :依據頻寬資訊來判斷傳送端系統310和 接收端系統320之系統時間是否彼此已 經同步;若系統時間已經同步,執行步驟 470;若系統時間尚未同步,執行步驟460。 步驟460:依據頻寬資訊來更新預定通訊頻寬,執行 步驟430。 步驟470:依據頻寬資訊來設定影像資料之預定壓 縮率以及影像資料緩衝器311之預定容 量。 步驟480 :傳遞一影像資料接收通知訊息至接收端 系統320。 步驟490:準備擷取影音資料。 請參考第5圖,第5圖之流程圖說明了本發明之影音系 統300在初始化接收端系統320時之方法,其包含下列步 驟: 13 開始。 # 驟 500 v驟510 ·初始化接收端系統32〇之網路。 步驟520 :判斷傳送端系統310和接收端系統32〇之 間是否能成功建立連線;若能成功建立連 線執行步驟5 3 0 ;若未能成功建立連線, 執行步驟510。 步驟530 :接收傳送端系統31〇傳來之虛擬資料。 步驟540:依據虛擬資料之接收狀態產生相關之頻 寬資訊。 步驟550.傳遞頻寬資訊至傳送端系統31〇。 步驟560 :判斷是否收到傳送端系統310傳來之影 像貧料接收通知訊息;若能收到影像資料 接收通知訊息’執行步驟570 ;若未能收 到影像資料接收通知訊息,執行步驟560。 步驟570 :準備接收傳送端系統310輸出之影像資 料。 本發明首先於步驟410和51〇中分別初始化傳送端系統 M0和接收端系統32〇之網路’並於步驟"ο和π。中判 ,影音系 '统300之傳送端系統310和接從端系統320之間 疋否此成功建立連線。當傳送端系統31〇和接收端系統32〇 之間已經成功建立連線,此時會確認傳送端系 統310和接 收知系統320之系統時間是否彼此已經同纟,以及確認傳 1337043 .. 輸資料時之預定通訊頻寬。首先,傳送端系統310於步驟 43〇中以初始之預定通訊頻寬來傳遞虛擬資料。接著,接 收端系統320於步驟530中接收虛擬資料,再於步驟54〇 中依據虛擬資料之接收狀態來產生相關之通訊頻寬資訊。 例如,若傳送端系統3 1 〇係於100微秒内傳送1 〇〇個長度 為8000位元之封包,而接收端系統32〇在ι〇〇微秒後僅收 到90個長度為8〇〇〇位元之正確封包,此時接收端系統 # 可依據封包傳輸時間、封包長度和成功接收到之封包個數 於步驟540中計算出相對應之頻寬資訊,傳送端系統31〇 再依據頻寬資訊來調整傳送封包之數量及大小,直到接收 , 端系統32〇能完整無誤地接收到所有封包。例如,在於步 驟460中更新預定通訊頻寬後,傳送端系統31〇可在 微秒内僅傳送90個長度為80〇〇位元之封包,或傳送1〇〇 個長度為7000位元之封包,若接收端系統32〇在ι〇〇微秒 •後能收到數量符合之正確封包並依此回報相對應之頻寬資 ίΐ此時即此疋成傳送端系統310和接收端系統“ο系統 時間之間的同步以及預定通訊頻寬的確認。此時,傳送端 糸統310已經完成揭取影音資料的準備,而接收端系統似 已經元成接收影音資料的準備。 /請參考第6圖,第6圖之流程圖說明了本發明之傳送端 .系統則在處理影像資料時之方法,其包含下列步驟: 15 ^^7043 步驟600:開始擷取影像資料。 步驟610:依據預定壓縮率來壓縮影像資料。 步驟㈣:將壓縮後之影像資料存人影像資料緩衝 器 311 〇 步驟630 :判斷影像資料緩衝器3U内存之資料量 是否已達到-第一預定準位;若資料緩衝 器内存之資料已相第—預定準位,執行 步驟640 ;若資料緩衝器内存之資料尚未 達到第一預定準位,執行步驟620。 步驟_··判斷是否接收到—影像資料播放通知訊 息;若接收到影像資料播放通知訊息,執 行步驟66G ;若尚未接收到影像資料播放 通知訊息’執行步驟65〇。 t驟㈣:不允許傳遞聲音資料,執行步驟640。 ’60:允許傳遞聲音資料,執行步驟㈣。 步驟670:判斷影像資料緩衝器3ιι内存之資料量 疋否已超過-第二預定準位;若資料緩衝 器311内存之資料已超過第二預定準 ^,執行步驟若資料緩衝器内存之 資料尚未超過第二預定準位,執行步驟 680 〇 步驟停止將壓縮後之影像資料存人影像資料 緩衝器311,執行步驟69〇。 16 1337043 步驟69G: _傳遞影像資料, 執行步驟630l AUDIO = '= The amount of data in the sample buffer 313 memory is in a predetermined range. At this time, the sound source data can be output through the wired output terminal. ^ / / The receiving system 320 can pass through a wireless receiving end 326 and - The cable receiving port 328 receives the wireless transmitting end material SV and the sound source data "the transmitted image bit stream, and stores the image bit stream data ^ into the image data buffer 321 respectively. When the amount of data in the image of the company is like a beibu buffer 321 in a predetermined range, the shadow is stupid. ○0 ~m...· The early element 324 can decompress and solve the image bit stream S v Coding, etc. ^ ^ 寻处理' and then the corresponding image output> 〇VqDE〇' Finally, the second 再 轮 轮 〇 IDE IDE IDE IDE IDE IDE IDE IDE IDE IDE IDE IDE IDE IDE IDE IDE IDE IDE IDE IDE IDE IDE IDE IDE IDE IDE IDE IDE IDE Junhao image output device 322 and a sound input [1337043 out device 323. 3 2 21 hair = video system 3 GG can be - home theater, video output device 2 = 曰 output device 323 can be a projector and audio group respectively. Due to the unsynchronized and shunted architecture of the fine sound, the video data and the sound data are distributed in a wireless and wired manner, and the image bit stream data may encounter different degrees of interference during the transmission process, and the receiving end system 32 receives the data. The image bit stream data Sv and the sound source are on the 41 side. They may not be synchronized with each other. Therefore, the receiving end system 320 of the audio and video system 3 of the present invention utilizes the image processing unit 314, the sound processing unit 315, and the control device 317. Controlling the transmission time of the audio-visual data, the transmission flow rate, and the (four) reduction rate parameters, etc., according to the audio-visual data and time actually received by the receiving end system 320, the above parameters are discarded, so that the image output data 0V1DE0 and the sound output played by the image output device 322 are output. The sound source data Iaudio played by the device 323 can be synchronized with each other. Referring to FIG. 4, a flow chart of FIG. 4 illustrates a method of the audio-visual system 300 of the present invention in initializing the transmitting end system 31, which includes the following steps: Step 400: Start Step 410: Initialize the network of the transmitting end system 31. Step 420: Determine whether the connection between the transmitting end system 31 and the receiving end system 320 12 1337043 can be successfully established; If the connection is successful, step 410 is performed. Step 430: The dummy data is transmitted to the receiving end system 320 by a predetermined communication bandwidth. Step 440: The receiving end system 320 transmits the data. The bandwidth information is determined. Step 450: judging whether the system time of the transmitting end system 310 and the receiving end system 320 are mutually based on the bandwidth information. If the system time is already synchronized, go to step 470. If the system time is not synchronized, go to step 460. Step 460: Update the scheduled communication bandwidth according to the bandwidth information, and go to step 430. Step 470: Set the image according to the bandwidth information. The predetermined compression ratio of the data and the predetermined capacity of the image data buffer 311. Step 480: Deliver an image data receiving notification message to the receiving end system 320. Step 490: Prepare to capture video and audio data. Please refer to Figure 5, Figure 5. The flowchart illustrates a method of the audiovisual system 300 of the present invention when initializing the receiving end system 320, which includes the following steps: 13 Start. #步骤 500 v步骤 510 · Initialize the network of the receiving system 32. Step 520: Determine whether the connection between the transmitting end system 310 and the receiving end system 32 can be successfully established; if the connection can be successfully established, step 5 3 0; if the connection is not successfully established, go to step 510. Step 530: Receive the virtual data transmitted by the transmitting end system 31. Step 540: Generate relevant bandwidth information according to the receiving state of the virtual data. Step 550. Transfer the bandwidth information to the transmitting end system 31. Step 560: Determine whether the image receiving notification message transmitted from the transmitting end system 310 is received; if the image data receiving notification message can be received, go to step 570; if the image data receiving notification message is not received, go to step 560. Step 570: Prepare to receive the image data output by the transmitting end system 310. The present invention first initializes the network of the transmitting end system M0 and the receiving end system 32 in steps 410 and 51, respectively, and in steps "ο and π. In the middle, the video system is connected between the transmitting system 310 of the system 300 and the slave system 320. When the connection between the transmitting end system 31〇 and the receiving end system 32〇 has been successfully established, it is confirmed whether the system time of the transmitting end system 310 and the receiving receiving system 320 are already identical to each other, and confirming the transmission of 1370034. The scheduled communication bandwidth. First, the transmitting end system 310 delivers the virtual material in the initial predetermined communication bandwidth in step 43. Next, the receiving end system 320 receives the virtual data in step 530, and then generates the relevant communication bandwidth information according to the receiving state of the virtual data in step 54. For example, if the transmitting end system 3 1 transmits 1 packet of 8000 bits in 100 microseconds, the receiving end system 32 only receives 90 lengths of 8 after ι 〇〇 microseconds. The correct packet of the bit element, at this time, the receiving end system # can calculate the corresponding bandwidth information according to the packet transmission time, the packet length and the number of successfully received packets in step 540, and the transmitting end system 31 The bandwidth information adjusts the number and size of the transport packets until receiving, and the end system 32 can receive all the packets without fail. For example, after updating the predetermined communication bandwidth in step 460, the transmitting end system 31 can transmit only 90 packets of length 80 bits in microseconds, or transmit one packet of 7000 bits in length. If the receiving end system 32 〇 〇〇 〇〇 microseconds, it can receive the correct number of packets that match the number and respond accordingly to the corresponding bandwidth. At this time, the transmitting end system 310 and the receiving end system are “. The synchronization between the system time and the confirmation of the predetermined communication bandwidth. At this time, the transmission end system 310 has completed the preparation for uncovering the video and audio data, and the receiving end system seems to have been prepared to receive the audio and video data. / Please refer to the sixth Figure 6 is a flow chart illustrating the transmitting end of the present invention. The method of processing the image data includes the following steps: 15 ^^7043 Step 600: Start capturing image data. Step 610: According to predetermined compression Rate to compress the image data. Step (4): The compressed image data is stored in the image data buffer 311. Step 630: Determine whether the data volume of the image data buffer 3U memory has reached the first predetermined level; The data of the buffer memory has been phased-predetermined, and step 640 is performed. If the data of the data buffer memory has not reached the first predetermined level, step 620 is performed. Step _··Determine whether the image data broadcast notification message is received. If the image data broadcast notification message is received, go to step 66G; if the video data playback notification message has not been received, go to step 65. Step (4): The voice data is not allowed to be transmitted, go to step 640. '60: Allow voice data to be transmitted Step 670: Determine whether the amount of data in the image data buffer 3 ιι memory has exceeded the second predetermined level; if the data in the data buffer 311 has exceeded the second predetermined level, the execution step is if the data buffer The data of the device memory has not exceeded the second predetermined level, and the step 680 〇 is performed to stop the compressed image data from being stored in the image data buffer 311, and step 69 is performed. 16 1337043 Step 69G: _Transfer the image data, and perform step 630

步驟700 : 步驟710 : 步驟720 : 步驟730 : 步驟740 : 開始操取聲音資料。 將聲音資料存入聲音資料緩衝器313。 判斷疋否允許傳遞聲音資料;若允許傳遞 聲音資料,執行步驟74〇;若不允許傳遞 聲音資料,執行步驟730。 停止傳遞聲音資料,執行步驟720。 接收相關於影像播放系統時間之通知訊 息0Step 700: Step 710: Step 720: Step 730: Step 740: Start listening to the sound data. The sound data is stored in the sound data buffer 313. It is judged whether or not the voice data is allowed to be transmitted; if the voice data is allowed to be transmitted, step 74 is performed; if the voice data is not allowed to be transmitted, step 730 is performed. Stop transmitting the sound data and go to step 720. Receive notification information about the time of the video playback system.

步驟750 :接收相關於影音播放系統時間之間差值 之誤差訊號。 步驟760 :依據誤差訊號來調整傳遞聲音資料時之 預定流量及預定資料量。 步驟770 :傳遞相關於聲音播放系統時間之通知訊 息至接收端系統320。 步驟7 8 0 :依據預定流量及預定資料量來傳遞聲音 資料,執行步驟720。 請參考第8圖,第8圖之流程圖說明了本發明之接收端 1337043 系統训在處理影音資料時之方法,其包含下列步驟: 步驟800 步驟810 步驟820 步驟830 開始接收傳送端系統310傳來之壓縮影 像資料和聲音資料。 將壓縮影像資料存入影像資料緩衝器 321。 計算影像㈣緩衝器321内存之資料量。 ^斷影像資料緩衝器321内存之資料量 = —預定準位;若影像資料緩衝器 步驟840 子之資料已達到預定準位,執行牛 Z 840 ;若影像資料緩衝器切内存之; 門^達到預定準位,執行步驟810。 =解壓縮影像資料緩衝器321内存之 步驟850 輸出解壓縮接夕旦,μ 備322。後之讀資料至影像輸出設 步驟860:輸出聲音資料至聲 像播放系統時間之通知訊自至傳t 統310。 L心至傳迗端系 一 步驟依據相關於聲音播 (令统時間之通知訊 息產生相關於影音播放系統時間之間差 值之块差訊號。 ^田本發明之影音系統300完成初始化後,接著傳送端 :統:10會開始處理影音資料(如第6圖和第7圖所示)。 •十對々像貝料,在於步驟6〇〇中擷取影像資料後,接著會 二 >驟61〇中壓縮影像資料,並於步驟㈣巾將壓縮後之 :::資料存入影像資料緩衝器311。傳送端系統310之控 J二置7會於步驟630矛口 670中判斷影像資料緩衝器311 子之’以及於步驟_中判斷是否接收到影像資 放通知訊息。若影像資料緩衝器311内存之資料量尚 $達到第一預疋準位,此時會執行步驟620以繼續將壓縮 „〜像胃料存4像資料緩衝mi ;若f彡像資料緩衝 資貝料里已達到第一預定準位且能接收到影像 :播放通知訊息’此時會執行步請以允許傳遞聲音 ㈣;若影像資料緩衝器311内存之資料量已達到第= ^位但尚未純到影像f料播 傳遞聲音資料。在允許傳遞聲音資= 位==1内存之資料量已超過第二預定準 位此時會執行步驟680 l、2彳古l 影像資料緩衝器311,接將壓縮後之影像⑽存入 遞影像資料;若影驟_以允許開始傳 過第二預定準位,此時會直内存之資料量尚未超 Θ直接執仃步驟690以開始傳遞影 19 像資料。 次另方面,針對聲音資料,在於步驟700中擷取聲音 資,後,接著會於步驟川中將聲音㈣存人聲音資料缓 衝益^13。右傳送端系統31〇之控制裝置Μ?於步驟72〇 ^判斷此時允許傳遞聲音f料,此時會接收接收端系統32〇 傳來相關於影像播放系統時間之通知訊息及相關於影音播 ,曰系…先時,之間差值之誤差訊號,因此傳送端系統3川可 了 曰貝料在接收端系統32〇播放時系、統時間的差異, t此:步驟760中調整傳遞聲音資料時之預定流量及預 疋貝料里’使仔影音資料在接收端系統32Q能夠同步播放。 當:發明之影音系統3〇〇之傳送端系統Μ。完成影音 貝’、’之处理及傳遞後(如第6圖和第7_示),接著接 = 320會開始接收影音資料(如第8圖所示… 將純到mf特入影m緩衝器 旦並於步驟820中計算影像資料緩衝器3心存之資料 =若影料__ 321 _之#已料到預定準 步二二會:步驟840以壓縮和解碼影像資料,並執行 料和於:=:!Γ像輸出設備322上播放影像資 系統 32。;二::^ 實際影音播放:=因步:9。;回_^^ ’’·、日1因此傳送鳊系統310能夠依據接 20 1337043 f圖式簡單說明】 •第i 料財m統之魏方塊圖。 f2 _切技财-f彡音純之魏方塊圖。 第3圖為本發明中—影音系統之功能方塊圖。 第4圖為本發明在初始化傳送端系統時之流程圏。 第5圓本發明在初始化接收端系統時之流程圖。 第6圖為本發明之傳送端系統在處理影像資料時之流程圖。 第7圖為本發明之傳送端㈣在處理聲音資料時之流程圖。 • 第8圖為本發明之接收端系統在處理影音資料時之流程圖。 【主要元件符號說明】 100〜300 影音系統 110 、 210 、 310 無線傳送端系統 112 、 212 、 312 影音輸入設備 114 、 124 影音處理單元 116 、 216 、 316 無線輸出端 120 、 220 、 320 無線接收端系統 122 影音輸出設備 126 ' 226 ' 326 無線接收端 214 、 224 、 314 、 324 影像處理單元 218 、 318 有線輸出端 222 、 322 影像輪出設備 223 、 323 聲音輪出設備 228 、 328 有線接收端 23 1337043 311 、 321 313 315 317 影像資料緩衝器 聲音資料緩衝器 聲音處理單元 控制單元 400〜490 、 500〜570 、 600〜690 、 700〜780、800〜890 步驟Step 750: Receive an error signal related to the difference between the time of the video playback system. Step 760: Adjust the predetermined flow rate and the predetermined data amount when the sound data is transmitted according to the error signal. Step 770: Deliver notification information related to the time of the sound playing system to the receiving end system 320. Step 7 8 0: The voice data is transmitted according to the predetermined flow rate and the predetermined data amount, and step 720 is performed. Please refer to FIG. 8 and FIG. 8 for a flowchart illustrating a method for processing the video and audio data by the receiving end 1337043 of the present invention, which includes the following steps: Step 800 Step 810 Step 820 Step 830 Start receiving the transmitting end system 310 Compressed image data and sound data. The compressed image data is stored in the image data buffer 321 . Calculate the amount of data in the image (4) buffer 321 memory. ^The amount of data in the image data buffer 321 is ==predetermined level; if the data of the image data buffer step 840 has reached the predetermined level, the cow Z 840 is executed; if the image data buffer cuts the memory; At a predetermined level, step 810 is performed. = Decompress the memory of the image data buffer 321 Step 850 The output is decompressed and the device is 322. After reading the data to the image output setting, step 860: the sound data is outputted to the time of the sound image playing system. The first step of the L heart to the transmission end is based on the sound difference broadcast (the notification message of the time is generated to generate a block difference signal related to the difference between the time of the video playback system.) After the initialization of the audio and video system 300 of the present invention is completed, Transmitter: System: 10 will start processing audio and video data (as shown in Figure 6 and Figure 7). • Ten pairs of 贝 贝 , , , , , 撷 撷 撷 撷 撷 撷 撷 撷 撷 撷 撷 撷 撷 撷 撷 撷 撷 撷 撷 撷 撷 撷 撷 撷The image data is compressed in 61〇, and the compressed::: data is stored in the image data buffer 311 in step (4). The control system of the transmitting end system 310 will determine the image data buffer in step 670. The device 311 and the step _ determine whether the image resource notification message is received. If the amount of data in the image data buffer 311 is still up to the first predetermined level, step 620 is executed to continue the compression. „~ Like the stomach material storage 4 image data buffer mi; if the image data buffer has reached the first predetermined level and can receive the image: play the notification message 'will execute the step at this time to allow the sound to be transmitted (4) ; if the image data buffer 311 memory The amount of data has reached the first = ^ position but has not been purely transmitted to the video f. The data is allowed to pass the voice = bit = 1 = 1. The amount of data in the memory has exceeded the second predetermined level. 2彳古1 image data buffer 311, the compressed image (10) is stored in the transfer image data; if the shadow _ is allowed to start to pass the second predetermined level, the amount of data in the straight memory is not exceeded yet. Step 690 is executed to start transmitting the image data. On the other hand, for the sound data, the sound is collected in step 700, and then the voice (4) is stored in the step Chuanzhong. The control device of the transmitting end system 31〇 determines in step 72 that the transmission of the sound f is allowed at this time, and the receiving end system 32 transmits the notification message related to the time of the video playing system and related to the video broadcasting.曰 ... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... ...... Scheduled traffic and In the mussels, the audio and video data can be played synchronously on the receiving system 32Q. When: the invention of the audio and video system 3〇〇 transmission system Μ. After the processing and transmission of the audio and video shells, 'as shown in Figure 6 and 7_show), then connect = 320 will start to receive audio and video data (as shown in Figure 8) will be pure to mf special shadow m buffer and in step 820 calculate the image data buffer 3 heart data = If the shadow material __ 321 _ # has been scheduled to step 2: 2, step 840 to compress and decode the image data, and execute the material and the: =:! image output device 322 to play the image resource system 32; Two::^ Actual video playback:=Instep:9. Back to _^^ ''·, day 1 Therefore, the transmission system 310 can be based on the simple description of 20 1337043 f] • The first i-finance m system Block diagram. F2 _ cut skills - f voice pure Wei block diagram. Figure 3 is a functional block diagram of the video-audio system of the present invention. Figure 4 is a flow diagram of the present invention when initializing the transmitting end system. The fifth circle is a flowchart of the present invention when initializing the receiving end system. Figure 6 is a flow chart of the transmitting end system of the present invention when processing image data. Figure 7 is a flow chart of the transmitting end (4) of the present invention when processing sound data. • Figure 8 is a flow chart of the receiving end system of the present invention when processing video and audio materials. [Main component symbol description] 100~300 audio and video system 110, 210, 310 wireless transmitting end system 112, 212, 312 video input device 114, 124 video processing unit 116, 216, 316 wireless output terminal 120, 220, 320 wireless receiving end System 122 video output device 126 '226' 326 wireless receiving end 214, 224, 314, 324 image processing unit 218, 318 wired output 222, 322 image wheeling device 223, 323 sound wheeling device 228, 328 wired receiving end 23 1337043 311, 321 313 315 317 Image data buffer sound data buffer sound processing unit control unit 400~490, 500~570, 600~690, 700~780, 800~890 steps

24twenty four

Claims (1)

1337043 十、申請專利範圍·· L 一郷音分流且同步之_ 驟: 法,其包含下列步 ⑻秘化-傳送端系統和—接收端系統 :壓縮率及相關於該傳送:: 資料緩衝器之第1定容量,·〒之$衫像 (b)該傳送端系統處理_ =粗 統傳來之-影像資·Γ4,並依據該接收端系 之影像資料 播放通知訊息來傳遞處理後 ⑷^端系統處理一聲音資料,並 =來之一誤差訊號及該影像資料播放通二? 來傳遞聲音資料; 況^ ⑷該接收端系統接收並處理該影像資料和該聲_ 相對應之影像輸出資料和聲音輪出;貝 ⑷j接收端系統輸出該影像輸出資料至_影像輸 设備以及輸出該聲音輸出資料至-聲音輪出設備。 2.如請求項1所述之方法,其另包含 該接收端系統依據該影像輸出設;播出該影像 料,資料時間以及該聲音輸出設備播出該“二 出貧料之資料時間來產生該誤差訊號^曰力 25 1337043 3.如請求項1所述之方法,其中步驟(a)中初始化該傳送 立而糸統更包含: 初始化該傳送端系統之網路; °亥傳送端系統傳送一虛擬(dummy)資料至該接收端系 統; 該傳送端系統接收對應於該虛擬資料接收狀態之—頻 寬資訊; 該傳送端系統依據該頻寬資訊來設定該預定壓縮率以 及該第一預定容量;以及 5亥傳达端系統傳遞-影像資料接收通知訊息至該 A 山夕,, ^ ^而系統。 4.如請求項3所述之方法, 系統更包含: 其中步驟(a)中初始化該傳送端1337043 X. Patent application scope · L A voice split and synchronized _ s: method, which includes the following steps (8) secretization - the transmitting end system and the receiving end system: compression ratio and related to the transmission:: data buffer The first fixed capacity, the 衫 $ 衫 ( ( ( 该 该 该 该 该 该 该 该 该 该 该 传送 该 传送 传送 传送 传送 传送 传送 传送 传送 传送 传送 传送 传送 传送 传送 传送 传送 传送 传送 传送 该 传送 传送 传送 传送 传送 传送 传送 传送 传送 传送 传送The end system processes a sound data, and = one of the error signals and the video data is played through the second? To transmit the sound data; condition ^ (4) the receiving end system receives and processes the image data and the image output data corresponding to the sound _ and the sound wheel out; the shell (4) j receiving end system outputs the image output data to the _ image transmitting device and The sound output data is output to the sound wheeling device. 2. The method of claim 1, further comprising: the receiving end system according to the image output setting; broadcasting the image material, the data time, and the sound output device broadcasting the "second poor material data time to generate The method of claim 1, wherein the method of claim 1 is to initialize the transmission, and the system further comprises: initializing the network of the transmitting end system; a dummy data to the receiving end system; the transmitting end system receives the bandwidth information corresponding to the virtual data receiving state; the transmitting end system sets the predetermined compression ratio and the first predetermined according to the bandwidth information Capacity; and 5 hai communication system delivery-image data receiving notification message to the A-Shan, ^ ^ and the system. 4. The method of claim 3, the system further includes: wherein the initialization in step (a) The transmitter 5·如請求項4所述之方法, 接收端系統之間尚未成功建立連3田刻=端糸統和該 送端系統之網路。 連線時,重新初始化該傳 6.如請求項3所述之方法, 其中步驟(a)中初始化該傳送 26 端系統另包含: 判斷。亥傳送端系統和該接收端系統之間是否能成功建 立連線。 士/月求項3所述之方法,其巾步驟⑷t初始化該傳送端 系統更包含: 田。亥傳送端系統和該接收端系統之系統時間彼此能同 二時,戎傳送端系統依據該頻寬資訊來設定該預定 壓縮率以及該第一預定容量。 8. 如請求項7所述之方法,其另包含當該傳送端系統和該 接收端系統之系統相彼此尚未同步時,該傳送端系統 依據該頻寬資訊來傳送該虛擬資料至該接收端系統。 9. 如請求項3所述之方法,丨中步驟⑷令初始化該傳送 端系統另包含: 判斷該傳送端系統和該接收端系統之系統時間彼此是 否能同步。 10. 如請求項3所述之方法,其中步驟(a)t初始化該傳送 端糸統另包含· 在傳遞該影像資料通知訊息至該接收端系統後,準備擷 取該影像資料和該聲音資料。 27 1337043 η.如請求項1所述之方法,其中步驟(勾中初始化該接收 端系統係包含: 初始化該接收端系統之網路; °玄接收端系統接收該傳送端系統傳來之一虛挺^資料; 。亥接收端系統依據該虛擬資料之接收狀況產生一頻寬 資訊; 該接收端系統傳遞該頻寬資訊至該傳送端系統;以及 在接收到該傳送m傳來之―影像資料通知訊息 後,該接收端系統開始接收該影像資料和該聲音資 料。 ,丨2.=求们1所述之方法’其中步驟⑷中初始化該接收 端系統另包含: 當該傳送端系統和該接收端系統之間尚未成功建立連 • 線時,重新初始化該接收端系統之網路。 丨3. :=:11所述之方法,其中步驟(a)中初始化該接收 4糸統另包含: 判=亥傳送端系統和該接收端系統之間是否能成功建 立連線;以及 N斷该接收端系統是否能接收到該影像資料通知訊息。 :丨4.如請求項丨所述之方法,其中步鄉)中該傳送端系統 28 1337043 處理之影像資料係包含: 依據該預定壓縮率來壓縮擷取到之影像資料. 將壓縮後之影像資料存人該第—影像資料緩衝器;以及 當接收到該影像資料播放通知訊息且該第—影像資料 緩衝器内存之資料量高於一第—宏 貝;、 弟預疋準位時,開始 傳遞該影像資料。 15. 如請求項14所述之方法,其另包含·· 依據該第一預定容量來設定該第一預定準位。 16. ^請求項14所述之方法,其中步驟⑻中該傳送端系統 處理之影像資料另包含: 當接收到該影像資料播放通知訊息且該第一影像資料 緩衝器内存之資料"於該第—預定準位時,允許 傳遞該聲音資料。 17. t請求項14所述之方法’其中步驟附該傳送端系統 處理之影像資料另包含: 當該第一影像資料緩衝器内存之資料量高於—第二預 定準位時,停止將壓縮後之影像資料存人該第-影 像資料緩衝器。 18.如請求項π所述之方法,其另包含: 29 1337043 t 依據該第一預定容量來設定該第二預定準位。 19.如請求項1所述之方法’其中步驟(幻中該傳送端系統 處理之聲音資料係包含: 將擷取到之聲音資料存入一聲音資料緩衝器; 當允許傳遞該聲音資料時’接收一相關於影像播放系統 時間之通知訊息及該誤差訊號; 鲁 依據該相關於影像播放系統時間之通知訊息及該誤差 訊號來調整傳遞該聲音資料時之一預定流量及一 預定資料量; 傳遞一相關於聲音播放系統時間之通知訊息至該接收 端系統;以及 以該預定流量及該預定資料量開始傳遞聲音資料。 _ 20.如請求項19所述之方法,其中步驟(c)中該傳送端系統 處理之聲音資料另包含: 當不允許傳遞生聲音資料時,停止傳遞聲音資料。 21.如請求項1所述之方法,其中步驟⑷中該接收端系統 處理該影像資料係包含: 將接收到之影像資料存入該接收端系統中之一第二影 像資料緩衝器;以及 當該第二影像資料緩衝器内存之資料量高於—第=預 30 ⑸ 7043 定準位時,解壓縮該影像資料以產生該影像輪 料。 22·如請求項21所述之方法,其中步驟(d)另包含: 。亥接收端系統產生該誤差訊號、該影像資料播放通知訊 息,以及一相關於影像播放系統時間之通知訊牵。 -種影音分流且同步之影音系統,其包含: —傳送端系統,其包含; 0 —第-影像處理單元,用來接收並處理—影音輸入 設備傳來之影像資料; 曰則 一第-聲音處理單it,用來接收並處理該影音輸入 设備傳來之聲音資料; m 出資 23. 一第一影像㈣緩衝器1來儲存該影像資料; -聲音資料緩衝器,用來儲存該聲音資料; 輸出端,用來以無線方式來傳遞該影像資 一有線輸出端’用來以有線方式來傳遞該聲音資 料,1 乂及 —控:!Γ用來依據—誤差訊號及-影像資料播 訊息來控制該影像資料及該聲音資料 之傅遞;以及 一接收端系統,其包含; 31 1337043 -無線㈣t 線方絲接收該影像資 料; ' 有線接收端有線方式來接 料; 第二影像資料緩衝器5. The method of claim 4, wherein the network between the receiving end systems has not been successfully established between the system and the transmitting end system. When the connection is made, the transmission is reinitialized. 6. The method of claim 3, wherein the initializing the transmission in the step (a) further comprises: determining. Whether the connection between the system and the receiver system can be successfully established. The method described in claim 3, wherein the towel step (4) t initializes the transmitting end system and further comprises: field. When the system time of the transmitting end system and the receiving end system are equal to each other, the transmitting end system sets the predetermined compression ratio and the first predetermined capacity according to the bandwidth information. 8. The method of claim 7, further comprising: when the system of the transmitting end system and the system of the receiving end system are not synchronized with each other, the transmitting end system transmits the virtual data to the receiving end according to the bandwidth information. system. 9. The method of claim 3, wherein the step (4) of the step of initializing the transmitting end system further comprises: determining whether the system time of the transmitting end system and the receiving end system are synchronized with each other. 10. The method of claim 3, wherein the step (a) t initializing the transmitting end system further comprises: after transmitting the image data notification message to the receiving end system, preparing to capture the image data and the sound data . 27 1337043 η. The method of claim 1, wherein the step (initiating the receiving end system comprises: initializing the network of the receiving end system; ° the receiving end system receiving the transmitting end system one of the virtual The data is generated by the receiving system according to the receiving condition of the virtual data; the receiving end system transmits the bandwidth information to the transmitting end system; and receiving the image data transmitted from the transmitting m After the notification message, the receiving end system starts to receive the image data and the sound data. , 2. The method described in claim 1 wherein the step (4) initializes the receiving end system further includes: when the transmitting end system and the The network of the receiving end system is re-initialized when the connecting end system has not successfully established the connection line. 丨3. :=:11, wherein the step (a) initializes the receiving system and further includes: Whether the connection between the system and the receiving end system can be successfully established; and whether the receiving end system can receive the image data notification message: 丨 4. If the request item The method of the method, wherein the image data processed by the transmitting end system 28 1337043 comprises: compressing the captured image data according to the predetermined compression ratio. storing the compressed image data in the first image data a buffer; and when the image data playback notification message is received and the amount of data in the first image data buffer memory is higher than a first-a-macro; 15. The method of claim 14, further comprising: setting the first predetermined level based on the first predetermined capacity. The method of claim 14, wherein the image data processed by the transmitting end system in the step (8) further comprises: when the image data broadcast notification message is received and the data of the first image data buffer is " The first time the predetermined level is allowed, the sound material is allowed to be delivered. 17. The method of claim 14, wherein the step of attaching the image data processed by the transmitting end system further comprises: stopping compressing when the amount of data in the first image data buffer memory is higher than a second predetermined level The subsequent image data is stored in the first image data buffer. 18. The method of claim π, further comprising: 29 1337043 t setting the second predetermined level in accordance with the first predetermined capacity. 19. The method of claim 1, wherein the step (the sound data processed by the transmitting end system comprises: storing the captured sound data into a sound data buffer; when the sound data is allowed to be delivered] Receiving a notification message related to the time of the video playing system and the error signal; and adjusting a predetermined flow rate and a predetermined amount of data when the sound data is transmitted according to the notification message related to the time of the video playing system and the error signal; a notification message relating to the time of the sound playing system to the receiving end system; and starting to transmit the sound data with the predetermined flow rate and the predetermined amount of data. _ 20. The method of claim 19, wherein in the step (c) The sound data processed by the transmitting end system further includes: stopping the transmission of the sound data when the raw sound data is not allowed to be transmitted. 21. The method of claim 1, wherein the receiving end system processing the image data in the step (4) comprises: Depositing the received image data into one of the second image data buffers in the receiving end system; and when the second When the amount of data in the image data buffer memory is higher than - the first = 30 (5) 7043 level, the image data is decompressed to generate the image wheel. 22. The method of claim 21, wherein step (d) The method further comprises: a receiving system for generating the error signal, the image data playing notification message, and a notification message relating to the time of the video playing system. - a video and audio streaming and synchronizing audio and video system, comprising: - a transmitting end system , comprising: 0 - a first image processing unit for receiving and processing - image data transmitted from the video input device; a first sound processing unit it for receiving and processing the video input device Sound data; m contribution 23. A first image (4) buffer 1 to store the image data; - a sound data buffer for storing the sound data; an output for wirelessly transmitting the image to a wired output The end 'is used to transmit the sound data in a wired manner, 1 乂 and - control: ! Γ used to control the image data according to the error signal and the image data broadcast message Fu of the sound information delivery; and a receiving end system, comprising; 311337043-- wireless ㈣t wire line receiving side of the image material resources; 'wired cable receiving end to pick material; a second image data buffer 收該聲音資 第二影像處理單元,用=储存·像資料; 相對應之影像”㈣_像資料以產生 第二聲音處理單元,=以及 相對應之聲音輪㈣^理該聲音資料以產生Receiving the sound of the second image processing unit, using the = storage image data; the corresponding image "4" _ image data to generate a second sound processing unit, = and the corresponding sound wheel (four) to manage the sound data to generate 3232
TW096111460A 2007-03-30 2007-03-30 Data transmission method and audio/video system capable of splitting and synchronizing audio/video data TWI337043B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
TW096111460A TWI337043B (en) 2007-03-30 2007-03-30 Data transmission method and audio/video system capable of splitting and synchronizing audio/video data
US12/057,378 US8379150B2 (en) 2007-03-30 2008-03-28 Data transmission method and audio/video system capable of splitting and synchronizing audio/video data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW096111460A TWI337043B (en) 2007-03-30 2007-03-30 Data transmission method and audio/video system capable of splitting and synchronizing audio/video data

Publications (2)

Publication Number Publication Date
TW200840373A TW200840373A (en) 2008-10-01
TWI337043B true TWI337043B (en) 2011-02-01

Family

ID=39794544

Family Applications (1)

Application Number Title Priority Date Filing Date
TW096111460A TWI337043B (en) 2007-03-30 2007-03-30 Data transmission method and audio/video system capable of splitting and synchronizing audio/video data

Country Status (2)

Country Link
US (1) US8379150B2 (en)
TW (1) TWI337043B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2625461T3 (en) 1998-07-17 2017-07-19 Rovi Guides, Inc. Interactive television program guide with remote access
US8343072B2 (en) 2007-07-16 2013-01-01 Inrad, Inc. Coaxial needle assembly
US20120214416A1 (en) * 2011-02-23 2012-08-23 Jonathan Douglas Kent Methods and apparatuses for communication between devices
KR20130058962A (en) * 2011-11-28 2013-06-05 한국전자통신연구원 Apparatus and method for communicating wireless multimedia based on zigbee network standard
TWI619384B (en) * 2016-02-03 2018-03-21 明基電通股份有限公司 Video transmitting device, video receiving device and associated mode switchign method
US10432988B2 (en) * 2016-04-15 2019-10-01 Ati Technologies Ulc Low latency wireless virtual reality systems and methods

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6421733B1 (en) * 1997-03-25 2002-07-16 Intel Corporation System for dynamically transcoding data transmitted between computers
US6175871B1 (en) * 1997-10-01 2001-01-16 3Com Corporation Method and apparatus for real time communication over packet networks
US6522352B1 (en) * 1998-06-22 2003-02-18 Motorola, Inc. Self-contained wireless camera device, wireless camera system and method
JP2004056397A (en) * 2002-07-18 2004-02-19 Canon Inc Image processing apparatus and method
FR2849328A1 (en) * 2002-12-20 2004-06-25 St Microelectronics Sa METHOD AND DEVICE FOR SYNCHRONIZING THE PRESENTATION OF AUDIO FRAMES AND / OR VIDEO FRAMES
JP2005123789A (en) * 2003-10-15 2005-05-12 Matsushita Electric Ind Co Ltd Av synchronization system
JP4579710B2 (en) * 2004-02-20 2010-11-10 フルカワ エレクトリック ノース アメリカ インコーポレーテッド Modification, enhancement and adjustment of light generation in highly nonlinear fibers by post-processing
US7636126B2 (en) * 2005-06-22 2009-12-22 Sony Computer Entertainment Inc. Delay matching in audio/video systems
CA2541560C (en) * 2006-03-31 2013-07-16 Leitch Technology International Inc. Lip synchronization system and method
US7765315B2 (en) * 2007-01-08 2010-07-27 Apple Inc. Time synchronization of multiple time-based data streams with independent clocks

Also Published As

Publication number Publication date
US20080240684A1 (en) 2008-10-02
TW200840373A (en) 2008-10-01
US8379150B2 (en) 2013-02-19

Similar Documents

Publication Publication Date Title
WO2018082284A1 (en) 3d panoramic audio and video live broadcast system and audio and video acquisition method
JP5989779B2 (en) Synchronized wireless display device
TWI337043B (en) Data transmission method and audio/video system capable of splitting and synchronizing audio/video data
WO2016124101A1 (en) Information display method, apparatus and system
JP4702397B2 (en) Content server, information processing apparatus, network device, content distribution method, information processing method, and content distribution system
WO2014161402A2 (en) Distributed video conference method, system, terminal, and audio-video integrated device
WO2012097549A1 (en) Method and system for sharing audio and/or video
US8704868B2 (en) Video conferencing system, video conferencing apparatus, video conferencing control method, and video conferencing control program
WO2012079424A1 (en) Distributed video processing method, system and multipoint control unit
WO2012041117A1 (en) Method, system and related device for centralized monitoring of video conference terminal
CN110062268A (en) A kind of audio-video sends and receives processing method and processing device with what screen played
US9088690B2 (en) Video conference system
WO2014187062A1 (en) Method and device for playing conference signal, video conference terminal and mobile device
WO2018068481A1 (en) Binocular 720-degree panoramic acquisition system
CN103856809A (en) Method, system and terminal equipment for multipoint at the same screen
CN108366044B (en) VoIP remote audio/video sharing method
CN110557346A (en) Wireless audio-video transmission system and method
TW201138464A (en) Network device, information processing apparatus, stream switching method, information processing method, program, and content distribution system
CN106302377B (en) Media session processing method and related equipment and communication system
JP2008311969A (en) Receiver, transmitter, communication system, control method for the receiver, communication method, control program for the receiver, and recording medium with the communication program stored
TWI526080B (en) Video conferencing system
CN101340589B (en) Video and audio shunting and synchronizing data transmission method and video and audio system
WO2025010818A1 (en) System for screen sharing among conference-participating devices
CN113055636B (en) Data processing method and conference system
CN105791367A (en) Auxiliary media information sharing method, system and related equipment in screen sharing

Legal Events

Date Code Title Description
MM4A Annulment or lapse of patent due to non-payment of fees