Nothing Special   »   [go: up one dir, main page]

CN112235649A - Distributed panoramic fusion system - Google Patents

Distributed panoramic fusion system Download PDF

Info

Publication number
CN112235649A
CN112235649A CN202011110575.0A CN202011110575A CN112235649A CN 112235649 A CN112235649 A CN 112235649A CN 202011110575 A CN202011110575 A CN 202011110575A CN 112235649 A CN112235649 A CN 112235649A
Authority
CN
China
Prior art keywords
fusion
display
video file
terminal
distributed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011110575.0A
Other languages
Chinese (zh)
Inventor
睢常明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Xingluo Technology Co ltd
Original Assignee
Guangzhou Xingluo Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Xingluo Technology Co ltd filed Critical Guangzhou Xingluo Technology Co ltd
Priority to CN202011110575.0A priority Critical patent/CN112235649A/en
Publication of CN112235649A publication Critical patent/CN112235649A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44016Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving splicing one content stream with another content stream, e.g. for substituting a video clip
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Studio Devices (AREA)

Abstract

The present disclosure provides a distributed panorama fusion system, comprising: the cloud server is used for sending a video file and a display instruction to the router, wherein the display instruction is used for determining the position of the fusion terminal and the corresponding relation between the position and the video file; the router is used for receiving the video file and the display instruction sent by the cloud server and sending the video file and the display instruction to the fusion terminal; the fusion terminal is used for receiving the video file and the display instruction sent by the router, analyzing the video file and the display instruction, determining the position of the fusion terminal, determining the display file according to the position, and sending the display file to the projector connected with the fusion terminal; and the projectors are used for receiving the display files sent by the fusion terminals connected with the projectors and projecting and displaying the display files, and the number of the fusion terminals corresponds to that of the projectors, and is at least one.

Description

Distributed panoramic fusion system
Technical Field
The present disclosure relates to a panorama fusion system, and more particularly, to a distributed panorama fusion system.
Background
In the corridor scenes such as a KTV corridor and the like, through using the panoramic fusion system, beach scenes can be projected on a plurality of screens on the corridor, so that people can obtain the same feeling as walking on the beach when walking on the corridor. At present, a panoramic fusion system adopts a centralized processing scheme and comprises a fusion server, a plurality of display cards and a plurality of projectors, wherein an HDMI output interface of each display card is connected with one projector. The fusion server adopts soft decoding and video fusion technology to realize video information processing. However, the current panoramic fusion system includes a plurality of display cards, and each display card requires a high cost for using projection, so the total cost of the panoramic fusion system is also high. In addition, the existing panoramic fusion system adopts a centralized architecture, and once a fusion server goes wrong, the panoramic fusion system can not work normally and devices are not increased or reduced at the edge. In addition, each projector is connected with one display card through an electric wire, and when the number of the projectors is large, the wiring construction difficulty is high.
Accordingly, there is a need to provide an improved distributed panorama fusion system.
Disclosure of Invention
An object of exemplary embodiments of the present disclosure is to overcome the above and/or other problems in the prior art.
Thus, according to one aspect of the present disclosure, there is provided a distributed panorama fusion system comprising:
the cloud server is used for sending a video file and a display instruction to the router, wherein the display instruction is used for determining the position of the fusion terminal and the corresponding relation between the position and the video file;
the router is used for receiving the video file and the display instruction sent by the cloud server and sending the video file and the display instruction to the fusion terminal;
the fusion terminal is used for receiving the video file and the display instruction sent by the router, analyzing the video file and the display instruction, determining the position of the fusion terminal, determining the display file according to the position, and sending the display file to the projector connected with the fusion terminal; and the number of the first and second groups,
the projector is used for receiving the display file sent by the fusion terminal connected with the projector and projecting and displaying the display file, the number of the fusion terminal corresponds to that of the projector, and the number of the fusion terminal is at least one.
Optionally, the router is further configured to receive a control instruction sent by the mobile phone, and send the control instruction to the cloud server, where the control instruction is used to determine the video file and the display instruction.
Optionally, the convergence terminal receives the video file and the display instruction sent by the router through a UDP protocol in a wired or wireless manner.
Optionally, the convergence terminal receives and analyzes the video file and the display instruction by using an android operating system, determines the display file, and sends the display file to a projector connected to the display file. .
Optionally, after receiving the video file, the convergence terminal analyzes the video file through chip hard decoding.
Optionally, the fusion terminal performs panoramic fusion through OPENGL after parsing the video file.
Optionally, the fusion terminal performs gamma correction and white balance determination on data at the edge of the display screen in the video file through OPENGL according to the determined terminal position, so as to implement panoramic fusion.
Optionally, the video files received by all the fusion terminals in the distributed panorama fusion system are the same.
Optionally, the convergence terminal further obtains the software version through the router to implement software update.
Optionally, wherein the convergence terminal is disposed proximate to a projector to which it is connected.
According to the exemplary embodiment, the fusion terminal in the distributed panoramic fusion system determines the position and the content of the display file of the fusion terminal by receiving and analyzing the video file and the display instruction sent by the cloud server, so that the synchronous panoramic fusion projection display of the same video file is realized. The panoramic fusion system adopts a distributed architecture, so that the increase or reduction of equipment is facilitated, the close arrangement of the fusion terminal and the projector reduces the wiring construction difficulty, and the remote setting of video files and display instructions can be realized through a mobile phone.
Drawings
The disclosure may be better understood by describing exemplary embodiments thereof in conjunction with the following drawings, in which:
fig. 1 is a schematic block diagram of a convergence terminal according to an embodiment of the present disclosure;
fig. 2 is a schematic view of a panoramic fusion system according to an embodiment of the present disclosure.
Fig. 3 is a schematic view of a projection display of a panoramic fusion system according to an embodiment of the present disclosure.
Detailed Description
In the following description of the embodiments of the present disclosure, it is noted that in the interest of brevity and conciseness, not all features of an actual implementation may be described in detail in this specification. It should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions are made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another. Moreover, it should be further appreciated that such a development effort might be complex and tedious, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure, and it will be appreciated that such a development effort might be complex and tedious.
Unless otherwise defined, technical or scientific terms used in the claims and the specification should have the ordinary meaning as understood by those of ordinary skill in the art to which this disclosure belongs. The use of "first," "second," and similar terms in the description and claims of the present disclosure are not intended to indicate any order, quantity, or importance, but rather are used to distinguish one element from another. The terms "a" or "an," and the like, do not denote a limitation of quantity, but rather denote the presence of at least one. The word "comprise" or "comprises", and the like, means that the element or item listed before "comprises" or "comprising" covers the element or item listed after "comprising" or "comprises" and its equivalent, and does not exclude other elements or items. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, nor are they restricted to direct or indirect connections.
Fig. 1 shows a block diagram of a convergence terminal 100 according to an embodiment of the present specification. The components of the convergence terminal 100 include, but are not limited to, a memory 110 and a processor 120. The processor 120 is coupled to the memory 110 via a bus 130 and a database 150 is used to store data.
The converged terminal 100 further includes an access device 140, the access device 140 enabling the converged terminal 100 to communicate via one or more networks 160. Examples of such networks include the Public Switched Telephone Network (PSTN), a Local Area Network (LAN), a Wide Area Network (WAN), a Personal Area Network (PAN), or a combination of communication networks such as the internet. Access device 140 may include one or more of any type of network interface (e.g., a Network Interface Card (NIC)) whether wired or wireless, such as an IEEE802.11 Wireless Local Area Network (WLAN) wireless interface, a worldwide interoperability for microwave access (Wi-MAX) interface, an ethernet interface, a Universal Serial Bus (USB) interface, a cellular network interface, a bluetooth interface, a Near Field Communication (NFC) interface, and so forth.
In one embodiment of the present description, the above-mentioned components of the convergence terminal 100 and other components not shown in fig. 1 may also be connected to each other, for example, through a bus. It should be understood that the configuration block diagram of the convergence terminal shown in fig. 1 is for illustrative purposes only and is not intended to limit the scope of the present specification. Those skilled in the art may add or replace other components as desired.
The convergence terminal 100 may be used to implement the functions of the convergence terminal in fig. 2.
Fig. 2 shows a distributed panorama fusion system according to an embodiment of the present application, which includes a cloud server 201, a router 202, a fusion terminal 203, and a projector 204.
The cloud server 201 is used for sending a video file and a display instruction to the router 202, the display instruction is used for determining the position of the fusion terminal 203 and the corresponding relation between the position and the video file, and the cloud server can send different video files according to the requirements of different festivals of solar terms and holidays for setting off festivals. The display instruction can set the positions of different terminals, the terminal position refers to a corresponding picture position of a full picture of a video file displayed by the terminal, for example, a convergence terminal with an ip address of 192.168.1.10 is set to display 10% of picture content with the upper left corner of the full picture as the center, as shown in a projection 1 in fig. 3, so that the convergence terminal obtains a corresponding display file after analyzing according to the video file and the display instruction, and sends the display file to a correspondingly connected projector for display projection.
The router 202 is configured to receive the video file and the display instruction sent by the cloud server 201, and send the video file and the display instruction to the convergence terminal.
The convergence terminal 203 is configured to receive the video file and the display instruction sent by the router 201, parse the video file and the display instruction, determine the position of the convergence terminal 203, determine the display file according to the position, and send the display file to the projector 204 connected to the convergence terminal. For example, when a terminal with an ip address of 192.168.1.10 in the convergence terminal parsing display instruction with an ip address of 192.168.1.10 displays 10% of picture content with the top left corner of the full-width picture as the center, the convergence terminal parses to obtain a video file corresponding to the 10% picture, performs video fusion processing on the corresponding video file to determine a display file, and sends the display file to a projector connected with the display file.
The projector 204 is configured to receive a display file sent by the convergence terminal 203 connected to the projector 204 and perform projection display, where the number of the convergence terminal 203 corresponds to that of the projector 204, and the number is at least one. The projector is connected with the fusion terminal through an HDMI.
The convergence terminal receives the video file and the display instruction sent by the router through a UDP protocol in a wired or wireless mode. Therefore, after the fusion terminal communicates in a wireless mode, the wiring construction difficulty of the panoramic fusion system can be greatly reduced.
The fusion terminal receives and analyzes the video file and the display instruction by adopting an android operating system, determines the display file, and sends the display file to a projector connected with the display file. Thus, the cost of the convergence terminal is greatly reduced. After the cost of the fusion terminal is reduced, the cost of the whole panoramic fusion system can be reduced by 70 percent compared with the cost of the conventional panoramic fusion system.
And the fusion terminal receives the video file and then analyzes the video file through chip hard decoding. In this way, the hard decoding method of the convergence terminal greatly shortens the video analysis time compared with the conventional soft decoding method.
And the fusion terminal analyzes the video file and then performs panoramic fusion through OPENGL. Specifically, according to the determined terminal position, the fusion terminal performs gamma correction and white balance determination on data at the edge of a display picture in a video file through OPENGL to determine a display file, so as to realize panoramic fusion. For example, in the third embodiment, the fusion terminal corresponding to the projector 1 performs gamma correction and white balance on the region connected to the display content of the projector 2, sharpens the connected display region, and implements frame fusion and further panoramic fusion.
The video files received by all the fusion terminals in the distributed panoramic fusion system are the same. Obviously, the video files received by all the fusion terminals are the same, and different fusion terminals display different positions of the same video file, so that video synchronization can be realized, and further panoramic fusion can be realized.
The convergence terminal also acquires the software version through the router to realize software updating.
Wherein, the fusion terminal and the projector connected with the fusion terminal can be arranged nearby. Therefore, the wiring construction difficulty of the whole panoramic fusion system can be reduced.
The router is further used for receiving a control instruction sent by the mobile phone and sending the control instruction to the cloud server, and the control instruction is used for determining the video file and the display instruction, so that the content of the video file which is sent to the fusion terminal by the cloud server and displayed on the projector can be determined through the mobile phone. And remote control is realized.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
In summary, according to the exemplary embodiment, the fusion terminal in the distributed panorama fusion system determines the position and the content of the display file of the fusion terminal by receiving and analyzing the video file and the display instruction sent by the cloud server, so as to implement the synchronous panorama fusion projection display of the same video file. The panoramic fusion system adopts a distributed architecture, so that the increase or reduction of equipment is facilitated, the close arrangement of the fusion terminal and the projector reduces the wiring construction difficulty, and the remote setting of video files and display instructions can be realized through a mobile phone.
It is noted that in the systems, devices, and methods of the present disclosure, it is apparent that individual components or steps may be broken down and/or re-combined. These decompositions and/or recombinations are to be considered equivalents of the present disclosure. Also, the steps of executing the series of processes described above may naturally be executed chronologically in the order described, but need not necessarily be executed chronologically. Some steps may be performed in parallel or independently of each other.
The above detailed description should not be construed as limiting the scope of the disclosure. Those skilled in the art will appreciate that various modifications, combinations, sub-combinations, and substitutions can occur, depending on design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present disclosure should be included in the scope of protection of the present disclosure.

Claims (10)

1. A distributed panorama fusion system, comprising:
the cloud server is used for sending a video file and a display instruction to the router, wherein the display instruction is used for determining the position of the fusion terminal and the corresponding relation between the position and the video file;
the router is used for receiving the video file and the display instruction sent by the cloud server and sending the video file and the display instruction to the fusion terminal;
the fusion terminal is used for receiving the video file and the display instruction sent by the router, analyzing the video file and the display instruction, determining the position of the fusion terminal, determining the display file according to the position, and sending the display file to the projector connected with the fusion terminal; and the number of the first and second groups,
the projector is used for receiving the display file sent by the fusion terminal connected with the projector and projecting and displaying the display file, the number of the fusion terminal corresponds to that of the projector, and the number of the fusion terminal is at least one.
2. The distributed panorama fusion system of claim 1, wherein the router is further configured to receive a control command sent by the mobile phone, and send the control command to the cloud server, where the control command is used to determine the video file and the display command.
3. The distributed panorama fusion system of claim 1, wherein the fusion terminal receives the video file and the display command transmitted from the router through a UDP protocol in a wired or wireless manner.
4. The distributed panorama fusion system of claim 1, wherein the fusion terminal receives and parses the video file and the display instruction using an android operating system, determines the display file, and sends the display file to a projector connected thereto. .
5. The distributed panorama fusion system of claim 1, wherein the fusion terminal parses the video file by chip hard decoding after receiving the video file.
6. The distributed panorama fusion system of claim 6, wherein the fusion terminal performs panorama fusion through OPENGL after parsing the video file.
7. The distributed panorama fusion system of claim 7, wherein the fusion terminal determines a display file by gamma correction and white balance through OPENGL for data at an edge of a display screen in the video file according to the determined terminal position to implement panorama fusion.
8. The distributed panorama fusion system of claim 1 wherein the video files received by all the fusion terminals in the distributed panorama fusion system are the same.
9. The distributed panorama fusion system of claim 1, wherein the fusion terminal further obtains a software version through the router to implement the software update.
10. The distributed panorama fusion system of claim 1, wherein the fusion terminal is disposed proximate to a projector connected thereto.
CN202011110575.0A 2020-10-16 2020-10-16 Distributed panoramic fusion system Pending CN112235649A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011110575.0A CN112235649A (en) 2020-10-16 2020-10-16 Distributed panoramic fusion system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011110575.0A CN112235649A (en) 2020-10-16 2020-10-16 Distributed panoramic fusion system

Publications (1)

Publication Number Publication Date
CN112235649A true CN112235649A (en) 2021-01-15

Family

ID=74117677

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011110575.0A Pending CN112235649A (en) 2020-10-16 2020-10-16 Distributed panoramic fusion system

Country Status (1)

Country Link
CN (1) CN112235649A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114189698A (en) * 2021-12-07 2022-03-15 广州慧联网络科技有限公司 Video file display control method and system based on cloud server

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103516887A (en) * 2012-06-29 2014-01-15 中国移动通信集团公司 Display method, device and system of multiple terminal screens
CN104836964A (en) * 2015-05-08 2015-08-12 中国科学院自动化研究所 Control device for video fusion equipment in distributed video fusion system
CN104954769A (en) * 2015-06-15 2015-09-30 中国科学院自动化研究所 Immersion type ultra-high-definition video processing system and method
WO2017080206A1 (en) * 2015-11-13 2017-05-18 深圳大学 Video panorama generation method and parallel computing system
CN110109634A (en) * 2019-04-02 2019-08-09 视联动力信息技术股份有限公司 A kind of display methods and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103516887A (en) * 2012-06-29 2014-01-15 中国移动通信集团公司 Display method, device and system of multiple terminal screens
CN104836964A (en) * 2015-05-08 2015-08-12 中国科学院自动化研究所 Control device for video fusion equipment in distributed video fusion system
CN104954769A (en) * 2015-06-15 2015-09-30 中国科学院自动化研究所 Immersion type ultra-high-definition video processing system and method
WO2017080206A1 (en) * 2015-11-13 2017-05-18 深圳大学 Video panorama generation method and parallel computing system
CN110109634A (en) * 2019-04-02 2019-08-09 视联动力信息技术股份有限公司 A kind of display methods and device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114189698A (en) * 2021-12-07 2022-03-15 广州慧联网络科技有限公司 Video file display control method and system based on cloud server

Similar Documents

Publication Publication Date Title
CN106658205B (en) Live broadcast room video stream synthesis control method and device and terminal equipment
CN107534704B (en) Information processing method, device and medium connected via communication network
CN104007991B (en) Application Program Interface layout adjustment method and device
US11711505B2 (en) Viewport dependent delivery methods for omnidirectional conversational video
US20150077511A1 (en) Information processing apparatus, information processing system and information processing method
US11106421B2 (en) Display method and system for wireless intelligent multi-screen display
US10034047B2 (en) Method and apparatus for outputting supplementary content from WFD
CN112947815B (en) Multi-window interaction method and system, readable storage medium and electronic device
CN111405339B (en) Split screen display method, electronic equipment and storage medium
CN109040792A (en) A kind of processing method, cloud terminal and cloud desktop server that video redirects
CN105451299A (en) Method for controlling WiFi connection and electronic equipment thereof
CN108255446A (en) multi-screen splicing display method, device and mobile terminal
CN112235649A (en) Distributed panoramic fusion system
CN115362672A (en) Method and apparatus for volumetric three-dimensional session service using network edge
CN106528014A (en) Method and system for implementing multi-screen display
US9648276B2 (en) Transmission management apparatus, transmission system, transmission management method and recording medium
CN111294649B (en) Content display method and device based on screen combination, electronic equipment and storage medium
WO2016058302A1 (en) Multi-video data display method and apparatus
US9794510B1 (en) Multi-vision device
CN103365062A (en) Cloud projection method
EP2693426A1 (en) Display apparatus, image post-processing apparatus and method for image post-processing of contents
WO2018105436A1 (en) Cooperative display system
KR20220114326A (en) Electronic device for transmitting and receiving media stream and method for thereof
CN115250504A (en) Data transmission method and device and computer readable storage medium
JP2005266593A (en) Display unit, and display system comprising the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210115