Nothing Special   »   [go: up one dir, main page]

WO2021132739A1 - Multi-channel image transmission system and method - Google Patents

Multi-channel image transmission system and method Download PDF

Info

Publication number
WO2021132739A1
WO2021132739A1 PCT/KR2019/018317 KR2019018317W WO2021132739A1 WO 2021132739 A1 WO2021132739 A1 WO 2021132739A1 KR 2019018317 W KR2019018317 W KR 2019018317W WO 2021132739 A1 WO2021132739 A1 WO 2021132739A1
Authority
WO
WIPO (PCT)
Prior art keywords
video transmission
channel
camera
video
image
Prior art date
Application number
PCT/KR2019/018317
Other languages
French (fr)
Korean (ko)
Inventor
유재형
김중구
Original Assignee
(주)비전에스티
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)비전에스티 filed Critical (주)비전에스티
Publication of WO2021132739A1 publication Critical patent/WO2021132739A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • H04N21/2385Channel allocation; Bandwidth allocation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/27Server based end-user applications
    • H04N21/274Storing end-user multimedia data in response to end-user request, e.g. network recorder
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4305Synchronising client clock from received content stream, e.g. locking decoder clock with encoder clock, extraction of the PCR packets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Definitions

  • the present invention relates to a multi-channel video transmission system and method, and more particularly, multi-channel video transmission capable of simplifying mass video transmission by receiving multiple camera inputs, synchronizing them, and integrating them in real time to transmit one video. It relates to systems and methods.
  • a camera is one of the core sensors of an autonomous driving platform, and the number of cameras required for autonomous driving is large and therefore a large amount of data must be transmitted in real time.
  • Connecting the camera to a PC usually requires the use of Gigabit ethernet or USB2.0/USB3.0 interfaces.
  • USB2.0/USB3.0 interfaces There are various cameras on the market, but to connect and synchronize multiple cameras, there are several limitations due to the number of corresponding interface ports and synchronization methods, and each camera must be separately connected to the deep learning platform.
  • Nvidia jetson nano Nvidia jetson TX2, Nvidia jetson xavier, Raspberry Pi 4 Model B, etc. support MIPI interface, but support few cameras, difficult to lengthen cable, and SDK (Software Development Kit) provided by the manufacturer. ) is subject to restrictions such as the use of
  • an object of the present invention is to provide a multi-channel image transmission system and method for applying a multi-channel camera to a PC or various embedded systems by synchronizing and transmitting a plurality of cameras in real time.
  • a multi-channel image transmission system includes: a camera unit including a plurality of cameras for photographing a subject; a camera interface for receiving and transmitting the image captured by the camera unit; an image synchronization integration and transmission unit for synchronizing and integrating images captured by the camera unit and transmitting; a video transmission interface for video transmission; and a host receiving the synchronized and integrated image.
  • the multi-channel image transmission method is characterized in that an image captured by a plurality of cameras is received through a camera interface, the images are synchronized and integrated, and the image is transmitted to a host through an image transmission interface.
  • the multi-channel video transmission system if a plurality of cameras are synchronized and integrated in real time and transmitted as USB 3.0 UVC (USB Video Class), the multi-channel camera can be transmitted to a PC or various embedded systems without separate SDK or drivers It has the advantage of being easy to apply.
  • USB 3.0 UVC USB Video Class
  • UVC For UVC, image transmission is possible through the standard driver included in the OS without a separate driver, and various UVC support programs can be used.
  • the basic example of image processing of a deep learning platform is an example using a UVC camera, so it can be easily applied to various platforms.
  • FIG. 1 is a block diagram for explaining the concept of a multi-channel video transmission system according to the present invention.
  • FIG. 2 is a data flow diagram for explaining an image synchronization and integration process of a multi-channel image transmission system according to the present invention.
  • 3A and 3B are photographs illustrating an embodiment in which an image synchronization and integration unit of a multi-channel image transmission system according to the present invention is implemented.
  • FIGS. 4A to 4C are diagrams illustrating an example of an operation result of an image synchronization and integration unit of a multi-channel image transmission system according to the present invention.
  • FIG. 1 is a block diagram for explaining the concept of a multi-channel video transmission system according to the present invention.
  • a multi-channel image transmission system 100 includes a camera unit 110 , a camera interface 120 , an image synchronization integration and transmission unit 130 , an image transmission interface 140 , and a host ( 150).
  • the multi-channel image transmission system 100 receives images photographed from four cameras 111 , 112 , 113 and 114 as input through the camera interface 120 , and then receives the images from the image synchronization integration and transmission unit 130 . Synchronization and real-time image integration are performed, and the image is transmitted to the host 150, a PC or an embedded system, through a USB3.0 transceiver, which is the image transmission interface 140 .
  • a plurality of (1 to N) cameras 111 , 12 , 113 , and 114 for photographing a specific subject may be disposed in the camera unit 110 of the multi-channel image transmission system 100 according to the present invention.
  • 1 shows that four cameras are used, but the number is not limited thereto, and at least two or more cameras may be freely used according to circumstances.
  • N cameras may be arranged in a line on the same plane with respect to an arbitrary camera, and N cameras may be sequentially arranged on a circumference spaced apart from the subject by a predetermined distance.
  • the camera interface 120 of the multi-channel video transmission system 100 may use a high-speed serial link of FPD LINK 3 or Gigabit Multimedia Serial Link (GMLS).
  • FPD LINK 3 or Gigabit Multimedia Serial Link (GMLS) is a technology used to extend camera cables.
  • the I2C control signal for video signal and camera control can be extended by about 10 meters using a coaxial cable or one LVDS signal, and through this, the distance between the synchronization board and the camera can be extended.
  • the image synchronization integration and transmission unit 130 receives images captured from a plurality of cameras of the camera unit 110 and transmitted through the camera interface 120 , synchronizes them, integrates them in real time, and hosts the transmission interface 140 through the transmission interface 140 . Send to 150.
  • FPGA Field Programmable Gate Array
  • Image data captured by multiple cameras is written in line units using dual port memory in the FPGA, and then read sequentially in the order in which the cameras are connected. As a result, the synchronized and merged image becomes like a side-by-side overlay of the input image.
  • the difference between the camera image signals must not exceed one line. Therefore, in order to satisfy this condition, the same trigger signal is applied to each camera.
  • the amount of time required to synchronize and integrate images in real time Time can be reduced from tens of milliseconds (ms) to tens of microseconds (us).
  • the present invention In the case of a multi-channel image transmission system according to the present invention, it is possible to significantly reduce the time required for image synchronization and integration compared to the prior art.
  • the video transmission interface 140 uses a USB3.0 transceiver of the USB3.0 UVC (USB Video Class) standard. This is not dependent on the operating system and is an interface that can be widely used for the host 150 such as a PC and an embedded system.
  • USB3.0 UVC USB Video Class
  • the image synchronized by the image synchronization integration and transmission unit 130 becomes a single image by itself, and is an image attached in the horizontal direction.
  • This image is output to the host 150 such as a PC and an embedded system through a USB3.0 transceiver that is an image transmission interface 140 .
  • FIG. 2 is a data flow diagram for explaining a video synchronization integration and transmission process of the multi-channel video transmission system according to the present invention.
  • FIG. 2 in the case of a four-channel camera, four images (Image of camera 1 ⁇ Image of camera 4) having a horizontal resolution X pixel and a vertical resolution Y line are stored in a dual port memory. is used to convert a single image with a horizontal resolution of X * 4 pixels and a vertical resolution of Y lines.
  • 3A and 3B are photographs illustrating an embodiment in which an image synchronization and integration unit of a multi-channel image transmission system according to the present invention is implemented.
  • Figure 3a is a picture showing the hardware of the FPGA, which is the video synchronization integration and transmission unit of the multi-channel video transmission system according to the present invention
  • Fig. 3b is a picture showing the FPD LINK 3 of the camera interface unit.
  • 4A to 4C are diagrams illustrating an example of an operation result of the video synchronization integration and transmission unit of the multi-channel video transmission system according to the present invention.
  • FIG. 4A shows a transmission screen of an image photographed using a 4-channel camera
  • FIG. 4B shows camera image transmission information
  • FIG. 4C shows an operation state of an image synchronization integration and transmission unit.
  • an image output result is an image with a resolution of 4 times in the horizontal direction.
  • the resolution of each camera is 1280x720
  • the resolution of the synchronized and integrated image is 5120x720
  • the frame rate is 30fps.
  • the maximum bandwidth of video transmission data can be checked by creating a video with a frame rate of 78fps and resolution of 1920*1080. In this case, if the performance of the chip is improved, the bandwidth of the maximum image transmission data may be further increased.
  • a code for receiving and storing or processing an image is developed in a programming language that is not OS-dependent (for example, python) and the camera is also configured with UVC, it is programmed differently depending on the camera or OS.
  • OS-dependent for example, python
  • UVC UV-dependent
  • the present invention relates to a multi-channel video synchronization and real-time transmission method.
  • a high-speed serial link (FPD LINK 3, GMSL, etc.) is used to extend the cable distance, and the received video is transferred from the FPGA. It includes a way to synchronize and integrate in real time, using the USB3.0 UVC standard to selectively use multiple cameras with multiple resolutions and frame rates.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Studio Devices (AREA)

Abstract

The present invention relates to a multi-channel image transmission system and method, which receive a plurality of camera inputs so as to synchronize same and integrate same in real time so as to transmit a single image, thereby facilitating transmission of a large volume of images, and according to a multi-channel image transmission system and method of the present invention, if a plurality of cameras are synchronized and integrated in real time so as to be transmitted through USB 3.0 USB Video Class (UVC), a multi-channel camera can be easily applied to a PC or various embedded systems without a separate SDK or driver.

Description

다채널 영상 전송 시스템 및 방법Multi-channel video transmission system and method
본 발명은 다채널 영상 전송 시스템 및 방법에 관한 것으로서, 보다 상세하게는 다수의 카메라 입력을 받아 이를 동기화하고 실시간으로 통합하여 하나의 영상의 전송함으로써 대량의 영상 전송을 간편하게 할 수 있는 다채널 영상 전송 시스템 및 방법에 관한 것이다.The present invention relates to a multi-channel video transmission system and method, and more particularly, multi-channel video transmission capable of simplifying mass video transmission by receiving multiple camera inputs, synchronizing them, and integrating them in real time to transmit one video. It relates to systems and methods.
카메라는 자율주행 플랫폼의 핵심 센서 중의 하나이며, 자율주행에 필요한 카메라는 그 수량이 많고 따라서 대량의 데이터를 실시간으로 전송해야 한다.A camera is one of the core sensors of an autonomous driving platform, and the number of cameras required for autonomous driving is large and therefore a large amount of data must be transmitted in real time.
PC에 카메라를 연결하려면 일반적으로 기가 비트 이더넷(Gigabit ethernet) 또는 USB2.0/USB3.0 인터페이스를 사용해야 한다. 시중에는 다양한 카메라가 있지만, 다수의 카메라를 연결하여 동기화하려면 해당 인터페이스 포트의 수량, 동기화 방법 등으로 인해 여러 제약이 따르며, 각 카메라를 딥러닝 플랫폼에 따로 연결해야 한다.Connecting the camera to a PC usually requires the use of Gigabit ethernet or USB2.0/USB3.0 interfaces. There are various cameras on the market, but to connect and synchronize multiple cameras, there are several limitations due to the number of corresponding interface ports and synchronization methods, and each camera must be separately connected to the deep learning platform.
임베디드 시스템에 카메라를 연결하는 경우, 일반적으로 PC보다 적은 인터페이스를 가지고 있어서 더욱 어렵다. Nvidia jetson nano, Nvidia jetson TX2, Nvidia jetson xavier, Raspberry Pi 4 Model B 등과 같은 일부 시스템의 경우 MIPI 인터페이스를 지원하지만, 지원하는 카메라가 적고, 케이블 길이를 길게 하기 어려우며 제조사에서 제공하는 SDK(Software Development Kit)를 사용해야 하는 등의 제약이 따른다.Connecting a camera to an embedded system is more difficult as it usually has fewer interfaces than a PC. Some systems, such as Nvidia jetson nano, Nvidia jetson TX2, Nvidia jetson xavier, Raspberry Pi 4 Model B, etc. support MIPI interface, but support few cameras, difficult to lengthen cable, and SDK (Software Development Kit) provided by the manufacturer. ) is subject to restrictions such as the use of
따라서 PC를 비롯한 다양한 임베디드 시스템에 손쉽게 연결할 수 있는 다채널 카메라에 대한 필요성이 증대되어 왔다.Therefore, the need for a multi-channel camera that can be easily connected to various embedded systems including PCs has increased.
본 발명은 이러한 문제를 해결하기 위하여 복수의 카메라를 동기화하고 실시간으로 통합하여 전송함으로써 PC 또는 다양한 임베디드 시스템에 다채널 카메라를 적용할 수 있도록 한 다채널 영상 전송 시스템 및 방법을 제공함을 목적으로 한다.In order to solve this problem, an object of the present invention is to provide a multi-channel image transmission system and method for applying a multi-channel camera to a PC or various embedded systems by synchronizing and transmitting a plurality of cameras in real time.
본 발명에 따른 다채널 영상 전송 시스템은, 피사체를 촬영하는 복수의 카메라를 포함하는 카메라부; 상기 카메라부에서 촬영된 영상을 입력받아 전달하는 카메라 인터페이스; 상기 카메라부에서 촬영된 영상을 동기화하고 통합하여 전송하는 영상 동기화 통합 및 전송부; 영상 전송을 위한 영상 전송 인터페이스; 및 상기 동기화되고 통합된 영상을 전송받는 호스트;를 포함하는 것을 특징으로 한다.A multi-channel image transmission system according to the present invention includes: a camera unit including a plurality of cameras for photographing a subject; a camera interface for receiving and transmitting the image captured by the camera unit; an image synchronization integration and transmission unit for synchronizing and integrating images captured by the camera unit and transmitting; a video transmission interface for video transmission; and a host receiving the synchronized and integrated image.
또한, 본 발명에 따른 다채널 영상 전송 방법은, 복수의 카메라에서 촬영된 영상을 카메라 인터페이스를 통해 입력받아 상기 영상을 동기화하고 통합하여 영상 전송 인터페이스를 통해 호스트로 전송하는 것을 특징으로 한다.In addition, the multi-channel image transmission method according to the present invention is characterized in that an image captured by a plurality of cameras is received through a camera interface, the images are synchronized and integrated, and the image is transmitted to a host through an image transmission interface.
본 발명에 따른 다채널 영상 전송 시스템에 의하면, 다수의 카메라를 동기화하고 실시간으로 통합하여 USB 3.0 UVC(USB Video Class)로 전송하면, PC 또는 다양한 임베디드 시스템에 별도의 SDK 또는 드라이버 없이 다채널 카메라를 손쉽게 적용할 수 있는 장점이 있다.According to the multi-channel video transmission system according to the present invention, if a plurality of cameras are synchronized and integrated in real time and transmitted as USB 3.0 UVC (USB Video Class), the multi-channel camera can be transmitted to a PC or various embedded systems without separate SDK or drivers It has the advantage of being easy to apply.
UVC는 별도의 드라이버 없이 OS에 포함된 표준 드라이버를 통해서 영상 전송이 가능하며 다양한 UVC지원 프로그램을 사용할 수 있다. 딥러닝 플랫폼의 영상처리 기본 예제는 보통 UVC 카메라를 사용하는 예제로 되어있어서 다양한 플랫폼에 쉽게 적용이 가능하다.For UVC, image transmission is possible through the standard driver included in the OS without a separate driver, and various UVC support programs can be used. The basic example of image processing of a deep learning platform is an example using a UVC camera, so it can be easily applied to various platforms.
특히 GPU가 장착된 저가의 딥러닝 시스템에서 다채널 영상 분석을 하는데 사용할 수 있다.In particular, it can be used for multi-channel image analysis in low-cost deep learning systems equipped with GPUs.
도 1은 본 발명에 따른 다채널 영상 전송 시스템의 개념을 설명하기 위한 블록도이다.1 is a block diagram for explaining the concept of a multi-channel video transmission system according to the present invention.
도 2는 본 발명에 따른 다채널 영상 전송 시스템의 영상 동기화 및 통합 과정을 설명하기 위한 데이터 흐름도이다.2 is a data flow diagram for explaining an image synchronization and integration process of a multi-channel image transmission system according to the present invention.
도 3a 및 도 3b는 본 발명에 따른 다채널 영상 전송 시스템의 영상 동기화 및 통합부를 구현한 일 실시예를 나타내는 사진이다. 3A and 3B are photographs illustrating an embodiment in which an image synchronization and integration unit of a multi-channel image transmission system according to the present invention is implemented.
도 4a 내지 도 4c는 본 발명에 따른 다채널 영상 전송 시스템의 영상 동기화 및 통합부의 동작 결과의 일 실시예를 나타내는 도면이다.4A to 4C are diagrams illustrating an example of an operation result of an image synchronization and integration unit of a multi-channel image transmission system according to the present invention.
이하, 첨부된 도면을 참조하여 본 발명의 바람직한 실시예를 상세하게 설명한다. 본 명세서 및 특허청구범위에 사용된 용어는 통상적이거나 사전적 의미로 한정되어 해석되지 아니하며, 본 발명의 기술적 사항에 부합하는 의미와 개념으로 해석되어야 한다.Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. The terms used in the present specification and claims are not limited to a conventional or dictionary meaning, and should be interpreted in a meaning and concept consistent with the technical matters of the present invention.
본 명세서에 기재된 실시예와 도면에 도시된 구성은 본 발명의 바람직한 실시예이며, 본 발명의 기술적 사상을 모두 대변하는 것이 아니므로, 본 출원 시점에서 이들을 대체할 수 있는 다양한 균등물과 변형예들이 있을 수 있다.The configuration shown in the embodiments and drawings described in this specification is a preferred embodiment of the present invention, and does not represent all of the technical spirit of the present invention, so various equivalents and modifications that can be substituted for them at the time of the present application are provided. there may be
도 1은 본 발명에 따른 다채널 영상 전송 시스템의 개념을 설명하기 위한 블록도이다. 1 is a block diagram for explaining the concept of a multi-channel video transmission system according to the present invention.
도 1을 참고하면, 본 발명에 따른 다채널 영상 전송 시스템(100)은 카메라부(110), 카메라 인터페이스(120), 영상 동기화 통합 및 전송부(130), 영상 전송 인터페이스(140) 및 호스트(150)를 포함한다.Referring to FIG. 1 , a multi-channel image transmission system 100 according to the present invention includes a camera unit 110 , a camera interface 120 , an image synchronization integration and transmission unit 130 , an image transmission interface 140 , and a host ( 150).
본 발명에 따른 다채널 영상 전송 시스템(100)은 4개 카메라(111, 112, 113 및 114)로부터 촬영된 영상을 카메라 인터페이스(120)를 통해 입력으로 받아서 영상 동기화 통합 및 전송부(130)에서 동기화 및 실시간 영상 통합을 수행하고 영상 전송 인터페이스(140)인 USB3.0 트랜시버(transceiver)를 통해서 영상을 호스트(150)인 PC또는 임베디드 시스템으로 전송한다.The multi-channel image transmission system 100 according to the present invention receives images photographed from four cameras 111 , 112 , 113 and 114 as input through the camera interface 120 , and then receives the images from the image synchronization integration and transmission unit 130 . Synchronization and real-time image integration are performed, and the image is transmitted to the host 150, a PC or an embedded system, through a USB3.0 transceiver, which is the image transmission interface 140 .
본 발명에 따른 다채널 영상 전송 시스템(100)의 카메라부(110)에는 특정 피사체를 촬영하기 위한 복수 개(1 내지 N개)의 카메라(111, 12, 113, 114)가 배치될 수 있다. 도 1에서는 4대의 카메라를 사용하는 것으로 도시하였으나 그 수치에 한정되는 것은 아니며 상황에 따라 적어도 2대 이상의 카메라가 자유롭게 사용될 수 있다. A plurality of (1 to N) cameras 111 , 12 , 113 , and 114 for photographing a specific subject may be disposed in the camera unit 110 of the multi-channel image transmission system 100 according to the present invention. 1 shows that four cameras are used, but the number is not limited thereto, and at least two or more cameras may be freely used according to circumstances.
한편, 복수 개의 카메라의 배열방법은 임의의 카메라를 기준으로 N개의 카메라를 동일한 평면상에서 일렬로 배치할 수 있으며, 피사체를 기준으로 일정 거리 이격된 원주상에 N개의 카메라를 순차적으로 배치할 수도 있다.On the other hand, in the arrangement method of the plurality of cameras, N cameras may be arranged in a line on the same plane with respect to an arbitrary camera, and N cameras may be sequentially arranged on a circumference spaced apart from the subject by a predetermined distance. .
본 발명에 따른 다채널 영상 전송 시스템(100)의 카메라 인터페이스(120)는 FPD LINK 3 또는 기가 비트 멀티미디어 시리얼 링크(GMLS)의 고속 시리얼 링크를 사용할 수 있다. FPD LINK 3 또는 기가 비트 멀티미디어 시리얼 링크(GMLS)는 카메라 케이블을 길게 연장하기 위해서 사용되는 기술이다. DS90UB913/DS90UB914의 경우 동축 케이블 또는 한 개의 LVDS신호를 사용하여 영상 신호 및 카메라 제어를 위한 I2C 제어 신호를 10meter 정도 연장할 수 있으며, 이를 통해 동기화 보드와 카메라 간의 거리를 연장할 수 있다.The camera interface 120 of the multi-channel video transmission system 100 according to the present invention may use a high-speed serial link of FPD LINK 3 or Gigabit Multimedia Serial Link (GMLS). FPD LINK 3 or Gigabit Multimedia Serial Link (GMLS) is a technology used to extend camera cables. In the case of DS90UB913/DS90UB914, the I2C control signal for video signal and camera control can be extended by about 10 meters using a coaxial cable or one LVDS signal, and through this, the distance between the synchronization board and the camera can be extended.
영상 동기화 통합 및 전송부(130)는 카메라부(110)의 복수의 카메라에서 촬영되어 카메라 인터페이스(120)를 통해 전달된 영상을 입력받아 이를 동기화하고 실시간으로 통합하여 전송 인터페이스(140)를 통해 호스트(150)로 전송한다.The image synchronization integration and transmission unit 130 receives images captured from a plurality of cameras of the camera unit 110 and transmitted through the camera interface 120 , synchronizes them, integrates them in real time, and hosts the transmission interface 140 through the transmission interface 140 . Send to 150.
영상을 동기화하고 실시간으로 통합하여 전송하는 것은 필드 프로그래머블 게이트 어레이(Field Programmable Gate Array : FPGA)에서 수행한다. FPGA(Field Programmable Gate Array)는 실시간 영상처리와 데이터 처리에 사용되는 것으로, 저가형 FPGA를 사용하더라도 병렬처리를 통해서 고속 및 대량의 영상 처리가 가능한 장점이 있다. Synchronizing images and integrating and transmitting the images in real time is performed by a Field Programmable Gate Array (FPGA). FPGA (Field Programmable Gate Array) is used for real-time image processing and data processing, and even if a low-cost FPGA is used, it has the advantage of high-speed and large-scale image processing through parallel processing.
복수 개의 카메라에서 촬영한 영상 데이터를 FPGA에 있는 듀얼 포트 메모리(dual port memory)를 사용하여 라인(line) 단위로 쓰고 나서, 나중에 카메라가 연결된 순서대로 순차적으로 읽는다. 그 결과 동기화 및 통합된 이미지는 입력 영상을 옆으로 붙여놓은 것과 같이 된다. Image data captured by multiple cameras is written in line units using dual port memory in the FPGA, and then read sequentially in the order in which the cameras are connected. As a result, the synchronized and merged image becomes like a side-by-side overlay of the input image.
한편, 이 방법을 사용하기 위해서는 카메라 영상 신호 간의 차이가 1 라인(line)을 넘으면 안된다. 따라서 이 조건을 만족하기 위해서 각 카메라에 동일한 트리거(trigger) 신호를 인가하게 된다.Meanwhile, in order to use this method, the difference between the camera image signals must not exceed one line. Therefore, in order to satisfy this condition, the same trigger signal is applied to each camera.
본 발명에 따른 다채널 영상 전송 시스템의 경우 영상을 동기화하고 통합하여 전송하는 일반적인 방법에서 사용되는 프레임 버퍼링(frame buffering) 대신 라인 버퍼링(line buffering)을 사용함으로써 영상을 동기화하고 실시간으로 통합하는데 소요되는 시간을 수십 밀리 세컨드(ms)단위에서 수십 마이크로 세컨드(us)단위로 줄일 수 있다.In the case of the multi-channel video transmission system according to the present invention, by using line buffering instead of frame buffering used in the general method of synchronizing, integrating and transmitting images, the amount of time required to synchronize and integrate images in real time Time can be reduced from tens of milliseconds (ms) to tens of microseconds (us).
프레임 레이트(frame rate)가 30fps인 1280x720 해상도의 카메라에서 1 프레임 버퍼링(frame buffering)의 소요시간은 33.33ms이고, 동일한 조건에서 1 라인 버퍼링(line buffering) 소요시간은 47(us)이므로, 본 발명에 따른 다채널 영상 전송 시스템의 경우 종래 기술에 비해 영상 동기화 및 통합에 소요되는 시간을 대폭 감소시킬 수 있는 효과가 있다.In a 1280x720 resolution camera with a frame rate of 30 fps, the time required for one frame buffering is 33.33 ms, and the time required for one line buffering under the same conditions is 47 (us), so the present invention In the case of a multi-channel image transmission system according to the present invention, it is possible to significantly reduce the time required for image synchronization and integration compared to the prior art.
영상 전송 인터페이스(140)는 USB3.0 UVC(USB Video Class) 표준의 USB3.0 트랜시버(Transceiver)를 사용한다. 이는 운영체제에 종속적이지 않으며, PC와 임베디드 시스템 등의 호스트(150)에 널리 사용할 수 있는 인터페이스이다.The video transmission interface 140 uses a USB3.0 transceiver of the USB3.0 UVC (USB Video Class) standard. This is not dependent on the operating system and is an interface that can be widely used for the host 150 such as a PC and an embedded system.
영상 동기화 통합 및 전송부(130)에서 동기화된 영상은 그 자체로 한개의 영상이 되며, 가로 방향으로 붙어있는 영상이다. 영상 전송 인터페이스(140)인 USB3.0 트랜시버(Transceiver)를 통해서 이 영상을 PC와 임베디드 시스템 등의 호스트(150)로 출력한다. The image synchronized by the image synchronization integration and transmission unit 130 becomes a single image by itself, and is an image attached in the horizontal direction. This image is output to the host 150 such as a PC and an embedded system through a USB3.0 transceiver that is an image transmission interface 140 .
도 2는 본 발명에 따른 다채널 영상 전송 시스템의 영상 동기화 통합 및 전송 과정을 설명하기 위한 데이터 흐름도이다.2 is a data flow diagram for explaining a video synchronization integration and transmission process of the multi-channel video transmission system according to the present invention.
도 2를 참고하면 4채널 카메라의 경우, 가로 해상도 X 픽셀(pixel)과 세로 해상도 Y 라인(line)인 4개의 영상(Image of camera 1 ~ Image of camera 4)을 듀얼 포트 메모리(dual port memory)를 사용하여 가로 해상도 X * 4 픽셀(pixel)과 세로해상도 Y 라인(line)인 하나의 영상으로 변환한다.Referring to FIG. 2 , in the case of a four-channel camera, four images (Image of camera 1 ~ Image of camera 4) having a horizontal resolution X pixel and a vertical resolution Y line are stored in a dual port memory. is used to convert a single image with a horizontal resolution of X * 4 pixels and a vertical resolution of Y lines.
도 3a 및 도 3b는 본 발명에 따른 다채널 영상 전송 시스템의 영상 동기화 및 통합부를 구현한 일 실시예를 나타내는 사진이다. 3A and 3B are photographs illustrating an embodiment in which an image synchronization and integration unit of a multi-channel image transmission system according to the present invention is implemented.
도 3a는 본 발명에 따른 다채널 영상 전송 시스템의 영상 동기화 통합 및 전송부인 FPGA의 하드웨어를 나타내는 사진이고, 도 3의 b는 카메라 인터페이스부의 FPD LINK 3를 나타내는 사진이다. Figure 3a is a picture showing the hardware of the FPGA, which is the video synchronization integration and transmission unit of the multi-channel video transmission system according to the present invention, and Fig. 3b is a picture showing the FPD LINK 3 of the camera interface unit.
도 4a 내지 도 4c는 본 발명에 따른 다채널 영상 전송 시스템의 영상 동기화 통합 및 전송부의 동작 결과의 일 실시예를 나타내는 도면이다.4A to 4C are diagrams illustrating an example of an operation result of the video synchronization integration and transmission unit of the multi-channel video transmission system according to the present invention.
도 4a는 4 채널 카메라를 사용하여 촬영된 영상의 전송화면을 나타내는 것이고, 도 4b는 카메라 영상 전송 정보를 나타내는 것이며, 도 4c는 영상 동기화 통합 및 전송부의 동작 모습을 나타내는 것이다.FIG. 4A shows a transmission screen of an image photographed using a 4-channel camera, FIG. 4B shows camera image transmission information, and FIG. 4C shows an operation state of an image synchronization integration and transmission unit.
도 4a 내지 도 4c를 참고하면 일반적인 UVC 캡쳐 프로그램을 사용하여 동기화 및 실시간으로 통합된 영상이 USB 3.0 트랜시버를 통해 PC로 영상 전송이 이루졌음을 확인할 수 있다. Referring to FIGS. 4A to 4C , it can be confirmed that the images synchronized and real-time integrated using a general UVC capture program are transferred to the PC through the USB 3.0 transceiver.
도 4의 (a) 및 (b)에 도시된 바와 같이 4채널 카메라의 경우 영상 출력 결과는 가로 방향으로 4배의 해상도가 된 영상이다. 이때, 각각의 카메라의 해상도는 1280x720이고 동기화 및 통합된 영상의 해상도는 5120x720이며, 프레임 레이트(Frame rate)는 30fps이다.As shown in (a) and (b) of FIG. 4 , in the case of a four-channel camera, an image output result is an image with a resolution of 4 times in the horizontal direction. At this time, the resolution of each camera is 1280x720, the resolution of the synchronized and integrated image is 5120x720, and the frame rate is 30fps.
1280x720/30fps인 카메라 한 개의 data 대역폭은 픽셀당 16bit인 YUY2 포맷의 경우 1280*720*16*30/1024/1024 = 422Mbps = 52.75MBps 이고, 1280x720/30fps인 카메라 네 개의 데이터(data) 대역폭은 픽셀당 16bit인 YUY2 포맷의 경우, 1280*720*16*30*4/1024/1024 = 1688Mbps = 211MBps 이다. The data bandwidth of one 1280x720/30fps camera is 1280*720*16*30/1024/1024 = 422Mbps = 52.75MBps for the YUY2 format, which is 16 bits per pixel, and the data bandwidth of four cameras of 1280x720/30fps is pixel In the case of YUY2 format, which is 16 bits per one, 1280*720*16*30*4/1024/1024 = 1688Mbps = 211MBps.
최대 영상 전송 데이터의 대역폭은 78fps의 프레임 레이트와 1920*1080 해상도의 영상을 생성하여 확인할 수 있다. 이때 칩의 성능이 개선되는 경우 최대 영상 전송 데이터의 대역폭은 더 늘어날 수도 있다.The maximum bandwidth of video transmission data can be checked by creating a video with a frame rate of 78fps and resolution of 1920*1080. In this case, if the performance of the chip is improved, the bandwidth of the maximum image transmission data may be further increased.
본 발명에 따른 다채널 영상 전송 시스템 및 방법에 의하면 영상을 받아서 저장 또는 처리하는 코드를 OS종속적이지 않은 프로그래밍 언어(예를 들어 python)로 개발하고 카메라도 UVC로 구성하면 카메라 혹은 OS에 따라서 다르게 프로그래밍을 할 필요가 없어지는 장점이 있다.According to the multi-channel image transmission system and method according to the present invention, if a code for receiving and storing or processing an image is developed in a programming language that is not OS-dependent (for example, python) and the camera is also configured with UVC, it is programmed differently depending on the camera or OS. The advantage is that there is no need to do
살펴본 바와 같이 본 발명은 다채널 영상 동기화 및 실시간 전송 방법에 대한 내용으로, 다채널 영상 입력에는 고속 시리얼 링크(FPD LINK 3, GMSL 등)를 사용하여 케이블 거리를 확장하고, 입력받은 영상을 FPGA에서 실시간으로 동기화 및 통합하여, USB3.0 UVC 표준을 사용하여 여러 가지 해상도 및 프레임 레이트(frame rate)와 여러 가지 카메라를 선택적으로 사용할 수 있게 하는 방법을 포함하고 있다. As described above, the present invention relates to a multi-channel video synchronization and real-time transmission method. For multi-channel video input, a high-speed serial link (FPD LINK 3, GMSL, etc.) is used to extend the cable distance, and the received video is transferred from the FPGA. It includes a way to synchronize and integrate in real time, using the USB3.0 UVC standard to selectively use multiple cameras with multiple resolutions and frame rates.
이를 통해서 하나의 시스템으로 딥러닝 학습에 필요한 고해상도 및 낮은 프레임 레이트(frame rate)의 영상 데이터를 확보하고 딥러닝에 필요한 저해상도 및 높은 프레임 레이트(high frame rate)의 실시간 처리가 가능하게 된다.Through this, it is possible to secure high-resolution and low frame rate image data required for deep learning learning with one system, and to process low resolution and high frame rate required for deep learning in real time.

Claims (12)

  1. 다채널 영상 전송 시스템에 있어서In a multi-channel video transmission system
    피사체를 촬영하는 복수의 카메라를 포함하는 카메라부;a camera unit including a plurality of cameras for photographing a subject;
    상기 카메라부에서 촬영된 영상을 입력받아 전달하는 카메라 인터페이스;a camera interface for receiving and transmitting the image captured by the camera unit;
    상기 카메라부에서 촬영된 영상을 동기화하고 통합하여 전송하는 영상 동기화 통합 및 전송부;an image synchronization integration and transmission unit for synchronizing and integrating images captured by the camera unit and transmitting;
    영상 전송을 위한 영상 전송 인터페이스; 및a video transmission interface for video transmission; and
    상기 동기화되고 통합된 영상을 전송받는 호스트;를 포함하는 것을 특징으로 하는 다채널 영상 전송 시스템.and a host receiving the synchronized and integrated image.
  2. 제1 항에 있어서, 상기 카메라 인터페이스는The method of claim 1, wherein the camera interface is
    FPD LINK 3 또는 기가 비트 멀티미디어 시리얼 링크(GMLS)의 고속 시리얼 링크를 사용하는 것을 특징으로 하는 다채널 영상 전송 시스템.A multi-channel video transmission system characterized by using a high-speed serial link of FPD LINK 3 or Gigabit Multimedia Serial Link (GMLS).
  3. 제1 항에 있어서, 상기 영상 동기화 통합 및 전송부는According to claim 1, wherein the video synchronization integration and transmission unit
    필드 프로그래머블 게이트 어레이(Field Programmable Gate Array : FPGA)인 것을 특징으로 하는 다채널 영상 전송 시스템.A multi-channel image transmission system, characterized in that it is a field programmable gate array (FPGA).
  4. 제3 항에 있어서, 상기 영상 동기화 통합 및 전송부는The method of claim 3, wherein the video synchronization integration and transmission unit
    듀얼 포트 메모리를 사용하여 라인 버퍼링을 수행하는 것을 특징으로 하는 다채널 영상 전송 시스템.A multi-channel video transmission system, characterized in that line buffering is performed using a dual-port memory.
  5. 제4 항에 있어서, 5. The method of claim 4,
    상기 영상 동기화 통합 및 전송부에서 동기화된 영상은 가로 방향의 해상도가 상기 카메라의 개수를 곱한 만큼 커진 것을 특징으로 하는 다채널 영상 전송 시스템.The multi-channel video transmission system, characterized in that the video synchronized by the video synchronization integration and transmission unit has a horizontal resolution multiplied by the number of cameras.
  6. 제1 항에 있어서, 상기 영상 전송 인터페이스는The method of claim 1, wherein the video transmission interface is
    USB 3.0 UVC(USB Video Class)를 사용하여 영상 전송을 수행하는 것을 특징으로 하는 다채널 영상 전송 시스템.A multi-channel video transmission system, characterized in that it performs video transmission using USB 3.0 UVC (USB Video Class).
  7. 제6 항에 있어서, 상기 영상 전송 인터페이스는The method of claim 6, wherein the video transmission interface is
    USB 3.0 트랜시버(transceiver)인 것을 특징으로 하는 다채널 영상 전송 시스템.A multi-channel video transmission system, characterized in that it is a USB 3.0 transceiver.
  8. 제1 항에 있어서, 상기 호스트는,According to claim 1, wherein the host,
    피씨(PC) 또는 임베디드시스템인 것을 특징으로 하는 다채널 영상 전송 시스템.A multi-channel video transmission system, characterized in that it is a PC or an embedded system.
  9. 다채널 영상 전송 방법에 있어서In a multi-channel video transmission method
    복수의 카메라에서 촬영된 영상을 카메라 인터페이스를 통해 입력받아 상기 영상을 동기화하고 통합하여 영상 전송 인터페이스를 통해 호스트로 전송하는 것을 특징으로 하는 다채널 영상 전송 방법.A multi-channel video transmission method, comprising: receiving images captured by a plurality of cameras through a camera interface, synchronizing and integrating the images, and transmitting the images to a host through an image transmission interface.
  10. 제9 항에 있어서, 10. The method of claim 9,
    듀얼 포트 메모리를 사용한 라인 버퍼링(line buffering)을 통해 상기 영상을 동기화하고 통합하는 것을 특징으로 하는 다채널 영상 전송 방법.A multi-channel video transmission method, characterized in that the video is synchronized and integrated through line buffering using a dual-port memory.
  11. 제10 항에 있어서, 11. The method of claim 10,
    상기 카메라 인터페이스는 FPD LINK 3 또는 기가 비트 멀티미디어 시리얼 링크(GMLS)를 사용하고The camera interface uses FPD LINK 3 or Gigabit Multimedia Serial Link (GMLS) and
    상기 영상 전송 인터페이스는 USB 3.0 UVC(USB Video Class)를 사용하는 것을 특징으로 하는 다채널 영상 전송 방법.The video transmission interface is a multi-channel video transmission method, characterized in that using USB 3.0 UVC (USB Video Class).
  12. 제11 항에 있어서, 12. The method of claim 11,
    복수의 카메라를 사용하여 다양한 해상도를 구현하고 최대 영상 데이터 대역폭내에서 사용할 카메라와 해상도 및 프레임 레이트를 변경하는 것이 가능한 것을 특징으로 하는 다채널 영상 전송 방법.A multi-channel video transmission method, characterized in that it is possible to implement various resolutions using a plurality of cameras and to change the camera, resolution and frame rate to be used within the maximum video data bandwidth.
PCT/KR2019/018317 2019-12-23 2019-12-23 Multi-channel image transmission system and method WO2021132739A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020190172636A KR20210080753A (en) 2019-12-23 2019-12-23 TRANSMITTING SYSTEM AND METHOD FOR Multi Channel Image
KR10-2019-0172636 2019-12-23

Publications (1)

Publication Number Publication Date
WO2021132739A1 true WO2021132739A1 (en) 2021-07-01

Family

ID=76574819

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/018317 WO2021132739A1 (en) 2019-12-23 2019-12-23 Multi-channel image transmission system and method

Country Status (2)

Country Link
KR (1) KR20210080753A (en)
WO (1) WO2021132739A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006325068A (en) * 2005-05-20 2006-11-30 Auto Network Gijutsu Kenkyusho:Kk Electronic controller for image processing, image processing system, image pickup processing system, image display system and image pickup display system
KR101239740B1 (en) * 2011-12-08 2013-03-18 아진산업(주) An apparatus for generating around view image of vehicle using polygon mapping and multi look-up table
KR101809727B1 (en) * 2016-09-27 2017-12-15 주식회사 켐트로닉스 Surround View Monitoring System and Image Signal Processing Method thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006325068A (en) * 2005-05-20 2006-11-30 Auto Network Gijutsu Kenkyusho:Kk Electronic controller for image processing, image processing system, image pickup processing system, image display system and image pickup display system
KR101239740B1 (en) * 2011-12-08 2013-03-18 아진산업(주) An apparatus for generating around view image of vehicle using polygon mapping and multi look-up table
KR101809727B1 (en) * 2016-09-27 2017-12-15 주식회사 켐트로닉스 Surround View Monitoring System and Image Signal Processing Method thereof

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
KIM JUNG-GU; YOO JAE-HYUNG: "HW Implementation of Real-Time Road & Lane Detection in FPGA-Based Stereo Camera", 2019 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (BIGCOMP), 27 February 2019 (2019-02-27), pages 1 - 4, XP033533427, DOI: 10.1109/BIGCOMP.2019.8679333 *
WIRSCHEM, T., MCCORMACK, P.: "Multi-camera design for parking assistants", ATZELEKTRONIC WORLDWIDE EMAGAZINE, vol. 6, no. 5, 7 October 2011 (2011-10-07), pages 34 - 37, XP009529680, ISSN: 1862-6211, DOI: 10.1365/s38314-011-0051-4 *

Also Published As

Publication number Publication date
KR20210080753A (en) 2021-07-01

Similar Documents

Publication Publication Date Title
US20210174481A1 (en) Processing apparatus, image sensor, and system
CN102724433A (en) Method and device for realizing multi-video signal image composition
WO2018070803A1 (en) Method and apparatus for session control support for field of view virtual reality streaming
EP1465401A2 (en) Transferring data from a digital imaging apparatus
EP3466045B1 (en) Processing apparatus, image sensor, and system
US10484732B2 (en) Data processing backplane with serial bus communication loop
KR101541771B1 (en) Displayport FPGA module of display test equipment
JP2023179491A (en) System, program and the like
CN113890977A (en) Airborne video processing device and unmanned aerial vehicle with same
WO2015060572A1 (en) Semiconductor automation facility real-time remote control system
WO2021132739A1 (en) Multi-channel image transmission system and method
KR102107299B1 (en) Method of transmitting image data and apparatuses performing the same
WO2020085571A1 (en) Image converting apparatus and system for generating 360 vr image in real time
WO2012044061A2 (en) Method and device for transmitting/receiving image data at high speed
CN208890933U (en) Industrial camera
Grunnet-Jepsen et al. Using the realsense d4xx depth sensors in multi-camera configurations
US20070076123A1 (en) Digital multi-source multi-destination video multiplexer and crossbar device
WO2019045245A1 (en) Synchronizing image captures in multiple sensor devices
WO2018062599A1 (en) Svm system, and image inputting and processing method therefor
EP4184913A1 (en) Fusion apparatus for multiple data transmission channels, and electronic device
CN108270960A (en) Image capturing device and its control method
WO2021066324A1 (en) Method of setting mirroring state between master device and client device, and electronic device for performing same
CN117156073B (en) Video data transmission device and system
CN117412187B (en) Image processing apparatus and processing method
WO2014073807A1 (en) Image management apparatus and image management method, and image management system including same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19957122

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 08.12.2022)

122 Ep: pct application non-entry in european phase

Ref document number: 19957122

Country of ref document: EP

Kind code of ref document: A1