WO2023100415A1 - Dispositif de traitement de l'information, corps mobile, procédé et programme de traitement de l'information - Google Patents
Dispositif de traitement de l'information, corps mobile, procédé et programme de traitement de l'information Download PDFInfo
- Publication number
- WO2023100415A1 WO2023100415A1 PCT/JP2022/028507 JP2022028507W WO2023100415A1 WO 2023100415 A1 WO2023100415 A1 WO 2023100415A1 JP 2022028507 W JP2022028507 W JP 2022028507W WO 2023100415 A1 WO2023100415 A1 WO 2023100415A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information processing
- image
- view image
- control unit
- bird
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 98
- 238000003672 processing method Methods 0.000 title claims description 6
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 claims abstract description 76
- 240000004050 Pentaglottis sempervirens Species 0.000 claims description 52
- 238000010586 diagram Methods 0.000 claims description 48
- 230000009466 transformation Effects 0.000 claims description 28
- 238000013459 approach Methods 0.000 claims description 9
- 230000002194 synthesizing effect Effects 0.000 claims description 6
- 238000012545 processing Methods 0.000 description 19
- 230000002093 peripheral effect Effects 0.000 description 17
- 230000006399 behavior Effects 0.000 description 16
- 239000002131 composite material Substances 0.000 description 14
- 238000012986 modification Methods 0.000 description 14
- 230000004048 modification Effects 0.000 description 14
- 238000006073 displacement reaction Methods 0.000 description 8
- 230000033001 locomotion Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 230000015654 memory Effects 0.000 description 3
- 239000003086 colorant Substances 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000000034 method Methods 0.000 description 2
- 239000012141 concentrate Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000002096 quantum dot Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/27—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/37—Details of the operation on graphic patterns
- G09G5/377—Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Definitions
- the present invention relates to an information processing device, a mobile object, an information processing method, and a program.
- Japanese Patent Laid-Open No. 2002-200001 discloses a technique aimed at displaying a display that makes it easy for a passenger to understand the traveling direction of a moving object and to easily concentrate on monitoring the surroundings.
- the surrounding information of the moving object is presented in an easy-to-understand manner.
- an information processing device has a control unit.
- the control unit generates one image based on a bird's-eye view image regarding the traveling direction of the moving object and a top view image including the moving object and its surroundings.
- One image includes a first area displaying a bird's-eye view image related to the traveling direction and a second area displaying a top view image. Control is performed so that one image is displayed on a display unit visible to the operator of the mobile object.
- FIG. 1 is a diagram showing an example of a system configuration of an information processing system 1000.
- FIG. FIG. 2 is a diagram showing an example of the hardware configuration of the information processing device 110.
- FIG. 3 is a diagram showing an example of the functional configuration of the information processing device 110.
- FIG. 4 is an activity diagram showing an example of information processing by the information processing apparatus 110.
- FIG. 5 is a diagram (part 1) showing an example of a bird's-eye view image.
- FIG. 6 is a diagram (part 1) showing an example of a top view image.
- FIG. 7 is a diagram (part 1) showing an example of a synthesized image.
- FIG. 8 is a diagram (part 1) for explaining adjustment of the aspect ratio.
- FIG. 1 is a diagram showing an example of a system configuration of an information processing system 1000.
- FIG. 2 is a diagram showing an example of the hardware configuration of the information processing device 110.
- FIG. 3 is a diagram showing an example of the functional configuration of the information processing device 110.
- FIG. 9 is a diagram (part 2) for explaining adjustment of the aspect ratio.
- FIG. 10 is a diagram (part 3) for explaining adjustment of the aspect ratio.
- FIG. 11 is a diagram (part 2) showing an example of a bird's-eye view image.
- FIG. 12 is a diagram (part 2) showing an example of a top view image.
- FIG. 13 is a diagram (part 2) showing an example of a synthesized image.
- FIG. 14 is a diagram showing an example of a displacement curve when per is decreasing away from an object.
- FIG. 15 is a diagram showing an example of a displacement curve when per approaches an object and increases.
- FIG. 16 is a diagram showing an example of a top view image and a bird's eye view image when the vehicle is in reverse.
- FIG. 16 is a diagram showing an example of a top view image and a bird's eye view image when the vehicle is in reverse.
- FIG. 17 is a diagram showing an example of a synthesized image.
- FIG. 18 is a diagram illustrating an example of image processing for a top view image and a bird's eye view image.
- FIG. 19 is a diagram showing an example in which projective transformation is performed so that a rectangle becomes an inverted trapezoid in a bird's-eye view image.
- 20A and 20B are diagrams illustrating an example of image processing for a top view image and a bird's eye view image in Modification 3.
- FIG. FIG. 21 is a diagram showing an example in which a distance sensor is provided in automobile 100.
- FIG. 22 is a diagram illustrating an example of a display mode of a bird's-eye view image.
- the term "unit” may include, for example, a combination of hardware resources implemented by circuits in a broad sense and software information processing that can be specifically realized by these hardware resources.
- various information is handled in the present embodiment, and these information are, for example, physical values of signal values representing voltage and current, and signal values as binary bit aggregates composed of 0 or 1. It is represented by high and low, or quantum superposition (so-called quantum bit), and communication and operation can be performed on a circuit in a broad sense.
- a circuit in a broad sense is a circuit realized by at least appropriately combining circuits, circuits, processors, memories, and the like. That is, Application Specific Integrated Circuit (ASIC), programmable logic device (for example, Simple Programmable Logic Device (SPLD), Complex Programmable Logic Device (CPLD), and field It includes a programmable gate array (Field Programmable Gate Array: FPGA)).
- ASIC Application Specific Integrated Circuit
- SPLD Simple Programmable Logic Device
- CPLD Complex Programmable Logic Device
- FPGA Field Programmable Gate Array
- FIG. 1 is a diagram showing an example of the system configuration of an information processing system 1000.
- an information processing system 1000 includes an automobile 100 as a system configuration.
- a car is an example of a mobile object.
- Automobile 100 includes information processing device 110 , display 120 , and multiple cameras 130 .
- the information processing device 110 is a device that executes the processing of this embodiment. In the present embodiment, the information processing device 110 is described as being included in the vehicle 100, but may not be included in the vehicle 100 as long as it can communicate with the vehicle 100 or the like.
- the display 120 is a display device that displays a 3D synthesized image and the like, which will be described later, under the control of the information processing device 110 .
- Camera 130 is a camera that captures an image of the surroundings of automobile 100 . More specifically, the camera 130 is an RGB camera or the like that adds and describes color information as an image. Camera 130 1 is provided behind automobile 100 . Camera 130 2 is provided in front of automobile 100 . Although not shown in FIG. 1, in addition to the cameras 130-1 and 130-2 , cameras 130 are provided on the left and right sides of the automobile 100, respectively. These cameras are hereinafter simply referred to as camera 130 . In addition, although not shown in FIG. 1 for simplification of explanation, the vehicle 100 has a plurality of depth sensors. The depth sensor measures the shape of objects around the automobile 100 and the distance from the automobile 100 to the objects using laser light or the like. However, the depth sensor is not limited to laser light, and may measure distance using ultrasonic waves, or may measure distance using a camera with a depth sensor.
- FIG. 2 is a diagram illustrating an example of the hardware configuration of the information processing apparatus 110.
- the information processing apparatus 110 includes a control unit 201, a storage unit 202, and a communication unit 203 as a hardware configuration.
- the control unit 201 is a CPU (Central Processing Unit) or the like, and controls the entire information processing apparatus 110 .
- the storage unit 202 is any one of HDD (Hard Disk Drive), ROM (Read Only Memory), RAM (Random Access Memory), SSD (Solid State Drive), etc., or any combination thereof, and stores programs and controls. It stores data and the like used when the unit 201 executes processing based on a program.
- the control unit 201 executes processing based on a program stored in the storage unit 202 to configure the functional configuration of the information processing apparatus 110 as shown in FIG. Activity diagram processing and the like are realized.
- data used when the control unit 201 executes processing based on a program is stored in the storage unit 202, but the data is stored in another device with which the information processing device 110 can communicate. You may make it memorize
- the communication unit 203 is a NIC (Network Interface Card) or the like, connects the information processing apparatus 110 to a network, and controls communication with other apparatuses (eg, display 120, camera 130, etc.).
- FIG. 3 is a diagram illustrating an example of the functional configuration of the information processing device 110 .
- the information processing apparatus 110 includes a peripheral information receiving unit 301, an object recognition unit 302, a behavior information receiving unit 303, and a display control unit 304 as functional configurations.
- the peripheral information receiving unit 301 receives information on objects around the vehicle 100, information on colors around the vehicle 100, and the like from the camera 130 as peripheral information. Further, the peripheral information receiving unit 301 receives the shapes of objects around the vehicle 100 and the distance from the vehicle 100 to the objects as peripheral information from the plurality of depth sensors.
- the object recognition unit 302 recognizes objects around the automobile 100 based on the surrounding information received by the surrounding information receiving unit 301 .
- Surrounding objects include, for example, buildings, roads, other vehicles, parking lots, and the like.
- the behavior information receiving unit 303 receives behavior information of the moving body. More specifically, the behavior information receiving unit 303 converts information detected by a wheel speed sensor, a steering angle sensor, an inertial measurement unit (IMU), etc. included in the vehicle 100 into the behavior information of the vehicle 100. to get as An inertial measurement device is a device that detects three-dimensional inertial motion (translational motion and rotational motion in orthogonal three-axis directions). Rotational motion is detected by [deg/sec].
- the behavior information receiving unit 303 may receive information about the behavior of the automobile 100 from SLAM (Simultaneous Localization and Mapping) included in the automobile 100, LiDAR, a stereo camera, a gyro sensor, etc. included in the automobile 100.
- SLAM Simultaneous Localization and Mapping
- LiDAR LiDAR
- stereo camera LiDAR
- gyro sensor a gyro sensor
- the behavior information includes, for example, position information and orientation information of the vehicle 100 .
- the display control unit 304 converts a surrounding object into a 3D model based on information on the object recognized by the object recognition unit 302, information from the SLAM, and the like.
- the display control unit 304 also generates a 3D model of the automobile 100 based on information from the camera 130, information from SLAM, behavior information of the automobile 100 received by the behavior information receiving unit 303, and the like.
- the display control unit 304 also generates a top view image including the automobile 100 and its surroundings based on the 3D model of the surrounding objects, the 3D model of the automobile 100, information from the camera 130, and the like.
- the top view image is an image that has undergone perspective transformation so that the viewpoint can be viewed from above the road surface, and is an image that includes the entire automobile 100 or at least half or more of the automobile 100 .
- an image of the ground taken from directly above is used as a bottom image, and an image synthesized and morphed with an image of the automobile 100 or the like is used as a top image.
- the display control unit 304 generates one composite image based on the bird's-eye view image regarding the traveling direction of the automobile 100 and the top view image including the automobile 100 and its surroundings.
- the bird's-eye view image regarding the traveling direction of the vehicle 100 is, for example, a bird's-eye view image in front of the vehicle 100 when the vehicle 100 is moving forward, and is, for example, a bird's-eye view image in front of the vehicle 100 when the vehicle 100 is moving backward.
- 2 is a bird's-eye view image of the rear of the automobile 100.
- the forward bird's eye view image is an image obtained by morphing the image captured by the camera 1302 based on the peripheral information of the automobile 100 acquired by the peripheral information receiving unit 301 .
- the rear bird's-eye view image is obtained by morphing the image captured by the camera 1301 based on the peripheral information of the automobile 100 acquired by the peripheral information receiving unit 301 .
- the synthesized image includes a first area displaying a bird's-eye view image related to the traveling direction and a second area displaying a top view image.
- the composite image is shown in FIG. 7 and the like, which will be described later.
- the display control unit 304 controls to display the generated composite image on the display 120 of the automobile 100 or the like.
- the display 120 of the automobile 100 is an example of a display visible to the operator of the automobile 100 .
- part or all of the functional configuration of FIG. 3 may be implemented in the information processing device 110 or the automobile 100 as a hardware configuration. Also, part or all of the functional configuration of the information processing device 110 may be implemented in the vehicle 100 .
- FIG. 4 is an activity diagram showing an example of information processing of the information processing apparatus 110 .
- the peripheral information receiving unit 301 receives information on objects around the vehicle 100, information on colors around the vehicle 100, etc. from the camera 130 as peripheral information. Further, the peripheral information receiving unit 301 receives the shapes of objects around the vehicle 100 and the distance from the vehicle 100 to the objects as peripheral information from the plurality of depth sensors.
- the object recognition unit 302 recognizes objects around the automobile 100 based on the surrounding information received by the surrounding information receiving unit 301 .
- the behavior information receiving unit 303 receives information about the behavior of the automobile 100.
- the display control unit 304 converts the surrounding objects into a 3D model based on the information on the object recognized by the object recognition unit 302, the information from the SLAM, and the like.
- the display control unit 304 also generates a 3D model of the automobile 100 based on information from the camera 130, information from SLAM, behavior information of the automobile 100 received by the behavior information receiving unit 303, and the like.
- the display control unit 304 also generates a top view image including the automobile 100 and its surroundings based on the 3D model of the surrounding objects, the 3D model of the automobile 100, information from the camera 130, and the like.
- the display control unit 304 also generates a bird's-eye view image regarding the traveling direction of the automobile 100 based on information from the camera 130 and the like.
- the display control unit 304 In A405, the display control unit 304 generates one composite image based on the bird's eye view image regarding the traveling direction of the automobile 100 and the top view image including the automobile 100 and its surroundings. In A406, the display control unit 304 controls to display the generated composite image on the display 120 of the automobile 100 or the like.
- FIG. 5 is a diagram (Part 1) showing an example of a bird's-eye view image regarding the traveling direction of the automobile 100.
- FIG. An image 501 is an example of a bird's-eye view image of the traveling direction of the automobile 100 .
- FIG. 5 is a diagram of a scene in which the vehicle 100 is about to be parked in a parking space in reverse.
- the display control unit 304 identifies the driving scene of the automobile 100 based on the distance to objects around the automobile 100 obtained from a depth sensor or the like.
- a bird's-eye view image back-view bird's-eye view image
- a bird's-eye view image front-view bird's-eye view image
- the display control unit 304 changes the ratio of the first area and the ratio of the second area in the synthesized image based on the distance between the automobile 100 and the object existing in the traveling direction.
- the first display area and the second display area will be described later with reference to FIG. 7 and the like.
- the display control unit 304 changes the proportion of the first area and the proportion of the second area in the composite image in stages based on the distance between the automobile 100 and the object existing in the traveling direction.
- An example to change is explained.
- An example of an object existing in the direction of travel is an obstacle such as a car stop.
- the display control unit 304 may recognize and extract an object present in the direction of travel by image processing from a bird's-eye view image related to the direction of travel, or an object around the automobile 100 recognized by the object recognition unit 302 may be extracted.
- the display control unit 304 does not have two stages, but three stages (for example, 5 m or more, less than 5 m, 3 m or more, less than 3 m, etc.), four stages (for example, 5 m or more, less than 5 m, 4 m or more, less than 4 m, 3 m or more, less than 3 m, etc.).
- 3 m will be described below as an example.
- 3m is an example.
- the vertical and horizontal lengths of the screen presented on the display 120 by the display control unit 304 are H and V
- the vertical and horizontal lengths of the top view image subjected to image processing such as cropping to generate the composite image are Hf and Vf
- the composite image is generated.
- Hd and Vd be bird's-eye view images of the traveling direction of the automobile 100 that have undergone image processing, such as cropping, for the purpose of obtaining the image.
- the display control unit 304 controls Hf and Hd to be one to one. More specifically, the display control unit 304 cuts out a portion of the frame 502 from the image 501 in FIG.
- the display control unit 304 presents an image far from the automobile 100 in the bird's-eye view image regarding the traveling direction of the automobile 100. Delete the image near 100.
- the display control unit 304 always keeps the aspect ratio of the cropped image constant.
- the display control unit 304 cuts out a portion of a frame 602 from the top view image 601 in FIG.
- FIG. 6 is a diagram (part 1) showing an example of a top view image.
- the display control unit 304 combines the image 503 and the image 604 to generate the composite image 701 shown in FIG.
- FIG. 7 is a diagram (part 1) showing an example of a synthesized image.
- Hd and Hf are one-to-one.
- 702 is the first area.
- 703 is the second area. As shown in FIG. 7, the first region is placed on top of the composite image 701 . The second area is placed at the bottom of the composite image 701 .
- the first area is an area where a bird's-eye view image regarding the traveling direction of the automobile 100 is displayed.
- the second area is an area in which a top view image including the automobile 100 and its surroundings is displayed. In the synthesized image, the first area and the second area match the flow of surrounding information with respect to the movement of the car. Therefore, the operator of the vehicle 100 can intuitively understand the surrounding conditions and the like.
- FIG. 8 is a diagram (part 1) for explaining adjustment of the aspect ratio.
- 704 represents the automobile 100 .
- the lines in the first area 702 and the second area 703 are attached for explanation. represents a line.
- the display control unit 304 performs image processing to compress the portion corresponding to the automobile 100 in the top view image so that the left and right sides of the automobile 100 can be seen, as shown in FIG.
- FIG. 9 is a diagram (part 2) for explaining adjustment of the aspect ratio.
- the display control unit 304 performs projective transformation so that the rectangle becomes a trapezoid in the top view image as shown in FIG. A top view image and a bird's eye view image are synthesized.
- FIG. 10 is a diagram (part 3) for explaining adjustment of the aspect ratio. By performing image processing in this manner, the lines of the top view image and the bird's eye view image can be made as if they were straight lines.
- FIG. 10 is a diagram (part 3) for explaining adjustment of the aspect ratio.
- the display control unit 304 controls Hf and Hd to be 1:2. More specifically, the display control unit 304 cuts out a frame 802 from the image 801 in FIG. 11 to obtain an image 803 . That is, when the distance L between the automobile 100 and the object behind the automobile 100 is less than 3 m, the display control unit 304 controls the image near the automobile 100 because distant images are not important in the bird's-eye view image regarding the traveling direction of the automobile 100 . Cut out the image of FIG. 11 is a diagram (No. 2) showing an example of a bird's-eye view image regarding the traveling direction of the automobile 100. As shown in FIG. The display control unit 304 changes the scale of the aspect ratio of the cropped image 803 to obtain an image 804 .
- FIG. 12 is a diagram (part 2) showing an example of a top view image.
- the display control unit 304 changes the scale of the horizontal ratio of the image 904 to obtain an image 905 .
- the display control unit 304 combines the image 804 and the image 905 to generate the composite image 1001 in FIG. 13 .
- FIG. 13 is a diagram (part 2) showing an example of a synthesized image. In FIG. 13, Hd and Hf are 1:2.
- the display control unit 304 continuously changes the proportion of the first area and the proportion of the second area in the synthesized image. More specifically, the display control unit 304 performs control such that the closer the vehicle 100 is to the object, the smaller the proportion of the first area in the composite image than the proportion of the second area.
- the display control unit 304 uses the area change rate between the image displayed in the first area and the image displayed in the second area, and the displacement curves shown in FIGS. 14 and 15 to be described later, Change the area of the image displayed in each area.
- the display control unit 304 obtains the display area division (dif) from the displacement curve using the obtained per.
- the displacement curve is set as shown in FIG. (curveInc)
- the automobile 100 is configured to have two of FIG. Also, curveInc is always located on the right side of curveDec.
- FIG. 14 is a diagram showing an example of a displacement curve when per is decreasing away from an object.
- FIG. 15 is a diagram showing an example of a displacement curve when per approaches an object and increases. In both FIGS. 14 and 15, the horizontal axis is per and the vertical axis is dif.
- the display control unit 304 changes the depression angle with respect to the viewpoint of the top view image based on the distance between the automobile 100 and an object existing in the traveling direction. More specifically, the display control unit 304 performs control such that the closer the vehicle 100 is to the object, the greater the angle of depression with respect to the viewpoint of the top view image. At this time, the display control unit 304 uses the displacement curve described above to perform control such that the closer the vehicle 100 is to the object, the greater the angle of depression with respect to the viewpoint of the top view image.
- the display control unit 304 may change the rate of change in the angle of depression depending on whether the vehicle 100 is moving closer to the object or when the vehicle 100 is moving away from the object.
- the display control unit 304 may change the display ratio described above when the automobile 100 is moving backward, and may change the viewpoint when the automobile 100 is moving forward.
- the information processing apparatus 110 presents the composite image on one screen, and changes the viewpoint and/or the presentation area of the perspective transformation according to the surrounding conditions, thereby obtaining a bird's-eye view of the direction of travel.
- the image and the top view image are represented continuously.
- intuitive surrounding information can be presented to the operator.
- the information processing device 110 considers the distance to objects such as obstacles around the automobile 100, and changes the viewpoint and/or the presentation area, thereby realizing a display method that makes the obstacles more recognizable.
- the operator can more intuitively understand the forward and backward movements of the vehicle when the vehicle is backing up, and can drive the automobile 100 safely.
- the automobile 100 has been described as an example of a moving object.
- the mobile object may be, for example, a so-called drone such as an unmanned aerial vehicle.
- the information processing device 110 may be included in the drone or may be included in the controller of the operator who operates the drone.
- the moving body is a drone
- a display section that can be visually observed by the operator is provided in the controller.
- examples of the object include a mark indicating the departure and arrival location of the drone. According to the modified example, it is also possible for the operator to easily understand the moving direction of the moving object, and to present the peripheral information of the moving object in an easy-to-understand manner.
- FIG. 16 is a diagram showing an example of a top view image and a bird's eye view image when the vehicle is in reverse.
- Arrow 1601 indicates the traveling direction of automobile 100 .
- Image 1602 is a top view image.
- An image 1603 is a bird's-eye view image.
- FIG. 17 is a diagram showing an example of a synthesized image.
- a synthesized image 1701 is an image created by synthesizing the top view image 1602 and the overhead view image 1603 shown in FIG.
- FIG. 18 is a diagram illustrating an example of image processing for a top view image and a bird's eye view image.
- the display control unit 304 turns the top view image upside down so that the vehicle faces downward.
- the display control unit 304 performs rectangular clipping on the vertically inverted top view image so as to have a predetermined area Sa. That is, the display control unit 304 does not draw a portion other than the predetermined area Sa of the top view image that is vertically inverted.
- the display control unit 304 projectively transforms the rectangle-clipped top view image so that the rectangle becomes a trapezoid.
- the display control unit 304 performs rectangular clipping so that the bird's-eye view image has a predetermined area Sb. That is, the display control unit 304 does not draw a portion other than the predetermined area Sb of the predetermined bird's-eye view image.
- the display control unit 340 changes the ratio of Sa to Sb described above according to the distance between the automobile 100 and the obstacle.
- the display control unit 304 performs projective transformation so as to correct the distortion of the bird's-eye view image subjected to rectangular clipping.
- the display control unit 304 synthesizes the projectively transformed top view image and the projectively transformed bird's eye view image to generate one synthesized image. Note that while the automobile 100 is moving, the processing shown in FIG. 18 is repeatedly executed.
- FIG. 19 is a diagram showing an example in which projective transformation is performed so that a rectangle becomes an inverted trapezoid in a bird's-eye view image.
- Projective transformation is performed so that the rectangle becomes an inverted trapezoid, and one image is generated by synthesizing the top view image subjected to the projective transformation and the overhead view image subjected to the projective transformation.
- the top view image and the bird's eye view image are transformed as shown in FIG. , it is possible to connect straight lines smoothly.
- FIG. 20A and 20B are diagrams illustrating an example of image processing for a top view image and a bird's eye view image in Modification 3.
- FIG. In S2001, the display control unit 304 turns the top view image upside down so that the vehicle faces downward.
- the display control unit 304 sets an ROI (Region Of Interest).
- the display control unit 304 sets the ROI so that the upper side of the top view image and the lower side of the bird's eye view image match in three-dimensional position.
- the display control unit 304 performs projective transformation so that the rectangle of the top view image becomes a trapezoid.
- the display control unit 304 sets an ROI (Region Of Interest).
- the display control unit 304 sets the ROI so that the lower side of the bird's-eye view image and the upper side of the top view image match at three-dimensional positions. Note that when the automobile 100 approaches an obstacle, the display control unit 304 sets the ROI so that the near area is enlarged in the vertical direction and the display area in front is enlarged in the horizontal direction.
- the display control unit 304 performs projective transformation so that the rectangle of the overhead view image becomes an inverted trapezoid.
- the display control unit 304 combines the top view image that has undergone the projective transformation and the overhead view image that has undergone the projective transformation.
- S ⁇ b>2007 the display control unit 304 enlarges or reduces the combined image to fit the display 120 . Also, the display control unit 304 clips the combined image. Note that while the automobile 100 is moving, the processing shown in FIG. 20 is repeatedly executed.
- FIG. 21 is a diagram showing an example in which a distance sensor is provided in automobile 100. As shown in FIG. FIG. 21 shows an example in which distance sensors are provided on the rear left and right sides of the automobile 100 . However, the number of distance sensors is not limited to two. Also, the distance sensor may measure the distance to an object existing on the side or front of the automobile 100 . As shown in FIG. 21, for example, when an object exists in the right rear of the automobile 100, the display control unit 304 of Modification 4 displays the left portion of the bird's-eye view image as shown in FIG. Projective transformation is performed so that the rectangle becomes an inverted trapezoid. Note that in FIG.
- projective transformation is not performed on the right side of the bird's-eye view image so that the rectangle becomes an inverted trapezoid.
- the display control unit 304 performs projective transformation so that the rectangle becomes an inverted trapezoid on the right side of the bird's-eye view image.
- the left portion of the bird's-eye view image is not subjected to projective transformation such that the rectangle becomes an inverted trapezoid. That is, the display control unit 304 performs control so that the display form of the bird's-eye view image is changed depending on whether the object is on the left or right of the moving object.
- FIG. 22 is a diagram illustrating an example of a display mode of a bird's-eye view image.
- projective transformation is performed on the left side of the bird's eye view image so that the rectangle becomes an inverted trapezoid, and projective transformation is not performed on the right side of the bird's eye view image.
- An information processing apparatus having a control unit, wherein the control unit generates a bird's-eye view image related to a moving direction of a moving body and a top view image including the moving body and the periphery of the moving body. based on, one image is generated, the one image includes a first area for displaying a bird's eye view image related to the traveling direction and a second area for displaying the top view image, An information processing device that controls to display the one image on a display unit that is visible to an operator of the mobile object.
- the first area is arranged above the one image, and the second area is arranged below the one image.
- Information processing equipment
- the control unit based on the distance between the moving object and the object existing in the traveling direction, in the one image, An information processing device that changes a ratio of the first area and a ratio of the second area.
- control unit controls, based on the distance, the ratio of the first area and the ratio of the second area in the one image, An information processing device that continuously changes the
- control unit controls, based on the distance, the ratio of the first area and the ratio of the second area in the one image, Information processing device that changes step by step.
- control unit controls the area of the first area in the one image as the moving object approaches the object.
- An information processing device that performs control so that the ratio is smaller than the ratio of the second region.
- the control unit when an object exists in the direction of travel, causes the rectangle to become a trapezoid in the top view image.
- An information processing device that generates the one image by performing a projective transformation and synthesizing it with the bird's-eye view image.
- the control unit when an object exists in the traveling direction, the control unit performs projective transformation so that a rectangle becomes a trapezoid in the top view image, and An information processing device that performs projective transformation so that a rectangle in an image becomes an inverted trapezoid, and generates the one image by synthesizing the projectively transformed top view image and the projectively transformed overhead view image.
- control unit may change the display form of the bird's-eye view image depending on whether the object is on the left or right of the moving object. Information processing device to control.
- control unit adjusts the overhead view image based on the distance between the moving object and the object.
- An information processing device that changes the depression angle related to the viewpoint.
- control unit performs control such that the depression angle with respect to the viewpoint of the bird's-eye view image increases as the moving object approaches the object.
- control unit changes the bird's-eye view image when the moving body approaches the object and when the moving body moves away from the object.
- An information processing device that changes the rate of change in depression angle with respect to a viewpoint.
- the moving object is an automobile, and the object is related to a parking position of the automobile. Device.
- a mobile body comprising the information processing device according to any one of (1) to (13) above.
- An information processing method executed by an information processing apparatus wherein one An image is generated, wherein the one image includes a first area for displaying a bird's-eye view image related to the traveling direction and a second area for displaying the top view image, and the one image is generated.
- An information processing method for controlling display on a display unit of the moving body
- the back view has been mainly explained as an example, but the above-described effects can be obtained by executing the same processing for the front view as well.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Hardware Design (AREA)
- Signal Processing (AREA)
- Mechanical Engineering (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Processing (AREA)
Abstract
Selon un aspect de la présente invention, un dispositif de traitement d'informations est décrit. L'invention concerne un dispositif de traitement d'informations qui comprend une unité de commande. L'unité de commande génère une image unique sur la base d'une vue à vol d'oiseau liée à la direction de déplacement d'un corps mobile et d'une vue de dessus incluant à la fois le corps mobile et la périphérie du corps mobile. L'image unique comprend une première zone pour l'affichage de la vue à vol d'oiseau relative à la direction de déplacement et une seconde zone pour l'affichage de la vue de dessus. L'unité de commande effectue une commande telle que l'image unique est affichée sur une unité d'affichage qui peut être vue par l'opérateur du corps mobile.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280075968.3A CN118511505A (zh) | 2021-11-30 | 2022-07-22 | 信息处理装置、移动体、信息处理方法及程序 |
JP2023564741A JPWO2023100415A1 (fr) | 2021-11-30 | 2022-07-22 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021194729 | 2021-11-30 | ||
JP2021-194729 | 2021-11-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023100415A1 true WO2023100415A1 (fr) | 2023-06-08 |
Family
ID=86611853
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/028507 WO2023100415A1 (fr) | 2021-11-30 | 2022-07-22 | Dispositif de traitement de l'information, corps mobile, procédé et programme de traitement de l'information |
Country Status (3)
Country | Link |
---|---|
JP (1) | JPWO2023100415A1 (fr) |
CN (1) | CN118511505A (fr) |
WO (1) | WO2023100415A1 (fr) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2000064175A1 (fr) * | 1999-04-16 | 2000-10-26 | Matsushita Electric Industrial Co., Ltd. | Dispositif de traitement d'images et systeme de surveillance |
JP2003158736A (ja) * | 2000-07-19 | 2003-05-30 | Matsushita Electric Ind Co Ltd | 監視システム |
JP2010215027A (ja) * | 2009-03-13 | 2010-09-30 | Fujitsu Ten Ltd | 車両用運転支援装置 |
JP2011030078A (ja) * | 2009-07-28 | 2011-02-10 | Toshiba Alpine Automotive Technology Corp | 車両用画像表示装置 |
JP2012147285A (ja) * | 2011-01-13 | 2012-08-02 | Alpine Electronics Inc | バックモニタ装置 |
JP2014110604A (ja) * | 2012-12-04 | 2014-06-12 | Denso Corp | 車両周辺監視装置 |
JP2017098932A (ja) * | 2015-11-17 | 2017-06-01 | 株式会社Jvcケンウッド | 車両用表示装置および車両用表示方法 |
WO2021131481A1 (fr) * | 2019-12-24 | 2021-07-01 | 株式会社Jvcケンウッド | Dispositif d'affichage, procédé d'affichage et programme d'affichage |
-
2022
- 2022-07-22 JP JP2023564741A patent/JPWO2023100415A1/ja active Pending
- 2022-07-22 WO PCT/JP2022/028507 patent/WO2023100415A1/fr active Application Filing
- 2022-07-22 CN CN202280075968.3A patent/CN118511505A/zh active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2000064175A1 (fr) * | 1999-04-16 | 2000-10-26 | Matsushita Electric Industrial Co., Ltd. | Dispositif de traitement d'images et systeme de surveillance |
JP2003158736A (ja) * | 2000-07-19 | 2003-05-30 | Matsushita Electric Ind Co Ltd | 監視システム |
JP2010215027A (ja) * | 2009-03-13 | 2010-09-30 | Fujitsu Ten Ltd | 車両用運転支援装置 |
JP2011030078A (ja) * | 2009-07-28 | 2011-02-10 | Toshiba Alpine Automotive Technology Corp | 車両用画像表示装置 |
JP2012147285A (ja) * | 2011-01-13 | 2012-08-02 | Alpine Electronics Inc | バックモニタ装置 |
JP2014110604A (ja) * | 2012-12-04 | 2014-06-12 | Denso Corp | 車両周辺監視装置 |
JP2017098932A (ja) * | 2015-11-17 | 2017-06-01 | 株式会社Jvcケンウッド | 車両用表示装置および車両用表示方法 |
WO2021131481A1 (fr) * | 2019-12-24 | 2021-07-01 | 株式会社Jvcケンウッド | Dispositif d'affichage, procédé d'affichage et programme d'affichage |
Also Published As
Publication number | Publication date |
---|---|
JPWO2023100415A1 (fr) | 2023-06-08 |
CN118511505A (zh) | 2024-08-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7010221B2 (ja) | 画像生成装置、画像生成方法、及び、プログラム | |
US7554461B2 (en) | Recording medium, parking support apparatus and parking support screen | |
EP2429877B1 (fr) | Système de caméra à usage dans une mise en stationnement de véhicule | |
JP7150274B2 (ja) | 改善された視覚的検出能力を備えた自律走行車 | |
KR102275310B1 (ko) | 자동차 주변의 장애물 검출 방법 | |
US10908604B2 (en) | Remote operation of vehicles in close-quarter environments | |
US8717196B2 (en) | Display apparatus for vehicle | |
US9712791B2 (en) | Around view provision apparatus and vehicle including the same | |
JP5209578B2 (ja) | 車両用画像表示装置 | |
US20170036678A1 (en) | Autonomous vehicle control system | |
JP2017200182A (ja) | 車両および車両ドライバのための地形視覚化 | |
JP5045172B2 (ja) | 車両用運転支援装置 | |
CN112825127B (zh) | 生成用于自动驾驶标记的紧密2d边界框的方法 | |
CN111066319B (zh) | 图像处理装置 | |
WO2020026825A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations, programme et corps mobile | |
KR20170118077A (ko) | 차량 주변을 왜곡 없이 보여주는 방법 및 장치 | |
US20190100141A1 (en) | Ascertainment of Vehicle Environment Data | |
JP2019526105A5 (fr) | ||
US10540807B2 (en) | Image processing device | |
KR102031635B1 (ko) | 오버랩 촬영 영역을 가지는 이종 카메라를 이용한 충돌 경고 장치 및 방법 | |
CN108422932B (zh) | 驾驶辅助系统、方法和车辆 | |
CN112837209B (zh) | 对鱼眼镜头生成具有畸变的图像的新方法 | |
WO2023100415A1 (fr) | Dispositif de traitement de l'information, corps mobile, procédé et programme de traitement de l'information | |
WO2023105842A1 (fr) | Dispositif de traitement d'informations, corps mobile, procédé de traitement d'informations et programme | |
WO2021131481A1 (fr) | Dispositif d'affichage, procédé d'affichage et programme d'affichage |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22900850 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023564741 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280075968.3 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |