Nothing Special   »   [go: up one dir, main page]

CN105526916B - Dynamic image masking system and method - Google Patents

Dynamic image masking system and method Download PDF

Info

Publication number
CN105526916B
CN105526916B CN201510634267.0A CN201510634267A CN105526916B CN 105526916 B CN105526916 B CN 105526916B CN 201510634267 A CN201510634267 A CN 201510634267A CN 105526916 B CN105526916 B CN 105526916B
Authority
CN
China
Prior art keywords
see
image
gatekeeper
imaging
dynamic image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510634267.0A
Other languages
Chinese (zh)
Other versions
CN105526916A (en
Inventor
查尔斯·B·斯皮内利
罗伯特·W·特纳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Boeing Co
Original Assignee
Boeing Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Boeing Co filed Critical Boeing Co
Publication of CN105526916A publication Critical patent/CN105526916A/en
Application granted granted Critical
Publication of CN105526916B publication Critical patent/CN105526916B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/86Combinations of sonar systems with lidar systems; Combinations of sonar systems with systems not using wave reflection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0069Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • B64U2101/32UAVs specially adapted for particular uses or applications for imaging, photography or videography for cartography or topography
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20004Adaptive image processing
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2358/00Arrangements for display data security
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/12Avionics applications
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Acoustics & Sound (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Burglar Alarm Systems (AREA)

Abstract

The application discloses a dynamic image masking system and method. A dynamic image masking system for providing a filtered autonomous remotely sensed image through a dynamic image masking process is provided. The dynamic image obscuring system has a remote sensing platform and an imaging system associated with the remote sensing platform. The imaging system has an optical system and an image sensing system. The dynamic image obscuring system further has a multi-level security system associated with the imaging system and one or more image change locations in the imaging system and the multi-level security system, wherein the change in the one or more images occurs via a dynamic image obscuring process. The dynamic image obscuring system further has a computer system associated with the imaging system. The computer system has a gatekeeper algorithm configured to send gatekeeper commands to one or more controllers that control one or more images to change position through dynamic image shading processing.

Description

Dynamic image masking system and method
Technical Field
The present disclosure relates generally to systems and methods for remote sensing image acquisition, and more particularly to automated dynamic image obscuring systems and methods for remote sensing image acquisition, such as airborne remote sensing image acquisition.
Background
Remote sensing, such as airborne remote sensing, involves the use of sensors and imaging techniques, such as radar imaging systems, camera imaging systems, light detection and ranging (LIDAR) systems, and other sensors and imaging systems, to obtain images of the ground and earth surfaces and remote objects. Analog aerial photography, video recording, and digital photography are commonly used in aerial remote sensing to capture images. Digital photography allows remotely sensed data to be transmitted in real time to the ground or a base station for direct analysis, and digital images can be analyzed and interpreted with the aid of a computer.
However, aerial remote sensing image acquisition can be burdensome and, therefore, can produce a large amount of minimally useful information if the imaging system is not properly positioned at the time of the image capture event. For Precision Agriculture (PA) type aerial remote sensing tasks where images of farmland and crops can be acquired to determine plant health and vigor, the operator must be able to deal with large areas of land, unique and well-known distributions of image acquisition sites, well-defined flight profiles in terms of range, time-of-flight, altitude, position and speed, and the exclusion of different area images, the latter of which, if not taken into account, can substantially hinder Precision Agriculture (PA) type aerial remote sensing operations.
For example, when flying to perform precision agriculture-type airborne remote sensing tasks, an aircraft, such as an Unmanned Aerial Vehicle (UAV), may enter the first farm by flying over other populated areas. It may not be desirable to begin imaging until positioned above the first farm, so it is desirable to integrate the autopilot and imaging system of the UAV and allow autonomous operation.
Known systems and methods for aerial remote sensing image acquisition may include flying under pilot control to position the pilot in a local field with a field of view of the entire planting area, and thus not allow autonomous operation preferred for the precision agricultural market. In addition, without a well defined acquisition zone, there may be too much land or area to image, and the amount of image data acquired may completely cover the sensor and imaging system. Further, data that is confined or out of bounds and not in a defined acquisition zone may be inadvertently imaged and acquired.
Further, known systems and methods for aerial remote sensing image acquisition may include a manually operated shutter control that may be pre-programmed (every n seconds) or operator triggered. However, for such manual operations used by the precision agriculture market, access to a particular field designated to be imaged may be required. Such a selection may require beyond-line-of-sight flight to reach the proper destination, especially when flying at low altitudes of 400 feet below ground level. However, such options can be labor intensive, expensive, and do not produce the desired results to address the precision agriculture market.
Another option could be to have a live link from the imaging camera system to the ground controllers (pilots and ground control station operators) that provides a bird's eye view of the area. This can be used to alert the operator when maneuvering and when taking a photograph. However, this option may also be labor intensive and does not meet all requirements for the precision agriculture market.
Accordingly, there is a need in the art for improved systems and methods for dynamic image masking systems for providing filtered autonomous remote sensed images through dynamic image masking processes and for providing methods of masking or changing pixels that are not needed or relevant to an image acquisition event or task, such as an precision agriculture task, that provide advantages over known systems and methods.
Disclosure of Invention
Exemplary embodiments of the present disclosure provide an improved system and method for a dynamic image masking system that provides filtered autonomous remotely sensed images through dynamic image masking processing, thereby overcoming the limitations of the prior art solutions. As described in the detailed description that follows, the improved system and method of dynamic image masking systems that provide filtered autonomous remotely sensed images through dynamic image masking processing may provide significant advantages over existing systems and methods.
In an embodiment of the present disclosure, a dynamic image masking system for providing a filtered autonomous remote sensing image by a dynamic image masking process is provided. The dynamic image obscuring system includes a remote sensing platform.
The dynamic image obscuring system further comprises an imaging system associated with the remote sensing platform. The imaging system includes an optical system and an image sensing system.
The dynamic image obscuring system further includes a multi-level security system associated with the imaging system. The dynamic image obscuring system further comprises one or more image change locations located in the imaging system and the multi-level security system, wherein the changing of the one or more images is performed via a dynamic image obscuring process.
The dynamic image obscuring system further comprises a computer system associated with the imaging system. The computer system includes a gatekeeper algorithm configured to send gatekeeper commands to one or more controllers that control one or more images to change position through dynamic image shading processing.
In another embodiment of the present disclosure, a method for providing filtered autonomous remote sensing images through dynamic image occlusion processing is provided. The method comprises the step of equipping the remote sensing platform with an imaging system. The method further comprises the step of designating a region for imaging to obtain a designated region to be imaged. The method further includes the step of establishing a plurality of fiducial points on the surface of the designated area to be imaged.
The method further includes the step of designating the plurality of specific surface areas as excluded areas that are not imaged with reference to the plurality of fiducial points. The method further comprises the step of controlling a pre-established acquisition planning process covering the designated area to be imaged.
The method further includes the step of positioning the imaging system using a navigation system comprising a Global Positioning System (GPS), a radio-based navigation system, an optical-based navigation system, an Inertial Measurement Unit (IMU) system, a magnetometer-equipped Inertial Measurement Unit (IMU) system, or a combination thereof, to image the specified area to be imaged. The method further comprises the step of using the imaging system to image the specified area to be imaged covered by the pre-established acquisition planning process.
The method further includes the step of dynamically invalidating one or more pixels in the one or more images of the exclusion area. The method further comprises the step of obtaining a filtered autonomous remote sensing image by dynamic image masking of the designated area to be imaged.
In another embodiment of the present disclosure, a method for providing filtered autonomous remote sensing images through dynamic image occlusion processing is provided. The method includes the step of equipping an Unmanned Aerial Vehicle (UAV) with an imaging system. The method further comprises the step of designating a region for imaging to obtain a designated region to be imaged. The method further includes the step of establishing a plurality of fiducial points on the surface of the designated area to be imaged.
The method further includes the step of designating the plurality of specific surface areas as excluded areas that are not imaged with reference to the plurality of fiducial points. The method further includes the step of controlling a pre-established UAV flight plan that covers the designated area to be imaged.
The method further includes the step of positioning the imaging system using a navigation system comprising a Global Positioning System (GPS), a radio-based navigation system, an optical-based navigation system, an Inertial Measurement Unit (IMU) system, a magnetometer-equipped Inertial Measurement Unit (IMU) system, or a combination thereof, to image the specified area to be imaged. The method further includes the steps of flying the UAV over a designated area to be imaged, and using an imaging system to image the designated area to be imaged covered by the pre-established UAV flight plan.
The method further includes the step of dynamically invalidating one or more pixels in the one or more images of the exclusion area. The method further comprises the step of obtaining a filtered autonomous remote sensing image by masking the dynamic image of the designated area to be imaged.
The features, functions, and advantages that are discussed can be achieved independently in various embodiments of the present disclosure or may be combined in yet other embodiments, further details of which can be seen with reference to the following description and drawings.
Drawings
The disclosure may be understood more readily by reference to the following detailed description taken in conjunction with the accompanying drawings, which illustrate preferred and exemplary embodiments, but which are not necessarily drawn to scale, and in which:
FIG. 1 is a schematic diagram of a system block diagram of an imaging system that may be used in embodiments of the dynamic image obscuring system and method of the present disclosure;
FIG. 2 is a system block diagram of an embodiment of the dynamic image obscuring system of the present disclosure with the imaging system of FIG. 1 and showing a gatekeeper algorithm and a plurality of image repositioning in the dynamic image obscuring system;
FIG. 3 is a schematic diagram of a functional block diagram of a gatekeeper algorithm used in embodiments of the dynamic image masking system and methods of the present disclosure;
FIG. 4A is a schematic diagram of a system block diagram of one of the embodiments of the dynamic image obscuring system of the present disclosure;
FIG. 4B is a schematic diagram of a system block diagram of one of the embodiments of dynamic image masking processing of the present disclosure;
FIG. 5A is a schematic illustration of a flow chart of a method embodiment of the present disclosure;
FIG. 5B is a schematic illustration of a flow chart of another embodiment of a method of the present disclosure;
FIG. 6 is a schematic diagram of a schematic representation of an Unmanned Aerial Vehicle (UAV) that may be used in embodiments of the dynamic image obscuring systems and methods of the present disclosure;
FIG. 7 is a schematic illustration of a flow chart of an embodiment of an aircraft manufacturing and service method; and is
Fig. 8 is a schematic diagram of a functional block diagram of an embodiment of an aircraft.
Detailed Description
The disclosed embodiments will be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all disclosed embodiments are shown. Of course, a number of different embodiments may be provided and should not be construed as limited to the embodiments set forth in this disclosure. Rather, these embodiments are provided so that this disclosure will be thorough and will fully convey the scope of the disclosure to those skilled in the art.
Referring to the drawings, FIG. 1 is a schematic diagram of a system block diagram of an embodiment of an imaging system 12 that may be used in embodiments of the dynamic image obscuring system 10 (see FIG. 2), method 150 (see FIG. 5A), and method 170 (see FIG. 5B) of the present disclosure.
As shown in fig. 1, imaging system 12 is associated with remote sensing platform 14 and includes an optical system 20 and an image sensing system 22. An optical system 20 (see fig. 1) such as a camera 20a (see also fig. 4A) represents an optical view all over the world. The acquisition planning process 16 (see fig. 1) outputs an output 18 (see fig. 1) of the acquisition planning process to an optical system 20 (see fig. 1). The optical system 20 (see fig. 1) outputs raw image data output 24 to a focal plane array 26 (see fig. 1) of the focal plane array subsystem 22a (see fig. 1).
As shown in fig. 1, image sensing system 22 (see fig. 1) may include a focal plane array subsystem 22a (see fig. 1), and focal plane array subsystem 22a includes a focal plane array 26, an analog-to-digital converter (a/D)30, a volatile temporary memory 34, a digital signal processor 38, and a digital-to-analog converter (D/a) 54.
The focal plane array 26 (see fig. 1) reads the raw image data 24 (see fig. 1) and passes it 28 to an analog-to-digital converter 30 (see fig. 1). The analog-to-digital converter 30 (see fig. 1) outputs an analog-to-digital converter output 32 (see fig. 1) to a volatile temporary memory 34 (see fig. 1), where an image 122 (see fig. 4A) is temporarily stored (subsequent image overwrite) of the current image). The volatile temporary memory 34 (see fig. 1) then outputs a volatile temporary memory output 36 (see fig. 1) to a digital signal processor 38 (see fig. 1). Several actions may be performed in the digital signal processor 38 (see fig. 1), including, for example, reading the digital signal 37 (see fig. 4B), adjusting the gain, processing the digital signal 37 (see fig. 4B) through a Bayer filter (i.e., a Color Filter Array (CFA) for arranging RGB (red, green, blue) color filters on a square grid of photosensors to obtain a jpg file format, for example), and performing image enhancement techniques such as edge sharpening. After the digital signal 37 (see fig. 4B) is processed into a readable image format 39 (see fig. 4B) by the digital signal processor 38 (see fig. 1), the digital signal processor 38 (see fig. 1) outputs a digital output 40 (see fig. 1) for storage in a non-volatile results memory (resultants) 44 (see fig. 1) of the multi-level security system 42 (see fig. 1). If necessary, a non-volatile results memory output 46 (see FIG. 1) may be output from the non-volatile results memory 44 (see FIG. 1) to a post-processing process 48 (see FIG. 1) of the multi-level security system 42 (see FIG. 1) for post-processing. Post-treatment process 48 (see fig. 1) outputs a post-treated output product 49 (see fig. 1).
If the optical system 20 is analog, the digital signal processor 38 (see FIG. 1) outputs a digital signal processor output signal 52 (see FIG. 1) to a digital-to-analog converter 54 (see FIG. 1), and the digital-to-analog converter 54 (see FIG. 1) converts the signal to analog and outputs an analog output 56 (see FIG. 1). The analog output 56 (see fig. 1) may be used or stored in a video editing system 58 (see fig. 1).
In an embodiment of the present disclosure, there is provided a dynamic image masking system 10 (see fig. 2, 4A) for providing a filtered autonomous remote sensing image 51 (see fig. 2, 4A) by a dynamic image masking process 11 (see fig. 4A to 4B). Fig. 2 is a system block diagram of an embodiment of the dynamic image obscuring system 10 of the present disclosure with the imaging system 12 of fig. 1 and shows a schematic diagram of a Gatekeeper (GK) algorithm 60 and a plurality of image change locations 90 in the dynamic image obscuring system 10.
Fig. 2 shows an image change location 90 in which one or more pixels 126 (see fig. 4B) in one or more images 122 (see fig. 4A) may be changed to produce a masked image 50, such as a filtered autonomous remotely sensed image 51. FIG. 4A is a schematic diagram of a system block diagram of one of the embodiments of the dynamic image obscuring system 10 of the present disclosure. Fig. 4B is a schematic diagram of a system block diagram of one of the embodiments of the dynamic image masking process 11 of the present disclosure.
Before discussing the dynamic image masking system 10 shown in fig. 2 and 4A in detail, the gatekeeper algorithm 60 (see fig. 2, 3, 4B) will be discussed. FIG. 3 is a schematic diagram of a functional block diagram of an embodiment of gatekeeper algorithm 60 used in embodiments of dynamic image masking system 10 (see FIG. 2), method 150 (see FIG. 5A), and method 170 (see FIG. 5B) of the present disclosure. Fig. 3 shows a Gatekeeper (GK) function 61. An "algorithm," as used herein, refers to a set of instructions or a series of steps for performing a task or solving a problem.
The gatekeeper algorithm 60 (see fig. 3) calculates where the pixel 126 (see fig. 4B) originates, for example, on the ground and determines whether the pixel 126 (see fig. 4B) is in the region 118 (see fig. 4A) for imaging. If the pixel 126 (see FIG. 4B) is in the region 118 (see FIG. 4A) for imaging, then the pixel 126 (see FIG. 4B) of the image 122 (see FIG. 4B) is captured. If the pixel 126 (see FIG. 4B) is not in the region 118 (see FIG. 4A) for imaging, the pixel 126 (see FIG. 4B) is replaced with an appropriate value, where the exact value depends on the pixel replacement method used.
As shown in fig. 3, gatekeeper algorithm 60 preferably takes the position (GPS)108 (such as obtained with a Global Positioning System (GPS)) and attitude (IMU)104 (such as obtained with an Inertial Measurement Unit (IMU)) of remote sensing platform 14 (e.g., airborne platform 14a (see fig. 4B) in the form of unmanned aerial vehicle 200 (see fig. 4B, 6)). Preferably, the GPS and IMU data are high fidelity to avoid any problems with the pose (IMU)104 or position (GPS)108 or location, which may affect the designated region 118a to be imaged (see fig. 4A).
As further shown in fig. 3, gatekeeper algorithm 60 may also retrieve information such as time 102, ranging sensor 106, altitude, speed, flight profile, or other information of remote sensing platform 14 (see fig. 2). As further shown in fig. 3, the gatekeeper algorithm 60 preferably applies a rule set 92 to generate the occluded image 50 (see fig. 2), the rule set 92 may contain a camera model 94 including parameters 95 (see fig. 4B), such as a field of view 95a (see fig. 4B) and a focal length 95B (see fig. 4B), occlusion commands 96, information related to an acquisition plan 98, information related to an acquisition strategy 100, or other suitable information. In this way, gatekeeper algorithm 60 (see fig. 3) provides Gatekeeper (GK) command 62 (see fig. 2, 3) to one or more controllers 63 (see fig. 2) as to which of one or more pixels 126 (see fig. 4B) is to be changed.
In one embodiment, the masking commands 96 (see FIG. 3) may include moving masking commands 96a (see FIG. 4B) that are dynamically updated for fixed or moving objects or people to publicly propagate their position or location. For example, this embodiment allows a first person who does not want a photograph to be publicly taken to propagate his or her location or position to a second person taking the photograph using a device such as a mobile phone 97 (see FIG. 4B). The optical system 20 (see fig. 2) of the second person, such as the camera 20a (see fig. 2) or sensor, will receive the location of the first person and determine whether the first person can be identified in the camera frame based on the camera model 94 (see fig. 4B), and camera parameters 95 (see fig. 4B), such as the field of view 95a (see fig. 4B), the focal length 95B (see fig. 4B), settings, or other suitable camera parameters 95 (see fig. 5B). The optical system 20 (see fig. 2), such as the camera 20a (see fig. 2) or sensor, will obscure or blur the image of the first person if possible.
The dynamic image obscuring system 10 (see fig. 2, 4A) is preferably an automated image acquisition system 148 (see fig. 4A) that includes a gatekeeper algorithm 60 (see fig. 2, 3, 4B), the gatekeeper algorithm 60 providing gatekeeper commands 62 (see fig. 2, 3, 4B) to one or more controllers 63 (see fig. 2, 4B) that control one or more image change locations 90 (see fig. 2) positioned in the dynamic image obscuring system 10 (see fig. 2) via the dynamic image obscuring process 11 (see fig. 4A-4B).
As used herein, "dynamic image masking" refers to masking, blanking (blinding out), blocking (blanking out), overwriting, light saturation (blinding), not capturing, eliminating, limiting, or otherwise altering one or more pixels 126 (see fig. 4B) in one or more images 122 (see fig. 4A) of an exclusion region 124 (see fig. 4A) in which one or more pixels 126 (see fig. 4B) are unwanted, irrelevant, or limited. The dynamic image masking system 10 (see fig. 2, 4A) and the dynamic image masking process 11 (see fig. 4A-4B) produce a masked image 50 (see fig. 4A), such as a filtered autonomous remote sensed image 51 (see fig. 4A), the masked image 50 being a reliable and repeatable and preferably only a resultant set of acquired pixels of interest 126 (see fig. 4B). One or more pixels 126 (see fig. 4B) undergoing the dynamic image masking process 11 may result in, for example, masked pixels 126a (see fig. 4B), blanked pixels 126B (see fig. 4B), non-captured pixels 126c (see fig. 4B), overwritten pixels 126d (see fig. 4B), light saturated pixels 126e (see fig. 4B), or other suitably altered pixels.
As shown in fig. 2, 4A, the dynamic image obscuring system 10 includes an imaging system 12 associated with a remote sensing platform 14 as shown in fig. 1. The imaging system 12 (see fig. 2, 4A) may include a two-dimensional imaging system 12a (see fig. 4A), a three-dimensional imaging system 12b such as stereoscopic imaging (see fig. 4A), or other suitable imaging system 12 (see fig. 4A). As shown in fig. 2 and 4A, the imaging system 12 includes an optical system 20 and an image sensing system 22.
As shown in fig. 4A, remote sensing platform 14 may include an airborne platform 14A, such as an unmanned aerial vehicle 200 (see fig. 6), a ground-based platform 14b, a space-based platform 14c, or a water-based platform 14 d. The remote sensing platform 14 (see FIG. 4A) may also comprise other suitable platforms.
As shown in fig. 2 and 4A, the dynamic image obscuring system 10 further includes a multi-level security system 42 associated with the imaging system 12. Multi-level security system 42 (see fig. 2) includes non-volatile results storage 44 (see fig. 2) and post-processing 48 (see fig. 2). The non-volatile results memory 44 (see fig. 2) may include any suitable computer-readable storage medium, such as Read Only Memory (ROM), Random Access Memory (RAM), video memory (VRAM), a hard disk, a floppy disk, a Compact Disk (CD), a tape, a combination thereof, or other suitable computer-readable storage device.
The multi-level security system 42 (see fig. 2, 4A) preferably needs to maintain the integrity of the data of the image 122 (see fig. 4B). The multi-level security system 42 (see fig. 2, 4A) controls access to the motion picture masking system 10 and to information related to individual pixels 126 (see fig. 4B).
As shown in fig. 2, 4B, the dynamic image obscuring system 10 further includes one or more image changing locations 90 preferably positioned in the imaging system 12 and the multi-level security system 42. The change of one or more images 90 occurs via the dynamic image masking process 11. One or more image change locations 90 (see fig. 2) may also be positioned external to imaging system 12 (see fig. 2) and multi-level security system 42 (see fig. 2).
The dynamic image obscuring system 10 (see fig. 2, 4A) preferably includes an acquisition planning phase 112 (see fig. 4A), an acquisition phase 114 (see fig. 4A), and a post-processing phase 116 (see fig. 4A) for planning, acquiring, and post-processing one or more images 122 (see fig. 4A) acquired during an acquisition event or task. For the acquisition planning phase 112 (see fig. 4A), the dynamic image masking system 10 (see fig. 2, 4A) may preferably include a pre-established acquisition planning process 16 (see fig. 1, 2, 4A). For example, the pre-established acquisition planning process 16 (see fig. 4A) may include a pre-established flight plan 17 (see fig. 4A) of an onboard platform 14A (see fig. 4A), such as an unmanned aerial vehicle 200 (see fig. 6).
The pre-established acquisition planning process 16 (see fig. 1, 2, 4A) preferably includes determining an exclusion region 124 (see fig. 4A) that is not imaged with the imaging system 12 (see fig. 1, 2, 4A) prior to the acquisition event or task (e.g., prior to flying the unmanned aerial vehicle 200 (see fig. 6) over the imaged region 118 (see fig. 4A)). The pre-established acquisition planning process 16 (see fig. 1, 2, 4A) allows for dynamically planning what regions do not acquire images 122 (see fig. 4A) before an acquisition event or task begins and excluding such regions from acquisition or task planning.
The pre-established acquisition planning process 16 (see fig. 2, 4A) may be performed as a manual process or an automated process. The automated process preferably uses a Gatekeeper (GK) algorithm 60 (see fig. 2), such as Gatekeeper (GK) algorithm 60a (see fig. 2), configured to send a gatekeeper command 62 (see fig. 2), such as gatekeeper command 62a (see fig. 2), to the pre-established acquisition planning process 16 (see fig. 2) at an image change location 90 (see fig. 2), such as image change location 90a (see fig. 2). The gatekeeper command 62 (see fig. 2), such as gatekeeper command 62a (see fig. 2), may preferably include a rule set 92 (see fig. 3) during this acquisition planning phase 112 (see fig. 4A), with the rule set 92 including the acquisition plan 98 (see fig. 3) and acquisition strategy (100), or other suitable rules and strategies. The acquisition plan 98 (see fig. 3) and acquisition policy 100 (see fig. 3) preferably include specific privacy policies and rules that are enforced with current effects in the area, region, state, country (country) and/or country (nation) where the event or task is acquired.
As shown in fig. 2, an image change position 90 (such as image change position 90a) is positioned prior to input to the optical system 20 of the imaging system 12. As further shown in fig. 2, an output 18 of the acquisition planning process is output from the acquisition planning process 16 and input into an optical system 20.
For the acquisition phase 114 (see fig. 4A) of the dynamic image obscuring system 10 (see fig. 2, 4A), the imaging system 12 (see fig. 2, 4A) is preferably used to designate the area 118 (see fig. 4A) for imaging, thereby obtaining a designated area 118a (see fig. 4A) to be imaged. A plurality of fiducials 120 (see fig. 4A) may be established on a surface 118b (see fig. 4A) of a designated area 118a (see fig. 4A) to be imaged. The plurality of specific surface areas 124A (see fig. 4A) may be designated as exclusion areas 124 (see fig. 4A) without imaging with reference to the plurality of fiducial points 120 (see fig. 4A).
The dynamic image obscuring system 10 (see fig. 4A) may further include a navigation system 110 (see fig. 4A) to position the imaging system 12 (see fig. 2, 4A) to image the designated area 118a (see fig. 4A) to be imaged. The navigation system 110 (see fig. 4A) may include a Global Positioning System (GPS)110a (see fig. 4A), a radio-based navigation system 110b (see fig. 4A), an optical-based navigation system 110c (see fig. 4A), an Inertial Measurement Unit (IMU) system 110d (see fig. 4A), a magnetometer-equipped Inertial Measurement Unit (IMU) system 110e (see fig. 4A), a combination thereof, or other suitable navigation system 110 (see fig. 4A).
As shown in fig. 1, 2, 4A, the optical system 20 may include a camera 20 a. Preferably, the camera 20a (see fig. 1, 2, 4A) is a digital camera 20b (see fig. 4A). The optical system 20 (see fig. 1, 2, 4A) may also comprise other suitable camera devices or advanced optical devices. As described above, the optical system 20 represents the optical framing all over the world.
As shown in fig. 2 and 4B, the dynamic image masking system 10 further includes an optical blindness (blindness) system 64 located between the optical system 20 and the image sensing system 22. As further shown in fig. 2, 4B, a Gatekeeper (GK) algorithm 60, such as Gatekeeper (GK) algorithm 60B, is configured to send a gatekeeper command 62, such as gatekeeper command 62B, to a controller 63, such as optical blindness system 64, to control an image change position 90, such as image change position 90B. Gatekeeper algorithm 60b (see fig. 2) is configured to mechanically or optically send gatekeeper command 62b (see fig. 2) to optical blinding system 64 (see fig. 2), which optical blinding system 64 controls image change position 90b (see fig. 2) located between optical system 20 (see fig. 2) and image sensing system 22 (see fig. 2).
With this optical blindness system 64 (see fig. 2) embodiment, no irrelevant (extra) pixels are processed because the pixels 126 (see fig. 4B) are changed before recording on the focal plane array 26 (see fig. 2) of the image sensing system 22 (see fig. 2). The pixel 126 (see fig. 4B) may be suppressed from collecting photons, or the pixel 126 (see fig. 4B) may be saturated light that illuminates it by 100% causing "blindness" to occur.
An optical blinding system 64 (see fig. 2, 4B) may be used with image altering hardware 65 (see fig. 4B). The image alteration hardware 65 (see fig. 4B) may include a mechanical device 66 (see fig. 4B), such as a shutter control mechanical device 66a (see fig. 4B), that may be used to inhibit the collection of photons by the plurality of pixels 126 (see fig. 4B). Alternatively, the image alteration hardware 65 (see fig. 4B) may include optical devices 67 (see fig. 4B), such as laser optical devices 67a (see fig. 4B) and micromirror optical devices 67B (see fig. 4B), that may be used to illuminate particular pixels 126 (see fig. 4B) on the focal plane array 26 (see fig. 2), causing the pixels 126 (see fig. 4B) to be blinded.
The gatekeeper command 62 (see fig. 2), such as gatekeeper command 62b (see fig. 2), may preferably include a rule set 92 (see fig. 3) at this acquisition stage 114 (see fig. 4A), the rule set 92 including a camera model 94 (see fig. 3), mask commands 96 (see fig. 3), acquisition plan 98, acquisition strategy (100), or other suitable rules and strategies. The gatekeeper command 62 (see fig. 2), such as the gatekeeper command 62b (see fig. 2), may preferably further include a time 102 (see fig. 3), a pose (IMU)104 (see fig. 3), a ranging sensor 106 (see fig. 3), and/or a position (GPS)108 (see fig. 3) in this acquisition phase 114 (see fig. 4A).
As shown in fig. 1, the optical system 20 outputs raw image data 24 obtained with the optical system 20, and the raw image data 24 is input to the image sensing system 22 of the imaging system 12. As shown in fig. 2, one or more pixels 126 (see fig. 4B) are occluded or altered using a gatekeeper algorithm 60 (such as gatekeeper algorithm 60B) and an optical blinding system 64, which optical blinding system 64 uses optical blinding system output 68 to control an image alteration location 90 (such as image alteration location 90B) through dynamic image occlusion processing 11 (see fig. 4A-4B). Thus, using a gatekeeper algorithm 60 (see fig. 2), such as gatekeeper algorithm 60b (see fig. 2), and an optical blinding system 64 (see fig. 2), the masked raw image data 24a (see fig. 2) is input to the focal plane array 26 (see fig. 2) of the image sensing system 22 (see fig. 2).
As shown in fig. 2 and 4A, image sensing system 22 includes a focal plane array 26, an analog-to-digital converter (a/D)30, a volatile temporary memory 34, a digital signal processor 38, and a digital-to-analog converter (D/a) 54. If the focal plane array subsystem 22a (see fig. 2) can be an integrated circuit, the focal plane array subsystem 22a (see fig. 2) may require some decomposition to interrupt the signal at the desired image change location 90 (see fig. 2), such as image change locations 90c, 90d, and 90e, in the focal plane array subsystem 22a (see fig. 2) of the image sensing system 22 (see fig. 2).
As shown in fig. 4A, the image sensing system 22 may further include a radar imaging system 22b, a sonar imaging system 22c, an infrared imaging system 22d, an x-ray imaging system 22e, a light detection and ranging system (LIDAR)22f, or other suitable image sensing system 22.
As shown in fig. 2, gatekeeper algorithm 60 (such as in the form of gatekeeper algorithm 60c) is configured to send gatekeeper command 62 (such as in the form of gatekeeper command 62 c) to controller 63 (such as pixel controller 69). The pixel controller 69 (see fig. 2) controls the image change location 90 (such as the image change location 90c) using the pixel controller output 70 through the dynamic image masking process 11 (see fig. 4A-4B) by overwriting one or more pixels 126 (see fig. 2) on the focal plane array 26 (see fig. 2) with zero saturation 140 (see fig. 4A) or one hundred percent saturation 142 (see fig. 4A).
With respect to this pixel controller 69 (see fig. 2) embodiment, the pixel controller 69 (see fig. 2) supplies the focal plane array 26 (see fig. 2) and essentially overwrites the pixels 126 (see fig. 4B) on the focal plane array 26 (see fig. 2) with either 0 (zero) (equivalent to 0) or 100% (one hundred percent) saturation (this level may be equivalent to 256 values for an 8-bit system).
As shown in fig. 1, the focal plane array 26 outputs a focal plane array output 28, and the focal plane array output 28 is input to an analog-to-digital converter 30. As shown in fig. 2, pixel controller 69 uses pixel controller output 70 to control an image change location 90 (such as image change location 90c) through dynamic image masking process 11 (see fig. 4A-4B) by masking or changing one or more pixels 126 (see fig. 4B) by overwriting using a gatekeeper algorithm 60 (such as gatekeeper algorithm 60c) and pixel controller 69. Thus, by using a gatekeeper algorithm 60 (see fig. 2), such as gatekeeper algorithm 60c (see fig. 2), and a pixel controller 69 (see fig. 2), the masked focal plane array output 28a (see fig. 2) is input to the analog-to-digital converter 30.
As further shown in FIG. 2, analog-to-digital converter 30 receives an obscured focal plane array output 28a (see FIG. 2), preferably in the form of obscured raw image data 24a, from focal plane array 26. The analog-to-digital converter 30 (see fig. 2) converts the masked original image data 24a from an analog signal to a digital signal 37 (see fig. 4B).
As shown in fig. 2, gatekeeper algorithm 60 (such as in the form of gatekeeper algorithm 60d) is configured to send gatekeeper command 62 (such as in the form of gatekeeper command 62 d) to controller 63 (such as digitizing controller 72). The digitizer controller 72 (see fig. 2) uses the digitizer controller output 74 to control an image change position 90 (such as image change position 90d) by the dynamic image masking process 11 (see fig. 4A-4B). The image change position 90 (see fig. 2) is located between the analog-to-digital converter 30 (see fig. 2) and the volatile temporary memory 34 (see fig. 2). By setting the digitized value 146 (see fig. 4B) (or minimum value 146a (see fig. 4B) or maximum value 146B (see fig. 4B)) of one or more pixels 126 (see fig. 4B), one or more pixels 126 (see fig. 4B) are preferably changed at the image change location 90.
With this digitizing controller 72 (see fig. 2) embodiment, after the analog-to-digital converter 30 (see fig. 2), the digitizing controller 72 (see fig. 2) controls the digitizing by setting the digitized value 146 (see fig. 4B) to be either high (minimum value 146a (see fig. 4B)) or low (maximum value 146B (see fig. 4B)). Thus, the signal for some pixels 126 (see fig. 4B) is essentially shortened (value set low) or maximized (maxed out) (value set high). This embodiment may be used in custom interface electronics, such as wired-OR function 144 (see FIG. 4B) as a hardware embodiment of Boolean arithmetic. The wired-or function 144 (see fig. 4B) uses a pull-down resistor and one diode for each input to electrically perform the boolean logic operation of the or gate.
As shown in fig. 1, analog-to-digital converter 30 outputs analog-to-digital converter output 32, and analog-to-digital converter output 32 is input to volatile temporary memory 34. As shown in fig. 2, the digitizing controller 72 uses the digitizing controller output 74 to control an image change location 90 (such as image change location 90d) through the dynamic image masking process 11 (see fig. 4A-4B) by overwriting to mask or change one or more pixels 126 (see fig. 4B) using the gatekeeper algorithm 60 (such as gatekeeper algorithm 60d) and the digitizing controller 72. Thus, by using the gatekeeper algorithm 60 (see fig. 2), such as gatekeeper algorithm 60d (see fig. 2), and the digitizer controller 72 (see fig. 2), the masked analog-to-digital converter output 32a (see fig. 2) is input to the volatile temporary memory 34 (see fig. 2).
As further shown in fig. 2, volatile temporary memory 34 receives masked analog-to-digital converter output 32a from analog-to-digital converter 30, preferably in the form of digital signal 37 (see fig. 4B). The volatile temporary memory 34 temporarily stores the digital signal 37 from the analog-to-digital converter 30 (see fig. 4B).
As shown in fig. 2, gatekeeper algorithm 60 (such as in the form of gatekeeper algorithm 60e) is configured to send gatekeeper command 62 (such as in the form of gatekeeper command 62 e) to controller 63 (such as digital flow controller 76). The digital flow controller 76 (see fig. 2) uses the digital flow controller output 78 to control an image change position 90 (such as image change position 90e) by the dynamic image masking process 11 (see fig. 4A-4B). The image change position 90e (see fig. 2) is located between the volatile temporary memory 34 (see fig. 2) and the digital signal processor 38 (see fig. 2). By changing the single image 122 (see fig. 4A) at a time and masking one or more pixels 126 (see fig. 4B) in the single image 122 (see fig. 4A), it is preferable to change one or more pixels 126 (see fig. 4B) at the image change position 90 e.
With this digital flow controller 76 (see fig. 2) embodiment, the volatile temporary memory 34 (see fig. 2) outputs a single image 122 (see fig. 4A) to the digital signal processor 38 (see fig. 2) at a time. This occurs since the storage (memory) of the volatile temporary memory 34 (see fig. 2) is written over for each single image 122 (see fig. 4A) that is processed.
As shown in fig. 1, the volatile temporary memory 34 outputs a volatile temporary memory output 36 and the volatile temporary memory output 36 is input to a digital signal processor 38. As shown in fig. 2, the digital flow controller 76 uses the digital flow controller output 78 to control an image change location 90 (such as image change location 90e) through the dynamic image masking process 11 (see fig. 4A-4B) by masking or changing one or more pixels 126 (see fig. 4B) by overwriting using the gatekeeper algorithm 60 (such as gatekeeper algorithm 60e) and the digital flow controller 76. Thus, the masked volatile temporary memory output 36a (see FIG. 2) is input to the digital signal processor 38 using the gatekeeper algorithm 60 (see FIG. 2), such as gatekeeper algorithm 60e (see FIG. 2), and the digital flow controller 76 (see FIG. 2).
The digital signal processor 38 (see fig. 2) receives the digital signal 37 (see fig. 4B) from the volatile temporary memory 34 (see fig. 2) and processes the digital signal 37 (see fig. 4B) into a readable image format 39 (see fig. 4B). When the imaging system 12 (see fig. 2) uses analog output, the digital-to-analog converter 54 (see fig. 2) receives a readable digital signal from the digital signal processor 38 (see fig. 2) and converts the readable digital signal to an analog signal.
As shown in fig. 2, gatekeeper algorithm 60 (such as in the form of gatekeeper algorithm 60f) is configured to send gatekeeper command 62 (such as in the form of gatekeeper command 62 f) to controller 63 (such as control storage controller 80). The control memory controller 80 (see fig. 2) uses the control memory controller output 82 to control an image change position 90 (such as image change position 90f) through the dynamic image masking process 11 (see fig. 4A-4B). The image change position 90f (see fig. 2) is positioned at the digital signal processor output 40 (see fig. 1, 2) of the focal plane array subsystem 22a (see fig. 2) of the imaging system 12 (see fig. 2) and prior to input to the non-volatile results memory 44 (see fig. 2) of the multi-level safety system 42 (see fig. 2). At the image change position 90f, one or more pixels 126 (see FIG. 4B) may be changed by masking so that they are not written to the non-volatile results memory 44 (see FIG. 2).
With respect to this control storage controller 80 (see FIG. 2) embodiment, the control storage controller 80 (see FIG. 2) changes the image 122 (see FIG. 4A) at the output of the focal plane array subsystem 22a (see FIG. 2). One or more pixels 126 (see fig. 4B) that need to be limited or eliminated (obscured) are determined by the gatekeeper algorithm 60 (see fig. 2), such as the gatekeeper algorithm 60f (see fig. 2), and this pixel information is then correlated with the location in the image 122 (see fig. 4A). The result is that the unwanted pixel is occluded and not written to the non-volatile results memory 44 (see fig. 2).
As shown in fig. 1, digital signal processor 38 outputs digital output 40 and inputs digital output 40 to non-volatile results memory 44 of multi-level security system 42. As shown in fig. 2, one or more pixels 126 (see fig. 4B) are masked or changed by masking and not written to the non-volatile results memory 44 by using the gatekeeper algorithm 60 (such as the gatekeeper algorithm 60f) and the control memory controller 80, which control memory controller 80 uses the control memory controller output 82 to control an image change location 90 (such as the image change location 90f) by the dynamic image masking process 11 (see fig. 4A-4B). Thus, masked digital output 40a (see fig. 2) is output to non-volatile results memory 44 (see fig. 2) of multi-level security system 42 (see fig. 2) using gatekeeper algorithm 60 (see fig. 2), such as gatekeeper algorithm 60f (see fig. 2), and control memory controller 80 (see fig. 2).
As shown in fig. 2, gatekeeper algorithm 60 (such as in the form of gatekeeper algorithm 60g) is configured to send gatekeeper command 62 (such as in the form of gatekeeper command 62 g) to control image change location 90 (such as image change location 90g) by dynamic image masking process 11 (see fig. 4A-4B). Image change location 90g (see fig. 2) is located in multi-level security system 42 (see fig. 2) between non-volatile results memory 44 (see fig. 2) and post-processing process 48 (see fig. 2). At the image change position 90g, one or more pixels 126 (see fig. 4B) may be changed by overwriting one or more pixels 126 (see fig. 4B) with zero saturation 140 (see fig. 4B) or one hundred percent saturation 142 (see fig. 4B).
With this embodiment, the image 122 (see FIG. 4A) is changed after the image is output from the non-volatile results memory 44 (see FIG. 2) but before being post-processed in the post-processing process 48 (see FIG. 2). Unwanted pixels as determined by the gatekeeper algorithm 60 (see fig. 2), such as the gatekeeper algorithm 60g (see fig. 2), are occluded by overwriting the digitized value 146 (see fig. 4B) of the unwanted pixel with a known entity to 0 (zero) or a value representing 100% (one hundred percent) of the allowed value of the pixel 126 (see fig. 4B).
As shown in FIG. 1, the non-volatile results memory 44 outputs a non-volatile results memory output 46 and inputs the non-volatile results memory output 46 to a post-processing process 48. As shown in fig. 2, one or more pixels 126 (see fig. 4B) are masked or changed by using a gatekeeper algorithm 60 (such as gatekeeper algorithm 60g) for controlling an image change location 90 (such as image change location 90g) through dynamic image masking process 11 (see fig. 4A-4B). Thus, by using the gatekeeper algorithm 60 (see FIG. 2), such as gatekeeper algorithm 60g (see FIG. 2), the masked nonvolatile result memory output 46a (see FIG. 2) is output to the post-processing process 48 (see FIG. 2).
As shown in fig. 2, gatekeeper algorithm 60 (such as in the form of gatekeeper algorithm 60h) is configured to send gatekeeper command 62 (such as in the form of gatekeeper command 62 h) to control image change position 90 (such as image change position 90h) by dynamic image masking process 11 (see fig. 4A-4B). Image change location 90h (see fig. 2) is located in multi-level security system 42 (see fig. 2) at post-processing 48 (see fig. 2). At the image changing position 90h, one or more pixels 126 (see fig. 4B) may be changed by editing or ignoring one or more pixels 126 (see fig. 4B) of the exclusion area 124 (see fig. 4B) representing the designated area 118a (see fig. 4B) to be imaged.
With this embodiment as a post-processing stage 116 (see FIG. 4A), one or more pixels 126 (see FIG. 4B) are masked at post-processing process 48 (see FIG. 2). The image 122 (see fig. 4A) is essentially altered by editing or simply ignoring one or more pixels 126 (see fig. 4B), which one or more pixels 126 represent an unwanted portion (see fig. 4A) of the image 122 (such as the exclusion area 124 (see fig. 4B)).
As shown in fig. 1, post-processing process 48 outputs post-processed output 49 out of multi-level security system 42. As shown in fig. 2, by masking or changing one or more pixels 126 (see fig. 4B) using a gatekeeper algorithm 60 (such as gatekeeper algorithm 60h), the gatekeeper algorithm 60 controls an image change position 90 (such as image change position 90h) through the dynamic image masking process 11 (see fig. 4A-4B). Thus, using gatekeeper algorithm 60 (see fig. 2), such as gatekeeper algorithm 60h (see fig. 2), post-processing process 48 outputs masked post-processing output 49a out of multi-stage security system 42 to obtain masked image 50 (see fig. 2), such as filtered autonomous remote sensed image 51 (see fig. 2).
As shown in fig. 2, gatekeeper algorithm 60 (such as in the form of gatekeeper algorithm 60i) is configured to send gatekeeper command 62 (such as in the form of gatekeeper command 62 i) to controller 63 (such as analog signal controller 84). The analog signal controller 84 (see fig. 2) uses the analog signal controller output 86 to control an image change position 90 (such as the image change position 90i) by the dynamic image masking process 11 (see fig. 4A to 4B).
The image change position 90i (see fig. 2) is located at the analog output 56 (see fig. 2) of the digital-to-analog converter 54 (see fig. 2) of the focal plane array subsystem 22a (see fig. 2) and prior to input to the video editing system 58 (see fig. 2) located outside of the focal plane array subsystem 22a (see fig. 2). One or more pixels 126 (see fig. 4B) are preferably changed at image change location 90i by masking one or more pixels 126 (see fig. 4B) so that they are not written to video editing system 58 (see fig. 2).
As shown in fig. 1, the digital signal processor 38 outputs the digital signal processor output from the analog 52 to a digital-to-analog converter 54, and the digital-to-analog converter 54 outputs an analog output 56 out of the focal plane array subsystem 22a, and inputs the analog output 56 to a video editing system 58. As shown in fig. 2, a plurality of pixels 126 (see fig. 4B) are masked or changed using a gatekeeper algorithm 60 (such as gatekeeper algorithm 60i) and an analog signal controller 84, which analog signal controller 84 uses an analog signal controller output 86 to control an image change location 90 (such as image change location 90i) through a dynamic image masking process 11 (see fig. 4A-4B). Thus, the masked analog output 56a (see FIG. 2) is input to the video editing system 58 (see FIG. 2) using the gatekeeper algorithm 60 (see FIG. 2), such as the gatekeeper algorithm 60i (see FIG. 2), and the analog signal controller 84 (see FIG. 2).
As shown in FIG. 4B, the dynamic image obscuring system 10 further includes a computer system 130 associated with the imaging system 12. The computer system 130 (see fig. 4B) includes a gatekeeper algorithm 60 (see fig. 2, 4B) configured to send a gatekeeper command 62 (see fig. 2, 4B) to one or more controllers 63 (see fig. 2, 4B), the controllers 63 controlling one or more image change positions 90 (see fig. 2, 4B) using the dynamic image masking process 11 (see fig. 2, 4B).
As shown in FIG. 4B, computer system 130 preferably includes a computer 132 and one or more of software 134, firmware 136, and hardware 138. Gatekeeper algorithm 60 and controller 63 may preferably be a combination of hardware 138 and firmware 136, or a combination of hardware 138 and software 134.
Software 134 (see fig. 4B) or firmware 136 (see fig. 4B) may implement gatekeeper algorithm 60 (see fig. 3), which is designed for use with computer 132 (see fig. 4B) of computer system 130 or other hardware 138 (see fig. 4B) of computer system 130 (see fig. 4B).
In another embodiment of the present disclosure, a method 150 (see fig. 5A) for providing a filtered autonomous remote sensed image 51 (see fig. 4A) by dynamic image masking process 11 (see fig. 4B) is provided. Fig. 5A is a schematic illustration of a flow chart of an embodiment of a method 150 of the present disclosure.
As shown in fig. 5A, the method 150 includes a step 152 of equipping the remote sensing platform 14 (see fig. 2, 4A) with the imaging system 12 (see fig. 2, 4A). The step 152 of equipping the remote sensing platform 14 (see fig. 2, 4A) with the imaging system 12 (see fig. 2, 4A) includes equipping the remote sensing platform 14 (see fig. 2, 4A) with an imaging system 12 (see fig. 2, 4A) that includes an optical system 20 (see fig. 2) that includes a digital camera 20a (see fig. 2) and an image sensing system 22 (see fig. 2, 4A) that includes a focal plane array subsystem 22a (see fig. 4A), a radar imaging system 22b (see fig. 4A), a sonar imaging system 22c (see fig. 4A), an infrared imaging system 22d (see fig. 4A), an x-ray imaging system 22e (see fig. 4A), or a light detection and ranging (LIDAR) system 22f (see fig. 4A).
The step 152 of equipping the remote sensing platform 14 (see fig. 2, 4A) with the imaging system 12 (see fig. 2, 4A) further comprises equipping the remote sensing platform 14 (see fig. 2, 4A) comprising an airborne platform 14A (see fig. 4A), a ground-based platform 14b (see fig. 4A), a space-based platform 14c (see fig. 4A), or a water-based platform 14d (see fig. 4A).
As shown in fig. 5A, the method 150 further includes a step 154 of designating the region 118 for imaging (see fig. 4A) to obtain a designated region 118a to be imaged (see fig. 4A). As shown in fig. 5A, the method 150 further includes a step 156 of establishing a plurality of fiducials 120 (see fig. 4A) on a surface 118b (see fig. 4A) of the designated area 118a (see fig. 4A) to be imaged.
As shown in fig. 5A, the method 150 further includes a step 158 of designating the plurality of specific surface regions 124A (see fig. 4A) as excluded regions 124 (see fig. 4A) that are not imaged with reference to the plurality of fiducial points 120 (see fig. 4A). As shown in fig. 5A, the method 150 further includes a step 160 of controlling the pre-established acquisition planning process 16 (see fig. 2, 4A) that covers the designated area 118a to be imaged (see fig. 4A).
As shown in fig. 5A, the method 150 includes a step 162 of positioning the imaging system 12 (see fig. 2, 4A) to image the designated area 118a (see fig. 4A) to be imaged using a navigation system 110 (see fig. 4A) that includes a Global Positioning System (GPS)110a (see fig. 4A), a radio-based navigation system 110b (see fig. 4A), an optical-based navigation system 110c (see fig. 4A), an Inertial Measurement Unit (IMU) system 110d (see fig. 4A), a magnetometer-equipped Inertial Measurement Unit (IMU) system 110e (see fig. 4A), or a combination thereof.
As shown in fig. 5A, the method 150 further includes a step 164 of imaging the designated area 118a (see fig. 4A) to be imaged covered by the pre-established acquisition planning process 16 (see fig. 2, 4A) using the imaging system 12 (see fig. 2, 4A).
As shown in fig. 5A, the method 150 includes a step 166 of dynamically invalidating one or more pixels 126 (see fig. 4B) in one or more images 122 (see fig. 4A) of the exclusion area 124 (see fig. 4A). The step 166 of dynamically invalidating one or more pixels 126 (see fig. 4B) from one or more images 122 (see fig. 4A) of the exclusion area 124 (see fig. 4A) includes altering one or more captured images 124B (see fig. 4B) of the exclusion area 124 (see fig. 4A) to make them illegible.
In one embodiment, the changing of the one or more captured images 124b (see fig. 4A) of the exclusion area 124 (see fig. 4A) is preferably performed in real time during the imaging of the designated area 118a (see fig. 4A) to be imaged. In another embodiment, the changing of the one or more captured images 124B (see fig. 4A) of the exclusion area 124 (see fig. 4A) is performed after completing the overall imaging of the designated area 118a (see fig. 4A) to be imaged and before passing through the dynamic image masking process 11 (see fig. 4B) of the designated area 118a (see fig. 4A) to be imaged to obtain the filtered autonomous remote sensing image 51 (see fig. 2, 4A).
As shown in fig. 5A, the method 150 includes a step 168 of obtaining a filtered autonomous remote sensed image 51 (see fig. 2, 4A) by subjecting the designated area 118a (see fig. 4A) to be imaged to dynamic image masking processing 11 (see fig. 4B).
In another embodiment of the present disclosure, a method 170 (see fig. 5B) for providing a filtered autonomous remote sensed image 51 (see fig. 4A) by a dynamic image masking process 11 (see fig. 4B) is provided. Fig. 5B is a schematic illustration of a flow chart of another embodiment of a method 170 of the present disclosure.
As shown in fig. 5B, the method 170 includes the step 172 of equipping an Unmanned Aerial Vehicle (UAV)200 (see fig. 6) with the imaging system 12 (see fig. 2, 4A).
As shown in fig. 5B, the method 170 further includes a step 174 of designating the region 118 for imaging (see fig. 4A) to obtain a designated region 118a to be imaged (see fig. 4A).
As shown in fig. 5B, the method 170 further includes the step 176 of establishing a plurality of fiducials 120 (see fig. 4A) on the surface 118B (see fig. 4A) of the designated area 118a (see fig. 4A) to be imaged.
As shown in fig. 5B, the method 170 further includes a step 178 of designating the plurality of specific surface regions 124A (see fig. 4A) as excluded regions 124 (see fig. 4A) that are not imaged with reference to the plurality of fiducial points 120 (see fig. 4A).
As shown in fig. 5B, the method 170 further includes a step 180 of controlling a pre-established flight plan 17 (see fig. 4A) for a UAV 200 (see fig. 6) covering the designated area 118a (see fig. 4A) to be imaged.
As shown in fig. 5B, the method 170 further includes a step 182 of positioning the imaging system 12 (see fig. 2, 4A) to image the specified area 118a (see fig. 4A) to be imaged using a navigation system 110 (see fig. 4A) that includes a Global Positioning System (GPS)110a (see fig. 4A), a radio-based navigation system 110B (see fig. 4A), an optical-based navigation system 110c (see fig. 4A), an Inertial Measurement Unit (IMU) system 110d (see fig. 4A), a magnetometer-equipped Inertial Measurement Unit (IMU) system 110e (see fig. 4A), or a combination thereof.
As shown in fig. 5B, the method 170 further includes the step 184 of flying the UAV 200 over the designated area 118a to be imaged (see fig. 4A) (see fig. 6), and imaging the designated area 118a to be imaged (see fig. 4A) covered by the pre-established flight plan 17 (see fig. 4A) of the UAV 200 (see fig. 6) using the imaging system 12 (see fig. 2, 4A).
As shown in fig. 5B, the method 170 further includes a step 186 of dynamically invalidating one or more pixels 126 (see fig. 4B) in the one or more images 122 (see fig. 4A) of the exclusion area 124 (see fig. 4A). The step 186 of dynamically invalidating one or more pixels 126 (see fig. 4B) in the one or more images 122 (see fig. 4A) of the exclusion area 124 (see fig. 4A) includes directing a pre-established flight plan 17 (see fig. 4A) of the UAV 200 (see fig. 6) to avoid flying over the exclusion area 124 (see fig. 4A).
The step 186 of dynamically invalidating one or more pixels 126 (see fig. 4B) in the one or more images 122 (see fig. 4A) of the exclusion area 124 (see fig. 4A) further includes dynamically eliminating imaging by the image sensing system 22 (see fig. 2, 4A) while the UAV 200 (see fig. 6) is flying over the exclusion area 124 (see fig. 4A) in real-time.
The step 186 of dynamically invalidating one or more pixels 126 (see fig. 4B) in one or more images 122 (see fig. 4A) of the exclusion area 124 (see fig. 4A) further comprises altering one or more captured images 124B (see fig. 4B) of the exclusion area 124 (see fig. 4A) to make them illegible.
In one embodiment, the changing of the one or more captured images 124b (see fig. 4A) of the exclusion area 124 (see fig. 4A) is preferably performed in real time during the imaging of the designated area 118a (see fig. 4A) to be imaged. In another embodiment, the changing of the one or more captured images 124B (see fig. 4A) of the exclusion area 124 (see fig. 4A) is performed after completing the overall imaging of the designated area 118a (see fig. 4A) to be imaged and before obtaining the filtered autonomous remote sensing image 51 (see fig. 2, 4A) by performing the dynamic image masking process 11 (see fig. 4B) on the designated area 118a (see fig. 4A) to be imaged.
As shown in fig. 5B, the method 170 further includes a step 188 of obtaining a filtered autonomous remote sensing image 51 (see fig. 2, 4A) by subjecting the designated area 118a (see fig. 4A) to be imaged to dynamic image masking processing 11 (see fig. 5B).
Fig. 6 is a schematic representation of an embodiment of a remote sensing platform 14 in the form of an Unmanned Aerial Vehicle (UAV)200, such as an onboard platform 14a, that may be used in embodiments of the dynamic image obscuring system 10, method 150 (see fig. 5A), and method 170 (see fig. 5B) of the present disclosure. As shown in fig. 6, a remote sensing platform 14 in the form of a UAV 200 (such as an airborne platform 14a) includes a dynamic image obscuring system 10. As further shown in fig. 6, UAV 200 includes a nose 202, a fuselage 204, wings 206, and an empennage 208.
FIG. 7 is a schematic illustration of a flow chart of an embodiment of an aircraft manufacturing and service method 300. Fig. 8 is a schematic diagram of a functional block diagram of an embodiment of an aircraft 320. Referring to fig. 7-8, embodiments of the present disclosure are described in the context of aircraft manufacturing and service method 300, as shown in fig. 7, and aircraft 320, as shown in fig. 8. During pre-production, exemplary aircraft manufacturing and service method 300 (see FIG. 7) may include specification and design 302 (see FIG. 7) of aircraft 316 (see FIG. 8) and material procurement 304 (see FIG. 7). During the manufacturing process, component and subassembly manufacturing 306 (see FIG. 7) and system integration 308 (see FIG. 7) of the aircraft 316 (see FIG. 8) are performed. Thereafter, the aircraft 316 (see FIG. 8) may pass verification and delivery 310 (see FIG. 7) so as to be in service 312 (see FIG. 7). While providing service 312 (see FIG. 7) to the customer, aircraft 316 (see FIG. 8) may be scheduled for routine maintenance and service 314 (see FIG. 7), which may also include modification, reconfiguration, refurbishment, and other suitable services.
A system integrator, a third party, and/or an operator (e.g., a customer) may perform or conduct the various processes of aircraft manufacturing and service method 300 (see fig. 7). For purposes of this specification, a system integrator may include (without limitation) any number of aircraft manufacturers and major-system subcontractors; the third party may include (without limitation) any number of sellers, subcontractors, and suppliers; and the operators may include airlines, leasing companies, military entities, service organizations, and other suitable operators.
As shown in FIG. 8, an aircraft 320 produced by exemplary aircraft manufacturing and service method 300 may include an airframe 322 with a plurality of systems 324 and an interior compartment 326. As further shown in fig. 8, examples of the system 324 may include one or more of a propulsion system 328, an electrical system 330, a hydraulic system 332, and an environmental system 334. Any number of other systems may be included. Although an aerospace example is shown, the principles of the disclosure may be applied to other industries, such as the automotive industry.
The methods and systems embodied herein may be employed during any one or more of the stages of aircraft manufacturing and service method 300 (see FIG. 7). For example, components or subassemblies corresponding to component and subassembly manufacturing 306 (see fig. 7) may be prepared or manufactured in a manner similar to components or subassemblies produced while aircraft 320 (see fig. 8) is in service state 312 (see fig. 7). Additionally, one or more apparatus embodiments, method embodiments, or a combination thereof may be utilized during component and subassembly manufacturing 306 (see fig. 7) and system integration 308 (see fig. 7), for example, to substantially speed assembly or reduce costs of aircraft 320 (see fig. 8). Similarly, while the aircraft 320 (see FIG. 8) is in the service state 312 (see FIG. 7), routine maintenance and service 312[ numbered here as if 314, please note ] may be performed, for example and without limitation, using one or more of the device embodiments, the method embodiments, or a combination thereof (see FIG. 7).
The disclosed embodiments of dynamic image masking system 10 (see fig. 2, 4A-4B), method 150 (see fig. 5A), and method 170 (see fig. 5B) provide a number of advantages over known systems and methods, including imaging only useful and desirable data, and not imaging areas or data that are restricted, out of range, or out of context for remote sensing platform tasks, such as airborne platform tasks. This "guaranteed shutter control" addresses privacy intrusion issues that may be of concern and ensures that the dynamic image obscuring system 10 (see fig. 2, 4A-4B) is not burdened with unused data, such as data collected on non-customer areas.
In addition, the disclosed embodiments of the dynamic image masking system 10 (see fig. 2, 4A-4B), method 150 (see fig. 5A), and method 170 (see fig. 5B) provide a well-defined acquisition area for image acquisition and provide autonomous operations typically required for aerial remote sensing image acquisition in a full-fine-tillage market, such as flying over agricultural fields to determine plant health and robustness. Further, the disclosed embodiments of the dynamic image obscuring system 10 (see fig. 2, 4A-4B), the method 150 (see fig. 5A), and the method 170 (see fig. 5B) integrate the imaging system 12 (see fig. 2, 4A) and an autopilot of the remote sensing platform 14, such as an Unmanned Aerial Vehicle (UAV)200 (see fig. 6), and may perform flight and shutter control operations of multiple UAVs 200 simultaneously.
Furthermore, the disclosed embodiments of the dynamic image masking system 10 (see fig. 2, 4A-4B), the method 150 (see fig. 5A), and the method 170 (see fig. 5B) produce reliable, repeatable masked image 50 (see fig. 2, 4A) artifacts, which are preferably produced using only the pixel of interest 126 (see fig. 4B). The pixels 126 (see fig. 4B) may or may not be acquired, blanked out, overwritten, light saturated, or otherwise altered, thereby rendering the pixels 126 (see fig. 4B) useless in the production process. Also, this can occur anywhere from "cancelling" in the acquisition planning stage 112 (see fig. 4A), to overwriting in the acquisition stage 114 (see fig. 4A), to the production of post-processed products in the post-processing stage 116 (see fig. 4A) after acquisition of one or more images 122 (see fig. 4A).
Further, the present disclosure includes embodiments according to:
an dynamic image masking system (10) for providing a filtered autonomous remotely sensed image (51) by a dynamic image masking process (11), the system comprising:
a remote sensing platform (14);
an imaging system (12) associated with the remote sensing platform (14), the imaging system (12) comprising:
an optical system (20);
an image sensing system (22);
a multi-level security system (42) associated with the imaging system (12);
one or more image change locations (90) in the imaging system (12) and the multi-level security system (42), wherein the change of one or more images is made via the dynamic image masking process (11); and
a computer system (130) associated with the imaging system (12), the computer system (130) comprising a gatekeeper algorithm (60) configured to send a gatekeeper command (62) to one or more controllers (63), the controllers (63) controlling the one or more image change positions (90) through the dynamic image masking process (11).
The dynamic image obscuring system (10) according to item 1, further comprising a navigation system (110) for positioning the imaging system (12) to image the designated area (118a) to be imaged, the navigation system (110) comprising a Global Positioning System (GPS) (110a), a radio-based navigation system (110b), an optical-based navigation system (110c), an Inertial Measurement Unit (IMU) system (110d), a magnetometer-equipped Inertial Measurement Unit (IMU) system (110e), or a combination thereof.
The dynamic image obscuring system (10) according to item 1 or item 2, wherein the remote sensing platform (14) comprises an airborne platform (14a), a ground-based platform (14b), a space-based platform (14c), or a water-based platform (14 d).
The dynamic image obscuring system (10) of item 1, item 2, or item 3, wherein the optical system (20) comprises a camera (20a) including a digital camera (20b), and wherein the image sensing system (22) comprises a focal plane array subsystem (22a), a radar imaging system (22b), a sonar imaging system (22c), an infrared imaging system (22d), an x-ray imaging system (22e), or a light detection and ranging (LIDAR) system (22 f).
Item 5. the dynamic image obscuring system (10) of item 1, item 2, item 3, or item 4, wherein the gatekeeper algorithm (60) is further configured to send a gatekeeper command (62) to the pre-established acquisition planning process (16) at the image change location (90) at the located pre-established acquisition planning process (16) prior to input to the imaging system (12) by determining that the imaging exclusion zone (124) is not imaged with the imaging system (12).
Item 6. the dynamic image obscuring system (10) of item 1, item 2, item 3, item 4, or item 5, wherein the gatekeeper algorithm (60) is configured to send a gatekeeper command (62) either mechanically or optically to an optical blinding system (64), the optical blinding system (64) controlling an image change location (90) located between the optical system (20) and the image sensing system (22), the optical blinding system (64) including a shutter control mechanism (66a) to inhibit one or more pixels (126) from collecting photons or a laser optical device (67a) and a micromirror optical device (67b) to illuminate one or more pixels (126) to blind the one or more pixels (126).
Item 7. the dynamic image obscuring system (10) of item 1, item 2, item 3, item 4, item 5, or item 6, wherein the image sensing system (22) comprises a focal plane array subsystem (22a) comprising:
a focal plane array (26) reading raw image data (24) from the optical system (20);
an analog-to-digital converter (30) that receives the raw image data (24) from the focal plane array (26) and converts the raw image data (24) from an analog signal to a digital signal;
a volatile temporary memory (34) receiving the digital signal (37) from the analog-to-digital converter (30) and temporarily storing the digital signal (37);
a digital signal processor (38) receiving the digital signal (37) from the volatile temporary memory (34) and processing the digital signal (37) into a readable image format (39); and
when the imaging system (12) uses analog output, a digital-to-analog converter (54) receives a readable digital signal from the digital signal processor (38) and converts the readable digital signal to an analog signal.
The dynamic image obscuring system (10) of item 7, wherein the gatekeeper algorithm (60) is configured to send a gatekeeper command (62) to a pixel controller that controls an image change location (90) on the focal plane array (26) by overwriting one or more pixels on the focal plane array (26) with zero saturation (140) or one-hundred percent saturation (142).
Item 9. the dynamic image obscuring system (10) of item 7, wherein the gatekeeper algorithm (60) is configured to send a gatekeeper command (62) to a digitizing controller (72), the digitizing controller (72) controlling an image change location (90) located between the analog-to-digital converter (30) and the volatile temporary memory (34) by setting a digitized value 146 (see fig. 4B) of one or more pixels (126) to a minimum value (146a) or a maximum value (146B).
Item 10. the dynamic image obscuring system (10) of item 7, wherein the gatekeeper algorithm (60) is configured to send a gatekeeper command (62) to a digital flow controller (73), the digital flow controller (73) controlling an image change location (90) located between the volatile temporary memory (34) and the digital signal processor (38) by changing a single image (122) at a time and obscuring one or more pixels (126) in the single image (122).
Item 11. the dynamic image obscuring system (10) of item 7, wherein the gatekeeper algorithm (60) is configured to send a gatekeeper command (62) to a control storage controller (80), the control storage controller (80) controlling an image change position (90) at the digital signal processor output (40) of the focal plane array subsystem (22a) and prior to input to a non-volatile results memory (44) of a multi-level security system (42) by obscuring one or more pixels (126) so that they are not written to the non-volatile results memory (44).
Item 12. the dynamic image obscuring system (10) of item 7, wherein the gatekeeper algorithm (60) is configured to send a gatekeeper command (62) to an analog signal controller (84), the analog signal controller (84) controlling an image change location (90) located at the digital-to-analog converter output (56) of the focal plane array subsystem (22a) and prior to input to a video editing system (58) by obscuring one or more pixels (126) so that they are not written to the video editing system (58).
Item 13. the dynamic image obscuring system (10) of item 1, item 2, item 3, item 4, item 5, item 6, item 7, item 8, item 9, item 10, item 11, or item 12, wherein the gatekeeper algorithm (60) is configured to send a gatekeeper command (62), the gatekeeper command (62) controlling an image change location (90) in the multi-level security system (42) positioned between a non-volatile results memory (44) and a post-processing process (48) by overwriting one or more pixels (126) with zero saturation (140) or one hundred percent saturation (142).
Item 14. the dynamic image obscuring system (10) of item 1, item 2, item 3, item 4, item 5, item 6, item 7, item 8, item 9, item 10, item 11, item 12, or item 13, wherein the gatekeeper algorithm (60) is configured to send a gatekeeper command (62), the gatekeeper command (62) controlling an image change location (90) in a post-processing procedure (48) of the multi-level security system (42) by editing or ignoring one or more pixels (126) of an exclusion area (124) representing a designated area (118a) to be imaged.
An item 15. a method for providing a filtered autonomous remotely sensed image (51) by dynamic image occlusion processing (11), the method comprising the steps of:
equipping a remote sensing platform (14) with an imaging system (12);
designating a region (118) for imaging to obtain a designated region (118a) to be imaged;
establishing a plurality of reference points (120) on a surface (118b) of the designated area (118a) to be imaged;
designating a plurality of specific surface regions (124a) as excluded regions (124) that are not imaged with reference to the plurality of fiducial points (120);
controlling a pre-established acquisition planning process (16) covering the designated area (118a) to be imaged;
positioning the imaging system (12) using a navigation system (110) including a Global Positioning System (GPS) (110a), a radio-based navigation system (110b), an optical-based navigation system (110c), an Inertial Measurement Unit (IMU) system (110d), a magnetometer-equipped Inertial Measurement Unit (IMU) system (110e), or a combination thereof, to image the designated area (118a) to be imaged;
imaging, using the imaging system (12), the designated area (118a) to be imaged covered by the pre-established acquisition planning process (16);
dynamically invalidating one or more pixels (126) in one or more images (122) of the exclusion area (124); and
obtaining a filtered autonomous remote sensing image (51) by performing the dynamic image masking process (11) on the designated area (118a) to be imaged.
The method of item 15, wherein the step of equipping the imaging system (12) with the remote sensing platform (14) comprises equipping the remote sensing platform (14) with the imaging system (12) comprising an optical system (20) comprising a digital camera (20a) and an image sensing system (22) comprising a focal plane array subsystem (22a), a radar imaging system (22b), a sonar imaging system (22c), an infrared imaging system (22d), an x-ray imaging system (22e), or a light detection and ranging (LIDAR) system (22 f).
The method of item 15 or item 16, wherein the step of equipping the remote sensing platform (14) with the imaging system (12) comprises equipping the remote sensing platform (14) with an airborne platform (14a), a ground-based platform (14b), a space-based platform (14c), or a water-based platform (14 d).
Item 18. the method of item 15, item 16, or item 17, wherein the step of dynamically invalidating the one or more pixels (126) in the one or more images (122) of the exclusion region (124) comprises altering one or more captured images (124b) of the exclusion region (124) to make them illegible.
The method of item 18, wherein the changing of one or more captured images (124b) of the exclusion region (124) is effected in real-time during imaging of the designated region (118a) to be imaged.
An item 20. a method for providing a filtered autonomous remotely sensed image (51) by dynamic image occlusion processing (11), said method comprising the steps of:
equipping an Unmanned Aerial Vehicle (UAV) with an imaging system (12) (200);
designating a region (118) for imaging to obtain a designated region (118a) to be imaged;
establishing a plurality of reference points (120) on a surface (118b) of the designated area (118a) to be imaged;
designating a plurality of specific surface regions (124a) as excluded regions (124) that are not imaged with reference to the plurality of fiducial points (120);
controlling a pre-established flight plan (17) of the UAV (200) covering the designated area (118a) to be imaged;
positioning the imaging system (12) using a navigation system (110) including a Global Positioning System (GPS) (110a), a radio-based navigation system (110b), an optical-based navigation system (110c), an Inertial Measurement Unit (IMU) system (110d), a magnetometer-equipped Inertial Measurement Unit (IMU) system (110e), or a combination thereof, to image the designated area (118a) to be imaged;
flying (200) the UAV over the designated area to be imaged (118a) and imaging the designated area to be imaged (118a) covered by the pre-established flight plan (17) of the UAV (200) using the imaging system (12);
dynamically invalidating one or more pixels (126) in one or more images (122) of the exclusion area (124); and
obtaining a filtered autonomous remote sensing image (51) by performing the dynamic image masking process (11) on the designated area (118a) to be imaged.
The method of item 20, wherein the step of dynamically invalidating the one or more pixels (126) in one or more images (122) of the exclusion area (124) comprises directing the pre-established flight plan (17) of the UAV (200) to avoid flying over the exclusion area (124).
The method of item 20 or item 21, wherein the step of dynamically invalidating the one or more pixels (126) in the one or more images (122) of the exclusion area (124) comprises dynamically eliminating, in real-time, images captured by the image sensing system (22) while the UAV (200) is flying over the exclusion area (124).
Item 23. the method of item 20, item 21, or item 22, wherein the step of dynamically invalidating the one or more pixels (126) in the one or more images (122) of the exclusion region (124) comprises altering one or more captured images (124b) of the exclusion region (124) to make them illegible.
The method of item 24. item 23, wherein the changing of the one or more captured images (124b) of the exclusion region (124) is effected in real-time during imaging of the designated region (118a) to be imaged.
Item 25. the method of item 23, wherein the changing of one or more captured images (124b) of the exclusion area (124) is effected after completing the overall imaging of the specified area to be imaged (118a) and before obtaining the filtered autonomous remote sensing image (51) by subjecting the specified area to be imaged (118a) to the dynamic image masking process (11).
Many modifications and other embodiments of the disclosure will come to mind to one skilled in the art to which this disclosure pertains having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. The described embodiments in this disclosure are intended to be illustrative and are not intended to be limiting or exhaustive. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.

Claims (14)

1. A dynamic image masking system (10) for providing a filtered autonomous remotely sensed image (51) by a dynamic image masking process (11), the dynamic image masking system comprising:
a remote sensing platform (14);
an imaging system (12) associated with the remote sensing platform (14), the imaging system (12) comprising:
an optical system (20) that outputs original image data obtained with the optical system;
an image sensing system (22) receiving the raw image data from the optical system, the image sensing system comprising a focal plane array subsystem;
a multi-level security system (42) associated with the imaging system (12);
one or more image change locations (90) in the imaging system (12) and the multi-level security system (42), wherein the change of one or more images is made via the dynamic image masking process (11);
an optical blindness system that controls an image change position between the optical system and the image sensing system, the optical blindness system including a shutter control mechanism to inhibit one or more pixels from collecting photons or a laser optics and a micromirror optics to illuminate one or more pixels to blind the one or more pixels; and
a computer system (130) associated with the imaging system (12), the computer system (130) comprising a gatekeeper algorithm (60) configured to send a gatekeeper command (62) to one or more controllers (63), the one or more controllers (63) controlling the one or more image change positions (90) through the dynamic image obscuring process (11), the one or more controllers (63) comprising the optical blindness system,
wherein the gatekeeper algorithm is configured to calculate where a pixel originates and determine whether the pixel is in a region for imaging, and wherein if the pixel is in the region for imaging, the pixel is captured, if the pixel is not in the region for imaging, the pixel is changed.
2. The dynamic image obscuring system (10) according to claim 1, further comprising a navigation system (110) for positioning the imaging system (12) to image the designated area (118a) to be imaged, the navigation system (110) comprising a global positioning system (110a), a radio-based navigation system (110b), an optical-based navigation system (110c), an inertial measurement unit system (110d), a magnetometer-equipped inertial measurement unit system (110e), or a combination thereof.
3. A dynamic image obscuring system (10) according to claim 1 or 2, wherein the remote sensing platform (14) comprises an airborne platform (14a), a ground-based platform (14b), a space-based platform (14c) or a water-based platform (14 d).
4. Dynamic image obscuring system (10) according to claim 1 or 2, wherein the optical system (20) comprises a camera (20a) including a digital camera (20b), and wherein the image sensing system (22) further comprises a radar imaging system (22b), a sonar imaging system (22c), an infrared imaging system (22d), an x-ray imaging system (22e) or a light detection and ranging (LIDAR) system (22 f).
5. A dynamic image obscuring system (10) according to claim 1 or 2, wherein the gatekeeper algorithm (60) is further configured to send a gatekeeper command (62) to a pre-established acquisition planning process (16) located on an image change location (90) at the located pre-established acquisition planning process (16) prior to input to the imaging system (12) by determining an exclusion region (124) that is not imaged with the imaging system (12).
6. A dynamic image obscuring system (10) according to claim 1 or 2, wherein the gatekeeper algorithm (60) is configured to send gatekeeper commands (62) mechanically or optically to the optical blindness system (64).
7. A dynamic image obscuring system (10) according to claim 1 or 2, wherein the focal plane array subsystem (22a) includes:
a focal plane array (26) reading raw image data (24) from the optical system (20);
an analog-to-digital converter (30) that receives the raw image data (24) from the focal plane array (26) and converts the raw image data (24) from an analog signal to a digital signal;
a volatile temporary memory (34) receiving the digital signal (37) from the analog-to-digital converter (30) and temporarily storing the digital signal (37);
a digital signal processor (38) receiving the digital signal (37) from the volatile temporary memory (34) and processing the digital signal (37) into a readable image format (39); and
when the imaging system (12) uses analog output, a digital-to-analog converter (54) receives a readable digital signal from the digital signal processor (38) and converts the readable digital signal to an analog signal.
8. A dynamic image obscuring system (10) according to claim 1 or 2, wherein the gatekeeper algorithm (60) is configured to send a gatekeeper command (62) that controls an image change location (90) located between a non-volatile results memory (44) and a post-processing process (48) in the multi-level security system (42) by overwriting one or more pixels (126) with zero saturation (140) or one-hundred percent saturation (142).
9. A dynamic image obscuring system (10) according to claim 1 or 2, wherein the gatekeeper algorithm (60) is configured to send a gatekeeper command (62) controlling an image change location (90) in a post-processing procedure (48) of the multi-level security system (42) by editing or ignoring one or more pixels (126) of an excluded area (124) representing a designated area (118a) to be imaged.
10. A method for providing a filtered autonomous remotely sensed image (51) by dynamic image occlusion processing (11), said method comprising the steps of:
equipping a remote sensing platform (14) with an imaging system (12);
designating a region (118) for imaging to obtain a designated region (118a) to be imaged;
establishing a plurality of reference points (120) on a surface (118b) of the designated area (118a) to be imaged;
designating a plurality of specific surface regions (124a) as excluded regions (124) that are not imaged with reference to the plurality of fiducial points (120);
controlling a pre-established acquisition planning process (16) covering the designated area (118a) to be imaged;
positioning the imaging system (12) using a navigation system (110) including a Global Positioning System (GPS) (110a), a radio-based navigation system (110b), an optical-based navigation system (110c), an Inertial Measurement Unit (IMU) system (110d), a magnetometer-equipped Inertial Measurement Unit (IMU) system (110e), or a combination thereof, to image the designated area (118a) to be imaged;
imaging, using the imaging system (12), the designated area (118a) to be imaged covered by the pre-established acquisition planning process (16);
calculating where a pixel originates and determining whether the pixel is in the region for imaging (118),
capturing the pixel if the pixel is in the area for imaging (118), an
Dynamically invalidating one or more pixels (126) in one or more images (122) of the exclusion area (124) if the pixel is not in the area (118) for imaging; and
obtaining a filtered autonomous remote sensing image (51) by performing the dynamic image masking process (11) on the designated area (118a) to be imaged,
the imaging system includes:
an optical system (20) that outputs original image data obtained with the optical system;
an image sensing system (22) receiving the raw image data from the optical system, the image sensing system comprising a focal plane array subsystem;
an optical blindness system that controls an image change position between the optical system and the image sensing system, the optical blindness system including a shutter control mechanism to inhibit one or more pixels from collecting photons or a laser optics and a micromirror optics to illuminate one or more pixels to blind the one or more pixels; and
a computer system (130) associated with the imaging system (12), the computer system (130) comprising a gatekeeper algorithm (60) configured to send a gatekeeper command (62) to one or more controllers (63), the one or more controllers (63) controlling the one or more image change positions (90) through the dynamic image obscuring process (11), the one or more controllers (63) comprising the optical blindness system.
11. The method of claim 10, wherein the step of equipping the remote sensing platform (14) with the imaging system (12) comprises: the remote sensing platform (14) is equipped with the imaging system (12) including an optical system (20) including a digital camera (20a) and an image sensing system (22) including a focal plane array subsystem (22a), a radar imaging system (22b), a sonar imaging system (22c), an infrared imaging system (22d), an x-ray imaging system (22e), or a light detection and ranging (LIDAR) system (22 f).
12. The method of claim 10 or 11, wherein the step of equipping the remote sensing platform (14) with the imaging system (12) comprises equipping the remote sensing platform (14) with an airborne platform (14a), a ground-based platform (14b), a space-based platform (14c), or a water-based platform (14 d).
13. The method of claim 10 or 11, wherein the step of dynamically invalidating the one or more pixels (126) in the one or more images (122) of the exclusion area (124) comprises altering one or more captured images (124b) of the exclusion area (124) to make them illegible.
14. The method of claim 13, wherein the changing of one or more captured images (124b) of the exclusion region (124) is performed in real-time during imaging of the designated region (118a) to be imaged.
CN201510634267.0A 2014-09-29 2015-09-29 Dynamic image masking system and method Active CN105526916B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US14/500,589 US9846921B2 (en) 2014-09-29 2014-09-29 Dynamic image masking system and method
US14/500,589 2014-09-29

Publications (2)

Publication Number Publication Date
CN105526916A CN105526916A (en) 2016-04-27
CN105526916B true CN105526916B (en) 2020-10-02

Family

ID=55485416

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510634267.0A Active CN105526916B (en) 2014-09-29 2015-09-29 Dynamic image masking system and method

Country Status (4)

Country Link
US (1) US9846921B2 (en)
JP (1) JP6629019B2 (en)
CN (1) CN105526916B (en)
FR (1) FR3026540B1 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3241205A4 (en) * 2014-12-31 2018-11-07 Airmap Inc. System and method for controlling autonomous flying vehicle flight paths
US10029804B1 (en) * 2015-05-14 2018-07-24 Near Earth Autonomy, Inc. On-board, computerized landing zone evaluation system for aircraft
TW201836020A (en) 2017-02-17 2018-10-01 日商半導體能源研究所股份有限公司 Semiconductor device and method for manufacturing semiconductor device
WO2018170857A1 (en) * 2017-03-23 2018-09-27 深圳市大疆创新科技有限公司 Method for image fusion and unmanned aerial vehicle
US10606271B2 (en) 2017-07-17 2020-03-31 The Boeing Company Magnetic navigation and positioning system
US10922431B2 (en) 2017-12-27 2021-02-16 Honeywell International Inc. Systems and methods for dynamically masking video and images captured by a drone device camera
EP4009226B1 (en) * 2020-12-04 2024-05-29 Axis AB Tag for indicating a region of interest and method for finding a region of interest in an image
CA3137651A1 (en) 2020-12-19 2022-06-19 The Boeing Company Combined multi-spectral and polarization sensor
CN114596683A (en) * 2022-02-09 2022-06-07 青岛海信日立空调系统有限公司 Intrusion detection method and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001023072A (en) * 1999-07-05 2001-01-26 Nippon Signal Co Ltd:The Road traffic information providing system
JP2003173449A (en) * 2001-12-06 2003-06-20 Dowa Koei Kk Remote sensing supporting device, its program and program recording medium
JP2004030460A (en) * 2002-06-27 2004-01-29 Starlabo Corp Image processing method, image processing program and recording medium with the same program recorded thereon

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7019777B2 (en) * 2000-04-21 2006-03-28 Flight Landata, Inc. Multispectral imaging system with spatial resolution enhancement
US6909997B2 (en) * 2002-03-26 2005-06-21 Lockheed Martin Corporation Method and system for data fusion using spatial and temporal diversity between sensors
JP4508753B2 (en) * 2003-07-12 2010-07-21 エルジー エレクトロニクス インコーポレイティド Camera photographing restriction system and method for portable terminal
KR100652619B1 (en) * 2003-07-18 2006-12-01 엘지전자 주식회사 Usage restriction system and method for digital camera adapted to mobile terminal
US20070070894A1 (en) * 2005-09-26 2007-03-29 Fan Wang Method to determine a scheduling priority value for a user data connection based on a quality of service requirement
US8918540B2 (en) * 2005-09-26 2014-12-23 The Boeing Company Unmanned air vehicle interoperability agent
JP4356733B2 (en) * 2006-11-09 2009-11-04 アイシン精機株式会社 In-vehicle image processing apparatus and control method thereof
US8218868B2 (en) * 2007-06-13 2012-07-10 Sensors Unlimited, Inc. Method and apparatus for enhancing images
US9041915B2 (en) * 2008-05-09 2015-05-26 Ball Aerospace & Technologies Corp. Systems and methods of scene and action capture using imaging system incorporating 3D LIDAR
US8731234B1 (en) * 2008-10-31 2014-05-20 Eagle View Technologies, Inc. Automated roof identification systems and methods
US20140347482A1 (en) * 2009-02-20 2014-11-27 Appareo Systems, Llc Optical image monitoring system and method for unmanned aerial vehicles
US8713215B2 (en) * 2009-05-29 2014-04-29 Z Microsystems, Inc. Systems and methods for image stream processing
US8266333B1 (en) * 2009-05-29 2012-09-11 Z Microsystems, Inc. System and method for parallel image processing and routing
US9163909B2 (en) 2009-12-11 2015-10-20 The Boeing Company Unmanned multi-purpose ground vehicle with different levels of control
US8965598B2 (en) * 2010-09-30 2015-02-24 Empire Technology Development Llc Automatic flight control for UAV based solid modeling
US9086484B2 (en) * 2011-06-30 2015-07-21 The Boeing Company Context-based target recognition
WO2014031557A1 (en) * 2012-08-20 2014-02-27 Drexel University Dynamically focusable multispectral light field imaging
JP6055274B2 (en) * 2012-10-31 2016-12-27 株式会社トプコン Aerial photograph measuring method and aerial photograph measuring system
DE102013019488A1 (en) * 2012-11-19 2014-10-09 Mace Wolf PHOTO WITH PROTECTION OF THE PRIVACY
US20140312165A1 (en) * 2013-03-15 2014-10-23 Armen Mkrtchyan Methods, apparatus and systems for aerial assessment of ground surfaces
EP3069509A4 (en) * 2013-11-14 2017-09-20 KSI Data Sciences, Inc. A system and method for managing and analyzing multimedia information

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001023072A (en) * 1999-07-05 2001-01-26 Nippon Signal Co Ltd:The Road traffic information providing system
JP2003173449A (en) * 2001-12-06 2003-06-20 Dowa Koei Kk Remote sensing supporting device, its program and program recording medium
JP2004030460A (en) * 2002-06-27 2004-01-29 Starlabo Corp Image processing method, image processing program and recording medium with the same program recorded thereon

Also Published As

Publication number Publication date
CN105526916A (en) 2016-04-27
JP6629019B2 (en) 2020-01-15
JP2017027571A (en) 2017-02-02
FR3026540B1 (en) 2019-03-29
US9846921B2 (en) 2017-12-19
FR3026540A1 (en) 2016-04-01
US20170018058A1 (en) 2017-01-19

Similar Documents

Publication Publication Date Title
CN105526916B (en) Dynamic image masking system and method
US11086324B2 (en) Structure from motion (SfM) processing for unmanned aerial vehicle (UAV)
US20200051264A1 (en) Image processing apparatus, ranging apparatus and processing apparatus
CN107240063A (en) A kind of autonomous landing method of rotor wing unmanned aerial vehicle towards mobile platform
JP7152836B2 (en) UNMANNED AIRCRAFT ACTION PLAN CREATION SYSTEM, METHOD AND PROGRAM
US20210009270A1 (en) Methods and system for composing and capturing images
US20200320293A1 (en) Texture classification of digital images in aerial inspection
US20140347482A1 (en) Optical image monitoring system and method for unmanned aerial vehicles
US20180365839A1 (en) Systems and methods for initialization of target object in a tracking system
JP6232502B2 (en) Artificial vision system
CN111225855A (en) Unmanned plane
US11741571B2 (en) Voronoi cropping of images for post field generation
CN108414454A (en) The synchronized measurement system and measurement method of a kind of plant three-dimensional structure and spectral information
DE202016007867U1 (en) Control the line of sight angle of an image processing platform
CN112163483A (en) Target quantity detection system
CN106846385A (en) Many sensing Remote Sensing Images Matching Methods, device and system based on unmanned plane
Gehrke et al. Multispectral image capturing with foveon sensors
Gehrke et al. RGBI images with UAV and off-the-shelf compact cameras: An investigation of linear sensor characteristics
US10553022B2 (en) Method of processing full motion video data for photogrammetric reconstruction
US20240095911A1 (en) Estimating properties of physical objects, by processing image data with neural networks
US11127165B1 (en) Registration of single channel image sensors
US12126942B2 (en) Active camouflage detection systems and methods
BD et al. A new rapid, low-cost and GPS-centric unmanned aerial vehicle incorporating in-situ multispectral oil palm trees health detection
DE202014011010U1 (en) Target tracking systems
CN117401194A (en) Unmanned aerial vehicle with adjustable accessory direction

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant