Nothing Special   »   [go: up one dir, main page]

IL276014B1 - Satellite Imaging System with Reduced Cloud Obstruction - Google Patents

Satellite Imaging System with Reduced Cloud Obstruction

Info

Publication number
IL276014B1
IL276014B1 IL276014A IL27601420A IL276014B1 IL 276014 B1 IL276014 B1 IL 276014B1 IL 276014 A IL276014 A IL 276014A IL 27601420 A IL27601420 A IL 27601420A IL 276014 B1 IL276014 B1 IL 276014B1
Authority
IL
Israel
Prior art keywords
satellite
leading
trailing
cloud
targets
Prior art date
Application number
IL276014A
Other languages
Hebrew (he)
Other versions
IL276014A (en
Inventor
Rosenthal Eran
Original Assignee
Israel Aerospace Ind Ltd
Rosenthal Eran
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Israel Aerospace Ind Ltd, Rosenthal Eran filed Critical Israel Aerospace Ind Ltd
Priority to IL276014A priority Critical patent/IL276014B1/en
Publication of IL276014A publication Critical patent/IL276014A/en
Publication of IL276014B1 publication Critical patent/IL276014B1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64GCOSMONAUTICS; VEHICLES OR EQUIPMENT THEREFOR
    • B64G1/00Cosmonautic vehicles
    • B64G1/10Artificial satellites; Systems of such satellites; Interplanetary vehicles
    • B64G1/1021Earth observation satellites
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64GCOSMONAUTICS; VEHICLES OR EQUIPMENT THEREFOR
    • B64G1/00Cosmonautic vehicles
    • B64G1/10Artificial satellites; Systems of such satellites; Interplanetary vehicles
    • B64G1/1085Swarms and constellations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64GCOSMONAUTICS; VEHICLES OR EQUIPMENT THEREFOR
    • B64G1/00Cosmonautic vehicles
    • B64G1/22Parts of, or equipment specially adapted for fitting in or to, cosmonautic vehicles
    • B64G1/222Parts of, or equipment specially adapted for fitting in or to, cosmonautic vehicles for deploying structures between a stowed and deployed state
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S1/00Beacons or beacon systems transmitting signals having a characteristic or characteristics capable of being detected by non-directional receivers and defining directions, positions, or position lines fixed relatively to the beacon transmitters; Receivers co-operating therewith
    • G01S1/02Beacons or beacon systems transmitting signals having a characteristic or characteristics capable of being detected by non-directional receivers and defining directions, positions, or position lines fixed relatively to the beacon transmitters; Receivers co-operating therewith using radio waves
    • G01S1/08Systems for determining direction or position line
    • G01S1/14Systems for determining direction or position line using amplitude comparison of signals transmitted simultaneously from antennas or antenna systems having differently oriented overlapping directivity-characteristics
    • G01S1/18Elevational guidance systems, e.g. system for defining aircraft glide path

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Astronomy & Astrophysics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Description

– 1 – SATELLITE IMAGING SYSTEM WITH REDUCED CLOUD OBSTRUCTION FIELD OF THE INVENTION The presently disclosed subject matter relates to Earth satellite imagery.
BACKGROUND OF THE INVENTION Earth satellite imagery includes images which are collected by an imaging system operating onboard a satellite orbiting the Earth. Satellites configured for obtaining imagery of the Earth include a category of Low-Earth orbits (LEO), which occupy a region of space from about 180 kilometers to 2000 kilometers above the Earth’s surface. LEO imagery acquiring satellites are used for capturing images of the Earth for various purposes, such as environmental monitoring, cartography, intelligence, etc.
GENERAL DESCRIPTION Following launch, some LEO satellites orbit the Earth while operating an electro optic image sensor (EOS) for capturing images of target areas on Earth. While a high resolution EOS can provide high quality images of the Earth, generally, at any given time, in average about 50% of the Earth's surface is covered by clouds which are opaque to a degree impenetrable to electromagnetic radiation in the frequency range detectable by an EOS. Accordingly, during an Earth imagery mission many areas on Earth which are obscured by clouds cannot be captured by an EOS onboard a satellite located above the clouds. Execution of electro optic (EO) imaging by a satellite, while ignoring the condition of clouds, results in a considerably degraded imaging output. One method applied nowadays in order to overcome this problem includes repeatedly capturing an area of interest during multiple passes over the same area, and fusing the captured images. This method is cumbersome, in both time consumption and consumption of the satellite's limited resources. Another method includes using a synthetic aperture radar (SAR), which is a costly and bulky RADAR, that provides imaging with less information as compared to EO sensors. – 2 – The presently disclosed subject matter includes a satellite system and method of operation which are dedicated for mitigating target obstruction by clouds during EO imaging of Earth.
According to one aspect of the presently disclosed subject matter, there is provided a satellite system for electro-optic imaging, the system comprising: a trailing satellite comprising a first imaging subsystem with a first electro-optic image sensor (EOS), configured and operable to capture images of one or more targets located on Earth; a leading satellite comprising a second imaging subsystem comprising a second electro-optic image sensor (EOS); wherein the leading satellite is characterized by smaller dimensions and smaller mass compared to the trailing satellite; while the leading satellite is flying along an orbit around the Earth, the second imaging subsystem is configured and operable to operate the second electro-optic image sensor for capturing images in the direction of the Earth; the system further comprising at least one processing circuitry configured and operable to: process the captured images and identify at least one cloud in the captured images; determine positioning data, of the at least one cloud relative to a certain frame of reference; determine whether any of the one or more targets are bound to be obstructed by the at least one identified cloud, thereby identifying a group comprising one or more targets unobstructed by the clouds; generate an image acquisition scheme including instructions for controlling the first imaging subsystem onboard the trailing satellite for imaging targets in the group; while the trailing satellite is following the leading satellite along the orbit, the first imaging subsystem is configured and operable to operate the first electro-optic – 3 – image sensor for capturing images of one or more targets according to the instructions prescribed by the image acquisition scheme, thereby mitigating obstruction of the one or more targets by the at least one cloud.
In addition to the above features, the method according to the above aspect can optionally comprise one or more of features (i) to (xxii) below, in any technically possible combination or permutation: I. wherein the first EOS comprises a first optical aperture and the second EOS comprises a second optical aperture that is smaller than the first optical aperture, thereby providing degraded imagining output as compared to the first EOS. II. wherein the second EOS is characterized by a resolution at nadir that is lower than the resolution at nadir of the first EOS. III. wherein the positioning data is a three dimensional positioning data that includes altitude. IV. wherein the at least one processing circuitry is further configured to: determine a future position of the at least one cloud relative to the one or more targets during a future respective acquisition period of the acquisition scheme; and identify the group of one or more targets, using the cloud future position. V. wherein the future position of the at least one cloud and the one or more targets is determined using data on the Earth’s rotation. VI. wherein the at least one processing circuitry includes a first processing circuitry onboard the trailing satellite; wherein the first processing circuitry is further configured to receive the images from the leading satellite. VII. wherein the at least one processing circuitry includes a second processing circuitry located at a ground control station connected over a communication link with the leading satellite and trailing satellite; and wherein the second processing circuitry is further configured to receive the images from the leading satellite and transmit the image acquisition scheme to the trailing satellite. VIII. wherein the image acquisition scheme is generated while striving to increase the respective number of image acquisitions, while considering agility of the trailing satellite. 30 – 4 – IX. wherein the system comprises two or more leading satellites configured to fly along the orbit; wherein each leading satellite of the two or more leading satellites comprises a respective second electro-optic image sensor; wherein the at least one processing circuitry is further configured to determine the positioning data of the at least one cloud, based on a plurality of images captured by imaging subsystems onboard the two or more leading satellites. X. wherein the system comprises two or more leading satellites configured to advance along the orbit with a respective distance between each pair of leading satellites; wherein each leading satellite of the two or more leading satellites comprises a respective second electro-optic image sensor; wherein the at least one processing circuitry is further configured to: process the plurality of images captured simultaneously by imaging subsystems onboard the two or more leading satellites for the purpose of determining a velocity of the at least one cloud; and estimate the future positioning of the at least one cloud based on the so determined velocity.
XI. wherein the system comprises a multispectral or a hyperspectral sensor onboard the leading satellite and/or the trailing satellite configured to provide an optical response of the at least one cloud, the at least one processing circuitry is configured to determine data indicative of transparency of the at least one cloud based on the spectral response and exclude the at least one cloud from obstructing targets in case transparency complies with one or more condition. XII. wherein the trailing satellite and the leading satellite are capable of transmitting data to the ground control station during one or more communication periods; XIII. wherein the trailing satellite is configured to transmit to the leading satellite data generated onboard the trailing satellite during times other than the respective communication period, the generated data including captured images of the one or more targets; the leading satellite is configured to transmit the data to a ground control station during a respective communication period, thereby increasing – 5 – an overall volume of data that is transmitted to the ground control station during the respective communication period. XIV. wherein the trailing satellite is a LEO satellite. XV. wherein the leading satellite has a mass of 100 kg or less. XVI. wherein the leading satellite has a mass of 20 kg or less. XVII. wherein the leading satellite is a microsatellite or a nanosatellite. XVIII. wherein the time lag between the leading satellite and trailing satellite along the orbit is equal to or greater than 2 minutes. XIX. wherein the time lag between the leading satellite and trailing satellite along the orbit is equal to or greater than 3 minutes. XX. wherein the time lag between the leading satellite and trailing satellite along the orbit is equal to or greater than 5 minutes. XXI. wherein the time lag between the leading satellite and trailing satellite along the orbit is equal to or greater than 7 minutes. XXII. wherein the time lag between the leading satellite and trailing satellite along the orbit is equal to or greater than 10 minutes.
According to another aspect of the presently disclosed subject matter there is provided a computer-readable non-transitory memory device tangibly embodying a program of instructions executable by the machine for executing a method of electro-optic imaging of the Earth, the method comprising: operating a trailing satellite and at least one leading satellite, wherein the at least one leading satellite advances along an orbit around the Earth and the trailing satellite follows the leading satellite along the orbit at a predefined distance; wherein the at least one leading satellite is characterized by smaller dimensions and smaller mass compared to the trailing satellite; while the trailing satellite and leading satellite are in orbit: operating a second electro-optic image sensor (EOS), onboard the at least one leading satellite orbiting the Earth, for capturing images in the direction of the Earth; processing the captured images and identifying at least one cloud in the 30 – 6 – captured images; determine positioning data, of the at least one cloud relative to a certain frame of reference; determining whether any of the one or more targets are bound to be obstructed by the at least one identified cloud, thereby identifying a group comprising one or more targets unobstructed by the clouds; generating an image acquisition scheme including instructions for controlling the first imaging subsystem onboard the trailing satellite for imaging targets in the group; operating a first EOS onboard the trailing satellite, while the trailing satellite is following the leading satellite along the orbit, for capturing images of one or more targets according to the instructions of the image acquisition scheme, thereby avoiding obstruction of the one or more targets by the identified clouds.
According to another aspect of the presently disclosed subject matter there is provided a method of using satellites for electro-optic imaging of the Earth, the method comprising: providing a trailing satellite comprising a first imaging subsystem with an a first electro-optic image sensor (EOS) having a first optical aperture, and configured and operable to capture images of one or more targets located on Earth; providing a leading satellite comprising a second imaging subsystem with a second electro-optic image sensor (EOS) and a second optical aperture that is smaller than the first optical aperture; wherein the leading satellite is characterized by smaller dimensions and smaller mass compared to the trailing satellite; wherein the leading satellite and trailing satellite are configured to be launched into space such that the leading satellite advances along an orbit around the Earth and the trailing satellite follows the leading satellite along the orbit, at a predefined distance; configuring the second imaging system to be capable of: operating the second electro-optic image sensor for capturing images in the direction of the Earth; – 7 – configuring a processing circuitry to be capable to execute a process, while the leading and trailing satellite are orbiting the Earth, the process comprising: processing the captured images and identifying at least one cloud in the captured images; determining positioning data, of the at least one cloud relative to a certain frame of reference; determining whether any of the one or more targets are bound to be obstructed by the at least one identified cloud, thereby identifying a group comprising one or more targets unobstructed by the clouds; generating an image acquisition scheme including instructions for controlling the first imaging subsystem onboard the trailing satellite for imaging targets in the group; configuring the first imaging system to be capable of: operating the first electro-optic images sensor, while the trailing satellite is following the leading satellite along the orbit, for capturing images of one or more targets according to the instructions of the image acquisition scheme, thereby avoiding obstruction of the one or more targets by the identified clouds.
According to another aspect of the presently disclosed subject matter there is provided a method of electro-optic imaging of the Earth, the method comprising: operating a trailing satellite and at least one leading satellite, wherein the at least one leading satellite advances along an orbit around the Earth and the trailing satellite follows the leading satellite along the orbit at a predefined distance; wherein the at least one leading satellite is characterized by smaller dimensions and smaller mass compared to the trailing satellite; while the trailing satellite and leading satellite are in orbit: operating a second electro-optic image sensor (EOS), onboard the at least one leading satellite orbiting the Earth, for capturing images in the direction of the Earth; operating a processing circuitry for: processing the captured images and identifying at least one cloud in the – 8 – captured images; determining positioning data, of the at least one cloud relative to a certain frame of reference; determining whether any of the one or more targets are bound to be obstructed by the at least one identified cloud, thereby identifying a group comprising one or more targets unobstructed by the clouds; generating an image acquisition scheme including instructions for controlling the first imaging subsystem onboard the trailing satellite for imaging targets in the group; operating a first EOS onboard the trailing satellite, while the trailing satellite is following the leading satellite along the orbit, for capturing images of one or more targets according to the instructions of the image acquisition scheme, thereby avoiding obstruction of the one or more targets by the identified clouds.
The presently disclosed subject matter further contemplates a leading satellite configured to operate in a satellite formation together with a trailing satellite as disclosed herein.
The presently disclosed subject matter further contemplates a trailing satellite configured to operate in a satellite formation together with at least one leading satellite as disclosed herein.
In addition, the methods, the computer-readable non-transitory memory device, the trailing satellite and the leading satellite may each comprise one or more of features (i) to (xxii) above, in any technically possible combination or permutation.
BRIEF DESCRIPTION OF THE DRAWINGS In order to understand the invention and to see how it may be carried out in practice, a preferred embodiment will now be described, by way of non-limiting example only, with reference to the accompanying drawings, in which: Fig . 1 is a schematic illustration of a satellite formation orbiting the Earth, in accordance with examples of the presently disclosed subject matter; – 9 – Fig. 2a is a block diagram schematically illustrating a system onboard a leading satellite, in accordance with examples of the presently disclosed subject matter; Fig. 2b is a block diagram schematically illustrating a system onboard a trailing satellite, in accordance with examples of the presently disclosed subject matter; Fig. 3 is a flowchart illustrating operations performed in accordance with examples of the presently disclosed subject matter; Fig. 4 is a schematic illustration of a leading satellite, in accordance with examples of the presently disclosed subject matter; Fig. 5 is a schematic illustration of a trailing satellite, in accordance with examples of the presently disclosed subject matter; Fig. 6 is another flowchart illustrating operations performed in accordance with examples of the presently disclosed subject matter; and Fig. 7 is a schematic illustration of a satellite formation orbiting the Earth, in accordance with examples of the presently disclosed subject matter.
It should be noted that the drawings are schematic drawings which are not drawn to scale.
DETAILED DESCRIPTION As used herein, the phrase "for example," "such as" and variants thereof describing exemplary implementations of the present invention, are exemplary in nature, and not limiting.
The system and satellites disclosed herein comprise at least one computerized device. The terms "computer", "computer devicesystem", "computerized devicesystem" or the like as disclosed herein should be broadly construed to include any kind of electronic device (or a collection of interconnected devices) with data processing circuitry, which includes (at least one) computer processor configured and operable to execute computer instructions. The computer instructions can be stored, for example, on a computer memory being operatively connected to the device. Examples of such a device include: a digital signal processor (DSP), a microcontroller, – 10 – a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), a laptop computer, a personal computer, a smartphone, etc.
As apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as "processing", "determining", "operating", "generating", "capturing" or the like, include action and/or processes of a computerized device that manipulate and/or transform data into other data, said data represented as physical quantities, e.g. such as electronic quantities and/or said data representing the physical objects.
It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the invention, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable sub-combination. While the invention has been shown and described with respect to particular embodiments, it is not thus limited. Numerous modifications, changes and improvements within the scope of the invention will now occur to the reader.
In embodiments of the invention, fewer, more and/or different stages than those shown in Figs. 3and 6may be executed. In embodiments of the invention, one or more stages illustrated in Figs. 3and 6may be executed in a different order and/or one or more groups of stages may be executed simultaneously. Fig. 2aand 2b illustrate schematics of a system architecture in accordance with embodiments of the invention. Components in Fig. 2a and 2b can be made up of any combination of software and hardware, and/or firmware that performs the functions as defined and explained herein. Components in Fig. 2aand 2b may be centralized in one location or dispersed over more than one location. According to some examples of the presently disclosed subject matter, the system may comprise fewer, more and or different modules than those shown in Fig. 2aand 2b .
Furthermore, for the sake of clarity, the terms "substantially" or "approximately" may be used herein to imply the possibility of variations from a prescribed value or state (examples of states including: "perpendicular", "same", 30 – 11 – "identical", "parallel", etc.) within an acceptable range. According to one example, the term "substantially" used herein should be interpreted to imply possible variations of up to 15% over or under any specified value. According to another example, the term "substantially" used herein should be interpreted to imply possible variation of up to 10% over or under any specified value. According to yet another example, the term "substantially" used herein should be interpreted to imply possible variation of up to 5% over or under any specified value. According to a further example, the term "substantially" used herein should be interpreted to imply possible variation of up to 2.5% over or under any specified value.
Bearing the above in mind, attention is now drawn to Fig. 1 schematically showing satellites in flight formation according to examples of the presently disclosed subject matter. As shown, the flight formation includes a first satellite 110 (referred to herein as a "leading satellite") advancing along an orbit around the Earth, and a second satellite 120 (referred to herein as a "trailing satellite") that follows the leading satellite 110 along a common orbit.
As would be apparent to any person skilled in the art, while the leading and trailing satellites advance on the common orbit, some variations between the orbit of the leading and trailing satellite may exist. These variations are maintained within certain boundaries in order to enable proper operation of the satellite system. As known in the art, deviation of a satellite from orbit can be fixed by operating thrusters. Thus, while it is described herein that one or more leading satellites and the trailing satellite all advance on a common orbit, it should be understood that this means that they all advance on a (substantially) common orbit within an acceptable variation.
The distance between the leading and trailing satellite is substantially fixed, such that the trailing satellite 120 reaches a previous position of the leading satellite 110 (in an inertial frame) after a certain time lag (dT). The satellite flight formation disclosed herein can be implemented according to the principles of "spacecraft formation flying" which are well known in the art and include utilizing various methods (e.g. thruster activation) for maintaining their relative orbit. In different implementations of the presently disclosed subject matter the time lag between the 30 – 12 – leading and trailing satellite may vary. In some examples, the time lag between the leading satellite and trailing satellite along the orbit is equal to or greater than minutes. In some examples, the time lag between the leading satellite and trailing satellite along the orbit is equal to or greater than 3 minutes. In some examples, the time lag between the leading satellite and trailing satellite along the orbit is equal to or greater than 5 minutes. In some examples, the time lag between the leading satellite and trailing satellite along the orbit is equal to or greater than 7 minutes. In some examples, the time lag between the leading satellite and trailing satellite along the orbit is equal to or greater than 10 minutes.
According to some examples, the trailing satellite 120 is a Low Earth Orbit (LEO) satellite comprising an imaging system having a high resolution image sensor configured to capture high resolution images of the Earth.
The leading satellite 110 is characterized by a considerably smaller mass as compared to the trailing satellite. For example, the leading satellite can be a microsatellite or a nanosatellite. Microsatellites are artificial satellites of low mass and size, usually under 100 kg. Nanosatellite is a term applied to a sub-category of satellites commonly having a mass less than 20 kg, or, according to other more restrictive approaches, 10 kg or less. According to some examples, the leading satellite is characterized by 6U CubeSat - each U represents a unit of 10×10×11.35 cm. Reducing satellite size and mass can provide various advantages including reduced costs, which are achieved due to less costly design and manufacturing, and less costly launching due to the use of smaller launching rockets, and the ability to launch several satellites simultaneously. A microsatellite or a nanosatellite may be launched and operated together in a group of multiple satellites, providing sometimes equal or even better performance at a lower cost as compared to that of a single larger satellite. In particular, here, since the size/mass and volume of the leading satellite are significantly smaller than the size/mass and volume of the trailing satellite, the leading and trailing satellites can be launched within the same launcher, at a cost similar to that of launching the trialing satellite alone. This reduces the mission cost, and, in – 13 – addition, in this manner, separation of the satellites from the launcher would deploy them to substantially the same orbit, as desired.
According to examples of the presently disclosed subject matter, the leading satellite 110 comprises an EOS having a resolution lower than that of the camera onboard the trailing satellite. In general, the EOS onboard the leading satellite is characterized by a resolution which is sufficient to detect clouds and determine whether they obstruct specific areas from being imaged by the trailing satellite. In some examples, the low resolution of the EOS onboard the leading satellite 110 is insufficient for detecting detailed information of the Earth.
Resolution can be characterized by a "ground sample distance" (GSD) value measured for a certain satellite attitude and certain camera look angle. According to some non-limiting examples high resolution imaging system onboard the trailing satellite is characterized by ground sample distance (GSD) at a given camera look angle (e.g. nadir, i.e. where the camera is pointing towards the Earth’s center) between 0.meters and 20 meters, and a low resolution imaging system onboard the leading satellite is characterized by GSD at nadir between 0.5 kilometers and 3 kilometers.
According to other examples, high resolution and low resolution are defined one with respect to the other. Where given a certain GSD at nadir of the leading satellite, the trailing satellite has a GSD at nadir which is X times smaller. According to one example, X is between 5 to 10; according to another examples, X is between 50 to 100; according to a further another example X is between 500 and 1000; and according to yet another example, X is greater than 1000.
In addition, the difference between imaging subsystem 240a onboard the leading satellite, and imaging system 240b onboard the trailing satellite, can be also defined as a difference in the camera optical aperture installed in each subsystem, as further explained below.
Turning to the description of the satellites, generally, a satellite comprises a number of subsystems, including for example, a structural subsystem, a communication (telemetry) subsystem, a power subsystem, a thermal control – 14 – subsystem, an attitude and orbit control subsystem, and an imaging subsystem. Notably, not all subsystems are shown in the figures.
The structural subsystem provides the mechanical infrastructure for holding together all the satellite functional subsystems, and provides the required durability to withstand the mechanical stress and extreme temperatures during launch and while in orbit.
The communication (telemetry) subsystem is configured to enable the satellite to transmit (downlink) and receive (uplink) data, including imaging data and equipment operation data.
The power subsystem can comprise solar panels and batteries, the panels being configured to absorb and convert solar energy into electrical power and charge the batteries onboard the satellite, configured in turn to store electric power and supply the power to the satellite subsystems.
The thermal control subsystem is configured to regulate the internal temperature of the satellite subsystems and protect the onboard equipment from the extreme temperatures to which the satellite is exposed.
The attitude and orbit control subsystem is a computerized system that comprises sensors configured to determine the position and attitude of the satellite, actuators configured to modify the orbit and attitude of the satellite, and a flight control system (including a computer) configured to obtain and maintain the desired attitude of the satellite, keep the satellite in the desired orbital position, and maintain the antennas/cameras directed to the desired position. The attitude and orbit control subsystem can comprise or be otherwise operatively connected to various positioning devices dedicated for determining position data of the satellite, including for example, a GPS receiver and a star tracker. The attitude and orbit control subsystem can further comprise or be otherwise operatively connected to various actuators, such as reaction wheels and thrusters, which are controlled for example by the flight control system and enable to adjust and control the attitude and orbital position of the satellite. – 15 – The imaging subsystem comprises various imaging devices and is configured to capture images e.g. of Earth.
Fig. 2a schematically illustrates some components of the leading satellite 110 including a mechanical subsystem (structure) 200 that comprises the various components mentioned above. Specifically Fig. 2a illustrates a communication subsystem 210 , attitude and orbit control subsystem 220 (including one or more positioning devices) and an imaging subsystem 240a comprising at least one EOS. The imaging subsystem 240a further comprises, or is otherwise operatively connected to, processing circuitry 230a . EOS refers to the entire imaging assembly, including for example, lenses, mirrors, sensors, telescope, aperture, etc.
As will be further detailed with reference to the figures below, processing circuitry 230a can be configured to execute several functional modules in accordance with computer-readable instructions implemented on a non-transitory computer-readable storage medium. Such functional modules are referred to hereinafter as comprised in the processing circuitry.
Fig. 2a also shows ground control station 260 located remotely from the satellite and configured in general to monitor and control operation of the satellite. Ground control station 260 is configured to communicate with the satellite e.g. for receiving telemetry data and imaging output from the satellite and transmitting instructions and other data to the satellite. The ground control station can be located at some location on Earth.
Fig. 2bschematically illustrates some components of the trailing satellite 120 . As mentioned above, the trailing satellite is different from the leading satellite, as the former is a larger and more costly satellite, carrying a high quality camera with a larger optical aperture as compared to the optical aperture of EOS onboard the leading satellite, which is capable of capturing high resolution images of the Earth (notably, the satellite's size and mass are sensitive to the magnitude of the optical aperture, and typically are increased as the optical aperture increases). Nonetheless, many of the basic components in the two satellites are the same, although there may be differences in quality, size and performance between equivalent components in the 30 – 16 – two satellites. For the sake of simplicity, the same reference numerals were used for identifying some of the equivalent components in both the leading and trailing satellites, while other components were given different reference numerals for each satellite.
Specifically, Fig. 2b includes an imaging system 240b that comprises a high resolution EOS and a processing circuitry 230b configured to execute processing related to the tasks performed by the trailing satellite, which are at least partially different to the tasks performed by the leading satellite and which are executed by processing circuitry 230a . Notably, part or all of the components of the processing circuitry 230b , 231 to 234 , can be alternatively located at the leading satellite, in which case the respective processing is performed at the leading satellite and the processing output is transmitted to the trailing satellite. Likewise, in other examples, components of the processing circuitry 230b , 231 to 234 , can be located at the GCS 260 , in which case image data is transmitted to GCS 260 , the respective processing is performed at GCS 260 , and the processing output is transmitted from GCS 260 to the trailing satellite.
Fig. 3 is a flowchart showing operations carried out according to some examples of the presently disclosed subject matter.
As explained above, a leading satellite and a trailing satellite are launched such that both satellites revolve about the Earth along a common orbit, while a distance is maintained between the leading satellite and the trailing satellite, such that the trailing satellite reaches a previous location of the leading satellite after a certain time period dT (in an inertial frame of reference).
During flight, a low resolution camera 240a onboard the leading satellite is operated for capturing images of regions on Earth where targets are located or assumed to be located (block 301 ). Images captured by the leading satellite along with respective metadata can be stored onboard the leading satellite, e.g. in computer data-repository 250a . Metadata (referred to below also as "image acquisition metadata") includes for example the position of the leading satellite at the time an – 17 – image is captured (respective image acquisition position) and the time of capturing (respective image acquisition time).
The captured images are processed for the purpose of detecting clouds (block 303 ). Cloud detection in captured images can be done by various methods of image processing. For example, machine learning models (including deep learning) dedicated to detecting cloud features and/or spectral signatures in images obtained by a multi-spectral or a hyper spectral sensor on board the leading satellite that identify clouds, can be applied for this purpose. According to one example, processing of the captured images is performed by a processing circuitry onboard the leading satellite 230a (e.g. by cloud detection module 231 ).
According to other examples, the captured images are transmitted to the trailing satellite 120 or a ground control station 260 (e.g. using communication subsystem 210 ), and the processing of the images is performed by processing circuitry 230b onboard the trailing satellite or the ground control station 260 , which is configured to identify the clouds mask.
Once identified, positioning data of the clouds is determined. According to one example, positioning data includes a 2-dimensional cloud map (referred to herein also as a "cloud mask") is created e.g. by computing the projection of the clouds observed in the captured images on a 2-dimensional surface of an average cloud altitude (of a few kilometers) above the Earth’s surface. As discussed below, the lack of altitude information in the cloud mask may generate difficulties in computing cloud obstruction for the trailing satellite.
According to some examples, once a cloud mask (over a certain area) is determined, the processing output is transmitted to the trailing satellite (e.g. using communication subsystem 210 ) onboard the leading satellite.
At block 305 , unobstructed targets are identified based on the position of the detected clouds and target data. The term "target data" refers to information indicating the position of regions of interest on Earth, which represent targets desired to be captured by the high resolution camera 240b onboard the trailing satellite during a respective acquisition period. Target data can be retrieved from a data-storage 30 – 18 – device 250 onboard the trailing satellite (and/or the leading satellite) and/or received from some other source, e.g. a ground control station 260 . The position of targets is compared to the position of the clouds and targets which are located in regions covered by clouds, obstructing the line of sight (LOS) extending from the high resolution EOS onboard the trailing satellite to the targets, are marked as not available, and are removed from the pool of targets destined to be captured during the respective acquisition period.
According to further examples, the term "acquisition period" is used to refer to the period of time during which a specific image acquisition scheme can be executed, i.e. targets pertaining to the specific image acquisition scheme can be captured from the orbiting satellite. During the acquisition period, the satellite is located within an area ("acquisition area") from which the specific image acquisition scheme can be executed. Thus, each acquisition scheme has a respective acquisition period. In some examples the acquisition period is between t+dT-x and t+dT-x, where t is the time of acquisition by the leading satellite and dT is the time difference between the leading satellite and trailing satellite and x is a time period of a few minutes.
At block 307 , an image acquisition scheme is generated (e.g. with the help of scheme generation module 234 ). The image acquisition scheme is generated in real-time with the help of a target acquisition scheduling algorithm, and includes data identifying targets unobstructed by clouds that should be captured, as well as the order of capturing of these targets. In some examples, the scheme is translated into specific instructions to be executed by the imaging subsystem (specifically the high resolution EOS) and, in some cases, the attitude and orbit control subsystem 220 , during the respective acquisition period.
According to some examples, processing circuitry 230b onboard the trailing satellite is configured to generate the image acquisition scheme, once the trailing satellite obtains the information identifying unobstructed targets (either from the leading satellite, or by processing the images received from the leading satellite). – 19 – According to other examples, a processing circuitry onboard leading satellite or GCS 260 generates the images acquisition scheme and transmits it to the trailing satellite.
According to some examples, the image acquisition scheme is generated while striving to maintain it as efficient as possible by maximizing the number of image acquisitions (i.e. the number of images captured during execution of an image acquisition scheme). In order to compute an efficient image acquisition scheme, the position of the non-obstructed targets, relative to the orbit and relative to one another, is considered, along with the agility of the satellite. As known in the art, the agility of a satellite refers to the performance of the trailing satellite attitude control system. Given the relative position of a collection of unobstructed targets, the agility of the satellites has an impact on the number of image acquisitions the satellite is able to execute in a given time period. This means that given a collection of desired targets unobstructed by clouds, their relative position and the agility constraints of the satellite are considered to select a subset of targets (possibly including all unobstructed targets) representing the maximum number of targets that can be captured during the acquisition period of the image acquisition scheme. During selection of the subset, the order of the capturing of different targets is also considered for the purpose of determining a capturing order that allows to maximize the image acquisitions.
In general, generation of an efficient image acquisition scheme is computationally time consuming, even in the absence of clouds, since typically there are numerus possibilities along the satellite orbit from which the imaging subsystem can capture any one of the unobstructed targets, and many different orders of capturing different targets, yet out of the many options of combining these individual target acquisitions to form a full image acquisition scheme, only those possibilities that comply with satellite agility constraints (and possibly other constraints such as lighting constraints) are allowed. This computation may require significant time for its operation, which entails a sufficiently large separation between the leading and trailing satellites that provides sufficient time from the time images are captured by – 20 – the leading satellite, to the time of execution of the image acquisition scheme by the trailing satellite.
The trailing satellite follows the leading satellite, and once it reaches an area from which the respective image acquisition scheme can be executed, it executes the scheme and operates the high resolution camera 240b for capturing high resolution images of non-obstructed targets as prescribed by the image acquisition scheme (block 309 ).
Due to rotation of the Earth, the targets and clouds continuously move, relative to the orbit of the satellites. Thus, the position of the clouds, detected by processing of images captured by the leading satellite, is bound to change during the time lag extending between a time when images of certain targets on Earth are captured by EOS 240a onboard the leading satellite 110 , and a time when the targets are being captured by EOS 240b onboard the trailing satellite 120 according to a respective acquisition scheme. Accordingly, in order to provide an accurate image acquisition scheme which reflects target obstruction by clouds at the time the respective acquisition period commences, the position of the clouds at that time should be estimated.
According to some examples of the presently disclosed subject matter, a processing circuitry (e.g. processing circuitry 240b onboard the trailing satellite or processing circuitry 240a onboard the leading satellite) is configured to calculate an estimated future position of the clouds at various possible times (e.g. all possible times) during the respective acquisition period, and generate an efficient image acquisition scheme while taking into consideration the estimated future position of the clouds.
In order to provide good estimation of future obstruction of targets by clouds, positioning data that includes the altitude of the clouds is needed. Firstly, if the trailing satellite attempts to acquire an image of a target from an orbital acquisition position different than the orbital acquisition position from which the leading satellite imaged the same target, then if there is a cloud altitude ambiguity, it is not possible to determine, with sufficient certainty, whether the clouds obscure the target from view 30 – 21 – for the trailing satellite. Secondly, as the following example shows, even if the trailing satellite and the leading satellite acquire an image of a target from the exact same orbital position (separated by time lag dT), the cloud altitude ambiguity may still prevent determination of cloud obstruction.
Consider for example, the scenario depicted in Fig. 4 , schematically illustrating the leading satellite 110at time t, with its velocity vector perpendicular to the figure plane (i.e. the orbit going into the drawing's plane), and the onboard EOS 240a capturing images in the direction of target T, while its LOS is being obstructed by a cloud. Fig. 4 further shows the same cloud at two possible altitudes, C1 and C2, among the possible cloud altitudes that are exhibited as the same obstructing cloud in an image captured from the leading satellite at time t, where C1 is a cloud at low altitude (near ground) and C2 is a cloud at a higher altitude.
As further demonstrated in Fig. 5 , according to the specific case exemplified in the figure, at time t+dT the trailing satellite 120 is located at the same position (in an inertial coordinate frame) as the leading satellite had been at time t. Assuming that at time t+dT an image acquisition scheme is being executed by the trailing satellite, by this time, due to the Earth’s rotation, the target T and cloud have rotated eastward. Notably, according to this example, the position of the cloud and target, relative to each other, is assumed to remain fixed. As shown in Fig. 5 , at time t+dT, if the obstructing cloud is C1, having a ground level altitude, it may continue to obstruct the LOS between EOS 240b and target T. If, however, the obstructing cloud is C2, at a higher altitude, it may not obstruct the LOS between EOS 240b and target T.
Consider the following a quantitative example. Suppose that the orbit altitude is 500 km, the target is at ground level, zero latitude, and located at time t at distance D=278km to the west (azimuth 270°) from the sub-satellite point P of the leading satellite 110 , and is obstructed by a cloud-center when imaged by the leading satellite. It is desired to determine, during generation of the image acquisition scheme, whether the target would still be obstructed from view when viewed by trailing satellite during execution of the image acquisition scheme. However, in case for example dT=minutes, after this time, due to the Earth’s rotation, the target will be at the sub- 30 – 22 – satellite point P below the trailing satellite. Since altitude of the cloud is not known, then due to cloud altitude ambiguity (clouds are typically between 0 -12 km above ground), a ground level cloud would still obstruct the target from view, while a cloud at altitude of 12 km at time t, would be shifted about 6.7 km to the east of target at time t+dT and would not obstruct the target from being viewed by the trailing satellite (this may also depend on the horizontal size of the cloud ).
Thus, as demonstrated above, the two dimensional information (cloud mask), extracted from an image captured by the leading camera, is insufficient for uniquely determining whether or not a desired target is likely to be obstructed by a cloud detected in an image captured by the leading satellite, during execution of the image acquisition scheme by the trailing satellite. This difficulty persists even for cases where the trailing satellite attempts to acquire an image of a target from the exact same orbital position of the leading satellite when capturing an image of the same target (where the respective image acquisition times of the two satellites are separated by a time lag of dT).
Thus, according to examples of the presently disclosed subject matter, an image acquisition scheme is calculated while taking into consideration positioning data that includes the altitude of detected clouds.
Fig. 6 is a flowchart showing additional operations carried out according to some examples of the presently disclosed subject matter. Fig. 6 includes some operations described above with reference to Fig. 3 , which are not described again in detail.
Following detection of clouds in images captured by a low resolution camera onboard the leading satellite (blocks 301 - 303 ), the position of the boundary surface of the detected clouds is calculated (block 601 ). The term "cloud boundary surface" (or CBS in short) is used herein to refer to the three dimensional position of the two dimensional surfaces of the clouds. A cloud has volume, and, accordingly, an internal height, extending from a minimal altitude at the bottom of the cloud, to a maximal altitude at the top of the cloud. The CBS position provides an estimation of the position of various parts of the cloud's surface, as viewed from space, including information 30 – 23 – extracted from the images with respect to the top and perimeter of the clouds. The "CBS position" pertains to a three dimensional position, including altitude, of the cloud relative to some known frame of reference (e.g. Earth-Centered, Earth-Fixed, abv. ECEF).
CBS position can be determined using a stereo-photogrammetry method. This method uses multiple images of the same cloud acquired from the leading satellite 110 taken at different satellite orbital positions. Alternatively, in case more than one leading satellite is used, CBS position can be determined using simultaneous (or substantially simultaneous) images taken from two or more leading satellites separated along their orbit. In addition, this method uses data of the satellite positions and satellite attitudes (e.g. available from positioning devices onboard the satellites), at the time of acquisition of each of the multiple images. According to some examples, processing circuitry 230b includes CBS module 232 configured to determine CBS position as well as future CBS position.
At block 603 , the future CBS position during the acquisition period of a certain image acquisition scheme is estimated. Calculation of the estimated future CBS position can be done based on the current CBS position, and Earth’s angular velocity.
At block 605 an image acquisition scheme is calculated based on the known target location and the estimated future CBS position of the clouds at future times (e.g. all times) during the acquisition period (i.e. when the trailing satellite will be at the respective area from which it will have a LOS to a target and will be able to image the target according to various predefined requirements, including, for example, minimal resolution and minimal solar illumination).
When the trailing satellite reaches the area of acquisition and the acquisition period commences, the imaging system is operated for capturing images of targets according to the image acquisition scheme (block 309 ).
As explained above, the ambiguity of clouds’ altitude reduces the ability to determine cloud obstruction of targets. This problem intensifies as the distance between the leading satellite and trailing satellite increases (i.e. the larger the dT value becomes) and may therefore limit the value of dT. A shorter dT, however, may give 30 – 24 – rise to target-acquisition optimization difficulties. As mentioned above, the efficiency of a satellite image acquisition depends on the total number of images acquired by the satellite during a certain fixed time, the optimization of this total number is done by a scheduling algorithm (executed for example by module 234 ) that generates the image acquisition scheme that prescribes in turn maneuvers, as well as an imaging acquisition sequence executed by the trailing satellite. In order to generate the image acquisition scheme, this algorithm combines the information on possible obstructions by clouds with desired target positioning data and the agility constraints of the trailing satellite. When clouds' altitude is unknown, dT is limited in order to avoid significant shift in cloud positions. By using CBS information, a larger value of dT is enabled, providing a longer time for implementing the scheduling algorithm and thereby improving the efficiency and accuracy of its output. Furthermore, knowing the CBS position also allows imaging of targets by the trailing satellite 120 from a wider range of positions, without being restricted to the exact position of the leading satellite at the time images of the targets were acquired (the exact same acquisition position of the leading satellite), thus possibly increasing the opportunities during which the target can be imaged.
According to further examples of the presently disclosed subject matter, instead of using a formation of satellites that include only one leading satellite, the formation may include two or more leading satellites, e.g. a plurality of nanosatellites. All leading satellites travel along a common orbit (substantially the same), with time lags separating between each pair of consecutive satellites, dT1, dT2, etc. (where different time lags can be the same or different, and can be selected for improving stereo-photography output).
Using more than one leading satellite enables to calculate a more accurate CBS position, which is based on stereo images of clouds captured simultaneously by two cameras or more, each camera operated from a different leading satellite.
Furthermore, according to some examples of the presently disclosed subject matter, it is suggested to use imaging output from multiple satellites in order to calculate cloud velocity. While in the description above, clouds were assumed to 30 – 25 – rotate together with the Earth and their velocity was approximated accordingly, this may not be always true, and may lead to inaccuracies in predicted motion of clouds and the future CBS position of detected clouds, as described above.
Thus, it is suggested to implement a cross correlation algorithm (e.g. by cross correlation module 233 ) based on a sequence of stereo-images captured by cameras onboard multiple leading satellites. By identifying the pattern of clouds in multiple satellite images (e.g. by implementing an appropriate machine learning algorithm), the clouds’ apparent horizontal velocity can be calculated (similar to an optical flow algorithm). By combining this information together with the measured satellite velocities (e.g. obtained from a GPS sensor), the actual cloud horizontal velocity relative to the Earth, can be determined. By doing so, various atmospheric effects that influence the clouds' motion can be accounted for, thus providing a more accurate future CBS position, and, as a result, more accurate prediction of obstruction of targets by clouds during trailing satellite imaging. Notably, obtaining a more accurate cloud velocity helps in increasing the time lag dT between the trailing satellite and the last satellite in a group of leading satellites. As explained above, this provides more time to calculate the image acquisition scheme, and accordingly improves its accuracy and efficiency.
According to an example, images captured by the plurality of leading satellites are transmitted to one satellite, where the collective processing of images for determining CBS position takes place. According to another example, all images are transmitted to the trailing satellite or to a ground station where the processing is executed. According to yet another example, processing tasks are distributed to different satellites which can process the information substantially in parallel, thereby improving processing efficiency.
In addition to determination of CBS position, the presently disclosed subject matter further contemplates using a multispectral or a hyperspectral sensor onboard the leading satellite for calculating an estimated optical thickness of the clouds (clouds' optical thickness; COT), at spectral-band of the EOS onboard the trailing satellite. Clouds may have various degrees of transparency for the spectrum detected 30 – 26 – by the EOS operated from the trailing satellite, such that in some cases the EOS may be able to detect targets covered by semi-transparent clouds. According to one example, cloud imaging performed by the leading satellite is done using a multispectral or hyperspectral sensor. During the processing of the images captured by the leading satellite, as part of generation of the image acquisition scheme, the output of the multispectral or hyperspectral image sensor is processed in order to determine a spectral-response of clouds detected in the images. Then a COT threshold is applied on the detected spectral response, to determine whether the cloud's opacity would obstruct the target from being viewed in images generated by the EOS onboard the trailing satellite. This also depends on the specific features of the EOS. In case it is determined that the clouds are sufficiently transparent (e.g. if the values obtained from the multispectral or hyperspectral sensor are within a certain predefined range), they are not considered as obstructive, and can be ignored during the generation of the image acquisition scheme. Thus, it is suggested to use multispectral or hyperspectral output and a respective processing circuitry as a screening tool in order to identify those clouds which actually obstruct the targets from being captured by the cameras onboard the trailing satellite.
In addition to the above, it is further suggested to utilize the leading satellite for the purpose of extending the communication time (for uplink and downlink) between the trailing satellite and a ground control station (GCS) 260 . Generally, the time when communication between a LEO-satellite and GCS-antenna 261 is enabled, is less or equal to the time intervals when a line-of-sight (LOS) between the satellite and GCS antenna is not obstructed by Earth, due to the Earth's curvature. These intervals, referred to herein as "communication periods" (or "ground communication periods"), are typically several minutes in duration, and are limited to those revolutions where the satellite passes in sufficient proximity to the GCS-antenna 261 so unobstructed-LOS can be established. Such limited communication periods may be insufficient for transmitting all the data (including for example, imaging output and telemetry) gathered by the trailing satellite 120 during time periods between communication periods (data can be stored for example in data-repository 250b onboard the satellite). Since the trailing satellite can be deployed at a significant – 27 – distance from the leading satellite, but still maintain an unobstructed LOS with it at all times (e.g. with a time lag dT of 10 min at altitude of 500km), a continuous communication link can be maintained between the two satellites. Using this "inter-satellite" data link, the trailing satellite can transmit some of the data (e.g. Earth imaging output) it gathered to the leading satellite (or satellites), while orbiting the Earth.
According to examples of the presently disclosed subject matter, while in orbit, prior to an incoming communication period, the trailing satellite transmits to the leading satellite at least part of the data it generates (including imaging data of the Earth) and the leading satellite stores the received data in onboard data storage devices 250a . The leading satellite is configured to transmit the stored data to a ground control station via the GCS-antenna during its communication periods. Due to the time lag between the leading and trailing satellite, which may be significant, as discussed above, they may communicate with a single GCS-antenna in a sequential manner, such that the combined communication time of GCS-antenna and the trailing satellite and GCS-antenna and the leading satellite is significantly larger than the communication time of the GCS-antenna and trailing satellite alone. This significantly increases the amount of data that can be downlinked in a single communication pass.
This approach helps to overcome the difficulty that may arise due to the fact that the inter-satellite datalink has typically significantly low channel capacity as compared to the higher channel capacity between the satellite and the ground station as a result of the significantly larger antennas commonly used in the ground control station.
It will also be understood that some components of the system according to the presently disclosed subject matter may be a suitably programmed computer. Likewise, the presently disclosed subject matter contemplates a computer program on a non-transitory computer memory device being readable by a computer for executing the method of the presently disclosed subject matter. The presently disclosed subject matter further contemplates a machine-readable non-transitory – 28 – computer memory tangibly embodying a program of instructions executable by the machine for executing the method of the presently disclosed subject matter.
It is to be understood that the presently disclosed subject matter is not limited in its application to the details set forth in the description contained herein or illustrated in the drawings. The presently disclosed subject matter is capable of other embodiments and of being practiced and carried out in various ways. Hence, it is to be understood that the phraseology and terminology employed herein are for the purpose of description and should not be regarded as limiting. As such, those skilled in the art will appreciate that the conception upon which this disclosure is based may readily be utilized as basis for designing other structures, methods, and systems for carrying out the several purposes of the presently disclosed subject matter.
The present invention has been described with a certain degree of particularity, but those versed in the art will readily appreciate that various alterations and modifications may be carried out without departing from the scope of the following claims. 15

Claims (43)

- 29 - 276014/V CLAIMS:
1. A satellite system for electro-optic imaging of Earth, the system comprising: a trailing satellite comprising a first imaging subsystem with a first electro-optic image sensor (first EOS) with a first optical aperture, configured and operable to capture images of one or more targets located on Earth; a leading satellite comprising a second imaging subsystem comprising a second electro-optic image sensor (second EOS) with a second optical aperture; wherein the leading satellite is characterized by a smaller mass compared to the trailing satellite; while the leading satellite is flying along an orbit around the Earth, the second imaging subsystem is configured and operable to operate the second electro-optic image sensor for capturing images in the direction of the Earth; the system further comprising at least one processing circuitry configured and operable during a time-lag between the leading satellite and trailing satellite, to: process the captured images and identify at least one cloud in the captured images; determine positioning data of the at least one cloud relative to a certain frame of reference; determine whether any of the one or more targets are bound to be obstructed by the at least one identified cloud, thereby identifying a group of one or more targets unobstructed by the clouds; generate an image acquisition scheme including instructions for controlling the first imaging subsystem for imaging targets in the group; while the trailing satellite is following the leading satellite along the orbit, the first imaging subsystem is configured and operable to operate the first electro-optic image sensor for capturing images of one or more targets, according to the instructions prescribed by the image acquisition scheme, thereby mitigating obstruction of the one or more targets by the at least one cloud. – 30 – 276014/V2
2. The system of claim 1, wherein the second EOS is characterized by a resolution at nadir that is lower than the resolution at nadir of the first EOS.
3. The system of any one of claims 1 and 2, wherein the positioning data is a three-dimensional positioning data that includes altitude.
4. The system of any one of the preceding claims, wherein the at least one processing circuitry is further configured to: determine a future position of the at least one cloud relative to the one or more targets during a future respective acquisition period; and identify the group of one or more targets, using the future position of the at least one cloud.
5. The system of claim 4, wherein the future position of the at least one cloud and the one or more targets is determined using data on Earth's rotation.
6. The system of any one of the preceding claims, wherein the at least one processing circuitry includes a first processing circuitry onboard the trailing satellite; wherein the first processing circuity is further configured to receive the images from the leading satellite.
7. The system of any one of claims 1 to 5, wherein the at least one processing circuitry includes a second processing circuitry located at a ground control station connected over a communication link with the leading satellite and trailing satellite; and wherein the second processing circuitry is further configured to receive the images from the leading satellite and transmit the image acquisition scheme to the trailing satellite.
8. The system of any one of the preceding claims, wherein the image acquisition scheme is generated while striving to increase a respective number of image acquisitions, while considering agility of the trailing satellite.
9. The system of any one of the preceding claims comprising two or more leading satellites configured to fly along the orbit; wherein each leading satellite of the two or more leading satellites comprises a respective second electro-optic image sensor; wherein the at least one processing circuitry is further configured to determine the positioning data of the at least one cloud, based on a plurality of images – 31 – 276014/V2 captured by imaging subsystems onboard the two or more leading satellites.
10. The system of any one of claims 2 to 8 comprising two or more leading satellites configured to advance along the orbit with a respective distance between each pair of leading satellites; wherein each leading satellite of the two or more leading satellites comprises a respective second electro-optic image sensor; wherein the at least one processing circuity is further configured to: process a plurality of images captured simultaneously by imaging subsystems onboard the two or more leading satellites for determining a velocity of the at least one cloud; and estimate the future positioning of the at least one cloud based on the velocity.
11. The system of claim 9, wherein the at least one processing circuity is further configured to: process the plurality of images captured simultaneously by imaging subsystems onboard the two or more leading satellites for determining a velocity of the at least one cloud; and estimate a future positioning of the at least one cloud based on the velocity.
12. The system of any one of the preceding claims comprising a multispectral or a hyperspectral sensor onboard the leading satellite and/or the trailing satellite configured to provide an optical response of the at least one cloud, the at least one processing circuitry is configured to determine data indicative of transparency of the at least one cloud based on the optical response and exclude the at least one cloud from obstructing targets in case transparency complies with one or more conditions.
13. The system of any one of the preceding claims, wherein the trailing satellite and the leading satellite are capable of transmitting data to a ground control station during one or more communication periods; wherein the trailing satellite is configured to transmit to the leading satellite, data generated onboard the trailing satellite during times other than the respective communication period, the generated data including captured images of the one or – 32 – 276014/V2 more targets; the leading satellite is configured to transmit the data to a ground control station during a respective communication period, thereby increasing an overall volume of data that is transmitted to the ground control station by the leading satellite and trailing satellite.
14. The system of any one of the preceding claims, wherein the trailing satellite is a LEO satellite.
15. The system of any one of the preceding claims, wherein the leading satellite has a mass of 100 kg or less.
16. The system of any one of the preceding claims, wherein the leading satellite has a mass of 20 kg. or less.
17. The system of any one of the preceding claims, wherein the leading satellite is a microsatellite or a nanosatellite.
18. The system of any one of the preceding claims, wherein the time-lag between the leading satellite and trailing satellite along the orbit is equal or greater than 3 minutes.
19. The system of any one of the preceding claims, wherein the time-lag between the leading satellite and trailing satellite along the orbit is equal to or greater than 10 minutes.
20. The system of any one of the preceding claims, wherein the second optical aperture is smaller than the first optical aperture.
21. A method of electro-optic imaging of Earth, the method comprising: operating a trailing satellite and at least one leading satellite, wherein the at least one leading satellite advances along an orbit around the Earth and the trailing satellite follows the leading satellite along the orbit at a predefined distance; wherein the at least one leading satellite is characterized by a smaller mass compared to the trailing satellite; while the trailing satellite and leading satellite are in orbit: operating a first electro-optic image sensor (EOS), onboard the at least one – 33 – 276014/V2 leading satellite orbiting the Earth, for capturing images in the direction of the Earth; during a time-lag between the leading satellite and trailing satellite, operating a processing circuitry to for: processing the captured images and identifying at least one cloud in the captured images; determine positioning data, of the at least one cloud relative to a certain frame of reference; determining whether any of the one or more targets are bound to be obstructed by the at least one identified cloud thereby identifying a group comprising one or more targets unobstructed by the clouds; generating an image acquisition scheme including instructions for controlling a second EOS onboard the trailing satellite, for imaging targets in the group; while the trailing satellite is following the leading satellite along the orbit, operating the second EOS for capturing images of one or more targets according to the instructions of the image acquisition scheme, thereby avoiding obstruction of the one or more targets by the identified clouds.
22. The method of claim 21, wherein the second EOS is characterized by a resolution at nadir that is lower than the resolution at nadir of the first EOS.
23. The method of any one of claims 21 and 22, wherein the positioning data is a three-dimensional positioning data that includes altitude.
24. The method of any one of claims 21 to 23, further comprising: determining a future position of the at least one cloud relative to the one or more targets during a future respective acquisition period; and identifying the group of one or more targets, using future position of the at least one cloud.
25. The method of claim 24, wherein the future position of the at least one cloud and the one or more targets is determined using data on Earth's rotation.
26. The method of any one of claims 21 to 25, further comprising: – 34 – 276014/V2 transmitting the images from the leading satellite to the trailing satellite and performing the generating of the image acquisition scheme at the trailing satellite.
27. The method of any one of claims 21 to 25, further comprising transmitting the images to a ground control station connected over a communication link with the leading satellite and trailing satellite, to enable generation of the image acquisition scheme at the ground control station; and receiving, at the trailing satellite, the image acquisition scheme from the ground control station.
28. The method of any one of claims 21 to 27, wherein the generating of the image acquisition scheme comprises striving to increase a respective number of image acquisitions, while considering agility of the trailing satellite.
29. The method of any one of claims 21 to 28 further comprising: operating two or more leading satellites configured to travel along the orbit; operating a respective EOS, onboard each leading satellite, for capturing images of Earth; wherein a respective optical aperture of each respective EOS is smaller than the optical aperture of the first EOS, onboard the trailing satellite; and determining the positioning data of the at least one cloud, based on a plurality of images captured by imaging subsystems onboard the two or more leading satellites.
30. The method of any one of claims 21 to 28 further comprising: operating two or more leading satellites configured travel along the orbit; operating a respective EOS, onboard each leading satellite, for capturing images of Earth; wherein a respective optical aperture of each respective EOS is smaller than the optical aperture of the second EOS, onboard the trailing satellite; and processing a plurality of images captured simultaneously by imaging subsystems onboard the two or more leading satellites for determining a velocity of the at least one cloud; and estimating the future positioning of the at least one cloud based on the velocity.
31. The method of claim 30 further comprising: processing the plurality of images captured simultaneously by imaging – 35 – 276014/V2 subsystems onboard the two or more leading satellites for determining a velocity of the at least one cloud; and estimating a future positioning of the at least one cloud based on the velocity.
32. The method of any one of claims 21 to 31 further comprising: operating a multispectral or a hyperspectral sensor onboard the leading satellite and/or the trailing satellite configured to provide an optical response of the at least one cloud; determining data indicative of transparency of the at least one cloud based on the optical response and excluding the at least one cloud from obstructing targets in case transparency complies with one or more condition.
33. The method of any one of claims 21 to 32, further comprising: transmitting from the trailing satellite and/or the leading satellite, data to a ground control station during one or more communication periods; transmitting to the leading satellite data, generated onboard the trailing satellite, during times other than the respective communication period, the generated data including captured images of the one or more targets; transmitting the data from the leading satellite to a ground control station during a respective communication period, thereby increasing an overall volume of data that is transmitted to the ground control station by the leading satellite and trailing satellite.
34. The method of any one of claims 21 to 33, wherein the trailing satellite is a LEO satellite.
35. The method of any one of claims 21 to 34, wherein the leading satellite has a mass of 100 kg or less.
36. The method of any one of claims 21 to 35, wherein the leading satellite has a mass of 20 kg. or less.
37. The method of any one of claims 21 to 36, wherein the leading satellite a microsatellite or a nanosatellite. – 36 – 276014/V2
38. The method of any one of claims 21 to 37, wherein the time-lag between the leading satellite and trailing satellite along the orbit is equal to or greater than 3 minutes.
39. The method of any one of claims 21 to 38, wherein the time-lag between the leading satellite and trailing satellite along the orbit is equal to or greater than 10 minutes.
40. The method of any one of claims 21 to 39 further comprising: launching the trailing satellite and at least one leading satellite such that the at least one leading satellite advances along the orbit around the Earth and the trailing satellite follows the leading satellite along the orbit, at a certain distance.
41. The method of any one of claims 21 to 40, wherein the first EOS comprises a first optical aperture and the second EOS comprises a second optical aperture that is smaller than the first optical aperture.
42. A machine-readable non-transitory memory device tangibly embodying a program of instructions executable by the machine for executing a method of electro-optic imaging of Earth by a satellite system comprising a trailing satellite and at least one leading satellite, wherein the at least one leading satellite advances along an orbit around the Earth and the trailing satellite follows the leading satellite along the orbit at a predefined distance; wherein the at least one leading satellite is characterized by smaller dimensions and smaller mass compared to the trailing satellite, the method comprising: while the trailing satellite and leading satellite are in orbit: during a time-lag between the leading satellite and trailing satellite, operating a first electro-optic image sensor (EOS), onboard the at least one leading satellite orbiting the Earth, for capturing images in the direction of the Earth; processing the captured images and identifying at least one cloud in the captured images; determining positioning data, of the at least one cloud relative to a certain frame of reference; determining whether any of the one or more targets are bound to be – 37 – 276014/V2 obstructed by the at least one identified cloud, thereby identifying a group comprising one or more targets unobstructed by the clouds; generating an image acquisition scheme including instructions for controlling a second EOS onboard the trailing satellite for imaging targets in the group; while the trailing satellite is following the leading satellite along the orbit, operating the second EOS onboard the trailing satellite for capturing images of one or more targets according to the instructions of the image acquisition scheme, thereby avoiding obstruction of the one or more targets by the identified clouds.
43. A method of configuring a satellite system for electro-optic imaging of Earth, the method comprising: providing a trailing satellite comprising a first imaging subsystem with a first electro-optic image sensor (EOS) having a first optical aperture, and configured and operable to capture images of one or more targets located on Earth; providing a leading satellite comprising a second imaging subsystem with a second electro-optic image sensor (EOS) having a second optical aperture; wherein the leading satellite is characterized by smaller dimensions and smaller mass compared to the trailing satellite; preparing the leading satellite and trailing satellite to be launched into space such that the leading satellite advances along an orbit around the Earth and the trailing satellite follows the leading satellite along the orbit, at a predefined distance; configuring the second imaging subsystem to be capable of: while the leading satellite is in orbit around the Earth, operating the second electro-optic image sensor for capturing images in the direction of the Earth; configuring a processing circuitry to be capable to execute a process, during a time-lag between the leading satellite and trailing satellite while the leading and trailing satellite are orbiting the Earth, the process comprising: processing the captured images and identifying at least one cloud in the captured images; determining positioning data, of the at least one cloud – 38 – 276014/V2 relative to a certain frame of reference; determining an estimated future position of the at least one cloud and the one or more targets during a future respective acquisition period; determining whether any of the one or more targets are bound to be obstructed by the at least one identified cloud thereby identifying a group comprising one or more targets unobstructed by the clouds; generating an image acquisition scheme including instructions for controlling a first imaging subsystem onboard the trailing satellite for imaging targets in the group; configuring the first imaging subsystem to be capable of: while the trailing satellite is following the leading satellite along the orbit, operating the first imaging subsystem for operating the first electro-optic image sensor for capturing images of one or more targets, according to the instructions of the image acquisition scheme, thereby avoiding obstruction of the one or more targets by the identified clouds. For the Applicants, REINHOLD COHN AND PARTNERS By:
IL276014A 2020-07-13 2020-07-13 Satellite Imaging System with Reduced Cloud Obstruction IL276014B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
IL276014A IL276014B1 (en) 2020-07-13 2020-07-13 Satellite Imaging System with Reduced Cloud Obstruction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
IL276014A IL276014B1 (en) 2020-07-13 2020-07-13 Satellite Imaging System with Reduced Cloud Obstruction

Publications (2)

Publication Number Publication Date
IL276014A IL276014A (en) 2022-02-01
IL276014B1 true IL276014B1 (en) 2024-08-01

Family

ID=80469094

Family Applications (1)

Application Number Title Priority Date Filing Date
IL276014A IL276014B1 (en) 2020-07-13 2020-07-13 Satellite Imaging System with Reduced Cloud Obstruction

Country Status (1)

Country Link
IL (1) IL276014B1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4145259B2 (en) * 2004-02-26 2008-09-03 富士通株式会社 Program, satellite adjustment device, control method
JP2014172555A (en) * 2013-03-12 2014-09-22 Mitsubishi Electric Corp Satellite observation system
US9126700B2 (en) * 2010-01-25 2015-09-08 Tarik Ozkul Autonomous decision system for selecting target in observation satellites
US20180172823A1 (en) * 2015-06-16 2018-06-21 Urthecast Corp Systems and methods for remote sensing of the earth from space
WO2018146220A1 (en) * 2017-02-08 2018-08-16 Klaus Schilling Small satellite capable of formation flying, and formation of multiple small satellites

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4145259B2 (en) * 2004-02-26 2008-09-03 富士通株式会社 Program, satellite adjustment device, control method
US9126700B2 (en) * 2010-01-25 2015-09-08 Tarik Ozkul Autonomous decision system for selecting target in observation satellites
JP2014172555A (en) * 2013-03-12 2014-09-22 Mitsubishi Electric Corp Satellite observation system
US20180172823A1 (en) * 2015-06-16 2018-06-21 Urthecast Corp Systems and methods for remote sensing of the earth from space
WO2018146220A1 (en) * 2017-02-08 2018-08-16 Klaus Schilling Small satellite capable of formation flying, and formation of multiple small satellites

Also Published As

Publication number Publication date
IL276014A (en) 2022-02-01

Similar Documents

Publication Publication Date Title
JP7086294B2 (en) Satellite constellations, ground equipment and artificial satellites
US5072396A (en) Navigation systems
CN102168981B (en) Independent celestial navigation method for Mars capturing section of deep space probe
US20230031823A1 (en) Satellite constellation, ground facility, and flying object tracking system
JP5567805B2 (en) Flying object detection method, system, and program
EP3283842B1 (en) Method for processing images of a ground surface
US20210206519A1 (en) Aerospace Vehicle Navigation and Control System Comprising Terrestrial Illumination Matching Module for Determining Aerospace Vehicle Position and Attitude
EP3454011B1 (en) Navigation systems and methods
Roncoli et al. Mission design overview for the Mars exploration rover mission
EP3454007A1 (en) Systems and methods for determining position of an object relative to a vehicle
US11175398B2 (en) Method and apparatus for multiple raw sensor image enhancement through georegistration
Pack et al. CubeSat nighttime lights
Fujita et al. Attitude maneuvering sequence design of high-precision ground target tracking control for multispectral Earth observations
US20080004758A1 (en) Apparatus and method for tracking an orbital body relative to a planetary body using a single sensor
WO2024015661A1 (en) Onboard geolocation for images
Plotke et al. Dual use star tracker and space domain awareness sensor in-space test
IL276014B1 (en) Satellite Imaging System with Reduced Cloud Obstruction
JP7499940B2 (en) Flight trajectory model selection method, flying object tracking system, flying object countermeasure system, and ground system
Dever et al. Guided-airdrop vision-based navigation
Magallon et al. Diwata-1 target pointing error assessment using orbit and space environment prediction model
Ewart et al. Pole-sitter based space domain awareness for cislunar regions
JP2022038459A (en) Space state monitoring project device, space state monitoring system, monitoring device, and ground facility
Adnastarontsau et al. Algorithm for Control of Unmanned Aerial Vehicles in the Process of Visual Tracking of Objects with a Variable Movement’s Trajectory
Teston et al. The PROBA-1 microsatellite
Strojnik et al. Push-broom reconnaissance camera with time expansion for a (Martian) landing-site certification