CN113838309A - Collision perception using historical data of vehicles - Google Patents
Collision perception using historical data of vehicles Download PDFInfo
- Publication number
- CN113838309A CN113838309A CN202110157742.5A CN202110157742A CN113838309A CN 113838309 A CN113838309 A CN 113838309A CN 202110157742 A CN202110157742 A CN 202110157742A CN 113838309 A CN113838309 A CN 113838309A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- predicted
- data
- airport
- location
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000008447 perception Effects 0.000 title claims description 11
- 238000000034 method Methods 0.000 claims abstract description 60
- 238000012545 processing Methods 0.000 claims description 144
- 238000004891 communication Methods 0.000 claims description 23
- 230000033001 locomotion Effects 0.000 claims description 16
- 239000003550 marker Substances 0.000 claims description 16
- 230000008685 targeting Effects 0.000 claims description 5
- 238000004590 computer program Methods 0.000 abstract description 5
- 238000010801 machine learning Methods 0.000 description 25
- 230000008569 process Effects 0.000 description 18
- 238000010586 diagram Methods 0.000 description 10
- 230000008859 change Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 230000001133 acceleration Effects 0.000 description 5
- 230000008901 benefit Effects 0.000 description 5
- 230000001413 cellular effect Effects 0.000 description 4
- 238000013500 data storage Methods 0.000 description 4
- 238000001514 detection method Methods 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 4
- 238000012423 maintenance Methods 0.000 description 4
- 238000012549 training Methods 0.000 description 4
- 230000003190 augmentative effect Effects 0.000 description 3
- 230000001186 cumulative effect Effects 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 238000003909 pattern recognition Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000013479 data entry Methods 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 235000019800 disodium phosphate Nutrition 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 239000008186 active pharmaceutical agent Substances 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 238000012417 linear regression Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000007477 logistic regression Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 238000003058 natural language processing Methods 0.000 description 1
- 238000012634 optical imaging Methods 0.000 description 1
- 238000007637 random forest analysis Methods 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
- 230000032258 transport Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0017—Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
- G08G5/0021—Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located in the aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0073—Surveillance aids
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0004—Transmission of traffic-related information to or from an aircraft
- G08G5/0013—Transmission of traffic-related information to or from an aircraft with a ground station
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/04—Anti-collision systems
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/04—Anti-collision systems
- G08G5/045—Navigation or guidance aids, e.g. determination of anti-collision manoeuvers
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/06—Traffic control systems for aircraft, e.g. air-traffic control [ATC] for control when on the ground
- G08G5/065—Navigation or guidance aids, e.g. for taxiing or rolling
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention provides collision sensing using historical data of a vehicle. The present disclosure relates to methods, computer program products, and systems for providing ground vehicle tracking data to an airport map display system onboard an aircraft, the ground vehicle tracking data including an indication of a potential collision zone. In one example, a method includes identifying historical navigation route data, airport guidance features, and a predicted path of a first vehicle. The method also includes determining a predicted position along the predicted path and determining a predicted position of a second vehicle, and comparing vehicle envelopes of the two vehicles to determine a predicted collision zone for the vehicle.
Description
This application claims priority to indian provisional 202011006508 filed on 14/2/2020, which provisional is hereby incorporated by reference in its entirety.
The present disclosure relates to collision sensing for vehicles.
Background
Airports have become increasingly busy as commercial air traffic continues to grow over the years. Accordingly, collision avoidance systems have been implemented that use various sensors, imaging devices, radar, and other hardware components mounted on an aircraft to help prevent a potential collision of the aircraft with another aircraft. Such hardware components increase the weight, maintenance complexity, and generally the overall cost of such vehicles.
More and more air traffic also involves very large passenger aircraft with very long wing spans, which can sometimes reduce wingtip clearance margins as the aircraft moves over airport ground. Further, multiple aircraft in an area may be powered down at any given time or otherwise, so as not to transmit tracking beacons that may be used to reduce the likelihood of a collision with another vehicle. The aircraft may be powered down and pulled under the tow of the trailer, in which case an aircraft ground collision or a collision between the aircraft and other vehicles is even more likely to occur. Furthermore, in aircraft field areas where aircraft or aircraft trailers navigate unmarked routes, such as in airport tarmac areas or hangar bays, wingtip collisions may occur at even higher rates due to seemingly unrestricted routes that the aircraft or trailer may take to reach the intended destination.
Disclosure of Invention
The present disclosure relates to methods, systems, and computer program products for predicting potential collision zones for vehicles at an expected time using historical vehicle data, licensing information for one or more vehicles, and/or airport guidance features. In some examples, the vehicle may transmit the current location of the vehicle to a User Interface (UI) device (e.g., an Electronic Flight Bag (EFB)) or to a remote data server (e.g., a cloud-based data server). The remote data server or EFB may use one or more of historical navigation route data, licensing information for one or more vehicles, and/or airport guidance features to predict a potential collision zone and provide an indication of the potential collision zone to a user. The historical navigation route data may be based on the transponder location data and stored in a database of historical vehicle data. Further, the airport guidance features may include data stored in a database that provides information about the location of guidance markers, such as guide lines painted on the ground, guidance signs, architectural features, and other information that provides guidance to the vehicle throughout a particular airport location. The collision awareness system may use historical navigation route data and airport guidance features to predict a route and predict vehicle locations along the route to determine potential collision zones. In some cases, the collision awareness system may provide potential collision zone data for display on the EFB, such as on an Airport Moving Map Display (AMMD) application executing on the EFB.
In some examples, a ground vehicle tracking system may be used to determine airport ground transit object data using, for example, a multipoint positioning sensor or other airport system sensor. The data may be used to confirm or verify the prediction of the potential collision zone predicted by the collision awareness system using one or more of historical navigation route data, permission information for one or more vehicles, and/or airport guidance features. Some transit ground objects or some types of transit ground objects may or may not actively transmit messages or signals that may be received by some types of multilateration sensors or other airport system sensors, or may not respond to some types of interrogation signals transmitted by multilateration sensors or other airport system sensors, such as if the airport system sensors are using cooperative surveillance with which objects other than aircraft are not typically configured to cooperate. In some examples, the transit vehicle may be pulled via an aircraft tractor (e.g., a trailer that transports other vehicles). In such cases, the pulled aircraft may be powered off at that time so that the aircraft does not transmit a signal that may be used to track the vehicle position. Further, the vehicle may be located in an airport area that provides less guidance to the vehicle via airport guidance features. For example, the apron region of an airport may not include painted guidance features on the ground that may be referenced in the airport guidance database. Thus, complex maneuvers and high traffic areas at various airport locations increase the likelihood of potential vehicle collisions (e.g., wingtip collisions, etc.).
According to various techniques of this disclosure, a collision awareness system may utilize one or more of historical navigation route data and/or airport guidance features to predict potential collision zones between vehicles. In addition, the collision awareness system may utilize vehicle clearance information, such as clearance information from Air Traffic Controllers (ATCs), to predict potential collision zones between vehicles passing over the ground, where at least one vehicle is moving, whether or not passing through a tractor. In some examples, the collision awareness system may execute on a remote server that collects data (such as the location of the vehicle), updates the database, and predicts the collision zone. In another example, the collision awareness system may be implemented at least partially on the EFB or other user interface device.
In some examples, the collision awareness system may receive permission information in text or speech, process the permission information, and determine navigation information for the vehicle, or predict a current location of the vehicle based on the permission information. For example, if a vehicle receives permission information to a particular gate of an apron area, but then powers off the avionics system of the vehicle, the collision awareness system can determine how much time has passed since the vehicle received the permission information, how much time the vehicle historically would take to reach the destination point or another target marker on the path toward the destination point, and predict the location of the vehicle at any particular point in time. In any case, the collision awareness system may use historical navigation route data and airport guidance features to predict the location of vehicles, predict the trajectory of vehicles, and predict the trajectory of other vehicles to determine whether overlap between the envelopes of two or more vehicles indicates a potential collision at an expected or future time.
In this way, the collision awareness system may be implemented without the need to install additional hardware on the aircraft, and may provide a reliable indication of a predicted collision zone in the airport by overlaying data (such as airport ground data overlaid with historical navigation route data) with a particular computing system. Further, the collision awareness system may utilize machine learning models to provide such predictions trained on specific data inputs, which allow for continuous modeling and updating of the predicted route as the vehicle traverses the route. For example, the collision awareness system may predict a route of the first vehicle, but when the first vehicle begins to travel the predicted route, an updated predicted route for the first vehicle may be determined, such as based on data received from the vehicles (e.g., speed information, location information, etc.), allowing the collision awareness system to provide dynamic predictions in flight while the object is moving throughout the airport and while historical navigation route data evolves with changing conditions. Further, the collision awareness system may predict the collision zone based on aircraft characteristics and airport characteristics, while referencing general and specific information derived from multiple vehicle types and airport locations.
In one example, a method includes obtaining, by processing circuitry of a ground collision awareness system, historical navigation route data for one or more reference vehicles, the historical navigation route data based on transponder location data. The method also includes identifying, by the processing circuit, a plurality of airport guidance features for a particular airport location, the airport guidance features including guidance marker information. The method also includes determining, by the processing circuit, a predicted path for the first vehicle, the predicted path including a first portion and a second portion, the first portion of the predicted path being predicted using the guidance marker information, and the second portion of the predicted path being predicted using the historical navigation route data. The method also includes determining, by the processing circuit, a predicted position of the first vehicle along the predicted path at the expected time. The method also includes determining, by the processing circuit, a predicted position of the second vehicle relative to approximately the same expected time. The method also includes performing, by the processing circuit, a comparison of a first vehicle envelope of the first vehicle and a second vehicle envelope of the second vehicle at the predicted location. The method also includes identifying, by the processing circuit, an overlap of the first vehicle envelope and the second vehicle envelope. The method also includes determining, by the processing circuit, a predicted collision zone of the first vehicle and the second vehicle at an expected time based at least in part on an overlap of the first vehicle envelope and the second vehicle envelope.
In another example, a ground impact perception system is disclosed that includes a processor and a memory. The memory is configured to store: historical navigation route data for one or more reference vehicles, wherein the historical navigation route data is based on the transponder location data; and a plurality of airport guidance features for one or more airport locations, wherein the airport guidance features include guidance marker information. The processor of the ground collision awareness system is configured to determine a predicted path for the first vehicle, the predicted path including a first portion and a second portion, the first portion of the predicted path being predicted using the guidance indicia information and the second portion of the predicted path being predicted using the historical navigation route data; determining a predicted position of the first vehicle along the predicted path at the expected time; determining a predicted position of the second vehicle relative to approximately the same expected time; performing a comparison of a first vehicle envelope of the first vehicle and a second vehicle envelope of the second vehicle at the predicted location; identifying an overlap of the first vehicle envelope and the second vehicle envelope; and determine a predicted collision zone for the first vehicle and the second vehicle at an expected time based at least in part on an overlap of the first vehicle envelope and the second vehicle envelope.
In another example, a non-transitory computer-readable storage medium having instructions stored thereon is disclosed. The instructions, when executed, cause the one or more processors to: obtaining historical navigation route data for one or more reference vehicles, the historical navigation route data based on the transponder location data; identifying a plurality of airport guidance features for a particular airport location, the airport guidance features including guidance tag information; determining a predicted path for the first vehicle, the predicted path including a first portion and a second portion, the first portion of the predicted path being predicted using the guidance indicia information and the second portion of the predicted path being predicted using the historical navigation route data; determining a predicted position of the first vehicle along the predicted path at the expected time; determining a predicted position of the second vehicle relative to approximately the same expected time; performing a comparison of a first vehicle envelope of the first vehicle and a second vehicle envelope of the second vehicle at the predicted location; identifying an overlap of the first vehicle envelope and the second vehicle envelope; and determine a predicted collision zone for the first vehicle and the second vehicle at an expected time based at least in part on an overlap of the first vehicle envelope and the second vehicle envelope.
The present disclosure also relates to an article of manufacture including a computer-readable storage medium. The computer readable storage medium includes computer readable instructions executable by a processor. The instructions cause the processor to perform any portion of the techniques described herein. The instructions may be, for example, software instructions, such as those used to define a software or computer program. The computer-readable medium may be a computer-readable storage medium, such as a storage device (e.g., a disk drive or optical drive), a memory (e.g., a flash memory, a read-only memory (ROM), or a Random Access Memory (RAM)), or any other type of volatile or non-volatile memory or storage element that stores instructions (e.g., in the form of a computer program or other executable file) to cause a processor to perform the techniques described herein. The computer readable medium may be a non-transitory storage medium.
The details of one or more examples are set forth in the accompanying drawings and the description below. Other features, objects, and advantages will be apparent from the description and drawings, and from the claims.
Drawings
Fig. 1 is a conceptual block diagram depicting an example collision awareness system interacting with various components to determine a predicted collision zone of a vehicle, in accordance with aspects of the present disclosure.
Fig. 2 is a conceptual block diagram of an example computing system having an example computer-executable collision awareness system in accordance with aspects of the present disclosure.
Fig. 3 depicts a flow diagram of an example process for identifying a predicted collision zone that a collision awareness system may implement using one or more of historical navigation route data and/or airport guidance features in accordance with aspects of the present disclosure.
Fig. 4 depicts an exemplary technique for aligning historical navigation routes with airport guidance features that may be implemented by a collision awareness system according to aspects of the present disclosure.
Fig. 5A-5C depict conceptual diagrams of a portion of an airport with various aircraft and ground vehicles on the airport runways, taxiways, and other airport grounds, and an exemplary collision awareness system predicting the location of various aircraft and ground vehicles on the airport grounds, in accordance with aspects of the present disclosure.
Fig. 6 is a diagram of an exemplary graphical output display of an airport ground map that may be implemented by a two-dimensional airport moving map display (2D AMMD) application that may be implemented on an Electronic Flight Bag (EFB), such as a tablet computer of a flight crew member in a cockpit of a particular aircraft on an airport surface, in accordance with aspects of the present disclosure.
Fig. 7 is a conceptual block diagram depicting an example airport network system with an example ground sensor, in accordance with aspects of the present disclosure.
Detailed Description
Various examples are generally described below that relate to methods, computer program products, and electronic systems that may provide collision awareness data (including indications of potential collision zones) from a collision awareness system. The collision awareness system may provide such data to an application onboard the aircraft (e.g., an Airport Moving Map Display (AMMD) application) over a wireless network. For example, a collision awareness system may provide such an indication of potential collision zones on an Electronic Flight Bag (EFB), which may be implemented on a tablet computer or similar user interface device. The flight crew may view and use the AMMD augmented with information from the collision awareness system when the pilot is controlling the aircraft on the airport ground, such as during taxiing, parking, etc. In some examples, the tow vehicle operator may view and use the AMMD augmented with information from the collision awareness system when towing the aircraft to the destination location according to ATC clearance information. The collision awareness system may determine potential collision zones with one or more other ground vehicles (e.g., other aircraft or ground vehicles) and transmit a warning of the potential collision zones to the EFB. Accordingly, implementations of the present disclosure may provide better situational awareness for controlling ground movement of aircraft on airport taxiways, including in weather conditions with limited visibility, without requiring any new hardware to be installed in the aircraft itself (and thus without requiring certification of the new hardware by the relevant aviation authority), and without requiring cooperative participation by other aircraft. Implementations of the present disclosure may not only reduce the likelihood of an aircraft colliding with another aircraft or ground vehicle, but may also provide additional benefits to an airport, such as smoother taxiing and less disruption or delay due to confusion or lack of situational awareness of ground traffic.
FIG. 1 is a conceptual block diagram depicting exemplary components of a collision awareness system environment 102. The collision awareness system may operate in such an exemplary collision awareness system environment 102, including the various exemplary components of fig. 1. In the illustrated example, the collision awareness system environment 102 includes various components, including ground and/or flight vehicles 111, traffic controllers 114, one or more data servers 132, various databases or data repositories 105, and user interface devices 104. Thus, the collision awareness system may be implemented as software installed on one or more of the components of the collision awareness system environment 102.
Although vehicles 112A-112N may sometimes be referred to as aircraft having various configurations, the techniques of this disclosure are not so limited, and vehicles 112A-112N may include other vehicles, such as helicopters, hybrid tiltrotor aircraft, urban air vehicles, jet planes, quadcopters, hovercraft, space shuttle, Unmanned Aerial Vehicles (UAVs), flying robots, and the like. Further, although vehicles 113A-113N may sometimes be referred to as trailers, the techniques of this disclosure are not so limited, and vehicles 113A-113N may include other vehicles, such as unmanned ground vehicles, transit ground vehicles, unmanned trailers (e.g., remote controlled vehicles), baggage car vehicles with multiple vehicles attached via linkages, refueling cars, airport buses, container loaders, belt loaders, dining vehicles, emergency vehicles, snow mobiles, or ground maintenance equipment, among others. In some examples, the vehicle 111 may receive direct communications from the traffic controller 114, such as via radio or cellular communications. For example, the traffic controller 114 may transmit the clearance information directly to one of the aircraft 112 or to the trailer 113 indicating the destination port for parking the aircraft.
In some examples, the user interface devices 104A-104N may include a variety of user interface devices. For example, the user interface device 104 may include a tablet, laptop, telephone, EFB, augmented reality headset, or virtual reality headset, or other type of user interface device. The user interface device 104 may be configured to receive ground vehicle movement data with an indication of a potential collision zone from a collision perception system. According to exemplary aspects of the present disclosure, such as those of fig. 5A-5C, the user interface device 104 may also be configured to generate (e.g., present) and present an AMMD showing an indication of transit ground vehicles and potential collision zones.
The network 130 may include any number of different types of network connections, including satellite connections and Wi-FiTMAnd (4) connecting. For example, the network 130 may include a network using geostationary satellites 105A, low earth orbiting satellites 105B, global navigation satellite system 105C, cellular base station transceivers 160 (e.g., for 3G, 4G, LTE, and/or 5G cellular network access), and/or Wi-FiTMThe network that the access point establishes. In turn, the geosynchronous satellite 105A and the low earth orbiting satellite 105B may communicate with a gateway that provides access to the network 130 for one or more devices implementing a collision awareness system. The cellular base station transceiver may have a connection that provides access to the network 130. Further, the global navigation satellite system may communicate directly with the vehicle 111, for example, to triangulate (or otherwise calculate) the current position of the vehicle 111. These various satellite, cellular, and Wi-Fi network connections may be managed by different third-party entities, referred to herein as "fortune" entitiesAnd (4) operation and business. In some examples, the network 130 may include a wired system. For example, the network 130 may include an ethernet system, such as the redundant ethernet system shown in fig. 7 of the present disclosure. In some examples, the network 130 may include a multipoint positioning system Local Area Network (LAN), such as the multipoint positioning system LAN shown in fig. 7 of the present disclosure.
In some examples, any of the devices of collision awareness system environment 102 executing one or more techniques of a collision awareness system may be configured to communicate with any of a variety of components via network 130. In other cases, a single component of the collision awareness system environment 102 may be configured to perform all of the techniques of the collision awareness system. For example, the collision-awareness system may include a system residing on the vehicle 111, the data server 132, the traffic controller 114, or the user-interface devices 104A/104N. In some examples, the collision awareness system may operate as part of a software package installed on one or more computing devices. For example, the traffic controller 114 may operate software that performs one or more of the various techniques of the disclosed collision awareness system. For example, a software version of the collision awareness system may be installed on the computing device of the traffic controller 114. Also, the disclosed collision awareness system may be included in the user interface device 104 or one or more data servers 132. For example, the data server 132 may comprise a cloud-based data server implementing the disclosed collision awareness system.
In some examples, the one or more data servers 132 may be configured to receive input data (e.g., vehicle location data, airport guidance features, permission information, etc.) from the network 130, determine a predicted collision zone, and may output the predicted collision zone data to one or more components of fig. 1, such as the user interface device 104, the traffic controller 114, or the vehicle 111, in accordance with one or more techniques of this disclosure. In some examples, data server 132 may include data store 105. In other examples, some or all of data store 105 may be embodied as a separate device that interacts with other components of collision awareness system environment 102, either directly or via network 130. For example, where the collision awareness system is implemented at least in part on one or more of the data servers 132, the data repository 105 may interact with the data servers 132 directly or via the network 130.
In some examples, the database may include historical vehicle data 106, airport guidance data 108, and in some cases permission data 110. Although shown as a single data store 105, databases shown as part of data store 105 may be embodied as separate objects. In some examples, a database included in or external to the vehicle may be or include a key-value data store, such as an object-based database or dictionary. In a non-limiting example, the database can include any data structure (and/or combination of multiple data structures) for storing and/or organizing data, including but not limited to a relational database (e.g., Oracle database, MySQL database, etc.), a non-relational database (e.g., NoSQL database, etc., in-memory database, spreadsheet, such as Comma Separated Value (CSV) file, extensible markup language (XML) file, TeXT (TXT) file, flat file, spreadsheet file, and/or any other widely used or proprietary format for data storage.
Databases are typically stored in one or more data repositories. Accordingly, each database referred to herein (e.g., in the description herein and/or the figures of the present application) should be understood as being stored in one or more data stores. In various examples, the outgoing request and/or the incoming response may be transmitted in any suitable format. For example, XML, JSON, and/or any other suitable format may be used for API requests and responses or otherwise. As described herein, data transfer refers to transmitting data from one of the vehicle 111, the traffic controller 114, the data server 132, or the user interface device 104 over the network 130, and receiving data at the user interface device 104, the data server 132, the traffic controller 114, or the vehicle 111 over the network 130. The data transfer may follow various formats, such as database format, file, XML, HTML, RDF, JSON, system-specific file format, data object format, or any other format, and may be encrypted or have any available type of data.
In some examples, the historical vehicle data 106 may store historical navigation route data, vehicle data (e.g., maintenance logs, safe zone envelope data, etc.). An exemplary visual depiction of certain historical navigation route data may be as shown in table 1 below.
TABLE 1
Speed of rotation | Latitude | Longitude (G) | Epoch time | |
10 | 41.97267 | -87.89229 | 1496021663 | 1:34:23 |
11 | 41.97268 | -87.89236 | 1496021664 | 1:34:24 |
10 | 41.97266 | -87.8924 | 1496021665 | 1:34:25 |
9 | 41.97268 | -87.89246 | 1496021666 | 1:34:26 |
13 | 41.9727 | -87.89256 | 1496021667 | 1:34:27 |
8 | 41.97268 | -87.89257 | 1496021668 | 1:34:28 |
10 | 41.97268 | -87.89264 | 1496021669 | 1:34:29 |
9 | 41.97266 | -87.89269 | 1496021670 | 1:34:30 |
11 | 41.97267 | -87.89278 | 1496021671 | 1:34:31 |
11 | 41.97265 | -87.89284 | 1496021672 | 1:34:32 |
The historical navigation route data of one data set may correspond to data obtained for a particular one of the vehicles 111. For example, table 1 above may include data regarding a particular one of vehicles 111 (such as vehicle 112A or vehicle 113A). Further, the historical navigation route data may be relative to a particular location, such as a particular airport location. Thus, the historical navigation route data may include additional data entries for "vehicle ID" and "airport identifier". In any event, the above table is merely one exemplary representation of some historical navigation route data that the database 105 may manage and store over time. The navigation route data may be based on data received directly from each aircraft, such as from transponder data, or may include tracking data obtained in other ways, such as by external sensors. In some examples, the data entry associated with "speed" as shown may be associated with ground speed. The historical vehicle data 106 may be stored in any suitable unit of speed, such as nautical miles per hour, meters per second, and so forth. The historical vehicle data 106 may also store acceleration data determined from speed data or received directly from one of the vehicle 111 or an external sensor.
In some examples, airport guidance data 108 may include a map or other data representation of airport ground, including guidance features configured to guide vehicles through a particular airport location. The ground may be, for example, a taxiway at an airport, a runway, a gate area, an apron, a hangar compartment, or other traffic lanes or the ground. For the purposes of this disclosure, the description of an "airport" is equally applicable to an air base, an aircraft landing site, or any other type of permanent or temporary airport. In some examples, airport guidance data 108 may include multiple databases specific to a particular airport or airports nearby. For example, the airport guidance data 108 may include an airport-specific fixed ground object database, including real-time or near-term imaging or detection, or a combination of both, to provide fixed ground object information and to provide airport guidance features, such as guideline coordinates fixed to the airport ground. In any case, the data server 132 or other components of the collision awareness system environment 102 may be configured to access one or more of the airport guidance data 108. For example, the data server 132 may identify a particular airport location, such as a particular airport in a particular city, and may access airport guidance data 108 specific to the identified airport location. In some examples, airport guidance data 108 for multiple airports may be included in a single data store 105, rather than in separate data stores 105 as is the case in some examples.
In some examples, data store 105 may also include permissions data 110. In some examples, the licensing data 110 may be included as a separate data store 105. For example, the permission data 110 may reside in a data store stored on a computing system of the traffic controller 114. For example, a traffic controller 114 for a particular airport may include the permission data 110. Traffic controller 114 may also include other data contained in data store 105. The permission data 110 may include textual or audible permission information generated by the traffic controller 114 and/or the vehicle 111, as well as communications between the traffic controller 114 and the receiving vehicle 111.
In some examples, the traffic controller 114 may transmit taxiway or runway clearance information to one of the vehicles 111 in text format or in a voice message. In some examples, one of the vehicles 111 may retrieve taxiway or runway clearance information from the traffic controller 114. For example, one of the vehicles 111 may perform a database query on the permission data 110, or otherwise request the permission data 110 from the traffic controller 114 or the data store 105 storing the permission data 110. In some examples, a text or voice message may be transmitted directly from the traffic controller 114 (e.g., live communication or from the permissions database 110) to one of the vehicles 111. One of the vehicles 111 may then transmit the licensing information to one or more external systems (e.g., cloud systems) via an aircraft data gateway communication unit (ADG). For example, the vehicle 111 or the traffic controller 114 may transmit the permission information to the device executing the collision awareness system.
In various scenarios, the traffic controller 114 may send a copy of the permit message (text or voice message) to the data server 132 (e.g., cloud system) via a secure communication protocol. Once the data is available on the data server 132, the data server 132 may convert any voice-related taxiway or runway clearance information to textual information and store the clearance information to a predefined location on the data repository 105.
A collision awareness system implemented on one or more components of collision awareness system environment 102 may utilize data from data store 105 to determine a predicted collision zone for a vehicle. As such, when the aircraft 112 is taxiing, taking off, landing, or stationary on an airport ground, the collision awareness system may help to mitigate or reduce collisions between the vehicle 111 (including the body, wingtips, or other portions of the vehicle 111) and other aircraft, ground vehicles, or other transiting or moving objects (collectively, "transiting ground objects") on the airport ground. For purposes of this disclosure, a "transitive ground object" may refer to any aircraft, ground vehicle, or other object on the airport ground, including objects that are permanently fixed in place and that may be monitored by a collision sensing system.
FIG. 2 is a conceptual block diagram of an exemplary computing system 138 having an exemplary computer-executable collision awareness system 140. As described with reference to fig. 1, the collision perception system 140 may be embodied in any number of different devices, such as one or more of the components of the collision perception system environment 102 described with reference to fig. 1. For simplicity, computing system 138 implementing collision awareness system 140 may be described as performing various techniques of the present disclosure across one or more data servers 132, such as on a cloud server. However, it should be understood that computing system 138 may be implemented on traffic controller 114, user interface device 104, vehicle 111, or other network devices designed to provide vehicle collision awareness. That is, the collision awareness system 140 may execute on any one or more of the processing circuits 142 of the computing devices corresponding to the traffic controller 114, the user interface device 104, the vehicle 111, or other network devices, and combinations thereof. Further, where one or more of databases 106, 108, or 110 are implemented as a storage device separate from storage device 146, collision awareness system 140 may execute based on data from storage device 146 and/or data store 105 included in any one or more of processing circuits 142 of a computing device corresponding to traffic controller 114, user interface device 104, vehicle 111, or other network device. In some examples, storage device 146 may include one or more of databases 106, 108, or 110.
Thus, the computing system 138 may implement the collision perception system 140 via the processing circuitry 142, the communication circuitry 144, and/or the storage device 146. In some examples, computing system 138 may include a display device 150. For example, where computing system 138 is embodied in one of user interface device 104, vehicle 111, or traffic controller 114, computing system 138 may include display device 150 that is integral to the particular device. In some examples, the display device 150 may include any display device, such as a Liquid Crystal Display (LCD) or Light Emitting Diode (LED) display or other type of screen, which the processing circuitry 142 may utilize to present information related to the predicted impact zone. In some examples, display device 150 may not be included in computing system 138. For example, the computing system 138 may be one of the data servers 132 configured to perform the various techniques of this disclosure and transmit the collision zone data to another device (such as one of the user interface devices 104) for display.
In examples that include the display device 150, the display device 150 may configure the collision zone information graphically presented on a ground navigation application implemented by the aircraft system. Further, the display device 150 may configure the position/velocity information of the one or more transitive ground objects to be graphically presented on a ground navigation application implemented by the aircraft system. For example, the display device may generate graphical display format data based on the position and velocity information that is compatibly configured with the graphical output of the AMMD application, such that the AMMD application may overlay, superimpose, or otherwise integrate the graphical display format data with the existing AMMD graphical display output. The display device 150 generates output that includes or is in the form of graphical display format data such that the output can be readily configured to be received and graphically presented by an AMMD application executing on an EFB (e.g., on a tablet computer) in a cockpit of an aircraft moving on the ground at an airport, as described further below. In this way, the collision sensing system 140 may provide immediately available outputs, including alerts or warnings, via the display device 150 to inform the pilot or other flight crew of the potential danger of an impending collision so that the pilot or flight crew can take appropriate action.
In some examples, the display device 150 including one or more display processors may be incorporated in an integrated collision avoidance logic and display processing subsystem in a single processor, electronic system and/or device, or software system with an integrated implementation of the collision awareness system 140. For example, the user interface device 104 may include the collision awareness system 140 and the display device 150 as a single device, such as an EFB.
In some examples, the processing circuitry 142 may include fixed function circuitry and/or programmable processing circuitry. The processing circuitry 142 may comprise any one or more of a microprocessor, controller, Digital Signal Processor (DSP), Application Specific Integrated Circuit (ASIC), Field Programmable Gate Array (FPGA), or equivalent discrete or analog logic circuitry. In some examples, the processing circuitry 142 may include multiple components, such as any combination of one or more microprocessors, one or more controllers, one or more DSPs, one or more ASICs, one or more FPGAs, and other discrete or integrated logic circuitry. The functionality attributed to processing circuit 142 herein may be embodied as software, firmware, hardware, or any combination thereof.
In some examples, the communication circuitry 144 may include a Wireless Network Interface Card (WNIC) or other type of communication module. In some examples, the communication circuitry 144 may have an Internet Protocol (IP) port coupled to an ethernet connection or to an output port such that the communication circuitry 144 receives an output from the processing circuitry 142. The communication circuit 144 may be configured to connect to Wi-FiTMOr other wireless network connection. In some examples, the communication circuit 144 may be separate from the collision sensing system 140. For example, the collision-awareness system 140 may include the processing circuitry 142 of the computing system 138, while the communication circuitry may be included as a separate computing system.
In some examples, the collision awareness system 140 may include one or more storage devices 146. In some examples, storage device 146 may include one or more of data store 105, and may be similarly configured to store data. For example, the storage device 146 may include any data structure (and/or combination of data structures) for storing and/or organizing data, including but not limited to relational databases (e.g., Oracle database, MySQL database, etc.), non-relational databases (e.g., NoSQL database, etc., in-memory databases, spreadsheets, such as Comma Separated Values (CSV) files, extensible markup language (XML) files, text (txt) files, flat files, spreadsheet files, and/or any other widely used or proprietary format for data storage.
In some examples, the storage device 146 may include executable instructions that, when executed, cause the processing circuitry 142 to perform the various techniques of this disclosure. Further, the storage device 146 may include a Machine Learning (ML) model 148. In some examples, the ML model 148 may be included on a separate storage device. For example, the ML model 148 may be stored in the data repository 105 on the data server 132. In such examples, processing circuitry 142 may execute ML model 148 via network connection 130.
In some examples, a path, vehicle location, or collision zone may be processed and predicted using a trained ML model, where the ML model is deemed advantageous (e.g., predictive modeling, inference detection, context matching, natural language processing, etc.), in accordance with certain examples of the present disclosure. Examples of ML models that may be used with aspects of the present disclosure include classifier and non-classification ML models, artificial neural networks ("NN"), linear regression models, logistic regression models, decision trees, support vector machines ("SVM"), naive or naive bayesian networks, K nearest neighbor ("KNN") models, K-means models, clustering models, random forest models, or any combination thereof. These models may be trained based on data stored in data store 105. For example, for purposes of illustration only, certain aspects of the disclosure will be described using a predicted path generated from training of data from data store 105 with an ML model.
The collision awareness system 140 may access or incorporate an ML system or a pattern recognition system. Based on a large training data set of past motions of aircraft, ground vehicles, and other objects, as statistically sampled over time in a representative set of particular airports or airports, the ML models 148 may incorporate knowledge of predictable future motions of aircraft, ground vehicles, or other objects based on statistical training of one or more ML models 148 or pattern recognition systems. For example, such ML systems or pattern recognition systems may also incorporate statistical training of observed motion of aircraft, ground vehicles, and other transit objects on the airport ground that is associated with various conditions, such as traffic levels, weather and visibility conditions, and time of day. The one or more initially trained ML models 148 may be further refined with a large amount of data of the motion of aircraft, ground vehicles, and other transit objects on the airport ground, as compared to the motion predicted by the one or more ML models 148. Further, the collision awareness system 140 of the computing system 138 may implement an expert rule system that may incorporate general airline gate assignments, specific gate assignments for a specific aircraft or a given flight, and knowledge of data about assigned taxi routes between gate areas and runways, which the ML model 148 may use to predict routes. For example, in accordance with techniques of this disclosure, the processing circuitry 142 may deploy an ML model 148 that trains historical navigation route data from the historical vehicle data 106 and that trains general airline gate assignments to predict a route of one of the vehicles 111, including worst-case and best-case scenario routes that may be combined to determine a single predicted route.
In some examples, collision awareness system 140 may be enabled and implemented in existing airport systems, vehicles 111, and/or user interface devices 104 with only minimal hardware or software changes. Further, in some examples, collision awareness system 140 may be configured to provide reliable false alarm mitigation and the ability to use various data inputs, such as from broadcast automatic dependent surveillance (ADS-B) sources, and to provide coverage of any type of vehicle 111 that may potentially collide with another of the fixed structure or vehicle 111.
Fig. 3 depicts a flowchart of an exemplary process 300 that may be implemented by the collision sensing system 140 for providing collision zone predictions, according to an exemplary aspect of the present disclosure. In some examples, process 300 may include some features that may be optional. In this example, the process 300 includes obtaining (e.g., by the processing circuitry 142 of the collision perception system 140) historical navigation route data for one or more vehicles 111 (302). For example, the processing circuit 142 may identify historical navigation route data from the historical carrier data 106. The historical navigation route data may be based at least in part on transponder location data from the vehicle 111. For example, the vehicle 111 may transmit navigation route data via the network 130, which may be stored in the historical vehicle data store 106 over time as historical navigation route data.
The process 300 also includes identifying (e.g., by the processing circuitry 142) a plurality of airport guidance features for a particular airport location (304). For example, the processing circuitry 142 may identify airport guidance features for a particular airport from the airport guidance data 108. In some examples, the airport guidance features may include guidance marker information, such as guidance signs and guide lines for the airport location. For example, the guide line may include coordinates of a line marker fixed to the airport ground (such as by being painted to the ground).
In some examples, the predicted path of the first vehicle is based on a combination of initial predicted paths of the first vehicle, the initial predicted paths including a likelihood of the first vehicle traveling the first initial predicted path and a likelihood of the first vehicle traveling the second initial predicted path. For example, the processing circuit 142 may combine or average predicted paths representing multiple best case historical paths and worst case historical path segments to generate a predicted path. The predicted path may be specific to a particular one of the vehicles 111, and may include a path from a first time (e.g., a1(t1)) to one or more other expected times (e.g., a1(t1+ n), where n represents an integer to be added to t representing the first time) connecting points of the particular one of the vehicles 111.
In such examples, the processing circuitry 142 may determine at least two initial predicted paths for the first vehicle. The processing circuitry 142 may also identify a likelihood that the first vehicle travels any of the initial predicted paths. For example, the likelihood may include an indication of a best case path, meaning that the predicted path is most likely to occur; or an indication of a worst-case path, which means that the predicted path is least likely to occur; or other paths having a likelihood of falling between the best case path and the worse case path. In any case, the processing circuit 142 may determine the predicted path of the first vehicle based on a combination of at least two initial predicted paths of the first vehicle. The combining may be based on the processing circuitry 142 deploying an ML model that is capable of determining an initial predicted path, combining the predicted paths, or both to determine a combined predicted path. In some cases, the combining may be based on the likelihood information such that more weight is provided for paths that are more likely to occur in that case, and less weight is provided for paths that are less likely to occur in the same case. For example, the processing circuitry 142 may determine a weighted average of the initial predicted paths, or deploy one of the ML models 148 to determine a weighted average or other combination of the initial predicted paths.
In some examples, the processing circuitry 142 may sort and transmit the predicted path data to the data store 106. The processing circuitry 142 may classify the predicted path based on the aircraft type such that the predicted path may be referenced for future use by similarly located vehicles 111.
In some examples, processing circuitry 142 may retrieve certain historical data from historical vehicle data 106 to identify a predicted path based on the particular vehicle type and/or current state information of the vehicle. For example, a particular vehicle may be native that performs the techniques of this disclosure on an EFB located in a native cockpit. In such examples, the processing circuitry 142 of the EFB may select the best-fit historical data to determine the predicted path. In some examples, the selected best-fit historical data may be dynamic from location to location for identifying predicted paths using native parameters. In some examples, the ML model may be used for best-fit path selection based on current data and historical data.
In some examples, the processing circuitry 142 may identify permission information for a traffic controller defining one or more target markers for the first vehicle 111. For example, the processing circuitry 142 may query a database for permission information. The permission information may include the destination location as one target marker of the first vehicle, but may also include a plurality of target markers along the road to the destination location, such that the vehicle will follow a path along the target markers to the destination location.
In some examples, the processing circuitry 142 may identify one or more targeting features from a plurality of airport guidance features based at least in part on the licensing information. In such examples, the one or more target-aiming features may be configured to provide guidance through a particular portion of the airport toward the one or more target markers. For example, the targeting feature may include an airport guidance feature that targets or travels the vehicle toward the target, such as by providing an arrow (whether virtual or real) that guides the vehicle 111 to the target. Thus, the processing circuit 142 may use the one or more targeting features to identify the first portion of the predicted path. For example, the first portion may include airport guidance features such that historical navigation data and airport guidance features may be used in conjunction with one another to determine a predicted path for the vehicle 111 through the first portion of the predicted path. In such examples, the processing circuit 142 may use the permission information and the historical navigation route data to identify a second portion of the predicted route that includes the destination location of the first vehicle defined by the permission information as a target marker of the one or more target markers. A second portion of the predicted path may pass through the apron area of the airport that does not include guidance features, and thus historical navigation data may be used to predict that portion of the path. It should be noted that in some examples, the first and second portions may be switched, such as when the vehicle leaves the apron or gate area towards the runway. That is, in some examples, the second portion of the predicted path may include an airport apron area, or may include a taxiway with ground guide markers, depending on the intended direction of travel of the vehicle 111 (e.g., toward a gate, toward a runway, toward an hangar space, or somewhere in between, etc.).
The process 300 also includes determining (e.g., by the processing circuitry 142) a predicted location of the first vehicle along the predicted path at the expected time (308). In some examples, the processing circuitry 142 may implement a regression algorithm to predict an instantaneous accurate position using previous position, speed, and heading information (e.g., a1(t +1) through a1(t +2), where time t is in seconds). In some scenarios, a regression model is used to minimize the position bias error in the historical data. In some examples, data points (such as those shown in table 1 above) may be used to calculate a cumulative distance for a particular vehicle 111. For example, the processing circuit 142 may calculate the cumulative distance from a1(t1) to a1(t + n). In such an example, assuming that all intermediate points are locally linear, processing circuitry 142 may determine all intermediate path points of the path segment from time't' to't + n'. In some examples, the processing circuitry 142 may utilize a function such as a great circle distance formula, a great circle earth model, or an equivalent projection system formula to determine location information based on the calculated distance and direction. The processing circuitry 142 may determine the directionality of one of the vehicles 111 from the predicted position along the predicted path. As shown in fig. 5A to 5C, a1 may be determined based on a function using the current position and the accumulated distance of the specific vehicle 111 according to equation [1] or [2] below (t 1).
In such examples, the processing circuitry 142 may determine movement information of the first vehicle at the current location. In some examples, the movement information may include speed information of the first vehicle. For example, the processing circuitry 142 may receive sensor data or transponder data from one of the vehicles 111 indicating the rate at which the particular vehicle 111 is traveling. Thus, the processing circuit 142 may use the movement information and the historical navigation route data to identify a predicted location of the first vehicle. For example, the processing circuitry 142 may determine how much distance the particular vehicle 111 will travel along the predicted path based on the rate at which the particular vehicle 111 is traveling at the current location along the predicted path.
The process 300 also includes determining (e.g., by the processing circuit 142) a predicted position of the second vehicle relative to approximately the same expected time (310). For example, the processing circuitry 142 may determine a predicted position of a first one of the vehicles 111 at a time 15 seconds into the future, and thus may determine a predicted position of another one of the vehicles 111 at a time 15 seconds into the future. In essence, the processing circuitry 142 may determine the predicted location of each vehicle at any number of expected times, and the processing circuitry 142 will likely find a future time or range of times (e.g., 14-15 seconds into the future) that indicates when a vehicle collision is likely to occur. In some examples, the expected time corresponding to the predicted position of the second vehicle may be the same as the expected time corresponding to the predicted position of the first vehicle. For example, the predicted positions of the two vehicles may correspond to an expected time of 15 seconds into the future. However, in some cases, the predicted locations may not correspond to exactly the same expected time. For example, due to the size of the vehicle, the predicted position at T-14 seconds and the predicted position at T-15 seconds may indicate an overlap of the vehicle envelope at 14 or 15 seconds. In another example, the predicted location may be determined at different intervals. For example, the predicted position of the first vehicle may be determined on a second-by-second basis, while the predicted position of the second vehicle may be determined on a half-second or every other second basis. In such cases, the processing circuitry 142 may perform interpolation techniques to predict collision zones at times that are approximately the same (e.g., within half a second or a few seconds of each other), but may not be exactly the same.
In some examples, the processing circuitry 142 performing the process 300 may include the processing circuitry 142 of one or more remote data servers 132. In such examples, the processing circuitry 142 of the one or more remote data servers 132 may receive the current location of the first vehicle 111 or an indication of the current location. The processing circuitry 142 may perform all or some of the process 300 in order to determine a predicted collision zone. In some examples, the processing circuitry 142 of the remote server may transmit the predicted collision zone from the remote server to the first vehicle 111, such as to the EFB or other user interface device 104 corresponding to the vehicle 111.
Fig. 4 depicts an exemplary technique that may be implemented by collision awareness system 140 for aligning historical navigation routes with airport guidance features when determining a predicted position of vehicle 111. In some examples, the processing circuitry 142 may align the historical navigation route so as to coincide with an airport guidance feature (such as a ground guide line). In an example, the processing circuit 142 may predict a path of the first vehicle 111 from the temporary source point 404A to the temporary destination point 404B (e.g., a target marker). The source and destination points 404 may be located in an airport area with airport guidance features. For example, the ground guide line 406 may be located between the source point and the destination point 404. In some examples, the processing circuitry 142 may determine the source and destination points 404 based on various predicted points along the predicted path, where the predicted path may be updated as the vehicle 111 approaches each predicted point along the predicted path. In some examples, the source and destination points 404 may be based on vehicle permission data received from the traffic controller 114. In some examples, the processing circuitry 142 may utilize a combination of the clearance data and the historical navigation route data to determine points 404A and 404B that are configured to guide the vehicle 111 to a final destination point that may be offset from an airport area having ground guide lines (such as the ground guide lines 406).
As shown in fig. 4, processing circuitry 142 may determine historical navigation route data 410 between points 404A and 404B. The processing circuitry 142 may determine historical navigation route data 410 from the airport guidance data 108. In some examples, the historical navigation route data 410 of fig. 4 may be a combination (e.g., an average) of multiple predicted paths combined into a single predicted path 410 that is composed of various predicted points along the predicted path 410 (e.g., based on a weighted average based on the likelihood of each predicted path).
The processing circuit 142 may align the historical data points 410 along the predicted path between the points 404A and 404B to determine an aligned predicted path 412. The aligned predicted path 412 may be aligned along a ground guide feature, such as ground guide line 406. In accordance with various techniques of this disclosure, the processing circuit 142 may use the aligned predicted path 412 to determine a predicted point (e.g., a target mark) along the aligned predicted path 412 at an expected time in order to determine a collision zone. It should be understood that a target mark refers to various points on the ground that the vehicle can target as it progresses along the path in order for the vehicle to navigate to the final target mark or final destination. For example, the target flag may change over time as the processing circuit 142 updates the predicted path. The processing circuit 142 may further update the predicted position along the updated predicted path, for example, based on the change in vehicle speed over time.
Fig. 5A-5C depict conceptual views of a portion of an airport 500 with various aircraft and ground vehicles on airport runways, taxiways, and other airport surfaces. The various aircraft and ground vehicles shown in fig. 5A-5C include aircraft 112 and ground vehicle 113 (designated a1-a4 in fig. 5A-5C for simplicity). As discussed above with respect to fig. 1, collision awareness system 140 may be configured to determine a predicted path and position of vehicle 111, an actual position and speed of vehicle 111, determine an alert envelope, and predict a collision zone for vehicle 111. The collision awareness system 140 may then output the position and speed information of the one or more transit ground objects and an indication of the potential collision zone to the network 130 so that these outputs from the collision awareness system 140 may be received by the EFB on at least one of the vehicles 111 in the transit ground objects on the ground of the airport 500.
Fig. 5A shows a simplified example of two vehicles a1 and a2, which may be aircraft of vehicles 112A-112N, but for simplicity will be referred to as vehicles a1 and a2 when describing time courses using the tX indicator. In the example of fig. 5A to 5C, t0-tX indicates time in seconds. For example, t0 indicates an initial starting point of time 0, and t5 indicates a predicted position after 5 seconds have elapsed. In fig. 5A, vehicle a1 has received permission to park at a particular destination location. Processing circuitry 142 may predict path 502 of vehicle a1 in accordance with various techniques of this disclosure. For example, the processing circuitry 142 may deploy the ML model to determine a combined predicted path determined from best case and worst case path predictions as informed by historical navigation route data at the airport 500 or other airports. Processing circuitry 142 may determine a predicted position of vehicle a1 along predicted path 502 at any time interval. Although the example of fig. 5A-5C shows a 5 second interval, the techniques of this disclosure are not so limited and any time interval including a variable time interval may be used. In the example of fig. 5A, four predicted positions for vehicle a1 at 5 seconds, 10 seconds, 15 seconds, and 30 seconds are predicted. The processing circuitry 142 may use historical navigation route data that is aligned with the airport guidance features if available to determine the predicted location, or in some examples, may use historical navigation route data without the airport guidance features if the airport guidance features are not available, such as in areas of the airport 500 where no guide lines are present (e.g., apron areas, airport areas, etc.).
Thus, the processing circuit 142 may predict the position of vehicle a2 along the predicted route 503, the predicted position including at least one time (e.g., t10, t15, t30) that coincides with the predicted position time relative to vehicle a 1. In some examples, the processing circuit 142 may determine the predicted location of the second vehicle using the predicted current location of the second vehicle and one or more of: historical navigation route data, a plurality of airport guidance features, or permission information for the second vehicle. In the example of fig. 5A, processing circuitry 142 may determine that the overlap of the safe zone envelopes of vehicles a1 and a2 will occur at the expected time of 15 seconds in the future unless some change is made to the system, such as deceleration or acceleration of one or the other vehicle.
As information becomes available, processing circuitry 142 may perform another prediction at various intervals using the new information, such as speed or acceleration data for vehicles a1 and a 2. In any case, the processing circuit 142 may identify the predicted collision zone as the overlap region 508. Although the safe zone envelope is shown as a circle in fig. 5A-5C, the safe zone envelope may be any shape and may be specific to the shape of a particular carrier 111. For example, where vehicle 112A (e.g., a1) is a particular aircraft having a particular size and shape, the safe zone envelope for vehicle 112A may be similar to the size and shape of vehicle 112A, such that a detected overlap would indicate a location on the vehicle where the overlap is predicted to occur. In the example of fig. 5A, processing circuitry 142 may determine that the overlap of envelope 508 indicates that the head of vehicle a2 is predicted to collide with the left wing of vehicle a1 at expected time t 15.
If there are any collision zones in front of the native machine, the processing circuitry 142 generates a visual or text notification for display on the user interface device 104. For example, equation [1] below is used to calculate the cumulative distance along the track from time't' to time't + n', where't' is in seconds and 'n' is a positive integer value.
Where 'u' refers to velocity or speed,'t' refers to change in time, and 'a' refers to acceleration (change in speed).
As shown in fig. 3, the historical performance/tracking data has speed and velocity information. The acceleration is calculated using the velocity difference at each position. If the speed is constant (i.e., 'a' is 0), then equation (1) (above) reduces to equation [2] (below):
In the example of fig. 5B, vehicle a3 represents an aircraft trailer pulling an aircraft. Processing circuit 142 may predict the path of vehicle A3 and determine that vehicles a2 and A3 are not predicted to collide because, at 5 seconds, vehicle a1 is predicted to exceed the intersection of the predicted paths of vehicles a1 and A3. In the example of fig. 5B, processing circuitry 142 may use historical navigation route data, such as in an apron area of airport 500, to predict a first portion of the predicted path of vehicle a 3. Processing circuitry 142 may use the historical navigation route data and airport guidance features, such as to predict a second portion of the predicted path of vehicle a3 in an area of airport 500 that includes guidance features that vehicle 111 is expected to follow to reach a predefined destination.
The example of fig. 5C is similar, except that vehicle a4 is predicted to be located near the path of A3. In the example of fig. 5C, vehicle a4 may be a parked vehicle with the avionics system closed. Thus, processing circuit 142 may use historical navigation route data to determine a predicted location of vehicle a4 and to determine a predicted path for vehicle a 4. In this example, vehicle a4 does not have a predicted path in the foreseeable future based on licensing information, historical navigation route data, and/or other aircraft information available to processing circuitry 142 (such as time of flight associated with vehicle a4, etc.). In such examples, processing circuitry 142 may determine whether carrier A3 will have sufficient clearance to follow the predicted path without pinching parked carrier a 4. In some examples, processing circuitry 142 may determine a predicted collision zone between vehicles A3 and a4 and provide a notification for display on one of user interface devices 104.
FIG. 6 is a diagram of an exemplary graphical output display 610 of an airport ground map that may be implemented by a two-dimensional airport moving map display (2D AMMD) application that may be implemented on an EFB. For example, the 2D AMMD application may be implemented on one of the user interface devices 104, such as an EFB tablet in the cockpit of a particular aircraft (e.g., aircraft 112 in fig. 1), for example, while the aircraft is on the ground and, for example, taxiing or otherwise moving. In other examples, the 2D AMMD application may be implemented on another device than one of the user interface devices 104. In other examples, a graphical output display similar to graphical output display 610 may be implemented by a three-dimensional AMMD application program, which may be implemented on an EFB or other application package executing on a tablet or other type of computing and display device.
AMMD application graphical output display 610 (or "AMMD 610") includes a representation (e.g., a graphical icon) of a transit ground vehicle that may be received and/or decoded by a transit ground object overlay module of an AMMD application executing on user interface device 104 providing AMMD 610. In the display example shown in fig. 6, AMMD 610 thus includes a graphical icon of native machine 612 (e.g., in the cockpit where user interface device 104 is being used, the graphical icon may correspond to vehicle 111 in fig. 1 and one vehicle a1-a4 in fig. 5A-5C); graphical icons of the other mobile vehicles 614, 616, and 618 (e.g., corresponding to the other vehicles 111); and graphical icons of ground vehicles 622, 624 (e.g., corresponding to ground vehicle 113).
AMMD 610 also includes representations of airport guidance features, such as ground guidance markers 642 and 644. AMMD 610 may also include representations of taxiways 634, 636, 638 and tarmac areas 652, 654, 656 near airport terminal portions 662, 664, 666. AMMD 610 may include indications of potential collision zones provided by collision awareness system 140 based on predicted collision zones determined and transmitted by collision awareness system 140, such as warning graphic 670 and text warning notification 672 between native icon 612 and aircraft icon 614. In one example, the native machine 612 may have a predicted route to the tarmac area 656 from between the ground guide markers 642. In such cases, the first portion of the predicted route may include a portion of the taxiway 634 and the second portion of the predicted route may include a portion of the tarmac area 656. In any case, the first portion may correspond to a region of the particular airport location that includes airport guide features, such as ground guide markers, while the second portion may correspond to a region of the particular airport location that does not include airport guide features, such as apron regions.
In some examples, carrier 614 and/or carrier 612 may not be physically present at the locations shown on AMMD 610. That is, AMMD 610 may display the predicted location of vehicle 614 and/or vehicle 612 at or near predicted collision zone 670. AMMD 610 may also display the predicted route over time along with the current location and one or more predicted locations. In another example, AMMD 610 may display one or more predicted locations of second vehicle 614 such that a user may view predicted routes and predicted locations that contribute to the predicted collision zone of local vehicle 612 and second vehicle 614. The user can switch various aspects of the displayed information, such as switching the predicted position. In one example, the predicted location and/or predicted route may be displayed as a hologram or other form of fuzzy depiction of the vehicle's moving or stationary location of the vehicle to indicate to the user that the location or route is not an actual route, but rather represents a predicted route that is subject to change over time based on predictions from collision perception system 140.
In this example, the collision awareness system 140 may connect to an aircraft system (e.g., an EFB application running on the user interface device 104) of a particular aircraft 612 over an extended range wireless network (via a wireless router, such as wireless router 710 of fig. 7), where the particular aircraft may be among the transitive ground objects that the collision awareness system 140 is monitoring. In some examples, the collision awareness system 140 may establish a secure wireless communication channel with an EFB application running on the user interface device 104, or with another aircraft system on a particular aircraft, over an extended-range wireless network, and then transmit its information, including location and velocity information of one or more transit ground objects, over the secure wireless communication channel.
In various examples, an EFB application executing on the user interface device 104 may thus receive all of the information transmitted by the collision awareness system 140, and receive all of the benefits of the collision awareness system 140, simply through a software upgrade to the EFB application implementing the examples of the present disclosure. Thus, in various examples, the pilot or flight crew may obtain the benefits of the present disclosure without requiring any new hardware (as the EFB application of the present disclosure may be executed on an EFB tablet or other EFB that the flight crew already has), and without requiring any hardware or software changes to the installed equipment of the aircraft itself, and thus without having to go through the certification process of any newly installed aircraft systems. Pilots, flight crew members, or aircraft operators may also enjoy the benefits of particular implementations of the present disclosure without relying on new hardware or software from Original Equipment Manufacturers (OEMs) of installed hardware or software systems installed in the aircraft.
Fig. 7 is a conceptual block diagram depicting an exemplary airport network system with exemplary ground sensors that may be used in conjunction with collision sensing system 140. In the example shown in fig. 7, the exemplary airport network system includes a collision-aware system 140 connected to a wireless router 710 via a communication circuit 144. The collision awareness system 140 may be communicatively connected to various types of multiple airport ground sensors, including ground moving radar (SMR) transceivers 720, multiple point location sensors 722, multiple point location reference transmitters 724, and/or to additional types of airport ground sensors, via a multiple point location system LAN 770 or redundant airport system ethernet Local Area Networks (LANs) 716A and 716B ("airport system LAN 716"). The multi-point positioning sensor 722 may collect data regarding the movement of the ground vehicles 111 and provide the data to the collision awareness system 140 via the network 130 (e.g., airport system ethernet LAN 716, etc.) so that the collision awareness system 140 may use such data to confirm various predictions based on non-sensor data.
In some examples, processing circuitry 142 may receive ground sensor data for a first vehicle and/or a second vehicle in vehicles 111. The processing circuitry 142 may receive surface sensor data collected as described in various techniques of U.S. patent publication No. 2016/0196754 to Lawrence j. city, filed on 6.1.2015, which is hereby incorporated by reference in its entirety. For example, SMR transceiver 720 is connected to at least one SMR antenna 726. The SMR transceiver 720 and SMR antenna 726 may be configured to detect, monitor and collect data from various airport grounds, and detect transit ground objects on the airport grounds, including aircraft, ground vehicles, and any other moving or temporary objects (or "transit ground objects") on the ground. In such examples, the processing circuitry 142 may use data from one or more SMR transceivers 720, multi-point positioning sensors 722, or other airport ground sensors, and combine the data from these multiple airport ground sensors to generate position and velocity information for one or more transit ground objects on the airport ground. The processing circuitry 142 may use the location and velocity information of one or more transit ground objects on the airport ground to determine a predicted location along the predicted path by extrapolating the location using the current location, velocity information, and predicted changes in velocity or location as informed by the historical navigation route data.
In any case, the processing circuitry 142 may then determine the current location of the first vehicle and/or the second vehicle from the ground sensor data. In some cases, the processing circuitry 142 may determine the current location of one vehicle, while another vehicle may be parked and out of range of the ground sensors. In any case, the processing circuit 142 may use the historical navigation route data to predict the current location of another vehicle. In such examples, the processing circuitry may use the current locations of the first and second vehicles 111 to identify both the predicted location of the first vehicle and the predicted location of the second vehicle.
The multi-point positioning sensor 722 may be configured to detect, monitor and collect data from various airport grounds, and detect transitive ground objects on the airport grounds, in a manner that may complement the detection of the SMR transceiver 720. Exemplary multi-point localized sensor data collection techniques are described in U.S. patent publication No. 2016/0196754. For example, the multi-point positioning sensor 722 may be implemented as an omnidirectional antenna sensor disposed at various remote locations around an airport.
In some examples, the collision awareness system 140 may be connected to any one or more of various other types of sensors configured to detect transitive ground objects on the airport ground. For example, the processing circuitry 142 of the collision sensing system 140 may be communicatively connected to and configured to receive data from one or more microwave sensors, optical imaging sensors, ultrasonic sensors, lidar transceivers, infrared sensors, and/or magnetic sensors. In some examples, the collision-awareness system 140 may incorporate features and/or components of an airport ground monitoring system, such as an advanced ground movement guidance and control system (a-SMGCS) or an airport ground detection equipment-X (ASDE-X) system. For example, the collision awareness system 140 may incorporate one or more of the SMR transceiver 720 and SMR antenna 726, a multi-point positioning sensor 722, and/or other airport ground sensors.
In some examples, collision perception system 140 may incorporate or integrate signals or sensor inputs from a combination of ground moving radar, multi-point positioning sensors, and satellites. One or more types of airport ground sensors may be configured to generate a signal indicating that the position of a transit ground object is within a selected accuracy (such as five meters), for example, enabling the processing circuitry 142 to generate position and velocity information for transit ground objects having similar levels of accuracy. The processing circuitry 142 may also be communicatively connected, at least at times (all or only at certain times), to sensors located outside the vicinity of the airport, such as imaging sensors hosted on satellites, airships, or drones with imaging and communication capabilities.
The processing circuitry 142 may be configured to use data from the SMR transceiver 720 and/or the multilateration sensor 722 and the multilateration reference transmitter 724 to estimate or determine the position and velocity of a transit ground object on the airport ground, and to generate position and velocity information for one or more transit ground objects on the airport ground based at least in part on the data from the SMR transceiver 720 and/or from the multilateration sensor 722 and/or from one or more other airport ground sensors. In some examples, the processing circuitry 142 may generate the position and speed of one or more airport ground vehicles or other ground support equipment (such as a fuelling vehicle, a rear-push towing vehicle, an airport liner, a container loader, a belt loader, a luggage cart, a dining vehicle, an emergency vehicle, a snow plow, or ground maintenance equipment) one or more times.
In some examples, the multi-point location sensor 722 may perform an active collaborative interrogation of the mobile aerial vehicles on the airport ground. For example, the multi-point location sensor 722 may transmit an interrogation signal via an 1030/1090 megahertz (MHz) Traffic Collision Avoidance System (TCAS) monitoring belt. In some examples, the multipoint positioning sensor 722 may include a broadcast automatic dependent surveillance (ADS-B) transceiver (e.g., a mode S ADS-B transceiver) configured to receive ADS-B messages from aircraft on a airport ground. Various aircraft moving on the airport ground (at least aircraft that have their ADS-B system active while on the ground) may automatically transmit ADS-B messages that may be received by the multipoint positioning sensor 722. The multipoint positioning sensor 722 using ADS-B may receive the ADS-B messages and transmit the ADS-B messages to the processing circuitry 142, potentially along with additional data (such as time of receipt), facilitating the processing circuitry 142 to determine and generate position and velocity information for the responding aircraft.
In some examples, the processing circuitry 142 of the collision awareness system 140 may be configured to output the position and velocity information generated for the transitive ground object to the communication circuitry 144, and thus to the extended range wireless router 710, for transmission over the wireless local area network. For example, processing circuitry 142 of collision awareness system 140 may output location and velocity information of one or more transit ground objects at selected ethernet connections or output ports, e.g., connected to communication circuitry 144 via a WNIC, to an IP address.
The extended range wireless network established by the wireless router 710 (and potentially additional wireless routers communicatively coupled to the communication circuit 144) may extend its range throughout an airport and include all taxiways, runways, airport areas, tarmac areas, hangar bays and other traffic lanes within its range. Thus, the extended range wireless network provided by wireless router 710 may include all aircraft on the airport ground within range, and may potentially provide wireless connectivity in the flight deck of all aircraft, including wireless connectivity to EFBs of pilots or flight crew of various aircraft. In some examples, extended range wireless router 710 may be combined with collision awareness system 140 in a single unit or component.
Thus, in various examples, in the collision awareness system 140 including the processing circuitry 142 and the communication circuitry 144, the processing circuitry 142 is configured to receive data from one or more airport ground sensors (e.g., one or both of the SMR transceiver 720 and the multi-point positioning sensor 722) configured to detect transitive ground objects on the airport ground. The processing circuitry 142 of the collision awareness system 140 may be further configured to generate position and velocity information for one or more transit ground objects on the airport ground based at least in part on data from the one or more airport ground sensors. The communication circuitry 144 of the collision awareness system 140 may be configured to receive the position and velocity information of the one or more transit surface objects from the processing circuitry 142 and output the position and velocity information of the one or more transit surface objects to the wireless router 710 for transmission over the wireless local area network.
Any of a variety of processing devices (such as collision awareness system 140, other components that interact with collision awareness system 140 and/or implement one or more techniques of collision awareness system 140, or other central processing units, ASICs, graphics processing units, computing devices, or any other types of processing devices) may perform process 300, or portions or aspects thereof. The collision-sensing system 140 and/or other components that interact with the collision-sensing system 140 and/or implement one or more techniques of the collision-sensing system 140 as disclosed above may be implemented in any of various types of circuit elements. For example, a processor of the collision-sensing system 140 or other components that interact with the collision-sensing system 140 and/or implement one or more techniques of the collision-sensing system 140 may be implemented as one or more ASICs, as magnetic non-volatile Random Access Memory (RAM) or other types of memory, mixed signal integrated circuits, Central Processing Units (CPUs), Field Programmable Gate Arrays (FPGAs), microcontrollers, a Programmable Logic Controller (PLC), a system on a chip (SoC), a sub-portion of any of the above, an interconnect or distributed combination of any of the above, or any other type of component or one or more components that can be configured to use airport guidance features, historical data, and/or confidence information to predict a collision zone at an expected time and perform other functions in accordance with any of the examples disclosed herein.
The functions performed by the electronics associated with the equipment systems described herein may be implemented, at least in part, by hardware, software, firmware, or any combination thereof. For example, various aspects of these techniques may be implemented within one or more processors (including one or more microprocessors, DSPs, ASICs, FPGAs, or any other equivalent integrated or discrete logic circuitry, as well as any combinations of such components), embodied in electronics included in components of system 140 or other systems described herein. The terms "processor," "processing device," or "processing circuitry" may generally refer to any of the preceding logic circuitry, alone or in combination with other logic circuitry, or any other equivalent circuitry.
Such hardware, software, firmware may be implemented within the same device or within separate devices to support the various operations and functions described in this disclosure. Furthermore, any of the described units, modules or components may be implemented together or separately as discrete but interoperable logic devices. The description of different features as modules or units is intended to highlight different functional aspects and does not necessarily imply that such modules or units must be realized by separate hardware or software components. Rather, functionality associated with one or more modules or units may be performed by separate hardware or software components, or integrated within common or separate hardware or software components. For example, although collision awareness system 140 is shown in fig. 1 as a separate system, collision awareness system 140 may execute on one or more data servers 132, vehicles 111, traffic controllers 114, user interface devices 104, data stores 105, or any combination thereof. In one example, collision awareness system 140 may be implemented simultaneously on multiple devices (such as data server 132 and vehicle 111).
When implemented in software, the functionality attributed to the apparatus and systems described herein may be embodied as instructions on a computer-readable medium, such as Random Access Memory (RAM), Read Only Memory (ROM), non-volatile random access memory (NVRAM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory, magnetic data storage media, optical data storage media, and so forth. The instructions may be executable to support one or more aspects of the functionality described in this disclosure. The computer readable medium may be non-transitory.
Various aspects of the present disclosure have been described. These and other aspects are within the scope of the following claims.
Claims (10)
1. A ground impact sensing system, the ground impact sensing system comprising:
a memory configured to store:
historical navigation route data for one or more reference vehicles, wherein the historical navigation route data is based on transponder location data, an
A plurality of airport guidance features for one or more airport locations, wherein the airport guidance features comprise guidance marker information; and
a processor in communication with the memory, wherein the processor is configured to:
determining a predicted path for a first vehicle, the predicted path comprising a first portion and a second portion, the first portion of the predicted path being predicted using the guidance marker information and the second portion of the predicted path being predicted using the historical navigation route data;
determining a predicted position of the first vehicle along the predicted path at an expected time;
determining a predicted position of the second vehicle relative to approximately the same expected time;
performing a comparison of a first vehicle envelope of the first vehicle and a second vehicle envelope of the second vehicle at the predicted location;
identifying an overlap of the first vehicle envelope and the second vehicle envelope; and is
Determining a predicted collision zone for the first vehicle and the second vehicle at the expected time based at least in part on the overlap of the first vehicle envelope and the second vehicle envelope.
2. The system of claim 1, wherein to determine the predicted path of the first vehicle, the processor is further configured to:
determining a first initial predicted path for the first vehicle using the guidance marker information and the historical navigation route data;
identifying a first likelihood that the first vehicle traveled the first initial predicted path;
determining a second initial predicted path for the first vehicle using the guidance marker information and the historical navigation route data;
identifying a second likelihood that the first vehicle traveled the second initial predicted path; and is
Determining the predicted path of the first vehicle based on a combination of the first initial predicted path of the first vehicle and the second initial predicted path of the first vehicle, the combination based at least in part on the first likelihood and the second likelihood.
3. The system of claim 1, wherein the processor is further configured to:
identifying permission information for a traffic controller defining one or more target markers for the first vehicle;
identifying one or more targeting features from the plurality of airport guidance features based at least in part on the permission information, the one or more targeting features configured to provide guidance through a particular portion of a particular airport location toward the one or more target markers;
identifying the first portion of the predicted path using the one or more target-aiming features; and is
Identifying the second portion of the predicted path using the permission information and the historical navigation route data, the second portion of the predicted path including a destination location of the first vehicle defined by the permission information as a target marker of the one or more target markers.
4. The system of claim 1, wherein the processor is further configured to:
receiving ground sensor data for the first vehicle or the second vehicle;
determining a current location of the first vehicle or the second vehicle from the ground sensor data; and is
Identifying the predicted location of the first vehicle or the predicted location of the second vehicle using the current location of the first vehicle or the second vehicle.
5. The system of claim 1, wherein the processor is further configured to:
determining movement information of the first vehicle at a current location, the movement information including speed information of the first vehicle; and is
Identifying the predicted location of the first vehicle using the movement information and the historical navigation route data.
6. The system of claim 1, wherein the processor is further configured to:
receiving a current position of the first vehicle; and is
Transmitting the predicted collision zone from a remote server to the first vehicle using a wireless network.
7. The system of claim 1, wherein the processor is further configured to:
identifying a predicted path for the second vehicle using a predicted current location of the second vehicle, the predicted current location of the second vehicle based on the historical navigation route data; and is
Determining the predicted location of the second vehicle using the predicted current location of the second vehicle and one or more of: the historical navigation route data, the plurality of airport guidance features, or permission information for the second vehicle.
8. The system of claim 1, wherein the processor is further configured to:
transmitting the predicted collision zone to a device corresponding to at least one of: the first vehicle, the second vehicle, a third vehicle configured to transport the first vehicle or the second vehicle, or a remote server.
9. The system of claim 1, wherein the first vehicle comprises the ground impact perception system or a taxiway comprising ground guide markers, and wherein the second portion comprises an airport apron area.
10. A method of predicting a vehicle collision zone, the method comprising:
obtaining, by a processing circuit of a ground collision awareness system, historical navigation route data for one or more reference vehicles, the historical navigation route data based on transponder location data;
identifying, by the processing circuit, a plurality of airport guidance features for a particular airport location, the airport guidance features comprising guidance marker information;
determining, by the processing circuit, a predicted path for a first vehicle, the predicted path comprising a first portion and a second portion, the first portion of the predicted path predicted using the guidance indicia information and the second portion of the predicted path predicted using the historical navigation route data;
determining, by the processing circuit, a predicted position of the first vehicle along the predicted path at an expected time;
determining, by the processing circuit, a predicted position of the second vehicle relative to approximately the same expected time;
performing, by the processing circuit, a comparison of a first vehicle envelope of the first vehicle and a second vehicle envelope of the second vehicle at the predicted location;
identifying, by the processing circuit, an overlap of the first vehicle envelope and the second vehicle envelope; and
determining, by the processing circuit, a predicted collision zone of the first vehicle and the second vehicle at the expected time based at least in part on the overlap of the first vehicle envelope and the second vehicle envelope.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IN202011006508 | 2020-02-14 | ||
IN202011006508 | 2020-02-14 | ||
US17/070,830 | 2020-10-14 | ||
US17/070,830 US11854418B2 (en) | 2020-02-14 | 2020-10-14 | Collision awareness using historical data for vehicles |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113838309A true CN113838309A (en) | 2021-12-24 |
Family
ID=74418171
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110157742.5A Pending CN113838309A (en) | 2020-02-14 | 2021-02-04 | Collision perception using historical data of vehicles |
Country Status (2)
Country | Link |
---|---|
EP (1) | EP3866139A1 (en) |
CN (1) | CN113838309A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220366794A1 (en) * | 2021-05-11 | 2022-11-17 | Honeywell International Inc. | Systems and methods for ground-based automated flight management of urban air mobility vehicles |
US11508244B2 (en) * | 2020-03-13 | 2022-11-22 | Saab Ab | Method, computer program product, system and craft for collision avoidance |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US12055951B2 (en) | 2022-03-01 | 2024-08-06 | Rockwell Collins, Inc. | High fidelity teammate state estimation for coordinated autonomous operations in communications denied environments |
US12007774B2 (en) | 2022-03-25 | 2024-06-11 | Rockwell Collins, Inc. | System and method for guidance integrity monitoring for low-integrity modules |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102629423A (en) * | 2011-02-07 | 2012-08-08 | 霍尼韦尔国际公司 | Airport taxiway collision alerting system |
CN103514760A (en) * | 2012-06-26 | 2014-01-15 | 霍尼韦尔国际公司 | Methods and systems for taxiway traffic alerting |
US20160196754A1 (en) * | 2015-01-06 | 2016-07-07 | Honeywell International Inc. | Airport surface monitoring system with wireless network interface to aircraft surface navigation system |
CN108074010A (en) * | 2016-11-15 | 2018-05-25 | 波音公司 | Motor-driven prediction to surrounding traffic |
US20190228668A1 (en) * | 2018-01-24 | 2019-07-25 | Honeywell International Inc. | Method and system for automatically predicting a surface movement path for an aircraft based on historical trajectory data |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11260838B2 (en) * | 2018-06-15 | 2022-03-01 | Honeywell International Inc. | Methods and systems for vehicle contact prediction and auto brake activation |
-
2021
- 2021-01-28 EP EP21154155.2A patent/EP3866139A1/en active Pending
- 2021-02-04 CN CN202110157742.5A patent/CN113838309A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102629423A (en) * | 2011-02-07 | 2012-08-08 | 霍尼韦尔国际公司 | Airport taxiway collision alerting system |
CN103514760A (en) * | 2012-06-26 | 2014-01-15 | 霍尼韦尔国际公司 | Methods and systems for taxiway traffic alerting |
US20160196754A1 (en) * | 2015-01-06 | 2016-07-07 | Honeywell International Inc. | Airport surface monitoring system with wireless network interface to aircraft surface navigation system |
CN108074010A (en) * | 2016-11-15 | 2018-05-25 | 波音公司 | Motor-driven prediction to surrounding traffic |
US20190228668A1 (en) * | 2018-01-24 | 2019-07-25 | Honeywell International Inc. | Method and system for automatically predicting a surface movement path for an aircraft based on historical trajectory data |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11508244B2 (en) * | 2020-03-13 | 2022-11-22 | Saab Ab | Method, computer program product, system and craft for collision avoidance |
US20220366794A1 (en) * | 2021-05-11 | 2022-11-17 | Honeywell International Inc. | Systems and methods for ground-based automated flight management of urban air mobility vehicles |
Also Published As
Publication number | Publication date |
---|---|
EP3866139A1 (en) | 2021-08-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11854418B2 (en) | Collision awareness using historical data for vehicles | |
EP3866139A1 (en) | Collision awareness using historical data for vehicles | |
US11900823B2 (en) | Systems and methods for computing flight controls for vehicle landing | |
US9355564B1 (en) | Position determination systems and methods for a plurality of aircraft | |
EP3043331A2 (en) | Airport surface monitoring system with wireless network interface to aircraft surface navigation system | |
EP3866138A1 (en) | Systems and methods for automated cross-vehicle navigation using sensor data fusion | |
US8400347B2 (en) | Device and method for monitoring the location of aircraft on the ground | |
US20060214816A1 (en) | Airport runway collision avoidance system and method | |
US20220335841A1 (en) | Systems and methods for strategic smart route planning service for urban airspace users | |
US11847925B2 (en) | Systems and methods to display an elevated landing port for an urban air mobility vehicle | |
US12067889B2 (en) | Systems and methods for detect and avoid system for beyond visual line of sight operations of urban air mobility in airspace | |
KR101301169B1 (en) | Navigation system used in transportation for airport or harbor | |
US11763555B2 (en) | System and method for ground obstacle detection and database management | |
US20180181125A1 (en) | On-ground vehicle collision avoidance utilizing unmanned aerial vehicles | |
US20220309931A1 (en) | Systems and methods for guiding vehicles to charging points | |
WO2004114252A1 (en) | Airfield vehicle monitoring system and respective vehicle | |
CN111512354B (en) | Aircraft traffic control method | |
EP4063987A1 (en) | Systems and methods for identifying landing zones for unmanned aircraft | |
US20230410667A1 (en) | Autonomous air taxi separation system and method | |
EP4080482A1 (en) | System and method for obstacle detection and database management | |
US11994880B2 (en) | Methods and systems for unmanned aerial vehicles to detect and avoid other flying machines | |
EP3859712A1 (en) | Collision awareness using cameras mounted on a vehicle | |
EP4064245A1 (en) | Systems and methods for detect and avoid system for beyond visual line of sight operations of urban air mobility in airspace | |
CA3178300C (en) | Autonomus air taxi separation system and method | |
EP4080481A1 (en) | Systems and methods to display an elevated landing port for an urban air mobility vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |