US20210342750A1 - Information processing apparatus, information processing method, and storage medium - Google Patents
Information processing apparatus, information processing method, and storage medium Download PDFInfo
- Publication number
- US20210342750A1 US20210342750A1 US16/645,966 US201916645966A US2021342750A1 US 20210342750 A1 US20210342750 A1 US 20210342750A1 US 201916645966 A US201916645966 A US 201916645966A US 2021342750 A1 US2021342750 A1 US 2021342750A1
- Authority
- US
- United States
- Prior art keywords
- information
- boarding
- passenger
- management server
- person
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 60
- 238000003672 processing method Methods 0.000 title claims description 8
- 238000000034 method Methods 0.000 claims abstract description 160
- 238000000605 extraction Methods 0.000 claims description 21
- 239000000284 extract Substances 0.000 claims description 8
- 238000010586 diagram Methods 0.000 description 72
- 230000008569 process Effects 0.000 description 67
- 230000004044 response Effects 0.000 description 53
- 238000007689 inspection Methods 0.000 description 49
- 230000002093 peripheral effect Effects 0.000 description 14
- 239000002184 metal Substances 0.000 description 13
- 238000004891 communication Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 7
- 230000032258 transport Effects 0.000 description 5
- 230000037361 pathway Effects 0.000 description 4
- 238000012795 verification Methods 0.000 description 3
- 238000012015 optical character recognition Methods 0.000 description 2
- WBMKMLWMIQUJDP-STHHAXOLSA-N (4R,4aS,7aR,12bS)-4a,9-dihydroxy-3-prop-2-ynyl-2,4,5,6,7a,13-hexahydro-1H-4,12-methanobenzofuro[3,2-e]isoquinolin-7-one hydrochloride Chemical compound Cl.Oc1ccc2C[C@H]3N(CC#C)CC[C@@]45[C@@H](Oc1c24)C(=O)CC[C@@]35O WBMKMLWMIQUJDP-STHHAXOLSA-N 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/02—Reservations, e.g. for tickets, services or events
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G06K9/00288—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/70—Multimodal biometrics, e.g. combining information from different biometric modalities
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07B—TICKET-ISSUING APPARATUS; FARE-REGISTERING APPARATUS; FRANKING APPARATUS
- G07B15/00—Arrangements or apparatus for collecting fares, tolls or entrance fees at one or more control points
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/30—Individual registration on entry or exit not involving the use of a pass
- G07C9/32—Individual registration on entry or exit not involving the use of a pass in combination with an identity check
- G07C9/37—Individual registration on entry or exit not involving the use of a pass in combination with an identity check using biometric data, e.g. fingerprints, iris scans or voice recognition
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/30—Individual registration on entry or exit not involving the use of a pass
- G07C9/38—Individual registration on entry or exit not involving the use of a pass with central registration
Definitions
- the present invention relates to an information processing apparatus, an information processing method, and a storage medium.
- Patent Literature 1 discloses a ticketless boarding system that performs a procedure with face authentication at a plurality of checkpoints (a check-in lobby, a security inspection area, a boarding gate, or the like) by using biometric information (face image) of a passenger.
- Patent Literature 1 As illustrated in Patent Literature 1 as an example, it is expected that throughput in an airport is improved by facilitating a use of a terminal having a face authentication function. In the conventional system illustrated as an example in Patent Literature 1, however, for a passenger waiting around a boarding gate, it is not assumed to support a procedure at a boarding gate.
- the present invention intends to provide an information processing apparatus, an information processing method, and a storage medium that support a procedure of a passenger at a boarding gate.
- an information processing apparatus including: an acquisition unit that acquires, from a captured image obtained by capturing a passenger who is boarding on an airplane and has not yet passed through a boarding gate corresponding to the airplane, biometric information of the passenger; a specifying unit that specifies boarding reservation information regarding the boarding by using the acquired biometric information; and an output unit that outputs information used for supporting a procedure of the passenger at the boarding gate based on the specified boarding reservation information.
- an information processing method including steps of: acquiring, from a captured image obtained by capturing a passenger who is boarding on an airplane and has not yet passed through a boarding gate corresponding to the airplane, biometric information of the passenger; specifying boarding reservation information regarding the boarding by using the acquired biometric information; and outputting information used for supporting a procedure of the passenger at the boarding gate based on the specified boarding reservation information.
- a storage medium storing a program that causes a computer to perform steps of: acquiring, from a captured image obtained by capturing a passenger who is boarding on an airplane and has not yet passed through a boarding gate corresponding to the airplane, biometric information of the passenger; specifying boarding reservation information regarding the boarding by using the acquired biometric information; and outputting information used for supporting a procedure of the passenger at the boarding gate based on the specified boarding reservation information.
- an information processing apparatus an information processing method, and a storage medium that support a procedure of a passenger at a boarding gate can be provided.
- FIG. 1 is a block diagram illustrating an example of an overall configuration of an information processing system in a first example embodiment.
- FIG. 2 is a diagram illustrating one example of the arrangement of boarding gate apparatuses, operation terminals, and cameras in the first example embodiment.
- FIG. 3 is a diagram illustrating one example of information stored in a token ID information database in the first example embodiment.
- FIG. 4 is a diagram illustrating one example of information stored in a passage history information database in the first example embodiment.
- FIG. 5 is a diagram illustrating one example of information stored in an operation information database in the first example embodiment.
- FIG. 6 is a diagram illustrating one example of information stored in a reservation information database in the first example embodiment.
- FIG. 7 is a block diagram illustrating one example of a hardware configuration of a management server in the first example embodiment.
- FIG. 8 is a block diagram illustrating one example of a hardware configuration of a check-in terminal in the first example embodiment.
- FIG. 9 is a block diagram illustrating one example of a hardware configuration of an automatic baggage drop-off machine in the first example embodiment.
- FIG. 10 is a block diagram illustrating one example of a hardware configuration of a security inspection apparatus in the first example embodiment.
- FIG. 11 is a block diagram illustrating one example of a hardware configuration of an automated gate apparatus in the first example embodiment.
- FIG. 12 is a block diagram illustrating one example of a hardware configuration of a boarding gate apparatus in the first example embodiment.
- FIG. 13 is a block diagram illustrating one example of a hardware configuration of an operation terminal in the first example embodiment.
- FIG. 14 is a sequence diagram illustrating one example of the process in the reservation system, the check-in terminal, and the management server in the first example embodiment.
- FIG. 15 is a sequence diagram illustrating one example of the process in the reservation system, the automatic baggage drop-off machine, and the management server in the first example embodiment.
- FIG. 16 is a sequence diagram illustrating one example of the process in the reservation system, the security inspection apparatus, and the management server in the first example embodiment.
- FIG. 17 is a sequence diagram illustrating one example of the process in the reservation system, the automated gate apparatus, and the management server in the first example embodiment.
- FIG. 18 is a sequence diagram illustrating one example of the process in the reservation system, the operation terminal, the management server, and the camera in the first example embodiment.
- FIG. 19 is a diagram illustrating one example of the operation screen displayed on the operation terminal in the first example embodiment.
- FIG. 20 is a diagram illustrating one example of a passenger list screen displayed on the operation terminal in the first example embodiment.
- FIG. 21 is a sequence diagram illustrating one example of the process in the reservation system, the operation terminal, the management server, and the camera in the first example embodiment.
- FIG. 22 is a diagram illustrating one example of a guidance instruction screen for priority boarding displayed on the operation terminal in the first example embodiment.
- FIG. 23 is a sequence diagram illustrating one example of the process in the reservation system, the operation terminal, the management server, and the camera in the first example embodiment.
- FIG. 24 is a diagram illustrating one example of a guide instruction screen for a waiting lane displayed on the operation terminal in the first example embodiment.
- FIG. 25 is a sequence diagram illustrating one example of the process in the reservation system, the operation terminal, the management server, and the camera in the first example embodiment.
- FIG. 26 is a diagram illustrating one example of a passenger inquiry screen displayed on the operation terminal in the first example embodiment.
- FIG. 27 is a sequence diagram illustrating one example of the process in the reservation system, the operation terminal, the management server, and the camera in the first example embodiment.
- FIG. 28 is a diagram illustrating one example of an extraction condition entry screen displayed on the operation terminal in the first example embodiment.
- FIG. 29 is a diagram illustrating one example of a passenger extraction result screen displayed on the operation terminal in the first example embodiment.
- FIG. 30 is a sequence diagram illustrating one example of the process in the reservation system, the boarding gate apparatus, and the management server in the first example embodiment.
- FIG. 31 is a flowchart illustrating one example of a check process of an accompanying person in a second example embodiment.
- FIG. 32 is a diagram illustrating one example of a check instruction screen of an accompanying person displayed on the operation terminal in the second example embodiment.
- FIG. 33 is a diagram illustrating one example of a check instruction screen of an accompanying person displayed on the operation terminal in the second example embodiment.
- FIG. 34 is a block diagram illustrating an example of an overall configuration of an information processing system in a third example embodiment.
- FIG. 35 is a sequence diagram illustrating one example of the process in the operation terminal, the management server, and the camera in the third example embodiment.
- FIG. 36 is a diagram illustrating one example of a passenger search result screen displayed on the operation terminal in the third example embodiment.
- FIG. 37 is a block diagram illustrating an example of an overall configuration of an information processing apparatus in a fourth example embodiment.
- FIG. 1 is a block diagram illustrating an example of the overall configuration of an information processing system 1 in the present example embodiment.
- the information processing system 1 is a computer system that manages and supports an operation regarding an inspection procedure at immigration to a user (hereafter, referred to as a “passenger”) U at an airport A.
- the information processing system 1 is operated by a public institution such as an office of administration of immigration or a consignee consigned for the operation of the institution, for example.
- a reservation system 2 is a computer system provided in an airline company.
- the reservation system 2 includes a reservation information database (DB) 3 that manages boarding reservation information. Note that, although only one reservation system 2 is illustrated for simplified illustration in FIG. 1 , the reservation system 2 is provided for each of a plurality of airline companies.
- DB reservation information database
- a check-in terminal 20 , an automatic baggage drop-off machine 30 , a security inspection apparatus 40 , an automated gate apparatus 50 , and a boarding gate apparatus 60 are connected to a common management server 10 via the network NW 1 , respectively.
- the security inspection apparatus 40 , the automated gate apparatus 50 , and the boarding gate apparatus 60 are installed in a security area SA illustrated by a dashed line.
- the check-in terminal 20 , the automatic baggage drop-off machine 30 , the security inspection apparatus 40 , the automated gate apparatus 50 , and the boarding gate apparatus 60 are connected to a server (not illustrated) via the network NW 2 , respectively.
- an operation terminal 70 used by a staff member S is connected to the networks NW 1 and NW 2 via access points (not illustrated).
- the networks NW 1 and NW 2 are formed of a local area network (LAN) including LAN of the airport A, a wide area network (WAN), a mobile communication network, or the like.
- the connection scheme is not limited to a wired scheme but may be a wireless scheme.
- the networks NW 1 and NW 2 are different networks from each other. That is, in the present example embodiment, the information processing system 1 is not directly connected to the reservation system 2 .
- the management server 10 is provided in a facility of an airport company that runs the airport A, for example.
- the management server 10 may be a cloud server instead of a server installed in a facility where an operation is actually provided.
- the management server 10 is not necessarily required to be a single server but may be formed as a server group including a plurality of servers.
- inspection procedures at departure from a country at the airport A are sequentially performed at five touch points.
- the relationship between each apparatus and each touch point will be described below.
- the check-in terminal 20 is installed in a check-in lobby (hereafter, referred to as “touch point P 1 ”) in the airport A.
- the check-in terminal 20 is a self-service terminal by which the user U performs a check-in procedure (boarding procedure) by himself/herself. Upon completion of the procedure at the touch point P 1 , the user U moves to a baggage counter or a security inspection area.
- the automatic baggage drop-off machine 30 is installed at a baggage counter (hereafter, referred to as “touch point P 2 ”) in the airport A.
- the automatic baggage drop-off machine 30 is a self-service terminal which is operated by the user U by himself/herself to perform a procedure of dropping off baggage that is not carried in an airplane (baggage drop-off procedure).
- baggage drop-off procedure a procedure of dropping off baggage that is not carried in an airplane.
- the user U moves to a security inspection area. Note that, when the user U does not drop off his/her baggage, the procedure at the touch point P 2 is omitted.
- the security inspection apparatus 40 is installed in a security inspection area (hereafter, referred to as “touch point P 3 ”) in the airport A.
- the security inspection apparatus 40 is an apparatus that uses a metal detector to check whether or not the user U wears a metal that may be a dangerous object.
- the term “security inspection apparatus” in the present example embodiment is used as a meaning including an X-ray inspection apparatus that uses an X-ray to check whether or not there is a dangerous object in carry-on baggage or the like, a terminal apparatus of a passenger reconciliation system (PRS) that determines whether or not to permit passage of the user U at the entrance of a security inspection area, or the like without being limited to a metal detector.
- PRS passenger reconciliation system
- the user U who completes a check-in procedure or an automatic baggage drop-off procedure goes through a security inspection procedure by the security inspection apparatus 40 in the security inspection area. Upon completion of the procedure at the touch point P 3 , the user U moves to an immigration area.
- the automated gate apparatus 50 is installed in an immigration area (hereafter, referred to as “touch point P 4 ”) in the airport A.
- the automated gate apparatus 50 is an apparatus that automatically performs an immigration procedure of the user U. Upon completion of the procedure at the touch point P 4 , the user U moves to a departure area in which a duty-free shop and a boarding gate are provided.
- the boarding gate apparatus 60 is a passage control apparatus installed for each boarding gate (hereafter, referred to as “touch point P 5 ”) in the departure area.
- the boarding gate apparatus 60 confirms that the user U is a passenger of an airplane who is allowed to board via the boarding gate. Upon completion of the procedure at the touch point P 5 , the user U boards on the airplane and departs from the country.
- the operation terminal 70 is a terminal apparatus used by the staff member S for its operation.
- One example of the operation terminal 70 may be a personal computer, a tablet terminal, a smartphone, or the like but not limited thereto.
- the operation terminal 70 receives information transmitted from the management server 10 and displays the information on a screen.
- the management server 10 of the present example embodiment transmits information used for supporting a procedure performed by a passenger at a boarding gate to the operation terminal 70 . Details of the information will be described later.
- a plurality of cameras 80 are arranged in peripheral regions (adjacent regions) of boarding gates and are image capture devices that capture peripheral regions of the boarding gates, respectively.
- Each camera 80 is attached to a ceiling, a wall, a pillar, or the like, for example, so as to be able to capture a face of the user U who is present in the peripheral region of the boarding gate and waiting for boarding.
- the type of the camera 80 may be any of a fixed type and a movable type.
- FIG. 2 is a diagram illustrating one example of the arrangement of the boarding gate apparatuses 60 , the operation terminals 70 , and the cameras 80 .
- This example illustrates a case where the plurality of cameras 80 are installed in the peripheral region of the boarding gate and capture passengers at various capturing angles. It is preferable that each operation terminal 70 can selectively switch captured images captured by the plurality of cameras 80 on an operation screen.
- two types of mobile type and fixed type are illustrated as the operation terminal 70 used by the staff member S.
- L 1 to L 3 each represent a waiting area (waiting lane) as an example where the users U wait in a line before boarding.
- an airline company categorizes the users U into a plurality of groups in advance based on various conditions such as a seat class, a membership category, accompanying person information, a seat position, or the like and guides the users U to the boarding gate in predetermined group order. For example, a passenger accompanying an infant or a young child, a passenger whose seat class is the first class, a passenger recognized as an upper class member by the airline company, or the like is categorized into a group having relatively higher priority in boarding guide. The group having high priority is to receive a priority boarding service.
- the management server 10 has a token ID information DB 11 , a passage history information DB 12 , and an operation information DB 13 .
- the database included in the management server 10 is not limited to these databases.
- FIG. 3 is a diagram illustrating one example of information stored in the token ID information DB 11 .
- the token ID information DB 11 has data items of “token ID”, “group ID”, “feature amount”, “registered face image”, “token issuance time”, “token issuance device name”, “invalid flag”, “invalidated time”, and “accompanying person ID”.
- the token ID is an identifier that uniquely identifies ID information.
- the token ID is temporarily issued provided that there is a matching in a matching result between a passport face image read from a passport at the touch point P 1 and a face image obtained by capturing the user U having the passport. Then, when the user U finishes a procedure at the touch point P 5 (boarding gate), the token ID is invalidated. That is, the token ID is a onetime ID having a lifetime.
- the group ID is an identifier used for grouping ID information.
- the feature amount is a value extracted from biometric information.
- the registered face image is a face image registered for the user U.
- the term of biometric information in the present example embodiment means a face image and a feature amount extracted from the face image.
- the biometric information is not limited to a face image and a face feature amount. That is, a fingerprint image, a palm-print image, a pinna image, an iris image, or the like may be used as the biometric information of the user U to perform biometric authentication.
- the token issuance time is the time when the management server 10 issues a token ID.
- the token issuance device name is a name of a device from which a registered face image that triggers issuance of a token ID is acquired.
- the invalid flag is flag information indicating whether or not the token ID is currently valid. Once a token ID is issued, the invalid flag becomes a value of “1” indicating that the token ID is valid. Further, once a predetermined condition is satisfied, the invalid flag is updated to a value of “0” indicating that the token ID is invalid.
- the invalidated time is a timestamp when the invalid flag is disabled.
- the accompanying person ID is a token ID issued for a person who boards on an airplane with support from another passenger, for example, a person such as an infant, a young child, or the like (hereafter, referred to as “supported person”).
- a person associated with the accompanying person ID is a supporting side person such as a guardian, for example, a father or a mother or a helper (hereafter, referred to as “supporting person”).
- supporting person ID is issued to a supported person at the same time when a supporting person first performs a procedure, for example.
- a single supported person is associated with a single supporting person will be described. However, a plurality of supported persons may be associated with a single supporting person.
- FIG. 4 is a diagram illustrating one example of information stored in the passage history information DB 12 .
- the passage history information DB 12 has data items of “passage history ID”, “token ID”, “passage time”, “device name”, “operation system type”, and “passage touch point”.
- the passage history ID is an identifier that uniquely identifies passage history information.
- the passage time is a timestamp when a passenger passes through the touch points P 1 to P 5 .
- the device name is a machine name of a terminal used in procedures at the touch points P 1 to P 5 .
- the operation system type is a type of the operation system to which a terminal belongs.
- the passage touch point is each name of the touch points P 1 to P 5 that a passenger has passed through. Note that the management server 10 can extract the passage history information on a token ID basis to recognize up to which touch point the user U completed the procedure.
- FIG. 5 is a diagram illustrating one example of information stored in the operation information DB 13 .
- the operation information DB 13 has data items of “token ID”, “reservation number”, “airline code”, and “operation information”.
- the reservation number is an identifier that uniquely identifies reservation information of boarding to an airplane.
- the airline code is an identifier that uniquely identifies an airline company.
- the operation information is arbitrary information obtained by an operation at each touch point.
- FIG. 6 is a diagram illustrating one example of information stored in the reservation information DB 3 .
- the reservation information DB 3 has data items of “reservation number”, “airline code”, “passenger name”, “departure place”, “destination place”, “flight number”, “date of flight”, “seat number”, “seat class” (for example, first class/business class/economy class), “nationality”, “passport number”, “family name”, “first name”, “date of birth”, “sexuality”, “membership category”, “with or without accompanying person”, and “accompanying person category”.
- the operation information DB 13 and the reservation information DB 3 are associated with each other by a reservation number and an airline code. Specifically, once a terminal apparatus at each touch point (the check-in terminal 20 or the like) reads a reservation number and an airline code from an airline ticket medium presented by a passenger, the terminal apparatus can inquire boarding reservation information from the reservation system 2 of an airline company corresponding to an airline code based on the reservation number. Note that a method of inquiring boarding reservation information from the reservation system 2 is not limited thereto.
- FIG. 7 is a block diagram illustrating one example of a hardware configuration of the management server 10 .
- the management server 10 has a central processing unit (CPU) 101 , a random access memory (RAM) 102 , a storage device 103 , and a communication I/F 104 . Each device is connected to a bus line 105 .
- the CPU 101 is a processor that has a function of performing a predetermined operation in accordance with a program stored in the storage device 103 and controlling each component of the management server 10 .
- the RAM 102 is formed of a volatile storage medium and provides a temporary memory area required for the operation of the CPU 101 .
- the storage device 103 is formed of a storage medium such as a nonvolatile memory, a hard disk drive, or the like and functions as a storage unit.
- the storage device 103 stores a program executed by the CPU 101 , data referenced by the CPU 101 when the program is executed, or the like.
- the communication I/F 104 is a communication interface based on a standard such as Ethernet (registered trademark), Wi-Fi (registered trademark), 4G, or the like, which is a module used for communicating with the check-in terminal 20 or the like.
- FIG. 8 is a block diagram illustrating one example of a hardware configuration of the check-in terminal 20 .
- the check-in terminal 20 has a CPU 201 , a RAM 202 , a storage device 203 , a communication I/F 204 , an input device 206 , a display device 207 , a medium reading device 208 , and a biometric information acquisition device 209 .
- Each device is connected to a bus line 205 .
- the input device 206 is a pointing device such as a touch panel, a keyboard, or the like, for example.
- the display device 207 is a liquid crystal display device, an organic light emitting diode (OLED) display device, or the like and used for display of a moving image, a static image, a text, or the like.
- the input device 206 and the display device 207 are integrally formed as a touch panel.
- the medium reading device 208 is a device that reads a passport or an airline ticket medium of the user U and acquires information recorded on the passport or the airline ticket.
- the airline ticket medium may be, for example, a paper airline ticket, a mobile terminal that displays a duplicate of an e-ticket, or the like.
- the medium reading device 208 is formed of a code reader, an image scanner, a contactless integrated circuit (IC) reader, an optical character reader (OCR) device, or the like, for example, and acquires information from various media held over the reading unit thereof.
- the biometric information acquisition device 209 is a device that acquires a face image of the user U as biometric information of the user U.
- the biometric information acquisition device 209 is a digital camera that captures a face of the user U standing in front of the check-in terminal 20 , and the biometric information acquisition device 209 captures a face of the user U and acquires a face image.
- FIG. 9 is a block diagram illustrating one example of a hardware configuration of the automatic baggage drop-off machine 30 .
- the automatic baggage drop-off machine 30 has a CPU 301 , a RAM 302 , a storage device 303 , a communication I/F 304 , an input device 306 , a display device 307 , a medium reading device 308 , a biometric information acquisition device 309 , a baggage transport device 310 , and an output device 311 . Each device is connected to a bus line 305 .
- the baggage transport device 310 transports baggage of the user U in order to load the baggage to an airplane that the user U is boarding on when the identity verification of the user U is successful.
- the baggage transport device 310 transports baggage that is placed on a reception part by the user U and attached with a baggage tag to a cargo handling section.
- the output device 311 is a device that outputs a baggage tag to be attached to dropped-off baggage. Further, the output device 311 outputs a baggage claim tag required for claiming baggage after arriving at the destination. Note that a baggage tag or a baggage claim tag is associated with at least one of passport information and boarding information.
- FIG. 10 is a block diagram illustrating one example of a hardware configuration of the security inspection apparatus 40 .
- the security inspection apparatus 40 has a CPU 401 , a RAM 402 , a storage device 403 , a communication I/F 404 , an input device 406 , a display device 407 , a medium reading device 408 , a biometric information acquisition device 409 , and a metal detector gate 410 .
- Each device is connected to a bus line 405 .
- the metal detector gate 410 is a gate type metal detector and detects a metal worn by the user U passing through the metal detector gate 410 .
- FIG. 11 is a block diagram illustrating one example of a hardware configuration of the automated gate apparatus 50 .
- the automated gate apparatus 50 has a CPU 501 , a RAM 502 , a storage device 503 , a communication I/F 504 , an input device 506 , a display device 507 , a medium reading device 508 , a biometric information acquisition device 509 , and a gate 511 . Each device is connected to a bus line 505 .
- the automated gate apparatus 50 is arranged in the entry inspection site has the same hardware as the automated gate apparatus 50 arranged in the immigration area.
- the gate 511 transitions from a closed state to block passage of the user U during standby to an opened state to permit passage of the user U under the control of the CPU 501 when identity verification of the user U at the automated gate apparatus 50 is successful and the user U passes through immigration procedure.
- the scheme of the gate 511 is not particularly limited and may be, for example, a flapper gate whose flapper provided on one side of the pathway or flappers provided on both sides of the pathway are opened and closed, a turn style gate whose three bars rotate, or the like.
- FIG. 12 is a block diagram illustrating one example of a hardware configuration of the boarding gate apparatus 60 .
- the boarding gate apparatus 60 has a CPU 601 , a RAM 602 , a storage device 603 , a communication I/F 604 , an input device 606 , a display device 607 , a biometric information acquisition device 609 , and a gate 611 .
- Each device is connected to a bus line 605 .
- FIG. 13 is a block diagram illustrating one example of a hardware configuration of the operation terminal 70 .
- the operation terminal 70 has a CPU 701 , a RAM 702 , a storage device 703 , a communication I/F 704 , an input device 706 , and a display device 707 . Each device is connected to a bus line 705 .
- FIG. 14 is a sequence diagram illustrating one example of the process in the reservation system 2 , the check-in terminal 20 , and the management server 10 . This process is performed when the user U uses the check-in terminal 20 to perform a check-in procedure.
- the check-in terminal 20 determines whether or not a passport of the user U is held over a reading unit (not illustrated) of the medium reading device 208 (step S 101 ), and stands by until a passport is held over (step S 101 , NO).
- the check-in terminal 20 acquires passport information on the user U from the passport that is held over (step S 102 ).
- the acquired passport information includes a passport face image of the user U, identity verification information, a passport number, information on a passport issuance country, or the like.
- the check-in terminal 20 captures a face of the user U by using the biometric information acquisition device 209 (step S 103 ) and transmits the face image and the passport information to the management server 10 (step S 104 ). Note that it is preferable to display a screen used for obtaining consent of the user U before capturing a face image.
- the management server 10 matches, at 1:1, a face image recorded on the passport of the user U (hereafter, referred to as “passport face image”) with a face image captured by the check-in terminal 20 (hereafter, referred to as “target face image”) (step S 105 ).
- the management server 10 issues a token ID (step S 107 ).
- the token ID is set to a unique value based on date and time or a sequence number at a process, for example.
- the management server 10 uses the target face image as a registered face image and registers a relationship between the token ID and the registered face image in the token ID information DB 11 (step S 108 ).
- the reason why a face image captured on site (target face image) is used as the registered face image is that the lifecycle of a token ID is terminated within the day, that a captured image is closer to an image captured in the subsequent authentication process than a passport face image in a quality (appearance), or the like.
- a passport face image may be set as a registered face image (registered biometric information) instead of a target face image (captured face image).
- a lifecycle of a token ID spans a long term (for example, when a token ID is validated for a certain lifecycle if the user U has a membership, or the like in airline services)
- a face image of a passport or a license card may be set as a registered face image.
- the management server 10 transmits the issued token ID and a matching result of a successful matching to the check-in terminal 20 (step S 109 ).
- the management server 10 transmits the matching result of the unsuccessful matching to the check-in terminal 20 (step S 110 ).
- step S 111 based on the matching result of the successful matching received from the management server 10 , if the check-in terminal 20 determines that the check-in procedure can be performed (step S 111 , YES), the process proceeds to step S 112 . Contrarily, based on the matching result of the unsuccessful matching received from the management server 10 , if the check-in terminal 20 determines that the check-in procedure is not performed (step S 111 , NO), the check-in terminal 20 notifies the user U of an error message (step S 113 ).
- step S 112 the check-in terminal 20 determines whether or not an airline ticket medium of the user U is held over the reading unit of the medium reading device 208 .
- the check-in terminal 20 stands by until an airline ticket medium is held over (step S 112 , NO).
- the check-in terminal 20 acquires recorded data such as a reservation number, an airline code, and the like from the airline ticket medium that is held over (step S 114 ).
- the check-in terminal 20 transmits the recorded data to the reservation system 2 of an airline company corresponding to the airline code (step S 115 ) and requests matching between the recorded data and boarding reservation information.
- the reservation system 2 In response to receiving recorded data from the check-in terminal 20 , the reservation system 2 matches the recorded data with boarding reservation information stored in the reservation information DB 3 (step S 116 ) and transmits the matching result to the check-in terminal 20 (step S 117 ).
- the check-in terminal 20 performs a check-in procedure such as confirmation of an itinerary, selection of a seat, or the like based on the input information on the user U (step S 118 ). If there is no matching in the matching result in the reservation system 2 , the check-in terminal 20 may notify the user U of an error without performing a check-in procedure.
- the check-in terminal 20 transmits, to the management server 10 , a token ID, operation information, and passage history information indicating completion of procedure at the check-in terminal 20 (step S 119 ).
- the operation information includes at least a reservation number and an airline code.
- the passage history information includes information such as a passage time at the touch point P 1 , a device name of a terminal used for the procedure, or the like.
- the management server 10 registers passage history information indicating a relationship between the token ID and the passage information at the touch point P 1 in the passage history information DB 12 (step S 120 ). The management server 10 then registers the operation information received from the check-in terminal 20 in the operation information DB 13 if necessary (step S 121 ).
- a target face image (captured face image) successfully matched with a passport face image acquired from a passport in a check-in procedure is registered in the token ID information DB 11 as a registered face image, and a registered face image and operation information in the operation information DB 13 are associated with each other by the issued token ID. This enables biometric authentication by using face matching between a captured face image and a registered face image at each subsequent touch point.
- FIG. 15 is a sequence diagram illustrating one example of the process in the reservation system 2 , the automatic baggage drop-off machine 30 , and the management server 10 . This process is performed when the user U who completed a check-in procedure performs a baggage drop-off procedure if necessary.
- the automatic baggage drop-off machine 30 continuously or periodically captures the area in front of the apparatus and determines whether or not a face of the user U standing in front of the automatic baggage drop-off machine 30 is detected in the captured image (step S 201 ).
- the automatic baggage drop-off machine 30 stands by until a face of the user U is detected in an image by the biometric information acquisition device 309 (step S 201 , NO).
- the automatic baggage drop-off machine 30 captures the face of the user U and acquires the face image of the user U as a target face image (step S 202 ).
- the automatic baggage drop-off machine 30 transmits the target face image of the user U captured by the biometric information acquisition device 309 to the management server 10 together with a matching request (step S 203 ). Thereby, the automatic baggage drop-off machine 30 requests the management server 10 to match, at 1:N, the target face image of the user U captured by the biometric information acquisition device 309 with a plurality of registered face images registered in the token ID information DB 11 of the management server 10 .
- the management server 10 In response to receiving the target face image and the matching request from the automatic baggage drop-off machine 30 , the management server 10 performs matching of the face image of the user U (step S 204 ). That is, the management server 10 matches, at 1:N, the target face image received from the automatic baggage drop-off machine 30 with a plurality of registered face images registered in the token ID information DB 11 . Note that the registered face images to be matched are limited to images associated with the token ID whose invalid flag has a value of “1” (valid).
- step S 205 determines that the matching result indicates an unsuccessful matching
- step S 208 transmits the unsuccessful matching result to the automatic baggage drop-off machine 30 (step S 208 ), and the process proceeds to step S 209 .
- step S 206 determines that the matching result indicates a successful matching
- step S 206 the management server 10 acquires the token ID associated with the successfully matched registered face image in the token ID information DB 11 and acquires a reservation number and an airline code from the operation information DB 13 by using the token ID as a key. The management server 10 then transmits the token ID, the reservation number, the airline code, and the matching result to the automatic baggage drop-off machine 30 (step S 207 ).
- the automatic baggage drop-off machine 30 transmits the reservation number to the reservation system 2 of an airline company corresponding to the airline code (step S 210 ) and inquires boarding reservation information.
- the automatic baggage drop-off machine 30 notifies the user U of an error message (step S 213 ).
- the reservation system 2 In response to receiving the reservation number from the automatic baggage drop-off machine 30 , the reservation system 2 transmits the corresponding boarding reservation information to the automatic baggage drop-off machine 30 (step S 211 ).
- the automatic baggage drop-off machine 30 performs a baggage drop-off procedure of the user U (step S 212 ).
- the automatic baggage drop-off machine 30 transmits, to the management server 10 , the token ID, the operation information, and passage history information indicating that the baggage drop-off procedure of the user U completed after the matching of the face image (step S 214 ).
- the passage history information includes information such as the passage time at the touch point P 2 , a device name of the used terminal, or the like.
- the management server 10 In response to receiving the information from the automatic baggage drop-off machine 30 , the management server 10 registers, in the passage history information DB 12 , passage history information indicating the relationship between the token ID and the passage information at the touch point P 2 on the user U (step S 215 ). The management server 10 then registers the operation information received from the automatic baggage drop-off machine 30 in the operation information DB 13 if necessary (step S 216 ).
- FIG. 16 is a sequence diagram illustrating one example of the process in the reservation system 2 , the security inspection apparatus 40 , and the management server 10 . This process is performed when the user U who completed a check-in procedure or a baggage drop-off procedure goes through a security inspection procedure.
- the security inspection apparatus 40 continuously or periodically captures the front area of the metal detector gate 410 and determines whether or not a face of the user U standing in front of the metal detector gate 410 is detected in a captured image (step S 301 ).
- the security inspection apparatus 40 stands by until a face of the user U is detected in an image by the biometric information acquisition device 409 (step S 301 , NO).
- the security inspection apparatus 40 captures the face of the user U and acquires the face image of the user U as a target face image (step S 302 ).
- the security inspection apparatus 40 transmits the target face image of the user U captured by the biometric information acquisition device 409 to the management server 10 together with a matching request (step S 303 ). Thereby, the security inspection apparatus 40 requests the management server 10 to match, at 1:N, the target face image of the user U captured by the biometric information acquisition device 409 with a plurality of registered face images registered in the token ID information DB 11 of the management server 10 .
- the management server 10 In response to receiving the target face image and the matching request from the security inspection apparatus 40 , the management server 10 performs matching of the face image of the user U (step S 304 ). That is, the management server 10 matches, at 1:N, the target face image received from the security inspection apparatus 40 with a plurality of registered face images registered in the token ID information DB 11 . Note that the registered face images to be matched are limited to images associated with the token ID whose invalid flag has a value of “1” (valid).
- step S 305 determines that the matching result indicates an unsuccessful matching
- step S 308 transmits the matching result of the unsuccessful matching to the security inspection apparatus 40
- step S 309 the management server 10 determines that the matching result indicates a successful matching
- step S 306 the management server 10 acquires the token ID associated with the successfully matched registered face image in the token ID information DB 11 and acquires a reservation number and an airline code from the operation information DB 13 by using the token ID as a key.
- the management server 10 then transmits the token ID, the reservation number, the airline code, and the matching result to the security inspection apparatus 40 (step S 307 ).
- the process proceeds to step S 309 .
- the security inspection apparatus 40 transmits the reservation number to the reservation system 2 of an airline company corresponding to the airline code (step S 310 ) and inquires boarding reservation information.
- the security inspection apparatus 40 notifies the user U of an error message (step S 313 ).
- the reservation system 2 In response to receiving the reservation number from the security inspection apparatus 40 , the reservation system 2 transmits the corresponding boarding reservation information to the security inspection apparatus 40 (step S 311 ).
- the security inspection apparatus 40 performs a security inspection procedure of the user U (step S 312 ).
- the CPU 401 controls each component of the security inspection apparatus 40 .
- the security inspection apparatus 40 detects a metal worn by the user U passing through the metal detector gate 410 .
- the user U who has passed through the metal detector gate 410 moves to an immigration area.
- the security inspection apparatus 40 transmits, to the management server 10 , the token ID, the operation information, and passage history information indicating that the security inspection procedure of the user U completed after the matching of the face image (step S 314 ).
- the passage history information includes information such as the passage time at the touch point P 3 , a device name of the used terminal, or the like.
- the management server 10 In response to receiving the information from the security inspection apparatus 40 , the management server 10 registers, in the passage history information DB 12 , passage history information indicating the relationship between the token ID and the passage information at the touch point P 3 on the user U (step S 315 ). The management server 10 then registers the operation information received from the security inspection apparatus 40 in the operation information DB 13 if necessary (step S 316 ).
- FIG. 17 is a sequence diagram illustrating one example of the process in the reservation system 2 , the automated gate apparatus 50 , and the management server 10 . This process is performed when the user U goes through an immigration procedure.
- the automated gate apparatus 50 continuously or periodically captures the front area of the automated gate apparatus 50 and determines whether or not a face of the user U standing in front of the automated gate apparatus 50 is detected in a captured image (step S 401 ).
- the automated gate apparatus 50 stands by until a face of the user U is detected in an image by the biometric information acquisition device 509 (step S 401 , NO).
- the automated gate apparatus 50 captures the face of the user U and acquires the face image of the user U as a target face image (step S 402 ).
- the automated gate apparatus 50 transmits the target face image of the user U captured by the biometric information acquisition device 509 to the management server 10 together with a matching request (step S 403 ). Thereby, the automated gate apparatus 50 requests the management server 10 to match, at 1:N, the target face image of the user U captured by the biometric information acquisition device 509 with a plurality of registered face images registered in the token ID information DB 11 of the management server 10 .
- the management server 10 In response to receiving the target face image and the matching request from the automated gate apparatus 50 , the management server 10 performs matching of the face image of the user U (step S 404 ). That is, the management server 10 matches, at 1:N, the target face image received from the automated gate apparatus 50 with a plurality of registered face images registered in the token ID information DB 11 . Note that the registered face images to be matched are limited to images associated with the token ID whose invalid flag has a value of “1” (valid).
- step S 405 determines that the matching result indicates an unsuccessful matching
- step S 408 transmits the matching result of the unsuccessful matching to the automated gate apparatus 50
- step S 409 the management server 10 determines that the matching result indicates a successful matching
- step S 406 the process proceeds to step S 406 .
- step S 406 the management server 10 acquires the token ID associated with the successfully matched registered face image in the token ID information DB 11 and acquires a reservation number and an airline code from the operation information DB 13 by using the token ID as a key.
- the management server 10 then transmits the token ID, the reservation number, the airline code, and the matching result to the automated gate apparatus 50 (step S 407 ).
- the process proceeds to step S 409 .
- the automated gate apparatus 50 transmits the reservation number to the reservation system 2 of an airline company corresponding to the airline code (step S 410 ) and inquires boarding reservation information.
- the automated gate apparatus 50 notifies the user U of an error message (step S 413 ). For example, a notification screen including a message such as “Please move to immigration procedure at the manned counter” is displayed on the display device 507 .
- the reservation system 2 In response to receiving the reservation number from the automated gate apparatus 50 , the reservation system 2 transmits the corresponding boarding reservation information to the automated gate apparatus 50 (step S 411 ).
- the automated gate apparatus 50 performs an immigration procedure of the user U (step S 412 ) and opens the gate 511 (step S 414 ).
- the user U who has passed through the touch point P 4 moves to a departure area in which a boarding gate is provided.
- the automated gate apparatus 50 transmits, to the management server 10 , the token ID, the operation information, and passage history information indicating that the immigration procedure of the user U completed after the matching of the face image (step S 415 ).
- the passage history information includes information such as the passage time at the touch point P 4 , a device name of the used terminal, or the like.
- the management server 10 In response to receiving the information from the automated gate apparatus 50 , the management server 10 registers, in the passage history information DB 12 , passage history information indicating the relationship between the token ID and the passage information at the touch point P 4 on the user U (step S 416 ). The management server 10 then registers the operation information received from the automated gate apparatus 50 in the operation information DB 13 if necessary (step S 417 ).
- FIG. 18 is a sequence diagram illustrating one example of the process of the reservation system 2 , the operation terminal 70 , the management server 10 , and the camera 80 . This process is performed when the staff member S references information on a passenger included in a monitoring image of a peripheral region of a boarding gate, for example.
- the operation terminal 70 transmits a display request for an operation screen to the management server 10 based on input information obtained from the staff member S (step S 501 ).
- the management server 10 acquires a captured image of the peripheral region of a boarding gate from the plurality of cameras 80 arranged in a distributed manner around the boarding gate (step S 502 ).
- the management server 10 transmits an operation screen including the captured image to the operation terminal 70 (step S 503 ).
- the operation terminal 70 displays the operation screen received from the management server 10 on the display device 707 (step S 504 ).
- FIG. 19 is a diagram illustrating one example of the operation screen displayed on the operation terminal 70 .
- a captured image is displayed in the center area of the screen.
- Each dashed line section in the captured image indicates a detected region of a face image of each of a plurality of users U 1 to U 12 .
- a camera select button used for switching the displayed image to a captured image of another camera 80 having different capturing angle a passenger list screen button used for displaying a list of passengers included in a captured image, and an end button used for closing the screen are displayed. Note that it is preferable that a passenger for which no token ID is registered or a passenger who is not authenticated be distinguished in a screen from a registered passenger (authenticated passenger) by a different display form.
- the operation terminal 70 transmits a display request for a passenger list screen to the management server 10 based on input information from the staff member S (step S 505 ).
- the management server 10 detects a person from the captured image of the peripheral region of the boarding gate (step S 506 ). When a plurality of persons are included in a captured image, the management server 10 detects face images of all the persons as the target face images, respectively.
- the management server 10 matches, at 1:N, the target face image with a plurality of registered face images registered in the token ID information DB 11 (step S 507 ). That is, the management server 10 specifies registered face images which are the same as the target face images of passengers included in the captured image, respectively. Note that the registered face images to be matched are limited to images associated with the token ID whose invalid flag has a value of “1” (valid). Furthermore, the passage history information DB 12 may be referenced, and only the image of the person who has passed through the immigration area may be a target to be matched.
- the management server 10 acquires the token ID associated with the successfully matched registered face image in the token ID information DB 11 and acquires a reservation number and an airline code of each person from the operation information DB 13 by using the token ID as a key (step S 508 ). The management server 10 then transmits a data set of the token ID, the reservation number, and the airline code to the operation terminal 70 (step S 509 ).
- the operation terminal 70 sequentially transmits reservation numbers to the reservation system 2 of an airline company corresponding to the airline code (step S 510 ) and inquires boarding reservation information for each person.
- the reservation system 2 In response to receiving a reservation number of each person from the operation terminal 70 , the reservation system 2 transmits the corresponding boarding reservation information to the operation terminal 70 (step S 511 ).
- the operation terminal 70 transmits the data set of the token ID, the reservation number, the airline code, and the boarding reservation information to the management server 10 (step S 512 ).
- the management server 10 transmits a passenger list screen created based on the data set to the operation terminal 70 (step S 513 ).
- the passenger list screen includes boarding reservation information associated with registered face images specified by biometric authentication, respectively.
- the operation terminal 70 then displays the passenger list screen received from the management server 10 on the display device 707 (step S 514 ).
- FIG. 20 is a diagram illustrating one example of the passenger list screen displayed on the operation terminal 70 .
- boarding reservation information (“name”, “nationality”, “flight number”, “departure place”, “destination place”, “departure time”, “boarding gate”) and face images of a plurality of persons detected from the captured image are displayed in a list form.
- displayed data items are not limited to the above.
- the staff member S can easily find a passenger who might have made a mistake in the place of a boarding gate or a waiting lane and guide such a passenger to a correct place.
- FIG. 21 is a sequence diagram illustrating one example of the process in the reservation system 2 , the operation terminal 70 , the management server 10 , and the camera 80 . This process is performed for each boarding gate in order to notify the staff member S of a target person for priority boarding.
- the management server 10 determines whether or not the current time is a predetermined period before the time to start guide of priority boarding (step S 601 ). In this step, if it is determined that the current time is the predetermined period before the time to start guide (step S 601 , YES), the management server 10 acquires a captured image of the peripheral region of a boarding gate from the camera 80 (step S 602 ). The process then proceeds to step S 603 . Contrarily, if it is determined that the current time is not the predetermined period before the time to start guide (step S 601 , NO), the management server 10 stands by until the current time becomes the predetermined period before the time to start guide.
- step S 603 the management server 10 detects a person from the captured image of the peripheral region of the boarding gate.
- the management server 10 detects face images of all the persons as target face images, respectively.
- the management server 10 matches, at 1:N, the target face image with a plurality of registered face images registered in the token ID information DB 11 (step S 604 ).
- the registered face images to be matched are limited to images associated with the token ID whose invalid flag has a value of “1” (valid).
- the passage history information DB 12 may be referenced, and only the image of the person who has passed through the immigration area may be a target to be matched.
- the management server 10 acquires the token ID associated with the successfully matched registered face image in the token ID information DB 11 and acquires a reservation number and an airline code of each person from the operation information DB 13 by using the token ID as a key (step S 605 ). The management server 10 then transmits a data set of the token ID, the reservation number, and the airline code to the operation terminal 70 (step S 606 ).
- the operation terminal 70 sequentially transmits reservation numbers to the reservation system 2 of an airline company corresponding to the airline code (step S 607 ) and inquires boarding reservation information for each person.
- the reservation system 2 In response to receiving a reservation number of each person from the operation terminal 70 , the reservation system 2 transmits the corresponding boarding reservation information to the operation terminal 70 (step S 608 ).
- the operation terminal 70 transmits the data set of the token ID, the reservation number, the airline code, and the boarding reservation information to the management server 10 (step S 609 ).
- the management server 10 determines priority in boarding guide based on information on a seat class, a membership category, and an accompanying person or the like on each person included in the data set and extracts a passenger having high priority (step S 610 ).
- the management server 10 transmits a guide instruction screen of priority boarding for the extracted passenger to the operation terminal 70 (step S 611 ).
- the operation terminal 70 then displays the guide instruction screen of priority boarding received from the management server 10 on the display device 707 (step S 612 ).
- FIG. 22 is a diagram illustrating one example of a guide instruction screen of priority boarding displayed on the operation terminal 70 .
- a captured image is displayed in the center area of the screen.
- the dashed line section in the image indicates a detected region of a parent and a child determined as passengers having the highest priority for boarding guide.
- a message instructing the staff member S for priority boarding guide (“Priority boarding is starting soon. Please guide the two passengers to the boarding gate”) is displayed in the lower side region in the screen.
- FIG. 23 is a sequence diagram illustrating one example of the process in the reservation system 2 , the operation terminal 70 , the management server 10 , and the camera 80 . This process is performed for guiding a passenger to an appropriate waiting lane in accordance with the priority of the passenger before starting priority boarding guide.
- the management server 10 determines whether or not the current time is a predetermined period before the time to start guide of priority boarding (step S 701 ). In this step, if it is determined that the current time is the predetermined period before the time to start guide (step S 701 , YES), the management server 10 acquires a captured image including a line of passengers waiting for boarding from the camera 80 (step S 702 ). The process then proceeds to step S 703 . Contrarily, if it is determined that the current time is not the predetermined period before the time to start guide (step S 701 , NO), the management server 10 stands by until the current time is the predetermined period before the time to start guide.
- step S 703 the management server 10 detects all the persons from the captured image including a line of passengers waiting for boarding.
- the management server 10 matches, at 1:N, the target face image with a plurality of registered face images registered in the token ID information DB 11 (step S 704 ).
- the management server 10 acquires the token ID associated with a registered face image of a successful matching in the token ID information DB 11 and acquires a reservation number and an airline code of each person from the operation information DB 13 by using the token ID as a key (step S 705 ). The management server 10 then transmits a data set of the token ID, the reservation number, and the airline code to the operation terminal 70 (step S 706 ).
- the operation terminal 70 sequentially transmits reservation numbers to the reservation system 2 of an airline company corresponding to the airline code (step S 707 ) and inquires boarding reservation information for each person.
- the reservation system 2 In response to receiving a reservation number of each person from the operation terminal 70 , the reservation system 2 transmits the corresponding boarding reservation information to the operation terminal 70 (step S 708 ).
- the operation terminal 70 transmits the data set of the token ID, the reservation number, the airline code, and the boarding reservation information to the management server 10 (step S 709 ).
- the management server 10 determines priority at boarding of each person based on information on a seat class, a membership category, and an accompanying person or the like on each person included in the data set and extracts a passenger whose priority is not matched to priority for the waiting lane, that is, a passenger who is waiting in a different waiting lane (step S 710 ).
- the management server 10 transmits a guide instruction screen of a waiting lane for the extracted passenger to the operation terminal 70 (step S 711 ). That is, when a passenger whose priority is different from the priority for the waiting place is included in passengers waiting in the waiting place, the management server 10 outputs information for suggesting guide to a correct waiting place corresponding to the priority of the passenger.
- the operation terminal 70 then displays the guide instruction screen received from the management server 10 on the display device 707 (step S 712 ).
- FIG. 24 is a diagram illustrating one example of a guide instruction screen of a waiting lane displayed on the operation terminal 70 .
- a captured image in which a plurality of passengers before boarding are waiting in a waiting lane provided around the boarding gate is displayed in the center region of the screen.
- the dashed line section in the screen indicates a detected region of a passenger E waiting in a wrong waiting lane.
- an instruction message that instructs the staff member S to guide the passenger E to a correct waiting lane corresponding to the priority of the passenger E (“There is a passenger waiting in a wrong waiting lane (No. 1). Please guide the passenger to the correct waiting lane (No. 3)”) is displayed in the lower side region in the screen.
- FIG. 25 is a sequence diagram illustrating one example of the process in the reservation system 2 , the operation terminal 70 , the management server 10 , and the camera 80 . This process is performed when the staff member S arbitrarily designates a person included in a captured image and inquires information on the designated person.
- the operation terminal 70 transmits a display request for an operation screen to the management server 10 based on input information from the staff member S (step S 801 ).
- the management server 10 acquires a captured image of the peripheral region of a boarding gate from the plurality of cameras 80 arranged in a distributed manner around the boarding gate (step S 802 ).
- the management server 10 transmits the operation screen including a captured image to the operation terminal 70 (step S 803 ).
- the operation terminal 70 displays the operation screen received from the management server 10 on the display device 707 (step S 804 ).
- the operation terminal 70 transmits, to the management server 10 , coordinate information on a designated person within the operation screen of the staff member S (step S 805 ).
- the management server 10 detects the designated person from the captured image based on the coordinate information (step S 806 ).
- a face image of the designated person cut out from the captured image is a target face image.
- the management server 10 matches, at 1:N, the target face image of the designated person with a plurality of registered face images registered in the token ID information DB 11 (step S 807 ). That is, the management server 10 specifies a registered face image which is the same as the target face image of the passenger specified on the screen of the operation terminal 70 by the staff member S.
- the management server 10 acquires the token ID associated with the successfully matched registered face image in the token ID information DB 11 and acquires a reservation number and an airline code of the designated person from the operation information DB 13 by using the token ID as a key (step S 808 ). The management server 10 then transmits a data set of the token ID, the reservation number, and the airline code to the operation terminal 70 (step S 809 ).
- the operation terminal 70 transmits reservation numbers to the reservation system 2 of an airline company corresponding to the airline code (step S 810 ) and inquires boarding reservation information.
- the reservation system 2 In response to receiving a reservation number of the designated person from the operation terminal 70 , the reservation system 2 transmits the corresponding boarding reservation information to the operation terminal 70 (step S 811 ).
- the operation terminal 70 transmits the data set of the token ID, the reservation number, the airline code, and the boarding reservation information to the management server 10 (step S 812 ).
- the management server 10 transmits a passenger inquiry screen created based on the data set to the operation terminal 70 (step S 813 ). That is, the management server 10 outputs boarding reservation information associated with a registered face image of a designated person to the operation terminal 70 .
- the operation terminal 70 displays the passenger inquiry screen received from the management server 10 on the display device 707 (step S 814 ).
- FIG. 26 is a diagram illustrating one example of a passenger inquiry screen displayed on the operation terminal 70 .
- the captured image captured image
- the target face image of the designated person extracted from the captured image and the registered face image successfully matched with the target face image are displayed.
- the dashed line section in the captured image is a detected region of the designated person T.
- boarding reservation information on the designated persons is displayed in a list form as an inquiry result.
- the management server 10 specifies a passenger based on a face image of the designated person and displays the boarding reservation information in the screen. Thereby, the staff member S can confirm boarding reservation information on a passenger included in a captured image in advance and use the confirmed boarding reservation information in the operation.
- FIG. 27 is a sequence diagram illustrating one example of the process in the reservation system 2 , the operation terminal 70 , the management server 10 , and the camera 80 . This process is performed when the staff member S extracts a desired passenger from passengers present around a boarding gate.
- FIG. 28 is a diagram illustrating one example of the extraction condition entry screen displayed on the operation terminal 70 .
- extraction conditions “boarding gate”, “flight number”, “priority”, “membership category”, “seat class”, and “with or without token ID” are illustrated as an example.
- the staff member S may set a checkbox regarding priority (“1”) to ON.
- the operation terminal 70 transmits the extraction condition input by the staff member S in the extraction condition entry screen to the management server 10 (step S 902 ).
- the management server 10 acquires a captured image of the peripheral region of a boarding gate from the plurality of cameras 80 arranged in a distributed manner around the boarding gate (step S 903 ).
- the management server 10 detects all the persons from the captured image (step S 904 ).
- the management server 10 matches, at 1:N, the target face image of the detected person with a plurality of registered face images registered in the token ID information DB 11 (step S 905 ).
- the management server 10 acquires the token ID associated with a registered face image of a successful matching in the token ID information DB 11 and acquires a reservation number and an airline code of each person from the operation information DB 13 by using the token ID as a key (step S 906 ). The management server 10 then transmits a data set of the token ID, the reservation number, and the airline code to the operation terminal 70 (step S 907 ).
- the operation terminal 70 sequentially transmits reservation numbers to the reservation system 2 of an airline company corresponding to the airline code (step S 908 ) and inquires boarding reservation information for each person.
- the reservation system 2 In response to receiving a reservation number of each person from the operation terminal 70 , the reservation system 2 transmits the corresponding boarding reservation information to the operation terminal 70 (step S 909 ).
- the operation terminal 70 transmits the data set of the token ID, the reservation number, the airline code, and the boarding reservation information to the management server 10 (step S 910 ).
- the management server 10 extracts a passenger matched to the extraction condition (step S 911 ).
- the management server 10 transmits a passenger extraction result screen regarding the extracted passenger to the operation terminal 70 (step S 912 ).
- the operation terminal 70 displays the passenger extraction result screen received from the management server 10 on the display device 707 (step S 913 ).
- FIG. 29 is a diagram illustrating one example of a passenger extraction result screen displayed on the operation terminal 70 .
- extraction conditions designated by the staff member S are displayed.
- three conditions of “boarding gate”, “flight number”, and “membership category” are designated.
- a captured image is displayed in the center region of the screen.
- a plurality of dashed line sections in the captured image are detected regions of passengers satisfying the extraction conditions. That is, the management server 10 can extract a passenger corresponding to boarding reservation information satisfying the extraction conditions input by the staff member S out of the passengers included in the captured image.
- FIG. 30 is a sequence diagram illustrating one example of the process in the reservation system 2 , the boarding gate apparatus 60 , and the management server 10 . This process is performed when the user U passes through a boarding gate.
- the boarding gate apparatus 60 continuously or periodically captures the front area of the apparatus and determines whether or not a face of the user U standing in front of the boarding gate apparatus 60 is detected in a captured image (step S 1001 ).
- the boarding gate apparatus 60 stands by until a face of the user U is detected in an image by the biometric information acquisition device 609 (step S 1001 , NO).
- the boarding gate apparatus 60 captures the face of the user U and acquires the face image of the user U as a target face image (step S 1002 ).
- the boarding gate apparatus 60 transmits the target face image of the user U captured by the biometric information acquisition device 609 to the management server 10 together with a matching request (step S 1003 ). Thereby, the boarding gate apparatus 60 requests the management server 10 to match, at 1:N, the target face image of the user U captured by the biometric information acquisition device 609 with a plurality of registered face images registered in the token ID information DB 11 of the management server 10 .
- the management server 10 In response to receiving the target face image and the matching request from the boarding gate apparatus 60 , the management server 10 performs matching of the face image of the user U (step S 1004 ). That is, the management server 10 matches, at 1:N, the target face image received from the boarding gate apparatus 60 with a plurality of registered face images registered in the token ID information DB 11 . Note that the registered face images to be matched are limited to images associated with the token ID whose invalid flag has a value of “1” (valid).
- step S 1005 determines that the matching result indicates an unsuccessful matching
- step S 1008 transmits the matching result of the unsuccessful matching to the boarding gate apparatus 60
- step S 1009 the management server 10 determines that the matching result indicates a successful matching
- step S 1006 determines that the matching result indicates a successful matching
- step S 1006 the management server 10 acquires the token ID associated with the successfully matched registered face image in the token ID information DB 11 and acquires a reservation number and an airline code from the operation information DB 13 by using the token ID as a key.
- the management server 10 then transmits the token ID, the reservation number, the airline code, and the matching result to the boarding gate apparatus 60 (step S 1007 ).
- the process proceeds to step S 1009 .
- the boarding gate apparatus 60 transmits the reservation number to the reservation system 2 of an airline company corresponding to the airline code (step S 1010 ) and inquires boarding reservation information.
- the boarding gate apparatus 60 notifies the user U of an error message (step S 1012 ).
- the boarding gate apparatus 60 displays a notification screen including a message such as “Please move to procedure at the manned counter” on the display device 607 .
- the reservation system 2 In response to receiving the reservation number from the boarding gate apparatus 60 , the reservation system 2 transmits the corresponding boarding reservation information to the boarding gate apparatus 60 (step S 1011 ).
- the boarding gate apparatus 60 performs check at the boarding gate for the user U based on the flight number, the gate number, the boarding start time, and the like included in the boarding reservation information (step S 1013 ) and when permitting the boarding, opens the gate 611 (step S 1014 ).
- the user U who has passed through the touch point P 5 boards on an airplane.
- the boarding gate apparatus 60 determines not to permit boarding in step S 1013 , it is preferable to notify the user U of an error message without opening the gate 611 .
- the gate 611 is not opened when the user U makes a mistake in the number of the gate 611 , when the current time is before the time to the boarding start time, or the like.
- the boarding gate apparatus 60 transmits, to the management server 10 , the token ID, the operation information, and passage history information indicating that the user U completed the boarding on the airplane after the matching of the face image (step S 1015 ).
- the passage history information includes information such as the passage time at the touch point P 5 , a device name of the used terminal, or the like.
- the management server 10 In response to receiving the information from the boarding gate apparatus 60 , the management server 10 registers, in the passage history information DB 12 , passage history information at the touch point P 5 on the user U (step S 1016 ). The management server 10 then registers the operation information received from the boarding gate apparatus 60 in the operation information DB 13 if necessary (step S 1017 ).
- the management server 10 then updates the token ID information DB 11 (step S 1018 ). Specifically, the management server 10 updates the invalid flag in the token ID information DB 11 to a value of invalidity (“0”). Thereby, the lifecycle of the token ID expires.
- the management server 10 stores boarding reservation information on a passenger who is boarding on an airplane and a registered face image of the passenger in association with each other. Further, the management server 10 acquires a target face image of a passenger who has not yet boarded on an airplane from a captured image of a peripheral region of a boarding gate and specifies a registered face image which is the same as the target face image. Then, in response to specifying boarding reservation information associated with the specified registered face image, the management server 10 outputs information used for supporting a procedure of a passenger at a boarding gate to the operation terminal 70 used by the staff member S based on the boarding reservation information. Accordingly, a procedure at a boarding gate can be efficiently performed.
- the management server 10 outputs priority in a procedure performed by a passenger at a boarding gate as information used for supporting the procedure of the passenger at the boarding gate. Accordingly, the staff member S can efficiently provide a service of priority boarding based on the priority of each passenger obtained from the captured image without checking a storage medium such as a passport, a boarding ticket, or the like.
- the information processing system 1 in the present example embodiment will be described below. Note that a reference common to the reference provided in the drawings in the first example embodiment represents the same object. Description of features common to the first example embodiment will be omitted, and different features will be described in detail.
- the present example embodiment is different from the first example embodiment in that a check process regarding an accompanying person of a passenger is further performed in the check at the time of boarding at a boarding gate (touch point P 5 ).
- FIG. 31 is a flowchart illustrating one example of a check process of an accompanying person. This process is performed by the management server 10 between step S 1011 and step S 1013 of FIG. 30 described above, for example.
- the management server 10 determines whether or not a target person under a boarding check at a boarding gate is a passenger having an accompanying person based on boarding reservation information received from the reservation system 2 (step S 1101 ). In this step, if the management server 10 determines that the target person is a passenger having an accompanying person (step S 1101 , YES), the process proceeds to step S 1102 .
- step S 1101 determines that the target person is a passenger having no accompanying person.
- step S 1102 the management server 10 determines whether or not another person is included in the captured image. In this step, if the management server 10 determines that another person is included in the captured image (step S 1102 , YES), the process proceeds to step S 1103 .
- step S 1102 determines that another person is not included in the captured image.
- step S 1103 the management server 10 determines whether or not a person included in the captured image is a person registered as an accompanying person. In this step, if the management server 10 determines that the person included in the captured image is a person registered as an accompanying person (step S 1103 , YES), the process proceeds to step S 1013 .
- step S 1103 determines that the person included in the captured image is not a person registered as an accompanying person.
- step S 1105 the management server 10 outputs a check instruction screen indicating the absence of an accompanying person to the operation terminal 70 and ends the process.
- FIG. 32 is a diagram illustrating one example of a check instruction screen regarding an accompanying person displayed on the operation terminal 70 .
- a captured image captured at a boarding gate is displayed.
- a person T in the captured image is a passenger on which face authentication has been performed.
- boarding reservation information regarding the person T is displayed in a list form.
- This example represents that the accompanying person of the person T is “female/young child”.
- a child C registered as an accompanying person of the person T is not included in the captured image.
- a message instructing the staff member S to check an accompanying person (“Accompanying person information is registered for this passenger. Please check the accompanying person”) is displayed in the lower side region in the screen.
- step S 1106 the management server 10 outputs a check instruction screen indicating an unsuccessful matching of an accompanying person to the operation terminal 70 and ends the process.
- FIG. 33 is a diagram illustrating one example of a check instruction screen of an accompanying person displayed on the operation terminal 70 .
- a captured image captured at a boarding gate is displayed.
- the person T in the captured image is a passenger on which face authentication has been performed.
- boarding reservation information regarding the person T is displayed in a list form.
- This example represents that the accompanying person of the person T is “male/young child”.
- the child C included in the captured image is different from the registered accompanying person.
- a message instructing the staff member S to check an accompanying person (“This accompanying person is different from registered information. Please check the accompanying person”) is displayed in the lower side region in the screen.
- the management server 10 can analyze a captured image at a boarding gate and efficiently instruct the staff member S to perform a check operation regarding an accompanying person of a passenger. Specifically, the management server 10 acquires a target face image of a passenger from a captured image obtained by capturing the passenger at a boarding gate and specifies a registered face image which is the same as a target face image. When the accompanying person associated with the specified registered face image is not included in the captured image at the boarding gate, the management server 10 outputs information that suggests a check operation of an accompanying person.
- the management server 10 outputs information that suggests a check operation of an accompanying person.
- FIG. 34 is a block diagram illustrating an example of the overall configuration of the information processing system 4 in the present example embodiment.
- the information processing system 4 is different from the information processing system 1 of the first example embodiment in that a function of searching for the position of a designated passenger in an airport by using captured images of the cameras 80 arranged at various places such as the touch points P 1 to P 5 , a lounge (not illustrated), a duty-free shop (not illustrated), a pathway, and the like in the airport is further provided.
- FIG. 35 is a sequence diagram illustrating one example of the process in the operation terminal 70 , the management server 10 , and the camera 80 . This process is performed when the staff member S searches for the current position of a desired passenger out of passengers present in the airport. Note that the passenger to be searched for is a passenger for which a token ID has been issued.
- the operation terminal 70 displays a search condition entry screen (step S 1201 ) and then transmits, to the management server 10 , a search condition input by the staff member S in the condition entry screen (step S 1202 ).
- a search condition may be a reservation number, an airline code, a seat class, a membership category, a name of a passenger, or the like.
- boarding reservation information is already stored as operation information in the operation information DB 13 at the time of issuance of a token ID. Thereby, an inquiry process of boarding reservation information to the reservation system 2 based on a reservation number can be omitted.
- the management server 10 specifies a token ID of a searching person associated with operation information matched to a search condition in the operation information DB 13 (step S 1203 ).
- the management server 10 acquires a registered face image of a searching person from the token ID information DB 11 based on the specified token ID (step S 1204 ).
- the management server 10 acquires captured images from the plurality of cameras 80 arranged in various places in the airport A in a distributed manner, respectively (step S 1205 ).
- the management server 10 detects all the persons from respective captured images (step S 1206 ).
- the management server 10 matches, at 1:N, a registered face image of the searching person with a plurality of face images detected from the captured image (step S 1207 ).
- the management server 10 specifies the current position of the searching person based on the captured image including the person of the face image successfully matched with the registered face image of the searching person (step S 1208 ) and then transmits a passenger search result screen including information on the current position of the searching person to the operation terminal 70 (step S 1209 ).
- the operation terminal 70 then displays the passenger search result screen received from the management server 10 on the display device 707 (step S 1210 ).
- FIG. 36 is a diagram illustrating one example of the passenger search result screen displayed on the operation terminal 70 .
- a captured image of a place in which the searching person is found a target face image of the searching person extracted from the captured image, and a registered face image successfully matched with the target face image are displayed.
- the dashed line section in the captured image is a detection region of the designated person T.
- information on the current location of the searching person (“First Terminal, 4F, XXX lounge”) is displayed above the captured image.
- boarding reservation information regarding the person T is displayed in a list form.
- the management server 10 selects an image including a face image that is the same as a registered face image of a passenger to be searched for out of images captured by the plurality of cameras 80 (image capture device) arranged in the airport.
- the management server 10 then outputs position information of a passenger in the airport based on the selected image. That is, the management server 10 can easily search for a passenger in various places in the airport without being limited to the peripheral region of a boarding gate and notify the operation terminal 70 on the position information (current location) of the searched passenger. Thereby, information on a desired passenger can be shared by the staff members S.
- FIG. 37 is a block diagram illustrating an example of the overall configuration of the information processing apparatus 100 in the present example embodiment.
- the information processing apparatus 100 has an acquisition unit 100 A, a specifying unit 100 B, and an output unit 100 C.
- the acquisition unit 100 A acquires biometric information of a passenger from a captured image obtained by capturing the passenger who is boarding on an airplane and has not yet passed through a boarding gate corresponding to the airplane.
- the specifying unit 100 B specifies boarding reservation information regarding the boarding by using the biometric information acquired by the acquisition unit 100 A.
- the output unit 100 C outputs information used for supporting a procedure of the passenger at the boarding gate based on the boarding reservation information specified by the specifying unit 100 B. According to the information processing apparatus 100 in the present example embodiment, it is possible to support a procedure of a passenger at a boarding gate.
- the configuration of the present invention is applicable not only to an international flight but also to a domestic flight.
- a 1 : 1 matching process between a passport face image and a captured face image in addition to an immigration procedure may be omitted.
- a captured face image at the time of purchasing a boarding ticket can be used as a registered biometric image.
- a terminal such as a smartphone or a personal computer is used to purchase a boarding ticket or perform check-in online, if a face image captured by a terminal is registered, the user can also board on an airplane through face authentication at the airport A.
- the check-in terminal 20 reads a passport face image from a passport and thereby issuance of a token ID is applied to the management server 10 in the first example embodiment described above, such issuance may be applied to the automatic baggage drop-off machine 30 or the security inspection apparatus 40 taking a case of an online check-in procedure into consideration. That is, the management server 10 acquires a passport face image and a target biometric image from any one of a terminal apparatus that performs the operation regarding the time of departure of the user U. Issuance of a token ID may be applied in a first performed procedure operation out of a series of procedure operations performed at the time of departure.
- the operation information DB 13 of the management server 10 stores only some of the items of boarding reservation information (a reservation number and an airline code).
- a terminal apparatus the check-in terminal 20 or the like at each touch point inquires boarding reservation information based on a reservation number for the reservation system 2 at the time of performing a procedure.
- all pieces of boarding reservation information acquired from the reservation system 2 may be copied in the operation information DB 13 as operation information.
- the terminal apparatus at the subsequent touch point can acquire boarding reservation information from the management server 10 (operation information DB 13 ), the inquiry to the reservation system (airline system) 2 may be omitted, or the inquiry method may be changed if necessary.
- the process between the boarding gate apparatus 60 , the management server 10 , and the reservation system 2 can be performed in the following procedures at a boarding gate (touch point P 5 ).
- the boarding gate apparatus 60 captures a face of a passenger and then transmits the face image to the management server 10 .
- the management server 10 performs face matching with a registered face image registered in the token ID information DB 11 and acquires the token ID corresponding to a registered face image of a successful matching.
- the management server 10 transmits, to the boarding gate apparatus 60 , boarding reservation information (boarding ticket data) acquired from the operation information DB 13 by using the token ID as a key.
- the boarding gate apparatus 60 transmits the acquired boarding reservation information to the reservation system 2 .
- the reservation system 2 matches the acquired boarding reservation information (boarding ticket data) with boarding reservation information stored in the reservation information DB 3 and, in response to determining whether or not to permit boarding of the passenger, transmits the determination result to the boarding gate apparatus 60 . Accordingly, the boarding gate apparatus 60 can control opening of the gate 611 based on the determination result received from the reservation system 2 .
- the terminal apparatus to which such information is output is not limited thereto.
- such information may be output to a signage terminal arranged in a peripheral region of a boarding gate or on a pathway or to a mobile terminal carried by a passenger.
- a contact address such as a mail address of a mobile terminal of a passenger is registered in a database (not illustrated)
- each of the example embodiments further includes a processing method that stores, in a storage medium, a program that causes the configuration of each of the example embodiments to operate so as to implement the function of each of the example embodiments described above, reads the program stored in the storage medium as a code, and executes the program in a computer. That is, the scope of each of the example embodiments also includes a computer readable storage medium. Further, each of the example embodiments includes not only the storage medium in which the program described above is stored but also the program itself.
- the storage medium for example, a floppy (registered trademark) disk, a hard disk, an optical disk, a magneto-optical disk, a CD-ROM, a magnetic tape, a nonvolatile memory card, or the like can be used.
- a floppy (registered trademark) disk for example, a hard disk, an optical disk, a magneto-optical disk, a CD-ROM, a magnetic tape, a nonvolatile memory card, or the like
- the scope of each of the example embodiments includes an example that operates on OS to perform a process in cooperation with another software or a function of an add-in board without being limited to an example that performs a process by an individual program stored in the storage medium.
- An information processing apparatus comprising:
- an acquisition unit that acquires, from a captured image obtained by capturing a passenger who is boarding on an airplane and has not yet passed through a boarding gate corresponding to the airplane, biometric information of the passenger;
- a specifying unit that specifies boarding reservation information regarding the boarding by using the acquired biometric information
- an output unit that outputs information used for supporting a procedure of the passenger at the boarding gate based on the specified boarding reservation information.
- the information processing apparatus further comprising a control unit that issues a token ID corresponding to registered biometric information of the passenger for each passenger and associates the registered biometric information with the boarding reservation information in advance via the token ID, wherein the specifying unit specifies the boarding reservation information on the passenger based on the token ID corresponding to the registered biometric information successfully matched with the biometric information.
- the information processing apparatus according to supplementary note 1 or 2, wherein the information includes priority in a procedure of the passenger performed at the boarding gate.
- the boarding reservation information includes a class of a seat in the airplane or a category of the passenger set by an airline company
- the output unit outputs the priority based on the class or the category.
- the information processing apparatus according to supplementary note 4, wherein the output unit outputs a waiting place prepared for boarding corresponding to the priority.
- the information processing apparatus wherein when a person having different priority from the priority corresponding to the waiting place is included in passengers waiting in the waiting place, the output unit outputs the information that suggests guide to another waiting place corresponding to the priority of the person.
- the boarding reservation information further includes registered information regarding a predetermined accompanying person
- the output unit outputs the priority based on whether or not the accompanying person is present.
- the acquisition unit acquires another biometric information of the passenger from another captured image obtained by capturing the passenger at the boarding gate
- the specifying unit specifies the boarding reservation information on the passenger by using the another biometric information
- the output unit outputs the information that suggests a check operation with respect to the accompanying person.
- the acquisition unit acquires another biometric information of the passenger from another captured image obtained by capturing the passenger at the boarding gate
- the specifying unit specifies the boarding reservation information on the passenger by using the another biometric information
- the output unit outputs the information that suggests a check operation with respect to the accompanying person.
- the specifying unit specifies each boarding reservation information on the passenger by using the biometric information of all passengers included in the captured image
- the output unit outputs a list screen including the specified boarding reservation information.
- the specifying unit specifies the boarding reservation information by using the biometric information of the passenger specified on a screen displaying the captured image
- the output unit outputs the specified boarding reservation information on the screen.
- an extraction unit that extracts the passenger corresponding to the boarding reservation information satisfying the extraction condition out of passengers included in the captured image.
- the information processing apparatus according to any one of supplementary notes 1 to 12 further comprising a selection unit that, out of images captured by an image capture device arranged in an airport, selects an image from which the biometric information of the passenger to be searched for is acquired,
- the output unit outputs positional information on the passenger in the airport based on the selected image.
- An information processing method comprising steps of:
- a storage medium storing a program that causes a computer to perform:
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Human Computer Interaction (AREA)
- Tourism & Hospitality (AREA)
- Computer Security & Cryptography (AREA)
- Multimedia (AREA)
- General Business, Economics & Management (AREA)
- General Engineering & Computer Science (AREA)
- Quality & Reliability (AREA)
- Operations Research (AREA)
- Marketing (AREA)
- Human Resources & Organizations (AREA)
- Entrepreneurship & Innovation (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Strategic Management (AREA)
- Development Economics (AREA)
- Economics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Finance (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Devices For Checking Fares Or Tickets At Control Points (AREA)
Abstract
Description
- The present invention relates to an information processing apparatus, an information processing method, and a storage medium.
-
Patent Literature 1 discloses a ticketless boarding system that performs a procedure with face authentication at a plurality of checkpoints (a check-in lobby, a security inspection area, a boarding gate, or the like) by using biometric information (face image) of a passenger. -
- PTL 1: Japanese Patent Application Laid-Open No. 2007-79656
- As illustrated in
Patent Literature 1 as an example, it is expected that throughput in an airport is improved by facilitating a use of a terminal having a face authentication function. In the conventional system illustrated as an example inPatent Literature 1, however, for a passenger waiting around a boarding gate, it is not assumed to support a procedure at a boarding gate. - Accordingly, in view of the above problem, the present invention intends to provide an information processing apparatus, an information processing method, and a storage medium that support a procedure of a passenger at a boarding gate.
- According to one example aspect of the present invention, provided is an information processing apparatus including: an acquisition unit that acquires, from a captured image obtained by capturing a passenger who is boarding on an airplane and has not yet passed through a boarding gate corresponding to the airplane, biometric information of the passenger; a specifying unit that specifies boarding reservation information regarding the boarding by using the acquired biometric information; and an output unit that outputs information used for supporting a procedure of the passenger at the boarding gate based on the specified boarding reservation information.
- According to another example aspect of the present invention, provided is an information processing method including steps of: acquiring, from a captured image obtained by capturing a passenger who is boarding on an airplane and has not yet passed through a boarding gate corresponding to the airplane, biometric information of the passenger; specifying boarding reservation information regarding the boarding by using the acquired biometric information; and outputting information used for supporting a procedure of the passenger at the boarding gate based on the specified boarding reservation information.
- According to yet another example aspect of the present invention, provided is a storage medium storing a program that causes a computer to perform steps of: acquiring, from a captured image obtained by capturing a passenger who is boarding on an airplane and has not yet passed through a boarding gate corresponding to the airplane, biometric information of the passenger; specifying boarding reservation information regarding the boarding by using the acquired biometric information; and outputting information used for supporting a procedure of the passenger at the boarding gate based on the specified boarding reservation information.
- According to the present invention, an information processing apparatus, an information processing method, and a storage medium that support a procedure of a passenger at a boarding gate can be provided.
-
FIG. 1 is a block diagram illustrating an example of an overall configuration of an information processing system in a first example embodiment. -
FIG. 2 is a diagram illustrating one example of the arrangement of boarding gate apparatuses, operation terminals, and cameras in the first example embodiment. -
FIG. 3 is a diagram illustrating one example of information stored in a token ID information database in the first example embodiment. -
FIG. 4 is a diagram illustrating one example of information stored in a passage history information database in the first example embodiment. -
FIG. 5 is a diagram illustrating one example of information stored in an operation information database in the first example embodiment. -
FIG. 6 is a diagram illustrating one example of information stored in a reservation information database in the first example embodiment. -
FIG. 7 is a block diagram illustrating one example of a hardware configuration of a management server in the first example embodiment. -
FIG. 8 is a block diagram illustrating one example of a hardware configuration of a check-in terminal in the first example embodiment. -
FIG. 9 is a block diagram illustrating one example of a hardware configuration of an automatic baggage drop-off machine in the first example embodiment. -
FIG. 10 is a block diagram illustrating one example of a hardware configuration of a security inspection apparatus in the first example embodiment. -
FIG. 11 is a block diagram illustrating one example of a hardware configuration of an automated gate apparatus in the first example embodiment. -
FIG. 12 is a block diagram illustrating one example of a hardware configuration of a boarding gate apparatus in the first example embodiment. -
FIG. 13 is a block diagram illustrating one example of a hardware configuration of an operation terminal in the first example embodiment. -
FIG. 14 is a sequence diagram illustrating one example of the process in the reservation system, the check-in terminal, and the management server in the first example embodiment. -
FIG. 15 is a sequence diagram illustrating one example of the process in the reservation system, the automatic baggage drop-off machine, and the management server in the first example embodiment. -
FIG. 16 is a sequence diagram illustrating one example of the process in the reservation system, the security inspection apparatus, and the management server in the first example embodiment. -
FIG. 17 is a sequence diagram illustrating one example of the process in the reservation system, the automated gate apparatus, and the management server in the first example embodiment. -
FIG. 18 is a sequence diagram illustrating one example of the process in the reservation system, the operation terminal, the management server, and the camera in the first example embodiment. -
FIG. 19 is a diagram illustrating one example of the operation screen displayed on the operation terminal in the first example embodiment. -
FIG. 20 is a diagram illustrating one example of a passenger list screen displayed on the operation terminal in the first example embodiment. -
FIG. 21 is a sequence diagram illustrating one example of the process in the reservation system, the operation terminal, the management server, and the camera in the first example embodiment. -
FIG. 22 is a diagram illustrating one example of a guidance instruction screen for priority boarding displayed on the operation terminal in the first example embodiment. -
FIG. 23 is a sequence diagram illustrating one example of the process in the reservation system, the operation terminal, the management server, and the camera in the first example embodiment. -
FIG. 24 is a diagram illustrating one example of a guide instruction screen for a waiting lane displayed on the operation terminal in the first example embodiment. -
FIG. 25 is a sequence diagram illustrating one example of the process in the reservation system, the operation terminal, the management server, and the camera in the first example embodiment. -
FIG. 26 is a diagram illustrating one example of a passenger inquiry screen displayed on the operation terminal in the first example embodiment. -
FIG. 27 is a sequence diagram illustrating one example of the process in the reservation system, the operation terminal, the management server, and the camera in the first example embodiment. -
FIG. 28 is a diagram illustrating one example of an extraction condition entry screen displayed on the operation terminal in the first example embodiment. -
FIG. 29 is a diagram illustrating one example of a passenger extraction result screen displayed on the operation terminal in the first example embodiment. -
FIG. 30 is a sequence diagram illustrating one example of the process in the reservation system, the boarding gate apparatus, and the management server in the first example embodiment. -
FIG. 31 is a flowchart illustrating one example of a check process of an accompanying person in a second example embodiment. -
FIG. 32 is a diagram illustrating one example of a check instruction screen of an accompanying person displayed on the operation terminal in the second example embodiment. -
FIG. 33 is a diagram illustrating one example of a check instruction screen of an accompanying person displayed on the operation terminal in the second example embodiment. -
FIG. 34 is a block diagram illustrating an example of an overall configuration of an information processing system in a third example embodiment. -
FIG. 35 is a sequence diagram illustrating one example of the process in the operation terminal, the management server, and the camera in the third example embodiment. -
FIG. 36 is a diagram illustrating one example of a passenger search result screen displayed on the operation terminal in the third example embodiment. -
FIG. 37 is a block diagram illustrating an example of an overall configuration of an information processing apparatus in a fourth example embodiment. - Illustrative example embodiments of the present invention will be described below with reference to the drawings. Throughout the drawings, the same components or corresponding components are labeled with the same references, and the description thereof may be omitted or simplified.
-
FIG. 1 is a block diagram illustrating an example of the overall configuration of aninformation processing system 1 in the present example embodiment. Theinformation processing system 1 is a computer system that manages and supports an operation regarding an inspection procedure at immigration to a user (hereafter, referred to as a “passenger”) U at an airport A. Theinformation processing system 1 is operated by a public institution such as an office of administration of immigration or a consignee consigned for the operation of the institution, for example. Unlike theinformation processing system 1, areservation system 2 is a computer system provided in an airline company. Thereservation system 2 includes a reservation information database (DB) 3 that manages boarding reservation information. Note that, although only onereservation system 2 is illustrated for simplified illustration inFIG. 1 , thereservation system 2 is provided for each of a plurality of airline companies. - In the
information processing system 1 of the present example embodiment, a check-in terminal 20, an automatic baggage drop-offmachine 30, asecurity inspection apparatus 40, anautomated gate apparatus 50, and aboarding gate apparatus 60 are connected to acommon management server 10 via the network NW1, respectively. Thesecurity inspection apparatus 40, theautomated gate apparatus 50, and theboarding gate apparatus 60 are installed in a security area SA illustrated by a dashed line. Similarly, in thereservation system 2 of each airline company, the check-in terminal 20, the automatic baggage drop-off machine 30, thesecurity inspection apparatus 40, theautomated gate apparatus 50, and theboarding gate apparatus 60 are connected to a server (not illustrated) via the network NW2, respectively. Note that anoperation terminal 70 used by a staff member S is connected to the networks NW1 and NW2 via access points (not illustrated). - The networks NW1 and NW2 are formed of a local area network (LAN) including LAN of the airport A, a wide area network (WAN), a mobile communication network, or the like. The connection scheme is not limited to a wired scheme but may be a wireless scheme. The networks NW1 and NW2 are different networks from each other. That is, in the present example embodiment, the
information processing system 1 is not directly connected to thereservation system 2. - The
management server 10 is provided in a facility of an airport company that runs the airport A, for example. Note that themanagement server 10 may be a cloud server instead of a server installed in a facility where an operation is actually provided. Further, themanagement server 10 is not necessarily required to be a single server but may be formed as a server group including a plurality of servers. - As illustrated in
FIG. 1 , inspection procedures at departure from a country at the airport A are sequentially performed at five touch points. The relationship between each apparatus and each touch point will be described below. - The check-in terminal 20 is installed in a check-in lobby (hereafter, referred to as “touch point P1”) in the airport A. The check-in terminal 20 is a self-service terminal by which the user U performs a check-in procedure (boarding procedure) by himself/herself. Upon completion of the procedure at the touch point P1, the user U moves to a baggage counter or a security inspection area.
- The automatic baggage drop-
off machine 30 is installed at a baggage counter (hereafter, referred to as “touch point P2”) in the airport A. The automatic baggage drop-off machine 30 is a self-service terminal which is operated by the user U by himself/herself to perform a procedure of dropping off baggage that is not carried in an airplane (baggage drop-off procedure). Upon completion of the procedure at the touch point P2, the user U moves to a security inspection area. Note that, when the user U does not drop off his/her baggage, the procedure at the touch point P2 is omitted. - The
security inspection apparatus 40 is installed in a security inspection area (hereafter, referred to as “touch point P3”) in the airport A. Thesecurity inspection apparatus 40 is an apparatus that uses a metal detector to check whether or not the user U wears a metal that may be a dangerous object. Note that the term “security inspection apparatus” in the present example embodiment is used as a meaning including an X-ray inspection apparatus that uses an X-ray to check whether or not there is a dangerous object in carry-on baggage or the like, a terminal apparatus of a passenger reconciliation system (PRS) that determines whether or not to permit passage of the user U at the entrance of a security inspection area, or the like without being limited to a metal detector. The user U who completes a check-in procedure or an automatic baggage drop-off procedure goes through a security inspection procedure by thesecurity inspection apparatus 40 in the security inspection area. Upon completion of the procedure at the touch point P3, the user U moves to an immigration area. - The
automated gate apparatus 50 is installed in an immigration area (hereafter, referred to as “touch point P4”) in the airport A. Theautomated gate apparatus 50 is an apparatus that automatically performs an immigration procedure of the user U. Upon completion of the procedure at the touch point P4, the user U moves to a departure area in which a duty-free shop and a boarding gate are provided. - The
boarding gate apparatus 60 is a passage control apparatus installed for each boarding gate (hereafter, referred to as “touch point P5”) in the departure area. Theboarding gate apparatus 60 confirms that the user U is a passenger of an airplane who is allowed to board via the boarding gate. Upon completion of the procedure at the touch point P5, the user U boards on the airplane and departs from the country. - The
operation terminal 70 is a terminal apparatus used by the staff member S for its operation. One example of theoperation terminal 70 may be a personal computer, a tablet terminal, a smartphone, or the like but not limited thereto. Theoperation terminal 70 receives information transmitted from themanagement server 10 and displays the information on a screen. Themanagement server 10 of the present example embodiment transmits information used for supporting a procedure performed by a passenger at a boarding gate to theoperation terminal 70. Details of the information will be described later. - A plurality of
cameras 80 are arranged in peripheral regions (adjacent regions) of boarding gates and are image capture devices that capture peripheral regions of the boarding gates, respectively. Eachcamera 80 is attached to a ceiling, a wall, a pillar, or the like, for example, so as to be able to capture a face of the user U who is present in the peripheral region of the boarding gate and waiting for boarding. The type of thecamera 80 may be any of a fixed type and a movable type. -
FIG. 2 is a diagram illustrating one example of the arrangement of theboarding gate apparatuses 60, theoperation terminals 70, and thecameras 80. This example illustrates a case where the plurality ofcameras 80 are installed in the peripheral region of the boarding gate and capture passengers at various capturing angles. It is preferable that eachoperation terminal 70 can selectively switch captured images captured by the plurality ofcameras 80 on an operation screen. Further, two types of mobile type and fixed type are illustrated as theoperation terminal 70 used by the staff member S. Further, L1 to L3 each represent a waiting area (waiting lane) as an example where the users U wait in a line before boarding. - In general, an airline company categorizes the users U into a plurality of groups in advance based on various conditions such as a seat class, a membership category, accompanying person information, a seat position, or the like and guides the users U to the boarding gate in predetermined group order. For example, a passenger accompanying an infant or a young child, a passenger whose seat class is the first class, a passenger recognized as an upper class member by the airline company, or the like is categorized into a group having relatively higher priority in boarding guide. The group having high priority is to receive a priority boarding service.
- Further, as illustrated in
FIG. 1 , themanagement server 10 has a tokenID information DB 11, a passagehistory information DB 12, and anoperation information DB 13. Note that the database included in themanagement server 10 is not limited to these databases. -
FIG. 3 is a diagram illustrating one example of information stored in the tokenID information DB 11. The tokenID information DB 11 has data items of “token ID”, “group ID”, “feature amount”, “registered face image”, “token issuance time”, “token issuance device name”, “invalid flag”, “invalidated time”, and “accompanying person ID”. The token ID is an identifier that uniquely identifies ID information. In the present example embodiment, the token ID is temporarily issued provided that there is a matching in a matching result between a passport face image read from a passport at the touch point P1 and a face image obtained by capturing the user U having the passport. Then, when the user U finishes a procedure at the touch point P5 (boarding gate), the token ID is invalidated. That is, the token ID is a onetime ID having a lifetime. - The group ID is an identifier used for grouping ID information. The feature amount is a value extracted from biometric information. For example, the same group ID is set for the user U having an accompanying person or the users U of a party traveler. The registered face image is a face image registered for the user U. The term of biometric information in the present example embodiment means a face image and a feature amount extracted from the face image. Note that the biometric information is not limited to a face image and a face feature amount. That is, a fingerprint image, a palm-print image, a pinna image, an iris image, or the like may be used as the biometric information of the user U to perform biometric authentication.
- The token issuance time is the time when the
management server 10 issues a token ID. The token issuance device name is a name of a device from which a registered face image that triggers issuance of a token ID is acquired. The invalid flag is flag information indicating whether or not the token ID is currently valid. Once a token ID is issued, the invalid flag becomes a value of “1” indicating that the token ID is valid. Further, once a predetermined condition is satisfied, the invalid flag is updated to a value of “0” indicating that the token ID is invalid. The invalidated time is a timestamp when the invalid flag is disabled. - The accompanying person ID is a token ID issued for a person who boards on an airplane with support from another passenger, for example, a person such as an infant, a young child, or the like (hereafter, referred to as “supported person”). A person associated with the accompanying person ID is a supporting side person such as a guardian, for example, a father or a mother or a helper (hereafter, referred to as “supporting person”). The accompanying person ID is issued to a supported person at the same time when a supporting person first performs a procedure, for example. In the present example embodiment, a case where a single supported person is associated with a single supporting person will be described. However, a plurality of supported persons may be associated with a single supporting person.
-
FIG. 4 is a diagram illustrating one example of information stored in the passagehistory information DB 12. The passagehistory information DB 12 has data items of “passage history ID”, “token ID”, “passage time”, “device name”, “operation system type”, and “passage touch point”. The passage history ID is an identifier that uniquely identifies passage history information. The passage time is a timestamp when a passenger passes through the touch points P1 to P5. The device name is a machine name of a terminal used in procedures at the touch points P1 to P5. The operation system type is a type of the operation system to which a terminal belongs. The passage touch point is each name of the touch points P1 to P5 that a passenger has passed through. Note that themanagement server 10 can extract the passage history information on a token ID basis to recognize up to which touch point the user U completed the procedure. -
FIG. 5 is a diagram illustrating one example of information stored in theoperation information DB 13. Theoperation information DB 13 has data items of “token ID”, “reservation number”, “airline code”, and “operation information”. The reservation number is an identifier that uniquely identifies reservation information of boarding to an airplane. The airline code is an identifier that uniquely identifies an airline company. The operation information is arbitrary information obtained by an operation at each touch point. -
FIG. 6 is a diagram illustrating one example of information stored in thereservation information DB 3. Thereservation information DB 3 has data items of “reservation number”, “airline code”, “passenger name”, “departure place”, “destination place”, “flight number”, “date of flight”, “seat number”, “seat class” (for example, first class/business class/economy class), “nationality”, “passport number”, “family name”, “first name”, “date of birth”, “sexuality”, “membership category”, “with or without accompanying person”, and “accompanying person category”. - In the present example embodiment, the
operation information DB 13 and thereservation information DB 3 are associated with each other by a reservation number and an airline code. Specifically, once a terminal apparatus at each touch point (the check-in terminal 20 or the like) reads a reservation number and an airline code from an airline ticket medium presented by a passenger, the terminal apparatus can inquire boarding reservation information from thereservation system 2 of an airline company corresponding to an airline code based on the reservation number. Note that a method of inquiring boarding reservation information from thereservation system 2 is not limited thereto. - Next, with reference to
FIG. 7 toFIG. 13 , an example of a hardware configuration of each apparatus forming theinformation processing system 1 will be described. Note that, since devices having the same name but having different references inFIG. 7 toFIG. 13 are devices having a similar function, detailed description thereof will be omitted in the subsequent drawings. -
FIG. 7 is a block diagram illustrating one example of a hardware configuration of themanagement server 10. As illustrated inFIG. 7 , themanagement server 10 has a central processing unit (CPU) 101, a random access memory (RAM) 102, astorage device 103, and a communication I/F 104. Each device is connected to abus line 105. - The
CPU 101 is a processor that has a function of performing a predetermined operation in accordance with a program stored in thestorage device 103 and controlling each component of themanagement server 10. TheRAM 102 is formed of a volatile storage medium and provides a temporary memory area required for the operation of theCPU 101. - The
storage device 103 is formed of a storage medium such as a nonvolatile memory, a hard disk drive, or the like and functions as a storage unit. Thestorage device 103 stores a program executed by theCPU 101, data referenced by theCPU 101 when the program is executed, or the like. - The communication I/
F 104 is a communication interface based on a standard such as Ethernet (registered trademark), Wi-Fi (registered trademark), 4G, or the like, which is a module used for communicating with the check-in terminal 20 or the like. -
FIG. 8 is a block diagram illustrating one example of a hardware configuration of the check-interminal 20. As illustrated inFIG. 8 , the check-in terminal 20 has aCPU 201, aRAM 202, astorage device 203, a communication I/F 204, aninput device 206, adisplay device 207, amedium reading device 208, and a biometricinformation acquisition device 209. Each device is connected to abus line 205. - The
input device 206 is a pointing device such as a touch panel, a keyboard, or the like, for example. Thedisplay device 207 is a liquid crystal display device, an organic light emitting diode (OLED) display device, or the like and used for display of a moving image, a static image, a text, or the like. In the check-interminal 20 of the present example embodiment, theinput device 206 and thedisplay device 207 are integrally formed as a touch panel. - The
medium reading device 208 is a device that reads a passport or an airline ticket medium of the user U and acquires information recorded on the passport or the airline ticket. The airline ticket medium may be, for example, a paper airline ticket, a mobile terminal that displays a duplicate of an e-ticket, or the like. Themedium reading device 208 is formed of a code reader, an image scanner, a contactless integrated circuit (IC) reader, an optical character reader (OCR) device, or the like, for example, and acquires information from various media held over the reading unit thereof. - The biometric
information acquisition device 209 is a device that acquires a face image of the user U as biometric information of the user U. For example, the biometricinformation acquisition device 209 is a digital camera that captures a face of the user U standing in front of the check-in terminal 20, and the biometricinformation acquisition device 209 captures a face of the user U and acquires a face image. -
FIG. 9 is a block diagram illustrating one example of a hardware configuration of the automatic baggage drop-off machine 30. As illustrated inFIG. 9 , the automatic baggage drop-off machine 30 has aCPU 301, aRAM 302, astorage device 303, a communication I/F 304, aninput device 306, adisplay device 307, amedium reading device 308, a biometricinformation acquisition device 309, abaggage transport device 310, and anoutput device 311. Each device is connected to abus line 305. - The
baggage transport device 310 transports baggage of the user U in order to load the baggage to an airplane that the user U is boarding on when the identity verification of the user U is successful. Thebaggage transport device 310 transports baggage that is placed on a reception part by the user U and attached with a baggage tag to a cargo handling section. - The
output device 311 is a device that outputs a baggage tag to be attached to dropped-off baggage. Further, theoutput device 311 outputs a baggage claim tag required for claiming baggage after arriving at the destination. Note that a baggage tag or a baggage claim tag is associated with at least one of passport information and boarding information. -
FIG. 10 is a block diagram illustrating one example of a hardware configuration of thesecurity inspection apparatus 40. As illustrated inFIG. 10 , thesecurity inspection apparatus 40 has aCPU 401, aRAM 402, astorage device 403, a communication I/F 404, aninput device 406, adisplay device 407, amedium reading device 408, a biometricinformation acquisition device 409, and ametal detector gate 410. Each device is connected to abus line 405. - The
metal detector gate 410 is a gate type metal detector and detects a metal worn by the user U passing through themetal detector gate 410. -
FIG. 11 is a block diagram illustrating one example of a hardware configuration of theautomated gate apparatus 50. Theautomated gate apparatus 50 has aCPU 501, aRAM 502, astorage device 503, a communication I/F 504, aninput device 506, adisplay device 507, amedium reading device 508, a biometricinformation acquisition device 509, and agate 511. Each device is connected to abus line 505. Note that theautomated gate apparatus 50 is arranged in the entry inspection site has the same hardware as theautomated gate apparatus 50 arranged in the immigration area. - The
gate 511 transitions from a closed state to block passage of the user U during standby to an opened state to permit passage of the user U under the control of theCPU 501 when identity verification of the user U at theautomated gate apparatus 50 is successful and the user U passes through immigration procedure. The scheme of thegate 511 is not particularly limited and may be, for example, a flapper gate whose flapper provided on one side of the pathway or flappers provided on both sides of the pathway are opened and closed, a turn style gate whose three bars rotate, or the like. -
FIG. 12 is a block diagram illustrating one example of a hardware configuration of theboarding gate apparatus 60. As illustrated inFIG. 12 , theboarding gate apparatus 60 has aCPU 601, aRAM 602, astorage device 603, a communication I/F 604, aninput device 606, adisplay device 607, a biometricinformation acquisition device 609, and agate 611. Each device is connected to abus line 605. -
FIG. 13 is a block diagram illustrating one example of a hardware configuration of theoperation terminal 70. As illustrated inFIG. 13 , theoperation terminal 70 has aCPU 701, aRAM 702, astorage device 703, a communication I/F 704, aninput device 706, and adisplay device 707. Each device is connected to abus line 705. - Next, the operation of each apparatus in the
information processing system 1 in the present example embodiment will be described based onFIG. 14 toFIG. 30 . -
FIG. 14 is a sequence diagram illustrating one example of the process in thereservation system 2, the check-in terminal 20, and themanagement server 10. This process is performed when the user U uses the check-in terminal 20 to perform a check-in procedure. - First, the check-in terminal 20 determines whether or not a passport of the user U is held over a reading unit (not illustrated) of the medium reading device 208 (step S101), and stands by until a passport is held over (step S101, NO).
- Next, if it is determined that a passport is held over the reading unit of the medium reading device 208 (step S101, YES), the check-in terminal 20 acquires passport information on the user U from the passport that is held over (step S102). The acquired passport information includes a passport face image of the user U, identity verification information, a passport number, information on a passport issuance country, or the like.
- Next, the check-in terminal 20 captures a face of the user U by using the biometric information acquisition device 209 (step S103) and transmits the face image and the passport information to the management server 10 (step S104). Note that it is preferable to display a screen used for obtaining consent of the user U before capturing a face image.
- In response to receiving information from the check-in terminal 20, the
management server 10 matches, at 1:1, a face image recorded on the passport of the user U (hereafter, referred to as “passport face image”) with a face image captured by the check-in terminal 20 (hereafter, referred to as “target face image”) (step S105). - Next, if it is determined that the matching result between the passport face image and the target face image indicates a successful matching (step S106, YES), the
management server 10 issues a token ID (step S107). The token ID is set to a unique value based on date and time or a sequence number at a process, for example. - Next, the
management server 10 uses the target face image as a registered face image and registers a relationship between the token ID and the registered face image in the token ID information DB 11 (step S108). - In the present example embodiment, the reason why a face image captured on site (target face image) is used as the registered face image is that the lifecycle of a token ID is terminated within the day, that a captured image is closer to an image captured in the subsequent authentication process than a passport face image in a quality (appearance), or the like. However, a passport face image may be set as a registered face image (registered biometric information) instead of a target face image (captured face image). For example, when a lifecycle of a token ID spans a long term (for example, when a token ID is validated for a certain lifecycle if the user U has a membership, or the like in airline services), a face image of a passport or a license card may be set as a registered face image.
- Next, the
management server 10 transmits the issued token ID and a matching result of a successful matching to the check-in terminal 20 (step S109). - On the other hand, if it is determined the matching result between the passport face image and the target face image indicates an unsuccessful matching (step S106, NO), the
management server 10 transmits the matching result of the unsuccessful matching to the check-in terminal 20 (step S110). - Next, based on the matching result of the successful matching received from the
management server 10, if the check-in terminal 20 determines that the check-in procedure can be performed (step S111, YES), the process proceeds to step S112. Contrarily, based on the matching result of the unsuccessful matching received from themanagement server 10, if the check-in terminal 20 determines that the check-in procedure is not performed (step S111, NO), the check-in terminal 20 notifies the user U of an error message (step S113). - In step S112, the check-in terminal 20 determines whether or not an airline ticket medium of the user U is held over the reading unit of the
medium reading device 208. The check-in terminal 20 stands by until an airline ticket medium is held over (step S112, NO). - Next, if it is determined that an airline ticket medium is held over the reading unit of the medium reading device 208 (step S112, YES), the check-in terminal 20 acquires recorded data such as a reservation number, an airline code, and the like from the airline ticket medium that is held over (step S114).
- Next, the check-in terminal 20 transmits the recorded data to the
reservation system 2 of an airline company corresponding to the airline code (step S115) and requests matching between the recorded data and boarding reservation information. - In response to receiving recorded data from the check-in terminal 20, the
reservation system 2 matches the recorded data with boarding reservation information stored in the reservation information DB 3 (step S116) and transmits the matching result to the check-in terminal 20 (step S117). - Next, in response to receiving the matching result of a successful matching from the
reservation system 2, the check-in terminal 20 performs a check-in procedure such as confirmation of an itinerary, selection of a seat, or the like based on the input information on the user U (step S118). If there is no matching in the matching result in thereservation system 2, the check-in terminal 20 may notify the user U of an error without performing a check-in procedure. The check-in terminal 20 then transmits, to themanagement server 10, a token ID, operation information, and passage history information indicating completion of procedure at the check-in terminal 20 (step S119). Note that the operation information includes at least a reservation number and an airline code. Further, the passage history information includes information such as a passage time at the touch point P1, a device name of a terminal used for the procedure, or the like. - Next, in response to receiving information from the check-in terminal 20, the
management server 10 registers passage history information indicating a relationship between the token ID and the passage information at the touch point P1 in the passage history information DB 12 (step S120). Themanagement server 10 then registers the operation information received from the check-in terminal 20 in theoperation information DB 13 if necessary (step S121). - As described above, a target face image (captured face image) successfully matched with a passport face image acquired from a passport in a check-in procedure is registered in the token
ID information DB 11 as a registered face image, and a registered face image and operation information in theoperation information DB 13 are associated with each other by the issued token ID. This enables biometric authentication by using face matching between a captured face image and a registered face image at each subsequent touch point. -
FIG. 15 is a sequence diagram illustrating one example of the process in thereservation system 2, the automatic baggage drop-off machine 30, and themanagement server 10. This process is performed when the user U who completed a check-in procedure performs a baggage drop-off procedure if necessary. - The automatic baggage drop-
off machine 30 continuously or periodically captures the area in front of the apparatus and determines whether or not a face of the user U standing in front of the automatic baggage drop-off machine 30 is detected in the captured image (step S201). The automatic baggage drop-off machine 30 stands by until a face of the user U is detected in an image by the biometric information acquisition device 309 (step S201, NO). - If it is determined that a face of the user U is detected by the biometric information acquisition device 309 (step S201, YES), the automatic baggage drop-
off machine 30 captures the face of the user U and acquires the face image of the user U as a target face image (step S202). - Next, the automatic baggage drop-
off machine 30 transmits the target face image of the user U captured by the biometricinformation acquisition device 309 to themanagement server 10 together with a matching request (step S203). Thereby, the automatic baggage drop-off machine 30 requests themanagement server 10 to match, at 1:N, the target face image of the user U captured by the biometricinformation acquisition device 309 with a plurality of registered face images registered in the tokenID information DB 11 of themanagement server 10. - In response to receiving the target face image and the matching request from the automatic baggage drop-
off machine 30, themanagement server 10 performs matching of the face image of the user U (step S204). That is, themanagement server 10 matches, at 1:N, the target face image received from the automatic baggage drop-off machine 30 with a plurality of registered face images registered in the tokenID information DB 11. Note that the registered face images to be matched are limited to images associated with the token ID whose invalid flag has a value of “1” (valid). - Herein, if the
management server 10 determines that the matching result indicates an unsuccessful matching (step S205, NO), themanagement server 10 transmits the unsuccessful matching result to the automatic baggage drop-off machine 30 (step S208), and the process proceeds to step S209. Contrarily, if themanagement server 10 determines that the matching result indicates a successful matching (step S205, YES), the process proceeds to step S206. - In step S206, the
management server 10 acquires the token ID associated with the successfully matched registered face image in the tokenID information DB 11 and acquires a reservation number and an airline code from theoperation information DB 13 by using the token ID as a key. Themanagement server 10 then transmits the token ID, the reservation number, the airline code, and the matching result to the automatic baggage drop-off machine 30 (step S207). - Next, if it is determined based on the matching result that the procedure can be performed (step S209, YES), the automatic baggage drop-
off machine 30 transmits the reservation number to thereservation system 2 of an airline company corresponding to the airline code (step S210) and inquires boarding reservation information. - Contrarily, if it is determined based on the matching result that the procedure is not performed (step S209, NO), the automatic baggage drop-
off machine 30 notifies the user U of an error message (step S213). - In response to receiving the reservation number from the automatic baggage drop-
off machine 30, thereservation system 2 transmits the corresponding boarding reservation information to the automatic baggage drop-off machine 30 (step S211). - Next, in response to receiving the boarding reservation information from the
reservation system 2, the automatic baggage drop-off machine 30 performs a baggage drop-off procedure of the user U (step S212). - Next, the automatic baggage drop-
off machine 30 transmits, to themanagement server 10, the token ID, the operation information, and passage history information indicating that the baggage drop-off procedure of the user U completed after the matching of the face image (step S214). Note that the passage history information includes information such as the passage time at the touch point P2, a device name of the used terminal, or the like. - In response to receiving the information from the automatic baggage drop-
off machine 30, themanagement server 10 registers, in the passagehistory information DB 12, passage history information indicating the relationship between the token ID and the passage information at the touch point P2 on the user U (step S215). Themanagement server 10 then registers the operation information received from the automatic baggage drop-off machine 30 in theoperation information DB 13 if necessary (step S216). -
FIG. 16 is a sequence diagram illustrating one example of the process in thereservation system 2, thesecurity inspection apparatus 40, and themanagement server 10. This process is performed when the user U who completed a check-in procedure or a baggage drop-off procedure goes through a security inspection procedure. - The
security inspection apparatus 40 continuously or periodically captures the front area of themetal detector gate 410 and determines whether or not a face of the user U standing in front of themetal detector gate 410 is detected in a captured image (step S301). Thesecurity inspection apparatus 40 stands by until a face of the user U is detected in an image by the biometric information acquisition device 409 (step S301, NO). - If it is determined that a face of the user U is detected by the biometric information acquisition device 409 (step S301, YES), the
security inspection apparatus 40 captures the face of the user U and acquires the face image of the user U as a target face image (step S302). - Next, the
security inspection apparatus 40 transmits the target face image of the user U captured by the biometricinformation acquisition device 409 to themanagement server 10 together with a matching request (step S303). Thereby, thesecurity inspection apparatus 40 requests themanagement server 10 to match, at 1:N, the target face image of the user U captured by the biometricinformation acquisition device 409 with a plurality of registered face images registered in the tokenID information DB 11 of themanagement server 10. - In response to receiving the target face image and the matching request from the
security inspection apparatus 40, themanagement server 10 performs matching of the face image of the user U (step S304). That is, themanagement server 10 matches, at 1:N, the target face image received from thesecurity inspection apparatus 40 with a plurality of registered face images registered in the tokenID information DB 11. Note that the registered face images to be matched are limited to images associated with the token ID whose invalid flag has a value of “1” (valid). - Herein, if the
management server 10 determines that the matching result indicates an unsuccessful matching (step S305, NO), themanagement server 10 transmits the matching result of the unsuccessful matching to the security inspection apparatus 40 (step S308), and the process proceeds to step S309. Contrarily, if themanagement server 10 determines that the matching result indicates a successful matching (step S305, YES), the process proceeds to step S306. - In step S306, the
management server 10 acquires the token ID associated with the successfully matched registered face image in the tokenID information DB 11 and acquires a reservation number and an airline code from theoperation information DB 13 by using the token ID as a key. Themanagement server 10 then transmits the token ID, the reservation number, the airline code, and the matching result to the security inspection apparatus 40 (step S307). The process proceeds to step S309. - Next, if it is determined based on the matching result that the procedure can be performed (step S309, YES), the
security inspection apparatus 40 transmits the reservation number to thereservation system 2 of an airline company corresponding to the airline code (step S310) and inquires boarding reservation information. - Contrarily, if it is determined based on the matching result that the procedure is not performed (step S309, NO), the
security inspection apparatus 40 notifies the user U of an error message (step S313). - In response to receiving the reservation number from the
security inspection apparatus 40, thereservation system 2 transmits the corresponding boarding reservation information to the security inspection apparatus 40 (step S311). - Next, in response to receiving the boarding reservation information from the
reservation system 2, thesecurity inspection apparatus 40 performs a security inspection procedure of the user U (step S312). In the security inspection process, theCPU 401 controls each component of thesecurity inspection apparatus 40. Thereby, thesecurity inspection apparatus 40 detects a metal worn by the user U passing through themetal detector gate 410. The user U who has passed through themetal detector gate 410 moves to an immigration area. - Next, the
security inspection apparatus 40 transmits, to themanagement server 10, the token ID, the operation information, and passage history information indicating that the security inspection procedure of the user U completed after the matching of the face image (step S314). Note that the passage history information includes information such as the passage time at the touch point P3, a device name of the used terminal, or the like. - In response to receiving the information from the
security inspection apparatus 40, themanagement server 10 registers, in the passagehistory information DB 12, passage history information indicating the relationship between the token ID and the passage information at the touch point P3 on the user U (step S315). Themanagement server 10 then registers the operation information received from thesecurity inspection apparatus 40 in theoperation information DB 13 if necessary (step S316). -
FIG. 17 is a sequence diagram illustrating one example of the process in thereservation system 2, theautomated gate apparatus 50, and themanagement server 10. This process is performed when the user U goes through an immigration procedure. - The
automated gate apparatus 50 continuously or periodically captures the front area of theautomated gate apparatus 50 and determines whether or not a face of the user U standing in front of theautomated gate apparatus 50 is detected in a captured image (step S401). Theautomated gate apparatus 50 stands by until a face of the user U is detected in an image by the biometric information acquisition device 509 (step S401, NO). - If it is determined that a face of the user U is detected by the biometric information acquisition device 509 (step S401, YES), the
automated gate apparatus 50 captures the face of the user U and acquires the face image of the user U as a target face image (step S402). - Next, the
automated gate apparatus 50 transmits the target face image of the user U captured by the biometricinformation acquisition device 509 to themanagement server 10 together with a matching request (step S403). Thereby, theautomated gate apparatus 50 requests themanagement server 10 to match, at 1:N, the target face image of the user U captured by the biometricinformation acquisition device 509 with a plurality of registered face images registered in the tokenID information DB 11 of themanagement server 10. - In response to receiving the target face image and the matching request from the
automated gate apparatus 50, themanagement server 10 performs matching of the face image of the user U (step S404). That is, themanagement server 10 matches, at 1:N, the target face image received from theautomated gate apparatus 50 with a plurality of registered face images registered in the tokenID information DB 11. Note that the registered face images to be matched are limited to images associated with the token ID whose invalid flag has a value of “1” (valid). - Herein, if the
management server 10 determines that the matching result indicates an unsuccessful matching (step S405, NO), themanagement server 10 transmits the matching result of the unsuccessful matching to the automated gate apparatus 50 (step S408), and the process proceeds to step S409. Contrarily, if themanagement server 10 determines that the matching result indicates a successful matching (step S405, YES), the process proceeds to step S406. - In step S406, the
management server 10 acquires the token ID associated with the successfully matched registered face image in the tokenID information DB 11 and acquires a reservation number and an airline code from theoperation information DB 13 by using the token ID as a key. Themanagement server 10 then transmits the token ID, the reservation number, the airline code, and the matching result to the automated gate apparatus 50 (step S407). The process proceeds to step S409. - Next, if it is determined based on the matching result that the procedure can be performed (step S409, YES), the
automated gate apparatus 50 transmits the reservation number to thereservation system 2 of an airline company corresponding to the airline code (step S410) and inquires boarding reservation information. - Contrarily, if it is determined based on the matching result that the procedure is not performed (step S409, NO), the
automated gate apparatus 50 notifies the user U of an error message (step S413). For example, a notification screen including a message such as “Please move to immigration procedure at the manned counter” is displayed on thedisplay device 507. - In response to receiving the reservation number from the
automated gate apparatus 50, thereservation system 2 transmits the corresponding boarding reservation information to the automated gate apparatus 50 (step S411). - Next, in response to receiving the boarding reservation information from the
reservation system 2, theautomated gate apparatus 50 performs an immigration procedure of the user U (step S412) and opens the gate 511 (step S414). The user U who has passed through the touch point P4 moves to a departure area in which a boarding gate is provided. - Next, the
automated gate apparatus 50 transmits, to themanagement server 10, the token ID, the operation information, and passage history information indicating that the immigration procedure of the user U completed after the matching of the face image (step S415). Note that the passage history information includes information such as the passage time at the touch point P4, a device name of the used terminal, or the like. - In response to receiving the information from the
automated gate apparatus 50, themanagement server 10 registers, in the passagehistory information DB 12, passage history information indicating the relationship between the token ID and the passage information at the touch point P4 on the user U (step S416). Themanagement server 10 then registers the operation information received from theautomated gate apparatus 50 in theoperation information DB 13 if necessary (step S417). -
FIG. 18 is a sequence diagram illustrating one example of the process of thereservation system 2, theoperation terminal 70, themanagement server 10, and thecamera 80. This process is performed when the staff member S references information on a passenger included in a monitoring image of a peripheral region of a boarding gate, for example. - First, the
operation terminal 70 transmits a display request for an operation screen to themanagement server 10 based on input information obtained from the staff member S (step S501). In response to receiving the display request from theoperation terminal 70, themanagement server 10 acquires a captured image of the peripheral region of a boarding gate from the plurality ofcameras 80 arranged in a distributed manner around the boarding gate (step S502). - Next, in response to acquiring the captured image from the
cameras 80, themanagement server 10 transmits an operation screen including the captured image to the operation terminal 70 (step S503). - Next, the
operation terminal 70 displays the operation screen received from themanagement server 10 on the display device 707 (step S504). -
FIG. 19 is a diagram illustrating one example of the operation screen displayed on theoperation terminal 70. A captured image is displayed in the center area of the screen. Each dashed line section in the captured image indicates a detected region of a face image of each of a plurality of users U1 to U12. Further, in the lower side region in the screen, a camera select button used for switching the displayed image to a captured image of anothercamera 80 having different capturing angle, a passenger list screen button used for displaying a list of passengers included in a captured image, and an end button used for closing the screen are displayed. Note that it is preferable that a passenger for which no token ID is registered or a passenger who is not authenticated be distinguished in a screen from a registered passenger (authenticated passenger) by a different display form. - Next, the
operation terminal 70 transmits a display request for a passenger list screen to themanagement server 10 based on input information from the staff member S (step S505). In response to receiving the display request from theoperation terminal 70, themanagement server 10 detects a person from the captured image of the peripheral region of the boarding gate (step S506). When a plurality of persons are included in a captured image, themanagement server 10 detects face images of all the persons as the target face images, respectively. - Next, the
management server 10 matches, at 1:N, the target face image with a plurality of registered face images registered in the token ID information DB 11 (step S507). That is, themanagement server 10 specifies registered face images which are the same as the target face images of passengers included in the captured image, respectively. Note that the registered face images to be matched are limited to images associated with the token ID whose invalid flag has a value of “1” (valid). Furthermore, the passagehistory information DB 12 may be referenced, and only the image of the person who has passed through the immigration area may be a target to be matched. - Next, the
management server 10 acquires the token ID associated with the successfully matched registered face image in the tokenID information DB 11 and acquires a reservation number and an airline code of each person from theoperation information DB 13 by using the token ID as a key (step S508). Themanagement server 10 then transmits a data set of the token ID, the reservation number, and the airline code to the operation terminal 70 (step S509). - Next, in response to receiving the data set of each person from the
management server 10, theoperation terminal 70 sequentially transmits reservation numbers to thereservation system 2 of an airline company corresponding to the airline code (step S510) and inquires boarding reservation information for each person. - In response to receiving a reservation number of each person from the
operation terminal 70, thereservation system 2 transmits the corresponding boarding reservation information to the operation terminal 70 (step S511). - Next, in response to receiving boarding reservation information on each person, the
operation terminal 70 transmits the data set of the token ID, the reservation number, the airline code, and the boarding reservation information to the management server 10 (step S512). - Next, in response to receiving the data set of each person from the
operation terminal 70, themanagement server 10 transmits a passenger list screen created based on the data set to the operation terminal 70 (step S513). The passenger list screen includes boarding reservation information associated with registered face images specified by biometric authentication, respectively. - The
operation terminal 70 then displays the passenger list screen received from themanagement server 10 on the display device 707 (step S514). -
FIG. 20 is a diagram illustrating one example of the passenger list screen displayed on theoperation terminal 70. In this example, boarding reservation information (“name”, “nationality”, “flight number”, “departure place”, “destination place”, “departure time”, “boarding gate”) and face images of a plurality of persons detected from the captured image are displayed in a list form. Note that displayed data items are not limited to the above. For example, it is preferable to display information on a membership category, priority at boarding guide, and a waiting place (waiting lane) before boarding corresponding to the priority in addition to the above. In such a case, by referencing this passenger list screen, the staff member S can easily find a passenger who might have made a mistake in the place of a boarding gate or a waiting lane and guide such a passenger to a correct place. -
FIG. 21 is a sequence diagram illustrating one example of the process in thereservation system 2, theoperation terminal 70, themanagement server 10, and thecamera 80. This process is performed for each boarding gate in order to notify the staff member S of a target person for priority boarding. - First, the
management server 10 determines whether or not the current time is a predetermined period before the time to start guide of priority boarding (step S601). In this step, if it is determined that the current time is the predetermined period before the time to start guide (step S601, YES), themanagement server 10 acquires a captured image of the peripheral region of a boarding gate from the camera 80 (step S602). The process then proceeds to step S603. Contrarily, if it is determined that the current time is not the predetermined period before the time to start guide (step S601, NO), themanagement server 10 stands by until the current time becomes the predetermined period before the time to start guide. - In step S603, the
management server 10 detects a person from the captured image of the peripheral region of the boarding gate. When a plurality of persons are included in a captured image, themanagement server 10 detects face images of all the persons as target face images, respectively. - Next, the
management server 10 matches, at 1:N, the target face image with a plurality of registered face images registered in the token ID information DB 11 (step S604). Note that the registered face images to be matched are limited to images associated with the token ID whose invalid flag has a value of “1” (valid). Furthermore, the passagehistory information DB 12 may be referenced, and only the image of the person who has passed through the immigration area may be a target to be matched. - Next, the
management server 10 acquires the token ID associated with the successfully matched registered face image in the tokenID information DB 11 and acquires a reservation number and an airline code of each person from theoperation information DB 13 by using the token ID as a key (step S605). Themanagement server 10 then transmits a data set of the token ID, the reservation number, and the airline code to the operation terminal 70 (step S606). - Next, in response to receiving the data set of each person from the
management server 10, theoperation terminal 70 sequentially transmits reservation numbers to thereservation system 2 of an airline company corresponding to the airline code (step S607) and inquires boarding reservation information for each person. - In response to receiving a reservation number of each person from the
operation terminal 70, thereservation system 2 transmits the corresponding boarding reservation information to the operation terminal 70 (step S608). - Next, in response to receiving boarding reservation information on each person, the
operation terminal 70 transmits the data set of the token ID, the reservation number, the airline code, and the boarding reservation information to the management server 10 (step S609). - Next, in response to receiving the data set of each person from the
operation terminal 70, themanagement server 10 determines priority in boarding guide based on information on a seat class, a membership category, and an accompanying person or the like on each person included in the data set and extracts a passenger having high priority (step S610). - Next, the
management server 10 transmits a guide instruction screen of priority boarding for the extracted passenger to the operation terminal 70 (step S611). - The
operation terminal 70 then displays the guide instruction screen of priority boarding received from themanagement server 10 on the display device 707 (step S612). -
FIG. 22 is a diagram illustrating one example of a guide instruction screen of priority boarding displayed on theoperation terminal 70. A captured image is displayed in the center area of the screen. The dashed line section in the image indicates a detected region of a parent and a child determined as passengers having the highest priority for boarding guide. Further, a message instructing the staff member S for priority boarding guide (“Priority boarding is starting soon. Please guide the two passengers to the boarding gate”) is displayed in the lower side region in the screen. -
FIG. 23 is a sequence diagram illustrating one example of the process in thereservation system 2, theoperation terminal 70, themanagement server 10, and thecamera 80. This process is performed for guiding a passenger to an appropriate waiting lane in accordance with the priority of the passenger before starting priority boarding guide. - First, the
management server 10 determines whether or not the current time is a predetermined period before the time to start guide of priority boarding (step S701). In this step, if it is determined that the current time is the predetermined period before the time to start guide (step S701, YES), themanagement server 10 acquires a captured image including a line of passengers waiting for boarding from the camera 80 (step S702). The process then proceeds to step S703. Contrarily, if it is determined that the current time is not the predetermined period before the time to start guide (step S701, NO), themanagement server 10 stands by until the current time is the predetermined period before the time to start guide. - In step S703, the
management server 10 detects all the persons from the captured image including a line of passengers waiting for boarding. - Next, the
management server 10 matches, at 1:N, the target face image with a plurality of registered face images registered in the token ID information DB 11 (step S704). - Next, the
management server 10 acquires the token ID associated with a registered face image of a successful matching in the tokenID information DB 11 and acquires a reservation number and an airline code of each person from theoperation information DB 13 by using the token ID as a key (step S705). Themanagement server 10 then transmits a data set of the token ID, the reservation number, and the airline code to the operation terminal 70 (step S706). - Next, in response to receiving the data set of each person from the
management server 10, theoperation terminal 70 sequentially transmits reservation numbers to thereservation system 2 of an airline company corresponding to the airline code (step S707) and inquires boarding reservation information for each person. - In response to receiving a reservation number of each person from the
operation terminal 70, thereservation system 2 transmits the corresponding boarding reservation information to the operation terminal 70 (step S708). - Next, in response to receiving boarding reservation information on each person, the
operation terminal 70 transmits the data set of the token ID, the reservation number, the airline code, and the boarding reservation information to the management server 10 (step S709). - Next, in response to receiving the data set of each person from the
operation terminal 70, themanagement server 10 determines priority at boarding of each person based on information on a seat class, a membership category, and an accompanying person or the like on each person included in the data set and extracts a passenger whose priority is not matched to priority for the waiting lane, that is, a passenger who is waiting in a different waiting lane (step S710). - Next, the
management server 10 transmits a guide instruction screen of a waiting lane for the extracted passenger to the operation terminal 70 (step S711). That is, when a passenger whose priority is different from the priority for the waiting place is included in passengers waiting in the waiting place, themanagement server 10 outputs information for suggesting guide to a correct waiting place corresponding to the priority of the passenger. - The
operation terminal 70 then displays the guide instruction screen received from themanagement server 10 on the display device 707 (step S712). -
FIG. 24 is a diagram illustrating one example of a guide instruction screen of a waiting lane displayed on theoperation terminal 70. A captured image in which a plurality of passengers before boarding are waiting in a waiting lane provided around the boarding gate is displayed in the center region of the screen. The dashed line section in the screen indicates a detected region of a passenger E waiting in a wrong waiting lane. - Further, an instruction message that instructs the staff member S to guide the passenger E to a correct waiting lane corresponding to the priority of the passenger E (“There is a passenger waiting in a wrong waiting lane (No. 1). Please guide the passenger to the correct waiting lane (No. 3)”) is displayed in the lower side region in the screen.
-
FIG. 25 is a sequence diagram illustrating one example of the process in thereservation system 2, theoperation terminal 70, themanagement server 10, and thecamera 80. This process is performed when the staff member S arbitrarily designates a person included in a captured image and inquires information on the designated person. - First, the
operation terminal 70 transmits a display request for an operation screen to themanagement server 10 based on input information from the staff member S (step S801). In response to receiving the display request from theoperation terminal 70, themanagement server 10 acquires a captured image of the peripheral region of a boarding gate from the plurality ofcameras 80 arranged in a distributed manner around the boarding gate (step S802). - Next, in response to acquiring a captured image from the
camera 80, themanagement server 10 transmits the operation screen including a captured image to the operation terminal 70 (step S803). - Next, the
operation terminal 70 displays the operation screen received from themanagement server 10 on the display device 707 (step S804). - Next, the
operation terminal 70 transmits, to themanagement server 10, coordinate information on a designated person within the operation screen of the staff member S (step S805). In response to receiving the coordinate information on the designated person from theoperation terminal 70, themanagement server 10 detects the designated person from the captured image based on the coordinate information (step S806). A face image of the designated person cut out from the captured image is a target face image. - Next, the
management server 10 matches, at 1:N, the target face image of the designated person with a plurality of registered face images registered in the token ID information DB 11 (step S807). That is, themanagement server 10 specifies a registered face image which is the same as the target face image of the passenger specified on the screen of theoperation terminal 70 by the staff member S. - Next, the
management server 10 acquires the token ID associated with the successfully matched registered face image in the tokenID information DB 11 and acquires a reservation number and an airline code of the designated person from theoperation information DB 13 by using the token ID as a key (step S808). Themanagement server 10 then transmits a data set of the token ID, the reservation number, and the airline code to the operation terminal 70 (step S809). - Next, in response to receiving the data set of the designated person from the
management server 10, theoperation terminal 70 transmits reservation numbers to thereservation system 2 of an airline company corresponding to the airline code (step S810) and inquires boarding reservation information. - In response to receiving a reservation number of the designated person from the
operation terminal 70, thereservation system 2 transmits the corresponding boarding reservation information to the operation terminal 70 (step S811). - Next, in response to receiving boarding reservation information on the designated person, the
operation terminal 70 transmits the data set of the token ID, the reservation number, the airline code, and the boarding reservation information to the management server 10 (step S812). - Next, in response to receiving the data set of the designated person from the
operation terminal 70, themanagement server 10 transmits a passenger inquiry screen created based on the data set to the operation terminal 70 (step S813). That is, themanagement server 10 outputs boarding reservation information associated with a registered face image of a designated person to theoperation terminal 70. - Next, the
operation terminal 70 displays the passenger inquiry screen received from themanagement server 10 on the display device 707 (step S814). -
FIG. 26 is a diagram illustrating one example of a passenger inquiry screen displayed on theoperation terminal 70. In the left side region in the screen, the captured image (captured image), the target face image of the designated person extracted from the captured image, and the registered face image successfully matched with the target face image are displayed. The dashed line section in the captured image is a detected region of the designated person T. Further, in the right side region in the screen, boarding reservation information on the designated persons is displayed in a list form as an inquiry result. - For example, when the staff member S designates any person in the operation screen, the
management server 10 specifies a passenger based on a face image of the designated person and displays the boarding reservation information in the screen. Thereby, the staff member S can confirm boarding reservation information on a passenger included in a captured image in advance and use the confirmed boarding reservation information in the operation. -
FIG. 27 is a sequence diagram illustrating one example of the process in thereservation system 2, theoperation terminal 70, themanagement server 10, and thecamera 80. This process is performed when the staff member S extracts a desired passenger from passengers present around a boarding gate. - First, the
operation terminal 70 displays an extraction condition entry screen (step S901).FIG. 28 is a diagram illustrating one example of the extraction condition entry screen displayed on theoperation terminal 70. Herein, as extraction conditions, “boarding gate”, “flight number”, “priority”, “membership category”, “seat class”, and “with or without token ID” are illustrated as an example. When the staff member S intends to extract only the passenger having the highest priority for boarding guide out of a plurality of passengers included in a captured image, for example, the staff member S may set a checkbox regarding priority (“1”) to ON. - Next, the
operation terminal 70 transmits the extraction condition input by the staff member S in the extraction condition entry screen to the management server 10 (step S902). - Next, in response to receiving the extraction condition from the
operation terminal 70, themanagement server 10 acquires a captured image of the peripheral region of a boarding gate from the plurality ofcameras 80 arranged in a distributed manner around the boarding gate (step S903). - Next, in response to acquiring the captured image from the
cameras 80, themanagement server 10 detects all the persons from the captured image (step S904). - Next, the
management server 10 matches, at 1:N, the target face image of the detected person with a plurality of registered face images registered in the token ID information DB 11 (step S905). - Next, the
management server 10 acquires the token ID associated with a registered face image of a successful matching in the tokenID information DB 11 and acquires a reservation number and an airline code of each person from theoperation information DB 13 by using the token ID as a key (step S906). Themanagement server 10 then transmits a data set of the token ID, the reservation number, and the airline code to the operation terminal 70 (step S907). - Next, in response to receiving the data set of each person from the
management server 10, theoperation terminal 70 sequentially transmits reservation numbers to thereservation system 2 of an airline company corresponding to the airline code (step S908) and inquires boarding reservation information for each person. - In response to receiving a reservation number of each person from the
operation terminal 70, thereservation system 2 transmits the corresponding boarding reservation information to the operation terminal 70 (step S909). - Next, in response to receiving boarding reservation information on each person, the
operation terminal 70 transmits the data set of the token ID, the reservation number, the airline code, and the boarding reservation information to the management server 10 (step S910). - Next, in response to receiving the data set of each person from the
operation terminal 70, themanagement server 10 extracts a passenger matched to the extraction condition (step S911). - Next, the
management server 10 transmits a passenger extraction result screen regarding the extracted passenger to the operation terminal 70 (step S912). - Next, the
operation terminal 70 displays the passenger extraction result screen received from themanagement server 10 on the display device 707 (step S913). -
FIG. 29 is a diagram illustrating one example of a passenger extraction result screen displayed on theoperation terminal 70. In the upper side region in the screen, extraction conditions designated by the staff member S are displayed. Herein, three conditions of “boarding gate”, “flight number”, and “membership category” are designated. Further, in the center region of the screen, a captured image is displayed. A plurality of dashed line sections in the captured image are detected regions of passengers satisfying the extraction conditions. That is, themanagement server 10 can extract a passenger corresponding to boarding reservation information satisfying the extraction conditions input by the staff member S out of the passengers included in the captured image. -
FIG. 30 is a sequence diagram illustrating one example of the process in thereservation system 2, theboarding gate apparatus 60, and themanagement server 10. This process is performed when the user U passes through a boarding gate. - The
boarding gate apparatus 60 continuously or periodically captures the front area of the apparatus and determines whether or not a face of the user U standing in front of theboarding gate apparatus 60 is detected in a captured image (step S1001). Theboarding gate apparatus 60 stands by until a face of the user U is detected in an image by the biometric information acquisition device 609 (step S1001, NO). - If it is determined that a face of the user U is detected by the biometric information acquisition device 609 (step S1001, YES), the
boarding gate apparatus 60 captures the face of the user U and acquires the face image of the user U as a target face image (step S1002). - Next, the
boarding gate apparatus 60 transmits the target face image of the user U captured by the biometricinformation acquisition device 609 to themanagement server 10 together with a matching request (step S1003). Thereby, theboarding gate apparatus 60 requests themanagement server 10 to match, at 1:N, the target face image of the user U captured by the biometricinformation acquisition device 609 with a plurality of registered face images registered in the tokenID information DB 11 of themanagement server 10. - In response to receiving the target face image and the matching request from the
boarding gate apparatus 60, themanagement server 10 performs matching of the face image of the user U (step S1004). That is, themanagement server 10 matches, at 1:N, the target face image received from theboarding gate apparatus 60 with a plurality of registered face images registered in the tokenID information DB 11. Note that the registered face images to be matched are limited to images associated with the token ID whose invalid flag has a value of “1” (valid). - Herein, if the
management server 10 determines that the matching result indicates an unsuccessful matching (step S1005, NO), themanagement server 10 transmits the matching result of the unsuccessful matching to the boarding gate apparatus 60 (step S1008), and the process proceeds to step S1009. Contrarily, if themanagement server 10 determines that the matching result indicates a successful matching (step S1005, YES), the process proceeds to step S1006. - In step S1006, the
management server 10 acquires the token ID associated with the successfully matched registered face image in the tokenID information DB 11 and acquires a reservation number and an airline code from theoperation information DB 13 by using the token ID as a key. Themanagement server 10 then transmits the token ID, the reservation number, the airline code, and the matching result to the boarding gate apparatus 60 (step S1007). The process proceeds to step S1009. - Next, if it is determined based on the matching result that the procedure can be performed (step S1009, YES), the
boarding gate apparatus 60 transmits the reservation number to thereservation system 2 of an airline company corresponding to the airline code (step S1010) and inquires boarding reservation information. - Contrarily, if it is determined based on the matching result that the procedure is not performed (step S1009, NO), the
boarding gate apparatus 60 notifies the user U of an error message (step S1012). For example, theboarding gate apparatus 60 displays a notification screen including a message such as “Please move to procedure at the manned counter” on thedisplay device 607. - In response to receiving the reservation number from the
boarding gate apparatus 60, thereservation system 2 transmits the corresponding boarding reservation information to the boarding gate apparatus 60 (step S1011). - Next, in response to receiving the boarding reservation information from the
reservation system 2, theboarding gate apparatus 60 performs check at the boarding gate for the user U based on the flight number, the gate number, the boarding start time, and the like included in the boarding reservation information (step S1013) and when permitting the boarding, opens the gate 611 (step S1014). The user U who has passed through the touch point P5 boards on an airplane. Note that, if theboarding gate apparatus 60 determines not to permit boarding in step S1013, it is preferable to notify the user U of an error message without opening thegate 611. For example, thegate 611 is not opened when the user U makes a mistake in the number of thegate 611, when the current time is before the time to the boarding start time, or the like. - Next, the
boarding gate apparatus 60 transmits, to themanagement server 10, the token ID, the operation information, and passage history information indicating that the user U completed the boarding on the airplane after the matching of the face image (step S1015). Note that the passage history information includes information such as the passage time at the touch point P5, a device name of the used terminal, or the like. - In response to receiving the information from the
boarding gate apparatus 60, themanagement server 10 registers, in the passagehistory information DB 12, passage history information at the touch point P5 on the user U (step S1016). Themanagement server 10 then registers the operation information received from theboarding gate apparatus 60 in theoperation information DB 13 if necessary (step S1017). - The
management server 10 then updates the token ID information DB 11 (step S1018). Specifically, themanagement server 10 updates the invalid flag in the tokenID information DB 11 to a value of invalidity (“0”). Thereby, the lifecycle of the token ID expires. - According to the present example embodiment, the
management server 10 stores boarding reservation information on a passenger who is boarding on an airplane and a registered face image of the passenger in association with each other. Further, themanagement server 10 acquires a target face image of a passenger who has not yet boarded on an airplane from a captured image of a peripheral region of a boarding gate and specifies a registered face image which is the same as the target face image. Then, in response to specifying boarding reservation information associated with the specified registered face image, themanagement server 10 outputs information used for supporting a procedure of a passenger at a boarding gate to theoperation terminal 70 used by the staff member S based on the boarding reservation information. Accordingly, a procedure at a boarding gate can be efficiently performed. - The
management server 10 outputs priority in a procedure performed by a passenger at a boarding gate as information used for supporting the procedure of the passenger at the boarding gate. Accordingly, the staff member S can efficiently provide a service of priority boarding based on the priority of each passenger obtained from the captured image without checking a storage medium such as a passport, a boarding ticket, or the like. - The
information processing system 1 in the present example embodiment will be described below. Note that a reference common to the reference provided in the drawings in the first example embodiment represents the same object. Description of features common to the first example embodiment will be omitted, and different features will be described in detail. - The present example embodiment is different from the first example embodiment in that a check process regarding an accompanying person of a passenger is further performed in the check at the time of boarding at a boarding gate (touch point P5).
-
FIG. 31 is a flowchart illustrating one example of a check process of an accompanying person. This process is performed by themanagement server 10 between step S1011 and step S1013 ofFIG. 30 described above, for example. - The
management server 10 determines whether or not a target person under a boarding check at a boarding gate is a passenger having an accompanying person based on boarding reservation information received from the reservation system 2 (step S1101). In this step, if themanagement server 10 determines that the target person is a passenger having an accompanying person (step S1101, YES), the process proceeds to step S1102. - Contrarily, if the
management server 10 determines that the target person is a passenger having no accompanying person (step S1101, NO), the process proceeds to step S1013. - In step S1102, the
management server 10 determines whether or not another person is included in the captured image. In this step, if themanagement server 10 determines that another person is included in the captured image (step S1102, YES), the process proceeds to step S1103. - Contrarily, if the
management server 10 determines that another person is not included in the captured image (step S1102, NO), the process proceeds to step S1105. - In step S1103, the
management server 10 determines whether or not a person included in the captured image is a person registered as an accompanying person. In this step, if themanagement server 10 determines that the person included in the captured image is a person registered as an accompanying person (step S1103, YES), the process proceeds to step S1013. - Contrarily, if the
management server 10 determines that the person included in the captured image is not a person registered as an accompanying person (step S1103, NO), the process proceeds to step S1106. - In step S1105, the
management server 10 outputs a check instruction screen indicating the absence of an accompanying person to theoperation terminal 70 and ends the process. -
FIG. 32 is a diagram illustrating one example of a check instruction screen regarding an accompanying person displayed on theoperation terminal 70. In the left side region in the screen, a captured image captured at a boarding gate is displayed. A person T in the captured image is a passenger on which face authentication has been performed. Further, in the right side region in the screen, boarding reservation information regarding the person T is displayed in a list form. This example represents that the accompanying person of the person T is “female/young child”. As represented by the dashed line in the captured image, however, a child C registered as an accompanying person of the person T is not included in the captured image. Thus, a message instructing the staff member S to check an accompanying person (“Accompanying person information is registered for this passenger. Please check the accompanying person”) is displayed in the lower side region in the screen. - In step S1106, the
management server 10 outputs a check instruction screen indicating an unsuccessful matching of an accompanying person to theoperation terminal 70 and ends the process. -
FIG. 33 is a diagram illustrating one example of a check instruction screen of an accompanying person displayed on theoperation terminal 70. In the left side region in the screen, a captured image captured at a boarding gate is displayed. The person T in the captured image is a passenger on which face authentication has been performed. Further, in the right side region in the screen, boarding reservation information regarding the person T is displayed in a list form. This example represents that the accompanying person of the person T is “male/young child”. However, the child C included in the captured image is different from the registered accompanying person. Thus, a message instructing the staff member S to check an accompanying person (“This accompanying person is different from registered information. Please check the accompanying person”) is displayed in the lower side region in the screen. - As described above, according to the present example embodiment, the
management server 10 can analyze a captured image at a boarding gate and efficiently instruct the staff member S to perform a check operation regarding an accompanying person of a passenger. Specifically, themanagement server 10 acquires a target face image of a passenger from a captured image obtained by capturing the passenger at a boarding gate and specifies a registered face image which is the same as a target face image. When the accompanying person associated with the specified registered face image is not included in the captured image at the boarding gate, themanagement server 10 outputs information that suggests a check operation of an accompanying person. - Furthermore, when the accompanying person associated with the specified registered face image and another passenger included in another captured image together with the passenger are not the same, the
management server 10 outputs information that suggests a check operation of an accompanying person. - In an
information processing system 4 in the present example embodiment will be described below. Note that a reference common to the reference provided in the drawings in the first example embodiment represents the same object. Description of features common to the first example embodiment will be omitted, and different features will be described in detail. -
FIG. 34 is a block diagram illustrating an example of the overall configuration of theinformation processing system 4 in the present example embodiment. Theinformation processing system 4 is different from theinformation processing system 1 of the first example embodiment in that a function of searching for the position of a designated passenger in an airport by using captured images of thecameras 80 arranged at various places such as the touch points P1 to P5, a lounge (not illustrated), a duty-free shop (not illustrated), a pathway, and the like in the airport is further provided. -
FIG. 35 is a sequence diagram illustrating one example of the process in theoperation terminal 70, themanagement server 10, and thecamera 80. This process is performed when the staff member S searches for the current position of a desired passenger out of passengers present in the airport. Note that the passenger to be searched for is a passenger for which a token ID has been issued. - First, the
operation terminal 70 displays a search condition entry screen (step S1201) and then transmits, to themanagement server 10, a search condition input by the staff member S in the condition entry screen (step S1202). In the search condition entry screen, the same condition as that in the extraction condition entry screen illustrated inFIG. 28 may be input, or another condition may be further input. One example of a search condition may be a reservation number, an airline code, a seat class, a membership category, a name of a passenger, or the like. In the present example embodiment, such boarding reservation information is already stored as operation information in theoperation information DB 13 at the time of issuance of a token ID. Thereby, an inquiry process of boarding reservation information to thereservation system 2 based on a reservation number can be omitted. - Next, in response to receiving a search condition from the
operation terminal 70, themanagement server 10 specifies a token ID of a searching person associated with operation information matched to a search condition in the operation information DB 13 (step S1203). - Next, the
management server 10 acquires a registered face image of a searching person from the tokenID information DB 11 based on the specified token ID (step S1204). - Next, the
management server 10 acquires captured images from the plurality ofcameras 80 arranged in various places in the airport A in a distributed manner, respectively (step S1205). - Next, in response to acquiring captured images from the
cameras 80, themanagement server 10 detects all the persons from respective captured images (step S1206). - Next, the
management server 10 matches, at 1:N, a registered face image of the searching person with a plurality of face images detected from the captured image (step S1207). - Next, the
management server 10 specifies the current position of the searching person based on the captured image including the person of the face image successfully matched with the registered face image of the searching person (step S1208) and then transmits a passenger search result screen including information on the current position of the searching person to the operation terminal 70 (step S1209). - The
operation terminal 70 then displays the passenger search result screen received from themanagement server 10 on the display device 707 (step S1210). -
FIG. 36 is a diagram illustrating one example of the passenger search result screen displayed on theoperation terminal 70. In the left side region in the screen, a captured image of a place in which the searching person is found, a target face image of the searching person extracted from the captured image, and a registered face image successfully matched with the target face image are displayed. The dashed line section in the captured image is a detection region of the designated person T. Further, information on the current location of the searching person (“First Terminal, 4F, XXX lounge”) is displayed above the captured image. In the right side region in the screen, boarding reservation information regarding the person T is displayed in a list form. - According to the present example embodiment, the
management server 10 selects an image including a face image that is the same as a registered face image of a passenger to be searched for out of images captured by the plurality of cameras 80 (image capture device) arranged in the airport. Themanagement server 10 then outputs position information of a passenger in the airport based on the selected image. That is, themanagement server 10 can easily search for a passenger in various places in the airport without being limited to the peripheral region of a boarding gate and notify theoperation terminal 70 on the position information (current location) of the searched passenger. Thereby, information on a desired passenger can be shared by the staff members S. -
FIG. 37 is a block diagram illustrating an example of the overall configuration of theinformation processing apparatus 100 in the present example embodiment. Theinformation processing apparatus 100 has anacquisition unit 100A, a specifyingunit 100B, and anoutput unit 100C. Theacquisition unit 100A acquires biometric information of a passenger from a captured image obtained by capturing the passenger who is boarding on an airplane and has not yet passed through a boarding gate corresponding to the airplane. The specifyingunit 100B specifies boarding reservation information regarding the boarding by using the biometric information acquired by theacquisition unit 100A. Theoutput unit 100C outputs information used for supporting a procedure of the passenger at the boarding gate based on the boarding reservation information specified by the specifyingunit 100B. According to theinformation processing apparatus 100 in the present example embodiment, it is possible to support a procedure of a passenger at a boarding gate. - Although the present invention has been described above with reference to the example embodiments, the present invention is not limited to the example embodiments described above. Various modifications that may be appreciated by those skilled in the art can be made to the configuration and details of the present invention within the scope not departing from the spirit of the present invention. For example, it should be understood that an example embodiment in which a part of the configuration of any of the example embodiments is added to another example embodiment or an example embodiment in which a part of the configuration of any of the example embodiments is replaced with a part of the configuration of another example embodiment is also one of the example embodiments to which the present invention may be applied.
- Further, the configuration of the present invention is applicable not only to an international flight but also to a domestic flight. In the case of a domestic flight, a 1:1 matching process between a passport face image and a captured face image in addition to an immigration procedure may be omitted. In such a case, for example, a captured face image at the time of purchasing a boarding ticket can be used as a registered biometric image. When a terminal such as a smartphone or a personal computer is used to purchase a boarding ticket or perform check-in online, if a face image captured by a terminal is registered, the user can also board on an airplane through face authentication at the airport A.
- Although the check-in terminal 20 reads a passport face image from a passport and thereby issuance of a token ID is applied to the
management server 10 in the first example embodiment described above, such issuance may be applied to the automatic baggage drop-off machine 30 or thesecurity inspection apparatus 40 taking a case of an online check-in procedure into consideration. That is, themanagement server 10 acquires a passport face image and a target biometric image from any one of a terminal apparatus that performs the operation regarding the time of departure of the user U. Issuance of a token ID may be applied in a first performed procedure operation out of a series of procedure operations performed at the time of departure. - Further, in the first example embodiment described above, unlike the
reservation information DB 3 of thereservation system 2, theoperation information DB 13 of themanagement server 10 stores only some of the items of boarding reservation information (a reservation number and an airline code). Thus, a terminal apparatus (the check-in terminal 20 or the like) at each touch point inquires boarding reservation information based on a reservation number for thereservation system 2 at the time of performing a procedure. At the time of the first procedure, however, all pieces of boarding reservation information acquired from thereservation system 2 may be copied in theoperation information DB 13 as operation information. In such a case, since the terminal apparatus at the subsequent touch point can acquire boarding reservation information from the management server 10 (operation information DB 13), the inquiry to the reservation system (airline system) 2 may be omitted, or the inquiry method may be changed if necessary. - For example, the process between the boarding
gate apparatus 60, themanagement server 10, and thereservation system 2 can be performed in the following procedures at a boarding gate (touch point P5). First, theboarding gate apparatus 60 captures a face of a passenger and then transmits the face image to themanagement server 10. Next, themanagement server 10 performs face matching with a registered face image registered in the tokenID information DB 11 and acquires the token ID corresponding to a registered face image of a successful matching. Next, themanagement server 10 transmits, to theboarding gate apparatus 60, boarding reservation information (boarding ticket data) acquired from theoperation information DB 13 by using the token ID as a key. Next, theboarding gate apparatus 60 transmits the acquired boarding reservation information to thereservation system 2. Next, in response to receiving the boarding reservation information, thereservation system 2 matches the acquired boarding reservation information (boarding ticket data) with boarding reservation information stored in thereservation information DB 3 and, in response to determining whether or not to permit boarding of the passenger, transmits the determination result to theboarding gate apparatus 60. Accordingly, theboarding gate apparatus 60 can control opening of thegate 611 based on the determination result received from thereservation system 2. - Furthermore, although a case where information used for supporting a procedure of a passenger at a boarding gate is output to the
operation terminal 70 has been described in the above first example embodiment, the terminal apparatus to which such information is output is not limited thereto. For example, such information may be output to a signage terminal arranged in a peripheral region of a boarding gate or on a pathway or to a mobile terminal carried by a passenger. When a contact address such as a mail address of a mobile terminal of a passenger is registered in a database (not illustrated), it is possible to output guide information indicating a correct waiting lane to the contact address when it is detected that the passenger is waiting in a wrong waiting lane. - The scope of each of the example embodiments further includes a processing method that stores, in a storage medium, a program that causes the configuration of each of the example embodiments to operate so as to implement the function of each of the example embodiments described above, reads the program stored in the storage medium as a code, and executes the program in a computer. That is, the scope of each of the example embodiments also includes a computer readable storage medium. Further, each of the example embodiments includes not only the storage medium in which the program described above is stored but also the program itself.
- As the storage medium, for example, a floppy (registered trademark) disk, a hard disk, an optical disk, a magneto-optical disk, a CD-ROM, a magnetic tape, a nonvolatile memory card, or the like can be used. Further, the scope of each of the example embodiments includes an example that operates on OS to perform a process in cooperation with another software or a function of an add-in board without being limited to an example that performs a process by an individual program stored in the storage medium.
- The whole or part of the example embodiments disclosed above can be described as, but not limited to, the following supplementary notes.
- (Supplementary Note 1)
- An information processing apparatus comprising:
- an acquisition unit that acquires, from a captured image obtained by capturing a passenger who is boarding on an airplane and has not yet passed through a boarding gate corresponding to the airplane, biometric information of the passenger;
- a specifying unit that specifies boarding reservation information regarding the boarding by using the acquired biometric information; and
- an output unit that outputs information used for supporting a procedure of the passenger at the boarding gate based on the specified boarding reservation information.
- (Supplementary Note 2)
- The information processing apparatus according to
supplementary note 1 further comprising a control unit that issues a token ID corresponding to registered biometric information of the passenger for each passenger and associates the registered biometric information with the boarding reservation information in advance via the token ID, wherein the specifying unit specifies the boarding reservation information on the passenger based on the token ID corresponding to the registered biometric information successfully matched with the biometric information. - (Supplementary Note 3)
- The information processing apparatus according to
supplementary note - (Supplementary Note 4)
- The information processing apparatus according to
supplementary note 3, - wherein the boarding reservation information includes a class of a seat in the airplane or a category of the passenger set by an airline company, and
- wherein the output unit outputs the priority based on the class or the category.
- (Supplementary Note 5)
- The information processing apparatus according to
supplementary note 4, wherein the output unit outputs a waiting place prepared for boarding corresponding to the priority. - (Supplementary Note 6)
- The information processing apparatus according to
supplementary note 5, wherein when a person having different priority from the priority corresponding to the waiting place is included in passengers waiting in the waiting place, the output unit outputs the information that suggests guide to another waiting place corresponding to the priority of the person. - (Supplementary Note 7)
- The information processing apparatus according to any one of
supplementary notes 3 to 6, - wherein the boarding reservation information further includes registered information regarding a predetermined accompanying person, and
- wherein the output unit outputs the priority based on whether or not the accompanying person is present.
- (Supplementary Note 8)
- The information processing apparatus according to
supplementary note 7, - wherein the acquisition unit acquires another biometric information of the passenger from another captured image obtained by capturing the passenger at the boarding gate,
- wherein the specifying unit specifies the boarding reservation information on the passenger by using the another biometric information, and
- wherein when the accompanying person recorded in the specified boarding reservation information is not included in the another captured image, the output unit outputs the information that suggests a check operation with respect to the accompanying person.
- (Supplementary Note 9)
- The information processing apparatus according to
supplementary note 7, - wherein the acquisition unit acquires another biometric information of the passenger from another captured image obtained by capturing the passenger at the boarding gate,
- wherein the specifying unit specifies the boarding reservation information on the passenger by using the another biometric information, and
- wherein when the accompanying person recorded in the specified boarding reservation information and a detected person detected together with the passenger in the another captured image are not the same, the output unit outputs the information that suggests a check operation with respect to the accompanying person.
- (Supplementary Note 10)
- The information processing apparatus according to any one of
supplementary notes 1 to 9, - wherein the specifying unit specifies each boarding reservation information on the passenger by using the biometric information of all passengers included in the captured image, and
- wherein the output unit outputs a list screen including the specified boarding reservation information.
- (Supplementary Note 11)
- The information processing apparatus according to any one of
supplementary notes 1 to 10, - wherein the specifying unit specifies the boarding reservation information by using the biometric information of the passenger specified on a screen displaying the captured image, and
- wherein the output unit outputs the specified boarding reservation information on the screen.
- (Supplementary Note 12)
- The information processing apparatus according to any one of
supplementary notes 1 to 11 further comprising: - an input unit used for inputting an extraction condition of the passenger; and
- an extraction unit that extracts the passenger corresponding to the boarding reservation information satisfying the extraction condition out of passengers included in the captured image.
- (Supplementary Note 13)
- The information processing apparatus according to any one of
supplementary notes 1 to 12 further comprising a selection unit that, out of images captured by an image capture device arranged in an airport, selects an image from which the biometric information of the passenger to be searched for is acquired, - wherein the output unit outputs positional information on the passenger in the airport based on the selected image.
- (Supplementary Note 14)
- An information processing method comprising steps of:
- acquiring, from a captured image obtained by capturing a passenger who is boarding on an airplane and has not yet passed through a boarding gate corresponding to the airplane, biometric information of the passenger;
- specifying boarding reservation information regarding the boarding by using the acquired biometric information; and
- outputting information used for supporting a procedure of the passenger at the boarding gate based on the specified boarding reservation information.
- (Supplementary Note 15)
- A storage medium storing a program that causes a computer to perform:
- acquiring, from a captured image obtained by capturing a passenger who is boarding on an airplane and has not yet passed through a boarding gate corresponding to the airplane, biometric information of the passenger;
- specifying boarding reservation information regarding the boarding by using the acquired biometric information; and
- outputting information used for supporting a procedure of the passenger at the boarding gate based on the specified boarding reservation information.
-
- NW1, NW2 network
- 1, 4 information processing system
- 2 reservation system
- 3 reservation information DB
- 10 management server
- 11 token ID information DB
- 12 passage history information DB
- 13 operation information DB
- 20 check-in terminal
- 30 automatic baggage drop-off machine
- 40 security inspection apparatus
- 50 automated gate apparatus
- 60 boarding gate apparatus
- 70 operation terminal
- 80 camera
- 100 information processing apparatus
- 100A acquisition unit
- 100B specifying unit
- 100C output unit
Claims (15)
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2019/031978 WO2021029046A1 (en) | 2019-08-14 | 2019-08-14 | Information processing device, information processing method, and recording medium |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2019/031978 A-371-Of-International WO2021029046A1 (en) | 2019-08-14 | 2019-08-14 | Information processing device, information processing method, and recording medium |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/391,918 Continuation US20240127131A1 (en) | 2019-08-14 | 2023-12-21 | Information processing apparatus, information processing method, and storage medium |
US18/394,191 Continuation US20240127132A1 (en) | 2019-08-14 | 2023-12-22 | Information processing apparatus, information processing method, and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210342750A1 true US20210342750A1 (en) | 2021-11-04 |
Family
ID=74571074
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/645,966 Pending US20210342750A1 (en) | 2019-08-14 | 2019-08-14 | Information processing apparatus, information processing method, and storage medium |
US18/391,918 Pending US20240127131A1 (en) | 2019-08-14 | 2023-12-21 | Information processing apparatus, information processing method, and storage medium |
US18/394,191 Pending US20240127132A1 (en) | 2019-08-14 | 2023-12-22 | Information processing apparatus, information processing method, and storage medium |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/391,918 Pending US20240127131A1 (en) | 2019-08-14 | 2023-12-21 | Information processing apparatus, information processing method, and storage medium |
US18/394,191 Pending US20240127132A1 (en) | 2019-08-14 | 2023-12-22 | Information processing apparatus, information processing method, and storage medium |
Country Status (5)
Country | Link |
---|---|
US (3) | US20210342750A1 (en) |
EP (1) | EP4016437A4 (en) |
JP (2) | JP7235123B2 (en) |
CN (1) | CN114175087A (en) |
WO (1) | WO2021029046A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220284837A1 (en) * | 2021-03-03 | 2022-09-08 | Fujitsu Limited | Computer-readable recording medium storing display control program, display control method, and display control apparatus |
WO2023159525A1 (en) * | 2022-02-25 | 2023-08-31 | 京东方科技集团股份有限公司 | Customer service method, apparatus and system, and storage medium |
WO2023227483A1 (en) * | 2022-05-23 | 2023-11-30 | Amadeus S.A.S. | Biometric data access |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7266071B2 (en) * | 2021-08-02 | 2023-04-27 | 株式会社日立ソリューションズ西日本 | Online authenticator, method and program |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070046426A1 (en) * | 2005-08-26 | 2007-03-01 | Kabushiki Kaisha Toshiba | Admittance management system and admittance management method |
US20080122578A1 (en) * | 2006-06-27 | 2008-05-29 | Hoyos Hector T | Ensuring the provenance of passengers at a transportation facility |
JP2015222459A (en) * | 2014-05-22 | 2015-12-10 | 株式会社日立製作所 | Immigration examination system and method |
US20180373936A1 (en) * | 2017-06-22 | 2018-12-27 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
US20200047346A1 (en) * | 2016-10-13 | 2020-02-13 | Lg Electronics Inc. | Robot and robot system comprising same |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001256517A (en) * | 2000-03-10 | 2001-09-21 | Kazuo Fujimoto | Electronic ticket system and entrance check method using electronic ticket |
JP2006127322A (en) * | 2004-10-29 | 2006-05-18 | Mitsubishi Heavy Ind Ltd | Customer movement control system and customer movement control method |
JP2006164073A (en) * | 2004-12-09 | 2006-06-22 | Oki Electric Ind Co Ltd | Customer guidance system |
JP2007079656A (en) | 2005-09-12 | 2007-03-29 | Hitachi Ltd | Ticketless boarding system and method |
JP5066956B2 (en) * | 2007-03-13 | 2012-11-07 | 富士通株式会社 | Search support method, mobile terminal for search, server device, search support system, and computer program |
JP5851651B2 (en) * | 2013-03-21 | 2016-02-03 | 株式会社日立国際電気 | Video surveillance system, video surveillance method, and video surveillance device |
JP5568153B1 (en) * | 2013-03-25 | 2014-08-06 | 空港情報通信株式会社 | Passenger detection system |
JPWO2015136938A1 (en) * | 2014-03-14 | 2017-04-06 | 株式会社東芝 | Information processing method and information processing system |
US10275587B2 (en) * | 2015-05-14 | 2019-04-30 | Alclear, Llc | Biometric ticketing |
JP6534597B2 (en) * | 2015-10-20 | 2019-06-26 | 株式会社エクサ | Airport passenger tracking system |
JP2018017924A (en) * | 2016-07-28 | 2018-02-01 | 日本電気株式会社 | Information display system, server, information display device, screen generation method, information display method, and program |
CN110100261A (en) * | 2016-12-22 | 2019-08-06 | 日本电气方案创新株式会社 | Non- boarding passengers search equipment, non-boarding passengers searching method and recording medium |
CN107909683A (en) * | 2017-10-25 | 2018-04-13 | 平安科技(深圳)有限公司 | Realize method, terminal device and the computer-readable recording medium of boarding |
JP2019082450A (en) * | 2017-10-31 | 2019-05-30 | キヤノン株式会社 | Information processing apparatus, notification system, method and program |
CN109102615A (en) * | 2018-11-09 | 2018-12-28 | 昆明船舶设备集团有限公司 | A kind of terminal passenger shunting guidance system with face information acquisition |
-
2019
- 2019-08-14 US US16/645,966 patent/US20210342750A1/en active Pending
- 2019-08-14 EP EP19941458.2A patent/EP4016437A4/en active Pending
- 2019-08-14 JP JP2021539773A patent/JP7235123B2/en active Active
- 2019-08-14 WO PCT/JP2019/031978 patent/WO2021029046A1/en unknown
- 2019-08-14 CN CN201980098888.8A patent/CN114175087A/en active Pending
-
2023
- 2023-02-21 JP JP2023025108A patent/JP7540521B2/en active Active
- 2023-12-21 US US18/391,918 patent/US20240127131A1/en active Pending
- 2023-12-22 US US18/394,191 patent/US20240127132A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070046426A1 (en) * | 2005-08-26 | 2007-03-01 | Kabushiki Kaisha Toshiba | Admittance management system and admittance management method |
US20080122578A1 (en) * | 2006-06-27 | 2008-05-29 | Hoyos Hector T | Ensuring the provenance of passengers at a transportation facility |
JP2015222459A (en) * | 2014-05-22 | 2015-12-10 | 株式会社日立製作所 | Immigration examination system and method |
US20200047346A1 (en) * | 2016-10-13 | 2020-02-13 | Lg Electronics Inc. | Robot and robot system comprising same |
US20180373936A1 (en) * | 2017-06-22 | 2018-12-27 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220284837A1 (en) * | 2021-03-03 | 2022-09-08 | Fujitsu Limited | Computer-readable recording medium storing display control program, display control method, and display control apparatus |
US11854441B2 (en) * | 2021-03-03 | 2023-12-26 | Fujitsu Limited | Computer-readable recording medium storing display control program, display control method, and display control apparatus |
WO2023159525A1 (en) * | 2022-02-25 | 2023-08-31 | 京东方科技集团股份有限公司 | Customer service method, apparatus and system, and storage medium |
WO2023227483A1 (en) * | 2022-05-23 | 2023-11-30 | Amadeus S.A.S. | Biometric data access |
Also Published As
Publication number | Publication date |
---|---|
JP7540521B2 (en) | 2024-08-27 |
EP4016437A4 (en) | 2022-08-17 |
WO2021029046A1 (en) | 2021-02-18 |
JPWO2021029046A1 (en) | 2021-02-18 |
CN114175087A (en) | 2022-03-11 |
JP7235123B2 (en) | 2023-03-08 |
US20240127132A1 (en) | 2024-04-18 |
EP4016437A1 (en) | 2022-06-22 |
JP2023054187A (en) | 2023-04-13 |
US20240127131A1 (en) | 2024-04-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20240127132A1 (en) | Information processing apparatus, information processing method, and storage medium | |
US20240028682A1 (en) | Information processing apparatus, information processing method, and storage medium | |
JP7380723B2 (en) | Information processing device, information processing method and program | |
US20240311944A1 (en) | Program, information processing apparatus, and information processing method | |
US20240127248A1 (en) | Information processing apparatus, server device, information processing method, and storage medium | |
US20220058760A1 (en) | Information processing apparatus, information processing method, and storage medium | |
WO2021059526A1 (en) | Information processing device, information processing method, and recording medium | |
AU2023203252A1 (en) | Information processing apparatus, information processing method, and storage medium | |
US12141259B2 (en) | Information processing apparatus, information processing method, and storage medium | |
JP2024159787A (en) | Information processing device, information processing method, and program | |
JP7487827B2 (en) | Information processing device, information processing method, and recording medium | |
JP7327651B2 (en) | Information processing device, information processing method and program | |
US20220414195A1 (en) | Information processing apparatus, information processing method, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: NEC CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HIRAMOTO, NORIYUKI;TAKAHASHI, KAZUYOSHI;REEL/FRAME:052157/0703 Effective date: 20200210 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |