US20160381412A1 - User centric adaptation of vehicle entertainment system user interfaces - Google Patents
User centric adaptation of vehicle entertainment system user interfaces Download PDFInfo
- Publication number
- US20160381412A1 US20160381412A1 US14/751,963 US201514751963A US2016381412A1 US 20160381412 A1 US20160381412 A1 US 20160381412A1 US 201514751963 A US201514751963 A US 201514751963A US 2016381412 A1 US2016381412 A1 US 2016381412A1
- Authority
- US
- United States
- Prior art keywords
- passenger
- metric
- display unit
- video display
- control
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
- H04N21/4316—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
-
- G06K9/00268—
-
- G06K9/00315—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
- G06V40/176—Dynamic expression
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04H—BROADCAST COMMUNICATION
- H04H20/00—Arrangements for broadcast or for distribution combined with broadcast
- H04H20/53—Arrangements specially adapted for specific applications, e.g. for traffic information or for mobile receivers
- H04H20/61—Arrangements specially adapted for specific applications, e.g. for traffic information or for mobile receivers for local area broadcast, e.g. instore broadcast
- H04H20/62—Arrangements specially adapted for specific applications, e.g. for traffic information or for mobile receivers for local area broadcast, e.g. instore broadcast for transportation systems, e.g. in vehicles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04H—BROADCAST COMMUNICATION
- H04H60/00—Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
- H04H60/29—Arrangements for monitoring broadcast services or broadcast-related services
- H04H60/33—Arrangements for monitoring the users' behaviour or opinions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04H—BROADCAST COMMUNICATION
- H04H60/00—Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
- H04H60/61—Arrangements for services using the result of monitoring, identification or recognition covered by groups H04H60/29-H04H60/54
- H04H60/65—Arrangements for services using the result of monitoring, identification or recognition covered by groups H04H60/29-H04H60/54 for using the result on users' side
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/214—Specialised server platform, e.g. server located in an airplane, hotel, hospital
- H04N21/2146—Specialised server platform, e.g. server located in an airplane, hotel, hospital located in mass transportation means, e.g. aircraft, train or bus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/25—Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
- H04N21/258—Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
- H04N21/25866—Management of end-user data
- H04N21/25883—Management of end-user data being end-user demographical data, e.g. age, family status or address
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/414—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
- H04N21/41422—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance located in transportation means, e.g. personal vehicle
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42201—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] biosensors, e.g. heat sensor for presence detection, EEG sensors or any limb activity sensors worn by the user
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/4223—Cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/4508—Management of client data or end-user data
- H04N21/4532—Management of client data or end-user data involving end-user characteristics, e.g. viewer profile, preferences
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/466—Learning process for intelligent management, e.g. learning user preferences for recommending movies
- H04N21/4668—Learning process for intelligent management, e.g. learning user preferences for recommending movies for recommending content, e.g. movies
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/482—End-user interface for program selection
- H04N21/4825—End-user interface for program selection using a list of items to be played back in a given order, e.g. playlists
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/485—End-user interface for client configuration
- H04N21/4858—End-user interface for client configuration for modifying screen layout parameters, e.g. fonts, size of the windows
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/812—Monomedia components thereof involving advertisement data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
Definitions
- Embodiments described herein relate generally to electronic entertainment systems and, more particularly, to man-machine interfaces for controlling vehicle and other entertainment systems.
- IFE systems are deployed onboard aircraft to provide entertainment for passengers in a passenger cabin. IFE systems typically provide passengers with television, movies, games, audio entertainment programming, and other content.
- IFE systems typically provide a Graphical User Interface (GUI) on an in-seat Video Display Unit (VDU) that can include an interactive touch selection interface.
- GUI Graphical User Interface
- VDU Video Display Unit
- the GUI includes graphical elements that can include buttons, text, icons, and images.
- a user can touch select a displayed graphical element to initiate an action or navigate to another layer of graphical elements associated with the selected graphical element.
- GUI is programmatically implemented to provide a same interface on all VDUs within an aircraft and across all aircraft using that type of VDU.
- Some embodiments of the present disclosure are directed to a vehicle entertainment system that includes a video display unit and a user interface (UI) control processor.
- the UI control processor is configured to receive passenger attributes sensed from the passenger operating the video display unit, generate a passenger metric that characterizes the passenger based on the attribute, and control a UI of the video display unit based on the passenger metric.
- the vehicle entertainment system may be an in-flight entertainment system configured to be mounted within an aircraft fuselage.
- the UI can be controlled responsive to the passenger's demographics which are sensed via a camera, responsive to the passenger's emotions, responsive to the passenger's attentiveness, the determined passenger's UI operational effectiveness, etc.
- FIG. 1 illustrates a block diagram of an in-flight entertainment system that generates passenger metrics which are used to control the user interface of a video display unit (VDU) according to some embodiments;
- VDU video display unit
- FIG. 2 illustrates example information and operations used to characterize a passenger and example operations performed to control the UI of a VDU according to some embodiments
- FIGS. 3-10 illustrate flowcharts of operations and methods that may be performed by a processor of a VDU, another component of the IFE, and/or a component off-board the aircraft to characterize a passenger and example operations performed to control the UI according to some embodiments;
- FIG. 11 illustrates a VDU configured according to some embodiments.
- FIG. 12 is a block diagram of components that may be included in the VDU 100 configured to operate according to some embodiments.
- Various embodiments of the present disclosure may arise from the present realization that providing a same fixed user interface (UI) to all passengers may satisfy some passengers, but will not sufficiently satisfy all passengers.
- UI fixed user interface
- Some passengers have a high level of computer sophistication and are more likely to desire increased complexity UIs that provide expanded capability and passenger-interaction efficient functionality.
- some other passengers have a lower level of computer sophistication and are more likely to desire less complex more intuitive UIs that may exhibit less passenger-interaction efficient functionality but provide greater guidance through more UIs queries and passenger responses to perform functionality.
- an IFE system includes a UI control processor that obtains attributes of an individual passenger, generates passenger metrics that characterize the individual passenger, and controls the UI of a video display unit (VDU) used by that individual passenger based on the passenger metrics.
- VDU video display unit
- FIG. 1 is a block diagram of an IFE system 10 that includes video display units (VDUs) 100 a - n, a head end content server 40 , and distribution components 20 .
- the system 10 further includes a UI control processor 120 that may reside at least partially within each VDU 100 , reside at least partially within another component of the IFE system 10 separate from the VDU 100 (e.g., within the head end content server 40 or elsewhere), and/or reside at least partially off-board the aircraft such as on a land based server.
- the UI control processor 120 may, for example, be incorporated within a land based server that is communicatively connected through a data network 82 (e.g., private or public network, such as the Internet), a Radio Access Network 80 (e.g., satellite communication system transceiver and/or cellular communication system transceiver) to a wireless transceiver of a network interface 50 to communicate with the VDUs 100 .
- a data network 82 e.g., private or public network, such as the Internet
- Radio Access Network 80 e.g., satellite communication system transceiver and/or cellular communication system transceiver
- FIG. 3 illustrates operations and methods that may be performed at least in part by the UI control processor 120 .
- the UI control processor 120 receives (block 300 ) passenger attributes have been sensed. Various approaches for sensing and identifying passenger attributes are described in detail regarding FIGS. 2 and 4-11 .
- the UI control processor 120 generates (block 302 ) a passenger metrics characterizing the passenger based on the attributes, and controls the UI of one of the VDUs 100 which is being operated by that passenger, based on the passenger metrics.
- the head end content server 40 stores a set of content and is configured to separately deliver content to the VDUs 100 a - n responsive to content selection commands separately received from the VDUs 100 a - n through a data network 30 and the distribution components 20 .
- the distribution components 20 may include seat electronics boxes 22 , each of which can be spaced apart adjacent to different groups of seats, and/or one or more wireless communication routers 24 .
- Example content that can be provided by the head end content server 10 to selected ones of the VDUs 100 a - n can include, but is not limited to, movies, TV programs, audio programs, application programs (e.g. games, news, etc.), informational videos and/or textual descriptions (e.g., regarding destination cites, services, and products), and advertisements.
- the wireless router 24 may be a WLAN router (e.g. IEEE 802.11, WIMAX, etc), a cellular-based network (e.g. a pico cell radio base station), etc.
- the VDUs 100 a - n are connected to the head end content server 40 to request and receive content through wired and/or wireless network connections through the network 30 and/or the distribution components 20 .
- the UI control processor 120 may actually be performed in part by more than one UI control processor and may involve other components, such as sensors which sense passenger attributes, personal devices of the passengers which they have carried on-board the aircraft, etc.
- the video display units 100 a - 100 n may alternatively be configured to store content in internal/local mass memory for access by users and/or may stream and/or download content from other devices, such as from other video display units (e.g., peer-to-peer sharing) and/or from off-board devices such as ground based content servers accessed via a satellite and/or cellular communication system.
- other video display units e.g., peer-to-peer sharing
- off-board devices such as ground based content servers accessed via a satellite and/or cellular communication system.
- embodiments herein are primarily described in the context of an IFE system within an aircraft cabin, the invention is not limited thereto. Instead, the embodiments may be used to provide other types of vehicle entertainment systems for trains, automobiles, cruise ships, buses, and other vehicles.
- the VDUs 100 a - n can be attached to seatbacks so they face passengers/users in adjacent rearward seats.
- the VDUs 100 a - n may alternatively be mounted to bulkheads, movable support assemblies connected to seat armrests and/or seat frames, etc.
- Embodiments herein may be used with vehicle electronic systems other than entertainments system, such as with vehicle digital signage display systems, etc.
- FIG. 2 illustrates example information and operations used to characterize a passenger and example operations performed to control the UI of a VDU according to some embodiments.
- the UI control processor 110 includes a user centric UI adaptation controller 200 that controls user interface elements 230 displayable on a display device 240 of the VDU 100 and/or controls content from the head end content server 40 based on passenger attribute data that is sensed by electronic sensors and characterizes one or more attributes of the passenger.
- the passenger attribute data can include, but is not limited to, one or more of the following: passenger demographic data 210 , passenger emotion data 212 , passenger attentiveness data 214 , passenger biometric data 216 , passenger fitness tracking data 218 , passenger UI operational effectiveness data 220 , content consumption data 222 , displayed advertisements data 224 which may include passenger eye tracking data, and data identifying cabin events directed to the passenger and/or flight phase data 226 .
- the UI adaptation controller 200 controls or adapts the UI elements so that they are more optimized for use by the passenger responsive to the passenger attribute data characterizing the passenger.
- the user interface elements 230 that can be controlled or adapted can include, but are not limited to, one or more of the following: information layout in UI layers; selectable widget layout in UI layers; operational flow between UI layers; operational flow between selectable widgets in a UI layer; font size, shape, and/or color; selectable widget size, shape, and/or color; applications made available through the VDU; movies, television programs and/or audio programs made available through the VDU, shuffled order of UI elements display; timing and/or selection of advertisements and/or advertisement display format; timing and/or selection of help items.
- FIG. 4 illustrates further operations that may be performed by the UI control processor 110 , e.g., via the user centric UI adaptation controller 200 , to generate passenger metrics that can be used to control one or more of the UI elements.
- FIG. 4 illustrates a sequence of operations it is to be understood that the UI control processor 110 may be configured to perform only one of the operations, to perform a plurality of operations and select among the plurality based on the passenger metrics and/or based on a defined rule which may respond to a defined event occurring at the time.
- the operations may include determining (block 400 ) a passenger demographic metric based on the passenger data.
- the controller 200 may analyze the camera signal to estimate the passenger's age based on hair color, sensed facial wrinkles or skin tightness, complexion, height, weight, etc. The controller 200 may similarly determine the gender and/or ethnicity of the passenger based on skin color, eye color, etc.
- the VDU 100 includes a camera configured to output a camera signal containing data representing the passenger's face.
- the UI control processor 110 is configured to process the camera signal to identify (block 500 ) facial features of the passenger, and compare (block 502 ) the facial features to defined demographics rules.
- the UI control processor 110 determines (block 504 ) a passenger demographic metric based on the comparison of the facial features to the defined demographics rules, and controls (block 506 ) the UI of the VDU 100 based on the passenger demographic metric.
- the operations may include determining (block 402 ) a passenger emotion metric based on the passenger data.
- the UI control processor 110 is configured to process the camera signal to identify (block 600 ) facial features of the passenger, and compare (block 602 ) the facial features to defined emotion rules, such as by identifying facial expressions that are compared to the emotion rules.
- the UI control processor 110 determines (block 604 ) a passenger emotion metric based on the comparison of the facial features to the defined emotion rules, and controls (block 606 ) the UI of the VDU 100 based on the passenger emotion metric.
- the facial analysis operations performed on the camera signal may identify occurrences of facial expressions that are classified based on the emotion rules as being, for example, neutral, smiling, laughing, sad, bored, sleeping, or surprised.
- the processor 110 may correlate changes in emotion to a timeline of the content, such as by correlating identified emotions to a timeline of content dialog or scene changes within content. Changes in emotion may be correlated to introduction or removal of advertisements which may be displayed in-line during commercial breaks in the content or displayed in parallel (e.g., within advertisement bars adjacent to the content, within picture-in-picture windows overlaid on the content, etc.) and/or correlated to tracked locations within advertisement dialogs or advertisement scene changes within content. The correlations may be identified by information included in the enhanced content usage metrics. Thus, for example, the UI of the VDU 100 may be controlled based on an average level of happiness, sadness, boredom, surprise, inattentiveness, etc. exhibited by the passenger during identified scenes and/or times within the content.
- the operations may include determining (block 404 ) a passenger attentiveness metric based on the passenger data.
- the UI control processor 110 is configured to process the camera signal to identify (block 700 ) facial features of the passenger, and compare (block 702 ) the facial features to defined attentiveness rules.
- the UI control processor 110 determines (block 704 ) a passenger attentiveness metric based on the comparison of the facial features to the defined attentiveness rules, and controls (block 706 ) the UI of the VDU 100 based on the passenger attentiveness metric.
- the operations may include determining (block 406 ) a passenger's UI operational effectiveness metric based on the passenger data.
- the UI control processor 110 is configured to monitor (block 800 a passenger's operation of the UI of the VDU 100 to identify passenger selections that indicate that the passenger has incorrectly selected displayed elements of the UI and/or has selected an ineffective (e.g., inefficient) sequence of displayed elements to perform an operation.
- a passenger UI operational accuracy metric is determined (block 802 ) based on the monitoring, and the UI of the VDU 100 is controlled (block 804 ) based on the passenger UI operational effectiveness metric.
- the operations may include determining (block 408 ) if a passenger's biometric data satisfies a defined rule.
- the UI control processor 110 is configured to receive (block 900 ) passenger biometric data from a biosensor, compare (block 902 ) the passenger biometric data to a defined rule, and control (block 904 ) the UI of the VDU 100 based on the comparison.
- the operations may include determining (block 410 ) a passenger metric based on fitness data received from the passenger's electronic device.
- the UI control processor 110 is configured to receive (block 1000 ) passenger fitness tracking data through a wireless connection to a passenger fitness tracking device (e.g., mobile phone, fitness tracking device worn by passenger, etc.).
- a passenger fitness metric is determined (block 1002 ) based on the passenger fitness tracking data from the passenger fitness tracking device.
- the UI control processor 110 can compare the passenger fitness metric to a defined rule, and control (block 1006 ) the UI of the VDU 100 based on the comparison.
- the UI control processor 110 may perform operations to generate (block 412 ) a passenger metric based on content that has been tracked and determined to have been consumed through the VDU 100 .
- the passenger metric may characterize which content items were consumed during a flight and how much passenger attention was given to various portions of the content items.
- the passenger metric may be generated (block 414 ) based on advertisements that have been displayed on the VDU 100 , and may be further generated based on tracking where on the display device the passenger has looked and for how long, and if and how long the passenger has looked at the displayed advertisements.
- the passenger metric may be generated (block 416 ) based on a present flight phase of the aircraft, and/or may be generated (block 418 ) based on cabin events (e.g., announcements, food/beverage service, etc.) that are directed to the passenger individually or collectively to the passengers.
- cabin events e.g., announcements, food/beverage service, etc.
- Metrics may be generated based on flight phase information, which may include one or more of boarding phase, departure taxi phase, take off phase, cruise phase, landing phase, arrival taxi phase, and disembarking phase.
- Metrics may be generated based on flight itinerary information, which may include one or more of the departure city, the destination city, departure time, arrival time, and flight duration.
- Metrics may be generated based on one or more of providing food (e.g., meal and/or beverage) service, crew announcements to passengers, occurrence of aircraft turbulence, and other events that may affect a passenger's attentiveness to information and operational widgets displayed through the UI, and which may affect that ability of the passenger to correctly operate the UI.
- food e.g., meal and/or beverage
- the processor 110 can be further configured to process the camera signal to determine an eye viewing location on a display device of the VDU 100 , and to correlate the eye viewing location to a timeline of content consumed by the user through the VDU 100 .
- the processor 110 may control the UI based on the correlation of the eye viewing location to the timeline of content consumed by the user through the VDU 100 .
- the processor 110 may identify particular locations within a timeline of the content that triggered changes in passenger emotion, such as laughing, smiling, surprise, etc.
- the metrics may indicate whether and how long a passenger looked at an advertisement displayed in a border area to a movie, how long a passenger looked at content of the movie and what content, how long the passenger looked elsewhere while the movie was playing and what content was missed, how long a passenger looked at defined items contained within the movie (e.g., products placed within a movie for advertisement), etc, and generate metrics based thereon.
- the UI control processor 110 can control (block 420 ) one or more of the UI elements of the VDU 100 based on one or more of these and/or other metrics.
- the UI control processor 110 can compare (block 1100 ) one or more of the determined passenger metrics to one or more defined rules to determine how to control the UI of the VDU 100 .
- the layout of information in UI layers of the VDU 100 can be controlled (block 1102 ) based on the comparison.
- the UI control processor 110 selects a layout of information for UI layers from among a plurality of different defined layouts of information for UI layers displayable on the VDU 100 based on the passenger metric, and displays the defined layout of information for the UI layers through the UI of the VDU 100 .
- the layout of selectable widgets in UI layers of the VDU 100 can be controlled (block 1104 ) based on the comparison.
- the UI control processor 110 selects a layout of selectable widgets for a UI layer from among a plurality of different defined layouts of selectable widgets for a UI layer displayable on the VDU 100 based on the passenger metric, and displays the defined layout of selectable widgets for the UI layer through the UI of the VDU 100 operated by that passenger.
- the operational flow between UI layers of the VDU 100 can be controlled (block 1106 ) based on the comparison.
- the UI control processor 110 selects an operational flow between passenger selectable widgets of different UI layers from among a plurality of different defined operational flows between passenger selectable widgets of different UI layers displayable on the VDU 100 based on the passenger metric, and controls the operational flow between the passenger selectable widgets of the different UI layers displayed through the UI of the VDU 100 operated by the passenger, based on the operational flow that was selected.
- the list of menus and/or operational features presented to passengers and/or content of menus presented for triggering operational flows may be simplified for passengers who are estimated be younger or older than a threshold age. Passengers within a defined age range may be provided more menu options for more advanced functionality and/or provided a greater array of operational features of the video display unit 100 , while other passenger outside the defined age range may be precluded from accessing such advanced functionality and/or provided a subset of the array of operational features.
- the font size, font shape, and/or color of text displayed through the UI of the VDU 100 operated by the passenger can be controlled (block 1108 ) based on the passenger metric.
- the size, the shape, and/or the color of passenger selectable widgets displayed through the UI of the VDU 100 operated by the passenger can be controlled (block 1110 ) based on the passenger metric.
- the processor 110 may determine a size or layout of text and/or passenger selectable widgets to be displayed on the VDU 100 responsive to an estimate of the passenger's age. For example, passengers having an age beyond a threshold value may be displayed text or selectable widgets having a greater minimum size and/or less graphically complex screen layouts than younger passengers, or vice versa. Similarly, passengers who are exhibiting tiredness or inattentiveness may be displayed text or selectable widgets having a greater minimum size than younger passengers and/or less graphically complex screen layouts than passengers not exhibiting those identifiable conditions, or vice versa.
- the processor 110 can control what content is offered for consumption through the VDU 100 based on the passenger demographics, passenger emotion, and/or attentiveness. For example, the processor 110 may filter a set of content stored on the content server 10 to generate a filtered list based on the determine user demographics, passenger emotion, and/or attentiveness, and communicate the filtered list of available content to the VDU 100 for display to that passenger. Thus, the content made available to a passenger can be filtered based on an estimate of the passengers age and/or emotion. Passengers who exhibit a threshold level of attentiveness or, alternatively, less than a threshold level of attentiveness, may be provided an expanded variety of content consumed through their respective VDUs 100 .
- the processor 110 may filter the list of movies, television programs, audio programs, and/or applications to generate a recommended list based on correlating the passenger attributes to historical data indicating the preferences of passengers who have certain determined attributes.
- the processor 110 may use a repository of information that identifies various content preferences that have been observed for passengers having identified attributes, to generate a list of movies, television programs, audio programs, and/or applications that are predicted to be preferred by the present passenger based on the present passenger's attributes.
- the UI control processor is configured to control the UI of the VDU 100 based on the passenger metric by controlling an order with which content is identified in a list based on the passenger metric.
- the content is available for display through the UI of the VDU 100 responsive to passenger selection among the ordered list of content. At least a portion of the ordered list of content is then displayed through the UI of the VDU 100 .
- a subset of movies, television programs, and/or audio programs that are available on the head end content server 40 can be selected (block 1114 ) to be made available through the UI of the VDU 100 operated by that passenger, based on the passenger metric.
- the UI control processor 110 generates a list of movies, television programs, and/or audio programs that are available from the memory within the content server 40 to be displayed through the UI of the VDU, and filters the list of movies, television programs, and/or audio programs based on the passenger metric to generate a filtered list of movies, television programs, and/or audio programs.
- the processor 110 displays the filtered list of movies, television programs, and/or audio programs through the UI of the video display unit 100 .
- the filtering may furthermore be based on metadata that has been associated with the content.
- the metadata may include, but is not limited to, a description of the content, poster art representing the content, etc.
- a subset of applications that are available on the head end content server 40 can be selected (block 1114 ) to be made available through the UI of the VDU 100 for operation upon by a processor of the VDU 100 , based on the passenger metric.
- the UI control processor 110 generates a list of applications that are available from the memory within the content server 40 to be selectable through the UI of the VDU 100 operated by the passenger for execution by UI control processor 110 or another processor of the VDU 100 .
- the list of applications is filtered based on the passenger metric to generate a filtered list of applications, and the filtered list of applications is displayed through the UI of the VDU 100 .
- the events that trigger display of help items through the UI of the VDU 100 can be controlled (block 1116 ) based on the passenger metric.
- the UI control processor 110 selects a help item trigger event from among a plurality of different defined help item trigger events based on the passenger metric, and initiates display of a help item through the UI of the VDU 100 operated by the passenger, based on the help item trigger event that was selected becoming satisfied.
- the content of the help items displayed through the UI of the VDU 100 is controlled (block 1116 ) based on the passenger metric.
- the UI control processor 110 selects a help content item from among a plurality of different defined help content items based on the passenger metric, and displays the help content item that was selected through the UI of the VDU 100 operated by the passenger.
- the timing and/or selection of advertisements for display through the UI the VDU 100 can be controlled (block 1118 ) based on the passenger metric.
- the UI control processor 110 selects an advertisement from among a plurality of different defined advertisement based on the passenger metric, and displays the advertisement that was selected through the UI of the VDU 100 .
- the UI control processor 110 selects an advertisement trigger event from among a plurality of different defined advertisement trigger events based on the passenger metric. The UI control processor 110 then initiates display of an advertisement through the UI of the VDU 100 operated by the passenger, based on the advertisement trigger event that was selected becoming satisfied.
- FIG. 12 is a block diagram of components that may be included in the VDU 100 configured to operate according to some embodiments.
- One or more of the components illustrated in FIG. 11 may be included in one or more elements of the system 10 other than the VDU 100 , such as within the head end content server 40 , a UI control processor 110 that is separate from the VDU 100 , or elsewhere.
- the VDU 100 includes a processor 1200 and a memory 1210 containing program code 1212 .
- the processor 1200 includes one or more data processing circuits, such as a general purpose and/or special purpose processor (e.g., microprocessor and/or digital signal processor).
- the processor 1200 is configured to execute the program code 1212 in the memory 1210 , described below as a computer readable medium, to perform some or all of the operations and methods that are described herein for one or more of the embodiments.
- the VDU 100 can include a camera 1220 , a microphone 1230 , a biometric sensor 1240 , a network interface 1250 (e.g., wired or RF wireless communication interface), an audio output interface 1260 , a display device 240 , and a user input interface 1280 (e.g., touch screen interface, switches, control wheels, buttons, keypad, keyboard, etc.).
- the camera 1220 is positioned to view a passenger who is operating the VDU 110 and configured to generate a camera signal.
- the camera 1220 may be any type of sensor that can generate data representing observable characteristics of a passenger.
- the display device 240 is configured to display images to a passenger and display user selectable widgets (e.g., programmatic soft-buttons, etc.) that the passenger can select to control various functional operations of the IFE system 10 .
- the processor 1200 processes the camera signal using image detection algorithms (e.g., facial feature expression detection algorithms) and defined rules to identify passenger demographics, passenger emotions, passenger attentiveness, passenger eye viewing locations, and other tracked passenger characteristics, which can be used to generate passenger metrics.
- the processor 1200 may further operate to track content that was consumed or particular portions of content that was consumed through the VDU 100 by a passenger.
- the processor 1200 may further operate to correlate the passenger metric(s), e.g., facial expression, mood, attentiveness, etc.) to particular portions of content that is consumed through the video display units 100 a - n , and may generate the enhanced content usage metrics based on correlations determined for particular portions of content that was consumed through the video display units 100 a - n.
- consumption of content can include viewing the content (e.g., movie, TV program, textual information, informational video), running an application program (e.g., game), listening to audio programming, etc.
- the network interface 1250 may connect through a RF transceiver to a fitness tracking device (e.g., mobile phone, fitness tracking device worn by passenger, etc.) carried by the passenger to receive fitness data.
- the biometric sensor 1140 may be contained within the VDU 100 or may be communicatively connected to the processor 1200 through the network interface 1250 .
- the biometric sensor 1240 may be configured to sense biometric that can include, but is not limited to, passenger's temperature, passenger's heart rate, passenger's respiration rate, passenger's weight, passenger's fingerprint, passenger's iris features, and/or other biometric attributes of the passenger.
- the biometric sensor 1240 may be contained within a headphone worn by the passenger, contained in an armrest or other seat surface of a seat occupied by the passenger, or be contained or provided by one or more other components of the system 10 .
- the microphone 1230 may be configured to receive spoken commands from a passenger, which can be interpreted by the processor 1200 to control the UI of the VDU 100 .
- the microphone 1230 may alternatively or additionally be configured to sense ambient noise, such as engine noise and/or announcements in the cabin, and control the UI of the VDU 100 .
- the processor 1200 may pause operation of content being consumed through the display device 240 and/or pause user selectable of the widgets displayed on the display device 240 while an announcement by a crew member is detected through the microphone 1230 .
- the terms “comprise”, “comprising”, “comprises”, “include”, “including”, “includes”, “have”, “has”, “having”, or variants thereof are open-ended, and include one or more stated features, integers, elements, steps, components or functions but does not preclude the presence or addition of one or more other features, integers, elements, steps, components, functions or groups thereof.
- the common abbreviation “e.g.”, which derives from the Latin phrase “exempli gratia,” may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item.
- the common abbreviation “i.e.”, which derives from the Latin phrase “id est,” may be used to specify a particular item from a more general recitation.
- Example embodiments are described herein with reference to block diagrams and/or flowchart illustrations of computer-implemented methods, apparatus (systems and/or devices) and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits.
- These computer program instructions may be provided to a processor of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and/or flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and/or flowchart block(s).
- These computer program instructions may also be stored in a tangible computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instructions which implement the functions/acts specified in the block diagrams and/or flowchart block or blocks.
- a tangible, non-transitory computer-readable medium may include an electronic, magnetic, optical, electromagnetic, or semiconductor data storage system, apparatus, or device. More specific examples of the computer-readable medium would include the following: a portable computer diskette, a random access memory (RAM) circuit, a read-only memory (ROM) circuit, an erasable programmable read-only memory (EPROM or Flash memory) circuit, a portable compact disc read-only memory (CD-ROM), and a portable digital video disc read-only memory (DVD/BlueRay).
- RAM random access memory
- ROM read-only memory
- EPROM or Flash memory erasable programmable read-only memory
- CD-ROM compact disc read-only memory
- DVD/BlueRay portable digital video disc read-only memory
- the computer program instructions may also be loaded onto a computer and/or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer and/or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks.
- embodiments of the present invention may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.) that runs on a processor such as a digital signal processor, which may collectively be referred to as “circuitry,” “a module” or variants thereof.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Databases & Information Systems (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Business, Economics & Management (AREA)
- Marketing (AREA)
- Social Psychology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Aviation & Aerospace Engineering (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Graphics (AREA)
- Neurosurgery (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Analytical Chemistry (AREA)
- Chemical & Material Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A vehicle entertainment system includes a video display unit and a user interface (UI) control processor. The UI control processor is configured to receive passenger attributes sensed from the passenger operating the video display unit, generate a passenger metric that characterizes the passenger based on the attribute, and control a UI of the video display unit based on the passenger metric. The passenger metric may be a passenger demographic metric determined based on facial features of the passenger. The passenger demographic metric may be a passenger emotion metric, a passenger attentiveness metric, a passenger UI operational effectiveness metric, a passenger biometric feature metric, etc.
Description
- Embodiments described herein relate generally to electronic entertainment systems and, more particularly, to man-machine interfaces for controlling vehicle and other entertainment systems.
- In-flight entertainment (IFE) systems are deployed onboard aircraft to provide entertainment for passengers in a passenger cabin. IFE systems typically provide passengers with television, movies, games, audio entertainment programming, and other content.
- IFE systems typically provide a Graphical User Interface (GUI) on an in-seat Video Display Unit (VDU) that can include an interactive touch selection interface. The GUI includes graphical elements that can include buttons, text, icons, and images. A user can touch select a displayed graphical element to initiate an action or navigate to another layer of graphical elements associated with the selected graphical element.
- Designers invest significant time and effort into producing a GUI that is both intuitive and efficient for a typical passenger to use. Once the layout of graphical elements and the interconnect of graphical elements on different hierarchical layers is selected, the GUI is programmatically implemented to provide a same interface on all VDUs within an aircraft and across all aircraft using that type of VDU.
- Passenger satisfaction with a flight experience and, ultimately, with an airline can be significantly impacted by the assumptions made by the designers during the layout and configuration design for the GUI of VDUs for an IFE system.
- Some embodiments of the present disclosure are directed to a vehicle entertainment system that includes a video display unit and a user interface (UI) control processor. The UI control processor is configured to receive passenger attributes sensed from the passenger operating the video display unit, generate a passenger metric that characterizes the passenger based on the attribute, and control a UI of the video display unit based on the passenger metric. The vehicle entertainment system may be an in-flight entertainment system configured to be mounted within an aircraft fuselage.
- As will be explained in detail below in the context of numerous example embodiments, the UI can be controlled responsive to the passenger's demographics which are sensed via a camera, responsive to the passenger's emotions, responsive to the passenger's attentiveness, the determined passenger's UI operational effectiveness, etc.
- Other vehicle entertainment systems and methods according to embodiments of the invention will be or become apparent to one with skill in the art upon review of the following drawings and detailed description. It is intended that all such additional vehicle entertainment systems and methods be included within this description, be within the scope of the present invention, and be protected by the accompanying claims. Moreover, it is intended that all embodiments disclosed herein can be implemented separately or combined in any way and/or combination.
- The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this application, illustrate certain non-limiting embodiments of the invention. In the drawings:
-
FIG. 1 illustrates a block diagram of an in-flight entertainment system that generates passenger metrics which are used to control the user interface of a video display unit (VDU) according to some embodiments; -
FIG. 2 illustrates example information and operations used to characterize a passenger and example operations performed to control the UI of a VDU according to some embodiments; -
FIGS. 3-10 illustrate flowcharts of operations and methods that may be performed by a processor of a VDU, another component of the IFE, and/or a component off-board the aircraft to characterize a passenger and example operations performed to control the UI according to some embodiments; -
FIG. 11 illustrates a VDU configured according to some embodiments; and -
FIG. 12 is a block diagram of components that may be included in the VDU 100 configured to operate according to some embodiments. - The following detailed description discloses various non-limiting example embodiments of the invention. The invention can be embodied in many different forms and is not to be construed as limited to the embodiments set forth herein.
- Various embodiments of the present disclosure may arise from the present realization that providing a same fixed user interface (UI) to all passengers may satisfy some passengers, but will not sufficiently satisfy all passengers. As passengers have become more sophisticated in their preferences, such a one-size-fits-all approach to UI layout and operation is not an optimal approach for achieving passenger satisfaction. Some passengers have a high level of computer sophistication and are more likely to desire increased complexity UIs that provide expanded capability and passenger-interaction efficient functionality. In sharp contrast, some other passengers have a lower level of computer sophistication and are more likely to desire less complex more intuitive UIs that may exhibit less passenger-interaction efficient functionality but provide greater guidance through more UIs queries and passenger responses to perform functionality.
- In accordance with various embodiments disclosed herein, an IFE system is provided that includes a UI control processor that obtains attributes of an individual passenger, generates passenger metrics that characterize the individual passenger, and controls the UI of a video display unit (VDU) used by that individual passenger based on the passenger metrics.
-
FIG. 1 is a block diagram of anIFE system 10 that includes video display units (VDUs) 100 a-n, a headend content server 40, anddistribution components 20. Thesystem 10 further includes aUI control processor 120 that may reside at least partially within eachVDU 100, reside at least partially within another component of theIFE system 10 separate from the VDU 100 (e.g., within the headend content server 40 or elsewhere), and/or reside at least partially off-board the aircraft such as on a land based server. TheUI control processor 120 may, for example, be incorporated within a land based server that is communicatively connected through a data network 82 (e.g., private or public network, such as the Internet), a Radio Access Network 80 (e.g., satellite communication system transceiver and/or cellular communication system transceiver) to a wireless transceiver of anetwork interface 50 to communicate with theVDUs 100. -
FIG. 3 illustrates operations and methods that may be performed at least in part by theUI control processor 120. Referring toFIGS. 1 and 3 , theUI control processor 120 receives (block 300) passenger attributes have been sensed. Various approaches for sensing and identifying passenger attributes are described in detail regardingFIGS. 2 and 4-11 . TheUI control processor 120 generates (block 302) a passenger metrics characterizing the passenger based on the attributes, and controls the UI of one of theVDUs 100 which is being operated by that passenger, based on the passenger metrics. - The head
end content server 40 stores a set of content and is configured to separately deliver content to theVDUs 100 a-n responsive to content selection commands separately received from theVDUs 100 a-n through adata network 30 and thedistribution components 20. Thedistribution components 20 may includeseat electronics boxes 22, each of which can be spaced apart adjacent to different groups of seats, and/or one or morewireless communication routers 24. - Example content that can be provided by the head
end content server 10 to selected ones of the VDUs 100 a-n can include, but is not limited to, movies, TV programs, audio programs, application programs (e.g. games, news, etc.), informational videos and/or textual descriptions (e.g., regarding destination cites, services, and products), and advertisements. Thewireless router 24 may be a WLAN router (e.g. IEEE 802.11, WIMAX, etc), a cellular-based network (e.g. a pico cell radio base station), etc. - The VDUs 100 a-n are connected to the head
end content server 40 to request and receive content through wired and/or wireless network connections through thenetwork 30 and/or thedistribution components 20. Although only threeVDUs 100 a-100 n and onecontent server 10 are shown inFIG. 1 for ease of illustration, any number of VDUs and content servers may be used with embodiments herein. Functionality described herein as being performed by theUI control processor 120 may actually be performed in part by more than one UI control processor and may involve other components, such as sensors which sense passenger attributes, personal devices of the passengers which they have carried on-board the aircraft, etc. - Although the system of
FIG. 1 includes a headend content server 40, the vehicle entertainment systems disclosed herein are not limited thereto. Thevideo display units 100 a-100 n may alternatively be configured to store content in internal/local mass memory for access by users and/or may stream and/or download content from other devices, such as from other video display units (e.g., peer-to-peer sharing) and/or from off-board devices such as ground based content servers accessed via a satellite and/or cellular communication system. - Although embodiments herein are primarily described in the context of an IFE system within an aircraft cabin, the invention is not limited thereto. Instead, the embodiments may be used to provide other types of vehicle entertainment systems for trains, automobiles, cruise ships, buses, and other vehicles. When used in an aircraft, a bus, a train, or other vehicles where seats are arranged in columns, the
VDUs 100 a-n can be attached to seatbacks so they face passengers/users in adjacent rearward seats. The VDUs 100 a-n may alternatively be mounted to bulkheads, movable support assemblies connected to seat armrests and/or seat frames, etc. Embodiments herein may be used with vehicle electronic systems other than entertainments system, such as with vehicle digital signage display systems, etc. -
FIG. 2 illustrates example information and operations used to characterize a passenger and example operations performed to control the UI of a VDU according to some embodiments. Referring toFIG. 2 , as will be explained in further detail below, theUI control processor 110 includes a user centricUI adaptation controller 200 that controlsuser interface elements 230 displayable on adisplay device 240 of theVDU 100 and/or controls content from the headend content server 40 based on passenger attribute data that is sensed by electronic sensors and characterizes one or more attributes of the passenger. The passenger attribute data can include, but is not limited to, one or more of the following: passengerdemographic data 210,passenger emotion data 212,passenger attentiveness data 214, passengerbiometric data 216, passengerfitness tracking data 218, passenger UIoperational effectiveness data 220,content consumption data 222, displayedadvertisements data 224 which may include passenger eye tracking data, and data identifying cabin events directed to the passenger and/orflight phase data 226. - The
UI adaptation controller 200 controls or adapts the UI elements so that they are more optimized for use by the passenger responsive to the passenger attribute data characterizing the passenger. As will be explained in further detail below, theuser interface elements 230 that can be controlled or adapted can include, but are not limited to, one or more of the following: information layout in UI layers; selectable widget layout in UI layers; operational flow between UI layers; operational flow between selectable widgets in a UI layer; font size, shape, and/or color; selectable widget size, shape, and/or color; applications made available through the VDU; movies, television programs and/or audio programs made available through the VDU, shuffled order of UI elements display; timing and/or selection of advertisements and/or advertisement display format; timing and/or selection of help items. -
FIG. 4 illustrates further operations that may be performed by theUI control processor 110, e.g., via the user centricUI adaptation controller 200, to generate passenger metrics that can be used to control one or more of the UI elements. AlthoughFIG. 4 illustrates a sequence of operations it is to be understood that theUI control processor 110 may be configured to perform only one of the operations, to perform a plurality of operations and select among the plurality based on the passenger metrics and/or based on a defined rule which may respond to a defined event occurring at the time. - Referring to
FIG. 4 , the operations may include determining (block 400) a passenger demographic metric based on the passenger data. In one nonlimiting example approach, thecontroller 200 may analyze the camera signal to estimate the passenger's age based on hair color, sensed facial wrinkles or skin tightness, complexion, height, weight, etc. Thecontroller 200 may similarly determine the gender and/or ethnicity of the passenger based on skin color, eye color, etc. In the embodiment ofFIG. 5 , theVDU 100 includes a camera configured to output a camera signal containing data representing the passenger's face. TheUI control processor 110 is configured to process the camera signal to identify (block 500) facial features of the passenger, and compare (block 502) the facial features to defined demographics rules. TheUI control processor 110 determines (block 504) a passenger demographic metric based on the comparison of the facial features to the defined demographics rules, and controls (block 506) the UI of theVDU 100 based on the passenger demographic metric. - With further reference to
FIG. 4 , the operations may include determining (block 402) a passenger emotion metric based on the passenger data. In the embodiment ofFIG. 6 , theUI control processor 110 is configured to process the camera signal to identify (block 600) facial features of the passenger, and compare (block 602) the facial features to defined emotion rules, such as by identifying facial expressions that are compared to the emotion rules. TheUI control processor 110 determines (block 604) a passenger emotion metric based on the comparison of the facial features to the defined emotion rules, and controls (block 606) the UI of theVDU 100 based on the passenger emotion metric. The facial analysis operations performed on the camera signal may identify occurrences of facial expressions that are classified based on the emotion rules as being, for example, neutral, smiling, laughing, sad, bored, sleeping, or surprised. - The
processor 110 may correlate changes in emotion to a timeline of the content, such as by correlating identified emotions to a timeline of content dialog or scene changes within content. Changes in emotion may be correlated to introduction or removal of advertisements which may be displayed in-line during commercial breaks in the content or displayed in parallel (e.g., within advertisement bars adjacent to the content, within picture-in-picture windows overlaid on the content, etc.) and/or correlated to tracked locations within advertisement dialogs or advertisement scene changes within content. The correlations may be identified by information included in the enhanced content usage metrics. Thus, for example, the UI of theVDU 100 may be controlled based on an average level of happiness, sadness, boredom, surprise, inattentiveness, etc. exhibited by the passenger during identified scenes and/or times within the content. - With further reference to
FIG. 4 , the operations may include determining (block 404) a passenger attentiveness metric based on the passenger data. In the embodiment ofFIG. 7 , theUI control processor 110 is configured to process the camera signal to identify (block 700) facial features of the passenger, and compare (block 702) the facial features to defined attentiveness rules. TheUI control processor 110 determines (block 704) a passenger attentiveness metric based on the comparison of the facial features to the defined attentiveness rules, and controls (block 706) the UI of theVDU 100 based on the passenger attentiveness metric. - With further reference to
FIG. 4 , the operations may include determining (block 406) a passenger's UI operational effectiveness metric based on the passenger data. In the embodiment ofFIG. 8 , theUI control processor 110 is configured to monitor (block 800 a passenger's operation of the UI of theVDU 100 to identify passenger selections that indicate that the passenger has incorrectly selected displayed elements of the UI and/or has selected an ineffective (e.g., inefficient) sequence of displayed elements to perform an operation. A passenger UI operational accuracy metric is determined (block 802) based on the monitoring, and the UI of theVDU 100 is controlled (block 804) based on the passenger UI operational effectiveness metric. - With further reference to
FIG. 4 , the operations may include determining (block 408) if a passenger's biometric data satisfies a defined rule. In the embodiment ofFIG. 8 , theUI control processor 110 is configured to receive (block 900) passenger biometric data from a biosensor, compare (block 902) the passenger biometric data to a defined rule, and control (block 904) the UI of theVDU 100 based on the comparison. - With further reference to
FIG. 4 , the operations may include determining (block 410) a passenger metric based on fitness data received from the passenger's electronic device. In the embodiment ofFIG. 10 , theUI control processor 110 is configured to receive (block 1000) passenger fitness tracking data through a wireless connection to a passenger fitness tracking device (e.g., mobile phone, fitness tracking device worn by passenger, etc.). A passenger fitness metric is determined (block 1002) based on the passenger fitness tracking data from the passenger fitness tracking device. TheUI control processor 110 can compare the passenger fitness metric to a defined rule, and control (block 1006) the UI of theVDU 100 based on the comparison. - With further reference to
FIG. 4 , theUI control processor 110 may perform operations to generate (block 412) a passenger metric based on content that has been tracked and determined to have been consumed through theVDU 100. The passenger metric may characterize which content items were consumed during a flight and how much passenger attention was given to various portions of the content items. The passenger metric may be generated (block 414) based on advertisements that have been displayed on theVDU 100, and may be further generated based on tracking where on the display device the passenger has looked and for how long, and if and how long the passenger has looked at the displayed advertisements. The passenger metric may be generated (block 416) based on a present flight phase of the aircraft, and/or may be generated (block 418) based on cabin events (e.g., announcements, food/beverage service, etc.) that are directed to the passenger individually or collectively to the passengers. - Metrics may be generated based on flight phase information, which may include one or more of boarding phase, departure taxi phase, take off phase, cruise phase, landing phase, arrival taxi phase, and disembarking phase. Metrics may be generated based on flight itinerary information, which may include one or more of the departure city, the destination city, departure time, arrival time, and flight duration. Metrics may be generated based on one or more of providing food (e.g., meal and/or beverage) service, crew announcements to passengers, occurrence of aircraft turbulence, and other events that may affect a passenger's attentiveness to information and operational widgets displayed through the UI, and which may affect that ability of the passenger to correctly operate the UI.
- The
processor 110 can be further configured to process the camera signal to determine an eye viewing location on a display device of theVDU 100, and to correlate the eye viewing location to a timeline of content consumed by the user through theVDU 100. Theprocessor 110 may control the UI based on the correlation of the eye viewing location to the timeline of content consumed by the user through theVDU 100. Theprocessor 110 may identify particular locations within a timeline of the content that triggered changes in passenger emotion, such as laughing, smiling, surprise, etc. The metrics may indicate whether and how long a passenger looked at an advertisement displayed in a border area to a movie, how long a passenger looked at content of the movie and what content, how long the passenger looked elsewhere while the movie was playing and what content was missed, how long a passenger looked at defined items contained within the movie (e.g., products placed within a movie for advertisement), etc, and generate metrics based thereon. - The
UI control processor 110 can control (block 420) one or more of the UI elements of theVDU 100 based on one or more of these and/or other metrics. - In the embodiments of
FIG. 11 , theUI control processor 110 can compare (block 1100) one or more of the determined passenger metrics to one or more defined rules to determine how to control the UI of theVDU 100. The layout of information in UI layers of theVDU 100 can be controlled (block 1102) based on the comparison. In one embodiment, theUI control processor 110 selects a layout of information for UI layers from among a plurality of different defined layouts of information for UI layers displayable on theVDU 100 based on the passenger metric, and displays the defined layout of information for the UI layers through the UI of theVDU 100. - The layout of selectable widgets in UI layers of the
VDU 100 can be controlled (block 1104) based on the comparison. In one embodiment, theUI control processor 110 selects a layout of selectable widgets for a UI layer from among a plurality of different defined layouts of selectable widgets for a UI layer displayable on theVDU 100 based on the passenger metric, and displays the defined layout of selectable widgets for the UI layer through the UI of theVDU 100 operated by that passenger. - The operational flow between UI layers of the
VDU 100 can be controlled (block 1106) based on the comparison. In one embodiment, theUI control processor 110 selects an operational flow between passenger selectable widgets of different UI layers from among a plurality of different defined operational flows between passenger selectable widgets of different UI layers displayable on theVDU 100 based on the passenger metric, and controls the operational flow between the passenger selectable widgets of the different UI layers displayed through the UI of theVDU 100 operated by the passenger, based on the operational flow that was selected. - For example, the list of menus and/or operational features presented to passengers and/or content of menus presented for triggering operational flows may be simplified for passengers who are estimated be younger or older than a threshold age. Passengers within a defined age range may be provided more menu options for more advanced functionality and/or provided a greater array of operational features of the
video display unit 100, while other passenger outside the defined age range may be precluded from accessing such advanced functionality and/or provided a subset of the array of operational features. - The font size, font shape, and/or color of text displayed through the UI of the
VDU 100 operated by the passenger, can be controlled (block 1108) based on the passenger metric. Alternatively or additionally, the size, the shape, and/or the color of passenger selectable widgets displayed through the UI of theVDU 100 operated by the passenger, can be controlled (block 1110) based on the passenger metric. - For example, the
processor 110 may determine a size or layout of text and/or passenger selectable widgets to be displayed on theVDU 100 responsive to an estimate of the passenger's age. For example, passengers having an age beyond a threshold value may be displayed text or selectable widgets having a greater minimum size and/or less graphically complex screen layouts than younger passengers, or vice versa. Similarly, passengers who are exhibiting tiredness or inattentiveness may be displayed text or selectable widgets having a greater minimum size than younger passengers and/or less graphically complex screen layouts than passengers not exhibiting those identifiable conditions, or vice versa. - The
processor 110 can control what content is offered for consumption through theVDU 100 based on the passenger demographics, passenger emotion, and/or attentiveness. For example, theprocessor 110 may filter a set of content stored on thecontent server 10 to generate a filtered list based on the determine user demographics, passenger emotion, and/or attentiveness, and communicate the filtered list of available content to theVDU 100 for display to that passenger. Thus, the content made available to a passenger can be filtered based on an estimate of the passengers age and/or emotion. Passengers who exhibit a threshold level of attentiveness or, alternatively, less than a threshold level of attentiveness, may be provided an expanded variety of content consumed through theirrespective VDUs 100. - The
processor 110 may filter the list of movies, television programs, audio programs, and/or applications to generate a recommended list based on correlating the passenger attributes to historical data indicating the preferences of passengers who have certain determined attributes. Thus, for example, theprocessor 110 may use a repository of information that identifies various content preferences that have been observed for passengers having identified attributes, to generate a list of movies, television programs, audio programs, and/or applications that are predicted to be preferred by the present passenger based on the present passenger's attributes. - In a related embodiment, the UI control processor is configured to control the UI of the
VDU 100 based on the passenger metric by controlling an order with which content is identified in a list based on the passenger metric. The content is available for display through the UI of theVDU 100 responsive to passenger selection among the ordered list of content. At least a portion of the ordered list of content is then displayed through the UI of theVDU 100. - In some embodiments, a subset of movies, television programs, and/or audio programs that are available on the head
end content server 40, can be selected (block 1114) to be made available through the UI of theVDU 100 operated by that passenger, based on the passenger metric. In one embodiment, theUI control processor 110 generates a list of movies, television programs, and/or audio programs that are available from the memory within thecontent server 40 to be displayed through the UI of the VDU, and filters the list of movies, television programs, and/or audio programs based on the passenger metric to generate a filtered list of movies, television programs, and/or audio programs. Theprocessor 110 then displays the filtered list of movies, television programs, and/or audio programs through the UI of thevideo display unit 100. The filtering may furthermore be based on metadata that has been associated with the content. The metadata may include, but is not limited to, a description of the content, poster art representing the content, etc. - A subset of applications that are available on the head
end content server 40, can be selected (block 1114) to be made available through the UI of theVDU 100 for operation upon by a processor of theVDU 100, based on the passenger metric. In one embodiment, theUI control processor 110 generates a list of applications that are available from the memory within thecontent server 40 to be selectable through the UI of theVDU 100 operated by the passenger for execution byUI control processor 110 or another processor of theVDU 100. The list of applications is filtered based on the passenger metric to generate a filtered list of applications, and the filtered list of applications is displayed through the UI of theVDU 100. - The events that trigger display of help items through the UI of the
VDU 100 can be controlled (block 1116) based on the passenger metric. In one embodiment, theUI control processor 110 selects a help item trigger event from among a plurality of different defined help item trigger events based on the passenger metric, and initiates display of a help item through the UI of theVDU 100 operated by the passenger, based on the help item trigger event that was selected becoming satisfied. - The content of the help items displayed through the UI of the
VDU 100 is controlled (block 1116) based on the passenger metric. In one embodiment, theUI control processor 110 selects a help content item from among a plurality of different defined help content items based on the passenger metric, and displays the help content item that was selected through the UI of theVDU 100 operated by the passenger. - The timing and/or selection of advertisements for display through the UI the
VDU 100 can be controlled (block 1118) based on the passenger metric. In one embodiment, theUI control processor 110 selects an advertisement from among a plurality of different defined advertisement based on the passenger metric, and displays the advertisement that was selected through the UI of theVDU 100. - In another embodiment, the
UI control processor 110 selects an advertisement trigger event from among a plurality of different defined advertisement trigger events based on the passenger metric. TheUI control processor 110 then initiates display of an advertisement through the UI of theVDU 100 operated by the passenger, based on the advertisement trigger event that was selected becoming satisfied. -
FIG. 12 is a block diagram of components that may be included in theVDU 100 configured to operate according to some embodiments. One or more of the components illustrated inFIG. 11 may be included in one or more elements of thesystem 10 other than theVDU 100, such as within the headend content server 40, aUI control processor 110 that is separate from theVDU 100, or elsewhere. TheVDU 100 includes aprocessor 1200 and amemory 1210 containingprogram code 1212. Theprocessor 1200 includes one or more data processing circuits, such as a general purpose and/or special purpose processor (e.g., microprocessor and/or digital signal processor). Theprocessor 1200 is configured to execute theprogram code 1212 in thememory 1210, described below as a computer readable medium, to perform some or all of the operations and methods that are described herein for one or more of the embodiments. - The
VDU 100 can include acamera 1220, amicrophone 1230, abiometric sensor 1240, a network interface 1250 (e.g., wired or RF wireless communication interface), anaudio output interface 1260, adisplay device 240, and a user input interface 1280 (e.g., touch screen interface, switches, control wheels, buttons, keypad, keyboard, etc.). Thecamera 1220 is positioned to view a passenger who is operating theVDU 110 and configured to generate a camera signal. Thecamera 1220 may be any type of sensor that can generate data representing observable characteristics of a passenger. Thedisplay device 240 is configured to display images to a passenger and display user selectable widgets (e.g., programmatic soft-buttons, etc.) that the passenger can select to control various functional operations of theIFE system 10. - The
processor 1200 processes the camera signal using image detection algorithms (e.g., facial feature expression detection algorithms) and defined rules to identify passenger demographics, passenger emotions, passenger attentiveness, passenger eye viewing locations, and other tracked passenger characteristics, which can be used to generate passenger metrics. Theprocessor 1200 may further operate to track content that was consumed or particular portions of content that was consumed through theVDU 100 by a passenger. Theprocessor 1200 may further operate to correlate the passenger metric(s), e.g., facial expression, mood, attentiveness, etc.) to particular portions of content that is consumed through thevideo display units 100 a-n, and may generate the enhanced content usage metrics based on correlations determined for particular portions of content that was consumed through thevideo display units 100 a-n. As explained above, consumption of content can include viewing the content (e.g., movie, TV program, textual information, informational video), running an application program (e.g., game), listening to audio programming, etc. - The
network interface 1250 may connect through a RF transceiver to a fitness tracking device (e.g., mobile phone, fitness tracking device worn by passenger, etc.) carried by the passenger to receive fitness data. The biometric sensor 1140 may be contained within theVDU 100 or may be communicatively connected to theprocessor 1200 through thenetwork interface 1250. Thebiometric sensor 1240 may be configured to sense biometric that can include, but is not limited to, passenger's temperature, passenger's heart rate, passenger's respiration rate, passenger's weight, passenger's fingerprint, passenger's iris features, and/or other biometric attributes of the passenger. Thebiometric sensor 1240 may be contained within a headphone worn by the passenger, contained in an armrest or other seat surface of a seat occupied by the passenger, or be contained or provided by one or more other components of thesystem 10. - The
microphone 1230 may be configured to receive spoken commands from a passenger, which can be interpreted by theprocessor 1200 to control the UI of theVDU 100. Themicrophone 1230 may alternatively or additionally be configured to sense ambient noise, such as engine noise and/or announcements in the cabin, and control the UI of theVDU 100. For example, theprocessor 1200 may pause operation of content being consumed through thedisplay device 240 and/or pause user selectable of the widgets displayed on thedisplay device 240 while an announcement by a crew member is detected through themicrophone 1230. - In the above-description of various embodiments of the present invention, it is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of this specification and the relevant art and will not be interpreted in an idealized or overly formal sense expressly so defined herein.
- When an element is referred to as being “connected”, “coupled”, “responsive”, or variants thereof to another node, it can be directly connected, coupled, or responsive to the other element or intervening element may be present. In contrast, when an element is referred to as being “directly connected”, “directly coupled”, “directly responsive”, or variants thereof to another element, there are no intervening element present. Like numbers refer to like element throughout. Furthermore, “coupled”, “connected”, “responsive”, or variants thereof as used herein may include wirelessly coupled, connected, or responsive. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Well-known functions or constructions may not be described in detail for brevity and/or clarity. The term “and/or” includes any and all combinations of one or more of the associated listed items.
- As used herein, the terms “comprise”, “comprising”, “comprises”, “include”, “including”, “includes”, “have”, “has”, “having”, or variants thereof are open-ended, and include one or more stated features, integers, elements, steps, components or functions but does not preclude the presence or addition of one or more other features, integers, elements, steps, components, functions or groups thereof. Furthermore, as used herein, the common abbreviation “e.g.”, which derives from the Latin phrase “exempli gratia,” may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item. The common abbreviation “i.e.”, which derives from the Latin phrase “id est,” may be used to specify a particular item from a more general recitation.
- Example embodiments are described herein with reference to block diagrams and/or flowchart illustrations of computer-implemented methods, apparatus (systems and/or devices) and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits. These computer program instructions may be provided to a processor of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and/or flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and/or flowchart block(s).
- These computer program instructions may also be stored in a tangible computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instructions which implement the functions/acts specified in the block diagrams and/or flowchart block or blocks.
- A tangible, non-transitory computer-readable medium may include an electronic, magnetic, optical, electromagnetic, or semiconductor data storage system, apparatus, or device. More specific examples of the computer-readable medium would include the following: a portable computer diskette, a random access memory (RAM) circuit, a read-only memory (ROM) circuit, an erasable programmable read-only memory (EPROM or Flash memory) circuit, a portable compact disc read-only memory (CD-ROM), and a portable digital video disc read-only memory (DVD/BlueRay).
- The computer program instructions may also be loaded onto a computer and/or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer and/or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks. Accordingly, embodiments of the present invention may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.) that runs on a processor such as a digital signal processor, which may collectively be referred to as “circuitry,” “a module” or variants thereof.
- It should also be noted that in some alternate implementations, the functions/acts noted in the blocks may occur out of the order noted in the flowcharts. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Moreover, the functionality of a given block of the flowcharts and/or block diagrams may be separated into multiple blocks and/or the functionality of two or more blocks of the flowcharts and/or block diagrams may be at least partially integrated. Finally, other blocks may be added/inserted between the blocks that are illustrated. Moreover, although some of the diagrams include arrows on communication paths to show a primary direction of communication, it is to be understood that communication may occur in the opposite direction to the depicted arrows.
- Many different embodiments have been disclosed herein, in connection with the above description and the drawings. It will be understood that it would be unduly repetitious and obfuscating to literally describe and illustrate every combination and subcombination of these embodiments. Accordingly, the present specification, including the drawings, shall be construed to constitute a complete written description of various example combinations and subcombinations of embodiments and of the manner and process of making and using them, and shall support claims to any such combination or subcombination.
- Many variations and modifications can be made to the embodiments without substantially departing from the principles of the present invention. All such variations and modifications are intended to be included herein within the scope of the present invention.
Claims (22)
1. A vehicle entertainment system comprising:
a video display unit configured to provide content to a passenger;
a user interface (UI) control processor configured to:
receive passenger attributes sensed from the passenger operating the video display unit;
generate a passenger metric that characterizes the passenger based on the attribute; and
control a UI of the video display unit based on the passenger metric.
2. The vehicle entertainment system of claim 1 , further comprising:
a camera configured to output a camera signal containing data representing the passenger's face,
wherein the UI control processor is further configured to:
process the camera signal to identify facial features of the passenger;
compare the facial features to defined demographics rules;
determine a passenger demographic metric based on the comparison of the facial features to the defined demographics rules; and
control the UI of the video display unit based on the passenger demographic metric.
3. The vehicle entertainment system of claim 2 , wherein the UI control processor is further configured to:
compare the facial features to defined emotion rules;
determine a passenger emotion metric based on the comparison of the facial features to the defined emotion rules; and
control the UI of the video display unit based on the passenger emotion metric.
4. The vehicle entertainment system of claim 2 , wherein the UI control processor is further configured to:
compare the facial features to defined attentiveness rules;
determine a passenger attentiveness metric based on the comparison of the facial features to the defined attentiveness rules; and
control the UI of the video display unit based on the passenger attentiveness metric.
5. The vehicle entertainment system of claim 1 , wherein the UI control processor is further configured to:
monitor a passenger's operation of the UI of the video display unit to identify passenger selections that indicate that the passenger has incorrectly selected displayed elements of the UI;
determine a passenger UI operational effectiveness metric based on the monitoring; and
control the UI of the video display unit based on the passenger UI operational effectiveness metric.
6. The vehicle entertainment system of claim 1 , wherein the UI control processor is further configured to:
receive passenger biometric data from a biosensor;
compare the passenger biometric data to a defined rule; and
control the UI of the video display unit based on the comparison.
7. The vehicle entertainment system of claim 1 , wherein the UI control processor is further configured to:
receive passenger fitness tracking data through a wireless connection to a passenger fitness tracking device;
generate a passenger fitness metric based on the passenger fitness tracking data from the passenger fitness tracking device; and
control the UI of the video display unit based on the passenger fitness metric.
8. The vehicle entertainment system of claim 1 , wherein the UI control processor is configured to control the UI of the video display unit based on the passenger metric, by operations that comprise:
selecting a layout of information for UI layers from among a plurality of different defined layouts of information for UI layers displayable on the video display unit based on the passenger metric; and
displaying the defined layout of information for the UI layers through the UI of the video display unit operated by the passenger.
9. The vehicle entertainment system of claim 1 , wherein the UI control processor is configured to control the UI of the video display unit based on the passenger metric, by operations that comprise:
controlling layout of graphical elements of a UI layer displayed on the video display unit based on the passenger metric.
10. The vehicle entertainment system of claim 1 , wherein the UI control processor is configured to control the UI of the video display unit based on the passenger metric, by operations that comprise:
selecting a layout of selectable widgets for a UI layer from among a plurality of different defined layouts of selectable widgets for a UI layer displayable on the video display unit based on the passenger metric; and
displaying the defined layout of selectable widgets for the UI layer through the UI of the video display unit operated by the passenger.
11. The vehicle entertainment system of claim 1 , wherein the UI control processor is configured to control the UI of the video display unit based on the passenger metric, by operations that comprise:
selecting an operational flow between passenger selectable widgets of different UI layers from among a plurality of different defined operational flows between passenger selectable widgets of different UI layers displayable on the video display unit based on the passenger metric; and
controlling the operational flow between the passenger selectable widgets of the different UI layers displayed through the UI of the video display unit operated by the passenger, based on the operational flow that was selected.
12. The vehicle entertainment system of claim 1 , wherein the UI control processor is configured to control the UI of the video display unit based on the passenger metric, by operations that comprise:
controlling a font size, font shape, and/or color of text displayed through the UI of the video display unit operated by the passenger, based on the passenger metric.
13. The vehicle entertainment system of claim 1 , wherein the UI control processor is configured to control the UI of the video display unit based on the passenger metric, by operations that comprise:
controlling a size, shape, and/or color of passenger selectable widgets displayed through the UI of the video display unit operated by the passenger, based on the passenger metric.
14. The vehicle entertainment system of claim 1 , wherein the UI control processor is configured to control the UI of the video display unit based on the passenger metric, by operations that comprise:
generating a list of content that are available in a memory to be displayed through the UI of the video display unit operated by the passenger;
filtering the list of content based on the passenger metric to generate a filtered list of content; and
displaying the filtered list of content through the UI of the video display unit.
15. The vehicle entertainment system of claim 14 , wherein the UI control processor is configured to control the UI of the video display unit based on the passenger metric, by operations that comprise:
generating a list of movies, television programs, audio programs, and/or applications that are available from the memory within a content server to be displayed through the UI of the video display unit operated by the passenger;
filtering the list of movies, television programs, audio programs, and/or applications based on the passenger metric to generate a filtered list of movies, television programs, audio programs, and/or applications; and
displaying the filtered list of movies, television programs, audio programs, and/or applications through the UI of the video display unit.
16. The vehicle entertainment system of claim 15 , filtering the list of movies, television programs, audio programs, and/or applications based on the passenger metric to generate a filtered list of movies, television programs, audio programs, and/or applications, comprises:
generating a recommended list of movies, television programs, audio programs, and/or applications for display to the passenger based on correlating the passenger attributes to historical data indicating preferences of passengers having attributes corresponding to the passenger attributes.
17. The vehicle entertainment system of claim 1 , wherein the UI control processor is configured to control the UI of the video display unit based on the passenger metric, by operations that comprise:
controlling an order with which content is identified in a list based on the passenger metric, the content being available for display through the UI of the video display unit responsive to passenger selection among the ordered list of content;
displaying at least a portion of the ordered list of content through the UI of the video display unit.
18. The vehicle entertainment system of claim 1 , wherein the UI control processor is configured to control the UI of the video display unit based on the passenger metric, by operations that comprise:
selecting a help item trigger event from among a plurality of different defined help item trigger events based on the passenger metric; and
initiating display of a help item through the UI of the video display unit operated by the passenger, based on the help item trigger event that was selected becoming satisfied.
19. The vehicle entertainment system of claim 1 , wherein the UI control processor is configured to control the UI of the video display unit based on the passenger metric, by operations that comprise:
selecting a help content item from among a plurality of different defined help content items based on the passenger metric; and
displaying the help content item that was selected through the UI of the video display unit operated by the passenger.
20. The vehicle entertainment system of claim 1 , wherein the UI control processor is configured to control the UI of the video display unit based on the passenger metric, by operations that comprise:
selecting an advertisement from among a plurality of different defined advertisement based on the passenger metric; and
displaying the advertisement that was selected through the UI of the video display unit.
21. The vehicle entertainment system of claim 1 , wherein the UI control processor is configured to control the UI of the video display unit based on the passenger metric, by operations that comprise:
selecting an advertisement trigger event from among a plurality of different defined advertisement trigger events based on the passenger metric; and
initiating display of an advertisement through the UI of the video display unit operated by the passenger, based on the advertisement trigger event that was selected becoming satisfied.
22. The vehicle entertainment system of claim 1 , wherein:
the vehicle comprises an in-flight entertainment system configured to mounted within an aircraft fuselage; and
the video display unit is configured to be mounted to a seatback surface of seat or mounted to an armrest of a seat.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/751,963 US20160381412A1 (en) | 2015-06-26 | 2015-06-26 | User centric adaptation of vehicle entertainment system user interfaces |
US15/864,535 US10306294B2 (en) | 2015-06-26 | 2018-01-08 | User centric adaptation of vehicle entertainment system user interfaces |
US16/418,459 US10893318B2 (en) | 2015-06-26 | 2019-05-21 | Aircraft entertainment systems with chatroom server |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/751,963 US20160381412A1 (en) | 2015-06-26 | 2015-06-26 | User centric adaptation of vehicle entertainment system user interfaces |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/864,535 Continuation US10306294B2 (en) | 2015-06-26 | 2018-01-08 | User centric adaptation of vehicle entertainment system user interfaces |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160381412A1 true US20160381412A1 (en) | 2016-12-29 |
Family
ID=57601460
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/751,963 Abandoned US20160381412A1 (en) | 2015-06-26 | 2015-06-26 | User centric adaptation of vehicle entertainment system user interfaces |
US15/864,535 Active US10306294B2 (en) | 2015-06-26 | 2018-01-08 | User centric adaptation of vehicle entertainment system user interfaces |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/864,535 Active US10306294B2 (en) | 2015-06-26 | 2018-01-08 | User centric adaptation of vehicle entertainment system user interfaces |
Country Status (1)
Country | Link |
---|---|
US (2) | US20160381412A1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9864559B2 (en) * | 2016-03-29 | 2018-01-09 | Panasonic Avionics Corporation | Virtual window display system |
US20190020923A1 (en) * | 2015-12-31 | 2019-01-17 | Thomson Licensing | Method and apparatus for inhibiting the interruption of content being consumed by a user |
US20190208254A1 (en) * | 2017-12-29 | 2019-07-04 | ANI Technologies Private Limited | System and method for providing in-vehicle services to commuters |
WO2019232440A1 (en) * | 2018-06-01 | 2019-12-05 | Systems And Software Enterprises, Llc | Systems and methods for recommendation system based on implicit feedback |
CN110997418A (en) * | 2017-07-31 | 2020-04-10 | 福特全球技术公司 | Vehicle occupancy management system and method |
CN111523364A (en) * | 2019-02-05 | 2020-08-11 | 丰田自动车株式会社 | Information processing system, readable storage medium, and vehicle |
US20220086345A1 (en) * | 2020-09-11 | 2022-03-17 | Airbus (S.A.S.) | Visual display system and method in an aircraft |
US20230276105A1 (en) * | 2018-03-15 | 2023-08-31 | Saturn Licensing Llc | Information processing apparatus, information processing apparatus, and program |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102689884B1 (en) * | 2018-11-13 | 2024-07-31 | 현대자동차주식회사 | Vehicle and control method for the same |
US11442753B1 (en) * | 2020-10-14 | 2022-09-13 | Wells Fargo Bank, N.A. | Apparatuses, computer-implemented methods, and computer program products for displaying dynamic user interfaces to multiple users on the same interface |
CN113656609A (en) * | 2021-08-13 | 2021-11-16 | 阿波罗智联(北京)科技有限公司 | Method and device for recommending multimedia information, electronic equipment and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5715415A (en) * | 1996-06-05 | 1998-02-03 | Microsoft Corporation | Computer application with help pane integrated into workspace |
US20060184800A1 (en) * | 2005-02-16 | 2006-08-17 | Outland Research, Llc | Method and apparatus for using age and/or gender recognition techniques to customize a user interface |
US20090138805A1 (en) * | 2007-11-21 | 2009-05-28 | Gesturetek, Inc. | Media preferences |
US20100131983A1 (en) * | 2006-09-29 | 2010-05-27 | Steve Shannon | Systems and methods for a modular media guidance dashboard application |
US20130063612A1 (en) * | 2011-09-12 | 2013-03-14 | Howard Isham Royster | In-flight system |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120083668A1 (en) * | 2010-09-30 | 2012-04-05 | Anantha Pradeep | Systems and methods to modify a characteristic of a user device based on a neurological and/or physiological measurement |
US20140026156A1 (en) * | 2012-07-18 | 2014-01-23 | David Deephanphongs | Determining User Interest Through Detected Physical Indicia |
-
2015
- 2015-06-26 US US14/751,963 patent/US20160381412A1/en not_active Abandoned
-
2018
- 2018-01-08 US US15/864,535 patent/US10306294B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5715415A (en) * | 1996-06-05 | 1998-02-03 | Microsoft Corporation | Computer application with help pane integrated into workspace |
US20060184800A1 (en) * | 2005-02-16 | 2006-08-17 | Outland Research, Llc | Method and apparatus for using age and/or gender recognition techniques to customize a user interface |
US20100131983A1 (en) * | 2006-09-29 | 2010-05-27 | Steve Shannon | Systems and methods for a modular media guidance dashboard application |
US20090138805A1 (en) * | 2007-11-21 | 2009-05-28 | Gesturetek, Inc. | Media preferences |
US20130063612A1 (en) * | 2011-09-12 | 2013-03-14 | Howard Isham Royster | In-flight system |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190020923A1 (en) * | 2015-12-31 | 2019-01-17 | Thomson Licensing | Method and apparatus for inhibiting the interruption of content being consumed by a user |
US10616653B2 (en) * | 2015-12-31 | 2020-04-07 | Interdigital Ce Patent Holdings | Method and apparatus for inhibiting the interruption of content being consumed by a user |
US9864559B2 (en) * | 2016-03-29 | 2018-01-09 | Panasonic Avionics Corporation | Virtual window display system |
CN110997418A (en) * | 2017-07-31 | 2020-04-10 | 福特全球技术公司 | Vehicle occupancy management system and method |
US11479147B2 (en) * | 2017-07-31 | 2022-10-25 | Ford Global Technologies, Llc | Vehicle occupancy management systems and methods |
US20190208254A1 (en) * | 2017-12-29 | 2019-07-04 | ANI Technologies Private Limited | System and method for providing in-vehicle services to commuters |
US10560736B2 (en) * | 2017-12-29 | 2020-02-11 | ANI Technologies Private Limited | System and method for providing in-vehicle services to commuters |
US20230276105A1 (en) * | 2018-03-15 | 2023-08-31 | Saturn Licensing Llc | Information processing apparatus, information processing apparatus, and program |
WO2019232440A1 (en) * | 2018-06-01 | 2019-12-05 | Systems And Software Enterprises, Llc | Systems and methods for recommendation system based on implicit feedback |
CN111523364A (en) * | 2019-02-05 | 2020-08-11 | 丰田自动车株式会社 | Information processing system, readable storage medium, and vehicle |
US20220086345A1 (en) * | 2020-09-11 | 2022-03-17 | Airbus (S.A.S.) | Visual display system and method in an aircraft |
US11785343B2 (en) * | 2020-09-11 | 2023-10-10 | Airbus (S.A.S.) | Visual display system and method in an aircraft |
Also Published As
Publication number | Publication date |
---|---|
US10306294B2 (en) | 2019-05-28 |
US20180131992A1 (en) | 2018-05-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10306294B2 (en) | User centric adaptation of vehicle entertainment system user interfaces | |
US9852355B2 (en) | Facial analysis for vehicle entertainment system metrics | |
US10757451B2 (en) | User centric service and content curation through in-flight entertainment system | |
US10893318B2 (en) | Aircraft entertainment systems with chatroom server | |
US12084045B2 (en) | Systems and methods for operating a vehicle based on sensor data | |
US20200238933A1 (en) | Recommendation and selection of personalized output actions in a vehicle | |
US20190028847A1 (en) | Methods and systems for encouraging behaviour while occupying vehicles | |
US11518517B2 (en) | Predictive preference selection for in-vehicle entertainment systems | |
US10659470B2 (en) | Methods and systems for establishing communication with users based on biometric data | |
GB2538339A (en) | Travel environment control | |
US11329942B2 (en) | Methods, systems, and media for presenting messages related to notifications | |
US10911832B2 (en) | Methods, systems, and media for facilitating interaction between viewers of a stream of content | |
US10992620B2 (en) | Methods, systems, and media for generating a notification in connection with a video content item | |
US20200342040A1 (en) | Onboard entertainment systems and methods | |
US20220020053A1 (en) | Apparatus, systems and methods for acquiring commentary about a media content event | |
US9542567B2 (en) | Methods and systems for enabling media guidance application operations based on biometric data | |
US20220360641A1 (en) | Dynamic time-based playback of content in a vehicle | |
US11169664B2 (en) | Interactive mapping for passengers in commercial passenger vehicle | |
CN115309285A (en) | Method and device for controlling display and mobile carrier | |
US20220174131A1 (en) | Variable-intensity immersion for extended reality media | |
US20230335138A1 (en) | Onboard aircraft system with artificial human interface to assist passengers and/or crew members | |
CN113656609A (en) | Method and device for recommending multimedia information, electronic equipment and storage medium | |
US11910043B2 (en) | Engagement measurement in in-flight entertainment systems | |
CN106062804B (en) | Initiating an activity using a badge |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: THALES AVIONICS, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:COULEAUD, JEAN-YVES;DECUIR, TRACY;REEL/FRAME:035915/0821 Effective date: 20150626 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |